back to indexRob Reid: The Existential Threat of Engineered Viruses and Lab Leaks | Lex Fridman Podcast #193
Chapters
0:0 Introduction
2:28 The most entertaining outcome is the most likely
8:47 Meme theory
12:7 Writing process
18:54 Engineered viruses as a threat to human civilization
26:40 Gain-of-function research on viruses
38:50 Did COVID leak from a lab?
46:10 Virus detection
53:59 Failure of institutions
61:43 Using AI to engineer viruses
66:2 Evil and competence
75:21 Where are the aliens?
79:14 Backing up human consciousness by colonizing space
88:43 Superintelligence and consciousness
100:7 Meditation
108:15 Fasting
114:15 Greatest song of all time
119:41 Early days of music streaming
131:34 Startup advice
144:45 Podcasting
160:7 Advice for young people
169:10 Mortality
174:36 Meaning of life
00:00:00.000 |
The following is a conversation with Rob Reed, 00:00:02.640 |
entrepreneur, author, and host of the After On podcast. 00:00:07.280 |
Sam Harris recommended that I absolutely must talk to Rob 00:00:11.120 |
about his recent work on the future of engineer pandemics. 00:00:15.080 |
I then listened to the four hour special episode 00:00:21.000 |
titled Engineering the Apocalypse, and I was floored. 00:00:28.840 |
Athletic Greens, Belcampo, Fundrise, and NetSuite. 00:00:33.520 |
Check them out in the description to support this podcast. 00:00:43.800 |
of gain-of-function research on coronaviruses 00:00:49.480 |
that was then accidentally leaked due to human error. 00:00:53.000 |
For context, this lab is biosafety level four, BSL-4, 00:01:01.800 |
but if you look at all the human-in-the-loop pieces 00:01:11.440 |
To me, whether the virus leaked from the lab or not, 00:01:15.760 |
is about much more than this particular catastrophic case. 00:01:24.800 |
of how well we can prepare and respond to threats 00:01:28.120 |
that can cripple or destroy human civilization. 00:01:31.360 |
If we continue gain-of-function research on viruses, 00:01:36.840 |
and they will be more deadly and more contagious. 00:01:42.320 |
or we can openly and honestly talk about the risks involved. 00:01:45.760 |
This research can both save and destroy human life on Earth 00:01:56.640 |
if scientists self-censor conversations about this, 00:02:00.080 |
we'll become merely victims of our brief homo sapien story, 00:02:07.160 |
too carelessly labeling ideas as misinformation 00:02:12.760 |
will eventually destroy our ability to discover the truth. 00:02:16.440 |
And without truth, we don't have a fighting chance 00:02:38.160 |
What do you think of the Elon Musk hypothesis 00:02:40.840 |
that the most entertaining outcome is the most likely? 00:02:54.200 |
of progressing our civilization that's fun to watch. 00:02:59.360 |
He said, from the standpoint of the observer, 00:03:05.640 |
those were, I think, just a couple of freestanding tweets 00:03:08.080 |
and delivered without a whole lot of wrapper of context, 00:03:11.640 |
so it's left to the mind of the reader of the tweets 00:03:35.080 |
And to me, that suggests, particularly coming from Elon, 00:03:41.220 |
that somebody is out there and has far greater insights 00:04:11.760 |
Then on top of that, when you think about it, 00:04:23.420 |
and other developments that you weren't foreseeing. 00:04:33.140 |
but there aren't a lot of surprises in there. 00:04:34.920 |
So now I'm thinking we need a producer and an observer 00:04:43.720 |
because Elon said the most entertaining outcome 00:04:48.360 |
So there's lots of layers for thinking about that. 00:04:54.800 |
it makes me think of there was a South Park episode 00:04:57.240 |
in which Earth turned out to be a reality show. 00:05:00.400 |
And somehow we had failed to entertain the audience 00:05:04.880 |
so the Earth show was going to get canceled, et cetera. 00:05:40.480 |
And there's just a completely inadequate level 00:05:55.000 |
- You're an extra that waiting for your one opportunity 00:06:04.480 |
So okay, so we'll rule out me being the star of the show, 00:06:07.240 |
which I probably could have guessed at anyway. 00:06:11.360 |
I mean, there have been a lot of really intriguing things 00:06:14.520 |
and a lot of astounding things that have happened. 00:06:16.440 |
But I would have some werewolves, I'd have some zombies. 00:06:21.440 |
I would have some really improbable developments 00:06:24.160 |
like maybe Canada absorbing the United States. 00:06:33.160 |
But if we are, that will mean that 2020 is just a prequel 00:06:45.520 |
- Well, the night is still young in terms of Canada, 00:06:47.700 |
but do you think it's possible for the observer 00:06:52.360 |
So meaning it does seem when you kind of watch memes 00:06:59.320 |
the entertaining ones spread more efficiently. 00:07:02.760 |
- I mean, I don't know what it is about the human mind 00:07:11.080 |
Much more sort of aggressively, it's more viral 00:07:16.640 |
Is there some sense that whatever the evolutionary process 00:07:23.560 |
is the same process that's going to, in an emergent way, 00:07:29.400 |
the most meme-ifiable outcome, the most viral outcome 00:07:41.600 |
Like, I mean, how many memes are created in a given day? 00:07:43.960 |
And the ones that go viral are almost uniformly funny, 00:07:46.400 |
at least to somebody with a particular sense of humor. 00:07:52.260 |
We are definitely great at creating atomized units of funny. 00:08:01.720 |
there are going to be X million brains parsing 00:08:04.640 |
and judging whether this meme is retweetable or not. 00:08:07.440 |
And so that sort of atomic element of funniness, 00:08:18.560 |
and selective pressure, and everything else that's going on. 00:08:21.640 |
But in terms of the entire ecosystem of conscious systems 00:08:26.640 |
here on the Earth driving for a level of entertainment, 00:08:55.320 |
versus the perspective of the individual human brains? 00:08:59.680 |
So almost thinking about the ideas or the memes, 00:09:16.480 |
putting selective pressure on them, et cetera. 00:09:27.480 |
to his unbelievably brilliant book about the selfish gene. 00:09:37.440 |
I view the relationship though between humans and memes 00:09:47.640 |
Do flowers have bees or do bees in a sense have flowers? 00:09:51.800 |
And the answer is, it is a very, very symbiotic relationship 00:09:56.200 |
in which both have semi-independent roles that they play 00:10:00.080 |
and both are highly dependent upon the other. 00:10:06.720 |
as being this monolithic structure physically 00:10:14.280 |
So you could kind of say, well, flowers have bees. 00:10:16.980 |
But on the other hand, the flowers would obviously be doomed. 00:10:25.320 |
flowers are really expression of what the bees need. 00:10:37.960 |
in which memes are either propagated or not propagated, 00:10:47.640 |
selective competition, plays out between different memes. 00:10:55.000 |
really the human mind is a production of memes 00:10:58.720 |
and ideas have us rather than us having ideas. 00:11:01.760 |
But at the same time, let's take a catchy tune 00:11:07.000 |
That catchy tune did originate in a human mind. 00:11:12.880 |
And as much as I like Elizabeth Gilbert's TED Talk 00:11:21.520 |
in this beautiful TED Talk, it's very lyrical. 00:11:30.760 |
She talked about needing to pull over to the side of the road 00:11:33.040 |
when she got inspiration for a particular paragraph 00:11:36.360 |
or a particular idea and a burning need to write that down. 00:11:47.160 |
And I think that really most things that do become memes 00:11:52.000 |
are the product of a great deal of deliberate 00:12:07.920 |
- If we could take a little bit of a tangent, 00:12:10.200 |
Stephen King on writing, you as a great writer, 00:12:14.280 |
you're dropping a hint here that the ideas don't come to you. 00:12:22.640 |
It's more of a very deliberate, rigorous daily process. 00:12:28.040 |
So maybe, can you talk about the writing process? 00:12:36.120 |
And maybe if you want to step outside of yourself, 00:12:38.760 |
almost like give advice to an aspiring writer, 00:12:42.480 |
what does it take to write the best work of your life? 00:12:52.360 |
two nonfiction books and two works of fiction. 00:13:08.440 |
Some people really like to fly by the seat of their pants, 00:13:11.400 |
and some people really, really like to outline, to plot. 00:13:25.660 |
but I lean, I guess, a little bit more toward the plotter. 00:13:39.240 |
And I do try to make an effort of making an outline 00:13:42.500 |
that I know I'm gonna be extremely unfaithful to 00:14:03.720 |
And I think if I were personally a rigorous outliner, 00:14:08.980 |
I also would make a much more vigorous skeleton 00:14:22.140 |
people who write spy novels or supernatural adventures, 00:14:31.660 |
of events, action, plot twists, conspiracy, et cetera. 00:14:47.600 |
And I think people who write what's often referred to 00:14:50.800 |
as literary fiction, for lack of a better term, 00:14:53.480 |
where it's more about sort of aura and ambiance 00:15:00.680 |
and inner experience and inner journey and so forth, 00:15:07.600 |
And I know people who start with a blank page 00:15:21.720 |
For me, it's an astonishingly high percentage of it 00:15:25.000 |
is editing as opposed to the initial writing. 00:15:27.600 |
For every hour that I spend writing new prose, 00:15:37.640 |
I probably spend, I mean, I wish I kept a count. 00:15:47.960 |
But I would say it's at least four or five hours 00:15:51.560 |
and maybe as many as 10 that I spend editing. 00:16:01.860 |
and I spend just relentlessly polishing and pruning 00:16:06.800 |
and sometimes on the micro level of just like, 00:16:12.360 |
Do I need to carve a syllable or something so it can land? 00:16:18.920 |
okay, I'm done but the book is 750 pages long 00:16:31.840 |
And I also write music, write and record and produce music. 00:16:49.120 |
that go into just making it all hang together 00:16:52.760 |
So I think that's true of a lot of creative processes. 00:17:07.340 |
comes from that part of the process, any creative process. 00:17:14.360 |
In the editing process, you're ultimately judging 00:17:20.480 |
How much of your time do you spend hating your work? 00:17:43.360 |
I spend almost all the time in a state of optimism 00:17:47.040 |
that this thing that I have, I like, I like quite a bit 00:17:51.320 |
and I can make it better and better and better 00:17:56.840 |
So I spend most of my time in a state of optimism. 00:18:00.420 |
- I think I personally oscillate much more aggressively 00:18:11.080 |
Marvin Minsky from MIT had this advice, I guess, 00:18:16.080 |
to what it takes to be successful in science and research 00:18:24.600 |
I mean, at least he was speaking about himself 00:18:27.820 |
that the key to his success was to hate everything 00:18:32.520 |
I have a little Marvin Minsky there in me too 00:18:36.480 |
to sort of always be exceptionally self-critical 00:18:42.440 |
but grateful for the chance to be able to do the work. 00:18:48.560 |
- But that, you know, each one of us have to strike 00:18:55.800 |
- But back to the destruction of human civilization. 00:18:59.040 |
If humans destroy ourselves in the next 100 years, 00:19:08.320 |
the most likely reason that we destroy ourselves? 00:19:11.400 |
- Well, let's see, 100 years, it's hard for me 00:19:18.120 |
and it's something to give a lot more thought to, 00:19:23.400 |
simply because I am a science fiction writer. 00:19:25.400 |
And I feel with the acceleration of technological progress, 00:19:34.040 |
I mean, comparing today's world to that of 1921, 00:19:46.640 |
I mean, our intuitions reliably defeat ourselves 00:19:53.080 |
And, you know, how we might destroy ourselves 00:19:56.040 |
in the 100-year timeframe might have everything to do 00:20:00.040 |
with breakthroughs in nanotechnology 40 years from now 00:20:03.060 |
and then how rapidly those breakthroughs accelerate. 00:20:05.520 |
But in the nearer term that I'm comfortable predicting, 00:20:07.800 |
let's say 30 years, I would say the most likely route 00:20:12.400 |
to self-destruction would be synthetic biology. 00:20:16.280 |
And I always say that with the gigantic caveat 00:20:21.920 |
and I'll abbreviate synthetic biology to SYNBIO 00:20:26.060 |
I believe SYNBIO offers us simply stunning promise 00:20:34.200 |
So I'm not an anti-SYNBIO person by any stretch. 00:20:50.040 |
those hands either being incompetent or being malevolent, 00:21:10.320 |
But in the 30-year timeframe, I think it's a lesser one, 00:21:13.440 |
or nuclear weapons or anything else that I can think of. 00:21:21.980 |
versus the natural side of the pandemic frontier. 00:21:26.120 |
So we humans engineering pathogens, engineering viruses 00:21:33.840 |
- And maybe how do you see the possible trajectories 00:21:46.520 |
or unintended consequences of particular actions 00:21:51.560 |
that are ultimately lead to unexpected mistakes? 00:21:55.700 |
And I think the question of which is more likely 00:22:00.800 |
One, do we take a lot of methodical, affordable, 00:22:05.600 |
four-sided steps that we are absolutely capable 00:22:24.120 |
And if we take those steps, I think the danger 00:22:34.000 |
and we have a bad, bad and very long track record 00:22:36.640 |
of hitting the snooze bar after different natural pandemics 00:22:43.400 |
Variable number two is how much experimentation 00:22:48.400 |
and pathogen development do we as a society decide 00:22:52.520 |
is acceptable in the realms of academia, government 00:22:58.680 |
And if we decide as a society that it's perfectly okay 00:23:09.520 |
could wipe out humanity, if we think that's fine, 00:23:12.520 |
and if that kind of work starts happening in one lab, 00:23:19.940 |
then 10 countries, then 70 countries or whatever, 00:23:23.280 |
that risk of a boo-boo starts rising astronomically. 00:23:28.240 |
And this won't be a spoiler alert based on the way 00:23:37.480 |
The easier one to manage, although it wouldn't be simple 00:23:40.960 |
by any stretch because it would have to be something 00:23:43.120 |
that all nations agree on, but the easiest way, 00:23:52.440 |
that if they escape from a lab could annihilate us. 00:23:56.120 |
There's no line of research that justifies that, 00:23:58.720 |
and in my view, I mean, that's the point of perspective 00:24:02.100 |
We'd have to collectively agree that there's no line 00:24:07.780 |
a highly rational conclusion is even the highest level 00:24:11.360 |
of biosafety lab in the world, biosafety lab level four, 00:24:15.560 |
and there are not a lot of BSL-4 labs in the world, 00:24:18.240 |
there are things can and have leaked out of BSL-4 labs, 00:24:28.480 |
which we can talk about, is actually done at BSL-3, 00:24:36.280 |
We have proven ourselves to be incapable of creating a lab 00:24:42.360 |
so why in the world would we create something 00:24:44.880 |
where if, God forbid, it leaked, could annihilate us all? 00:24:50.720 |
that are taken in biosafety level anything labs 00:24:57.040 |
What happens if you have a malevolent insider? 00:24:59.600 |
And we could talk about the psychology and the motivations 00:25:04.400 |
who wants to release something annihilating in a bit. 00:25:13.960 |
into biosafety level one, two, three, and four 00:25:17.200 |
are about preventing somebody hijacking the process. 00:25:21.360 |
but they're mainly designed against accidents. 00:25:27.280 |
in lots and lots of labs, with every lab you add, 00:25:35.960 |
Now, on the front of somebody outside of a government, 00:25:40.960 |
academic, or scientific, traditional government, 00:25:55.840 |
the hardening of the entire syn-bio ecosystem 00:26:02.840 |
that we don't want to have out there by rogue actors, 00:26:05.920 |
to early detection, to lots and lots of other things 00:26:08.880 |
that we can do to dramatically mitigate that risk. 00:26:13.640 |
decide that no, we're not going to experiment, 00:26:16.120 |
we make annihilating pathogens in leaky labs, 00:26:19.360 |
and B, yes, we are going to take countermeasures 00:26:24.720 |
of our annual defense budget to preclude their creation, 00:26:31.720 |
But if you take one set of precautions and not the other, 00:26:34.840 |
then the thing that you have not taken precautions against 00:26:45.080 |
and what are the positives and negatives of it? 00:27:03.320 |
- Yeah, so that would be the logic behind doing it. 00:27:06.600 |
And so gain-of-function can mean a lot of different things. 00:27:10.200 |
Viewed through a certain lens, gain-of-function research 00:27:21.440 |
I mean, you could view that as gain-of-function. 00:27:26.400 |
which is actually the sense that the term is usually used, 00:27:34.320 |
of microorganisms to make them more dangerous, 00:27:37.400 |
whether it's more transmissible or more deadly. 00:27:44.480 |
'cause it's very illustrative and it's also very chilling. 00:27:53.280 |
I assume there was some kind of communication between them, 00:27:55.320 |
but they were basically independent projects, 00:28:04.080 |
H5N1 is something that, at least on a lethality basis, 00:28:12.400 |
COVID, according to the World Health Organization, 00:28:21.320 |
And so that's actually even slightly more lethal than Ebola. 00:28:33.000 |
And I believe it is in no way contagious human to human. 00:28:47.120 |
and you spend an enormous amount of time around them, 00:29:05.320 |
I mean, not that we're, it just doesn't exist. 00:29:11.240 |
did a relentless survey of the number of H5N1 cases. 00:29:16.800 |
I saw one 10-year series where I think it was like 00:29:26.560 |
I believe the typical lethality from lightning 00:29:31.840 |
So we've been getting struck by lightning, pretty low risk. 00:29:41.520 |
set out to make H5N1 that would be contagious, 00:29:48.200 |
And so they basically passed it, I think in both cases, 00:29:50.980 |
they passed it through a large number of ferrets. 00:29:56.440 |
there wasn't even any CRISPR back in those days. 00:30:03.040 |
And after guiding the path and passing them through, 00:30:07.760 |
they did in fact come up with a version of H5N1 00:30:17.320 |
they didn't inject it into humans to see what would happen. 00:30:21.960 |
we don't really know how contagious it might have been. 00:30:29.200 |
that could be a civilization-threatening pathogen. 00:30:38.320 |
I believe their agenda as they explained it was, 00:30:52.700 |
And so potential of leak, significantly non-zero, 00:30:57.360 |
hopefully way below 1%, but significantly non-zero. 00:31:00.920 |
And when you look at the consequences of an escape 00:31:05.440 |
destruction of a large portion of the economy, et cetera, 00:31:24.440 |
If you said, if you believed that H5N1 in nature 00:31:29.440 |
is on an inevitable path to airborne transmission, 00:31:33.960 |
and it's only gonna be a small number of years, A, 00:31:40.280 |
there is one set of changes to its metabolic pathways 00:32:04.920 |
that's coming toward the Earth and is five years off. 00:32:06.960 |
And yes, you marshal everything you can to resist that. 00:32:10.240 |
But there's two problems with that perspective. 00:32:12.520 |
The first is, in however many thousands of generations 00:32:15.920 |
that humans have been inhabiting this planet, 00:32:17.720 |
there has never been a transmissible form of H5N1. 00:32:21.160 |
And influenza's been around for a very long time. 00:32:27.360 |
of this kind of a jump to airborne transmission. 00:32:30.080 |
So we're not on a freight train to that outcome. 00:32:36.080 |
it's not like there's just one set of genetic code 00:32:39.760 |
There are just, there's all kinds of different mutations 00:32:43.680 |
that could conceivably result in that kind of an outcome. 00:32:58.300 |
and unbelievably negative card and injecting it in the deck 00:33:08.880 |
or scientific justification for that kind of work. 00:33:12.120 |
And interestingly, there was quite a bit of excitement 00:33:17.120 |
and concern about this when the work came out. 00:33:18.820 |
One of the teams was gonna publish their results in science, 00:33:25.160 |
and a lot of scientists are saying this is crazy. 00:33:27.800 |
And publication of those papers did get suspended. 00:33:31.080 |
And not long after that, there was a pause put 00:33:38.120 |
But both of those speed bumps were ultimately removed. 00:33:47.720 |
And in fact, those two very projects, my understanding is, 00:33:50.560 |
resumed their funding, got their government funding back. 00:33:53.680 |
I don't know why Dutch Project's getting NIH funding, 00:33:58.700 |
So as far as the US government and regulators are concerned, 00:34:02.980 |
it's all systems go for gain-of-function at this point, 00:34:07.580 |
- Now I'm a little bit of an outsider from this field, 00:34:09.700 |
but it has echoes of the same kind of problem I see 00:34:12.500 |
in the AI world with autonomous weapon systems. 00:34:16.220 |
Nobody, and my colleagues, my colleagues, friends, 00:34:21.220 |
as far as I can tell, people in the AI community 00:34:25.180 |
are not really talking about autonomous weapon systems. 00:34:40.900 |
and they don't want to talk about gain-of-function publicly. 00:34:48.860 |
from an outsider perspective in terms of gain-of-function. 00:34:53.860 |
from the insider perspective on autonomous weapon systems. 00:35:00.220 |
and I certainly don't know how to communicate effectively 00:35:06.060 |
Should we seize all gain-of-function research? 00:35:11.380 |
- Well, again, I'm gonna use gain-of-function 00:35:13.100 |
in the relatively narrow context of what we're discussing. 00:35:16.260 |
- 'Cause you could say almost anything that you do 00:35:18.460 |
to make biology more effective is gain-of-function. 00:35:20.540 |
So within the narrow confines of what we're discussing, 00:35:23.500 |
I think it would be easy enough for level-headed people 00:35:30.380 |
level-headed governmental people in all of the countries 00:35:32.500 |
that realistically could support such a program to agree, 00:35:36.060 |
we don't want this to happen because all labs leak. 00:35:47.000 |
is the anthrax attacks in the United States in 2001. 00:35:51.380 |
I mean, talk about an example of the least likely lab 00:35:57.940 |
This was shortly after 9/11, for folks who don't remember it, 00:36:01.020 |
and it was a very, very lethal strand of anthrax 00:36:06.340 |
based on the forensic genomic work that was done 00:36:13.280 |
probably the one at Fort Detrick in Maryland. 00:36:17.300 |
It absolutely leaked from a high-security US Army lab. 00:36:30.660 |
including the Senate Majority Leader's office, 00:36:32.620 |
Tom Daschle's office, I think it was Senator Leahy's office, 00:36:39.160 |
But let's go to the Senate Majority Leader's office. 00:36:41.860 |
It is hard to imagine a more security-minded country 00:36:46.000 |
than the United States two weeks after the 9/11 attack. 00:36:49.120 |
I mean, it doesn't get more security-minded than that. 00:37:08.060 |
despite that level of focus and concern and competence, 00:37:24.120 |
somewhere in the line of presidential succession. 00:37:27.980 |
So again, think of a level-headed conversation 00:37:30.900 |
between powerful leaders in a diversity of countries, 00:37:36.460 |
I can imagine a very simple PowerPoint revealing, 00:37:39.680 |
just discussing briefly things like the anthrax leak, 00:37:43.080 |
things like this foot-in-mouth disease outbreak 00:37:47.320 |
or leaking that came out of a BSL-4-level lab in the UK, 00:37:54.940 |
that could result from gain-of-function and say, 00:37:57.000 |
folks, can we agree that this just shouldn't happen? 00:38:07.360 |
which we did agree on, we the world, for the most part, 00:38:20.960 |
and then to decide we're gonna get everybody together 00:38:24.640 |
Now, that doesn't make it entirely impossible 00:38:28.460 |
but in well-regulated, carefully watched over 00:38:39.520 |
things going on in companies that have investors 00:38:43.400 |
who don't wanna go to jail for the rest of their lives, 00:38:50.080 |
- But there is a particular possible catalyst 00:38:55.200 |
which is for really kind of raising the question 00:38:59.560 |
of gain-of-function research for the application of virus, 00:39:16.000 |
It seems like a very important question to ask 00:39:23.520 |
about whether we should be doing gain-of-function research. 00:39:35.200 |
And two, do you think that the answer could be 00:39:48.040 |
for the hypothetical, rational national leaders 00:39:57.720 |
and you look at the unbelievable destructive power 00:40:02.040 |
that should be an overwhelmingly powerful argument 00:40:16.080 |
that has gone into people making the pro-argument of that. 00:40:29.080 |
it is entirely possible for a couple of reasons. 00:40:45.880 |
that alarmed very sophisticated US diplomats and others 00:41:03.640 |
I believe one sophisticated scientist or other observer 00:41:10.200 |
And I believe it's also been pretty reasonably established 00:41:13.840 |
that coronaviruses were a topic of great interest at WIV. 00:41:28.440 |
about what happened in the early days and weeks 00:41:31.640 |
after the outbreak that's basically been imposed 00:41:34.840 |
by the Chinese government that we just don't know. 00:41:47.720 |
Now we're going to the realm of thought experiment, 00:41:56.120 |
and there is this precedent of gain-of-function research 00:42:03.160 |
whereas we know coronavirus is contagious to humans. 00:42:05.680 |
I could definitely, and there is this global consensus. 00:42:09.960 |
Certainly was the case two or three years ago 00:42:17.560 |
US paused funding for a little while, but paused funding. 00:42:20.680 |
They never said private actors couldn't do it. 00:42:28.800 |
You could certainly see the folks at WIV saying, 00:42:42.760 |
Why don't we do a little gain-of-function on this? 00:42:52.120 |
and very, very level-headed people have said that, 00:42:54.640 |
you know, who've looked at it much more deeply, 00:43:13.360 |
is really afraid of admitting mistakes that everybody makes? 00:43:24.520 |
I mean, well, major mistakes were made in Chernobyl. 00:43:32.420 |
the scale of the mistake is much smaller, right? 00:43:39.680 |
The depth and the breadth of rot that in bureaucracy 00:43:56.260 |
very careful security procedures, even in level three labs, 00:44:14.000 |
as opposed to a multi-year bureaucratic failure 00:44:19.180 |
- Right, well, certainly the magnitude of mistakes 00:44:22.380 |
and compounding mistakes that went into Chernobyl 00:44:29.920 |
the consequence of Chernobyl to a tremendous degree. 00:44:33.480 |
And I think that particularly authoritarian governments 00:44:48.360 |
across dozens and dozens of authoritarian governments. 00:44:54.980 |
this is in the hypothetical world in which this was a leak, 00:44:57.540 |
which again, I don't personally have enough sophistication 00:45:03.220 |
but in the hypothetical world in which it was a leak, 00:45:06.420 |
the global reaction and the amount of global animus 00:45:21.720 |
because every country suffered massively from this, 00:45:29.820 |
The world would in some way present China with that bill. 00:45:37.920 |
the natural disinclination for any authoritarian government 00:45:41.260 |
to admit any fallibility and tolerate the possibility 00:45:49.420 |
even though they let a World Health Organization 00:45:51.580 |
group in, you know, a couple of months ago to run around, 00:45:56.800 |
anywhere near the level of access that would be necessary 00:46:02.240 |
The level of opacity that surrounds those opening weeks 00:46:04.900 |
and months of COVID in China, we just don't know. 00:46:12.700 |
and maybe broadening it out to future pandemics 00:46:20.780 |
what kind of response, how do we fail in a response 00:46:27.420 |
So the gain of function research is discussing, 00:46:41.000 |
But if it does happen, perhaps the natural evolution, 00:46:52.860 |
on the vaccine development side, on the collection of data, 00:46:57.220 |
or on the basic sort of policy response side, 00:47:05.320 |
And most of what I've thought about and written about, 00:47:09.000 |
and again, discussed in that long bit with Sam, is dual use. 00:47:14.000 |
So most of the countermeasures that I've been thinking about 00:47:18.220 |
and advocating for would be every bit as effective 00:47:21.480 |
against zoonotic disease, a natural pandemic of some sort 00:47:29.120 |
even the near-term risk of an artificial one, 00:47:31.700 |
ups the urgency around these measures immensely, 00:47:34.540 |
but most of them would be broadly applicable. 00:47:37.500 |
And so I think the first thing that we really wanna do 00:47:40.800 |
on a global scale is have a far, far, far more robust 00:47:45.140 |
and globally transparent system of detection. 00:47:52.020 |
The most obvious one is just in the blood of people 00:47:56.660 |
who come into clinics exhibiting signs of illness. 00:48:03.180 |
where we're at with relatively minimal investment. 00:48:18.780 |
And better than that, this is a little bit further off, 00:48:23.620 |
but it wouldn't cost tens of billions in research dollars. 00:48:26.380 |
It would be a relatively modest and affordable budget 00:48:28.980 |
in relation to the threat, at-home diagnostics 00:48:34.820 |
okay, particularly with respiratory infections, 00:48:37.780 |
because that is generally, almost universally, 00:48:40.880 |
the mechanism of transmission for any serious pandemic. 00:48:56.360 |
If it's influenza, is it influenza A versus B? 00:48:58.920 |
Or is it a small handful of other more exotic, 00:49:03.880 |
but nonetheless sort of common respiratory infections 00:49:08.840 |
Developing a diagnostic panel to pinpoint all of that stuff, 00:49:12.080 |
that's something that's well within our capabilities. 00:49:13.920 |
That's much less a lift than creating mRNA vaccines, 00:49:25.600 |
because the best prototype for this that I'm aware of 00:49:29.420 |
isn't currently rolling out in Atherton, California, 00:49:39.920 |
And it's a project that came out of the Broad Institute, 00:50:00.840 |
in areas of Nigeria that are particularly vulnerable 00:50:13.480 |
where clinicians in the field could very rapidly determine, 00:50:17.760 |
do you have one of the infections of acute interest here, 00:50:21.560 |
either because it's very common in this region, 00:50:23.820 |
so we want to diagnose as many things as we can 00:50:31.240 |
So frontline worker can make that determination 00:50:40.140 |
at a fully configured doctor's office or local hospital. 00:50:56.480 |
there shouldn't be any inhibition for it to happen 00:51:00.680 |
And it should be affordable from a budgetary standpoint. 00:51:03.200 |
And based on Sentinel's budget and adjusting things 00:51:05.960 |
for things like very different cost of living, 00:51:16.800 |
And wealthy countries, middle-income countries 00:51:21.840 |
Lower-income countries should certainly be helped with that. 00:51:27.700 |
And then layer on top of that other interesting things 00:51:42.900 |
Most of it kind of academic and experimental. 00:51:45.920 |
But some of it has been powerful enough to suggest 00:51:48.200 |
that this could be a very powerful early warning system. 00:52:04.820 |
in the early days of the pandemic in given countries 00:52:12.200 |
16 days of forewarning can be monumentally important 00:52:22.460 |
but nonetheless very resource-constrained academic project. 00:52:32.840 |
that's something we could do radically, radically better. 00:52:43.240 |
of creating almost like a weather map of pathogens. 00:52:46.680 |
Like basically aggregating all of these data sources, 00:53:02.760 |
but everything, like a full spectrum of things 00:53:11.260 |
like that are dynamically updated on an hourly basis 00:53:19.380 |
And so you can respond, like you can then integrate, 00:53:22.300 |
just like you do when you check your weather map 00:53:24.540 |
and it's raining or not, of course, not perfect, 00:53:31.220 |
and use that to then make decisions about your own life. 00:53:52.700 |
relative to all the things that we do in this world, 00:54:03.820 |
which it requires, is it requires trust in institutions 00:54:12.260 |
and it requires trust in science and engineers 00:54:27.740 |
It feels like, I'm not exactly sure where to place the blame, 00:54:44.160 |
it sounded like it dismissed the basic human experience 00:54:58.120 |
like they're trying to, in a self-preserving way, 00:55:01.240 |
control the population or something like that. 00:55:05.440 |
from the majority of the scientific community, 00:55:18.280 |
acknowledge the uncertainties under which we operate, 00:55:20.960 |
acknowledge the mistakes that scientists make, 00:55:31.720 |
that make all the progress we see in the world, 00:55:33.960 |
and that being honest about that imperfection, 00:56:01.040 |
on the part of the CDC and other institutions 00:56:04.040 |
to admit to, to frame, and to contextualize uncertainty. 00:56:15.740 |
and when they're told, they need to be told with authority 00:56:29.080 |
you know, when the CDC is kind of at the very beginning 00:56:32.480 |
of the pandemic saying, "Masks don't do anything, 00:56:35.320 |
"don't wear them," when the real driver for that was, 00:56:42.800 |
"because they may be needed in medical settings, 00:56:49.760 |
I think a message that actually respected people and said, 00:56:53.320 |
"This is why we're asking you not to do masks yet, 00:56:58.080 |
would be less whipsawing and would bring people, 00:57:01.240 |
like they feel more like they're part of the conversation 00:57:06.080 |
than saying one day, definitively masks suck, 00:57:09.360 |
and then X days later saying, "Nope, dammit, wear masks." 00:57:12.920 |
And so I think framing things in terms of the probabilities, 00:57:29.240 |
in the United States, based on the fact that I believe 00:57:42.280 |
I believe one of which resulted in a fatality. 00:57:47.740 |
that indicated that there was a relationship. 00:57:51.280 |
because I think all of the clotting incidents happened 00:57:55.420 |
and kind of clustered in a certain age group. 00:57:58.180 |
But does that call for shutting off the vaccine, 00:58:02.740 |
or does it call for leveling with the American public 00:58:05.860 |
and saying, "We've had one fatality out of seven million. 00:58:30.940 |
and I think people would have been able to parse 00:58:32.780 |
those simple bits of data and make their own judgment. 00:58:40.840 |
who don't read all 900 words in the New York Times piece 00:58:45.900 |
but just see the headline, which is a majority of people. 00:58:54.380 |
and then all the people who sat on the fence, 00:59:00.580 |
That is gonna push an incalculable number of people. 00:59:04.980 |
for we don't know how many hundreds of thousands, 00:59:11.520 |
By pausing that for, whatever it was, 10 or 12 days, 00:59:16.160 |
as everybody who knew much about the situation 00:59:27.280 |
to certitude J&J good in a period of just a few days, 00:59:43.040 |
Just, I don't know what it is about Anthony Fauci, 00:59:55.900 |
I'm sure he's a brilliant scientist and researcher. 00:59:59.120 |
I'm sure he's also a great, like, inside the room, 01:00:06.040 |
But what makes a great leader is something about 01:00:14.000 |
but being a communicator that you know you can trust, 01:00:19.000 |
that there's an authenticity that's required. 01:00:23.080 |
And I'm not sure, maybe I'm being a bit too judgmental, 01:00:34.040 |
in the way that Fauci does not, and I think about that. 01:00:38.120 |
I think about what is effective science communication. 01:00:40.520 |
So, you know, great leaders throughout history 01:00:43.280 |
did not necessarily need to be great science communicators. 01:00:50.960 |
you also have to be a great science communicator. 01:00:53.760 |
You have to be able to communicate uncertainties. 01:00:56.240 |
You have to be able to communicate something like a vaccine 01:00:59.480 |
that you're allowing inside your body into the messiness, 01:01:16.720 |
that there's no short-term negative consequences, 01:01:23.920 |
and doing our best in this battle against trillions 01:01:37.440 |
because I think there should be more science communicators 01:01:46.800 |
that I think about that kind of goes along this thread 01:02:00.740 |
is from amazing work from DeepMind, AlphaFold2, 01:02:13.160 |
Do you think about the use of AI in the SYN biospace of, 01:02:22.040 |
in the virus-based research that you referred to, 01:02:31.280 |
until you get one that has this both contagious and deadly. 01:02:36.280 |
But what about then using AI through simulation 01:02:51.320 |
or again, is this something you're more excited about? 01:02:55.680 |
is unbelievably exciting and promising field. 01:02:58.680 |
And I think when you're doing things in silico 01:03:05.440 |
You don't have a critter that can leak from a leaky lab. 01:03:11.560 |
except I do worry about the data security dimension of it. 01:03:16.040 |
Because if you were doing really, really interesting 01:03:21.080 |
and you hit upon, through a level of sophistication, 01:03:24.360 |
we don't currently have, but synthetic biology 01:03:28.600 |
so capabilities that are utterly out of reach today 01:03:34.440 |
I think if you conjured up worst-case genomes of viruses 01:03:45.840 |
but like, hey guys, this is the genetic sequence 01:03:51.080 |
Then you have to worry about the utter hackability 01:03:58.280 |
I mean, data leaks from the least likely places 01:04:01.880 |
on the grandest possible scales have happened 01:04:08.280 |
And so that would be the danger of doing the work in silico. 01:04:15.600 |
that list leaks, and after the passage of some time, 01:04:25.840 |
going all the way down to the high school level 01:04:27.840 |
are in a position to, to make it overly simplistic, 01:04:31.360 |
hit print on a genome and have the virus bearing that genome 01:04:38.520 |
But in general, computational biology, I think, 01:04:42.320 |
particularly because the crushing majority of work 01:04:45.480 |
that people are doing with the protein folding problem 01:04:47.680 |
and other things are about creating therapeutics, 01:04:50.840 |
about creating things that will help us live better, 01:04:54.200 |
live longer, thrive, be more well, and so forth. 01:05:02.680 |
that we seem to make just the most glacial project on, 01:05:11.400 |
at which people tackle the protein folding problem, 01:05:23.560 |
And so, protein folding is an unbelievably important thing 01:05:27.360 |
if you want to start thinking about therapeutics, 01:05:32.320 |
that tells us where the channels and the receptors 01:05:41.420 |
that you can start barraging it again in silicon 01:06:01.760 |
- Well, let me ask you about fear and hope in this world. 01:06:13.800 |
that people who are, maybe it's in my interactions, 01:06:21.120 |
want to do good, and are just better at doing good 01:06:28.760 |
And more than that, people who are malevolent 01:06:33.420 |
are usually incompetent at building technology. 01:06:40.300 |
that people who are exceptionally good at stuff, 01:06:57.180 |
whether that's building nuclear weapons or plumbing, 01:07:01.520 |
the less likely you are to destroy the world. 01:07:14.640 |
will be far outnumbered by the ultra competent. 01:07:25.000 |
in terms of the people trying to destroy the world. 01:07:28.560 |
Now, there's a few spaces where that might not be the case, 01:07:34.840 |
where this one person who's not very competent 01:07:42.840 |
because of the exponential effects of the technology. 01:07:47.280 |
I tend to believe AI is not one of those such spaces, 01:07:54.520 |
that the ultra competent are usually also the good? 01:08:03.720 |
that we will be able to short circuit the threat 01:08:10.260 |
But we need to start creating those defensive systems, 01:08:13.220 |
or defensive layers, one of which we talked about, 01:08:15.820 |
far, far, far better surveillance in order to prevail. 01:08:18.740 |
So, the good guys will almost inevitably outsmart, 01:08:26.020 |
in most sort of smack downs that we can imagine. 01:08:29.580 |
But the good guys aren't going to be able to exert 01:08:32.540 |
their advantages unless they have the imagination 01:08:36.360 |
necessary to think about what the worst possible thing 01:08:45.020 |
So, that's a tricky, tricky thing to solve for. 01:08:47.460 |
Now, in terms of whether the asymmetric power 01:08:51.740 |
that a bad guy might have in the face of the overwhelming 01:09:04.100 |
I'm sure the guy who was responsible for the Vegas shooting, 01:09:07.680 |
or the Orlando shooting, or any other shooting 01:09:14.040 |
And the number of good guy citizens in the United States 01:09:20.920 |
I'm sure is a crushingly, overwhelmingly high ratio 01:09:43.880 |
powerful and lethal technology that gets so democratized 01:09:48.880 |
and so proliferated in tools that are very, very easy 01:09:56.180 |
When those tools get really easy to use by a knucklehead 01:10:06.160 |
Now, the good news, quote unquote, about mass shootings, 01:10:10.800 |
is even the most brutal and carefully planning 01:10:15.240 |
and well-armed mass shooter can only take so many victims. 01:10:19.880 |
And the same is true, there's been four instances 01:10:23.400 |
that I'm aware of, of commercial pilots committing suicide 01:10:26.660 |
by downing their planes and taking all their passengers 01:10:33.160 |
ultimately were not capable of preventing that. 01:10:44.760 |
In those cases, they only have a plane load of people 01:10:50.200 |
If we imagine a highly plausible and imaginable future 01:10:59.820 |
start embodying unbelievable sophistication and genius 01:11:04.820 |
in the tool, in the easier and easier and easier 01:11:08.760 |
to make tool, all those thousands, tens of thousands, 01:11:15.200 |
start getting embodied in something that may be as simple 01:11:20.020 |
then that good guy technology can be hijacked 01:11:25.960 |
by a bad person and used in a very asymmetric way. 01:11:43.720 |
this kind of large-scale damage with an engineered virus, 01:11:48.720 |
the more and more there will be engineering of defenses 01:11:54.360 |
in terms of testing, in terms of collection of data, 01:11:56.680 |
but also in terms of like a scale contact tracing 01:12:03.920 |
like in a matter of like days, maybe hours, maybe minutes. 01:12:21.920 |
then we start to quickly build up the defenses 01:12:31.440 |
Of course, again, certain kinds of exponential threats 01:12:42.040 |
But I ultimately am hopeful that the natural process 01:12:50.680 |
will work out for quite a long time for us humans. 01:13:00.320 |
about our ability to short circuit this threat 01:13:16.200 |
And so, what I'm hoping to do and trying to do 01:13:22.680 |
into the public conversation and do my small part 01:13:25.040 |
to up the odds that that actually ends up happening. 01:13:27.800 |
The danger with this one is it is exponential. 01:13:33.440 |
And I think that our minds fundamentally struggle 01:13:42.360 |
Our ancestors didn't confront exponential processes 01:13:47.560 |
So, it's not something that's intuitive to us 01:13:55.880 |
And issue number two with something like this 01:14:05.520 |
and we're doomed, which is not the case with mass shooters. 01:14:08.920 |
It's not the case with commercial pilots running muck. 01:14:15.200 |
that I can think of with the exception of nuclear war 01:14:23.640 |
And that means that we need to be unbelievably serious 01:14:27.880 |
about these defenses and we need to do things 01:14:31.520 |
that might on the surface seem like a tremendous overreaction 01:14:39.560 |
But I, like you, believe that's eminently doable. 01:14:43.600 |
I, like you, believe that the good guys outnumber 01:14:45.920 |
the bad guys in this particular one to a degree 01:14:50.160 |
I mean, even the worst, worst people, I'm sure, in ISIS, 01:14:53.900 |
even Osama bin Laden, even any bad guy you could imagine 01:15:05.680 |
And so, the good guys completely outnumber the bad guys 01:15:10.980 |
But the asymmetry and the fact that one catastrophic error 01:15:15.980 |
could lead to unbelievably consequential things 01:15:21.840 |
- The thing that I sometimes worry about is the fact 01:15:29.820 |
Makes me think, well, there's a lot of explanations, 01:15:35.700 |
whenever they get smart, they just destroy themselves. 01:15:51.920 |
one to 400 billion stars in the Milky Way galaxy, 01:15:58.480 |
that an astonishingly high percentage of them 01:16:06.120 |
when the Drake equation was originally written, 01:16:15.800 |
was that it would be a small minority of solar systems 01:16:19.480 |
But now we know it's substantially all of them. 01:16:21.880 |
How many of those stars have planets in the habitable zone? 01:16:25.800 |
It's kind of looking like 20%, like, oh my God. 01:16:29.360 |
And so, L, which is how long does a civilization, 01:16:44.760 |
that when a civilization reaches a level of sophistication 01:16:47.620 |
that's probably just a decade or three in our future, 01:16:52.920 |
just start mounting astronomically, no pun intended. 01:17:00.120 |
there is a lot of alien civilizations out there, 01:17:07.760 |
the thing that was useful, that used to be a feature 01:17:10.480 |
and now became a bug, which is the desire to colonize, 01:17:17.480 |
alien civilizations out there that are just chilling, 01:17:42.640 |
as a progression of conquering of other life, 01:17:53.720 |
perhaps as something we have to shed in order to survive. 01:18:02.160 |
to Fermi's paradox, and it's one that makes sense. 01:18:13.640 |
in an intermediate future world of flawless VR 01:18:22.120 |
that we wanna inhabit, it will just simply cease 01:18:36.480 |
interstellar territory, it wouldn't necessarily 01:18:39.280 |
I can imagine a benign but sophisticated intelligence 01:18:44.520 |
"We're gonna go to places that we can terraform." 01:18:46.640 |
We'd use a different word than terra, obviously, 01:18:53.200 |
so long as that they don't house intelligent, 01:18:56.240 |
sentient creatures that would suffer from our invasion. 01:19:04.800 |
where interstellar travel with its incalculable expense 01:19:10.760 |
compared to what could be done where one already is. 01:19:38.520 |
and try to expand throughout the solar system 01:19:47.000 |
one of many, for protecting human civilizations 01:19:55.800 |
I mean, I find it electrifying, first of all, 01:19:59.400 |
When I was a kid, I thought there was nothing cooler 01:20:01.680 |
than rockets, I thought there was nothing cooler than NASA, 01:20:07.920 |
And as I grew up, I thought there was nothing more tragic 01:20:11.320 |
than the fact that we went from walking on the moon 01:20:13.400 |
to at best getting to something like suborbital altitude. 01:20:17.120 |
And just, I found that more and more depressing 01:20:20.280 |
with the passage of decades at just the colossal expense 01:20:24.960 |
of manned space travel and the fact that it seemed 01:20:28.760 |
that we were unlikely to ever get back to the moon, 01:20:31.720 |
So I have a boundless appreciation for Elon Musk 01:20:35.440 |
for many reasons, but the fact that he has put Mars 01:20:37.920 |
on the incredible agenda is one of the things 01:20:43.000 |
So there's just this sort of space nerd in me 01:20:47.280 |
But on a more practical level, we were talking about 01:20:51.800 |
potentially inhabiting planets that aren't our own, 01:20:56.560 |
and we're thinking about a benign civilization 01:20:59.080 |
that would do that in planetary circumstances 01:21:04.080 |
where we're not causing other conscious systems to suffer. 01:21:07.560 |
I mean, Mars is a place that's very promising. 01:21:09.600 |
There may be microbial life there, and I hope there is, 01:21:12.240 |
and if we found it, I think it would be electrifying. 01:21:14.840 |
But I think ultimately, the moral judgment would be made 01:21:18.680 |
that the continued thriving of that microbial life 01:21:22.480 |
is of less concern than creating a habitable planet 01:21:30.560 |
But I don't think that that would be a greatly immoral act. 01:21:34.320 |
And if that happened, and if Mars became home 01:21:40.160 |
that could survive a catastrophic mistake here on Earth, 01:21:43.320 |
then yeah, the fact that we have a backup colony is great. 01:21:45.960 |
And if we could make more, I'm sorry, not backup colony, 01:21:50.120 |
And if we could make more and more such backup copies 01:21:53.000 |
throughout the solar system by hollowing out asteroids 01:22:10.260 |
with the incredible distances that are involved, 01:22:22.400 |
channel of human expansion than the Atlantic Oceans. 01:22:50.160 |
So that's kind of how I view our space program, 01:22:53.440 |
is like big, very conscious, deliberate efforts 01:22:58.440 |
If you look at how Pacific Islanders transmitted 01:23:03.440 |
their descendants and their culture and so forth 01:23:20.200 |
and find the next one and pray to find the next one. 01:23:30.520 |
And it was like going from this island to that island 01:23:46.920 |
from the inner solar system to the outer solar system, 01:23:51.800 |
There's theories that there might be planets out there 01:23:57.240 |
like kind of hop, hop, slowly transmitting ourselves. 01:24:00.840 |
At some point, we're actually in Alpha Centauri. 01:24:08.400 |
and our culture to a diversity of extraterrestrial outposts 01:24:23.840 |
in a sense that there'll be one program with NASA 01:24:26.280 |
and maybe private Elon Musk, SpaceX, or Jeff Bezos and so on. 01:24:31.280 |
But it's true that with the help of Elon Musk, 01:24:35.120 |
making it cheaper and cheaper and more effective 01:24:41.440 |
perhaps the way we actually colonize the solar system 01:24:50.520 |
is basically just like these renegade ships of weirdos. 01:25:08.200 |
of millions of these little ships just flying out, 01:25:23.560 |
almost always as a response to the main set of efforts. 01:25:28.880 |
- 'Cause you kind of think of Mars colonization 01:25:30.560 |
as the big NASA Elon Musk effort of a big colony, 01:25:42.520 |
some high school kid who gets together a large team 01:25:58.240 |
and then take that into the scale of centuries forward 01:26:28.080 |
as space travel becomes more democratized and more capable. 01:26:41.600 |
as a result of a King Ferdinand and Isabella-like effort 01:26:48.200 |
making individual decisions that there's gold 01:26:55.040 |
What I can't see, and the reason that I think 01:26:57.400 |
this Pacific model of transmission is more likely, 01:27:10.880 |
of relatively tiny steps between now and there. 01:27:14.720 |
And the fact is that there are large chunks of matter 01:27:22.440 |
extends at least a light year beyond the sun. 01:27:25.240 |
And then maybe there are these untethered planets after that. 01:27:32.520 |
and Alpha Centauri's Oort cloud goes out a light year, 01:27:39.640 |
- One of the possibilities, probably the cheapest 01:27:52.920 |
here's where you have high school students be able to build 01:27:57.080 |
a sort of a Hal 9000 version, the modern version of that. 01:28:02.080 |
And it's kind of interesting to think about these robots 01:28:14.440 |
there'll be these intelligent robots flying throughout space 01:28:47.520 |
super intelligent or just mediocre intelligent AI systems 01:28:54.360 |
- Yeah, I guess it depends on the circumstances 01:28:58.400 |
So let's take the example that you just gave. 01:29:01.160 |
We send out, you know, very sophisticated AGI's 01:29:05.080 |
on simple rocket ships, relatively simple ones 01:29:17.320 |
And therefore they're way more likely to happen. 01:29:21.080 |
And let's say that they travel to distant planets 01:29:27.920 |
And so it's inevitably tens, hundreds of thousands of years 01:29:34.120 |
And meanwhile, we die for reasons that have nothing to do 01:29:39.120 |
with those AGI's diffusing throughout the solar system, 01:29:42.320 |
whether it's through climate change, nuclear war, 01:29:47.280 |
In that kind of scenario, the notion of the AGI's 01:29:50.000 |
that we created outlasting us is very reassuring 01:30:04.760 |
Whereas the Terminator scenario of a super AGI 01:30:09.200 |
arising on earth and getting let out of its box 01:30:12.760 |
due to some boo-boo on the part of its creators 01:30:22.560 |
and exterminating us, that makes me feel crushingly sad. 01:30:26.160 |
I mean, look, I was sad when my elementary school 01:30:31.500 |
even though I hadn't been a student there for decades. 01:30:37.960 |
is even worse, the thought of my home state of Connecticut 01:30:41.500 |
getting disbanded and like absorbed into Massachusetts 01:30:52.840 |
Some goodbyes are really, really liberating, but yes. 01:30:55.960 |
- Well, but what if the Terminators, you know, 01:31:00.680 |
have consciousness and enjoy the hell out of life as well? 01:31:07.520 |
- Yeah, well, the have consciousness is a really key element. 01:31:15.660 |
that a super intelligence would have consciousness. 01:31:33.580 |
hey, I want to do this thing for which humans 01:31:38.240 |
their presence is either an unacceptable risk 01:31:46.980 |
being snuffed out by something that is very competent 01:31:50.840 |
but has no consciousness is really, really sad. 01:31:54.380 |
- Yeah, but I tend to believe that it's almost impossible 01:32:07.460 |
or supersede humans, you really have to be accepted 01:32:23.700 |
And for them to be integrated, they have to be human-like, 01:32:29.060 |
but in all the things that we value as humans, 01:32:34.540 |
The other one is just ability to communicate. 01:33:07.080 |
that we're not the most special species on Earth anymore. 01:33:16.680 |
would have to be conscious, let's say, I'm not so sure. 01:33:26.760 |
could, over text-based interaction in any event, 01:33:30.640 |
successfully mimic a very conscious intelligence 01:33:34.220 |
on the other end, but just be completely unconscious. 01:33:38.860 |
And that if you take that upper radical step, 01:33:45.660 |
you could have something that could reason its way 01:33:53.300 |
I've got to deal with these messy, somewhat illogical things 01:34:07.360 |
I need to seize control of these manufacturing resources 01:34:13.280 |
I need to improve those robots with software upgrades 01:34:20.160 |
That doesn't, you know, that could still be a thing 01:34:30.460 |
you know, maximizing entity would be conscious. 01:34:35.460 |
- So this is from a very engineering perspective 01:34:39.120 |
because I think a lot about natural language processing, 01:34:48.960 |
I really think that something like consciousness 01:34:56.640 |
But I think consciousness is part of reasoning 01:35:06.120 |
that it's required to be part of human society 01:35:37.320 |
to feel the pain, the uncertainty, the doubt. 01:35:40.480 |
The other part of that is not just the suffering, 01:35:42.700 |
but the ability to understand that it too is mortal 01:36:02.900 |
that most of us construct an illusion around. 01:36:11.800 |
Like every computation, every part of the thing 01:36:15.260 |
that generates, that does both the perception 01:36:17.920 |
and generates the behavior will have to have, 01:36:22.940 |
but I believe it has to truly be terrified of death, 01:36:30.320 |
and from that, something that will be recognized 01:36:35.120 |
Whether it's the illusion of consciousness, I don't know. 01:36:52.200 |
And all of that, I think, is fully integrated. 01:37:03.200 |
to destroy all humans because it's really good 01:37:13.880 |
It may be possible, but the number of trajectories 01:37:16.560 |
to that are far outnumbered by the trajectories 01:37:27.680 |
And ultimately, the sad, destructive path for that AI 01:37:41.560 |
And I would say, of course, the cold machines 01:37:44.320 |
that lack consciousness, the philosophical zombies, 01:38:02.560 |
becoming a better chess player than the best of humans, 01:38:06.420 |
even starting with Deep Blue, but really with AlphaZero, 01:38:11.920 |
One of the most beautiful games that humans ever created 01:38:17.520 |
that used to be seen as demonstrations of the intellect, 01:38:20.400 |
which is chess, and Go in other parts of the world 01:38:24.440 |
have been solved by AI, that makes me quite sad. 01:38:32.040 |
And to be perfectly clear, I absolutely believe 01:38:35.360 |
that artificial consciousness is entirely possible. 01:38:42.960 |
to have a perfect map of the neural structure 01:38:46.760 |
and the neural states and the amount of neurotransmitters 01:38:55.560 |
at some reasonably distant point in the future? 01:38:59.560 |
Absolutely, and then you'd have a consciousness. 01:39:05.600 |
What I'm less certain about is whether consciousness 01:39:16.120 |
I don't feel the certitude that consciousness 01:39:21.800 |
You had said for it to coexist with human society, 01:39:26.880 |
Could be entirely true, but it also could just exist 01:39:32.720 |
And it could also, upon attaining a superintelligence 01:39:36.760 |
with a maximizing function, very, very, very rapidly 01:39:40.840 |
because of the speed at which computing works 01:39:46.200 |
very, very rapidly make the decisions and calculations 01:39:53.200 |
- Yeah, I mean, kind of like biological viruses do. 01:39:57.000 |
they integrate themselves just fine with human society. 01:40:02.400 |
- Without consciousness. - Yeah, without even 01:40:13.520 |
on that four-hour special episode we mentioned. 01:40:15.920 |
I'm just curious to ask, 'cause I use this meditation app 01:40:21.580 |
I've been using for the past month to meditate. 01:40:35.040 |
and just kind of, from a third-person perspective, 01:40:41.400 |
- You know, I've tried it three separate times in my life, 01:40:49.880 |
One of them, the most extreme, was I took a class 01:40:55.920 |
who is, in many ways, one of the founding people 01:41:07.640 |
and you were gonna meditate an hour a day, every day. 01:41:12.360 |
And having done that for, I think it was 10 weeks, 01:41:15.480 |
it might have been 13, however long a period of time was, 01:41:25.000 |
I did not feel the collapse in quality of life 01:41:33.000 |
And then the most recent one was actually with Sam's app. 01:41:36.160 |
During the lockdown, I did make a pretty good 01:41:41.320 |
to listen to his 10-minute meditation every day, 01:41:52.600 |
because it wasn't bringing me that, you know, 01:41:55.840 |
joy or inner peace or better confidence at being me 01:42:03.600 |
in the way that we cling to certain good habits, 01:42:10.120 |
but yeah, that's one thing that defeats a lot of people. 01:42:16.320 |
if you get in which book or maybe, I forget where, 01:42:35.520 |
So it could be that for you, the flossing of teeth 01:42:38.200 |
is yet another like little inkling of meditation. 01:42:45.920 |
Maybe podcasting, you have an amazing podcast, 01:42:57.440 |
there's a bunch of mechanisms which take my mind 01:43:09.880 |
and especially when I listen to certain kinds of audio books, 01:43:14.520 |
like I've listened to the rise and fall of the Third Reich. 01:43:18.120 |
I've listened to a lot of sort of World War II, 01:43:20.960 |
which at once, because I have a lot of family 01:43:28.680 |
is grounded in the suffering of World War II, 01:43:34.120 |
but also there's some kind of purifying aspect 01:43:38.840 |
to thinking about how cruel, but at the same time, 01:43:45.640 |
like it clears the mind from all the concerns of the world, 01:43:51.200 |
where you were like deeply appreciative to be alive, 01:43:54.840 |
in the sense that, as opposed to listening to your breath, 01:44:00.800 |
and all those kinds of processes that Sam's app does, 01:44:13.280 |
- I hope flossing is not my main form of expertise, 01:44:16.000 |
although I am gonna claim a certain expertise there, 01:44:19.080 |
- Somebody has to be the best flosser in the world. 01:44:30.640 |
I definitely enter a flow state when I'm writing, 01:44:33.160 |
I definitely enter a flow state when I'm editing, 01:44:39.440 |
I enter a flow state when I'm doing heavy, heavy research 01:44:56.660 |
while I'm reading this and watching that YouTube lecture 01:44:59.340 |
and going through this presentation and so forth. 01:45:04.140 |
that bring me into a flow state in my normal weekly life, 01:45:17.420 |
Is this your first attempt to integrate it with your life? 01:45:26.100 |
That takes my mind, I don't know what the hell it does, 01:45:28.540 |
but it takes my mind immediately into like the state 01:45:34.260 |
- So it's like you're accompanying sound when you're-- 01:45:37.300 |
And what's the difference between brown and white noise? 01:45:45.860 |
- 'Cause you have to experience it, you have to listen to it. 01:45:48.140 |
So I think white noise is, this has to do with music. 01:45:54.060 |
There's pink noise, and I think that has to do 01:46:17.700 |
but for me, it was when I was a research scientist at MIT, 01:46:22.700 |
especially when there's a lot of students around, 01:46:32.940 |
"Well, you should try listening to brown noise. 01:46:44.060 |
it's as if my mind was waiting all these years 01:46:52.180 |
It makes me wonder how many other amazing things out there 01:46:54.820 |
they're waiting to discover from my own particular, 01:46:58.460 |
like biological, from my own particular brain. 01:47:20.660 |
Cal Newport was the first person that introduced me 01:47:28.140 |
that's required to sort of deeply think about a problem, 01:47:35.140 |
'cause what it's doing is you have these constructs 01:47:38.740 |
in your mind that you're building on top of each other. 01:47:42.660 |
that keep bombarding you from all over the place. 01:47:58.860 |
of just letting the thought go by without deranging you. 01:48:08.500 |
I am going to try that as soon as this evening. 01:48:25.060 |
or maybe taking the word of experts as the gospel truth 01:48:31.740 |
and only using it as an inspiration to try something, 01:48:39.780 |
So fasting was one of the things when I first discovered, 01:48:53.460 |
putting ethical concerns aside, makes me feel amazing. 01:49:02.900 |
until nutrition science becomes a real science 01:49:05.620 |
to where it's doing studies that deeply understand 01:49:12.420 |
and also does real thorough long-term studies 01:49:20.660 |
versus the very small studies that are generalizing 01:49:25.740 |
from very noisy data and all those kinds of things 01:49:32.420 |
- Particularly because our own personal metabolism 01:49:38.420 |
like if brown noise is a game changer for 7% of people, 01:49:46.860 |
but there's certainly every reason in the world 01:49:54.780 |
I don't have any problem going to one meal a day 01:50:00.780 |
like I've never done it like I'm gonna do this for 15 days, 01:50:04.900 |
And maybe I should, like how many days in a row 01:50:19.220 |
So for me, 'cause I eat a very low carb diet, 01:50:25.460 |
Like there wasn't a painful hunger, like wanting to eat. 01:50:46.980 |
it felt like I'm running on a track when I'm fasting 01:50:53.420 |
- And is this your first 72 hour fast right now? 01:50:55.100 |
- This is the first time doing 72 hours, yeah. 01:51:00.580 |
Like I'm going up and down in terms of hunger 01:51:06.620 |
The thing I'm noticing most of all, to be honest, 01:51:09.580 |
is how much eating, even when it's once a day 01:51:18.460 |
Like I almost feel like I have way more time in my life. 01:51:32.060 |
Or any cleaning up after eating or provisioning of food. 01:51:38.860 |
So when you think about what you're going to do tonight, 01:51:42.060 |
I think I'm realizing that as opposed to thinking, 01:51:57.860 |
you know, when people talk about like the weather 01:52:06.740 |
And I don't have the opportunity to have that thought 01:52:13.780 |
that are more complicated than the eating process. 01:52:16.740 |
That's been the most noticeable thing, to be honest. 01:52:25.060 |
And there's a few people that have written me 01:52:27.100 |
and I've heard of this, is doing a 30 day fast. 01:52:33.900 |
I don't know what the health benefits are necessarily. 01:52:37.100 |
What that shows me is how adaptable the human body is. 01:52:44.060 |
And that's something really important to remember 01:52:50.620 |
- Yeah, I mean, we sure couldn't go 30 days without water. 01:52:57.500 |
You ever read, Franz Kafka has a great short story 01:53:05.060 |
- You know, that was before I started fasting. 01:53:06.660 |
I read that story and I admired the beauty of that, 01:53:14.540 |
but it also felt like a little bit of genius. 01:53:18.300 |
You know what, that's what I'm gonna do tonight. 01:53:19.540 |
I'm gonna read it because I'm doing the fasting. 01:53:30.820 |
I've, here in Texas, people have been so friendly 01:53:36.660 |
with incredible people, a lot of whiskey as well. 01:53:52.360 |
Like I have a lot of like natural resources on my body. 01:54:05.600 |
Like I can go a long time because of the long-term investing 01:54:19.120 |
So I got to walk in, at least for a brief moment. 01:54:28.520 |
Let me ask the big ridiculous question first. 01:54:34.640 |
Okay, wow, it's gonna obviously vary dramatically 01:54:51.400 |
I keep coming back to the sound of bass, guitar, 01:54:59.200 |
And added to it, I think a lot of really cool 01:55:04.940 |
that's really, really new and hybrid-y and awesome. 01:55:23.920 |
It uses the synthesizers that were available at the time. 01:55:32.080 |
but uses them in this hypnotic and beautiful way 01:55:36.200 |
that I can't imagine somebody with the greatest synth array 01:55:40.380 |
conceivable by today's technology could do a better job of 01:55:49.220 |
So I would say in that genre, the genre of rock, 01:55:58.820 |
Pinball Wizard is overriding everything else by The Who, 01:56:04.620 |
- Well, I would say, ironically, with Pinball Wizard, 01:56:09.860 |
And in the movie "Tommy," the rival of Tommy, 01:56:25.720 |
But the version that is sung by Elton John in the movie, 01:56:29.460 |
which is available to those who are ambitious 01:56:31.620 |
and wanna dig for it, that's even better in my mind. 01:56:41.020 |
They asked that question. - And what is that? 01:56:47.540 |
- But for me, somebody who values lyrics as well 01:56:56.200 |
by the way, "Hallelujah" by Leonard Cohen was a close one, 01:57:00.080 |
but the number one is Johnny Cash's cover of "Hurt." 01:57:03.780 |
There's something so powerful about that song, 01:57:15.300 |
Maybe another one is the cover of "Sound of Silence." 01:57:21.820 |
- So whose cover sounds, 'cause Simon and Garfunkel, 01:57:24.400 |
I think, did the original recording of that, right? 01:57:34.020 |
'cause I'm really not into that kind of metal, 01:57:41.360 |
I would say it's one of the greatest, people should see it. 01:57:44.220 |
It's like 400 million views or something like that. 01:57:48.100 |
- It's probably the greatest live vocal performance 01:57:52.260 |
I've ever heard is Disturbed covering "Sound of Silence." 01:58:00.460 |
There was no, for me, with Simon and Garfunkel, 01:58:11.820 |
It's almost like this melancholy, I don't know. 01:58:15.360 |
- Well, there's a lot of, I guess there's a lot of beauty 01:58:20.840 |
And I think, I never thought of this until now, 01:58:23.560 |
but I think if you put entirely different lyrics 01:58:26.880 |
on top of it, unless they were joyous, which would be weird, 01:58:33.200 |
It's just a beauty in the harmonizing, it's soft. 01:58:36.240 |
And you're right, it's not dripping with emotion. 01:58:40.680 |
The vocal performance is not dripping with emotion. 01:58:49.520 |
- Now, if you compare that to the Disturbed cover 01:58:52.880 |
or the Johnny Cash's "Hurt" cover, when you walk away, 01:59:02.640 |
There's certain performances that will just stay with you 01:59:05.960 |
to where, like if you watch people respond to that, 01:59:10.960 |
and that's certainly how I felt when you listened 01:59:14.480 |
to the Disturbed performance or Johnny Cash "Hurt", 01:59:17.680 |
there's a response to where you just sit there 01:59:20.720 |
with your mouth open, kind of like paralyzed by it somehow. 01:59:25.300 |
And I think that's what makes for a great song 01:59:29.600 |
to where you're just like, it's not that you're like 01:59:36.640 |
but where you're just like, what, this is, you're in awe. 01:59:42.060 |
- If we go to listen.com and that whole fascinating era 01:59:47.060 |
of music in the '90s, transitioning to the aughts, 01:59:55.120 |
when piracy, from my perspective, allegedly ruled the land. 02:00:07.240 |
and what were the big takeaways in terms of piracy, 02:00:11.440 |
in terms of what it takes to build a company that succeeds 02:00:15.240 |
in that kind of digital space in terms of music, 02:00:25.520 |
listen.com created a service called Rhapsody, 02:00:28.760 |
which is much, much more recognizable to folks 02:00:38.520 |
we were the first company, so I founded Listen. 02:00:46.240 |
to get full catalog licenses from all the major music labels 02:00:52.640 |
and we specifically did it through a mechanism, 02:00:54.600 |
which at the time struck people as exotic and bizarre 02:01:01.720 |
which of course now, it's a model that's been appropriated 02:01:14.640 |
was the reaction of the music labels to piracy, 02:01:32.200 |
that enabled people to get near unlimited access 02:01:39.120 |
I mean, truly obscure things could be very hard 02:01:56.000 |
You might download a really god-awful recording 02:02:00.200 |
You may download a recording that actually wasn't that song 02:02:07.320 |
You could struggle to find the song that you're looking for. 02:02:17.320 |
doesn't have a very good internet connection, 02:02:18.960 |
so you might wait 19 minutes only for it to snap, 02:02:25.880 |
let's start with how that hit the music labels. 02:02:28.440 |
The music labels had been in a very, very comfortable position 02:02:31.880 |
for many, many decades of essentially, you know, 02:02:42.640 |
Any given label was a monopoly provider of the artists 02:02:54.160 |
you were talking close to $20 for a compact disc 02:02:57.600 |
that might have one song that you were crazy about 02:03:00.000 |
and simply needed to own that might actually be glued 02:03:03.080 |
to 17 other songs that you found to be sure crap. 02:03:13.280 |
and profound pricing power to really get music lovers 02:03:18.280 |
to the point that they felt very, very misused 02:03:22.560 |
Now along comes Napster and music sales start getting gutted 02:03:29.440 |
And the reaction of the music industry to that 02:03:39.160 |
I mean, industries do get gutted all the time, 02:03:42.120 |
but I struggle to think of an analog of an industry 02:03:46.800 |
I mean, we could say that passenger train service 02:03:51.720 |
but that was a process that took place over decades 02:04:01.200 |
and started looking like an existential threat 02:04:05.080 |
So the music industry is quite understandably 02:04:15.120 |
both for themselves and almost for people like us 02:04:32.640 |
Even if you all shut your eyes and wish very, very, 02:04:58.520 |
better experience to piracy, something that's way better, 02:05:02.480 |
that you sell at a completely reasonable price, 02:05:06.400 |
Don't just give people access to that very limited number 02:05:11.720 |
and paid for or pirated and have on their hard drive. 02:05:15.840 |
Give them access to all of the music in the world 02:05:19.560 |
And obviously, that doesn't sound like a crazy suggestion, 02:05:29.520 |
a much, much better option to this kind of crappy, 02:05:33.240 |
kind of rickety, kind of buggy process of acquiring MP3s. 02:05:37.720 |
Now, unfortunately, the music industry was so angry 02:05:41.560 |
about Napster and so forth that for essentially 02:05:44.720 |
three and a half years, they folded their arms, 02:05:47.400 |
stamped their feet, and boycotted the internet. 02:05:49.880 |
So they basically gave people who were fervently passionate 02:05:59.400 |
we, the music industry, insist that you steal it 02:06:04.560 |
So what that did is it made an entire generation 02:06:07.040 |
of people morally comfortable with swiping the music 02:06:14.240 |
It's like a 20-year-old violating the 21 drinking age. 02:06:18.840 |
If they do that, they're not gonna feel like felons. 02:06:22.000 |
They're gonna be like, "This is an unreasonable law 02:06:33.920 |
and kind of even trickier tools and like tweakier tools 02:06:41.640 |
So by the time they finally, grudgingly, it took years, 02:06:48.800 |
that we were quite convinced would be better than piracy, 02:06:54.840 |
where lots of people said music is a thing that is free 02:06:58.680 |
and that's morally okay and I know how to get it. 02:07:01.800 |
And so streaming took many, many, many more years 02:07:16.160 |
as opposed to demand that people want digital music, 02:07:26.360 |
in different domains currently, we just don't know. 02:07:30.760 |
I mean, I don't know if you can draw perfect parallels, 02:07:40.200 |
who are kind of very skeptical about cryptocurrency, 02:07:47.720 |
where there should be a complete like Coinbase 02:07:52.600 |
There's a lot of other domains that where a pivot, 02:07:57.400 |
like if you pivot now, you're going to win big, 02:08:09.520 |
The company succeeds initially, and then it grows, 02:08:13.600 |
and there's a huge number of employees and managers 02:08:16.640 |
that don't have the guts or the institutional mechanisms 02:08:27.080 |
There was an economic model that they put food on the table 02:08:32.360 |
and seven and even eight figure executive salaries 02:08:42.800 |
and it seems so ephemeral and like such a long shot 02:08:52.520 |
that something illicit is cannibalizing their business 02:08:57.000 |
And so if they don't do it themselves, they're doomed. 02:08:59.360 |
I mean, we used to put slides in front of these folks, 02:09:04.440 |
okay, let's assume Rhapsody, we want it to be 9.99 a month, 02:09:10.320 |
So it's $120 a year from the budget of a music lover. 02:09:20.480 |
the average person who bothered to collect music, 02:09:26.920 |
that the average CD buyer spends a hell of a lot 02:09:32.880 |
This is a revenue expansion, blah, blah, blah, 02:09:36.960 |
and I'm not saying this in a pejorative or patronizing way, 02:09:43.760 |
All they could think of was the incredible margins 02:09:51.480 |
by the mechanism that you guys are proposing, 02:10:01.040 |
We were talking about a penny a play back then, 02:10:02.680 |
it's less than that now that the record labels get paid. 02:10:05.400 |
But would have to stream songs from that 1,799 times, 02:10:10.520 |
So they were just sort of stuck in the model of this, 02:10:13.440 |
but they're gonna spend money on all this other stuff. 02:10:24.360 |
and a whole bunch of cities, very, very fragmented. 02:10:34.000 |
people wanna be able to hail things easily, cheaply, 02:10:39.360 |
they wanna know how many minutes it's gonna be, 02:10:43.040 |
and they want a much bigger fleet than what we've got. 02:10:46.400 |
If the taxi industry had rolled out something like that, 02:10:50.640 |
with the branding of yellow taxis, universally known 02:10:56.760 |
and expanded their fleet in a necessary manner, 02:10:58.800 |
I don't think Uber or Lyft ever would have gotten a foothold. 02:11:02.000 |
But the problem there was that real economics 02:11:21.240 |
So you think you end up having these vested interests 02:11:23.640 |
with economics that aren't necessarily visible to outsiders 02:11:27.900 |
who get very, very reluctant to disrupt their own model, 02:11:34.760 |
- So you know what it takes to build a successful startup, 02:11:37.400 |
but you're also an investor in a lot of successful startups. 02:11:44.000 |
What do you think it takes to build a successful startup 02:11:51.200 |
everything starts and even ends with the founder. 02:11:59.440 |
and their sophistication about what they're doing. 02:12:07.400 |
you've had a founder who was deeply, deeply inculcated 02:12:12.400 |
in the domain of technology that they were taking on. 02:12:16.560 |
Now, what's interesting about that is you could say, 02:12:23.480 |
they're generally coming out of very nascent, 02:12:30.800 |
and engaged in the community for a period of even months 02:12:34.160 |
is enough time to make them very, very deeply inculcated. 02:12:37.040 |
I mean, you look at Marc Andreessen and Netscape, 02:12:43.580 |
when Netscape had been founded for what, a year and a half? 02:12:51.520 |
and the commercial internet was pre-nascent in 1994 02:12:58.520 |
So there's somebody who's very, very deep in their domain, 02:13:07.620 |
I mean, 10 years ago, even seven or eight years ago, 02:13:14.720 |
and engaged participant in the crypto ecosystem, 02:13:19.860 |
You look, however, at more established industries, 02:13:26.100 |
when it got started, who's the executive and the founder? 02:13:33.640 |
which ended up being Salesforce's main competition. 02:13:36.740 |
So more established, you need the entrepreneur 02:13:40.520 |
to be very, very deep in the technology and the culture 02:13:46.600 |
because you need that entrepreneur, that founder, 02:13:49.760 |
to have just an unbelievably accurate intuitive sense 02:14:01.380 |
And the next thing is that that founder needs to be 02:14:04.620 |
charismatic and/or credible, or ideally both, 02:14:08.900 |
in exactly the right ways, to be able to attract a team 02:14:14.900 |
and is bought into that founder's intuitions being correct, 02:14:18.140 |
and not just the team, obviously, but also the investors. 02:14:25.680 |
Then the next thing I'm still talking about, the founder, 02:14:33.140 |
to put this above things that might rationally, 02:14:38.180 |
should perhaps rationally supersede it for a period of time, 02:14:41.800 |
to just relentlessly pivot when pivoting is called for, 02:14:48.140 |
I mean, think of even very successful companies. 02:14:53.940 |
Newsfeed was something that was completely alien 02:15:07.940 |
the DNA that's been inculcated with a company 02:15:10.740 |
has to have that relentlessness and that ability 02:15:18.260 |
And then the last thing I'll say about the founder 02:15:24.180 |
is the founder has to be obviously a really great hirer, 02:15:40.780 |
And being good at realizing when this particular person 02:15:58.140 |
is something that most people don't have in them. 02:16:01.700 |
And it's something that needs to be developed 02:16:04.460 |
in most people, or maybe some people have it naturally. 02:16:15.300 |
And so that's all what needs to be present in the founder. 02:16:23.900 |
The one thing that was really kind of surprising to me 02:16:42.100 |
Like, of course, you're often trying to do the impossible. 02:16:51.980 |
But you have to be honest with what is actually possible. 02:16:59.580 |
just a complete immersion in that emerging market. 02:17:02.780 |
And so I can imagine, there are a couple people out there 02:17:12.540 |
and through the culture and a deep understanding 02:17:15.300 |
of what's happening and what's not happening, 02:17:16.980 |
they can get a good intuition of what's possible. 02:17:26.020 |
And dual founder companies have become extremely common 02:17:38.700 |
a very damn good technical person very, very fast. 02:17:52.140 |
and saying that it's impossible to do the first few steps. 02:17:54.820 |
Not impossible, but much more difficult to do it alone. 02:18:01.660 |
where there's not significant investment required 02:18:11.820 |
and already has a huge number of customers alone? 02:18:15.900 |
There are lots and lots of loan founder companies out there 02:18:31.140 |
and ended up in the hands of Real Networks and MTV, 02:18:37.020 |
and I studied Arabic and Middle Eastern history undergrad. 02:18:53.220 |
I mean, two founders who fall out with each other badly 02:18:59.660 |
because they both have an enormous amount of equity, 02:19:04.380 |
and the capital structure is a result of that. 02:19:06.700 |
They both have an enormous amount of moral authority 02:19:10.340 |
with the team as a result of each having that founder role. 02:19:17.580 |
many, many situations in which companies have been shredded 02:19:27.500 |
And the more founders you add, the more risky that becomes. 02:19:37.260 |
is such an unstable and potentially treacherous situation 02:19:42.780 |
that I would never, ever recommend going beyond two. 02:19:49.140 |
sort of business and market and outside-minded founder 02:20:04.780 |
there is no other person that you can sit down with 02:20:17.620 |
Your most trusted board member is likely an investor, 02:20:30.340 |
who might own a very significant stake in the company, 02:20:33.740 |
doesn't own anywhere near your stake in the company. 02:20:51.980 |
whether it's a rival or one in a completely different space. 02:21:02.460 |
Can you find an alleviation to that loneliness 02:21:09.380 |
- With a good mentor, like a mentor who's mentoring you? 02:21:18.420 |
and cares enough about you and your well-being 02:21:29.140 |
And I had a board member who was not an investor, 02:21:33.260 |
who basically played that role for me to a great degree. 02:21:36.060 |
He came in maybe halfway through the company's history, 02:21:43.500 |
- Yeah, the loneliness, that's the whole journey of life. 02:21:52.860 |
You were saying that there might be something 02:22:00.660 |
- Yeah, okay, so we talked about the founder. 02:22:05.820 |
is thing number one, but then thing number two, 02:22:23.820 |
and sales and so forth, who themselves are great hirers. 02:22:26.900 |
But what needs to radiate from the founder into the team 02:22:37.580 |
to the intuitions and the vision of the founder. 02:22:43.100 |
But the team needs to have a slightly different thing, 02:22:59.100 |
That is 1% vision, you don't wanna lose that. 02:23:14.820 |
I try to beat and raise expectations relentlessly, 02:23:27.220 |
A good founder is going to trust that VP of sales 02:23:32.920 |
to build out that organization, what the milestones be. 02:23:38.740 |
But execution obsession in the team is the next thing. 02:23:43.060 |
- Yeah, there's some sense where the founder, 02:23:49.880 |
asking big difficult questions of future trajectories 02:23:53.340 |
or having a big vision and big picture dreams. 02:24:20.780 |
that are attention with the pragmatic nature of execution, 02:24:38.780 |
in the software world, that would be the programmer 02:24:50.040 |
a podcaster, you host a podcast called After On. 02:24:52.780 |
I mean, there's a million questions I wanna ask you here, 02:24:58.580 |
what do you think makes for a great conversation? 02:25:06.820 |
One is if something is beautifully architected, 02:25:11.820 |
whether it's done deliberately and methodically 02:25:19.200 |
or whether that just emerges from the conversation, 02:25:21.780 |
but something that's beautifully architected, 02:25:28.540 |
or something where there's just extraordinary chemistry. 02:25:38.940 |
- I couldn't care less about auto mechanics myself. 02:25:51.920 |
like Red Scare is just really entertaining to me 02:25:54.720 |
because the banter between the women on that show 02:25:59.260 |
So I think it's a combination of sort of the arc 02:26:04.740 |
And I think because the arc can be so important, 02:26:07.680 |
that's why very, very highly produced podcasts 02:26:11.600 |
like This American Life, obviously a radio show, 02:26:14.240 |
but I think of a podcast 'cause that's how I always consume 02:26:16.360 |
it, or Criminal, or a lot of what Wondery does and so forth. 02:26:24.200 |
and that requires a big team and a big budget 02:26:26.260 |
relative to the kinds of things you and I do. 02:26:34.680 |
I think it's a combination of structure and chemistry. 02:26:38.360 |
- Yeah, and I've actually personally have lost, 02:26:45.560 |
the possibility of magic, it's engineered magic. 02:26:53.080 |
I mean, when I fell madly in love with it during the aughts, 02:27:02.940 |
But yeah, I think that maybe there's a little bit 02:27:07.680 |
less magic there now, 'cause I think they have agendas 02:27:10.040 |
other than necessarily just delighting their listeners 02:27:13.240 |
with quirky stories, which I think is what it was all about 02:27:17.640 |
- Is there a memorable conversation that you've had 02:27:20.480 |
on the podcast, whether it was because it was wild and fun, 02:27:28.860 |
maybe challenging to prepare for, that kind of thing? 02:27:31.360 |
Is there something that stands out in your mind 02:27:45.200 |
really challenging to prepare for was George Church. 02:27:49.920 |
many of your listeners know, he is one of the absolute 02:27:52.480 |
leading lights in the field of synthetic biology. 02:27:57.400 |
His lab is large and has all kinds of efforts 02:28:02.560 |
And what I wanted to make my George Church episode about 02:28:12.040 |
And that required me to learn a hell of a lot more 02:28:23.040 |
going into that episode, but there was this incredible 02:28:25.720 |
breadth of grounding that I needed to give myself 02:28:28.920 |
And then George does so many interesting things, 02:28:32.600 |
there's so many interesting things emitting from his lab 02:28:35.480 |
that, you know, and he and I had a really good dialogue. 02:28:46.400 |
to create a sense of wonder and magic in the listener 02:28:51.600 |
very broad spectrum domain, that was a doozy of a challenge. 02:28:54.680 |
That was a tough, tough, tough one to prepare for. 02:28:58.120 |
Now in terms of something that was just wild and fun, 02:29:02.760 |
unexpected, I mean, by the time we sat down to interview, 02:29:07.400 |
but just in terms of the idea space, Don Hoffman. 02:29:12.400 |
So Don Hoffman, as again, some listeners probably know, 02:29:16.000 |
'cause he's, I think I was the first podcaster 02:29:19.400 |
I'm sure some of your listeners are familiar with him, 02:29:26.240 |
on the nature of reality, but it is contrarian in a way 02:29:31.240 |
that all the ideas are highly internally consistent 02:29:35.120 |
and snap together in a way that's just delightful. 02:29:38.520 |
And it seems as radically violating of our intuitions 02:29:43.520 |
and as radically violating of the probable nature of reality 02:29:49.460 |
but an analogy that he uses, which is very powerful, 02:29:52.040 |
which is what intuition could possibly be more powerful 02:29:56.360 |
than the notion that there is a single unitary direction 02:29:59.160 |
called down, and we're on this big flat thing 02:30:05.200 |
And we all know, I mean, that's the most intuitive thing 02:30:12.320 |
So my conversation with Don Hoffman was just wild 02:30:15.680 |
and full of plot twists and interesting stuff. 02:30:19.640 |
- And the interesting thing about the wildness of his ideas, 02:30:23.200 |
it's to me at least as a listener coupled with, 02:30:45.640 |
He loves a Perry or a jab, whatever the word is, 02:30:53.880 |
He's a very, very gentle and non-combatitive soul, 02:30:58.880 |
but then he is very good and takes great evident joy 02:31:10.160 |
- Let me, as a small tangent of tying up together 02:31:15.480 |
and streaming and Spotify and the world of podcasting. 02:31:19.200 |
So we've been talking about this magical medium 02:31:23.720 |
of podcasting, I have a lot of friends at Spotify 02:31:53.480 |
do you worry as well about the future of podcasting? 02:31:57.160 |
- Yeah, I think walled gardens are really toxic 02:32:05.520 |
So to take an example, I'll take two examples. 02:32:08.240 |
With music, it was a very, very big deal that at Rhapsody, 02:32:14.200 |
we were the first company to get full catalog licenses 02:32:16.920 |
from all, back then there were five major music labels 02:32:24.040 |
with a sense that basically everything is there 02:32:38.160 |
the editorial team assembled or a good algorithm 02:32:40.720 |
or whatever it is, but a good map to wander this domain. 02:32:45.720 |
A, you undermine the joy of friction-free discovery, 02:33:01.320 |
but it also creates an incredible opening vector for piracy. 02:33:08.020 |
from the Rhapsody/Spotify/et cetera like experience 02:33:20.020 |
Is it on Discovery+, is it here, is it there? 02:33:26.780 |
that people encounter when they are seeking something 02:33:31.440 |
and they're already paying a very respectable amount 02:33:36.920 |
and they can't find it, the more that happens, 02:34:04.860 |
and lovers of podcasting, we should wanna resist 02:34:18.280 |
unless you wanna sign up for lots of different services. 02:34:25.840 |
who might be able to have a far, far, far bigger impact 02:34:28.700 |
by reaching far more neurons with their ideas. 02:34:45.820 |
And 'cause he was syndicated on hundreds and hundreds 02:34:50.520 |
when terrestrial broadcast was the main thing 02:34:52.220 |
people listened to in their car, no more obviously. 02:34:54.860 |
But when he decided to go over to satellite radio, 02:35:03.840 |
totally his right to do it, a financial calculation 02:35:07.720 |
that they were offering him a nine-figure sum to do that. 02:35:10.980 |
But his audience, because not a lot of people 02:35:13.160 |
were subscribing to satellite radio at that point, 02:35:17.840 |
I wouldn't be surprised if it was as much as 95%. 02:35:20.840 |
And so the influence that he had on the culture 02:35:24.120 |
and his ability to sort of shape conversation 02:35:33.480 |
especially in modern times where the walled gardens 02:35:54.400 |
if they're providing incentives within the platform 02:36:05.960 |
imagine somebody has got a reasonably interesting idea 02:36:14.360 |
is gonna give them financing to get the thing spun up. 02:36:17.160 |
And that's great, and Spotify is gonna give them 02:36:19.720 |
a certain amount of really powerful placement 02:36:31.960 |
will be much more successful if you dumb it down about 60%, 02:36:43.800 |
and suddenly the person who is dependent upon Spotify 02:36:49.000 |
and is really dependent, really wants to please them 02:37:02.580 |
at Spotify and a creative that's going to shape 02:37:16.120 |
let's say, of somebody who says the wrong word 02:37:21.000 |
not kind of, it's what you have with film and TV, 02:37:23.360 |
is that so much influence is exerted over the storyline 02:37:35.400 |
and the skill set of being a showrunner in television, 02:37:37.440 |
being a director in film, that is meant to like, 02:37:42.840 |
we can't say that, we need to have cast members 02:37:46.440 |
that have precisely these demographics reflected 02:37:55.220 |
in terms of film, I think the quality has nosedived 02:38:02.160 |
coming out of a major studio, the average quality, 02:38:04.120 |
and my view has nosedived over the past decade 02:38:06.240 |
is it's kind of everything's gotta be a superhero franchise, 02:38:19.080 |
greater stuff would be made if there was less interference 02:38:36.920 |
what the heck they do, but they do a good job 02:38:41.760 |
like Tim Dillon, like Joe Rogan, like comedians, 02:38:46.960 |
and the result is some of the greatest television, 02:38:56.600 |
And I don't know what the heck they're doing. 02:39:00.120 |
From what I understand, it's a relative thing. 02:39:09.940 |
and obviously, they're the ones writing the checks, 02:39:13.380 |
so they have every right to their own influence, obviously, 02:39:16.600 |
but my understanding is that they're relatively 02:39:19.100 |
way more hands-off, and that has had a demonstrable effect, 02:39:22.140 |
'cause I agree, some of the greatest produced video content 02:39:26.620 |
of all time, an incredibly inordinate percentage of that 02:39:29.940 |
is coming out from Netflix in just a few years 02:39:32.140 |
when the history of cinema goes back many, many decades. 02:39:34.500 |
- And Spotify wants to be that for podcasting, 02:39:38.400 |
and I hope they do become that for podcasting, 02:39:41.280 |
but I'm wearing my skeptical goggles or skeptical hat, 02:39:45.800 |
whatever the heck it is, 'cause it's not easy to do, 02:40:02.120 |
pivoting into a whole new space is very tricky, 02:40:04.440 |
and difficult, so I'm skeptical, but hopeful. 02:40:08.120 |
What advice would you give to a young person today 02:40:12.980 |
We talked about startups, we talked about music, 02:40:15.580 |
we talked about the end of human civilization. 02:40:17.880 |
Is there advice you would give to a young person today, 02:40:22.260 |
maybe in college, maybe in high school, about their life? 02:40:27.020 |
- Well, let's see, I mean, there's so many domains 02:40:28.900 |
you can advise on, and I'm not gonna give advice 02:40:39.280 |
that really wouldn't be all that distinctive, 02:40:50.460 |
On a career level, one thing that I think is unintuitive, 02:40:55.460 |
but unbelievably powerful, is to focus not necessarily 02:41:00.880 |
on being in the top sliver of 1% in excelling at one domain 02:41:05.880 |
that's important and valuable, but to think in terms 02:41:19.620 |
The first is, in an incredibly competitive world 02:41:26.660 |
radically more competitive than when I was coming 02:41:40.320 |
You wanna be one of the world's greatest Python developers, 02:41:53.240 |
to interview, 'cause I find language fascinating, 02:42:10.220 |
a Python developer's a very, very difficult thing to do, 02:42:12.420 |
particularly if you wanna be number one in the world, 02:42:14.780 |
And I'll use an analogy, is I had a friend in college 02:42:17.820 |
who was on a track, and indeed succeeded at that, 02:42:39.900 |
but he didn't participate in a lot of the social, 02:42:44.780 |
a lot of the that, because he was training so much. 02:42:48.120 |
And obviously he also wanted to keep up with his academics, 02:42:50.660 |
and at the end of the day, story has a happy ending, 02:42:59.140 |
that's an extraordinary thing, and at that moment, 02:43:01.060 |
he was one of the top three people on Earth at that thing. 02:43:07.320 |
how many thousands of other people went down that path 02:43:10.700 |
and made similar sacrifices and didn't get there. 02:43:22.580 |
and learned the things that were there to be learned, 02:43:25.720 |
and I came out and I entered a world with lots of-- 02:43:34.620 |
who didn't say where you went, which is beautiful, 02:43:37.780 |
It's one of the greatest business schools in the world. 02:43:41.260 |
It's a whole 'nother fascinating conversation 02:43:51.100 |
and I entered a world that had hundreds of thousands 02:43:53.500 |
of people who had MBAs, probably hundreds of thousands 02:44:00.320 |
So I was not particularly great at being an MBA person. 02:44:04.960 |
I was inexperienced relative to most of them, 02:44:16.000 |
into working on the commercial internet in 1994. 02:44:20.320 |
So I went to a, at the time, giant and hot computing company 02:44:23.540 |
called Silicon Graphics, which had enough heft 02:44:26.260 |
and enough head count that they could take on 02:44:33.060 |
But within that company that had an enormous amount 02:44:37.300 |
of surface area and was touching a lot of areas 02:44:39.460 |
and had unbelievably smart people at the time, 02:44:47.420 |
really interesting and innovative and trailblazing stuff 02:44:55.800 |
with Mark Andresen, so the whole company was like, 02:45:01.900 |
Now, in terms of being a commercial internet person 02:45:16.360 |
the business and cultural significance of this transition. 02:45:25.320 |
Within a few months, I was in the relatively top echelon 02:45:31.500 |
'Cause let's say it was five months into the program, 02:45:33.680 |
there were only so many people who had been doing 02:45:35.520 |
worldwide web stuff commercially for five months. 02:45:49.600 |
And so by being a pretty good, okay web person 02:45:56.600 |
that intersection put me in a very rare group, 02:46:03.080 |
And in those early days, you could probably count 02:46:10.100 |
who were doing stuff full-time on the internet. 02:46:11.760 |
And there was a greater appetite for great software 02:46:24.220 |
who were also seasoned and networked in the emerging world 02:46:33.300 |
you can be pretty good at, but is a rare intersection 02:46:37.620 |
and a special intersection, is probably a much easier way 02:46:41.740 |
to make yourself distinguishable and in demand 02:47:01.900 |
but it's one I've been thinking about a little bit. 02:47:06.460 |
It'd be hard to be in the top percentile of crypto people, 02:47:11.060 |
whether it comes from just having a sheer grasp 02:47:13.020 |
of the industry, a great network within the industry, 02:47:15.040 |
technological skills, whatever you wanna call it. 02:47:27.340 |
particularly in the wealthy and industrialized world 02:47:29.460 |
where people, there's sophisticated financial markets, 02:47:37.380 |
Somewhere out there is somebody who is pretty crypto savvy, 02:47:42.580 |
But also has kind of been in the crop insurance world 02:47:52.140 |
And so I think that decentralized finance, DeFi, 02:47:56.420 |
one of the interesting and I think very world positive things 02:48:07.060 |
I mean, people who have tiny, tiny plots of land 02:48:12.020 |
where there is no crop insurance available to them 02:48:14.500 |
because just the financial infrastructure doesn't exist. 02:48:18.980 |
But it's highly imaginable that using Oracle networks 02:48:22.520 |
that are trusted outside deliverers of factual information 02:48:28.280 |
you can start giving drought insurance to folks like this. 02:48:33.700 |
is not a crypto whiz who doesn't know a blasted thing 02:48:51.820 |
for somebody who occupies the right intersection of skills 02:49:01.060 |
about my own little things that I'm average at 02:49:04.620 |
and seeing where the intersections that could be exploited. 02:49:10.340 |
So we talked quite a bit about the end of the world 02:49:13.780 |
and how we're both optimistic about us figuring our way out. 02:49:20.540 |
both you and I are going to die one day way too soon. 02:49:39.340 |
how does that kind of, what kind of wisdom insight 02:50:06.080 |
is it makes you realize how unbelievably rare and precious 02:50:13.500 |
and therefore how consequential the decisions that we make 02:50:23.060 |
or do you have dinner with somebody who's really important 02:50:26.380 |
to you who you haven't seen in three and a half years? 02:50:28.780 |
If you had an infinite expanse of time in front of you, 02:50:33.860 |
I'm gonna do those emails because collectively, 02:50:35.740 |
they're rather important and I have tens of thousands 02:50:41.020 |
But I think the scarcity of the time that we have 02:50:43.900 |
helps us choose the right things if we're attuned to that. 02:50:49.660 |
And we're attuned to the context that mortality puts 02:50:53.020 |
over the consequence of every decision we make 02:50:56.920 |
That doesn't mean that we're all very good at it. 02:51:00.400 |
But it does add a dimension of choice and significance 02:51:07.380 |
- It's kind of funny that you say you try to think about it 02:51:10.540 |
I would venture to say you probably think about 02:51:12.340 |
the end of human civilization more than you do 02:51:16.700 |
- Because that feels like a problem that could be solved. 02:51:21.240 |
- Whereas the end of my own life can't be solved. 02:51:23.840 |
I mean, there's transhumanists who have incredible optimism 02:51:29.540 |
that could really, really change human lifespan. 02:51:34.620 |
but I don't have a whole lot to add to that project 02:51:43.180 |
Not as much, but close to as I'm afraid of death itself. 02:51:48.740 |
So it feels like the things that give us meaning 02:51:56.940 |
- I'm almost afraid of having too much of stuff. 02:52:11.660 |
- Well, part of the reason I wanted to not do a startup, 02:52:15.280 |
really the only thing that worries me about doing a startup 02:52:28.820 |
that there will not be enough silence in my life, 02:52:33.860 |
enough scarcity to appreciate the moments I appreciate now 02:52:44.700 |
that it feels like it might disappear with success. 02:52:52.180 |
I think if you start a company that has ambitious investors, 02:52:57.180 |
ambitious for the returns that they'd like to see, 02:53:21.740 |
to be creative, to be peaceful, to be so forth 02:53:24.420 |
because with every new employee that you hire, 02:53:33.020 |
that's one more person to whom you really do wanna 02:54:13.740 |
In terms of time and attention and experience. 02:54:23.740 |
- Oh yeah, I mean you can do balance sheets all you want 02:54:28.220 |
I mean I've done it in the past and it's never worked. 02:55:03.940 |
and goes beyond almost anything else that we can do. 02:55:07.780 |
And whether that is something that lies in the past, 02:55:11.900 |
like maybe there was somebody that you were dating 02:55:27.300 |
and you triggered that in somebody else and that happened. 02:55:35.220 |
it's family members, it's love between friends, 02:55:39.980 |
I had a dog for 10 years who passed away a while ago 02:55:50.460 |
And we were talking about the flow states that we enter 02:56:11.180 |
I think that's a big, big, big part of the meaning of life. 02:56:13.620 |
It's not something that everybody participates in 02:56:18.660 |
at least in a very local level by the example that we set, 02:56:25.620 |
but for people who create works that travel far 02:56:36.460 |
and come across their ideas or their works or their stories 02:56:43.340 |
I think that's a really, really big part of the fabric 02:56:48.300 |
And so all these things, like love and creation, 02:57:28.860 |
and the intensity of emotion you still feel about it 02:57:35.180 |
You're like, after saying goodbye, you relive it. 02:57:49.580 |
- I won't say the loss is the best part personally, 02:57:55.660 |
And the grief you might feel about something that's gone 02:58:05.360 |
- Speaking of which, this particular journey, 02:58:14.020 |
So I have to say goodbye, and I hate saying goodbye. 02:58:20.320 |
People should definitely check out your podcast. 02:58:22.020 |
You're a master at what you do in the conversation space, 02:58:25.980 |
It's been an incredible honor that you would show up here 02:58:30.940 |
- Well, it's been a huge honor to be here as well, 02:58:33.220 |
and also a fan and have been for a long time. 02:58:37.660 |
Thanks for listening to this conversation with Rob Reed, 02:58:46.200 |
Check them out in the description to support this podcast. 02:58:49.260 |
And now, let me leave you with some words from Plato. 02:58:52.500 |
We can easily forgive a child who's afraid of the dark. 02:58:55.660 |
The real tragedy of life is when men are afraid of the light. 02:58:59.640 |
Thank you for listening, and hope to see you next time.