back to index

Jamie Metzl: Lab Leak Theory | Lex Fridman Podcast #247


Chapters

0:0 Introduction
1:27 Lab leak
60:1 Gain-of-function research
69:32 Anthony Fauci
79:14 Francis Collins
83:56 Joe Rogan, Brett Weinstein, and Sam Harris
113:53 Xi Jinping
128:24 Patient Zero
141:38 WHO
165:28 Government transparency
187:28 Likelihood of a cover-up
189:16 Future of reproduction
224:55 Jon Stewart
230:14 Joe Rogan and Sanjay Gupta
255:19 Ultramarathons
265:21 Chocolate
273:34 One Shared World
288:37 Hope for the future

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Jamie Metzl,
00:00:02.440 | author specializing in topics of genetic engineering,
00:00:05.900 | biotechnology, and geopolitics.
00:00:09.200 | In the past two years, he has been outspoken
00:00:12.160 | about the need to investigate and keep an open mind
00:00:15.240 | about the origins of COVID-19.
00:00:18.200 | In particular, he has been keeping an extensive
00:00:20.940 | up-to-date collection of circumstantial evidence
00:00:23.520 | in support of what is colloquially known
00:00:25.760 | as lab leak hypothesis.
00:00:27.860 | That COVID-19 leaked in 2019
00:00:30.600 | from the Wuhan Institute of Virology.
00:00:33.580 | In part, I wanted to explore the idea
00:00:35.560 | in response to the thoughtful criticism
00:00:37.920 | to parts of the Francis Collins episode.
00:00:40.760 | I will have more and more difficult conversations like this
00:00:43.780 | with people from all walks of life
00:00:45.840 | and with all kinds of ideas.
00:00:47.800 | I promise to do my best to keep an open mind
00:00:50.200 | and yet to ask hard questions
00:00:52.320 | while together searching for the beautiful
00:00:54.240 | and the inspiring in the mind of the other person.
00:00:57.620 | It's a hard line to walk gracefully,
00:00:59.920 | especially for someone like me,
00:01:01.640 | who's a bit of an awkward introvert
00:01:03.520 | with barely the grasp of the English language
00:01:06.480 | or any language, except maybe Python and C++.
00:01:10.200 | But I hope you stick around, be patient and empathetic,
00:01:13.920 | and maybe learn something new together with me.
00:01:17.300 | This is the Lex Friedman Podcast.
00:01:19.500 | To support it, please check out our sponsors
00:01:21.640 | in the description.
00:01:22.840 | And now, here's my conversation with Jamie Metzl.
00:01:27.620 | What is the probability in your mind
00:01:29.560 | that COVID-19 leaked from a lab?
00:01:32.080 | In your write-up, I believe you said 85%.
00:01:35.960 | I know it's just a percentage.
00:01:37.720 | We can't really be exact with these kinds of things,
00:01:39.840 | but it gives us a sense where your mind is,
00:01:42.360 | where your intuition is.
00:01:43.320 | So as it stands today,
00:01:44.880 | what would you say is that probability?
00:01:46.960 | - I would stand by what I've been saying
00:01:48.960 | since really the middle of last year.
00:01:51.960 | It's more likely and not, in my opinion,
00:01:55.600 | that the pandemic stems
00:01:56.880 | from an accidental lab incident in Wuhan.
00:01:59.400 | Is it 90%, is it 65%?
00:02:02.720 | I mean, that's kind of arbitrary,
00:02:04.200 | but when I stack up all of the available evidence,
00:02:08.100 | and all of it on both sides is circumstantial,
00:02:10.980 | it weighs very significantly toward a lab incident origin.
00:02:14.400 | - So before we dive into the specifics at a high level,
00:02:17.920 | what types of evidence, what intuition,
00:02:21.480 | what ideas are leading you
00:02:22.800 | to have that kind of estimate?
00:02:25.800 | Is it possible to kind of condense?
00:02:28.020 | When you look at the wall of evidence before you,
00:02:32.160 | where's your source,
00:02:33.920 | the strongest source of your intuition?
00:02:36.160 | - Yeah, and I would have to say
00:02:37.720 | it's just logic and deductive reasoning.
00:02:40.600 | So before I make the case
00:02:41.940 | for why I think it's most likely a lab incident origin,
00:02:44.960 | let's just say why it could be,
00:02:46.720 | and still could be, a natural origin.
00:02:49.400 | All of this is a natural origin
00:02:50.960 | in the sense that it's a bat virus backbone,
00:02:54.400 | horseshoe bat virus backbone.
00:02:56.960 | - Okay, I'm gonna keep pausing you to define stuff.
00:03:00.340 | So maybe it's useful to say,
00:03:02.360 | what do we mean by lab leak?
00:03:04.120 | What do we mean by natural origin?
00:03:05.880 | What do we mean by virus backbone?
00:03:07.800 | - Okay, great questions.
00:03:09.760 | So viruses come from somewhere.
00:03:11.400 | Viruses have been around for 3.5 billion years,
00:03:14.900 | and they've been around for such a long time
00:03:17.800 | because they are adaptive, and they're growing,
00:03:20.680 | and they're always changing, and they're morphing.
00:03:23.600 | And that's why viruses are,
00:03:26.280 | I mean, they've been very successful,
00:03:28.020 | and we are our victims.
00:03:29.480 | Sometimes we're beneficiaries.
00:03:30.660 | We have viral DNA has morphed into our genomes,
00:03:34.760 | but now, certainly in the case of COVID-19,
00:03:37.680 | we are victims of the success of viruses.
00:03:42.680 | And so when we talk about a backbone,
00:03:44.760 | so the SARS-CoV-2 virus, it has a history,
00:03:50.040 | and these viruses don't come out of whole cloth.
00:03:52.640 | There are viruses that morph.
00:03:54.640 | And so we know that at some period,
00:03:59.120 | maybe 20 years ago or whatever,
00:04:01.400 | the virus that is SARS-CoV-2 existed in horseshoe bats.
00:04:07.760 | It was a horseshoe bat virus, and it evolved somewhere.
00:04:13.160 | And there are some people who say
00:04:15.680 | there's no evidence of this, but it's a plausible theory
00:04:18.800 | based on how things have happened in the past.
00:04:21.080 | Maybe that virus jumped from the horseshoe bat
00:04:25.100 | through some intermediate species.
00:04:27.160 | So it's like, let's say there's a bat,
00:04:29.080 | and then it infects some other animal.
00:04:31.820 | Let's say it's a pig or a raccoon dog or a civet cat.
00:04:36.240 | They're all, pangolin, they're all sorts of animals
00:04:38.280 | that have been considered.
00:04:40.320 | And then that virus adapts into that new host,
00:04:43.920 | and it changes and grows.
00:04:46.000 | And then according to the quote unquote
00:04:48.120 | natural origins hypothesis,
00:04:49.520 | it jumps from that animal into humans.
00:04:53.160 | And so what you could imagine,
00:04:55.120 | and some of the people who are making the case,
00:04:57.000 | all of the people actually who are making the case
00:04:58.800 | for a natural origin of the virus,
00:05:00.960 | what they're saying is it went from bat
00:05:03.440 | to some intermediate species,
00:05:05.760 | and then from that intermediate species, most likely,
00:05:08.420 | there's some people who say it went directly bat to human,
00:05:10.920 | but through some intermediate species,
00:05:13.000 | and then humans interacted with that species,
00:05:16.600 | and then it jumped from that whatever it is to humans.
00:05:20.000 | And that's a very plausible theory.
00:05:21.740 | It's just that there's no evidence for it.
00:05:23.360 | - And the nature of the interaction is,
00:05:25.720 | do most people kind of suggest this,
00:05:27.440 | that the, like wet markets.
00:05:29.680 | So the interaction of the humans with the animal
00:05:32.560 | is in the form of, it's either a live animal
00:05:35.320 | that's being sold to be eaten,
00:05:36.640 | or a recently live animal,
00:05:39.640 | but newly dead animal being sold to be eaten.
00:05:41.920 | - That's certainly one very possible possibility,
00:05:46.120 | a possible possibility, I don't know if that's a word,
00:05:48.920 | but the people who believe in the wet market origin,
00:05:52.040 | that's what they're saying.
00:05:52.980 | So they had one of these animals,
00:05:55.440 | they were cutting it up, let's say, in a market,
00:05:57.920 | and maybe some of the blood got into somebody's,
00:06:01.040 | maybe had a cut on their hand,
00:06:02.600 | or maybe it was aerosolized, and so somebody breathed it,
00:06:05.800 | and then that virus found this new host,
00:06:09.120 | and that was the human host.
00:06:11.640 | But you could also have that happen in, let's say, a farm.
00:06:14.960 | So it's happened in the past,
00:06:17.080 | that let's say that there are farms,
00:06:19.000 | and because of human encroachment into wild spaces,
00:06:23.080 | we're pushing our farms and our animal farms
00:06:26.160 | further and further into what used to be
00:06:28.760 | the just natural habitats.
00:06:31.080 | And so it's happened in the past, for example,
00:06:33.020 | that there were bats roosting over pig pens,
00:06:35.920 | and the bat droppings went into the pig pens,
00:06:38.960 | the viruses in those droppings infected the pigs,
00:06:43.360 | and then the pigs infected the humans.
00:06:46.000 | And that's why it's a plausible theory,
00:06:48.560 | it's just that there's basically no evidence for it.
00:06:52.000 | If it was the case that SARS-CoV-2
00:06:55.220 | comes from this type of interaction,
00:06:58.000 | as in most of the at least recent past outbreaks,
00:07:01.840 | we'd see evidence of that.
00:07:03.200 | Viruses are messy,
00:07:05.240 | they're constantly undergoing Darwinian evolution,
00:07:08.000 | and they're changing,
00:07:09.000 | and it's not that they're just ready for prime time,
00:07:11.800 | ready to infect humans on day one.
00:07:13.840 | Normally you can trace the viral evolution
00:07:17.680 | prior to the time when it infects humans.
00:07:20.680 | But for SARS-CoV-2, it just showed up on the scene
00:07:24.760 | ready to infect humans.
00:07:26.780 | And there's no history that anybody has found so far
00:07:30.620 | of that kind of a viral evolution.
00:07:33.080 | With the first SARS, you could track it
00:07:35.600 | by the genome sequencing that it was experimenting.
00:07:39.920 | And SARS-CoV-2 was very, very stable,
00:07:44.160 | and meaning it had already adapted to humans
00:07:47.540 | by the time it interacted with us.
00:07:49.800 | - Fully adapted.
00:07:50.720 | So with SARS, there's a rapid evolution
00:07:55.160 | when it first kind of hooks onto a human.
00:07:58.520 | - Yeah, 'cause it's trying.
00:07:59.640 | Like a virus, its goal is to survive and replicate.
00:08:02.880 | No, it's true, it's like, oh, we're gonna try this,
00:08:04.400 | oh, that didn't work, we'll try it.
00:08:05.680 | Exactly like a startup.
00:08:08.280 | And so we don't see that.
00:08:10.440 | And so there are some people who say,
00:08:12.000 | all right, well, one hypothesis is,
00:08:14.760 | you have a totally isolated group of humans,
00:08:17.360 | maybe in Southern China,
00:08:18.920 | which is more than 1,000 miles away from Wuhan.
00:08:22.880 | And maybe they're doing their animal farming
00:08:26.880 | right next to these areas where there are these horseshoe bats
00:08:31.960 | and maybe in this totally isolated place
00:08:34.600 | that no one's ever heard of,
00:08:35.600 | they're not connected to any other place,
00:08:37.880 | one person gets infected.
00:08:39.720 | And it doesn't spread to anybody else
00:08:42.400 | because they're so isolated.
00:08:44.080 | They're like, I don't know,
00:08:45.560 | I mean, I can't even imagine that this is the case.
00:08:48.120 | Then somebody gets in a car and drives all night,
00:08:52.560 | more than 1,000 miles through crappy roads to get to Wuhan.
00:08:56.200 | Doesn't stop for anything,
00:08:57.560 | doesn't infect anybody on the way,
00:08:59.120 | no one else in that person's village infects anyone.
00:09:01.880 | And then that person goes straight
00:09:03.560 | to the Huanan seafood market,
00:09:05.720 | according to this, in my mind, not very credible theory,
00:09:09.280 | and then unloads his stuff and everybody gets infected
00:09:12.640 | and they're only delivering those animals
00:09:15.240 | to the Wuhan market,
00:09:17.040 | which doesn't even sell very many of these kinds of animals
00:09:20.240 | that are likely intermediate species and not anywhere else.
00:09:23.120 | So that's, I mean, it's a little bit of a straw man.
00:09:25.880 | But on top of that,
00:09:27.720 | the Chinese have sequenced more than 80,000 animal samples
00:09:32.440 | and there's no evidence of this type of viral evolution
00:09:36.280 | that we would otherwise expect.
00:09:37.400 | - Let's try to, at this moment,
00:09:41.400 | steel man the argument for the natural origin of the virus.
00:09:46.400 | So just to clarify,
00:09:48.920 | so Wuhan is actually,
00:09:50.240 | despite what it might sound like to people,
00:09:52.200 | is a pretty big city.
00:09:53.880 | There's a lot of people that live in it.
00:09:55.160 | - 11 million.
00:09:56.000 | - So not only is there the Wuhan Institute of Virology,
00:09:59.640 | there's other centers that do work on viruses.
00:10:03.040 | - Yes.
00:10:03.960 | - But there's also a giant number of markets
00:10:06.600 | and everything we're talking about here
00:10:08.000 | is pretty close together.
00:10:09.560 | So when I kind of look at the geography of this,
00:10:13.600 | I think when you zoom out, it's all Wuhan.
00:10:16.000 | But when you zoom in,
00:10:17.280 | there's just a lot of interesting dynamics
00:10:19.240 | that could be happening
00:10:20.080 | and what the cases are popping up
00:10:21.600 | and what's being reported, all that kind of stuff.
00:10:24.040 | So I think the people that argue for the natural origin,
00:10:29.040 | and there's a few recent papers that come out arguing this,
00:10:32.940 | it's kind of fascinating to watch this whole thing.
00:10:35.640 | But I think what they're arguing
00:10:36.920 | is that there's this Hunan market,
00:10:39.080 | that's one of the major markets,
00:10:41.600 | the wet markets in Wuhan,
00:10:44.080 | that there's a bunch of cases
00:10:49.200 | that were reported from there.
00:10:52.560 | So if I look at, for example,
00:10:54.200 | the Michael Warby perspective that he wrote in Science,
00:10:58.960 | he argues, he wrote this a few days ago,
00:11:01.960 | the predominance of early COVID cases
00:11:05.080 | linked to Hunan market,
00:11:06.680 | and this can't be dismissed as ascertainment bias,
00:11:09.800 | which I think is what people argue,
00:11:11.120 | that you're just kind of focusing on this region
00:11:13.360 | 'cause a lot of cases came,
00:11:14.400 | but there could be a huge number of other cases.
00:11:18.280 | So people who argue against this
00:11:19.760 | say that this is a later stage already.
00:11:22.640 | So he says no.
00:11:26.300 | He says this is the epicenter,
00:11:29.760 | and this is a clear evidence that,
00:11:34.760 | circumstantial evidence, but evidence nevertheless,
00:11:37.920 | that this is where the jump happened to humans,
00:11:42.320 | the big explosion.
00:11:43.680 | Maybe not case zero, I don't know if he argues that,
00:11:45.900 | but the early cases.
00:11:47.400 | So what do you make of this whole idea?
00:11:49.160 | Can you steel man it before we talk about the outside?
00:11:52.520 | - And my goal here isn't to attack people on the other side,
00:11:56.120 | and my feeling is if there is evidence that's presented
00:12:00.760 | that should change my view,
00:12:02.840 | I hope that I'll be open-minded enough to change my view.
00:12:06.520 | And certainly Michael Warby is a thoughtful person,
00:12:09.400 | a respected scientist,
00:12:11.200 | and I think this work is contributive work,
00:12:14.080 | but I just don't think that it's as significant
00:12:19.680 | as has been reported in the press.
00:12:22.600 | And so what his argument is,
00:12:24.840 | is that there's an early cluster in December of 2019
00:12:29.840 | around the Huanan seafood market.
00:12:33.320 | And even though he himself argues
00:12:36.360 | that the original breakthrough case,
00:12:39.520 | the original case, the index case
00:12:41.480 | where the first person infected happened earlier,
00:12:46.120 | happened in October or November.
00:12:48.240 | So not in December.
00:12:49.760 | His argument is, well, what are the odds
00:12:52.840 | that you would have this number,
00:12:54.480 | this cluster of cases in the Huanan seafood market,
00:12:58.040 | and if the origin had happened someplace else,
00:13:00.880 | wouldn't you expect other clusters?
00:13:03.240 | And it's not an entirely implausible argument,
00:13:06.120 | but there are reasons why I think
00:13:08.040 | that this is not nearly as determinative
00:13:11.360 | as has been reported,
00:13:12.800 | and I certainly had a lot of,
00:13:14.200 | I and others had tweeted a lot about this.
00:13:17.600 | And that is first,
00:13:19.240 | the people who were infected in this cluster,
00:13:22.760 | it's not the earliest known virus.
00:13:26.480 | Of the SARS-CoV-2, it began mutating.
00:13:28.520 | So this is, it's not the original SARS-CoV-2 there.
00:13:31.800 | So it had to have happened someplace else.
00:13:34.600 | Two, the people who were infected in the market
00:13:38.520 | weren't infected in the part of the market
00:13:41.480 | where they had these kinds of animals
00:13:43.640 | that are considered to be candidates
00:13:46.200 | as an intermediary species.
00:13:49.440 | And third, there was a bias,
00:13:51.760 | and actually I'll have four things.
00:13:53.160 | Third, there was a bias in the early assessment in China
00:13:58.160 | of what they were looking for.
00:13:59.720 | They were asked, did you have exposure to the market?
00:14:02.320 | Because I think in the early days
00:14:03.360 | when people were figuring things out,
00:14:05.120 | that was one of the questions that was asked.
00:14:08.720 | And fourth, and probably most significantly,
00:14:11.480 | we have so little information about those early cases
00:14:16.120 | in China, and that's really unfortunate.
00:14:18.520 | I know we'll talk about this later
00:14:19.760 | because the Chinese government is preventing access
00:14:23.000 | to all of that information, which they have,
00:14:26.080 | which could easily help us get to the bottom,
00:14:28.880 | at least know a ton more about how this pandemic started.
00:14:31.840 | And so this is, it's like grasping at straws
00:14:36.840 | in the dark with gloves on.
00:14:38.440 | - That's right.
00:14:39.680 | But to steel man the argument,
00:14:41.720 | we have this evidence from this market,
00:14:45.040 | and yes, the Chinese government
00:14:46.920 | has turned off the lights essentially,
00:14:48.920 | so we have very little data to work with,
00:14:51.280 | but this is the data we have.
00:14:53.400 | So who's to say that this data doesn't represent
00:14:56.360 | a much bigger data set that a lot of people got infected
00:15:00.400 | at this market, even at the parts,
00:15:03.880 | or especially at the parts where the meat,
00:15:06.840 | the infected meat was being sold.
00:15:09.160 | - So that could be true, and it probably is true.
00:15:13.640 | The question is, is this the source?
00:15:16.840 | Is this the place where this began?
00:15:18.720 | Or was this just a place where it was amplified?
00:15:22.000 | And I certainly think that it's extremely likely
00:15:26.640 | that the Huanan seafood market was a point of amplification.
00:15:31.240 | And it's just answering a different question.
00:15:33.000 | - Basically what you're saying is it's very difficult
00:15:35.320 | to use the market as evidence for anything
00:15:38.200 | because it's probably not even the starting point.
00:15:42.720 | So it's just a good place for it to continue spreading.
00:15:45.800 | - That's certainly my view.
00:15:47.560 | What Michael Waraby's argument is,
00:15:50.400 | and Michael is that, well, what are the odds of that,
00:15:53.120 | that we're seeing this amplification in the market?
00:15:57.000 | And if we, let me put it this way,
00:16:00.240 | if we had all of the information,
00:16:02.520 | if the Chinese government hadn't blocked access
00:16:06.000 | to all of this, 'cause there's blood bank information,
00:16:08.480 | there's all sorts of information,
00:16:10.600 | and based on a full and complete understanding,
00:16:15.000 | we came to believe that all of the early cases
00:16:18.360 | were at this market.
00:16:19.520 | I think that would be a stronger argument
00:16:22.280 | than what this is so far.
00:16:24.000 | But everything leads to the fact that why is it
00:16:26.720 | that the Chinese government, which was frankly,
00:16:30.000 | after a slow start, the gold standard
00:16:32.520 | of doing viral tracking for SARS-1,
00:16:36.720 | why have they apparently done so little
00:16:39.840 | and shared so little?
00:16:41.120 | I think it asks, it begs a lot of questions.
00:16:45.040 | - Okay, so let's then talk about the Chinese government.
00:16:49.000 | There's several governments, right?
00:16:51.960 | So one is the local government of Wuhan.
00:16:54.340 | And not just the Chinese government,
00:16:56.720 | let's talk about government.
00:16:58.120 | No, let's talk about human nature.
00:17:00.800 | This just keeps zooming out.
00:17:03.440 | Let's talk about planet Earth.
00:17:04.920 | No, so there's the Wuhan local government,
00:17:08.440 | there's the Chinese government led by Xi Jinping,
00:17:12.540 | and there's governments in general.
00:17:16.480 | I'm trying to empathize.
00:17:18.080 | So my father was involved with Chernobyl.
00:17:21.440 | I'm trying to put myself into the mind of local officials,
00:17:25.680 | of people who are like, oh shit,
00:17:28.040 | there's a potential catastrophic event happening here,
00:17:33.560 | and it's my ass because there's incompetence
00:17:38.560 | all over the place.
00:17:39.920 | Human nature is such that there's incompetence
00:17:41.600 | all over the place, and you're always trying
00:17:43.240 | to cover it up.
00:17:44.600 | And so given that context,
00:17:47.120 | I want to lay out all the possible incompetence,
00:17:52.720 | all the possible malevolence,
00:17:54.960 | all the possible geopolitical tensions here.
00:18:01.280 | - All right, where in your sense did the cover-up start?
00:18:06.280 | So there's this suspicious fact,
00:18:10.680 | it seems like, that the Wuhan Institute of Virology
00:18:16.720 | had a public database of thousands
00:18:19.200 | of sampled bat coronavirus sequences,
00:18:22.640 | and that went offline in September of 2019.
00:18:26.280 | What's that about?
00:18:28.200 | - So let me talk about that specific,
00:18:30.520 | and then I'll also follow your path of zooming out,
00:18:33.560 | and it's a really important--
00:18:34.480 | - Is that a good starting point?
00:18:35.320 | - It's a great starting point, yeah, yeah.
00:18:37.600 | But there's a bigger story, but let me talk about that.
00:18:42.520 | So the Wuhan Institute of Virology,
00:18:45.320 | and we can go into the whole history
00:18:48.040 | of the Wuhan Institute of Virology,
00:18:49.560 | either now or later, 'cause I think it's very relevant
00:18:51.760 | to the story, but let's focus for now on this database.
00:18:55.400 | They had a database of 22,000 viral samples,
00:19:00.360 | and sequence information about viruses
00:19:02.800 | that they had collected, some of which,
00:19:06.360 | the collection of some of which was supported
00:19:08.680 | through funding from the NIH,
00:19:10.360 | not a huge NIH, through the EcoHealth Alliance.
00:19:12.760 | It's a relatively small amount, $600,000, but not nothing.
00:19:17.200 | The goal of this database was so that we could understand
00:19:23.040 | viral evolution, so that exactly for this kind of moment
00:19:27.280 | where we had an unknown virus, we could say,
00:19:30.160 | well, is this like anything that we've seen before?
00:19:33.360 | And that would help us both understand what we're facing
00:19:35.800 | and be better able to respond.
00:19:39.320 | So this was a password-protected public access database.
00:19:44.320 | In 2019, in September 2019, it became inaccessible.
00:19:49.960 | And then the whole, a few months later,
00:19:53.440 | the entire database disappeared.
00:19:55.840 | What the Chinese have said is that
00:19:58.600 | because there were all kinds of computer attacks
00:20:01.760 | on this database, but why would that happen in September 2019
00:20:06.760 | before the pandemic, at least as far as we know?
00:20:11.480 | - So just to clarify. - Yes.
00:20:14.400 | - It went down to September 2019,
00:20:17.400 | just so we get the year straight.
00:20:19.680 | January 2020 is when the virus
00:20:22.120 | really started getting the press.
00:20:25.920 | So we're talking about December 2019,
00:20:29.000 | a lot of early infections happened.
00:20:30.960 | September 2019 is when this database goes down.
00:20:34.880 | Just to clarify, 'cause you said it quickly,
00:20:37.200 | the Chinese government said that their database
00:20:42.200 | was getting hacked, therefore--
00:20:45.400 | - Xu Jing Li, the director of this part
00:20:48.600 | of the Wuhan Institute of Virology said that.
00:20:50.760 | - Oh, really? - She said it.
00:20:52.160 | - She was the one that said it?
00:20:53.200 | - She was the one who said it.
00:20:54.040 | - Oh, boy. - Yeah.
00:20:54.920 | - I did not even know that part.
00:20:56.360 | - Yeah. - Okay.
00:20:57.200 | Well, she's an interesting character.
00:20:58.600 | We'll talk about her. - Yeah.
00:21:00.200 | (laughs)
00:21:01.400 | - So the excuse is that it's getting cyber attacked a lot,
00:21:06.400 | so we're gonna take it down without any further explanation,
00:21:10.800 | which seems very suspicious.
00:21:12.120 | And then this virus starts to emerge
00:21:15.600 | in October, November, December.
00:21:17.480 | There's a lot of argument about that, but after.
00:21:19.480 | - Sorry to interrupt, but some people are saying
00:21:21.400 | that the first outbreak could have happened
00:21:23.240 | as early as September.
00:21:25.480 | I think it's more likely it's October, November,
00:21:27.640 | but for the people who are saying that the first outbreak,
00:21:31.360 | the first incident of a known outbreak,
00:21:34.920 | at least to somebody, happened in September,
00:21:37.360 | they make the argument, well, what if that also happened
00:21:40.440 | in mid-September of 2019?
00:21:42.440 | I'm not prepared to go there,
00:21:43.800 | but there are some people who make that argument.
00:21:45.080 | - But I think, again, if I were to put myself
00:21:47.960 | in the mind of officials, whether it's officials
00:21:51.080 | within the Wuhan Institute of Virology
00:21:53.680 | or Wuhan local officials,
00:21:55.800 | I think if I notice some major problem,
00:22:01.960 | like somebody got sick, some sign of,
00:22:06.080 | oh, shit, we screwed up, that's when you kind of do the slow,
00:22:11.880 | there's like a Homer Simpson meme
00:22:13.680 | where you slowly start backing out,
00:22:15.520 | and I would probably start hiding stuff.
00:22:20.680 | - CYA, yeah. - Yeah, and then coming up
00:22:23.800 | with really shady excuses.
00:22:25.960 | It's like you're in a relationship
00:22:27.880 | and your girlfriend wants to see your phone,
00:22:30.000 | and you're like, I'm sorry,
00:22:31.360 | I'm just getting attacked by the Russians now.
00:22:33.520 | The cybersecurity, I can't. - Yeah, I wish I could.
00:22:36.760 | - I wish I could, it's just unsafe right now.
00:22:39.440 | - So would it be okay if I give you my kind of macro view
00:22:42.760 | of the whole information space
00:22:44.560 | and why I believe this has been so contentious?
00:22:49.040 | - So here's, if I had to give my best guess,
00:22:52.400 | and I underline the word guess of what happened,
00:22:55.960 | and your background, your family background with Chernobyl
00:22:59.480 | I think is highly relevant here.
00:23:02.040 | So after the first SARS, there was a recognition
00:23:06.280 | that we needed to distribute knowledge about virology
00:23:09.800 | and epidemiology around the world,
00:23:11.600 | that people in China and Africa, in Southeast Asia,
00:23:14.880 | they were the frontline workers,
00:23:16.680 | and they needed to be doing a lot of the viral monitoring
00:23:20.600 | and assessment so that we could have an early alarm system.
00:23:24.720 | And that was why there was a lot of investment
00:23:28.240 | in all of those places in building capacity,
00:23:30.960 | in training people,
00:23:31.960 | and helping to build institutional capacity.
00:23:34.600 | And the Chinese government,
00:23:36.760 | they recognized that they needed to ramp things up.
00:23:40.800 | And then the World Health Organization
00:23:43.280 | and the World Health Assembly,
00:23:44.640 | they had their international health regulations
00:23:47.560 | that were designed to create a stronger infrastructure.
00:23:50.080 | So that was the goal.
00:23:52.480 | There were a lot of investments,
00:23:54.280 | and I know we'll talk later
00:23:55.400 | about the Wuhan Institute of Virology,
00:23:56.880 | and I won't go into that right now.
00:23:59.400 | So there was all of this distributed capacity.
00:24:02.600 | And so in the early days, there's a breakout in Wuhan.
00:24:07.280 | We don't know, is it September, October, November,
00:24:10.720 | maybe December is when the local authorities
00:24:14.640 | start to recognize that something's happening.
00:24:16.680 | But at some point in late 2019,
00:24:19.240 | local officials in Wuhan understand that something is up.
00:24:23.560 | And exactly like in Chernobyl,
00:24:26.040 | these guys exist within a hierarchical system.
00:24:29.200 | And they are going to be rewarded if good things happen,
00:24:32.360 | and they're going to be in big trouble
00:24:34.000 | if bad things happen under their watch.
00:24:36.280 | So their initial instinct is to squash it.
00:24:40.360 | And my guess is they think,
00:24:42.160 | well, if we squash this information,
00:24:44.320 | we can most likely beat back this outbreak,
00:24:47.120 | 'cause lots of outbreaks happen all the time,
00:24:49.320 | including of SARS-1,
00:24:51.440 | where there was multiple lab incidents
00:24:53.960 | out of a lab in Beijing.
00:24:56.800 | And so they start their cover-up on day one.
00:24:59.760 | They start screening social media.
00:25:02.560 | They send nasty letters to different doctors and others
00:25:06.480 | who are starting to speak up.
00:25:08.560 | But then it becomes clear that there's a bigger issue.
00:25:12.040 | And then the national government of China --
00:25:14.720 | again, this is just a hypothesis --
00:25:17.400 | the national government gets involved.
00:25:19.600 | They say, "All right, this is getting much bigger."
00:25:21.840 | They go in, and they realize
00:25:24.000 | that we have a big problem on our hands.
00:25:26.120 | They relatively quickly know
00:25:28.280 | that it's spreading human to human.
00:25:30.600 | And so the right thing for them to do then
00:25:32.840 | is what the South African government is doing now,
00:25:35.320 | is to say, "We have this outbreak.
00:25:37.760 | We don't know everything, but we know it's serious.
00:25:40.760 | We need help."
00:25:41.640 | But that's not the instinct of people in most governments
00:25:44.680 | and certainly not in authoritarian governments
00:25:47.000 | like China.
00:25:48.320 | And so the national government,
00:25:50.920 | they have a choice at that point.
00:25:52.800 | They can do option one,
00:25:55.000 | which is what we would here call the right thing,
00:25:57.480 | which is total transparency.
00:25:59.560 | They criticize the local officials
00:26:01.520 | for having this cover-up,
00:26:03.560 | and they say, "Now we're going to be totally transparent."
00:26:05.760 | But what does that do in a system
00:26:07.880 | like the former Soviet Union, like China now?
00:26:10.640 | If local officials say, "Wait a second.
00:26:12.360 | I thought my job was to cover everything up,
00:26:15.880 | to support this alternative reality
00:26:18.240 | that authoritarian systems need in order to survive.
00:26:22.200 | Well, now I'm going to be held accountable
00:26:23.760 | for if I'm not totally transparent,"
00:26:26.640 | like your whole system would collapse.
00:26:29.320 | So the national government, they have that choice,
00:26:32.480 | and their only choice, according to the logic of their system,
00:26:37.200 | is to be all in on a cover-up.
00:26:38.920 | And that's why they block the World Health Organization
00:26:41.520 | from sending its team to Wuhan for over three weeks.
00:26:45.480 | They overtly lie to the World Health Organization
00:26:48.480 | about human-to-human transmission,
00:26:51.200 | and then they begin their cover-up.
00:26:53.440 | So they begin very, very quickly destroying samples,
00:26:56.840 | hiding records.
00:26:58.000 | They start imprisoning people for asking basic questions.
00:27:02.840 | Soon after, they establish a gag order,
00:27:05.720 | preventing Chinese scientists from writing or saying
00:27:08.400 | anything about pandemic origins
00:27:10.560 | without prior government approval.
00:27:12.160 | And what that does means that there isn't a lot of data,
00:27:16.120 | there's not nearly enough data coming out of China.
00:27:19.280 | And so lots of responsible scientists outside of China
00:27:22.800 | who are data-driven say,
00:27:24.760 | "Well, I don't have enough information to draw conclusions."
00:27:29.240 | And then into that vacuum step a relatively small number
00:27:34.240 | of largely virologists, but also others,
00:27:39.160 | respected scientists, and I know we'll talk about
00:27:42.160 | the, I think, infamous Peter Daszak,
00:27:44.800 | who say, well, without any real foundation in the evidence,
00:27:52.680 | they say, "We know pretty much this comes from nature,
00:27:56.680 | "and anyone who's raising the possibility
00:28:00.200 | "of a lab incident origin is a conspiracy theorist."
00:28:03.360 | So that message starts to percolate.
00:28:07.840 | And then in the United States, we have Donald Trump,
00:28:11.560 | and he's starting to get criticized for America's failure
00:28:15.360 | to respond, prepare for, and respond adequately
00:28:17.600 | to the outbreak.
00:28:19.000 | And so he starts saying, "Well, I know,"
00:28:22.480 | first after praising Xi Jinping, he starts saying,
00:28:25.040 | "Well, I know that China did it, and the WHO did it,"
00:28:28.560 | and he's kind of pointing fingers at everybody but himself.
00:28:32.960 | And then we have a media here that had shifted
00:28:36.080 | from the traditional model of he said, she said journalism,
00:28:40.680 | so-and-so said X, and so-and-so said Y,
00:28:43.160 | and then we'll present both of those views.
00:28:45.800 | With Donald Trump,
00:28:47.400 | he would make outlandish starting positions.
00:28:50.520 | So he would say, "Lex is an ax murderer."
00:28:53.400 | And then in the early days, they would say,
00:28:55.280 | "Lex is an ax murderer.
00:28:57.320 | "Lex's friend says he's not an ax murderer."
00:29:00.000 | And we'd have a four-day debate, is he or isn't he?
00:29:02.560 | And then at day four, someone would say,
00:29:04.540 | "Why are we having this debate at all?
00:29:06.680 | "Because the original point is baseless."
00:29:11.680 | And so the media just got in the habit,
00:29:13.720 | "Here's what Trump said, and here's why it's wrong."
00:29:16.600 | - It's very complicated to figure out
00:29:20.000 | what is the role of a politician,
00:29:21.600 | what is the role of a leader in this kind of game of politics
00:29:25.560 | but certainly when there's a tragedy,
00:29:29.600 | when there's a catastrophic event,
00:29:32.120 | what it takes to be a leader
00:29:34.080 | is to see clearly through the fog
00:29:36.640 | and to make big, bold decisions
00:29:38.760 | and to speak to the truth of things.
00:29:41.160 | And even if it's unpopular truth,
00:29:43.140 | to listen to the people, to listen to all sides,
00:29:48.120 | to the opinions, to the controversial ideas
00:29:51.800 | and to see past all the bullshit,
00:29:54.680 | all the political bullshit and just speak to the people,
00:29:59.240 | speak to the world and make bold, big decisions.
00:30:02.440 | That's probably what was needed in terms of leadership.
00:30:04.880 | And I'm not so willing to criticize
00:30:07.480 | whether it's Joe Biden or Donald Trump on this.
00:30:11.280 | I think most people cannot be great leaders
00:30:15.780 | but that's why when great leaders step up,
00:30:18.640 | we write books about them.
00:30:20.280 | - And I agree.
00:30:21.240 | And even though, I mean, I think of myself
00:30:24.760 | as a progressive person, I certainly was a critic
00:30:27.960 | of a lot of what President Trump did.
00:30:32.960 | But on this particular case,
00:30:36.160 | even though he may have said it in an uncouth way,
00:30:39.440 | Donald Trump was actually, in my view, right.
00:30:43.600 | I mean, when he said, "Hey, let's look at this lab."
00:30:46.660 | Maybe he said, "I have evidence, I can't tell you."
00:30:48.640 | I don't think he even had the evidence.
00:30:51.280 | But his intuition that this probably comes from a lab,
00:30:55.440 | in my view, was a correct intuition.
00:30:58.140 | And certainly I started speaking up
00:30:59.980 | about pandemic origins early in 2019.
00:31:04.420 | And my friends, my Democratic friends were brutal with me
00:31:08.740 | saying, "What are you doing?
00:31:09.700 | You're supporting Trump in an election year."
00:31:11.780 | And I said, "Just because Donald Trump is saying something
00:31:15.460 | doesn't mean that I need to oppose it.
00:31:17.980 | If Donald Trump says something that I think is correct,
00:31:21.820 | well, I wanna say it's correct.
00:31:22.960 | Just as if he says something that I don't like,
00:31:25.260 | I'm gonna speak up about that."
00:31:26.880 | - Good, you walked through the fire.
00:31:28.540 | So you laid out the story here.
00:31:31.920 | And I think in many ways, it's a human story.
00:31:36.540 | It's a story of politics, it's a story of human nature.
00:31:41.420 | But let's talk about the story of the virus.
00:31:45.540 | And let's talk about the Wuhan Institute of Virology.
00:31:48.340 | So maybe this is a good time to try to talk about its history,
00:31:52.580 | about its origins, about what kind of stuff it works on,
00:31:55.920 | about biosafety levels, and about Batwoman.
00:32:00.780 | - Yeah, Xue Zhengli, yes.
00:32:02.340 | - Xue Zhengli.
00:32:03.780 | So what is the Wuhan Institute of Virology?
00:32:06.540 | When did it start?
00:32:07.540 | - Yeah, so it's a great question.
00:32:09.060 | So after SARS-1, which was in the early 2000,
00:32:13.140 | 2003, 2004, there was this effort to enhance,
00:32:18.140 | as I mentioned before, global capacity, including in China.
00:32:23.780 | So the Wuhan Institute of Virology
00:32:25.500 | had been around for decades before then.
00:32:29.220 | But there was an agreement between the French
00:32:32.100 | and the Chinese governments to build the largest BSL-4 lab,
00:32:38.280 | so biosafety level four.
00:32:39.900 | So in these what are called high containment labs,
00:32:42.340 | there's level four, which is the highest level.
00:32:44.460 | And people have seen that on TV and elsewhere
00:32:47.300 | where you have the people in the different,
00:32:49.900 | in suits and all of these protections.
00:32:51.860 | And then there's level three, which is still very serious,
00:32:55.480 | but not as much as level four.
00:32:58.780 | And then level two is just kind of goggles and some gloves
00:33:03.180 | and maybe a face mask, much less.
00:33:05.660 | So the French and the Chinese governments agreed
00:33:10.240 | that France would help build the first
00:33:13.480 | and still the largest BSL-4 plus some mobile BSL-3 labs.
00:33:18.480 | And they were going to do it in Wuhan.
00:33:22.180 | And Wuhan is kind of like China's Chicago.
00:33:24.920 | And I had actually been, it's a different story.
00:33:26.680 | I'd been in Wuhan relatively not that long
00:33:30.640 | before the pandemic broke out.
00:33:32.880 | And that was why I knew that Wuhan,
00:33:34.840 | it's not some backwater where there are a bunch
00:33:36.760 | of yokels eating bats for dinner every night.
00:33:40.220 | This is a really sophisticated, wealthy,
00:33:42.720 | highly educated and cultured city.
00:33:45.800 | And so I knew that it wasn't like,
00:33:48.360 | that even the Huanan seafood market wasn't like
00:33:51.200 | some of these seafood markets that they have
00:33:53.520 | in Southern China or in Cambodia,
00:33:55.020 | where I lived for two years.
00:33:57.120 | I mean, it was a totally different thing.
00:33:59.280 | - I'm going to have to talk to you about some of the food,
00:34:01.320 | including the Wuhan market,
00:34:02.640 | just some of the wild food going on here.
00:34:04.640 | 'Cause you've traveled that part of the world.
00:34:06.360 | But let's not get there.
00:34:07.200 | Let's not get distracted.
00:34:08.880 | - Good, as I was telling you, Lex, before,
00:34:11.080 | and this is maybe an advertisement,
00:34:13.800 | is having now listened to a number of your podcasts
00:34:18.000 | when I'm doing long ultra training runs
00:34:20.360 | or driving in the mountains.
00:34:21.880 | Because in the beginning, we have to talk about
00:34:24.800 | whatever it is is the topic.
00:34:26.080 | But the really good stuff happens later.
00:34:28.320 | So stay tuned. - So friends,
00:34:29.160 | you should listen to the end.
00:34:31.360 | I have to say, as I was telling you before,
00:34:34.880 | like when I heard your long podcast with Jeroen Lanier
00:34:37.760 | and he talked about his mother at the very end,
00:34:41.120 | I mean, just beautiful stuff.
00:34:42.720 | So I don't know whether I can match beautiful stuff,
00:34:45.480 | but I'm gonna do my best.
00:34:47.800 | - You're gonna have to find out.
00:34:49.120 | - Exactly, stay tuned.
00:34:50.840 | So France had this agreement
00:34:54.920 | that they were going to help design
00:34:56.800 | and help build this BSL-4 lab in Wuhan.
00:35:01.040 | And it was going to be with French standards,
00:35:04.880 | and there were going to be 50 French experts
00:35:07.480 | who were going to work there
00:35:09.400 | and supervise the work that happened
00:35:12.680 | even after the Wuhan Institute of Virology
00:35:15.360 | in the new location started operating.
00:35:21.440 | But then when they started building it,
00:35:24.440 | the French contractors, the French overseers
00:35:28.520 | were increasingly appalled
00:35:31.480 | that they had less and less control,
00:35:33.440 | that the Chinese contractors were swapping out new things,
00:35:37.120 | it wasn't built up to French standards,
00:35:39.600 | so much that at the end, when it was finally built,
00:35:43.680 | the person who was the vice chairman of the project
00:35:47.880 | and a leading French industrialist named Merriot
00:35:51.400 | refused to sign off.
00:35:53.000 | And he said, "We can't support,
00:35:55.560 | "we have no idea what this is,
00:35:58.160 | "whether it's safe or not."
00:36:00.240 | And when this lab opened,
00:36:03.000 | remember, it was supposed to have 50 French experts,
00:36:06.000 | it had one French expert.
00:36:07.880 | And so the French were really disgusted.
00:36:11.320 | And actually, when the Wuhan Institute of Virology
00:36:14.720 | in its new location opened in 2018, two things happened.
00:36:19.440 | One, French intelligence
00:36:20.920 | privately approached US intelligence saying,
00:36:23.760 | "We have a lot of concerns
00:36:25.000 | "about the Wuhan Institute of Virology,
00:36:27.200 | "about its safety,
00:36:28.360 | "and we don't even know who's operating there.
00:36:30.320 | "Is it being used as a dual-use facility?"
00:36:34.040 | And also in 2018, the US embassy in Beijing
00:36:38.560 | sent some people down to Wuhan to go and look at,
00:36:42.240 | well, at this laboratory.
00:36:44.720 | And they wrote a scathing cable
00:36:47.360 | that Josh Rogin from the Washington Post
00:36:49.720 | later got his hands on saying,
00:36:52.440 | "This is really unsafe.
00:36:54.200 | "They're doing work on dangerous bat coronaviruses
00:36:58.760 | "in conditions where a leak is possible."
00:37:02.800 | And so then you mentioned Shujing Li,
00:37:05.760 | and I'll connect that to these virologists
00:37:08.720 | who I was talking about.
00:37:11.160 | So there's a very credible thesis
00:37:14.280 | that because these pathogenic outbreaks
00:37:17.400 | happen in other parts of the world,
00:37:19.800 | having partnerships with experts
00:37:22.120 | in those parts of the world
00:37:24.720 | must be a foundation of our efforts.
00:37:28.080 | We can't just bring everything home
00:37:30.080 | because we know that viruses
00:37:31.680 | don't care about borders and boundaries,
00:37:33.760 | and so if something happens there,
00:37:35.040 | it's going to come here.
00:37:36.240 | So very correctly, we have all kinds of partnerships
00:37:41.240 | with experts in these labs,
00:37:44.040 | and Shujing Li was one of those partners.
00:37:47.320 | And her closest relationship was with Peter Daszak,
00:37:51.480 | who's a British, I think now American,
00:37:54.000 | but the president of a thing called EcoHealth Alliance,
00:37:57.600 | which was getting money from NIH.
00:37:59.280 | And basically, EcoHealth Alliance
00:38:01.800 | was a pass-through organization,
00:38:03.480 | and over the years, it was only about $600,000,
00:38:06.640 | so almost all of her funding
00:38:08.000 | came from the Chinese government,
00:38:09.280 | but there's a little bit that came from the United States.
00:38:11.960 | And so she became their kind of leading expert
00:38:15.160 | and the point of contact
00:38:17.640 | between the Wuhan Institute of Virology
00:38:20.240 | and certainly Peter Daszak, but also with others.
00:38:25.240 | And that was why in the earliest days of the outbreak,
00:38:28.880 | I didn't mention that,
00:38:30.400 | I did mention that there were these virologists
00:38:32.920 | who had this fake certainty
00:38:34.960 | that they knew it came from nature
00:38:36.680 | and it didn't come from a lab,
00:38:38.880 | and they called people like me, conspiracy theorists,
00:38:41.400 | just for raising that possibility.
00:38:43.920 | But when Peter Daszak was organizing that effort
00:38:46.880 | in February of 2020,
00:38:50.040 | what he said is,
00:38:51.040 | "We need to rally behind our Chinese colleagues."
00:38:54.120 | And the basic idea was
00:38:55.960 | these international collaborations are under threat.
00:38:59.680 | And I think it was because of that,
00:39:01.320 | because Peter Daszak's basically,
00:39:03.520 | his major contribution as a scientist
00:39:07.680 | was just tacking his name on work
00:39:09.680 | that Shoujiang Li had largely done.
00:39:13.000 | He was defending a lot,
00:39:14.480 | certainly for himself and his organization.
00:39:16.840 | - So you think EcoHealth Alliance and Peter
00:39:20.640 | is less about money,
00:39:22.000 | it's more about kind of almost like legacy,
00:39:24.760 | 'cause you're so attached to this work?
00:39:26.680 | Is this just on a human level?
00:39:27.520 | - I think so, I think so.
00:39:29.120 | I mean, I've been criticized for being actually,
00:39:31.680 | I'm certainly a big critic of Peter Daszak,
00:39:34.800 | but I've been criticized by some for being too lenient.
00:39:38.520 | I mean, it's so easy to say,
00:39:39.920 | "Oh, somebody, they're like an evil ogre
00:39:43.160 | and just trying to do evil
00:39:45.480 | and cackling in their closet or whatever."
00:39:49.720 | But I think for most of us,
00:39:51.160 | even those of us who do terrible, horrible things,
00:39:55.480 | the story that we tell ourselves
00:39:57.640 | and we really believe is that we're doing the thing
00:40:01.080 | that we most believe in.
00:40:02.560 | I mean, I did my PhD dissertation
00:40:04.560 | on the Khmer Rouge in Cambodia.
00:40:06.800 | They genuinely saw themselves as idealists.
00:40:09.920 | They thought, "Well, we need to make radical change
00:40:13.880 | to build a better future."
00:40:15.680 | And what they described as,
00:40:17.760 | what they felt was radical change
00:40:19.080 | was a monstrous atrocities by us.
00:40:21.880 | - So the criticism here of Peter
00:40:24.080 | is that he was part of an organization
00:40:31.840 | that was kind of, well, funding an effort
00:40:36.480 | that was an unsafe implementation
00:40:39.000 | of a biosafety level four laboratory.
00:40:42.080 | - Well, a few things.
00:40:43.320 | So what he thought he was doing was,
00:40:48.000 | and then what he thought he was doing
00:40:49.560 | is itself highly controversial
00:40:51.720 | because there's one there that in 2011,
00:40:56.320 | there were, I know you've talked about this
00:40:57.860 | with other guests, but in 2011,
00:41:00.600 | there were the first published papers
00:41:04.000 | on this now infamous gain of function research.
00:41:07.480 | And basically what they did,
00:41:10.520 | both in different labs and certainly in the United States,
00:41:14.600 | in Wisconsin and in the Netherlands,
00:41:18.640 | was they had a bird flu virus that was very dangerous,
00:41:23.640 | but not massively transmissive.
00:41:28.540 | And they had a gain of function process
00:41:31.620 | through what's called serial passage,
00:41:33.640 | which means basically passing a virus,
00:41:35.560 | like natural selection, but forcing natural selection
00:41:39.220 | by just passing a virus through a different cell cultures
00:41:42.460 | and then selecting for what it is that you want.
00:41:46.320 | So relatively easily, they took this deadly,
00:41:49.480 | but not massively transmissive virus
00:41:51.880 | and turned it into, in a lab,
00:41:53.920 | a deadly and transmissive virus.
00:41:57.320 | And that showed that this is really dangerous.
00:41:59.560 | And so there were, at that point,
00:42:01.160 | there was a huge controversy.
00:42:03.040 | There were some people like Richard Ebright
00:42:07.120 | and Mark Lipsitch at Harvard who were saying
00:42:11.160 | that this is really dangerous.
00:42:12.480 | We're in the idea that we need to create monsters
00:42:16.880 | to study monsters.
00:42:17.920 | I think maybe even you have said that in the past.
00:42:21.160 | It doesn't make sense because there's an unlimited number
00:42:24.000 | of monsters.
00:42:24.820 | And so what are we gonna do?
00:42:25.800 | Create an unlimited number of monsters.
00:42:27.480 | And if we do that, eventually the monsters
00:42:29.800 | are going to get out.
00:42:31.440 | Then there was the Peter Daszak camp,
00:42:33.680 | and he got a lot of funding,
00:42:35.480 | particularly from the United States, who said,
00:42:37.680 | well, and certainly Collins and Fauci
00:42:40.400 | were supportive of this.
00:42:42.080 | And they thought, well, there's a safe way
00:42:44.700 | to go out into the world
00:42:46.400 | to collect the world's most dangerous viruses
00:42:49.640 | and to poke and prod them to figure out
00:42:52.940 | how they might mutate, how they might become more dangerous
00:42:56.480 | with the goal of predicting future pandemics.
00:43:01.480 | And that certainly never happened
00:43:03.680 | with the goal of creating vaccines and treatments.
00:43:08.000 | And that largely never happened.
00:43:11.240 | But that was, so Peter Daszak kind of epitomized
00:43:14.440 | that second approach.
00:43:18.240 | And as you've talked about in the past, in 2014,
00:43:22.200 | there was a funding moratorium in the United States.
00:43:25.220 | And then in 2017, that was lifted.
00:43:27.960 | It didn't affect the funding
00:43:29.080 | that went to the EcoHealth Alliance.
00:43:33.360 | So when this happened in the beginning,
00:43:37.000 | and again, coming back to Peter's motivations,
00:43:40.060 | I don't think, here's the best case scenario for Peter.
00:43:44.040 | I'm going to give you what I imagine he was thinking,
00:43:47.080 | and then I'll tell you what I actually think.
00:43:49.760 | So I think here's what he's thinking.
00:43:51.720 | This is most likely a natural origin outbreak.
00:43:56.340 | Just like SARS-1, and again, in Peter's hypothetical mind,
00:44:00.420 | just like SARS-1, this is most likely a natural outbreak.
00:44:04.520 | We need to have an international coalition
00:44:06.980 | in order to fight it.
00:44:08.680 | If we allow these political attacks
00:44:11.980 | to undermine our Chinese counterparts
00:44:14.140 | and the trust in these relationships
00:44:16.180 | that we've built over many years,
00:44:18.480 | we're really screwed
00:44:19.520 | because they have the most local knowledge
00:44:22.680 | of these outbreaks.
00:44:23.800 | And even though, and this gets a lot more complicated,
00:44:27.720 | even though there are basic questions
00:44:30.720 | that anybody would ask,
00:44:32.520 | and that Shujing Li herself did ask
00:44:34.720 | about the origins of this pandemic,
00:44:37.740 | even though Peter Daszak,
00:44:39.380 | and I'll describe this in a moment,
00:44:42.000 | had secret information that we didn't have,
00:44:45.020 | that in my mind massively increases
00:44:47.480 | the possibility of a lab incident origin,
00:44:49.940 | I, Peter Daszak, would like to guide
00:44:53.720 | the public conversation in the direction
00:44:57.120 | where I think it should go,
00:44:58.840 | and in support of the kind of international collaboration
00:45:03.560 | that I think is necessary.
00:45:04.400 | - That's a strong, positive discussion
00:45:06.240 | because it's true that there's a lot of political BS
00:45:11.240 | and a lot of kind of just bickering and lies,
00:45:17.360 | as we've talked about.
00:45:18.920 | And so it's very convenient to say,
00:45:20.560 | you know what, let's just ignore
00:45:22.120 | all of these quote unquote lies,
00:45:24.080 | and my favorite word, misinformation.
00:45:26.220 | And then, because the way out from this serious pandemic
00:45:31.800 | is for us to work together.
00:45:33.440 | So let's strengthen our partnerships,
00:45:36.240 | and everything else is just like noise.
00:45:38.440 | - Yeah, so let's, and so then now I wanna do
00:45:40.960 | my personal indictment of Peter Daszak,
00:45:43.480 | because that's my view, but I wanted to fairly,
00:45:46.840 | because I think that we all tell ourselves stories,
00:45:51.840 | and also when you're a science communicator,
00:45:56.400 | you can't, in your public communications,
00:45:59.200 | give every doubt that you have, or every nuance.
00:46:03.320 | You kind of have to summarize things.
00:46:05.820 | And so I think that he was, again,
00:46:07.560 | in this benign interpretation,
00:46:09.720 | trying to summarize in the way that he thought
00:46:12.200 | the conversations should go.
00:46:14.120 | Here's my indictment of Peter Daszak,
00:46:16.760 | and I feel like Brutus here,
00:46:20.960 | but I've not come here to praise Peter Daszak,
00:46:25.960 | because while Peter Daszak was doing all of this,
00:46:29.280 | and making all of these statements about,
00:46:31.680 | well, we pretty much know it's a natural origin,
00:46:34.400 | then there was this February 2020 Lancet letter,
00:46:38.180 | where it turns out, and we only knew this later,
00:46:40.960 | that he was highly manipulative.
00:46:42.840 | So he was recruiting all of these people.
00:46:45.280 | He drafted the infamous letter,
00:46:47.520 | calling people like me conspiracy theorists.
00:46:50.720 | He then wrote to people like Ralph Baric and Lin-Fa Wang,
00:46:54.560 | who are also very high-profile virologists,
00:46:57.000 | saying, well, let's not put our names on it,
00:46:59.120 | so it doesn't look like we're doing it,
00:47:01.960 | even though they were doing it.
00:47:04.520 | He didn't disclose a lot of information that they had.
00:47:09.440 | - It was a strategic move,
00:47:10.880 | so just in case people are not familiar,
00:47:13.720 | February 2020 Lancet letter was,
00:47:18.080 | TLDR is Lab Leak Hypothesis is a Conspiracy Theory.
00:47:24.080 | - Essentially, yes.
00:47:26.640 | - So like, with the authority of science,
00:47:29.680 | not saying it's highly likely, saying it's obvious,
00:47:34.680 | duh, it's natural origin,
00:47:37.800 | everybody else is just,
00:47:40.840 | everything else is just misinformation,
00:47:42.840 | and look, there's a bunch of really smart people
00:47:44.960 | that signed this, therefore it's true.
00:47:46.440 | - Yeah, not only that,
00:47:47.400 | so there were the people who,
00:47:50.040 | 27 people signed that letter,
00:47:51.760 | and then after President Trump cut funding
00:47:54.440 | to EcoHealth Alliance,
00:47:55.280 | then he organized 77 Nobel laureates
00:47:59.040 | to have a public letter criticizing that.
00:48:01.760 | But what Peter knew then,
00:48:03.640 | that we didn't fully know,
00:48:06.440 | is that in March of 2018,
00:48:09.680 | EcoHealth Alliance,
00:48:11.000 | in partnership with the Wuhan Institute of Virology
00:48:13.680 | and others,
00:48:14.760 | had applied for a $14 million grant to DARPA,
00:48:19.360 | which is kind of like the VC side
00:48:22.160 | of the venture capital side of the defense department,
00:48:25.560 | they're kind of, where they do kind of big ideas.
00:48:29.400 | - By the way, as a tiny tangent,
00:48:31.200 | I've gotten a lot of funding from DARPA,
00:48:33.960 | they fund a lot of excellent robotics research.
00:48:36.360 | - And DARPA's incredible,
00:48:37.760 | and among the things that they applied for
00:48:39.960 | is that we, meaning Wuhan Institute of Virology,
00:48:42.440 | is gonna go and it's gonna collect
00:48:45.000 | the most dangerous bat coronaviruses in Southern China,
00:48:49.400 | and then we, as this group,
00:48:52.320 | are going to genetically engineer these viruses
00:48:56.600 | to insert a furin cleavage site,
00:48:59.760 | so I think when everyone's now seeing the image
00:49:02.600 | of the SARS-CoV-2 virus,
00:49:04.560 | it has these little spike proteins,
00:49:06.320 | these little things that stick out,
00:49:07.960 | which is why they call it a coronavirus,
00:49:09.920 | within that spike protein are these furin cleavage sites,
00:49:12.400 | which basically help with the virus
00:49:14.840 | getting access into our cells,
00:49:18.080 | and they were going to genetically engineer
00:49:20.120 | these furin cleavage sites into these bat coronaviruses,
00:49:24.160 | the Serbico viruses,
00:49:26.360 | and then, and so then, a year and a half later,
00:49:30.680 | what do we see?
00:49:31.720 | We see a bat coronavirus with a furin cleavage site
00:49:36.720 | unlike anything that we've ever seen before
00:49:39.840 | in that category of SARS-like coronaviruses.
00:49:44.200 | That, well, yes, I mean,
00:49:46.480 | the DARPA, very correctly, didn't support that application.
00:49:50.880 | - Well, let's actually, let's like pause on that.
00:49:53.120 | So for a lot of people, that's like the smoking gun.
00:49:55.680 | - Yeah.
00:49:56.520 | - Okay, let's talk about this 2018 proposal to DARPA,
00:50:01.520 | so I guess who's drafted the proposal, is it?
00:50:04.560 | - It's EcoHealth.
00:50:05.520 | - EcoHealth, but the proposal is to do,
00:50:09.120 | so EcoHealth is technically a US-funded organization.
00:50:14.120 | - Primarily.
00:50:15.520 | - And then the idea was to do work
00:50:18.520 | at Wuhan Institute of Virology.
00:50:20.880 | - With, yeah, so it was--
00:50:22.320 | - With EcoHealth.
00:50:23.160 | - Yeah, so EcoHealth, basically,
00:50:25.000 | the Wuhan Institute of Virology was gonna go,
00:50:27.600 | and they were gonna collect these viruses
00:50:29.880 | and store them at Wuhan Institute of Virology.
00:50:31.840 | - But they're also gonna do the actual--
00:50:33.680 | - According, it's a really important point,
00:50:35.480 | according to their proposal,
00:50:37.360 | the actual work was going to be done
00:50:39.640 | at the lab of Ralph Baric
00:50:41.760 | at the University of North Carolina,
00:50:43.400 | who's probably the world's leading expert on coronaviruses.
00:50:48.080 | And so we know that DARPA didn't fund that work.
00:50:54.400 | We know, I think, quite well that Ralph Baric's lab,
00:51:00.320 | in part because it was not funded by DARPA,
00:51:05.280 | they didn't do that specific work.
00:51:07.480 | What we don't know is, well, what work was done
00:51:11.680 | at the Wuhan Institute of Virology,
00:51:13.720 | because WIV was part of this proposal,
00:51:16.940 | they had access to all of the plans,
00:51:19.720 | they had their own capacity,
00:51:22.440 | and they had already done a lot of work
00:51:25.000 | in genetically altering this exact category of viruses
00:51:30.000 | they had created, chimeric mixed viruses they had done,
00:51:35.480 | they had mastered pretty much all of the steps
00:51:39.080 | in order to achieve this thing
00:51:40.640 | that they applied for funding with EcoHealth to do.
00:51:44.200 | And so the question is,
00:51:46.280 | did the Wuhan Institute of Virology
00:51:49.200 | go through with that research anyway?
00:51:51.840 | And in my mind, that's a very, very real possibility.
00:51:55.320 | It would certainly explain
00:51:56.160 | why they're giving no information.
00:51:58.520 | And as you know, I've been a member
00:52:01.480 | of the World Health Organization Expert Advisory Committee
00:52:04.120 | on Human Genome Editing,
00:52:05.920 | which Dr. Tedros created in the aftermath
00:52:08.440 | of the announcement of the world's first CRISPR babies,
00:52:11.720 | and it was just basically the exact same story.
00:52:14.320 | So Ho-Chunk Hui, a Chinese scientist,
00:52:16.760 | he was not a first-tier scientist,
00:52:18.160 | but a perfectly adequate second-tier scientist,
00:52:21.240 | came to the United States,
00:52:22.280 | learned all of these capacities,
00:52:23.860 | went back to China and said,
00:52:25.160 | well, there's a much more permissive environment,
00:52:28.160 | I'm gonna be a world leader,
00:52:30.560 | I'm gonna establish both myself and China.
00:52:33.280 | So in every scientific field,
00:52:35.360 | we're seeing this same thing
00:52:37.280 | where you kind of learn a model,
00:52:39.440 | and then you do it in China.
00:52:41.540 | So is it possible that the Wuhan Institute of Virology,
00:52:45.660 | with this exact game plan, was doing it anyway?
00:52:49.920 | Do we, possible?
00:52:51.800 | We have no clue what work was being done
00:52:55.080 | at the Wuhan Institute of Virology.
00:52:56.880 | It seems extremely likely
00:52:59.840 | that at the Wuhan Institute of Virology,
00:53:02.080 | this is certainly the US government position,
00:53:04.520 | there was the work that was being done in Dr. Hsu's lab,
00:53:08.320 | but that wasn't the whole WIV.
00:53:10.320 | We know, at least according to the United States government,
00:53:12.600 | that there was the Chinese military,
00:53:14.180 | the PLA, was doing work there.
00:53:17.560 | Were they doing this kind of work?
00:53:19.800 | Not to create a bioweapon,
00:53:22.120 | but in order to understand these viruses,
00:53:25.120 | maybe to develop vaccines and treatments.
00:53:28.240 | It seems like a very, very logical possibility.
00:53:33.160 | And then, so we know that the Wuhan Institute of Virology
00:53:36.280 | had all of the skills,
00:53:37.600 | we know that they were part of this proposal.
00:53:40.680 | And then you have Peter Daszak, who knows all of this,
00:53:43.800 | that at that time, in February of 2020, we didn't know.
00:53:47.380 | But then he comes swinging out of the gate,
00:53:49.580 | saying anybody who's raising this possibility
00:53:52.920 | of a lab incident origin is a conspiracy theorist.
00:53:57.040 | I mean, it really makes him look,
00:53:59.600 | in my mind, very, very bad.
00:54:01.720 | - Not to at least be somewhat open-minded on this,
00:54:04.480 | because he knows all the details.
00:54:06.080 | He knows that it's not 0%.
00:54:09.000 | I mean, there's no way in his mind could you even argue that.
00:54:12.080 | So it's potential because of the bias,
00:54:14.200 | because of your focus.
00:54:16.160 | I mean, it could be the Anthony Fauci masks thing,
00:54:20.280 | where he knows there's some significant probability
00:54:23.320 | that this is happening.
00:54:24.240 | But in order to preserve good relations
00:54:28.680 | with our Chinese colleagues,
00:54:30.120 | we want to make sure we tell a certain kind of narrative.
00:54:33.020 | So it's not really lying,
00:54:34.720 | it's doing the best possible action at this time
00:54:39.220 | to help the world.
00:54:40.400 | Not that this already happened.
00:54:42.200 | But that's how like--
00:54:43.760 | - I think it's quite likely that that was the story
00:54:47.480 | that he was telling himself.
00:54:49.440 | But it's that lack of transparency, in my mind,
00:54:54.440 | is fraudulent, that we were struggling
00:54:59.320 | to understand something that we didn't understand.
00:55:02.520 | And that I just think that people who possess
00:55:05.280 | that kind of information,
00:55:07.000 | especially when the existence,
00:55:10.040 | like the entire career of Peter Daszak
00:55:12.280 | is based on US taxpayers.
00:55:14.300 | There's a debt that comes with that.
00:55:16.760 | And that debt is honesty and transparency.
00:55:19.040 | And for all of us,
00:55:19.960 | you talked about your girlfriend checking your phone.
00:55:22.880 | For all of us, being honest and transparent
00:55:26.320 | in the most difficult times is really difficult.
00:55:29.240 | If it were easy, everybody would do it.
00:55:31.360 | And that's, I just feel that Peter
00:55:34.400 | was the opposite of transparent,
00:55:37.320 | and then went on the offensive.
00:55:39.720 | And then had the gall of joining,
00:55:44.320 | I know we can talk about this,
00:55:46.040 | this highly compromised joint study process
00:55:51.040 | with the international experts
00:55:54.320 | and their Chinese government counterparts.
00:55:56.580 | And used that as a way of furthering this,
00:56:00.520 | in my mind, fraudulent narrative
00:56:05.040 | that it almost certainly came from natural origins
00:56:09.080 | and a lab origin was extremely unlikely.
00:56:12.800 | - Just to stick briefly on the proposal
00:56:15.120 | to wrap that up, because I do think
00:56:17.280 | in a kind of Jon Stewart way,
00:56:21.960 | if you heard that a bit. - Yeah, loved it.
00:56:24.800 | - Yeah, sort of kinda like common sense way,
00:56:29.120 | the 2018 proposal to DARPA from EcoHealth Alliance
00:56:35.360 | and Wuhan Institute of Virology
00:56:38.200 | just seems like a bit of a smoking gun to me.
00:56:41.000 | Like that, so there's this excellent book
00:56:45.280 | that people should read called
00:56:46.960 | "Viral, The Search for the Origin of COVID-19."
00:56:50.240 | Matt Ridley and Alina Chan, I think Alina is in MIT.
00:56:54.440 | Probably-- - She's at the Broad, yeah.
00:56:55.720 | - At Broad Institute, yeah, yeah.
00:56:57.760 | So she, I heard her in an interview
00:57:00.800 | give this analogy of unicorns. - Yeah.
00:57:04.000 | - And where basically somebody writes a proposal
00:57:09.160 | to add horns to horses, the proposal is rejected,
00:57:14.160 | and then a couple of years later or a year later,
00:57:17.760 | a unicorn shows up. (laughs)
00:57:19.880 | - In the place where they're proposing to do it.
00:57:23.200 | I mean, that's so, I had--
00:57:24.040 | - And then everyone's like, it's natural origin.
00:57:26.800 | It's like, it's possible it's natural origin.
00:57:28.960 | Like, we haven't detected a unicorn yet,
00:57:31.000 | and this is the first time we've detected a unicorn.
00:57:33.680 | Or it could be this massive organization
00:57:36.760 | that was planning, is fully equipped,
00:57:39.480 | has like a history of being able to do this stuff,
00:57:42.800 | has the world experts to do it, has the funding,
00:57:45.080 | has the motivation to add horns to horses,
00:57:48.560 | and now a unicorn shows up and they're saying,
00:57:51.000 | nope, definitely natural origin.
00:57:54.360 | - That connects to what your first question
00:57:56.800 | of how do I get to my 85%,
00:57:59.160 | and here's a summary of that answer.
00:58:03.120 | And so it's what I said in my 60 Minutes interview
00:58:06.320 | a long time ago, of all the gin joints
00:58:08.040 | in all the towns in all the world,
00:58:09.280 | the quote from Casablanca.
00:58:11.880 | And so of all the places in the world
00:58:14.680 | where we have an outbreak of a SARS-like bat coronavirus,
00:58:19.680 | it's not in the area of the natural habitat
00:58:23.640 | of the horseshoe bats, it's the one city in China
00:58:28.640 | with the first and largest level four virology lab,
00:58:33.420 | which actually wasn't even using it,
00:58:35.000 | they were doing level three and level two for this work,
00:58:38.480 | where they had the world's largest collection
00:58:40.160 | of bat coronaviruses,
00:58:42.280 | where they were doing aggressive experiments
00:58:45.520 | designed to make these scary viruses scarier,
00:58:50.080 | where they had been part of an application
00:58:53.280 | to insert a furin cleavage site
00:58:56.400 | able to infect human cells,
00:59:00.260 | and where when the outbreak happened,
00:59:03.320 | we had a virus that was ready for action
00:59:06.720 | to infect humans and to this day,
00:59:08.200 | better able to infect humans than any other species,
00:59:11.600 | including bats.
00:59:13.800 | And then from day one, there's this massive coverup.
00:59:17.720 | And then on top of that, in spite of lots of efforts
00:59:21.320 | by lots of people, there's basically no evidence
00:59:24.960 | for the natural origin hypothesis.
00:59:27.600 | Everything that I've described just now is circumstantial,
00:59:30.240 | but there's a certain point
00:59:32.080 | of where you add up the circumstances
00:59:34.520 | and you see this seems pretty, pretty likely.
00:59:37.820 | I mean, if we're getting to 100%,
00:59:39.840 | we are not at 100% by any means.
00:59:42.440 | There still is a possibility of a natural origin.
00:59:45.640 | And if we find that, great.
00:59:47.120 | But from everything that I know,
00:59:49.240 | that's how I get to my 85.
00:59:51.200 | - And we'll talk about why this matters
00:59:54.240 | in a political sense, in a human sense,
00:59:56.600 | in a science, in the realm of science,
00:59:59.360 | all of those factors.
01:00:00.440 | But first, as Nietzsche said,
01:00:02.800 | "Let us look into the abyss
01:00:04.320 | "and the games we'll play with monsters."
01:00:06.400 | That is colloquially called gain-of-function research.
01:00:10.660 | Let me ask the kind of political-sounding question,
01:00:14.720 | which is how people usually phrase it.
01:00:16.880 | Did Anthony Fauci fund gain-of-function research
01:00:21.880 | at the Wuhan Institute of Virology?
01:00:26.820 | - So it depends.
01:00:28.000 | I've obviously been very closely monitoring this.
01:00:31.560 | I've spoken a lot about it.
01:00:32.800 | I've written about it.
01:00:34.680 | And it depends on, I mean, not to quote Bill Clinton,
01:00:37.800 | but to quote Bill Clinton,
01:00:39.200 | it depends on what the definition of is is.
01:00:41.760 | And so if you use a common-sense definition
01:00:45.740 | of gain-of-function, and by gain-of-function,
01:00:47.920 | there are lots of things like gene therapies
01:00:49.440 | that are gain-of-function.
01:00:50.280 | But here, what we mean is gain-of-function
01:00:52.640 | for pathogens potentially able to create
01:00:57.900 | human pandemics.
01:01:00.220 | But if you use the kind of common-sense language,
01:01:03.900 | well, then he probably did.
01:01:05.580 | If you use the technical language from a 2017 NIH document,
01:01:10.580 | and you read that language very narrowly,
01:01:14.860 | I think you can make a credible argument that he did not.
01:01:19.380 | There's a question, though, and Francis Collins
01:01:22.780 | talked about that in his interview with you,
01:01:25.620 | but then there's a question that we know
01:01:27.540 | from now that we have the information of the reports
01:01:31.080 | submitted by EcoHealth Alliance to the NIH,
01:01:35.480 | and some of which were late or not even delivered,
01:01:38.900 | that some of this research was done on MERS,
01:01:42.200 | Middle Eastern Respiratory Syndrome virus.
01:01:45.920 | And if that was the case,
01:01:48.240 | there is, I think, a colorable argument
01:01:51.080 | that that would be considered gain-of-function research
01:01:54.920 | even by the narrow language of that 2017 document.
01:01:59.640 | But I definitely think, and I've said this repeatedly,
01:02:02.960 | that Rand Paul can be right, and Tony Fauci can be right.
01:02:07.960 | And the question is, how are we defining gain-of-function?
01:02:12.400 | And that's why I've always said the question in my mind
01:02:14.520 | isn't was it or wasn't it gain-of-function,
01:02:17.700 | as if that's like a binary thing,
01:02:19.920 | if not, great, and if yes, guilty.
01:02:23.980 | The question is just what work was being done
01:02:26.680 | at the Wuhan Institute of Virology?
01:02:28.760 | What role, if any, did US government funding play
01:02:33.760 | in supporting that work?
01:02:36.840 | And what rights do we all have as human beings
01:02:41.360 | and as American citizens and taxpayers
01:02:43.760 | to get all of the relevant information about them?
01:02:47.120 | - So let's try to kinda dissect this.
01:02:51.240 | So who frustrates you more, Rand Paul,
01:02:54.540 | or Anthony Fauci's discussion, or the discussion itself?
01:02:57.640 | So for example, gain-of-function is a term
01:03:01.860 | that's kind of more used just to mean
01:03:06.860 | playing with viruses in a lab
01:03:11.620 | to try to develop more dangerous viruses.
01:03:14.980 | Is this kind of research a good idea?
01:03:22.140 | Is it also a good idea for us to talk about it in public
01:03:26.780 | in the political way that it's been talked about?
01:03:29.880 | Is it okay that US may have funded
01:03:34.880 | gain-of-function research elsewhere?
01:03:39.020 | I mean, it's kind of assumed, just like with Bill Clinton,
01:03:43.580 | there was very little discussion of, I think,
01:03:46.720 | correct me if I'm wrong, but whether it's okay
01:03:50.620 | for a president, male or female,
01:03:54.100 | to have extramarital sex, okay?
01:03:57.540 | Or is it okay for a president to have extramarital sex
01:04:02.540 | with people on his staff or her staff?
01:04:07.960 | It was more the discussion of lying, I think.
01:04:12.220 | It was, did you lie about having sex or not?
01:04:16.060 | And in this gain-of-function discussion,
01:04:18.340 | what frustrates me personally
01:04:20.580 | is there's not a deep philosophical discussion
01:04:23.140 | about whether we should be doing this kind of research
01:04:25.740 | and what kinds, like what are the ethical lines?
01:04:30.580 | Research on animals at all.
01:04:32.320 | Those are fascinating questions.
01:04:33.640 | Instead, it's a gotcha thing.
01:04:36.440 | Did you or did you not fund research on gain-of-function?
01:04:40.380 | And did you fund, it's almost like a bioweapon,
01:04:43.700 | did you give money to China to develop this bioweapon
01:04:47.780 | that now attacked the rest of the world?
01:04:49.740 | So, I mean, all those things are pretty frustrating,
01:04:52.780 | but is there, I think the thing you can untangle
01:04:57.340 | about Anthony Fauci and gain-of-function research
01:04:59.660 | in the United States and Equal Health Alliance
01:05:01.740 | and Wuhan Institute of Virology that's kind of,
01:05:04.940 | that's clarifying, what were the mistakes made?
01:05:08.820 | - Sure, so on gain-of-function,
01:05:11.060 | there actually has been a lot of debate.
01:05:14.220 | And I mentioned before in 2011, these first papers,
01:05:18.020 | there was a big debate.
01:05:20.160 | Mark Lipsitch, who's formerly at Harvard,
01:05:22.400 | now with the US government, working in the president's
01:05:26.020 | office, he led a thing called the Cambridge Group
01:05:29.660 | that was highly critical of this work,
01:05:32.580 | but basically saying we're creating monsters.
01:05:35.680 | They had the funding pause in 2014.
01:05:39.220 | They spent three years putting together a framework
01:05:42.740 | and then they lifted it in 2017.
01:05:45.300 | So we had a thoughtful conversation.
01:05:47.180 | Unfortunately, it didn't work.
01:05:48.620 | And I think that's where we are now.
01:05:50.900 | So I absolutely think that there are real issues
01:05:54.940 | with the relationship between the United States government
01:05:58.680 | and Equal Health Alliance, and through that,
01:06:01.660 | Equal Health Alliance with the Wuhan Institute of Virology.
01:06:05.380 | And one issue is just essential transparency,
01:06:08.260 | because as I see it, it's most likely the case
01:06:11.020 | that we transferred a lot of our knowledge and plans and things
01:06:14.500 | to the Wuhan Institute of Virology.
01:06:16.660 | And again, I'm sure that Shujing Li is not herself a monster.
01:06:23.140 | I'm sure of that, even though I've never met her.
01:06:26.240 | But there are just a different set
01:06:28.540 | of pressures on people working in an authoritarian system
01:06:31.380 | than people who are working in other systems.
01:06:33.340 | That doesn't mean it's entirely different.
01:06:36.500 | And so I absolutely think that we shouldn't give $1
01:06:40.660 | to an organization, and certainly a virology institute,
01:06:44.100 | where you don't have full access to their records,
01:06:47.820 | to their databases.
01:06:49.020 | We don't know what work is happening there.
01:06:52.580 | And I think that we need to have that kind of full examination.
01:06:56.860 | And that's why-- so I understand what Dr. Fauci is doing,
01:07:00.860 | is saying, hey, what I hear--
01:07:02.500 | Dr. Fauci is saying, what I hear from you, Rand Paul,
01:07:04.940 | is you're accusing me of starting this pandemic.
01:07:08.780 | And you're using gain of function as a proxy for that.
01:07:11.420 | And we have, when there are Senate hearings,
01:07:13.380 | every senator gets five minutes.
01:07:15.340 | And the name of the game is to translate your five minutes
01:07:18.580 | into a clip that's going to run on the news.
01:07:22.140 | And so I get that there is that kind of--
01:07:23.900 | It's a dark, dark game.
01:07:24.860 | --gotcha.
01:07:25.980 | But I also think that Dr. Fauci and the National
01:07:31.780 | Institute of Allergy and Infectious Diseases and the NIH
01:07:35.220 | should have been more transparent.
01:07:37.140 | Because I think that in this day and age, where there
01:07:41.380 | are a lot of people poking around
01:07:42.940 | in this whole story of COVID origins,
01:07:45.380 | we would not be where we are if it wasn't for a relatively
01:07:49.300 | small number of people.
01:07:51.140 | And I'm part of-- there are two, as I know, two groups.
01:07:54.740 | One is these internet sleuths known as DRASTIC.
01:07:58.100 | And a number of them are part of a group
01:08:00.620 | that I'm part of called--
01:08:02.020 | it's not our official name, but called the Paris Group.
01:08:04.300 | It's about two dozen experts around the world,
01:08:08.420 | but centered around some very high-level French academics.
01:08:13.420 | So we've all been digging and meeting with each other
01:08:16.660 | regularly since last year.
01:08:19.660 | And our governments across the board, certainly China,
01:08:22.180 | but including the United States, haven't
01:08:23.900 | been as transparent as they need to be.
01:08:27.380 | So there's definitely mistakes were made on all sides.
01:08:31.540 | And that's why, for me, from day one,
01:08:33.800 | I've been calling for a comprehensive investigation
01:08:37.180 | into this issue that certainly, obviously, looks at China.
01:08:40.700 | But we have to look at ourselves.
01:08:42.020 | We did not get this right.
01:08:43.580 | So to you--
01:08:45.740 | I'm just going to put Rand Paul aside here.
01:08:50.540 | Politician playing political games, it's very frustrating.
01:08:53.420 | But it is what it is on all sides.
01:08:56.580 | Anthony Fauci, you think, should have been more transparent.
01:09:01.460 | And maybe more eloquent in expressing
01:09:08.100 | the complexity of all of this, the uncertainty in all of this.
01:09:12.780 | Yeah, and I get that it's really hard to do that.
01:09:16.500 | Because let's say you have one--
01:09:20.100 | you speak a paragraph, and it's got four sentences.
01:09:23.140 | And one of those sentences is the thing that's
01:09:25.860 | going to be turned into Twitter.
01:09:27.140 | And--
01:09:27.640 | Let me push back.
01:09:28.300 | I get really-- so I'll try not to be
01:09:31.220 | emotional about this.
01:09:32.460 | But I've heard Anthony Fauci a couple of times
01:09:39.500 | now say that he represents science.
01:09:43.660 | I know what he means by that.
01:09:45.580 | He means in the political bickering,
01:09:48.940 | all that kind of stuff, that for a lot of people,
01:09:52.120 | he represents science.
01:09:54.980 | But words matter.
01:09:56.460 | And this isn't just clips.
01:09:58.620 | I mean, maybe I'm distinctly aware of that
01:10:00.380 | doing this podcast.
01:10:01.300 | Like, yeah, I talk for hundreds of hours now,
01:10:05.140 | maybe over 1,000 hours.
01:10:07.180 | But I'm still careful with the words.
01:10:11.420 | I'm trying not to be an asshole.
01:10:13.180 | And I'm aware when I'm an asshole,
01:10:14.780 | and I'll apologize for it.
01:10:17.660 | If the words I represent science left my mouth,
01:10:21.340 | which they very well could, I would sure
01:10:24.420 | as hell be apologizing for it.
01:10:26.020 | And not in a because I got in trouble,
01:10:29.120 | I would just feel bad about saying something like that.
01:10:31.580 | And even that little phrase, I represent science,
01:10:36.180 | no, Dr. Fauci, you do not represent science.
01:10:39.100 | I love science.
01:10:40.100 | The millions of scientists that inspired me to get into it,
01:10:45.140 | to fall in love with the scientific method
01:10:48.060 | in the exploration of ideas through the rigor of science,
01:10:53.860 | that Anthony Fauci does not represent.
01:10:56.180 | He's one, I believe, great scientist of millions.
01:11:00.600 | He does not represent anybody.
01:11:03.280 | He's just one scientist.
01:11:04.760 | And I think the greatness of a scientist
01:11:07.880 | is best exemplified in humility.
01:11:11.000 | Because the scientific method basically
01:11:13.440 | says you're standing before the fog, the mystery of it all,
01:11:19.940 | and slowly chipping away at the mystery.
01:11:23.300 | And it's embarrassing.
01:11:25.760 | It's humiliating how little you know.
01:11:28.520 | That's the experience.
01:11:29.880 | So the great scientists have to have humility, to me.
01:11:33.560 | And especially in their communication,
01:11:35.400 | they have to have humility.
01:11:36.800 | And I don't know.
01:11:38.120 | And some of it is also words matter.
01:11:40.040 | Because great leaders have to have the poetry of action.
01:11:46.160 | They have to be bold and inspire action
01:11:50.160 | across millions of people.
01:11:52.720 | But you also have to, through that poetry of words,
01:11:58.960 | express the complexity of the uncertainty
01:12:02.800 | you're operating under.
01:12:04.280 | Be humble in the face of not being
01:12:06.560 | able to predict the future, or understand the past,
01:12:09.640 | or really know what's the right thing to do,
01:12:11.600 | but we have to do something.
01:12:13.320 | And through that, you have to be a great leader that
01:12:16.840 | inspires action.
01:12:17.960 | And some of that is just words.
01:12:20.080 | And he chose words poorly.
01:12:22.000 | I mean, so I'm all torn about this.
01:12:25.400 | And then there's politicians that are taking those words
01:12:28.200 | and magnifying them and playing games with them.
01:12:31.800 | And of course, that's a disincentive
01:12:34.280 | for the people who do, the scientific leaders that
01:12:37.840 | step into the limelight to say any more words.
01:12:41.640 | So they kind of become more conservative
01:12:43.480 | with the words they use.
01:12:45.360 | I mean, it just becomes a giant mess.
01:12:47.040 | But I think the solution is to ignore all of that
01:12:52.360 | and to be transparent, to be honest, to be vulnerable,
01:12:56.840 | and to express the full uncertainty of what you're
01:13:02.440 | operating under, to present all the possible actions,
01:13:06.000 | and to be honest about the mistakes
01:13:07.600 | they made in the past.
01:13:08.360 | I mean, there's something-- even if you're not directly
01:13:10.760 | responsible for those mistakes, taking responsibility for them
01:13:15.280 | is a way to win people over.
01:13:18.040 | I don't think leaders realize this often in the modern age.
01:13:21.760 | In the internet age, they can see through your bullshit.
01:13:24.960 | And it's really inspiring when you take ownership.
01:13:29.160 | So to do the thought experiment in public,
01:13:32.680 | do a thought experiment, if there was a lab leak,
01:13:35.360 | and then lay out all the funding, the EcoHealth
01:13:37.920 | Alliance, all the incredible science going on
01:13:41.800 | at the Wuhan Institute of Virology and the NIH.
01:13:46.360 | Lay out all the possible ethical problems.
01:13:48.440 | Lay out all the possible mistakes
01:13:52.520 | that could have been made.
01:13:53.600 | And say, this could have happened.
01:13:56.280 | And if this happened, here's the best way to respond to it
01:13:59.600 | and to prevent it in the future.
01:14:00.920 | And just lay all that complexity out.
01:14:03.360 | I wish we would have seen that.
01:14:05.960 | And I have hope that this conversation, conversations
01:14:09.960 | like it, your work, and books on this topic
01:14:13.680 | will inspire young people today, when
01:14:15.920 | they become in Anthony Fauci's role,
01:14:19.680 | to be much more transparent and much more humble
01:14:22.080 | and all those kinds of things.
01:14:23.520 | That this is just a relic of the past.
01:14:26.000 | When there's a person, no offense to me,
01:14:27.920 | in a suit that has to stand up and speak
01:14:31.120 | with clarity and certainty, I mean,
01:14:33.200 | that's just a relic of the past.
01:14:34.600 | This is my hope.
01:14:39.280 | But--
01:14:40.400 | Do you mind if I--
01:14:41.200 | Yes, please.
01:14:41.800 | --agree with a great deal of what you said.
01:14:46.160 | And it's really unfortunate that certainly the Chinese
01:14:50.480 | government, as I said before, our government
01:14:53.480 | wasn't as transparent as I feel they should have been,
01:14:56.880 | particularly in the early days of the pandemic,
01:14:59.600 | and particularly with regard to the issue of pandemic origins.
01:15:03.360 | I mean, we know that Dr. Fauci was on calls
01:15:06.720 | with people like Christian Anderson at Scripps
01:15:09.800 | and others in those early days, raising questions,
01:15:13.440 | is this an engineered virus?
01:15:15.640 | There were a lot of questions.
01:15:17.800 | And it's kind of sad.
01:15:19.520 | I mean, as I mentioned before, I've been one.
01:15:23.520 | I mean, and certainly there were others.
01:15:25.600 | But there weren't a lot of us, of the people
01:15:28.240 | who from the earliest days of the pandemic
01:15:30.800 | were raising questions about, hey, not so fast here.
01:15:34.920 | And I launched my website on pandemic origins
01:15:38.480 | in April of last year, April 2020.
01:15:41.120 | It got a huge amount of attention.
01:15:42.520 | And actually, my friend Matt Pottinger,
01:15:44.880 | who is the deputy national security advisor,
01:15:46.840 | when he was reaching out to people in the US government
01:15:50.360 | and in allied governments saying, hey,
01:15:52.520 | we should look into this, what he was sending them
01:15:55.640 | was my website.
01:15:56.680 | It wasn't some US government information.
01:15:59.800 | And so--
01:16:00.320 | And by the way, people should still go to the website.
01:16:02.600 | You keep updating it.
01:16:05.200 | And it's an incredible resource.
01:16:07.360 | Thank you.
01:16:07.880 | Thank you.
01:16:08.360 | JamieMetzl.com.
01:16:10.800 | And it's really unfortunate that our governments
01:16:13.480 | and international institutions for pretty much all of 2020
01:16:18.360 | weren't doing their jobs of really probing this issue.
01:16:22.560 | People were hiding behind this kind of false consensus.
01:16:26.560 | And I'm critical of many people.
01:16:28.520 | Even when I heard Francis Collins interview with you,
01:16:32.480 | I just felt, well, he wasn't as balanced
01:16:35.240 | on the issue of COVID origins.
01:16:37.400 | Certainly, Dr. Fauci could have, in his conversation
01:16:41.480 | with Rand Paul-- he wasn't even a conversation,
01:16:43.480 | but in some process in the aftermath,
01:16:46.560 | could have laid things out a bit better.
01:16:48.440 | He did say, and Francis Collins did
01:16:50.400 | say, that we don't know the origins.
01:16:52.240 | And that was a shift.
01:16:54.240 | And we need to have an investigation.
01:16:58.040 | But having said all of that, I do kind of--
01:17:02.120 | one, I have tremendous respect for Dr. Fauci for the work
01:17:05.320 | that he's done on HIV/AIDS.
01:17:07.000 | I have been vaccinated with the Moderna vaccine.
01:17:10.840 | Dr. Fauci was a big part of the story of getting us
01:17:14.760 | these vaccines that have saved millions and millions of lives.
01:17:19.480 | And so I don't think--
01:17:21.040 | I mean, there's a lot to this story.
01:17:23.440 | And then the second thing is it's really hard
01:17:26.400 | to be a public health expert, because you have--
01:17:29.320 | your mission is public health.
01:17:32.120 | And so you have to--
01:17:33.760 | if you are leading with all of your uncertainty,
01:17:37.800 | it's a really hard way to do things.
01:17:40.440 | And so even now, if I go to CVS and I get a Tylenol,
01:17:45.920 | somebody has done a calculation of how many people
01:17:49.520 | will die from taking Tylenol.
01:17:51.480 | And they say, well, we can live with that.
01:17:54.200 | And that's why we have regulation.
01:17:55.920 | And so all of us are doing kind of summaries.
01:17:59.120 | And then we have people in public health who are saying,
01:18:01.440 | well, we've summed it all up.
01:18:03.360 | And you should do X. You should get your kids vaccinated
01:18:07.320 | for measles.
01:18:09.480 | You should not drive your car at 100 miles an hour.
01:18:12.880 | You should-- don't drink lighter fluid,
01:18:15.200 | whatever these things are.
01:18:17.360 | And we want them to kind of give us broad guidelines.
01:18:21.000 | And yet now our information world
01:18:23.720 | is so fragmented that if you're not
01:18:27.240 | being honest about something, something material,
01:18:31.280 | someone's going to find out.
01:18:33.240 | And it's going to undermine your credibility.
01:18:35.080 | And so I agree with you that there is a greater requirement
01:18:41.040 | for transparency now.
01:18:43.320 | Maybe there always has been.
01:18:44.480 | But there's an even greater requirement for it now.
01:18:48.480 | Because people want to trust that you're speaking honestly
01:18:53.800 | and that you're saying, well, here's what I know.
01:18:56.080 | And this is based on what I know.
01:18:58.080 | Here are the conclusions that I draw.
01:19:00.480 | But if it's just-- and again, I don't think the words "I'm
01:19:04.040 | science" or whatever it was are the right words.
01:19:07.000 | But if it's just, trust me because of who I am,
01:19:10.840 | I don't think that flies anywhere anymore.
01:19:14.720 | - Can I just ask you about the Francis Collins interview
01:19:18.040 | that I did, if you got a chance to hear that part?
01:19:20.000 | I think in the beginning we talked about the lab leak.
01:19:23.720 | What are your thoughts about his response?
01:19:25.960 | Basically saying it's worthy of an investigation.
01:19:28.840 | But I don't know how you would interpret it.
01:19:33.920 | See, it's funny because I heard it in the moment
01:19:39.680 | as it's great for the head of NIH to be open-minded on this.
01:19:46.960 | But then the internet and Mr. Joe Rogan and a bunch of friends
01:19:52.160 | and colleagues told me that, yeah, well, that's
01:19:56.560 | too late and too little.
01:19:58.640 | - Yeah, so first, let me say I've
01:20:01.200 | been on Joe's podcast twice.
01:20:03.240 | And I love the guy, which doesn't
01:20:04.840 | mean that I agree with everything he does or says.
01:20:10.200 | And on this issue--
01:20:11.920 | and I'm normally a pretty calm and measured guy.
01:20:15.000 | And when you're just out running with your AirPods on
01:20:18.960 | and you start yelling into the wind in Central Park,
01:20:24.000 | nobody else knows why you're yelling.
01:20:27.200 | But while--
01:20:27.700 | - So you had such a moment?
01:20:28.960 | - I had a moment with Collins.
01:20:30.560 | And again, Francis Collins is someone I respect enormously.
01:20:34.040 | I mean, I live a big chunk of my life
01:20:36.880 | living in the world of genetics and biotech.
01:20:40.240 | And my book, Hacking Darwin, is about the future
01:20:43.280 | of human genetic engineering.
01:20:45.200 | And his work on the Human Genome Project and so many other
01:20:48.080 | things have been fantastic.
01:20:49.200 | And I'm a huge fan of the work of NIH.
01:20:53.320 | And he was right to say that the Chinese government hasn't
01:20:56.360 | been forthcoming, and we need to look into it.
01:20:58.760 | But then you asked him, well, how will we know?
01:21:01.960 | And then his answer was, we need to find the intermediate host.
01:21:06.080 | Remember I said before?
01:21:07.400 | And so that made it clear that he thought, well,
01:21:11.040 | we should have an investigation.
01:21:13.080 | But it comes from nature.
01:21:15.040 | And we just need to find that whatever it is,
01:21:17.880 | that intermediate animal host in the wild,
01:21:21.320 | and that'll tell us the story.
01:21:22.160 | - So here he had the conclusion in mind,
01:21:25.600 | and they're just waiting for the evidence
01:21:27.280 | to support the conclusion.
01:21:28.360 | - That was my feeling.
01:21:29.640 | I felt like he was open in general, but he was tilting.
01:21:33.160 | And again, your first question was, where do I fall?
01:21:36.440 | He was like, I'm 85% or whatever it is, 80, 75, 90,
01:21:41.440 | whatever it is in the direction of a lab incident.
01:21:44.960 | It made it feel that he was 90/10 in the other direction,
01:21:49.680 | which still means that he's open-minded
01:21:52.240 | about the possibility.
01:21:53.920 | And that's why, in my view, every single person
01:21:57.960 | who talks about this issue, I think the right answer,
01:22:00.720 | in my view, is we don't know conclusively.
01:22:04.000 | In my, and this is my personal view,
01:22:06.720 | the circumstantial evidence is strongly in favor
01:22:09.480 | of a lab incident origin, but that could immediately shift
01:22:12.480 | with additional information.
01:22:14.640 | We need transparency, but we should come together
01:22:19.440 | in absolutely condemning the outrageous coverup
01:22:24.040 | carried out by the Chinese government,
01:22:26.040 | which to this day is preventing any meaningful investigation
01:22:31.040 | into pandemic origins.
01:22:32.440 | We have, if you use the economist numbers,
01:22:36.000 | 15 million people who are dead as a result of this pandemic.
01:22:40.480 | And I believe that the actions of the Chinese government
01:22:44.120 | are disgracing the memory of these 15 million dead.
01:22:50.120 | They're insulting the families and the billions of people
01:22:55.360 | around the world who have suffered
01:22:57.040 | from this totally avoidable pandemic.
01:23:00.560 | And whatever the origin, the fact, the criminal coverup
01:23:04.360 | carried out by the Chinese government,
01:23:07.000 | which continues to this day, but most intensely
01:23:09.600 | in the first months following the outbreak,
01:23:12.160 | that's the reason why we have so many dead.
01:23:16.840 | And certainly, as I was saying before,
01:23:19.160 | I and a small number of others have been carrying this flame
01:23:23.000 | since early last year, but it's kind of crazy
01:23:27.080 | that our governments haven't been demanding it.
01:23:29.840 | And we can talk about the World Health Organization process,
01:23:33.600 | which was deeply compromised in the beginning.
01:23:36.040 | Now it's become much, much better.
01:23:38.880 | But again, it was the pressure of outsiders
01:23:42.960 | that played such an important role
01:23:44.680 | in shifting our national and international institutions.
01:23:49.320 | And while that's better than nothing,
01:23:50.840 | it would have been far better if our governments
01:23:53.760 | and international organizations
01:23:55.400 | had done the right thing from the start.
01:23:56.800 | - If I could just make a couple of comments about Joe Rogan.
01:24:01.560 | So there's a bunch of people in my life
01:24:08.560 | who have inspired me, who have taught me a lot,
01:24:11.720 | who I even look up to.
01:24:14.080 | Many of them are alive, most of them are dead.
01:24:19.000 | I wanna say that Joe said a few critical words
01:24:22.640 | about the conversation with Francis Collins,
01:24:24.680 | most of it offline,
01:24:26.280 | with a lot of great conversations about it,
01:24:30.080 | some he said publicly.
01:24:37.560 | he was also critical to say that
01:24:40.520 | me asking hard questions in an interview
01:24:45.920 | is not my strong suit.
01:24:47.360 | And I really want to kinda respond to that,
01:24:53.240 | which I did privately as well,
01:24:54.600 | but publicly to say that Joe's 100% right on that.
01:24:59.600 | But that doesn't mean that always has to be the case.
01:25:04.560 | And that is definitely something I wanna work on,
01:25:06.680 | 'cause most of the conversations I have,
01:25:08.760 | I wanna see the beautiful ideas in people's minds.
01:25:12.280 | But there's some times
01:25:15.080 | where you have to ask the hard questions
01:25:17.280 | to bring out the beautiful ideas.
01:25:19.680 | And it's hard to do.
01:25:23.000 | It's a skill.
01:25:24.080 | And Joe is very good at this.
01:25:26.520 | He says the way he put it in his criticisms,
01:25:29.880 | and he does this in his conversations,
01:25:31.680 | which is, whoa, whoa, whoa, stop, stop, stop, stop, stop.
01:25:34.760 | There's a kinda sense like,
01:25:36.520 | did you just say what you said?
01:25:38.560 | Let's make sure we get to the bottom,
01:25:42.560 | we clarify what you mean.
01:25:44.340 | 'Cause sometimes really big negative or difficult ideas
01:25:52.640 | can be said as a quick aside in a sentence.
01:25:59.080 | Like it's nothing.
01:26:00.120 | But it could be everything.
01:26:02.680 | And you wanna make sure you catch that
01:26:04.600 | and you talk about it.
01:26:06.320 | And not as a gotcha,
01:26:08.320 | not as a kinda way to destroy another human being,
01:26:12.400 | but to reveal something profound.
01:26:14.840 | And that's definitely something I wanna work on.
01:26:16.960 | I also want to say that,
01:26:21.200 | as you said, you disagree with Joe on quite a lot of things.
01:26:24.440 | So for a long time,
01:26:25.680 | Joe was somebody that I was just a fan of, listened to.
01:26:28.080 | He's now a good friend.
01:26:30.920 | And I would say we disagree more than we agree.
01:26:34.560 | And I love doing that.
01:26:36.520 | But at the same time, I learned from that.
01:26:41.680 | So it's like dual,
01:26:43.600 | like nobody in this world can tell me what to think.
01:26:48.600 | But I think everybody has a lesson to teach me.
01:26:53.700 | I think that's a good way to approach it.
01:26:57.080 | Whenever somebody has words of criticism,
01:27:00.320 | I assume they're right and walk around with that idea.
01:27:05.240 | To really sort of empathize with that idea
01:27:07.360 | because there's a lesson there.
01:27:09.120 | And oftentimes, my understanding
01:27:13.000 | of a topic becomes,
01:27:18.400 | is altered completely or it becomes much more nuanced
01:27:22.480 | and much more, much richer
01:27:25.140 | for that kind of empathetic process.
01:27:27.180 | But definitely, I do not allow anybody
01:27:31.440 | to tell me what to think,
01:27:32.660 | whether it's Joe Rogan or Fyodor Dostoevsky or Nietzsche
01:27:37.580 | or my parents or the proverbial girlfriend,
01:27:42.580 | which I don't actually have.
01:27:46.280 | - But she's still busting my balls.
01:27:49.700 | - Exactly, exactly.
01:27:51.260 | In my imagination, I have a girlfriend in Canada.
01:27:54.220 | Yeah, that I have imagined, exactly.
01:27:58.740 | Imagining conversations.
01:27:59.780 | I so wanna mention that.
01:28:01.420 | But also, I don't know if you've gotten a chance
01:28:03.980 | to see this, I'd love to also mention this Twitter feud
01:28:08.980 | between two other interesting people,
01:28:12.780 | which is Brett Weinstein and Sam Harris,
01:28:15.820 | or Sam Harris and others in general.
01:28:18.500 | And it kind of breaks my heart
01:28:21.060 | that these two people I listen to
01:28:22.980 | that are very thoughtful about a bunch of issues.
01:28:25.580 | Let's put COVID aside
01:28:27.860 | 'cause people are very emotional about this topic.
01:28:30.500 | I mean, I think they're deeply thoughtful and intelligent,
01:28:35.500 | whether you agree with them or not.
01:28:39.020 | And I always learn something from their conversations.
01:28:42.100 | And they are legitimately,
01:28:43.880 | or have been for a long time, friends.
01:28:46.540 | And it's a little bit heartbreaking to me
01:28:49.380 | to see that they basically don't talk in private anymore.
01:28:53.300 | And there's occasional jabs on Twitter.
01:28:57.080 | And I hope that changes.
01:29:00.060 | I hope that changes in general for COVID,
01:29:02.220 | that COVID brought out the, I would say,
01:29:06.100 | the most emotional sides of people, the worst in people.
01:29:10.100 | And I think there hasn't been enough love
01:29:14.260 | and empathy and compassion.
01:29:16.620 | And to see two people from whom I've learned a lot,
01:29:20.740 | whether it's Eric Weinstein, Brett Weinstein, Sam Harris,
01:29:23.720 | you can criticize them as much as you want,
01:29:25.500 | their ideas as much as you want.
01:29:27.360 | But if you're not sufficiently open-minded
01:29:30.220 | to admit that you have a lot to learn
01:29:33.060 | from their conversations, I think you're not being honest.
01:29:37.700 | And so I do hope they have those conversations.
01:29:40.380 | And I hope we can kinda,
01:29:42.040 | I think there's a lot of repairing to be done post-COVID
01:29:44.900 | of relationships, of conversations.
01:29:49.580 | And I think empathy and love can help a lot there.
01:29:53.860 | This is also just a,
01:29:55.620 | I talked to Sam privately,
01:29:58.480 | but this is also a public call out
01:30:01.040 | to put a little bit more love in the world
01:30:06.040 | and for these difficult conversations to happen.
01:30:13.020 | Because Brett Weinstein could be very wrong
01:30:19.200 | about a bunch of topics here around COVID,
01:30:25.480 | but he could also be right.
01:30:27.440 | And the only way to find out is to have those conversations.
01:30:31.220 | 'Cause there's a lot of people listening
01:30:32.760 | to both Sam Harris and Brett Weinstein.
01:30:35.500 | And if you go into these silos
01:30:39.520 | where you just keep telling each other
01:30:42.160 | that you are the possessors of truth
01:30:46.660 | and nobody else is the possessor of truth,
01:30:48.800 | what starts happening is you both lose track
01:30:53.520 | or the capability of arriving at the truth,
01:30:56.080 | 'cause nobody's in the possession of the truth.
01:30:58.440 | So anyway, this is the call out
01:30:59.920 | that we should have a little bit more conversation,
01:31:01.560 | a little bit more love.
01:31:02.400 | - I totally agree.
01:31:04.240 | And both of those guys are guys who I respect.
01:31:08.000 | And as you know, Brett, and again, as I mentioned,
01:31:11.560 | there are just a handful of us
01:31:13.760 | who were the early people raising questions
01:31:16.320 | about the origins of this pandemic.
01:31:19.560 | - He was there also talking.
01:31:21.200 | So people have heard him speak quite a bit
01:31:24.040 | about antiviral drugs and all that kind of stuff,
01:31:26.960 | but he was also raising concerns about lab leak early on.
01:31:30.640 | - Yeah, exactly.
01:31:31.480 | And so, but I completely agree with you
01:31:33.720 | that we don't have to agree with everybody,
01:31:37.480 | but it's great to have healthy conversations.
01:31:40.780 | That's how we grow.
01:31:42.160 | And absolutely, we live in a world where we're kind of,
01:31:46.680 | if we're not careful,
01:31:47.800 | pushed into these little information pockets.
01:31:49.960 | And certainly on social media,
01:31:51.840 | I have different parts of my life.
01:31:54.080 | One is focusing on issues of COVID origins,
01:31:58.680 | and then I have genetics and biotechnology.
01:32:01.040 | And then I have, which maybe we'll talk about later,
01:32:03.080 | one shared world, which is about
01:32:04.320 | how do we build a safer future?
01:32:06.760 | And when I say critical things like the Chinese government,
01:32:10.440 | we'd have to demand a full investigation
01:32:13.000 | into pandemic origins, this is an outreach,
01:32:15.200 | then it's really popular.
01:32:16.800 | When I say, let's build a better future
01:32:19.120 | for everyone in peace and love,
01:32:20.640 | it's like, wow, three people liked it,
01:32:23.040 | and one was my mother.
01:32:25.080 | And so I just feel like we need to build,
01:32:28.080 | we used to have that connectivity just built in
01:32:32.120 | because we had these town squares
01:32:34.080 | and you couldn't get away from them.
01:32:36.080 | Now we can get away from them.
01:32:37.760 | So engaging with people who have a different background
01:32:41.240 | is really essential.
01:32:42.520 | And I'm on Fox News sometimes three, four times a week,
01:32:48.120 | and I wouldn't, in my normal life,
01:32:49.840 | I'm not watching that much of Fox News
01:32:53.640 | or even television more generally,
01:32:56.320 | but I just feel like if I just speak to people
01:32:59.040 | who are very similar to me, it'll be comfortable,
01:33:04.040 | but what have I contributed?
01:33:05.840 | So I think we really have to have
01:33:07.800 | those kinds of conversations and recognize
01:33:11.840 | that at the end of the day,
01:33:13.160 | most people wanna be happy,
01:33:15.880 | they wanna live in a better world,
01:33:17.360 | maybe have different paths to get there,
01:33:20.200 | but if we just break into camps
01:33:22.640 | that don't even connect with each other,
01:33:24.720 | that's a much more dangerous world.
01:33:26.560 | - Let's dive back into the difficult pool.
01:33:31.400 | - Yes.
01:33:32.720 | - Just like you said, in the English speaking world,
01:33:36.840 | it seems popular, almost easy to demonize China,
01:33:41.840 | the Chinese government, I should say.
01:33:46.960 | But even China, there's this kind of a gray area
01:33:50.160 | that people just fall into,
01:33:52.320 | and I'm really uncomfortable with that,
01:33:54.920 | perhaps because in my mind, in my heart, in my blood
01:33:58.440 | are echoes of the Cold War and that kind of tension.
01:34:01.120 | It feels like we almost desire conflict,
01:34:08.880 | so we see demons when there is none.
01:34:12.040 | So I'm a little cautious to demonize,
01:34:15.680 | but at the same time, you have to be honest.
01:34:18.160 | So it's like honest with the demons that are there
01:34:22.320 | and honest when they're not.
01:34:23.800 | This is kind of a geopolitical therapy session of sorts.
01:34:29.280 | So let's keep talking about China
01:34:31.760 | a little bit from different angles.
01:34:33.000 | So let's return to the director
01:34:37.160 | of the Center of Emerging Infectious Diseases
01:34:39.280 | at the Wuhan Institute of Virology, Shi Zhengli,
01:34:44.120 | colloquially referred to as Batwoman.
01:34:46.000 | So do you think she's lying?
01:34:51.440 | - Yes.
01:34:52.280 | - Do you think she's being forced to lie?
01:34:54.720 | - Yes.
01:34:55.560 | - I've known a bunch of virologists
01:34:58.240 | in private and public conversation
01:35:00.200 | that respect her as a human being, as a scientist.
01:35:03.560 | - I respect her as a human being.
01:35:05.120 | - Sorry, as a scientist, not a human being,
01:35:07.080 | 'cause I think they don't know the human,
01:35:08.640 | they know the scientist,
01:35:09.520 | and they respect her a lot as a scientist.
01:35:11.360 | - Yeah, I respect her, and I've never met her,
01:35:13.920 | and we had one exchange, which I'll mention in a second
01:35:16.160 | in a virtual forum, but I do respect her.
01:35:18.800 | I actually, I think that she is somebody
01:35:20.560 | who has tried to do the right thing.
01:35:23.120 | She was one of the heroes of tracking down
01:35:25.240 | the origins of SARS-1, and that was a major contribution.
01:35:29.920 | But as we talked about earlier,
01:35:33.640 | it's a different thing living, being a scientist,
01:35:38.520 | or really kind of anything.
01:35:40.640 | It's different being one of those people
01:35:43.960 | in an authoritarian society
01:35:47.240 | than it is being in a different type of society.
01:35:50.520 | And so when Shujing Li said that the reason
01:35:54.240 | the WIV database was taken offline in September '19
01:36:00.320 | was because of computer hacks,
01:36:02.280 | I don't think that's the story.
01:36:03.960 | I don't think she thinks that's the story.
01:36:07.520 | When I asked her in March of 2021, March of this year,
01:36:12.520 | in a Rutgers online forum,
01:36:15.480 | when I asked her whether the Chinese military
01:36:18.280 | had any engagement with the Wuhan Institute of Virology
01:36:22.200 | in any way, and she said, "Absolutely not," paraphrasing,
01:36:26.480 | I think she was lying.
01:36:27.720 | Do I think that she had the ability to say,
01:36:30.680 | "Well, either one, yes, but I can't talk about it,
01:36:34.400 | "or I know there are a lot of things
01:36:36.520 | "that are happening at this institute
01:36:38.080 | "that I don't know about, and that could be one."
01:36:42.120 | Could she have said that the personnel
01:36:46.480 | at the Wuhan Institute of Virology
01:36:48.200 | have all had to go through classification training
01:36:52.920 | so that they can know about what can and can't be said?
01:36:57.640 | Like, she could have said all those things,
01:36:59.560 | but she couldn't say all of those things.
01:37:01.960 | And so, and I think that's why so many,
01:37:06.200 | at least in my view, so many people
01:37:08.080 | certainly in the Western world got this story wrong
01:37:12.760 | from the beginning, because if your only prism
01:37:16.600 | was the science, and you just assumed
01:37:19.080 | this is a science question to be left to the scientists,
01:37:22.720 | Xu Jing Li is just like any scientist
01:37:26.040 | working in Switzerland or Norway,
01:37:29.320 | the Chinese government isn't interfering in any way,
01:37:33.400 | and we can trust them.
01:37:34.800 | That would lead you down one path.
01:37:37.280 | In my view, the reason why I progressed as I did
01:37:40.680 | is I felt like I had two keys,
01:37:42.280 | and I had one key as I live in the science world
01:37:45.960 | through my work with WHO and my books and things like that,
01:37:50.120 | but I also have another part of my life
01:37:52.320 | in the world of geopolitics as an Asia, quote-unquote,
01:37:56.960 | expert and former National Security Council official
01:38:00.400 | and other things, and I felt, for me,
01:38:02.120 | I needed both keys to open that door,
01:38:06.360 | but if I only had the science key,
01:38:08.960 | I wouldn't have had the level of doubt
01:38:11.280 | and suspicion that I have, but if my starting point
01:38:14.320 | was only doubt and suspicion, well, it's coming from China,
01:38:17.540 | it must be that the government is guilty.
01:38:20.600 | Like, that wouldn't help either.
01:38:22.200 | - I wonder what's in her mind,
01:38:26.600 | whether it's fear or habit,
01:38:30.880 | 'cause I think a lot of people in the former Soviet Union,
01:38:35.800 | it's like Chernobyl, it's not really fear.
01:38:38.740 | It's almost like a momentum.
01:38:40.600 | It's like the reason I showed up to this interview
01:38:45.200 | wearing clothes as opposed to being naked.
01:38:48.000 | It's like, all right.
01:38:49.360 | It's like, it's just all of us are doing the clothes thing.
01:38:55.760 | - Although there was a startup years ago called Naked News.
01:38:59.560 | Did you ever hear about that?
01:39:00.440 | - They just would read the exact news.
01:39:02.520 | - With naked.
01:39:03.360 | - No, they would, after each story,
01:39:04.800 | they'd take something off until the end.
01:39:07.080 | I don't think--
01:39:07.920 | - It's a good idea for a podcast.
01:39:09.560 | - They have an IPO.
01:39:10.400 | - Stay tuned, next time I'm with Michael Mellis.
01:39:13.960 | Okay.
01:39:14.800 | So what do you think, I mean,
01:39:19.040 | 'cause the reason I asked that question is,
01:39:21.200 | how do we kind of take steps to improve
01:39:25.680 | without any kind of revolutionary action?
01:39:27.960 | You could say we need to inspire the Chinese people
01:39:32.800 | to elect, to sort of revolutionize the system from within,
01:39:37.800 | but like, who are we to suggest that?
01:39:43.480 | Because we have our flaws too.
01:39:45.520 | We should be working on our flaws as well.
01:39:48.000 | And so, but at the individual scientist level,
01:39:52.440 | what are the small acts of rebellion that can be done?
01:39:56.400 | How can we improve this?
01:39:58.640 | - Well, I don't know about small acts of rebellion,
01:40:02.440 | but I'll try to answer your question
01:40:04.640 | from a few different perspectives.
01:40:07.920 | So right now, actually, as we speak,
01:40:11.040 | there is a special session
01:40:12.940 | of the World Health Assembly going on.
01:40:14.720 | And the World Health Assembly is the governing authority
01:40:17.560 | over the World Health Organization,
01:40:19.360 | where it's represented by states and territories,
01:40:22.560 | 194 of them, tragically not including Taiwan,
01:40:26.920 | because of the Chinese government's assistance.
01:40:29.320 | But they're now beginning a process
01:40:31.520 | of trying to negotiate a global pandemic treaty
01:40:35.480 | to try to have a better process
01:40:37.640 | for responding to crises exactly like we're in.
01:40:42.200 | But unfortunately, for the exact same reasons
01:40:45.600 | that we have failed, I mean,
01:40:47.280 | we had a similar process after the first SARS.
01:40:49.900 | We set up what we thought was the best available system,
01:40:52.640 | and it has totally failed here.
01:40:55.440 | And it's failed here because of the inherent pathologies
01:41:00.440 | of the Chinese government system.
01:41:03.000 | We are suffering from a pandemic that exists
01:41:06.840 | because of the internal pathologies of the Chinese state.
01:41:11.440 | And that's why on one hand, I totally get this impulse.
01:41:14.960 | Well, we do it our way, they do it their way.
01:41:18.000 | Who's to say that one way is better?
01:41:21.140 | And certainly right now in the United States,
01:41:23.580 | we're at each other's throats.
01:41:25.300 | We have a hard time getting anything meaningful done.
01:41:29.020 | And I'm sure there are people who are saying,
01:41:31.300 | well, that model looks appealing.
01:41:33.660 | But just as people could look to the United States and say,
01:41:37.980 | well, because the United States has such a massive reach,
01:41:40.700 | what we do domestically has huge implications
01:41:43.180 | for the rest of the world,
01:41:44.760 | they become stakeholders in our politics.
01:41:48.180 | And that's why I think for a lot of years,
01:41:49.900 | people have just been looking at US politics,
01:41:52.220 | not 'cause it's interesting,
01:41:53.220 | but because the decisions that we make
01:41:55.340 | have big implications for their lives.
01:41:58.100 | The same is true for ours.
01:41:59.760 | You could say that the lack of civil and political rights
01:42:04.180 | in China is the, it's up to the Chinese,
01:42:08.660 | not even people, 'cause they have no say,
01:42:10.340 | but to their government.
01:42:12.620 | And they weren't democratically elected,
01:42:14.260 | that they are recognized as the government.
01:42:17.660 | But some significant percentage of the 15 million people
01:42:22.660 | now dead from COVID are dead
01:42:26.000 | because in the earliest days following the outbreak,
01:42:29.660 | whatever the origin,
01:42:31.560 | the voices of people sounding the alarm were suppressed.
01:42:35.260 | That the Chinese government had,
01:42:37.220 | just like in Chernobyl,
01:42:38.220 | the Chinese government had a greater incentive
01:42:41.200 | to lie to the international community
01:42:44.020 | than to tell the truth.
01:42:46.580 | And everybody was incentivized
01:42:49.260 | to pretty much do the wrong thing.
01:42:51.900 | And so that's why I think one of the big messages
01:42:55.180 | of this pandemic is that all of our fates
01:42:57.600 | are tied to everybody else's fates.
01:43:00.260 | And so while we can say and should say,
01:43:02.740 | well, let's focus on our own communities and our countries,
01:43:06.460 | we're all stakeholders in what happens elsewhere.
01:43:09.940 | - Can I ask you a weird question?
01:43:14.940 | So I'm gonna do a few podcast interviews
01:43:19.980 | with interesting people in Russia,
01:43:23.020 | in the Russian language, 'cause I could speak Russian.
01:43:25.720 | And a lot of those people have,
01:43:29.180 | you know, are not usually speaking
01:43:32.500 | in these kinds of formats.
01:43:36.580 | Do you think it's possible to interview Shi Zhenli?
01:43:40.460 | Do you think it's possible to interview somebody like her
01:43:44.500 | or anyone in the Chinese government?
01:43:47.320 | - I think not.
01:43:48.660 | And I think the reason is because I think they would,
01:43:53.900 | one, be uncomfortable being in any environment
01:43:57.620 | where really unknown questions will be asked.
01:44:02.060 | And I actually, I was, so as you know, on this topic,
01:44:06.060 | the Chinese, as I mentioned earlier,
01:44:07.380 | the Chinese government has a gag order on Chinese scientists.
01:44:10.820 | They can't speak without prior government approval.
01:44:13.020 | Shi Zhenli has been able to speak,
01:44:15.580 | and she's spoken at a number of forums.
01:44:17.180 | I mentioned this Rutgers event.
01:44:19.940 | - What was the nature of that forum, the Rutgers event?
01:44:23.100 | - All of them were kind of science conversations
01:44:26.340 | about the pandemic, including the origins issue.
01:44:33.660 | But I think that she, in her response to my question,
01:44:37.100 | it was kind of this funny thing.
01:44:38.700 | So they had this event organized by Rutgers,
01:44:42.700 | and I went on, it was an online event on Zoom,
01:44:45.700 | but I got on there, and I just realized
01:44:47.920 | it was very poorly organized.
01:44:49.500 | Like normally the controls that you would have
01:44:51.780 | about who gets to chat to who, who gets to ask questions,
01:44:54.360 | none of them were set.
01:44:56.940 | And so I kind of couldn't believe it.
01:44:58.420 | I was just sitting at home in my neon green fleece,
01:45:02.240 | and I just started sending chat messages to Shi Zhenli.
01:45:06.580 | - So you could, anybody could send--
01:45:08.180 | - Anybody could, it was insane.
01:45:09.780 | But I thought, wow, this is incredible.
01:45:11.700 | And so then it was unclear who got to ask questions.
01:45:16.060 | And so I was like posting questions,
01:45:18.300 | and then I was sending chats to the organizers of the event
01:45:21.380 | saying I really have a question.
01:45:23.500 | And first they said, well, you can submit your questions,
01:45:27.380 | and we'll have submitted questions,
01:45:29.240 | and then if we have time, we'll open up.
01:45:31.180 | So I just, I mean, I just thought, well, what the hell?
01:45:33.040 | I just sent messages to everybody.
01:45:34.740 | And then the event was already done.
01:45:37.160 | They were 15 minutes over time.
01:45:39.440 | And then they said, all right,
01:45:40.460 | we have time just for one question, and it's Jamie Metzl.
01:45:44.880 | And like I said, I'm sitting there in my running clothes.
01:45:47.240 | Like I wasn't, I was like multitasking,
01:45:49.220 | and I heard my name.
01:45:50.440 | And so I went diving back, and I asked this question
01:45:55.440 | about did you know all of the work
01:45:59.280 | that was happening at the Wuhan Institute of Virology,
01:46:03.080 | not just your work?
01:46:06.320 | And can you confirm that US intelligence has said
01:46:10.560 | that the military played a role,
01:46:14.200 | it was engaged with the Wuhan Institute of Virology,
01:46:16.760 | do you deny that the Chinese military was involved
01:46:19.160 | in any way with the Wuhan Institute of Virology?
01:46:21.720 | And as I said before, she said, this is crazy.
01:46:24.780 | Absolutely not.
01:46:26.280 | It actually got, that one question got covered in the media
01:46:29.040 | 'cause it was like, I think an essential question.
01:46:32.320 | But I just think that since then, to my knowledge,
01:46:35.280 | she's not been in any public forums.
01:46:38.320 | But that's why most people would be shocked
01:46:41.220 | that to date there has been no comprehensive
01:46:43.700 | international investigation into pandemic origins.
01:46:46.640 | There is no whistleblower provision.
01:46:49.040 | So if you're, my guess is there are at least tens,
01:46:52.800 | maybe hundreds of people in China
01:46:55.240 | who have relevant information about the origins
01:46:57.760 | of the pandemic who are terrified and don't dare share it.
01:47:01.840 | And let's just say somebody wanted
01:47:04.400 | to get that information out, to send it somewhere.
01:47:08.520 | There's no official address.
01:47:10.480 | The WHO doesn't have that.
01:47:12.960 | Nobody has that.
01:47:13.960 | And so I would love, I mean, you may as well ask.
01:47:16.720 | I don't think it's likely that there'll be a yes,
01:47:21.160 | but it could well be that there are defectors
01:47:23.720 | who will want to speak.
01:47:25.440 | So let me also push back.
01:47:28.920 | So one, I want to ask if the language barrier is a thing.
01:47:33.440 | Because I've talked, so I understand Russian culture,
01:47:37.760 | I think, or not understand.
01:47:39.480 | (laughs)
01:47:41.560 | I don't understand basically anything in this world.
01:47:45.480 | But I mean, I hear the music that is Russian culture,
01:47:50.440 | and I enjoy it.
01:47:51.840 | I don't hear that music for Chinese culture.
01:47:55.360 | It's just not something I've experienced.
01:47:57.000 | So it's a beautiful, rich, complex culture.
01:48:00.160 | And from my sense, it seems distant to me.
01:48:04.920 | Like I, like whenever I look,
01:48:08.560 | even like we mentioned offline Japan and so on,
01:48:12.240 | I probably don't even understand Japanese culture.
01:48:14.400 | I believe I kind of do
01:48:15.760 | 'cause I did martial arts my whole life,
01:48:17.280 | but even that, it's just so distant.
01:48:20.160 | People who've lived in Japan, foreigners,
01:48:22.040 | for like 20 years say the exact same thing.
01:48:24.760 | Yeah.
01:48:25.600 | - This makes me sad.
01:48:26.440 | It makes me sad 'cause I can't,
01:48:27.840 | I will never be able to fully appreciate the literature,
01:48:31.920 | the conversations, the people,
01:48:35.120 | the little humor and the subtleties.
01:48:38.160 | And those are all essential to understand
01:48:40.000 | even this cold topics of science.
01:48:43.640 | - Yeah.
01:48:44.480 | - 'Cause all of that is important to understand.
01:48:46.200 | So that's a question for me
01:48:47.440 | if you think language barrier is a thing.
01:48:49.800 | But the other thing I just wanna kinda comment on
01:48:52.120 | is the criticism of journalism
01:48:57.120 | that somebody like Xi Jinping or even Xi Jinping,
01:49:02.840 | just anybody in China,
01:49:05.880 | it's very skeptical to have really conversations
01:49:08.760 | with anybody in the Western media.
01:49:11.120 | - Yeah.
01:49:11.960 | - 'Cause it's like, what are the odds
01:49:14.400 | that they will try to bring out
01:49:17.480 | the beautiful ideas in the person?
01:49:19.600 | And honestly, just this is a harsh criticism.
01:49:23.480 | I apologize, but I kind of mean it.
01:49:26.080 | Is the journalists that have some
01:49:31.200 | of these high profile conversations
01:49:34.040 | often don't do the work.
01:49:36.800 | They come off as not very intelligent.
01:49:39.480 | And I know they're intelligent people.
01:49:41.360 | They have not done the research.
01:49:43.480 | They have not come up and like read a bunch of books.
01:49:46.520 | They have not even read the Wikipedia article,
01:49:48.880 | meaning put in the minimal effort to empathize,
01:49:52.480 | to try to understand the culture of the people,
01:49:54.800 | all the complexities, all the different ideas in the spaces,
01:49:58.760 | do all the incredible, not all,
01:50:00.960 | but some of the incredible work that you've done initially.
01:50:04.240 | Like that, you have to do that work to earn the right
01:50:07.160 | to have a deep, real conversation with some of these folks.
01:50:11.240 | And it's just disappointing to me
01:50:13.400 | that journalists often don't do that work.
01:50:15.320 | - Yeah, so on that,
01:50:16.840 | just first I completely agree with you.
01:50:19.640 | I mean, there is just an incredible beauty
01:50:22.600 | in Chinese culture and I think all cultures,
01:50:25.240 | but certainly China has such a deep and rich history,
01:50:29.120 | amazing literature and art and just human beings.
01:50:34.120 | I mean, I'm a massive critic of the Chinese government.
01:50:38.000 | I'm very vociferous about the really genocide in Xinjiang,
01:50:42.720 | the absolute effort to destroy a Tibetan culture,
01:50:46.560 | the destruction of democracy in Hong Kong,
01:50:51.040 | incredibly illegal efforts to seize
01:50:54.160 | basically the entire South China Sea.
01:50:56.520 | And I could go on and on and on,
01:50:59.640 | but Chinese culture is fantastic.
01:51:02.680 | And I can't speak to every technical field,
01:51:05.720 | but just in terms of having journalists,
01:51:09.000 | and I'll speak to American journalists,
01:51:10.560 | people like Peter Hessler,
01:51:11.800 | who have really invested the time to live in China,
01:51:16.200 | to learn the language, learn the culture.
01:51:19.480 | Peter himself, who is maybe one of our best journalists
01:51:22.920 | covering China from a soul level,
01:51:26.080 | he was kicked out of China.
01:51:27.920 | So it's very, very difficult.
01:51:30.600 | - It's tough.
01:51:31.440 | - Yeah, it's really.
01:51:32.280 | And so for me, you talked about my website
01:51:35.520 | on pandemic origins.
01:51:36.640 | So when I launched it, I had it,
01:51:38.920 | I'm not a Chinese speaker,
01:51:40.520 | but I had the entire site translated into Chinese
01:51:44.120 | and I have it up on my website,
01:51:47.160 | just because I felt like, well, if somebody,
01:51:49.320 | I mean, the great firewall makes it very, very difficult
01:51:52.840 | for people in China to access that kind of information.
01:51:56.320 | I figured if somebody gets there
01:51:58.520 | and they wanna have it in their own language,
01:52:02.000 | but it's hard because the Chinese government
01:52:04.320 | is represented by these quote unquote wolf warriors,
01:52:08.560 | which is, it's like these basic ruffians.
01:52:11.840 | And I personally was condemned by name
01:52:15.440 | by the spokesman of the Chinese foreign ministry
01:52:17.560 | from the podium in Beijing.
01:52:20.680 | And so it's really hard because I absolutely think
01:52:24.160 | the American people and the Chinese people,
01:52:27.800 | I mean, maybe all people, but we have so much in common.
01:52:31.280 | I mean, yes, China is an ancient civilization,
01:52:36.200 | but they kind of wiped out their own civilization
01:52:38.880 | in the great leap forward and cultural revolution.
01:52:41.240 | They burned their scrolls, they smashed their artworks.
01:52:44.560 | And so it's a very young society,
01:52:47.440 | kind of like America is a young society.
01:52:50.600 | So we have a lot in common.
01:52:52.760 | And if we just kind of got out of our own ways,
01:52:57.040 | we could have a beautiful relationship,
01:52:59.640 | but there's a lot of things that are happening.
01:53:01.960 | Certainly the United States feels responsible
01:53:04.120 | to defend the post-war international order
01:53:07.280 | that past generations helped build.
01:53:09.760 | And I'm a certain believer in that,
01:53:11.760 | and China is challenging that.
01:53:14.440 | And the Chinese government,
01:53:16.560 | and they've shared that view with the Chinese people,
01:53:19.400 | feel that they haven't been adequately respected.
01:53:21.760 | And now they're building a massive nuclear arsenal
01:53:25.960 | and all these other things
01:53:27.240 | to try to position themselves in the world
01:53:29.800 | with an articulated goal
01:53:31.120 | of being the lead country in the world.
01:53:33.000 | And that puts them at odds with the United States.
01:53:34.840 | So there are a lot of real reasons
01:53:37.520 | that we need to be honest about for division.
01:53:39.920 | But if that's all we focus on,
01:53:42.440 | if we don't say that there's another side of the story
01:53:45.640 | that brings us together,
01:53:47.640 | we'll put ourselves on an inevitable glide path
01:53:51.280 | to a terrible outcome.
01:53:52.400 | - What do you make of Xi Jinping?
01:53:55.800 | So two questions.
01:53:57.880 | So one in general,
01:53:59.360 | and two more on lab leak and his meeting
01:54:01.760 | with our president Biden in discussion of lab leak.
01:54:07.520 | - Yeah, so I feel that Xi Jinping
01:54:10.720 | has a very narrow goal of articulated,
01:54:15.720 | of establishing China as the lead country in the world
01:54:20.160 | by the 100th anniversary of the founding
01:54:22.920 | of the modern Chinese state.
01:54:26.400 | And it's ruthless and it's strategic.
01:54:29.760 | There's a great book called "The Long Game" by Rush Doshi,
01:54:33.720 | who's actually now working in the White House
01:54:36.720 | about this goal and pretty clearly articulated goal
01:54:41.400 | to subvert the post-war international order
01:54:45.080 | and in China's interest.
01:54:47.240 | And maybe every leader wants to organize the world
01:54:50.600 | around their interests,
01:54:51.480 | but I feel that his vision of what that entails
01:54:56.360 | is not one that I think is shareable
01:54:59.120 | for the rest of the world.
01:55:00.080 | I mean, the strength of the United States
01:55:01.640 | with all of our flaws is particularly
01:55:04.520 | in that post-war period,
01:55:06.880 | we put forward a model that was desirable
01:55:10.320 | to a lot of people.
01:55:11.160 | Certainly it was desirable to people in Western Europe
01:55:13.680 | and then Eastern Europe and Japan and Korea.
01:55:17.000 | Doesn't mean it's perfect.
01:55:18.840 | The United States is deeply flawed.
01:55:21.560 | As articulated to date,
01:55:23.560 | I don't think most people and countries
01:55:26.840 | would like to live in a Sinocentric world.
01:55:30.320 | And so I certainly, as I mentioned before,
01:55:32.480 | I'm a huge critic of what Xi Jinping is doing,
01:55:35.000 | the incredible brutality in Xinjiang,
01:55:39.240 | in Tibet and elsewhere.
01:55:42.600 | - Yeah, the censorship one,
01:55:44.200 | it gives me a lot of trouble.
01:55:47.920 | On the science realm and in just in journalism
01:55:51.120 | and just the world,
01:55:52.280 | it prevents us from having conversations with each other.
01:55:56.200 | Do you know about the Winnie the Pooh thing?
01:55:58.680 | - Yes.
01:55:59.600 | I mean, it's ridiculous.
01:56:00.800 | So to me, that's such a good illustration
01:56:03.680 | of censorship being petty.
01:56:07.400 | - The censorship has to be petty
01:56:09.720 | because the goal of censorship,
01:56:12.160 | maybe you experienced in the Soviet Union,
01:56:14.720 | is to get into your head.
01:56:16.520 | Like if it's just censorship,
01:56:17.840 | like you say down with the state
01:56:20.720 | and like you can't say that,
01:56:22.720 | but you can say all the other things up to that point,
01:56:26.200 | eventually people will feel empowered
01:56:28.520 | to say down with the state.
01:56:29.600 | And so I think the goal of this kind
01:56:31.800 | of authoritarian censorship
01:56:33.760 | is to turn you into the censor.
01:56:36.200 | And so the--
01:56:38.120 | - Like self-censorship, whatever.
01:56:39.440 | - Because they almost have to have you think,
01:56:41.480 | well, if I'm gonna make any criticism,
01:56:43.560 | maybe they're gonna come and get me.
01:56:45.880 | So it's safer to not do it.
01:56:48.560 | I mean, I've traveled through North Korea pretty extensively
01:56:51.360 | and I've seen that in its ultimate form,
01:56:53.480 | but that's what they're trying to do in China too.
01:56:56.760 | - Yeah, so for people who are not familiar,
01:57:00.080 | it's such a clear illustration
01:57:01.680 | of just the pettiness of censorship and leaders,
01:57:05.160 | the corrupting nature of power.
01:57:07.200 | But there's a meme of Xi Jinping
01:57:10.880 | with I guess Barack Obama.
01:57:13.880 | And the meme is that he looks like Winnie the Pooh
01:57:18.400 | in that picture.
01:57:20.800 | And that was the president of Xi Jinping
01:57:25.200 | looks like Winnie the Pooh.
01:57:26.360 | And I guess that became, because that got censored,
01:57:30.040 | or like mentions of Winnie the Pooh
01:57:31.560 | got censored all across China.
01:57:33.800 | Winnie the Pooh became the unknowing revolutionary hero
01:57:38.240 | that represents freedom of speech and so on.
01:57:41.600 | But it's just such a absurd,
01:57:44.840 | 'cause we spend all so much time in this conversation
01:57:47.840 | talking about the censorship,
01:57:49.760 | that's a little bit more understandable to me,
01:57:52.440 | which is like, we messed up.
01:57:55.360 | And it wasn't, maybe it's almost understandable errors
01:57:59.720 | that happen in the progress of science.
01:58:02.720 | I mean, you could always argue
01:58:05.560 | that there's a lot of mistakes along the way
01:58:09.200 | and the censorship along the way caused the big mistake.
01:58:11.960 | You can argue that same way for the Chernobyl,
01:58:14.320 | but those are sort of understandable and difficult topics.
01:58:18.320 | Like Winnie the Pooh.
01:58:19.800 | - But in your message, it shows both sides of the story.
01:58:22.160 | I mean, one, how petty authoritarian censors
01:58:25.240 | have to be, and that's why the messaging
01:58:28.280 | from the Chinese government is so consistent.
01:58:30.640 | No matter who you are,
01:58:32.480 | you have to be careful what you say.
01:58:33.960 | And that's why it's the story of Peng Shui,
01:58:37.200 | the tennis player, she dared raise her voice
01:58:41.280 | in an individual way.
01:58:42.360 | Jack Ma, the richest man in China,
01:58:46.840 | had a minor criticism of the Chinese government.
01:58:50.440 | He had basically disappeared from the public eye.
01:58:54.600 | Fan Bingbing, who's like one of the leading
01:58:57.560 | Chinese movie stars, she was seen as not loyal enough
01:59:02.200 | and she just vanished.
01:59:03.360 | And so the message is no matter who you are,
01:59:06.280 | no matter what level, if you don't mind everything you say,
01:59:11.280 | you could lose everything.
01:59:12.680 | - I'm pretty hopeful, optimistic about a lot of things.
01:59:15.160 | And so for me, if the Chinese government stays
01:59:18.920 | with its current structure, I think what I hope
01:59:23.360 | they start fixing is the freedom of speech.
01:59:26.600 | - But they can't.
01:59:27.440 | I mean, the thing is, if they open up freedom of speech,
01:59:32.240 | really in a meaningful way, they can't maintain
01:59:36.160 | their current form of government.
01:59:38.360 | And it's connected, as I was saying before,
01:59:40.320 | to the origins of the pandemic.
01:59:42.520 | And if my hypothesis was right, that was the big choice
01:59:46.440 | that the national government had.
01:59:48.880 | Do we really investigate the origins of the pandemic?
01:59:51.800 | Do we deliver a message that transparency is required?
01:59:55.640 | Public transparency is required from local officials?
01:59:58.880 | If they do that, the entire system collapse.
02:00:02.080 | Pretty much everybody in China has a relative
02:00:06.640 | who has died as a result of the actions
02:00:09.840 | of the Communist Party, particularly
02:00:11.440 | in the Great Leap Forward.
02:00:12.640 | It's nearly 50 million people died as a result
02:00:15.840 | of Mao's disastrous policies.
02:00:18.600 | And yet, why is Mao's picture still on Tiananmen Square
02:00:21.800 | and it's on the money?
02:00:23.560 | Because maintaining that fiction is the foundation
02:00:27.480 | of the legitimacy of the Chinese state.
02:00:29.920 | If people were allowed, just say what you want.
02:00:32.320 | Do you really think Mao was such a great guy,
02:00:35.200 | even though your own relatives are dead as a result?
02:00:39.320 | Do you really buy, even on this story
02:00:43.880 | that China did nothing wrong, even though
02:00:45.800 | in the earliest days of the pandemic,
02:00:48.240 | these two, at least, Chinese scientists themselves
02:00:51.160 | courageously issued a preprint paper that was later almost
02:00:56.760 | certainly forcibly retracted saying,
02:00:59.000 | well, this looks like this comes from one of the Wuhan labs
02:01:02.240 | that we're studying.
02:01:03.360 | Like, if you opened up that window,
02:01:07.120 | I think that the Chinese government would not
02:01:10.760 | be able to continue in its current form.
02:01:12.520 | And that's why they cracked down at Tiananmen Square.
02:01:15.000 | That's why with Feng Shui, the tennis player,
02:01:17.880 | if they had let her accuse somebody
02:01:21.080 | from the Communist Party of sexual assault,
02:01:24.760 | and they said, OK, now people, you can use social media
02:01:28.640 | and you can have your Me Too moment
02:01:30.680 | and let us know who in the Chinese Communist Party
02:01:34.640 | or your boss in a business has assaulted you,
02:01:37.520 | just like in every society, I'm sure there's tons of women
02:01:40.060 | who've been sexually assaulted, manipulated, abused by men.
02:01:45.520 | And so I certainly hope that there
02:01:48.880 | can be that kind of opening.
02:01:51.200 | But if I were an authoritarian dictator,
02:01:54.320 | that's the thing I would be most afraid of.
02:01:56.640 | - Yeah, dictator perhaps.
02:01:58.000 | But I think you can gradually increase the freedom of speech.
02:02:01.280 | So I think you can maintain control over the freedom
02:02:04.800 | of press first.
02:02:06.120 | So control the press more, but let the lower levels sort
02:02:11.160 | of open up YouTube, right?
02:02:13.560 | Open up where individual citizens can make content.
02:02:16.960 | I mean, there's a lot of benefits to that.
02:02:19.080 | And then from an authoritarian perspective,
02:02:22.720 | you can just say that's misinformation,
02:02:25.560 | that's conspiracy theories, all those kinds of things.
02:02:28.400 | But at least I think if you open up that freedom of speech
02:02:32.520 | at the level of the individual citizen,
02:02:34.560 | that's good for entrepreneurship,
02:02:37.960 | for the development of ideas, of exchange of ideas,
02:02:40.520 | all that kind of stuff.
02:02:41.400 | I just think that increased the GDP of the country.
02:02:44.000 | So I think there's a lot of benefits.
02:02:46.040 | I feel like you can still play,
02:02:48.320 | we're playing some dark thoughts here,
02:02:50.760 | but I feel like you could still play the game of thrones,
02:02:54.560 | still maintain power while giving freedom to the citizenry.
02:02:59.560 | I think just like with North Korea is a good example
02:03:04.200 | of where cracking down too much
02:03:07.320 | can completely destroy your country.
02:03:10.360 | Like there's some balance you can strike in your evil mind
02:03:14.520 | and still maintain authoritarian control over the country.
02:03:17.360 | Obviously, it's not obvious,
02:03:20.960 | but I'm a big supporter of freedom of speech.
02:03:24.040 | I mean, it seems to work really well.
02:03:26.520 | I don't know what the failure cases
02:03:27.960 | for freedom of speech are.
02:03:29.280 | Probably we're experiencing them with Twitter
02:03:32.760 | and like where the nature of truth
02:03:34.280 | is being completely kind of flipped upside down.
02:03:38.920 | But it seems like on the whole ability to defeat lies
02:03:43.920 | with more, not through censorship,
02:03:48.360 | but through more conversations,
02:03:50.760 | more information is the right way to go.
02:03:53.120 | - Can I tell you a little story,
02:03:54.160 | two stories about North Korea?
02:03:56.000 | So a number of years ago,
02:03:57.080 | I was invited to be part of a small six-person delegation
02:04:01.720 | advising the government of North Korea
02:04:04.800 | on how to establish special economic zones
02:04:07.360 | because other countries have used these SEZs
02:04:10.560 | as a way of building their economies.
02:04:13.000 | And when I was invited, I thought,
02:04:15.640 | well, maybe there's an opening.
02:04:17.880 | And I certainly believe in that.
02:04:19.960 | So we flew to China,
02:04:21.880 | crossed the border into North Korea,
02:04:24.520 | and then we were met by our partners
02:04:27.360 | from the North Korean Development Organization.
02:04:30.040 | And then we zigzagged the country for almost two weeks,
02:04:33.960 | visiting all these sites
02:04:35.600 | where they were intended
02:04:38.080 | to create these special economic zones.
02:04:39.880 | And in each site, they had their local officials
02:04:42.720 | and they had a map and they showed us
02:04:44.840 | where everything that was going to be built.
02:04:47.120 | And the other people who were like really technical experts
02:04:50.360 | on how to set up a special economic zone,
02:04:52.160 | they were asking questions,
02:04:53.160 | well, like, should you put the entrance over here
02:04:55.800 | or shouldn't you put it over there?
02:04:57.000 | And what if there's flooding?
02:04:58.520 | And I kept asking just these basic questions,
02:05:00.520 | like, what do you think you're gonna do here?
02:05:03.000 | Why do you think you can be competitive?
02:05:05.280 | Do you know anything about who you're competing against?
02:05:07.880 | Are you empowering your workers to innovate
02:05:10.600 | because everybody else is innovating?
02:05:12.600 | So at the end of the trip, they flew us to Pyongyang
02:05:14.880 | and they put us in this,
02:05:15.840 | it looked kind of like the United Nations.
02:05:17.440 | They probably had 500 people there.
02:05:20.200 | And I gave a speech to them.
02:05:22.920 | I obviously was in English and it was translated.
02:05:26.520 | And I figured, you know, I've come all this way,
02:05:29.920 | I'm just gonna be honest.
02:05:30.880 | If they arrest me for being honest, that's on them.
02:05:34.680 | And I said, I'm here because I believe
02:05:37.960 | we can never give up hope
02:05:39.640 | that we always have to try to connect.
02:05:41.400 | I'm also here because I think that North Korea
02:05:45.360 | connecting to the world economy
02:05:46.960 | is an important first step.
02:05:50.200 | But having visited all of your special economic zone sites
02:05:52.640 | and having met with all of your, or many of your officials,
02:05:56.040 | I don't think your plan has any chance of succeeding
02:05:59.360 | because you're trying to sell into a global market,
02:06:03.600 | but you need to have market information that,
02:06:07.520 | and I gave examples of GE and others,
02:06:10.880 | that the innovation can't only happen at one place.
02:06:14.600 | And if you want innovation to happen
02:06:17.760 | from the people who are doing this,
02:06:19.320 | you have to empower them.
02:06:20.520 | They have to have access.
02:06:22.360 | They have to have voice.
02:06:23.480 | I mean, nobody, I mean,
02:06:27.560 | the people after they kind of had to condemn me
02:06:30.120 | because what I was saying was challenging.
02:06:32.560 | So I certainly agree with you.
02:06:33.840 | And then just one side story of them that night,
02:06:36.760 | and it was just kind of bizarre
02:06:38.960 | 'cause North Korea is, it's so desperately poor,
02:06:42.360 | but they were trying to impress us.
02:06:44.360 | And so we had these embarrassingly sumptuous banquets.
02:06:49.240 | And so for our final dinner that night,
02:06:51.560 | really it looked like something from "Beauty and the Beast."
02:06:54.680 | I mean, it was like China and waiters and tuxedos,
02:06:59.480 | and they had this beautiful dinner.
02:07:02.080 | And then afterwards,
02:07:04.240 | because we'd now spent two weeks
02:07:05.560 | with our North Korean partners,
02:07:07.040 | they brought out this karaoke machine
02:07:08.960 | and our North Korean counterparts,
02:07:11.040 | they sang songs to us in Korean.
02:07:15.240 | And so I said, "Well, we wanna reciprocate.
02:07:18.080 | Do you have any English songs on your karaoke machine?"
02:07:21.240 | It's North Korea, obviously they didn't.
02:07:23.320 | But there was, I said, "Well, I have an idea."
02:07:25.560 | And so there was one of the women
02:07:27.120 | who'd been part of the North Korean delegation.
02:07:30.200 | She was able just to play the piano,
02:07:32.640 | just like you could hum a tune
02:07:34.360 | and she could play it on the piano.
02:07:36.160 | And so I said, "All right, here's this tune,"
02:07:39.480 | which I whispered in her ear,
02:07:40.900 | "When I give you the signal,
02:07:42.440 | just play this tune over and over."
02:07:45.400 | And so I got these, I mean, there were the six of us
02:07:48.200 | and maybe 20 North Koreans, and we are all in a circle.
02:07:51.080 | I said, "All right, everybody hold hands."
02:07:52.880 | And then put your right,
02:07:54.920 | just try put your right foot in front of your left
02:07:57.600 | and then left foot in front of the right going sideways.
02:08:00.560 | And I said, "All right, hit it."
02:08:02.480 | And she played a North Korean version of "Hava Nagila."
02:08:06.800 | And I think it was the first and only horror
02:08:09.400 | that they've ever done in North Korea.
02:08:11.720 | - That's hilarious. - I survived to tell the tale.
02:08:13.000 | - Was this recorded or no?
02:08:14.200 | - It was not. - Oh, no.
02:08:15.600 | - Yeah, if they had free YouTube,
02:08:18.160 | this would have been a big one.
02:08:21.520 | Let's return to the beginning and just,
02:08:23.920 | patient zero.
02:08:27.000 | It's kind of always incredible to think
02:08:30.760 | that there's one human at which it all started.
02:08:33.840 | Who do you think was patient zero?
02:08:38.760 | Do you think it was somebody that worked
02:08:42.640 | at Wuhan Institute of Virology?
02:08:47.400 | Do you think it was,
02:08:50.280 | do you think there was a leak of some other kind
02:08:52.440 | that led to the infection?
02:08:55.040 | What do we know?
02:08:55.920 | Because there's this December 8th/December 16th case
02:08:59.800 | of, maybe you can describe what that is.
02:09:04.400 | And then there's like, what's his name?
02:09:09.400 | Michael Worlby has a nice timeline.
02:09:12.960 | I'm sure you have a timeline.
02:09:14.480 | But he has a nice timeline that puts the average
02:09:17.880 | at like November something, like 18th and November 16th
02:09:22.240 | as the average estimate for when the patient zero
02:09:27.200 | got infected, when the first human infection happened.
02:09:30.320 | - Yeah, so just two points.
02:09:32.080 | One is, it may be that there's infectee zero
02:09:36.520 | and patient zero.
02:09:37.640 | It could be that the first person infected was asymptomatic
02:09:41.240 | 'cause we know there's a lot of people who are asymptomatic.
02:09:44.200 | And then there's the question of, well, who is patient zero?
02:09:47.600 | Meaning the first person to present themselves
02:09:51.000 | in some kind of health facility
02:09:53.560 | where that diagnosis could be made.
02:09:56.400 | - So can we actually linger on that definition?
02:09:58.640 | - Yeah.
02:09:59.480 | - So is that to you a good definition of patient zero?
02:10:02.600 | Okay, there's a bunch of stuff here
02:10:04.560 | 'cause this virus is weird.
02:10:06.640 | So one is who gets infected, one who is infectious,
02:10:11.120 | or the first person to infect others?
02:10:14.400 | - Yeah, so--
02:10:15.240 | - And who shows up to a hospital,
02:10:17.320 | that would be sick. - Yeah, so I think
02:10:18.160 | that's why I'm calling the first person
02:10:19.480 | to show up to a hospital who's diagnosed with COVID-19,
02:10:22.600 | I'm calling that person patient zero.
02:10:24.440 | There's also, there is somewhere,
02:10:27.800 | the first person to be infected.
02:10:30.600 | And that person maybe never showed up in a hospital
02:10:33.680 | because maybe they were asymptomatic and never got sick.
02:10:37.840 | So let me start with what I'm calling infectee zero.
02:10:41.360 | Here are some options.
02:10:42.560 | I talked before about some person who was a villager
02:10:47.200 | in some remote village.
02:10:48.600 | It's almost impossible to imagine,
02:10:51.000 | but possible to imagine because strange things happen.
02:10:55.080 | And that person somehow gets to Wuhan.
02:10:58.160 | - By the way, just to steal a minute,
02:11:00.160 | that argument, it's not an argument, it's a statement,
02:11:02.920 | but strange things happen all the time.
02:11:06.440 | - No, I agree.
02:11:07.640 | It doesn't mean that logic doesn't apply
02:11:10.120 | and probabilities don't apply,
02:11:11.520 | but we all, I mean, in general principle,
02:11:15.040 | everyone, if we were honest,
02:11:17.400 | should be agnostic about everything.
02:11:20.000 | Like I think I'm Jamie, but is there a 0.01% chance
02:11:24.480 | or 0.001% chance that I'm not?
02:11:27.200 | But it could be.
02:11:28.200 | I mean, how would I know?
02:11:29.040 | - But there's a large number of people arguing
02:11:30.400 | about the meaning of the word I,
02:11:31.920 | and that I'm Jamie. - Yeah, exactly.
02:11:33.120 | So exactly. - What is conscious?
02:11:34.320 | - Exactly, exactly.
02:11:35.200 | So we could spend another three hours going into that one.
02:11:39.200 | So one possibility is there's some remote villager.
02:11:42.160 | Another possibility is there's somehow, bizarrely,
02:11:47.160 | there are these infected animals
02:11:49.080 | that come from Southern China, most likely.
02:11:51.900 | Maybe there's only one of them that's infected,
02:11:55.840 | which how could that possibly be?
02:11:58.120 | And it's only sent to Wuhan.
02:11:59.880 | It's not sent anywhere else,
02:12:02.840 | to any of the markets there or whatever,
02:12:04.200 | and then maybe somebody in a market is infected.
02:12:06.800 | That's one remote possibility, but a possibility.
02:12:10.440 | Another is that researchers
02:12:13.280 | from the Wuhan Institute of Virology
02:12:15.680 | go down to Southern China.
02:12:17.560 | We haven't talked about it yet, but in 2012,
02:12:20.360 | there were six miners were sent into a copper mine
02:12:24.120 | in Southern China in Yunnan province.
02:12:26.420 | All of them got very sick
02:12:28.120 | with what now appear like COVID-19 like symptoms.
02:12:31.840 | Half of them died.
02:12:34.320 | Blood samples from them were taken
02:12:37.640 | to the Wuhan Institute of Virology and elsewhere.
02:12:40.800 | And then after that,
02:12:42.280 | there were multiple site visits to that mine
02:12:48.320 | collecting viral samples that were brought
02:12:51.480 | to the Wuhan Institute of Virology.
02:12:53.660 | Included among those samples
02:12:57.080 | was this now infamous RATG13 virus,
02:13:00.220 | which is among the genetically closest viruses to SARS-CoV-2.
02:13:04.960 | There were other, nine other or eight other viruses
02:13:08.480 | that were collected from that mine
02:13:10.080 | that were presumably very similar to that.
02:13:12.880 | And again, we have no access to the information
02:13:15.680 | about those and many of the other,
02:13:18.960 | most, almost all of the other viruses.
02:13:20.800 | So could it be that one of the people
02:13:25.100 | who was sent from the Wuhan Institute of Virology
02:13:27.360 | or the Wuhan Centers for Disease Control,
02:13:30.680 | they went down there to collect
02:13:32.200 | and they got infected asymptomatically
02:13:34.480 | and brought it back?
02:13:35.560 | Could it be that they were working on these viruses
02:13:38.360 | in the laboratory and there was an issue with waste disposal
02:13:42.800 | and we know that the Wuhan CDC
02:13:44.640 | had a major problem with waste disposal.
02:13:47.640 | And just before the pandemic,
02:13:49.520 | one, they put out an RFP to fix their waste disposal.
02:13:54.520 | And in early 2019, they moved to their new site,
02:13:59.720 | which was basically across the street
02:14:01.720 | from the Huanan Seafood Market.
02:14:04.840 | So could there have been issue of somebody infected
02:14:07.240 | in the lab of waste disposal?
02:14:09.520 | Could a laboratory animal,
02:14:10.960 | their experiences in China,
02:14:12.160 | actually China just recently passed a law
02:14:13.840 | saying it's illegal to sell laboratory animals
02:14:17.760 | in the market because there were scientists,
02:14:20.400 | or one scientist who was selling laboratory animals
02:14:23.800 | in the market and people would just come and buy.
02:14:27.400 | - It's insane.
02:14:28.720 | So there's so many scenarios,
02:14:31.760 | but if I, again, connect it to my 85% number,
02:14:35.600 | I think in the whole category
02:14:37.560 | of laboratory-related incidents,
02:14:40.480 | whether it's collection, waste,
02:14:42.720 | something connected to the lab,
02:14:44.440 | I think that's the most likely.
02:14:47.400 | But there are other credible people
02:14:49.880 | who would say they think it's not the most likely
02:14:52.680 | and I welcome their views
02:14:54.760 | and we need to have this conversation.
02:14:56.240 | - So in your write-up,
02:14:57.640 | what's the URL?
02:15:00.200 | 'Cause I always find it by doing Jamie Metzl Lab Leak.
02:15:04.320 | That's probably the easiest, just Google that.
02:15:06.280 | - No, no, but if you just go to JamieMetzl.com,
02:15:08.840 | J-A-M-I-E-M-E-T-Z-L.com,
02:15:11.960 | then there's just a thing, it's COVID origins.
02:15:14.800 | - COVID origins.
02:15:16.000 | Or you could just Google Jamie Metzl Lab Leak.
02:15:19.600 | Google's search engine is such a powerful thing.
02:15:23.040 | You mentioned in that write-up
02:15:25.400 | that you don't think,
02:15:27.440 | this could be just me misreading it
02:15:29.080 | or it's just slightly miswritten,
02:15:31.240 | but you don't think that the viruses
02:15:34.880 | from that 2012 mine, which is fascinating,
02:15:38.160 | could be the backbone for SARS-CoV-2.
02:15:40.800 | - So what I mean, just the specific virus,
02:15:43.800 | which I mentioned, RATG13,
02:15:45.920 | and there's a whole history of that
02:15:48.120 | because it had a different name
02:15:50.440 | and it looked, and Shijing Li provided wrong information
02:15:54.920 | about when it had been sequenced.
02:15:57.200 | I mean, there was a whole issue connected to that.
02:16:00.280 | But the genetic difference,
02:16:02.880 | even though it's 96.2% similar
02:16:07.880 | to the SARS-CoV-2 virus,
02:16:10.100 | that's actually a significant difference,
02:16:12.400 | even though that and a virus called Binal-52
02:16:17.400 | that was collected in Laos are the two most similar,
02:16:20.200 | there still are differences.
02:16:22.320 | So I'm not saying RATG13 is the backbone,
02:16:25.720 | but is there, I believe there is a possibility
02:16:28.800 | that other viruses that were collected
02:16:32.400 | either in that mine in Yunnan in Southern China
02:16:36.920 | or in Laos or Cambodia,
02:16:39.360 | because that was with the EcoHealth Alliance proposals
02:16:44.240 | and documents,
02:16:45.080 | their plan was to collect viruses in Laos
02:16:49.400 | and Cambodia and elsewhere
02:16:51.200 | and bring them to the Wuhan Institute of Virology
02:16:54.280 | so that there are people,
02:16:55.680 | as a matter of fact,
02:16:56.520 | just when I was sitting here before this message,
02:16:58.720 | I got, before this interview,
02:17:01.040 | I got a message from somebody who was saying,
02:17:03.640 | well, Peter Dezak is telling everybody
02:17:06.080 | that the viral sample, the Binal-52 from Laos
02:17:10.160 | proves that there's not a lab incident origin
02:17:13.760 | of the pandemic,
02:17:14.580 | and it's actually doesn't prove that at all
02:17:17.080 | because these viruses were being collected
02:17:20.540 | in places like Laos and Cambodia
02:17:24.120 | and being brought to the Wuhan Institute of Virology.
02:17:28.400 | - Those are like early, early, like the prequel.
02:17:32.800 | So these are, they're not sufficiently similar
02:17:35.760 | to be, to serve as a backbone,
02:17:37.400 | but they kind of tell a story
02:17:38.680 | that they could have been brought to the lab
02:17:40.600 | through several processes,
02:17:43.320 | including genetic modification
02:17:44.960 | or through the natural evolution processes,
02:17:47.560 | accelerated evolution,
02:17:48.520 | they could have arrived at something that has
02:17:51.160 | the spike protein and the cleavage,
02:17:53.960 | the foreign cleavage side and all that kind of stuff.
02:17:56.960 | - So what I'm saying is,
02:17:58.560 | the essential point is if we had access,
02:18:02.200 | if we knew everything that was being,
02:18:04.120 | every virus that was being held
02:18:05.960 | at the Wuhan Institute of Virology
02:18:08.120 | and the Wuhan CDC, we had full access.
02:18:10.640 | We had full access to everybody's lab notes.
02:18:13.820 | And we did just the kind of forensic investigation
02:18:16.720 | that has been so desperately required since day one.
02:18:21.360 | We'd be able to say, well, what did you have?
02:18:24.320 | Because if we knew, if it should come out
02:18:26.960 | that the Wuhan Institute of Virology
02:18:28.960 | had in its repository prior to the outbreak,
02:18:32.240 | either SARS-CoV-2 or a reasonable precursor to it,
02:18:36.640 | that would prove the lab incident hypothesis.
02:18:39.480 | In my mind, that's almost certainly why
02:18:41.320 | they are preventing any kind of meaningful investigation.
02:18:46.120 | So my hypothesis is not that what RATG13 says
02:18:51.120 | is because as I mentioned earlier,
02:18:54.480 | the genetics of virus are constantly recombinating.
02:18:58.720 | So that what that means is if you have,
02:19:01.880 | you don't have very many total outlier viruses
02:19:06.080 | in a bat community because these viruses
02:19:08.640 | are always mixing and matching with each other.
02:19:11.360 | And so if you have RATG13,
02:19:13.720 | which is relatively similar to SARS-CoV-2,
02:19:17.560 | there's a pretty decent likelihood
02:19:19.120 | there was other stuff that was collected
02:19:22.680 | at this mine called Mojong Mine in Yunnan Province,
02:19:27.680 | maybe in Laos and Cambodia.
02:19:30.920 | And that's why we need to have that information.
02:19:34.040 | - Do you think somebody knows who patient zero is
02:19:39.040 | within China? - Yes.
02:19:39.880 | - So do you think that is information?
02:19:40.920 | - Well, there's two things.
02:19:42.080 | One is I think somebody and people probably know.
02:19:45.600 | And then two, it's been incredibly curious
02:19:47.760 | that the best virus chasers in the world are in China
02:19:52.400 | and they are in Wuhan.
02:19:54.480 | And when, I mean, we can talk about this deeply compromised,
02:19:58.640 | now vastly improved World Health Organization process.
02:20:03.160 | But when they went there, the Chinese,
02:20:05.400 | the local and national Chinese authorities say,
02:20:07.400 | oh, we haven't done, we haven't tested the samples
02:20:10.600 | in our blood center.
02:20:11.880 | We haven't done any of this tracing.
02:20:13.560 | And these deeply compromised people
02:20:16.840 | who were part of the international part
02:20:20.840 | of the joint study tour,
02:20:23.840 | when they came out with their,
02:20:25.040 | they had their visit earlier this year
02:20:26.440 | and came out with their report,
02:20:28.040 | they had, in my mind, just an absurd letter
02:20:32.160 | to the editor in Nature saying,
02:20:34.160 | well, if we don't hurry back,
02:20:35.720 | we're not gonna know what happened.
02:20:37.440 | Assuming that the people in China are like bumpkins
02:20:41.400 | who on their own don't know how to trace the origin of a virus.
02:20:44.960 | And the opposite is the case.
02:20:47.400 | So I think there are people in China
02:20:49.960 | who at least know a lot.
02:20:51.920 | They know a lot more than they're saying.
02:20:55.120 | And the best case scenario is the Chinese government
02:20:58.960 | wants to prevent any investigation, including by them.
02:21:03.200 | The worst case scenario is that there are people
02:21:06.480 | who already know.
02:21:07.320 | And that's why, again, my point from day one has been,
02:21:10.280 | we need a comprehensive international investigation
02:21:14.760 | in Wuhan with full access to all relevant records,
02:21:18.240 | samples, and personnel.
02:21:19.320 | When this, again, deeply flawed.
02:21:22.120 | Can I give you a little history of this WHO process?
02:21:27.120 | - Okay.
02:21:28.280 | Who are the, that's funny.
02:21:30.560 | - Who's on first?
02:21:33.520 | - Who's on first?
02:21:34.440 | I'm so funny with the jokes.
02:21:36.360 | Look at me go.
02:21:37.200 | Who are the WHO?
02:21:39.760 | So what is this organization?
02:21:41.440 | What is its purpose?
02:21:43.080 | What role did it play in the pandemic?
02:21:45.960 | It certainly was demonized in the realm of politics.
02:21:49.200 | This is an institution that was supposed to save us
02:21:55.080 | from this pandemic.
02:21:57.080 | A lot of people believe it failed.
02:21:58.960 | Has it failed?
02:22:00.280 | Why did it fail?
02:22:01.680 | And you said it's improving.
02:22:02.920 | How is it improving?
02:22:04.280 | - Great.
02:22:05.120 | All right.
02:22:05.960 | I hope you don't mind.
02:22:06.780 | I'm gonna have to talk for a little bit of extra time.
02:22:08.840 | - I love this.
02:22:09.680 | - Okay.
02:22:10.520 | Good, good, good, good.
02:22:12.080 | So the WHO is an absolutely essential organization
02:22:16.200 | created in 1948 in that wonderful period
02:22:20.560 | after the second world war
02:22:22.200 | when the United States and allied countries
02:22:24.520 | asked the big, bold questions,
02:22:27.200 | how do we build a safer world for everyone?
02:22:30.640 | And so that's the WHO.
02:22:33.320 | If we, although there are many critics of the WHO,
02:22:36.760 | if we didn't have it, we would need to invent it
02:22:39.120 | because the whole nature of these big public health issues
02:22:44.120 | and certainly for pandemics, but all sorts of things
02:22:48.280 | is that they are transnational in nature.
02:22:50.880 | And so we cannot just build moats.
02:22:54.160 | We cannot build walls.
02:22:55.720 | We're all connected to it.
02:22:56.960 | So that's the idea.
02:22:59.560 | There's a political process because the United Nations
02:23:02.760 | and the WHO is part of it,
02:23:04.480 | it exists within a political context.
02:23:08.840 | And so the current director general
02:23:11.640 | of the World Health Organization,
02:23:13.120 | who was just reelected for his second five-year term
02:23:16.880 | is Dr. Tedros Adhanom Ghebreyesus,
02:23:20.200 | who is from Ethiopia, Tigrayan from Ethiopia.
02:23:25.200 | And in full disclosure, I have a lot of respect for Tedros.
02:23:30.220 | Tedros got his job.
02:23:33.260 | He was not America's candidate.
02:23:35.360 | He was not Britain's candidate.
02:23:37.200 | Our candidate was a guy named David Nabarro,
02:23:39.960 | who I also know and have tremendous respect for.
02:23:42.740 | China led the process of putting Tedros in this position.
02:23:48.760 | And in the earliest days of the pandemic,
02:23:54.560 | Tedros, in my view,
02:23:55.960 | even though I have tremendous respect for him,
02:23:57.800 | I think he made a mistake.
02:23:59.320 | The WHO doesn't have
02:24:00.960 | its own independent surveillance network.
02:24:04.300 | It's not organized to have it
02:24:05.800 | and the states have not allowed it.
02:24:07.240 | So it's dependent on member states
02:24:10.320 | for providing it information.
02:24:13.200 | And because it's a poorly funded organization,
02:24:17.120 | dependent on its bosses, who are these governments,
02:24:20.120 | its natural instinct isn't to condemn its bosses.
02:24:24.760 | It's to say, well, let's quietly work with everybody.
02:24:28.300 | Having said that,
02:24:29.440 | the Chinese government knowingly lied to Tedros.
02:24:33.360 | And Tedros in repeating the position
02:24:36.400 | of the Chinese government,
02:24:37.240 | which incidentally I'll say Donald Trump
02:24:39.440 | also did the exact same thing.
02:24:41.160 | Donald Trump had a private conversation with Xi Jinping
02:24:43.960 | and then repeated what Xi had told him.
02:24:48.560 | Both of them were wrong.
02:24:50.440 | Dr. Tedros, I think when it took,
02:24:54.480 | when Chinese government was lying,
02:24:55.880 | knowingly lying, saying there's no human
02:24:57.960 | to human transmission, Dr. Tedros said that.
02:25:01.140 | And even though within the World Health Organization,
02:25:04.860 | there were private critiques saying China
02:25:07.720 | is now doing exactly what it did in SARS-1.
02:25:10.520 | It's not providing access.
02:25:11.960 | It's not providing information.
02:25:13.960 | Tedros' instinct because of his background,
02:25:16.720 | because of his role and wrongly,
02:25:21.040 | was to have a more collaborative relationship with China,
02:25:25.560 | particularly by making assertions
02:25:28.440 | based on the information that was wrong.
02:25:30.360 | - Don't call people liars.
02:25:32.100 | They're not gonna be happy with you.
02:25:33.840 | - They're not gonna be happy.
02:25:34.680 | And the job of the WHO isn't to condemn states.
02:25:38.880 | It's to do the best possible job of addressing problems.
02:25:41.920 | And I think that the culture was,
02:25:43.720 | well, let's do the most that we can.
02:25:45.840 | If we totally alienate China on day one,
02:25:49.260 | we're in even worse shape than if we call them out for--
02:25:52.560 | - Not exactly sure, by the way,
02:25:54.260 | that maybe you can also steel man that argument.
02:25:58.160 | Like it's not completely obvious
02:26:00.120 | that that's a terrible decision.
02:26:02.880 | Like if you and I were in that role,
02:26:05.480 | we would make that decision.
02:26:06.880 | It's complicated because you want China in your side
02:26:10.700 | to help solve this.
02:26:11.540 | - Right, so I would have made a different decision,
02:26:14.520 | which is why I never would have been selected
02:26:17.560 | as the director general.
02:26:18.680 | There's a selection criteria
02:26:20.640 | that everybody kind of needs to support you.
02:26:24.400 | And so let me just, this is just the beginning.
02:26:27.120 | - Can you also just elaborate or kind of restate,
02:26:30.880 | what were the inaccuracies that you quickly mentioned?
02:26:34.200 | So human to human transmission, what were the things--
02:26:36.720 | - So the most important, there were a few things.
02:26:41.360 | One, China didn't report the outbreak.
02:26:46.360 | Two, they had the sequenced genome of the SARS-CoV-2 virus
02:26:51.400 | and they didn't share it for two critical weeks.
02:26:55.880 | And when they did share it, it was inadvertent.
02:26:59.280 | I mean, there was a very, very courageous scientist
02:27:01.760 | who essentially leaked it
02:27:03.540 | and was later punished for leaking it,
02:27:05.520 | even though the Chinese government is now saying
02:27:07.400 | we were so great by releasing the sequenced genome.
02:27:10.240 | - Wait, I was really confused.
02:27:11.080 | Really?
02:27:11.900 | So I'm so clueless about this as most things.
02:27:15.080 | 'Cause I thought, 'cause there's a celebration of,
02:27:19.360 | isn't this amazing that we got the sequence,
02:27:23.480 | like this amazing, and then the scientific community
02:27:27.160 | across the world stepped up and were able to do a lot
02:27:30.400 | of stuff really quickly with that sharing.
02:27:32.240 | 'Cause I thought the Chinese government shared it.
02:27:34.480 | - No, no, so they sat on it for two weeks.
02:27:37.000 | When they shared it, against their will, it was incredible.
02:27:40.640 | Moderna, 48 hours later after getting the information,
02:27:45.440 | getting the sequenced genome, they had the formulation
02:27:48.180 | for what's now the Moderna COVID-19 vaccine.
02:27:51.960 | But that's two critical weeks.
02:27:55.240 | In those early days, they blocked
02:27:58.680 | the World Health Organization from sending its experts
02:28:02.240 | to Wuhan for more than three weeks.
02:28:04.940 | I said they lied about human-to-human transmission.
02:28:08.140 | During that time, they were aggressively enacting
02:28:11.160 | their cover-up, destroying records, hiding samples,
02:28:15.880 | imprisoning people who were asking tough questions.
02:28:19.640 | They soon after established their gag order.
02:28:24.120 | They fought internally in the World Health Organization
02:28:27.900 | to prevent the declaration of a global emergency.
02:28:32.100 | So China definitely, I mean, I couldn't be stronger
02:28:36.120 | in my critique of China, particularly what it did
02:28:40.040 | in those early days, but it really, what it's doing
02:28:42.320 | even to today is outrageous.
02:28:44.640 | So then there was the question of, well,
02:28:48.340 | how do we examine what actually happened?
02:28:51.280 | And the prime minister of Australia, then and now,
02:28:54.280 | Scott Morrison, was incredibly courageous.
02:28:57.000 | And he said, we need a full investigation.
02:28:59.640 | And because of that, the Chinese government
02:29:02.120 | attacked him personally and imposed trade sanctions
02:29:05.920 | on Australia to try to, not just to punish Australia,
02:29:09.640 | but to deliver a message to every other country,
02:29:12.160 | if you ask questions, we're going
02:29:14.120 | to punish you ruthlessly.
02:29:15.880 | And that certainly was the message that was delivered.
02:29:21.380 | The Australians brought that idea of a full investigation
02:29:24.780 | to the World Health Assembly in May of 2020.
02:29:28.420 | As I mentioned before, the WHA is the governing authority
02:29:32.220 | above, of states, above the World Health Organization.
02:29:37.340 | But instead of passing a resolution calling
02:29:40.600 | for a full investigation, what ended up, ironically
02:29:44.540 | and tragically, passing with Chinese support
02:29:48.260 | was a mandate to have essentially
02:29:50.420 | a Chinese-controlled joint study where half of the team,
02:29:55.140 | a little more than half of the team,
02:29:56.580 | was Chinese experts, government-affiliated Chinese
02:29:59.020 | experts, and half were independent international
02:30:03.060 | experts, but organized by the WHO.
02:30:08.340 | And then it took six months to negotiate
02:30:11.220 | the terms of reference.
02:30:12.340 | And again, while China was doing all this cover-up,
02:30:14.900 | they delayed and delayed and delayed.
02:30:16.700 | And by the terms of reference that were negotiated,
02:30:19.420 | China had veto power over who got
02:30:21.820 | to be a member of the international group.
02:30:25.780 | And that group was not entitled to access to raw data.
02:30:31.100 | The Chinese side would give them conclusions
02:30:34.680 | based on their own analysis of the raw data, which
02:30:37.740 | was totally outrageous.
02:30:39.820 | So then-- and I was a big--
02:30:41.420 | I and others-- now a friend of mine,
02:30:44.500 | although we've never met in person,
02:30:46.100 | Gilles de Menouf in New Zealand.
02:30:48.020 | He did a great job of chronicling just the letter
02:30:50.920 | by letter of the terms of reference.
02:30:54.340 | So then it took--
02:30:56.260 | now it's January of this year, January 2021.
02:31:00.740 | This deeply flawed, deeply compromised international group
02:31:05.980 | is sent to Wuhan.
02:31:07.420 | So what's the connection between this group
02:31:09.460 | and the joint study?
02:31:10.340 | So the joint study, it had the Chinese side
02:31:12.300 | and the international side.
02:31:13.500 | So these international experts, then part of their examination
02:31:17.540 | was going for one month to Wuhan.
02:31:19.940 | And the nature of the flaws of this international group--
02:31:23.300 | It's-- OK, really important point.
02:31:24.940 | And I'm sorry I wasn't clear on that.
02:31:26.860 | Rather, the mandate of what they were doing
02:31:31.060 | was not to investigate the origins of the pandemic.
02:31:35.060 | It was to have a joint study into the zoonotic origins
02:31:39.580 | of the virus, which means--
02:31:41.660 | which was interpreted to mean the natural origins
02:31:44.940 | hypothesis.
02:31:45.540 | They weren't empowered--
02:31:46.540 | To find evidence for--
02:31:47.460 | For a single hypothesis, not--
02:31:49.580 | so they weren't empowered to examine the lab incident
02:31:53.500 | origin.
02:31:54.060 | They were there to look at the natural origin hypothesis.
02:31:56.900 | To shop for some meat at some markets?
02:31:58.820 | Yeah, that was--
02:31:59.500 | How do you do this investigation?
02:32:01.540 | So then they were there for a month.
02:32:04.380 | Of the makeup of the team, guess who was--
02:32:09.740 | so the United States government proposed
02:32:11.700 | three experts for this team, people
02:32:13.900 | who had a lot of background.
02:32:15.260 | This was the Trump administration, people
02:32:17.500 | who had a lot of background, including
02:32:20.660 | in investigating lab incidents.
02:32:23.020 | None of those people were accepted.
02:32:24.860 | The one American who was accepted--
02:32:27.060 | Don't tell me it's Peter Daszak.
02:32:28.420 | Peter Daszak.
02:32:29.100 | Peter Daszak, who had this funding relationship
02:32:33.180 | for many years with the Wuhan Institute of Virology,
02:32:35.660 | whose entire, basically, professional reputation
02:32:39.460 | was based on his collaboration with Shoujiang Li, who
02:32:43.540 | had written the February 2020 Lancet letter saying it comes
02:32:48.620 | from natural origin.
02:32:49.580 | And anybody who's suggesting otherwise
02:32:51.780 | is a conspiracy theorist, and who, at least according to me,
02:32:55.380 | had been at very, very least the opposite of transparent,
02:33:00.260 | and at most engaged in a massive disinformation campaign.
02:33:03.420 | He is the one American who's on this.
02:33:07.820 | So they go there.
02:33:08.900 | They have one month in Wuhan.
02:33:10.900 | Two weeks of it are spent in quarantine,
02:33:14.340 | just in their hotel rooms.
02:33:16.020 | So then they have two weeks, but really it's just 10 working
02:33:19.140 | days.
02:33:20.140 | One of the earliest-- and so then they're kind of--
02:33:22.460 | we've all seen the pictures.
02:33:23.660 | They're traveling around Wuhan in little buses.
02:33:27.540 | One of the first visits they have
02:33:30.500 | is to this museum exhibition on the-- it's basically
02:33:34.580 | a propaganda exhibition on the success, Xi Jinping
02:33:38.300 | and the success in fighting COVID.
02:33:40.300 | And they said, well, we had to show respect
02:33:41.940 | to our Chinese hosts.
02:33:42.820 | But I think what the Chinese hosts were saying is,
02:33:44.940 | let's just--
02:33:45.580 | we're just going to rub your noses in this.
02:33:47.380 | You're going to go where we tell you.
02:33:48.960 | You're going to hear what we want you to hear.
02:33:52.880 | So they have that little short time.
02:33:54.380 | They spend a few hours--
02:33:56.260 | - Because they weren't in control of where the bus goes.
02:33:59.380 | - No.
02:33:59.940 | They made recommendations.
02:34:01.980 | Many of their recommendations were accepted.
02:34:05.260 | But when they went to the Wuhan Institute of Virology,
02:34:08.820 | and some of them did, they weren't
02:34:11.620 | able to do any kind of audit.
02:34:13.060 | When they asked for access to raw data,
02:34:15.620 | they weren't provided that.
02:34:20.260 | As I said in my "60 Minutes" interview,
02:34:22.660 | it was a chaperoned study tour.
02:34:24.980 | It was not even remotely close to an investigation.
02:34:28.020 | And the thing they were looking at
02:34:29.340 | wasn't the origins of the pandemic.
02:34:32.220 | It was the single hypothesis of a, quote,
02:34:36.020 | unquote, "natural origins."
02:34:39.060 | Then-- I mean, it was really so shocking for me.
02:34:43.260 | On February 9 of this year, in Wuhan,
02:34:46.860 | the Chinese government sets up a joint press event
02:34:51.020 | where it's the Chinese side and the international side.
02:34:55.060 | And during that press event, a guy named Peter Ben-Imbarak--
02:34:59.860 | and it's a little confusing.
02:35:01.020 | He was basically the head of this delegation.
02:35:04.140 | And he works for the WHO, even though this
02:35:07.100 | was an independent committee.
02:35:09.420 | It was organized by the WHO.
02:35:11.700 | So Peter Ben-Imbarak gets up there and says,
02:35:16.140 | we think it's most likely it comes from nature.
02:35:19.300 | Then he says, we think it's possible it
02:35:22.180 | comes through frozen food, which is absolutely outrageous.
02:35:25.900 | I mean, it's basically preposterous.
02:35:27.780 | Alina Chan calls this popsicle origins.
02:35:32.140 | But it's really, really unlikely.
02:35:34.300 | But then most significantly, he says that we've all
02:35:39.340 | agreed that a lab incident origin is, quote, unquote,
02:35:43.380 | "extremely unlikely" and shouldn't be investigated.
02:35:47.820 | We later learn that the way they came up with that determination
02:35:52.060 | was by a show of hands vote of the international experts
02:35:56.580 | and the Chinese experts.
02:35:57.980 | And the Chinese experts had to do their vote
02:36:00.660 | in front of the Chinese government officials
02:36:03.300 | who were constantly there.
02:36:05.500 | So even if whatever they thought,
02:36:07.340 | there was no possibility that someone raises their hand
02:36:09.820 | and says, oh, yeah, I think it's a lab origin.
02:36:12.300 | So that was outrageous thing number one.
02:36:15.260 | Outrageous thing number two, which I'll come back
02:36:18.700 | to my response in February, outrageous thing number two
02:36:22.740 | is months later, Peter Benambarach
02:36:25.300 | does an interview on Danish television.
02:36:28.180 | And he says, actually, I was lying about extremely unlikely
02:36:32.620 | because the Chinese side, they didn't
02:36:34.860 | want any mention of a lab incident origin
02:36:38.180 | anywhere, including in the report that later came out.
02:36:42.700 | And so the deal we made, even though he himself thought
02:36:46.260 | that at least some manifestation of a lab incident origin
02:36:49.020 | was likely and that there should be an investigation,
02:36:52.780 | particularly, he said, well, that's
02:36:54.220 | kind of weird that the Wuhan CDC moved just across
02:36:57.260 | from the Huanan seafood market just before the beginning
02:37:00.660 | of the pandemic.
02:37:03.900 | But he said, as a horse trading deal
02:37:06.700 | with the Chinese authorities, it shouldn't be--
02:37:10.980 | that he agreed to say it was extremely unlikely
02:37:13.300 | and shouldn't be investigated.
02:37:15.340 | So I was actually in Colorado staying with my parents.
02:37:18.860 | And I stayed up late watching this press event.
02:37:23.340 | And I was appalled because I knew after two weeks,
02:37:25.980 | there was no way they could possibly
02:37:27.900 | come to that conclusion.
02:37:29.740 | So I immediately sent a private message
02:37:32.660 | to Tedros, the WHO director general,
02:37:36.660 | essentially saying there's no way
02:37:38.740 | they had enough access to come to this conclusion.
02:37:42.500 | If the WHO doesn't distance itself from this,
02:37:47.340 | the WHO itself is going to be in danger
02:37:49.900 | because it's going to be basically
02:37:51.620 | institutional capture by the Chinese.
02:37:54.020 | This was repeating the Chinese government's propaganda points.
02:37:57.660 | And Tedros sent me a really--
02:38:00.340 | again, why I have so much respect for Tedros--
02:38:02.220 | sent me a private note saying, don't worry.
02:38:05.580 | We are determined to do the right thing.
02:38:08.580 | So I got that private message.
02:38:10.020 | And again, I really like Tedros.
02:38:11.900 | But I thought, well, what are you going to do?
02:38:14.780 | Three days later, Tedros makes a public statement.
02:38:19.180 | And he says, I've heard this thing.
02:38:23.140 | I don't think that this is a final answer.
02:38:25.420 | We need to have a full investigation
02:38:28.540 | into this process.
02:38:29.340 | He then released two more statements
02:38:32.820 | saying we need to have a full investigation with access
02:38:37.820 | to raw data, and we need a full audit of the Wuhan labs.
02:38:42.860 | So that part was really, really great.
02:38:46.380 | But then this saga continues.
02:38:49.220 | So I was part of a group, as I mentioned before,
02:38:51.780 | this Paris group.
02:38:53.500 | It was about two dozen or so experts.
02:38:55.620 | And we'd been meeting since 2020, having regular meetings.
02:39:00.020 | And we'd just present papers, present data, debate,
02:39:02.740 | to try to really get to the bottom of things.
02:39:04.580 | And it was all private.
02:39:06.380 | So I went to this group, and I said, look,
02:39:09.220 | this playing field is now skewed.
02:39:12.940 | These guys, they've put out this thing.
02:39:14.980 | Labins in origin, extremely unlikely.
02:39:16.860 | It's in every newspaper in the world.
02:39:19.620 | We can't just be our own little private group
02:39:21.900 | talking to each other.
02:39:23.540 | So I led the political process of drafting
02:39:26.820 | what became four open letters that many of us signed,
02:39:31.380 | most of us signed, that saying, all right,
02:39:35.100 | here's why this investigation, this study group and the report
02:39:40.260 | are not credible.
02:39:41.060 | Here's what's wrong.
02:39:42.460 | Here's what a full investigation would look like.
02:39:45.900 | Here's a treasure map of all the resources
02:39:48.380 | where people can look.
02:39:50.580 | And we demand a comprehensive investigation.
02:39:52.700 | So those four open letters were in pretty much
02:39:56.220 | every newspaper in the world.
02:39:58.500 | And it played a really significant role,
02:40:01.020 | along with some other things.
02:40:02.660 | There was later, there was a letter,
02:40:04.940 | a short letter in Science, making basically similar points
02:40:10.340 | in a much more condensed way.
02:40:11.740 | There were some higher profile articles
02:40:14.580 | by Nicholas Wade and Nick Baker and others.
02:40:19.820 | And those collectively shifted the conversation.
02:40:24.500 | And then really impressively, the WHO,
02:40:29.500 | with Tedros' leadership, did something
02:40:31.420 | that was really incredible.
02:40:33.580 | And that is, earlier this year, they,
02:40:36.660 | meaning the leadership of the WHO,
02:40:38.340 | not the World Health Assembly, but the leadership of the WHO,
02:40:43.500 | announced the establishment of what's
02:40:45.540 | called SAGO, the Scientific Advisory Group on the Origins
02:40:49.460 | of Novel Pathogens.
02:40:51.420 | And basically, what they did was overrule their own governing
02:40:55.500 | board and say, we're going to create our own entity.
02:40:59.100 | And so it basically dissolved that international,
02:41:02.140 | deeply flawed international joint study group.
02:41:04.420 | And a lot of those people, they have
02:41:06.860 | become very critical, like the Chinese of Tedros.
02:41:11.100 | So then they had an open call for nominations
02:41:15.260 | to be part of SAGO.
02:41:17.580 | And so a lot of people put in their nominations.
02:41:23.100 | They selected 26 people.
02:41:25.500 | But our group, we had a meeting.
02:41:26.900 | And we were unhappy with that list of 26.
02:41:30.820 | It still felt skewed toward the natural origin hypothesis.
02:41:35.260 | So again, I drafted and we worked on together
02:41:38.500 | an open letter, which we submitted to the WHO,
02:41:42.100 | saying, we think this list, it's a step in the right direction,
02:41:45.300 | but it's not good enough.
02:41:46.820 | And we call on these three people to be removed.
02:41:50.740 | And we have these three people who we think should be added.
02:41:54.020 | Incredibly, and I was in private touch
02:41:56.540 | with the WHO, after announcing the 26 people,
02:42:00.260 | the WHO said, we're reopening the process.
02:42:03.860 | So send in more.
02:42:04.860 | And so then they added two more people, one of whom
02:42:08.580 | is an expert in the auditing of lab incidents.
02:42:13.820 | And then one of the--
02:42:15.940 | so they added those two.
02:42:17.060 | And then when they just released the list of people
02:42:20.260 | who are part of SAGO, this one woman,
02:42:23.140 | a highly respected Dutch virologist named
02:42:26.020 | Marion Koopmans, who had been part
02:42:28.100 | of that deeply flawed and compromised international study
02:42:32.500 | group, who had called, who has consistently called a lab
02:42:35.620 | incident origin, quote unquote, a debunked conspiracy theory.
02:42:39.360 | As of now, her name is not on the list.
02:42:42.780 | We haven't seen any announcements.
02:42:44.820 | So I summary, and I'm sorry to go on for so long
02:42:48.100 | and to be so animated about this.
02:42:49.580 | I genuinely feel that the WHO is trying to do the right thing.
02:42:55.740 | But they exist within a political context.
02:42:59.500 | And they're kind of-- it's like they're pushing at the edges.
02:43:03.780 | But there's only so far that they can go.
02:43:07.660 | And that's why we definitely need
02:43:10.100 | to have full accountability for the WHO.
02:43:12.460 | We need to expand the mandate to WHO.
02:43:14.820 | But we need to recognize that states have a big role.
02:43:18.180 | And China is an incredibly influential state
02:43:21.460 | that's doing everything possible to prevent
02:43:24.620 | the kind of full investigation into pandemic origins
02:43:27.380 | that's so desperately required.
02:43:28.860 | Well, it sounds like the leadership
02:43:31.180 | made all the difference in the WHO.
02:43:34.180 | So like the way to change the momentum of large institutions
02:43:37.180 | is through the leadership.
02:43:38.860 | Leadership and empowerment.
02:43:41.900 | As I mentioned, the World Health Assembly is meeting now.
02:43:45.140 | And I think that it shouldn't be that we require superhumans.
02:43:50.740 | And there are some people who are big critics of WHO.
02:43:53.700 | The leader of the WHO in SARS 1 was definitely more aggressive.
02:44:00.420 | She had a different set of powers at that time.
02:44:05.540 | But it can't be entirely--
02:44:07.980 | I mean, we definitely need strong-willed, aggressive,
02:44:11.540 | independent people in these kinds of roles.
02:44:14.500 | We also need a more empowered WHO.
02:44:17.820 | Like when the Chinese government,
02:44:19.100 | in the earliest days of the pandemic,
02:44:22.140 | said we're just not going to allow
02:44:24.340 | you to send a team to collect your own information.
02:44:26.980 | And we're not going to allow you to have any kind
02:44:31.420 | of independent surveillance.
02:44:34.180 | There was very little that the WHO
02:44:36.500 | could do because of the limitations of its mandate.
02:44:40.580 | And we can't just say we're going to have a WHO that only
02:44:44.020 | compromises Chinese sovereignty.
02:44:46.260 | If we want to have a powerful WHO,
02:44:48.540 | we should say you have emergency teams.
02:44:52.220 | When the director general says an emergency team
02:44:56.020 | needs to go somewhere, if they aren't
02:44:58.260 | allowed to go there that day, you
02:45:00.620 | could say there's an immediate referral to the Security
02:45:03.140 | Council.
02:45:03.620 | There needs to be something.
02:45:05.780 | But we have all these demands, rightfully,
02:45:09.140 | so of the WHO, which doesn't have the authorities.
02:45:13.260 | The WHO itself only controls 20% of its own budget.
02:45:16.180 | So the governments are saying we're
02:45:17.660 | going to give you money to do this or that.
02:45:21.660 | So we need a stronger WHO to protect us,
02:45:27.140 | but we also have to build that.
02:45:28.900 | - So looking a little bit into the future,
02:45:33.020 | let's first step into the past, sort
02:45:35.060 | of the philosophical question about China.
02:45:39.660 | If you were to put yourself in the shoes
02:45:42.820 | of the Chinese government, if they
02:45:46.860 | were to be more transparent, how should they
02:45:49.900 | be more transparent?
02:45:51.500 | Because it's easier to say we want to see this.
02:45:57.780 | But from a perspective of government,
02:45:59.300 | and not just the Chinese government,
02:46:00.780 | but a government on WHO's geographic territory,
02:46:07.860 | say it's a lab leak, a lab leak occurred
02:46:11.260 | that has resulted in trillions of dollars of loss,
02:46:16.220 | countless of lives, just all kinds of damage to the world.
02:46:20.780 | If they were to admit or show data
02:46:27.780 | that could serve as evidence for a lab leak,
02:46:31.220 | that's something that people could, like in the worst case,
02:46:35.740 | start wars over, or in the most likely case,
02:46:41.220 | just constantly bring that up at every turn,
02:46:45.980 | making you powerless in negotiations.
02:46:50.300 | Whenever you want to do something
02:46:51.900 | in a geopolitical sense, the United States
02:46:54.900 | will bring up, oh, remember that time
02:46:57.140 | you cost us trillions of dollars because of your fuck up?
02:47:01.460 | So what is the incentive for the Chinese government
02:47:05.460 | to be transparent?
02:47:07.380 | And if it is to be transparent, how should it do it?
02:47:11.540 | So there's a bunch of people,
02:47:13.980 | like the reason I'm talking to you,
02:47:16.620 | as opposed to a bunch of other folks,
02:47:19.300 | because you are kind-hearted and thoughtful
02:47:21.940 | and open-minded and really respected,
02:47:24.500 | there's a bunch of people that are talking about lab leak
02:47:27.540 | that are a little bit less interested
02:47:30.380 | in building a better world, and more interested
02:47:34.180 | in pointing out the emperor has no clothes.
02:47:36.140 | They want step one, which is saying,
02:47:38.620 | basically tearing down the bullshitters.
02:47:43.540 | They don't wanna do the further steps of building.
02:47:48.340 | And so as a Chinese government,
02:47:50.500 | I would be nervous about being transparent
02:47:52.700 | with anybody that just wants to tear our power centers,
02:47:57.700 | our power structures down.
02:47:59.060 | Anyway, that's a long way to ask,
02:48:01.620 | how should the Chinese government be transparent
02:48:05.740 | now and in the future?
02:48:07.380 | - So maybe I'll break that down into a few sub-questions.
02:48:12.100 | The first is, what should, in an ideal world,
02:48:15.820 | what should the Chinese government do?
02:48:17.580 | And that's pretty straightforward.
02:48:20.180 | They should be totally transparent.
02:48:22.660 | The South African government now,
02:48:24.620 | there is an outbreak of this Omicron variant,
02:48:27.900 | and the South African government has done
02:48:30.060 | what we would want a government to do,
02:48:32.140 | say, hey, there's an outbreak,
02:48:33.260 | we don't have all of the information,
02:48:35.660 | we need help, we want to alert the world,
02:48:38.580 | and in some ways they're being punished for it
02:48:41.100 | through these travel bans, but it's a separate topic,
02:48:43.720 | but I actually think short-term travel bans
02:48:46.180 | actually are not a terrible idea.
02:48:48.060 | They should have, on day one,
02:48:50.980 | if they should have allowed WHO experts in,
02:48:55.100 | they should have shared information,
02:48:57.300 | they should have allowed a full
02:48:59.140 | and comprehensive investigation
02:49:02.140 | with international partnerships
02:49:04.260 | to understand what went wrong,
02:49:07.940 | they should have shared their raw data,
02:49:10.640 | they should have allowed their scientists
02:49:12.900 | to speak and write publicly,
02:49:14.740 | because nobody knows more about this stuff,
02:49:17.060 | certainly in the early days, than their scientists do,
02:49:20.620 | so it's relatively easy to say what they should do.
02:49:26.540 | - It's a hard question to say, well, what would happen?
02:49:29.420 | Let's just say, let's just say tomorrow,
02:49:33.020 | we prove for certain that this pandemic stems
02:49:36.660 | from both from an accidental lab incident
02:49:39.460 | and then from what I've consistently called
02:49:41.300 | a criminal cover-up, because the cover-up
02:49:44.620 | has done in many ways as much or more damage
02:49:47.580 | than the incident.
02:49:49.900 | Mo, what happens?
02:49:50.840 | You could easily imagine, Xi Jinping has had two terms
02:49:54.140 | as the leader of China.
02:49:57.700 | - And he can now have unlimited terms.
02:49:59.900 | - Well, they've changed the rules for that,
02:50:01.900 | but he's got a lot of enemies.
02:50:03.540 | I mean, there are a lot of people
02:50:04.500 | who are waiting in line to step up.
02:50:07.620 | So is there a chance that Xi Jinping could be deposed
02:50:11.160 | if it was proven that this comes from a lab?
02:50:13.220 | And I think there's a real possibility.
02:50:15.500 | Would people in the United States Congress, for example,
02:50:19.400 | demand reparations from China?
02:50:21.900 | So we've had $4.5 trillion of stimulus,
02:50:26.220 | all of the economic losses, and we owe a lot of money
02:50:29.780 | to China from our debt.
02:50:32.880 | I'm quite certain that members of Congress would say,
02:50:36.420 | we're just gonna wipe that out.
02:50:37.620 | It would destroy the global financial system,
02:50:39.860 | but I think they would be extremely likely.
02:50:42.300 | Would other countries like India that have lost millions
02:50:46.420 | of people and had terrible economic damages,
02:50:50.620 | would they demand reparations?
02:50:53.220 | So I think from a Chinese perspective, starting from now,
02:50:57.340 | it would have major geopolitical implications.
02:51:00.540 | And go back to Chernobyl, there was a reason
02:51:04.460 | why the Soviet Union went to such length to cover things up.
02:51:08.220 | And when it came out, I mean, there are different theories,
02:51:11.500 | but certainly Chernobyl played some role
02:51:15.780 | in the end of communist power in the Soviet Union.
02:51:20.780 | So the Chinese are very, very aware of that.
02:51:25.540 | - But the difference, of course, with Chernobyl,
02:51:27.340 | the damage to the rest of the world
02:51:28.660 | was not as nearly significant as with COVID.
02:51:32.020 | So you say that the coverup is a crime,
02:51:34.500 | but everything you just described,
02:51:37.700 | the response of the rest of the world is,
02:51:41.420 | I could say unfair.
02:51:45.020 | - Well, it's not-- - If it's a, so, okay.
02:51:46.980 | If we say the best possible version of the story,
02:51:50.360 | lab leaks happen.
02:51:53.980 | They shouldn't happen, but they happen.
02:51:57.780 | And how is that on the Chinese government?
02:52:01.060 | I mean, what's a good example?
02:52:03.980 | - Well, the Union Carbide.
02:52:05.060 | Union Carbide, there was this American company
02:52:07.420 | operating in India.
02:52:09.100 | They had this leak.
02:52:10.260 | All these people were killed.
02:52:12.580 | The company admitted responsibility.
02:52:15.060 | I was working in the White House
02:52:17.460 | when the United States government, in my view,
02:52:19.460 | which I know to be the case,
02:52:20.960 | but other people in China think differently,
02:52:23.380 | bombed the Chinese embassy in Belgrade.
02:52:25.500 | And so the United States government
02:52:27.720 | allowed a full investigation.
02:52:29.580 | Then we paid reparations to the family, the families.
02:52:34.260 | And so to your question, if I were,
02:52:36.900 | let's just say I were the Chinese government,
02:52:39.620 | not, I mean, in kind of an idealized version
02:52:42.100 | of the Chinese government.
02:52:43.780 | And let's just say that they had come to the conclusion
02:52:46.980 | that it was a lab incident.
02:52:49.020 | And let's just say they knew that even if
02:52:52.800 | they continued to cover it up,
02:52:55.100 | eventually this information would come out.
02:52:58.660 | I mean, maybe there was a whistleblower.
02:53:00.180 | Maybe they knew of some evidence
02:53:01.980 | that we didn't know about or something.
02:53:05.260 | What would I do starting right now?
02:53:09.180 | What I would do is I would hold a press conference
02:53:12.540 | and I would say, we had this terrible accident.
02:53:16.820 | The reason why we were doing this research
02:53:19.540 | in Wuhan and elsewhere is that we had SARS-1
02:53:23.300 | and we felt a responsibility to do everything possible
02:53:26.400 | to prevent that kind of terrible thing happening again
02:53:29.380 | for our country and for the world.
02:53:31.460 | That was why we collaborated with France,
02:53:34.460 | with the United States in building up those capacities.
02:53:38.420 | We know that nothing is perfect,
02:53:40.460 | but we're a sovereign country and we have our own system.
02:53:42.820 | And so we had to adapt our systems
02:53:45.700 | so that they made sense internally.
02:53:49.420 | When this outbreak began, we didn't know how it started.
02:53:53.980 | And that was why we wanted to look into things.
02:53:57.700 | When the process of investigating became so political,
02:54:01.440 | it gave us pause and we were worried
02:54:03.780 | that our enemies were trying to use this investigation
02:54:07.480 | in order to undermine us.
02:54:09.580 | Having said that, now that we've dug deeper,
02:54:13.780 | we have recognized because we have access
02:54:16.740 | to additional information that we didn't have then,
02:54:19.540 | that this pandemic started from an accidental lab incident.
02:54:23.620 | And we feel really terribly about that.
02:54:25.860 | And we know that we were very aggressive
02:54:28.500 | in covering up information in the beginning,
02:54:31.100 | but the reason we were doing that is because we thoroughly,
02:54:34.380 | we fully believe that it came from a natural origin.
02:54:37.680 | Now that we see otherwise, we feel terribly.
02:54:41.340 | Therefore, we're doing a few different things.
02:54:44.940 | One is we are committing ourselves
02:54:48.200 | to establishing a stronger WHO,
02:54:51.660 | a new pandemic treaty that addresses
02:54:55.360 | the major challenges that we face
02:54:57.880 | and allows the World Health Organization
02:55:01.040 | to pierce the veil of absolute sovereignty
02:55:03.760 | because we know that when these pandemics happen,
02:55:06.920 | they affect everybody.
02:55:08.760 | We are also putting, and you can pick your number,
02:55:12.000 | but let's start with $5 trillion,
02:55:15.360 | some massive amount, into a fund
02:55:19.600 | that we will be distributing to the victims of COVID-19
02:55:24.560 | and their--
02:55:26.120 | - China would do that?
02:55:27.640 | - This is a fantasy speech.
02:55:29.520 | - But I disagree with your, I mean, okay.
02:55:32.900 | (sighs)
02:55:34.180 | - So you think China has a responsibility?
02:55:36.900 | - Well, so it's not the, like, just a lab leak.
02:55:40.500 | Like if China on day one had said,
02:55:43.300 | "We have this outbreak.
02:55:45.100 | We don't know where it came from.
02:55:47.060 | We want to have a full investigation.
02:55:50.100 | We call on international,
02:55:52.100 | responsible international partners
02:55:53.820 | to join us in that process.
02:55:55.620 | And we're going to do everything in our power
02:55:57.980 | to share the relevant information
02:56:00.340 | because however this started, we're all victims."
02:56:03.220 | That's a totally different story
02:56:05.140 | than punishing Australia, preventing WHO,
02:56:08.820 | blocking any investigation,
02:56:10.660 | condemning people who are trying to look.
02:56:12.740 | And so--
02:56:13.900 | - So cover up for a couple of weeks,
02:56:16.900 | you can understand maybe,
02:56:18.620 | 'cause there's so much uncertainty.
02:56:20.500 | You're like, "Oh, let's hide all the Winnie the Pooh pictures
02:56:24.620 | while we figure this out."
02:56:26.060 | But the moment you really figure out what happened,
02:56:29.800 | you always, as a Jew, I can say this,
02:56:32.360 | always find like a blame the Jews kind of situation.
02:56:35.280 | A little bit, just a little bit.
02:56:36.600 | You're like, "All right, it's not us."
02:56:38.520 | (Zubin laughs)
02:56:40.240 | I'm just kidding.
02:56:41.080 | But be proactive in saying--
02:56:44.760 | - Sorry to interrupt, but the joke about that is
02:56:47.640 | there's a big problem because a lot of people
02:56:50.120 | have to leave the Jewish socialist conspiracy
02:56:53.080 | to make it for the Jewish capitalist conspiracy meeting.
02:56:56.080 | - I love it.
02:56:57.080 | (Zubin laughs)
02:56:59.000 | - So I would say not 5 trillion, but some large amount,
02:57:03.440 | and I would really focus on the future,
02:57:05.160 | which is every time we talk about the lab leak,
02:57:07.980 | the unfortunate thing is I feel like people
02:57:11.760 | don't focus enough about the future.
02:57:13.320 | To me, the lab leak is important
02:57:15.080 | because we want to construct a kind of framework
02:57:20.160 | of thinking and a global conversation
02:57:23.680 | that minimizes the damage done by future lab leaks,
02:57:28.360 | which will almost certainly happen.
02:57:30.760 | And so to me, any lab leak is about the future.
02:57:35.760 | I would launch a giant investment in saying
02:57:40.040 | we're going to create a testing infrastructure,
02:57:43.800 | all of this kind of infrastructure investments
02:57:46.760 | that help minimize the damage of a lab leak
02:57:49.520 | here and the rest of the world.
02:57:51.280 | - So the challenge with that is one,
02:57:54.420 | it's hard to imagine a fully accountable future system
02:57:58.400 | to prevent these kinds of terrible pandemics
02:58:01.960 | that's built upon obfuscation and cover-up
02:58:06.080 | regarding the origins of this worst pandemic in a century.
02:58:09.600 | So it's just like that foundation isn't strong enough.
02:58:14.340 | Second, China across the fields of science
02:58:17.820 | is looking to leapfrog the rest of the world.
02:58:20.740 | So China now has current plans to build BSL-4 labs
02:58:25.120 | in every of its province.
02:58:27.080 | - Yeah, they're scaling up the-
02:58:28.640 | - Scaling up everything, and so with the plan on leading.
02:58:32.400 | And that's why, again, I was saying before,
02:58:33.720 | I think there's a lot of similarity between this story,
02:58:36.700 | at least as I see it, at least the most probable case,
02:58:40.240 | and these other areas where China gets knowledge
02:58:42.680 | and then tries to leapfrog.
02:58:44.140 | It's the same with AI and autonomous killer robots.
02:58:48.400 | It's the same with human genome editing,
02:58:50.320 | with animal experimentations, with so many,
02:58:52.800 | basically all areas of advanced science.
02:58:57.000 | So the question is, would China stop in that process?
02:59:01.800 | And then third, it's a little bit of a historical background
02:59:06.800 | but defending national sovereignty
02:59:10.640 | is one of the core principles of,
02:59:13.480 | certainly of the Chinese state.
02:59:16.080 | And the historical issue is,
02:59:18.480 | for those of us who come from the West,
02:59:21.200 | I mean, one of the lessons of the post-war planners
02:59:24.020 | was that absolute national sovereignty
02:59:26.920 | was actually a major feeder
02:59:29.680 | into the first and second world wars,
02:59:31.400 | that we had all these conflicting states.
02:59:34.280 | And therefore, the logic of the post-war system
02:59:37.420 | is we need to, in some ways, pool sovereignty
02:59:39.600 | that's like the EU and have transnational organizations
02:59:44.400 | like the UN organizations
02:59:45.960 | and the Bretton Woods organizations.
02:59:47.800 | For most Asian states, and also even for some African,
02:59:52.040 | the people who were kind of
02:59:52.880 | on the colonized side of history,
02:59:55.520 | sovereignty was the thing that was denied them.
02:59:58.720 | That was the thing that they want,
03:00:00.040 | that the European power is denied.
03:00:01.880 | And so the idea of giving up sovereignty
03:00:04.700 | was the absolute opposite.
03:00:07.040 | And so that's why China is,
03:00:10.120 | and again, I mentioned this Rush Doshi book,
03:00:12.400 | it's not that China is trying to strengthen
03:00:15.120 | this rules-based international order,
03:00:17.200 | which is based on the principle
03:00:19.040 | that, well, there are certain things that we share,
03:00:21.240 | and how do we build a governance system
03:00:23.840 | to protect those things?
03:00:25.680 | What it seems to be doing
03:00:27.000 | is trying to advance its own sovereignty.
03:00:31.300 | And so I think I agree with you,
03:00:33.040 | but I don't think that we can just go forward
03:00:37.060 | without some accountability for the present.
03:00:39.640 | - So the cover-up was a big problem.
03:00:41.480 | It's like, I often, I find myself playing devil's advocate
03:00:46.560 | 'cause I'm trying to sort of empathize,
03:00:49.520 | and then I forget that two or three people
03:00:54.280 | listen to this thing, and then they're like,
03:00:55.880 | look, Lex is defending the Chinese government
03:00:58.880 | with their cover-up.
03:00:59.720 | No, I'm not.
03:01:00.640 | I'm just trying to understand.
03:01:05.000 | I mean, it's the same reason I'm reading Mein Kampf now,
03:01:08.040 | is you have to really understand the minds of people
03:01:13.640 | as if I too could have done that.
03:01:18.360 | You have to understand that we're all the same
03:01:21.120 | to some degree, and that kind of empathy
03:01:24.680 | is required to figure out solutions for the future.
03:01:29.640 | It's just, in empathizing with the Chinese government
03:01:32.600 | in this whole situation, I'm still not sure I understand
03:01:37.600 | how to minimize the chance of a cover-up in the future,
03:01:41.800 | whether for China or for the United States.
03:01:44.040 | If the virus started in the United States,
03:01:45.680 | I'm not exactly sure we would be,
03:01:48.100 | with all the emphasis we put on freedom of speech,
03:01:52.480 | with all the emphasis we put on freedom of the press
03:01:57.480 | and access to the press, to sort of all aspects
03:02:01.800 | of government, I'm not sure the US government
03:02:04.120 | would do a similar kind of cover-up.
03:02:06.040 | - Let me put it this way.
03:02:07.280 | So we're in Texas now doing this interview.
03:02:09.920 | Imagine there's a kind of horseshoe bat
03:02:13.560 | that we'll call the Texas horseshoe bat.
03:02:17.080 | And the Texas--
03:02:17.920 | - There's a lot of bats in Austin,
03:02:19.840 | but it's a whole thing. - It's true, it's true.
03:02:21.960 | And so let's just say that the Texas horseshoe bats
03:02:25.560 | only exist in Texas.
03:02:28.400 | But in Montana, we have a thing,
03:02:32.080 | it's called the Montana Institute of Virology.
03:02:36.240 | And at the Montana Institute of Virology,
03:02:38.860 | they have the world's largest collection
03:02:41.440 | of Texas horseshoe bats, including horseshoe bats
03:02:45.600 | that are associated with a previous global pandemic
03:02:50.600 | called the Texas horseshoe bat pandemic.
03:02:55.000 | And let's just say that people in Montana,
03:02:58.880 | in the same town where this Montana Institute of Virology is,
03:03:03.320 | start getting a version of this Texas horseshoe bat
03:03:09.260 | syndrome that is genetically relatively similar
03:03:13.580 | to the outbreak in Texas.
03:03:15.260 | There are no horseshoe bats there.
03:03:18.020 | And the government says, it's your same point,
03:03:21.340 | Alina's point about the unicorns,
03:03:23.260 | like nothing to see here, just move along.
03:03:26.020 | - No, but we'll see-- - With Joe Rogan
03:03:30.100 | and Brett Weinstein and Josh Rogan,
03:03:33.300 | and I mean, would they say, oh, I guess,
03:03:35.380 | I mean, I just think that--
03:03:36.700 | - No, no, but the point is the government going to say it.
03:03:40.540 | So Joe Rogan is a comedian.
03:03:45.040 | Brett Weinstein is a podcaster.
03:03:48.900 | The point is what we want is not just those folks
03:03:53.480 | to have the freedom to speak, that's important,
03:03:56.220 | but you want the government to have the transparent,
03:03:58.380 | like, I don't think Joe Rogan is enough
03:04:02.380 | to hold the government accountable.
03:04:04.380 | I think they're gonna do their thing anyway.
03:04:07.060 | - But I think that's our system,
03:04:09.100 | and that was the genius of the founding fathers.
03:04:12.300 | - Was that enough? - They said that the
03:04:13.540 | government probably is going to have a lot of instincts
03:04:16.960 | to do the wrong thing.
03:04:18.240 | That was the experience in England before.
03:04:22.380 | And so that's why we have free speech
03:04:25.780 | to hold the government accountable.
03:04:27.140 | I mean, I'm kind of broadly a gun control person,
03:04:30.160 | but the people who say, well,
03:04:31.540 | we need to have broad gun rights.
03:04:34.140 | - As somebody who's now in Texas, I am offended.
03:04:39.020 | - But their argument is, look,
03:04:40.420 | we don't fully trust the government.
03:04:42.900 | If the government, just like we fought against the British,
03:04:47.420 | if the government's wrong,
03:04:48.500 | we want to at least have some authority.
03:04:51.040 | So that's our system, is to have that kind of voice,
03:04:53.640 | and that is the public voice actually balances.
03:04:57.580 | 'Cause every government, as you correctly said,
03:05:00.060 | every government has the same instincts,
03:05:02.860 | and that's why we have, and it's imperfect here,
03:05:06.460 | but kind of these ideas of separation of powers,
03:05:08.620 | of inalienable rights, so that we can have,
03:05:11.260 | it's almost like a vast market where we can have balance.
03:05:14.620 | - So you think if a lab leak occurred in the United States,
03:05:17.920 | what probability would you put some kind of public report
03:05:23.900 | led by Rand Paul, would come out saying this was a lab leak?
03:05:29.220 | You have good confidence that that would happen?
03:05:31.020 | - I have pretty decent confidence,
03:05:32.380 | and the reason I say, I mentioned that I'm a,
03:05:35.180 | I think of myself, I'm sure I'm not anymore,
03:05:37.220 | 'cause as I get older, but as a progressive person,
03:05:39.860 | I'm a Democrat, and I worked in Democratic administrations,
03:05:44.300 | worked for President Clinton
03:05:45.300 | on the National Security Council.
03:05:46.900 | But my kind of best friend in the United States Senate,
03:05:52.540 | who I talk to all the time,
03:05:55.580 | is a Senator from Kansas named Roger Marshall.
03:05:59.460 | And Roger, I mean, if you just lined up our positions
03:06:03.380 | on all sorts of things, we're radically different.
03:06:07.420 | But we have a great relationship, we talk all the time,
03:06:13.260 | and we share a commitment to saying,
03:06:16.140 | well, let's ask the tough questions about how this started.
03:06:20.180 | And again, if we had,
03:06:22.620 | like, what is the United States government?
03:06:24.340 | Yeah, it's the executive branch, but there's also Congress.
03:06:27.300 | And Congress, you talk about Rand Paul,
03:06:29.940 | and as a former executive branch worker
03:06:33.500 | when I was on the National Security Council,
03:06:35.620 | and I guess technically when I was at the State Department,
03:06:38.940 | all of this stuff, all of this process,
03:06:40.860 | it just seems like a pain in the ass.
03:06:42.620 | It's like these F'ers, they're just attacking us.
03:06:46.740 | We tried to do this thing,
03:06:48.380 | we had all the best intentions,
03:06:49.900 | and now they're holding hearings,
03:06:51.020 | and they're trying to box us in, and whatever.
03:06:53.700 | But that's our process.
03:06:55.060 | And there's like a form of accountability as chaotic,
03:06:58.260 | as crazy as it is, and so it makes it really difficult.
03:07:03.260 | I mean, we have other problems of just chaos
03:07:05.660 | and everybody doing their own thing,
03:07:07.580 | but it makes it difficult
03:07:08.820 | to have a kind of systematic coverup.
03:07:11.620 | And again, all of that is predicated on my hypothesis,
03:07:15.900 | not fully proven, although I think likely,
03:07:18.620 | that there is a lab incident origin of this pandemic.
03:07:22.980 | - Well, I mean, we're having
03:07:24.660 | like several layers of conversation,
03:07:26.620 | but I think whether a lab leak hypothesis is true or not,
03:07:31.620 | it does seem that the likelihood of a coverup,
03:07:38.580 | if it leaked from a lab, is high.
03:07:41.500 | - What I'd say, yeah. - And that's the more
03:07:42.500 | important conversation to be having.
03:07:45.380 | Well, you could argue a lot of things,
03:07:47.980 | but to me, arguably, that's the more important conversation
03:07:51.500 | is about what is the likelihood of a coverup.
03:07:53.700 | - 100%.
03:07:54.740 | Like in my mind, there is a legitimate debate
03:07:59.420 | about the origins of the pandemic.
03:08:01.940 | There are people who I respect,
03:08:04.100 | who I don't necessarily agree with,
03:08:06.340 | people like Stuart Neal, who's a virologist in the UK,
03:08:10.020 | who's been very open-minded,
03:08:11.660 | engaged in productive debate about the origin,
03:08:16.140 | and you know where I stand.
03:08:18.200 | There is and can be no debate about whether or not
03:08:23.200 | there has been a coverup.
03:08:25.240 | There has been a coverup.
03:08:26.400 | There is, in my mind, no credible argument
03:08:29.600 | that there hasn't been a coverup,
03:08:31.320 | and we can just see it in the regulations,
03:08:35.440 | in the lack of access.
03:08:37.120 | There's an incredible woman named Zhang Zhan,
03:08:40.480 | who is a Chinese, we have to call her a citizen journalist
03:08:44.240 | because everything is controlled by the state,
03:08:46.060 | but in the early days of the pandemic,
03:08:47.720 | she went to Wuhan, started taking videos and posting them.
03:08:52.060 | She was imprisoned for picking quarrels,
03:08:54.920 | which is kind of a catch-all,
03:08:57.440 | and now she's engaged in a hunger strike,
03:09:00.260 | and she's near death, and so there's no question
03:09:04.920 | that there has been a coverup,
03:09:06.680 | and there's no question in my mind
03:09:08.280 | that that coverup is responsible
03:09:10.400 | for a significant percentage of the total deaths
03:09:13.920 | due to COVID-19.
03:09:17.040 | - In a pivot, can I talk to you about sex?
03:09:22.040 | - Let's roll.
03:09:23.400 | - Okay, so you're the author of a book, "Hacking Darwin."
03:09:26.600 | So humans have used sex, allegedly,
03:09:34.100 | as I've read about, to mix genetic information
03:09:39.120 | to produce offspring, and sort of,
03:09:44.720 | through that kind of process, adapted their environment.
03:09:49.720 | - Lex, you mentioned earlier
03:09:51.440 | about your asking tough questions
03:09:53.680 | and people pushing you to ask tough questions.
03:09:56.640 | Is it okay if I just, so you said,
03:09:59.120 | "Have done this as I've read about."
03:10:01.240 | - As I've read about on the internet, yeah.
03:10:03.400 | - All I'm saying, as a person sitting with you,
03:10:06.280 | to people who would be open-minded
03:10:10.320 | in experimenting of as I've read about to reality,
03:10:13.720 | what I would say is Lex Friedman is handsome, charming.
03:10:18.720 | - I'm gonna open a Tinder account and publish this.
03:10:23.160 | - Really a great guy.
03:10:24.000 | I'm sorry to interrupt.
03:10:24.840 | - Thank you, I appreciate that.
03:10:26.760 | So I was reading about this last night.
03:10:28.720 | I was gonna tweet it, but then I'm like,
03:10:30.000 | "This is going to be misinterpreted."
03:10:32.400 | But it's, and this is why I like podcasts,
03:10:36.120 | 'cause I can say stuff like this.
03:10:38.160 | It's kind of incredible to me
03:10:42.640 | that the average human male produces
03:10:46.480 | like 500 billion plus sperm cells in their lifetime.
03:10:51.160 | Like each one of those are genetically unique.
03:10:55.300 | Like they can produce like unique humans.
03:10:59.900 | Each one, 500 billion, there's like 100 billion people
03:11:04.440 | who's ever lived, maybe like 110, whatever the number is.
03:11:09.380 | So it's like five times the number of people
03:11:11.160 | who ever lived is produced by each male
03:11:15.760 | of genetic information.
03:11:18.400 | So those are all possible trajectories of lives
03:11:20.800 | that could have lived.
03:11:21.960 | Like those are all little people that could have been.
03:11:25.560 | And like all the possible stories,
03:11:28.180 | all the Hitlers and Einsteins that could have been created
03:11:32.200 | and all that, I mean, I don't know,
03:11:34.120 | this kind of you're painting this possible future
03:11:38.280 | and we get to see only one little string of that.
03:11:40.720 | I mean, I suppose the magic of that is also captured
03:11:44.320 | by the, in the space of physics,
03:11:47.280 | having like multiple dimensions
03:11:49.680 | and the many worlds hypothesis of quantum mechanics,
03:11:53.440 | the interpretation that we're basically just,
03:11:57.720 | at every point, there's an infinite offspring
03:12:02.240 | of universes that are created.
03:12:04.720 | But I don't know, that's just like a magic
03:12:06.680 | of this game of genetics that we're playing.
03:12:10.820 | And the winning sperm is not the fastest.
03:12:15.180 | The winning sperm is basically the luckiest,
03:12:19.400 | has the right timing.
03:12:21.060 | So it's not, I also got into this whole,
03:12:24.820 | started reading papers about like,
03:12:29.360 | is there something to be said about who wins the race,
03:12:32.720 | right, genetically?
03:12:34.000 | So it's fascinating 'cause there's studies
03:12:36.000 | in animals and so on to answer that question
03:12:38.460 | 'cause it's interesting.
03:12:39.560 | 'Cause I'm a winner, right?
03:12:41.680 | I won, I won a race.
03:12:43.360 | - Yes. - And so you wanna know
03:12:44.560 | like what does that say about me
03:12:46.520 | in this fascinating genetic race against,
03:12:50.720 | I think, what is it, 200 million others, I think.
03:12:54.280 | So one pool of sperm cells is about,
03:12:59.280 | so something like 200 million.
03:13:02.920 | It could be, yes.
03:13:04.440 | But that, millions.
03:13:05.960 | - Yeah. - 'Cause I thought
03:13:06.800 | it was much, much lower than that.
03:13:08.040 | So like that, those are all brothers and sisters of mine
03:13:13.040 | and I beat 'em all out.
03:13:15.480 | - Yeah. - I won.
03:13:16.600 | And so it's interesting to know,
03:13:19.320 | there's a temptation to say I'm somehow better than them.
03:13:24.120 | Right?
03:13:25.400 | And now that goes into the next stage
03:13:28.360 | of something you're deeply thinking about,
03:13:33.200 | which is if we have more control now
03:13:38.200 | over the winning genetic code that becomes offspring,
03:13:43.840 | if we have first not even control,
03:13:46.640 | just information and then control,
03:13:50.320 | what do you think that world looks like
03:13:53.200 | from a biological perspective
03:13:54.980 | and from an ethical perspective
03:13:56.560 | when we start getting more information and more control?
03:14:00.800 | - Yeah, great question.
03:14:02.320 | So first on the sperm,
03:14:04.360 | there can be up to about 1.2 billion sperm cells
03:14:09.200 | in a male ejaculation.
03:14:10.600 | So as I mentioned in "Hacking Darwin,"
03:14:13.200 | male sperm, it's kind of a dime a dozen
03:14:15.400 | with all the guys in all the world
03:14:18.400 | just doing whatever they do with it.
03:14:20.320 | And it's an open question how competitive,
03:14:26.200 | I mean, there is an element of luck
03:14:28.520 | and there is an element of competition.
03:14:31.800 | And it's an open question how much that competition
03:14:36.720 | impacts the outcome or whether it's just luck.
03:14:40.240 | But my guess is there's some combination
03:14:42.840 | of fitness and luck.
03:14:45.220 | But you're absolutely right
03:14:46.960 | that all of those other sperm cells in the ejaculation,
03:14:51.320 | if that's how the union of the sperm and egg is happening,
03:14:55.840 | all of them represent a different future.
03:14:59.400 | - And there's a wonderful book
03:15:01.840 | called "Invisible Cities" by Italo Calvino.
03:15:05.480 | And he even talks about a city as something like this
03:15:09.180 | where everybody, you have your life,
03:15:12.200 | but then you have all these alternate lives.
03:15:14.280 | And every time you make any decision,
03:15:16.280 | but in this "Invisible Cities,"
03:15:19.320 | there's a little string that goes toward that alternate life.
03:15:23.320 | And then the city becomes this weaving of all the strings
03:15:27.360 | of people's real lives and the alternate lives
03:15:30.540 | that they could have taken
03:15:31.700 | had they made any other different steps.
03:15:34.040 | So that part, it's like a deep philosophical question.
03:15:37.640 | It's not just for us, it's for all of,
03:15:39.520 | I mean, it's baked into evolutionary biology.
03:15:43.300 | It's just what are the different strategies
03:15:45.320 | for different species to achieve fitness.
03:15:48.420 | And there's some of the different corals or other fish
03:15:52.020 | where they just kind of release the eggs into the water.
03:15:55.320 | I mean, there's all different kinds of ways.
03:15:58.920 | And then you're right in my book, "Hacking Darwin,"
03:16:03.160 | and it's the full title is "Hacking Darwin,
03:16:04.840 | Genetic Engineering and the Future of Humanity."
03:16:07.500 | I kind of go deep into exploring
03:16:12.400 | the big picture implications
03:16:14.520 | of the future of human reproduction.
03:16:17.960 | We are already participating
03:16:20.920 | in a revolutionary transformation,
03:16:23.440 | not just because of the diagnostics that we have,
03:16:26.360 | things like ultrasound,
03:16:28.360 | but because now an increasing number of us
03:16:31.120 | are being born through in vitro fertilization,
03:16:33.860 | which means the eggs are extracted from the mother,
03:16:36.540 | they're fertilized by the father's sperm in vitro in a lab,
03:16:41.200 | and then re-implanted in the mother.
03:16:45.120 | On top of that, there's a somewhat newer,
03:16:48.840 | but still now older technology
03:16:52.760 | called pre-implantation genetic testing.
03:16:55.560 | And so as everyone knows from high school biology,
03:16:58.640 | you have the fertilized egg,
03:17:01.360 | and then it goes one cell to two cells
03:17:03.960 | to four to eight and whatever.
03:17:05.280 | And after around five days in this PGT process,
03:17:09.600 | a few cells are extracted.
03:17:12.180 | So let's say you have 10 fertilized eggs,
03:17:15.440 | early stage embryos.
03:17:16.480 | A few cells are extracted from each,
03:17:18.520 | and those cells, if they would,
03:17:21.200 | the ones that are extracted
03:17:22.120 | would end up becoming the placenta.
03:17:25.040 | But every one of our cells has,
03:17:26.840 | other than a few, has our full genome.
03:17:31.000 | And so then you sequence those cells,
03:17:33.080 | and with pre-implantation genetic testing now,
03:17:36.440 | what you can do is you can screen out
03:17:38.680 | deadly single gene mutation disorders,
03:17:43.680 | things that could be deadly or life-ruining.
03:17:47.120 | And so people use it to determine
03:17:49.400 | which of those 10 early stage embryos
03:17:53.260 | to implant in a mother.
03:17:55.680 | As we shift towards a much greater understanding
03:17:59.000 | of genetics, and that is part of our,
03:18:01.480 | just the broader genetics revolution,
03:18:04.180 | but within that, in our transition
03:18:06.360 | from personalized to precision healthcare,
03:18:09.440 | more and more of us are going to have
03:18:11.000 | our whole genome sequenced
03:18:12.400 | because it's going to be the foundation
03:18:13.920 | of getting personalized healthcare.
03:18:16.000 | We're going to have already millions,
03:18:18.160 | but very soon billions of people
03:18:20.720 | who've had their whole genome sequenced.
03:18:22.800 | And then we'll have big databases
03:18:24.920 | of people's genetic, genotypic information,
03:18:27.640 | and life, or phenotypic information.
03:18:30.040 | And using, coming into your area,
03:18:32.280 | our tools of machine learning and data analytics,
03:18:35.120 | we're going to be able to increasingly understand
03:18:38.360 | patterns of genetic expression,
03:18:40.340 | even though we're all different.
03:18:41.180 | - So predict how the genetic information
03:18:43.480 | will get expressed.
03:18:44.320 | - Correct. - Yeah.
03:18:45.920 | - Never perfectly, perhaps, but more and more,
03:18:48.920 | always more and more.
03:18:50.320 | And so with that information,
03:18:52.440 | we aren't going to just be in the,
03:18:54.520 | even now, we aren't going to just be selecting
03:18:58.480 | based on which of these 10 early stage embryos
03:19:02.280 | is carrying a deadly genetic disorder,
03:19:04.820 | but we can, we'll be able to know everything
03:19:07.040 | that can be partly or entirely predicted by genetics.
03:19:12.040 | And there's a lot of our humanity
03:19:14.760 | that fits into that category.
03:19:17.360 | And certainly simple traits like height and eye color
03:19:22.360 | and things like that.
03:19:23.960 | I mean, height is not at all simple,
03:19:25.600 | but it's, if you have good nutrition,
03:19:28.940 | it's entirely or mostly genetic.
03:19:31.520 | But even personality traits and personality styles,
03:19:34.480 | there are a lot of things that we see
03:19:35.960 | just as the experience, the beauty of life,
03:19:39.240 | that are partly have a genetic foundation.
03:19:42.200 | So whatever part of these traits are definable
03:19:47.200 | and influenced by genetics,
03:19:50.000 | we're going to have greater and greater predictability
03:19:53.240 | within a range.
03:19:54.640 | And so selecting those embryos will be informed
03:19:59.640 | by that kind of knowledge.
03:20:02.540 | And that's why in "Hacking Darwin,"
03:20:04.640 | I talk about embryo selection as being a key driver
03:20:09.400 | of the future of human evolution.
03:20:11.680 | But then on top of that, there is in 2012,
03:20:15.400 | Shinya Yamanaka, an amazing Japanese scientist,
03:20:19.920 | won the Nobel Prize for developing a process
03:20:23.120 | for creating what are called
03:20:24.280 | induced pluripotent stem cells, IPS cells.
03:20:27.840 | And what IPS cells are is you can induce an adult cell
03:20:31.920 | to go back in evolutionary time and become a stem cell.
03:20:35.360 | And a stem cell is like when we're a fertilized egg,
03:20:39.780 | like our entire blueprint is in that one cell,
03:20:42.800 | and that cell can be anything,
03:20:44.340 | but then it starts to, our cells start to specialize,
03:20:48.280 | and that's why we have skin cells and blood cells
03:20:50.600 | and all the different types of things.
03:20:51.760 | So with the Yamanaka process,
03:20:54.840 | we can induce an adult cell to become a stem cell.
03:20:59.840 | So the relevance to this story is what you can do,
03:21:03.120 | and it works now in animal models.
03:21:05.280 | And as far as I know, it hasn't yet been done in humans,
03:21:08.420 | but it works pretty well in animal models.
03:21:11.240 | You take any adult cell,
03:21:12.440 | but skin cells are probably the easiest.
03:21:15.160 | You induce this skin cell into a stem cell.
03:21:19.920 | And if you just take a little skin graft,
03:21:21.920 | we'd have millions of cells.
03:21:23.880 | So you induce those skin cells into stem cells.
03:21:26.900 | Then you induce those stem cells into egg precursor cells.
03:21:31.640 | Then you induce those egg precursor cells into eggs,
03:21:36.040 | egg cells.
03:21:37.260 | Then, because we have this massive overabundance
03:21:41.240 | of male sperm, then you could fertilize,
03:21:45.400 | let's call it 10,000 of the mother's eggs.
03:21:49.720 | So you have 10,000 eggs which are fertilized.
03:21:52.800 | - Sounds like a party.
03:21:53.820 | - Yeah.
03:21:54.660 | Then you have an automated process
03:21:58.280 | for what I mentioned before
03:22:00.720 | in pre-implantation genetic testing.
03:22:02.440 | You grow them all for five days.
03:22:04.240 | You extract a few cells from each.
03:22:06.200 | You test them.
03:22:07.220 | And that's why I had a piece in the New York Times
03:22:09.180 | a couple of years ago,
03:22:10.060 | imagining what it would be like to go to a fertility clinic
03:22:13.020 | in the year 2050.
03:22:14.700 | And the choice is not--
03:22:15.540 | - No humans involved.
03:22:16.540 | - Yeah.
03:22:17.380 | Well, no, no, there are,
03:22:18.200 | but the choice is not,
03:22:19.500 | do you want a kid who does or doesn't have,
03:22:22.900 | let's call it Tay-Sachs.
03:22:25.560 | It's a whole range of possibilities,
03:22:28.740 | including very intimate traits
03:22:33.500 | like height, IQ, personality style.
03:22:36.020 | It doesn't mean you can predict everything,
03:22:38.000 | but it means there will be increasing predictability.
03:22:41.460 | So if you're choosing from 10,000 eggs,
03:22:45.340 | fertilized eggs, early stage embryos,
03:22:47.680 | that's a lot of choice.
03:22:49.680 | And on top of that,
03:22:52.200 | then we have the new technology of human genome editing.
03:22:57.200 | Many people have heard of CRISPR,
03:23:00.220 | but what I say is if you think of human genome editing
03:23:02.740 | as a pie,
03:23:04.980 | sorry, human genome engineering as a pie,
03:23:07.540 | genome editing is a slice,
03:23:09.140 | and CRISPR is just a sliver of that slice.
03:23:11.580 | It's just one of our tools for genome editing,
03:23:14.060 | and things are getting better and better.
03:23:16.660 | Then you can go in and change,
03:23:20.580 | let's say, I mean, again, it starts simple.
03:23:23.220 | A small number of genes,
03:23:24.880 | let's say you've selected from among the one of 10
03:23:27.900 | or the one of 10,000,
03:23:30.060 | but there are a number of changes
03:23:31.380 | that you would like to make to achieve some kind of outcome.
03:23:33.980 | And biology is incredibly complex,
03:23:36.340 | and it's not that one gene does one thing.
03:23:38.660 | One gene does probably a lot of things simultaneously,
03:23:42.620 | which is why the decision about changing one gene
03:23:45.500 | if it's causing deathly harm
03:23:47.540 | is easier than when we think about
03:23:49.620 | the complexity of biology.
03:23:52.100 | - But as the machine learning gets better
03:23:53.340 | and better at predicting the full complexity of biology,
03:23:56.060 | so as one gets better,
03:23:58.500 | then you're editing,
03:23:59.900 | your ability to reliably edit
03:24:03.220 | such that the conclusions are predictable
03:24:05.100 | gets better and better.
03:24:05.940 | So those are two are coupled together.
03:24:07.420 | - You got it, that's exactly it.
03:24:09.060 | And then so that's why,
03:24:10.420 | and people would say, well, that,
03:24:11.740 | I mean, I wrote about that in my two science fiction novels,
03:24:15.700 | Genesis Code and Eternal Sonata,
03:24:17.740 | years ago, especially with Genesis Code,
03:24:19.620 | I wrote about that,
03:24:20.980 | and as a sci-fi,
03:24:23.060 | and I had actually testified before Congress,
03:24:25.980 | but now 15 years ago,
03:24:27.280 | saying here's what the future looks like.
03:24:30.660 | But even I, and in my first edition of Hacking Darwin,
03:24:35.660 | when it was already in production,
03:24:39.460 | and then in November, 2018,
03:24:42.620 | this scientist, Ho-Jung Kuei,
03:24:44.780 | announced in Hong Kong
03:24:47.220 | that the world's first two,
03:24:50.460 | and later three, CRISPR babies had been born,
03:24:52.500 | which he had genetically altered,
03:24:54.460 | in a misguided, in my view,
03:24:56.140 | and dangerous view,
03:24:58.540 | a dangerous goal of making it
03:25:02.540 | so they would have increased resistance to HIV.
03:25:06.920 | And so I called my publisher,
03:25:09.760 | and I said, I've got good news and bad news.
03:25:12.320 | I'll start with the bad news,
03:25:13.540 | is that the world's first CRISPR babies have been born,
03:25:17.660 | and so we need to pull my book out of production,
03:25:21.540 | because you can't have a book on the future
03:25:23.360 | of human genetic engineering,
03:25:25.380 | and have it not mention the first CRISPR babies
03:25:27.960 | that had been born.
03:25:29.220 | But the good news is, in the book,
03:25:32.020 | I had predicted that it's going to happen,
03:25:34.460 | and it's going to happen in China, and here's why.
03:25:37.780 | And all we need to do is add a few more sentences,
03:25:42.220 | and that was the hardback,
03:25:43.380 | and then I updated it more in the paperback,
03:25:45.220 | saying, and it happened,
03:25:46.740 | and it was announced on this day.
03:25:48.780 | - Yeah.
03:25:50.100 | Well, then let's fast forward.
03:25:52.980 | Given your predictions are slowly becoming reality,
03:25:58.260 | let's talk about some philosophy and ethics, I suppose.
03:26:02.640 | So I'm not being too self-deprecating here,
03:26:07.860 | and saying if my parents had the choice,
03:26:12.300 | I would be probably less likely to come out the winner.
03:26:18.420 | We're all weird, and I'm certainly a very
03:26:21.500 | distinctly weird specimen of the human species.
03:26:26.940 | I can give the full long list of flaws,
03:26:29.760 | and we can be very poetic of saying
03:26:32.320 | those are features and so on, but they're not.
03:26:36.280 | If you look at the menu--
03:26:39.040 | - Again, for these women who are listening,
03:26:42.640 | apropos of your thing,
03:26:43.760 | they're all kind of charming individualities.
03:26:47.280 | - Yes, that's beautiful, yes, thank you.
03:26:50.240 | But anyway, on the full sort of individual,
03:26:52.960 | let's say IQ alone, right?
03:26:55.120 | That what do we do about a world where
03:27:00.120 | IQ could be selected in a menu when you're having children?
03:27:10.520 | What concerns you about that world?
03:27:15.680 | What excites you about that world?
03:27:17.920 | Are there certain metrics that excite you more than others?
03:27:25.000 | IQ has been a source of,
03:27:28.460 | I don't know, I'm not sure IQ as a measure,
03:27:36.120 | flawed as it is, has been used to celebrate
03:27:40.880 | the successes of the human species
03:27:43.680 | nearly as much as it has been used to divide people,
03:27:47.120 | to say negative things about people,
03:27:49.560 | to make negative claims about people.
03:27:54.120 | And in that same way, it seems like when there's
03:27:57.440 | a selection, a genetic selection based on IQ,
03:28:01.480 | you can start now having classes of citizenry,
03:28:05.160 | like further divide, the rich get richer.
03:28:09.500 | It'll be very rich people that'll be able to do
03:28:13.440 | kind of fine selection of IQ,
03:28:16.440 | and then they will start forming these classes
03:28:21.440 | of super intelligent people,
03:28:23.960 | and those super intelligent people in their minds
03:28:26.320 | would of course be the right people to be making
03:28:28.400 | global authoritarian decisions about everybody else,
03:28:31.320 | all the usual aspects of human nature,
03:28:33.980 | but now magnified with the new tools of technology.
03:28:38.280 | Anyway, all that to say is what's exciting to you,
03:28:42.480 | what's concerning to you?
03:28:44.280 | - It's a great question, and just stepping into the IQ,
03:28:48.880 | we'll call it a quagmire for now,
03:28:51.880 | but it raises a lot of big issues which are complicated.
03:28:56.880 | Maybe you've listened to Sam Harris's interview
03:29:02.240 | with Charles Murray, and then that spawned
03:29:05.040 | kind of a whole industry of debate.
03:29:10.040 | So first, just the background of IQ,
03:29:13.360 | and it's from the early 20th century,
03:29:16.440 | and there was the idea that we can measure
03:29:19.120 | people's general intelligence,
03:29:21.240 | and there are so many different kinds of intelligence.
03:29:23.720 | This was measuring a specific thing.
03:29:25.680 | So my feeling is that IQ is not a perfect measure
03:29:30.440 | of intelligence, but it's a perfect measure of IQ.
03:29:33.440 | Like it's measuring what it's measuring,
03:29:35.480 | but that thing correlates to a lot of things
03:29:40.040 | which are rewarded in our society.
03:29:43.480 | So every study of IQ has shown that people with higher IQs,
03:29:49.400 | they make more money, they live longer,
03:29:52.160 | they have more stable relationships.
03:29:53.960 | I mean, that could be something in the testing,
03:29:56.640 | but as Sam Harris has talked about a lot,
03:30:00.600 | you could line up all of these kind of IQ
03:30:03.880 | and IQ-like tests correlate with each other.
03:30:07.120 | So the people who score high on one score high on all of them
03:30:10.560 | and people think that IQ tests are like a thing
03:30:15.080 | like the Earl of Dorchester is coming for dinner.
03:30:18.840 | Does he have two forks or three forks
03:30:21.320 | or something like that?
03:30:22.200 | It's not that.
03:30:23.280 | A lot of them are things that I think a lot of us
03:30:26.520 | would recognize are relevant,
03:30:27.800 | just like how much stuff can you memorize?
03:30:30.640 | If you see some shapes, how can you position them
03:30:33.560 | and things like that?
03:30:36.000 | And so IQ, I mean, it really hit its stride
03:30:38.760 | in certainly in the second world war
03:30:40.760 | when we were just, our governments were processing
03:30:43.000 | a lot of people and trying to figure out
03:30:44.440 | who to put in what job.
03:30:47.080 | So that's the starting point.
03:30:48.880 | Let me start first with the negatives.
03:30:51.960 | That our societies, that when we talk about diversity
03:30:56.720 | in Darwinian terms, it's not like diversity
03:31:00.440 | is from Darwinian terms.
03:31:01.960 | Oh, wouldn't it be nice if we have some moths
03:31:05.880 | of different colors because it'll be really fun
03:31:08.320 | to have different colored moths.
03:31:10.200 | Diversity is the sole survival strategy of our species
03:31:14.720 | and of every species.
03:31:15.960 | And it's impossible to predict which,
03:31:20.080 | what diversity is going to be rewarded.
03:31:22.920 | And I've said this before, if you went down
03:31:25.440 | and you had, if you spoke T-Rex and you spoke
03:31:28.600 | to the dinosaurs and said, hey, you can select your kids,
03:31:31.920 | what criteria do you want?
03:31:33.440 | And they'd say, oh yeah, yeah, sharp teeth,
03:31:36.120 | cruel fangs, roar, whatever it is
03:31:39.560 | that makes you a great T-Rex.
03:31:42.440 | But the answer from an evolutionary perspective,
03:31:46.280 | from an earth perspective was, oh, it's much better
03:31:48.240 | to be like a cockroach or an alligator
03:31:50.800 | or some little nothing or a little shrew
03:31:53.680 | because the dinosaurs are gonna get wiped out
03:31:57.480 | when the asteroid hits.
03:31:58.880 | And so there's no better or worse in evolution.
03:32:01.960 | There's just better or worse suited for a given environment.
03:32:05.840 | And when that environment changes,
03:32:07.680 | the best suited person from the old system
03:32:11.640 | could be the worst suited person for the new one.
03:32:14.080 | So if we start selecting for the things
03:32:17.080 | that we value the most, including things like IQ,
03:32:21.720 | but even disease resistance, I mean, this is well-known,
03:32:26.360 | but if you, people who are recessive carrier
03:32:29.240 | of sickle cell disease have increased resistance
03:32:33.480 | to malaria, which is the biggest reason
03:32:35.640 | why that trait hasn't just disappeared,
03:32:40.080 | given how deadly sickle cell disease is,
03:32:43.920 | biology is incredibly complex.
03:32:46.120 | We understand such a tiny percentage of it
03:32:49.400 | that we need to have, in your words,
03:32:51.000 | just a level of humility.
03:32:53.920 | There are huge equity issues, as you've articulated.
03:32:56.560 | Let's just say that it is the case that in our society,
03:32:59.760 | IQ and IQ-like traits are highly rewarded.
03:33:04.760 | There is an equity issue, but it works in both ways
03:33:08.160 | because my guess is, let's just say
03:33:09.920 | that we had a society where we were doing genome sequencing
03:33:13.440 | of everybody who was born,
03:33:15.400 | and we had some predictive model to predict IQ,
03:33:18.800 | and we had decided as a society
03:33:21.160 | that IQ was going to be what we were going to select for.
03:33:24.320 | We were gonna put the highest IQ people
03:33:26.280 | in these different roles.
03:33:28.680 | I guarantee you, the people in those roles
03:33:31.560 | would not be the people
03:33:32.680 | who are legacy admissions to Harvard.
03:33:36.040 | They would very likely be people who are born in slums,
03:33:40.800 | people who are born with no opportunity,
03:33:42.720 | or in refugee camps, who are just wasting away
03:33:46.520 | because we've thrown them away.
03:33:49.280 | It's the idea of just being able to look under the hood
03:33:57.120 | of our humanity is really scary for everybody,
03:34:02.120 | and it should be.
03:34:03.120 | I'm also an Ashkenazi Jew.
03:34:06.120 | My father was born in Austria.
03:34:08.040 | My father and grandparents came here as refugees.
03:34:11.120 | After the war, most of that side of the family was killed,
03:34:14.840 | so I get what it means to be on the other,
03:34:19.280 | and you said you're reading Mein Kampf,
03:34:20.520 | on the other side of the story,
03:34:22.200 | when someone said, "Oh, here's what's good,
03:34:24.560 | "and you're not good, and therefore you're,"
03:34:26.920 | so I totally get that.
03:34:29.800 | Having said that, I do believe that we're moving
03:34:34.760 | toward a new way of procreating,
03:34:38.360 | and we're going to have to decide what are the values
03:34:41.400 | that we would like to realize through that process.
03:34:45.120 | Is it randomness, which is what we currently have now,
03:34:48.240 | which is not totally random
03:34:49.760 | because we have a sort of mating through colleges
03:34:52.040 | and other things, but if it's--
03:34:54.600 | - Wait, mating through what, colleges?
03:34:56.280 | - It's sort of like if you go,
03:34:58.320 | if you go to Harvard or whatever,
03:35:01.320 | and your wife also goes to Harvard, it's like--
03:35:05.000 | - So it's location-based mating.
03:35:07.880 | - Well, it's not location, it's selection.
03:35:09.800 | It's like there are selections that are made
03:35:11.760 | about who gets to a certain place,
03:35:14.400 | and it's like Harvard admissions is a filter.
03:35:17.360 | So we're gonna have to decide what are the values
03:35:19.600 | that we want to realize through this process
03:35:21.400 | because diversity, it's just baked into our biology.
03:35:24.900 | We're the first species ever
03:35:27.080 | that has the opportunity to make choices
03:35:29.960 | about things that were otherwise baked into our biology,
03:35:33.720 | and there's a real danger that if we make bad choices,
03:35:37.520 | even with good intentions,
03:35:39.560 | it could even drive us toward extinction
03:35:42.440 | and certainly undermine our humanity,
03:35:45.120 | and that's why I always say, and like I said,
03:35:47.200 | I'm deeply involved with WHO and other things,
03:35:50.160 | that these aren't conversations about science.
03:35:53.080 | They're conversations, science brings us
03:35:55.320 | to the conversation, but the conversation
03:35:56.920 | is about values and ethics.
03:35:58.280 | - As you described, that world is wide open.
03:36:00.500 | It's not even a subtly different world.
03:36:04.920 | That world is fundamentally different
03:36:06.640 | from anything we understand about life on Earth
03:36:09.740 | because natural selection, this random process,
03:36:14.740 | is so fundamental how we think about life.
03:36:17.820 | Being able to program, I mean, it has the chance to,
03:36:23.360 | I mean, it'll probably make my question
03:36:25.820 | about the ethical concerns around IQ-based selection
03:36:29.880 | just meaningless because it'll change
03:36:35.000 | the nature of identity.
03:36:36.720 | Like, it's possible it will dissolve identity
03:36:40.760 | because we take so much pride
03:36:44.840 | in all the different characteristics that make us who we are.
03:36:48.640 | Whenever you have some control over those characteristics,
03:36:51.980 | those characteristics start losing meaning.
03:36:54.120 | And what may start gaining meaning
03:36:57.760 | is the ideas inside our heads, for example,
03:37:00.600 | versus the details of, is it a Commodore 64,
03:37:05.600 | is it a PC, is it a Mac?
03:37:10.680 | It's gonna be less important than the software
03:37:12.760 | that runs on it.
03:37:14.280 | So we can more and more be operating
03:37:16.040 | in the digital space, and identity could be something
03:37:18.400 | that borrows multiple bodies.
03:37:21.720 | The legacy of our ideas may become more important
03:37:24.760 | than the details of our physical embodiment.
03:37:28.220 | I mean, I'm saying perhaps ridiculous-sounding things,
03:37:31.620 | but the point is it will bring up
03:37:34.400 | so many new ethical concerns
03:37:38.060 | that our narrow-minded thinking
03:37:39.900 | about the current ethical concerns will not apply.
03:37:43.440 | But it's important to think about all this kind of stuff
03:37:47.340 | like actively.
03:37:49.160 | What are the right conversations to be having now?
03:37:51.560 | 'Cause it feels like it's an ongoing conversation
03:37:56.200 | that then continually evolves, like with an NIH involved.
03:37:59.680 | Like, do you do experiments with animals?
03:38:03.400 | Do you build these brain organoids?
03:38:06.200 | Do you, like, through that process you described
03:38:08.320 | with the stem cells, like, do you experiment
03:38:10.640 | with a bunch of organisms to see how genetic material,
03:38:15.640 | what formula that actually takes,
03:38:18.080 | how to minimize the chance of cancer,
03:38:19.520 | and all those kinds of things.
03:38:20.500 | What are the negative consequences of that?
03:38:22.640 | What are the positive consequences?
03:38:24.840 | Yeah, it's a fascinating world.
03:38:26.920 | It's a really fascinating world.
03:38:28.240 | - Yeah, but those conversations are just so essential.
03:38:32.240 | Like, we have to be talking about ethics,
03:38:34.280 | and then that raises the question of who is the we?
03:38:37.400 | And coming back to your conversation
03:38:39.800 | about science communication,
03:38:41.980 | maybe there was a time earlier
03:38:44.120 | when these conversations were held
03:38:47.440 | among a small number of experts
03:38:49.360 | who made decisions on behalf of everybody else.
03:38:53.140 | But what we're talking about here
03:38:54.820 | is really the future of our species,
03:38:57.100 | and I think that conversation is too important
03:39:01.140 | to be left just to experts and government officials.
03:39:04.980 | So I mentioned that I'm a member.
03:39:06.860 | We just ended our work after two years
03:39:09.060 | of the World Health Organization Expert Advisory Committee
03:39:11.580 | on Human Genome Editing, and my big push in that process
03:39:16.460 | was to have education, engagement, and empowerment
03:39:21.140 | of the broad public,
03:39:22.240 | to not just bring people into the conversation
03:39:25.980 | with the tools to be able to engage,
03:39:29.320 | but also into the decision-making process.
03:39:31.980 | And it's a real shift,
03:39:33.780 | and there are countries that are doing it
03:39:36.020 | better than others.
03:39:37.220 | I mean, Denmark is obviously a much smaller country
03:39:39.660 | than the United States,
03:39:40.900 | but they have a really well-developed infrastructure
03:39:44.400 | for public engagement
03:39:46.420 | around really complicated scientific issues.
03:39:49.100 | And I just think that we have to,
03:39:51.000 | like, it's great that we have Twitter
03:39:53.420 | and all these other things,
03:39:55.140 | but we need structured conversations
03:39:58.460 | where we can really bring people together
03:40:00.500 | and listen to each other,
03:40:02.380 | which feels like it's harder than ever.
03:40:05.240 | But even now in this process
03:40:08.420 | where all these people are shouting at each other,
03:40:11.340 | at least there are a bunch of people
03:40:12.840 | who are in the conversation,
03:40:14.540 | so we have a foundation,
03:40:16.540 | but we just really need to do more work.
03:40:20.440 | And again and again and again,
03:40:21.780 | it's about ethics and values
03:40:24.920 | because we're at an age,
03:40:27.240 | and this has become a cliche,
03:40:28.780 | of exponential technological change.
03:40:31.380 | And so the rate of change is faster going forward
03:40:35.500 | than it has been in the past.
03:40:36.680 | So in our minds,
03:40:37.780 | we underappreciate how quickly things are changing
03:40:41.080 | and will change.
03:40:43.420 | And if we're not careful,
03:40:44.660 | if we don't know who we are
03:40:46.900 | and what our values are,
03:40:48.680 | we're going to get lost.
03:40:49.820 | And we don't have to know technology.
03:40:52.260 | We have to know who we are.
03:40:53.780 | I mean, our values are hard won over thousands of years.
03:40:58.780 | No matter how new the technology is,
03:41:01.100 | we shouldn't and can't jettison our values
03:41:04.460 | 'cause that is our primary navigational tool.
03:41:07.600 | - Absurd question
03:41:09.780 | 'cause we were saying that sexual reproduction
03:41:14.780 | is not the best way to define the offspring.
03:41:17.860 | You think there'll be a day when humans stop having sex?
03:41:21.500 | - I don't think we'll stop having sex
03:41:23.280 | because it's so enjoyable,
03:41:25.540 | but we may significantly stop having sex for reproduction.
03:41:30.920 | Even today, most human sex is not for making babies.
03:41:35.100 | It's for other things,
03:41:36.240 | whether it's pleasure or love or pair bonding
03:41:39.000 | or whatever.
03:41:40.820 | - Intimacy.
03:41:41.660 | - Intimacy.
03:41:42.500 | I mean, some people do it for intimacy.
03:41:44.500 | Some people do it for pleasure with strangers.
03:41:47.020 | - I feel like the people that do it for pleasure,
03:41:48.920 | I feel like there will be better ways
03:41:50.940 | to achieve that same chemical pleasure, right?
03:41:54.660 | - You know, there's just so many different kinds of people.
03:41:57.600 | I just saw this on television,
03:42:00.060 | but there are people who put on those big bunny outfits
03:42:02.660 | and go and have sex with other people.
03:42:04.380 | I mean, there's just like an unlimited number
03:42:06.700 | of different kinds of people.
03:42:07.980 | - I think they're called,
03:42:09.500 | so I remember hearing about this,
03:42:11.020 | I think Dan Savage is the podcast.
03:42:14.100 | I think they're called furries.
03:42:17.580 | - Furries.
03:42:18.460 | - Like furry parties.
03:42:19.540 | - Yeah, exactly.
03:42:20.380 | So there's just-
03:42:21.660 | - I love people.
03:42:22.780 | - Yeah, well, that's like the thing.
03:42:25.220 | It's like, whenever you hear these words,
03:42:26.540 | like humans, what will they think of next?
03:42:30.800 | So, but I do think that,
03:42:32.020 | and I write about this in "Hacking Darwin,"
03:42:34.620 | that as people come to believe
03:42:39.020 | that making children through the application of science
03:42:43.820 | is safer and more beneficial
03:42:46.100 | than having children through sex,
03:42:48.780 | we'll start to see a shift over time
03:42:52.140 | toward reproduction through science.
03:42:54.420 | We'll still have sex for all the same great reasons
03:42:58.780 | that we do it now,
03:43:00.420 | it's just reproduction less and less through the act of sex.
03:43:04.420 | - Man, it's such a fascinating future.
03:43:07.080 | 'Cause as somebody, I value flaws.
03:43:11.180 | I think it's the goodwill hunting,
03:43:17.140 | that's the good stuff.
03:43:18.540 | The flaws, the weird quirks of humans,
03:43:21.900 | that's what makes us who we are, the weird.
03:43:25.380 | The weird is the beautiful,
03:43:26.660 | and there's a fear of optimization that I-
03:43:33.020 | - You should have it.
03:43:33.860 | - I mean, it's very healthy.
03:43:35.500 | And I think that's the danger of all of this selection
03:43:39.900 | is that we make selections just based on social norms
03:43:44.340 | that are so deeply internal
03:43:47.460 | that they feel like they're eternal truths.
03:43:51.180 | And so we talked about selecting for IQ.
03:43:54.580 | What about selecting for a kind heart?
03:43:56.780 | Like there are lots of you,
03:43:57.620 | and you talked about Hitler and Mein Kampf.
03:43:59.500 | Hitler had certainly had a high IQ,
03:44:03.060 | I guess it's higher than average IQ.
03:44:05.120 | If we just select, I mean, that's why I was saying before,
03:44:11.080 | diversity is baked into our biology.
03:44:13.500 | But the key lesson, and I've said this many times before,
03:44:16.180 | the key lesson of this moment in our history
03:44:19.020 | is that after nearly 4 billion years of evolution,
03:44:22.780 | our one species suddenly has the unique
03:44:26.380 | and increasing ability to read, write,
03:44:28.140 | and hack the code of life.
03:44:30.260 | And so as we apply these godlike powers
03:44:34.100 | that we've now assumed for ourselves,
03:44:37.940 | we better be pretty careful
03:44:39.820 | because it's so easy to make mistakes,
03:44:44.120 | particularly mistakes that are guided
03:44:47.060 | by our best intentions.
03:44:48.460 | - To jump briefly back onto lab leak,
03:44:52.180 | and I swear there's a reason for that.
03:44:55.660 | What did you think about the Jon Stewart,
03:44:58.180 | this moment, I forget when it was,
03:45:00.820 | maybe a few months ago, in the summer, I think, of 2021,
03:45:05.220 | where he went on Colbert Report,
03:45:07.900 | or not the Colbert Report, sorry,
03:45:09.340 | the Stephen Colbert's, whatever his show is.
03:45:13.340 | But again, Jon Stewart reminded us
03:45:15.700 | how valuable his wit and brilliance
03:45:18.860 | within the humor was for our culture.
03:45:22.220 | And so he did this whole bit
03:45:24.500 | that highlighted the common sense nature
03:45:27.460 | about what was the metaphor he used
03:45:30.020 | about the Hershey factory in Pennsylvania.
03:45:33.140 | So what'd you think about that whole bit?
03:45:34.860 | - I loved it.
03:45:36.580 | And so not to be overly self-referential,
03:45:40.940 | but it's hard not to be overly self-referential
03:45:43.180 | when you're doing a, whatever, however long we are,
03:45:45.020 | five-hour interview about yourself,
03:45:47.580 | which reminds me of when you had Bret Weinstein on.
03:45:50.660 | He said, "I have no ego,
03:45:52.500 | but these 57 people have screwed me over."
03:45:55.020 | And I deserve credit.
03:45:56.460 | - It's hard, it's hard to walk out.
03:45:58.540 | - I am a person, I will confess, it's enjoyable.
03:46:02.820 | Some people feel different.
03:46:03.660 | I kind of like talking about all this stuff
03:46:06.260 | and talking, period.
03:46:08.020 | So for me, in the earliest,
03:46:10.620 | I remember those early days when the pandemic started,
03:46:13.900 | I was just sitting down,
03:46:14.900 | it was late January, early February, 2020,
03:46:17.260 | and I just was laying out all of the evidence
03:46:20.220 | just that I could collect,
03:46:22.500 | trying to make sense of where does this come from?
03:46:26.420 | And it was just logic.
03:46:29.100 | I mean, it was all of the things that Jon Stewart said,
03:46:33.100 | which in some overly wordy form
03:46:36.380 | were all at that time on my website.
03:46:38.620 | Like, what are the odds of having this outbreak
03:46:42.820 | of a bat coronavirus more than a thousand miles away
03:46:45.900 | from where these bats have their natural habitat,
03:46:49.140 | where they have the largest collection
03:46:52.500 | of these bat coronaviruses in the world,
03:46:55.100 | and they're doing all these very aggressive
03:46:57.260 | research projects to make them more aggressive,
03:47:02.100 | and then you have the outbreak of a virus
03:47:04.260 | that's primed for human-to-human transmission.
03:47:07.860 | It was just logic was my first step,
03:47:12.540 | and I kept gathering the information.
03:47:17.500 | But Jon Stewart distilled that
03:47:20.060 | in a way that just everybody got.
03:47:23.300 | And I think that I loved it,
03:47:26.300 | and I just think that there's a way of reaching people.
03:47:28.980 | It's the reason why I write science fiction
03:47:31.500 | in addition to thinking and writing about the science,
03:47:34.180 | is that we kind of have to reach people where they are.
03:47:37.660 | And I just thought it was just,
03:47:40.340 | there was a lot of depth, I thought,
03:47:42.700 | and maybe that's too self-serving,
03:47:46.620 | but in the analysis, but he captured that
03:47:50.740 | into those things about, it's like the, whatever,
03:47:55.740 | the outbreak of chewy goodness near the Hershey factory.
03:48:00.500 | I wonder where that came from.
03:48:01.980 | - Yeah, the humor, there's metaphor.
03:48:05.260 | Also, the sticking with the joke when the audience is,
03:48:09.340 | the audience is Stephen Colbert.
03:48:14.020 | He was resisting it.
03:48:16.380 | He was very uncomfortable with it.
03:48:17.820 | Maybe that was part of the bit, I'm not sure,
03:48:20.220 | but it didn't look like it.
03:48:21.980 | So Stephen in that moment kind of represented
03:48:24.220 | the discomfort of the scientific community, I think.
03:48:26.780 | It's kind of interesting, that whole dynamic.
03:48:28.860 | And I think that was a pivotal moment.
03:48:31.500 | That just highlights the value of comedy,
03:48:35.780 | the value of, like when Joe Rogan says,
03:48:40.100 | "I'm just a comedian."
03:48:41.660 | I mean, that's such a funny thing to say.
03:48:46.220 | It's like saying, "I'm just a podcaster,"
03:48:48.100 | or, "I'm just a writer."
03:48:49.340 | I'm just a, you know, that ability in so few words
03:48:54.340 | to express what everybody else is thinking,
03:48:59.200 | it's so refreshing.
03:49:02.500 | And I wish the scientific communicators would do that too.
03:49:06.300 | A little humor, a little humor.
03:49:08.260 | I mean, that's why I love Elon Musk very much.
03:49:10.540 | So like the way he communicates is like,
03:49:14.900 | it's so refreshing for a CEO of a major company,
03:49:18.980 | several major companies, to just have a sense of humor
03:49:22.520 | and say ridiculous shit every once in a while.
03:49:24.820 | That's so, there's something to that.
03:49:26.960 | Like it shakes up the whole conversation
03:49:29.540 | to where it gives you freedom to like think publicly.
03:49:33.780 | If you're always trying to say the proper thing,
03:49:36.900 | you lose the freedom to think,
03:49:38.740 | to reason out, to be authentic and genuine.
03:49:42.740 | When you allow yourself the freedom
03:49:44.940 | to regularly say stupid shit, have fun, make fun of yourself,
03:49:49.940 | I think you give yourself freedom
03:49:53.100 | to really be a great scientist.
03:49:54.740 | Honestly, I think scientists have a lot
03:49:58.700 | to learn from comedians.
03:50:00.060 | - Well, for sure.
03:50:00.900 | I think we all do about just distilling and communicating
03:50:05.340 | in ways that people can hear.
03:50:07.280 | Like a lot of us say things and people just can't hear them
03:50:10.420 | either because of the way we're saying them
03:50:12.060 | or where they are.
03:50:13.540 | And like I said before, I'm a big fan of Joe Rogan.
03:50:18.540 | I've been on his show twice and whatever,
03:50:21.340 | but when Francis Collins was in his conversation with you,
03:50:25.500 | he said, which I think makes sense,
03:50:27.820 | is that when somebody has that kind of platform
03:50:30.980 | and people rightly or wrongly who follow them
03:50:34.040 | and look to them for guidance,
03:50:36.940 | I do think that there is some responsibility
03:50:40.740 | for people in those roles to make whatever judgment
03:50:45.020 | that they make and to share that.
03:50:47.220 | And as I mentioned to you when we were off mic,
03:50:50.660 | Sanjay Gupta is a very close friend of mine.
03:50:53.180 | We've been friends for many years
03:50:54.860 | and I fully supported Sanjay's instinct
03:50:58.420 | to go on the Joe Rogan show.
03:51:02.160 | I thought it was great.
03:51:05.180 | At the end of that whole conversation, Joe said,
03:51:09.220 | "Well, I'm just a comedian, what do I know?"
03:51:12.300 | And I just felt that yes, Joe Rogan is a comedian.
03:51:16.460 | I wouldn't say just a comedian among other things,
03:51:20.500 | but I also felt that he had a responsibility
03:51:23.780 | for just saying whatever he believed,
03:51:25.340 | even if he believed or believes, as I think is the case,
03:51:30.100 | that ivermectin should be studied more,
03:51:33.000 | which I certainly agree,
03:51:35.860 | and that healthy people shouldn't get vaccinated,
03:51:40.860 | healthy young people, which I don't agree.
03:51:43.380 | I just felt at the end of that conversation to say,
03:51:45.580 | "Well, I'm just a comedian, what do I know?"
03:51:48.300 | I feel like it didn't fully integrate the power
03:51:52.720 | that a person like Joe Rogan has to set the agenda.
03:51:55.960 | - So I think the reason he says, "I'm just a comedian,"
03:51:59.060 | is the same reason I say, "I'm an idiot,"
03:52:01.220 | which I truly believe.
03:52:02.860 | I can explain exactly what I mean by that,
03:52:04.580 | but it's more for him, or in this case for me,
03:52:09.580 | to just keep yourself humble.
03:52:11.840 | 'Cause I think it's a slippery slope
03:52:15.380 | when you think you have a responsibility
03:52:17.060 | to then think you actually have an authority,
03:52:20.320 | because a lot of people listen to you,
03:52:23.300 | you think you have an authority
03:52:24.620 | to actually speak to those people,
03:52:27.220 | and you have enough authority
03:52:28.460 | to know what the hell you're talking about.
03:52:30.500 | And I think there's just a humility
03:52:32.460 | to just kind of make fun of yourself
03:52:34.640 | that's extremely valuable.
03:52:36.340 | And saying, "I'm just a comedian,"
03:52:38.140 | I think is a reminder to himself
03:52:41.820 | that he's often full of shit, so are all of us.
03:52:46.820 | And so that's a really powerful way for himself
03:52:51.740 | to keep himself humble.
03:52:52.980 | I mean, I think that's really useful
03:52:56.180 | in some kind of way for people in general
03:52:58.420 | to make fun of themselves a little bit,
03:53:01.300 | in whatever way that means.
03:53:02.660 | And saying, "I'm just a comedian,"
03:53:04.100 | is just one way to do that.
03:53:05.500 | Now, that couple of that with the responsibility
03:53:07.900 | of doing the research and really having an open mind
03:53:11.500 | and all those kinds of stuff,
03:53:13.000 | I think that's something that Joe does really well
03:53:15.860 | on a lot of topics, but he can't do that on everything.
03:53:18.820 | And so it's up to the people to decide
03:53:22.780 | how well he does it on certain topics and not others.
03:53:26.900 | But how do you think Sanjay did in that conversation?
03:53:29.620 | - So I know I'm gonna get myself into trouble here
03:53:32.460 | because Sanjay's a very close friend.
03:53:35.700 | Joe, my personal interaction with him
03:53:38.540 | has been our two interviews,
03:53:40.020 | but it's like my interview with now.
03:53:41.500 | Sit down with somebody for four hours,
03:53:43.580 | it's a lot and great, and then private communication.
03:53:48.580 | So I am personally more sympathetic to the arguments
03:53:54.220 | that Sanjay was making or trying to make.
03:53:58.540 | I believe that the threat of the virus
03:54:01.380 | is greater than the threat of the vaccine.
03:54:04.300 | That doesn't mean that we can guarantee 100% safety
03:54:08.940 | for the vaccine,
03:54:09.900 | but these are really well tolerated vaccines.
03:54:13.740 | And we know for all the reasons we've been talking about,
03:54:15.700 | that this is a really scary virus.
03:54:18.500 | And particularly the mRNA vaccines,
03:54:20.940 | what they're basically doing is getting your body
03:54:23.420 | to replicate a tiny little piece of the virus,
03:54:26.180 | the spike protein,
03:54:27.180 | and then your body responds to that.
03:54:30.100 | And so that's a much less of an insult to your body
03:54:34.980 | than being infected by the virus.
03:54:37.460 | So I'm more sympathetic to the people who say,
03:54:41.500 | well, everybody should get vaccinated,
03:54:45.140 | but people who've already been infected,
03:54:48.060 | we should study whether they need to be vaccinated or not.
03:54:52.940 | Having said all of that,
03:54:54.980 | I felt that Joe Rogan won the debate.
03:54:59.980 | And the reason that I felt that he won the debate
03:55:03.780 | was they had two different categories of arguments.
03:55:08.620 | So Sanjay, what he was trying to do,
03:55:11.900 | which I totally respect,
03:55:13.140 | was saying there's so much animosity
03:55:14.980 | between the, on these different sides,
03:55:16.620 | let's lower the temperature.
03:55:17.980 | Let's model that we can have a respectful dialogue
03:55:22.900 | with each other where we can actually listen.
03:55:25.020 | And Sanjay, again, I've known him for many years.
03:55:27.500 | He's a very empathic, humble,
03:55:30.780 | just an all around wonderful human being.
03:55:34.060 | And I really love him.
03:55:36.100 | And so he was making cases that were based on
03:55:38.780 | kind of averages, studies, and things like that.
03:55:41.500 | And Joe was saying, well, I know a guy
03:55:44.780 | whose sister's cousin had this experience.
03:55:48.420 | And I'm sure that it's all true
03:55:50.780 | in the sense that we have millions of people
03:55:53.060 | who are getting vaccinated and different things.
03:55:56.580 | And what Sanjay should have said was,
03:55:59.460 | I know that's anecdote.
03:56:01.740 | Here's another anecdote of like
03:56:03.900 | when Francis Collins was with you
03:56:05.420 | and he talked about the world wrestling guy
03:56:08.220 | who was like 6'6" and a big muscly guy.
03:56:10.660 | And then he got COVID and he was anti-vaxxed.
03:56:13.180 | And then he got COVID and almost died.
03:56:14.820 | And he said, I'm gonna--
03:56:15.660 | - By the way, I don't know if you know this part.
03:56:17.940 | - No.
03:56:18.780 | - Oh, this is funny.
03:56:19.620 | Joe's gonna listen to this.
03:56:20.740 | He's gonna be laughing.
03:56:22.020 | - Does Joe listened like to the four hours of this
03:56:24.660 | in addition to the three hours of his interviews every day?
03:56:28.340 | - No, not every day, but he listens to a lot of these.
03:56:30.780 | - I love it.
03:56:31.620 | - And we talk about it.
03:56:32.440 | - I love it.
03:56:33.280 | - We argue about it.
03:56:34.100 | - Hi, Joe.
03:56:34.940 | - Hey, Joe.
03:56:35.780 | We love you, Joe.
03:56:36.940 | But he, so that particular case,
03:56:40.260 | I don't know why Francis said what he said there,
03:56:43.220 | but that's not accurate.
03:56:45.100 | - Oh, really?
03:56:45.940 | - So the wrestler never,
03:56:49.500 | he didn't almost die.
03:56:50.820 | He was no big deal at all for him.
03:56:53.140 | And he said that to him.
03:56:54.500 | I think, I'm not sure.
03:56:56.860 | I think something got mixed up in Francis's memory.
03:57:00.220 | There was another case.
03:57:01.140 | He must have been like,
03:57:02.340 | 'cause I don't imagine he would bring that case up
03:57:05.380 | and just like make it up, you know, 'cause like why?
03:57:09.380 | But he, that was not at all,
03:57:11.540 | like that was a pretty public case.
03:57:13.140 | He had an interview with him.
03:57:15.340 | That wrestler, he was just fine.
03:57:17.500 | So that anecdotal case, I mean,
03:57:19.780 | Francis should not have done that.
03:57:21.060 | So if I have any, so I have a bunch of criticism
03:57:24.300 | of how that went.
03:57:26.180 | People who criticize that interview,
03:57:28.700 | I feel like don't give enough respect
03:57:32.340 | to the full range of things
03:57:33.820 | that Francis Collins has done in his career.
03:57:35.740 | He's an incredible scientist.
03:57:37.420 | And I also think a really good human being.
03:57:40.100 | But yes, that conversation was flawed in many ways.
03:57:44.620 | And one of them was why,
03:57:47.860 | when you're trying to present some kind of critical,
03:57:52.000 | like criticize Joe Rogan,
03:57:56.500 | why bring up anecdotal evidence at all?
03:57:58.740 | And if you do bring up anecdotal evidence,
03:58:02.060 | which is not scientific,
03:58:03.300 | if you're a scientist,
03:58:04.140 | you should not be using anecdotal evidence.
03:58:06.180 | If you do bring it up, why bring up one that's not,
03:58:10.260 | that's first not true and you know it's not true?
03:58:13.780 | - Well, I-- - So I know,
03:58:16.300 | pretend, so you don't know it's not true.
03:58:18.620 | So yes, that would have been,
03:58:20.100 | find another case where-- - Exactly.
03:58:23.220 | So the basic thing,
03:58:24.460 | coming back to Sanjay and Joe's conversation,
03:58:28.120 | was that Sanjay was trying to use statistical evidence
03:58:31.660 | and Joe was using anecdotal evidence.
03:58:33.660 | And so I think that for Sanjay,
03:58:36.060 | and there are all kinds of things,
03:58:38.040 | where there are debates,
03:58:39.340 | where often the person who's better at debating
03:58:42.820 | wins the debate regardless of the topic.
03:58:46.100 | So I think what Sanjay could have done,
03:58:49.500 | and Sanjay is such a smart guy,
03:58:52.980 | is to say, "Well, that's an anecdote,
03:58:55.820 | "here's another anecdote."
03:58:57.500 | And there are lots of different anecdotes.
03:59:00.020 | And there certainly are people who have taken the vaccine
03:59:03.660 | and have had problems that could reasonably be traced
03:59:07.100 | to the vaccines.
03:59:08.260 | And there are certainly are lots of people,
03:59:10.620 | I would argue more people,
03:59:12.380 | who've not had the vaccine,
03:59:14.140 | but who've gotten COVID and have either died,
03:59:16.360 | or our hospitals are now full of people
03:59:19.180 | who weren't vaccinated.
03:59:20.420 | And in many ways,
03:59:22.100 | our emergency rooms are full of unvaccinated people
03:59:24.740 | here in the United States.
03:59:26.180 | So I think what Sanjay could have done,
03:59:28.700 | but there was a conflict between
03:59:30.420 | wanting to kind of win the debate
03:59:34.760 | and wanting to take the temperature down.
03:59:36.860 | And what he could have done is to say,
03:59:39.400 | "Well, here's an anecdote, I have a counter anecdote."
03:59:42.280 | And we can go on all day,
03:59:43.660 | but here's what the statistics show.
03:59:46.320 | And I think that was the thing.
03:59:48.460 | So I think it's a healthy conversation.
03:59:50.340 | We can't, I mean, there are a lot of people
03:59:52.620 | who are afraid of the vaccine.
03:59:55.220 | There are a lot of people who don't trust
03:59:56.680 | the scientific establishment.
03:59:58.260 | And lots of them have good reason.
03:59:59.980 | I mean, it's not just people think of like Trump Republicans,
04:00:04.500 | there are lots of people in the African-American community
04:00:08.600 | who've had a historical terrible experience
04:00:11.900 | with the Tuskegee and all sorts of things.
04:00:14.420 | So they don't trust the messages
04:00:16.940 | that were being delivered.
04:00:18.100 | I live in New York City,
04:00:20.260 | and we had a piece in the New York Times
04:00:22.160 | where in the earliest days of the vaccines,
04:00:24.700 | there was this big movement,
04:00:25.820 | "Let's make sure that the poorest people in the city
04:00:28.940 | have first access to the vaccines,
04:00:31.860 | because they're the ones,
04:00:33.660 | they have higher density in their homes,
04:00:36.220 | they're relying on public transport."
04:00:37.860 | So there was this whole liberal effort.
04:00:39.380 | And then in the black community in New York,
04:00:42.420 | according to the New York Times,
04:00:43.500 | there was very low acceptance of the vaccines.
04:00:46.300 | And they interviewed people in that article,
04:00:48.020 | and they said, "Well, if the white people
04:00:50.860 | want us to have it first,
04:00:52.800 | there must be something wrong with it.
04:00:54.620 | They must be doing some..."
04:00:56.700 | And so we have to listen to each other.
04:00:59.820 | Like I would never,
04:01:01.160 | I have a disrespect for everybody.
04:01:04.700 | And if somebody is cautious about the vaccine
04:01:07.860 | for themselves or for their children,
04:01:10.260 | we have to listen to them.
04:01:12.260 | At the same time,
04:01:13.340 | public health is about creating public health.
04:01:19.100 | And there's no doubt,
04:01:20.420 | I think Joe was absolutely right,
04:01:23.100 | that older people, obese people,
04:01:26.540 | are at greater risk for being harmed or killed
04:01:30.540 | by COVID-19 than young, healthy people.
04:01:34.500 | But by everybody getting vaccinated,
04:01:38.100 | we reduce the risk to everybody else.
04:01:41.180 | And so I feel like with everything,
04:01:43.700 | there's the individual benefit argument,
04:01:46.660 | and then there's the community argument.
04:01:49.300 | And I absolutely think-
04:01:50.420 | - But expressing that clearly,
04:01:51.660 | that there's a difference between the individual health
04:01:54.980 | and freedoms and the community health and freedoms,
04:01:59.180 | and steel manning each side of this.
04:02:01.900 | One of the problems that people don't do enough of
04:02:05.180 | is be able to, so how do you steel man an argument?
04:02:08.740 | You describe that argument in the best possible way.
04:02:11.540 | You have to first understand that argument.
04:02:14.540 | Let's go to the non-controversial thing like flat earth.
04:02:17.880 | Like most people, most colleagues of mine at MIT,
04:02:23.820 | don't even read about the full argument
04:02:28.820 | that the flat earthers make.
04:02:31.100 | I feel it's disingenuous for people in the physics community
04:02:36.100 | to roll their eyes at flat earthers
04:02:39.620 | if they haven't read their arguments.
04:02:42.220 | You should feel bad that you didn't read their arguments.
04:02:46.260 | And it's the rolling of the eyes that's a big problem.
04:02:50.700 | You haven't read it.
04:02:51.960 | Your intuition says that these are a bunch of crazy people.
04:02:55.260 | Okay, but you haven't earned the right to roll your eyes.
04:02:59.460 | You've earned your right to maybe not read it,
04:03:03.620 | but then don't have an opinion.
04:03:04.980 | Don't roll your eyes, don't do any of that dismissive stuff.
04:03:07.820 | And the same thing in the scientific community
04:03:11.200 | around COVID and so on, there's often this kind of saying,
04:03:14.380 | oh God, that's conspiracy theories, that's misinformation,
04:03:17.740 | without actually looking into what they're saying.
04:03:20.340 | If you haven't looked into what they're saying,
04:03:22.420 | then don't talk about it.
04:03:23.880 | Like if you're a scientific leader and the communicator,
04:03:26.260 | you need to look into it.
04:03:27.580 | It's not that much effort.
04:03:28.660 | I totally agree.
04:03:29.780 | And I think that humility, it's a constant theme
04:03:32.580 | of your podcast, and I love that.
04:03:35.260 | And so after the conversation, debate,
04:03:39.060 | whatever it was between Sanjay and Joe,
04:03:43.020 | I reached out on Twitter to someone
04:03:45.020 | I've never met in person, but I'm in touch privately,
04:03:47.940 | to a guy named Daniel Griffin, who's
04:03:50.660 | a professor at Columbia Medical School.
04:03:54.740 | And just so smart.
04:03:57.100 | He gives regular updates on COVID-19
04:04:00.940 | on a thing called TWIV, This Week in Virology.
04:04:03.700 | I'm a critic of TWIV for its coverage of pandemic origins.
04:04:09.620 | But on this issue, on just having regular updates,
04:04:12.220 | Daniel is great.
04:04:13.700 | And so I said to him, I said, why don't we
04:04:17.100 | have an honest process to get the people who
04:04:21.660 | are raising concerns about the vaccines, in their own words,
04:04:25.700 | to raise what are their concerns?
04:04:28.420 | And then let's do our best job of saying, well,
04:04:32.220 | here are these concerns.
04:04:34.060 | And then here is our evidence making a counterclaim.
04:04:38.420 | And here are links to, if you want
04:04:40.540 | to look at the studies upon which these claims are made,
04:04:43.860 | here they are.
04:04:45.380 | And Daniel, who's incredibly busy, I mean, he reads every--
04:04:50.140 | I mean, it seems every paper that comes out every week.
04:04:53.180 | And it's unbelievable.
04:04:55.420 | So but he sent me a link to the CDC Q&A page on the CDC
04:05:01.980 | website.
04:05:02.940 | And it wasn't that.
04:05:03.900 | It was people who were--
04:05:05.820 | I mean, it was written by people like me
04:05:07.820 | who were convinced in the benefit of these vaccines.
04:05:13.460 | So the questions were framed.
04:05:15.700 | They were kind of like--
04:05:16.940 | they weren't really the framing of the people
04:05:20.780 | with the concerns.
04:05:21.620 | They were framing of people who were just
04:05:23.280 | kind of imagining something else.
04:05:25.220 | I mean, you always talk about kind of humility
04:05:27.420 | and active listening.
04:05:29.300 | I know you don't mean.
04:05:30.380 | And it doesn't mean that we don't stand for something.
04:05:33.380 | Like, I certainly am a strong proponent of vaccines and masks
04:05:37.220 | and all of those things.
04:05:39.860 | But if we don't hear the other people,
04:05:42.140 | if we don't let them hear their voice in the conversation,
04:05:47.060 | if it's just saying, well, you may think this,
04:05:49.020 | and here's why it's wrong, the argument may be right.
04:05:52.420 | It'll just never break through.
04:05:53.740 | By the way, my interpretation of Joe and Sanjay,
04:05:56.180 | I listened to that conversation without looking
04:05:58.300 | at Twitter or the internet.
04:06:00.060 | And I thought that was a great conversation.
04:06:01.980 | And I thought Sanjay actually really succeeded
04:06:04.540 | at bringing the temperature down.
04:06:05.820 | To me, the goal was bringing the temperature down.
04:06:08.540 | I didn't even think of it as a debate.
04:06:10.140 | I was like, oh, cool.
04:06:11.100 | This isn't going to be some weird--
04:06:12.900 | it's like two friendly people talking.
04:06:15.180 | And then I look at the internet, and then the internet says,
04:06:18.420 | Joe Rogan slammed Sanjay, like as if it was a heated debate
04:06:23.660 | that Joe won.
04:06:24.540 | And it's like, all right.
04:06:26.060 | It's really the temperature being brought down,
04:06:30.140 | real conversation between two humans.
04:06:32.100 | That wasn't really a debate.
04:06:34.220 | It was just a conversation.
04:06:36.540 | And that was a success.
04:06:38.020 | Yeah, and I definitely think it was a success.
04:06:41.020 | But I also felt that a takeaway--
04:06:47.820 | and again, because this is something
04:06:49.780 | that I don't agree with, even though I have great,
04:06:52.540 | as I've said, respect for Joe, I think a reasonable person
04:06:56.180 | listening to that conversation would come away
04:06:59.220 | with the conclusion that all in all,
04:07:02.260 | these vaccines are a good thing.
04:07:05.140 | But if you're young and healthy, you probably don't need it.
04:07:10.980 | And I just felt that there was a stronger case to be made,
04:07:16.220 | even though Sanjay made it.
04:07:17.460 | It wasn't that Sanjay didn't make it.
04:07:19.420 | It was just that in the flow of that conversation,
04:07:22.220 | I felt that the case for the vaccines,
04:07:26.380 | and the vaccines both as an individual choice--
04:07:28.980 | and certainly, again, as I said before,
04:07:31.140 | I think that while people can be afraid of the vaccines,
04:07:35.020 | the virus itself is much scarier.
04:07:37.580 | And we're seeing it now in real time
04:07:40.780 | with these variations and variants.
04:07:43.900 | I just felt that that was kind of the rough takeaway
04:07:47.700 | from that conversation.
04:07:50.820 | And I felt that Sanjay, again, whom I love,
04:07:54.180 | I felt could have made his case a little bit stronger.
04:07:56.780 | So the thing he succeeded is he didn't come off
04:08:00.220 | as a science expert looking down at everybody,
04:08:07.140 | talking down to everybody.
04:08:09.180 | So he succeeded in that, which is very respectful.
04:08:12.020 | But I also think making the case for taking the vaccine where
04:08:16.900 | when you're a young, healthy person, when you're sitting
04:08:19.980 | across from Joe Rogan, is like a high difficulty
04:08:24.460 | on the video game level.
04:08:26.060 | For sure.
04:08:26.740 | So it's not-- it's difficult to do.
04:08:30.180 | Yeah, for sure.
04:08:30.860 | It's difficult to do.
04:08:31.740 | And also, it's difficult to do because it's not like--
04:08:34.900 | it's not as simple as like, look at the data.
04:08:38.940 | There's a lot of data to go through here.
04:08:42.900 | And there's also a lot of non-data stuff,
04:08:45.700 | like the fact that-- first of all,
04:08:48.540 | questioning the sources of the data, the quality of the data.
04:08:51.660 | Because what's also disappointing about COVID
04:08:54.020 | is that the quality of the data is not great.
04:08:56.340 | But also questioning all the motivations
04:09:00.540 | of the different parties involved,
04:09:02.420 | whether it's major organizations that
04:09:04.300 | develop the vaccine, whether it's major institutions
04:09:07.380 | like NIH or NIAID that are sort of communicating to us
04:09:11.500 | about the vaccine, whether it's the CDC and the WHO,
04:09:15.580 | whether it's the Biden or the Trump administration,
04:09:18.060 | whether it's China and all those kinds of things.
04:09:20.220 | You have to-- that's part of the conversation here.
04:09:24.100 | I mean, vaccination is not just a public health tool.
04:09:28.700 | It's also a tool for a government
04:09:30.340 | to gain more control over the populace.
04:09:33.420 | Like, there's a lot of truth to that, too.
04:09:36.260 | Things that have a lot of benefit
04:09:39.940 | can also be used as a Trojan horse
04:09:44.180 | to increase bureaucracy and control.
04:09:47.220 | But that has to be on the table for a conversation.
04:09:49.860 | Yeah, I think it has to be on the conversation.
04:09:51.980 | But I mean, your parents, when they were in the Soviet Union
04:09:56.540 | and here in the United States-- and actually,
04:09:58.460 | it was a big collaboration between US and Soviet Union--
04:10:02.140 | when the polio vaccine came out, there
04:10:04.260 | were people all around the world who had a different life
04:10:07.660 | trajectory, no longer living in fear.
04:10:09.820 | And all these people who were paralyzed or killed from polio,
04:10:13.660 | smallpox has been eradicated.
04:10:15.900 | It was one of the great successes in human history.
04:10:20.260 | And while it for sure is true that you
04:10:22.500 | could imagine some kind of fraudulent vaccination effort,
04:10:27.740 | but here I genuinely think, I mean, whatever the number--
04:10:31.180 | 15 million, 16 million is the economist number
04:10:34.260 | of dead from COVID-19--
04:10:37.020 | many, many, many more people would be dead
04:10:39.780 | but for these vaccines.
04:10:41.740 | And so I get that any activity that
04:10:44.540 | needs to be coordinated by a central government
04:10:47.700 | has the potential to increase bureaucracy
04:10:50.660 | and increase control.
04:10:53.580 | But there are certain things that central governments
04:10:56.620 | do, like the development, particularly,
04:10:59.580 | these mRNA vaccines, which it's purely a US government
04:11:04.380 | victory.
04:11:05.060 | I mean, there was huge DARPA funding,
04:11:08.020 | and then the National Institute for Allergy and Infectious
04:11:11.500 | Disease, NIH funding.
04:11:13.500 | I mean, this was a public-private partnership
04:11:16.140 | throughout.
04:11:17.100 | And that we got a working vaccine in 11 months
04:11:20.540 | was a miracle.
04:11:21.580 | It's not purely a victory.
04:11:24.340 | Again, you have to be open-minded.
04:11:27.260 | I'm with you here playing a bit of devil's advocate,
04:11:30.380 | but the people who discuss antiviral drugs like ivermectin
04:11:33.420 | and other alternatives would say that the extreme focus
04:11:37.180 | on the vaccine distracted us from considering
04:11:40.500 | other possibilities.
04:11:42.140 | And saying that this is purely a success
04:11:46.100 | is distracting from the story that there could
04:11:48.540 | have been other solutions.
04:11:50.660 | So yes, it's a huge success that the vaccine was developed
04:11:54.260 | so quickly and surprisingly way more effective
04:11:58.660 | than it was hoped for.
04:12:02.220 | But there could have been other solutions,
04:12:04.580 | and they completely distracted from that.
04:12:08.060 | In fact, it distracted us from looking
04:12:10.700 | into a bunch of things like the lab leak.
04:12:13.340 | So it's not a pure victory.
04:12:15.780 | - Fair enough.
04:12:16.620 | - And there's a lot of people that criticize
04:12:18.180 | the overreach of government in all of this.
04:12:20.700 | One of the things that makes the United States great
04:12:23.500 | is the individualism and the hesitancy to ideas of mandates.
04:12:29.540 | Even if the mandates en masse will have a positive,
04:12:35.220 | even strongly positive result.
04:12:40.180 | Many Americas will still say no.
04:12:42.980 | Because in the long arc of history,
04:12:46.820 | saying no in that moment will actually lead
04:12:49.700 | to a better country and a better world.
04:12:53.620 | So that's a messed up aspect of America,
04:12:57.340 | but it's also a beautiful part.
04:12:59.940 | We're skeptical even about good things.
04:13:03.860 | - I agree, and certainly we should all be cautious
04:13:08.260 | about government overreach, absolutely.
04:13:11.540 | And it happens in all kinds of scenarios,
04:13:14.540 | with incarceration, with a thousand things.
04:13:17.340 | And we also should be afraid of government underreach.
04:13:20.940 | That if there is a problem,
04:13:22.660 | that could be solved by governments.
04:13:24.660 | And that's why we have governments in the first place,
04:13:26.860 | is that there's just certain things
04:13:28.180 | that individuals can't do on their own.
04:13:31.060 | And that's why we pool our resources,
04:13:33.980 | and we in some ways sacrifice our rights
04:13:38.180 | for this common thing.
04:13:39.140 | And that's why we don't have, hopefully,
04:13:41.140 | people, murderers marauding,
04:13:42.900 | or people driving 200 miles down the street.
04:13:45.420 | That we have a process for arriving
04:13:48.140 | at a set of common rules.
04:13:49.820 | And so, while I fully agree that we need to respect,
04:13:52.660 | and we need to listen, we need to find that right balance.
04:13:56.380 | And you've raised the magic I word, ivermectin.
04:13:59.980 | And so an ivermectin, like my view,
04:14:03.780 | has always been ivermectin could be effective.
04:14:08.100 | It could not be effective.
04:14:09.580 | Let's study it through a full process.
04:14:12.060 | And when you had Francis Collins with you,
04:14:14.420 | even while he was making up stories about this wrestler,
04:14:19.420 | he was saying-- - I can do it.
04:14:21.260 | - Exactly.
04:14:22.100 | But he was saying that they're going to do
04:14:25.140 | a full randomized highest level trial of ivermectin.
04:14:28.500 | And if ivermectin works,
04:14:30.140 | then that's another tool in our toolbox.
04:14:32.780 | And I think we should.
04:14:34.020 | And I think that Sanjay was absolutely correct
04:14:37.740 | to concede the point to Joe that it was disingenuous
04:14:42.180 | for people, including people on CNN,
04:14:45.220 | to say that ivermectin is for livestock.
04:14:48.980 | And so, I definitely think that we have to,
04:14:52.340 | like we have to have some kind of process
04:14:55.500 | that allows us to come together.
04:14:57.540 | And I totally agree that the great strength of America
04:15:01.860 | is that we empower individuals.
04:15:03.460 | It's the history of our frontier mentality in our country.
04:15:06.900 | So, I 100% agree that we have to allow that,
04:15:11.460 | even if sometimes it creates messy processes
04:15:15.580 | and uncomfortable feelings and all those sorts of things.
04:15:18.420 | - You are an ultra marathon runner.
04:15:22.660 | - Yes.
04:15:23.500 | - What are you running from?
04:15:28.020 | (laughing)
04:15:28.860 | - It's the right, it's the funny thing is,
04:15:30.660 | so I'm an ultra marathoner and I've done 13 Ironmans.
04:15:34.180 | And people say, "Oh my God, that's amazing.
04:15:36.220 | "13 Ironmans?"
04:15:37.060 | And what I always say, "No, one Ironman is impressive."
04:15:41.060 | 13 Ironmans, there's something effing wrong with you.
04:15:44.340 | We just need to figure out what it is.
04:15:45.740 | - Yeah, there's some demons you're trying to work through.
04:15:48.500 | I mean, while you're doing the work though,
04:15:49.980 | most people just kinda let the demons sit in the attic.
04:15:53.940 | No, what have you learned about yourself,
04:15:57.260 | about your mind, about your body, about life,
04:16:00.260 | from taking your body to the limit in that kind of way,
04:16:05.380 | to running those kinds of distances?
04:16:06.980 | - Well, it's a great question.
04:16:08.220 | I know that you are also kind of exploring
04:16:10.100 | the limits of the physical.
04:16:12.500 | And so for me, in doing the Ironmans and the ultra marathons,
04:16:17.220 | it's always the same kind of lesson,
04:16:20.420 | which is just when you think you have nothing left,
04:16:25.420 | you actually have a ton left.
04:16:27.860 | There are a lot of resources that are there
04:16:31.140 | if you call on them.
04:16:32.820 | And the ability to call on them has to be cultivated.
04:16:37.820 | And so for me, especially in the Ironman,
04:16:42.380 | and Ironman in many ways is harder than the ultra marathons,
04:16:45.860 | 'cause I'll be at, I mean, it's 140 miles,
04:16:48.100 | I'll be at a hundred mile, 120,
04:16:51.260 | having done the swim and then the bike,
04:16:53.640 | and I'll be whatever, six miles into the run,
04:16:58.300 | and I'll think, "I feel like shit.
04:17:01.300 | "I have nothing left.
04:17:02.780 | "How am I possibly gonna run 20 miles more?"
04:17:07.660 | But there's always more.
04:17:10.500 | And I think that for me, these extreme sports
04:17:15.420 | are my process of exploring what's possible.
04:17:20.420 | And I feel like it applies
04:17:23.020 | in so many different areas of life
04:17:26.140 | where you're kind of pushing,
04:17:28.100 | and it feels like the limit.
04:17:30.500 | And one of my, a friend of mine
04:17:33.100 | who I just have so much respect for,
04:17:35.780 | who actually would be a great guest
04:17:37.260 | if you haven't already interviewed him,
04:17:39.300 | is Charlie Engle.
04:17:41.260 | And Charlie, he was a drug addict.
04:17:44.140 | He was in prison.
04:17:45.260 | His life was total shit.
04:17:47.180 | And somehow, and I can't remember the full story,
04:17:50.900 | he just started running around the prison yard.
04:17:55.620 | And it was like Forrest Gump.
04:17:56.820 | And he just kept running and running.
04:17:59.180 | And then he got out of prison,
04:18:00.700 | and he kept running,
04:18:02.340 | and he started doing ultramarathons,
04:18:04.780 | started inspiring all these other people.
04:18:07.620 | Now he's written all these books.
04:18:09.020 | As a matter of fact, we just spoke a few months ago
04:18:12.460 | that he's planning on running from the Dead Sea
04:18:17.460 | to somehow to the top of Mount Everest,
04:18:20.300 | from the lowest point to the highest point on Earth.
04:18:23.380 | And I said, "Well, why are you stopping there?
04:18:25.780 | "Why don't you get whatever camera
04:18:29.180 | "and go down to the lowest part of the ocean?"
04:18:32.100 | Go to the lowest part of the ocean,
04:18:33.900 | and then talk to Elon Musk or Jeff Bezos
04:18:38.140 | and go to the kind of the highest place
04:18:40.100 | in the stratosphere you can get.
04:18:42.060 | But it's this thing of possibility.
04:18:44.700 | And I just feel like so many of us,
04:18:47.140 | and myself included,
04:18:49.140 | we get stuck in a sense of what we think is our range.
04:18:54.620 | And if we're not careful, that can become our range.
04:18:57.860 | And that's why for me in all of life,
04:19:00.900 | it's all about, like we've been talking about,
04:19:02.500 | challenging the limits, challenging assumptions,
04:19:06.220 | challenging ourselves.
04:19:08.100 | And hopefully, we do it in a way
04:19:10.540 | that kind of doesn't hurt anybody.
04:19:12.660 | You know, when I'm at the Ironman,
04:19:14.540 | they have all these little kids
04:19:15.900 | and they'll have these little shirts,
04:19:17.060 | and it'll say like, "My dad is a hero,"
04:19:19.220 | and have the little Ironman logo.
04:19:21.060 | And I wanna say it's like,
04:19:22.420 | no, your dad is actually a narcissistic dick
04:19:25.300 | who goes on eight-mile bike rides every Sunday
04:19:29.540 | rather than spend time with you.
04:19:31.540 | And so we shouldn't hurt anybody.
04:19:34.020 | But for me, and also I just find it very enjoyable.
04:19:38.700 | And I hope I'm not disclosing too much
04:19:41.380 | about our conversation before we went live,
04:19:44.100 | where you're doing so many different things
04:19:46.060 | with running and your martial arts.
04:19:48.780 | And I encouraged you to do ultra marathons
04:19:53.660 | because there's so many great ones in Texas.
04:19:55.260 | It's actually, surprisingly,
04:19:57.340 | a very enjoyable way to spend a day.
04:20:00.060 | - How would you recommend?
04:20:01.100 | So yeah, for people who might not know,
04:20:03.460 | I've never actually even run a marathon.
04:20:05.100 | I run 22 miles in one time at most.
04:20:08.420 | I did a four-by-four-by-48 challenge with David Goggins
04:20:13.420 | where you run four miles every four hours.
04:20:16.340 | It has less to do with the distance
04:20:19.380 | and more to do with the sleep deprivation.
04:20:21.620 | What advice would you give to a first-time ultra marathoner
04:20:24.980 | like me trying to run 50 or more miles
04:20:27.180 | or for anybody else interested
04:20:30.260 | in this kind of exploration of their range?
04:20:32.340 | - What I always tell is the same advice is register.
04:20:36.940 | Pick your timeline of when you think you can be ready,
04:20:40.140 | depending on where you are now.
04:20:41.140 | Make it six months, make a year,
04:20:43.100 | and then register for the race.
04:20:45.300 | And then once you're registered,
04:20:47.060 | just work back from there, what's it going to take?
04:20:49.820 | But one of the things for people who are just getting going,
04:20:52.760 | you really do need to make sure
04:20:53.980 | that your body is ready for it.
04:20:55.940 | And so, particularly, and particularly as we get older,
04:20:59.100 | strengthening is really important.
04:21:02.660 | So I'll do a plug for my brother, Jordan Metzel.
04:21:06.260 | He's a doctor at Hospital for Special Surgery,
04:21:08.380 | but his whole thing is functional strength.
04:21:10.740 | And so, and people know about,
04:21:12.740 | and you can actually even go to his website.
04:21:15.140 | You can just Google Jordan Metzel Iron Strength.
04:21:17.420 | But it's all about burpees
04:21:19.180 | and just building your muscular strength
04:21:21.380 | so that you don't get injured as you increase.
04:21:25.060 | And then just increase your mileage in some steady way.
04:21:28.920 | Make sure that you take rest days and listen to your body,
04:21:32.840 | because people like you
04:21:34.540 | who are just very kind of mind over matter,
04:21:37.100 | like you were telling me before about you have an injury,
04:21:39.440 | but you kind of run a little bit differently.
04:21:41.760 | And we need to listen to our bodies
04:21:44.420 | 'cause our bodies are communicating.
04:21:47.640 | But I think it was kind of
04:21:49.780 | little by little magic is possible.
04:21:53.420 | And what I will say is,
04:21:54.740 | and I also do, I've done lots and lots of marathons,
04:21:58.260 | and I always tell people that the ultra marathons,
04:22:01.580 | at least the ones that I do,
04:22:02.500 | and I shouldn't misrepresent myself.
04:22:04.060 | I mean, there are people who do 500 mile races.
04:22:07.460 | The ones that I do are 50K mountain trail runs,
04:22:11.160 | which is 32 miles.
04:22:12.580 | So I do the kind of the easier side of ultras,
04:22:17.580 | but it's actually much easier than a marathon
04:22:19.800 | because some of the mountain ones,
04:22:21.900 | sometimes it's so steep that you can't,
04:22:25.500 | you have to walk it
04:22:26.860 | 'cause walking is faster than running.
04:22:29.140 | And every four or five miles in the supported races,
04:22:32.100 | you stop and eat blintzes and foiled potatoes.
04:22:35.460 | It's actually quite enjoyable.
04:22:36.740 | But as I started to tell you before we went live,
04:22:42.080 | so I've done for lots of years,
04:22:43.820 | these 50K mountain trail runs,
04:22:45.500 | and I was going to Taiwan a number of years ago
04:22:48.180 | for something else.
04:22:49.260 | And I thought, well, wouldn't it be fun
04:22:50.780 | to do an ultra marathon in Taiwan?
04:22:52.980 | I looked and that the weekend after my visit,
04:22:56.900 | there was a marathon.
04:22:57.900 | It was called the, I mean, ultra marathon,
04:22:59.820 | it was called the Taiwan Beast.
04:23:01.780 | And I figured, oh, beasts, what are they talking about?
04:23:03.940 | It's 50K mountain trail, and I've done a million of them.
04:23:06.540 | And I went to register.
04:23:07.800 | And then as part of registration,
04:23:09.260 | they said, you need to have all of this equipment.
04:23:11.620 | And it was all this like wilderness survival equipment.
04:23:14.900 | And I was thinking, God, these Taiwanese,
04:23:17.900 | what a bunch of whims. - Way too dramatic.
04:23:19.300 | - You have to carry, give me a break, 50K mountain trail.
04:23:22.300 | So I get there and the race starts at like 4.30
04:23:25.640 | in the morning in the middle of nowhere.
04:23:27.480 | And you have to wear headlamps
04:23:28.580 | and everyone's carrying all this stuff.
04:23:29.780 | And you kind of go running out into the rainforest.
04:23:34.780 | It was the hardest thing I've ever done.
04:23:36.660 | It took 19 hours.
04:23:38.140 | There were maybe 15 cliff faces,
04:23:40.500 | like a real cliff, and somebody had dangled
04:23:42.940 | like a little piece of string.
04:23:44.500 | And so you had to hold onto the string with one hand
04:23:46.860 | while it was in the pouring rain, climb up these cliffs.
04:23:49.340 | There were maybe 20 river crossings,
04:23:52.700 | but not just like a little stream, like a torrential river.
04:23:57.060 | There were some things where it was so steep
04:24:00.140 | that everyone was just climbing up
04:24:02.060 | and then you'd slide all the way down and climb up.
04:24:03.960 | And there were people who I met on the way out there
04:24:07.060 | who were saying, "Oh yeah, I did the Sahara 500 kilometer
04:24:11.140 | "race," and those people were just sprawled out.
04:24:15.220 | A lot of them didn't finish.
04:24:17.460 | So that was the hardest thing I've ever done.
04:24:19.780 | - So how do you get through something like that?
04:24:22.060 | You just, one step at a time?
04:24:23.620 | Was there, do you remember, is there a--
04:24:26.380 | - Yeah, the-- - Was there dark moments
04:24:28.500 | or is it kind of all spread out thinly?
04:24:30.900 | - It wasn't really dark moments.
04:24:34.100 | I mean, there was one thing where I'd been running so long,
04:24:36.620 | I thought, well, I must almost be done.
04:24:39.020 | And then I found out I had like 15 miles more.
04:24:43.060 | But I guess with all of these things,
04:24:48.420 | it's the messages that we tell ourselves.
04:24:51.700 | And so for me, it's like the message I always tell myself is
04:24:55.860 | quitting isn't an option.
04:24:58.740 | I mean, once in a while you kind of have to quit
04:25:00.700 | if like, listen to the universe,
04:25:02.420 | if whatever, you're gonna kill yourself or something.
04:25:04.820 | But for me, it was just, whatever it takes,
04:25:08.500 | there's no way I'm stopping.
04:25:10.140 | And if I have to go up this muddy hill 20 times
04:25:14.220 | because I keep sliding, I'm sure there's a way.
04:25:17.780 | It's probably a personality flaw.
04:25:19.620 | - Where does your love for chocolate come from?
04:25:23.900 | - Oh, it's a great question.
04:25:24.940 | And in both of my Joe Rogan interviews,
04:25:27.420 | that's the first question that he asked.
04:25:28.980 | So I'm glad that we've gotten to that.
04:25:30.940 | So one, I've always loved chocolate.
04:25:33.900 | And I call it like a secret,
04:25:36.620 | but now that I keep telling,
04:25:37.900 | if you keep telling the same secret,
04:25:39.540 | it's actually no longer a secret,
04:25:42.060 | that I have a secret, which is not secret,
04:25:45.180 | 'cause I'm telling you on a podcast,
04:25:47.740 | life as a chocolate shaman.
04:25:50.140 | And so when I give keynotes at tech conferences,
04:25:53.540 | I always say, I'm happy to give a keynote,
04:25:55.540 | but I want to lead a sacred cacao ceremony in the night.
04:25:59.020 | I'm actually, believe it or not,
04:26:00.580 | the official chocolate shaman
04:26:02.540 | of what used to be called Exponential Medicine,
04:26:05.180 | which is part of Singularity University.
04:26:06.660 | Now, my friend Daniel Kraft, who runs it,
04:26:09.580 | it's going to be called NextMed.
04:26:12.580 | And so, but I'll have to go back,
04:26:14.780 | 'cause I was going to Berlin a lot of years ago,
04:26:19.420 | and I've always loved chocolate,
04:26:20.540 | but I was going to Berlin to give a keynote
04:26:23.820 | at a big conference called TOA, Tech Open Air.
04:26:28.460 | And so when I got there,
04:26:30.940 | the first night I was supposed to give a talk,
04:26:33.060 | but there had been some mix-up,
04:26:34.580 | they'd forgotten to reserve the room,
04:26:36.700 | and so the talk got canceled.
04:26:38.380 | And in the brochure,
04:26:40.020 | they had all these different events around Berlin
04:26:43.300 | that you could go to, and one of them was a cacao ceremony.
04:26:46.300 | And so I went there and actually met somebody,
04:26:49.700 | Viviana, who is still a friend, but I met,
04:26:52.740 | going in there, and there was this cacao ceremony,
04:26:54.540 | and there were these kind of hippie dudes,
04:26:56.660 | and then everybody got the cacao,
04:26:59.740 | and then they said, all right,
04:27:01.060 | as they talked a little bit about the process,
04:27:03.220 | and then they said, all right, everyone just stand,
04:27:05.860 | and kind of, we're going to spin around in a circle
04:27:08.460 | for 45 minutes.
04:27:09.620 | And so I spun around in the circle for like 10 minutes,
04:27:12.940 | but then I had to leave,
04:27:14.180 | because I had to go to something else.
04:27:15.940 | And so I thought that was that,
04:27:18.540 | but then I saw Viviana the next day,
04:27:21.100 | and I said, well, how did the cacao ceremony go?
04:27:23.300 | And she showed me these pictures
04:27:25.380 | of all of these people, mostly naked,
04:27:28.460 | like it turned into chaos.
04:27:30.660 | - Oh, that's awesome.
04:27:31.500 | - And it was like, oh, and so let me get this straight.
04:27:33.700 | People drank chocolate, then they spun around in a circle,
04:27:37.740 | and something else happened.
04:27:39.220 | And anyway, so then two days later,
04:27:41.460 | I was invited to another cacao ceremony,
04:27:43.860 | which was actually part of this TOA.
04:27:46.580 | And that was kind of more structured,
04:27:48.660 | and it was more sane, 'cause it was part of this thing.
04:27:51.180 | And at the end of that, I had this,
04:27:53.700 | I thought, one, how, the greatest thing ever,
04:27:56.900 | a sacred cacao ceremony,
04:27:58.540 | like you drink chocolate milk and everybody's free.
04:28:01.500 | And I love that idea, 'cause I've never done drugs,
04:28:05.700 | I don't drink, but just part of it is,
04:28:08.460 | 'cause I think whatever,
04:28:09.380 | like I was saying with the ultra running,
04:28:12.060 | all of the possibilities are within us,
04:28:14.100 | if we can get out of our own way.
04:28:16.820 | And then I thought, well, I think I can do a better job
04:28:20.180 | than what I experienced in Berlin.
04:28:22.300 | So I came back and I thought, all right,
04:28:23.500 | I'm gonna get accredited as a cacao shaman.
04:28:26.140 | And this will shock you,
04:28:27.540 | 'cause I know if you're gonna be like a rabbi or a priest
04:28:30.820 | or something, there's some process,
04:28:32.020 | but shockingly, there's no official process
04:28:35.100 | to become a chocolate shaman. - Shockingly.
04:28:38.140 | - And so I thought, all right, well, you know,
04:28:40.540 | I'm just gonna train myself, and when I'm ready,
04:28:42.940 | I'm gonna declare my chocolate shamanism.
04:28:45.140 | So I started studying different things,
04:28:47.380 | and when I was ready, I just said,
04:28:50.300 | now I'm a chocolate shaman, self-declared.
04:28:52.780 | - Self-declared.
04:28:53.620 | - And so, but I do these ceremonies,
04:28:55.540 | and I've done them at tech conferences.
04:28:58.340 | I did one in Soho House in New York.
04:29:01.060 | I've done it at a place, Rancho La Puerta in Mexico.
04:29:04.180 | And every time it's the same thing,
04:29:05.980 | 'cause it just, if people are given a license to be free,
04:29:10.220 | just to, it doesn't matter.
04:29:11.940 | And what I always say is,
04:29:13.460 | you're here for a sacred cacao ceremony,
04:29:15.260 | but the truth is there's no such thing as sacred cacao,
04:29:18.340 | and there's no sacred mountains,
04:29:19.700 | and there's no sacred people, and there's no sacred plants,
04:29:22.420 | because nothing is sacred if we don't attribute,
04:29:25.780 | ascribe sacredness to it.
04:29:28.820 | But if we recognize that everything is sacred,
04:29:32.420 | then we'll live different lives.
04:29:33.700 | And for the purpose of this ceremony,
04:29:36.180 | we're just gonna say, all right,
04:29:37.620 | we're gonna focus on this cacao,
04:29:39.940 | which actually has been used ceremonially for 5,000 years.
04:29:43.420 | It has all these wonderful properties.
04:29:46.120 | But it's just people who get that license,
04:29:49.500 | and then they're just free,
04:29:51.060 | and people are dancing, and all sorts of things.
04:29:52.780 | - Is the goal to celebrate life in general?
04:29:57.140 | Is it to celebrate the senses, like taste?
04:30:00.640 | Is it to celebrate yourself, each other?
04:30:03.020 | What is their--
04:30:04.100 | - I think the core is gratitude, and just appreciation.
04:30:09.100 | - All the experiences in life?
04:30:10.460 | - Yeah, just of being alive,
04:30:12.460 | of just living in this sacred world
04:30:15.100 | where we have all these things
04:30:16.340 | that we don't even pay any attention to.
04:30:19.540 | My friend, A.J. Jacobs, he had a wonderful book
04:30:23.100 | that I used the spirit of it in the ceremonies.
04:30:27.020 | Not the exactly, but he was in a restaurant
04:30:30.660 | in New York, a coffee shop,
04:30:31.740 | and his child said, "Hey, where does the coffee come from?"
04:30:35.040 | And he's like a wonderful big thinker.
04:30:38.820 | And he started really answering that question.
04:30:41.100 | Well, here's where the beans come from,
04:30:42.700 | but how did the beans get here,
04:30:44.320 | and who painted the yellow line on the street
04:30:46.260 | so the truck didn't crash, and who made the cup?
04:30:50.140 | And he spent a year making a full spreadsheet
04:30:53.120 | of all of the people who in one way or another
04:30:56.460 | played some role in that one cup of coffee.
04:31:00.300 | And he traveled all around the world thanking them.
04:31:03.460 | Like, it's like, thank you for painting
04:31:05.100 | the yellow line on the road.
04:31:06.900 | And so for me, with the cacao,
04:31:08.780 | part of when I do these ceremonies
04:31:10.740 | is just to say, like, you're drinking this cacao.
04:31:14.220 | But there's a person who planted the seed.
04:31:16.340 | There's a person who watered the plant.
04:31:18.060 | There's a person, and I just think that level of awareness,
04:31:22.420 | and it's true with anything.
04:31:24.260 | Like, you have in front of you a stuffed hedgehog.
04:31:27.500 | So-- - Somebody made that.
04:31:28.860 | - I love it, it's great.
04:31:30.220 | But like, if we just said,
04:31:32.100 | where does this stuffed hedgehog come from?
04:31:35.200 | We would have a full story of globalization,
04:31:38.980 | of the interconnection of people all around the world
04:31:42.100 | doing all sorts of things of human imagination.
04:31:45.140 | It's beyond our capacity in our daily,
04:31:47.380 | we'd go insane if every day,
04:31:49.580 | like we're speaking into a microphone,
04:31:51.260 | well, what are the hundreds of years of technology
04:31:54.540 | that make this possible?
04:31:56.060 | But if just once in a while,
04:31:57.300 | we just focus on one thing and say,
04:32:00.340 | this thing is sacred.
04:32:01.940 | And because I'm recognizing that,
04:32:04.100 | and I'm having an appreciation for the world around me,
04:32:07.000 | it just kind of makes my life feel more sacred.
04:32:10.420 | It makes me recognize my connection to others.
04:32:12.500 | So that's the gist of it.
04:32:14.740 | - Yeah, it's funny, I often look at
04:32:16.940 | things in this world and moments and just,
04:32:22.340 | I'm in awe of the full universe
04:32:27.340 | that brought that to be.
04:32:32.700 | In a similar way as you're saying,
04:32:35.620 | but I don't as often think about exactly what you're saying,
04:32:38.940 | which is the number of people behind
04:32:40.660 | every little thing we get to enjoy.
04:32:43.060 | I mean, yeah, this hedgehog, this microphone,
04:32:45.640 | directly thousands of people involved.
04:32:49.620 | - Millions. - And then indirectly
04:32:51.220 | is millions.
04:32:52.060 | And they're all, this microphone,
04:32:59.260 | there's artists, essentially,
04:33:01.980 | people who made it their life's work,
04:33:04.100 | all the cross from the factories to the manufacturer,
04:33:07.420 | there's families that the production of this microphone
04:33:11.620 | and this hedgehog are fed because of the skill of this human
04:33:16.100 | that helped contribute to that development.
04:33:18.860 | - And like Isaac Newton and John von Neumann
04:33:22.420 | are in this microphone.
04:33:24.420 | - They're standing on the shoulders of giants,
04:33:26.180 | and we're standing on their shoulders.
04:33:28.600 | And somebody will be standing on ours.
04:33:34.100 | - You mentioned one shared world.
04:33:37.300 | - Yeah.
04:33:38.660 | - What is it?
04:33:39.980 | - Well, thanks for asking.
04:33:41.020 | And by the way, what I will say is the people
04:33:42.980 | who are listening, this is so incredible,
04:33:46.220 | and I'm so thrilled to have this kind of long conversation.
04:33:49.460 | - Hello, person who's listening.
04:33:51.540 | - Exactly, thank you.
04:33:52.380 | - Past the five hour mark.
04:33:54.260 | - Thanks, mom.
04:33:55.140 | - I salute you.
04:33:58.180 | Somebody was sleeping for the first four hours
04:34:00.780 | and just woke up.
04:34:01.740 | - Now's the good stuff.
04:34:02.980 | I've been saving it.
04:34:04.060 | And I have to say that so much of our lives
04:34:09.220 | is forced into these short bursts
04:34:11.260 | that I'm just so appreciative to have the chance
04:34:14.180 | to have this conversation.
04:34:15.740 | So thank you for that.
04:34:16.580 | - Some people would say five hours is short,
04:34:17.860 | so I'm not seeing that.
04:34:19.700 | - Let's go.
04:34:20.540 | That's what my girlfriend says,
04:34:25.820 | like if I was captured and tortured
04:34:29.220 | and they were gonna interrogate me,
04:34:30.820 | it's like at the end they'd say, all right.
04:34:32.700 | No, we're sick of this guy.
04:34:34.300 | We quit.
04:34:35.500 | Let him go.
04:34:36.340 | - I love it.
04:34:37.260 | - So background on One Shared World.
04:34:39.940 | I mentioned I'm on a faculty for Singularity University.
04:34:43.100 | In the earliest days of the pandemic,
04:34:45.300 | I was invited to give a talk on whether the tools
04:34:48.580 | of the genetics and biotech revolutions
04:34:50.340 | were a match for the outbreak.
04:34:52.820 | And my view was then as now
04:34:55.100 | that the answer to that question is yes.
04:34:57.460 | But I woke up that morning and I felt
04:34:59.500 | that that wasn't the most important talk that I could give.
04:35:03.300 | There was something else that was more pressing for me.
04:35:06.060 | And that was the realization,
04:35:08.220 | they were asking the question,
04:35:09.060 | well, why weren't we prepared for this pandemic?
04:35:12.020 | Because we could have been, we weren't.
04:35:14.380 | And because of that,
04:35:16.900 | why can't we respond adequately to this outbreak?
04:35:21.780 | And then there was the thing,
04:35:24.220 | well, even if we respond somehow miraculously,
04:35:27.660 | overcome this pandemic, it's a pyrrhic victory
04:35:31.380 | if we don't prepare ourselves to respond
04:35:35.460 | to the broader category of pandemics,
04:35:37.500 | particularly as we enter the age of synthetic biology.
04:35:40.620 | But if somehow miraculously we solve that problem,
04:35:44.820 | but we don't solve the problem of climate change,
04:35:47.300 | well, kind of who cares?
04:35:48.140 | We didn't have a pandemic,
04:35:49.140 | but we wiped everybody out from climate change.
04:35:51.420 | And let's just say, you get where this is going,
04:35:54.940 | that we organize ourselves and we solve climate change.
04:35:58.580 | And then we have a nuclear war
04:36:00.540 | because everybody's, particularly China now,
04:36:02.980 | but US, the former Soviet Union
04:36:05.100 | are building all these nuclear weapons.
04:36:07.020 | Who cares that we solved climate change
04:36:08.980 | because we're all gone anyway.
04:36:10.380 | And the meta category,
04:36:12.260 | bringing all of those things together,
04:36:14.820 | was this mismatch between the increasingly global
04:36:19.660 | and shared nature of the biggest challenges that we face
04:36:23.980 | and our inability to solve
04:36:26.220 | that entire category of problems.
04:36:29.380 | And there's a historical issue,
04:36:32.180 | which is that prior to the 30 years war in the 17th century,
04:36:36.460 | we had all these different kinds of sovereignty
04:36:38.900 | and religious and different kinds
04:36:41.100 | of organizational principles.
04:36:42.900 | And everybody got in this war
04:36:44.100 | and in this series of treaties
04:36:46.780 | that together are called the Peace of Westphalia,
04:36:50.460 | the framework for the modern,
04:36:52.980 | what we now understand as the modern nation state was laid.
04:36:55.780 | And then through colonialism and other means,
04:36:58.540 | that idea of a state is what it is today,
04:37:03.180 | spread throughout the world.
04:37:05.980 | Then through particularly the late 19th
04:37:09.580 | and early 20th century,
04:37:10.980 | we realized how unstable that system was
04:37:14.180 | because you always had these jockeying
04:37:16.420 | between sovereign states and some were rising
04:37:18.300 | and some were falling and you ended up in war.
04:37:21.020 | And that was the genius of the generations
04:37:23.380 | who came together in 1945 in San Francisco
04:37:25.940 | and the planning had even started before then,
04:37:28.260 | who said, well, we can't just have that world,
04:37:30.940 | we need to have an overlay.
04:37:32.620 | And we talked about the UN and the WHO
04:37:34.940 | of systems which transcend our national sovereignties.
04:37:39.820 | They don't get rid of them,
04:37:41.380 | but they transcend them
04:37:42.260 | so we can solve this category of problems.
04:37:44.700 | But we're now reaching a point where our reach as humans,
04:37:47.780 | even individually, but collectively is so great
04:37:51.060 | that there's a mismatch between, as I said,
04:37:52.580 | the nature of the problems
04:37:53.860 | and the ability to solve those problems.
04:37:58.140 | And unless we can address
04:38:00.220 | that broader global collective action problem,
04:38:03.860 | we're going to extinct ourselves.
04:38:05.380 | And we see these different, what I call verticals,
04:38:08.220 | whether it's climate change
04:38:09.500 | or trying to prevent nuclear weapons proliferation
04:38:13.140 | or anything else, but none of those can succeed.
04:38:16.300 | And frankly, it doesn't even matter if one succeeds
04:38:18.980 | because all of them have the potential
04:38:21.700 | to lead to extinction level events.
04:38:24.460 | So anyways, I gave that talk and that talk went viral.
04:38:30.060 | I stayed up all night the next night and I drafted,
04:38:32.740 | I mean, I think it was like an insanity,
04:38:35.180 | but I think a lot of us were manic
04:38:36.900 | in those early days of the pandemics
04:38:38.940 | wanting to do something.
04:38:40.260 | And so I stayed up all night
04:38:41.260 | and I drafted what I called
04:38:42.860 | a declaration of global interdependence.
04:38:45.540 | And I posted that on my website, myjamiemuscle.com,
04:38:48.700 | it's still there, and that went viral.
04:38:51.820 | And so then I called a meeting
04:38:53.340 | just on the people on my personal email list.
04:38:56.340 | And so we had people from 25 countries.
04:38:59.660 | There were all of these people
04:39:00.940 | who were having the same thing.
04:39:02.540 | There's something wrong in the world.
04:39:03.980 | They wanted to be part of a process of fixing it.
04:39:07.740 | And so it was a crazy 35 days
04:39:10.540 | where we broke into eight different working groups.
04:39:12.980 | We had an amazing team that helped redraft
04:39:16.100 | what became the declaration of interdependence,
04:39:18.780 | which is now in 20 languages.
04:39:21.500 | We laid out a work plan.
04:39:23.420 | We founded this organization called One Shared World.
04:39:27.100 | The URL is oneshared.world.
04:39:28.980 | And it's been this incredible journey.
04:39:31.540 | We now have people who are participating
04:39:33.180 | in one way or another from 120 different countries.
04:39:36.980 | We have our public events exploring these issues,
04:39:40.380 | get millions of viewers.
04:39:43.060 | We have world leaders who are participating.
04:39:46.700 | - So the vision is to work on some of these big problems,
04:39:51.300 | arbitrary number of problems
04:39:52.540 | that present themselves in the world
04:39:54.340 | that face all of human civilization,
04:39:56.500 | and to be able to work together.
04:39:58.140 | - Well, that is, but there's a macro, a meta problem,
04:40:02.220 | which is the global collective action problem.
04:40:04.620 | And so the idea is even if we just focus on the verticals,
04:40:10.060 | on the manifestations
04:40:11.300 | of the global collective action problem,
04:40:13.740 | there'll be an infinite number of those things.
04:40:16.500 | So while we work on those things,
04:40:18.900 | like climate change, pandemics, WMD, and other things,
04:40:22.740 | we also have to ask the bigger questions
04:40:25.260 | of why can't we solve this category of problems?
04:40:27.420 | And the idea is, at least from my observation,
04:40:30.180 | is that whenever big decisions are being made,
04:40:35.060 | our national leaders and corporate leaders
04:40:37.700 | are doing exactly what we've hired them to do.
04:40:40.540 | They're maximizing for national interest,
04:40:43.660 | even, or corporate interest,
04:40:45.100 | even at the expense of everybody.
04:40:48.540 | And so it's not that we wanna get rid of states.
04:40:50.500 | States are essential in our world system.
04:40:52.700 | It's not we wanna undermine the UN,
04:40:54.380 | which is also essential, but massively underperforming.
04:40:57.940 | What we wanna do
04:40:58.780 | is to create an empowered global constituency
04:41:01.940 | of people who are demanding
04:41:03.940 | that their leaders at all levels
04:41:06.260 | just do a better job of balancing
04:41:09.140 | broader and narrower interests.
04:41:10.620 | - I see.
04:41:11.460 | So this is more like a,
04:41:13.580 | make it more symmetric in terms of power.
04:41:17.860 | It's holding accountable the nations, the leaders.
04:41:22.860 | The problem is nations are powerful.
04:41:26.980 | We talked about China quite a bit.
04:41:29.300 | How do you have an organizations of citizens of Earth
04:41:33.980 | that can solve this collective problem
04:41:36.900 | that holds China accountable?
04:41:39.180 | It's difficult 'cause UN,
04:41:41.540 | you could say a lot of things,
04:41:42.740 | but to call it effective is hard.
04:41:45.140 | - Yeah.
04:41:45.980 | - The internet almost is a kind of representation
04:41:50.340 | of a collective force that holds nations accountable.
04:41:55.340 | You know, Twitter, not to give Twitter too much credit,
04:41:58.800 | but social networks, broadly speaking.
04:42:03.380 | So you have hope that this is possible
04:42:05.260 | to build such collections of humans that resist China.
04:42:10.260 | - Not necessarily resist China,
04:42:12.740 | but human, I mean, our cultures change over time.
04:42:16.900 | I mean, the idea of the modern nation state
04:42:20.060 | would not have made sense to people
04:42:22.500 | in the 13th or 14th century.
04:42:24.540 | The idea that became the United Nations,
04:42:28.300 | I mean, it had its earliest days
04:42:30.180 | in the philosophies of Kant.
04:42:33.580 | It took a long time for these ideas to be realized.
04:42:38.580 | And so the idea, and we're far from successful.
04:42:44.460 | I mean, we've had little minor successes,
04:42:47.000 | which we're very proud of.
04:42:47.860 | We got the G20 leaders to incorporate the language
04:42:51.220 | that we provided on addressing the needs
04:42:53.600 | of the world's most vulnerable populations
04:42:56.180 | into the final summit communique
04:42:59.660 | from the G20 summit in Riyadh.
04:43:01.780 | This year, we're just on the verge of having our language
04:43:05.260 | pat on the same issue, ensuring everyone on earth
04:43:08.820 | has access to safe water, basic sanitation and hygiene,
04:43:11.540 | and essential pandemic protection by 2030,
04:43:14.620 | passed as part of a resolution
04:43:17.440 | in the United Nations General Assembly.
04:43:19.100 | And we're primarily, I mean,
04:43:21.300 | it's young people all around the world.
04:43:23.580 | And when I told them in the beginning of this year,
04:43:26.100 | this is our goal, we're gonna get the UN General Assembly
04:43:29.020 | to pass a resolution with our language in it.
04:43:32.340 | I mean, first, I think they all thought it was insane,
04:43:35.060 | but they were too young and inexperienced
04:43:38.220 | to know how insane it was.
04:43:39.900 | But now these young people are just so excited
04:43:42.180 | that it's actually happening.
04:43:43.780 | So what we're trying to do is really to create a movement,
04:43:48.780 | which we don't feel that we need to do from scratch,
04:43:51.620 | because there are a lot of movements.
04:43:53.140 | Like right now, we just had the Glasgow G20,
04:43:56.700 | I mean, I'm sorry, the Glasgow Climate Change, COP26.
04:44:00.420 | And then Greta Thunberg, who has a huge following
04:44:03.100 | and who is an amazing young woman,
04:44:05.840 | but I was kind of disappointed in what she said afterwards.
04:44:08.980 | It became like a meme on Twitter, which was blah, blah, blah.
04:44:13.100 | And basically it was like, blah, blah, blah,
04:44:15.140 | these old people are just screwing around
04:44:17.500 | and it's a waste of time.
04:44:18.900 | And definitely the critique is merited,
04:44:22.340 | but young people have never been more empowered,
04:44:25.660 | educated, connected than they are now.
04:44:29.300 | And so that's what we've had a process
04:44:32.660 | with One Shared World,
04:44:36.200 | where we partnered with the Model United Nations,
04:44:39.660 | the Aga Khan Foundation, the India Sanitation Coalition.
04:44:42.300 | And what we did is say, all right,
04:44:43.580 | we have this goal, water, sanitation, hygiene,
04:44:45.980 | and pandemic protection for everyone on earth by 2030.
04:44:49.460 | And we had debates and consultations
04:44:52.140 | using the Model UN Framework all around the world
04:44:55.020 | in multiple languages.
04:44:56.900 | And we said, come up with a plan
04:44:58.500 | for how this could be achieved.
04:44:59.660 | And these brilliant young people in every country,
04:45:02.720 | not every country, most countries,
04:45:04.860 | they all contributed and we had a plan.
04:45:06.980 | Then I recruited friends of mine,
04:45:09.020 | like my friend Hans Carell in Sweden,
04:45:11.280 | who's the former chief counsel of the whole United Nations,
04:45:15.180 | and asked him and others to work with these young people
04:45:18.440 | and representatives to turn that
04:45:20.860 | into what looks exactly like a UN resolution.
04:45:24.820 | It's just written by a bunch of kids all around the world.
04:45:28.420 | We then sent that to every permanent representative,
04:45:31.700 | every government representative at the UN.
04:45:34.300 | And that was why working with the German
04:45:36.220 | and Spanish governments,
04:45:37.220 | why the language is centralized from that document
04:45:40.100 | is about to pass the UN.
04:45:41.940 | And it doesn't mean that just passing
04:45:43.500 | a UN General Assembly resolution changes anything,
04:45:46.580 | but we think that there's a model of engaging people,
04:45:49.900 | just like you're talking about,
04:45:50.900 | these people who are outside
04:45:53.180 | of the traditional power structures
04:45:55.460 | and who want to have a voice,
04:45:58.020 | but I think we need to give a little bit of structure
04:46:00.100 | because just going, I'm a big fan of Global Citizen,
04:46:03.940 | but just going to a Global Citizen concert
04:46:06.860 | and waving your iPhone back and forth
04:46:09.100 | and tweeting about it isn't enough
04:46:11.740 | to drive the kind of change that's required.
04:46:14.380 | We need to come together, even in untraditional ways,
04:46:17.980 | and articulate the change we want
04:46:20.100 | and build popular movements to make that happen.
04:46:23.020 | - And popular means scale and then movements at scale
04:46:26.580 | that actually, like,
04:46:28.580 | where at the individual level do something
04:46:30.500 | and that's then magnified with the scale
04:46:33.500 | to actually have a significant impact.
04:46:35.380 | I mean, at its best,
04:46:37.780 | you hear a lot of folks talk about
04:46:40.260 | the various cryptocurrencies as possibly helping.
04:46:44.580 | You have young people get involved
04:46:47.060 | in challenging the power structures
04:46:49.460 | by challenging the monetary system.
04:46:51.300 | And there's, you know, some of it is number go up,
04:46:56.300 | people get excited when they can make a little bit of money,
04:47:01.020 | but that's actually almost like an entry point
04:47:04.900 | because then you almost feel empowered,
04:47:07.780 | and because of that,
04:47:09.420 | you start to think about some of these philosophical ideas
04:47:12.060 | that I, as a young person,
04:47:14.580 | have the power to change the world.
04:47:16.780 | All of these senior folks in the position of power,
04:47:20.820 | they were, like, first of all,
04:47:23.100 | they were once young and powerless like me,
04:47:26.300 | and I could be part of the next generation
04:47:29.500 | that makes a change.
04:47:30.340 | Well, all the things I see that are wrong with the world,
04:47:32.620 | I can make it better.
04:47:34.700 | And it's very true that the overly powerful nations
04:47:40.020 | of the world could be a relic of the past.
04:47:43.380 | That could be a 20th century and before idea
04:47:47.980 | that was tried, create a lot of benefit,
04:47:52.580 | but we also saw the problems with that kind of world,
04:47:56.580 | extreme nationalism.
04:47:58.060 | We see the benefits and the problems of the Cold War,
04:48:02.660 | arguably Cold War got us to the moon,
04:48:04.860 | but there could be a lot of other different mechanisms
04:48:09.200 | that inspired competition,
04:48:11.260 | especially friendly competition between nations
04:48:13.860 | versus adversarial competition that resulted
04:48:16.420 | in the response to COVID, for example,
04:48:18.140 | with China, the United States and Russia
04:48:20.540 | and the secrecy, the censorship.
04:48:23.700 | Yeah, and all the things that are basically
04:48:27.700 | against the spirit of science
04:48:31.420 | and resulted in the loss of trillions of dollars
04:48:34.140 | and the cost of countless lives.
04:48:37.660 | What gives you hope about the future, Jamie?
04:48:41.560 | Well, one of the things, I mean,
04:48:42.560 | you mentioned cryptocurrency and then as you know,
04:48:46.440 | better than most, there's cryptocurrency
04:48:49.120 | and then underneath the cryptocurrency,
04:48:51.160 | there's the blockchain and the distributed ledger.
04:48:54.120 | And then like we talked about,
04:48:55.880 | there are all these young people who are able
04:48:57.740 | to connect with each other, to organize in new ways.
04:49:02.740 | And I work with these young people every single day
04:49:07.140 | through One Shared World primarily, but also other things.
04:49:11.280 | And there's so much optimism.
04:49:13.240 | There's so much hope that I just have a lot of faith
04:49:18.240 | that we're gonna figure something out.
04:49:20.680 | I'm an optimist by nature.
04:49:23.720 | And that doesn't mean that we need to be blind
04:49:26.760 | to the dangers.
04:49:27.600 | There are very, very real dangers,
04:49:30.280 | but just given half the chance, people wanna be good.
04:49:34.640 | People want to do the right thing.
04:49:36.720 | And I do believe that there's a role,
04:49:39.940 | I mean, there's a role for the,
04:49:41.980 | at least near term for governments,
04:49:44.240 | but there's always a role for leadership.
04:49:46.680 | And I'm, I guess like a Gramscian in the sense that,
04:49:51.080 | that I think that we need to create frameworks
04:49:53.880 | and structures that allow leaders to emerge.
04:49:58.600 | And we need to build norms so that the leaders who emerge
04:50:02.880 | are leaders who call on us, inspire our best instincts,
04:50:07.880 | and not drive us toward our worst.
04:50:10.840 | But I really see a lot of hope.
04:50:13.400 | And I mean, you say this all the time in your podcast,
04:50:18.400 | and you may even be more optimistic to me
04:50:21.720 | 'cause you look at the darkest moments of human history
04:50:24.800 | and see hope, but we're kind of a crazy, wonderful species.
04:50:29.720 | I mean, yes, we figured out ways to slaughter each other
04:50:32.460 | at scale, but we've come up with these wonderful philosophies
04:50:36.060 | about love and all of those things.
04:50:38.560 | And yeah, maybe the bonobos have some love
04:50:41.920 | in their cultures, but this,
04:50:43.900 | we're kind of a wonderful, magical species.
04:50:46.140 | And if we just can create enough of an infrastructure,
04:50:49.960 | it doesn't need to be and shouldn't be controlling,
04:50:52.120 | just enough of an infrastructure
04:50:54.080 | so that people are stakeholders,
04:50:56.520 | feel like they're stakeholders
04:50:57.760 | in contributing to a positive story.
04:51:00.360 | I just really feel the sky is the limit.
04:51:04.520 | - So if there's somebody who's young right now,
04:51:06.880 | somebody in high school,
04:51:07.880 | somebody in college listening to you,
04:51:09.520 | you've done a lot of incredible things.
04:51:12.400 | You're respected by a lot of the elites.
04:51:17.400 | You're respected by the people.
04:51:21.140 | So you're both able to sort of speak to all groups,
04:51:26.140 | walk through the fire, like you mentioned,
04:51:30.140 | with this lab leak.
04:51:31.640 | What advice would you give to young kids today
04:51:37.040 | that are inspired by your story?
04:51:38.840 | - Well, thank you.
04:51:39.980 | I mean, I think there's one, there's lots of,
04:51:42.000 | I'm honored if anybody is inspired,
04:51:44.600 | but it's the same thing as I said with the science
04:51:49.800 | that it's all about values.
04:51:51.520 | The core of everything is knowing who you are.
04:51:55.480 | And so, yes, I mean, there's the broader thing
04:51:58.040 | of follow your passions, a creative mind,
04:52:02.780 | and an inquisitive mind is the core of everything
04:52:05.300 | because the knowledge base is constantly sharing,
04:52:08.060 | so learning how to learn.
04:52:10.540 | But at the core of everything is investing
04:52:14.460 | in knowing who you are and what you stand for,
04:52:17.500 | because that's the way, that's the path
04:52:22.140 | to leading a meaningful life, to contributing,
04:52:25.300 | to not feeling alienated from your life as you get older.
04:52:29.660 | And just like you live, it's an ongoing process,
04:52:34.660 | and we all make mistakes, and we all kind of travel
04:52:38.460 | down wrong paths, and just have some love for yourself
04:52:42.300 | and recognize that just at every,
04:52:44.340 | like I was saying with the Iron Man,
04:52:46.520 | just when you think there's no possibility
04:52:50.300 | that you can go on, there's a 100% possibility
04:52:54.580 | that you can go on.
04:52:55.500 | And just when you think that nothing better
04:52:58.100 | will happen to you, there's a 100% chance
04:53:02.060 | that something better will happen to you,
04:53:03.680 | you just gotta keep going.
04:53:05.740 | - Jamie, I've been a fan of yours.
04:53:08.780 | I've, I think, first heard you on Joe Rogan Experience,
04:53:11.980 | but been following your work, your bold, fearless work
04:53:15.780 | with speaking about the lab leak
04:53:18.620 | and everything you represent, from your brilliance
04:53:21.620 | to your kindness, and the fact that you spend
04:53:25.460 | your valuable time with me today,
04:53:27.140 | and now I officially made you miss your flight,
04:53:31.660 | and the fact that you said that,
04:53:34.220 | whether you were being nice or not, I don't know,
04:53:36.380 | that you would be okay with that, means the world to me,
04:53:39.420 | and I'm really honored that you were spending
04:53:41.460 | your time with me today.
04:53:42.300 | - Well, really, it's been such a great pleasure,
04:53:45.020 | and thank you for creating a forum
04:53:48.140 | to have these kinds of long conversations.
04:53:51.060 | So I've really enjoyed it, and thank you,
04:53:53.900 | and if anybody has now listened for,
04:53:58.380 | what's it been, five and a half hours?
04:53:59.900 | - Yep.
04:54:00.740 | - Thank you for listening.
04:54:01.980 | - Welcome, five hour club.
04:54:03.620 | - Exactly. - Represent.
04:54:04.940 | (both laughing)
04:54:06.380 | Thank you, Jamie.
04:54:07.220 | - Thanks, Lex. - This was awesome.
04:54:09.020 | Thanks for listening to this conversation
04:54:10.580 | with Jamie Metzl.
04:54:11.820 | To support this podcast, please check out our sponsors
04:54:14.380 | in the description.
04:54:16.060 | And now, let me leave you with some words
04:54:18.140 | from Richard Feynman about science and religion,
04:54:21.600 | which I think also applies to science and geopolitics,
04:54:24.820 | because I believe scientists have the responsibility
04:54:27.260 | to think broadly about the world
04:54:29.180 | so that they may understand the bigger impact
04:54:31.580 | of their inventions.
04:54:33.480 | The quote goes like this.
04:54:34.900 | "In this age of specialization,
04:54:37.660 | "men who thoroughly know one field
04:54:39.580 | "are often incompetent to discuss another.
04:54:42.940 | "The old problems, such as the relation
04:54:45.060 | "of science and religion, are still with us,
04:54:47.700 | "and I believe present as difficult dilemmas as ever,
04:54:51.660 | "but they are not often publicly discussed
04:54:54.020 | "because of the limitations of specialization."
04:54:57.860 | Thank you for listening, and hope to see you next time.
04:55:00.780 | (upbeat music)
04:55:03.360 | (upbeat music)
04:55:05.940 | [BLANK_AUDIO]