back to index

Francis Collins: National Institutes of Health (NIH) | Lex Fridman Podcast #238


Chapters

0:0 Introduction
1:18 Lab-leak theory
4:26 Gain-of-function research of viruses
16:35 Bioterrorism
21:4 Tony Fauci
31:15 COVID Vaccines
37:20 Joe Rogan
44:23 Variants
49:5 Rapid at home testing
53:18 Animal testing
58:44 Stepping down as director of the NIH
62:37 Barack Obama
64:40 Accelerating Medicines Partnership
75:18 Faith
80:47 Fear of death
83:49 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Francis Collins,
00:00:02.820 | director of the NIH, the National Institutes of Health,
00:00:06.920 | appointed and reappointed to the role by three presidents,
00:00:10.320 | Obama, Trump, and Biden.
00:00:13.160 | He oversees 27 separate institutes and centers,
00:00:16.680 | including NIAID, which makes him Anthony Fauci's boss.
00:00:20.840 | At the NIH, Francis helped launch
00:00:23.240 | and led a huge number of projects
00:00:25.920 | that pushed the frontiers of science, health, and medicine,
00:00:29.400 | including one of my favorites, the BRAIN Initiative,
00:00:32.880 | that seeks to map the human brain
00:00:34.680 | and understand how the function arises
00:00:37.360 | from neural circuitry.
00:00:39.600 | Before the NIH, Francis led the Human Genome Project,
00:00:43.220 | one of the largest and most ambitious efforts
00:00:45.800 | in the history of science.
00:00:47.900 | Given all that, Francis is a humble, thoughtful, kind man.
00:00:52.720 | And because of this, to me,
00:00:54.240 | he's one of the best representatives of science in the world.
00:00:58.320 | He's a man of God,
00:00:59.800 | and yet also a friend of the late Christopher Hitchens,
00:01:03.160 | who called him, quote,
00:01:04.880 | "One of the greatest living Americans."
00:01:08.280 | This is the Lex Friedman Podcast.
00:01:10.360 | To support it,
00:01:11.200 | please check out our sponsors in the description.
00:01:13.560 | And now, here's my conversation with Francis Collins.
00:01:17.420 | Science at its best is a source of hope.
00:01:21.320 | So for me, it's been difficult to watch,
00:01:23.480 | as it has during the pandemic,
00:01:25.760 | become at times a source of division.
00:01:28.200 | What I would love to do in this conversation with you
00:01:30.900 | is touch some difficult topics
00:01:32.920 | and do so with empathy and humility
00:01:35.160 | so that we may begin to regain a sense of trust in science,
00:01:39.040 | and that it may once again become a source of hope.
00:01:41.980 | I hope that's okay with you.
00:01:43.200 | - I love the goal.
00:01:45.020 | - Let's start with some hard questions.
00:01:47.620 | You called for, quote,
00:01:49.120 | "Thorough, expert-driven, and objective inquiry
00:01:51.920 | "into the origins of COVID-19."
00:01:54.240 | So let me ask,
00:01:55.520 | is there a reasonable chance that COVID-19 leaked
00:01:58.400 | from a lab?
00:01:59.240 | - I can't exclude that.
00:02:01.920 | I think it's fairly unlikely.
00:02:04.140 | I wish we had more ability to be able to ask questions
00:02:07.720 | of the Chinese government
00:02:09.040 | and learn more about what kind of records
00:02:11.820 | might have been in the lab
00:02:13.360 | that we've never been able to see.
00:02:15.600 | But most likely, this was a natural origin of a virus,
00:02:19.560 | probably starting in a bat,
00:02:20.880 | perhaps traveling through some other intermediate,
00:02:23.240 | yet to be identified host,
00:02:25.160 | and finding its way into humans.
00:02:27.440 | - Is answering this question within the realm of science,
00:02:29.880 | do you think, will we ever know?
00:02:31.920 | - I think we might know if we find that intermediate host.
00:02:35.560 | And there has not yet been a thorough enough investigation
00:02:39.560 | to say that that's not going to happen.
00:02:41.880 | And remember, it takes a while to do this.
00:02:44.680 | With SARS, it was 14 years before we figured out
00:02:47.620 | it was the civet cat that was the intermediate host.
00:02:50.760 | With MERS, it was a little quicker
00:02:52.280 | to discover it was the camel.
00:02:54.060 | With SARS-CoV-2, there's been some looking,
00:02:56.820 | but especially now, with everything really tense
00:02:59.800 | between the US and China,
00:03:01.480 | if there's looking going on,
00:03:02.600 | we're not getting told about it.
00:03:05.000 | - Do you think it's a scientific question
00:03:06.640 | or a political question?
00:03:08.320 | - It's a scientific question,
00:03:09.680 | but it has political implications.
00:03:12.320 | - So the world is full of scientists
00:03:14.560 | that are working together,
00:03:16.040 | but in the political space,
00:03:17.240 | in the political science space, there's tensions.
00:03:21.320 | What is it like to do great science
00:03:23.820 | in a time of a pandemic when there's political tensions?
00:03:27.860 | - It's very unfortunate.
00:03:29.960 | Pasteur said science knows no one country.
00:03:34.100 | He was right about that.
00:03:35.820 | My whole career, in genetics especially,
00:03:39.420 | has depended upon international collaboration
00:03:42.320 | between scientists as a way
00:03:44.580 | to make discoveries, get things done.
00:03:46.300 | Scientists, by their nature,
00:03:48.620 | like to be involved in international collaborations.
00:03:52.200 | The Human Genome Project, for heaven's sake,
00:03:54.800 | 2,400 scientists in six countries working together,
00:03:58.860 | not worrying who was gonna get the credit,
00:04:00.520 | giving all the data away.
00:04:02.240 | I was the person who was supposed
00:04:03.600 | to keep all that coordinated.
00:04:04.960 | It was a wonderful experience.
00:04:07.120 | And that included China.
00:04:08.380 | That was sort of their first real entry
00:04:10.640 | into a big international, big science kind of project,
00:04:14.600 | and they did their part.
00:04:15.940 | It's very different now.
00:04:19.100 | - Continuing the line of difficult questions,
00:04:21.840 | especially difficult ethical questions.
00:04:26.580 | In 2014, US put a hold on gain-of-function research
00:04:30.740 | in response to a number of laboratory biosecurity incidents,
00:04:34.060 | including anthrax, smallpox, and influenza.
00:04:37.000 | In December 2017, NIH lifted this ban because, quote,
00:04:42.000 | "Gain-of-function research is important
00:04:44.020 | "in helping us identify, understand,
00:04:46.620 | "and develop strategies and effective countermeasures
00:04:49.040 | "against rapidly evolving pathogens
00:04:50.920 | "that pose a threat to public health."
00:04:53.120 | All difficult questions have arguments on both sides.
00:04:57.800 | Can you argue the pros and cons
00:04:59.800 | of gain-of-function research with viruses?
00:05:02.560 | - I can.
00:05:04.960 | And first, let me say this term, gain-of-function,
00:05:09.000 | is causing such confusion that I need to take a minute
00:05:12.360 | and just sort of talk about what the common scientific
00:05:16.260 | use of that term is and where it is very different
00:05:19.800 | when we're talking about the current oversight
00:05:22.680 | of potentially dangerous human pathogens.
00:05:25.320 | As you know, in science, we're doing gain-of-function
00:05:29.360 | experiments all the time.
00:05:31.000 | We support a lot of cancer immunotherapy at NIH.
00:05:36.240 | Right here in our clinical center,
00:05:37.960 | there are trials going on where people's immune cells
00:05:41.000 | are taken out of their body, treated with a genetic therapy
00:05:44.400 | that revs up their ability to discover the cancer
00:05:47.380 | that that patient currently has, maybe even at stage four,
00:05:50.620 | and then give them back as those little ninja warriors
00:05:54.940 | go after the cancer.
00:05:56.140 | And it sometimes works dramatically.
00:05:58.740 | That's gain-of-function.
00:06:00.180 | You gave that patient a gain in their immune function
00:06:02.800 | that may have saved their life.
00:06:04.640 | So we gotta be careful not to say,
00:06:05.860 | "Oh, gain-of-function is bad."
00:06:08.020 | Most of what we do in science that's good
00:06:11.600 | involves quite a bit of that.
00:06:14.180 | And we are all living with gains-of-function every day.
00:06:16.320 | I have a gain-of-function 'cause I'm wearing
00:06:17.960 | these eyeglasses, otherwise I would not be seeing you
00:06:21.000 | as clearly.
00:06:22.020 | I'm happy for that gain-of-function.
00:06:24.740 | So that's where a lot of confusion has happened.
00:06:27.800 | The kind of gain-of-function which is now subject
00:06:30.580 | to very rigorous and very carefully defined oversight
00:06:35.440 | is when you are working with an established human pathogen
00:06:38.880 | that is known to be potentially causing a pandemic,
00:06:42.740 | and you are enhancing or potentially enhancing
00:06:46.720 | its transmissibility or its virulence.
00:06:49.940 | We call that EPPP, Enhanced Potential Pandemic Pathogen.
00:06:54.940 | That requires this very stringent oversight,
00:07:01.040 | worked out over three years
00:07:03.480 | by the National Science Advisory Board on Biosecurity
00:07:08.240 | that needs to be looked at by a panel
00:07:11.340 | that goes well beyond NIH to decide,
00:07:13.720 | are the benefits worth the risks in that situation?
00:07:17.160 | Most of the time, it's not worth the risk.
00:07:21.320 | Only three times in the last three or four years
00:07:25.120 | have experiments been given permission to go forward.
00:07:28.320 | They were all on influenza.
00:07:29.800 | So I will argue that if you're worried
00:07:33.160 | about the next pandemic,
00:07:35.600 | the more you know about the coming enemy,
00:07:37.920 | the better chance you have to recognize
00:07:40.040 | when trouble is starting.
00:07:41.960 | And so if you can do it safely,
00:07:44.400 | studying influenza or coronaviruses like SARS, MERS,
00:07:48.840 | and SARS-CoV-2 would be a good thing
00:07:51.780 | to be able to know about,
00:07:53.200 | but you have to be able to do it safely
00:07:55.400 | because we all know lab accidents can happen.
00:07:59.160 | I mean, look at SARS where there have been lab accidents
00:08:02.640 | and people who have gotten sick as a result.
00:08:05.080 | We don't want to take that chance
00:08:06.760 | unless there's a compelling scientific reason.
00:08:08.800 | That's why we have this very stringent oversight.
00:08:13.360 | The experiments being done
00:08:15.000 | at the Wuhan Institute of Virology
00:08:17.680 | as a subaward to our grant to EcoHealth in New York
00:08:22.680 | did not meet that standard
00:08:24.840 | of requiring that kind of stringent oversight.
00:08:27.520 | I want to be really clear about that
00:08:29.160 | 'cause there's been so much thrown around about it.
00:08:32.160 | Was it gain of function?
00:08:33.560 | Well, in the standard use of that term
00:08:36.080 | that you would use in science in general,
00:08:38.000 | you might say it was,
00:08:38.960 | but in the use of that term that applies
00:08:41.720 | to this very specific example
00:08:46.200 | of a potential pandemic pathogen, absolutely not.
00:08:50.160 | So nothing went on there that should not have happened
00:08:54.040 | based upon the oversight.
00:08:55.800 | There was an instance where the grantee institution
00:08:59.920 | failed to notify us about the result of an experiment
00:09:02.960 | that they were supposed to tell us
00:09:04.120 | where they mixed and matched some viral genomes
00:09:08.000 | and got a somewhat larger viral load as a result,
00:09:11.760 | but it was not EPPP.
00:09:13.960 | It was not getting into that zone
00:09:16.680 | that would have required this higher level of scrutiny.
00:09:19.240 | It was all bat viruses.
00:09:20.920 | These were not human pathogens.
00:09:22.920 | - So they didn't cross a threshold
00:09:25.840 | within that gray area that makes for an EPPP?
00:09:29.160 | - They did not.
00:09:30.360 | And anybody who's willing to take the time
00:09:33.040 | to look at what EPPP means and what those experiments were
00:09:37.040 | would have to agree with what I just said.
00:09:39.080 | - What is the biggest reason
00:09:40.640 | it didn't cross that threshold?
00:09:42.000 | Is it because it wasn't jumping to humans?
00:09:45.440 | Is it because it did not have a sufficient increase
00:09:48.560 | in virulence or transmissibility?
00:09:49.960 | What's your sense?
00:09:51.480 | - EPPP only applies to agents
00:09:54.640 | that are known human pathogens of pandemic potential.
00:10:00.760 | These were all bat viruses derived in the wild,
00:10:04.560 | not shown to be infectious to humans.
00:10:07.240 | Just looking at what happened
00:10:08.520 | if you took four different bat viruses
00:10:10.640 | and you tried moving the spike protein gene
00:10:13.480 | from one into one of the others
00:10:15.120 | to see whether it would bind better to the ACE2 receptor.
00:10:18.960 | That doesn't get across that threshold.
00:10:21.080 | And let me also say,
00:10:22.320 | for those who are trying to connect the dots here,
00:10:25.280 | which is the most troubling part of this,
00:10:27.320 | and say, well, this is how SARS-CoV-2 got started,
00:10:31.080 | that is absolutely demonstrably false.
00:10:34.640 | These bat viruses that were being studied
00:10:37.560 | had only about 80% similarity in their genomes
00:10:41.240 | to SARS-CoV-2.
00:10:42.400 | They were like decades away in evolutionary terms.
00:10:45.800 | And it is really irresponsible for people
00:10:48.000 | to claim otherwise.
00:10:49.080 | - Speaking of people who claim otherwise,
00:10:54.760 | Rand Paul, what do you make of the battle of words
00:10:58.480 | between Senator Rand Paul and Dr. Anthony Fauci
00:11:01.840 | over this particular point?
00:11:03.560 | - I don't want to talk about specific members of Congress,
00:11:07.200 | but I will say it's really unfortunate
00:11:09.440 | that Tony Fauci, who is the epitome
00:11:12.440 | of a dedicated public servant,
00:11:14.800 | has now somehow been targeted for political reasons
00:11:19.200 | as somebody that certain figures are trying to discredit,
00:11:23.440 | perhaps to try to distract from their own failings.
00:11:26.440 | This never should have happened.
00:11:28.280 | Here's a person who's dedicated his whole life
00:11:31.960 | to trying to prevent illnesses from infectious diseases,
00:11:35.560 | including HIV, in the 1980s and '90s,
00:11:38.720 | and now probably the most knowledgeable
00:11:42.440 | infectious disease physician in the world,
00:11:45.160 | and also a really good communicator,
00:11:48.200 | is out there telling the truth
00:11:50.200 | about where we are with SARS-CoV-2
00:11:52.680 | to certain political figures who don't want to hear it,
00:11:55.520 | and who are therefore determined to discredit him,
00:11:58.600 | and that is disgraceful.
00:12:00.360 | - So with politicians,
00:12:01.760 | they often play games with black and white.
00:12:04.640 | They try to sort of use the gray areas of science
00:12:09.640 | and then paint their own picture.
00:12:12.000 | But I have a question about the gray areas of science.
00:12:14.960 | So like you mentioned, gain of function is a term
00:12:18.280 | that has very specific scientific meaning,
00:12:20.640 | but it also has a more general term.
00:12:23.280 | And it's very possible to argue that the,
00:12:26.320 | not to argue, not the way politicians argue,
00:12:28.880 | but just as human beings and scientists,
00:12:31.120 | that there was a gain of function achieved
00:12:34.840 | at the Wuhan Institute of Virology,
00:12:37.880 | but it didn't cross a threshold.
00:12:39.720 | I mean, there's a, it's a, but it could have too.
00:12:43.480 | So here's the thing.
00:12:44.520 | When you do these kinds of experiments,
00:12:47.240 | unexpected results may be achieved,
00:12:50.320 | and that's the gray area of science.
00:12:52.360 | You're taking risks with such experiments.
00:12:55.180 | And I am very uncomfortable that we can't discuss
00:13:00.920 | the uncertainty in the gray area of this.
00:13:03.640 | - Oh, I'm comfortable discussing the gray area.
00:13:06.240 | What I'm uncomfortable with is people deciding
00:13:08.820 | to define for themselves what that threshold is
00:13:12.400 | based on sort of some political argument.
00:13:14.840 | The threshold was very explicitly laid out.
00:13:18.740 | Everybody agreed to that in the basis
00:13:21.700 | of this three years of deliberation.
00:13:23.760 | So that's what it is.
00:13:24.760 | If that threshold needs to be reconsidered,
00:13:27.200 | let's reconsider it, but let's not try to take
00:13:30.380 | an experiment that's already been done
00:13:32.440 | and decide that the threshold isn't what it was,
00:13:35.160 | 'cause that really is doing a disservice
00:13:37.680 | to the whole process.
00:13:38.880 | - I wish there was a discussion,
00:13:40.160 | even in response to Rand Paul,
00:13:43.040 | and I know we're not talking about specific senators,
00:13:45.160 | but just that particular case, I'm saying stuff here.
00:13:48.360 | I wish there was an opportunity to talk about,
00:13:51.040 | given the current threshold, this is not gain of function,
00:13:54.960 | but maybe we need to reconsider the threshold
00:13:56.860 | and have an actual, that's an opportunity
00:13:58.480 | for a discussion about the ethics of gain of function.
00:14:01.160 | You said that there was three studies
00:14:03.000 | that passed that threshold with influenza.
00:14:05.340 | That's a fascinating human question,
00:14:07.200 | scientific question about ethics,
00:14:09.280 | because like you said, there's pros and cons.
00:14:13.880 | You're taking risks here to prevent
00:14:18.120 | horribly destructive viruses in the future,
00:14:21.640 | but you also are risking creating
00:14:25.000 | such viruses in the future.
00:14:26.880 | With nuclear weapons and nuclear energy,
00:14:29.180 | nuclear energy promises a lot of positive effects,
00:14:35.160 | and yet you're taking risks here.
00:14:37.200 | With mutually assured destruction,
00:14:39.840 | nations possessing nuclear weapons.
00:14:41.920 | - Oh my.
00:14:42.760 | I hope we're not going there.
00:14:45.080 | - Well, we're not, but a lot of people argue
00:14:47.840 | that that's the reason we've,
00:14:49.040 | nuclear weapons is the reason we've prevented world wars,
00:14:52.400 | and yet they also have the risk of starting world wars.
00:14:56.520 | And this is what we have to be honest about
00:14:59.240 | with the benefits and risks of science,
00:15:01.760 | that you have to make that calculation.
00:15:04.480 | What are the pros and what are the cons?
00:15:06.160 | - I'm totally with you, but I want to reassure you, Lex,
00:15:09.380 | that this is not an issue that's been ignored.
00:15:12.000 | - Yes.
00:15:12.840 | - That this issue about the kind of gain of function
00:15:15.360 | that might result in a serious human pathogen
00:15:18.360 | has been front and center in many deliberations
00:15:21.640 | for a decade or more,
00:15:23.080 | involved a lot of my time along the way, by the way,
00:15:26.140 | and has been discussed publicly on multiple occasions,
00:15:29.560 | including two major meetings
00:15:31.640 | of the National Academy of Sciences,
00:15:34.220 | getting input from everybody,
00:15:35.800 | and ultimately arriving at our current framework.
00:15:38.520 | Now, we actually, back in January of 2020,
00:15:43.360 | just before COVID-19 changed everything,
00:15:46.680 | had planned and even charged
00:15:48.560 | that same National Science Advisory Board on Biosecurity
00:15:53.560 | to reconvene and look at the current framework and say,
00:15:57.360 | "Do we have it right?
00:15:58.600 | "Let's look at the experience over those three years
00:16:01.180 | "and say, is the threshold too easy, too hard?
00:16:05.160 | "Do we need to reconsider it?
00:16:06.280 | "Let's look at the experience."
00:16:07.920 | COVID came along, the members of the board said,
00:16:10.460 | "Please, we're all infectious disease experts.
00:16:12.300 | "We don't have time for this right now,
00:16:14.120 | "but I think the time is right to do this."
00:16:16.600 | I'm totally supportive of that,
00:16:18.140 | and that should be just as public a discussion
00:16:20.280 | as you can imagine about what are the benefits and the risks.
00:16:23.000 | And if somebody decided, ultimately,
00:16:25.900 | this came together and said,
00:16:26.860 | "We just shouldn't be doing these experiments
00:16:28.680 | "under any circumstances,"
00:16:30.000 | if that was the conclusion,
00:16:31.320 | well, that would be the conclusion,
00:16:32.760 | but it hasn't been so far.
00:16:34.200 | - If we can briefly look out
00:16:36.640 | into the next hundred years on this.
00:16:40.820 | I apologize for the existential questions,
00:16:44.280 | but it seems obvious to me
00:16:47.380 | that as gain-of-function type of research and development
00:16:52.220 | becomes easier and cheaper,
00:16:54.000 | it will become greater and greater risk.
00:16:58.060 | So if it doesn't no longer need to be contained
00:17:01.580 | within laboratories of high security,
00:17:04.900 | it feels like this is one of the greatest threats
00:17:08.540 | facing human civilization.
00:17:10.500 | Do you worry that at some point in the future,
00:17:12.880 | a leaked man-made virus
00:17:14.680 | may destroy most of human civilization?
00:17:18.640 | - I do worry about the risks.
00:17:20.680 | And at the moment where we have the greatest control,
00:17:24.420 | the greatest oversight,
00:17:26.040 | is when this is federally funded research.
00:17:29.080 | But as you're alluding,
00:17:30.280 | there's no reason to imagine that's the only place
00:17:33.160 | that this kind of activity would go on.
00:17:35.800 | If there was an evil source that wished to create a virus
00:17:40.800 | that was highly pathogenic in their garage,
00:17:43.540 | the technology does get easier.
00:17:46.580 | And there is no international oversight about this either
00:17:50.500 | that you could say has the same stringency
00:17:52.580 | as what we have in the United States.
00:17:54.920 | So yes, that is a concern.
00:17:58.100 | It would take a seriously deranged group or person
00:18:03.020 | to undertake this on purpose,
00:18:05.460 | given the likelihood that they too would go down.
00:18:08.420 | We don't imagine there are going to be bioweapons
00:18:13.420 | that only kill your enemies and don't kill you.
00:18:15.700 | Sorry, we're too much alike for that to work.
00:18:18.340 | So I don't see it as an imminent risk.
00:18:23.100 | There's lots of scary novels and movies written about it,
00:18:27.580 | but I do think it's something we have to consider.
00:18:30.940 | What are all the things that ought to be watched?
00:18:32.980 | You may not know that if somebody is ordering
00:18:36.340 | a particular oligonucleotide from one of the main suppliers
00:18:41.340 | and it happens to match smallpox, they're gonna get caught.
00:18:45.800 | So there is effort underway to try to track
00:18:50.220 | any nefarious actions that might be going on.
00:18:52.660 | - In the United States or internationally?
00:18:54.100 | Is there an international collaboration
00:18:55.780 | of trying to track this stuff?
00:18:57.340 | - There is some.
00:18:58.180 | I wish it were stronger.
00:19:00.480 | - This is a general issue, Lex, in terms of,
00:19:03.400 | do we have a mechanism, particularly when it comes
00:19:05.800 | to ethical issues, to be able to decide what's allowable
00:19:09.680 | and what's not and enforce it?
00:19:11.360 | I mean, look where we are with germline genome editing
00:19:14.200 | for humans, for instance.
00:19:15.640 | There is no enforcement mechanism.
00:19:17.600 | There's just bully pulpits and governments
00:19:19.920 | that get to decide for themselves.
00:19:21.880 | - So you talked about evil.
00:19:23.080 | What about incompetence?
00:19:24.440 | Does that worry you?
00:19:25.360 | I was born in the Soviet Union.
00:19:28.040 | My dad, a physicist, worked at Chernobyl.
00:19:31.000 | That comes to mind.
00:19:32.440 | That wasn't evil.
00:19:33.640 | That was, I don't know what word you wanna put it.
00:19:36.420 | Maybe incompetence is too harsh.
00:19:38.400 | Maybe it's the inherent incompetence of bureaucracy.
00:19:40.980 | I don't know.
00:19:41.820 | But for whatever reason, there was an accident.
00:19:43.880 | Does that worry you?
00:19:45.120 | - Of course it does.
00:19:46.800 | We know that SARS, for instance, did manage to leak
00:19:50.220 | out of a lab in China two or three times,
00:19:53.400 | and at least in some instances, people died.
00:19:55.720 | Unfortunately, quickly contained.
00:19:57.580 | All one can do in that circumstance,
00:20:00.680 | because you need to study the virus and understand it
00:20:04.320 | in order to keep it from causing a broader pandemic,
00:20:07.560 | but you need to insist upon the kind of biosecurity,
00:20:11.100 | the BSL two, three, and four framework
00:20:14.080 | under which those experiments have to be done.
00:20:17.080 | And certainly at NIH, we're extremely rigorous about that,
00:20:20.660 | but you can't count on every human being
00:20:23.640 | to always do exactly what they're supposed to.
00:20:26.200 | So there's a risk there, which is another reason
00:20:28.560 | why if we're contemplating supporting research
00:20:32.000 | on pathogens that might be the next pandemic,
00:20:35.000 | you have to factor that in,
00:20:36.440 | not just whether people are gonna do something
00:20:39.240 | that we couldn't have predicted,
00:20:40.520 | where all of a sudden they created a virus
00:20:42.280 | that's much worse without knowing they were gonna do that,
00:20:44.280 | but also just having an accident.
00:20:46.400 | That's in the mix when those estimates are done
00:20:49.980 | about whether the risk is worth it or not.
00:20:53.080 | - Continuing on line of difficult questions.
00:20:56.360 | (laughing)
00:20:57.400 | - We're gonna get to fun stuff after a while.
00:20:58.920 | - We will soon, I promise.
00:21:00.800 | You are the director of the NIH.
00:21:06.820 | You are Dr. Anthony Fauci's, technically his boss.
00:21:11.920 | - Yep.
00:21:12.760 | - You have stood behind him.
00:21:14.440 | You have supported him,
00:21:16.080 | just like you did already in this conversation.
00:21:18.540 | It is painful for me to see division and distrust,
00:21:21.920 | but many people in politics and elsewhere
00:21:25.240 | have called for Anthony Fauci to be fired.
00:21:27.960 | When there's such calls of distrust in public
00:21:31.280 | about a leader like Anthony Fauci,
00:21:33.240 | who should garner trust, do you think he should be fired?
00:21:37.700 | - Absolutely not.
00:21:39.180 | To do so would be basically to give the opportunity
00:21:45.440 | for those who wanna make up stories about anybody
00:21:50.320 | to destroy them.
00:21:51.600 | There is nothing in the ways in which Tony Fauci
00:21:55.160 | has been targeted that is based upon truth.
00:21:58.540 | How could we then accept those cries for his firing
00:22:04.720 | as having legitimacy?
00:22:07.000 | It's a circular argument.
00:22:08.320 | They've decided they don't like Tony,
00:22:10.500 | so they make up stuff and they twist comments
00:22:13.600 | that he's made about things like gain of function,
00:22:15.960 | where he's referring to the very specific gain of function
00:22:19.440 | that's covered by this policy.
00:22:21.560 | And they're trying to say he lied to the Congress.
00:22:24.360 | That's simply not true.
00:22:26.280 | They don't like the fact that Tony changes
00:22:28.920 | the medical recommendations about what to do with COVID-19
00:22:33.240 | over the space of more than a year.
00:22:35.640 | And they call that flip-flopping and you can't trust the guy
00:22:37.920 | 'cause he says one thing last year and one thing this year.
00:22:40.320 | Well, the science has changed.
00:22:42.640 | Delta variant has changed everything.
00:22:44.860 | You don't want him to be saying the same thing
00:22:47.080 | he did a year ago.
00:22:48.040 | That would be wrong now.
00:22:49.060 | It was the best we could do then.
00:22:50.520 | People don't understand that
00:22:52.000 | or else they don't want to understand that.
00:22:54.800 | So when you basically whip up a largely political argument
00:22:59.600 | against a scientist and hammer at it over and over again
00:23:03.280 | to the point where he now has to have 24/7 security
00:23:06.640 | to protect him against people
00:23:08.100 | who really want to do violence to him,
00:23:10.560 | for that to be a reason to say that then he should be fired
00:23:13.520 | is to hand the evil forces the victory.
00:23:17.020 | I will not do that.
00:23:19.260 | (sighs)
00:23:21.540 | - Yet there's something difficult
00:23:23.580 | I'm going to try to express to you.
00:23:26.180 | So it may be your guitar playing.
00:23:28.720 | It may be something else, but there's a humility to you.
00:23:33.580 | It may be because you're a man of God.
00:23:36.260 | There's a humility to you that garners trust.
00:23:41.260 | And when you're in a leadership position,
00:23:47.980 | representing science,
00:23:49.220 | especially in catastrophic events like the pandemic,
00:23:52.640 | it feels like as a leader,
00:23:55.540 | you have to go far above and beyond your usual duties.
00:24:00.540 | And I think there's no question
00:24:02.680 | that Anthony Fauci has delivered on his duties,
00:24:06.980 | but it feels like he needs to go above
00:24:09.740 | as a science communicator.
00:24:11.080 | And if there's a large number of people
00:24:13.340 | that are distrusting him,
00:24:17.620 | it's also his responsibility to garner their trust,
00:24:20.780 | to gain their trust.
00:24:22.000 | As a person who's the face of science,
00:24:26.860 | are you torn on this?
00:24:28.180 | The responsibility of Anthony Fauci, of yourself,
00:24:31.340 | to represent science,
00:24:33.600 | not just the communication of advising what should be done,
00:24:37.860 | but giving people hope, giving people trust in science,
00:24:42.860 | and alleviating division.
00:24:45.420 | Do you think that's also a responsibility of a leader,
00:24:48.060 | or is that unfair to ask?
00:24:49.480 | - I think the best way you give people trust
00:24:52.660 | is to tell them the truth.
00:24:54.260 | And so they recognize that when you're sharing information,
00:24:57.580 | it's the best you've got at that point.
00:24:59.320 | And Tony Fauci does that at every moment.
00:25:02.900 | I don't think him expressing more humility
00:25:06.340 | would change the fact that they're looking for a target
00:25:09.220 | of somebody to blame,
00:25:11.300 | to basically distract people
00:25:13.460 | from the failings of their own political party.
00:25:16.820 | Maybe I'm less targeted, not because of a difference
00:25:20.380 | in the way in which I convey the information,
00:25:23.060 | I'm less visible.
00:25:24.820 | If Tony were out of the scene and I was placed in that role,
00:25:28.700 | I'd probably be seeing a ratcheting up
00:25:31.700 | of that same targeting.
00:25:33.000 | - I would like to believe that if Tony Fauci said
00:25:39.460 | that when I originally made recommendations
00:25:42.680 | not to wear masks,
00:25:44.600 | that was given on our best available data,
00:25:47.780 | and now we know that is a mistake.
00:25:50.440 | So admit with humility that there's an error.
00:25:53.060 | That's not actually correct,
00:25:56.900 | but that's a statement of humility.
00:25:59.940 | And I would like to believe, despite the attacks,
00:26:04.380 | he would win a lot of people over with that.
00:26:06.980 | So a lot of people, as you're saying,
00:26:09.100 | would use that, see that, here we go,
00:26:11.820 | here's that Dr. Anthony Fauci making mistakes.
00:26:15.140 | How can we trust him on anything?
00:26:16.820 | I believe if he was,
00:26:18.520 | that public display of humility to say that I made an error,
00:26:25.120 | that would win a lot of people over.
00:26:28.060 | That's kind of my sense,
00:26:31.980 | to face the fire of the attacks on politics,
00:26:34.980 | like politicians will attack no matter what.
00:26:37.900 | But the question is the people,
00:26:40.620 | to win over the people.
00:26:41.740 | The biggest concern I've had
00:26:44.180 | is that there was this distrust of science
00:26:47.820 | that's been brewing.
00:26:49.540 | And maybe you can correct me,
00:26:51.980 | but I'm a little bit unwilling to fully blame
00:26:54.580 | the politicians,
00:26:55.900 | 'cause politicians play their games no matter what.
00:26:58.620 | It just feels like this was an opportunity
00:27:02.800 | to inspire people with the power of science.
00:27:05.320 | The development of the vaccines,
00:27:07.500 | no matter what you think of those vaccines,
00:27:09.880 | is one of the greatest accomplishments
00:27:11.500 | in the history of science.
00:27:12.660 | - It is indeed.
00:27:13.720 | - And the fact that that's not inspiring,
00:27:17.140 | listen, I host a podcast.
00:27:18.900 | Whenever I say positive stuff about the vaccine,
00:27:21.200 | I get to hear a lot of different opinions.
00:27:23.980 | - I bet you do.
00:27:24.820 | - The fact that I do is a big problem to me,
00:27:28.400 | because it's an incredible,
00:27:30.360 | an incredible accomplishment of science.
00:27:33.320 | And so I'm sorry,
00:27:36.460 | but I have to put responsibility on the leaders,
00:27:40.020 | even if it's not their mistakes.
00:27:42.420 | That's what the leadership is.
00:27:43.800 | That's what leadership is.
00:27:44.840 | You take responsibility for the situation.
00:27:47.300 | I wonder if there's something
00:27:48.700 | that could have been done better
00:27:50.760 | to give people hope
00:27:54.480 | that science will save us
00:27:55.980 | as opposed to science will divide us.
00:27:58.220 | - I think you have more confidence
00:28:03.440 | in the ability to get beyond our current divisions
00:28:06.900 | than I do after seeing just how deep
00:28:09.700 | and dark they have become.
00:28:12.040 | Tony Fauci has said multiple times
00:28:14.660 | the recommendation about not wearing masks
00:28:17.620 | was for two reasons,
00:28:19.380 | a shortage of masks, which were needed in hospitals,
00:28:22.380 | and a lack of realization early
00:28:25.140 | in the course of the epidemic
00:28:27.180 | that this was a virus
00:28:28.620 | that could heavily infect asymptomatic people.
00:28:33.520 | Has that changed?
00:28:35.280 | He changed.
00:28:36.480 | Now, did he make an error?
00:28:37.600 | No, he was making a judgment
00:28:39.440 | based on the data available at the time,
00:28:41.720 | but he certainly made that clear over and over again.
00:28:45.240 | It has not stopped those who would like to demonize him
00:28:48.400 | from saying, "Well, he just flip-flopped.
00:28:50.400 | "You can't trust a guy.
00:28:52.580 | "He says one thing today and one thing tomorrow."
00:28:54.980 | - Well, masks is a tricky one.
00:28:58.320 | So I'm actually-- - It is a tricky one.
00:29:00.080 | - Early on, I'm a co-author on a paper,
00:29:02.140 | one of many, but this was a survey paper
00:29:04.880 | overlooking the evidence.
00:29:08.400 | It's a summary of the evidence we have
00:29:10.320 | for the effectiveness of masks.
00:29:12.420 | It seems that it's difficult
00:29:15.640 | to do rigorous scientific study on masks.
00:29:18.480 | - It is difficult.
00:29:19.880 | - There's a lot of philosophical and ethical questions
00:29:22.040 | I want to ask you.
00:29:22.880 | Well, within this,
00:29:24.320 | it's back to your words and Anthony Fauci's words.
00:29:30.560 | When you're dealing with so much uncertainty
00:29:33.780 | and so much potential uncertainty
00:29:36.100 | about how catastrophic this virus is in the early days,
00:29:39.240 | and knowing that each word you say may create panic,
00:29:44.900 | how do you communicate science with the world?
00:29:48.480 | It's a philosophical, it's an ethical,
00:29:53.520 | it's a practical question.
00:29:55.640 | There was a discussion about masks a century ago
00:29:59.280 | and that too led to panic.
00:30:01.280 | So, I mean, I'm trying to put myself in your mind
00:30:08.160 | and the mind of Anthony Fauci in those early days,
00:30:10.400 | knowing that there's limited supply of masks.
00:30:13.160 | Like, what do you say?
00:30:15.040 | Do you fully convey the uncertainty of the situation,
00:30:18.880 | of the challenges of the supply chain?
00:30:22.840 | Or do you say that masks don't work?
00:30:26.680 | That's a complicated calculation.
00:30:29.680 | How do you make that calculation?
00:30:31.380 | - It is a complicated calculation.
00:30:35.400 | As a scientist, your temptation would be
00:30:39.080 | to give a full brain dump of all the details
00:30:43.360 | of the information about what's known and what isn't known
00:30:45.920 | and what experiments need to be done.
00:30:47.760 | Most of the time, that's not gonna play well
00:30:51.000 | in a soundbite on the evening news.
00:30:53.280 | So you have to kind of distill it down to a recommendation
00:30:56.040 | that is the best you can do at that time
00:30:58.600 | with the information you've got.
00:31:00.200 | - So you're a man of God.
00:31:03.400 | And we'll return to that to talk about
00:31:05.480 | some also unanswerable philosophical questions.
00:31:09.860 | But first, let's linger on the vaccine
00:31:13.160 | because in the religious, in the Christian community,
00:31:16.640 | there was some hesitancy with the vaccine.
00:31:18.960 | - Still is.
00:31:19.880 | - Still is.
00:31:20.720 | There's a lot of data showing high efficacy
00:31:24.160 | and safety of vaccines, of COVID vaccines,
00:31:27.760 | but still they are far from perfect as all vaccines are.
00:31:31.840 | Can you empathize with people who are hesitant
00:31:33.880 | to take the COVID vaccine
00:31:35.320 | or to have their children take the COVID vaccine?
00:31:38.960 | - I can totally empathize,
00:31:41.520 | especially when people are barraged
00:31:43.280 | by conflicting information coming at them
00:31:45.360 | from all kinds of directions.
00:31:47.100 | I've spent a lot of my time in the last year
00:31:50.600 | trying to figure out how to do a better job of listening
00:31:54.000 | because I think we have all got the risk
00:31:58.920 | of assuming we know the basis for somebody's hesitancy.
00:32:03.200 | And that often doesn't turn out to be what you thought.
00:32:07.600 | And the variety of reasons is quite broad.
00:32:11.640 | I think a big concern is just this sense of uncertainty
00:32:16.840 | about whether this was done too fast
00:32:19.240 | and that corners were cut.
00:32:20.640 | And there are good answers to that.
00:32:23.280 | Along with that, a sense that maybe this vaccine
00:32:28.200 | will have long-term effects that we won't know about
00:32:30.640 | for years to come.
00:32:32.340 | And one can say that hasn't been seen with other vaccines.
00:32:35.840 | And there's no particular reason to think
00:32:37.480 | this one's going to be different
00:32:38.800 | than the dozens of others that we have experience with.
00:32:41.100 | But you can't absolutely say,
00:32:43.200 | no, there's no chance of that.
00:32:46.040 | So it does come down to listening
00:32:49.320 | and then trying in a fashion that doesn't convey a message
00:32:54.320 | that you're smarter than the person you're talking to
00:32:58.320 | 'cause that isn't gonna help
00:32:59.760 | to really address what the substance is of the concerns.
00:33:03.980 | But my heart goes out to so many people
00:33:07.720 | who are fearful about this because of all the information
00:33:11.480 | that has been dumped on them.
00:33:14.980 | Some of it by politicians, a lot of it by the internet,
00:33:18.540 | some of it by parts of the media
00:33:20.920 | that seem to take pleasure in stirring up this kind of fear
00:33:25.920 | for their own reasons.
00:33:29.520 | And that is shameful.
00:33:31.440 | I'm really sympathetic with the people
00:33:34.160 | who are confused and fearful.
00:33:36.680 | I am not sympathetic with people
00:33:38.400 | who are distributing information that's demonstrably false
00:33:41.720 | and continue to do so.
00:33:43.480 | They're taking lives.
00:33:45.460 | I didn't realize how strong that sector of disinformation
00:33:50.460 | would be.
00:33:53.320 | And it's been in many ways more effective
00:33:56.740 | than the means of spreading the truth.
00:33:58.640 | This is gonna take us into another place.
00:34:02.200 | But Lex, if there's something I'm really worried about
00:34:06.040 | in this country, and it's not just this country,
00:34:08.500 | but it's the one I live in,
00:34:10.080 | is that we have another epidemic besides COVID-19,
00:34:14.480 | and it's an epidemic of the loss of the anchor of truth.
00:34:18.500 | The truth as a means of making decisions,
00:34:23.240 | truth as a means of figuring out
00:34:25.560 | how to wrestle with a question like,
00:34:28.720 | should I get this vaccine for myself or my children,
00:34:32.080 | seems to have lost its primacy.
00:34:34.460 | And instead, it's an opinion of somebody
00:34:38.840 | who expressed it very strongly,
00:34:42.600 | or some Facebook post that I read two hours ago.
00:34:46.760 | And for those to become substitutes for objective truth,
00:34:52.700 | not just, of course, for vaccines,
00:34:56.760 | but for many other issues,
00:34:58.080 | like was the 2020 election actually fair?
00:35:01.560 | This worries me deeply.
00:35:05.040 | It's bad enough to have polarization and divisions,
00:35:08.840 | but to have no way of resolving those
00:35:11.680 | by actually saying, okay, what's true here,
00:35:13.840 | makes me very worried about the path we're on.
00:35:17.440 | And I'm usually an optimist.
00:35:18.920 | - Well, to give you an optimistic angle on this,
00:35:22.880 | I actually think that this sense
00:35:26.880 | that there's no one place for truth
00:35:29.720 | is just a thing that will inspire leaders
00:35:33.480 | and science communicators to speak,
00:35:35.680 | not from a place of authority,
00:35:37.280 | but from a place of humility.
00:35:39.120 | I think it's just challenging people
00:35:41.080 | to communicate in a new way, to be listeners first.
00:35:45.040 | I think the problem isn't that
00:35:47.680 | there's a lot of misinformation.
00:35:49.860 | I think that people,
00:35:54.500 | the internet and the world are distrustful
00:36:00.680 | of people who speak as if they possess the truth
00:36:04.880 | with an authoritarian kind of tone,
00:36:08.160 | which was, I think, defining
00:36:10.120 | for what science was in the 20th century.
00:36:12.520 | I just think it has to sound different in the 21st.
00:36:15.320 | In the battle of ideas, I think humility and love wins.
00:36:21.280 | And that's how science wins,
00:36:24.680 | not through having quote-unquote truth.
00:36:27.140 | 'Cause now everybody can just say, "I have the truth."
00:36:30.840 | I think you have to speak, like I said,
00:36:34.140 | from humility, not authority.
00:36:35.500 | And so it just challenges our leaders to go back
00:36:39.100 | and learn to be, pardon my French, less assholes
00:36:43.260 | and more kind.
00:36:45.540 | And like you said, to listen,
00:36:47.940 | to listen to the experiences of people that are good people,
00:36:51.460 | not the ones who are trying to manipulate the system
00:36:53.620 | or play a game and so on,
00:36:55.140 | but real people who are just afraid of uncertainty,
00:36:59.300 | of hurting those they loved and so on.
00:37:02.540 | So I think it's just an opportunity for leaders
00:37:04.320 | to go back and take a class on effective communication.
00:37:07.420 | - I'm with you on shifting more from where we are
00:37:13.020 | to humility and love.
00:37:14.180 | That's gotta be the right answer.
00:37:15.460 | That's very biblical, by the way.
00:37:17.100 | - We'll get there.
00:37:19.280 | I have to bring up Joe Rogan.
00:37:22.460 | I don't know if you know who he is.
00:37:24.100 | - I do.
00:37:24.940 | - He's a podcaster, comedian, fighting commentator,
00:37:27.780 | and my now friend.
00:37:30.340 | - And Ivermectin believer too.
00:37:32.700 | - Yes, that is the question I have to ask you about.
00:37:35.700 | He has gotten some flack in the mainstream media
00:37:39.020 | for not getting vaccinated.
00:37:40.860 | And when he got COVID recently,
00:37:42.820 | taking Ivermectin as part of a cocktail of treatments.
00:37:46.940 | The NIH actually has a nice page on Ivermectin saying,
00:37:50.540 | quote, "There's insufficient evidence to recommend
00:37:53.740 | either for or against the use of Ivermectin
00:37:57.180 | for the treatment of COVID-19.
00:37:59.340 | Results from adequately powered, well-designed,
00:38:02.580 | and well-conducted clinical trials are needed
00:38:04.800 | to provide more specific evidence-based guidance
00:38:07.700 | on the role of Ivermectin in the treatment of COVID-19."
00:38:11.380 | So let me ask, why do you think there has been
00:38:14.660 | so much attack on Joe Rogan and anyone else
00:38:18.400 | that's talking about Ivermectin
00:38:20.260 | when there's insufficient evidence for or against?
00:38:24.400 | - Well, let's unpack that.
00:38:26.600 | First of all, I think the concerns about Joe
00:38:28.680 | are not limited to his taking Ivermectin.
00:38:32.080 | Much more seriously, his being fairly publicly negative
00:38:35.680 | about vaccines at a time where people are dying.
00:38:39.060 | 700,000 people have died from COVID-19.
00:38:43.160 | Estimates by Kaiser are at least 100,000 of those
00:38:47.160 | were unnecessary deaths of unvaccinated people.
00:38:50.400 | And for Joe to promote that further,
00:38:52.760 | even as this pandemic rages through our population,
00:38:57.760 | is simply irresponsible.
00:39:00.680 | So yeah, the Ivermectin is just one other twist.
00:39:03.020 | Obviously, Ivermectin has been controversial
00:39:05.920 | for months and months.
00:39:07.360 | The reason that it got particular attention
00:39:10.440 | is because of the way in which it seemed
00:39:12.400 | to have captured the imagination of a lot of people,
00:39:16.320 | and to the point where they were taking doses
00:39:18.500 | that were intended for livestock.
00:39:20.960 | And some of them got pretty sick as a result
00:39:23.000 | from overdosing on this stuff.
00:39:25.160 | That was not good judgment.
00:39:26.680 | The drug itself remains uncertain.
00:39:31.560 | There's a recent review that looks at all of the studies
00:39:35.360 | of Ivermectin and basically concludes
00:39:38.300 | that it probably doesn't work.
00:39:40.600 | We are running a study right now.
00:39:42.160 | I looked at that data this morning
00:39:44.040 | in a trial called ACTIV6, which is one of the ones
00:39:48.240 | that my public-private partnership is running.
00:39:51.000 | We're up to about 400 patients who've been randomized
00:39:53.920 | to Ivermectin or placebo, and should know,
00:39:57.720 | perhaps as soon as a month from now,
00:39:59.480 | in a very carefully controlled trial,
00:40:01.900 | did it help or did it not?
00:40:03.640 | So there will be an answer.
00:40:06.160 | Coming back to Joe, again, I don't think,
00:40:09.520 | the fact that he took Ivermectin,
00:40:11.160 | hoping it might work, is that big a knock against him.
00:40:14.700 | It's more the conveying of, we don't trust what science says,
00:40:19.300 | which is vaccines are gonna save your life.
00:40:21.100 | We're gonna trust what's on the internet
00:40:22.900 | that says Ivermectin and hydroxychloroquine
00:40:25.140 | really do work, even though the scientific community
00:40:27.280 | says probably not.
00:40:28.380 | - So let me push back on that a little bit.
00:40:31.220 | So he doesn't say, let's not listen to science.
00:40:35.380 | He doesn't say don't get vaccinated.
00:40:38.460 | He says it's okay to ask questions.
00:40:44.060 | - I'm okay with that.
00:40:44.980 | - How risky is the vaccine for certain populations?
00:40:48.900 | What are the benefits and risks?
00:40:51.860 | There's other friends of Joe and friends of mine,
00:40:55.900 | like Sam Harris, who says, if you look at the data,
00:40:59.820 | it's obvious that the benefits outweigh the risks.
00:41:03.180 | And what Joe says is, yes,
00:41:05.700 | but let's still openly talk about risks.
00:41:09.500 | And he often brings up anecdotal evidence
00:41:12.060 | of people who've had highly negative effects from vaccines.
00:41:17.060 | Science is not done with anecdotal evidence.
00:41:20.340 | And so you could infer a lot of stuff
00:41:23.500 | from the way he expresses it,
00:41:24.900 | but he also communicates a lot of interesting questions.
00:41:27.700 | And that's something maybe you can comment on is,
00:41:31.740 | there's certain groups that are healthy.
00:41:34.980 | They're younger, they exercise a lot,
00:41:39.980 | they get nutrition and all those kinds of things.
00:41:43.420 | He shows skepticism on whether it's so obvious
00:41:48.420 | that they should get vaccinated.
00:41:50.020 | And the same is he makes this,
00:41:52.980 | he kind of presents the same kind of skepticism for kids,
00:41:57.260 | for young kids.
00:41:58.380 | So with empathy and listening my Russian ineloquent
00:42:03.380 | description of what Joe believes,
00:42:09.260 | what is your kind of response to that?
00:42:12.220 | Why should certain categories of healthy and young people
00:42:16.260 | still get vaccinated, do you think?
00:42:18.180 | - Well, first, just to say,
00:42:19.180 | it's great for Joe to be a skeptic, to ask questions.
00:42:22.140 | We should all be doing that.
00:42:23.780 | But then the next step is to go and see what the data says
00:42:26.580 | and see if there are actually answers to those questions.
00:42:29.540 | So coming to healthy people,
00:42:31.380 | I've done a bunch of podcasts besides this one.
00:42:35.180 | The one I think I remember most
00:42:37.700 | was a podcast with a worldwide wrestling superstar.
00:42:42.700 | - Very nice.
00:42:43.780 | - He's about six foot six and just absolutely solid muscle.
00:42:48.140 | And he got COVID and he almost died.
00:42:51.360 | And recovering from that, he said,
00:42:54.540 | "I've got to let my supporters know."
00:42:58.460 | 'Cause you can imagine worldwide wrestling fans
00:43:00.980 | are probably not big embracers of the need for vaccines.
00:43:06.940 | And he just turned himself into a spokesperson
00:43:11.940 | for the fact that this virus doesn't care
00:43:15.020 | how healthy you are, how much you exercise,
00:43:17.900 | what a great specimen you are.
00:43:19.380 | It wiped him out.
00:43:21.900 | And we see that.
00:43:23.060 | The average person in the ICU right now with COVID-19
00:43:27.900 | is under age 50.
00:43:30.100 | I think there's a lot of people still thinking,
00:43:31.660 | "Oh, it's just those old people in the nursing homes.
00:43:33.700 | "That's not gonna be about me."
00:43:35.020 | They're wrong.
00:43:36.380 | There are plenty of instances of people
00:43:38.300 | who were totally healthy with no underlying diseases,
00:43:41.500 | taking good care of themselves, not obese,
00:43:43.620 | exercising who have died from this disease.
00:43:46.980 | 700 children have died from this disease.
00:43:52.620 | Yes, some of them had underlying factors like obesity,
00:43:55.880 | but a lot of them did not.
00:43:57.980 | So it's fair to say younger people are less susceptible
00:44:02.500 | to serious illness, kids even less so
00:44:05.860 | than young adults, but it ain't zero.
00:44:09.540 | And if the vaccine is really safe and really effective,
00:44:14.060 | then you probably want everybody to take advantage of that.
00:44:17.540 | Even though some are dropping their risks more than others,
00:44:20.820 | everybody's dropping their risks some.
00:44:22.940 | - Are you worried about variants?
00:44:26.060 | So looking out into the future,
00:44:28.140 | what's your vision for all the possible trajectories
00:44:32.460 | that this virus takes in human society?
00:44:34.940 | - I'm totally worried about the variants.
00:44:37.940 | Delta was such an impressive arrival on the scene
00:44:42.060 | in all the wrong ways.
00:44:43.300 | I mean, it took over the world
00:44:46.980 | in the space of just a couple months
00:44:49.140 | because of its extremely contagious ability.
00:44:52.740 | - Viruses would be beautiful if they weren't terrifying.
00:44:55.020 | - Yeah, exactly.
00:44:56.180 | I mean, this whole story of viral evolution,
00:44:58.460 | scientifically, is just amazingly elegant.
00:45:01.560 | Anybody who really wanted to understand
00:45:03.300 | how evolution works in real time, study SARS-CoV-2,
00:45:07.940 | 'cause it's not just Delta, it's Alpha,
00:45:09.780 | it's Beta, and it's Gamma,
00:45:10.980 | and it's the fact that these sweep through
00:45:14.700 | the world's population
00:45:16.540 | by fairly minor differences in fitness.
00:45:20.680 | So the real question many people are wrestling is,
00:45:23.580 | is Delta it?
00:45:24.940 | Is it such a fit virus
00:45:27.020 | that nothing else will be able to displace it?
00:45:30.260 | I don't know.
00:45:31.300 | I mean, there's now Delta-AY4,
00:45:34.740 | which is a variant of Delta
00:45:37.060 | that at least in the UK seems to be taking over
00:45:40.780 | the Delta population
00:45:42.240 | as though it's maybe even a little more contagious.
00:45:45.280 | That might be the first hint
00:45:47.020 | that we're seeing something new here.
00:45:49.180 | It's not a completely different virus.
00:45:51.660 | It's still Delta, but it's Delta plus.
00:45:53.980 | You know, the big worry, Alex,
00:45:57.540 | is what's out there that is so different
00:46:00.820 | that the vaccine protection doesn't work?
00:46:04.660 | (laughs)
00:46:07.340 | - And we don't know how different it needs to be
00:46:10.020 | for the vaccine to start working.
00:46:11.540 | That's the terrifying thing about each of these variants.
00:46:14.860 | It's like, it's always a pleasant surprise
00:46:18.060 | that the vaccine seems to still have efficacy.
00:46:21.020 | - And hooray for our immune system, may I say,
00:46:23.920 | because the vaccine immunized you
00:46:26.180 | against that original Wuhan virus.
00:46:28.960 | Now we can see that especially after two doses
00:46:35.220 | and even more so after a booster,
00:46:37.540 | your immune system is so clever
00:46:39.700 | that it's also making a diversity of antibodies
00:46:43.460 | to cover some other things that might happen to that virus
00:46:46.660 | to make it a little different.
00:46:48.420 | And you're still getting really good coverage.
00:46:51.900 | Even for beta, which was South Africa, B1351,
00:46:56.500 | which is the most different, it looks pretty good.
00:46:59.900 | But that doesn't mean it will always be as good as that
00:47:02.780 | if something gets really far away from the original virus.
00:47:06.140 | Now, the good news is we would know what to do
00:47:08.540 | in that situation.
00:47:10.140 | The mRNA vaccines allow you to redesign the vaccine
00:47:14.260 | like that and to quickly get it through
00:47:17.340 | a few thousand participants in a clinical trial
00:47:19.980 | to be sure it's raising antibodies
00:47:21.420 | and then bang, you could go.
00:47:23.580 | But I don't wanna have to do that.
00:47:25.220 | There will be people's lives at risk in the meantime.
00:47:28.580 | And what's the best way to keep that from happening?
00:47:30.380 | Well, try to cut down the number of infections
00:47:33.700 | 'cause you don't get variants
00:47:34.980 | unless the virus is replicating in a person.
00:47:37.300 | - So how do we solve this thing?
00:47:40.740 | How do we get out of this pandemic?
00:47:43.060 | What's, like if you had like a wand or something
00:47:46.300 | or you could really implement policies,
00:47:50.500 | what's the full cocktail of solutions here?
00:47:53.020 | - It's a full cocktail.
00:47:53.980 | It's not just one thing.
00:47:56.020 | In our own country here in the US,
00:47:58.020 | it would be getting those 64 million reluctant people
00:48:01.300 | to actually go ahead and get vaccinated.
00:48:03.020 | - There's 64 million people who didn't get vaccinated?
00:48:05.260 | - Adults, yes.
00:48:06.180 | Not even counting the kids.
00:48:07.940 | 64 million.
00:48:09.620 | Isn't that astounding?
00:48:10.740 | Get the kids vaccinated.
00:48:13.380 | Hopefully their parents will see that as a good thing too.
00:48:17.020 | Get those of us who are due for boosters boosted
00:48:19.660 | because that's gonna reduce our likelihood
00:48:21.340 | of having breakthrough infections and keep spreading it.
00:48:24.540 | Convince people that until we're really done with this
00:48:27.460 | and we're not now,
00:48:28.780 | that social distancing and mask wearing indoors
00:48:31.620 | are still critical to cut down the number of new infections.
00:48:35.260 | But of course, that's our country.
00:48:38.860 | This is a worldwide pandemic.
00:48:41.020 | I worry greatly about the fact
00:48:43.860 | that low and middle income countries
00:48:45.460 | have for the most part not even gotten started
00:48:47.900 | with access to vaccines.
00:48:49.500 | And we have to figure out a way to speed that up
00:48:51.900 | because otherwise that's where the next variant
00:48:56.140 | will probably arrive.
00:48:58.180 | And who knows how bad it will be
00:48:59.700 | and it will cross the world quickly
00:49:01.500 | as we've seen happen repeatedly in the last 22 months.
00:49:05.100 | - I think I'm really surprised, annoyed, frustrated
00:49:09.780 | that testing, rapid at-home testing
00:49:13.620 | from the very beginning
00:49:14.620 | wasn't a big, big part of the solution.
00:49:17.180 | It seems, first of all, nobody's against it.
00:49:19.780 | That's one huge plus for testing.
00:49:22.220 | It's everybody supports.
00:49:24.220 | Second of all, like that's what America is good at
00:49:27.260 | is like mass manufacturer stuff.
00:49:29.860 | Like stepping up, engineer stepping up
00:49:32.340 | and really deploying it.
00:49:33.980 | Plus without the collection of data
00:49:35.820 | is giving people freedom, is giving them information
00:49:39.700 | and then freedom to decide what to do with that information.
00:49:42.620 | It's such a powerful solution.
00:49:44.060 | I don't understand.
00:49:45.300 | - Well, now I think the Biden administration
00:49:47.460 | is I think emphasized like the scaling
00:49:50.380 | of testing manufacturers.
00:49:51.540 | So, but I just feel like it's an obvious solution.
00:49:54.020 | Get a test that's cost less than a dollar to manufacture,
00:49:57.420 | cost less than a dollar to buy
00:49:59.340 | and just everybody gets tested every single day.
00:50:02.460 | Don't share that data with anyone.
00:50:03.900 | You just make the decisions.
00:50:05.140 | And I believe in the intelligence of people
00:50:07.660 | to make the right decision to stay at home
00:50:09.500 | when the test is positive.
00:50:11.140 | - I am so completely with you on that.
00:50:13.460 | And NIH has been smack in the middle
00:50:15.340 | of trying to make that dream come true.
00:50:17.780 | We're running a trial right now in Georgia,
00:50:21.820 | Indiana, Hawaii.
00:50:23.860 | And where's the other one?
00:50:26.460 | Oh, Kentucky.
00:50:28.900 | Basically blanketing a community with free testing.
00:50:32.980 | - That's beautiful.
00:50:33.820 | - And look to see what happens as far as stemming
00:50:36.540 | the spread of the epidemic and measuring it by wastewater
00:50:40.100 | 'cause you can really tell whether you've cut back
00:50:42.580 | the amount of infection in the community.
00:50:44.940 | Yeah, I'm so with you.
00:50:47.100 | We got off to such a bad start with testing.
00:50:49.580 | And of course, all the testing was being done
00:50:52.420 | for the first several months in big box laboratories
00:50:55.940 | where you had to send the sample off
00:50:57.820 | and put it through the mail somehow
00:50:59.300 | and get the result back sometimes five days later
00:51:01.460 | after you've already infected a dozen people.
00:51:03.900 | It was just a completely wrong model,
00:51:05.740 | but it's what we had.
00:51:06.780 | And everybody was like, oh, we got to stick with PCR
00:51:09.820 | because if you start using those home tests
00:51:12.020 | that are based on antigens lateral flow,
00:51:14.820 | probably there's gonna be false positives
00:51:16.580 | and false negatives.
00:51:17.420 | Okay, sure, no test is perfect,
00:51:20.460 | but having a test that's not acceptable
00:51:23.420 | or accessible is the worst setting.
00:51:26.100 | So we, NIH, with some requests from Congress,
00:51:29.580 | got a billion dollars to create this program
00:51:32.620 | called Rapid Acceleration of Diagnostics, RADx.
00:51:36.740 | And we turned into a venture capital organization
00:51:39.140 | and we invited every small business or academic lab
00:51:41.620 | that had a cool idea about how to do home testing
00:51:44.340 | to bring it forward.
00:51:45.700 | And we threw them into what we called our shark tank
00:51:48.300 | of business experts, engineers, technology people.
00:51:51.060 | People understood how to deal
00:51:53.820 | with supply chains and manufacturing.
00:51:56.780 | And right now today, there are about 2 million tests
00:52:00.260 | being done based on what came out of that program,
00:52:03.540 | including most of the home tests
00:52:05.620 | that you can now buy on the pharmacy shelves.
00:52:07.620 | We did that and I wish we had done it faster,
00:52:10.340 | but it was an amazingly speedy effort.
00:52:13.700 | And you're right, companies are really good.
00:52:15.460 | Once they've gotten FDA emergency use authorization,
00:52:18.140 | and we helped a lot of them get that,
00:52:20.300 | they can scale up their manufacturing.
00:52:22.900 | I think in December, we should have about 410 million tests
00:52:27.900 | for that month ready to go.
00:52:30.260 | And if we can get one or two more platforms approved,
00:52:34.020 | and by the way, we are now helping FDA
00:52:36.700 | by being their validation lab.
00:52:39.140 | If we can get a couple more of these approved,
00:52:41.060 | we could be in the half a billion tests a month,
00:52:44.820 | which is really getting where we need to be.
00:52:46.860 | - Wow, yeah, that's a dream.
00:52:48.940 | That's a dream for me.
00:52:49.780 | It seems like an obvious solution, engineering solution.
00:52:52.780 | Everybody's behind it.
00:52:53.900 | It leads to hope versus division.
00:52:56.020 | I love it.
00:52:57.260 | Okay.
00:52:58.100 | - A happy story.
00:53:00.660 | - A happy story.
00:53:01.500 | - I was waiting for one.
00:53:02.740 | - Yeah, all right.
00:53:03.580 | Well, one last dive into the not happy,
00:53:06.420 | but you won't even have to comment on it.
00:53:08.940 | - Well, comment on the broader philosophical question.
00:53:11.340 | So NIH, again, I said Joe Rogan
00:53:16.340 | as the first one who pointed me to this.
00:53:18.820 | NIH was recently accused of funding research of a paper
00:53:21.860 | that had images of sedated puppies
00:53:24.060 | with their heads inserted into small enclosures
00:53:26.460 | containing disease carrying sand flies.
00:53:28.780 | So I could just say that this story is not true,
00:53:34.620 | or at least the,
00:53:37.220 | I think it is true that the paper
00:53:39.020 | that showed those images cited NIH as a funding source,
00:53:43.300 | but that citation is not correct.
00:53:45.140 | - That was not correct.
00:53:45.980 | - Yeah.
00:53:47.500 | But that brings up a bigger philosophical question
00:53:51.580 | that it could have been correct.
00:53:54.620 | How difficult is it as a director of NIH
00:53:57.380 | or just NIH as an organization
00:53:58.860 | that's funding so many amazing deep research studies
00:54:03.060 | to ensure the ethical fortitude of those studies
00:54:07.260 | when the ethics of science is,
00:54:10.100 | there's such a gray area between what is
00:54:12.260 | and what isn't ethical?
00:54:13.500 | - Well, tough issues.
00:54:16.700 | Certainly animal research is a tough issue.
00:54:20.020 | - I was going to bring up,
00:54:21.460 | it's a good example of that tough issue,
00:54:23.820 | is in 2015, you announced that NIH
00:54:27.020 | would no longer support any biomedical research
00:54:29.700 | involving chimpanzees.
00:54:31.900 | So that's like a one example of looking in the mirror,
00:54:36.780 | thinking deeply about what is and isn't ethical.
00:54:39.700 | And there was a conclusion that biomedical research
00:54:42.460 | on chimps is not ethical.
00:54:45.060 | - That was the conclusion.
00:54:46.420 | That was based on a lot of deep thinking
00:54:48.260 | and a lot of input from people
00:54:50.460 | who have considered this issue
00:54:52.020 | and a panel of the National Academy of Sciences
00:54:54.940 | that was asked to review the issue.
00:54:57.660 | I mean, the question that I wanted them to look at was,
00:55:01.420 | are we actually learning anything that's really essential
00:55:05.420 | from chimpanzee invasive research at this point?
00:55:08.980 | Or is it time to say that these closest relatives of ours
00:55:13.900 | should not be subjected to that any further
00:55:16.100 | and ought to be retired to a sanctuary?
00:55:19.100 | And that was the conclusion,
00:55:20.420 | that there was really no kind of medical experimentation
00:55:24.660 | that needed to be done on chimps in order to proceed.
00:55:27.820 | So why are we still doing this?
00:55:29.660 | Many of these were chimpanzees that were purchased
00:55:32.740 | because we thought they would be good hosts for HIV/AIDS,
00:55:37.740 | and they sort of weren't.
00:55:39.700 | And they were kept around in these primate laboratories
00:55:43.260 | with people coming up with other things to do,
00:55:45.820 | but they weren't compelling scientifically.
00:55:48.340 | So I think that was the right decision.
00:55:50.380 | I took a lot of flack
00:55:51.900 | from some of the scientific communities said,
00:55:53.580 | "Well, you're caving in to the animal rights people.
00:55:56.620 | And now that you've said no more research on chimps,
00:55:59.300 | what's next?"
00:56:00.460 | Certainly when it comes to companion animals,
00:56:04.940 | everybody's heart starts to be hurting
00:56:09.020 | when you see anything done that seems harmful
00:56:12.460 | to a dog or a cat.
00:56:14.100 | I have a cat, I don't have a dog.
00:56:16.380 | And I understand that completely.
00:56:18.100 | That's why we have these oversight groups
00:56:21.420 | that decide before you do any of that kind of research,
00:56:24.380 | is it justified?
00:56:26.340 | And what kind of provision is going to be made
00:56:29.740 | to avoid pain and suffering?
00:56:31.500 | And those have input from the public
00:56:35.420 | as well as the scientific community.
00:56:37.900 | Is that completely saying that every step
00:56:40.500 | that's happening there is ethical by some standard
00:56:45.100 | that would be hard for anybody to agree to?
00:56:48.900 | No, but at least it's a consensus
00:56:50.820 | of what people think is acceptable.
00:56:54.380 | Dogs are the only host for some diseases
00:56:58.180 | like Leishmaniasis, which was that paper
00:57:01.580 | that we were not responsible for,
00:57:03.300 | but I know why they were doing the experiment.
00:57:05.540 | Or like lymphatic filariasis,
00:57:08.020 | which is an experiment that we are supporting in Georgia
00:57:11.780 | that involves dogs getting infected with a parasite
00:57:14.860 | because that's the only model we have to know
00:57:16.780 | whether a treatment is gonna work or not.
00:57:18.940 | So I will defend that.
00:57:21.980 | I am not in the place of those who think
00:57:24.420 | all animal research is evil,
00:57:27.220 | 'cause I think if there's something that's gonna be done
00:57:29.700 | to save a child from a terrible disease or an adult,
00:57:32.900 | and it involves animal research
00:57:34.340 | that's been carefully reviewed,
00:57:36.260 | then I think ethically, while it doesn't make me comfortable,
00:57:39.380 | it still seems like it's the right choice.
00:57:42.340 | I think to say all animal research
00:57:45.140 | should be taken off the table is also very unethical
00:57:48.860 | 'cause that means you have basically doomed a lot of people
00:57:52.860 | for whom that research might have saved their lives
00:57:55.020 | to having no more hope.
00:57:56.180 | - And to me personally,
00:57:59.340 | there's far greater concerns ethically
00:58:01.660 | in terms of factory farming, for example,
00:58:04.020 | the treatment of animals in other contexts.
00:58:06.300 | - Oh, there's so much that goes on
00:58:08.100 | outside of medical research that is much more troubling.
00:58:12.500 | - That said, I think all cats have to go.
00:58:15.500 | That's just my off the record opinion.
00:58:17.620 | That's why I'm not involved with any ethical decisions.
00:58:19.980 | I'm just joking internet, I love cats.
00:58:22.300 | - You're a dog person.
00:58:23.260 | - I'm a dog person, I'm sorry.
00:58:25.060 | - You've seen the New Yorker cartoon
00:58:26.620 | where there are two dogs in the bar having a martini,
00:58:30.060 | and one is saying they're dressed up in their business suits
00:58:32.540 | and one says to the other,
00:58:34.060 | "You know, it's not enough for the dogs to win.
00:58:37.540 | "The cats have to lose."
00:58:39.340 | (laughing)
00:58:40.860 | - That's beautiful.
00:58:41.820 | So a few weeks ago, you've announced
00:58:45.660 | that you're resigning from the NIH at the end of the year.
00:58:49.100 | - I'm stepping down.
00:58:50.220 | I'm still gonna be at NIH in a different capacity.
00:58:53.420 | - Different capacity, right.
00:58:54.860 | And it's over a decade of an incredible career
00:58:58.940 | overseeing the NIH as its director.
00:59:01.180 | What are the things you're most proud of
00:59:04.020 | of the NIH in your time here
00:59:06.980 | as its director, maybe memorable moments?
00:59:12.820 | - There's a lot in 12 years.
00:59:15.540 | Science has just progressed in amazing ways
00:59:19.700 | over those 12 years.
00:59:21.380 | Think about where we are right now.
00:59:25.100 | Something like gene editing,
00:59:26.580 | being able to make changes in DNA,
00:59:29.420 | even for therapeutic purposes,
00:59:31.060 | which is now curing sickle cell disease.
00:59:33.980 | Unthinkable when I became director in 2009.
00:59:38.140 | The ability to study single cells
00:59:41.580 | and ask them what they're doing and get an answer.
00:59:44.540 | Single cell biology just has emerged
00:59:47.140 | in this incredibly powerful way.
00:59:49.220 | Having the courage to be able to say,
00:59:53.900 | "We could actually understand the human brain,"
00:59:57.060 | seemed like so far out there.
00:59:59.020 | And we're in the process of doing that
01:00:01.340 | with the Brain Initiative.
01:00:02.940 | Taking all that we've learned about the genome
01:00:06.260 | and applying it to cancer,
01:00:09.020 | to make individual cancer treatment really precision,
01:00:13.220 | and developing cancer immunotherapy,
01:00:15.060 | which seemed like sort of a backwater
01:00:17.340 | into some of the hottest science around.
01:00:19.340 | All those things sort of erupting,
01:00:22.620 | and much more to come, I'm sure.
01:00:23.860 | We're on an exponential curve of medical research advances,
01:00:27.980 | and that's glorious to watch.
01:00:30.380 | And of course, COVID-19,
01:00:31.900 | as a beneficiary of decades of basic science,
01:00:35.540 | understanding what mRNA is,
01:00:37.740 | understanding basics about coronaviruses and spike proteins,
01:00:41.220 | and how to combine structural biology,
01:00:43.540 | and immunology, and genomics into this package
01:00:46.500 | that allows you to make a vaccine in 11 months.
01:00:49.380 | Just, I would never have imagined that possible in 2009.
01:00:53.620 | So to have been able to kind of be the midwife,
01:00:56.460 | helping all of those things get birthed,
01:00:59.060 | that's been just an amazing 12 years.
01:01:02.300 | And as NIH director, you have this convening power,
01:01:06.700 | and this ability to look across the whole landscape
01:01:09.180 | of biomedical research,
01:01:10.340 | and identify areas that are just like ready
01:01:13.820 | for something big to happen,
01:01:15.820 | but isn't gonna happen spontaneously
01:01:17.600 | without some encouragement,
01:01:18.940 | without pulling people together from different disciplines
01:01:21.740 | who don't know each other,
01:01:22.780 | and maybe don't know how to quite understand
01:01:24.940 | each other's scientific language,
01:01:26.660 | and create an environment for that to happen.
01:01:29.060 | That has been just an amazing experience.
01:01:32.100 | I mean, I mentioned the BRAIN Initiative as one of those.
01:01:35.380 | The BRAIN Initiative right now,
01:01:36.540 | I think there's about 600 investigators working on this.
01:01:39.880 | Last week, the whole issue of Nature Magazine
01:01:43.420 | was about the output of the BRAIN Initiative,
01:01:45.980 | basically now giving us a cell census
01:01:48.700 | of what those cells in the brain are doing,
01:01:51.220 | which has just never been imaginable.
01:01:53.900 | And interestingly, more than half of the investigators
01:01:58.900 | in the BRAIN Initiative are engineers.
01:02:00.880 | They're not biologists in a traditional sense.
01:02:04.020 | I love that.
01:02:04.860 | Maybe partly 'cause my PhD is in quantum mechanics.
01:02:08.260 | So I think it's really a good idea
01:02:10.700 | to bring disciplines together and see what happens.
01:02:14.180 | That's an exciting thing.
01:02:15.520 | And I will not ever forget having the chance
01:02:19.700 | to announce that program in the East Room
01:02:22.700 | in that White House with President Obama,
01:02:25.860 | who totally got it and totally loved science,
01:02:28.620 | and working with him in some of those rare moments
01:02:32.380 | of sort of one-on-one conversation in the Oval Office,
01:02:35.100 | just him and me about science.
01:02:36.900 | That's a gift.
01:02:37.980 | - What's it like talking to Barack Obama about science?
01:02:41.500 | He seems to be a sponge.
01:02:43.860 | I've heard him, I'm an artificial intelligence person,
01:02:46.980 | and I've heard him talk about AI.
01:02:48.700 | And it was like, it made me think,
01:02:51.260 | is somebody like whispering in his ear or something?
01:02:53.300 | Because he was saying stuff that totally passed the BS test,
01:02:56.260 | like he really understands stuff.
01:02:58.500 | - He does.
01:02:59.500 | - That means he listened to a bunch of experts on AI.
01:03:02.380 | He was explaining the difference
01:03:03.940 | between narrow artificial intelligence and strong AI.
01:03:07.300 | He was saying all this,
01:03:08.180 | both technical and philosophical stuff.
01:03:10.420 | And it just made me, I don't know,
01:03:12.220 | it made me hopeful about the depth of understanding
01:03:15.980 | that a human being in political office can attain.
01:03:18.500 | - That gave me hope as well, and having those experiences.
01:03:22.220 | Oftentimes in a group, I mean, another example
01:03:24.940 | where I was trying to figure out,
01:03:26.900 | how do we take what we've learned about the genome
01:03:29.120 | and really apply it at scale
01:03:31.380 | to figure out how to prevent illness,
01:03:33.320 | not just treat it, but prevent it,
01:03:35.460 | out of which came this program called All of Us,
01:03:38.060 | this million strong American cohort of participants
01:03:42.740 | who make their electronic health records
01:03:44.740 | and their genome sequences and everything else available
01:03:46.980 | for researchers to look at.
01:03:48.620 | That came out of a couple of conversations
01:03:51.820 | with Obama and others in his office.
01:03:55.060 | And he asked the best questions.
01:03:58.580 | That was what struck me so much.
01:04:00.560 | I mean, a room full of scientists,
01:04:02.680 | and we'd be talking about the possible approaches,
01:04:05.160 | and he would come up with this
01:04:07.000 | incredibly insightful, penetrating question.
01:04:09.440 | Not that he knew what the answer was gonna be,
01:04:11.320 | but he knew what the right question was.
01:04:13.360 | - I think the core to that is curiosity.
01:04:17.040 | - Yeah.
01:04:18.080 | - I don't think he's even like,
01:04:19.240 | he's trying to be a good leader.
01:04:20.400 | He's legit curious.
01:04:22.360 | - Yes, legit.
01:04:24.800 | - That he, almost like a kid in a candy store
01:04:26.680 | gets to talk to the world experts.
01:04:28.280 | He somehow sneaked into this office
01:04:30.940 | and gets to talk to the world experts.
01:04:33.240 | And that's the kind of energy that I think leads
01:04:36.880 | to beautiful leadership in the space of science.
01:04:40.200 | - Indeed.
01:04:41.320 | Another thing I've been able to do as director
01:04:43.560 | is to try to break down some of the boundaries
01:04:45.720 | that seem to be traditional
01:04:47.280 | between the public and the private sectors
01:04:49.240 | when it comes to areas of science
01:04:50.840 | that really could and should be open access anyway.
01:04:53.920 | Why don't we work together?
01:04:56.280 | And that was obvious early on.
01:04:58.320 | And after identifying a few possible collaborators
01:05:03.320 | who are chief scientists of pharmaceutical companies,
01:05:07.480 | it looked as like we might be able
01:05:08.800 | to do something in that space.
01:05:10.420 | Out of that was born something called
01:05:12.200 | the Accelerating Medicines Partnership, AMP.
01:05:15.040 | And it took a couple of years of convening people
01:05:19.320 | who usually didn't talk to each other.
01:05:21.440 | And there was a lot of suspicion, academic scientists saying,
01:05:24.880 | oh, those scientists in pharma, they're not that smart.
01:05:28.080 | They're just trying to make money.
01:05:30.000 | And the academic scientists getting the wrap
01:05:33.140 | from the pharmaceutical scientists,
01:05:34.560 | all they wanna do is publish papers.
01:05:35.980 | They don't really care about helping anybody.
01:05:38.200 | And we found out both of those stereotypes were wrong.
01:05:41.060 | And over the course of that couple of years,
01:05:44.280 | built a momentum behind three starting projects,
01:05:47.600 | one on Alzheimer's, one on diabetes,
01:05:50.000 | one on rheumatoid arthritis and lupus.
01:05:52.100 | Very different, each one of them trying to identify
01:05:54.440 | what is an area that we both really need to see advance
01:05:58.480 | and we could do better together.
01:06:00.160 | And it's gonna have to be open access,
01:06:01.840 | otherwise NIH is not gonna play.
01:06:03.920 | And guess what, industry?
01:06:05.280 | If you really wanna do this,
01:06:06.600 | you gotta have skin in the game.
01:06:08.160 | We'll cover half the cost, you gotta cover the other half.
01:06:10.720 | - I love it.
01:06:11.640 | Enforcing open access, so resulting in open science.
01:06:16.200 | - Millions of dollars gone into this.
01:06:17.880 | And it has been a wild success.
01:06:19.960 | After many people were skeptical,
01:06:22.440 | a couple of years later we had another project on Parkinson's
01:06:27.120 | more recently we've added one on schizophrenia.
01:06:29.560 | Just this week, we added one on gene therapy,
01:06:34.040 | on bespoke gene therapy for ultra rare diseases,
01:06:38.080 | which otherwise aren't gonna have enough commercial appeal.
01:06:41.160 | But if we did this together,
01:06:42.320 | especially with FDA at the table, and they have been,
01:06:45.560 | we could make something happen,
01:06:46.880 | turn this into a sort of standardized approach
01:06:49.800 | where everything didn't have to be a one-off.
01:06:52.480 | I'm really excited about that.
01:06:54.320 | So what began as three projects is six,
01:06:56.600 | and it's about to be seven next year
01:06:58.480 | with a heart failure project.
01:07:00.640 | And all of us have gotten to know each other.
01:07:03.600 | And if it weren't for that background,
01:07:05.520 | when COVID came along, it would have been a lot harder
01:07:08.840 | to build the partnership called ACTIV,
01:07:11.040 | which has been my passion for the last 20 months,
01:07:14.440 | accelerating COVID-19 therapeutic interventions
01:07:17.520 | and vaccines.
01:07:18.360 | I was at our leadership team meeting this morning.
01:07:21.120 | It was amazing what's been accomplished.
01:07:23.160 | That's pretty much a hundred people who dropped everything
01:07:26.880 | just to work on this, about half from industry
01:07:29.080 | and half from government and academia.
01:07:31.360 | And that's how we got vaccine master protocols designed.
01:07:36.520 | So we all agreed about what the end points had to be.
01:07:39.200 | And you wondered why are there 30,000 participants
01:07:42.560 | in each of these trials?
01:07:43.560 | That's 'cause of ACTIV's group mapping out
01:07:47.040 | what the power needed to be for this to be convincing.
01:07:49.720 | Same with therapeutics.
01:07:53.160 | We have run at least 20 therapeutic agents through trials
01:07:57.920 | that ACTIV supported in record time.
01:08:00.960 | That's how we got monoclonal antibodies that we know work.
01:08:03.880 | That's been, that would not have been possible
01:08:08.840 | if I didn't already have a sense of how to work
01:08:12.440 | with the private sector that came out of AMP.
01:08:15.440 | AMP took two years to get started.
01:08:17.080 | ACTIV took two weeks.
01:08:18.440 | We just kept the lawyers-
01:08:20.600 | - Wow, to get a hundred people over?
01:08:21.880 | - Yeah, kept the lawyers out of the room and away.
01:08:24.440 | (laughing)
01:08:27.120 | - Now you're gonna get yourself in trouble.
01:08:28.960 | (laughing)
01:08:30.880 | I do hope one day the story of this incredible vaccine,
01:08:34.680 | development of vaccine protocols and trials
01:08:36.440 | and all this kind of details,
01:08:37.680 | the messy, beautiful details of science and engineering
01:08:41.600 | and that led to the manufacturing, the deployment
01:08:44.320 | and the scientific test.
01:08:45.200 | It's such a nice dance between engineering
01:08:48.560 | in the space of manufacture of the vaccines.
01:08:50.520 | You start before the studies are complete,
01:08:53.120 | you start making the vaccines just in case
01:08:56.080 | if the studies prove to be positive,
01:08:58.040 | then you can start deploying them.
01:08:59.720 | Just like so many parties, like you said,
01:09:04.000 | private and public playing together.
01:09:05.680 | That's just a beautiful dance that is one of the,
01:09:10.120 | is one of, for me, the sources of hope
01:09:11.960 | in this very tricky time where there's a lot
01:09:14.760 | of things to be cynical about in terms
01:09:19.320 | of the games politicians play and the hardship experience
01:09:23.280 | of the economy and all those kinds of things.
01:09:25.360 | But to me, this dance was a vaccine development
01:09:30.360 | was done just beautifully and it gives me hope.
01:09:33.200 | - It does me as well.
01:09:34.440 | And it was in many ways, the finest hour
01:09:37.400 | that science has had in a long time being called upon
01:09:41.600 | when every day counted and making sure
01:09:44.080 | that time was not wasted
01:09:46.400 | and things were done rigorously, but quickly.
01:09:49.960 | - So you're incredibly good as a leader of the NIH.
01:09:54.840 | It seems like you're having a heck of a lot of fun.
01:09:57.920 | Why step down from this role after so much fun?
01:10:02.720 | - Well, no other NIH director has served more
01:10:06.920 | than one president after being appointed
01:10:09.360 | by one, you're sort of done.
01:10:10.720 | And the idea of being carried over
01:10:12.960 | for a second presidency with Trump
01:10:14.760 | and now a third one with Biden is unheard of.
01:10:18.640 | I just think, Lex, that scientific organizations benefit
01:10:22.640 | from new vision and 12 years is a really long time
01:10:26.360 | to have the same leader.
01:10:28.480 | And if I wasn't gonna stick it out
01:10:30.240 | for the entire Biden four-year term,
01:10:33.200 | it's good not to wait too late during that
01:10:36.800 | to signal an intent to step down
01:10:38.680 | 'cause the president's gotta find the right person,
01:10:40.880 | gotta nominate them, gotta get the Senate to confirm them,
01:10:44.160 | which is a unpredictable process right now.
01:10:47.520 | And you don't wanna try to do that
01:10:49.240 | in the second half of somebody's term as president.
01:10:52.800 | This has gotta happen now.
01:10:53.920 | So I kind of decided back at the end of May
01:10:56.400 | that this should be my final year.
01:10:59.120 | And I'm okay with that.
01:11:00.800 | I do have some mixed emotions 'cause I love the NIH.
01:11:05.800 | I love the job.
01:11:07.720 | It's exhausting.
01:11:09.640 | I'm traditionally for the last 20 months anyway,
01:11:13.320 | working 100 hours a week.
01:11:14.640 | It's just, that's what it takes to juggle all of this.
01:11:18.840 | And that keeps me from having a lot of time
01:11:21.320 | for anything else.
01:11:23.240 | And I wouldn't mind, 'cause I don't think I'm done yet.
01:11:25.840 | I wouldn't mind having some time to really think
01:11:29.000 | about what the next chapter should be.
01:11:31.360 | And I have none of that time right now.
01:11:33.840 | Do I have another calling?
01:11:35.200 | Is there something else I could contribute
01:11:37.000 | that's different than this?
01:11:38.360 | I'd like to find that out.
01:11:41.000 | - I think the right answer is you're just stepping down
01:11:46.040 | to focus on your music career.
01:11:47.640 | (laughing)
01:11:49.040 | But-- - That might not be
01:11:50.560 | a good plan for anything, very sustainable.
01:11:53.960 | - But I think that is a sign of a great leader
01:11:56.120 | as George Washington did stepping down at the right time.
01:11:59.880 | - Ted Williams.
01:12:00.840 | - Yes.
01:12:03.080 | - He quit when, I think he hit a home run on his last at-bat
01:12:06.000 | and his average was 400 at the time.
01:12:08.520 | - No one to walk away.
01:12:10.400 | I mean, it's hard, but it's beautiful to see in a leader.
01:12:13.240 | You also oversaw the Human Genome Project.
01:12:17.280 | You mentioned the Brain Initiative,
01:12:19.280 | which has, it's a dream to map the human brain.
01:12:24.280 | And there's the dream to map the human code,
01:12:28.960 | which was the Human Genome Project.
01:12:30.440 | And you've said that it is humbling for me
01:12:32.840 | and awe-inspiring to realize
01:12:34.920 | that we have caught the first glimpse
01:12:37.120 | of our own instruction book, previously known only to God.
01:12:42.120 | How does that, if you can just kind of wax poetic
01:12:46.400 | for a second, how does it make you feel
01:12:49.280 | that we were able to map this instruction book,
01:12:52.760 | look into our own code and be able to reverse engineer it?
01:12:57.600 | - It's breathtaking.
01:13:00.840 | It's so fundamental.
01:13:02.680 | And yet, for all of human history,
01:13:05.720 | we're ignorant of the details
01:13:08.640 | of what that instruction book looked like.
01:13:11.240 | And then we crossed a bridge
01:13:13.480 | into the territory of the known.
01:13:16.200 | And we had that in front of us,
01:13:17.720 | still written in a language
01:13:19.120 | that we had to learn how to read.
01:13:21.600 | And we're in the process of doing that
01:13:23.240 | and will be for decades to come.
01:13:25.080 | But we owned it, we had it.
01:13:27.640 | And it has such profound consequences.
01:13:30.440 | It's both a book about our history.
01:13:33.720 | It's a book of sort of the parts list of a human being,
01:13:39.520 | the genes that are in there and how they're regulated.
01:13:43.000 | And it's also a medical textbook
01:13:45.400 | that can teach us things that will provide answers
01:13:48.720 | to illnesses we don't understand
01:13:50.960 | and alleviate suffering and premature death.
01:13:53.520 | So it's a pretty amazing thing to contemplate.
01:13:57.120 | And it has utterly transformed the way we do science.
01:14:00.240 | And it is in the process of transforming
01:14:02.360 | the way we do medicine,
01:14:04.480 | although much of that still lies ahead.
01:14:07.080 | You know, while we were working on the Genome Project,
01:14:11.200 | it was sort of hard to get this sense of a wow-ness
01:14:16.200 | because it was just hard work.
01:14:18.840 | And you were getting, you know, another megabase.
01:14:21.200 | Okay, this is good.
01:14:22.840 | But when did you actually step back and say, we did it.
01:14:26.880 | It's the profoundness of that.
01:14:29.680 | I mean, there were two points, I guess.
01:14:31.760 | One was the announcement on that June 26, 2000,
01:14:34.600 | where the whole world heard, well, we don't quite have it,
01:14:37.240 | but we got a pretty good draft.
01:14:39.280 | And suddenly people were like realizing,
01:14:41.600 | oh, this is a big deal.
01:14:44.000 | For me, it was more when we got the full analysis of it,
01:14:48.280 | published it in February, 2001, in that issue of Nature,
01:14:51.880 | paper that Eric Lander and Bob Waterston and I
01:14:54.360 | were the main authors.
01:14:55.800 | And we toiled over and tried to get as much insight
01:14:59.480 | as we could in there about what the meaning of all this was.
01:15:02.840 | But you also had this sense that we are
01:15:05.720 | such beginning readers here.
01:15:07.720 | We are still in kindergarten trying to make sense
01:15:11.360 | out of this 3 billion letter book.
01:15:14.640 | And we're gonna be at this for generations to come.
01:15:17.400 | - You are a man of faith, Christian,
01:15:22.280 | and you are a man of science.
01:15:25.160 | What is the role of religion and of science in society
01:15:29.320 | and in the individual human mind and heart like yours?
01:15:34.120 | - Well, I was not a person of faith when I was growing up.
01:15:39.240 | I became a believer in my 20s,
01:15:42.200 | influenced as a medical student by a recognition
01:15:46.800 | that I hadn't really thought through the issues
01:15:48.880 | of what's the meaning of life?
01:15:51.960 | Why are we all here?
01:15:53.600 | What happens when you die?
01:15:55.280 | Is there a God?
01:15:56.840 | Science is not so helpful in answering those questions.
01:16:00.320 | So I had to look around in other places
01:16:02.680 | and ultimately came to my own conclusion that atheism,
01:16:07.040 | which is where I had been,
01:16:08.680 | was the least supportable of the choices
01:16:11.480 | because it was the assertion of a universal negative,
01:16:14.800 | which scientists aren't supposed to do.
01:16:16.760 | And agnosticism came as an attractive option
01:16:21.840 | but felt a little bit like a cop-out.
01:16:23.520 | So I had to keep going, trying to figure out
01:16:26.000 | why do believers actually believe this stuff?
01:16:28.400 | And came to realize it was all pretty compelling.
01:16:32.560 | That there's no proof.
01:16:33.560 | I can't prove to you or anybody else that God exists,
01:16:36.280 | but I can say it's pretty darn plausible.
01:16:38.440 | And ultimately, what kind of God is it caused me
01:16:43.680 | to search through various religions and see,
01:16:46.320 | well, what do people think about that?
01:16:48.520 | And to my surprise, encountered the person of Jesus Christ
01:16:53.080 | as unique in every possible way
01:16:56.320 | and answering a lot of the questions
01:16:58.960 | I couldn't otherwise answer.
01:17:00.840 | And somewhat kicking and screaming, I became a Christian.
01:17:06.680 | Even though at the time, as a medical student
01:17:10.400 | already interested in genetics,
01:17:12.080 | people predicted my head would then explode
01:17:14.680 | 'cause these were incompatible worldviews.
01:17:18.080 | They really have not been for me.
01:17:20.600 | I am so fortunate, I think, that in a given day,
01:17:23.920 | wrestling with an issue,
01:17:27.160 | it can have both the rigorous scientific component
01:17:31.000 | and it can have the spiritual component.
01:17:33.040 | COVID-19 is a great example.
01:17:35.560 | These vaccines are both an amazing scientific achievement
01:17:39.640 | and an answer to prayer.
01:17:40.800 | When I'm wrestling with vaccine hesitancy
01:17:44.760 | and trying to figure out what answers to come up with,
01:17:47.320 | I get so frustrated sometimes and I'm comforted
01:17:51.680 | by reassurances that God is aware of that.
01:17:55.360 | I don't have to do this alone.
01:17:56.840 | So I know there are people like your friend, Sam Harris,
01:18:02.480 | who feel differently.
01:18:04.760 | Sam wrote a rather famous op-ed in the New York Times
01:18:09.280 | when I was nominated as the NIH director
01:18:11.800 | saying, "This is a terrible mistake."
01:18:14.800 | You can't do this. - Oh no, Sam.
01:18:17.960 | You can't have somebody who believes in God
01:18:20.200 | running the NIH.
01:18:21.200 | He's just gonna completely ruin the place.
01:18:23.800 | - Well, I have a testimonial.
01:18:26.960 | Christopher Hitchens, a devout atheist, if I could say so,
01:18:31.240 | was a friend of yours and referred to you as,
01:18:33.640 | quote, "One of the greatest living Americans,"
01:18:36.520 | and stated that you were one of the most devout believers
01:18:39.120 | he has ever met.
01:18:40.680 | He further stated that you were sequencing the genome
01:18:43.280 | of the cancer that would ultimately claim his life
01:18:46.080 | and that your friendship,
01:18:47.040 | despite their differing opinions on religion,
01:18:50.000 | was an example of the greatest armed truce in modern times.
01:18:55.000 | What did you learn from Christopher Hitchens about life
01:18:58.080 | or perhaps what is a fond memory you have of this man
01:19:00.960 | with whom you've disagreed but who is also your friend?
01:19:05.080 | - Yeah, I loved Hitch.
01:19:08.240 | I'm sorry he's gone.
01:19:10.640 | Iron sharpens iron.
01:19:12.520 | There's nothing better for trying to figure out
01:19:15.240 | where you are with your own situation
01:19:18.120 | and your own opinions, your own world views,
01:19:20.680 | than encountering somebody who's completely in another space
01:19:24.320 | and who's got the gift, as Hitch did,
01:19:26.520 | of challenging everything
01:19:28.200 | and doing so over a glass of scotch or two or three.
01:19:31.300 | Yeah, we got off to a rough start
01:19:35.240 | where in an interaction we had at a rather highbrow dinner,
01:19:39.640 | he was really deeply insulting of a question I was asking,
01:19:45.160 | but I was like, okay, that's fine.
01:19:47.800 | Let's figure out how we can have a more civil conversation.
01:19:51.360 | And then I really learned to greatly admire his intellect
01:19:54.960 | and to find the jousting with him,
01:19:58.680 | and it wasn't all about faith, although it often was,
01:20:01.760 | was really inspiring and enervating, energizing.
01:20:05.460 | And then when he got cancer,
01:20:07.960 | I became sort of his ally,
01:20:09.840 | trying to help him find pathways through the various options
01:20:13.920 | and maybe helped him to stay around on this planet
01:20:17.880 | for an extra six months or so.
01:20:19.760 | And I have the warmest feelings
01:20:23.120 | of being in his apartment downtown
01:20:25.920 | over a glass of wine, talking about whatever.
01:20:32.160 | Sometimes it was science, he was fascinated by science.
01:20:34.880 | Sometimes it was Thomas Jefferson.
01:20:37.840 | Sometimes it was faith.
01:20:40.840 | And I knew it would always be really interesting.
01:20:44.200 | - So he's now gone.
01:20:45.680 | - Yeah.
01:20:46.520 | - Do you think about your own mortality?
01:20:49.960 | Are you afraid of death?
01:20:51.680 | - I'm not afraid.
01:20:52.560 | I'm not looking forward to it.
01:20:53.840 | I don't wanna rush it,
01:20:55.080 | 'cause I feel like I got some things I can still do here.
01:20:58.880 | But as a person of faith, I don't think I'm afraid.
01:21:02.520 | I'm 71.
01:21:03.740 | I know I don't have an infinite amount of time left,
01:21:06.960 | and I wanna use the time I've got
01:21:09.400 | in some sort of way that matters.
01:21:12.240 | I'm not ready to become a full-time golfer.
01:21:15.120 | (laughing)
01:21:17.720 | But I don't quite know what that is.
01:21:20.000 | I do feel that I've had a chance
01:21:23.320 | to do amazingly powerful things as far as experiences,
01:21:27.840 | and maybe God has something else in mind.
01:21:30.600 | I wrote this book 16 years ago,
01:21:34.400 | "The Language of God," about science and faith,
01:21:37.220 | trying to explain how, from my perspective,
01:21:39.840 | these are compatible, these are in harmony.
01:21:43.840 | They're complementary if you are careful
01:21:46.280 | about which kind of question you're asking.
01:21:48.520 | And to my surprise, a lot of people
01:21:51.240 | seemed to be interested in that.
01:21:52.520 | They were tired of hearing the extreme voices,
01:21:55.420 | like Dawkins at one end, and people like Ken Ham
01:22:00.960 | and Answers in Genesis on the other end,
01:22:02.880 | saying if you trust science, you're going to hell.
01:22:05.720 | And they thought there must be a way
01:22:07.620 | that these things could get along,
01:22:09.200 | and that's what I tried to put forward.
01:22:10.680 | And then I started a foundation, BioLogos,
01:22:13.480 | which then I had to step away from to become NIH director,
01:22:16.760 | which has just flourished, maybe because I stepped away,
01:22:19.360 | I don't know. (laughing)
01:22:20.440 | But it now has millions of people
01:22:23.200 | who come to that website, and they run amazing meetings.
01:22:26.240 | And I think a lot of people have really come to a sense
01:22:29.120 | that this is okay, I can love science, and I can love God,
01:22:32.240 | and that's not a bad thing.
01:22:33.580 | So maybe there's something more I can do in that space.
01:22:37.360 | Maybe that book is ready for a second edition.
01:22:40.280 | - I think so.
01:22:41.480 | But when you look back, life is finite.
01:22:44.960 | What do you hope your legacy is?
01:22:47.440 | - I don't know, this whole legacy thing
01:22:51.240 | seems a little bit hard to embrace.
01:22:54.200 | It feels a little self-promoting, doesn't it?
01:22:56.240 | I sort of feel like in many ways,
01:22:57.840 | I went to my own funeral on October 5th
01:23:01.080 | when I announced that I was stepping down,
01:23:02.880 | and I got the most amazing responses from people,
01:23:06.760 | some of whom I knew really well,
01:23:08.000 | some of whom I didn't know at all,
01:23:10.120 | who were just telling me stories
01:23:12.080 | about something that I had contributed to
01:23:15.280 | that made a difference to them.
01:23:16.960 | And that was incredibly heartwarming, and that's enough.
01:23:19.760 | I don't wanna build an edifice,
01:23:22.440 | I don't have a plan for a monument or a statue,
01:23:25.720 | God help us. (laughing)
01:23:27.680 | I do feel like I've been incredibly fortunate,
01:23:30.280 | I've had the chance to play a role
01:23:32.800 | in things that were pretty profound
01:23:35.200 | from the Genome Project to NIH to COVID vaccines,
01:23:39.520 | and I ought to be plenty satisfied
01:23:41.600 | that I've had enough experiences here to feel pretty good
01:23:46.160 | about the way in which my life panned out.
01:23:49.360 | - We did a bunch of difficult questions in this conversation,
01:23:53.400 | let me ask the most difficult one,
01:23:55.320 | that perhaps is the reason you turned to God,
01:24:01.060 | what is the meaning of life? (laughing)
01:24:05.120 | Have you figured it out yet?
01:24:07.160 | - Expect me to put that into three sentences?
01:24:10.200 | - We only have a couple of minutes, so please hurry up.
01:24:12.600 | (laughing)
01:24:15.520 | - Well, that's not a question
01:24:16.360 | that I think science helps me with,
01:24:18.080 | so you're gonna push me into the faith zone,
01:24:20.360 | which is where I'd wanna go with that.
01:24:23.480 | I think, well, what is the meaning, why are we here,
01:24:25.760 | what are we put here to do?
01:24:27.280 | I do believe we're here for just a blink of an eye,
01:24:31.640 | and that our existence somehow goes on beyond that
01:24:35.360 | in a way that I don't entirely understand,
01:24:37.760 | despite efforts to do so.
01:24:39.680 | I think we are called upon in this blink of an eye
01:24:43.920 | to try to make the world a better place,
01:24:45.760 | to try to love people, to try to do a better job
01:24:50.760 | of our more altruistic instincts,
01:24:55.800 | and less of our selfish instincts,
01:24:59.440 | to try to be what God calls us to be,
01:25:03.080 | people who are holy, not people who are
01:25:07.940 | driven by self-indulgence.
01:25:11.920 | And sometimes I'm better at that than others.
01:25:13.920 | (laughing)
01:25:15.320 | But I think that, for me as a Christian, is a pretty clear,
01:25:18.400 | I mean, it's to live out the Sermon on the Mount.
01:25:22.020 | Once I read that, I couldn't unread it,
01:25:27.480 | all those beatitudes, all the blesseds.
01:25:30.260 | That's what we're supposed to do.
01:25:32.960 | And the meaning of life is to strive for that standard,
01:25:36.240 | recognizing you're going to fail over and over again,
01:25:40.120 | and that God forgives you.
01:25:41.440 | - Hopefully to put a little bit of love
01:25:44.440 | out there into the world.
01:25:45.480 | - That's what it's about.
01:25:47.400 | - Francis, I'm truly humbled and inspired
01:25:52.400 | by both your brilliance and your humility,
01:25:56.080 | and that you would spend your extremely
01:25:58.200 | valuable time with me today.
01:25:59.400 | It was really an honor.
01:26:00.480 | Thank you so much for talking today.
01:26:02.080 | - I was glad to, and you ask really good questions.
01:26:05.040 | So your reputation as the best podcaster
01:26:08.720 | has borne itself out here this afternoon.
01:26:10.920 | - Thank you so much.
01:26:12.720 | Thanks for listening to this conversation
01:26:14.240 | with Francis Collins.
01:26:15.460 | To support this podcast, please check out
01:26:17.280 | our sponsors in the description.
01:26:19.160 | And now, let me leave you with some words
01:26:21.400 | from Isaac Newton, reflecting on his life and work.
01:26:25.720 | I seem to have been only like a boy,
01:26:28.120 | playing on the seashore and diverting myself
01:26:31.120 | in now and then finding a smoother pebble
01:26:33.820 | or a prettier shell than ordinary,
01:26:36.320 | whilst the great ocean of truth
01:26:38.720 | lay all undiscovered before me.
01:26:40.840 | Thank you for listening, and hope to see you next time.
01:26:44.760 | (upbeat music)
01:26:47.340 | (upbeat music)
01:26:49.920 | [BLANK_AUDIO]