back to index

Bret Weinstein: Truth, Science, and Censorship in the Time of a Pandemic | Lex Fridman Podcast #194


Chapters

0:0 Introduction
3:14 Why biology is beautiful
10:47 Boston Dynamics
14:11 Being self-critical
24:19 Theory of close calls
32:56 Lab leak hypothesis
66:50 Joe Rogan
76:0 Censorship
112:49 Vaccines
126:35 The paper on mice with long telomeres
154:18 Martyrdom
163:3 Eric Weinstein
172:23 Monogamy
184:41 Advice for young people
191:25 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | "The following is a conversation with Brett Weinstein,
00:00:03.100 | "evolutionary biologist, author,
00:00:05.320 | "co-host of the Dark Horse podcast,
00:00:07.520 | "and as he says, reluctant radical.
00:00:11.160 | "Even though we've never met or spoken before this,
00:00:14.160 | "we both felt like we've been friends for a long time.
00:00:17.400 | "I don't agree on everything with Brett,
00:00:19.660 | "but I'm sure as hell happy he exists
00:00:22.040 | "in this weird and wonderful world of ours.
00:00:25.100 | "Quick mention of our sponsors,
00:00:27.020 | "Jordan Harbinger Show, ExpressVPN,
00:00:29.880 | "Magic Spoon, and Four Sigmatic.
00:00:32.460 | "Check them out in the description
00:00:33.940 | "to support this podcast.
00:00:35.900 | "As a side note, let me say a few words
00:00:37.820 | "about COVID-19 and about science broadly.
00:00:41.220 | "I think science is beautiful and powerful.
00:00:44.200 | "It is the striving of the human mind
00:00:46.580 | "to understand and to solve the problems of the world.
00:00:50.080 | "But as an institution, it is susceptible
00:00:52.360 | "to the flaws of human nature,
00:00:54.240 | "to fear, to greed, power, and ego.
00:00:58.340 | "2020 is a story of all of these
00:01:00.580 | "that has both scientific triumph and tragedy.
00:01:04.120 | "We needed great leaders, and we didn't get them.
00:01:07.240 | "What we needed is leaders who communicate
00:01:09.360 | "in an honest, transparent, and authentic way
00:01:12.020 | "about the uncertainty of what we know
00:01:14.040 | "and the large-scale scientific efforts
00:01:16.080 | "to reduce that uncertainty and to develop solutions.
00:01:19.580 | "I believe there are several candidates for solutions
00:01:21.840 | "that could have all saved hundreds of billions of dollars
00:01:25.340 | "and lessened or eliminated
00:01:27.820 | "the suffering of millions of people.
00:01:30.180 | "Let me mention five of the categories of solutions,
00:01:33.000 | "masks, at-home testing, anonymized contact tracing,
00:01:37.020 | "antiviral drugs, and vaccines.
00:01:39.520 | "Within each of these categories,
00:01:41.260 | "institutional leaders should have constantly asked
00:01:44.100 | "and answered publicly, honestly,
00:01:46.860 | "the following three questions.
00:01:48.860 | "One, what data do we have on the solution
00:01:51.960 | "and what studies are we running
00:01:53.260 | "to get more and better data?
00:01:55.120 | "Two, given the current data and uncertainty,
00:01:57.900 | "how effective and how safe is the solution?
00:02:01.140 | "Three, what is the timeline and cost involved
00:02:04.140 | "with mass manufacturing distribution of the solution?
00:02:07.420 | "In the service of these questions,
00:02:08.960 | "no voices should have been silenced,
00:02:11.040 | "no ideas left off the table.
00:02:13.060 | "Open data, open science,
00:02:15.000 | "open, honest scientific communication and debate
00:02:17.560 | "was the way, not censorship.
00:02:20.240 | "There are a lot of ideas out there
00:02:21.900 | "that are bad, wrong, dangerous,
00:02:24.800 | "but the moment we have the hubris
00:02:26.780 | "to say we know which ideas those are
00:02:29.620 | "is the moment we'll lose our ability to find the truth,
00:02:32.460 | "to find solutions,
00:02:33.860 | "the very things that make science beautiful and powerful
00:02:37.780 | "in the face of all the dangers
00:02:39.120 | "that threaten the wellbeing and the existence
00:02:41.660 | "of humans on earth."
00:02:43.620 | This conversation with Brett
00:02:44.920 | is less about the ideas we talk about.
00:02:46.880 | We agree on some, disagree on others.
00:02:49.060 | It is much more about the very freedom to talk,
00:02:52.340 | to think, to share ideas.
00:02:55.020 | This freedom is our only hope.
00:02:57.360 | Brett should never have been censored.
00:03:00.100 | I asked Brett to do this podcast to show solidarity
00:03:03.400 | and to show that I have hope for science and for humanity.
00:03:07.400 | This is the Lex Friedman Podcast
00:03:10.200 | and here's my conversation with Brett Weinstein.
00:03:13.440 | What to you is beautiful about the study of biology?
00:03:18.080 | The science, the engineering, the philosophy of it?
00:03:21.000 | - It's a very interesting question.
00:03:22.320 | I must say at one level, it's not a conscious thing.
00:03:27.320 | I can say a lot about why as an adult
00:03:30.680 | I find biology compelling,
00:03:32.080 | but as a kid I was completely fascinated with animals.
00:03:36.720 | I loved to watch them and think about why they did
00:03:40.280 | what they did and that developed
00:03:43.200 | into a very conscious passion as an adult.
00:03:46.520 | But I think in the same way
00:03:49.400 | that one is drawn to a person,
00:03:51.800 | I was drawn to the never-ending series
00:03:56.800 | of near miracles that exists across biological nature.
00:04:02.480 | - When you see a living organism,
00:04:03.880 | do you see it from an evolutionary biology perspective
00:04:08.200 | of this entire thing that moves around in this world?
00:04:11.000 | Or do you see from an engineering perspective
00:04:14.240 | that first principles almost down to the physics,
00:04:18.040 | like the little components that build up hierarchies,
00:04:21.200 | that you have cells, first proteins and cells and organs
00:04:26.200 | and all that kind of stuff.
00:04:27.200 | So do you see low level or do you see high level?
00:04:29.920 | - Well, the human mind is a strange thing.
00:04:32.800 | And I think it's probably a bit like a time-sharing machine
00:04:37.800 | in which I have different modules.
00:04:40.360 | We don't know enough about biology for them to connect.
00:04:43.360 | So they exist in isolation
00:04:44.860 | and I'm always aware that they do connect.
00:04:47.320 | But I basically have to step into a module
00:04:50.720 | in order to see the evolutionary dynamics
00:04:53.760 | of the creature and the lineage that it belongs to.
00:04:56.940 | I have to step into a different module
00:04:58.660 | to think of that lineage over a very long time scale,
00:05:02.240 | a different module still to understand
00:05:04.400 | what the mechanisms inside would have to look like
00:05:06.940 | to account for what we can see from the outside.
00:05:11.080 | And I think that probably sounds really complicated,
00:05:15.960 | but one of the things about being involved
00:05:20.680 | in a topic like biology and doing so for one,
00:05:25.160 | really not even just my adult life, for my whole life,
00:05:27.480 | is that it becomes second nature.
00:05:29.400 | And when we see somebody do an amazing parkour routine
00:05:34.400 | or something like that,
00:05:37.020 | we think about what they must be doing
00:05:39.940 | in order to accomplish that.
00:05:41.460 | But of course, what they are doing
00:05:43.080 | is tapping into some kind of zone, right?
00:05:46.320 | They are in a zone in which they are in such command
00:05:51.320 | of their center of gravity, for example,
00:05:53.720 | that they know how to hurl it around a landscape
00:05:56.880 | so that they always land on their feet.
00:05:59.040 | And I would just say,
00:06:03.120 | for anyone who hasn't found a topic
00:06:05.000 | on which they can develop that kind of facility,
00:06:08.820 | it is absolutely worthwhile.
00:06:11.540 | It's really something that human beings are capable of doing
00:06:14.220 | across a wide range of topics,
00:06:16.320 | many things our ancestors didn't even have access to.
00:06:19.600 | And that flexibility of humans,
00:06:22.020 | that ability to repurpose our machinery
00:06:26.160 | for topics that are novel,
00:06:28.440 | means really the world is your oyster.
00:06:31.000 | You can figure out what your passion is
00:06:32.960 | and then figure out all of the angles
00:06:34.820 | that one would have to pursue to really deeply understand it.
00:06:38.200 | And it is well worth having at least one topic like that.
00:06:42.860 | - You mean embracing the full adaptability
00:06:45.740 | of both the body and the mind.
00:06:49.380 | So like, 'cause I don't know
00:06:51.340 | what to attribute the parkour to,
00:06:53.480 | like biomechanics of how our bodies can move,
00:06:56.620 | or is it the mind?
00:06:58.060 | Like how much percent wise?
00:07:00.160 | Is it the entirety of the hierarchies of biology
00:07:04.820 | that we've been talking about, or is it just all?
00:07:08.400 | The mind.
00:07:09.380 | - The way to think about creatures
00:07:10.740 | is that every creature is two things simultaneously.
00:07:14.180 | A creature is a machine of sorts, right?
00:07:17.920 | It's not a machine in the,
00:07:19.900 | you know, I call it an aqueous machine, right?
00:07:22.600 | And it's run by an aqueous computer, right?
00:07:24.840 | So it's not identical to our technological machines.
00:07:29.060 | But every creature is both a machine
00:07:31.900 | that does things in the world
00:07:33.180 | sufficient to accumulate enough resources
00:07:36.380 | to continue surviving, to reproduce.
00:07:39.620 | It is also a potential.
00:07:41.540 | So each creature is potentially,
00:07:44.980 | for example, the most recent common ancestor
00:07:47.220 | of some future clade of creatures
00:07:48.820 | that will look very different from it.
00:07:50.620 | And if a creature is very, very good at being a creature,
00:07:53.780 | but not very good in terms of the potential
00:07:56.260 | it has going forward,
00:07:57.760 | then that lineage will not last very long into the future
00:08:01.420 | because change will throw at challenges
00:08:04.980 | that its descendants will not be able to meet.
00:08:07.260 | So the thing about humans is we are a generalist platform
00:08:12.260 | and we have the ability to swap out our software
00:08:17.500 | to exist in many, many different niches.
00:08:20.760 | And I was once watching an interview
00:08:23.780 | with this British group of parkour experts
00:08:27.060 | who were being, you know,
00:08:29.460 | they were discussing what it is they do and how it works.
00:08:31.780 | And what they essentially said is,
00:08:33.300 | look, you're tapping into deep monkey stuff, right?
00:08:38.300 | And I thought, yeah, that's about right.
00:08:41.740 | And you know, anybody who is proficient
00:08:45.660 | at something like skiing or skateboarding,
00:08:49.020 | you know, has the experience of flying down the hill
00:08:54.020 | on skis, for example,
00:08:56.220 | bouncing from the top of one mogul to the next.
00:08:59.920 | And if you really pay attention,
00:09:02.520 | you will discover that your conscious mind
00:09:04.360 | is actually a spectator.
00:09:05.620 | It's there, it's involved in the experience,
00:09:08.160 | but it's not driving.
00:09:09.060 | Some part of you knows how to ski
00:09:10.360 | and it's not the part of you that knows how to think.
00:09:12.720 | And I would just say that what accounts
00:09:17.180 | for this flexibility in humans
00:09:19.900 | is the ability to bootstrap a new software program
00:09:23.180 | and then drive it into the unconscious layer
00:09:27.120 | where it can be applied very rapidly.
00:09:30.000 | And you know, I will be shocked
00:09:31.740 | if the exact thing doesn't exist in robotics.
00:09:36.100 | You know, if you programmed a robot
00:09:37.940 | to deal with circumstances that were novel to it,
00:09:40.240 | how would you do it?
00:09:41.080 | It would have to look something like this.
00:09:43.100 | - There's a certain kind of magic,
00:09:45.340 | you're right, with the consciousness being an observer.
00:09:48.620 | When you play guitar, for example,
00:09:50.240 | or piano for me, music,
00:09:53.060 | when you get truly lost in it,
00:09:55.460 | I don't know what the heck is responsible
00:09:57.740 | for the flow of the music,
00:09:59.100 | the kind of the loudness of the music going up and down,
00:10:02.820 | the timing, the intricate,
00:10:05.140 | like even the mistakes, all those things,
00:10:06.940 | that doesn't seem to be the conscious mind.
00:10:09.260 | It is just observing.
00:10:12.020 | And yet it's somehow intricately involved.
00:10:14.540 | More like, 'cause you mentioned parkour,
00:10:17.180 | dance is like that too.
00:10:18.820 | When you start up in tango dancing,
00:10:20.820 | when you truly lose yourself in it,
00:10:24.020 | then it's just like you're an observer.
00:10:26.940 | And how the hell is the body able to do that?
00:10:29.100 | And not only that, it's the physical motion
00:10:31.220 | is also creating the emotion,
00:10:33.980 | the like, that damn is good to be alive feeling.
00:10:38.980 | (laughs)
00:10:42.260 | But then that's also intricately connected
00:10:44.460 | to the full biology stack that we're operating in.
00:10:47.660 | I don't know how difficult it is to replicate that.
00:10:50.140 | We were talking offline about Boston Dynamics robots.
00:10:54.900 | They've recently been, they did both parkour,
00:10:57.980 | they did flips, they've also done some dancing.
00:11:01.080 | And it's something I think a lot about
00:11:03.520 | because what most people don't realize
00:11:07.940 | because they don't look deep enough
00:11:09.580 | is those robots are hard-coded to do those things.
00:11:13.980 | The robots didn't figure it out by themselves.
00:11:16.940 | And yet the fundamental aspect of what it means to be human
00:11:20.620 | is that process of figuring out, of making mistakes.
00:11:24.140 | And then there's something about overcoming
00:11:26.240 | those challenges and the mistakes
00:11:27.940 | and like figuring out how to lose yourself
00:11:30.100 | in the magic of the dancing or just movement
00:11:34.120 | is what it means to be human, that learning process.
00:11:37.120 | So that's what I wanna do with the,
00:11:40.320 | almost as a fun side thing with the Boston Dynamics robots
00:11:44.740 | is to have them learn and see what they figure out.
00:11:48.420 | Even if they make mistakes.
00:11:50.940 | I wanna let Spot make mistakes
00:11:55.140 | and in so doing discover what it means to be alive,
00:12:00.140 | discover beauty, because I think that's
00:12:02.500 | the essential aspect of mistakes.
00:12:04.520 | Boston Dynamics folks want Spot to be perfect
00:12:09.300 | 'cause they don't want Spot to ever make mistakes
00:12:11.620 | because they wants to operate in the factories,
00:12:13.540 | it wants to be very safe and so on.
00:12:16.580 | For me, if you construct the environment,
00:12:19.060 | if you construct a safe space for robots
00:12:21.380 | and allow them to make mistakes,
00:12:24.380 | something beautiful might be discovered.
00:12:26.740 | But that requires a lot of brain power.
00:12:29.820 | So Spot is currently very dumb
00:12:32.060 | and I'm gonna add, give it a brain.
00:12:34.300 | So first make it see, currently can't see,
00:12:36.780 | meaning computer vision.
00:12:38.100 | It has to understand its environment,
00:12:39.780 | it has to see all the humans,
00:12:41.340 | but then also has to be able to learn.
00:12:43.900 | Learn about its movement, learn how to use its body
00:12:47.480 | to communicate with others, all those kinds of things.
00:12:49.940 | The dogs know how to do well,
00:12:51.820 | humans know how to do somewhat well.
00:12:54.020 | I think that's a beautiful challenge,
00:12:56.540 | but first you have to allow the robot to make mistakes.
00:13:00.380 | - Well, I think your objective is laudable,
00:13:03.820 | but you're gonna realize that the Boston Dynamics folks
00:13:06.860 | are right the first time Spot poops on your rug.
00:13:09.300 | - I hear the same thing about kids and so on.
00:13:13.940 | I still wanna have kids.
00:13:14.980 | - No, you should, it's a great experience.
00:13:18.080 | So let me step back into what you said
00:13:19.760 | in a couple of different places.
00:13:21.240 | One, I have always believed that the missing element
00:13:24.720 | in robotics and artificial intelligence
00:13:27.920 | is a proper development.
00:13:30.440 | It is no accident, it is no mere coincidence
00:13:33.620 | that human beings are the most dominant species
00:13:36.400 | on planet Earth and that we have the longest childhoods
00:13:38.880 | of any creature on Earth by far.
00:13:42.000 | The development is the key to the flexibility
00:13:44.880 | and so the capability of a human at adulthood
00:13:49.880 | is the mirror image, it's the flip side
00:13:54.800 | of our helplessness at birth.
00:13:58.240 | So I'll be very interested to see what happens
00:14:00.920 | in your robot project if you do not end up
00:14:04.280 | reinventing childhood for robots,
00:14:06.400 | which of course is foreshadowed in 2001 quite brilliantly.
00:14:11.880 | But I also wanna point out, you can see this issue
00:14:16.040 | of your conscious mind becoming a spectator very well
00:14:19.760 | if you compare tennis to table tennis.
00:14:23.480 | If you watch a tennis game, you could imagine
00:14:28.640 | that the players are highly conscious as they play.
00:14:31.260 | You cannot imagine that if you've ever played
00:14:34.680 | ping pong decently.
00:14:36.680 | A volley in ping pong is so fast
00:14:39.440 | that your conscious mind, if your reactions
00:14:42.200 | had to go through your conscious mind,
00:14:43.740 | you wouldn't be able to play.
00:14:44.960 | So you can detect that your conscious mind,
00:14:47.140 | while very much present, isn't there.
00:14:49.720 | And you can also detect where consciousness
00:14:52.160 | does usefully intrude.
00:14:54.540 | If you go up against an opponent in table tennis
00:14:57.760 | that knows a trick that you don't know how to respond to,
00:15:01.800 | you will suddenly detect that something about your game
00:15:04.800 | is not effective and you will start thinking
00:15:07.260 | about what might be, how do you position yourself
00:15:09.520 | so that move that puts the ball just in that corner
00:15:11.880 | of the table or something like that
00:15:13.600 | doesn't catch you off guard.
00:15:15.440 | And this I believe is, we highly conscious folks,
00:15:20.440 | those of us who try to think through things
00:15:23.880 | very deliberately and carefully, mistake consciousness
00:15:27.580 | for like the highest kind of thinking.
00:15:30.280 | And I really think that this is an error.
00:15:33.120 | Consciousness is an intermediate level of thinking.
00:15:36.280 | What it does is it allows you,
00:15:37.720 | it's basically like uncompiled code.
00:15:40.120 | And it doesn't run very fast.
00:15:42.620 | It is capable of being adapted to new circumstances.
00:15:45.400 | But once the code is roughed in, right,
00:15:48.440 | it gets driven into the unconscious layer
00:15:50.480 | and you become highly effective at whatever it is.
00:15:52.840 | And from that point, your conscious mind
00:15:55.260 | basically remains there to detect things
00:15:57.240 | that aren't anticipated by the code you've already written.
00:16:00.040 | And so I don't exactly know how one would establish this,
00:16:05.340 | how one would demonstrate it.
00:16:07.220 | But it must be the case that the human mind
00:16:10.580 | contains sandboxes in which things are tested, right?
00:16:15.380 | Maybe you can build a piece of code
00:16:16.820 | and run it in parallel next to your active code
00:16:19.860 | so you can see how it would have done comparatively.
00:16:23.600 | But there's gotta be some way of writing new code
00:16:26.460 | and then swapping it in.
00:16:27.980 | And frankly, I think this has a lot to do
00:16:29.700 | with things like sleep cycles.
00:16:31.220 | Very often, you know, when I get good at something,
00:16:34.100 | I often don't get better at it while I'm doing it.
00:16:36.740 | I get better at it when I'm not doing it,
00:16:38.660 | especially if there's time to sleep and think on it.
00:16:41.740 | So there's some sort of, you know,
00:16:43.700 | new program swapping in for old program phenomenon,
00:16:46.940 | which, you know, will be a lot easier to see in machines.
00:16:50.860 | It's gonna be hard with the wetware.
00:16:53.820 | - I like, I mean, it is true,
00:16:55.500 | 'cause somebody that played,
00:16:56.780 | I played tennis for many years.
00:16:58.780 | I do still think the highest form of excellence in tennis
00:17:01.900 | is when the conscious mind is a spectator.
00:17:05.220 | So the compiled code is the highest form of being human.
00:17:10.220 | And then consciousness is just some like specific compiler.
00:17:16.380 | You have to have like Borland C++ compiler.
00:17:19.740 | You could just have different kinds of compilers.
00:17:22.180 | Ultimately, the thing that by which we measure
00:17:26.140 | the power of life, the intelligence of life
00:17:30.580 | is the compiled code.
00:17:31.820 | And you can probably do that compilation
00:17:33.300 | all kinds of ways.
00:17:34.900 | - Yep, yeah, I'm not saying that tennis
00:17:36.660 | is played consciously and table tennis isn't.
00:17:38.900 | I'm saying that because tennis is slowed down
00:17:41.460 | by the just the space on the court,
00:17:43.740 | you could imagine that it was your conscious mind playing.
00:17:47.060 | But when you shrink the court down--
00:17:48.740 | - It becomes obvious.
00:17:49.580 | - It becomes obvious that your conscious mind
00:17:51.420 | is just present rather than knowing where to put the paddle.
00:17:54.740 | And weirdly for me,
00:17:56.460 | I would say this probably isn't true in a podcast situation.
00:18:01.780 | But if I have to give a presentation,
00:18:03.980 | especially if I have not overly prepared,
00:18:06.940 | I often find the same phenomenon
00:18:08.700 | when I'm giving the presentation.
00:18:10.060 | My conscious mind is there watching
00:18:11.580 | some other part of me present,
00:18:13.540 | which is a little jarring, I have to say.
00:18:17.420 | - Well, that means you've gotten good at it.
00:18:20.620 | Not let the conscious mind get in the way
00:18:22.460 | of the flow of words.
00:18:24.700 | - Yeah, that's the sensation to be sure.
00:18:27.100 | - And that's the highest form of podcasting too.
00:18:28.980 | I mean, that's why I have,
00:18:30.900 | that's what it looks like when a podcast
00:18:32.900 | is really in the pocket,
00:18:34.180 | like Joe Rogan just having fun and just losing themselves.
00:18:39.180 | And that's something I aspire to as well,
00:18:41.820 | just losing yourself in conversation.
00:18:43.460 | Somebody that has a lot of anxiety with people,
00:18:45.660 | like I'm such an introvert, I'm scared.
00:18:47.940 | I was scared before you showed up, I'm scared right now.
00:18:50.900 | There's just anxiety, there's just, it's a giant mess.
00:18:55.180 | It's hard to lose yourself.
00:18:56.540 | It's hard to just get out of the way of your own mind.
00:19:00.660 | - Yeah, actually trust is a big component of that.
00:19:04.940 | Your conscious mind retains control
00:19:08.100 | if you are very uncertain.
00:19:10.040 | But when you do get into that zone when you're speaking,
00:19:14.260 | I realize it's different for you
00:19:15.420 | with English as a second language,
00:19:16.740 | although maybe you present in Russian and it happens.
00:19:20.280 | But do you ever hear yourself say something
00:19:22.540 | and you think, oh, that's really good, right?
00:19:25.060 | Like you didn't come up with it,
00:19:26.960 | some other part of you that you don't exactly know
00:19:30.140 | came up with it?
00:19:30.980 | - I don't think I've ever heard myself in that way
00:19:35.980 | because I have a much louder voice
00:19:38.180 | that's constantly yelling in my head
00:19:40.460 | at why the hell did you say that?
00:19:43.540 | There's a very self-critical voice that's much louder.
00:19:47.380 | So I'm very, maybe I need to deal with that voice,
00:19:51.300 | but it's been like with a, what is it called?
00:19:53.140 | Like a megaphone just screaming,
00:19:54.620 | so I can't hear the voice that says,
00:19:56.600 | good job, you said that thing really nicely.
00:19:58.600 | So I'm kind of focused right now on the megaphone person
00:20:02.300 | in the audience versus the positive.
00:20:05.300 | But that's definitely something to think about.
00:20:07.180 | It's been productive, but the place where I find gratitude
00:20:12.180 | and beauty and appreciation of life
00:20:15.220 | is in the quiet moments when I don't talk,
00:20:18.740 | when I listen to the world around me,
00:20:20.340 | when I listen to others.
00:20:21.600 | When I talk, I'm extremely self-critical in my mind.
00:20:26.780 | When I produce anything out into the world
00:20:29.640 | that originated with me,
00:20:32.000 | like any kind of creation, extremely self-critical.
00:20:35.040 | It's good for productivity,
00:20:37.440 | for always striving to improve and so on.
00:20:40.760 | It might be bad for just appreciating
00:20:45.040 | the things you've created.
00:20:46.200 | I'm a little bit with Marvin Minsky on this,
00:20:49.600 | where he says the key to a productive life
00:20:54.840 | is to hate everything you've ever done in the past.
00:20:58.020 | - I didn't know he said that.
00:20:59.340 | I must say I resonate with it a bit.
00:21:01.240 | And unfortunately my life currently
00:21:06.060 | has me putting a lot of stuff into the world
00:21:08.700 | and I effectively watch almost none of it.
00:21:12.060 | I can't stand it.
00:21:12.900 | - Yeah, what do you make of that?
00:21:16.820 | I don't know.
00:21:18.060 | I just recently, I just yesterday read Metamorphosis by,
00:21:21.740 | re-read Metamorphosis by Kafka
00:21:23.660 | where he turns into a giant bug
00:21:25.840 | because of the stress that the world puts on him.
00:21:29.060 | His parents put on him to succeed.
00:21:31.620 | And I think that you have to find the balance
00:21:35.500 | 'cause if you allow the self-critical voice
00:21:39.640 | to become too heavy, the burden of the world,
00:21:41.800 | the pressure that the world puts on you
00:21:44.120 | to be the best version of yourself and so on to strive,
00:21:47.360 | then you become a bug and that's a big problem.
00:21:51.760 | And then the world turns against you because you're a bug.
00:21:56.720 | You become some kind of caricature of yourself.
00:21:59.760 | I don't know.
00:22:00.600 | Become the worst version of yourself
00:22:03.920 | and then thereby end up destroying yourself
00:22:07.920 | and then the world moves on.
00:22:09.600 | That's the story.
00:22:10.420 | - That's a lovely story.
00:22:11.460 | (laughing)
00:22:12.480 | I do think this is one of these places
00:22:14.840 | and frankly you could map this onto
00:22:17.680 | all of modern human experience.
00:22:19.500 | But this is one of these places
00:22:20.820 | where our ancestral programming
00:22:23.320 | does not serve our modern selves.
00:22:25.640 | So I used to talk to students
00:22:27.240 | about the question of dwelling on things.
00:22:30.720 | Dwelling on things is famously understood to be bad
00:22:34.320 | and it can't possibly be bad.
00:22:36.720 | It wouldn't exist, the tendency toward it wouldn't exist
00:22:38.920 | if it was bad.
00:22:40.320 | So what is bad is dwelling on things
00:22:42.720 | past the point of utility.
00:22:45.360 | And that's obviously easier to say than to operationalize.
00:22:49.660 | But if you realize that your dwelling is the key in fact
00:22:53.320 | to upgrading your program for future well-being
00:22:57.000 | and that there's a point presumably from diminishing returns
00:23:01.400 | if not counter productivity,
00:23:03.780 | there is a point at which you should stop
00:23:05.600 | because that is what is in your best interest,
00:23:08.320 | then knowing that you're looking for that point is useful.
00:23:12.120 | This is the point at which it is no longer useful
00:23:14.120 | for me to dwell on this error I have made.
00:23:16.400 | That's what you're looking for.
00:23:17.960 | And it also gives you license.
00:23:21.060 | If some part of you feels like it's punishing you
00:23:23.820 | rather than searching,
00:23:25.600 | then that also has a point at which it's no longer valuable
00:23:29.680 | and there's some liberty in realizing,
00:23:32.400 | yep, even the part of me that was punishing me
00:23:35.260 | knows it's time to stop.
00:23:36.460 | - So if we map that onto compiled code discussion,
00:23:40.460 | as a computer science person, I find that very compelling.
00:23:45.280 | When you compile code, you get warnings sometimes.
00:23:48.720 | And usually if you're a good software engineer,
00:23:53.720 | you're going to make sure there's no,
00:23:56.840 | you treat warnings as errors.
00:23:58.900 | So you make sure that the compilation produces no warnings.
00:24:02.280 | But at a certain point when you have a large enough system,
00:24:05.160 | you just let the warnings go.
00:24:06.820 | It's fine.
00:24:07.860 | Like I don't know where that warning came from,
00:24:10.400 | but you know, it's just ultimately you need
00:24:14.280 | to compile the code and run with it
00:24:16.320 | and hope nothing terrible happens.
00:24:19.200 | - Well, I think what you will find,
00:24:21.000 | and believe me, I think what you're talking about
00:24:24.520 | with respect to robots and learning
00:24:27.760 | is gonna end up having to go to a deep developmental state
00:24:31.920 | and a helplessness that evolves into hyper-competence
00:24:34.960 | and all of that.
00:24:36.040 | But I live, I noticed that I live by something
00:24:41.480 | that I, for lack of a better descriptor,
00:24:44.180 | call the theory of close calls.
00:24:46.120 | And the theory of close calls says
00:24:49.600 | that people typically miscategorize the events
00:24:54.600 | in their life where something almost went wrong.
00:24:57.840 | And you know, for example, if you,
00:25:01.080 | I have a friend who, I was walking down the street
00:25:04.800 | with my college friends and one of my friends
00:25:06.760 | stepped into the street thinking it was clear
00:25:08.740 | and was nearly hit by a car going 45 miles an hour.
00:25:12.120 | Would have been an absolute disaster,
00:25:13.920 | might have killed her, certainly would have
00:25:16.160 | permanently injured her.
00:25:17.400 | But she didn't, you know, car didn't touch her, right?
00:25:21.760 | Now you could walk away from that and think nothing of it
00:25:25.240 | because well, what is there to think?
00:25:26.680 | Nothing happened.
00:25:28.240 | Or you could think, well, what is the difference
00:25:30.680 | between what did happen and my death?
00:25:33.580 | The difference is luck.
00:25:35.120 | I never want that to be true, right?
00:25:37.360 | I never want the difference between what did happen
00:25:40.100 | and my death to be luck.
00:25:41.680 | Therefore, I should count this as very close to death.
00:25:45.320 | And I should prioritize coding so it doesn't happen again
00:25:48.760 | at a very high level.
00:25:50.480 | So anyway, my basic point is the accidents and disasters
00:25:55.480 | and misfortune describe a distribution that tells you
00:26:02.240 | what's really likely to get you in the end.
00:26:04.440 | And so personally, you can use them to figure out
00:26:09.440 | where the dangers are so that you can afford
00:26:12.040 | to take great risks because you have a really good sense
00:26:14.400 | of how they're gonna go wrong.
00:26:15.800 | But I would also point out civilization has this problem.
00:26:19.080 | Civilization is now producing these events
00:26:22.920 | that are major disasters,
00:26:24.400 | but they're not existential scale yet, right?
00:26:27.480 | They're very serious errors that we can see.
00:26:30.100 | And I would argue that the pattern is you discover
00:26:32.920 | that we are involved in some industrial process
00:26:35.160 | at the point it has gone wrong, right?
00:26:37.960 | So I'm now always asking the question,
00:26:40.800 | okay, in light of the Fukushima triple meltdown,
00:26:44.880 | the financial collapse of 2008,
00:26:47.000 | the Deepwater Horizon blowout, COVID-19
00:26:51.680 | and its probable origins in the Wuhan lab,
00:26:55.360 | what processes do I not know the name of yet
00:26:58.840 | that I will discover at the point
00:27:00.500 | that some gigantic accident has happened?
00:27:03.100 | And can we talk about the wisdom or lack thereof
00:27:06.220 | of engaging in that process before the accident, right?
00:27:09.100 | That's what a wise civilization would be doing.
00:27:11.500 | And yet we don't.
00:27:12.940 | - I just wanna mention something that happened
00:27:15.140 | a couple of days ago.
00:27:17.180 | I don't know if you know who J.B. Straubel is.
00:27:20.060 | He's the co-founder of Tesla,
00:27:21.800 | CTO of Tesla for many, many years.
00:27:24.100 | His wife just died.
00:27:26.380 | She was riding a bicycle.
00:27:28.600 | And in the same thin line between death and life
00:27:33.600 | that many of us have been in,
00:27:37.420 | where you walk into the intersection
00:27:39.540 | and there's this close call,
00:27:41.780 | every once in a while,
00:27:43.160 | you get the short straw.
00:27:49.520 | I wonder how much of our own individual lives
00:27:54.940 | and the entirety of the human civilization
00:27:57.500 | rests on this little roll of the dice.
00:27:59.760 | - Well, this is sort of my point about the close calls
00:28:03.280 | is that there's a level at which we can't control it, right?
00:28:06.780 | The gigantic asteroid that comes from deep space
00:28:11.080 | that you don't have time to do anything about.
00:28:13.060 | There's not a lot we can do to hedge that out,
00:28:15.600 | or at least not short term.
00:28:16.960 | But there are lots of other things.
00:28:20.400 | Obviously, the financial collapse of 2008
00:28:23.760 | didn't break down the entire world economy.
00:28:27.040 | It threatened to, but a Herculean effort
00:28:28.880 | managed to pull us back from the brink.
00:28:31.080 | The triple meltdown at Fukushima was awful,
00:28:34.380 | but every one of the seven fuel pools held.
00:28:37.400 | There wasn't a major fire that made it impossible
00:28:39.560 | to manage the disaster going forward.
00:28:41.640 | We got lucky.
00:28:42.660 | We could say the same thing about the blowout
00:28:47.800 | at the deep water horizon,
00:28:49.580 | where a hole in the ocean floor large enough
00:28:52.480 | that we couldn't have plugged it could have opened up.
00:28:54.320 | All of these things could have been much, much worse, right?
00:28:57.520 | And I think we can say the same thing about COVID,
00:28:59.160 | as terrible as it is.
00:29:00.960 | And we cannot say for sure that it came from the Wuhan lab,
00:29:04.720 | but there's a strong likelihood that it did.
00:29:07.000 | And it also could be much, much worse.
00:29:10.320 | So in each of these cases, something is telling us
00:29:13.820 | we have a process that is unfolding
00:29:16.320 | that keeps creating risks where it is luck
00:29:18.600 | that is the difference between us
00:29:19.920 | and some scale of disaster that is unimaginable.
00:29:22.700 | And that wisdom, you can be highly intelligent
00:29:26.720 | and cause these disasters.
00:29:28.920 | To be wise is to stop causing them, right?
00:29:31.760 | And that would require a process of restraint,
00:29:36.560 | a process that I don't see a lot of evidence of yet.
00:29:38.880 | So I think we have to generate it.
00:29:41.880 | And somehow, at the moment,
00:29:45.540 | we don't have a political structure
00:29:47.460 | that would be capable of taking a protective,
00:29:53.000 | algorithm and actually deploying it, right?
00:29:55.740 | Because it would have important economic consequences.
00:29:57.780 | And so it would almost certainly be shot down.
00:30:00.360 | But we can obviously also say,
00:30:03.380 | we paid a huge price for all of the disasters
00:30:07.000 | that I've mentioned.
00:30:09.360 | And we have to factor that into the equation.
00:30:12.080 | Something can be very productive short-term
00:30:13.840 | and very destructive long-term.
00:30:15.380 | - Also, the question is how many disasters we avoided
00:30:20.900 | because of the ingenuity of humans
00:30:24.000 | or just the integrity and character of humans.
00:30:26.760 | That's sort of an open question.
00:30:30.280 | We may be more intelligent than lucky.
00:30:35.280 | That's the hope.
00:30:36.940 | Because the optimistic message here
00:30:38.640 | that you're getting at is maybe the process
00:30:42.200 | that we should be,
00:30:43.720 | that maybe we can overcome luck with ingenuity.
00:30:48.560 | Meaning, I guess you're suggesting the process
00:30:51.800 | is we should be listing all the ways
00:30:53.720 | that human civilization can destroy itself,
00:30:55.880 | assigning likelihood to it,
00:30:59.020 | and thinking through how can we avoid that.
00:31:03.260 | And being very honest with the data out there
00:31:06.740 | about the close calls and using those close calls
00:31:10.540 | to then create sort of mechanism
00:31:13.940 | by which we minimize the probability of those close calls.
00:31:17.460 | And just being honest and transparent
00:31:21.240 | with the data that's out there.
00:31:23.120 | - Well, I think we need to do a couple things
00:31:25.000 | for it to work.
00:31:26.220 | So I've been an advocate for the idea
00:31:30.060 | that sustainability is actually,
00:31:32.060 | it's difficult to operationalize,
00:31:33.660 | but it is an objective that we have to meet
00:31:35.880 | if we're to be around long-term.
00:31:38.560 | And I realize that we also need to have reversibility
00:31:41.460 | of all of our processes
00:31:43.280 | because processes very frequently when they start
00:31:46.320 | do not appear dangerous.
00:31:47.960 | And then when they scale, they become very dangerous.
00:31:51.280 | So for example, if you imagine
00:31:53.160 | the first internal combustion engine vehicle
00:31:58.280 | driving down the street,
00:31:59.680 | and you imagine somebody running after them saying,
00:32:01.640 | "Hey, if you do enough of that,
00:32:02.920 | "you're gonna alter the atmosphere
00:32:04.200 | "and it's gonna change the temperature of the planet."
00:32:05.960 | It's preposterous, right?
00:32:07.440 | Why would you stop the person
00:32:08.600 | who's invented this marvelous new contraption?
00:32:10.920 | But of course, eventually you do get to the place
00:32:13.160 | where you're doing enough of this
00:32:14.600 | that you do start changing the temperature of the planet.
00:32:17.040 | So if we built the capacity,
00:32:20.460 | if we basically said, "Look, you can't involve yourself
00:32:23.700 | "in any process that you couldn't reverse if you had to,"
00:32:27.520 | then progress would be slowed,
00:32:30.160 | but our safety would go up dramatically.
00:32:33.280 | And I think in some sense,
00:32:36.720 | if we are to be around long-term,
00:32:38.120 | we have to begin thinking that way.
00:32:40.280 | We're just involved in too many very dangerous processes.
00:32:43.880 | - So let's talk about one of the things that,
00:32:48.040 | if not threatened human civilization,
00:32:50.280 | certainly hurt it at a deep level, which is COVID-19.
00:32:54.920 | What percent probability would you currently place
00:32:59.480 | on the hypothesis that COVID-19 leaked
00:33:02.480 | from the Wuhan Institute of Virology?
00:33:06.320 | - So I maintain a flow chart
00:33:08.640 | of all the possible explanations,
00:33:10.800 | and it doesn't break down exactly that way.
00:33:14.280 | The likelihood that it emerged from a lab
00:33:17.920 | is very, very high.
00:33:19.160 | If it emerged from a lab,
00:33:21.400 | the likelihood that the lab was the Wuhan Institute
00:33:23.600 | is very, very high.
00:33:24.760 | There are multiple different kinds of evidence
00:33:30.500 | that point to the lab,
00:33:31.640 | and there is literally no evidence that points to nature.
00:33:35.080 | Either the evidence points nowhere,
00:33:36.760 | or it points to the lab,
00:33:38.160 | and the lab could mean any lab,
00:33:39.880 | but geographically, obviously,
00:33:41.880 | the labs in Wuhan are the most likely,
00:33:44.920 | and the lab that was most directly involved
00:33:46.720 | with research on viruses that look like SARS-CoV-2
00:33:51.720 | is obviously the place that one would start.
00:33:55.800 | But I would say the likelihood
00:33:57.680 | that this virus came from a lab is well above 95%.
00:34:02.680 | We can talk about the question of,
00:34:05.600 | could a virus have been brought into the lab
00:34:07.860 | and escaped from there without being modified?
00:34:09.920 | That's also possible,
00:34:11.220 | but it doesn't explain any of the anomalies
00:34:13.720 | in the genome of SARS-CoV-2.
00:34:16.520 | Could it have been delivered from another lab?
00:34:20.680 | Could Wuhan be a distraction
00:34:23.120 | in order that we would connect the dots in the wrong way?
00:34:26.600 | That's conceivable.
00:34:27.600 | I currently have that below 1% on my flow chart,
00:34:30.600 | but I think-- - It's a very dark thought
00:34:32.640 | that somebody would do that
00:34:34.160 | almost as a political attack on China.
00:34:37.760 | - Well, it depends.
00:34:39.120 | I don't even think that's one possibility.
00:34:42.400 | Sometimes when Eric and I talk about these issues,
00:34:44.480 | we will generate a scenario
00:34:47.680 | just to prove that something could live in that space,
00:34:50.800 | right, as a placeholder
00:34:51.960 | for whatever may actually have happened.
00:34:53.840 | And so it doesn't have to have been an attack on China.
00:34:57.320 | That's certainly one possibility.
00:34:59.320 | But I would point out,
00:35:00.720 | if you can predict the future in some unusual way
00:35:07.200 | better than others, you can print money, right?
00:35:10.280 | That's what markets that allow you to bet for or against
00:35:13.080 | virtually any sector allow you to do.
00:35:17.000 | So you can imagine a simply amoral person or entity
00:35:22.000 | generating a pandemic, attempting to cover their tracks
00:35:28.240 | because it would allow them to bet against things
00:35:30.520 | like cruise ships, air travel, whatever it is,
00:35:35.080 | and bet in favor of, I don't know,
00:35:37.840 | sanitizing gel and whatever else you would do.
00:35:43.280 | So am I saying that I think somebody did that?
00:35:46.360 | No, I really don't think it happened.
00:35:47.720 | We've seen zero evidence
00:35:49.000 | that this was intentionally released.
00:35:51.280 | However, were it to have been intentionally released
00:35:54.720 | by somebody who did not know,
00:35:56.160 | did not want it known where it had come from,
00:35:59.240 | releasing it into Wuhan
00:36:00.360 | would be one way to cover their tracks.
00:36:01.840 | So we have to leave the possibility formally open,
00:36:05.120 | but acknowledge there's no evidence.
00:36:07.360 | - And the probability therefore is low.
00:36:09.320 | I tend to believe,
00:36:10.560 | maybe this is the optimistic nature that I have,
00:36:14.480 | that people who are competent enough
00:36:18.440 | to do the kind of thing we just described
00:36:20.600 | are not going to do that
00:36:23.080 | because it requires a certain kind of,
00:36:26.200 | I don't wanna use the word evil,
00:36:27.400 | but whatever word you wanna use to describe
00:36:29.240 | the kind of disregard for human life required to do that,
00:36:34.240 | that's just not going to be coupled with competence.
00:36:40.560 | I feel like there's a trade-off chart
00:36:42.440 | where competence on one axis and evil is on the other.
00:36:45.920 | And the more evil you become,
00:36:47.400 | the crappier you are at doing great engineering,
00:36:52.400 | scientific work required to deliver weapons
00:36:55.720 | of different kinds,
00:36:56.840 | whether it's bioweapons or nuclear weapons
00:36:58.560 | and all those kinds of things.
00:36:59.840 | That seems to be the lessons I take from history,
00:37:02.720 | but that doesn't necessarily mean
00:37:04.360 | that's what's going to be happening in the future.
00:37:06.960 | But to stick on the lab leak idea,
00:37:11.280 | 'cause the flow chart is probably huge here
00:37:13.560 | 'cause there's a lot of fascinating possibilities.
00:37:16.200 | One question I wanna ask is
00:37:18.080 | what would evidence for natural origins look like?
00:37:20.760 | So one piece of evidence for natural origins
00:37:25.240 | is that it's happened in the past,
00:37:28.840 | that viruses have jumped.
00:37:33.600 | - Oh, they do jump.
00:37:35.080 | - So like that's one, like that's possible to have happened.
00:37:38.560 | So that's a sort of like a historical evidence,
00:37:42.320 | like, okay, well, it's possible that it happened.
00:37:45.520 | - It's not evidence of the kind you think it is.
00:37:48.120 | It's a justification for a presumption.
00:37:51.120 | So the presumption upon discovering a new virus
00:37:54.780 | circulating is certainly that it came from nature, right?
00:37:57.880 | The problem is the presumption evaporates
00:38:00.720 | in the face of evidence, or at least it logically should.
00:38:04.200 | And it didn't in this case.
00:38:05.800 | It was maintained by people who privately in their emails
00:38:09.000 | acknowledged that they had grave doubts
00:38:11.800 | about the natural origin of this virus.
00:38:14.880 | - Is there some other piece of evidence
00:38:17.040 | that we could look for and see that would say,
00:38:21.840 | this increases the probability that it's natural origins?
00:38:24.780 | - Yeah, in fact, there is evidence.
00:38:27.660 | I always worry that somebody is going to
00:38:30.780 | make up some evidence in order to reverse the flow.
00:38:34.700 | - Oh, boy.
00:38:35.540 | - Well, let's say I am--
00:38:36.380 | - There's a lot of incentive for that, actually.
00:38:38.220 | - There's a huge amount of incentive.
00:38:39.580 | On the other hand, why didn't the powers that be,
00:38:43.480 | the powers that lied to us about
00:38:44.820 | weapons of mass destruction in Iraq,
00:38:46.380 | why didn't they ever fake
00:38:47.780 | weapons of mass destruction in Iraq?
00:38:49.780 | Whatever force it is, I hope that force is here too.
00:38:52.680 | And so whatever evidence we find is real.
00:38:54.960 | - It's the competence thing I'm talking about.
00:38:56.880 | But okay, go ahead, sorry.
00:38:58.920 | - Well, we can get back to that.
00:39:00.200 | But I would say, yeah, the giant piece of evidence
00:39:03.880 | that will shift the probabilities in the other direction
00:39:07.160 | is the discovery of either a human population
00:39:10.540 | in which the virus circulated prior to showing up in Wuhan
00:39:14.320 | that would explain where the virus learned
00:39:16.320 | all of the tricks that it knew instantly
00:39:18.360 | upon spreading from Wuhan.
00:39:20.360 | So that would do it, or an animal population
00:39:24.080 | in which an ancestor epidemic can be found
00:39:27.880 | in which the virus learned this before jumping to humans.
00:39:30.600 | But I'd point out in that second case,
00:39:33.460 | you would certainly expect to see a great deal of evolution
00:39:36.600 | in the early epidemic, which we don't see.
00:39:39.720 | So there almost has to be a human population
00:39:42.880 | somewhere else that had the virus circulating
00:39:45.040 | or an ancestor of the virus that we first saw
00:39:47.580 | in Wuhan circulating.
00:39:48.640 | And it has to have gotten very sophisticated
00:39:50.960 | in that prior epidemic before hitting Wuhan
00:39:54.040 | in order to explain the total lack of evolution
00:39:56.560 | and extremely effective virus that emerged
00:40:00.120 | at the end of 2019.
00:40:01.640 | - So you don't believe in the magic of evolution
00:40:03.680 | to spring up with all the tricks already there.
00:40:05.640 | Like everybody who doesn't have the tricks,
00:40:07.440 | they die quickly.
00:40:09.440 | And then you just have this beautiful virus
00:40:11.480 | that comes in with a spike protein
00:40:13.440 | and through mutation and selection,
00:40:17.200 | just like the ones that succeed and succeed big
00:40:22.200 | are the ones that are going to just spring into life
00:40:25.780 | with the tricks.
00:40:26.780 | - Well, no, that's called a hopeful monster
00:40:30.580 | and hopeful monsters don't work.
00:40:33.160 | The job of becoming a new pandemic virus is too difficult.
00:40:37.680 | It involves two very difficult steps
00:40:39.340 | and they both have to work.
00:40:40.340 | One is the ability to infect a person
00:40:42.620 | and spread in their tissues
00:40:45.060 | sufficient to make an infection.
00:40:46.780 | And the other is to jump between individuals
00:40:49.140 | at a sufficient rate that it doesn't go extinct
00:40:51.540 | for one reason or another.
00:40:53.340 | Those are both very difficult jobs.
00:40:55.180 | They require, as you describe, selection.
00:40:58.060 | And the point is selection would leave a mark.
00:41:00.700 | We would see evidence that it was taking place.
00:41:02.260 | - In animals or humans, we would see.
00:41:04.060 | - Both, right?
00:41:05.820 | - And we see this evolutionary trace
00:41:07.300 | of the virus gathering the tricks up.
00:41:10.700 | - Yeah, you would see the virus,
00:41:12.300 | you would see the clumsy virus get better and better.
00:41:14.180 | And yes, I am a full believer
00:41:15.740 | in the power of that process.
00:41:17.220 | In fact, I believe it.
00:41:19.300 | What I know from studying the process
00:41:22.420 | is that it is much more powerful than most people imagine.
00:41:25.300 | That what we teach in the Evolution 101 textbook
00:41:28.900 | is too clumsy a process to do what we see it doing
00:41:32.100 | and that actually people should increase their expectation
00:41:35.220 | of the rapidity with which that process can produce
00:41:38.700 | just jaw-dropping adaptations.
00:41:42.660 | That said, we just don't see evidence
00:41:44.340 | that it happened here,
00:41:45.160 | which doesn't mean it doesn't exist,
00:41:46.660 | but it means in spite of immense pressure
00:41:49.380 | to find it somewhere, there's been no hint,
00:41:52.020 | which probably means it took place inside of a laboratory.
00:41:55.580 | - So inside the laboratory,
00:41:58.260 | gain-of-function research on viruses.
00:42:00.940 | And I believe most of that kind of research
00:42:04.940 | is doing this exact thing that you're referring to,
00:42:07.140 | which is accelerated evolution.
00:42:09.460 | And just watching evolution do its thing
00:42:11.220 | on a bunch of viruses
00:42:12.620 | and seeing what kind of tricks get developed.
00:42:16.180 | The other method is engineering viruses.
00:42:21.180 | So manually adding on the tricks.
00:42:24.480 | Which do you think we should be thinking about here?
00:42:30.960 | - So mind you, I learned what I know
00:42:33.660 | in the aftermath of this pandemic emerging.
00:42:36.000 | I started studying the question.
00:42:38.140 | And I would say based on the content of the genome
00:42:42.340 | and other evidence in publications from the various labs
00:42:45.700 | that were involved in generating this technology,
00:42:50.440 | a couple of things seem likely.
00:42:52.620 | This SARS-CoV-2 does not appear to be entirely the result
00:42:57.620 | of either a splicing process or serial passaging.
00:43:02.780 | It appears to have both things in its past,
00:43:07.340 | or it's at least highly likely that it does.
00:43:09.340 | So for example, the Fern Cleavage site
00:43:11.860 | looks very much like it was added in to the virus.
00:43:15.660 | And it was known that that would increase its infectivity
00:43:18.660 | in humans and increase its tropism.
00:43:21.280 | The virus appears to be excellent
00:43:27.180 | at spreading in humans and minks and ferrets.
00:43:32.800 | Now minks and ferrets are very closely related
00:43:34.520 | to each other and ferrets are very likely
00:43:36.440 | to have been used in a serial passage experiment.
00:43:39.180 | The reason being that they have an ACE2 receptor
00:43:41.900 | that looks very much like the human ACE2 receptor.
00:43:44.340 | And so were you going to passage the virus
00:43:47.500 | or its ancestor through an animal
00:43:49.820 | in order to increase its infectivity in humans,
00:43:52.220 | which would have been necessary?
00:43:54.500 | Ferrets would have been very likely.
00:43:56.200 | It is also quite likely that humanized mice were utilized
00:44:01.200 | and it is possible that human airway tissue was utilized.
00:44:05.700 | I think it is vital that we find out
00:44:07.940 | what the protocols were.
00:44:09.100 | If this came from the Wuhan Institute,
00:44:11.420 | we need to know it and we need to know
00:44:12.740 | what the protocols were exactly
00:44:15.000 | because they will actually give us some tools
00:44:17.300 | that would be useful in fighting SARS-CoV-2
00:44:20.940 | and hopefully driving it to extinction,
00:44:22.540 | which ought to be our priority.
00:44:24.460 | It is a priority that does not,
00:44:26.220 | it is not apparent from our behavior,
00:44:28.260 | but it really is, it should be our objective.
00:44:31.340 | If we understood where our interests lie,
00:44:33.440 | we would be much more focused on it.
00:44:36.620 | But those protocols would tell us a great deal.
00:44:39.380 | If it wasn't the Wuhan Institute, we need to know that.
00:44:42.180 | If it was nature, we need to know that.
00:44:44.460 | And if it was some other laboratory,
00:44:45.900 | we need to figure out who, what and where
00:44:48.540 | so that we can determine what is,
00:44:50.460 | what we can determine about what was done.
00:44:53.100 | - You're opening up my mind about why we should investigate,
00:44:57.380 | why we should know the truth of the origins of this virus.
00:45:01.380 | So for me personally, let me just tell the story
00:45:04.500 | of my own kind of journey.
00:45:05.980 | When I first started looking into the lab leak hypothesis,
00:45:11.460 | what became terrifying to me and important to understand
00:45:17.340 | and obvious is the sort of like Sam Harris way of thinking,
00:45:22.300 | which is it's obvious that a lab leak
00:45:25.900 | of a deadly virus will eventually happen.
00:45:28.260 | My mind was, it doesn't even matter
00:45:32.540 | if it happened in this case,
00:45:34.500 | it's obvious there's going to happen in the future.
00:45:37.460 | So why the hell are we not freaking out about this?
00:45:40.700 | And COVID-19 is not even that deadly
00:45:42.540 | relative to the possible future viruses.
00:45:45.100 | It's the way, I disagree with Sam on this,
00:45:47.660 | but he thinks about this way about AGI as well,
00:45:50.620 | about artificial intelligence.
00:45:52.380 | It's a different discussion, I think,
00:45:54.060 | but with viruses, it seems like something
00:45:55.900 | that could happen on the scale of years,
00:45:58.380 | maybe a few decades.
00:46:00.060 | AGI is a little bit farther out for me,
00:46:02.380 | but it seemed the terrifying thing,
00:46:04.220 | it seemed obvious that this will happen very soon
00:46:08.460 | for a much deadlier virus as we get better and better
00:46:11.700 | at both engineering viruses
00:46:13.780 | and doing this kind of evolutionary driven research,
00:46:16.660 | gain of function research.
00:46:18.620 | Okay, but then you started speaking out about this as well,
00:46:23.100 | but also started to say, no, no, no,
00:46:25.260 | we should hurry up and figure out the origins now
00:46:27.780 | because it will help us figure out
00:46:29.780 | how to actually respond to this particular virus,
00:46:34.780 | how to treat this particular virus,
00:46:37.660 | what is in terms of vaccines, in terms of antiviral drugs,
00:46:40.460 | in terms of just all the number of responses we should have.
00:46:45.460 | Okay, I still am much more freaking out about the future.
00:46:51.420 | Maybe you can break that apart a little bit.
00:46:57.620 | Which are you most focused on now?
00:47:02.620 | Which are you most freaking out about now
00:47:06.820 | in terms of the importance of figuring out
00:47:08.540 | the origins of this virus?
00:47:10.580 | - I am most freaking out about both of them
00:47:13.620 | because they're both really important
00:47:15.300 | and we can put bounds on this.
00:47:18.060 | Let me say first that this is a perfect test case
00:47:20.820 | for the theory of close calls
00:47:22.380 | because as much as COVID is a disaster,
00:47:25.180 | it is also a close call from which we can learn much.
00:47:28.460 | You are absolutely right,
00:47:29.620 | if we keep playing this game in the lab,
00:47:31.860 | especially if we do it under pressure
00:47:36.460 | and when we are told that a virus
00:47:37.780 | is going to leap from nature any day
00:47:40.060 | and that the more we know,
00:47:41.220 | the better we'll be able to fight it,
00:47:42.460 | we're gonna create the disaster all the sooner.
00:47:46.460 | So yes, that should be an absolute focus.
00:47:49.380 | The fact that there were people saying
00:47:50.860 | that this was dangerous back in 2015
00:47:54.100 | ought to tell us something.
00:47:55.420 | The fact that the system bypassed a ban
00:47:57.740 | and offshored the work to China
00:48:00.100 | ought to tell us this is not a Chinese failure,
00:48:02.140 | this is a failure of something larger and harder to see.
00:48:05.780 | But I also think that there's a clock ticking
00:48:11.580 | with respect to SARS-CoV-2 and COVID,
00:48:14.780 | the disease that it creates.
00:48:16.900 | And that has to do with whether or not
00:48:18.340 | we are stuck with it permanently.
00:48:20.100 | So if you think about the cost to humanity
00:48:22.940 | of being stuck with influenza,
00:48:24.900 | it's an immense cost year after year.
00:48:27.180 | And we just stop thinking about it
00:48:28.580 | because it's there.
00:48:30.220 | Some years you get the flu, most years you don't.
00:48:32.380 | Maybe you get the vaccine to prevent it.
00:48:34.180 | Maybe the vaccine isn't particularly well targeted.
00:48:37.140 | But imagine just simply doubling that cost.
00:48:40.220 | Imagine we get stuck with SARS-CoV-2
00:48:43.980 | and its descendants going forward
00:48:45.940 | and that it just settles in
00:48:48.300 | and becomes a fact of modern human life.
00:48:51.300 | That would be a disaster, right?
00:48:52.820 | The number of people we will ultimately lose
00:48:54.620 | is incalculable.
00:48:55.940 | The amount of suffering that will be caused
00:48:57.380 | is incalculable.
00:48:58.300 | The loss of wellbeing and wealth, incalculable.
00:49:01.940 | So that ought to be a very high priority,
00:49:04.580 | driving this extinct before it becomes permanent.
00:49:08.020 | And the ability to drive extinct goes down
00:49:11.940 | the longer we delay effective responses
00:49:15.580 | to the extent that we let it have this very large canvas,
00:49:18.820 | large numbers of people who have the disease
00:49:21.500 | in which mutation and selection can result in adaptation
00:49:24.980 | that we will not be able to counter
00:49:26.900 | the greater its ability to figure out features
00:49:29.460 | of our immune system and use them to its advantage.
00:49:33.580 | So I'm feeling the pressure of driving it extinct.
00:49:37.260 | I believe we could have driven it extinct six months ago
00:49:40.340 | and we didn't do it because of very mundane concerns
00:49:43.300 | among a small number of people.
00:49:44.740 | And I'm not alleging that they were brazen
00:49:49.500 | or that they were callous about deaths that would be caused.
00:49:54.500 | I have the sense that they were working
00:49:56.700 | from a kind of autopilot in which,
00:49:59.260 | let's say you're in some kind of a corporation,
00:50:02.740 | a pharmaceutical corporation,
00:50:04.480 | you have a portfolio of therapies
00:50:08.540 | that in the context of a pandemic might be very lucrative.
00:50:11.580 | Those therapies have competitors.
00:50:13.920 | You of course wanna position your product
00:50:15.860 | so that it succeeds and the competitors don't.
00:50:18.980 | And lo and behold, at some point,
00:50:21.580 | through means that I think those of us on the outside
00:50:23.820 | can't really intuit, you end up saying things
00:50:28.580 | about competing therapies that work better
00:50:30.940 | and much more safely than the ones you're selling
00:50:33.940 | that aren't true and do cause people to die
00:50:36.460 | in large numbers.
00:50:38.660 | But it's some kind of autopilot, at least part of it is.
00:50:43.180 | - So there's a complicated coupling of the autopilot
00:50:47.460 | of institutions, companies, governments.
00:50:52.460 | And then there's also the geopolitical game theory
00:50:57.260 | thing going on where you wanna keep secrets.
00:51:00.540 | It's the Chernobyl thing where if you messed up,
00:51:03.920 | there's a big incentive, I think,
00:51:07.780 | to hide the fact that you messed up.
00:51:10.780 | So how do we fix this?
00:51:12.740 | And what's more important to fix?
00:51:14.460 | The autopilot, which is the response
00:51:18.420 | that we often criticize about our institutions,
00:51:21.980 | especially the leaders in those institutions,
00:51:23.900 | Anthony Fauci and so on,
00:51:25.700 | as some of the members of the scientific community.
00:51:29.100 | And the second part is the game with China
00:51:34.100 | of hiding the information
00:51:37.260 | in terms of on the fight between nations.
00:51:39.460 | - Well, in our live streams on Dark Horse,
00:51:42.860 | Heather and I have been talking from the beginning
00:51:44.860 | about the fact that although, yes,
00:51:47.300 | what happens began in China,
00:51:50.020 | it very much looks like a failure
00:51:51.560 | of the international scientific community.
00:51:53.820 | That's frightening, but it's also hopeful in the sense
00:51:58.580 | that actually if we did the right thing now,
00:52:01.020 | we're not navigating a puzzle about Chinese responsibility.
00:52:05.380 | We're navigating a question of collective responsibility
00:52:10.380 | for something that has been terribly costly to all of us.
00:52:14.140 | So that's not a very happy process.
00:52:17.980 | But as you point out,
00:52:19.260 | what's at stake is in large measure,
00:52:21.060 | at the very least,
00:52:22.220 | the strong possibility this will happen again
00:52:24.980 | and that at some point it will be far worse.
00:52:27.140 | So just as a person that does not learn the lessons
00:52:32.140 | of their own errors doesn't get smarter
00:52:34.940 | and they remain in danger,
00:52:37.220 | we collectively, humanity, have to say,
00:52:39.980 | "Well, there sure is a lot of evidence
00:52:43.060 | "that suggests that this is a self-inflicted wound."
00:52:46.100 | When you have done something
00:52:47.900 | that has caused a massive self-inflicted wound,
00:52:52.620 | it makes sense to dwell on it
00:52:55.380 | exactly to the point that you have learned the lesson
00:52:57.700 | that makes it very, very unlikely
00:52:59.660 | that something similar will happen again.
00:53:01.860 | - I think this is a good place
00:53:02.940 | to kind of ask you to do almost like a thought experiment.
00:53:07.260 | Or to steel man the argument
00:53:12.260 | against the lab leak hypothesis.
00:53:15.500 | So if you were to argue,
00:53:20.500 | you know, you said 95% chance
00:53:23.340 | that the virus leaked from a lab,
00:53:25.540 | there's a bunch of ways I think you can argue
00:53:29.940 | that even talking about it is bad for the world.
00:53:35.940 | So if I just put something on the table,
00:53:40.180 | it's to say that for one,
00:53:44.060 | it would be racism versus Chinese people.
00:53:46.880 | That talking about that it leaked from a lab,
00:53:51.180 | there's a kind of immediate kind of blame
00:53:53.580 | and it can spiral down into this idea
00:53:56.260 | that somehow the people are responsible
00:53:58.900 | for the virus and this kind of thing.
00:54:02.060 | Is it possible for you to come up
00:54:03.300 | with other steel man arguments against talking
00:54:08.300 | or against the possibility of the lab leak hypothesis?
00:54:12.120 | - Well, so I think steel manning is a tool
00:54:16.740 | that is extremely valuable,
00:54:19.220 | but it's also possible to abuse it.
00:54:22.780 | I think that you can only steel man a good faith argument.
00:54:26.420 | And the problem is we now know
00:54:28.580 | that we have not been engaged in opponents
00:54:31.260 | who are wielding good faith arguments
00:54:32.680 | because privately their emails reflect their own doubts.
00:54:36.220 | And what they were doing publicly
00:54:37.500 | was actually a punishment, a public punishment
00:54:41.040 | for those of us who spoke up with, I think,
00:54:43.940 | the purpose of either backing us down
00:54:46.900 | or more likely warning others not to engage
00:54:49.860 | in the same kind of behavior.
00:54:51.260 | And obviously for people like you and me
00:54:53.380 | who regard science as our likely best hope
00:54:58.380 | for navigating difficult waters,
00:55:01.420 | shutting down people who are using those tools honorably
00:55:05.160 | is itself dishonorable.
00:55:07.500 | So I don't really, I don't feel that it is,
00:55:10.420 | I don't feel that there's anything to steel man.
00:55:13.780 | And I also think that, you know,
00:55:16.900 | immediately at the point that the world suddenly
00:55:19.140 | with no new evidence on the table,
00:55:21.380 | switched gears with respect to the lab leak,
00:55:23.920 | you know, at the point that Nicholas Wade
00:55:25.240 | had published his article and suddenly the world
00:55:27.340 | was going to admit that this was at least a possibility,
00:55:30.180 | if not a likelihood, we got to see something
00:55:34.900 | of the rationalization process that had taken place
00:55:37.420 | inside the institutional world.
00:55:39.340 | And it very definitely involved the claim
00:55:41.760 | that what was being avoided was the targeting
00:55:45.060 | of Chinese scientists.
00:55:49.100 | And my point would be,
00:55:50.020 | I don't wanna see the targeting of anyone.
00:55:53.420 | I don't want to see racism of any kind.
00:55:55.860 | On the other hand, once you create license to lie
00:56:00.660 | in order to protect individuals,
00:56:03.900 | when the world has a stake in knowing what happened,
00:56:07.540 | then it is inevitable that that process,
00:56:10.200 | that license to lie will be used by the thing
00:56:12.980 | that captures institutions for its own purposes.
00:56:15.420 | So my sense is, it may be very unfortunate
00:56:19.700 | if the story of what happened here can be used
00:56:23.700 | against Chinese people, that would be very unfortunate.
00:56:28.140 | And as I think I mentioned,
00:56:31.180 | Heather and I have taken great pains to point out
00:56:33.940 | that this doesn't look like a Chinese failure,
00:56:35.780 | it looks like a failure
00:56:36.600 | of the international scientific community.
00:56:38.380 | So I think it is important to broadcast that message
00:56:41.100 | along with the analysis of the evidence,
00:56:43.880 | but no matter what happened, we have a right to know.
00:56:46.740 | And I frankly do not take the institutional layer
00:56:50.820 | at its word, that its motivations are honorable
00:56:53.460 | and that it was protecting good-hearted scientists
00:56:56.700 | at the expense of the world.
00:56:57.900 | That explanation does not add up.
00:57:00.280 | - Well, this is a very interesting question
00:57:03.260 | about whether it's ever okay to lie
00:57:06.380 | at the institutional layer to protect the populace.
00:57:11.180 | I think both you and I are probably on the same,
00:57:16.260 | have the same sense that it's a slippery slope,
00:57:21.900 | even if it's an effective mechanism in the short term,
00:57:25.220 | in the long term, it's going to be destructive.
00:57:27.700 | This happened with masks, this happened with other things.
00:57:32.540 | If you look at just history pandemics,
00:57:34.620 | there's an idea that panic is destructive
00:57:40.420 | amongst the populace.
00:57:41.300 | So you want to construct a narrative,
00:57:44.440 | whether it's a lie or not, to minimize panic.
00:57:49.540 | But you're suggesting that almost in all cases,
00:57:52.780 | and I think that was the lesson from the pandemic
00:57:57.540 | in the early 20th century,
00:57:59.280 | that lying creates distrust and distrust
00:58:04.280 | in the institutions is ultimately destructive.
00:58:08.240 | That's your sense, that lying is not okay?
00:58:10.840 | - Well, okay.
00:58:12.400 | There are obviously places where complete transparency
00:58:15.820 | is not a good idea, right?
00:58:17.300 | To the extent that you broadcast a technology
00:58:19.980 | that allows one individual to hold the world hostage,
00:58:23.900 | right, obviously you've got something to be navigated.
00:58:27.660 | But in general, I don't believe that the scientific system
00:58:32.660 | should be lying to us.
00:58:36.100 | In the case of this particular lie,
00:58:39.820 | the idea that the well-being of Chinese scientists
00:58:45.380 | outweighs the well-being of the world is preposterous.
00:58:49.860 | - Yeah. - Right?
00:58:50.700 | As you point out, one thing that rests on this question
00:58:53.460 | is whether we continue to do this kind of research
00:58:55.940 | going forward.
00:58:56.780 | And the scientists in question, all of them,
00:58:58.900 | American, Chinese, all of them,
00:59:01.700 | were pushing the idea that the risk
00:59:04.080 | of a zoonotic spillover event causing a major
00:59:07.380 | and highly destructive pandemic was so great
00:59:09.820 | that we had to risk this.
00:59:12.220 | Now, if they themselves have caused it,
00:59:14.380 | and if they are wrong, as I believe they are,
00:59:16.740 | about the likelihood of a major world pandemic
00:59:19.220 | spilling out of nature in the way that they wrote
00:59:22.700 | into their grant applications,
00:59:24.620 | then the danger is, you know,
00:59:26.260 | the call is coming from inside the house.
00:59:27.980 | And we have to look at that.
00:59:31.200 | And yes, whatever we have to do to protect scientists
00:59:35.540 | from retribution, we should do.
00:59:38.260 | But we cannot, protecting them by lying to the world,
00:59:42.880 | and even worse, by demonizing people like me,
00:59:47.880 | like Josh Rogan, like Yuri Dagan,
00:59:56.140 | the entire drastic group on Twitter,
00:59:58.940 | by demonizing us for simply following the evidence
01:00:02.940 | is to set a terrible precedent, right?
01:00:05.700 | You're demonizing people for using the scientific method
01:00:08.380 | to evaluate evidence that is available to us in the world.
01:00:11.940 | What a terrible crime it is to teach that lesson, right?
01:00:16.020 | Thou shalt not use scientific tools?
01:00:18.060 | No, I'm sorry.
01:00:19.300 | Whatever your license to lie is, it doesn't extend to that.
01:00:22.540 | - Yeah, I've seen the attacks on you,
01:00:25.420 | the pressure on you has a very important effect
01:00:28.460 | on thousands of world-class biologists, actually.
01:00:34.540 | So at MIT, colleagues of mine, people I know,
01:00:40.020 | there's a slight pressure to not be allowed
01:00:44.100 | to one, speak publicly, and two, actually think.
01:00:49.100 | Like, to even think about these ideas.
01:00:53.580 | It sounds kind of ridiculous, but just in the privacy
01:00:57.500 | of your own home, to read things, to think,
01:01:01.540 | it's many people, many world-class biologists that I know
01:01:06.900 | will just avoid looking at the data.
01:01:10.700 | There's not even that many people that are publicly opposed
01:01:13.220 | in gain-of-function research.
01:01:15.060 | They're also like, it's not worth it.
01:01:18.820 | It's not worth the battle.
01:01:20.100 | And there's many people that kind of argue
01:01:21.660 | that those battles should be fought in private,
01:01:24.740 | with colleagues in the privacy of the scientific community,
01:01:31.260 | that the public is somehow not maybe intelligent enough
01:01:35.460 | to be able to deal with the complexities
01:01:38.260 | of this kind of discussion.
01:01:39.740 | I don't know, but the final result is,
01:01:41.940 | combined with the bullying of you
01:01:44.060 | and all the different pressures
01:01:47.060 | in the academic institutions,
01:01:48.460 | is that it's just people are self-censoring
01:01:51.220 | and silencing themselves.
01:01:53.380 | And silencing, the most important thing,
01:01:55.180 | which is the power of their brains.
01:01:57.100 | Like, these people are brilliant.
01:02:01.520 | And the fact that they're not utilizing their brain
01:02:04.500 | to come up with solutions
01:02:06.700 | outside of the conformist line of thinking is tragic.
01:02:11.500 | - Well, it is.
01:02:12.540 | I also think that we have to look at it
01:02:15.860 | and understand it for what it is.
01:02:17.740 | For one thing, it's kind of a cryptic totalitarianism.
01:02:20.860 | Somehow, people's sense of what they're allowed
01:02:23.500 | to think about, talk about, discuss,
01:02:25.740 | is causing them to self-censor.
01:02:27.340 | And I can tell you,
01:02:28.460 | it's causing many of them to rationalize,
01:02:30.340 | which is even worse.
01:02:31.220 | They are blinding themselves to what they can see.
01:02:34.620 | But it is also the case, I believe,
01:02:36.860 | that what you're describing about what people said,
01:02:40.140 | and a great many people understood
01:02:43.020 | that the lab leak hypothesis
01:02:45.020 | could not be taken off the table,
01:02:47.020 | but they didn't say so publicly.
01:02:48.860 | And I think that their discussions with each other
01:02:52.760 | about why they did not say what they understood,
01:02:55.660 | that's what capture sounds like on the inside.
01:02:59.160 | I don't know exactly what force captured the institutions.
01:03:02.900 | I don't think anybody knows for sure out here in public.
01:03:07.900 | I don't even know that it wasn't just simply a process.
01:03:10.020 | But you have these institutions.
01:03:13.620 | They are behaving towards a kind of somatic obligation.
01:03:18.620 | They have lost sight of what they were built to accomplish.
01:03:22.780 | And on the inside,
01:03:24.780 | the way they avoid going back to their original mission
01:03:28.420 | is to say things to themselves,
01:03:30.120 | like the public can't have this discussion,
01:03:32.580 | I can't be trusted with it.
01:03:34.180 | Yes, we need to be able to talk about this,
01:03:35.740 | but it has to be private.
01:03:36.760 | Whatever it is they say to themselves,
01:03:38.100 | that is what capture sounds like on the inside.
01:03:40.280 | It's a institutional rationalization mechanism.
01:03:44.420 | And it's very, very deadly.
01:03:46.420 | And at the point you go from lab leak to repurposed drugs,
01:03:50.600 | you can see that it's very deadly in a very direct way.
01:03:54.660 | - Yeah, I see this in my field
01:03:57.220 | with things like autonomous weapon systems.
01:04:01.720 | People in AI do not talk about the use of AI
01:04:04.460 | in weapon systems.
01:04:05.860 | They kind of avoid the idea that AI is used in the military.
01:04:09.780 | It's kind of funny.
01:04:11.460 | There's this like kind of discomfort.
01:04:13.060 | And they're like, they all hurry,
01:04:14.420 | like something scary happens
01:04:17.180 | and a bunch of sheep kind of like run away.
01:04:19.260 | That's what it looks like.
01:04:21.300 | And I don't even know what to do about it.
01:04:23.620 | And then I feel this natural pull
01:04:26.420 | every time I bring up autonomous weapon systems
01:04:29.220 | to go along with the sheep.
01:04:30.700 | There's a natural kind of pull towards that direction
01:04:33.640 | because it's like, what can I do as one person?
01:04:36.200 | Now, there's currently nothing destructive
01:04:39.820 | happening with autonomous weapon systems.
01:04:42.040 | So we're in like in the early days of this race
01:04:44.960 | that in 10, 20 years might become a real problem.
01:04:48.140 | Now, the discussion we're having now,
01:04:50.440 | we're now facing the result of that in the space of viruses,
01:04:55.400 | like for many years avoiding the conversations here.
01:05:00.480 | I don't know what to do that in the early days,
01:05:03.560 | but I think we have to, I guess,
01:05:04.920 | create institutions where people can stand out.
01:05:08.280 | People can stand out and like basically be individual
01:05:12.040 | thinkers and break out into all kinds of spaces of ideas
01:05:16.560 | that allow us to think freely, freedom of thought.
01:05:19.600 | And maybe that requires a decentralization of institutions.
01:05:22.840 | - Well, years ago, I came up with a concept
01:05:26.040 | called cultivated insecurity.
01:05:28.880 | And the idea is, let's just take the example
01:05:31.640 | of the average Joe, right?
01:05:34.040 | The average Joe has a job somewhere and their mortgage,
01:05:39.040 | their medical insurance, their retirement,
01:05:44.180 | their connection with the economy is to one degree
01:05:48.200 | or another dependent on their relationship
01:05:51.120 | with the employer.
01:05:52.520 | That means that there is a strong incentive,
01:05:57.300 | especially in any industry where it's not easy
01:06:00.240 | to move from one employer to the next,
01:06:01.960 | there's a strong incentive to stay
01:06:05.000 | in your employer's good graces, right?
01:06:06.960 | So it creates a very top-down dynamic,
01:06:09.160 | not only in terms of who gets to tell other people
01:06:13.600 | what to do, but it really comes down to
01:06:16.080 | who gets to tell other people how to think.
01:06:18.760 | So that's extremely dangerous.
01:06:21.200 | The way out of it is to cultivate security
01:06:25.460 | to the extent that somebody is in a position
01:06:28.320 | to go against the grain and have it not be a catastrophe
01:06:32.440 | for their family and their ability to earn,
01:06:34.920 | you will see that behavior a lot more.
01:06:36.540 | So I would argue that some of what you're talking about
01:06:38.680 | is just a simple, predictable consequence
01:06:41.880 | of the concentration of the sources of well-being
01:06:46.880 | and that this is a solvable problem.
01:06:50.500 | - You got a chance to talk with Joe Rogan yesterday?
01:06:55.320 | - Yes, I did.
01:06:56.560 | - And I just saw the episode was released
01:06:59.300 | and Ivermectin is trending on Twitter.
01:07:03.200 | Joe told me it was an incredible conversation.
01:07:06.020 | I look forward to listening to it today.
01:07:07.380 | Many people have probably, by the time this is released,
01:07:10.660 | have already listened to it.
01:07:12.060 | I think it would be interesting to discuss a postmortem.
01:07:18.100 | How do you feel how the conversation went?
01:07:21.060 | And maybe broadly, how do you see the story
01:07:25.600 | as it's unfolding of Ivermectin from the origins
01:07:30.400 | from before COVID-19 through 2020 to today?
01:07:34.980 | - I very much enjoyed talking to Joe
01:07:36.800 | and I'm indescribably grateful
01:07:41.380 | that he would take the risk of such a discussion,
01:07:44.360 | that he would, as he described it,
01:07:46.440 | do an emergency podcast on the subject,
01:07:49.400 | which I think that was not an exaggeration.
01:07:52.060 | This needed to happen for various reasons.
01:07:55.400 | That he took us down the road of talking
01:07:59.280 | about the censorship campaign against Ivermectin,
01:08:01.920 | which I find utterly shocking,
01:08:04.840 | and talking about the drug itself.
01:08:07.160 | And I should say we had Pierre Kory available.
01:08:10.880 | He came on the podcast as well.
01:08:12.760 | He is, of course, the face of the FLCCC,
01:08:17.080 | the Frontline COVID-19 Critical Care Alliance.
01:08:20.440 | These are doctors who have innovated ways
01:08:23.740 | of treating COVID patients, and they happened on Ivermectin,
01:08:26.720 | and have been using it.
01:08:29.560 | And I hesitate to use the word advocating for it,
01:08:32.960 | because that's not really the role of doctors or scientists,
01:08:36.200 | but they are advocating for it in the sense
01:08:38.280 | that there is this pressure not to talk
01:08:41.400 | about its effectiveness for reasons that we can go into.
01:08:44.880 | - So maybe step back and say what is Ivermectin,
01:08:48.680 | and how much studies have been done
01:08:52.060 | to show its effectiveness?
01:08:53.540 | - So Ivermectin is an interesting drug.
01:08:56.480 | It was discovered in the '70s
01:08:58.940 | by a Japanese scientist named Satoshi Omura,
01:09:03.380 | and he found it in soil near a Japanese golf course.
01:09:08.200 | So I would just point out in passing
01:09:10.480 | that if we were to stop self-silencing
01:09:12.980 | over the possibility that Asians will be demonized
01:09:17.320 | over the possible lab leak in Wuhan,
01:09:20.000 | and to recognize that actually the natural course
01:09:22.600 | of the story has a likely lab leak in China,
01:09:27.600 | it has a unlikely hero in Japan,
01:09:32.060 | the story is naturally not a simple one.
01:09:36.800 | But in any case, Omura discovered this molecule.
01:09:41.040 | He sent it to a friend who was at Merck,
01:09:45.700 | a scientist named Campbell.
01:09:47.020 | They won a Nobel Prize for the discovery
01:09:50.620 | of the Ivermectin molecule in 2015.
01:09:55.140 | Its initial use was in treating parasitic infections.
01:09:58.740 | It's very effective in treating the worm
01:10:02.620 | that causes river blindness,
01:10:05.060 | the pathogen that causes elephantiasis, scabies.
01:10:08.860 | It's a very effective anti-parasite drug.
01:10:11.020 | It's extremely safe.
01:10:12.140 | It's on the WHO's list of essential medications.
01:10:14.780 | It's safe for children.
01:10:17.020 | It has been administered something like four billion times
01:10:20.280 | in the last four decades.
01:10:22.420 | It has been given away in the millions of doses
01:10:25.620 | by Merck in Africa.
01:10:27.980 | People have been on it for long periods of time,
01:10:30.940 | and in fact, one of the reasons that Africa
01:10:33.300 | may have had less severe impacts from COVID-19
01:10:36.900 | is that Ivermectin is widely used there
01:10:39.380 | to prevent parasites,
01:10:40.940 | and the drug appears to have a long-lasting impact.
01:10:43.340 | So it's an interesting molecule.
01:10:45.700 | It was discovered some time ago, apparently,
01:10:49.540 | that it has antiviral properties,
01:10:51.140 | and so it was tested early in the COVID-19 pandemic
01:10:54.880 | to see if it might work to treat humans with COVID.
01:10:58.760 | It turned out to have very promising evidence
01:11:02.280 | that it did treat humans.
01:11:03.660 | It was tested in tissues.
01:11:05.140 | It was tested at a very high dosage, which confuses people.
01:11:08.900 | They think that those of us who believe
01:11:11.020 | that Ivermectin might be useful in confronting this disease
01:11:14.780 | are advocating those high doses, which is not the case.
01:11:17.540 | But in any case, there have been quite a number of studies.
01:11:21.920 | A wonderful meta-analysis was finally released.
01:11:25.000 | We had seen it in preprint version,
01:11:26.680 | but it was finally peer-reviewed
01:11:28.140 | and published this last week.
01:11:30.500 | It reveals that the drug,
01:11:33.460 | as clinicians have been telling us,
01:11:35.280 | those who have been using it,
01:11:36.380 | it's highly effective at treating people with the disease,
01:11:38.740 | especially if you get to them early,
01:11:40.620 | and it showed an 86% effectiveness as a prophylactic
01:11:44.540 | to prevent people from contracting COVID.
01:11:47.540 | And that number, 86%, is high enough
01:11:50.780 | to drive SARS-CoV-2 to extinction if we wished to deploy it.
01:11:55.780 | - First of all, the meta-analysis,
01:11:59.420 | is this the Ivermectin for COVID-19
01:12:02.580 | real-time meta-analysis of 60 studies?
01:12:05.280 | Or there's a bunch of meta-analysis there?
01:12:07.340 | 'Cause I was really impressed by the real-time meta-analysis
01:12:10.220 | that keeps getting updated.
01:12:12.340 | I don't know if it's the same kind.
01:12:13.180 | - The one at ivmmeta.com?
01:12:18.060 | - Well, I saw it, it's c19ivermectin.com.
01:12:21.820 | - No, this is not that meta-analysis.
01:12:24.020 | So that is, as you say, a living meta-analysis
01:12:26.140 | where you can watch as evidence rolls in.
01:12:27.820 | - Which is super cool, by the way.
01:12:29.100 | - It's really cool, and they've got some really nice graphics
01:12:32.220 | that allow you to understand, well, what is the evidence?
01:12:34.660 | You know, it's concentrated around this level
01:12:36.660 | of effectiveness, et cetera.
01:12:38.220 | So anyway, it's a great site,
01:12:39.300 | well worth paying attention to.
01:12:40.920 | No, this is a meta-analysis.
01:12:42.720 | I don't know any of the authors, but one.
01:12:46.420 | Second author is Tess Lorry of the BIRD Group,
01:12:49.420 | BIRD being a group of analysts and doctors in Britain
01:12:54.420 | that is playing a role similar to the FLCCC here in the US.
01:12:59.620 | So anyway, this is a meta-analysis
01:13:02.060 | that Tess Lorry and others did
01:13:06.260 | of all of the available evidence,
01:13:08.140 | and it's quite compelling.
01:13:10.740 | People can look for it on my Twitter.
01:13:12.880 | I will put it up and people can find it there.
01:13:15.160 | - So what about dose here?
01:13:17.420 | In terms of safety, what do we understand
01:13:22.300 | about the kind of dose required
01:13:23.820 | to have that level of effectiveness,
01:13:26.860 | and what do we understand about the safety
01:13:29.180 | of that kind of dose?
01:13:30.220 | - So let me just say, I'm not a medical doctor.
01:13:32.580 | I'm a biologist.
01:13:34.460 | I'm on ivermectin in lieu of vaccination.
01:13:39.240 | In terms of dosage, there is one reason for concern,
01:13:42.540 | which is that the most effective dose for prophylaxis
01:13:45.800 | involves something like weekly administration,
01:13:49.500 | and that because that is not a historical pattern
01:13:53.120 | of use for the drug, it is possible
01:13:56.420 | that there is some long-term implication
01:13:58.060 | of being on it weekly for a long period of time.
01:14:02.500 | There's not a strong indication of that,
01:14:04.140 | the safety signal that we have over people
01:14:07.060 | using the drug over many years and using it in high doses.
01:14:10.020 | In fact, Dr. Corey told me yesterday
01:14:12.300 | that there are cases in which people
01:14:15.340 | have made calculation errors
01:14:17.660 | and taken a massive overdose of the drug
01:14:19.820 | and had no ill effect.
01:14:21.440 | So anyway, there's lots of reasons
01:14:23.320 | to think the drug is comparatively safe,
01:14:24.860 | but no drug is perfectly safe,
01:14:27.060 | and I do worry about the long-term implications
01:14:29.740 | of taking it.
01:14:30.760 | I also think it's very likely
01:14:32.580 | because the drug is administered in a dose
01:14:39.500 | something like, let's say, 15 milligrams for somebody
01:14:43.060 | my size once a week after you've gone through
01:14:45.740 | the initial double dose that you take 48 hours apart,
01:14:50.380 | it is apparent that if the amount of drug in your system
01:14:55.340 | is sufficient to be protective at the end of the week,
01:14:58.260 | then it was probably far too high
01:15:00.100 | at the beginning of the week.
01:15:01.260 | So there's a question about whether or not
01:15:03.300 | you could flatten out the intake
01:15:05.900 | so that the amount of ivermectin goes down,
01:15:09.740 | but the protection remains.
01:15:10.900 | I have little doubt that that would be discovered
01:15:13.140 | if we looked for it.
01:15:14.140 | But that said, it does seem to be quite safe,
01:15:18.860 | highly effective at preventing COVID.
01:15:20.980 | The 86% number is plenty high enough
01:15:23.940 | for us to drive SARS-CoV-2 to extinction
01:15:27.660 | in light of its R0 number of slightly more than two.
01:15:31.380 | And so why we are not using it is a bit of a mystery.
01:15:36.620 | - So even if everything you said now
01:15:39.140 | turns out to be not correct,
01:15:40.920 | it is nevertheless obvious
01:15:44.340 | that it's sufficiently promising,
01:15:46.820 | and it always has been,
01:15:48.020 | in order to merit rigorous scientific exploration,
01:15:52.440 | investigation, doing a lot of studies,
01:15:55.100 | and certainly not censoring the science
01:15:58.480 | or the discussion of it.
01:16:00.220 | So before we talk about the various vaccines for COVID-19,
01:16:05.220 | I'd like to talk to you about censorship.
01:16:08.180 | Given everything you're saying,
01:16:10.100 | why did YouTube and other places
01:16:14.740 | censor discussion of ivermectin?
01:16:21.580 | - Well, there's a question about why they say they did it,
01:16:24.040 | and there's a question about why they actually did it.
01:16:26.760 | Now, it is worth mentioning
01:16:29.320 | that YouTube is part of a consortium.
01:16:32.680 | It is partnered with Twitter, Facebook,
01:16:37.240 | Reuters, AP, Financial Times, Washington Post,
01:16:42.240 | some other notable organizations,
01:16:44.880 | and that this group has appointed itself
01:16:49.920 | the arbiter of truth.
01:16:51.640 | In effect, they have decided to control discussion
01:16:56.160 | ostensibly to prevent the distribution of misinformation.
01:16:59.560 | Now, how have they chosen to do that?
01:17:02.320 | In this case, they have chosen to simply utilize
01:17:06.540 | the recommendations of the WHO and the CDC
01:17:09.900 | and apply them as if they are synonymous
01:17:12.100 | with scientific truth.
01:17:13.200 | Problem, even at their best,
01:17:17.560 | the WHO and CDC are not scientific entities.
01:17:20.640 | They are entities that are about public health,
01:17:24.180 | and public health has this, whether it's right or not,
01:17:28.020 | and I believe I disagree with it,
01:17:30.100 | but it has this self-assigned right to lie
01:17:35.100 | that comes from the fact that there is game theory
01:17:40.080 | that works against, for example,
01:17:41.860 | a successful vaccination campaign,
01:17:44.360 | that if everybody else takes a vaccine
01:17:48.160 | and therefore the herd becomes immune through vaccination,
01:17:51.800 | and you decide not to take a vaccine,
01:17:53.980 | then you benefit from the immunity of the herd
01:17:56.480 | without having taken the risk.
01:17:58.600 | So people who do best are the people who opt out.
01:18:02.100 | That's a hazard, and the WHO and CDC
01:18:04.560 | as public health entities effectively oversimplify stories
01:18:09.560 | in order that that game theory does not cause
01:18:12.800 | a predictable tragedy of the commons.
01:18:15.640 | With that said, once that right to lie exists,
01:18:19.280 | then it turns out to serve the interests of,
01:18:23.000 | for example, pharmaceutical companies,
01:18:25.080 | which have emergency use authorizations
01:18:27.140 | that require that there not be a safe and effective treatment
01:18:29.920 | and have immunity from liability
01:18:31.960 | for harms caused by their product.
01:18:34.780 | So that's a recipe for disaster, right?
01:18:37.440 | You don't need to be a sophisticated thinker
01:18:40.980 | about complex systems to see the hazard
01:18:43.160 | of immunizing a company from the harm of its own product
01:18:48.160 | at the same time that that product
01:18:50.360 | can only exist in the market
01:18:52.480 | if some other product that works better
01:18:55.160 | somehow fails to be noticed.
01:18:57.120 | So somehow YouTube is doing the bidding of Merck and others.
01:19:02.120 | Whether it knows that that's what it's doing,
01:19:04.000 | I have no idea.
01:19:05.020 | I think this may be another case of an autopilot
01:19:08.400 | that thinks it's doing the right thing
01:19:09.680 | because it's parroting the corrupt wisdom
01:19:13.000 | of the WHO and the CDC,
01:19:14.360 | but the WHO and the CDC have been wrong
01:19:16.340 | again and again in this pandemic.
01:19:18.080 | And the irony here is that with YouTube coming after me,
01:19:22.240 | well, my channel has been right
01:19:24.560 | where the WHO and CDC have been wrong
01:19:26.680 | consistently over the whole pandemic.
01:19:29.060 | So how is it that YouTube is censoring us
01:19:33.000 | because the WHO and CDC disagree with us
01:19:35.160 | when in fact in past disagreements,
01:19:36.960 | we've been right and they've been wrong?
01:19:38.640 | There's so much to talk about here.
01:19:41.480 | So I've heard this many times actually
01:19:46.480 | on the inside of YouTube
01:19:48.960 | and with colleagues that I've talked with
01:19:50.960 | is they kind of in a very casual way
01:19:55.520 | say their job is simply to slow
01:19:59.600 | or prevent the spread of misinformation.
01:20:01.880 | And they say like, that's an easy thing to do.
01:20:06.920 | Like to know what is true or not is an easy thing to do.
01:20:11.760 | And so from the YouTube perspective,
01:20:14.280 | I think they basically outsource
01:20:18.280 | of the task of knowing what is true or not
01:20:23.080 | to public institutions that on a basic Google search
01:20:28.080 | claim to be the arbiters of truth.
01:20:32.960 | So if you were YouTube who are exceptionally profitable
01:20:37.960 | and exceptionally powerful in terms of controlling
01:20:43.280 | what people get to see or not, what would you do?
01:20:46.980 | Would you take a stand,
01:20:48.960 | a public stand against the WHO, CDC?
01:20:52.880 | Or would you instead say, you know what,
01:20:57.460 | let's open the dam and let any video on anything fly?
01:21:02.920 | What do you do here?
01:21:04.520 | If you say you were put,
01:21:05.920 | if Brent Weinstein was put in charge of YouTube for a month
01:21:10.920 | in this most critical of times
01:21:13.200 | where YouTube actually has incredible amounts of power
01:21:16.120 | to educate the populace,
01:21:18.720 | to give power of knowledge to the populace
01:21:22.360 | such that they can reform institutions,
01:21:24.240 | what would you do?
01:21:25.080 | How would you run YouTube?
01:21:26.840 | - Well, unfortunately or fortunately
01:21:29.560 | this is actually quite simple.
01:21:32.360 | The founders, the American founders
01:21:34.680 | settled on a counterintuitive formulation
01:21:37.880 | that people should be free to say anything.
01:21:41.120 | They should be free from the government
01:21:43.360 | blocking them from doing so.
01:21:45.400 | They did not imagine that in formulating that right,
01:21:48.640 | that most of what was said would be of high quality,
01:21:51.480 | nor did they imagine it would be free of harmful things.
01:21:54.520 | What they correctly reasoned was that the benefit
01:21:57.640 | of leaving everything so it can be said
01:22:01.240 | exceeds the cost,
01:22:02.480 | which everyone understands to be substantial.
01:22:04.820 | What I would say is they could not have anticipated
01:22:08.680 | the impact, the centrality of platforms
01:22:13.560 | like YouTube, Facebook, Twitter, et cetera.
01:22:17.000 | If they had, they would not have limited
01:22:20.120 | the First Amendment as they did.
01:22:21.960 | They clearly understood that the power
01:22:24.220 | of the federal government was so great
01:22:27.440 | that it needed to be limited by granting
01:22:31.200 | explicitly the right of citizens to say anything.
01:22:33.840 | In fact, YouTube, Twitter, Facebook
01:22:38.360 | may be more powerful in this moment
01:22:41.000 | than the federal government of their worst nightmares
01:22:43.240 | could have been.
01:22:44.140 | The power that these entities have to control thought
01:22:47.600 | and to shift civilization is so great
01:22:50.280 | that we need to have those same protections.
01:22:52.400 | It doesn't mean that harmful things won't be said,
01:22:54.500 | but it means that nothing has changed
01:22:56.360 | about the cost benefit analysis
01:22:59.000 | of building the right to censor.
01:23:01.040 | So if I were running YouTube,
01:23:03.240 | the limit of what should be allowed
01:23:06.080 | is the limit of the law, right?
01:23:08.360 | If what you are doing is legal,
01:23:10.560 | then it should not be YouTube's place
01:23:12.400 | to limit what gets said or who gets to hear it.
01:23:15.600 | That is between speakers and audience.
01:23:18.000 | Will harm come from that?
01:23:19.280 | Of course it will.
01:23:20.400 | But will net harm come from it?
01:23:22.960 | No, I don't believe it will.
01:23:24.240 | I believe that allowing everything to be said
01:23:26.240 | does allow a process in which better ideas
01:23:29.080 | do come to the fore and win out.
01:23:31.040 | - So you believe that in the end,
01:23:33.800 | when there's complete freedom to share ideas,
01:23:37.580 | that truth will win out.
01:23:39.760 | So what I've noticed, just as a brief side comment,
01:23:44.000 | that certain things become viral
01:23:48.280 | irregardless of their truth.
01:23:49.960 | I've noticed that things that are dramatic
01:23:54.400 | and or funny, like things that become memes
01:23:58.000 | don't have to be grounded in truth.
01:24:00.720 | And so what worries me there
01:24:03.120 | is that we basically maximize for drama
01:24:08.120 | versus maximize for truth
01:24:09.840 | in a system where everything is free.
01:24:12.800 | And that's worrying in the time of emergency.
01:24:16.280 | - Well, yes, it's all worrying in time of emergency,
01:24:18.880 | to be sure.
01:24:19.720 | But I want you to notice that what you've happened on
01:24:22.280 | is actually an analog for a much deeper and older problem.
01:24:26.740 | Human beings are the, we are not a blank slate,
01:24:31.500 | but we are the blankest slate that nature has ever devised.
01:24:34.080 | And there's a reason for that, right?
01:24:35.560 | It's where our flexibility comes from.
01:24:37.840 | We have effectively, we are robots in which
01:24:43.200 | a large fraction of the cognitive capacity has been,
01:24:48.160 | or of the behavioral capacity has been offloaded
01:24:51.500 | to the software layer, which gets written and rewritten
01:24:54.100 | over evolutionary time.
01:24:56.500 | That means effectively that much of what we are,
01:25:00.960 | in fact, the important part of what we are
01:25:02.680 | is housed in the cultural layer and the conscious layer
01:25:06.300 | and not in the hardware, hard coding layer.
01:25:08.900 | That layer is prone to make errors, right?
01:25:14.300 | And anybody who's watched a child grow up
01:25:17.720 | knows that children make absurd errors all the time, right?
01:25:20.640 | That's part of the process, as we were discussing earlier.
01:25:24.200 | It is also true that as you look across a field
01:25:27.760 | of people discussing things,
01:25:29.580 | a lot of what is said is pure nonsense, it's garbage.
01:25:33.040 | But the tendency of garbage to emerge
01:25:37.480 | and even to spread in the short term
01:25:39.800 | does not say that over the long term,
01:25:41.900 | what sticks is not the valuable ideas.
01:25:45.920 | So there is a high tendency for novelty to be created
01:25:50.440 | in the cultural space, but there's also a high tendency
01:25:52.520 | for it to go extinct.
01:25:54.000 | And you have to keep that in mind.
01:25:55.280 | It's not like the genome, right?
01:25:57.120 | Everything is happening at a much higher rate.
01:25:59.440 | Things are being created, they're being destroyed.
01:26:01.500 | And I can't say that, I mean, obviously,
01:26:04.600 | we've seen totalitarianism arise many times
01:26:08.040 | and it's very destructive each time it does.
01:26:10.600 | So it's not like, hey, freedom to come up
01:26:13.080 | with any idea you want hasn't produced
01:26:15.400 | a whole lot of carnage.
01:26:16.480 | But the question is over time,
01:26:18.600 | does it produce more open, fairer, more decent societies?
01:26:23.100 | And I believe that it does.
01:26:24.600 | I can't prove it, but that does seem to be the pattern.
01:26:27.640 | - I believe so as well.
01:26:29.640 | The thing is in the short term, freedom of speech,
01:26:34.640 | absolute freedom of speech can be quite destructive,
01:26:38.760 | but you nevertheless have to hold on to that
01:26:42.560 | because in the long term,
01:26:44.080 | I think you and I, I guess, are optimistic
01:26:47.600 | in the sense that good ideas will win out.
01:26:51.480 | I don't know how strongly I believe that it will work,
01:26:54.780 | but I will say I haven't heard a better idea.
01:26:56.780 | (both laughing)
01:26:58.180 | - Yeah.
01:26:59.020 | - I would also point out that there's something
01:27:01.720 | very significant in this question of the hubris involved
01:27:06.300 | in imagining that you're going to improve
01:27:08.320 | the discussion by censoring,
01:27:10.300 | which is the majority of concepts at the fringe are nonsense.
01:27:15.300 | That's automatic.
01:27:19.540 | But the heterodoxy at the fringe,
01:27:23.220 | which is indistinguishable at the beginning
01:27:25.920 | from the nonsense ideas, is the key to progress.
01:27:30.300 | So if you decide, hey, the fringe is 99% garbage,
01:27:34.180 | let's just get rid of it, right?
01:27:35.820 | Hey, that's a strong win.
01:27:36.860 | We're getting rid of 99% garbage for 1% something or other.
01:27:40.620 | And the point is, yeah, but that 1% something or other
01:27:42.700 | is the key.
01:27:43.760 | You're throwing out the key.
01:27:45.460 | And so that's what YouTube is doing.
01:27:48.320 | Frankly, I think at the point that it started censoring
01:27:50.860 | my channel, in the immediate aftermath
01:27:53.880 | of this major reversal of LabLeak,
01:27:56.660 | it should have looked at itself and said,
01:27:57.880 | well, what the hell are we doing?
01:27:59.500 | Who are we censoring?
01:28:00.340 | We're censoring somebody who was just right, right?
01:28:03.260 | In a conflict with the very same people
01:28:05.300 | on whose behalf we are now censoring, right?
01:28:07.540 | That should have caused them to wake up.
01:28:09.380 | - So you said one approach if you're on YouTube
01:28:11.420 | is just basically let all videos go
01:28:15.540 | that do not violate the law.
01:28:16.900 | - Well, I should fix that, okay?
01:28:18.540 | I believe that that is the basic principle.
01:28:20.740 | Eric makes an excellent point about the distinction
01:28:23.480 | between ideas and personal attacks,
01:28:26.780 | doxing, these other things.
01:28:28.460 | So I agree, there's no value in allowing people
01:28:31.460 | to destroy each other's lives,
01:28:33.220 | even if there's a technical legal defense for it.
01:28:36.820 | Now, how you draw that line, I don't know.
01:28:39.140 | But what I'm talking about is, yes,
01:28:41.960 | people should be free to traffic in bad ideas,
01:28:44.100 | and they should be free to expose that the ideas are bad.
01:28:47.180 | And hopefully that process results
01:28:49.060 | in better ideas winning out.
01:28:50.680 | - Yeah, there's an interesting line between ideas,
01:28:55.300 | like the earth is flat,
01:28:56.660 | which I believe you should not censor.
01:29:00.020 | And then you start to encroach on personal attacks.
01:29:04.420 | So not doxing, yes, but not even getting to that.
01:29:08.480 | There's a certain point where it's like,
01:29:10.740 | that's no longer ideas, that's more,
01:29:13.460 | that's somehow not productive, even if it's,
01:29:18.460 | it feels like believing the earth is flat
01:29:20.940 | is somehow productive,
01:29:22.400 | because maybe there's a tiny percent chance it is.
01:29:25.820 | It just feels like personal attacks, it doesn't,
01:29:31.420 | well, I'm torn on this because there's assholes
01:29:35.580 | in this world, there's fraudulent people in this world.
01:29:37.820 | So sometimes personal attacks are useful to reveal that,
01:29:40.740 | but there's a line you can cross.
01:29:44.500 | Like there's a comedy where people make fun of others.
01:29:48.420 | I think that's amazing, that's very powerful,
01:29:50.780 | and that's very useful, even if it's painful.
01:29:53.140 | But then there's like, once it gets to be,
01:29:55.780 | yeah, there's a certain line,
01:29:58.420 | it's a gray area where you cross,
01:30:00.100 | where it's no longer, in any possible world, productive.
01:30:04.900 | And that's a really weird gray area for YouTube
01:30:08.340 | to operate in, and it feels like it should be
01:30:10.300 | a crowdsourced thing where people vote on it,
01:30:13.800 | but then again, do you trust the majority to vote
01:30:16.560 | on what is crossing the line and not?
01:30:19.060 | I mean, this is where, this is really interesting
01:30:23.220 | on this particular, like the scientific aspect of this.
01:30:27.220 | Do you think YouTube should take more of a stance,
01:30:30.980 | not censoring, but to actually have scientists
01:30:35.780 | within YouTube having these kinds of discussions,
01:30:39.020 | and then be able to almost speak out in a transparent way,
01:30:42.180 | this is what we're going to let this video stand,
01:30:45.180 | but here's all these other opinions.
01:30:47.580 | Almost like take a more active role
01:30:49.820 | in its recommendation system
01:30:52.460 | in trying to present a full picture to you.
01:30:55.380 | Right now they're not, the recommender systems
01:30:58.540 | are not human fine-tuned.
01:31:01.020 | They're all based on how you click,
01:31:03.100 | and there's this clustering algorithms.
01:31:05.460 | They're not taking an active role
01:31:07.140 | on giving you the full spectrum of ideas
01:31:09.360 | in the space of science.
01:31:11.060 | They just censor or not.
01:31:12.980 | - Well, at the moment, it's gonna be pretty hard
01:31:16.340 | to compel me that these people should be trusted
01:31:18.800 | with any sort of curation or comment
01:31:22.480 | on matters of evidence, because they have demonstrated
01:31:26.060 | that they are incapable of doing it well.
01:31:28.100 | You could make such an argument,
01:31:30.900 | and I guess I'm open to the idea of institutions
01:31:34.260 | that would look something like YouTube,
01:31:36.300 | that would be capable of offering something valuable.
01:31:39.060 | I mean, and even just the fact of them
01:31:41.300 | literally curating things and putting some videos
01:31:43.700 | next to others implies something.
01:31:47.580 | So yeah, there's a question to be answered,
01:31:49.540 | but at the moment, no.
01:31:51.720 | At the moment, what it is doing
01:31:53.340 | is quite literally putting not only individual humans
01:31:57.340 | in tremendous jeopardy by censoring discussion
01:32:00.740 | of useful tools and making tools that are more hazardous
01:32:04.500 | than has been acknowledged seem safe, right?
01:32:07.460 | But it is also placing humanity in danger
01:32:10.560 | of a permanent relationship with this pathogen.
01:32:13.580 | I cannot emphasize enough how expensive that is.
01:32:16.820 | It's effectively incalculable.
01:32:18.800 | If the relationship becomes permanent,
01:32:20.500 | the number of people who will ultimately
01:32:22.740 | suffer and die from it is indefinitely large.
01:32:26.100 | - Yeah, there's currently the algorithm
01:32:28.140 | is very rabbit hole driven,
01:32:30.220 | meaning if you click on the flat earth videos,
01:32:35.220 | that's all you're going to be presented with,
01:32:38.620 | and you're not going to be nicely presented
01:32:41.020 | with arguments against the flat earth.
01:32:42.920 | And the flip side of that,
01:32:44.580 | if you watch like quantum mechanics videos,
01:32:48.580 | or no, general relativity videos,
01:32:50.820 | it's very rare you're going to get a recommendation,
01:32:53.180 | have you considered the earth is flat?
01:32:54.980 | And I think you should have both.
01:32:57.580 | Same with vaccine, videos that present the power
01:33:00.860 | and the incredible like biology, genetics,
01:33:04.660 | virology about the vaccine,
01:33:06.720 | you're rarely going to get videos
01:33:09.860 | from well-respected scientific minds
01:33:14.460 | presenting possible dangers of the vaccine.
01:33:16.740 | And the vice versa is true as well,
01:33:19.180 | which is if you're looking at the dangers
01:33:21.460 | of the vaccine on YouTube,
01:33:23.220 | you're not going to get the highest quality
01:33:25.660 | of videos recommended to you.
01:33:27.180 | And I'm not talking about like manually inserted CDC videos
01:33:30.900 | that are like the most untrustworthy things
01:33:33.740 | you can possibly watch about how everybody
01:33:36.020 | should take the vaccine, it's the safest thing ever.
01:33:38.440 | No, it's about incredible, again, MIT colleagues of mine,
01:33:42.100 | incredible biologists, virologists that talk about
01:33:45.340 | the details of how the mRNA vaccines work
01:33:49.140 | and all those kinds of things.
01:33:50.540 | I think maybe this is me with the AI hat on,
01:33:55.500 | is I think the algorithm can fix a lot of this
01:33:58.140 | and YouTube should build better algorithms
01:34:00.540 | and trust that to,
01:34:02.740 | and a couple of complete freedom of speech
01:34:06.380 | to expand what people are able to think about,
01:34:10.900 | present always varied views,
01:34:12.540 | not balanced in some artificial way,
01:34:15.060 | hard-coded way, but balanced in a way that's crowdsourced.
01:34:18.780 | I think that's an algorithm problem that can be solved
01:34:21.540 | because then you can delegate it to the algorithm
01:34:25.500 | as opposed to this hard code censorship
01:34:29.600 | of basically creating artificial boundaries
01:34:34.340 | on what can and can't be discussed.
01:34:36.140 | Instead, creating a full spectrum of exploration
01:34:39.860 | that can be done and trusting the intelligence of people
01:34:43.220 | to do the exploration.
01:34:45.320 | - Well, there's a lot there.
01:34:47.080 | I would say we have to keep in mind
01:34:49.280 | that we're talking about a publicly held company
01:34:53.680 | with shareholders and obligations to them
01:34:55.920 | and that that may make it impossible.
01:34:57.880 | And I remember many years ago,
01:35:01.560 | back in the early days of Google,
01:35:03.840 | I remember a sense of terror
01:35:06.520 | at the loss of general search.
01:35:10.560 | All right, it used to be that Google,
01:35:13.000 | if you searched, came up with the same thing for everyone.
01:35:16.120 | And then it got personalized.
01:35:18.520 | And for a while, it was possible
01:35:19.880 | to turn off the personalization,
01:35:21.480 | which was still not great
01:35:22.780 | because if everybody else is looking
01:35:24.160 | at a personalized search
01:35:25.640 | and you can tune into one that isn't personalized,
01:35:28.200 | that doesn't tell you why the world
01:35:31.040 | is sounding the way it is.
01:35:33.040 | But nonetheless, it was at least an option.
01:35:34.760 | And then that vanished.
01:35:35.960 | And the problem is,
01:35:37.520 | I think this is literally deranging us.
01:35:40.360 | That in effect, I mean,
01:35:42.640 | what you're describing is unthinkable.
01:35:44.400 | It is unthinkable that in the face of a campaign
01:35:48.200 | to vaccinate people in order to reach herd immunity,
01:35:51.780 | that YouTube would give you videos on hazards of vaccines
01:35:56.780 | when this is, you know,
01:36:00.520 | how hazardous the vaccines are is an unsettled question.
01:36:04.000 | - Why is it unthinkable?
01:36:06.000 | That doesn't make any sense.
01:36:07.440 | From a company perspective,
01:36:09.560 | if intelligent people in large amounts
01:36:14.560 | are open-minded and are thinking through the hazards
01:36:19.080 | and the benefits of a vaccine,
01:36:21.520 | a company should find the best videos
01:36:25.400 | to present what people are thinking about.
01:36:28.720 | - Well, let's come up with a hypothetical.
01:36:30.800 | Okay, let's come up with a very deadly disease
01:36:34.800 | for which there's a vaccine that is very safe,
01:36:37.760 | though not perfectly safe.
01:36:40.080 | And we are then faced with YouTube
01:36:43.000 | trying to figure out what to do
01:36:44.400 | for somebody searching on vaccine safety.
01:36:47.320 | Suppose it is necessary in order to drive the pathogen
01:36:50.760 | to extinction, something like smallpox,
01:36:53.160 | that people get on board with the vaccine.
01:36:55.360 | But there's a tiny fringe of people
01:36:59.280 | who thinks that the vaccine is a mind control agent.
01:37:03.360 | All right?
01:37:05.520 | So should YouTube direct people
01:37:09.160 | to the only claims against this vaccine,
01:37:12.600 | which is that it's a mind control agent,
01:37:15.320 | when in fact the vaccine is very safe,
01:37:20.320 | whatever that means.
01:37:22.440 | If that were the actual configuration of the puzzle,
01:37:25.560 | then YouTube would be doing active harm
01:37:28.040 | pointing you to this other video potentially.
01:37:33.040 | Now, yes, I would love to live in a world
01:37:35.120 | where people are up to the challenge of sorting that out.
01:37:39.120 | But my basic point would be,
01:37:41.000 | if it's an evidentiary question,
01:37:43.720 | and there is essentially no evidence
01:37:45.840 | that the vaccine is a mind control agent,
01:37:48.200 | and there's plenty of evidence that the vaccine is safe,
01:37:50.800 | then, well, you look for this video,
01:37:52.760 | we're gonna give you this one, puts it on a par, right?
01:37:55.360 | So for the mind that's tracking
01:37:57.000 | how much thought is there behind it's safe
01:38:00.680 | versus how much thought is there behind
01:38:02.920 | it's a mind control agent,
01:38:04.400 | will result in artificially elevating this.
01:38:07.800 | Now, in the current case,
01:38:08.840 | what we've seen is not this at all.
01:38:11.080 | We have seen evidence obscured
01:38:13.800 | in order to create a false story about safety.
01:38:18.080 | And we saw the inverse with ivermectin.
01:38:22.080 | We saw a campaign to portray the drug
01:38:24.800 | as more dangerous and less effective
01:38:28.640 | than the evidence clearly suggested it was.
01:38:30.960 | So we're not talking about a comparable thing,
01:38:33.920 | but I guess my point is the algorithmic solution
01:38:36.040 | that you point to creates a problem of its own,
01:38:39.760 | which is that it means that the way to get exposure
01:38:42.840 | is to generate something fringy.
01:38:44.720 | If you're the only thing on some fringe,
01:38:46.860 | then suddenly YouTube would be recommending those things.
01:38:49.720 | And that's obviously a gameable system at best.
01:38:53.160 | - Yeah, but the solution to that,
01:38:54.880 | I know you're creating a thought experiment,
01:38:57.600 | maybe playing a little bit of a devil's advocate.
01:39:00.800 | I think the solution to that
01:39:01.880 | is not to limit the algorithm
01:39:03.920 | in the case of the super deadly virus.
01:39:05.880 | It's for the scientists to step up
01:39:08.280 | and become better communicators, more charismatic,
01:39:10.960 | is fight the battle of ideas,
01:39:14.080 | so to create better videos.
01:39:15.700 | Like if the virus is truly deadly,
01:39:19.240 | you have a lot more ammunition, a lot more data,
01:39:22.080 | a lot more material to work with
01:39:23.720 | in terms of communicating with the public.
01:39:26.680 | So be better at communicating and stop being,
01:39:31.000 | you have to start trusting the intelligence of people
01:39:33.800 | and also being transparent
01:39:35.400 | and playing the game of the internet,
01:39:37.160 | which is like, what is the internet hungry for?
01:39:39.920 | I believe, authenticity.
01:39:42.720 | Stop looking like you're full of shit.
01:39:44.680 | The scientific community,
01:39:47.520 | if there's any flaw that I currently see,
01:39:50.520 | especially the people that are in public office,
01:39:53.080 | that like Anthony Fauci,
01:39:54.520 | they look like they're full of shit.
01:39:56.200 | And I know they're brilliant.
01:39:57.880 | Why don't they look more authentic?
01:39:59.920 | So they're losing that game.
01:40:01.400 | And I think a lot of people observing this entire system now,
01:40:05.120 | younger scientists are seeing this and saying,
01:40:09.360 | "Okay, if I want to continue being a scientist
01:40:12.560 | "in the public eye,
01:40:13.880 | "and I want to be effective at my job,
01:40:16.040 | "I'm going to have to be a lot more authentic."
01:40:18.180 | So they're learning the lesson.
01:40:19.360 | This evolutionary system is working.
01:40:21.240 | So there's just a younger generation of minds coming up
01:40:25.280 | that I think will do a much better job
01:40:27.120 | in this battle of ideas
01:40:28.640 | that when the much more dangerous virus comes along,
01:40:32.840 | they'll be able to be better communicators.
01:40:34.480 | At least that's the hope.
01:40:35.800 | Using the algorithm to control that is,
01:40:40.360 | I feel like is a big problem.
01:40:41.840 | So you're going to have the same problem with a deadly virus
01:40:45.140 | as with the current virus,
01:40:46.940 | if you let YouTube draw hard lines
01:40:50.360 | by the PR and the marketing people
01:40:52.780 | versus the broad community of scientists.
01:40:56.200 | - Well, in some sense,
01:40:57.040 | you're suggesting something that's close kin
01:41:00.560 | to what I was saying about freedom of expression
01:41:04.400 | ultimately provides an advantage to better ideas.
01:41:07.800 | So I'm in agreement, broadly speaking.
01:41:10.800 | But I would also say there's probably some sort of,
01:41:13.720 | let's imagine the world that you propose
01:41:15.620 | where YouTube shows you the alternative point of view.
01:41:18.640 | That has the problem that I suggest.
01:41:21.420 | But one thing you could do is you could give us the tools
01:41:24.800 | to understand what we're looking at.
01:41:26.640 | You could give us, so first of all,
01:41:29.080 | there's something I think myopic, solipsistic,
01:41:34.080 | narcissistic about an algorithm that serves shareholders
01:41:38.840 | by showing you what you want to see
01:41:41.000 | rather than what you need to know.
01:41:42.840 | That's the distinction is flattering you,
01:41:45.840 | playing to your blind spot
01:41:47.640 | is something that algorithm will figure out,
01:41:49.680 | but it's not healthy for us all
01:41:51.000 | to have Google playing to our blind spot.
01:41:53.480 | It's very, very dangerous.
01:41:54.640 | So what I really want is analytics that allow me,
01:41:59.640 | or maybe options and analytics.
01:42:02.240 | The options should allow me to see
01:42:05.240 | what alternative perspectives are being explored.
01:42:09.200 | So here's the thing I'm searching
01:42:10.680 | and it leads me down this road.
01:42:12.480 | Let's say it's ivermectin.
01:42:14.280 | I find all of this evidence that ivermectin works.
01:42:16.360 | I find all of these discussions
01:42:17.520 | and people talk about various protocols and this and that.
01:42:20.320 | And then I could say, all right, what is the other side?
01:42:24.240 | And I could see who is searching not as individuals,
01:42:28.880 | but what demographics are searching alternatives.
01:42:32.320 | And maybe you could even combine it
01:42:33.880 | with something Reddit like where effectively,
01:42:37.360 | let's say that there was a position that,
01:42:39.360 | I don't know, that a vaccine is a mind control device
01:42:44.240 | and you could have a steel man this argument
01:42:47.200 | competition effectively.
01:42:49.560 | And the better answers that steel man
01:42:51.160 | and as well as possible would rise to the top.
01:42:53.360 | And so you could read the top three or four explanations
01:42:56.440 | about why this really credibly is a mind control product.
01:43:01.240 | And you can say, well, that doesn't really add up.
01:43:03.400 | I can check these three things myself
01:43:05.080 | and they can't possibly be right.
01:43:07.280 | And you could dismiss it.
01:43:08.200 | And then as an argument that was credible,
01:43:10.080 | let's say plate tectonics before
01:43:12.720 | that was an accepted concept,
01:43:15.120 | you'd say, wait a minute,
01:43:16.840 | there is evidence for plate tectonic.
01:43:19.400 | As crazy as it sounds that the continents
01:43:21.240 | are floating around on liquid,
01:43:23.400 | actually that's not so implausible.
01:43:26.120 | We've got these subduction zones,
01:43:27.960 | we've got a geology that is compatible,
01:43:30.520 | we've got puzzle piece continents
01:43:31.960 | that seem to fit together.
01:43:33.200 | Wow, that's a surprising amount of evidence
01:43:35.600 | for that position.
01:43:36.440 | So I'm gonna file some Bayesian probability with it
01:43:39.120 | that's updated for the fact that actually
01:43:40.800 | the steel man arguments better than I was expecting.
01:43:43.600 | So I could imagine something like that where,
01:43:45.880 | A, I would love the search to be indifferent
01:43:48.040 | to who's searching.
01:43:49.800 | The solipsistic thing is too dangerous.
01:43:51.960 | So the search could be general.
01:43:53.600 | So we would all get a sense for what everybody else
01:43:55.440 | was seeing too.
01:43:57.040 | And then some layer that didn't have anything to do
01:43:59.480 | with what YouTube points you to or not,
01:44:01.960 | but allowed you to see the general pattern
01:44:06.120 | of adherence to searching for information.
01:44:10.640 | And again, a layer in which those things could be defended.
01:44:14.760 | So you could hear what a good argument sounded like
01:44:17.080 | rather than just hear a caricatured argument.
01:44:19.320 | - Yeah, and also reward people, creators,
01:44:22.600 | that have demonstrated like a track record
01:44:24.840 | of open-mindedness and correctness,
01:44:28.400 | as much as it could be measured over a long-term.
01:44:31.040 | And sort of, I mean, a lot of this maps
01:44:36.040 | to incentivizing good long-term behavior,
01:44:41.880 | not immediate kind of dopamine rush kind of signals.
01:44:48.120 | I think ultimately the algorithm on the individual level
01:44:53.120 | should optimize for personal growth, long-term happiness,
01:45:00.880 | just growth intellectually, growth in terms of lifestyle,
01:45:06.480 | personally, and so on, as opposed to immediate.
01:45:10.440 | I think that's going to build a better society,
01:45:12.320 | not even just like truth,
01:45:13.480 | 'cause I think truth is a complicated thing.
01:45:16.200 | It's more just you growing as a person,
01:45:19.320 | exploring the space of ideas, changing your mind often,
01:45:23.200 | increasing the level to which you're open-minded,
01:45:25.360 | the knowledge base you're operating from,
01:45:28.040 | the willingness to empathize with others,
01:45:31.360 | all those kinds of things the algorithm should optimize for.
01:45:34.160 | Like creating a better human at the individual level.
01:45:36.800 | I think that's a great business model
01:45:40.320 | because the person that's using this tool
01:45:44.800 | will then be happier with themselves for having used it
01:45:47.320 | and will be a lifelong quote-unquote customer.
01:45:50.760 | I think it's a great business model
01:45:52.480 | to make a happy, open-minded,
01:45:55.840 | knowledgeable, better human being.
01:45:58.480 | - It's a terrible business model under the current system.
01:46:02.400 | What you want is to build the system
01:46:04.320 | in which it is a great business model.
01:46:05.640 | - Why is it a terrible model?
01:46:07.680 | - Because it will be decimated by those
01:46:10.560 | who play to the short term.
01:46:11.920 | - I don't think so.
01:46:14.320 | - Why?
01:46:15.160 | - I think we're living it.
01:46:16.160 | We're living it.
01:46:17.320 | - Well, no, because if you have the alternative
01:46:19.440 | that presents itself, it points out
01:46:22.320 | the emperor has no clothes.
01:46:24.000 | I mean, it points out that YouTube is operating in this way,
01:46:27.320 | Twitter's operating in this way,
01:46:28.920 | Facebook is operating in this way.
01:46:30.680 | - How long-term would you like the wisdom to prove at?
01:46:35.000 | - Even a week is better than what's currently happening.
01:46:40.040 | - Right, but the problem is,
01:46:42.200 | if a week loses out to an hour, right?
01:46:45.240 | - I don't think it loses out.
01:46:48.280 | It loses out in the short term.
01:46:49.720 | - That's my point.
01:46:50.560 | - At least you're a great communicator
01:46:52.080 | and you basically say, look, here's the metrics.
01:46:56.240 | A lot of it is how people actually feel.
01:46:58.640 | This is what people experience with social media.
01:47:02.560 | They look back at the previous month and say,
01:47:04.960 | I felt shitty in a lot of days because of social media.
01:47:09.240 | - Right.
01:47:11.160 | - If you look back at the previous few weeks and say,
01:47:14.680 | wow, I'm a better person because that month happened,
01:47:18.240 | they immediately choose the product
01:47:20.800 | that's going to lead to that.
01:47:22.240 | That's what love for products looks like.
01:47:24.560 | If you love, like a lot of people love their Tesla car,
01:47:28.020 | or iPhone, or beautiful design,
01:47:31.960 | that's what love looks like.
01:47:33.080 | You look back, I'm a better person
01:47:35.360 | for having used this thing.
01:47:36.600 | - Well, you gotta ask yourself the question, though.
01:47:38.400 | If this is such a great business model,
01:47:40.280 | why isn't it evolving?
01:47:41.640 | Why don't we see it?
01:47:44.360 | - Honestly, it's competence.
01:47:46.040 | It's not easy to build new,
01:47:50.760 | it's not easy to build products, tools, systems
01:47:55.560 | on new ideas.
01:47:57.680 | It's kind of a new idea.
01:47:59.160 | We've gone through this.
01:48:00.560 | Everything we're seeing now comes from the ideas
01:48:04.320 | of the initial birth of the internet.
01:48:06.360 | There just needs to be new sets of tools
01:48:08.240 | that are incentivizing long-term personal growth
01:48:12.520 | and happiness, that's it.
01:48:14.000 | - Right, but what we have is a market
01:48:16.640 | that doesn't favor this.
01:48:18.600 | For one thing, we had an alternative to Facebook
01:48:23.160 | that looked, you owned your own data,
01:48:25.960 | it wasn't exploitative,
01:48:27.920 | and Facebook bought a huge interest in it, and it died.
01:48:32.920 | Who do you know who's on Diaspora?
01:48:34.920 | - The execution there was not good.
01:48:37.480 | - Right, but it could have gotten better, right?
01:48:40.400 | - I don't think that the argument
01:48:42.520 | that why hasn't somebody done it,
01:48:44.840 | a good argument for it's not going to completely destroy
01:48:48.040 | all of Twitter and Facebook when somebody does it,
01:48:51.080 | or Twitter will catch up and pivot to the algorithm.
01:48:54.680 | - This is not what I'm saying.
01:48:56.360 | There's obviously great ideas that remain unexplored
01:48:59.840 | because nobody has gotten to the foothill
01:49:01.760 | that would allow you to explore them.
01:49:03.080 | That's true, but an internet that was non-predatory
01:49:06.920 | is an obvious idea, and many of us know that we want it,
01:49:10.960 | and many of us have seen prototypes of it,
01:49:13.480 | and we don't move because there's no audience there,
01:49:15.640 | so the network effects cause you to stay
01:49:17.720 | with the predatory internet.
01:49:19.880 | But let me just, I wasn't kidding about build the system
01:49:24.480 | in which your idea is a great business plan.
01:49:27.620 | So in our upcoming book, Heather and I in our last chapter
01:49:32.600 | explore something called the fourth frontier,
01:49:34.760 | and fourth frontier has to do with sort of a 2.0 version
01:49:38.320 | of civilization, which we freely admit
01:49:40.480 | we can't tell you very much about.
01:49:42.440 | It's something that would have to be,
01:49:44.040 | we would have to prototype our way there.
01:49:45.680 | We would have to effectively navigate our way there.
01:49:48.220 | But the result would be very much
01:49:49.760 | like what you're describing.
01:49:51.040 | It would be something that effectively
01:49:53.400 | liberates humans meaningfully, and most importantly,
01:49:57.860 | it has to feel like growth without depending on growth.
01:50:02.200 | In other words, human beings are creatures
01:50:05.140 | that like every other creature
01:50:07.160 | is effectively looking for growth.
01:50:09.720 | We are looking for underexploited
01:50:11.600 | or unexploited opportunities, and when we find them,
01:50:14.840 | our ancestors for example, if they happen into a new valley
01:50:18.240 | that was unexplored by people, their population would grow
01:50:22.020 | until it hit carrying capacity.
01:50:23.320 | So there would be this great feeling of there's abundance
01:50:25.660 | until you hit carrying capacity, which is inevitable,
01:50:27.920 | and then zero-sum dynamics would set in.
01:50:30.500 | So in order for human beings to flourish long-term,
01:50:34.040 | the way to get there is to satisfy the desire for growth
01:50:37.920 | without hooking it to actual growth,
01:50:39.680 | which only moves in fits and starts.
01:50:42.400 | And this is actually, I believe, the key
01:50:45.320 | to avoiding these spasms of human tragedy
01:50:48.960 | when in the absence of growth,
01:50:50.660 | people do something that causes their population
01:50:54.100 | to experience growth, which is they go and make war on
01:50:57.760 | or commit genocide against some other population,
01:50:59.980 | which is something we obviously have to stop.
01:51:02.920 | - By the way, this is a hunter-gatherer's guide
01:51:06.040 | to the 21st century, co-authored.
01:51:08.400 | - That's right.
01:51:09.240 | - With your wife, Heather, being released this September.
01:51:11.200 | I believe you said you're going to do a little bit
01:51:14.080 | of a preview videos on each chapter
01:51:16.080 | leading up to the release.
01:51:17.240 | So I'm looking forward to the last chapter,
01:51:19.760 | as well as all the previous ones.
01:51:23.120 | I have a few questions on that.
01:51:24.580 | So you generally have faith to clarify
01:51:29.320 | that technology could be the thing
01:51:31.780 | that empowers this kind of future.
01:51:36.320 | - Well, if you just let technology evolve,
01:51:40.480 | it's going to be our undoing, right?
01:51:43.560 | One of the things that I fault my libertarian friends for
01:51:48.280 | is this faith that the market is going to find solutions
01:51:51.360 | without destroying us.
01:51:52.580 | And my sense is I'm a very strong believer in markets,
01:51:55.960 | right, I believe in their power
01:51:57.720 | even above some market fundamentalists.
01:52:00.280 | But what I don't believe is that they should be allowed
01:52:04.000 | to plot our course, right?
01:52:06.400 | Markets are very good at figuring out how to do things.
01:52:09.940 | They are not good at all about figuring out
01:52:12.120 | what we should do, right, what we should want.
01:52:14.880 | We have to tell markets what we want
01:52:16.640 | and then they can tell us how to do it best.
01:52:19.040 | And if we adopted that kind of pro-market
01:52:22.960 | but in a context where it's not steering,
01:52:25.800 | where human wellbeing is actually the driver,
01:52:28.920 | we can do remarkable things.
01:52:30.920 | And the technology that emerges
01:52:32.320 | would naturally be enhancing of human wellbeing.
01:52:35.420 | Perfectly so, no, but overwhelmingly so.
01:52:38.720 | But at the moment,
01:52:39.840 | markets are finding our every defect of character
01:52:42.720 | and exploiting them and making huge profits
01:52:44.940 | and making us worse to each other in the process.
01:52:48.260 | - Before we leave COVID-19,
01:52:52.480 | let me ask you about a very difficult topic,
01:52:56.380 | which is the vaccines.
01:52:59.280 | So I took the Pfizer vaccine, the two shots.
01:53:03.840 | You did not.
01:53:07.160 | You have been taking ivermectin.
01:53:10.520 | - Yep.
01:53:11.340 | - So one of the arguments against the discussion
01:53:16.520 | of ivermectin is that it prevents people
01:53:21.160 | from being fully willing to get the vaccine.
01:53:24.960 | How would you compare ivermectin
01:53:27.800 | and the vaccine for COVID-19?
01:53:30.520 | - All right, that's a good question.
01:53:33.360 | I would say, first of all,
01:53:34.600 | there are some hazards with the vaccine
01:53:37.080 | that people need to be aware of.
01:53:38.500 | There are some things that we cannot rule out
01:53:41.280 | and for which there is some evidence.
01:53:44.440 | The two that I think people should be tracking
01:53:46.800 | is the possibility, some would say a likelihood,
01:53:50.780 | that a vaccine of this nature,
01:53:53.640 | that is to say very narrowly focused on a single antigen,
01:53:58.640 | is an evolutionary pressure
01:54:02.380 | that will drive the emergence of variants
01:54:05.100 | that will escape the protection that comes from the vaccine.
01:54:08.740 | So this is a hazard.
01:54:11.660 | It is a particular hazard in light of the fact
01:54:14.020 | that these vaccines have a substantial number
01:54:16.800 | of breakthrough cases.
01:54:18.880 | So one danger is that a person who has been vaccinated
01:54:22.500 | will shed viruses that are specifically less visible
01:54:27.500 | or invisible to the immunity created by the vaccines.
01:54:30.680 | So we may be creating the next pandemic
01:54:34.220 | by applying the pressure of vaccines
01:54:37.300 | at a point that it doesn't make sense to.
01:54:39.360 | The other danger has to do
01:54:41.800 | with something called antibody-dependent enhancement,
01:54:45.220 | which is something that we see in certain diseases
01:54:47.180 | like dengue fever.
01:54:48.740 | You may know that dengue, one gets a case,
01:54:51.660 | and then their second case is much more devastating.
01:54:54.100 | So breakbone fever is when you get your second case
01:54:57.260 | of dengue, and dengue effectively utilizes
01:55:00.740 | the immune response that is produced by prior exposure
01:55:04.580 | to attack the body in ways that it is incapable
01:55:06.860 | of doing before exposure.
01:55:08.820 | So this is apparently,
01:55:09.940 | this pattern has apparently blocked past efforts
01:55:13.100 | to make vaccines against coronaviruses,
01:55:17.380 | whether it will happen here or not,
01:55:19.180 | it is still too early to say.
01:55:20.420 | But before we even get to the question
01:55:22.740 | of harm done to individuals by these vaccines,
01:55:26.900 | we have to ask about what the overall impact is going to be.
01:55:29.620 | And it's not clear in the way people think it is,
01:55:32.220 | that if we vaccinate enough people, the pandemic will end.
01:55:35.420 | It could be that we vaccinate people
01:55:37.020 | and make the pandemic worse.
01:55:38.660 | And while nobody can say for sure
01:55:40.540 | that that's where we're headed,
01:55:42.040 | it is at least something to be aware of.
01:55:43.900 | - So don't vaccines usually create
01:55:46.180 | that kind of evolutionary pressure
01:55:48.980 | to create deadlier different strains of the virus?
01:55:53.020 | So is there something particular with these mRNA vaccines
01:55:58.780 | that's uniquely dangerous in this regard?
01:56:01.420 | - Well, it's not even just the mRNA vaccines.
01:56:03.460 | The mRNA vaccines and the adenovector DNA vaccine
01:56:07.180 | all share the same vulnerability,
01:56:09.100 | which is they are very narrowly focused
01:56:11.260 | on one subunit of the spike protein.
01:56:14.220 | So that is a very concentrated evolutionary signal.
01:56:18.220 | We are also deploying it in mid pandemic
01:56:20.940 | and it takes time for immunity to develop.
01:56:23.500 | So part of the problem here,
01:56:25.780 | if you inoculated a population
01:56:28.340 | before encounter with a pathogen,
01:56:31.380 | then there might be substantial enough immunity
01:56:33.860 | to prevent this phenomenon from happening.
01:56:37.340 | But in this case, we are inoculating people
01:56:40.480 | as they are encountering those
01:56:42.220 | who are sick with the disease.
01:56:44.020 | And what that means is the disease is now faced
01:56:47.360 | with a lot of opportunities
01:56:48.580 | to effectively evolutionarily practice escape strategies.
01:56:52.460 | So one thing is the timing,
01:56:54.700 | the other thing is the narrow focus.
01:56:56.820 | Now in a traditional vaccine,
01:56:58.400 | you would typically not have one antigen, right?
01:57:01.240 | You would have basically a virus full of antigens
01:57:04.380 | and the immune system
01:57:05.540 | would therefore produce a broader response.
01:57:08.140 | So that is the case for people who have had COVID, right?
01:57:11.900 | They have an immunity that is broader
01:57:13.400 | because it wasn't so focused
01:57:14.880 | on one part of the spike protein.
01:57:17.100 | So anyway, there is something unique here.
01:57:19.240 | So these platforms create that special hazard.
01:57:21.840 | They also have components
01:57:23.600 | that we haven't used before in people.
01:57:25.980 | So for example, the lipid nanoparticles
01:57:28.240 | that coat the RNAs are distributing themselves
01:57:32.600 | around the body in a way
01:57:34.000 | that will have unknown consequences.
01:57:37.520 | So anyway, there's reason for concern.
01:57:40.200 | - Is it possible for you to steel man the argument
01:57:45.040 | that everybody should get vaccinated?
01:57:48.000 | - Of course.
01:57:49.040 | The argument that everybody should get vaccinated
01:57:51.360 | is that nothing is perfectly safe.
01:57:54.640 | Phase three trials showed good safety for the vaccines.
01:57:59.560 | Now that may or may not be actually true,
01:58:01.960 | but what we saw suggested high degree of efficacy
01:58:06.000 | and a high degree of safety for the vaccines
01:58:09.800 | that inoculating people quickly
01:58:11.780 | and therefore dropping the landscape
01:58:14.400 | of available victims for the pathogen to a very low number
01:58:19.400 | so that herd immunity drives it to extinction
01:58:22.040 | requires us all to take our share of the risk
01:58:24.600 | and that because driving it to extinction
01:58:30.400 | should be our highest priority
01:58:31.680 | that really people shouldn't think too much
01:58:35.300 | about the various nuances
01:58:36.720 | because overwhelmingly fewer people will die
01:58:39.760 | if the population is vaccinated from the vaccine
01:58:43.200 | than will die from COVID if they're not vaccinated.
01:58:45.400 | - And with the vaccine as it currently is being deployed,
01:58:48.240 | that is quite a likely scenario
01:58:50.400 | that everything, the virus will fade away
01:58:56.320 | in the following sense
01:58:59.800 | that the probability
01:59:01.520 | that a more dangerous strain will be created
01:59:04.520 | is non-zero, but it's not 50%.
01:59:08.540 | It's something smaller.
01:59:10.140 | And so the most likely,
01:59:11.320 | well, I don't know, maybe you disagree with that,
01:59:12.800 | but the scenario where most likely it's seen
01:59:15.480 | not that the vaccine is here
01:59:17.360 | is that the effects of the virus will fade away.
01:59:21.620 | - First of all, I don't believe that the probability
01:59:23.520 | of creating a worse pandemic is low enough to discount.
01:59:27.380 | I think the probability is fairly high
01:59:29.420 | and frankly, we are seeing a wave of variants
01:59:32.880 | that we will have to do a careful analysis
01:59:37.160 | to figure out what exactly that has to do
01:59:39.080 | with campaigns of vaccination,
01:59:40.760 | where they have been, where they haven't been,
01:59:42.320 | where the variants emerged from.
01:59:43.940 | But I believe that what we are seeing
01:59:45.320 | is a disturbing pattern that reflects
01:59:47.760 | that those who were advising caution
01:59:50.840 | may well have been right.
01:59:51.840 | - The data here, by the way,
01:59:53.320 | and the small tangent is terrible.
01:59:55.200 | - Terrible, right.
01:59:56.440 | And why is it terrible is another question, right?
01:59:59.680 | - This is where I started getting angry.
02:00:01.240 | - Yes.
02:00:02.080 | - There's an obvious opportunity
02:00:04.120 | for exceptionally good data,
02:00:06.400 | for exceptionally rigorous,
02:00:07.600 | like even the self,
02:00:08.560 | like the website for self-reporting side effects
02:00:11.360 | for not side effects, but negative effects.
02:00:13.760 | - Adverse events.
02:00:14.600 | - Adverse events, sorry, for the vaccine.
02:00:17.100 | Like there's many things I could say
02:00:20.520 | from both the study perspective,
02:00:22.200 | but mostly let me just put on my hat
02:00:24.480 | of like HTML and like web design.
02:00:29.800 | Like it's like the worst website.
02:00:32.680 | It makes it so unpleasant to report.
02:00:34.760 | It makes it so unclear what you're reporting.
02:00:37.080 | If somebody actually has serious effect,
02:00:38.840 | like if you have very mild effects,
02:00:40.600 | what are the incentives for you
02:00:42.480 | to even use that crappy website
02:00:44.480 | with many pages and forms that don't make any sense?
02:00:47.640 | If you have adverse effects,
02:00:49.180 | what are the incentives for you to use that website?
02:00:53.160 | What is the trust that you have
02:00:55.040 | that this information will be used well?
02:00:56.940 | All those kinds of things.
02:00:58.200 | And the data about who's getting vaccinated,
02:01:01.120 | anonymized data about who's getting vaccinated,
02:01:04.120 | where, when, with what vaccine,
02:01:06.900 | coupled with the adverse effects,
02:01:09.000 | all of that we should be collecting.
02:01:10.920 | Instead, we're completely not.
02:01:13.080 | We're doing it in a crappy way
02:01:14.940 | and using that crappy data to make conclusions
02:01:18.000 | that you then twist.
02:01:19.600 | You're basically collecting in a way
02:01:21.120 | that can arrive at whatever conclusions you want.
02:01:25.680 | And the data is being collected
02:01:26.880 | by the institutions, by governments.
02:01:30.400 | And so therefore, it's obviously,
02:01:32.240 | they're going to try to construct any kind of narratives
02:01:34.560 | they want based on this crappy data.
02:01:36.640 | It reminds me of much of psychology,
02:01:38.800 | the field that I love, but is flawed
02:01:40.800 | in many fundamental ways.
02:01:42.560 | So rant over, but coupled with the dangers
02:01:46.720 | that you're speaking to,
02:01:47.560 | we don't have even the data to understand the dangers.
02:01:52.120 | - Yeah, I'm gonna pick up on your rant and say,
02:01:55.600 | we, estimates of the degree of under-reporting in VAERS
02:02:00.600 | are that it is 10% of the real to 100%,
02:02:05.640 | or 1%. - And that's the system
02:02:07.160 | for reporting. - Yeah, the VAERS system
02:02:08.960 | is the system for reporting adverse events.
02:02:11.160 | So in the US, we have above 5,000 unexpected deaths
02:02:16.160 | that seem, in time, to be associated with vaccination.
02:02:22.000 | That is an undercount, almost certainly,
02:02:24.480 | and by a large factor.
02:02:27.960 | We don't know how large, I've seen estimates,
02:02:30.320 | 25,000 dead in the US alone.
02:02:34.720 | Now, you can make the argument that,
02:02:37.520 | okay, that's a large number,
02:02:39.320 | but the necessity of immunizing the population
02:02:42.920 | to drive SARS-CoV-2 to extinction
02:02:45.480 | is such that it's an acceptable number.
02:02:48.000 | But I would point out that that actually
02:02:49.760 | does not make any sense.
02:02:51.040 | And the reason it doesn't make any sense is,
02:02:53.120 | actually, there's several reasons.
02:02:54.320 | One, if that was really your point,
02:02:57.200 | that yes, many, many people are gonna die,
02:02:59.960 | but many more will die if we don't do this,
02:03:02.120 | were that your approach,
02:03:05.120 | you would not be inoculating people
02:03:07.160 | who had had COVID-19, which is a large population.
02:03:10.640 | There's no reason to expose those people to danger.
02:03:13.280 | Their risk of adverse events,
02:03:14.880 | in the case that they have them, is greater.
02:03:18.280 | So there's no reason that we would be allowing
02:03:20.560 | those people to face a risk of death
02:03:22.320 | if this was really about an acceptable number
02:03:25.120 | of deaths arising out of this set of vaccines.
02:03:29.240 | I would also point out,
02:03:30.520 | there's something incredibly bizarre,
02:03:32.840 | and I struggle to find language that is strong enough
02:03:37.840 | for the horror of vaccinating children in this case,
02:03:42.960 | because children suffer a greater risk of long-term effects
02:03:48.240 | because they are going to live longer,
02:03:49.880 | and because this is earlier in their development,
02:03:51.960 | therefore it impacts systems that are still forming.
02:03:55.680 | They tolerate COVID well,
02:03:57.920 | and so the benefit to them is very small.
02:04:01.480 | And so the only argument for doing this
02:04:04.080 | is that they may cryptically be carrying
02:04:05.920 | more COVID than we think,
02:04:07.560 | and therefore they may be integral
02:04:09.640 | to the way the virus spreads to the population.
02:04:11.960 | But if that's the reason that we are inoculating children,
02:04:14.280 | and there has been some revision in the last day or two
02:04:16.480 | about the recommendation on this
02:04:17.880 | because of the adverse events
02:04:19.840 | that have shown up in children,
02:04:21.000 | but to the extent that we were vaccinating children,
02:04:24.200 | we were doing it to protect old, infirm people
02:04:28.280 | who are the most likely to succumb to COVID-19.
02:04:32.760 | What society puts children in danger,
02:04:37.140 | robs children of life to save old, infirm people?
02:04:40.900 | That's upside down.
02:04:42.200 | So there's something about the way
02:04:45.200 | we are going about vaccinating, who we are vaccinating,
02:04:48.720 | what dangers we are pretending don't exist
02:04:52.280 | that suggests that to some set of people,
02:04:55.680 | vaccinating people is a good in and of itself,
02:04:58.660 | that that is the objective of the exercise,
02:05:00.480 | not herd immunity.
02:05:01.560 | And the last thing, and I'm sorry,
02:05:03.080 | I don't wanna prevent you from jumping in here,
02:05:05.540 | but the second reason,
02:05:06.560 | in addition to the fact that we're exposing people
02:05:08.480 | to danger that we should not be exposing them to.
02:05:11.840 | - By the way, as a tiny tangent,
02:05:13.680 | another huge part of this soup
02:05:16.240 | that should have been part of it,
02:05:17.560 | that's an incredible solution is large-scale testing.
02:05:20.880 | - Mm-hmm.
02:05:21.980 | - But that might be another couple of hours conversation,
02:05:26.400 | but there's these solutions that are obvious,
02:05:28.600 | that were available from the very beginning.
02:05:30.520 | So you could argue that ivermectin is not that obvious,
02:05:34.680 | but maybe the whole point is you have aggressive,
02:05:38.760 | very fast research that leads to a meta-analysis
02:05:43.080 | and then large-scale production and deployment.
02:05:46.120 | - Okay, at least that possibility
02:05:48.960 | should be seriously considered,
02:05:51.160 | coupled with a serious consideration
02:05:53.560 | of large-scale deployment of testing, at-home testing,
02:05:56.920 | that could have accelerated the speed
02:06:01.920 | at which we reached that herd immunity.
02:06:05.860 | But I don't even wanna--
02:06:08.640 | - Well, let me just say, I am also completely shocked
02:06:11.680 | that we did not get on high-quality testing early
02:06:15.000 | and that we are still suffering from this even now,
02:06:19.000 | because just the simple ability to track
02:06:21.560 | where the virus moves between people
02:06:23.680 | would tell us a lot about its mode of transmission,
02:06:26.080 | which would allow us to protect ourselves better.
02:06:28.800 | Instead, that information was hard won
02:06:32.280 | and for no good reason.
02:06:33.200 | So I also find this mysterious.
02:06:35.880 | - You've spoken with Eric Weinstein, your brother,
02:06:39.900 | on his podcast, "The Portal,"
02:06:41.440 | about the ideas that eventually led to the paper
02:06:45.160 | you published titled "The Reserved Capacity Hypothesis."
02:06:48.720 | I think first, can you explain this paper
02:06:55.480 | and the ideas that led up to it?
02:06:59.900 | - Sure, easier to explain the conclusion of the paper.
02:07:03.980 | There's a question about why a creature
02:07:08.400 | that can replace its cells with new cells
02:07:12.000 | grows feeble and inefficient with age.
02:07:15.000 | We call that process, which is otherwise called aging,
02:07:18.580 | we call it senescence.
02:07:19.900 | And senescence, in this paper, it is hypothesized,
02:07:26.000 | is the unavoidable downside
02:07:29.280 | of a cancer prevention feature of our bodies,
02:07:36.320 | that each cell has a limit
02:07:38.400 | on the number of times it can divide.
02:07:40.760 | There are a few cells in the body that are exceptional,
02:07:43.000 | but most of our cells can only divide
02:07:45.040 | a limited number of times.
02:07:46.080 | That's called the Hayflick limit.
02:07:48.000 | And the Hayflick limit reduces the ability
02:07:52.480 | of the organism to replace tissues.
02:07:55.780 | It therefore results in a failure over time
02:07:58.800 | of maintenance and repair.
02:08:01.360 | And that explains why we become decrepit as we grow old.
02:08:06.200 | The question was, why would that be,
02:08:09.480 | especially in light of the fact that the mechanism
02:08:12.920 | that seems to limit the ability of cells to reproduce
02:08:16.600 | is something called a telomere.
02:08:18.840 | Telomere is a, it's not a gene,
02:08:21.280 | but it's a DNA sequence at the ends of our chromosomes
02:08:24.880 | that is just simply repetitive.
02:08:26.320 | And the number of repeats functions like a counter.
02:08:30.240 | So there's a number of repeats that you have
02:08:33.000 | after development is finished.
02:08:34.400 | And then each time the cell divides,
02:08:35.760 | a little bit of telomere is lost.
02:08:37.600 | And at the point that the telomere becomes critically short,
02:08:40.240 | the cell stops dividing,
02:08:41.880 | even though it still has the capacity to do so.
02:08:44.640 | Stops dividing and it starts transcribing different genes
02:08:47.380 | than it did when it had more telomere.
02:08:50.240 | So what my work did was it looked at the fact
02:08:53.880 | that the telomeric shortening was being studied
02:08:56.660 | by two different groups.
02:08:57.880 | It was being studied by people who were interested
02:09:00.760 | in counteracting the aging process.
02:09:03.440 | And it was being studied in exactly the opposite fashion
02:09:06.520 | by people who were interested in tumorigenesis and cancer.
02:09:10.520 | The thought being, because it was true
02:09:12.600 | that when one looked into tumors,
02:09:14.080 | they always had telomerase active.
02:09:16.700 | That's the enzyme that lengthens our telomeres.
02:09:19.040 | So those folks were interested in bringing about a halt
02:09:24.040 | to the lengthening of telomeres
02:09:25.720 | in order to counteract cancer.
02:09:27.520 | And the folks who were studying the senescence process
02:09:30.640 | were interested in lengthening telomeres
02:09:32.280 | in order to generate greater repair capacity.
02:09:36.200 | And my point was evolutionarily speaking,
02:09:38.640 | this looks like a pleiotropic effect
02:09:42.880 | that the genes which create the tendency of the cells
02:09:47.880 | to be limited in their capacity to replace themselves
02:09:53.320 | are providing a benefit in youth,
02:09:55.600 | which is that we are largely free of tumors and cancer
02:09:59.320 | at the inevitable late life cost
02:10:01.160 | that we grow feeble and inefficient and eventually die.
02:10:04.840 | And that matches a very old hypothesis
02:10:08.520 | in evolutionary theory by somebody
02:10:11.280 | I was fortunate enough to know, George Williams,
02:10:13.560 | one of the great 20th century evolutionists,
02:10:16.640 | who argued that senescence would have to be caused
02:10:19.760 | by pleiotropic genes that cause early life benefits
02:10:23.800 | at unavoidable late life costs.
02:10:26.160 | And although this isn't the exact nature of the system
02:10:29.240 | he predicted, it matches what he was expecting
02:10:32.720 | in many regards to a shocking degree.
02:10:35.880 | - That said, the focus of the paper is about the,
02:10:40.880 | well, let me just read the abstract.
02:10:43.800 | "We observed that captive rodent breeding protocols
02:10:46.800 | "designed," this is the end of the abstract,
02:10:49.000 | "We observed that captive rodent breeding protocols
02:10:51.440 | "designed to increase reproductive output,
02:10:53.600 | "simultaneously exert strong selection
02:10:55.720 | "against reproductive senescence
02:10:57.960 | "and virtually eliminate selection
02:11:00.040 | "that would otherwise favor tumor suppression.
02:11:03.160 | "This appears to have greatly elongated the telomeres
02:11:06.280 | "of laboratory mice.
02:11:07.840 | "With their telomeric fail-safe effectively disabled,
02:11:10.640 | "these animals are unreliable models
02:11:12.520 | "of normal senescence and tumor formation."
02:11:15.280 | So basically using these mice is not going to lead
02:11:19.640 | to the right kinds of conclusions.
02:11:21.420 | "Safety tests employing these animals
02:11:24.080 | "likely overestimate cancer risks
02:11:26.600 | "and underestimate tissue damage
02:11:28.960 | "and consequent accelerated senescence."
02:11:31.780 | So I think, especially with your discussion with Eric,
02:11:37.280 | the conclusion of this paper has to do with the fact
02:11:42.260 | that we shouldn't be using these mice to test the safety
02:11:47.260 | or to make conclusions about cancer or senescence.
02:11:53.320 | Is that the basic takeaway?
02:11:55.040 | Like basically saying that the length of these telomeres
02:11:57.800 | is an important variable to consider.
02:12:00.080 | - Well, let's put it this way.
02:12:01.960 | I think there was a reason that the world of scientists
02:12:05.760 | who was working on telomeres
02:12:07.520 | did not spot the pleiotropic relationship
02:12:10.920 | that was the key argument in my paper.
02:12:15.360 | The reason they didn't spot it
02:12:17.520 | was that there was a result that everybody knew
02:12:20.600 | which seemed inconsistent.
02:12:23.040 | The result was that mice have very long telomeres,
02:12:26.960 | but they do not have very long lives.
02:12:30.260 | Now, we can talk about what the actual meaning
02:12:32.480 | of don't have very long lives is,
02:12:34.560 | but in the end, I was confronted with a hypothesis
02:12:39.240 | that would explain a great many features
02:12:41.560 | of the way mammals and indeed vertebrates age,
02:12:44.780 | but it was inconsistent with one result.
02:12:46.640 | And at first I thought,
02:12:48.360 | maybe there's something wrong with the result.
02:12:50.020 | Maybe this is one of these cases
02:12:51.340 | where the result was achieved once
02:12:54.360 | through some bad protocol and everybody else was repeating it
02:12:57.680 | didn't turn out to be the case.
02:12:58.760 | Many laboratories had established
02:13:00.440 | that mice had ultra long telomeres.
02:13:02.880 | And so I began to wonder whether or not
02:13:05.320 | there was something about the breeding protocols
02:13:09.580 | that generated these mice.
02:13:11.240 | And what that would predict is that the mice
02:13:13.840 | that have long telomeres would be laboratory mice
02:13:16.940 | and that wild mice would not.
02:13:18.720 | And Carol Greider, who agreed to collaborate with me,
02:13:23.220 | tested that hypothesis and showed that it was indeed true
02:13:27.660 | that wild derived mice, or at least mice
02:13:29.940 | that had been in captivity for a much shorter period of time
02:13:32.740 | did not have ultra long telomeres.
02:13:35.220 | Now, what this implied though, as you read,
02:13:38.700 | is that our breeding protocols
02:13:41.140 | generate lengthening of telomeres.
02:13:43.100 | And the implication of that is that the animals
02:13:45.740 | that have these very long telomeres
02:13:47.440 | will be hyper prone to create tumors.
02:13:50.540 | They will be extremely resistant to toxins
02:13:54.460 | because they have effectively an infinite capacity
02:13:56.940 | to replace any damaged tissue.
02:13:59.020 | And so ironically, if you give one of these
02:14:02.680 | ultra long telomere lab mice a toxin,
02:14:06.500 | if the toxin doesn't outright kill it,
02:14:08.340 | it may actually increase its lifespan
02:14:10.880 | because it functions as a kind of chemotherapy.
02:14:14.340 | So the reason that chemotherapy works
02:14:16.980 | is that dividing cells are more vulnerable
02:14:19.220 | than cells that are not dividing.
02:14:21.140 | And so if this mouse has effectively
02:14:23.580 | had its cancer protection turned off
02:14:26.320 | and it has cells dividing too rapidly
02:14:28.540 | and you give it a toxin,
02:14:29.940 | you will slow down its tumors faster
02:14:31.980 | than you harm its other tissues.
02:14:34.020 | And so you'll get a paradoxical result
02:14:35.740 | that actually some drug that's toxic
02:14:38.320 | seems to benefit the mouse.
02:14:40.500 | Now, I don't think that that was understood
02:14:43.180 | before I published my paper.
02:14:44.760 | Now, I'm pretty sure it has to be.
02:14:46.860 | And the problem is that this actually is a system
02:14:50.700 | that serves pharmaceutical companies
02:14:53.180 | that have the difficult job of bringing compounds to market,
02:14:57.180 | many of which will be toxic.
02:14:59.420 | Maybe all of them will be toxic.
02:15:01.780 | And these mice predispose our system
02:15:04.780 | to declare these toxic compounds safe.
02:15:07.580 | And in fact, I believe we've seen the errors
02:15:10.460 | that result from using these mice a number of times,
02:15:12.940 | most famously with Vioxx,
02:15:14.420 | which turned out to do conspicuous heart damage.
02:15:18.180 | - Why do you think this paper
02:15:20.060 | and this idea has not gotten significant traction?
02:15:23.700 | - Well, my collaborator, Carol Greider,
02:15:27.580 | said something to me that rings in my ears to this day.
02:15:32.180 | She initially, after she showed
02:15:34.380 | that laboratory mice have anomalously long telomeres
02:15:37.420 | and that wild mice don't have long telomeres,
02:15:39.980 | I asked her where she was going to publish that result
02:15:42.300 | so that I could cite it in my paper.
02:15:44.540 | And she said that she was going to keep the result in-house
02:15:47.500 | rather than publish it.
02:15:49.820 | And at the time, I was a young graduate student.
02:15:54.060 | I didn't really understand what she was saying.
02:15:56.820 | But in some sense,
02:15:58.620 | the knowledge that a model organism is broken
02:16:02.300 | in a way that creates the likelihood
02:16:04.820 | that certain results will be reliably generatable,
02:16:08.320 | you can publish a paper
02:16:09.740 | and make a big splash with such a thing,
02:16:11.940 | or you can exploit the fact
02:16:13.340 | that you know how those models will misbehave
02:16:16.140 | and other people don't.
02:16:17.700 | So there's a question.
02:16:19.260 | If somebody is motivated cynically
02:16:22.360 | and what they want to do
02:16:23.780 | is appear to have deeper insight into biology
02:16:26.460 | because they predict things better than others do,
02:16:29.140 | knowing where the flaw is
02:16:31.140 | so that your predictions come out true is advantageous.
02:16:34.980 | At the same time, I can't help but imagine
02:16:38.900 | that the pharmaceutical industry,
02:16:40.140 | when it figured out that the mice were predisposed
02:16:42.700 | to suggest that drugs were safe,
02:16:45.760 | didn't leap to fix the problem
02:16:47.840 | because in some sense,
02:16:49.080 | it was the perfect cover for the difficult job
02:16:51.780 | of bringing drugs to market
02:16:54.080 | and then discovering their actual toxicity profile.
02:16:57.860 | This made things look safer than they were,
02:16:59.540 | and I believe a lot of profits
02:17:01.260 | have likely been generated downstream.
02:17:03.160 | - So to kind of play devil's advocate,
02:17:06.740 | it's also possible that this particular,
02:17:10.420 | the length of the telomeres
02:17:11.420 | is not a strong variable for the conclusions,
02:17:13.740 | for the drug development
02:17:15.240 | and for the conclusions that Carol
02:17:16.700 | and others have been studying.
02:17:18.540 | Is it possible for that to be the case?
02:17:21.120 | So one reason she and others could be ignoring this
02:17:27.140 | is because it's not a strong variable.
02:17:29.540 | - Well, I don't believe so.
02:17:30.460 | And in fact, at the point that I went to publish my paper,
02:17:34.200 | Carol published her result.
02:17:36.720 | She did so in a way that did not make a huge splash.
02:17:39.580 | - Did she, like, I apologize if I don't know how,
02:17:43.100 | what was the emphasis of her publication of that paper?
02:17:49.140 | Was it purely just kind of showing data
02:17:52.220 | or was there more,
02:17:53.380 | because in your paper,
02:17:54.220 | there's a kind of more of a philosophical statement as well.
02:17:57.580 | - Well, my paper was motivated by interest
02:18:00.900 | in the evolutionary dynamics around senescence.
02:18:03.540 | I wasn't pursuing grants or anything like that.
02:18:07.660 | I was just working on a puzzle I thought was interesting.
02:18:10.880 | Carol has, of course, gone on to win a Nobel Prize
02:18:14.320 | for her co-discovery with Elizabeth Greider
02:18:17.460 | of telomerase, the enzyme that lengthens telomeres.
02:18:20.400 | But anyway, she's a heavy hitter in the academic world.
02:18:25.720 | I don't know exactly what her purpose was.
02:18:27.800 | I do know that she told me she wasn't planning to publish.
02:18:30.240 | And I do know that I discovered
02:18:31.720 | that she was in the process of publishing very late.
02:18:34.400 | And when I asked her to send me the paper
02:18:36.880 | to see whether or not she had put evidence in it
02:18:41.000 | that the hypothesis had come from me,
02:18:43.560 | she grudgingly sent it to me
02:18:45.180 | and my name was nowhere mentioned
02:18:46.720 | and she broke contact at that point.
02:18:49.620 | What it is that motivated her, I don't know,
02:18:53.420 | but I don't think it can possibly be
02:18:55.060 | that this result is unimportant.
02:18:57.340 | The fact is, the reason I called her in the first place
02:19:00.600 | and established contact that generated our collaboration
02:19:04.840 | was that she was a leading light
02:19:06.320 | in the field of telomeric studies.
02:19:08.760 | And because of that, this question
02:19:12.280 | about whether the model organisms are distorting
02:19:15.640 | the understanding of the functioning of telomeres,
02:19:19.760 | it's central.
02:19:21.000 | - Do you feel like you've been,
02:19:23.600 | as a young graduate student, do you think Carroll
02:19:27.200 | or do you think the scientific community
02:19:28.720 | broadly screwed you over?
02:19:30.760 | In some way.
02:19:31.600 | - You know, I don't think of it in those terms,
02:19:33.280 | probably partly because it's not productive,
02:19:37.520 | but I have a complex relationship with this story.
02:19:42.360 | On the one hand, I'm livid with Carroll Grider
02:19:44.960 | for what she did.
02:19:46.400 | She absolutely pretended that I didn't exist in this story
02:19:50.040 | and I don't think I was a threat to her.
02:19:51.920 | My interest was as an evolutionary biologist,
02:19:54.720 | I had made an evolutionary contribution,
02:19:57.480 | she had tested a hypothesis,
02:19:59.000 | and frankly, I think it would have been better for her
02:20:01.560 | if she had acknowledged what I had done,
02:20:03.760 | I think it would have enhanced her work.
02:20:06.000 | And you know, I was, let's put it this way,
02:20:10.480 | when I watched her Nobel lecture,
02:20:12.440 | and I should say there's been a lot of confusion
02:20:13.960 | about this Nobel stuff, I've never said
02:20:15.720 | that I should have gotten a Nobel Prize,
02:20:17.840 | people have misportrayed that.
02:20:20.320 | In listening to her lecture,
02:20:25.840 | I had one of the most bizarre emotional experiences
02:20:29.080 | of my life, because she presented the work
02:20:33.160 | that resulted from my hypothesis.
02:20:35.960 | She presented it as she had in her paper,
02:20:38.840 | with no acknowledgement of where it had come from,
02:20:42.320 | and she had in fact portrayed the distortion
02:20:47.240 | of the telomeres as if it were a lucky fact,
02:20:50.600 | because it allowed testing hypotheses
02:20:53.180 | that would otherwise not be testable.
02:20:55.600 | You have to understand, as a young scientist,
02:21:00.680 | to watch work that you have done,
02:21:03.320 | presented in what's surely the most important lecture
02:21:07.680 | of her career, right?
02:21:10.360 | It's thrilling, it was thrilling to see, you know,
02:21:13.840 | her figures projected on the screen there, right?
02:21:18.640 | To have been part of work that was important enough
02:21:21.160 | for that felt great, and of course,
02:21:23.180 | to be erased from the story felt absolutely terrible.
02:21:26.340 | So anyway, that's sort of where I am with it.
02:21:30.120 | My sense is, what I'm really troubled by in the story
02:21:35.120 | is the fact that as far as I know,
02:21:41.020 | the flaw with the mice has not been addressed.
02:21:44.800 | And actually, Eric did some looking into this,
02:21:48.060 | he tried to establish, by calling the Jax Lab
02:21:50.700 | and trying to ascertain what had happened with the colonies,
02:21:54.340 | whether any change in protocol had occurred,
02:21:56.980 | and he couldn't get anywhere.
02:21:58.660 | There was seemingly no awareness that it was even an issue.
02:22:02.260 | So I'm very troubled by the fact that as a father,
02:22:06.060 | for example, I'm in no position to protect my family
02:22:10.420 | from the hazard that I believe
02:22:11.940 | lurks in our medicine cabinets, right?
02:22:14.980 | Even though I'm aware of where the hazard comes from,
02:22:17.340 | it doesn't tell me anything useful
02:22:18.900 | about which of these drugs will turn out to do damage
02:22:21.140 | if it is ultimately tested.
02:22:23.300 | And that's a very frustrating position to be in.
02:22:26.460 | On the other hand, there's a part of me
02:22:28.140 | that's even still grateful to Carol for taking my call.
02:22:31.600 | She didn't have to take my call
02:22:32.980 | and talk to some young graduate student
02:22:34.620 | who had some evolutionary idea
02:22:36.180 | that wasn't in her wheelhouse specifically,
02:22:40.380 | and yet she did.
02:22:41.300 | And for a while, she was a good collaborator.
02:22:43.860 | - Well, can I, I have to proceed carefully here
02:22:48.140 | because it's a complicated topic.
02:22:50.620 | So she took the call,
02:22:53.720 | and you kind of,
02:22:57.000 | you're kind of saying that she basically erased credit,
02:23:03.140 | pretending you didn't exist in some kind of,
02:23:07.140 | in a certain sense.
02:23:08.480 | Let me phrase it this way.
02:23:12.460 | I've, as a research scientist at MIT,
02:23:17.920 | I've had, and especially just part of
02:23:21.380 | a large set of collaborations,
02:23:25.380 | I've had a lot of students come to me
02:23:27.180 | and talk to me about ideas,
02:23:31.260 | perhaps less interesting than what we're discussing here
02:23:33.620 | in the space of AI,
02:23:35.500 | that I've been thinking about anyway.
02:23:37.700 | In general, with everything I'm doing with robotics,
02:23:41.820 | people would have told me a bunch of ideas
02:23:47.660 | that I'm already thinking about.
02:23:49.440 | The point is taking that idea,
02:23:52.260 | see, this is different because the idea has more power
02:23:54.840 | in the space that we're talking about here,
02:23:56.240 | and robotics is like, your idea means shit
02:23:58.400 | until you build it.
02:24:00.120 | Like, so the engineering world is a little different,
02:24:03.200 | but there's a kind of sense that I probably forgot
02:24:08.040 | a lot of brilliant ideas that have been told to me.
02:24:10.600 | Do you think she pretended you don't exist?
02:24:14.840 | Do you think she was so busy that she kind of forgot?
02:24:19.300 | She has this stream of brilliant people around her,
02:24:23.180 | there's a bunch of ideas that are swimming in the air,
02:24:26.340 | and you just kind of forget people
02:24:28.260 | that are a little bit on the periphery
02:24:30.300 | on the idea generation.
02:24:32.060 | Or is it some mix of both?
02:24:33.580 | - It's not a mix of both.
02:24:36.700 | I know that because we corresponded.
02:24:39.860 | She put a graduate student on this work.
02:24:41.820 | He emailed me excitedly when the results came in.
02:24:46.740 | So there was no ambiguity about what had happened.
02:24:50.160 | What's more, when I went to publish my work,
02:24:52.920 | I actually sent it to Carol in order to get her feedback
02:24:56.680 | because I wanted to be a good collaborator to her.
02:24:59.780 | And she absolutely panned it,
02:25:02.760 | made many critiques that were not valid,
02:25:06.080 | but it was clear at that point that she became an antagonist
02:25:10.160 | and none of this adds up.
02:25:12.500 | She couldn't possibly have forgotten the conversation.
02:25:15.360 | I believe I even sent her tissues at some point,
02:25:20.220 | in part, not related to this project,
02:25:23.080 | but as a favor, she was doing another project
02:25:25.040 | that involved telomeres and she needed samples
02:25:27.160 | that I could get ahold of
02:25:28.060 | because of the Museum of Zoology that I was in.
02:25:30.820 | So this was not a one-off conversation.
02:25:34.240 | I certainly know that those sorts of things can happen,
02:25:36.160 | but that's not what happened here.
02:25:37.840 | This was a relationship that existed
02:25:41.300 | and then was suddenly cut short at the point
02:25:44.020 | that she published her paper by surprise
02:25:46.360 | without saying where the hypothesis had come from
02:25:48.980 | and began to be a opposing force to my work.
02:25:53.980 | - Is there, there's a bunch of trajectories
02:25:57.000 | you could have taken through life.
02:25:58.700 | Do you think about the trajectory of being a researcher,
02:26:06.640 | of then going to war in the space of ideas,
02:26:10.140 | of publishing further papers along this line?
02:26:13.560 | I mean, that's often the dynamic of that fascinating space
02:26:18.560 | is you have a junior researcher with brilliant ideas
02:26:21.980 | and a senior researcher that starts out as a mentor
02:26:25.000 | then becomes a competitor.
02:26:26.000 | I mean, that happens, but then the way to,
02:26:30.440 | it's an almost an opportunity to shine
02:26:33.560 | is to publish a bunch more papers in this place
02:26:36.460 | like to tear it apart, to dig into,
02:26:39.280 | like really make it a war of ideas.
02:26:42.580 | Did you consider that possible trajectory?
02:26:45.400 | - I did.
02:26:46.240 | I have a couple of things to say about it.
02:26:48.400 | One, this work was not central for me.
02:26:51.800 | I took a year on the T. Lemire Project
02:26:54.480 | because something fascinating occurred to me
02:26:57.560 | and I pursued it and the more I pursued it,
02:26:59.560 | the clearer it was there was something there,
02:27:01.420 | but it wasn't the focus of my graduate work.
02:27:03.800 | And I didn't want to become a telomere researcher.
02:27:08.660 | What I want to do is to be an evolutionary biologist
02:27:12.260 | who upgrades the toolkit of evolutionary concepts
02:27:15.900 | so that we can see more clearly
02:27:17.780 | how organisms function and why.
02:27:20.180 | And telomeres was a proof of concept, right?
02:27:24.880 | That paper was a proof of concept
02:27:26.340 | that the toolkit in question works.
02:27:30.380 | As for the need to pursue it further,
02:27:35.380 | I think it's kind of absurd
02:27:37.080 | and you're not the first person to say
02:27:38.480 | maybe that was the way to go about it,
02:27:40.060 | but the basic point is, look, the work was good.
02:27:42.520 | It turned out to be highly predictive.
02:27:47.000 | Frankly, the model of senescence that I presented
02:27:50.100 | is now widely accepted and I don't feel any misgivings
02:27:55.020 | at all about having spent a year on it,
02:27:57.520 | said my piece and moved on to other things,
02:28:00.000 | which frankly I think are bigger.
02:28:02.280 | I think there's a lot of good to be done
02:28:03.640 | and it would be a waste to get overly narrowly focused.
02:28:08.080 | - There's so many ways through the space of science
02:28:12.880 | and the most common way is to just publish a lot.
02:28:16.760 | Just publish a lot of papers, do these incremental work
02:28:19.200 | and exploring the space kind of like ants looking for food.
02:28:24.200 | You're tossing out a bunch of different ideas.
02:28:26.840 | Some of them could be brilliant breakthrough ideas,
02:28:28.960 | nature, some of them are more confidence kind
02:28:31.200 | of publications, all those kinds of things.
02:28:33.360 | Did you consider that kind of path in science?
02:28:37.420 | - Of course I considered it, but I must say
02:28:41.100 | the experience of having my first encounter
02:28:44.600 | with the process of peer review be this story,
02:28:48.720 | which was frankly a debacle from one end to the other
02:28:52.460 | with respect to the process of publishing.
02:28:55.840 | It did not, it was not a very good sales pitch
02:28:58.800 | for trying to make a difference through publication.
02:29:01.480 | And I would point out part of what I ran into
02:29:03.320 | and I think frankly part of what explains Carol's behavior
02:29:06.920 | is that in some parts of science,
02:29:10.640 | there is this dynamic where PIs parasitize their underlings
02:29:15.640 | and if you're very, very good, you rise to the level
02:29:20.320 | where one day instead of being parasitized,
02:29:23.160 | you get to parasitize others.
02:29:25.320 | Now I find that scientifically despicable
02:29:28.320 | and it wasn't the culture of the lab I grew up in at all.
02:29:31.200 | My lab, in fact, the PI, Dick Alexander, who's now gone,
02:29:35.840 | but who was an incredible mind and a great human being,
02:29:40.440 | he didn't want his graduate students working
02:29:42.520 | on the same topics he was on,
02:29:44.280 | not because it wouldn't have been useful and exciting,
02:29:47.480 | but because in effect, he did not want any confusion
02:29:51.240 | about who had done what,
02:29:54.460 | because he was a great mentor and the idea was actually,
02:29:57.920 | a great mentor is not stealing ideas
02:30:00.400 | and you don't want people thinking that they are.
02:30:03.500 | So anyway, my point would be,
02:30:05.920 | I wasn't up for being parasitized.
02:30:11.200 | I don't like the idea that if you are very good,
02:30:14.440 | you get parasitized until it's your turn
02:30:16.320 | to parasitize others.
02:30:17.680 | That doesn't make sense to me.
02:30:19.180 | A crossing over from evolution into cellular biology
02:30:23.660 | may have exposed me to that.
02:30:25.480 | That may have been par for the course,
02:30:27.000 | but it doesn't make it acceptable.
02:30:29.960 | And I would also point out that my work falls
02:30:33.160 | in the realm of synthesis.
02:30:35.540 | My work generally takes evidence accumulated by others
02:30:41.540 | and places it together in order to generate hypotheses
02:30:46.440 | that explain sets of phenomena
02:30:48.640 | that are otherwise intractable.
02:30:51.320 | And I am not sure that that is best done
02:30:55.260 | with narrow publications that are read by few.
02:30:59.700 | And in fact, I would point to the very conspicuous example
02:31:03.020 | of Richard Dawkins, who I must say,
02:31:04.840 | I've learned a tremendous amount from and I greatly admire.
02:31:07.940 | Dawkins has almost no publication record
02:31:12.300 | in the sense of peer reviewed papers in journals.
02:31:15.780 | What he's done instead is done synthetic work
02:31:18.200 | and he's published it in books,
02:31:19.620 | which are not peer reviewed in the same sense.
02:31:22.580 | And frankly, I think there's no doubting
02:31:24.860 | his contribution to the field.
02:31:27.080 | So my sense is if Richard Dawkins can illustrate
02:31:32.080 | that one can make contributions to the field
02:31:34.420 | without using journals as the primary mechanism
02:31:38.380 | for distributing what you've come to understand,
02:31:40.620 | then it's obviously a valid mechanism
02:31:42.380 | and it's a far better one from the point of view
02:31:44.700 | of accomplishing what I want to accomplish.
02:31:46.420 | - Yeah, it's really interesting.
02:31:47.900 | There is of course several levels
02:31:49.220 | you can do the kind of synthesis
02:31:50.780 | and that does require a lot of both broad
02:31:53.940 | and deep thinking is exceptionally valuable.
02:31:56.180 | You could also publish, I'm working on something
02:31:58.860 | with Andrew Huberman now, you can also publish synthesis.
02:32:01.580 | - Sure.
02:32:02.420 | - That's like review papers,
02:32:03.260 | they're exceptionally valuable for the communities.
02:32:06.660 | It brings the community together, tells a history,
02:32:09.380 | tells a story of where the community has been.
02:32:11.080 | It paints a picture of where the path lays for the future.
02:32:14.380 | I think it's really valuable.
02:32:15.560 | And Richard Dawkins is a good example of somebody
02:32:17.780 | that does that in book form that he kind of walks the line
02:32:21.900 | really interestingly.
02:32:23.700 | You have like somebody who like Neil deGrasse Tyson,
02:32:26.460 | who's more like a science communicator.
02:32:28.820 | Richard Dawkins sometimes is a science communicator,
02:32:30.980 | but he gets like close to the technical
02:32:33.980 | to where it's a little bit, it's not shying away
02:32:36.960 | from being really a contribution to science.
02:32:41.460 | - No, he's made real contributions.
02:32:44.140 | - In book form.
02:32:45.060 | - Yes, he really has.
02:32:45.900 | - Which is fascinating.
02:32:47.620 | I mean, Roger Penrose, I mean, similar kind of idea.
02:32:51.620 | That's interesting, that's interesting.
02:32:53.020 | Synthesis does not, especially synthesis work,
02:32:56.440 | work that synthesizes ideas does not necessarily
02:33:00.220 | need to be peer reviewed.
02:33:02.140 | It's peer reviewed by peers reading it.
02:33:07.140 | - Well.
02:33:09.040 | - And reviewing it.
02:33:10.100 | - That's it, it is reviewed by peers,
02:33:11.680 | which is not synonymous with peer review.
02:33:13.460 | And that's the thing is people don't understand
02:33:15.540 | that the two things aren't the same, right?
02:33:17.700 | Peer review is an anonymous process
02:33:20.200 | that happens before publication
02:33:23.340 | in a place where there is a power dynamic, right?
02:33:26.580 | I mean, the joke of course is that peer review
02:33:28.220 | is actually peer preview, right?
02:33:30.340 | Your biggest competitors get to see your work
02:33:32.820 | before it sees the light of day
02:33:34.100 | and decide whether or not it gets published.
02:33:37.220 | And, you know, again, when your formative experience
02:33:41.220 | with the publication apparatus is the one I had
02:33:43.500 | with the Telomere paper, there's no way
02:33:46.860 | that that seems like the right way
02:33:48.300 | to advance important ideas.
02:33:50.100 | - Yeah.
02:33:50.940 | - And, you know, what's the harm in publishing them
02:33:53.980 | so that your peers have to review them in public
02:33:55.900 | where they actually, if they're gonna disagree with you,
02:33:58.580 | they actually have to take the risk of saying,
02:34:00.620 | I don't think this is right and here's why, right?
02:34:03.420 | With their name on it.
02:34:04.540 | I'd much rather that.
02:34:05.580 | It's not that I don't want my work reviewed by peers,
02:34:07.660 | but I want it done in the open, you know,
02:34:10.260 | for the same reason you don't meet
02:34:11.500 | with dangerous people in private.
02:34:13.540 | You meet at the cafe.
02:34:14.660 | I want the work reviewed out in public.
02:34:17.640 | - Can I ask you a difficult question?
02:34:20.900 | - Sure.
02:34:21.740 | - There is popularity in martyrdom.
02:34:26.680 | There's popularity in pointing out
02:34:30.600 | that the emperor has no clothes.
02:34:32.200 | That can become a drug in itself.
02:34:41.020 | I've confronted this in scientific work I've done at MIT
02:34:45.340 | where there are certain things that are not done well.
02:34:49.900 | People are not being the best version of themselves.
02:34:52.500 | And particular aspects of a particular field
02:34:58.940 | are in need of a revolution.
02:35:03.260 | And part of me wanted to point that out
02:35:07.240 | versus doing the hard work of publishing papers
02:35:12.240 | and doing the revolution.
02:35:14.200 | Basically just pointing out, look,
02:35:15.900 | you guys are doing it wrong and then just walking away.
02:35:20.240 | Are you aware of the drug of martyrdom?
02:35:23.360 | Of the ego involved in it,
02:35:28.360 | that it can cloud your thinking?
02:35:31.500 | - Probably one of the best questions I've ever been asked.
02:35:35.840 | So let me try to sort it out.
02:35:38.300 | First of all, we are all mysteries to ourself at some level.
02:35:43.560 | So it's possible there's stuff going on in me
02:35:46.240 | that I'm not aware of that's driving.
02:35:48.320 | But in general, I would say one of my better strengths
02:35:52.080 | is that I'm not especially ego-driven.
02:35:54.280 | I have an ego.
02:35:56.360 | I clearly think highly of myself, but it is not driving me.
02:36:00.300 | I do not crave that kind of validation.
02:36:03.220 | I do crave certain things.
02:36:05.000 | I do love a good Eureka moment.
02:36:07.880 | There is something great about it.
02:36:09.360 | And there's something even better about the phone calls
02:36:11.720 | you make next when you share it.
02:36:13.360 | It's pretty fun.
02:36:15.840 | I really like it.
02:36:17.380 | I also really like my subject.
02:36:20.880 | There's something about a walk in the forest
02:36:23.800 | when you have a toolkit in which you can actually look
02:36:26.920 | at creatures and see something deep.
02:36:29.600 | I like it.
02:36:31.600 | That drives me.
02:36:33.080 | And I could entertain myself for the rest of my life.
02:36:35.760 | If I was somehow isolated from the rest of the world,
02:36:39.840 | but I was in a place that was biologically interesting,
02:36:42.960 | hopefully I would be with people that I love
02:36:45.960 | and pets that I love, believe it or not.
02:36:48.400 | But if I were in that situation
02:36:50.780 | and I could just go out every day and look at cool stuff
02:36:53.400 | and figure out what it means, I could be all right with that.
02:36:56.700 | So I'm not heavily driven by the ego thing, as you put it.
02:37:02.720 | - So I'm completely the same,
02:37:05.640 | except instead of the pets, I would put robots.
02:37:09.240 | So it's not, it's the Eureka, it's the exploration
02:37:12.080 | of the subject that brings you joy and fulfillment.
02:37:16.040 | It's not the ego.
02:37:17.880 | - Well, there's more to say.
02:37:18.880 | No, I really don't think it's the ego thing.
02:37:21.520 | I will say I also have kind of a secondary passion
02:37:24.240 | for robot stuff.
02:37:25.160 | I've never made anything useful,
02:37:27.560 | but I do believe, I believe I found my calling.
02:37:30.680 | But if this wasn't my calling,
02:37:32.320 | my calling would have been inventing stuff.
02:37:34.480 | I really enjoy that too.
02:37:36.120 | So I get what you're saying about the analogy quite well.
02:37:39.940 | As far as the martyrdom thing,
02:37:43.060 | I understand the drug you're talking about,
02:37:47.920 | and I've seen it more than I felt it.
02:37:51.000 | I do, if I'm just to be completely candid,
02:37:53.800 | and this question is so good, it deserves a candid answer,
02:37:57.640 | I do like the fight.
02:38:00.360 | I like fighting against people I don't respect,
02:38:04.720 | and I like winning.
02:38:06.400 | But I have no interest in martyrdom.
02:38:10.020 | One of the reasons I have no interest in martyrdom
02:38:12.680 | is that I'm having too good a time.
02:38:14.380 | I very much enjoy my life.
02:38:17.560 | - That's such a good answer.
02:38:18.760 | - I have a wonderful wife, I have amazing children,
02:38:23.420 | I live in a lovely place, I don't wanna exit.
02:38:27.520 | Any quicker than I have to.
02:38:29.440 | That said, I also believe in things,
02:38:32.220 | and a willingness to exit, if that's the only way,
02:38:35.600 | is not exactly inviting martyrdom,
02:38:37.840 | but it is an acceptance that fighting is dangerous,
02:38:41.260 | and going up against powerful forces
02:38:43.720 | means who knows what will come of it.
02:38:46.040 | I don't have the sense that the thing is out there
02:38:48.840 | that used to kill inconvenient people.
02:38:51.420 | I don't think that's how it's done anymore.
02:38:52.860 | It's primarily done through destroying them reputationally,
02:38:56.860 | which is not something I relish the possibility of.
02:39:00.520 | But there's a difference between a willingness
02:39:04.880 | to face the hazard rather than a desire
02:39:10.440 | to face it because of the thrill.
02:39:13.560 | For me, the thrill is in fighting when I'm in the right.
02:39:18.560 | I feel that that is a worthwhile way
02:39:22.320 | to take what I see as the kind of brutality
02:39:27.320 | that is built into men,
02:39:29.380 | and to channel it to something useful.
02:39:33.580 | If it is not channeled into something useful,
02:39:35.260 | it will be channeled into something else,
02:39:36.580 | so it damn well better be channeled into something useful.
02:39:38.940 | - It's not motivated by fame or popularity,
02:39:41.100 | those kinds of things.
02:39:42.160 | You're just making me realize that enjoying the fight,
02:39:50.500 | fighting the powerful and idea that you believe is right,
02:39:53.400 | is a kind of optimism for the human spirit.
02:40:00.360 | It's like, we can win this.
02:40:04.300 | It's almost like you're turning into action,
02:40:07.980 | into personal action, this hope for humanity,
02:40:12.980 | by saying like, we can win this.
02:40:15.720 | And that makes you feel good about the rest of humanity,
02:40:20.720 | that if there's people like me, then we're going to be okay.
02:40:27.840 | Even if your ideas might be wrong or not,
02:40:31.360 | but if you believe they're right,
02:40:33.060 | and you're fighting the powerful against all odds,
02:40:37.920 | then we're going to be okay.
02:40:39.540 | If I were to project, I mean,
02:40:44.560 | because I enjoy the fight as well,
02:40:46.320 | I think that's what brings me joy,
02:40:50.080 | is it's almost like it's optimism in action.
02:40:55.080 | - Well, it's a little different for me.
02:40:57.660 | And again, I recognize you, you're a familiar,
02:41:01.640 | your construction's familiar, even if it isn't mine, right?
02:41:04.700 | For me, I actually expect us not to be okay.
02:41:10.480 | And I'm not okay with that.
02:41:12.240 | But what's really important, if I feel like,
02:41:15.720 | what I've said is I don't know of any reason
02:41:17.920 | that it's too late.
02:41:19.280 | As far as I know, we could still save humanity,
02:41:22.120 | and we could get to the fourth frontier
02:41:23.600 | or something akin to it.
02:41:25.760 | But I expect us not to, I expect us to fuck it up, right?
02:41:29.520 | I don't like that thought, but I've looked into the abyss,
02:41:31.880 | and I've done my calculations,
02:41:34.000 | and the number of ways we could not succeed are many,
02:41:38.440 | and the number of ways that we could manage
02:41:40.480 | to get out of this very dangerous phase of history is small.
02:41:43.600 | But the thing I don't have to worry about
02:41:47.000 | is that I didn't do enough, right?
02:41:50.580 | That I was a coward, that I prioritized other things.
02:41:55.580 | At the end of the day, I think I will be able
02:41:59.000 | to say to myself, and in fact,
02:42:00.460 | the thing that allows me to sleep
02:42:02.200 | is that when I saw clearly what needed to be done,
02:42:05.640 | I tried to do it to the extent that it was in my power.
02:42:08.480 | And if we fail, as I expect us to,
02:42:12.800 | I can't say, well, geez, that's on me.
02:42:16.240 | And frankly, I regard what I just said to you
02:42:18.480 | as something like a personality defect, right?
02:42:22.220 | I'm trying to free myself from the sense
02:42:24.300 | that this is my fault.
02:42:25.800 | On the other hand, my guess is that personality defect
02:42:28.240 | is probably good for humanity, right?
02:42:31.400 | It's a good one for me to have,
02:42:33.160 | the externalities of it are positive,
02:42:36.160 | so I don't feel too bad about it.
02:42:38.720 | - Yeah, that's funny.
02:42:39.560 | So yeah, our perspective on the world are different,
02:42:43.640 | but they rhyme, like you said.
02:42:45.240 | 'Cause I've also looked into the abyss
02:42:47.680 | and it kind of smiled nervously back.
02:42:51.720 | So I have a more optimistic sense
02:42:54.720 | that we're gonna win, more than likely we're going to be okay.
02:42:58.160 | - I'm right there with you, brother.
02:43:00.240 | I'm hoping you're right.
02:43:01.600 | I'm expecting me to be right.
02:43:03.800 | But back to Eric, he had a wonderful conversation.
02:43:07.320 | In that conversation, he played the big brother role
02:43:11.080 | and he was very happy about it.
02:43:13.400 | He was self-congratulatory about it.
02:43:15.460 | Can you talk to the ways in which Eric
02:43:20.880 | made you a better man throughout your life?
02:43:24.040 | - Yeah, hell yeah.
02:43:25.600 | I mean, for one thing, Eric and I are interestingly similar
02:43:30.280 | in some ways and radically different in some other ways.
02:43:33.080 | And it's often a matter of fascination
02:43:35.760 | to people who know us both,
02:43:36.980 | because almost always people meet one of us first
02:43:39.240 | and they sort of get used to that thing
02:43:40.920 | and then they meet the other
02:43:41.800 | and it throws the model into chaos.
02:43:44.400 | But I had a great advantage, which is I came second.
02:43:48.440 | So although it was kind of a pain in the ass
02:43:51.800 | to be born into a world that had Eric in it,
02:43:53.780 | because he's a force of nature,
02:43:55.920 | it was also terrifically useful,
02:43:58.000 | because A, he was a very awesome older brother
02:44:02.640 | who made interesting mistakes, learned from them
02:44:06.000 | and conveyed the wisdom of what he had discovered.
02:44:08.760 | And that was, I don't know who else ends up so lucky
02:44:13.560 | as to have that kind of person blazing the trail.
02:44:18.000 | And also, probably, my hypothesis
02:44:22.520 | for what birth order effects are
02:44:24.840 | is that they're actually adaptive.
02:44:26.720 | The reason that a second born is different
02:44:30.680 | than a first born is that they're not born
02:44:32.840 | into a world with the same niches in it.
02:44:35.320 | And so the thing about Eric
02:44:36.840 | is he's been completely dominant
02:44:39.520 | in the realm of fundamental thinking.
02:44:44.520 | What he's fascinated by is the fundamental of fundamentals.
02:44:48.420 | And he's excellent at it,
02:44:49.740 | which meant that I was born into a world
02:44:51.640 | where somebody was becoming excellent in that
02:44:53.600 | and for me to be anywhere near
02:44:55.560 | the fundamental of fundamentals was going to be pointless.
02:44:58.800 | I was gonna be playing second fiddle forever.
02:45:00.880 | And I think that that actually drove me
02:45:02.560 | to the other end of the continuum
02:45:04.680 | between fundamental and emergent.
02:45:06.560 | And so I became fascinated with biology
02:45:09.120 | and have been since I was three years old.
02:45:12.640 | I think Eric drove that.
02:45:15.240 | And I have to thank him for it because, you know, I mean--
02:45:19.360 | - I never thought of,
02:45:20.800 | so Eric drives towards the fundamental
02:45:24.080 | and you drive towards the emergent,
02:45:26.360 | the physics and the biology.
02:45:28.200 | - Right, opposite ends of the continuum.
02:45:30.080 | And as Eric would be quick to point out
02:45:32.640 | if he was sitting here,
02:45:34.160 | I treat the emergent layer,
02:45:36.160 | I seek the fundamentals in it,
02:45:37.760 | which is sort of an echo of Eric's style of thinking,
02:45:40.360 | but applied to the very far complexity.
02:45:43.440 | - He's overpoweringly argues for the importance of physics,
02:45:48.440 | the fundamental of the fundamental.
02:45:52.160 | He's not here to defend himself.
02:45:57.420 | Is there an argument to be made against that?
02:46:00.200 | Or biology, the emergent,
02:46:03.040 | the study of the thing that emerged
02:46:06.760 | when the fundamental acts at the universal,
02:46:09.680 | at the cosmic scale and then builds the beautiful thing
02:46:12.320 | that is us is much more important.
02:46:14.380 | Psychology, biology,
02:46:18.360 | the systems that we're actually interacting with
02:46:21.640 | in this human world are much more important to understand
02:46:25.420 | than the low level theories of quantum mechanics
02:46:30.420 | and general relativity.
02:46:32.940 | - Yeah, I can't say that one is more important.
02:46:35.700 | I think there's probably a different timescale.
02:46:38.400 | I think understanding the emergent layer
02:46:40.720 | is more often useful,
02:46:42.960 | but the bang for the buck at the far fundamental layer
02:46:47.280 | may be much greater.
02:46:48.400 | So for example, the fourth frontier,
02:46:51.220 | I'm pretty sure it's gonna have to be fusion powered.
02:46:55.660 | I don't think anything else will do it,
02:46:57.260 | but once you had fusion power,
02:46:58.820 | assuming we didn't just dump fusion power on the market
02:47:01.460 | the way we would be likely to
02:47:02.620 | if it was invented usefully tomorrow.
02:47:05.720 | But if we had fusion power
02:47:08.580 | and we had a little bit more wisdom than we have,
02:47:10.980 | you could do an awful lot.
02:47:12.060 | And that's not gonna come from people like me
02:47:15.380 | who look at dynamics.
02:47:17.980 | - Can I argue against that?
02:47:19.420 | - Please.
02:47:21.280 | - I think the way to unlock fusion power
02:47:25.700 | is through artificial intelligence.
02:47:27.520 | So I think most of the breakthrough ideas
02:47:32.580 | in the futures of science will be developed by AI systems.
02:47:36.020 | And I think in order to build intelligent AI systems,
02:47:39.060 | you have to be a scholar of the fundamental of the emergent,
02:47:43.900 | of biology, of the neuroscience,
02:47:47.540 | of the way the brain works, of intelligence,
02:47:49.900 | of consciousness, and those things,
02:47:52.960 | at least directly, don't have anything to do with physics.
02:47:56.140 | - Well, you're making me a little bit sad
02:47:58.260 | because my addiction to the aha moment thing
02:48:02.080 | is incompatible with outsourcing that job.
02:48:06.740 | - You don't like to outsource that?
02:48:07.580 | - I don't wanna outsource that thing to the AI.
02:48:09.380 | - You reap the moment.
02:48:10.420 | - And actually I've seen this happen before
02:48:13.240 | because some of the people who trained Heather and me
02:48:16.400 | were phylogenetic systematists.
02:48:19.380 | Arnold Kluge in particular.
02:48:21.740 | And the problem with systematics is that to do it right
02:48:26.460 | when your technology is primitive,
02:48:28.800 | you have to be deeply embedded
02:48:30.580 | in the philosophical and the logical, right?
02:48:34.180 | Your method has to be based in the highest level of rigor.
02:48:39.180 | Once you can sequence genes,
02:48:42.420 | genes can spit so much data at you
02:48:44.180 | that you can overwhelm high quality work
02:48:46.840 | with just lots and lots and lots of automated work.
02:48:49.980 | And so in some sense,
02:48:51.780 | there's like a generation of phylogenetic systematists
02:48:54.980 | who are the last of the greats
02:48:56.660 | because what's replacing them is sequencers.
02:48:58.900 | So anyway, maybe you're right about the AI,
02:49:03.140 | and I guess I'm-- - Makes you sad.
02:49:04.900 | - I like figuring stuff out.
02:49:07.960 | - Is there something that you disagree with Eric on,
02:49:11.260 | you've been trying to convince him,
02:49:13.100 | you've failed so far, but you will eventually succeed?
02:49:17.720 | - You know, that is a very long list.
02:49:20.600 | Eric and I have tensions over certain things
02:49:24.120 | that recur all the time.
02:49:26.960 | And I'm trying to think what would be the ideal--
02:49:29.480 | - Is it in the space of science,
02:49:30.860 | in the space of philosophy, politics, family, love, robots?
02:49:35.860 | - Well, all right, let me,
02:49:39.720 | I'm just gonna use your podcast
02:49:42.500 | to make a bit of cryptic war
02:49:44.760 | and just say there are many places
02:49:47.120 | in which I believe that I have butted heads with Eric
02:49:50.720 | over the course of decades,
02:49:52.340 | and I have seen him move in my direction
02:49:55.040 | substantially over time. - So you've been winning.
02:49:57.560 | He might win a battle here or there,
02:49:59.460 | but you've been winning the war.
02:50:00.580 | - I would not say that.
02:50:01.860 | It's quite possible he could say the same thing about me.
02:50:04.780 | And in fact, I know that it's true.
02:50:06.220 | There are places where he's absolutely convinced me.
02:50:08.440 | But in any case, I do believe it's at least,
02:50:11.880 | it may not be a totally even fight,
02:50:13.220 | but it's more even than some will imagine.
02:50:15.340 | But yeah, we have,
02:50:17.440 | there are things I say that drive him nuts.
02:50:21.580 | Like when something, like you heard me talk about the,
02:50:27.140 | what was it?
02:50:29.540 | It was the autopilot that seems to be putting
02:50:33.820 | a great many humans in needless medical jeopardy
02:50:37.180 | over the COVID-19 pandemic.
02:50:40.300 | And my feeling is we can say this almost for sure.
02:50:44.460 | Anytime you have the appearance
02:50:47.020 | of some captured gigantic entity
02:50:50.680 | that is censoring you on YouTube
02:50:52.900 | and handing down dictates from the WHO and all of that,
02:50:56.780 | it is sure that there will be
02:50:59.180 | a certain amount of collusion, right?
02:51:01.300 | There's gonna be some embarrassing emails in some places
02:51:03.920 | that are gonna reveal some shocking connections.
02:51:05.780 | And then there's gonna be an awful lot of emergence
02:51:09.500 | that didn't involve collusion, right?
02:51:11.380 | In which people were doing their little part of a job
02:51:13.420 | and something was emerging.
02:51:14.460 | And you never know what the admixture is.
02:51:16.740 | How much are we looking at actual collusion
02:51:19.580 | and how much are we looking at an emergent process?
02:51:21.540 | But you should always walk in
02:51:23.060 | with the sense that it's gonna be a ratio.
02:51:24.820 | And the question is, what is the ratio in this case?
02:51:27.640 | I think this drives Eric nuts
02:51:29.820 | because he is very focused on the people.
02:51:32.500 | I think he's focused on the people who have a choice
02:51:34.980 | and make the wrong one.
02:51:36.900 | And anyway, he may--
02:51:38.580 | - Discussion of the ratio is a distraction to that.
02:51:41.340 | - I think he takes it almost as an offense
02:51:44.980 | because it grants cover to people who are harming others.
02:51:49.980 | And I think it offends him morally.
02:51:56.140 | And if I had to say,
02:51:57.900 | I would say it alters his judgment on the matter.
02:52:02.020 | But anyway, certainly useful just to leave open
02:52:05.300 | the two possibilities and say it's a ratio,
02:52:07.300 | but we don't know which one.
02:52:08.700 | - Brother to brother, do you love the guy?
02:52:12.840 | - Hell yeah, hell yeah.
02:52:15.460 | And I'd love him if he was just my brother,
02:52:18.220 | but he's also awesome.
02:52:19.260 | So I love him and I love him for who he is.
02:52:21.980 | - So let me ask you about, back to your book,
02:52:25.900 | "Hunter-Gatherer's Guide to the 21st Century."
02:52:29.680 | I can't wait both for the book
02:52:31.900 | and the videos you do on the book.
02:52:33.740 | That's really exciting that there's like a structured,
02:52:35.900 | organized way to present this.
02:52:37.680 | A kind of, from an evolutionary biology perspective,
02:52:44.620 | a guide for the future, using our past
02:52:48.420 | as the fundamental, the emergent way
02:52:52.340 | to present a picture of the future.
02:52:56.160 | Let me ask you about something that,
02:52:59.160 | I think about a little bit in this modern world,
02:53:02.720 | which is monogamy.
02:53:03.820 | So I personally value monogamy.
02:53:10.200 | One girl, ride or die.
02:53:12.420 | - There you go.
02:53:13.560 | Ride or, no, that's exactly it.
02:53:15.480 | - But that said, I don't know what's the right way
02:53:21.720 | to approach this, but from an evolutionary biology
02:53:26.480 | perspective or from just looking at modern society,
02:53:30.120 | that seems to be an idea that's not,
02:53:32.500 | what's the right way to put it, flourishing?
02:53:35.820 | - It is waning.
02:53:38.380 | - It's waning.
02:53:39.260 | So I suppose based on your reaction,
02:53:44.100 | you're also a supporter of monogamy
02:53:45.880 | or you value monogamy.
02:53:47.940 | Are you and I just delusional?
02:53:51.160 | What can you say about monogamy
02:53:56.400 | from the context of your book,
02:53:58.040 | from the context of evolutionary biology,
02:54:00.640 | from the context of being human?
02:54:02.560 | - Yeah, I can say that I fully believe
02:54:05.040 | that we are actually enlightened
02:54:06.600 | and that although monogamy is waning,
02:54:09.340 | that it is not waning because there is a superior system.
02:54:12.320 | It is waning for predictable other reasons.
02:54:15.380 | So let us just say there is a lot of pre-trans fallacy here
02:54:20.380 | where people go through a phase where they recognize
02:54:26.120 | that actually we know a lot about the evolution of monogamy
02:54:31.120 | and we can tell from the fact that humans
02:54:34.040 | are somewhat sexually dimorphic,
02:54:36.600 | that there has been a lot of polygyny in human history.
02:54:39.480 | And in fact, most of human history was largely polygynous.
02:54:45.120 | But it is also the case that most of the people
02:54:48.760 | on earth today belong to civilizations
02:54:51.200 | that are at least nominally monogamous
02:54:53.040 | and have practiced monogamy.
02:54:54.640 | And that's not anti-evolutionary.
02:54:58.060 | What that is, is part of what I mentioned before,
02:55:01.360 | where human beings can swap out their software program
02:55:05.280 | and different mating patterns are favored
02:55:09.840 | in different periods of history.
02:55:11.960 | So I would argue that the benefit of monogamy,
02:55:15.160 | the primary one that drives the evolution
02:55:17.360 | of monogamous patterns in humans,
02:55:19.560 | is that it brings all adults into child rearing.
02:55:23.520 | Now the reason that that matters
02:55:26.640 | is because human babies are very labor intensive.
02:55:29.880 | In order to raise them properly,
02:55:31.120 | having two parents is a huge asset
02:55:34.040 | and having more than two parents,
02:55:35.440 | having an extended family also is very important.
02:55:39.720 | But what that means is that for a population
02:55:43.480 | that is expanding, a monogamous mating system makes sense.
02:55:48.000 | It makes sense because it means that the number
02:55:50.140 | of offspring that can be raised is elevated.
02:55:52.840 | It's elevated because all potential parents
02:55:56.280 | are involved in parenting.
02:55:58.080 | Whereas if you sideline a bunch of males
02:56:00.160 | by having a polygynous system
02:56:01.520 | in which one male has many females,
02:56:03.240 | which is typically the way that works,
02:56:05.360 | what you do is you sideline all those males,
02:56:07.040 | which means the total amount of parental effort is lower
02:56:09.960 | and the population can't grow.
02:56:12.160 | So what I'm arguing is that you should expect
02:56:15.040 | to see populations that face the possibility
02:56:19.400 | of expansion endorse monogamy.
02:56:21.880 | And at the point that they have reached carrying capacity,
02:56:24.000 | you should expect to see polygyny break back out.
02:56:26.480 | And what we are seeing is a kind of false sophistication
02:56:30.120 | around polyamory, which will end up breaking down
02:56:33.640 | into polygyny, which will not be in the interest
02:56:36.740 | of most people.
02:56:37.580 | Really the only people whose interest it could be argued
02:56:39.760 | to be in would be the very small number of males
02:56:43.480 | at the top who have many partners
02:56:47.200 | and everybody else suffers.
02:56:48.960 | - Is it possible to make the argument,
02:56:51.240 | if we focus in on those males at the quote unquote top
02:56:55.040 | with many female partners, is it possible to say
02:56:59.280 | that that's a suboptimal life?
02:57:01.280 | That a single partner is the optimal life?
02:57:05.240 | - Well, it depends what you mean.
02:57:06.320 | I have a feeling that you and I wouldn't have
02:57:08.360 | to go very far to figure out that what might
02:57:12.900 | be evolutionarily optimal doesn't match my values
02:57:16.700 | as a person and I'm sure it doesn't match yours either.
02:57:19.080 | - Can we try to dig into that gap between those two?
02:57:23.200 | - Sure.
02:57:24.040 | I mean, we can do it very simply.
02:57:26.980 | Selection might favor your engaging in war
02:57:33.160 | against a defenseless enemy or genocide.
02:57:36.780 | It's not hard to figure out how that might put
02:57:40.880 | your genes at advantage.
02:57:43.280 | I don't know about you, Lex, I'm not getting involved
02:57:45.760 | in no genocide, it's not gonna happen.
02:57:47.760 | I won't do it, I will do anything to avoid it.
02:57:49.880 | So some part of me has decided that my conscious self
02:57:54.400 | and the values that I hold trump my evolutionary self
02:57:59.400 | and once you figure out that in some extreme case
02:58:03.080 | that's true and then you realize that that means
02:58:05.520 | it must be possible in many other cases
02:58:07.520 | and you start going through all of the things
02:58:09.120 | that selection would favor and you realize
02:58:10.720 | that a fair fraction of the time,
02:58:12.360 | actually you're not up for this.
02:58:14.000 | You don't wanna be some robot on a mission
02:58:17.160 | that involves genocide when necessary.
02:58:19.980 | You wanna be your own person and accomplish things
02:58:22.000 | that you think are valuable.
02:58:24.880 | And so among those are not advocating,
02:58:29.880 | let's suppose you were in a position
02:58:32.120 | to be one of those males at the top of a polygynous system.
02:58:35.040 | We both know why that would be rewarding, right?
02:58:38.180 | But we also both recognize--
02:58:39.560 | - Do we? - Yeah, sure.
02:58:41.440 | - Lots of sex? - Yeah.
02:58:42.940 | - Okay, what else? - Lots of sex
02:58:44.320 | and lots of variety, right?
02:58:45.960 | So look, every red-blooded American/Russian male
02:58:50.960 | can understand why that's appealing, right?
02:58:53.840 | On the other hand, it is up against an alternative
02:58:57.800 | which is having a partner with whom one is bonded
02:59:02.400 | especially closely, right?
02:59:06.200 | And so-- - AKA love.
02:59:08.240 | - Right, well, I don't wanna straw man
02:59:13.240 | the polygyny position.
02:59:14.860 | Obviously, polygyny is complex and there's nothing
02:59:17.320 | that stops a man presumably from loving multiple partners
02:59:22.320 | and from them loving him back.
02:59:24.260 | But in terms of, if love is your thing,
02:59:26.360 | there's a question about, okay, what is the quality of love
02:59:29.360 | if it is divided over multiple partners, right?
02:59:32.480 | And what is the net consequence for love in a society
02:59:36.900 | when multiple people will be frozen out
02:59:39.420 | for every individual male in this case who has it?
02:59:42.520 | And what I would argue is, and you know,
02:59:48.040 | this is weird to even talk about,
02:59:49.800 | but this is partially me just talking
02:59:51.520 | from personal experience.
02:59:53.000 | I think there actually is a monogamy program in us
02:59:55.920 | and it's not automatic.
02:59:57.640 | But if you take it seriously, you can find it.
03:00:02.520 | And frankly, marriage, and it doesn't have to be marriage,
03:00:06.500 | but whatever it is that results in a lifelong bond
03:00:09.320 | with a partner has gotten a very bad rap.
03:00:11.960 | You know, it's the butt of too many jokes.
03:00:14.280 | But the truth is, it's hugely rewarding.
03:00:17.800 | It's not easy.
03:00:19.680 | But if you know that you're looking for something, right?
03:00:22.400 | If you know that the objective actually exists
03:00:24.200 | and it's not some utopian fantasy that can't be found,
03:00:27.080 | if you know that there's some real world,
03:00:29.840 | you know, warts and all version of it,
03:00:32.440 | then you might actually think,
03:00:33.840 | hey, that is something I want.
03:00:35.120 | And you might pursue it.
03:00:35.960 | And my guess is you'd be very happy when you find it.
03:00:38.080 | - Yeah, I think there is,
03:00:39.640 | getting to the fundamentals of the emergent,
03:00:41.640 | I feel like there is some kind of physics of love.
03:00:43.620 | So one, there's a conservation thing going on.
03:00:47.240 | So if you have like many partners,
03:00:49.240 | yeah, in theory, you should be able
03:00:52.480 | to love all of them deeply.
03:00:54.160 | But it seems like in reality, that love gets split.
03:00:56.760 | - Yep.
03:00:58.880 | - Now there's another law that's interesting
03:01:01.080 | in terms of monogamy.
03:01:02.720 | I don't know if it's at the physics level,
03:01:04.600 | but if you are in a monogamous relationship by choice
03:01:09.280 | and almost as in slight rebellion to social norms,
03:01:15.280 | that's much more powerful.
03:01:17.760 | Like if you choose that one partnership,
03:01:20.920 | that's also more powerful.
03:01:22.680 | If like everybody's in a monogamy,
03:01:24.480 | there's this pressure to be married
03:01:26.280 | and this pressure of society, that's different.
03:01:28.880 | Because that's almost like a constraint on your freedom
03:01:32.200 | that is enforced by something other than your own ideals.
03:01:35.440 | It's by somebody else.
03:01:37.760 | When you yourself choose to, I guess,
03:01:40.620 | create these constraints, that enriches that love.
03:01:45.000 | So there's some kind of love function,
03:01:47.560 | like E equals MC squared, but for love,
03:01:50.160 | that I feel like if you have less partners
03:01:53.240 | and it's done by choice, that can maximize that.
03:01:56.520 | And that love can transcend the biology,
03:02:00.900 | transcend the evolutionary biology forces
03:02:03.520 | that have to do much more with survival
03:02:06.280 | and all those kinds of things.
03:02:07.800 | It can transcend to take us to a richer experience,
03:02:11.880 | which we have the luxury of having, exploring,
03:02:14.360 | of happiness, of joy, of fulfillment,
03:02:17.960 | all those kinds of things.
03:02:19.400 | - Totally agree with this.
03:02:21.040 | And there's no question that by choice,
03:02:24.800 | when there are other choices, imbues it with meaning
03:02:27.960 | that it might not otherwise have.
03:02:30.760 | I would also say, I'm really struck by,
03:02:35.640 | and I have a hard time not feeling terrible sadness
03:02:40.120 | over what younger people are coming
03:02:44.360 | to think about this topic.
03:02:46.880 | I think they're missing something so important
03:02:49.520 | and so hard to phrase,
03:02:51.520 | and they don't even know that they're missing it.
03:02:54.280 | They might know that they're unhappy,
03:02:55.960 | but they don't understand what it is
03:02:58.000 | they're even looking for,
03:02:58.840 | because nobody's really been honest with them
03:03:00.760 | about what their choices are.
03:03:02.160 | And I have to say, if I was a young person,
03:03:05.160 | or if I was advising a young person,
03:03:06.960 | which I used to do, again, a million years ago
03:03:09.160 | when I was a college professor, four years ago,
03:03:12.000 | but I used to talk to students.
03:03:13.780 | I knew my students really well,
03:03:15.000 | and they would ask questions about this,
03:03:16.640 | and they were always curious,
03:03:17.640 | because Heather and I seemed to have a good relationship,
03:03:19.920 | and many of them knew both of us.
03:03:21.980 | So they would talk to us about this.
03:03:24.480 | If I was advising somebody, I would say,
03:03:28.040 | do not bypass the possibility
03:03:30.920 | that what you are supposed to do
03:03:32.680 | is find somebody worthy,
03:03:36.240 | somebody who can handle it,
03:03:37.840 | somebody who you are compatible with,
03:03:39.520 | and that you don't have to be perfectly compatible.
03:03:41.840 | It's not about dating until you find the one.
03:03:44.880 | It's about finding somebody whose underlying values
03:03:48.600 | and viewpoint are complementary to yours,
03:03:51.200 | sufficient that you fall in love.
03:03:53.060 | If you find that person, opt out together.
03:03:58.480 | Get out of this damn system
03:04:00.400 | that's telling you what's sophisticated to think about,
03:04:02.880 | love and romance and sex.
03:04:04.620 | Ignore it together, right?
03:04:06.520 | That's the key.
03:04:08.040 | And I believe you'll end up laughing in the end
03:04:11.560 | if you do it.
03:04:12.400 | But discover, wow, that's a hellscape that I opted out of,
03:04:17.400 | and this thing I opted into, complicated, difficult,
03:04:21.480 | worth it.
03:04:22.680 | - Nothing that's worth it is ever not difficult.
03:04:25.840 | So we should even just skip the whole statement
03:04:29.320 | about difficult.
03:04:30.200 | - Yeah, right.
03:04:31.320 | I wanna be honest.
03:04:32.320 | It's not like, oh, it's nonstop joy.
03:04:35.160 | No, it's fricking complex.
03:04:36.960 | But worth it?
03:04:39.000 | No question in my mind.
03:04:41.100 | - Is there advice outside of love
03:04:43.000 | that you can give to young people?
03:04:45.400 | You were a million years ago a professor.
03:04:48.000 | Is there advice you can give to young people,
03:04:51.540 | high schoolers, college students,
03:04:54.120 | about career, about life?
03:04:56.780 | - Yeah, but it's not, they're not gonna like it
03:04:58.840 | 'cause it's not easy to operationalize.
03:05:00.680 | So, and this was a problem
03:05:01.840 | when I was a college professor too.
03:05:03.160 | People would ask me what they should do.
03:05:04.800 | Should they go to graduate school?
03:05:06.360 | I had almost nothing useful to say
03:05:08.360 | because the job market and the market of,
03:05:12.640 | you know, pre-job training and all of that,
03:05:15.040 | these things are all so distorted and corrupt
03:05:20.040 | that I didn't wanna point anybody to anything, right?
03:05:23.520 | Because it's all broken.
03:05:24.700 | And I would tell them that.
03:05:26.620 | But I would say that results in a kind of meta-level advice
03:05:31.380 | that I do think is useful.
03:05:33.460 | You don't know what's coming.
03:05:35.900 | You don't know where the opportunities will be.
03:05:38.760 | You should invest in tools rather than knowledge, right?
03:05:42.600 | To the extent that you can do things,
03:05:44.500 | you can repurpose that no matter what the future brings
03:05:47.920 | to the extent that, you know, if you, as a robot guy, right,
03:05:51.940 | you've got the skills of a robot guy.
03:05:53.880 | Now, if civilization failed
03:05:56.800 | and the stuff of robot building disappeared with it,
03:05:59.620 | you'd still have the mind of a robot guy.
03:06:02.600 | And the mind of a robot guy can retool
03:06:04.540 | around all kinds of things,
03:06:05.740 | whether you're, you know, forced to work with, you know,
03:06:09.320 | fibers that are made into ropes, right?
03:06:12.680 | Your mechanical mind would be useful in all kinds of places.
03:06:15.920 | So invest in tools like that that can be easily repurposed
03:06:19.240 | and invest in combinations of tools, right?
03:06:23.800 | If civilization keeps limping along,
03:06:27.800 | you're gonna be up against all sorts of people
03:06:30.840 | who have studied the things that you studied, right?
03:06:33.360 | If you think, hey, computer programming
03:06:34.840 | is really, really cool,
03:06:36.120 | and you pick up computer programming, guess what?
03:06:38.400 | You just entered a large group of people
03:06:40.660 | who have that skill and many of them
03:06:42.080 | will be better than you, almost certainly.
03:06:44.800 | On the other hand, if you combine that with something else
03:06:48.600 | that's very rarely combined with it,
03:06:50.800 | if you have, I don't know if it's carpentry
03:06:54.080 | and computer programming,
03:06:55.660 | if you take combinations of things that are,
03:06:59.060 | even if they're both common,
03:07:00.560 | but they're not commonly found together,
03:07:03.040 | then those combinations create a rarefied space
03:07:06.080 | where you inhabit it.
03:07:07.240 | And even if the things don't even really touch,
03:07:10.440 | but nonetheless, they create a mind
03:07:12.240 | in which the two things are live
03:07:13.760 | and you can move back and forth between them
03:07:15.920 | and step out of your own perspective
03:07:18.360 | by moving from one to the other,
03:07:20.400 | that will increase what you can see
03:07:22.640 | and the quality of your tools.
03:07:24.480 | And so anyway, that isn't useful advice.
03:07:26.440 | It doesn't tell you whether you should go
03:07:27.520 | to graduate school or not,
03:07:29.040 | but it does tell you the one thing we can say for certain
03:07:32.880 | about the future is that it's uncertain
03:07:34.560 | and so prepare for it.
03:07:36.040 | - And like you said, there's cool things to be discovered
03:07:38.800 | in the intersection of fields and ideas.
03:07:42.480 | And I would look at grad school that way,
03:07:44.960 | actually, if you do go.
03:07:46.960 | Or I see, I mean, this is such a,
03:07:51.760 | like every course in grad school, undergrad too,
03:07:55.160 | was like this little journey that you're on
03:07:57.800 | that explores a particular field.
03:08:00.040 | And it's not immediately obvious how useful it is,
03:08:03.760 | but it allows you to discover intersections
03:08:08.360 | between that thing and some other thing.
03:08:11.120 | So you're bringing to the table these pieces of knowledge,
03:08:16.120 | some of which when intersected might create a niche
03:08:19.880 | that's completely novel, unique, and will bring you joy.
03:08:23.560 | I have that, I mean, I took a huge number of courses
03:08:25.760 | in theoretical computer science.
03:08:28.040 | Most of them seem useless,
03:08:29.840 | but they totally changed the way I see the world
03:08:32.720 | in ways that I'm not prepared
03:08:34.760 | or is a little bit difficult to kind of make explicit.
03:08:38.640 | But taken together, they've allowed me to see,
03:08:43.640 | for example, the world of robotics totally different
03:08:48.320 | and different from many of my colleagues and friends
03:08:51.160 | and so on.
03:08:52.000 | And I think that's a good way to see
03:08:54.280 | if you go to grad school as an opportunity
03:08:59.160 | to explore intersections of fields,
03:09:01.920 | even if the individual fields seem useless.
03:09:04.720 | - Yeah, and useless doesn't mean useless, right?
03:09:07.120 | Useless means not directly applicable.
03:09:09.280 | - Not directly.
03:09:10.120 | - A good useless course can be the best one you ever took.
03:09:12.840 | - Yeah, I took a course on James Joyce,
03:09:18.080 | and that was truly useless.
03:09:20.320 | - Well, I took immunobiology in the medical school
03:09:25.880 | when I was at Penn as, I guess I would have been
03:09:29.640 | a freshman or a sophomore.
03:09:30.880 | I wasn't supposed to be in this class.
03:09:33.040 | It blew my goddamn mind, and it still does, right?
03:09:37.040 | I mean, we had this, I don't even know who it was,
03:09:39.600 | but we had this great professor who was highly placed
03:09:42.720 | in the world of immunobiology.
03:09:44.080 | You know, the course is called immunobiology,
03:09:45.920 | not immunology, immunobiology.
03:09:49.040 | It had the right focus.
03:09:50.160 | And as I recall it, the professor stood sideways
03:09:54.560 | to the chalkboard, staring off into space,
03:09:57.240 | literally stroking his beard with this bemused look
03:10:01.240 | on his face through the entire lecture.
03:10:04.280 | And you know, you had all these medical students
03:10:05.760 | who were so furiously writing notes
03:10:07.160 | that I don't even think they were noticing
03:10:08.560 | the person delivering this thing.
03:10:09.900 | But you know, I got what this guy was smiling about.
03:10:13.600 | It was like so, what he was describing, you know,
03:10:15.960 | adaptive immunity is so marvelous, right,
03:10:18.680 | that it was like almost a privilege to even be saying it
03:10:21.380 | to a roomful of people who were listening, you know?
03:10:23.920 | But anyway, yeah, I took that course.
03:10:25.340 | And you know, lo and behold, COVID.
03:10:28.160 | - That's never gonna be useful.
03:10:29.000 | - Well, yeah, suddenly it's front and center
03:10:32.120 | and wow, am I glad I took it.
03:10:33.600 | But anyway, yeah, useless courses are great.
03:10:37.040 | And actually Eric gave me one of the greater pieces
03:10:40.200 | of advice, at least for college, that anyone's ever given,
03:10:43.380 | which was don't worry about the prereqs,
03:10:46.120 | take it anyway, right?
03:10:47.600 | But now I don't even know if kids can do this now
03:10:50.400 | because the prereqs are now enforced by a computer.
03:10:53.240 | But back in the day, if you didn't mention
03:10:56.760 | that you didn't have the prereqs,
03:10:57.960 | nobody stopped you from taking the course.
03:10:59.680 | And what he told me, which I didn't know,
03:11:01.400 | was that often the advanced courses are easier in some way.
03:11:06.320 | The material's complex, but you know,
03:11:09.720 | it's not like intro bio where you're learning
03:11:12.520 | a thousand things at once, right?
03:11:14.440 | It's like focused on something.
03:11:16.000 | So if you dedicate yourself, you can pull it off.
03:11:18.760 | - Yeah, stay with an idea for many weeks at a time.
03:11:21.480 | And it's ultimately rewarding
03:11:22.840 | and not as difficult as it looks.
03:11:24.560 | Can I ask you a ridiculous question?
03:11:27.240 | - Please.
03:11:28.060 | - What do you think is the meaning of life?
03:11:32.040 | - Well, I feel terrible having to give you the answer.
03:11:38.880 | I realize you asked the question, but if I tell you,
03:11:41.160 | you're gonna again feel bad.
03:11:43.280 | I don't wanna do that, but look, there's two--
03:11:46.080 | - There can be a disappointment, isn't there?
03:11:47.320 | - No, it's gonna be a horror, right?
03:11:50.400 | Because we actually know the answer to the question.
03:11:52.680 | - Oh no.
03:11:53.840 | - It's completely meaningless.
03:11:56.200 | There is nothing that we can do that escapes
03:11:59.560 | the heat death of the universe or whatever it is
03:12:01.520 | that happens at the end.
03:12:02.760 | And we're not gonna make it there anyway,
03:12:04.780 | but even if you were optimistic about our ability
03:12:07.680 | to escape every existential hazard indefinitely,
03:12:12.680 | ultimately it's all for naught and we know it, right?
03:12:17.060 | That said, once you stare into that abyss
03:12:20.660 | and then it stares back and laughs or whatever happens,
03:12:24.420 | then the question is okay, given that,
03:12:27.140 | can I relax a little bit and figure out,
03:12:30.860 | well, what would make sense if that were true?
03:12:33.160 | And I think there's something very clear to me.
03:12:37.220 | I think if you do all of the,
03:12:38.980 | if I just take the values that I'm sure we share
03:12:41.780 | and extrapolate from them, I think the following thing
03:12:45.100 | is actually a moral imperative.
03:12:46.740 | Being a human and having opportunity
03:12:51.580 | is absolutely fucking awesome, right?
03:12:54.340 | A lot of people don't make use of the opportunity
03:12:56.020 | and a lot of people don't have opportunity, right?
03:12:58.020 | They get to be human, but they're too constrained
03:13:00.140 | by keeping a roof over their heads to really be free.
03:13:03.880 | But being a free human is fantastic.
03:13:07.460 | And being a free human on this beautiful planet,
03:13:10.100 | crippled as it may be, is unparalleled.
03:13:13.860 | I mean, what could be better?
03:13:15.300 | How lucky are we that we get that, right?
03:13:17.820 | So if that's true, that it is awesome to be human
03:13:21.380 | and to be free, then surely it is our obligation
03:13:25.300 | to deliver that opportunity to as many people as we can.
03:13:28.200 | And how do you do that?
03:13:30.760 | Well, I think I know what job one is.
03:13:33.340 | Job one is we have to get sustainable.
03:13:36.840 | The way to get the maximum number of humans
03:13:39.060 | to have that opportunity to be both here and free
03:13:42.720 | is to make sure that there isn't a limit
03:13:44.900 | on how long we can keep doing this.
03:13:46.780 | That effectively requires us to reach sustainability.
03:13:50.400 | And then at sustainability,
03:13:52.660 | you could have a horror show of sustainability, right?
03:13:55.500 | You could have a totalitarian sustainability.
03:13:58.860 | That's not the objective.
03:14:00.220 | The objective is to liberate people.
03:14:02.140 | And so the question,
03:14:03.420 | the whole fourth frontier question, frankly,
03:14:05.780 | is how do you get to a sustainable
03:14:08.400 | and indefinitely sustainable state
03:14:10.780 | in which people feel liberated,
03:14:13.080 | in which they are liberated
03:14:14.520 | to pursue the things that actually matter,
03:14:16.320 | to pursue beauty, truth, compassion, connection,
03:14:21.320 | all of those things that we could list as unalloyed goods,
03:14:27.280 | those are the things that people should be most liberated
03:14:29.460 | to do in a system that really functions.
03:14:31.680 | And anyway, my point is,
03:14:33.900 | I don't know how precise that calculation is,
03:14:37.060 | but I'm pretty sure it's not wrong.
03:14:38.600 | It's accurate enough.
03:14:39.960 | And if it is accurate enough,
03:14:41.880 | then the point is, okay, well, there's no ultimate meaning,
03:14:45.320 | but the proximate meaning is that one.
03:14:47.120 | How many people can we get to have this wonderful experience
03:14:50.200 | that we've gotten to have, right?
03:14:52.240 | And there's no way that's so wrong
03:14:54.800 | that if I invest my life in it,
03:14:56.680 | that I'm making some big error.
03:14:58.280 | I'm sure of that.
03:14:59.200 | - Life is awesome,
03:15:00.200 | and we wanna spread the awesome as much as possible.
03:15:03.560 | - Yeah, you sum it up that way, spread the awesome.
03:15:05.760 | - Spread the awesome.
03:15:06.720 | So that's the fourth frontier.
03:15:07.960 | And if that fails, if the fourth frontier fails,
03:15:10.600 | the fifth frontier will be defined by robots,
03:15:13.020 | and hopefully they'll learn the lessons
03:15:14.920 | of the mistakes that the humans made
03:15:18.400 | and build a better world.
03:15:19.680 | I hope with more awesome. - They're very happy here
03:15:21.280 | and that they do a better job with the play
03:15:22.720 | than we did. (both laughing)
03:15:25.240 | - Brett, I can't believe it took us this long to talk.
03:15:29.080 | As I mentioned to you before,
03:15:31.660 | that we haven't actually spoken, I think, at all.
03:15:35.800 | And I've always felt that we're already friends.
03:15:39.360 | I don't know how that works
03:15:40.760 | because I've listened to your podcast a lot.
03:15:42.980 | I've also sort of loved your brother.
03:15:46.120 | And so it was like we've known each other
03:15:49.040 | for the longest time,
03:15:50.000 | and I hope we can be friends and talk often again.
03:15:54.000 | I hope that you get a chance to meet
03:15:56.300 | some of my robot friends as well and fall in love.
03:15:59.080 | And I'm so glad that you love robots as well,
03:16:02.640 | so we get to share in that love.
03:16:04.120 | So I can't wait for us to interact together.
03:16:07.640 | So we went from talking about some of the worst failures
03:16:11.720 | of humanity to some of the most beautiful aspects
03:16:15.120 | of humanity.
03:16:16.400 | What else can you ask for from our conversation?
03:16:18.880 | Thank you so much for talking today.
03:16:20.620 | - You know, Alex, I feel the same way towards you,
03:16:23.180 | and I really appreciate it.
03:16:24.320 | This has been a lot of fun,
03:16:25.300 | and I'm looking forward to our next one.
03:16:27.920 | - Thanks for listening to this conversation
03:16:29.280 | with Brett Weinstein.
03:16:30.360 | A thank you to Jordan Harbridge's show, Express CPN.
03:16:34.100 | Magic Spoon and Four Sigmatic.
03:16:36.520 | Check them out in the description to support this podcast.
03:16:39.840 | And now let me leave you with some words
03:16:41.840 | from Charles Darwin.
03:16:43.760 | "Ignorance more frequently begets confidence
03:16:46.560 | than does knowledge.
03:16:47.880 | It is those who know little, not those who know much,
03:16:51.520 | who so positively assert that this or that problem
03:16:55.120 | will never be solved by science."
03:16:57.720 | Thank you for listening, and hope to see you next time.
03:17:00.480 | (upbeat music)
03:17:03.060 | (upbeat music)
03:17:05.640 | [BLANK_AUDIO]