back to indexBret Weinstein: Truth, Science, and Censorship in the Time of a Pandemic | Lex Fridman Podcast #194
Chapters
0:0 Introduction
3:14 Why biology is beautiful
10:47 Boston Dynamics
14:11 Being self-critical
24:19 Theory of close calls
32:56 Lab leak hypothesis
66:50 Joe Rogan
76:0 Censorship
112:49 Vaccines
126:35 The paper on mice with long telomeres
154:18 Martyrdom
163:3 Eric Weinstein
172:23 Monogamy
184:41 Advice for young people
191:25 Meaning of life
00:00:00.000 |
"The following is a conversation with Brett Weinstein, 00:00:11.160 |
"Even though we've never met or spoken before this, 00:00:14.160 |
"we both felt like we've been friends for a long time. 00:00:46.580 |
"to understand and to solve the problems of the world. 00:01:00.580 |
"that has both scientific triumph and tragedy. 00:01:04.120 |
"We needed great leaders, and we didn't get them. 00:01:09.360 |
"in an honest, transparent, and authentic way 00:01:16.080 |
"to reduce that uncertainty and to develop solutions. 00:01:19.580 |
"I believe there are several candidates for solutions 00:01:21.840 |
"that could have all saved hundreds of billions of dollars 00:01:30.180 |
"Let me mention five of the categories of solutions, 00:01:33.000 |
"masks, at-home testing, anonymized contact tracing, 00:01:41.260 |
"institutional leaders should have constantly asked 00:01:55.120 |
"Two, given the current data and uncertainty, 00:02:01.140 |
"Three, what is the timeline and cost involved 00:02:04.140 |
"with mass manufacturing distribution of the solution? 00:02:15.000 |
"open, honest scientific communication and debate 00:02:29.620 |
"is the moment we'll lose our ability to find the truth, 00:02:33.860 |
"the very things that make science beautiful and powerful 00:02:39.120 |
"that threaten the wellbeing and the existence 00:02:49.060 |
It is much more about the very freedom to talk, 00:03:00.100 |
I asked Brett to do this podcast to show solidarity 00:03:03.400 |
and to show that I have hope for science and for humanity. 00:03:10.200 |
and here's my conversation with Brett Weinstein. 00:03:13.440 |
What to you is beautiful about the study of biology? 00:03:18.080 |
The science, the engineering, the philosophy of it? 00:03:22.320 |
I must say at one level, it's not a conscious thing. 00:03:32.080 |
but as a kid I was completely fascinated with animals. 00:03:36.720 |
I loved to watch them and think about why they did 00:03:56.800 |
of near miracles that exists across biological nature. 00:04:03.880 |
do you see it from an evolutionary biology perspective 00:04:08.200 |
of this entire thing that moves around in this world? 00:04:11.000 |
Or do you see from an engineering perspective 00:04:14.240 |
that first principles almost down to the physics, 00:04:18.040 |
like the little components that build up hierarchies, 00:04:21.200 |
that you have cells, first proteins and cells and organs 00:04:27.200 |
So do you see low level or do you see high level? 00:04:32.800 |
And I think it's probably a bit like a time-sharing machine 00:04:40.360 |
We don't know enough about biology for them to connect. 00:04:53.760 |
of the creature and the lineage that it belongs to. 00:04:58.660 |
to think of that lineage over a very long time scale, 00:05:04.400 |
what the mechanisms inside would have to look like 00:05:06.940 |
to account for what we can see from the outside. 00:05:11.080 |
And I think that probably sounds really complicated, 00:05:20.680 |
in a topic like biology and doing so for one, 00:05:25.160 |
really not even just my adult life, for my whole life, 00:05:29.400 |
And when we see somebody do an amazing parkour routine 00:05:46.320 |
They are in a zone in which they are in such command 00:05:53.720 |
that they know how to hurl it around a landscape 00:06:05.000 |
on which they can develop that kind of facility, 00:06:11.540 |
It's really something that human beings are capable of doing 00:06:16.320 |
many things our ancestors didn't even have access to. 00:06:34.820 |
that one would have to pursue to really deeply understand it. 00:06:38.200 |
And it is well worth having at least one topic like that. 00:06:53.480 |
like biomechanics of how our bodies can move, 00:07:00.160 |
Is it the entirety of the hierarchies of biology 00:07:04.820 |
that we've been talking about, or is it just all? 00:07:10.740 |
is that every creature is two things simultaneously. 00:07:19.900 |
you know, I call it an aqueous machine, right? 00:07:24.840 |
So it's not identical to our technological machines. 00:07:50.620 |
And if a creature is very, very good at being a creature, 00:07:57.760 |
then that lineage will not last very long into the future 00:08:04.980 |
that its descendants will not be able to meet. 00:08:07.260 |
So the thing about humans is we are a generalist platform 00:08:12.260 |
and we have the ability to swap out our software 00:08:29.460 |
they were discussing what it is they do and how it works. 00:08:33.300 |
look, you're tapping into deep monkey stuff, right? 00:08:49.020 |
you know, has the experience of flying down the hill 00:08:56.220 |
bouncing from the top of one mogul to the next. 00:09:10.360 |
and it's not the part of you that knows how to think. 00:09:19.900 |
is the ability to bootstrap a new software program 00:09:31.740 |
if the exact thing doesn't exist in robotics. 00:09:37.940 |
to deal with circumstances that were novel to it, 00:09:45.340 |
you're right, with the consciousness being an observer. 00:09:59.100 |
the kind of the loudness of the music going up and down, 00:10:26.940 |
And how the hell is the body able to do that? 00:10:33.980 |
the like, that damn is good to be alive feeling. 00:10:44.460 |
to the full biology stack that we're operating in. 00:10:47.660 |
I don't know how difficult it is to replicate that. 00:10:50.140 |
We were talking offline about Boston Dynamics robots. 00:10:54.900 |
They've recently been, they did both parkour, 00:10:57.980 |
they did flips, they've also done some dancing. 00:11:09.580 |
is those robots are hard-coded to do those things. 00:11:13.980 |
The robots didn't figure it out by themselves. 00:11:16.940 |
And yet the fundamental aspect of what it means to be human 00:11:20.620 |
is that process of figuring out, of making mistakes. 00:11:34.120 |
is what it means to be human, that learning process. 00:11:40.320 |
almost as a fun side thing with the Boston Dynamics robots 00:11:44.740 |
is to have them learn and see what they figure out. 00:11:55.140 |
and in so doing discover what it means to be alive, 00:12:04.520 |
Boston Dynamics folks want Spot to be perfect 00:12:09.300 |
'cause they don't want Spot to ever make mistakes 00:12:11.620 |
because they wants to operate in the factories, 00:12:43.900 |
Learn about its movement, learn how to use its body 00:12:47.480 |
to communicate with others, all those kinds of things. 00:12:56.540 |
but first you have to allow the robot to make mistakes. 00:13:03.820 |
but you're gonna realize that the Boston Dynamics folks 00:13:06.860 |
are right the first time Spot poops on your rug. 00:13:09.300 |
- I hear the same thing about kids and so on. 00:13:21.240 |
One, I have always believed that the missing element 00:13:33.620 |
that human beings are the most dominant species 00:13:36.400 |
on planet Earth and that we have the longest childhoods 00:13:42.000 |
The development is the key to the flexibility 00:13:44.880 |
and so the capability of a human at adulthood 00:13:58.240 |
So I'll be very interested to see what happens 00:14:06.400 |
which of course is foreshadowed in 2001 quite brilliantly. 00:14:11.880 |
But I also wanna point out, you can see this issue 00:14:16.040 |
of your conscious mind becoming a spectator very well 00:14:23.480 |
If you watch a tennis game, you could imagine 00:14:28.640 |
that the players are highly conscious as they play. 00:14:31.260 |
You cannot imagine that if you've ever played 00:14:54.540 |
If you go up against an opponent in table tennis 00:14:57.760 |
that knows a trick that you don't know how to respond to, 00:15:01.800 |
you will suddenly detect that something about your game 00:15:07.260 |
about what might be, how do you position yourself 00:15:09.520 |
so that move that puts the ball just in that corner 00:15:15.440 |
And this I believe is, we highly conscious folks, 00:15:23.880 |
very deliberately and carefully, mistake consciousness 00:15:33.120 |
Consciousness is an intermediate level of thinking. 00:15:42.620 |
It is capable of being adapted to new circumstances. 00:15:50.480 |
and you become highly effective at whatever it is. 00:15:57.240 |
that aren't anticipated by the code you've already written. 00:16:00.040 |
And so I don't exactly know how one would establish this, 00:16:10.580 |
contains sandboxes in which things are tested, right? 00:16:16.820 |
and run it in parallel next to your active code 00:16:19.860 |
so you can see how it would have done comparatively. 00:16:23.600 |
But there's gotta be some way of writing new code 00:16:31.220 |
Very often, you know, when I get good at something, 00:16:34.100 |
I often don't get better at it while I'm doing it. 00:16:38.660 |
especially if there's time to sleep and think on it. 00:16:43.700 |
new program swapping in for old program phenomenon, 00:16:46.940 |
which, you know, will be a lot easier to see in machines. 00:16:58.780 |
I do still think the highest form of excellence in tennis 00:17:05.220 |
So the compiled code is the highest form of being human. 00:17:10.220 |
And then consciousness is just some like specific compiler. 00:17:19.740 |
You could just have different kinds of compilers. 00:17:22.180 |
Ultimately, the thing that by which we measure 00:17:36.660 |
is played consciously and table tennis isn't. 00:17:38.900 |
I'm saying that because tennis is slowed down 00:17:43.740 |
you could imagine that it was your conscious mind playing. 00:17:49.580 |
- It becomes obvious that your conscious mind 00:17:51.420 |
is just present rather than knowing where to put the paddle. 00:17:56.460 |
I would say this probably isn't true in a podcast situation. 00:18:27.100 |
- And that's the highest form of podcasting too. 00:18:34.180 |
like Joe Rogan just having fun and just losing themselves. 00:18:43.460 |
Somebody that has a lot of anxiety with people, 00:18:47.940 |
I was scared before you showed up, I'm scared right now. 00:18:50.900 |
There's just anxiety, there's just, it's a giant mess. 00:18:56.540 |
It's hard to just get out of the way of your own mind. 00:19:00.660 |
- Yeah, actually trust is a big component of that. 00:19:10.040 |
But when you do get into that zone when you're speaking, 00:19:16.740 |
although maybe you present in Russian and it happens. 00:19:22.540 |
and you think, oh, that's really good, right? 00:19:26.960 |
some other part of you that you don't exactly know 00:19:30.980 |
- I don't think I've ever heard myself in that way 00:19:43.540 |
There's a very self-critical voice that's much louder. 00:19:47.380 |
So I'm very, maybe I need to deal with that voice, 00:19:51.300 |
but it's been like with a, what is it called? 00:19:58.600 |
So I'm kind of focused right now on the megaphone person 00:20:05.300 |
But that's definitely something to think about. 00:20:07.180 |
It's been productive, but the place where I find gratitude 00:20:21.600 |
When I talk, I'm extremely self-critical in my mind. 00:20:32.000 |
like any kind of creation, extremely self-critical. 00:20:54.840 |
is to hate everything you've ever done in the past. 00:21:18.060 |
I just recently, I just yesterday read Metamorphosis by, 00:21:25.840 |
because of the stress that the world puts on him. 00:21:31.620 |
And I think that you have to find the balance 00:21:39.640 |
to become too heavy, the burden of the world, 00:21:44.120 |
to be the best version of yourself and so on to strive, 00:21:47.360 |
then you become a bug and that's a big problem. 00:21:51.760 |
And then the world turns against you because you're a bug. 00:21:56.720 |
You become some kind of caricature of yourself. 00:22:30.720 |
Dwelling on things is famously understood to be bad 00:22:36.720 |
It wouldn't exist, the tendency toward it wouldn't exist 00:22:45.360 |
And that's obviously easier to say than to operationalize. 00:22:49.660 |
But if you realize that your dwelling is the key in fact 00:22:53.320 |
to upgrading your program for future well-being 00:22:57.000 |
and that there's a point presumably from diminishing returns 00:23:05.600 |
because that is what is in your best interest, 00:23:08.320 |
then knowing that you're looking for that point is useful. 00:23:12.120 |
This is the point at which it is no longer useful 00:23:21.060 |
If some part of you feels like it's punishing you 00:23:25.600 |
then that also has a point at which it's no longer valuable 00:23:32.400 |
yep, even the part of me that was punishing me 00:23:36.460 |
- So if we map that onto compiled code discussion, 00:23:40.460 |
as a computer science person, I find that very compelling. 00:23:45.280 |
When you compile code, you get warnings sometimes. 00:23:48.720 |
And usually if you're a good software engineer, 00:23:58.900 |
So you make sure that the compilation produces no warnings. 00:24:02.280 |
But at a certain point when you have a large enough system, 00:24:07.860 |
Like I don't know where that warning came from, 00:24:21.000 |
and believe me, I think what you're talking about 00:24:27.760 |
is gonna end up having to go to a deep developmental state 00:24:31.920 |
and a helplessness that evolves into hyper-competence 00:24:36.040 |
But I live, I noticed that I live by something 00:24:49.600 |
that people typically miscategorize the events 00:24:54.600 |
in their life where something almost went wrong. 00:25:01.080 |
I have a friend who, I was walking down the street 00:25:04.800 |
with my college friends and one of my friends 00:25:06.760 |
stepped into the street thinking it was clear 00:25:08.740 |
and was nearly hit by a car going 45 miles an hour. 00:25:17.400 |
But she didn't, you know, car didn't touch her, right? 00:25:21.760 |
Now you could walk away from that and think nothing of it 00:25:28.240 |
Or you could think, well, what is the difference 00:25:37.360 |
I never want the difference between what did happen 00:25:41.680 |
Therefore, I should count this as very close to death. 00:25:45.320 |
And I should prioritize coding so it doesn't happen again 00:25:50.480 |
So anyway, my basic point is the accidents and disasters 00:25:55.480 |
and misfortune describe a distribution that tells you 00:26:04.440 |
And so personally, you can use them to figure out 00:26:12.040 |
to take great risks because you have a really good sense 00:26:15.800 |
But I would also point out civilization has this problem. 00:26:24.400 |
but they're not existential scale yet, right? 00:26:30.100 |
And I would argue that the pattern is you discover 00:26:32.920 |
that we are involved in some industrial process 00:26:40.800 |
okay, in light of the Fukushima triple meltdown, 00:27:03.100 |
And can we talk about the wisdom or lack thereof 00:27:06.220 |
of engaging in that process before the accident, right? 00:27:09.100 |
That's what a wise civilization would be doing. 00:27:12.940 |
- I just wanna mention something that happened 00:27:17.180 |
I don't know if you know who J.B. Straubel is. 00:27:28.600 |
And in the same thin line between death and life 00:27:49.520 |
I wonder how much of our own individual lives 00:27:59.760 |
- Well, this is sort of my point about the close calls 00:28:03.280 |
is that there's a level at which we can't control it, right? 00:28:06.780 |
The gigantic asteroid that comes from deep space 00:28:11.080 |
that you don't have time to do anything about. 00:28:13.060 |
There's not a lot we can do to hedge that out, 00:28:37.400 |
There wasn't a major fire that made it impossible 00:28:42.660 |
We could say the same thing about the blowout 00:28:52.480 |
that we couldn't have plugged it could have opened up. 00:28:54.320 |
All of these things could have been much, much worse, right? 00:28:57.520 |
And I think we can say the same thing about COVID, 00:29:00.960 |
And we cannot say for sure that it came from the Wuhan lab, 00:29:10.320 |
So in each of these cases, something is telling us 00:29:19.920 |
and some scale of disaster that is unimaginable. 00:29:22.700 |
And that wisdom, you can be highly intelligent 00:29:31.760 |
And that would require a process of restraint, 00:29:36.560 |
a process that I don't see a lot of evidence of yet. 00:29:47.460 |
that would be capable of taking a protective, 00:29:55.740 |
Because it would have important economic consequences. 00:29:57.780 |
And so it would almost certainly be shot down. 00:30:03.380 |
we paid a huge price for all of the disasters 00:30:09.360 |
And we have to factor that into the equation. 00:30:15.380 |
- Also, the question is how many disasters we avoided 00:30:24.000 |
or just the integrity and character of humans. 00:30:43.720 |
that maybe we can overcome luck with ingenuity. 00:30:48.560 |
Meaning, I guess you're suggesting the process 00:31:03.260 |
And being very honest with the data out there 00:31:06.740 |
about the close calls and using those close calls 00:31:13.940 |
by which we minimize the probability of those close calls. 00:31:23.120 |
- Well, I think we need to do a couple things 00:31:38.560 |
And I realize that we also need to have reversibility 00:31:43.280 |
because processes very frequently when they start 00:31:47.960 |
And then when they scale, they become very dangerous. 00:31:59.680 |
and you imagine somebody running after them saying, 00:32:04.200 |
"and it's gonna change the temperature of the planet." 00:32:08.600 |
who's invented this marvelous new contraption? 00:32:10.920 |
But of course, eventually you do get to the place 00:32:14.600 |
that you do start changing the temperature of the planet. 00:32:20.460 |
if we basically said, "Look, you can't involve yourself 00:32:23.700 |
"in any process that you couldn't reverse if you had to," 00:32:40.280 |
We're just involved in too many very dangerous processes. 00:32:43.880 |
- So let's talk about one of the things that, 00:32:50.280 |
certainly hurt it at a deep level, which is COVID-19. 00:32:54.920 |
What percent probability would you currently place 00:33:21.400 |
the likelihood that the lab was the Wuhan Institute 00:33:24.760 |
There are multiple different kinds of evidence 00:33:31.640 |
and there is literally no evidence that points to nature. 00:33:46.720 |
with research on viruses that look like SARS-CoV-2 00:33:57.680 |
that this virus came from a lab is well above 95%. 00:34:07.860 |
and escaped from there without being modified? 00:34:16.520 |
Could it have been delivered from another lab? 00:34:23.120 |
in order that we would connect the dots in the wrong way? 00:34:27.600 |
I currently have that below 1% on my flow chart, 00:34:42.400 |
Sometimes when Eric and I talk about these issues, 00:34:47.680 |
just to prove that something could live in that space, 00:34:53.840 |
And so it doesn't have to have been an attack on China. 00:35:00.720 |
if you can predict the future in some unusual way 00:35:07.200 |
better than others, you can print money, right? 00:35:10.280 |
That's what markets that allow you to bet for or against 00:35:17.000 |
So you can imagine a simply amoral person or entity 00:35:22.000 |
generating a pandemic, attempting to cover their tracks 00:35:28.240 |
because it would allow them to bet against things 00:35:30.520 |
like cruise ships, air travel, whatever it is, 00:35:37.840 |
sanitizing gel and whatever else you would do. 00:35:43.280 |
So am I saying that I think somebody did that? 00:35:51.280 |
However, were it to have been intentionally released 00:35:56.160 |
did not want it known where it had come from, 00:36:01.840 |
So we have to leave the possibility formally open, 00:36:10.560 |
maybe this is the optimistic nature that I have, 00:36:29.240 |
the kind of disregard for human life required to do that, 00:36:34.240 |
that's just not going to be coupled with competence. 00:36:42.440 |
where competence on one axis and evil is on the other. 00:36:47.400 |
the crappier you are at doing great engineering, 00:36:59.840 |
That seems to be the lessons I take from history, 00:37:04.360 |
that's what's going to be happening in the future. 00:37:13.560 |
'cause there's a lot of fascinating possibilities. 00:37:18.080 |
what would evidence for natural origins look like? 00:37:35.080 |
- So like that's one, like that's possible to have happened. 00:37:38.560 |
So that's a sort of like a historical evidence, 00:37:42.320 |
like, okay, well, it's possible that it happened. 00:37:45.520 |
- It's not evidence of the kind you think it is. 00:37:51.120 |
So the presumption upon discovering a new virus 00:37:54.780 |
circulating is certainly that it came from nature, right? 00:38:00.720 |
in the face of evidence, or at least it logically should. 00:38:05.800 |
It was maintained by people who privately in their emails 00:38:17.040 |
that we could look for and see that would say, 00:38:21.840 |
this increases the probability that it's natural origins? 00:38:30.780 |
make up some evidence in order to reverse the flow. 00:38:36.380 |
- There's a lot of incentive for that, actually. 00:38:39.580 |
On the other hand, why didn't the powers that be, 00:38:49.780 |
Whatever force it is, I hope that force is here too. 00:38:54.960 |
- It's the competence thing I'm talking about. 00:39:00.200 |
But I would say, yeah, the giant piece of evidence 00:39:03.880 |
that will shift the probabilities in the other direction 00:39:07.160 |
is the discovery of either a human population 00:39:10.540 |
in which the virus circulated prior to showing up in Wuhan 00:39:27.880 |
in which the virus learned this before jumping to humans. 00:39:33.460 |
you would certainly expect to see a great deal of evolution 00:39:42.880 |
somewhere else that had the virus circulating 00:39:45.040 |
or an ancestor of the virus that we first saw 00:39:54.040 |
in order to explain the total lack of evolution 00:40:01.640 |
- So you don't believe in the magic of evolution 00:40:03.680 |
to spring up with all the tricks already there. 00:40:17.200 |
just like the ones that succeed and succeed big 00:40:22.200 |
are the ones that are going to just spring into life 00:40:33.160 |
The job of becoming a new pandemic virus is too difficult. 00:40:49.140 |
at a sufficient rate that it doesn't go extinct 00:40:58.060 |
And the point is selection would leave a mark. 00:41:00.700 |
We would see evidence that it was taking place. 00:41:12.300 |
you would see the clumsy virus get better and better. 00:41:22.420 |
is that it is much more powerful than most people imagine. 00:41:25.300 |
That what we teach in the Evolution 101 textbook 00:41:28.900 |
is too clumsy a process to do what we see it doing 00:41:32.100 |
and that actually people should increase their expectation 00:41:35.220 |
of the rapidity with which that process can produce 00:41:52.020 |
which probably means it took place inside of a laboratory. 00:42:04.940 |
is doing this exact thing that you're referring to, 00:42:12.620 |
and seeing what kind of tricks get developed. 00:42:24.480 |
Which do you think we should be thinking about here? 00:42:38.140 |
And I would say based on the content of the genome 00:42:42.340 |
and other evidence in publications from the various labs 00:42:45.700 |
that were involved in generating this technology, 00:42:52.620 |
This SARS-CoV-2 does not appear to be entirely the result 00:42:57.620 |
of either a splicing process or serial passaging. 00:43:11.860 |
looks very much like it was added in to the virus. 00:43:15.660 |
And it was known that that would increase its infectivity 00:43:27.180 |
at spreading in humans and minks and ferrets. 00:43:32.800 |
Now minks and ferrets are very closely related 00:43:36.440 |
to have been used in a serial passage experiment. 00:43:39.180 |
The reason being that they have an ACE2 receptor 00:43:41.900 |
that looks very much like the human ACE2 receptor. 00:43:49.820 |
in order to increase its infectivity in humans, 00:43:56.200 |
It is also quite likely that humanized mice were utilized 00:44:01.200 |
and it is possible that human airway tissue was utilized. 00:44:15.000 |
because they will actually give us some tools 00:44:28.260 |
but it really is, it should be our objective. 00:44:36.620 |
But those protocols would tell us a great deal. 00:44:39.380 |
If it wasn't the Wuhan Institute, we need to know that. 00:44:53.100 |
- You're opening up my mind about why we should investigate, 00:44:57.380 |
why we should know the truth of the origins of this virus. 00:45:01.380 |
So for me personally, let me just tell the story 00:45:05.980 |
When I first started looking into the lab leak hypothesis, 00:45:11.460 |
what became terrifying to me and important to understand 00:45:17.340 |
and obvious is the sort of like Sam Harris way of thinking, 00:45:34.500 |
it's obvious there's going to happen in the future. 00:45:37.460 |
So why the hell are we not freaking out about this? 00:45:47.660 |
but he thinks about this way about AGI as well, 00:46:04.220 |
it seemed obvious that this will happen very soon 00:46:08.460 |
for a much deadlier virus as we get better and better 00:46:13.780 |
and doing this kind of evolutionary driven research, 00:46:18.620 |
Okay, but then you started speaking out about this as well, 00:46:25.260 |
we should hurry up and figure out the origins now 00:46:29.780 |
how to actually respond to this particular virus, 00:46:37.660 |
what is in terms of vaccines, in terms of antiviral drugs, 00:46:40.460 |
in terms of just all the number of responses we should have. 00:46:45.460 |
Okay, I still am much more freaking out about the future. 00:47:18.060 |
Let me say first that this is a perfect test case 00:47:25.180 |
it is also a close call from which we can learn much. 00:47:42.460 |
we're gonna create the disaster all the sooner. 00:48:00.100 |
ought to tell us this is not a Chinese failure, 00:48:02.140 |
this is a failure of something larger and harder to see. 00:48:05.780 |
But I also think that there's a clock ticking 00:48:30.220 |
Some years you get the flu, most years you don't. 00:48:34.180 |
Maybe the vaccine isn't particularly well targeted. 00:48:58.300 |
The loss of wellbeing and wealth, incalculable. 00:49:04.580 |
driving this extinct before it becomes permanent. 00:49:15.580 |
to the extent that we let it have this very large canvas, 00:49:21.500 |
in which mutation and selection can result in adaptation 00:49:26.900 |
the greater its ability to figure out features 00:49:29.460 |
of our immune system and use them to its advantage. 00:49:33.580 |
So I'm feeling the pressure of driving it extinct. 00:49:37.260 |
I believe we could have driven it extinct six months ago 00:49:40.340 |
and we didn't do it because of very mundane concerns 00:49:49.500 |
or that they were callous about deaths that would be caused. 00:49:59.260 |
let's say you're in some kind of a corporation, 00:50:08.540 |
that in the context of a pandemic might be very lucrative. 00:50:15.860 |
so that it succeeds and the competitors don't. 00:50:21.580 |
through means that I think those of us on the outside 00:50:23.820 |
can't really intuit, you end up saying things 00:50:30.940 |
and much more safely than the ones you're selling 00:50:38.660 |
But it's some kind of autopilot, at least part of it is. 00:50:43.180 |
- So there's a complicated coupling of the autopilot 00:50:52.460 |
And then there's also the geopolitical game theory 00:51:00.540 |
It's the Chernobyl thing where if you messed up, 00:51:18.420 |
that we often criticize about our institutions, 00:51:21.980 |
especially the leaders in those institutions, 00:51:25.700 |
as some of the members of the scientific community. 00:51:42.860 |
Heather and I have been talking from the beginning 00:51:53.820 |
That's frightening, but it's also hopeful in the sense 00:52:01.020 |
we're not navigating a puzzle about Chinese responsibility. 00:52:05.380 |
We're navigating a question of collective responsibility 00:52:10.380 |
for something that has been terribly costly to all of us. 00:52:22.220 |
the strong possibility this will happen again 00:52:27.140 |
So just as a person that does not learn the lessons 00:52:43.060 |
"that suggests that this is a self-inflicted wound." 00:52:47.900 |
that has caused a massive self-inflicted wound, 00:52:55.380 |
exactly to the point that you have learned the lesson 00:53:02.940 |
to kind of ask you to do almost like a thought experiment. 00:53:25.540 |
there's a bunch of ways I think you can argue 00:53:29.940 |
that even talking about it is bad for the world. 00:53:46.880 |
That talking about that it leaked from a lab, 00:54:03.300 |
with other steel man arguments against talking 00:54:08.300 |
or against the possibility of the lab leak hypothesis? 00:54:22.780 |
I think that you can only steel man a good faith argument. 00:54:32.680 |
because privately their emails reflect their own doubts. 00:54:37.500 |
was actually a punishment, a public punishment 00:55:01.420 |
shutting down people who are using those tools honorably 00:55:10.420 |
I don't feel that there's anything to steel man. 00:55:16.900 |
immediately at the point that the world suddenly 00:55:25.240 |
had published his article and suddenly the world 00:55:27.340 |
was going to admit that this was at least a possibility, 00:55:34.900 |
of the rationalization process that had taken place 00:55:41.760 |
that what was being avoided was the targeting 00:55:55.860 |
On the other hand, once you create license to lie 00:56:03.900 |
when the world has a stake in knowing what happened, 00:56:10.200 |
that license to lie will be used by the thing 00:56:12.980 |
that captures institutions for its own purposes. 00:56:19.700 |
if the story of what happened here can be used 00:56:23.700 |
against Chinese people, that would be very unfortunate. 00:56:31.180 |
Heather and I have taken great pains to point out 00:56:33.940 |
that this doesn't look like a Chinese failure, 00:56:38.380 |
So I think it is important to broadcast that message 00:56:43.880 |
but no matter what happened, we have a right to know. 00:56:46.740 |
And I frankly do not take the institutional layer 00:56:50.820 |
at its word, that its motivations are honorable 00:56:53.460 |
and that it was protecting good-hearted scientists 00:57:06.380 |
at the institutional layer to protect the populace. 00:57:11.180 |
I think both you and I are probably on the same, 00:57:16.260 |
have the same sense that it's a slippery slope, 00:57:21.900 |
even if it's an effective mechanism in the short term, 00:57:25.220 |
in the long term, it's going to be destructive. 00:57:27.700 |
This happened with masks, this happened with other things. 00:57:44.440 |
whether it's a lie or not, to minimize panic. 00:57:49.540 |
But you're suggesting that almost in all cases, 00:57:52.780 |
and I think that was the lesson from the pandemic 00:58:04.280 |
in the institutions is ultimately destructive. 00:58:12.400 |
There are obviously places where complete transparency 00:58:17.300 |
To the extent that you broadcast a technology 00:58:19.980 |
that allows one individual to hold the world hostage, 00:58:23.900 |
right, obviously you've got something to be navigated. 00:58:27.660 |
But in general, I don't believe that the scientific system 00:58:39.820 |
the idea that the well-being of Chinese scientists 00:58:45.380 |
outweighs the well-being of the world is preposterous. 00:58:50.700 |
As you point out, one thing that rests on this question 00:58:53.460 |
is whether we continue to do this kind of research 00:59:04.080 |
of a zoonotic spillover event causing a major 00:59:14.380 |
and if they are wrong, as I believe they are, 00:59:16.740 |
about the likelihood of a major world pandemic 00:59:19.220 |
spilling out of nature in the way that they wrote 00:59:31.200 |
And yes, whatever we have to do to protect scientists 00:59:38.260 |
But we cannot, protecting them by lying to the world, 00:59:42.880 |
and even worse, by demonizing people like me, 00:59:58.940 |
by demonizing us for simply following the evidence 01:00:05.700 |
You're demonizing people for using the scientific method 01:00:08.380 |
to evaluate evidence that is available to us in the world. 01:00:11.940 |
What a terrible crime it is to teach that lesson, right? 01:00:19.300 |
Whatever your license to lie is, it doesn't extend to that. 01:00:25.420 |
the pressure on you has a very important effect 01:00:28.460 |
on thousands of world-class biologists, actually. 01:00:34.540 |
So at MIT, colleagues of mine, people I know, 01:00:44.100 |
to one, speak publicly, and two, actually think. 01:00:53.580 |
It sounds kind of ridiculous, but just in the privacy 01:01:01.540 |
it's many people, many world-class biologists that I know 01:01:10.700 |
There's not even that many people that are publicly opposed 01:01:21.660 |
that those battles should be fought in private, 01:01:24.740 |
with colleagues in the privacy of the scientific community, 01:01:31.260 |
that the public is somehow not maybe intelligent enough 01:02:01.520 |
And the fact that they're not utilizing their brain 01:02:06.700 |
outside of the conformist line of thinking is tragic. 01:02:17.740 |
For one thing, it's kind of a cryptic totalitarianism. 01:02:20.860 |
Somehow, people's sense of what they're allowed 01:02:31.220 |
They are blinding themselves to what they can see. 01:02:36.860 |
that what you're describing about what people said, 01:02:48.860 |
And I think that their discussions with each other 01:02:52.760 |
about why they did not say what they understood, 01:02:55.660 |
that's what capture sounds like on the inside. 01:02:59.160 |
I don't know exactly what force captured the institutions. 01:03:02.900 |
I don't think anybody knows for sure out here in public. 01:03:07.900 |
I don't even know that it wasn't just simply a process. 01:03:13.620 |
They are behaving towards a kind of somatic obligation. 01:03:18.620 |
They have lost sight of what they were built to accomplish. 01:03:24.780 |
the way they avoid going back to their original mission 01:03:38.100 |
that is what capture sounds like on the inside. 01:03:40.280 |
It's a institutional rationalization mechanism. 01:03:46.420 |
And at the point you go from lab leak to repurposed drugs, 01:03:50.600 |
you can see that it's very deadly in a very direct way. 01:04:05.860 |
They kind of avoid the idea that AI is used in the military. 01:04:26.420 |
every time I bring up autonomous weapon systems 01:04:30.700 |
There's a natural kind of pull towards that direction 01:04:33.640 |
because it's like, what can I do as one person? 01:04:42.040 |
So we're in like in the early days of this race 01:04:44.960 |
that in 10, 20 years might become a real problem. 01:04:50.440 |
we're now facing the result of that in the space of viruses, 01:04:55.400 |
like for many years avoiding the conversations here. 01:05:00.480 |
I don't know what to do that in the early days, 01:05:04.920 |
create institutions where people can stand out. 01:05:08.280 |
People can stand out and like basically be individual 01:05:12.040 |
thinkers and break out into all kinds of spaces of ideas 01:05:16.560 |
that allow us to think freely, freedom of thought. 01:05:19.600 |
And maybe that requires a decentralization of institutions. 01:05:34.040 |
The average Joe has a job somewhere and their mortgage, 01:05:44.180 |
their connection with the economy is to one degree 01:05:57.300 |
especially in any industry where it's not easy 01:06:09.160 |
not only in terms of who gets to tell other people 01:06:28.320 |
to go against the grain and have it not be a catastrophe 01:06:36.540 |
So I would argue that some of what you're talking about 01:06:41.880 |
of the concentration of the sources of well-being 01:06:50.500 |
- You got a chance to talk with Joe Rogan yesterday? 01:07:03.200 |
Joe told me it was an incredible conversation. 01:07:07.380 |
Many people have probably, by the time this is released, 01:07:12.060 |
I think it would be interesting to discuss a postmortem. 01:07:25.600 |
as it's unfolding of Ivermectin from the origins 01:07:41.380 |
that he would take the risk of such a discussion, 01:07:59.280 |
about the censorship campaign against Ivermectin, 01:08:07.160 |
And I should say we had Pierre Kory available. 01:08:17.080 |
the Frontline COVID-19 Critical Care Alliance. 01:08:23.740 |
of treating COVID patients, and they happened on Ivermectin, 01:08:29.560 |
And I hesitate to use the word advocating for it, 01:08:32.960 |
because that's not really the role of doctors or scientists, 01:08:41.400 |
about its effectiveness for reasons that we can go into. 01:08:44.880 |
- So maybe step back and say what is Ivermectin, 01:09:03.380 |
and he found it in soil near a Japanese golf course. 01:09:12.980 |
over the possibility that Asians will be demonized 01:09:20.000 |
and to recognize that actually the natural course 01:09:36.800 |
But in any case, Omura discovered this molecule. 01:09:55.140 |
Its initial use was in treating parasitic infections. 01:10:05.060 |
the pathogen that causes elephantiasis, scabies. 01:10:12.140 |
It's on the WHO's list of essential medications. 01:10:17.020 |
It has been administered something like four billion times 01:10:22.420 |
It has been given away in the millions of doses 01:10:27.980 |
People have been on it for long periods of time, 01:10:33.300 |
may have had less severe impacts from COVID-19 01:10:40.940 |
and the drug appears to have a long-lasting impact. 01:10:51.140 |
and so it was tested early in the COVID-19 pandemic 01:10:54.880 |
to see if it might work to treat humans with COVID. 01:10:58.760 |
It turned out to have very promising evidence 01:11:05.140 |
It was tested at a very high dosage, which confuses people. 01:11:11.020 |
that Ivermectin might be useful in confronting this disease 01:11:14.780 |
are advocating those high doses, which is not the case. 01:11:17.540 |
But in any case, there have been quite a number of studies. 01:11:21.920 |
A wonderful meta-analysis was finally released. 01:11:36.380 |
it's highly effective at treating people with the disease, 01:11:40.620 |
and it showed an 86% effectiveness as a prophylactic 01:11:50.780 |
to drive SARS-CoV-2 to extinction if we wished to deploy it. 01:12:07.340 |
'Cause I was really impressed by the real-time meta-analysis 01:12:24.020 |
So that is, as you say, a living meta-analysis 01:12:29.100 |
- It's really cool, and they've got some really nice graphics 01:12:32.220 |
that allow you to understand, well, what is the evidence? 01:12:34.660 |
You know, it's concentrated around this level 01:12:46.420 |
Second author is Tess Lorry of the BIRD Group, 01:12:49.420 |
BIRD being a group of analysts and doctors in Britain 01:12:54.420 |
that is playing a role similar to the FLCCC here in the US. 01:13:12.880 |
I will put it up and people can find it there. 01:13:30.220 |
- So let me just say, I'm not a medical doctor. 01:13:39.240 |
In terms of dosage, there is one reason for concern, 01:13:42.540 |
which is that the most effective dose for prophylaxis 01:13:45.800 |
involves something like weekly administration, 01:13:49.500 |
and that because that is not a historical pattern 01:13:58.060 |
of being on it weekly for a long period of time. 01:14:07.060 |
using the drug over many years and using it in high doses. 01:14:27.060 |
and I do worry about the long-term implications 01:14:39.500 |
something like, let's say, 15 milligrams for somebody 01:14:43.060 |
my size once a week after you've gone through 01:14:45.740 |
the initial double dose that you take 48 hours apart, 01:14:50.380 |
it is apparent that if the amount of drug in your system 01:14:55.340 |
is sufficient to be protective at the end of the week, 01:15:10.900 |
I have little doubt that that would be discovered 01:15:14.140 |
But that said, it does seem to be quite safe, 01:15:27.660 |
in light of its R0 number of slightly more than two. 01:15:31.380 |
And so why we are not using it is a bit of a mystery. 01:15:48.020 |
in order to merit rigorous scientific exploration, 01:16:00.220 |
So before we talk about the various vaccines for COVID-19, 01:16:21.580 |
- Well, there's a question about why they say they did it, 01:16:24.040 |
and there's a question about why they actually did it. 01:16:37.240 |
Reuters, AP, Financial Times, Washington Post, 01:16:51.640 |
In effect, they have decided to control discussion 01:16:56.160 |
ostensibly to prevent the distribution of misinformation. 01:17:02.320 |
In this case, they have chosen to simply utilize 01:17:20.640 |
They are entities that are about public health, 01:17:24.180 |
and public health has this, whether it's right or not, 01:17:35.100 |
that comes from the fact that there is game theory 01:17:48.160 |
and therefore the herd becomes immune through vaccination, 01:17:53.980 |
then you benefit from the immunity of the herd 01:17:58.600 |
So people who do best are the people who opt out. 01:18:04.560 |
as public health entities effectively oversimplify stories 01:18:09.560 |
in order that that game theory does not cause 01:18:15.640 |
With that said, once that right to lie exists, 01:18:27.140 |
that require that there not be a safe and effective treatment 01:18:43.160 |
of immunizing a company from the harm of its own product 01:18:57.120 |
So somehow YouTube is doing the bidding of Merck and others. 01:19:02.120 |
Whether it knows that that's what it's doing, 01:19:05.020 |
I think this may be another case of an autopilot 01:19:18.080 |
And the irony here is that with YouTube coming after me, 01:20:01.880 |
And they say like, that's an easy thing to do. 01:20:06.920 |
Like to know what is true or not is an easy thing to do. 01:20:23.080 |
to public institutions that on a basic Google search 01:20:32.960 |
So if you were YouTube who are exceptionally profitable 01:20:37.960 |
and exceptionally powerful in terms of controlling 01:20:43.280 |
what people get to see or not, what would you do? 01:20:57.460 |
let's open the dam and let any video on anything fly? 01:21:05.920 |
if Brent Weinstein was put in charge of YouTube for a month 01:21:13.200 |
where YouTube actually has incredible amounts of power 01:21:45.400 |
They did not imagine that in formulating that right, 01:21:48.640 |
that most of what was said would be of high quality, 01:21:51.480 |
nor did they imagine it would be free of harmful things. 01:21:54.520 |
What they correctly reasoned was that the benefit 01:22:02.480 |
which everyone understands to be substantial. 01:22:04.820 |
What I would say is they could not have anticipated 01:22:31.200 |
explicitly the right of citizens to say anything. 01:22:41.000 |
than the federal government of their worst nightmares 01:22:44.140 |
The power that these entities have to control thought 01:22:52.400 |
It doesn't mean that harmful things won't be said, 01:23:12.400 |
to limit what gets said or who gets to hear it. 01:23:24.240 |
I believe that allowing everything to be said 01:23:33.800 |
when there's complete freedom to share ideas, 01:23:39.760 |
So what I've noticed, just as a brief side comment, 01:24:12.800 |
And that's worrying in the time of emergency. 01:24:16.280 |
- Well, yes, it's all worrying in time of emergency, 01:24:19.720 |
But I want you to notice that what you've happened on 01:24:22.280 |
is actually an analog for a much deeper and older problem. 01:24:26.740 |
Human beings are the, we are not a blank slate, 01:24:31.500 |
but we are the blankest slate that nature has ever devised. 01:24:43.200 |
a large fraction of the cognitive capacity has been, 01:24:48.160 |
or of the behavioral capacity has been offloaded 01:24:51.500 |
to the software layer, which gets written and rewritten 01:24:56.500 |
That means effectively that much of what we are, 01:25:02.680 |
is housed in the cultural layer and the conscious layer 01:25:17.720 |
knows that children make absurd errors all the time, right? 01:25:20.640 |
That's part of the process, as we were discussing earlier. 01:25:24.200 |
It is also true that as you look across a field 01:25:29.580 |
a lot of what is said is pure nonsense, it's garbage. 01:25:45.920 |
So there is a high tendency for novelty to be created 01:25:50.440 |
in the cultural space, but there's also a high tendency 01:25:57.120 |
Everything is happening at a much higher rate. 01:25:59.440 |
Things are being created, they're being destroyed. 01:26:18.600 |
does it produce more open, fairer, more decent societies? 01:26:24.600 |
I can't prove it, but that does seem to be the pattern. 01:26:29.640 |
The thing is in the short term, freedom of speech, 01:26:34.640 |
absolute freedom of speech can be quite destructive, 01:26:51.480 |
I don't know how strongly I believe that it will work, 01:26:54.780 |
but I will say I haven't heard a better idea. 01:26:59.020 |
- I would also point out that there's something 01:27:01.720 |
very significant in this question of the hubris involved 01:27:10.300 |
which is the majority of concepts at the fringe are nonsense. 01:27:25.920 |
from the nonsense ideas, is the key to progress. 01:27:30.300 |
So if you decide, hey, the fringe is 99% garbage, 01:27:36.860 |
We're getting rid of 99% garbage for 1% something or other. 01:27:40.620 |
And the point is, yeah, but that 1% something or other 01:27:48.320 |
Frankly, I think at the point that it started censoring 01:28:00.340 |
We're censoring somebody who was just right, right? 01:28:09.380 |
- So you said one approach if you're on YouTube 01:28:20.740 |
Eric makes an excellent point about the distinction 01:28:28.460 |
So I agree, there's no value in allowing people 01:28:33.220 |
even if there's a technical legal defense for it. 01:28:41.960 |
people should be free to traffic in bad ideas, 01:28:44.100 |
and they should be free to expose that the ideas are bad. 01:28:50.680 |
- Yeah, there's an interesting line between ideas, 01:29:00.020 |
And then you start to encroach on personal attacks. 01:29:04.420 |
So not doxing, yes, but not even getting to that. 01:29:22.400 |
because maybe there's a tiny percent chance it is. 01:29:25.820 |
It just feels like personal attacks, it doesn't, 01:29:31.420 |
well, I'm torn on this because there's assholes 01:29:35.580 |
in this world, there's fraudulent people in this world. 01:29:37.820 |
So sometimes personal attacks are useful to reveal that, 01:29:44.500 |
Like there's a comedy where people make fun of others. 01:29:48.420 |
I think that's amazing, that's very powerful, 01:29:50.780 |
and that's very useful, even if it's painful. 01:30:00.100 |
where it's no longer, in any possible world, productive. 01:30:04.900 |
And that's a really weird gray area for YouTube 01:30:08.340 |
to operate in, and it feels like it should be 01:30:10.300 |
a crowdsourced thing where people vote on it, 01:30:13.800 |
but then again, do you trust the majority to vote 01:30:19.060 |
I mean, this is where, this is really interesting 01:30:23.220 |
on this particular, like the scientific aspect of this. 01:30:27.220 |
Do you think YouTube should take more of a stance, 01:30:30.980 |
not censoring, but to actually have scientists 01:30:35.780 |
within YouTube having these kinds of discussions, 01:30:39.020 |
and then be able to almost speak out in a transparent way, 01:30:42.180 |
this is what we're going to let this video stand, 01:30:55.380 |
Right now they're not, the recommender systems 01:31:12.980 |
- Well, at the moment, it's gonna be pretty hard 01:31:16.340 |
to compel me that these people should be trusted 01:31:22.480 |
on matters of evidence, because they have demonstrated 01:31:30.900 |
and I guess I'm open to the idea of institutions 01:31:36.300 |
that would be capable of offering something valuable. 01:31:41.300 |
literally curating things and putting some videos 01:31:53.340 |
is quite literally putting not only individual humans 01:31:57.340 |
in tremendous jeopardy by censoring discussion 01:32:00.740 |
of useful tools and making tools that are more hazardous 01:32:10.560 |
of a permanent relationship with this pathogen. 01:32:13.580 |
I cannot emphasize enough how expensive that is. 01:32:22.740 |
suffer and die from it is indefinitely large. 01:32:30.220 |
meaning if you click on the flat earth videos, 01:32:35.220 |
that's all you're going to be presented with, 01:32:50.820 |
it's very rare you're going to get a recommendation, 01:32:57.580 |
Same with vaccine, videos that present the power 01:33:27.180 |
And I'm not talking about like manually inserted CDC videos 01:33:36.020 |
should take the vaccine, it's the safest thing ever. 01:33:38.440 |
No, it's about incredible, again, MIT colleagues of mine, 01:33:42.100 |
incredible biologists, virologists that talk about 01:33:55.500 |
is I think the algorithm can fix a lot of this 01:34:06.380 |
to expand what people are able to think about, 01:34:15.060 |
hard-coded way, but balanced in a way that's crowdsourced. 01:34:18.780 |
I think that's an algorithm problem that can be solved 01:34:21.540 |
because then you can delegate it to the algorithm 01:34:36.140 |
Instead, creating a full spectrum of exploration 01:34:39.860 |
that can be done and trusting the intelligence of people 01:34:49.280 |
that we're talking about a publicly held company 01:35:13.000 |
if you searched, came up with the same thing for everyone. 01:35:25.640 |
and you can tune into one that isn't personalized, 01:35:44.400 |
It is unthinkable that in the face of a campaign 01:35:48.200 |
to vaccinate people in order to reach herd immunity, 01:35:51.780 |
that YouTube would give you videos on hazards of vaccines 01:36:00.520 |
how hazardous the vaccines are is an unsettled question. 01:36:14.560 |
are open-minded and are thinking through the hazards 01:36:30.800 |
Okay, let's come up with a very deadly disease 01:36:34.800 |
for which there's a vaccine that is very safe, 01:36:47.320 |
Suppose it is necessary in order to drive the pathogen 01:36:59.280 |
who thinks that the vaccine is a mind control agent. 01:37:22.440 |
If that were the actual configuration of the puzzle, 01:37:28.040 |
pointing you to this other video potentially. 01:37:35.120 |
where people are up to the challenge of sorting that out. 01:37:48.200 |
and there's plenty of evidence that the vaccine is safe, 01:37:52.760 |
we're gonna give you this one, puts it on a par, right? 01:38:13.800 |
in order to create a false story about safety. 01:38:30.960 |
So we're not talking about a comparable thing, 01:38:33.920 |
but I guess my point is the algorithmic solution 01:38:36.040 |
that you point to creates a problem of its own, 01:38:39.760 |
which is that it means that the way to get exposure 01:38:46.860 |
then suddenly YouTube would be recommending those things. 01:38:49.720 |
And that's obviously a gameable system at best. 01:38:57.600 |
maybe playing a little bit of a devil's advocate. 01:39:08.280 |
and become better communicators, more charismatic, 01:39:19.240 |
you have a lot more ammunition, a lot more data, 01:39:26.680 |
So be better at communicating and stop being, 01:39:31.000 |
you have to start trusting the intelligence of people 01:39:37.160 |
which is like, what is the internet hungry for? 01:39:50.520 |
especially the people that are in public office, 01:40:01.400 |
And I think a lot of people observing this entire system now, 01:40:05.120 |
younger scientists are seeing this and saying, 01:40:09.360 |
"Okay, if I want to continue being a scientist 01:40:16.040 |
"I'm going to have to be a lot more authentic." 01:40:21.240 |
So there's just a younger generation of minds coming up 01:40:28.640 |
that when the much more dangerous virus comes along, 01:40:41.840 |
So you're going to have the same problem with a deadly virus 01:41:00.560 |
to what I was saying about freedom of expression 01:41:04.400 |
ultimately provides an advantage to better ideas. 01:41:10.800 |
But I would also say there's probably some sort of, 01:41:15.620 |
where YouTube shows you the alternative point of view. 01:41:21.420 |
But one thing you could do is you could give us the tools 01:41:29.080 |
there's something I think myopic, solipsistic, 01:41:34.080 |
narcissistic about an algorithm that serves shareholders 01:41:54.640 |
So what I really want is analytics that allow me, 01:42:05.240 |
what alternative perspectives are being explored. 01:42:14.280 |
I find all of this evidence that ivermectin works. 01:42:17.520 |
and people talk about various protocols and this and that. 01:42:20.320 |
And then I could say, all right, what is the other side? 01:42:24.240 |
And I could see who is searching not as individuals, 01:42:28.880 |
but what demographics are searching alternatives. 01:42:33.880 |
with something Reddit like where effectively, 01:42:39.360 |
I don't know, that a vaccine is a mind control device 01:42:51.160 |
and as well as possible would rise to the top. 01:42:53.360 |
And so you could read the top three or four explanations 01:42:56.440 |
about why this really credibly is a mind control product. 01:43:01.240 |
And you can say, well, that doesn't really add up. 01:43:36.440 |
So I'm gonna file some Bayesian probability with it 01:43:40.800 |
the steel man arguments better than I was expecting. 01:43:43.600 |
So I could imagine something like that where, 01:43:53.600 |
So we would all get a sense for what everybody else 01:43:57.040 |
And then some layer that didn't have anything to do 01:44:10.640 |
And again, a layer in which those things could be defended. 01:44:14.760 |
So you could hear what a good argument sounded like 01:44:17.080 |
rather than just hear a caricatured argument. 01:44:28.400 |
as much as it could be measured over a long-term. 01:44:41.880 |
not immediate kind of dopamine rush kind of signals. 01:44:48.120 |
I think ultimately the algorithm on the individual level 01:44:53.120 |
should optimize for personal growth, long-term happiness, 01:45:00.880 |
just growth intellectually, growth in terms of lifestyle, 01:45:06.480 |
personally, and so on, as opposed to immediate. 01:45:10.440 |
I think that's going to build a better society, 01:45:19.320 |
exploring the space of ideas, changing your mind often, 01:45:23.200 |
increasing the level to which you're open-minded, 01:45:31.360 |
all those kinds of things the algorithm should optimize for. 01:45:34.160 |
Like creating a better human at the individual level. 01:45:44.800 |
will then be happier with themselves for having used it 01:45:47.320 |
and will be a lifelong quote-unquote customer. 01:45:58.480 |
- It's a terrible business model under the current system. 01:46:17.320 |
- Well, no, because if you have the alternative 01:46:24.000 |
I mean, it points out that YouTube is operating in this way, 01:46:30.680 |
- How long-term would you like the wisdom to prove at? 01:46:35.000 |
- Even a week is better than what's currently happening. 01:46:52.080 |
and you basically say, look, here's the metrics. 01:46:58.640 |
This is what people experience with social media. 01:47:02.560 |
They look back at the previous month and say, 01:47:04.960 |
I felt shitty in a lot of days because of social media. 01:47:11.160 |
- If you look back at the previous few weeks and say, 01:47:14.680 |
wow, I'm a better person because that month happened, 01:47:24.560 |
If you love, like a lot of people love their Tesla car, 01:47:36.600 |
- Well, you gotta ask yourself the question, though. 01:47:50.760 |
it's not easy to build products, tools, systems 01:48:00.560 |
Everything we're seeing now comes from the ideas 01:48:08.240 |
that are incentivizing long-term personal growth 01:48:18.600 |
For one thing, we had an alternative to Facebook 01:48:27.920 |
and Facebook bought a huge interest in it, and it died. 01:48:37.480 |
- Right, but it could have gotten better, right? 01:48:44.840 |
a good argument for it's not going to completely destroy 01:48:48.040 |
all of Twitter and Facebook when somebody does it, 01:48:51.080 |
or Twitter will catch up and pivot to the algorithm. 01:48:56.360 |
There's obviously great ideas that remain unexplored 01:49:03.080 |
That's true, but an internet that was non-predatory 01:49:06.920 |
is an obvious idea, and many of us know that we want it, 01:49:13.480 |
and we don't move because there's no audience there, 01:49:19.880 |
But let me just, I wasn't kidding about build the system 01:49:27.620 |
So in our upcoming book, Heather and I in our last chapter 01:49:32.600 |
explore something called the fourth frontier, 01:49:34.760 |
and fourth frontier has to do with sort of a 2.0 version 01:49:45.680 |
We would have to effectively navigate our way there. 01:49:53.400 |
liberates humans meaningfully, and most importantly, 01:49:57.860 |
it has to feel like growth without depending on growth. 01:50:11.600 |
or unexploited opportunities, and when we find them, 01:50:14.840 |
our ancestors for example, if they happen into a new valley 01:50:18.240 |
that was unexplored by people, their population would grow 01:50:23.320 |
So there would be this great feeling of there's abundance 01:50:25.660 |
until you hit carrying capacity, which is inevitable, 01:50:30.500 |
So in order for human beings to flourish long-term, 01:50:34.040 |
the way to get there is to satisfy the desire for growth 01:50:50.660 |
people do something that causes their population 01:50:54.100 |
to experience growth, which is they go and make war on 01:50:57.760 |
or commit genocide against some other population, 01:50:59.980 |
which is something we obviously have to stop. 01:51:02.920 |
- By the way, this is a hunter-gatherer's guide 01:51:09.240 |
- With your wife, Heather, being released this September. 01:51:11.200 |
I believe you said you're going to do a little bit 01:51:43.560 |
One of the things that I fault my libertarian friends for 01:51:48.280 |
is this faith that the market is going to find solutions 01:51:52.580 |
And my sense is I'm a very strong believer in markets, 01:52:00.280 |
But what I don't believe is that they should be allowed 01:52:06.400 |
Markets are very good at figuring out how to do things. 01:52:12.120 |
what we should do, right, what we should want. 01:52:25.800 |
where human wellbeing is actually the driver, 01:52:32.320 |
would naturally be enhancing of human wellbeing. 01:52:39.840 |
markets are finding our every defect of character 01:52:44.940 |
and making us worse to each other in the process. 01:53:11.340 |
- So one of the arguments against the discussion 01:53:38.500 |
There are some things that we cannot rule out 01:53:44.440 |
The two that I think people should be tracking 01:53:46.800 |
is the possibility, some would say a likelihood, 01:53:53.640 |
that is to say very narrowly focused on a single antigen, 01:54:05.100 |
that will escape the protection that comes from the vaccine. 01:54:11.660 |
It is a particular hazard in light of the fact 01:54:14.020 |
that these vaccines have a substantial number 01:54:18.880 |
So one danger is that a person who has been vaccinated 01:54:22.500 |
will shed viruses that are specifically less visible 01:54:27.500 |
or invisible to the immunity created by the vaccines. 01:54:41.800 |
with something called antibody-dependent enhancement, 01:54:45.220 |
which is something that we see in certain diseases 01:54:51.660 |
and then their second case is much more devastating. 01:54:54.100 |
So breakbone fever is when you get your second case 01:55:00.740 |
the immune response that is produced by prior exposure 01:55:04.580 |
to attack the body in ways that it is incapable 01:55:09.940 |
this pattern has apparently blocked past efforts 01:55:22.740 |
of harm done to individuals by these vaccines, 01:55:26.900 |
we have to ask about what the overall impact is going to be. 01:55:29.620 |
And it's not clear in the way people think it is, 01:55:32.220 |
that if we vaccinate enough people, the pandemic will end. 01:55:48.980 |
to create deadlier different strains of the virus? 01:55:53.020 |
So is there something particular with these mRNA vaccines 01:56:01.420 |
- Well, it's not even just the mRNA vaccines. 01:56:03.460 |
The mRNA vaccines and the adenovector DNA vaccine 01:56:14.220 |
So that is a very concentrated evolutionary signal. 01:56:31.380 |
then there might be substantial enough immunity 01:56:44.020 |
And what that means is the disease is now faced 01:56:48.580 |
to effectively evolutionarily practice escape strategies. 01:56:58.400 |
you would typically not have one antigen, right? 01:57:01.240 |
You would have basically a virus full of antigens 01:57:08.140 |
So that is the case for people who have had COVID, right? 01:57:19.240 |
So these platforms create that special hazard. 01:57:28.240 |
that coat the RNAs are distributing themselves 01:57:40.200 |
- Is it possible for you to steel man the argument 01:57:49.040 |
The argument that everybody should get vaccinated 01:57:54.640 |
Phase three trials showed good safety for the vaccines. 01:58:01.960 |
but what we saw suggested high degree of efficacy 01:58:14.400 |
of available victims for the pathogen to a very low number 01:58:19.400 |
so that herd immunity drives it to extinction 01:58:22.040 |
requires us all to take our share of the risk 01:58:39.760 |
if the population is vaccinated from the vaccine 01:58:43.200 |
than will die from COVID if they're not vaccinated. 01:58:45.400 |
- And with the vaccine as it currently is being deployed, 01:59:11.320 |
well, I don't know, maybe you disagree with that, 01:59:17.360 |
is that the effects of the virus will fade away. 01:59:21.620 |
- First of all, I don't believe that the probability 01:59:23.520 |
of creating a worse pandemic is low enough to discount. 01:59:29.420 |
and frankly, we are seeing a wave of variants 01:59:40.760 |
where they have been, where they haven't been, 01:59:56.440 |
And why is it terrible is another question, right? 02:00:08.560 |
like the website for self-reporting side effects 02:00:34.760 |
It makes it so unclear what you're reporting. 02:00:44.480 |
with many pages and forms that don't make any sense? 02:00:49.180 |
what are the incentives for you to use that website? 02:01:01.120 |
anonymized data about who's getting vaccinated, 02:01:14.940 |
and using that crappy data to make conclusions 02:01:21.120 |
that can arrive at whatever conclusions you want. 02:01:32.240 |
they're going to try to construct any kind of narratives 02:01:47.560 |
we don't have even the data to understand the dangers. 02:01:52.120 |
- Yeah, I'm gonna pick up on your rant and say, 02:01:55.600 |
we, estimates of the degree of under-reporting in VAERS 02:02:11.160 |
So in the US, we have above 5,000 unexpected deaths 02:02:16.160 |
that seem, in time, to be associated with vaccination. 02:02:27.960 |
We don't know how large, I've seen estimates, 02:02:39.320 |
but the necessity of immunizing the population 02:03:07.160 |
who had had COVID-19, which is a large population. 02:03:10.640 |
There's no reason to expose those people to danger. 02:03:18.280 |
So there's no reason that we would be allowing 02:03:22.320 |
if this was really about an acceptable number 02:03:25.120 |
of deaths arising out of this set of vaccines. 02:03:32.840 |
and I struggle to find language that is strong enough 02:03:37.840 |
for the horror of vaccinating children in this case, 02:03:42.960 |
because children suffer a greater risk of long-term effects 02:03:49.880 |
and because this is earlier in their development, 02:03:51.960 |
therefore it impacts systems that are still forming. 02:04:09.640 |
to the way the virus spreads to the population. 02:04:11.960 |
But if that's the reason that we are inoculating children, 02:04:14.280 |
and there has been some revision in the last day or two 02:04:21.000 |
but to the extent that we were vaccinating children, 02:04:24.200 |
we were doing it to protect old, infirm people 02:04:28.280 |
who are the most likely to succumb to COVID-19. 02:04:37.140 |
robs children of life to save old, infirm people? 02:04:45.200 |
we are going about vaccinating, who we are vaccinating, 02:04:55.680 |
vaccinating people is a good in and of itself, 02:05:03.080 |
I don't wanna prevent you from jumping in here, 02:05:06.560 |
in addition to the fact that we're exposing people 02:05:08.480 |
to danger that we should not be exposing them to. 02:05:17.560 |
that's an incredible solution is large-scale testing. 02:05:21.980 |
- But that might be another couple of hours conversation, 02:05:26.400 |
but there's these solutions that are obvious, 02:05:30.520 |
So you could argue that ivermectin is not that obvious, 02:05:34.680 |
but maybe the whole point is you have aggressive, 02:05:38.760 |
very fast research that leads to a meta-analysis 02:05:43.080 |
and then large-scale production and deployment. 02:05:53.560 |
of large-scale deployment of testing, at-home testing, 02:06:08.640 |
- Well, let me just say, I am also completely shocked 02:06:11.680 |
that we did not get on high-quality testing early 02:06:15.000 |
and that we are still suffering from this even now, 02:06:23.680 |
would tell us a lot about its mode of transmission, 02:06:26.080 |
which would allow us to protect ourselves better. 02:06:35.880 |
- You've spoken with Eric Weinstein, your brother, 02:06:41.440 |
about the ideas that eventually led to the paper 02:06:45.160 |
you published titled "The Reserved Capacity Hypothesis." 02:06:59.900 |
- Sure, easier to explain the conclusion of the paper. 02:07:15.000 |
We call that process, which is otherwise called aging, 02:07:19.900 |
And senescence, in this paper, it is hypothesized, 02:07:29.280 |
of a cancer prevention feature of our bodies, 02:07:40.760 |
There are a few cells in the body that are exceptional, 02:08:01.360 |
And that explains why we become decrepit as we grow old. 02:08:09.480 |
especially in light of the fact that the mechanism 02:08:12.920 |
that seems to limit the ability of cells to reproduce 02:08:21.280 |
but it's a DNA sequence at the ends of our chromosomes 02:08:26.320 |
And the number of repeats functions like a counter. 02:08:37.600 |
And at the point that the telomere becomes critically short, 02:08:41.880 |
even though it still has the capacity to do so. 02:08:44.640 |
Stops dividing and it starts transcribing different genes 02:08:50.240 |
So what my work did was it looked at the fact 02:08:53.880 |
that the telomeric shortening was being studied 02:08:57.880 |
It was being studied by people who were interested 02:09:03.440 |
And it was being studied in exactly the opposite fashion 02:09:06.520 |
by people who were interested in tumorigenesis and cancer. 02:09:16.700 |
That's the enzyme that lengthens our telomeres. 02:09:19.040 |
So those folks were interested in bringing about a halt 02:09:27.520 |
And the folks who were studying the senescence process 02:09:32.280 |
in order to generate greater repair capacity. 02:09:42.880 |
that the genes which create the tendency of the cells 02:09:47.880 |
to be limited in their capacity to replace themselves 02:09:55.600 |
which is that we are largely free of tumors and cancer 02:10:01.160 |
that we grow feeble and inefficient and eventually die. 02:10:11.280 |
I was fortunate enough to know, George Williams, 02:10:16.640 |
who argued that senescence would have to be caused 02:10:19.760 |
by pleiotropic genes that cause early life benefits 02:10:26.160 |
And although this isn't the exact nature of the system 02:10:29.240 |
he predicted, it matches what he was expecting 02:10:35.880 |
- That said, the focus of the paper is about the, 02:10:43.800 |
"We observed that captive rodent breeding protocols 02:10:49.000 |
"We observed that captive rodent breeding protocols 02:11:00.040 |
"that would otherwise favor tumor suppression. 02:11:03.160 |
"This appears to have greatly elongated the telomeres 02:11:07.840 |
"With their telomeric fail-safe effectively disabled, 02:11:15.280 |
So basically using these mice is not going to lead 02:11:31.780 |
So I think, especially with your discussion with Eric, 02:11:37.280 |
the conclusion of this paper has to do with the fact 02:11:42.260 |
that we shouldn't be using these mice to test the safety 02:11:47.260 |
or to make conclusions about cancer or senescence. 02:11:55.040 |
Like basically saying that the length of these telomeres 02:12:01.960 |
I think there was a reason that the world of scientists 02:12:17.520 |
was that there was a result that everybody knew 02:12:23.040 |
The result was that mice have very long telomeres, 02:12:30.260 |
Now, we can talk about what the actual meaning 02:12:34.560 |
but in the end, I was confronted with a hypothesis 02:12:41.560 |
of the way mammals and indeed vertebrates age, 02:12:48.360 |
maybe there's something wrong with the result. 02:12:54.360 |
through some bad protocol and everybody else was repeating it 02:13:05.320 |
there was something about the breeding protocols 02:13:13.840 |
that have long telomeres would be laboratory mice 02:13:18.720 |
And Carol Greider, who agreed to collaborate with me, 02:13:23.220 |
tested that hypothesis and showed that it was indeed true 02:13:29.940 |
that had been in captivity for a much shorter period of time 02:13:43.100 |
And the implication of that is that the animals 02:13:54.460 |
because they have effectively an infinite capacity 02:14:10.880 |
because it functions as a kind of chemotherapy. 02:14:46.860 |
And the problem is that this actually is a system 02:14:53.180 |
that have the difficult job of bringing compounds to market, 02:15:10.460 |
that result from using these mice a number of times, 02:15:14.420 |
which turned out to do conspicuous heart damage. 02:15:20.060 |
and this idea has not gotten significant traction? 02:15:27.580 |
said something to me that rings in my ears to this day. 02:15:34.380 |
that laboratory mice have anomalously long telomeres 02:15:37.420 |
and that wild mice don't have long telomeres, 02:15:39.980 |
I asked her where she was going to publish that result 02:15:44.540 |
And she said that she was going to keep the result in-house 02:15:49.820 |
And at the time, I was a young graduate student. 02:15:54.060 |
I didn't really understand what she was saying. 02:15:58.620 |
the knowledge that a model organism is broken 02:16:04.820 |
that certain results will be reliably generatable, 02:16:13.340 |
that you know how those models will misbehave 02:16:23.780 |
is appear to have deeper insight into biology 02:16:26.460 |
because they predict things better than others do, 02:16:31.140 |
so that your predictions come out true is advantageous. 02:16:40.140 |
when it figured out that the mice were predisposed 02:16:49.080 |
it was the perfect cover for the difficult job 02:16:54.080 |
and then discovering their actual toxicity profile. 02:17:11.420 |
is not a strong variable for the conclusions, 02:17:21.120 |
So one reason she and others could be ignoring this 02:17:30.460 |
And in fact, at the point that I went to publish my paper, 02:17:36.720 |
She did so in a way that did not make a huge splash. 02:17:39.580 |
- Did she, like, I apologize if I don't know how, 02:17:43.100 |
what was the emphasis of her publication of that paper? 02:17:54.220 |
there's a kind of more of a philosophical statement as well. 02:18:00.900 |
in the evolutionary dynamics around senescence. 02:18:03.540 |
I wasn't pursuing grants or anything like that. 02:18:07.660 |
I was just working on a puzzle I thought was interesting. 02:18:10.880 |
Carol has, of course, gone on to win a Nobel Prize 02:18:17.460 |
of telomerase, the enzyme that lengthens telomeres. 02:18:20.400 |
But anyway, she's a heavy hitter in the academic world. 02:18:27.800 |
I do know that she told me she wasn't planning to publish. 02:18:31.720 |
that she was in the process of publishing very late. 02:18:36.880 |
to see whether or not she had put evidence in it 02:18:57.340 |
The fact is, the reason I called her in the first place 02:19:00.600 |
and established contact that generated our collaboration 02:19:12.280 |
about whether the model organisms are distorting 02:19:15.640 |
the understanding of the functioning of telomeres, 02:19:23.600 |
as a young graduate student, do you think Carroll 02:19:31.600 |
- You know, I don't think of it in those terms, 02:19:37.520 |
but I have a complex relationship with this story. 02:19:42.360 |
On the one hand, I'm livid with Carroll Grider 02:19:46.400 |
She absolutely pretended that I didn't exist in this story 02:19:51.920 |
My interest was as an evolutionary biologist, 02:19:59.000 |
and frankly, I think it would have been better for her 02:20:12.440 |
and I should say there's been a lot of confusion 02:20:25.840 |
I had one of the most bizarre emotional experiences 02:20:38.840 |
with no acknowledgement of where it had come from, 02:20:55.600 |
You have to understand, as a young scientist, 02:21:03.320 |
presented in what's surely the most important lecture 02:21:10.360 |
It's thrilling, it was thrilling to see, you know, 02:21:13.840 |
her figures projected on the screen there, right? 02:21:18.640 |
To have been part of work that was important enough 02:21:23.180 |
to be erased from the story felt absolutely terrible. 02:21:26.340 |
So anyway, that's sort of where I am with it. 02:21:30.120 |
My sense is, what I'm really troubled by in the story 02:21:41.020 |
the flaw with the mice has not been addressed. 02:21:44.800 |
And actually, Eric did some looking into this, 02:21:48.060 |
he tried to establish, by calling the Jax Lab 02:21:50.700 |
and trying to ascertain what had happened with the colonies, 02:21:58.660 |
There was seemingly no awareness that it was even an issue. 02:22:02.260 |
So I'm very troubled by the fact that as a father, 02:22:06.060 |
for example, I'm in no position to protect my family 02:22:14.980 |
Even though I'm aware of where the hazard comes from, 02:22:18.900 |
about which of these drugs will turn out to do damage 02:22:23.300 |
And that's a very frustrating position to be in. 02:22:28.140 |
that's even still grateful to Carol for taking my call. 02:22:41.300 |
And for a while, she was a good collaborator. 02:22:43.860 |
- Well, can I, I have to proceed carefully here 02:22:57.000 |
you're kind of saying that she basically erased credit, 02:23:31.260 |
perhaps less interesting than what we're discussing here 02:23:37.700 |
In general, with everything I'm doing with robotics, 02:23:52.260 |
see, this is different because the idea has more power 02:24:00.120 |
Like, so the engineering world is a little different, 02:24:03.200 |
but there's a kind of sense that I probably forgot 02:24:08.040 |
a lot of brilliant ideas that have been told to me. 02:24:14.840 |
Do you think she was so busy that she kind of forgot? 02:24:19.300 |
She has this stream of brilliant people around her, 02:24:23.180 |
there's a bunch of ideas that are swimming in the air, 02:24:41.820 |
He emailed me excitedly when the results came in. 02:24:46.740 |
So there was no ambiguity about what had happened. 02:24:52.920 |
I actually sent it to Carol in order to get her feedback 02:24:56.680 |
because I wanted to be a good collaborator to her. 02:25:06.080 |
but it was clear at that point that she became an antagonist 02:25:12.500 |
She couldn't possibly have forgotten the conversation. 02:25:15.360 |
I believe I even sent her tissues at some point, 02:25:23.080 |
but as a favor, she was doing another project 02:25:25.040 |
that involved telomeres and she needed samples 02:25:28.060 |
because of the Museum of Zoology that I was in. 02:25:34.240 |
I certainly know that those sorts of things can happen, 02:25:46.360 |
without saying where the hypothesis had come from 02:25:58.700 |
Do you think about the trajectory of being a researcher, 02:26:10.140 |
of publishing further papers along this line? 02:26:13.560 |
I mean, that's often the dynamic of that fascinating space 02:26:18.560 |
is you have a junior researcher with brilliant ideas 02:26:21.980 |
and a senior researcher that starts out as a mentor 02:26:33.560 |
is to publish a bunch more papers in this place 02:26:59.560 |
the clearer it was there was something there, 02:27:03.800 |
And I didn't want to become a telomere researcher. 02:27:08.660 |
What I want to do is to be an evolutionary biologist 02:27:12.260 |
who upgrades the toolkit of evolutionary concepts 02:27:40.060 |
but the basic point is, look, the work was good. 02:27:47.000 |
Frankly, the model of senescence that I presented 02:27:50.100 |
is now widely accepted and I don't feel any misgivings 02:28:03.640 |
and it would be a waste to get overly narrowly focused. 02:28:08.080 |
- There's so many ways through the space of science 02:28:12.880 |
and the most common way is to just publish a lot. 02:28:16.760 |
Just publish a lot of papers, do these incremental work 02:28:19.200 |
and exploring the space kind of like ants looking for food. 02:28:24.200 |
You're tossing out a bunch of different ideas. 02:28:26.840 |
Some of them could be brilliant breakthrough ideas, 02:28:28.960 |
nature, some of them are more confidence kind 02:28:33.360 |
Did you consider that kind of path in science? 02:28:44.600 |
with the process of peer review be this story, 02:28:48.720 |
which was frankly a debacle from one end to the other 02:28:55.840 |
It did not, it was not a very good sales pitch 02:28:58.800 |
for trying to make a difference through publication. 02:29:01.480 |
And I would point out part of what I ran into 02:29:03.320 |
and I think frankly part of what explains Carol's behavior 02:29:10.640 |
there is this dynamic where PIs parasitize their underlings 02:29:15.640 |
and if you're very, very good, you rise to the level 02:29:28.320 |
and it wasn't the culture of the lab I grew up in at all. 02:29:31.200 |
My lab, in fact, the PI, Dick Alexander, who's now gone, 02:29:35.840 |
but who was an incredible mind and a great human being, 02:29:44.280 |
not because it wouldn't have been useful and exciting, 02:29:47.480 |
but because in effect, he did not want any confusion 02:29:54.460 |
because he was a great mentor and the idea was actually, 02:30:00.400 |
and you don't want people thinking that they are. 02:30:11.200 |
I don't like the idea that if you are very good, 02:30:19.180 |
A crossing over from evolution into cellular biology 02:30:29.960 |
And I would also point out that my work falls 02:30:35.540 |
My work generally takes evidence accumulated by others 02:30:41.540 |
and places it together in order to generate hypotheses 02:30:55.260 |
with narrow publications that are read by few. 02:30:59.700 |
And in fact, I would point to the very conspicuous example 02:31:04.840 |
I've learned a tremendous amount from and I greatly admire. 02:31:12.300 |
in the sense of peer reviewed papers in journals. 02:31:15.780 |
What he's done instead is done synthetic work 02:31:19.620 |
which are not peer reviewed in the same sense. 02:31:27.080 |
So my sense is if Richard Dawkins can illustrate 02:31:34.420 |
without using journals as the primary mechanism 02:31:38.380 |
for distributing what you've come to understand, 02:31:42.380 |
and it's a far better one from the point of view 02:31:56.180 |
You could also publish, I'm working on something 02:31:58.860 |
with Andrew Huberman now, you can also publish synthesis. 02:32:03.260 |
they're exceptionally valuable for the communities. 02:32:06.660 |
It brings the community together, tells a history, 02:32:09.380 |
tells a story of where the community has been. 02:32:11.080 |
It paints a picture of where the path lays for the future. 02:32:15.560 |
And Richard Dawkins is a good example of somebody 02:32:17.780 |
that does that in book form that he kind of walks the line 02:32:23.700 |
You have like somebody who like Neil deGrasse Tyson, 02:32:28.820 |
Richard Dawkins sometimes is a science communicator, 02:32:33.980 |
to where it's a little bit, it's not shying away 02:32:47.620 |
I mean, Roger Penrose, I mean, similar kind of idea. 02:32:53.020 |
Synthesis does not, especially synthesis work, 02:32:56.440 |
work that synthesizes ideas does not necessarily 02:33:13.460 |
And that's the thing is people don't understand 02:33:23.340 |
in a place where there is a power dynamic, right? 02:33:26.580 |
I mean, the joke of course is that peer review 02:33:30.340 |
Your biggest competitors get to see your work 02:33:37.220 |
And, you know, again, when your formative experience 02:33:41.220 |
with the publication apparatus is the one I had 02:33:50.940 |
- And, you know, what's the harm in publishing them 02:33:53.980 |
so that your peers have to review them in public 02:33:55.900 |
where they actually, if they're gonna disagree with you, 02:33:58.580 |
they actually have to take the risk of saying, 02:34:00.620 |
I don't think this is right and here's why, right? 02:34:05.580 |
It's not that I don't want my work reviewed by peers, 02:34:41.020 |
I've confronted this in scientific work I've done at MIT 02:34:45.340 |
where there are certain things that are not done well. 02:34:49.900 |
People are not being the best version of themselves. 02:35:07.240 |
versus doing the hard work of publishing papers 02:35:15.900 |
you guys are doing it wrong and then just walking away. 02:35:31.500 |
- Probably one of the best questions I've ever been asked. 02:35:38.300 |
First of all, we are all mysteries to ourself at some level. 02:35:43.560 |
So it's possible there's stuff going on in me 02:35:48.320 |
But in general, I would say one of my better strengths 02:35:56.360 |
I clearly think highly of myself, but it is not driving me. 02:36:09.360 |
And there's something even better about the phone calls 02:36:23.800 |
when you have a toolkit in which you can actually look 02:36:33.080 |
And I could entertain myself for the rest of my life. 02:36:35.760 |
If I was somehow isolated from the rest of the world, 02:36:39.840 |
but I was in a place that was biologically interesting, 02:36:50.780 |
and I could just go out every day and look at cool stuff 02:36:53.400 |
and figure out what it means, I could be all right with that. 02:36:56.700 |
So I'm not heavily driven by the ego thing, as you put it. 02:37:05.640 |
except instead of the pets, I would put robots. 02:37:09.240 |
So it's not, it's the Eureka, it's the exploration 02:37:12.080 |
of the subject that brings you joy and fulfillment. 02:37:21.520 |
I will say I also have kind of a secondary passion 02:37:27.560 |
but I do believe, I believe I found my calling. 02:37:36.120 |
So I get what you're saying about the analogy quite well. 02:37:53.800 |
and this question is so good, it deserves a candid answer, 02:38:00.360 |
I like fighting against people I don't respect, 02:38:10.020 |
One of the reasons I have no interest in martyrdom 02:38:18.760 |
- I have a wonderful wife, I have amazing children, 02:38:23.420 |
I live in a lovely place, I don't wanna exit. 02:38:32.220 |
and a willingness to exit, if that's the only way, 02:38:37.840 |
but it is an acceptance that fighting is dangerous, 02:38:46.040 |
I don't have the sense that the thing is out there 02:38:52.860 |
It's primarily done through destroying them reputationally, 02:38:56.860 |
which is not something I relish the possibility of. 02:39:00.520 |
But there's a difference between a willingness 02:39:13.560 |
For me, the thrill is in fighting when I'm in the right. 02:39:33.580 |
If it is not channeled into something useful, 02:39:36.580 |
so it damn well better be channeled into something useful. 02:39:42.160 |
You're just making me realize that enjoying the fight, 02:39:50.500 |
fighting the powerful and idea that you believe is right, 02:40:07.980 |
into personal action, this hope for humanity, 02:40:15.720 |
And that makes you feel good about the rest of humanity, 02:40:20.720 |
that if there's people like me, then we're going to be okay. 02:40:33.060 |
and you're fighting the powerful against all odds, 02:40:57.660 |
And again, I recognize you, you're a familiar, 02:41:01.640 |
your construction's familiar, even if it isn't mine, right? 02:41:19.280 |
As far as I know, we could still save humanity, 02:41:25.760 |
But I expect us not to, I expect us to fuck it up, right? 02:41:29.520 |
I don't like that thought, but I've looked into the abyss, 02:41:34.000 |
and the number of ways we could not succeed are many, 02:41:40.480 |
to get out of this very dangerous phase of history is small. 02:41:50.580 |
That I was a coward, that I prioritized other things. 02:41:55.580 |
At the end of the day, I think I will be able 02:42:02.200 |
is that when I saw clearly what needed to be done, 02:42:05.640 |
I tried to do it to the extent that it was in my power. 02:42:16.240 |
And frankly, I regard what I just said to you 02:42:18.480 |
as something like a personality defect, right? 02:42:25.800 |
On the other hand, my guess is that personality defect 02:42:39.560 |
So yeah, our perspective on the world are different, 02:42:54.720 |
that we're gonna win, more than likely we're going to be okay. 02:43:03.800 |
But back to Eric, he had a wonderful conversation. 02:43:07.320 |
In that conversation, he played the big brother role 02:43:25.600 |
I mean, for one thing, Eric and I are interestingly similar 02:43:30.280 |
in some ways and radically different in some other ways. 02:43:36.980 |
because almost always people meet one of us first 02:43:44.400 |
But I had a great advantage, which is I came second. 02:43:58.000 |
because A, he was a very awesome older brother 02:44:02.640 |
who made interesting mistakes, learned from them 02:44:06.000 |
and conveyed the wisdom of what he had discovered. 02:44:08.760 |
And that was, I don't know who else ends up so lucky 02:44:13.560 |
as to have that kind of person blazing the trail. 02:44:44.520 |
What he's fascinated by is the fundamental of fundamentals. 02:44:51.640 |
where somebody was becoming excellent in that 02:44:55.560 |
the fundamental of fundamentals was going to be pointless. 02:44:58.800 |
I was gonna be playing second fiddle forever. 02:45:15.240 |
And I have to thank him for it because, you know, I mean-- 02:45:37.760 |
which is sort of an echo of Eric's style of thinking, 02:45:43.440 |
- He's overpoweringly argues for the importance of physics, 02:45:57.420 |
Is there an argument to be made against that? 02:46:09.680 |
at the cosmic scale and then builds the beautiful thing 02:46:18.360 |
the systems that we're actually interacting with 02:46:21.640 |
in this human world are much more important to understand 02:46:25.420 |
than the low level theories of quantum mechanics 02:46:32.940 |
- Yeah, I can't say that one is more important. 02:46:35.700 |
I think there's probably a different timescale. 02:46:42.960 |
but the bang for the buck at the far fundamental layer 02:46:51.220 |
I'm pretty sure it's gonna have to be fusion powered. 02:46:58.820 |
assuming we didn't just dump fusion power on the market 02:47:08.580 |
and we had a little bit more wisdom than we have, 02:47:12.060 |
And that's not gonna come from people like me 02:47:32.580 |
in the futures of science will be developed by AI systems. 02:47:36.020 |
And I think in order to build intelligent AI systems, 02:47:39.060 |
you have to be a scholar of the fundamental of the emergent, 02:47:52.960 |
at least directly, don't have anything to do with physics. 02:48:07.580 |
- I don't wanna outsource that thing to the AI. 02:48:13.240 |
because some of the people who trained Heather and me 02:48:21.740 |
And the problem with systematics is that to do it right 02:48:34.180 |
Your method has to be based in the highest level of rigor. 02:48:46.840 |
with just lots and lots and lots of automated work. 02:48:51.780 |
there's like a generation of phylogenetic systematists 02:49:07.960 |
- Is there something that you disagree with Eric on, 02:49:13.100 |
you've failed so far, but you will eventually succeed? 02:49:26.960 |
And I'm trying to think what would be the ideal-- 02:49:30.860 |
in the space of philosophy, politics, family, love, robots? 02:49:47.120 |
in which I believe that I have butted heads with Eric 02:49:55.040 |
substantially over time. - So you've been winning. 02:50:01.860 |
It's quite possible he could say the same thing about me. 02:50:06.220 |
There are places where he's absolutely convinced me. 02:50:21.580 |
Like when something, like you heard me talk about the, 02:50:29.540 |
It was the autopilot that seems to be putting 02:50:33.820 |
a great many humans in needless medical jeopardy 02:50:40.300 |
And my feeling is we can say this almost for sure. 02:50:52.900 |
and handing down dictates from the WHO and all of that, 02:51:01.300 |
There's gonna be some embarrassing emails in some places 02:51:03.920 |
that are gonna reveal some shocking connections. 02:51:05.780 |
And then there's gonna be an awful lot of emergence 02:51:11.380 |
In which people were doing their little part of a job 02:51:19.580 |
and how much are we looking at an emergent process? 02:51:24.820 |
And the question is, what is the ratio in this case? 02:51:32.500 |
I think he's focused on the people who have a choice 02:51:38.580 |
- Discussion of the ratio is a distraction to that. 02:51:44.980 |
because it grants cover to people who are harming others. 02:51:57.900 |
I would say it alters his judgment on the matter. 02:52:02.020 |
But anyway, certainly useful just to leave open 02:52:21.980 |
- So let me ask you about, back to your book, 02:52:25.900 |
"Hunter-Gatherer's Guide to the 21st Century." 02:52:33.740 |
That's really exciting that there's like a structured, 02:52:37.680 |
A kind of, from an evolutionary biology perspective, 02:52:59.160 |
I think about a little bit in this modern world, 02:53:15.480 |
- But that said, I don't know what's the right way 02:53:21.720 |
to approach this, but from an evolutionary biology 02:53:26.480 |
perspective or from just looking at modern society, 02:54:09.340 |
that it is not waning because there is a superior system. 02:54:15.380 |
So let us just say there is a lot of pre-trans fallacy here 02:54:20.380 |
where people go through a phase where they recognize 02:54:26.120 |
that actually we know a lot about the evolution of monogamy 02:54:36.600 |
that there has been a lot of polygyny in human history. 02:54:39.480 |
And in fact, most of human history was largely polygynous. 02:54:45.120 |
But it is also the case that most of the people 02:54:58.060 |
What that is, is part of what I mentioned before, 02:55:01.360 |
where human beings can swap out their software program 02:55:11.960 |
So I would argue that the benefit of monogamy, 02:55:19.560 |
is that it brings all adults into child rearing. 02:55:26.640 |
is because human babies are very labor intensive. 02:55:35.440 |
having an extended family also is very important. 02:55:43.480 |
that is expanding, a monogamous mating system makes sense. 02:55:48.000 |
It makes sense because it means that the number 02:56:07.040 |
which means the total amount of parental effort is lower 02:56:12.160 |
So what I'm arguing is that you should expect 02:56:21.880 |
And at the point that they have reached carrying capacity, 02:56:24.000 |
you should expect to see polygyny break back out. 02:56:26.480 |
And what we are seeing is a kind of false sophistication 02:56:30.120 |
around polyamory, which will end up breaking down 02:56:33.640 |
into polygyny, which will not be in the interest 02:56:37.580 |
Really the only people whose interest it could be argued 02:56:39.760 |
to be in would be the very small number of males 02:56:51.240 |
if we focus in on those males at the quote unquote top 02:56:55.040 |
with many female partners, is it possible to say 02:57:06.320 |
I have a feeling that you and I wouldn't have 02:57:12.900 |
be evolutionarily optimal doesn't match my values 02:57:16.700 |
as a person and I'm sure it doesn't match yours either. 02:57:19.080 |
- Can we try to dig into that gap between those two? 02:57:36.780 |
It's not hard to figure out how that might put 02:57:43.280 |
I don't know about you, Lex, I'm not getting involved 02:57:47.760 |
I won't do it, I will do anything to avoid it. 02:57:49.880 |
So some part of me has decided that my conscious self 02:57:54.400 |
and the values that I hold trump my evolutionary self 02:57:59.400 |
and once you figure out that in some extreme case 02:58:03.080 |
that's true and then you realize that that means 02:58:07.520 |
and you start going through all of the things 02:58:19.980 |
You wanna be your own person and accomplish things 02:58:32.120 |
to be one of those males at the top of a polygynous system. 02:58:35.040 |
We both know why that would be rewarding, right? 02:58:45.960 |
So look, every red-blooded American/Russian male 02:58:53.840 |
On the other hand, it is up against an alternative 02:58:57.800 |
which is having a partner with whom one is bonded 02:59:14.860 |
Obviously, polygyny is complex and there's nothing 02:59:17.320 |
that stops a man presumably from loving multiple partners 02:59:26.360 |
there's a question about, okay, what is the quality of love 02:59:29.360 |
if it is divided over multiple partners, right? 02:59:32.480 |
And what is the net consequence for love in a society 02:59:39.420 |
for every individual male in this case who has it? 02:59:53.000 |
I think there actually is a monogamy program in us 02:59:57.640 |
But if you take it seriously, you can find it. 03:00:02.520 |
And frankly, marriage, and it doesn't have to be marriage, 03:00:06.500 |
but whatever it is that results in a lifelong bond 03:00:19.680 |
But if you know that you're looking for something, right? 03:00:22.400 |
If you know that the objective actually exists 03:00:24.200 |
and it's not some utopian fantasy that can't be found, 03:00:35.960 |
And my guess is you'd be very happy when you find it. 03:00:41.640 |
I feel like there is some kind of physics of love. 03:00:43.620 |
So one, there's a conservation thing going on. 03:00:54.160 |
But it seems like in reality, that love gets split. 03:01:04.600 |
but if you are in a monogamous relationship by choice 03:01:09.280 |
and almost as in slight rebellion to social norms, 03:01:26.280 |
and this pressure of society, that's different. 03:01:28.880 |
Because that's almost like a constraint on your freedom 03:01:32.200 |
that is enforced by something other than your own ideals. 03:01:40.620 |
create these constraints, that enriches that love. 03:01:53.240 |
and it's done by choice, that can maximize that. 03:02:07.800 |
It can transcend to take us to a richer experience, 03:02:11.880 |
which we have the luxury of having, exploring, 03:02:24.800 |
when there are other choices, imbues it with meaning 03:02:35.640 |
and I have a hard time not feeling terrible sadness 03:02:46.880 |
I think they're missing something so important 03:02:51.520 |
and they don't even know that they're missing it. 03:02:58.840 |
because nobody's really been honest with them 03:03:06.960 |
which I used to do, again, a million years ago 03:03:09.160 |
when I was a college professor, four years ago, 03:03:17.640 |
because Heather and I seemed to have a good relationship, 03:03:39.520 |
and that you don't have to be perfectly compatible. 03:03:41.840 |
It's not about dating until you find the one. 03:03:44.880 |
It's about finding somebody whose underlying values 03:04:00.400 |
that's telling you what's sophisticated to think about, 03:04:08.040 |
And I believe you'll end up laughing in the end 03:04:12.400 |
But discover, wow, that's a hellscape that I opted out of, 03:04:17.400 |
and this thing I opted into, complicated, difficult, 03:04:22.680 |
- Nothing that's worth it is ever not difficult. 03:04:25.840 |
So we should even just skip the whole statement 03:04:48.000 |
Is there advice you can give to young people, 03:04:56.780 |
- Yeah, but it's not, they're not gonna like it 03:05:15.040 |
these things are all so distorted and corrupt 03:05:20.040 |
that I didn't wanna point anybody to anything, right? 03:05:26.620 |
But I would say that results in a kind of meta-level advice 03:05:35.900 |
You don't know where the opportunities will be. 03:05:38.760 |
You should invest in tools rather than knowledge, right? 03:05:44.500 |
you can repurpose that no matter what the future brings 03:05:47.920 |
to the extent that, you know, if you, as a robot guy, right, 03:05:56.800 |
and the stuff of robot building disappeared with it, 03:06:05.740 |
whether you're, you know, forced to work with, you know, 03:06:12.680 |
Your mechanical mind would be useful in all kinds of places. 03:06:15.920 |
So invest in tools like that that can be easily repurposed 03:06:27.800 |
you're gonna be up against all sorts of people 03:06:30.840 |
who have studied the things that you studied, right? 03:06:36.120 |
and you pick up computer programming, guess what? 03:06:44.800 |
On the other hand, if you combine that with something else 03:07:03.040 |
then those combinations create a rarefied space 03:07:07.240 |
And even if the things don't even really touch, 03:07:29.040 |
but it does tell you the one thing we can say for certain 03:07:36.040 |
- And like you said, there's cool things to be discovered 03:07:51.760 |
like every course in grad school, undergrad too, 03:08:00.040 |
And it's not immediately obvious how useful it is, 03:08:11.120 |
So you're bringing to the table these pieces of knowledge, 03:08:16.120 |
some of which when intersected might create a niche 03:08:19.880 |
that's completely novel, unique, and will bring you joy. 03:08:23.560 |
I have that, I mean, I took a huge number of courses 03:08:29.840 |
but they totally changed the way I see the world 03:08:34.760 |
or is a little bit difficult to kind of make explicit. 03:08:38.640 |
But taken together, they've allowed me to see, 03:08:43.640 |
for example, the world of robotics totally different 03:08:48.320 |
and different from many of my colleagues and friends 03:09:04.720 |
- Yeah, and useless doesn't mean useless, right? 03:09:10.120 |
- A good useless course can be the best one you ever took. 03:09:20.320 |
- Well, I took immunobiology in the medical school 03:09:25.880 |
when I was at Penn as, I guess I would have been 03:09:33.040 |
It blew my goddamn mind, and it still does, right? 03:09:37.040 |
I mean, we had this, I don't even know who it was, 03:09:39.600 |
but we had this great professor who was highly placed 03:09:44.080 |
You know, the course is called immunobiology, 03:09:50.160 |
And as I recall it, the professor stood sideways 03:09:57.240 |
literally stroking his beard with this bemused look 03:10:04.280 |
And you know, you had all these medical students 03:10:09.900 |
But you know, I got what this guy was smiling about. 03:10:13.600 |
It was like so, what he was describing, you know, 03:10:18.680 |
that it was like almost a privilege to even be saying it 03:10:21.380 |
to a roomful of people who were listening, you know? 03:10:37.040 |
And actually Eric gave me one of the greater pieces 03:10:40.200 |
of advice, at least for college, that anyone's ever given, 03:10:47.600 |
But now I don't even know if kids can do this now 03:10:50.400 |
because the prereqs are now enforced by a computer. 03:11:01.400 |
was that often the advanced courses are easier in some way. 03:11:09.720 |
it's not like intro bio where you're learning 03:11:16.000 |
So if you dedicate yourself, you can pull it off. 03:11:18.760 |
- Yeah, stay with an idea for many weeks at a time. 03:11:32.040 |
- Well, I feel terrible having to give you the answer. 03:11:38.880 |
I realize you asked the question, but if I tell you, 03:11:43.280 |
I don't wanna do that, but look, there's two-- 03:11:46.080 |
- There can be a disappointment, isn't there? 03:11:50.400 |
Because we actually know the answer to the question. 03:11:59.560 |
the heat death of the universe or whatever it is 03:12:04.780 |
but even if you were optimistic about our ability 03:12:07.680 |
to escape every existential hazard indefinitely, 03:12:12.680 |
ultimately it's all for naught and we know it, right? 03:12:20.660 |
and then it stares back and laughs or whatever happens, 03:12:30.860 |
well, what would make sense if that were true? 03:12:33.160 |
And I think there's something very clear to me. 03:12:38.980 |
if I just take the values that I'm sure we share 03:12:41.780 |
and extrapolate from them, I think the following thing 03:12:54.340 |
A lot of people don't make use of the opportunity 03:12:56.020 |
and a lot of people don't have opportunity, right? 03:12:58.020 |
They get to be human, but they're too constrained 03:13:00.140 |
by keeping a roof over their heads to really be free. 03:13:07.460 |
And being a free human on this beautiful planet, 03:13:17.820 |
So if that's true, that it is awesome to be human 03:13:21.380 |
and to be free, then surely it is our obligation 03:13:25.300 |
to deliver that opportunity to as many people as we can. 03:13:39.060 |
to have that opportunity to be both here and free 03:13:46.780 |
That effectively requires us to reach sustainability. 03:13:52.660 |
you could have a horror show of sustainability, right? 03:13:55.500 |
You could have a totalitarian sustainability. 03:14:16.320 |
to pursue beauty, truth, compassion, connection, 03:14:21.320 |
all of those things that we could list as unalloyed goods, 03:14:27.280 |
those are the things that people should be most liberated 03:14:33.900 |
I don't know how precise that calculation is, 03:14:41.880 |
then the point is, okay, well, there's no ultimate meaning, 03:14:47.120 |
How many people can we get to have this wonderful experience 03:15:00.200 |
and we wanna spread the awesome as much as possible. 03:15:03.560 |
- Yeah, you sum it up that way, spread the awesome. 03:15:07.960 |
And if that fails, if the fourth frontier fails, 03:15:10.600 |
the fifth frontier will be defined by robots, 03:15:19.680 |
I hope with more awesome. - They're very happy here 03:15:25.240 |
- Brett, I can't believe it took us this long to talk. 03:15:31.660 |
that we haven't actually spoken, I think, at all. 03:15:35.800 |
And I've always felt that we're already friends. 03:15:50.000 |
and I hope we can be friends and talk often again. 03:15:56.300 |
some of my robot friends as well and fall in love. 03:15:59.080 |
And I'm so glad that you love robots as well, 03:16:07.640 |
So we went from talking about some of the worst failures 03:16:11.720 |
of humanity to some of the most beautiful aspects 03:16:16.400 |
What else can you ask for from our conversation? 03:16:20.620 |
- You know, Alex, I feel the same way towards you, 03:16:30.360 |
A thank you to Jordan Harbridge's show, Express CPN. 03:16:36.520 |
Check them out in the description to support this podcast. 03:16:47.880 |
It is those who know little, not those who know much, 03:16:51.520 |
who so positively assert that this or that problem 03:16:57.720 |
Thank you for listening, and hope to see you next time.