back to indexRobin Hanson: Alien Civilizations, UFOs, and the Future of Humanity | Lex Fridman Podcast #292
Chapters
0:0 Introduction
1:52 Grabby aliens
39:36 War and competition
45:10 Global government
58:1 Humanity's future
68:2 Hello aliens
95:6 UFO sightings
119:43 Conspiracy theories
128:1 Elephant in the brain
141:32 Medicine
154:1 Institutions
180:54 Physics
185:46 Artificial intelligence
203:35 Economics
206:56 Political science
212:45 Advice for young people
221:36 Darkest moments
224:37 Love and loss
233:59 Immortality
237:56 Simulation hypothesis
248:13 Meaning of life
00:00:00.000 |
we can actually figure out where are the aliens 00:00:07.120 |
And so now that you have this living cosmology, 00:00:10.080 |
we can tell the story that the universe starts out empty 00:00:12.960 |
and then at some point, things like us appear, 00:00:20.440 |
And then for a few billion years, they expand 00:00:34.540 |
the expansion of the universe will happen so much 00:00:36.560 |
that all you'll have left is some galaxy clusters 00:00:39.600 |
and that are sort of disconnected from each other. 00:00:59.000 |
The following is a conversation with Robin Hansen, 00:01:04.160 |
and one of the most fascinating, wild, fearless, 00:01:06.760 |
and fun minds I've ever gotten a chance to accompany 00:01:09.320 |
for a time in exploring questions of human nature, 00:01:21.680 |
"The Elephant in the Brain, Hidden Motives in Everyday Life, 00:01:35.840 |
titled "If Loud Aliens Explain Human Earliness, 00:01:52.420 |
You are working on a book about quote, "Grabby Aliens." 00:02:02.200 |
- Grabby aliens expand fast into the universe 00:02:15.840 |
So the question is, where are the grabby aliens? 00:02:19.160 |
So Fermi's question is, where are the aliens? 00:02:37.880 |
They're not making a big difference in the world. 00:02:44.100 |
We don't know exactly what they do with where they went, 00:02:47.240 |
but the idea is they're in some sort of competitive world 00:02:50.000 |
where each part of them is trying to grab more stuff 00:02:56.160 |
And almost surely, whatever is the most competitive thing 00:03:03.320 |
isn't to leave it alone the way it started, right? 00:03:10.000 |
We turn a forest into a farmland, turn a harbor into a city. 00:03:14.800 |
So the idea is aliens would do something with it 00:03:18.100 |
and so we're not exactly sure what it would look like, 00:03:21.340 |
So somewhere in the sky, we would see big spheres 00:03:24.460 |
of different activity where things had been changed 00:03:30.880 |
- So as you expand, you aggressively interact 00:03:37.700 |
you're using them sometimes synonymously, sometimes not. 00:03:40.740 |
Grabby to me is a little bit more aggressive. 00:03:54.460 |
physical phenomenon like gravitational waves? 00:04:35.180 |
you wanna distinguish your particular model of things 00:04:47.400 |
If life on Earth, God, this is such a good abstract. 00:05:03.940 |
So what is the technical definition of grabby? 00:05:23.880 |
- We have a mathematical model of the distribution 00:05:26.880 |
of advanced civilizations, i.e. aliens, in space and time. 00:05:34.280 |
and we can set each one of those parameters from data, 00:05:37.400 |
and therefore we claim this is actually what we know about 00:05:43.040 |
So the key idea is they appear at some point in space-time, 00:05:46.500 |
and then after some short delay, they start expanding, 00:06:02.780 |
and they appear in time according to a power law, 00:06:07.600 |
and we can fit each of those parameters to data. 00:06:13.000 |
We know the distribution of advanced civilizations 00:06:23.600 |
within, say, 10 million years of the current moment. 00:06:27.980 |
And 10 million years is a really short duration 00:06:31.440 |
So we are, at the moment, a sort of random sample 00:06:36.040 |
of the kind of times at which an advanced civilization 00:06:38.560 |
might appear, because we may or may not become grabby, 00:06:45.080 |
and that gives us one of the other parameters. 00:06:52.760 |
- So power law, what is the N in the power law? 00:07:12.360 |
and so we had to go through some intermediate steps, 00:07:30.880 |
and the challenge was to achieve all of those steps 00:07:41.920 |
compared to all the other billions of planets out there, 00:07:45.040 |
and that we managed to achieve all these steps 00:07:55.800 |
'cause we don't know how many steps there are 00:07:58.800 |
So these are all the steps from the birth of life 00:08:07.120 |
it would happen really soon so that it couldn't be 00:08:10.160 |
the same sort of a hard step as the last one, 00:08:19.160 |
that have happened, that suggests that there's roughly, 00:08:36.120 |
- But whatever it is, we've just achieved the last one. 00:08:38.560 |
- Are we talking about humans or aliens here? 00:08:44.760 |
We don't exactly know the level of specialness, 00:08:47.760 |
we don't really know which steps were the hardest or not, 00:08:53.000 |
but you're saying that there's three to 12 steps 00:08:55.720 |
that we have to go through to get to where we are 00:08:58.280 |
that are hard steps, hard to find by something 00:09:07.880 |
There's a lot more ways to fail than to succeed. 00:09:10.880 |
- The first step would be sort of the very simplest 00:09:14.740 |
and then we don't know whether that first sort 00:09:19.360 |
is the first sort that we see in the historical record 00:09:42.340 |
And in this statistical model of trying to fit 00:09:53.260 |
that is they could each take different amounts 00:09:55.180 |
of time on average, but if you're lucky enough 00:10:00.340 |
then the durations between them will be roughly equal, 00:10:08.420 |
So we at the moment have roughly a billion years 00:10:18.820 |
after the very first time when life was possible 00:10:20.860 |
at the very beginning, so those two numbers right there 00:10:24.260 |
give you the rough estimate of six hard steps. 00:10:28.480 |
so we're trying to create a simple mathematical model 00:10:32.700 |
of how life emerges and expands in the universe. 00:10:43.280 |
- The two most plausibly diagnostic Earth durations 00:10:47.820 |
before Earth becomes uninhabitable for complex life. 00:11:17.020 |
If life, pre-expansionary life requires a number of steps, 00:11:22.020 |
what is the probability of taking those steps 00:11:31.740 |
And you say solving for E using the observed durations 00:11:36.140 |
of 1.1 and 0.4 then gives E values of 3.9 and 12.5, 00:11:41.140 |
range 5.7 to 26, suggesting a middle estimate 00:11:46.020 |
of at least six, that's where you said six hard steps. 00:12:07.900 |
And so we're really lucky that they all happened 00:12:10.100 |
really fast in a short time before our window closed. 00:12:14.020 |
And the chance of that happening in that short window 00:12:20.740 |
And so that was where the power we talked about 00:12:24.300 |
And so that means in the history of the universe, 00:12:27.500 |
we should overall roughly expect advanced life to appear 00:12:32.480 |
So that very early on, there was very little chance 00:12:35.900 |
of anything appearing, and then later on as things appear, 00:12:38.540 |
other things are appearing somewhat closer to them in time 00:12:48.180 |
- Math inclined, can you describe what a power law is? 00:12:53.220 |
and x squared is quadratic, so it's the power of two. 00:13:08.720 |
on a planet like Earth in that proportion to the time 00:13:21.120 |
it'll appear at roughly a power law like that. 00:13:34.640 |
this is the probability you just keep winning. 00:13:42.920 |
And so we're the result of this unlikely chain of successes. 00:13:47.760 |
So the dominant model of cancer in an organism 00:13:51.080 |
like each of us is that we have all these cells, 00:13:55.600 |
a single cell has to go through a number of mutations. 00:14:09.120 |
that the chance of any one cell producing cancer 00:14:11.760 |
by the end of your life is actually pretty high, 00:14:15.360 |
And so the chance of cancer appearing in your lifetime 00:14:19.520 |
this power of the number of mutations that's required 00:14:22.400 |
for any one cell in your body to become cancerous. 00:14:33.680 |
is roughly the power of six of the time you've been 00:14:40.640 |
that you're comparing power laws of the survival 00:14:45.520 |
or the arrival of the human species to cancerous cells. 00:14:51.700 |
but of course we might have a different value assumption 00:14:56.520 |
But of course, from the point of view of cancer, 00:15:01.800 |
From the point of view of cancer, it's a win-win. 00:15:17.020 |
It's like the Instagram channel, Nature is Metal. 00:15:38.420 |
I am an analyst, I'm a scholar, an intellectual, 00:15:52.880 |
And it's a little dangerous to mix those up too closely 00:16:14.720 |
to move it at least a little toward what we might prefer. 00:16:25.960 |
They find that dehumanizing or cold or metal, as you say, 00:16:30.740 |
to just say, well, this is what's likely to happen 00:16:41.080 |
- This is very interesting, that the cold analysis, 00:16:45.480 |
whether it's geopolitics, whether it's medicine, 00:16:54.240 |
some very specific aspect of human condition. 00:17:04.860 |
and the act of a doctor helping a single patient, 00:17:18.860 |
is it worth spending 10, 20, $30,000 on this one patient? 00:17:29.760 |
And yet, there's something about human nature 00:17:33.300 |
that wants to help the person in front of you, 00:17:57.200 |
Well, it's like, for example, the US government, 00:18:04.380 |
puts a value of, I think, like $9 million on a human life. 00:18:09.280 |
And the moment you put that number on a human life, 00:18:13.520 |
I can start making decisions about this or that 00:18:16.480 |
and with a sort of cold economic perspective, 00:18:24.060 |
from a deeper truth of what it means to be human somehow. 00:18:28.260 |
So you have to dance, because then if you put too much weight 00:18:32.300 |
on the anecdotal evidence on these kinds of human emotions, 00:18:39.140 |
you could also probably more likely deviate from truth. 00:18:42.980 |
But there's something about that cold analysis. 00:18:47.160 |
coldly analyze wars, war in Yemen, war in Syria, 00:18:56.420 |
and there's something lost when you do a cold analysis 00:19:02.320 |
talking about sort of conflict, competition over resources, 00:19:11.380 |
sort of models of geopolitics and why a certain war happened, 00:19:14.540 |
you lose something about the suffering that happens. 00:19:19.120 |
It's an interesting thing because you're both, 00:19:20.860 |
you're exceptionally good at models in all domains, 00:19:25.860 |
literally, but also there's a humanity to you. 00:19:32.180 |
I don't know if you can comment on that dance. 00:19:35.140 |
It's definitely true as you say that for many people, 00:19:43.980 |
What's the chance that this treatment might help? 00:19:52.720 |
this looks like a lot of cost for a small medical gain. 00:20:20.060 |
or the high cost in order to save you this tension, 00:20:38.660 |
I will look at things accurately, I will know the truth, 00:20:41.380 |
and then I will also do the right thing with it. 00:20:46.980 |
about what the right thing is in terms of the truth. 00:20:50.500 |
in order to figure out what the right thing to do is. 00:20:54.380 |
And I think if you do think you need to be lied to 00:20:56.620 |
in order to figure out what the right thing to do is, 00:21:08.020 |
achieving whatever good you were trying to achieve. 00:21:11.460 |
- But getting the data, getting the facts is step one, 00:21:20.020 |
getting the good data is step one, and it's a burden. 00:21:28.340 |
to arrive at sort of the easy, convenient thing. 00:21:49.460 |
You can use data to basically excuse away anything. 00:22:07.540 |
That very kind of gray area, that very subjective area. 00:22:34.320 |
There's the rate at which they appear in time. 00:22:39.840 |
So we've talked about the history of life on Earth 00:22:41.700 |
suggests that power is around six, but maybe three to 12. 00:22:45.540 |
We can say that constant comes from our current date, 00:22:53.260 |
comes from the fact that when we look in the sky, 00:23:06.660 |
That is, at a random time when a civilization 00:23:09.900 |
is first appearing, if it looks out into its sky, 00:23:15.500 |
And they would be much bigger than the full moon. 00:23:23.300 |
- There's a bunch of hard steps that Earth had to pass 00:23:30.220 |
which we're starting to launch rockets out into space. 00:23:39.140 |
If you look at the entirety of the history of Earth, 00:23:48.320 |
But if we do, we will do it in the next 10 million years. 00:23:56.460 |
- 10 million is a short time on the cosmological scale. 00:24:00.100 |
But the point is, even if it's up to 10 million, 00:24:02.220 |
that hardly makes any difference to the model. 00:24:08.020 |
I was so stressed about planning what I'm gonna do today. 00:24:13.480 |
I just need to be generating some offspring quickly here. 00:24:21.540 |
This 10 million year gap or window when we start expanding. 00:24:31.540 |
where there's a bunch of other alien civilizations 00:24:39.740 |
There's a model for how likely it is that that happens. 00:24:45.660 |
And you think of an expansion as almost like a sphere. 00:24:50.860 |
we're talking about the speed of the radius growth. 00:24:53.860 |
- Exactly, the surface, how fast the surface expands. 00:24:56.540 |
- Okay, and so you're saying that there is some speed 00:25:07.260 |
then maybe that explains why we haven't seen anything. 00:25:14.220 |
if slow predicts we would see them, but we don't see them. 00:25:17.540 |
- And so the way to explain that is that they're fast. 00:25:19.100 |
So the idea is if they're moving really fast, 00:25:21.960 |
then we don't see them until they're almost here. 00:25:39.180 |
Okay, so there's these spheres out there in the universe 00:25:44.180 |
that are made visible because they're sort of 00:25:57.060 |
- They would take apart stars, rearrange them, 00:26:06.060 |
we would see a lot of them because the universe is old. 00:26:12.500 |
- That is, we're assuming we're just typical, 00:26:26.300 |
they appear roughly once per million galaxies. 00:26:30.140 |
And we would meet them in roughly a billion years 00:26:35.620 |
- So we're looking at a Grabby Aliens model, 3D sim. 00:26:39.980 |
- What's, that's the actual name of the video. 00:26:42.580 |
What, by the time we get to 13.8 billion years, 00:26:49.860 |
Okay, so this is, we're watching a three-dimensional sphere 00:27:06.360 |
So that's how the Grabby Aliens come in contact, 00:27:16.380 |
The following is a simulation of the Grabby Aliens model 00:27:30.180 |
this sphere will be about 3,000 times as wide 00:27:33.820 |
as the distance from the Milky Way to Andromeda. 00:28:38.520 |
well, maybe I just don't wanna believe this whole model. 00:28:40.560 |
Why should I believe this whole model at all? 00:28:42.560 |
And our best evidence why you should believe this model 00:29:10.520 |
appearing on a planet goes as the power of sixth of the time. 00:29:17.160 |
then the chance of it appearing on that planet, 00:29:21.760 |
is 1,000 to the sixth power, or 10 to the 18. 00:29:30.460 |
sit and empty and waiting for advanced life to appear, 00:29:39.180 |
That is the long planets near the end of the lifetime, 00:29:49.460 |
the universe is filling up in roughly a billion years. 00:29:55.760 |
So you had to show up now before that deadline. 00:29:57.960 |
- Okay, can we break that apart a little bit? 00:30:00.240 |
Okay, or linger on some of the things you said. 00:30:03.160 |
So with the power law, the things we've done on Earth, 00:30:06.120 |
the model you have says that it's very unlikely. 00:30:28.780 |
If we're super, can we just be the lucky ones? 00:30:41.720 |
So then if you were just being honest and humble, 00:30:49.020 |
- Means one of the assumptions that calculated 00:30:57.840 |
So most life would appear like 1,000 times longer 00:31:02.060 |
later than now if everything would stay empty 00:31:08.300 |
- So the gravity aliens are filling the universe right now. 00:31:10.660 |
Roughly at the moment they've filled half of the universe 00:31:42.900 |
the prediction with an assumption that's wrong. 00:31:49.540 |
You have a model that makes a prediction that's wrong, 00:31:54.120 |
Let's try to understand exactly where the wrong is. 00:31:56.460 |
So the assumption is that the universe is empty. 00:32:05.820 |
That is, if the universe would just stay empty, 00:32:07.480 |
if there was just, you know, nobody else out there, 00:32:10.820 |
then when you should expect advanced life to appear, 00:32:16.580 |
You should expect to appear trillions of years 00:32:21.280 |
So this is a very sort of nuanced mathematical assumption. 00:32:25.140 |
I don't think we can intuit it cleanly with words. 00:32:40.700 |
then it should happen very late, much later than now. 00:32:44.940 |
And if you look at Earth, the way things happen on Earth, 00:32:49.260 |
it happened much, much, much, much, much earlier 00:32:51.740 |
than it was supposed to according to this model 00:32:55.360 |
Therefore, you can say, well, the initial assumption 00:32:58.100 |
of the universe staying empty is very unlikely. 00:33:02.420 |
- And the other alternative theory is the universe 00:33:10.020 |
of things that can appear before the deadline. 00:33:13.220 |
Okay, it's filling up, so why don't we see anything 00:33:24.660 |
What are the ways in which we might see a quickly expanding? 00:33:41.780 |
at least one sphere in the sky, growing very rapidly. 00:33:50.440 |
- So there's different, 'cause we were just talking 00:33:57.740 |
- You might see it 10 million years in advance coming. 00:34:10.060 |
So the chance of one originating very close to you in time 00:34:18.460 |
from the time you see it, from the time it gets here. 00:34:27.300 |
so if they're traveling close to the speed of light-- 00:34:40.160 |
- I see, so the delta between the speed of light 00:34:43.660 |
and their actual travel speed is very important? 00:34:52.000 |
But what if they're traveling exactly at a speed of light? 00:34:59.820 |
- And they could also be somehow traveling fast 00:35:06.380 |
because if they could go faster than the speed of light, 00:35:13.140 |
than the speed of light, you can go backwards in space time. 00:35:15.660 |
So any time you appeared anywhere in space time, 00:35:21.300 |
- So anybody in the future, whoever appeared, 00:35:26.300 |
that those kinds of aliens aren't already here? 00:35:28.660 |
- Well, we should have a different discussion of that. 00:35:33.860 |
Well, let's actually leave that discussion aside 00:35:36.140 |
just to linger and understand the Grabby alien expansion, 00:36:09.540 |
So there's, like, the first time the spheres touch each other 00:36:14.620 |
They recognize each other first before they meet. 00:36:30.420 |
Okay, so what does that, you think, look like? 00:36:41.060 |
- So the story of the history of the universe here 00:36:45.700 |
So what I'm excited about, in part, by this model, 00:36:54.820 |
So most ancient peoples, they had cosmologies, 00:36:57.860 |
the stories they told about where the universe came from 00:36:59.940 |
and where it's going and what's happening out there. 00:37:03.340 |
and actors, gods or something, out there doing things. 00:37:05.740 |
And lately, our favorite cosmology is dead, kind of boring. 00:37:10.740 |
We're the only activity we know about our sea 00:37:13.340 |
and everything else just looks dead and empty. 00:37:15.780 |
But this is now telling us, no, that's not quite right. 00:37:21.300 |
and in a few billion years, it'll be all full. 00:37:24.820 |
And from then on, the history of the universe 00:37:32.620 |
a really good way to think about cosmologies. 00:37:37.660 |
and we don't know what's going on in that darkness 00:37:40.740 |
until the light from whatever generate lights arrives here. 00:37:50.260 |
But you don't think about the giant expanding spheres 00:38:02.180 |
- So I like the analogy with the ancient Greeks. 00:38:07.180 |
staring at the universe couldn't possibly tell 00:38:09.540 |
how far away the sun was, or how far away the moon is, 00:38:13.900 |
That all you can see is just big things in the sky 00:38:18.220 |
to be able to figure out the size of the earth 00:38:24.580 |
That is, they could figure those things out, actually, 00:38:31.700 |
by being clever about the few things we can see, 00:38:35.780 |
And so now that you have this living cosmology, 00:38:38.740 |
we can tell the story that the universe starts out empty, 00:38:41.600 |
and then at some point, things like us appear, 00:38:49.100 |
And then for a few billion years, they expand, 00:39:03.180 |
the expansion of the universe will happen so much 00:39:05.200 |
that all you'll have left is some galaxy clusters 00:39:08.260 |
and that are sort of disconnected from each other. 00:39:10.420 |
But before then, for the next 100 billion years, 00:39:35.960 |
Is the universe of alien civilizations defined by war 00:39:40.960 |
as much or more than war-defined human history? 00:39:52.180 |
and then the question is how much competition implies war. 00:39:57.100 |
So up until recently, competition defined life on Earth. 00:40:02.100 |
Competition between species and organisms and among humans, 00:40:08.580 |
competitions among individuals and communities, 00:40:11.580 |
and that competition often took the form of war 00:40:20.380 |
to sort of suppress and end competition in human affairs. 00:40:42.780 |
and any forms of business and personal competition 00:41:05.460 |
But if they choose to allow interstellar colonization, 00:41:28.900 |
and therefore they have the potential for competition 00:41:38.160 |
- So military meaning physically destructive. 00:41:51.040 |
is progress might be maximized through competition, 00:42:02.360 |
So like constructive, not destructive competition. 00:42:09.160 |
Graby alien civilizations would be likely defined 00:42:12.720 |
by competition 'cause they can expand faster. 00:42:22.080 |
competition just happens if you can't coordinate to stop it. 00:42:32.320 |
So competition is a fundamental force in the universe. 00:42:43.260 |
But we today have the chance, many people think and hope, 00:42:47.240 |
of greatly controlling and limiting competition 00:42:55.640 |
Whether to allow competition to sort of regain 00:42:59.680 |
its full force or whether to suppress and manage it. 00:43:03.940 |
- Well, one of the open questions that has been raised 00:43:13.460 |
is whether our desire to lessen the destructive nature 00:43:18.380 |
of competition or the destructive kind of competition 00:43:22.300 |
will be outpaced by the destructive power of our weapons. 00:43:27.200 |
Sort of if nuclear weapons and weapons of that kind 00:43:33.720 |
become more destructive than our desire for peace, 00:43:40.660 |
then all it takes is one asshole at the party 00:43:49.860 |
on the cosmological scales we're talking about. 00:44:18.260 |
which is not human-caused atrocities or whatever. 00:44:26.540 |
- There are a lot of military atrocities in history, 00:44:30.660 |
Those are, those challenges to think about human nature, 00:44:40.380 |
they do not stop the human spirit, essentially. 00:44:50.640 |
So even a nuclear war isn't enough to destroy us 00:44:57.340 |
but we could institute a regime of global governance 00:45:04.060 |
including military and business competition of sorts, 00:45:20.380 |
and power corrupts, and absolute power corrupts absolutely. 00:45:30.020 |
is not letting any one person, any one country, 00:45:35.020 |
any one center of power become absolutely powerful, 00:45:39.980 |
because that's another lesson, is it seems to corrupt. 00:45:43.380 |
There's something about ego and the human mind 00:45:50.360 |
that terrifies me more than the possibility of war, 00:46:01.380 |
and let me try to paint the picture from their point of view. 00:46:05.540 |
but I think it's going to be a widely shared point of view. 00:46:08.100 |
- Yes, this is two devil's advocates arguing, two devils. 00:46:16.900 |
we actually have had a strong elite global community 00:46:24.980 |
and has created a lot of convergence in global policy. 00:46:36.020 |
or nuclear power energy or regulating airplanes 00:46:41.980 |
in fact, the world has very similar regulations 00:46:50.820 |
where people get together at places like Davos, et cetera, 00:47:02.380 |
and that produces something like global governance, 00:47:11.780 |
That is, humans can coordinate together on shared behavior 00:47:14.740 |
without a center by having gossip and reputation 00:47:24.860 |
So for example, one of the things that's happening, 00:47:31.460 |
has decided that they disapprove of the Russian invasion 00:47:35.020 |
and they are coordinating to pull resources together 00:47:38.180 |
from all around the world in order to oppose it, 00:47:45.020 |
and they feel that they are morally justified 00:47:53.020 |
that actually brings world elite communities together, 00:47:55.500 |
where they come together and they push a particular policy 00:47:59.660 |
and position that they share and that they achieve successes. 00:48:02.580 |
And the same sort of passion animates global elites 00:48:07.620 |
or global poverty, and other sorts of things. 00:48:19.700 |
they are slowly walking toward global governance, 00:48:22.220 |
slowly strengthening various world institutions 00:48:32.000 |
I think a lot of people over the coming centuries 00:48:41.260 |
and thank you for playing that devil's advocate there. 00:48:57.260 |
- Sure, but-- - And everything you just said. 00:48:58.100 |
- If their view is the one that determines what happens, 00:49:07.780 |
- From a perspective of minimizing human suffering, 00:49:23.820 |
and with disregard to the amount of suffering it causes, 00:49:32.220 |
So like you can tell all kinds of narratives, 00:49:46.320 |
and it's actually trying to bring about a better world. 00:49:50.320 |
So every power center thinks they're doing good. 00:50:08.520 |
The dinners, the meetings in the closed rooms. 00:50:14.820 |
But remember we talked about separating our cold analysis 00:50:19.880 |
of what's likely or possible from what we prefer, 00:50:22.580 |
and so this isn't exactly enough time for that. 00:50:25.020 |
We might say, I would recommend we don't go this route 00:50:30.900 |
and because I would say it'll preclude this possibility 00:50:39.500 |
for the next billion years with vast amounts of activity, 00:50:55.980 |
- So you, wait, you think that global governance 00:51:19.400 |
- So again, I want to separate my neutral analysis 00:51:25.940 |
first of all, I have an analysis that tells us 00:51:34.660 |
or one in 100 civilizations chooses to expand, 00:51:42.720 |
And it'll happen sometime in the next 10 million years, 00:51:48.180 |
But the key thing to notice from our point of view 00:51:50.660 |
is that even though you might like our global governance, 00:51:54.380 |
you might like the fact that we've come together, 00:51:58.040 |
and we no longer have destructive competition, 00:52:07.980 |
That is, once you allow interstellar colonization, 00:52:14.420 |
they could come back here and compete with you back here 00:52:19.980 |
And I think if people value that global governance 00:52:28.120 |
they would then want to prevent interstellar colonization. 00:52:31.660 |
- I want to have a conversation with those people. 00:52:33.860 |
I believe that both for humanity, for the good of humanity, 00:52:46.060 |
distributing the centers of power is very beneficial. 00:53:17.600 |
you know how you generally get disrespectful of kids, 00:53:23.240 |
- Okay, but I think you should hear a stronger case 00:53:41.740 |
So the human civilization and alien civilizations 00:53:51.240 |
And connected to that, do we want to give a lot of power 00:54:02.240 |
which is naturally connected to the expansion? 00:54:17.440 |
Within a solar system, still, everything is within reach. 00:54:20.320 |
That is, if there's a rebellious colony around Neptune, 00:54:29.080 |
- A central control over the solar system is feasible. 00:54:39.120 |
maybe broken into a thousand different political units 00:54:41.720 |
in the solar system, then any one part of that 00:54:45.400 |
that allows interstellar colonization, and it happens. 00:54:53.160 |
and is able to do it, and that's what it, therefore. 00:54:55.920 |
So we can just say, in a world of competition, 00:54:58.760 |
if interstellar colonization is possible, it will happen, 00:55:02.800 |
And that will sort of ensure the continuation 00:55:11.960 |
or can take productive forms. - In many forms. 00:55:15.440 |
I think one of the things that most scares people 00:55:17.400 |
about competition is not just that it creates 00:55:37.280 |
becomes multi-planetary, multi-solar system, potentially, 00:55:59.920 |
if you really understood what people were like 00:56:08.960 |
Most historical fiction lies to you about that. 00:56:11.680 |
It often offers you modern characters in an ancient world. 00:56:25.240 |
So I think the most obvious prediction about the future is, 00:56:28.160 |
even if you only have the mechanisms of change 00:56:30.440 |
we've seen in the past, you should still expect 00:56:33.960 |
But we have a lot bigger mechanisms for change 00:56:42.560 |
"Work, Love, and Life," and "Robots Rule the Earth," 00:56:53.880 |
to create a computer simulation of that brain. 00:56:57.560 |
And then those computer simulations of brains 00:57:02.280 |
They work and they vote and they fall in love 00:57:08.200 |
And my book is about analyzing how that world 00:57:12.760 |
basically using competition as my key lever of analysis. 00:57:18.040 |
then I can figure out how they change in that world, 00:57:24.920 |
And it's different in ways that are shocking sometimes 00:57:28.560 |
to many people and ways some people don't like. 00:57:41.120 |
changes into the future, we should just expect 00:57:44.600 |
it's possible to become very different than who we are. 00:57:52.000 |
a particular structure, a particular set of habits, 00:57:55.040 |
and they are only one piece in a vast space of possibilities. 00:58:01.960 |
- So yeah, let's linger on the space of possible minds 00:58:05.680 |
for a moment just to sort of humble ourselves 00:58:15.640 |
Like the fact that we like a particular kind of sex 00:58:20.040 |
and the fact that we eat food through one hole 00:58:26.920 |
And that seems to be a fundamental aspect of life, 00:58:31.480 |
And that life is finite in a certain kind of way. 00:59:29.200 |
to try to see how different the future could be, 00:59:31.920 |
but that doesn't give you some sort of like sense 01:00:13.360 |
So, this actually explains typical interest rates 01:00:17.040 |
that is interest rates are greatly influenced 01:00:29.480 |
our preferences evolved as sexually selected creatures. 01:00:35.920 |
creatures will evolve who don't discount the future. 01:00:41.440 |
and they will therefore not neglect the wrong. 01:00:43.560 |
So for example, for things like global warming 01:00:49.520 |
that basically ordinary people don't seem to care much, 01:00:55.600 |
because humans don't care much about the long-term future. 01:00:59.240 |
But, and futurists find it hard to motivate people 01:01:03.360 |
and to engage people about the long-term future 01:01:07.280 |
But that's a side effect of this particular way 01:01:09.720 |
that our preferences evolved about the future. 01:01:12.920 |
And so in the future, they will neglect the future less. 01:01:18.800 |
Eventually, maybe a few centuries, maybe longer, 01:01:22.480 |
eventually our descendants will care about the future. 01:01:25.920 |
- Can you speak to the intuition behind that? 01:01:41.480 |
then that would be good if you made those decisions. 01:01:43.640 |
But in order to do that, you'll have to care about them. 01:01:48.760 |
that's if you're trying to maximize the number of descendants 01:02:24.600 |
and what the managers of those funds should care about 01:02:41.680 |
but be very risk-neutral with respect to uncorrelated risk. 01:02:46.040 |
So that's a feature that's predicted to happen 01:02:56.120 |
That's also something we can say about the long run. 01:03:11.560 |
And that's just really bad to have zero descendants. 01:03:19.600 |
And then you have a portfolio of descendants. 01:03:22.040 |
And so that portfolio ensures you against problems 01:03:26.680 |
- I like the idea of portfolio of descendants. 01:03:28.640 |
And we'll talk about portfolios with your idea 01:03:32.840 |
We'll return there with M, E-M, the age of E-M. 01:03:36.600 |
Work, love, and life when robots rule the earth. 01:03:50.160 |
create artificial minds, artificial copies of minds, 01:03:57.040 |
- I have another dramatic prediction I can make 01:04:01.560 |
- Which is at the moment, we reproduce as the result 01:04:14.960 |
and hungry, and thirsty, and wanting to have sex, 01:04:17.560 |
and wanting to be excitement, et cetera, right? 01:04:23.120 |
the packages of preferences that we evolved to have 01:04:31.160 |
But those packages of preferences are not a robust way 01:04:40.120 |
So that's one of the reasons we are now having 01:04:45.480 |
our ancestral preferences are not inducing us 01:04:48.840 |
which is, from evolution's point of view, a big mistake. 01:04:55.020 |
there will arise creatures who just abstractly know 01:05:00.920 |
That's a very robust way to have more descendants 01:05:09.480 |
So mathematical, and thank you for thinking so clear with me 01:05:20.560 |
So you're just clearly saying that successful, 01:05:23.800 |
long-term civilizations will prefer to have descendants, 01:05:30.160 |
- Not just prefer, consciously and abstractly prefer. 01:05:33.500 |
That is, it won't be the indirect consequence 01:05:37.320 |
It will just be the thing they know they want. 01:05:40.040 |
- There'll be a president in the future that says, 01:05:48.960 |
- We must go to the moon and do the other things. 01:05:52.000 |
- Not because they're easy, but because they're hard. 01:05:53.760 |
But instead of the moon, let's have lots of sex. 01:05:55.620 |
Okay, but there's a lot of ways to have descendants, right? 01:06:05.720 |
that will force you to think through those possibilities 01:06:10.040 |
- So just to clarify, descendants doesn't necessarily mean 01:06:16.140 |
meaning humans having sex and then having babies. 01:06:19.460 |
- You can have artificial intelligence systems 01:06:21.600 |
that in whom you instill some capability of cognition 01:06:29.280 |
You can also create through genetics and biology 01:06:31.560 |
clones of yourself or slightly modified clones, 01:06:40.080 |
It could be descendants in the space of ideas too, 01:06:43.320 |
for somehow we no longer exist in this meat vehicle. 01:06:46.720 |
It's now just like whatever the definition of a life form is, 01:06:54.420 |
- Yes, and they will be thoughtful about that. 01:06:56.680 |
They will have thought about what counts as a descendant 01:06:59.760 |
and that'll be important to them to have the right concept. 01:07:02.360 |
- So the they there is very interesting, who the they are. 01:07:06.080 |
- But the key thing is we're making predictions 01:07:09.560 |
about what our distant descendants will be like. 01:07:11.800 |
Another thing I think you would automatically accept 01:07:16.480 |
And I think that would be the obvious prediction 01:07:24.220 |
- Well, it's all, it's like organic or something. 01:07:28.800 |
- It might be squishy and made out of hydrocarbons, 01:07:31.360 |
but it would be artificial in the sense of made in factories 01:07:42.160 |
So the factories in our cells are, there are marvels, 01:07:44.920 |
but they don't achieve very many scale economies. 01:07:54.520 |
is different than sort of the factories that have evolved. 01:08:05.280 |
let me go, let me analyze your Twitter like it's Shakespeare. 01:08:12.360 |
define "hello" in quotes, alien civilizations 01:08:15.520 |
as one that might, in the next million years, 01:08:17.960 |
identify humans as intelligent and civilized, 01:08:24.140 |
by making their presence and advanced abilities known to us. 01:08:30.480 |
the next 15 polls ask about such "hello" aliens. 01:08:34.320 |
And what these polls ask is your Twitter followers, 01:08:43.640 |
So poll number one is, what percent of "hello" aliens 01:08:47.840 |
evolved from biological species with two main genders? 01:08:51.320 |
And, you know, the popular vote is above 80%. 01:09:04.900 |
So the genders as we look through evolutionary history, 01:09:09.560 |
As opposed to having just one or like millions. 01:09:13.680 |
- So there's a question in evolution of life on Earth, 01:09:16.520 |
there are very few species that have more than two genders. 01:09:24.200 |
that do have two genders, much more than one. 01:09:27.040 |
And so there's literature on why did multiple genders evolve 01:09:32.040 |
and then sort of what's the point of having males 01:09:38.960 |
That is they have, they would mate male, female, 01:09:50.440 |
And then they're differentiating the two genders. 01:10:03.040 |
for the affection of other and there's sexual partnership 01:10:21.400 |
being successfully adapted to the environment. 01:10:27.080 |
If you have males, then the males can take higher variance. 01:10:30.680 |
And so there can be stronger selection among the males 01:10:40.640 |
Question number two, what percent of hello aliens 01:10:43.760 |
evolved from land animals as opposed to plants 01:10:55.400 |
there's only 10% of species on earth are in the ocean. 01:11:07.440 |
I don't even, I can't even intuit exactly why that would be. 01:11:14.560 |
- So the story that I understand is it's about small niches. 01:11:25.600 |
That is, there are more creatures in each species 01:11:28.720 |
because the ocean environments don't vary as much. 01:11:42.640 |
because they vary so much from place to place. 01:11:57.200 |
that speciation promotes evolution in the long run. 01:12:07.700 |
And that's actually a warning about something called rot 01:12:12.200 |
which is one of the problems with even a world government, 01:12:23.000 |
for other large systems, including biological systems, 01:12:30.880 |
actually don't evolve as effectively as small ones do. 01:12:34.280 |
And that's an important thing to notice about. 01:12:40.120 |
from ordinary sort of evolution in economies on Earth 01:12:56.240 |
More evolution happened in the fragmented species. 01:13:01.840 |
'cause you can also push back in terms of nations 01:13:06.400 |
It's like large companies seems to evolve less effectively. 01:13:18.720 |
And when you look at the scale of decades and centuries, 01:13:27.480 |
Like large cities grow better than small cities. 01:13:30.400 |
Large integrated economies like the United States 01:13:32.440 |
or the European Union do better than small fragmented ones. 01:13:40.320 |
But so most of the people, and obviously votes on Twitter 01:13:43.720 |
represent the absolute objective truth of things. 01:13:48.440 |
So most, but an interesting question about oceans is that, 01:13:51.200 |
okay, remember I told you about how most planets 01:13:53.320 |
would last for trillions of years and be later, right? 01:13:56.640 |
So people have tried to explain why life appeared on earth 01:13:59.560 |
by saying, oh, all those planets are gonna be unqualified 01:14:04.040 |
That is, they're around smaller stars, which lasts longer, 01:14:06.400 |
and smaller stars have some things like more solar flares, 01:14:10.640 |
But almost all of these problems with longer lived planets 01:14:16.760 |
And a large fraction of planets out there are ocean worlds. 01:14:25.120 |
that these planets that last a very long time 01:14:31.640 |
you know, there's a huge fraction of ocean worlds. 01:14:34.560 |
So when you say, sorry, when you say life appear, 01:14:38.800 |
you're kind of saying life and intelligent life. 01:14:57.440 |
What's the chance that they first began their early steps, 01:15:07.920 |
80%, most people on Twitter think it's very likely. 01:15:15.000 |
- I think people are discounting ocean worlds too much. 01:15:22.360 |
it's possible, and I think people aren't giving enough 01:15:34.440 |
What percent of hello aliens once had a nuclear war 01:15:56.420 |
- So like, I wonder what, so most people think 01:15:59.920 |
once you get nukes, we're not gonna fire them. 01:16:02.560 |
They believe in the power of the game theory. 01:16:05.880 |
- I think they're assuming that if you had a nuclear war, 01:16:07.640 |
then that would just end civilization for good. 01:16:12.880 |
I think you could rise again after a nuclear war. 01:16:19.040 |
- So what do you think about mutually assured destruction 01:16:21.720 |
as a force to prevent people from firing nuclear weapons? 01:16:25.880 |
That's a question that I knew to a terrifying degree 01:16:32.180 |
- Well, I mean, clearly it has had an effect. 01:16:34.760 |
The question is just how strong an effect for how long? 01:16:37.880 |
I mean, clearly we have not gone wild with nuclear war, 01:16:41.680 |
and clearly the devastation that you would get 01:16:44.360 |
if you initiated a nuclear war is part of the reasons 01:16:55.400 |
- So it's been 70 years or whatever it's been. 01:17:02.120 |
Do you think we'll see nuclear war in the century? 01:17:15.600 |
Now this is where I pull you out of your mathematical model 01:17:20.440 |
Do you think, this particular human question-- 01:17:22.920 |
- I think we've been lucky that it hasn't happened so far. 01:17:41.080 |
- So the biggest datum here is that we've had 01:17:43.600 |
an enormous decline in major war over the last century. 01:17:58.180 |
So the average war is much worse than the median war 01:18:04.560 |
And that makes it hard to identify trends over time. 01:18:10.480 |
in the last century, that a median rate of war. 01:18:12.240 |
But it could be that's because the tail has gotten thicker. 01:18:26.640 |
because of the destructive nature of the weapons, 01:18:39.320 |
we can do a power law fit to the rate of wars. 01:18:43.680 |
So it's one of those things that you should expect 01:18:45.920 |
most of the damage to be in the few biggest ones. 01:18:53.560 |
So the median pandemics of ours less than the average 01:18:57.920 |
- But those, that fitting of data is very questionable 01:19:08.800 |
about the future of civilization threatening pandemics 01:19:28.820 |
Nuclear war happens with two assholes or idiots 01:19:37.940 |
But that's, it's very important, small wars aside, 01:19:41.080 |
it's very important to understand the dynamics, 01:19:47.560 |
in order to predict how we can minimize the chance of-- 01:19:52.560 |
- It is a common and useful intellectual strategy 01:20:08.540 |
say with tornadoes or even pandemics or something, 01:20:11.220 |
the underlying process might not be that different 01:20:27.440 |
So it's a really important question to understand 01:20:44.660 |
say you pressing this button will kill everyone on earth, 01:20:55.860 |
how many people have that much irresponsibility, 01:21:06.860 |
that leads you to press that button, but how many? 01:21:16.460 |
That number gets very close to zero very quickly, 01:21:19.380 |
especially people have access to such a button. 01:21:26.340 |
And unfortunately we don't have good data on this, 01:21:30.980 |
which is like how destructive are humans willing to be? 01:21:34.820 |
- So I think part of this just has to think about, 01:21:38.340 |
ask what your time scales you're looking at, right? 01:21:40.740 |
So if you say, if you look at the history of war, 01:21:49.900 |
in the next 50 years, I might say, well, probably not. 01:21:55.460 |
like if the same sort of risks are underlying 01:22:38.740 |
And I don't know, these feel like novel questions 01:22:49.820 |
this has been engineering pandemics, for example, 01:23:08.300 |
- So if you look on, say, the last 1,000 years 01:23:10.860 |
or 10,000 years, we could say we've seen a certain rate 01:23:13.700 |
at which people are willing to make big destruction 01:23:18.260 |
- Okay, and if you're willing to project that data forward, 01:23:21.500 |
then I think if you wanna ask over periods of thousands 01:23:27.500 |
So the key question is what's changed lately? 01:23:35.340 |
what are the major changes that seem to have happened 01:23:37.820 |
in culture and human attitudes over the last few centuries 01:23:42.700 |
so that we can project them forward into the future? 01:23:48.660 |
which is the story that we have been drifting back 01:23:51.180 |
toward forager attitudes in the last few centuries 01:23:55.860 |
So the idea is we spent a million years being a forager, 01:23:59.460 |
and that was a very sort of standard lifestyle 01:24:06.580 |
they make decisions cooperatively, they share food, 01:24:15.580 |
And then 10,000 years ago, farming became possible, 01:24:18.220 |
but it was only possible because we were plastic enough 01:24:21.900 |
Farming styles and cultures are very different. 01:24:24.540 |
They have slavery, they have war, they have property, 01:24:31.580 |
they don't have as much diversity of experience or food, 01:24:37.980 |
But humans were able to sort of introduce conformity 01:24:42.980 |
to become just a very different kind of creature as farmers. 01:24:45.380 |
Farmers are just really different than foragers 01:24:49.180 |
But the pressures that made foragers into farmers 01:24:57.340 |
from the farming norms that people around them supported, 01:25:00.100 |
they were quite at risk of starving to death. 01:25:02.340 |
And then in the last few centuries, we've gotten rich. 01:25:06.660 |
And as we've gotten rich, the social pressures 01:25:15.740 |
So for example, a farming young woman who was told, 01:25:38.060 |
she's more inclined often to go with her inclinations, 01:25:41.860 |
her sort of more natural inclinations about such things 01:25:51.180 |
we have been drifting back toward forager attitudes 01:25:56.180 |
And so aside from at work, which is an exception, 01:26:02.620 |
toward less slavery, more democracy, less religion, 01:26:06.260 |
less fertility, more promiscuity, more travel, 01:26:25.060 |
toward this world where basically like foragers, 01:26:27.380 |
we're peaceful, we share, we make decisions collectively, 01:26:38.860 |
and it's a loaded word because it's connected 01:26:42.180 |
to the actual, what life was actually like at that time. 01:26:47.580 |
As you mentioned, we sometimes don't do a good job 01:26:50.100 |
of telling accurately what life was like back then. 01:26:53.260 |
But you're saying if it's not exactly like foragers, 01:27:00.820 |
Is it obvious that a forager with a nuclear weapon 01:27:13.860 |
The main sort of violence they had would be sexual jealousy. 01:27:19.720 |
But they did not have organized wars with each other. 01:27:25.980 |
They didn't have property in land or even in people. 01:27:38.420 |
- Right, they didn't have coordinated large-scale wars 01:27:59.180 |
one of them is really interesting about language. 01:28:16.260 |
So I think some people see that we have advanced 01:28:25.240 |
and so they tend to assume it could go on forever. 01:28:28.380 |
And I actually tend to think that within, say, 01:28:30.980 |
10 million years, we will sort of max out on technology. 01:28:35.100 |
We will sort of learn everything that's feasible to know, 01:29:00.280 |
who've become so weird and different from each other 01:29:03.820 |
but we're probably pretty simple compared to them. 01:29:18.480 |
let's explore the possibility where that's not the case. 01:29:30.560 |
- We're not very good at communicating in general. 01:29:36.240 |
You're saying once you get orders of magnitude better 01:29:39.880 |
- Once they had maxed out on all communication technology 01:29:43.080 |
in general, and they just understood in general 01:29:48.680 |
- But you have to be able to, this is so interesting, 01:30:03.520 |
how the other person, the other organism sees the world. 01:30:20.840 |
because I'll say, well, the hello aliens question 01:30:30.400 |
So one scenario would be that the hello aliens 01:30:36.880 |
They would have been expanding for millions of years. 01:30:38.560 |
They would have a very advanced civilization, 01:30:43.280 |
after a billion years, perhaps, of expanding, 01:30:45.600 |
in which case they're gonna be crazy advanced 01:30:49.000 |
But the hello aliens about aliens we might meet soon, 01:30:56.160 |
and UFO aliens probably are not grabby aliens. 01:31:00.460 |
- How do you get here if you're not a grabby alien? 01:31:04.500 |
- Well, they would have to be able to travel. 01:31:11.640 |
So if it's a road trip, it doesn't count as grabby. 01:31:14.960 |
So we're talking about expanding the comfortable colony. 01:31:19.640 |
- The question is, if UFOs, some of them are aliens, 01:31:26.560 |
This is sort of the key question you have to ask 01:31:30.800 |
The key fact we would know is that they are here right now, 01:31:41.240 |
So that says right off the bat that they chose not 01:31:45.800 |
to allow massive expansion of a grabby civilization. 01:31:55.120 |
These are the stragglers, the journeymen, the-- 01:32:04.160 |
They are many millions of years older than us. 01:32:09.800 |
in that last millions of years if they had wanted to. 01:32:12.400 |
That is, they couldn't just be right at the edge. 01:32:17.060 |
Most likely they would have been around waiting for us 01:32:22.820 |
and they just chosen, they've been waiting around for this, 01:32:26.860 |
But the timing coincidence, it would be crazy unlikely 01:32:30.100 |
that they just happened to be able to get here, 01:32:34.180 |
They would no doubt have been able to get here 01:32:39.100 |
So this is a friend like UFO sightings on Earth. 01:32:41.580 |
We don't know if this kind of increase in sightings 01:32:50.980 |
And it's very unlikely that that was just to the point 01:32:55.260 |
that they could just barely get here recently. 01:33:01.900 |
- And well, throughout the stretch of several billion years 01:33:04.580 |
that Earth existed, they could have been here often. 01:33:06.380 |
- Exactly, so they could have therefore filled the galaxy 01:33:31.500 |
So isn't it possible that the sphere of places 01:33:36.340 |
where the stragglers go, the different people 01:33:38.980 |
that journey out, the explorers, is much, much larger 01:33:53.080 |
They're exploring, they're collecting the data, 01:34:01.640 |
- The time delay between when the first thing might arrive 01:34:11.140 |
and do a mass amount of work is cosmologically short. 01:34:30.340 |
it's likely, and if there's anything of value, 01:34:33.260 |
it's likely the other ants will follow quickly. 01:34:37.580 |
It's also true that traveling over very long distances, 01:34:41.620 |
probably one of the main ways to make that feasible 01:34:44.500 |
is that you land somewhere, you colonize a bit, 01:34:47.460 |
you create new resources that can then allow you 01:34:53.260 |
- Exactly, but those hops require that you are able 01:34:56.180 |
to start a colonization of sorts along those hops, right? 01:35:01.180 |
make it into a way station such that you can then 01:35:29.900 |
which is people like to do on this topic, right? 01:35:45.960 |
that are at all plausible in terms of what we know 01:35:53.300 |
like how hard are those to explain through various means. 01:35:56.740 |
I will establish myself as someone of an expert 01:36:03.060 |
and the things I've studied make me an expert 01:36:05.060 |
and I should stand up and have an opinion on that 01:36:09.120 |
The likelihood, however, is not my area of expertise. 01:36:11.980 |
That is, I'm not a pilot, I don't do atmospheric studies 01:36:18.140 |
the various kinds of atmospheric phenomena or whatever 01:36:20.740 |
that might be used to explain the particular sightings. 01:36:30.620 |
the attempts I've seen to easily dismiss them 01:36:33.060 |
seem to me to fail, it seems like these are pretty puzzling, 01:36:36.180 |
weird stuff that deserve an expert's attention 01:36:40.500 |
in terms of considering, asking what the likelihood is. 01:36:43.540 |
So analogy I would make is a murder trial, okay? 01:36:51.100 |
as a prior probability, maybe one in a thousand people 01:36:53.660 |
get murdered, maybe each person has a thousand people 01:36:55.860 |
around them who could plausibly have done it, 01:36:57.440 |
so the prior probability of a murder is one in a million. 01:37:00.740 |
But we allow murder trials because often evidence 01:37:03.740 |
is sufficient to overcome a one in a million prior 01:37:06.660 |
because the evidence is often strong enough, right? 01:37:10.020 |
My guess, rough guess for the UFOs as aliens scenario, 01:37:14.460 |
at least some of them, is the prior is roughly 01:37:15.900 |
one in a thousand, much higher than the usual murder trial, 01:37:19.860 |
plenty high enough that strong physical evidence 01:37:27.180 |
But I'm not an expert on that physical evidence, 01:37:34.900 |
So then I can elaborate on where my prior comes from. 01:37:38.060 |
What scenario could make most sense of this data? 01:37:41.300 |
My scenario to make sense has two main parts. 01:37:53.260 |
by which life might have arrived on Earth from elsewhere. 01:37:59.700 |
it would have to happen very early in Earth's history 01:38:05.340 |
during the stellar nursery where the sun was born 01:38:08.580 |
with many other stars in the same close proximity 01:38:14.180 |
able to move things from one place to another. 01:38:18.340 |
If a rock with life on it from some rock with planet 01:38:27.980 |
in that stellar nursery all at the same time. 01:38:30.140 |
They're all born at the same time in the same place, 01:38:31.700 |
pretty close to each other, lots of rocks flying around. 01:38:35.140 |
So a panspermia scenario would then create siblings, 01:38:44.740 |
So after the nursery forms, it drifts, it separates, 01:38:52.940 |
and we can actually spot them in terms of their spectrum. 01:38:55.780 |
And they would have then started on the same path of life 01:39:10.580 |
but maybe one other did, and maybe it did before us. 01:39:19.020 |
and they could go searching for their siblings. 01:39:20.740 |
That is, they could look in the sky for the other stars 01:39:23.060 |
that match the spectrum that matches the spectrum 01:39:27.060 |
They could identify their sibling stars in the galaxy, 01:39:31.940 |
and those would be of special interest to them 01:39:33.780 |
'cause they would think, well, life might be on those. 01:39:51.500 |
because we all kind of started at similar time 01:40:06.140 |
No longer just random independent spaces in space-time. 01:40:38.700 |
- Less than a billion, but still plenty of time 01:40:46.460 |
so the fact is they chose not to become grabby. 01:40:54.260 |
So it should be fine. - Yes, they had plenty of time 01:41:01.900 |
- 100 million, so I told you before that I said 01:41:11.020 |
- And so they clearly more than 10 million years 01:41:24.540 |
to not allow any part of themselves to do it. 01:41:36.780 |
becomes grabby from their origin with this one colony. 01:41:39.260 |
So in order to prevent their civilization being grabby, 01:41:42.340 |
they have to have a rule they enforce pretty strongly 01:41:44.540 |
that no part of them can ever try to do that. 01:41:49.980 |
or through something that's internal to them, 01:41:58.300 |
- Like a political officer in the brain or whatever. 01:42:04.620 |
that prevents you from, or like alien nature, 01:42:14.700 |
- So I would say they would have to have enforced 01:42:29.180 |
So they would probably have a pretty tight lid 01:42:31.060 |
on just allowing any travel out from their origin 01:42:40.740 |
So clearly they made an exception from their general rule 01:42:52.780 |
- But if incompetent, then they couldn't maintain 01:43:19.860 |
technological barrier to becoming expansionary? 01:43:24.540 |
- Imagine the Europeans that tried to prevent anybody 01:43:38.020 |
- They would have had to have very strict, you know, 01:43:40.760 |
guards at the borders, at the borders saying, 01:43:51.340 |
- I don't know how you keep, in my silly human brain, 01:44:01.820 |
no matter how much censorship or control or so on, 01:44:08.100 |
from exploring into the mysterious, into the unknown. 01:44:11.780 |
- You're thinking of people, we're talking aliens. 01:44:17.520 |
different kinds of cultures they could be in, 01:44:21.500 |
I mean, there are many things, as you talked about, 01:44:23.000 |
that most of us would feel very reluctant to do. 01:44:36.380 |
- So Panspermia siblings is one part of the scenario, 01:44:54.860 |
then we have defeated the purpose of this rule they set up. 01:44:58.940 |
- Right, so they would be here to convince us 01:45:10.040 |
That would have been completely possible, right? 01:45:12.320 |
So the fact that they're here and we are not destroyed 01:45:22.060 |
that would make them reluctant to just destroy us. 01:45:29.500 |
there's a difference between arrival and observation. 01:45:32.460 |
They may have been observing for a very long time. 01:45:46.780 |
- Which is because that's, we can see that they did not, 01:45:49.220 |
they must have enforced a ruling against that, 01:45:56.860 |
when they clearly don't risk very many expeditions 01:45:59.060 |
over this long period, to allow this one exception, 01:46:01.420 |
because otherwise, if they don't, we may become grabby. 01:46:04.900 |
And they could have just destroyed us, but they didn't. 01:46:13.940 |
that might have less to do with nuclear weapons 01:46:27.860 |
they can push the button and ensure that we don't expand, 01:46:36.580 |
But there's another thing that we need to explain. 01:46:38.180 |
There's another key data we need to explain about UFOs 01:46:40.500 |
if we're gonna have a hypothesis that explains them. 01:46:42.460 |
And this is something many people have noticed, 01:46:51.020 |
They could have either just remained completely invisible. 01:46:56.620 |
There's no reason they need to fly around and be noticed. 01:46:59.180 |
They could just be in orbit in dark satellites 01:47:07.820 |
The other thing they could do is just show up 01:47:10.020 |
and land on the White House lawn, as they say, 01:47:12.260 |
and shake hands, like make themselves really obvious. 01:47:21.420 |
Why would they take this intermediate approach, 01:47:27.860 |
but not walking up and introducing themselves, 01:47:36.820 |
where the White House is, or the White House lawn-- 01:47:39.860 |
- Well, it's obvious where there are concentrations 01:47:41.340 |
of humans that you could go up and introduce. 01:47:42.580 |
- But is humans the most interesting thing about Earth? 01:47:49.580 |
about an expansion, then they would be worried 01:47:52.420 |
about a civilization that could be capable of expansion. 01:47:54.660 |
Obviously, humans are the civilization on Earth 01:47:56.660 |
that's by far the closest to being able to expand. 01:48:03.880 |
obviously see humans, like the individual humans, 01:48:11.500 |
like the meat vehicles, as the center of focus 01:48:19.700 |
- They're supposed to be really smart and advanced. 01:48:26.260 |
because we think humans are the important things, 01:48:30.720 |
it could be something about our technologies. 01:48:32.940 |
- But that's mediated with us, it's correlated with us. 01:48:34.620 |
- No, we make it seem like it's mediated by us humans, 01:48:43.380 |
might be the AI systems or the technologies themselves. 01:48:48.880 |
Human is the food, the source of the organism 01:48:59.660 |
- So what they wanted to have close contact with 01:49:05.620 |
and we would just incidentally see, but we would still see. 01:49:15.140 |
with some fundamental aspect that they're interested in 01:49:21.060 |
And that's actually a very, no matter how advanced you are, 01:49:52.020 |
if they just wanted to talk to our computer systems, 01:49:56.540 |
that connects to a wire and reads and sends bits there. 01:50:00.580 |
They don't need a shiny thing flying in the sky. 01:50:05.940 |
they are, would be looking for the right way to communicate, 01:50:11.100 |
Everything you just said, looking at the computer systems, 01:50:22.760 |
but also understand, might not be that trivial. 01:50:25.340 |
How would you talk to things? - Well, so not freak out 01:50:28.580 |
So again, I said, like the two obvious strategies 01:50:30.740 |
are just to remain completely invisible and watch, 01:50:36.780 |
let's come out and be really very direct, right? 01:50:39.420 |
I mean, there's big things that you can see around. 01:50:41.580 |
There's big cities, there's aircraft carriers. 01:50:44.220 |
There's lots of, if you want to just find a big thing 01:50:46.300 |
and come right up to it and like tap it on the shoulder 01:50:52.080 |
So my hypothesis is that one of the other questions there 01:51:02.380 |
who are social animals have status hierarchy, 01:51:11.420 |
- Well, I would say their strategy is to be impressive 01:51:25.660 |
We convince dogs we're the leader of their pack, right? 01:51:30.800 |
but as we just swap into the top of their status hierarchy 01:51:36.800 |
so you should do what we say, you should follow our lead. 01:51:42.120 |
they are going to get us to do what they want 01:51:51.080 |
have tried to impress their citizens and other people 01:51:53.340 |
by having the bigger palace, the bigger parade, 01:51:57.560 |
Whatever, maybe building a bigger pyramid, et cetera. 01:52:08.520 |
several orders of magnitude of power differential, 01:52:13.280 |
I feel like that status hierarchy no longer applies. 01:52:18.300 |
- Most emperors are several orders of magnitude 01:52:20.280 |
more powerful than any one member of their empire. 01:52:25.880 |
So like, if I'm interacting with ants, right? 01:52:44.560 |
So I'm less concerned about them worshiping me. 01:52:50.080 |
- But it is important that you be non-threatening 01:52:53.480 |
if the aliens had done something really big in the sky, 01:53:01.280 |
So I think their strategy to be the high status 01:53:06.640 |
- I just don't know if it's obvious how to do that. 01:53:10.920 |
You see a planet with relatively intelligent, 01:53:20.200 |
We could see this under, in Titan or something like that. 01:53:27.320 |
You start to see not just primitive bacterial life, 01:53:33.620 |
cellular colonies, structures that they're dynamic. 01:53:40.360 |
Some gigantic cellular automata type of construct. 01:53:50.100 |
in an impressive fashion without destroying it? 01:54:02.560 |
by getting too close and touching it and interacting, right? 01:54:06.940 |
- Right, so the claim is that their current strategy 01:54:11.400 |
of hanging out at the periphery of our vision 01:54:13.740 |
and just being very clearly physically impressive 01:54:16.240 |
with very clear physically impressive abilities 01:54:19.360 |
is at least a plausible strategy they might use 01:54:25.240 |
that we're at the top of their status hierarchy. 01:54:31.960 |
in ways that they couldn't really understand, 01:54:37.800 |
So if you look at how we treat other civilizations 01:54:53.260 |
that violates our moral norms, and then we hate them. 01:54:56.760 |
And these are aliens, for God's sakes, right? 01:55:16.200 |
with mimetic theory where we only feel this way 01:55:29.260 |
of our status hierarchy to get us to follow them, 01:55:32.800 |
They have to be close enough that we would see them that way. 01:55:48.380 |
If we see that they're here, we can figure out, 01:56:05.560 |
and here are these very advanced, smart aliens 01:56:22.900 |
in that framework, who originated, who planted it? 01:56:42.460 |
and it went through part of the stages of evolution 01:56:45.220 |
to advanced life, but not all the way to advanced life, 01:56:48.100 |
and then some rock hit it, grabbed a piece of it 01:57:06.560 |
- There's some graphs that I've been impressed by 01:57:08.860 |
that show sort of the level of genetic information 01:57:12.360 |
in various kinds of life on the history of Earth, 01:57:21.660 |
And so if you actually project this log graph in history, 01:57:24.740 |
it looks like it was many billions of years ago 01:57:29.620 |
you could say there was just a lot of evolution 01:57:33.240 |
to the simplest life we've ever seen in history 01:57:35.180 |
of life on Earth, was still pretty damn complicated. 01:57:40.660 |
How could life get to this enormously complicated level 01:57:48.300 |
So where, you know, it's only 300 million years 01:57:51.780 |
at most it appeared, and then it was really complicated 01:58:01.860 |
on another planet, going through lots of earlier stages 01:58:06.780 |
of complexity you see at the beginning of Earth. 01:58:12.060 |
of the origin of life, but let me return to UFO sightings. 01:58:16.080 |
Is there other explanations that are possible 01:58:18.620 |
outside of panspermia siblings that can explain 01:58:31.200 |
that most people will use is, well, first of all, 01:58:33.820 |
just mistakes, like, you know, you're confusing 01:58:37.020 |
something ordinary for something mysterious, right? 01:58:43.380 |
like our government is secretly messing with us, 01:58:45.940 |
and trying to do a false flag ops or whatever, right? 01:58:50.100 |
They're trying to convince the Russians or the Chinese 01:58:55.800 |
'Cause if you know the history of World War II, 01:58:58.500 |
say the US government did all these big fake operations 01:59:04.580 |
in order to mess with people, so that's a possibility. 01:59:07.060 |
The government's been lying and faking things 01:59:09.740 |
and paying people to lie about what they saw, et cetera. 01:59:47.460 |
talked about, thought about so many different topics. 01:59:55.940 |
I'm almost like at a loss of which place we explore, 01:59:59.740 |
but let me, on this topic, ask about conspiracy theories. 02:00:03.660 |
'Cause you've written about institutions, authorities. 02:00:19.180 |
- The phrase itself is pushing you in a direction, right? 02:00:26.020 |
many large coordinated keepings of secrets, right? 02:00:45.380 |
So clearly, it's possible to keep something secret 02:00:48.620 |
over time periods, but the more people you involve 02:00:55.300 |
and the more, the less centralized an organization 02:01:01.420 |
But we're just trying to calibrate, basically, in our minds 02:01:04.620 |
which kind of secrets can be kept by which groups 02:01:06.820 |
over what time periods for what purposes, right? 02:01:16.580 |
and I love people, I love all things, really. 02:01:22.980 |
even the assholes, have the capacity to be good, 02:01:37.860 |
that doesn't allow me to intuit the competence 02:01:49.460 |
So for example, one thing that people often talk about 02:01:52.060 |
is intelligence agencies, this broad thing they say, 02:01:55.340 |
the CIA, the FSB, the different, the British intelligence. 02:02:04.740 |
to talk to any member of those intelligence agencies. 02:02:13.340 |
or the first curtain, I don't know how many levels 02:02:15.380 |
of curtains there are, and so I can't intuit. 02:02:20.060 |
I was funded by DOD and DARPA, and I've interacted, 02:02:37.140 |
that sometimes suffocates the ingenuity of the human spirit, 02:02:43.140 |
Meaning, they are, I just, it's difficult for me 02:02:52.520 |
Now, that doesn't mean, that's my very anecdotal data 02:03:14.100 |
sufficiently robust propaganda that controls the populace. 02:03:18.420 |
If you look at World War II, as you mentioned, 02:03:21.780 |
there have been extremely powerful propaganda machines 02:03:25.380 |
on the side of Nazi Germany, on the side of the Soviet Union, 02:04:07.620 |
when the initial spark was just a propaganda thought 02:04:13.300 |
So I can't necessarily intuit of what's possible, 02:04:18.300 |
but I'm skeptical of the power of human institutions 02:04:30.780 |
when information is becoming more and more accessible 02:04:40.660 |
the people who are managing the various conspiracies 02:04:45.300 |
they thought that their conspiracy was avoiding harm 02:05:02.980 |
So if you can make your conspiracy the sort of thing 02:05:04.980 |
that people wouldn't wanna talk about anyway, 02:05:15.280 |
should be interested in, but somehow don't know, 02:05:17.320 |
even though the data has been very widespread. 02:05:19.620 |
So I have this book, "The Elephant and the Brain," 02:05:21.740 |
and one of the chapters is there on medicine. 02:05:27.140 |
of the very basic fact that when we do randomized trials 02:05:29.860 |
where we give some people more medicine than others, 02:05:32.540 |
the people who get more medicine are not healthier. 02:05:35.100 |
Just overall, in general, just like induce somebody 02:05:38.940 |
to get more medicine because you just give them 02:05:42.860 |
Not a specific medicine, just the whole category. 02:05:49.100 |
You might even think that would be a conspiracy theory 02:05:53.100 |
but in fact, most people never learn that fact. 02:05:55.860 |
- So just to clarify, just a general high-level statement, 02:06:00.860 |
the more medicine you take, the less healthy you are. 02:06:03.580 |
- Randomized experiments don't find that fact. 02:06:07.340 |
Do not find that more medicine makes you more healthy. 02:06:11.520 |
In randomized experiments, there's no relationship 02:06:28.820 |
And then you're saying that there's also part of this 02:06:45.840 |
It never was mentioned in that section of the paper 02:06:55.320 |
where most people don't want to know the truth? 02:07:04.000 |
- That is bad looking truths, truths that discouraging, 02:07:06.640 |
truths that sort of take away the justification 02:07:11.320 |
- Do you think that's a bad aspect of human nature? 02:07:24.680 |
and then to try to be careful about how we can improve. 02:07:33.360 |
And our first priority there is just to explain to you 02:07:36.080 |
what are the things that you are not looking at 02:07:46.840 |
And that often goes badly because it's harder 02:07:54.000 |
that this truth is available if you want to learn about it. 02:07:57.160 |
- It's the Nietzsche, "If you gaze long into the abyss, 02:08:01.280 |
Let's talk about this "Elephant in the Brain." 02:08:08.440 |
"An important issue that people are reluctant 02:08:15.400 |
but unacknowledged feature of how our mind works, 02:08:29.760 |
as some of the elephant offspring in the brain. 02:08:43.240 |
in our brain that we don't want to acknowledge to ourselves? 02:08:48.240 |
- In your conscious mind, the one that's listening to me 02:08:53.000 |
you like to think of yourself as the president 02:08:55.360 |
or king of your mind, ruling over all that you see, 02:09:07.680 |
You don't make decisions, you justify them to an audience. 02:09:15.960 |
You watch what you're doing and you try to come up 02:09:21.440 |
so that you can avoid accusations of violating norms. 02:09:24.800 |
So humans compared to most other animals have norms 02:09:30.640 |
with our morals and norms about what we should 02:09:45.960 |
So if I hit you on purpose, that's a big violation. 02:09:50.880 |
I need to be able to explain why it was an accident 02:09:59.000 |
- Right, so humans have norms and we have the norm 02:10:03.200 |
we need to tell other people and then coordinate 02:10:05.280 |
to make them stop and punish them for violating. 02:10:09.280 |
So such benefits are strong enough and severe enough 02:10:12.240 |
that we each want to avoid being successfully accused 02:10:22.280 |
If we do it consistently, we may be thrown out of the group 02:10:26.640 |
Okay, so we need to be able to convince people 02:10:29.440 |
we are not going around hitting people on purpose. 02:10:32.520 |
If somebody happens to be at the other end of our fist 02:10:35.360 |
and their face connects, that was an accident 02:10:41.780 |
And similarly for many other norms humans have, 02:10:48.120 |
We find them violating, we're going to accuse them. 02:10:57.280 |
about everything we do, which is why we're constantly 02:11:02.240 |
and that's what your conscious mind is doing. 02:11:04.240 |
It is trying to make sure you've got a good motive story 02:11:08.720 |
And that's why you don't know why you really do things. 02:11:26.680 |
and is justifying all of the decisions of the dictator. 02:11:32.240 |
- Right, now most people actually are willing to believe 02:11:36.720 |
So our book has been classified as psychology 02:11:43.840 |
and reviewers responded is to say this is well known. 02:11:49.720 |
- But they don't want to accept it about themselves. 02:11:52.920 |
about the particular topics that we talk about. 02:11:59.160 |
or that they might not be honest about various things. 02:12:01.920 |
But that hasn't penetrated into the literatures 02:12:04.200 |
where people are explaining particular things 02:12:06.000 |
like why we go to school, why we go to the doctor, 02:12:17.760 |
And people who study those things have not admitted 02:12:21.640 |
that hidden motives are explaining those particular areas. 02:12:25.480 |
- So they haven't taken the leap from theoretical psychology 02:12:41.440 |
So how vast is this landscape of the unconscious mind, 02:12:56.880 |
- The vast majority of what's happening in your head 02:13:03.860 |
the aspects of your mind that you're not conscious of 02:13:09.160 |
But that's just true in a literal engineering sense. 02:13:26.600 |
staying with the philosophical psychology side for a moment. 02:13:29.920 |
You know, can you shine a light in the Jungian shadow? 02:13:37.360 |
Like what level of thoughts are happening there? 02:13:49.320 |
of like monitoring different systems in the body, 02:13:52.300 |
making sure you're happy, making sure you're fed, 02:14:07.880 |
Therefore, most of our brain is devoted to being social. 02:14:44.060 |
sometimes for fun, sometimes as a basic statement 02:14:57.220 |
how are you self-deceiving yourself in this task 02:15:03.140 |
how's your, like why is the dictator manipulating you 02:15:08.700 |
Like there's norms, why do you wanna stand out in this way? 02:15:13.560 |
Why do you want to challenge the norms in this way? 02:15:19.680 |
but the more practical strategy that's quite feasible 02:15:26.720 |
and then to own up to those particular things. 02:15:31.960 |
- So for example, I can very much acknowledge 02:16:00.680 |
I might want to find topics where other people are interested 02:16:06.680 |
of finding a big insight rather than a small one 02:16:10.600 |
and maybe one that was especially surprising. 02:16:20.680 |
but most intellectuals are relatively risk-averse. 02:16:25.120 |
They are in some local intellectual tradition 02:16:55.480 |
and they should be looking for what's neglected 02:16:57.240 |
between the major traditions and major questions. 02:17:16.200 |
So you could say that one motivation I might have 02:17:21.200 |
is less motivated to be sort of comfortably accepted 02:17:33.480 |
that should be very important if you could find them. 02:17:39.680 |
would get you appreciated-- - Attention, respect. 02:17:56.720 |
but the larger community will see the brilliance 02:18:00.300 |
of you breaking out of the cage of the small conformity 02:18:05.340 |
It's always a bigger, there's always a bigger cage, 02:18:10.140 |
Yeah, also that explains your choice of colorful shirt 02:18:22.500 |
by making false claims of dramatic improvement. 02:18:28.460 |
than actually working through all the details-- 02:18:37.680 |
about how much you value truth and the pursuit of truth. 02:18:43.080 |
- Hitler and Stalin also talked about the value of truth. 02:19:15.960 |
- So I think the standards you hold yourself to 02:19:22.960 |
are dependent on the audience you have in mind. 02:19:28.120 |
as relatively easily fooled or relatively gullible, 02:19:32.500 |
then you won't bother to generate more complicated, 02:19:39.240 |
to persuade somebody who has higher standards 02:19:46.340 |
And of course, if you are, say, a salesperson, 02:19:50.440 |
then you don't need to convince the top few percent 02:20:12.440 |
Is it the people who are reading their Twitter feed? 02:20:19.000 |
Or is it Einstein and Freud and Socrates, right? 02:20:23.380 |
So I think those of us who are especially arrogant, 02:20:33.120 |
we were naturally gonna pick the big shot audience 02:20:36.720 |
We're gonna be trying to impress Socrates and Einstein. 02:20:39.520 |
- Is that why you hang out with Tyler Cohen a lot? 02:20:52.560 |
Trying to impress the very most highest quality minds, 02:21:00.280 |
So I might well have had more ordinary success in life, 02:21:04.640 |
be more popular, invited to more parties, make more money 02:21:07.320 |
if I had targeted a lower tier set of intellectuals 02:21:17.160 |
that Einstein was my audience, or people like him, 02:21:23.280 |
- Yeah, I mean, you pick your set of motivations. 02:21:28.880 |
is not gonna help you get laid, trust me, I tried. 02:21:39.760 |
of the elephant in the brain in everyday life? 02:21:51.080 |
all those kinds of things, what are some things 02:21:54.200 |
medicine is much less useful health-wise than you think. 02:21:57.120 |
So, you know, if you were focused on your health, 02:22:03.400 |
And if you were focused on other people's health, 02:22:10.560 |
and let other people showing that they care about you, 02:22:13.160 |
then a lot of priority on medicine can make sense. 02:22:16.120 |
So that was our very earliest discussion in the podcast. 02:22:25.960 |
if that's the way that you show that you care about them 02:22:33.220 |
if you can't find a cheaper, more effective substitute. 02:22:36.200 |
- So if we actually just pause on that for a little bit, 02:22:43.200 |
of self-deception happening in the space of medicine? 02:22:46.720 |
- So we have a method that we use in our book 02:22:57.420 |
Look at broad patterns of behavior in other people. 02:23:00.560 |
And then ask, what are the various theories we could have 02:23:07.240 |
Which theory better matches the behavior they have? 02:23:10.920 |
And the last step is to assume that's true of you 02:23:17.120 |
If you happen to be an exception, that won't go so well, 02:23:19.000 |
but nevertheless, on average, you aren't very well 02:23:26.200 |
explain what other people do, and assume that's you too. 02:23:37.480 |
There's the doctors that are prescribing the medicine. 02:23:40.080 |
There's drug companies that are selling drugs. 02:23:46.940 |
So you can build up a network of categories of humans 02:24:06.100 |
it's usually much easier to explain producer behavior 02:24:18.240 |
And similarly say, governments in democratic countries 02:24:22.020 |
have the incentive to give the voters what they want. 02:24:24.900 |
So that focuses your attention on the patient 02:24:33.020 |
They would be driving the rest of the system. 02:24:37.900 |
are willing to give them in order to get paid. 02:24:46.020 |
What are they choosing and why do they choose that? 02:24:56.380 |
by the producer being incentivized to manipulate 02:25:07.340 |
and exaggerate in order to get more customers. 02:25:15.460 |
can't be explained by the willingness of the producers 02:25:20.220 |
or to do various things that we have to, again, 02:25:31.100 |
- Yeah, and that potentially requires a lot of thinking, 02:25:35.780 |
and potentially looking at historical data too 02:25:42.700 |
it's a lot, actually, easier than you might think. 02:25:45.500 |
I think the biggest limitation is just the willingness 02:25:49.860 |
So many of the patterns that you need to rely on 02:25:52.860 |
are actually pretty obvious, simple patterns. 02:25:54.780 |
You just have to notice them and ask yourself, 02:25:59.300 |
Often, you don't need to look at the most subtle, 02:26:08.620 |
- All right, so there's a fundamental statement 02:26:18.740 |
that many of the foundational ideas in the book are wrong? 02:26:29.540 |
which is it can be a lot simpler than it looks. 02:26:39.980 |
It's very difficult to have a simple model about. 02:26:56.340 |
And we are able to really introspect our own mind. 02:26:59.700 |
And like what's on the surface of the conscious 02:27:07.900 |
You're able to actually arrive to deeply think 02:27:15.980 |
and more about being a free thinking individual. 02:27:23.820 |
why they don't have their homework assignment, 02:27:30.400 |
They almost never say the dragon ate my homework. 02:27:39.060 |
Almost always when we make excuses for things, 02:27:41.700 |
we choose things that are at least in some degree plausible. 02:27:47.980 |
That's an obstacle for any explanation of a hidden motive 02:28:02.100 |
That's gonna be an obstacle to proving that hypothesis 02:28:10.260 |
that a person would typically have if they were challenged. 02:28:17.020 |
maybe you can't tell whether his dog ate his homework or not. 02:28:24.660 |
You will need to have a wider range of evidence 02:28:42.660 |
it'll be true that you'll have to look at wider data 02:28:58.340 |
I have to point you to sort of larger sets of data, 02:29:00.900 |
but in many areas of academia, including health economics, 02:29:16.260 |
whereby if they're getting a result too much contrary 02:29:19.460 |
to the usual point of view everybody wants to have, 02:29:24.100 |
or redo the analysis until they get an answer 02:29:28.220 |
So that means in the health economics literature, 02:29:33.780 |
that in fact, we have evidence that medicine is effective. 02:29:39.380 |
I will have to point you to our most reliable evidence 02:29:59.580 |
that's where we will see the truth be more revealed. 02:30:05.620 |
we have millions of papers published in medicine 02:30:09.300 |
most of which give the impression that medicine is useful. 02:30:13.660 |
There's a small literature on randomized experiments 02:30:19.220 |
where there's maybe a few half dozen or so papers, 02:30:26.460 |
because it's such a straightforward experiment 02:30:37.740 |
to show you that there's relatively little correlation 02:30:40.940 |
But even then, people could try to save the phenomenon 02:30:44.500 |
and say, "Well, it's not hidden motives, it's just ignorance." 02:30:56.820 |
They are just ignorantly assuming that medicine is effective. 02:31:08.180 |
I'm saying, "Well, how long has this misperception 02:31:12.620 |
How consistently has it happened around the world 02:31:16.420 |
And I would have to say, "Look, if we're talking about, 02:31:20.140 |
say, a recent new product, like Segway scooters 02:31:27.740 |
Maybe they could be confused about their value. 02:31:29.820 |
If we're talking about a product that's been around 02:31:31.660 |
for thousands of years, used in roughly the same way 02:31:34.260 |
all across the world, and we see the same pattern 02:31:36.700 |
over and over again, this sort of ignorance mistake 02:31:41.860 |
- It's also is a question of how much of the self-deception 02:32:11.740 |
but then you escape it easily if you're motivated. 02:32:14.820 |
- The motivational hypotheses about the self-deceptions 02:32:24.100 |
So the story would be most people want to look good 02:32:28.660 |
Therefore, most people present themselves in ways 02:32:31.860 |
that help them look good to the people around them. 02:32:35.840 |
That's sufficient to say there would be a lot of it. 02:32:41.400 |
There's enough variety in people and in circumstances 02:32:46.120 |
can be in the interest of some minority of the people. 02:32:53.640 |
I've decided that being contrarian on these things 02:32:57.860 |
could be winning for me in that there's a room 02:33:07.080 |
even if there's not room for most people to do that. 02:33:09.680 |
And that can be explaining sort of the variety, right? 02:33:35.200 |
clearly most people are trying to look somewhat nice, right? 02:33:37.880 |
They shower, they shave, they comb their hair, 02:33:49.640 |
We can see in those particular people's context 02:33:58.720 |
- So the general rule does reveal something foundational. 02:34:09.520 |
since we're talking about this, especially in medicine. 02:34:18.960 |
there's been a growing distrust of authorities, 02:34:22.000 |
of institutions, even the institution of science itself. 02:34:27.760 |
What are the pros and cons of authorities, would you say? 02:34:43.080 |
is as something you can defer to respectively 02:34:54.680 |
That is, when you're asking, what should I act on 02:35:02.880 |
You might be worried if I chose something too contrarian, 02:35:06.440 |
too weird, too speculative, that that would make me look bad 02:35:11.200 |
so I would just choose something very conservative. 02:35:14.340 |
So maybe an authority lets you choose something 02:35:23.320 |
And somebody says, why did you do that thing? 02:35:31.520 |
- So the authority is often pushing for the conservative. 02:35:41.400 |
you could just think, oh, I'll just stay home 02:35:43.040 |
and close all the doors or I'll just ignore it. 02:35:44.800 |
You could just think of just some very simple strategy 02:35:47.000 |
that might be defensible if there were no authorities. 02:35:50.960 |
But authorities might be able to know more than that. 02:36:20.720 |
- Details, just not behaving as I would have imagined 02:36:27.720 |
in the best possible evolution of human civilization, 02:36:33.200 |
They seem to have failed in some fundamental way 02:36:36.920 |
in terms of leadership in a difficult time for our society. 02:36:45.600 |
- So again, if there were no authorities whatsoever, 02:36:51.080 |
then people would have to sort of randomly pick 02:36:57.560 |
and then they'd be fighting each other about that, 02:37:02.920 |
that you would always do without responding to context. 02:37:08.600 |
is that they could know more than just basic ignorance, 02:37:13.600 |
they could both be more informed than ignorance 02:37:29.260 |
- So the con is that if you think of yourself 02:37:35.920 |
it's unfortunately not to be maximally informative. 02:37:49.800 |
as you could possibly listen to and manage to assimilate, 02:37:53.460 |
and it would update that as frequently as possible 02:37:56.260 |
or as frequently as you were able to listen and assimilate, 02:37:58.880 |
and that would be the maximally informative authority. 02:38:04.960 |
between being an authority or being seen as an authority 02:38:15.460 |
That is, if you look at it from their point of view, 02:38:18.900 |
they won't long remain the perceived authority 02:38:22.860 |
if they are too incautious about how they use that authority, 02:38:34.740 |
- Okay, that's still in the pro column for me 02:38:42.660 |
and I would hope that authorities struggle with that, 02:38:56.500 |
'cause I trust in the intelligence of people, 02:38:58.500 |
but I'd like to mention a bigger con on authorities, 02:39:04.220 |
This comes back to global government and so on, 02:39:07.820 |
is that there's humans that sit in chairs during meetings 02:39:12.820 |
and those authorities, they have different titles, 02:39:18.220 |
and sometimes those titles get to your head a little bit, 02:39:23.060 |
how do I preserve my control over this authority, 02:39:30.940 |
what is the mission of WHO and the other such organization, 02:39:34.300 |
and how do I maximize the implementation of that mission, 02:39:37.740 |
you start to think, well, I kind of like sitting 02:39:58.780 |
under what good means, given the mission of the authority, 02:40:06.780 |
First, in the meeting room, everybody around you, 02:40:15.420 |
throughout the whole hierarchy of the company. 02:40:19.780 |
or in the organization believes this narrative, 02:40:22.680 |
now you start to control the release of information, 02:40:27.680 |
not because you're trying to maximize outcomes, 02:41:07.940 |
just because they're an assistant associate, full faculty, 02:41:11.860 |
just because they are deputy head of X organization, 02:41:28.300 |
and then somebody shook their hand and gave them a medal, 02:41:34.980 |
that people have been patting them on the back, 02:41:44.040 |
who really want the money in a self-deception kind of way, 02:41:46.780 |
they don't actually really care about your performance, 02:41:55.820 |
you become an authority that just wants to maximize 02:42:00.460 |
self-preserve yourself in a sitting on a throne of power. 02:42:05.460 |
- So this is core to sort of what it is to be an economist, 02:42:16.340 |
- We often have a situation where we see a world of behavior, 02:42:20.820 |
and then we see ways in which particular behaviors 02:42:28.140 |
- And we have a variety of reactions to that, 02:42:31.780 |
so one kind of reaction is to sort of morally blame 02:42:38.800 |
under perhaps the idea that people could be identified 02:42:44.640 |
and maybe induced into doing the better thing 02:42:46.660 |
if only enough people were calling them out on it, right? 02:42:55.180 |
with certain stable institutional structures, 02:42:58.100 |
and that institutions create particular incentives 02:43:00.740 |
for individuals, and that individuals are typically 02:43:15.940 |
and more blaming the world for having the wrong institutions. 02:43:24.340 |
and which of them might promote better behavior, 02:43:26.140 |
and this is a common thing we do all across human behavior, 02:43:29.660 |
is to think of what are the institutions we're in, 02:43:32.060 |
and what are the alternative variations we could imagine, 02:43:37.220 |
I would agree with you that our information institutions, 02:43:42.080 |
that is the institutions by which we collect information 02:43:47.260 |
are especially broken in the sense of far from the ideal 02:44:06.860 |
that we give people incentives to do research 02:44:14.300 |
and we actually care more about whether academics 02:44:16.380 |
are impressive than whether they're interesting or useful. 02:44:27.100 |
ways in which those institutions produce incentives 02:44:30.820 |
that are mistaken, and that was the point of the post 02:44:32.940 |
we started with talking about the authorities. 02:44:55.980 |
let me just apologize for a couple of things. 02:44:58.660 |
One, I often put too much blame on leaders of institutions 02:45:03.660 |
versus the incentives that govern those institutions. 02:45:17.500 |
too emotional about my criticism of Anthony Fauci, 02:45:22.820 |
because I think there's deeper truths to think about, 02:45:29.620 |
That said, I do sort of, I'm a romantic creature by nature. 02:45:46.900 |
You think about leaders, you think about individuals, 02:45:55.620 |
It's harder to think through deeply about the models 02:46:03.300 |
But also, I don't apologize for being emotional sometimes, 02:46:12.940 |
well, you should be trying to reform these institutions 02:46:22.660 |
But I can understand why the people at the top 02:46:30.300 |
- Can I maybe ask you about particular universities? 02:46:37.340 |
an increase in distrust overall as an institution, 02:46:50.420 |
one of the journeys that humans have taken on. 02:47:21.300 |
a place where you can be a kid for your whole life 02:47:28.060 |
that universities still not currently receive, 02:47:38.420 |
of particular departments, particular people. 02:48:01.140 |
engineering, so robotics, artificial intelligence. 02:48:11.620 |
There's like more rules, there's more meetings, 02:48:30.260 |
And meetings destroy, they suffocate that radical thought 02:48:34.500 |
that happens when you're an undergraduate student 02:48:42.220 |
Is there something positive, insightful you could say 02:48:46.460 |
about how we can make for better universities 02:48:50.340 |
in the decades to come, this particular institution? 02:48:58.100 |
many scientists and intellectuals were aristocrats. 02:49:16.280 |
and that the kind of competition they were faced in 02:49:19.260 |
among aristocrats allowed that sort of a self-indulgence 02:49:23.860 |
or self-pursuit at least at some point in their lives. 02:49:28.920 |
So the analogous observation is that university professors 02:49:39.160 |
And I am certainly enjoying that as a tenured professor. 02:49:47.080 |
Just the exploration you're doing, the depth of thought, 02:49:50.120 |
like most people are afraid to do the kind of 02:49:53.680 |
broad thinking that you're doing, which is great. 02:49:57.880 |
is the combination of these two things analogously. 02:50:14.400 |
even though people have a lot of resources, et cetera, 02:50:18.600 |
So I think I'm kind of lucky that tenure exists 02:50:54.240 |
Or are they getting a good value for that payment? 02:51:00.820 |
And the question is, are students getting good value 02:51:05.040 |
And each person is getting value in the sense 02:51:12.560 |
which is then worth more salary as an employee later. 02:51:19.040 |
because we aren't actually changing the students 02:51:22.520 |
or educating them, we're more sorting them or labeling them. 02:51:26.880 |
And that's a very expensive process to produce that outcome. 02:51:30.280 |
And part of the expense is the freedom from tenure, I guess. 02:51:36.680 |
because it's basically a tax on all these young students 02:51:47.120 |
The other main customer is researcher patrons 02:51:53.360 |
And then the question is, are they getting their money worth 02:51:55.880 |
out of the money they're paying for research to happen? 02:52:09.620 |
They mainly pay money to researchers who are impressive 02:52:21.160 |
So there's a deep truth to that cynical perspective. 02:52:25.560 |
Is there a less cynical perspective that they do care 02:52:46.520 |
the individuals there are rated based on the prestige 02:52:55.440 |
they are in a competitive game where they don't have tenure 02:53:01.160 |
And so once they give grant money to prestigious people, 02:53:04.000 |
that is the thing that shows that they have achieved 02:53:27.240 |
to a new equilibrium where we do that instead. 02:53:29.720 |
- What are the components of the better ways to do it? 02:53:36.820 |
So the sources of money and how the money is allocated 02:53:44.500 |
- Years ago, I started studying this topic exactly 02:53:51.720 |
and my best guess still is prediction markets, 02:54:01.840 |
like what's the mass of the electron neutrino, 02:54:04.280 |
then what you can do is just subsidize a betting market 02:54:07.760 |
in that question, and that will induce more research 02:54:12.340 |
because the people who then answer that question 02:54:17.900 |
So that's a robust way to induce more information 02:54:38.480 |
And for the customers who want to be affiliated 02:54:50.080 |
which I just wrote about in my second most recent blog post. 02:54:56.360 |
What we do today is we take sort of acceptance 02:54:58.960 |
by other academics recently as our best indication 02:55:04.260 |
That is recent publications, recent job affiliation, 02:55:08.820 |
institutional affiliations, recent invitations to speak, 02:55:14.780 |
We are today taking other impressive academics' 02:55:24.020 |
I would say we could do better by creating betting markets 02:55:42.800 |
and tried to look in detail at their research 02:55:58.560 |
which will be different than what people at the time judged 02:56:04.860 |
or which publications they had or things like that. 02:56:07.600 |
- In this way, if you think from the perspective 02:56:22.220 |
and you would think like what is the brave, the bold, 02:56:31.060 |
'cause you could see the path with which ideas took, 02:56:39.120 |
have a much better estimate of who actually had 02:56:41.760 |
what long-term effects on intellectual progress. 02:56:47.140 |
in several centuries to do this historical analysis 02:56:52.340 |
where we buy and sell assets which will later off pay off 02:57:10.220 |
So instead of looking at their list of publications 02:57:12.420 |
or affiliations, you look at the actual price of assets 02:57:27.780 |
- I've been elaborating two versions of it here. 02:57:39.760 |
then what you would do is subsidize a betting market 02:57:49.560 |
for many kinds of concrete intellectual questions 02:57:52.240 |
like what's the mass of the electron neutrino. 02:57:53.880 |
- In this hypothetical world that you're constructing 02:58:11.600 |
- So the idea would be research labs would be for profit. 02:58:18.600 |
and then their profit would come from using the insights 02:58:21.200 |
the researchers gains to trade in these financial markets. 02:58:29.740 |
and then making their profits by trading on those, 02:58:32.440 |
that insight in the ordinary financial market. 02:58:44.320 |
- The variance around the mass of the electron neutrino 02:58:49.960 |
and any other parameters that we wanted to estimate. 02:58:52.600 |
- You don't think those markets would also respond 02:58:54.520 |
to recency of prestige and all those kinds of things? 02:59:02.680 |
but if you think they're doing it incorrectly, 02:59:12.400 |
in the current ways in which people are estimating 02:59:22.400 |
it's the safe choice to go with the prestige. 02:59:26.840 |
- Even if you privately think that the prestige 02:59:30.960 |
- Even if you privately think strongly that it's overrated. 02:59:34.200 |
- Still, you don't have an incentive to defy that publicly. 02:59:38.520 |
unless you're a contrarian that writes brilliant blogs 02:59:46.400 |
I mean, initially, this was my initial concept 02:59:48.120 |
of having these betting markets on these key parameters. 02:59:52.600 |
that's more what people pretend to care about. 02:59:54.520 |
What they really mostly care about is just who's how good. 02:59:58.200 |
And that's what most of the system is built on, 03:00:04.080 |
based on historical evaluation centuries later, 03:00:08.600 |
because that's what I think most of the customers 03:00:11.120 |
- Customers, I like the word customers here, humans. 03:00:18.440 |
which has specialists who get paid to do that thing, 03:00:24.520 |
- Well, who are the customers for the mass of the neutrino? 03:00:35.480 |
- That's an important thing to understand about anything, 03:00:37.560 |
who are the customers, and what's the product, 03:00:39.760 |
like medicine, education, academia, military, et cetera. 03:00:47.440 |
about what the product is and who the customer is, 03:00:54.960 |
You've written that you seek out, quote, view quakes. 03:01:10.520 |
which dramatically changed my worldview, your worldview. 03:01:15.920 |
You write, I loved science fiction as a child, 03:01:22.680 |
and now study economics and political science, 03:01:40.320 |
something that springs to mind about physics, for example, 03:01:56.560 |
that you need to sort of have a mixture concept 03:01:59.700 |
where you put it into the space-time construct, 03:02:02.200 |
how it looks different from different perspectives, 03:02:06.680 |
And that was such a shock that it makes you think, 03:02:10.080 |
what else do I know that isn't the way it seems? 03:02:29.600 |
It looks nothing like what you would have thought 03:02:32.000 |
as sort of the basic representation of the physical world. 03:03:05.280 |
of all the weird stuff that falls out of quantum mechanics, 03:03:24.540 |
and about many other areas of academic intellectual life 03:04:11.760 |
and it's well worth celebrating Einstein for that. 03:04:17.520 |
who did something first or came across something first, 03:04:19.960 |
we are encouraging all the rest to move a little faster, 03:04:23.320 |
to try to push us all a little faster, which is great, 03:04:31.920 |
roughly to the same place within a half century. 03:04:37.040 |
because of how much longer it would have taken. 03:04:41.000 |
would have taken longer without Einstein than other things. 03:04:45.960 |
I mean, there were several different formulations 03:04:47.400 |
of quantum mechanics all around the same few years, 03:04:50.320 |
means no one of them made that much of a difference. 03:04:54.640 |
regardless of which of them did it exactly when. 03:04:57.400 |
Nevertheless, I'm happy to celebrate them all. 03:05:03.280 |
where there's lots of people working together, 03:05:08.080 |
and getting a result just before somebody else does, 03:05:10.720 |
you ask, well, how much of a difference would I make there? 03:05:17.800 |
And so I'm less worried about them missing things. 03:05:20.600 |
So when I'm trying to help the world, like doing research, 03:05:24.200 |
I'm looking for things that nobody's doing it. 03:05:32.000 |
- Same with general relativity, just, you know, 03:05:43.400 |
- And then that's when you get the big view quake, 03:05:54.440 |
What idea, whether that struck you in the shower one day, 03:06:05.120 |
in artificial intelligence is the realization 03:06:10.880 |
So most people who come to artificial intelligence 03:06:14.420 |
from other fields or from relative ignorance, 03:06:17.820 |
a very common phenomenon, which you must be familiar with, 03:06:24.640 |
Once we implement this new concept, we will have it. 03:06:30.560 |
And they're just not appreciating just how big the problem is, 03:06:34.240 |
how long the road is, just how much is involved, 03:06:45.160 |
looking in each problem, all the different things 03:06:47.220 |
you need to be able to do to solve a problem like that 03:06:49.960 |
makes you realize all the things your minds are doing 03:06:54.200 |
That's that vast subconscious that you're not aware. 03:07:00.180 |
for most people who study artificial intelligence, 03:07:21.160 |
The human mind, the human body is quite incredible. 03:07:25.220 |
- But then, see, it's already been so long for me 03:07:31.080 |
that for me, I now experience the view quakes 03:07:34.040 |
of holy shit, this little thing is actually quite powerful. 03:07:43.520 |
after that first view quake of like, this is so hard. 03:07:55.000 |
you've talked about a bunch of simple models, 03:07:57.280 |
that simple things can actually be extremely powerful. 03:08:01.080 |
That maybe emulating the human mind is extremely difficult, 03:08:06.000 |
but you can go a long way with a large neural network. 03:08:14.120 |
Holy crap, you can go quite a long way with a simple thing. 03:08:30.520 |
with the six hard steps that humans have to take 03:08:34.080 |
to arrive at where we are from the origin of life on Earth. 03:08:37.600 |
So it's long maybe in the statistical improbability 03:08:44.560 |
But in terms of how quickly those steps could be taken, 03:09:00.880 |
And mildly confident it's at least three decades. 03:09:05.480 |
I prefer to measure that journey in Elon Musk's. 03:09:12.680 |
- For now, I don't know, maybe you can clone, 03:09:15.560 |
or maybe multiply, or I don't even know what Elon Musk, 03:09:24.000 |
- How does that fit into the model of the three parameters 03:09:28.080 |
that are required for becoming a grabby alien civilization? 03:09:33.080 |
- That's the question of how much any individual makes 03:09:45.240 |
And certainly some individuals make a substantial difference 03:09:55.440 |
European history would have taken a different path 03:09:59.280 |
But if we're looking over many centuries longer term things, 03:10:02.520 |
most individuals do fade in their individual influence. 03:10:13.920 |
you will also be forgotten in the long arc of history. 03:10:19.760 |
so let's talk a little bit about this AI point 03:10:28.120 |
how hard is the problem of solving intelligence 03:10:35.440 |
that achieves human-level, human-like qualities 03:10:42.120 |
What are the different trajectories that take us there? 03:10:45.240 |
- One way to think about it is in terms of the scope 03:10:48.600 |
of the technology space you're talking about. 03:10:57.880 |
So the entire economy is composed of many industries, 03:11:03.680 |
with many different technologies supporting each one. 03:11:10.560 |
that most innovations are a small fraction of the total. 03:11:14.560 |
That is, usually you have relatively gradual overall progress 03:11:26.000 |
is still a small percentage of the total economy, right? 03:11:30.800 |
that made a substantial difference to the whole economy. 03:11:40.600 |
- Shipping containers deserves to be up there 03:11:44.160 |
- Can you say exactly why shipping containers? 03:11:47.080 |
- Shipping containers revolutionized shipping. 03:11:55.480 |
so you're saying you wouldn't have some of the magic 03:12:02.280 |
- Interesting, that's something we'll look into. 03:12:05.200 |
We shouldn't take that tangent, although I'm tempted to. 03:12:08.000 |
But anyway, so there's a few, just a few innovations. 03:12:11.040 |
- Right, so at the scale of the whole economy, right? 03:12:14.040 |
Now, as you move down to a much smaller scale, 03:12:21.280 |
So if you look at, I don't know, lawnmowers or something, 03:12:24.960 |
I don't know about the innovations of lawnmower, 03:12:26.160 |
but there were probably like steps where you just had 03:12:28.800 |
a new kind of lawnmower, and that made a big difference 03:12:31.640 |
to mowing lawns, because you're focusing on a smaller part 03:12:38.200 |
So, and you know, sometimes like military technology, 03:12:43.880 |
a lot of small ones, but every once in a while, 03:12:45.280 |
a particular military weapon like makes a big difference. 03:12:51.360 |
they're making modest differences to something 03:12:54.480 |
that's increasing relatively, like US military 03:12:57.040 |
is the strongest in the world consistently for a while. 03:12:59.840 |
No one weapon in the last 70 years has like made 03:13:03.320 |
a big difference in terms of the overall prominence 03:13:07.400 |
'Cause that's just saying, even though every once in a while, 03:13:09.760 |
even the recent Soviet hyper missiles or whatever they are, 03:13:13.360 |
they aren't changing the overall balance dramatically, right? 03:13:17.000 |
So when we get to AI, now I can frame the question, 03:13:23.680 |
Basically, if, so one way of thinking about AI 03:13:28.840 |
And then you ask what fraction of tasks are mental tasks? 03:13:32.320 |
And then if I think of AI as like half of everything, 03:13:37.320 |
then I think, well, it's gotta be composed of lots of parts 03:13:41.240 |
where any one innovation is only a small impact, right? 03:13:44.480 |
Now, if you think, no, no, no, AI is like AGI. 03:13:48.800 |
And then you think AGI is a small thing, right? 03:13:52.840 |
There's only a small number of key innovations 03:13:56.640 |
Now you're thinking there could be a bigger chunk 03:14:00.760 |
that you might find that would have a bigger impact. 03:14:02.760 |
So the way I would ask you to frame these things 03:14:05.480 |
in terms of the chunkiness of different areas of technology, 03:14:11.240 |
So if you take 10 chunky areas and you add them together, 03:14:16.080 |
- Yeah, but don't you, are you able until you solve 03:14:25.440 |
- Well, if you have a history of prior chunkiness, 03:14:28.360 |
that could be your best estimate for future chunkiness. 03:14:31.680 |
even at the level of the world economy, right? 03:14:34.080 |
We've had this, what, 10,000 years of civilization. 03:14:39.120 |
You might say, oh, that doesn't predict future chunkiness. 03:14:42.120 |
But it looks relatively steady and consistent. 03:14:54.040 |
Like, when were there algorithms or approaches 03:14:59.160 |
And how large a fraction of those that was that? 03:15:05.800 |
most innovation has been relatively small chunks. 03:15:11.600 |
This is about AI and just algorithms in general, 03:15:23.600 |
that by itself is not that useful, but the scale-- 03:15:31.680 |
like depending on the, yeah, depending on the context, 03:15:46.480 |
- So one standard story about algorithms is to say, 03:15:49.760 |
algorithms have a fixed cost plus a marginal cost. 03:15:54.400 |
And so in history, when you had computers that were 03:16:04.840 |
you could afford to do larger fixed costs and try those. 03:16:07.760 |
And some of those had more effective algorithms 03:16:17.560 |
the rate of algorithmic improvement is about the same 03:16:26.440 |
well, there's all these better algorithms you can't try 03:16:29.840 |
until you have a big enough computer to pay the fixed cost 03:16:32.440 |
of doing some trials to find out if that algorithm 03:16:39.600 |
for this relatively continuous history where, 03:16:44.640 |
And you might think, why would software be so continuous 03:16:51.840 |
and it's, say, spread out in a wide log-normal distribution, 03:16:58.520 |
trying out algorithms with larger fixed costs 03:17:00.840 |
and finding the ones that have lower marginal costs. 03:17:04.240 |
- So would you say AGI, human-level, AI, even EM, M, 03:17:23.080 |
in the sense that if you have an emulated brain 03:17:25.520 |
and you're 25% effective at emulating it, that's crap. 03:17:30.400 |
You pretty much need to emulate a full human brain. 03:17:36.600 |
Is that obvious that the 25%- - I think it's pretty obvious. 03:17:40.840 |
so the key thing is you're emulating various brain cells, 03:18:09.520 |
But the hope is that the emulating the human brain, 03:18:14.520 |
- Right, so it has a certain level of redundancy 03:18:18.800 |
When you get close to that level of redundancy 03:18:25.520 |
It's gonna be just a big thing that isn't working well. 03:18:36.000 |
that's actually effective in able substituting for humans, 03:18:39.920 |
and then that will be this huge economic product 03:18:44.480 |
- You'll bring a lot of value to people's lives, 03:18:48.960 |
- But it could be that the first emulation costs 03:18:53.600 |
And then we have them, but we can't really use them, 03:18:58.040 |
and now we have less of a chunky adoption, right? 03:19:04.320 |
then we use more and more of them in more and more contexts. 03:19:10.120 |
So it's only if the first emulations are relatively cheap 03:19:13.640 |
that you get a more sudden disruption to society. 03:19:17.600 |
And that could happen if sort of the algorithm 03:19:19.960 |
is the last thing you figure out how to do or something. 03:19:36.160 |
of human connection as they interact with humans, 03:19:44.920 |
- So we're thinking about chunkiness or distance now. 03:20:19.280 |
I tend to believe that all of it, not just the M, 03:20:25.840 |
And human level intelligence embodied in robots 03:20:32.120 |
The history of computer science and chunkiness so far 03:20:34.440 |
seems to be my rough best guess for the chunkiness of AGI. 03:20:44.120 |
Our ability to use computers to do many things in the economy 03:20:48.600 |
Overall, in terms of our use of computers in society, 03:20:51.520 |
they have been relatively steadily improving for 70 years. 03:20:58.880 |
Okay, I would have to really think about that 03:21:07.920 |
we see something like that every 10 years or so. 03:21:10.360 |
Some new innovations-- - The progress is gradual. 03:21:21.400 |
we've seen in the past would be a rough estimate 03:21:24.720 |
Unless the future is we're gonna hit a chunky territory, 03:21:39.640 |
is difficult to reason with because it's so recent. 03:22:19.600 |
that looked like they had a potential for revolution. 03:22:26.000 |
I would say the past trend is roughly your best guess 03:22:42.240 |
or the capacity of revolution that appear to be there now. 03:22:45.520 |
First of all, there's a lot more money to be made. 03:22:58.680 |
So maybe we're just like riding a nice wave right now. 03:23:01.080 |
- One of the biggest issues is the difference 03:23:03.240 |
between impressive demos and commercial value. 03:23:10.360 |
that never really translated much into commercial value. 03:23:12.920 |
- Somebody who works on and cares about autonomous 03:23:15.480 |
and semi-autonomous vehicles, tell me about it. 03:23:35.120 |
All right, two more fields that I would like to force 03:23:41.360 |
- To look for view quakes and for beautiful ideas, economics. 03:23:44.280 |
What is a beautiful idea to you about economics? 03:23:55.760 |
there's gonna be the first view quake most people encounter 03:23:58.600 |
that makes the biggest difference on average in the world, 03:24:01.040 |
'cause that's the only thing most people ever see 03:24:05.040 |
And so, with AI, the first one is just how big 03:24:12.600 |
Certainly for economics, the first one is just 03:24:20.800 |
to figure out how to optimize in a big, complicated space, 03:24:29.000 |
And they are really quite robust and powerful. 03:24:48.720 |
And most people's intuitions for how they should limit 03:24:55.160 |
Most people, when you go, I don't know if we wanna trust 03:25:09.120 |
then let other companies form to try to supply that thing. 03:25:14.560 |
of whatever they're making and try to offer that product 03:25:17.840 |
Let many people, many such firms enter that industry 03:25:21.240 |
and let the customers decide which ones they want. 03:25:23.280 |
And if the firm goes out of business, let it go bankrupt 03:25:25.760 |
and let other people invest in whichever ventures 03:25:37.800 |
There's a hope that there's no manipulation of information 03:25:45.440 |
still just the simple market solution is usually better 03:25:54.600 |
It's not what you would have initially thought. 03:25:57.120 |
- That's one of the great, I guess, inventions 03:25:59.800 |
of human civilization that trusts the markets. 03:26:03.240 |
- Now, another view, Craig, that I learned in my research 03:26:06.680 |
that's not all of economics but something more specialized 03:26:12.240 |
That is, basically, people who are trying to believe 03:26:22.280 |
so it was quite the striking fact for me to learn 03:26:26.560 |
rational agents would not knowingly disagree. 03:26:28.800 |
And so, that makes disagreement more puzzling 03:26:35.180 |
- Humans are, to some degree, rational and are able to-- 03:26:46.260 |
Which might not be the same as being irrational. 03:26:52.280 |
- That's another tangent that could take an hour. 03:26:54.720 |
In the space of human affairs, political science, 03:27:01.300 |
what is a beautiful, foundational, interesting idea to you, 03:27:06.000 |
a view, Craig, in the space of political science? 03:27:13.640 |
is people not agreeing on what the best thing to do is. 03:27:20.720 |
- So that's what goes wrong, that is when you say 03:27:22.400 |
what's fundamentally behind most political failures, 03:27:29.800 |
And that's surprising because it's actually feasible 03:27:33.280 |
to solve that problem, which we aren't solving. 03:27:37.160 |
that there's an inability to arrive at a consensus. 03:27:44.800 |
if everybody looked to some authority, say, on a question, 03:28:28.160 |
- Right, another big view crick about politics 03:28:31.320 |
that most people, when they're interacting with politics, 03:28:35.920 |
they make their city better, their country better, 03:28:42.760 |
They wanna show their people they're on their side, yes. 03:28:47.000 |
That's their primary priority, and they do accomplish that. 03:28:51.480 |
- Yeah, and the tribes are usually color-coded, 03:28:55.540 |
What would you say, you know, it's the Churchill question, 03:29:01.480 |
democracy's the crappiest form of government, 03:29:08.280 |
for this, our, seven billion human civilization, 03:29:12.740 |
and the, maybe, as we get farther and farther, 03:29:16.160 |
you mentioned a lot of stuff that's fascinating 03:29:18.160 |
about human history as we become more forager-like, 03:29:21.760 |
and looking out beyond, what's the best form of government 03:29:35.040 |
and related institutions, like media institutions 03:29:44.240 |
And the key failing, we're just not exploring that space. 03:29:50.560 |
and I think I can identify many promising solutions, 03:29:54.720 |
many other promising proposals in that space, 03:29:58.960 |
those proposals, we're not trying them out on small scales, 03:30:08.040 |
And if we did that, I am confident we would find 03:30:11.520 |
much better institutions than the one we're using now, 03:30:59.800 |
most interesting people you see in the world. 03:31:14.600 |
and they wanna say, "What are you behind the scenes? 03:31:19.640 |
They don't have another life behind the scenes. 03:31:24.700 |
the one we admire, the one we're fascinated by, 03:31:27.340 |
and they kinda have to make up the stuff behind the scenes 03:31:29.740 |
to supply it for you, but it's not really there. 03:31:32.920 |
- Well, there's several ways of phrasing this. 03:31:35.920 |
which is if you become the thing you are on the surface, 03:31:51.920 |
they actually have often a manufactured surface 03:31:55.840 |
that they put on, and they try on different masks, 03:31:58.880 |
and the depths are very different from the surface, 03:32:04.800 |
If you're an actor who actually lives the role 03:32:21.480 |
the person you play on the surface, that's authenticity. 03:32:27.580 |
They play in all of their movies and TV shows, 03:32:39.120 |
I think they just always play the same person. 03:32:41.000 |
- And you and I are just both surface players. 03:32:47.960 |
and I am the suit-wearing idiot full of silly questions. 03:32:52.960 |
All right, that said, let's put on your wise sage hat 03:33:04.360 |
to young people today in high school and college 03:33:08.080 |
about life, about how to live a successful life 03:33:12.880 |
in career or just in general that they can be proud of? 03:33:16.380 |
- Most young people, when they actually ask you 03:33:22.560 |
is how can I be successful by usual standards. 03:33:27.020 |
- I'm not very good at giving advice about that 03:33:28.500 |
'cause that's not how I tried to live my life. 03:33:35.960 |
you live in a rich society, you will have a long life, 03:33:42.380 |
Whatever career you take, you'll have plenty of time 03:33:53.760 |
in a way that gives you more time and energy, 03:33:55.320 |
but there are often big compromises there as well. 03:34:01.040 |
or some thing that you think just is worth pursuing, 03:34:03.920 |
you can just do it, you don't need other people's approval. 03:34:12.520 |
It might take you decades, but decades are enough 03:34:14.800 |
to make enormous progress on most all interesting things. 03:34:18.520 |
- And don't worry about the commitment of it. 03:34:20.520 |
I mean, that's a lot of what people worry about is, 03:34:23.000 |
well, there's so many options, and if I choose a thing 03:34:25.540 |
and I stick with it, I sacrifice all the other 03:34:29.360 |
- But I mean, so I switched my career at the age of 34 03:34:32.320 |
with two kids age zero and two, went back to grad school 03:34:40.640 |
So it's quite possible to change your mind later in life. 03:34:49.320 |
- Okay, so, oh, oh, you indexed with zero, I got it, okay. 03:34:55.080 |
- Right, and you know, like people also ask what to read, 03:35:02.160 |
or maybe review articles, I'm not so sure you should 03:35:04.840 |
be reading blog posts and Twitter feeds and even podcasts. 03:35:16.000 |
of how to learn things is crammed into textbooks. 03:35:18.280 |
- Yeah, especially the ones on introduction to biology. 03:35:22.360 |
- Yeah, everything, introduction to everything. 03:35:27.640 |
and then maybe if you wanna know more about a subject, 03:35:31.240 |
You don't need to read the latest stuff for most topics. 03:35:37.760 |
- And depending on the field, if it's technical, 03:35:44.880 |
Extremely powerful way to understand something 03:35:48.880 |
You know, I actually think of like high school and college, 03:35:56.280 |
but you will almost not again get an opportunity 03:36:01.280 |
to spend the time with a fundamental subject. 03:36:10.400 |
And like you'll never get that chance again to sit there, 03:36:14.820 |
even though it's outside of your interest, biology. 03:36:16.920 |
Like in high school I took AP biology, AP chemistry. 03:36:26.480 |
And it was so nice to be forced into anatomy and physiology, 03:36:31.480 |
to be forced into that world, to stay with it, 03:36:39.760 |
enjoy the beauty of these, of like how a cell works 03:36:44.640 |
And somehow that stays, like the ripples of that fascination 03:36:48.740 |
that stays with you even if you never do those, 03:36:56.960 |
- A common problem, at least of many young people I meet, 03:36:59.880 |
is that they're like feeling idealistic and altruistic, 03:37:05.880 |
- So the usual human tradition that goes back 03:37:10.840 |
is that people's productivity rises with time 03:37:18.120 |
having the highest income, you'll have the most contacts, 03:37:21.080 |
you will sort of be wise about how the world works. 03:37:41.640 |
enormous things at the age of 18 or whatever. 03:37:43.920 |
I mean, you might as well practice trying to do things, 03:37:46.080 |
but that's mostly about learning how to do things 03:38:15.960 |
in mutated form and just the joy of raising them? 03:38:23.960 |
So in the literature on the value people get out of life, 03:38:29.680 |
there's a key distinction between happiness and meaning. 03:38:32.480 |
So happiness is how do you feel right now about right now, 03:38:36.440 |
and meaning is how do you feel about your whole life? 03:38:48.080 |
And meaning goes along with sacrificing happiness sometimes. 03:39:01.100 |
- Why do you think kids, children are so magical, 03:39:19.640 |
And in that case, I'm not trying to draw too many parallels, 03:39:32.160 |
that is now instilled in this other moving being 03:39:49.380 |
several different reasons, all of which is sufficient. 03:39:52.900 |
we don't know which ones are the correct reasons. 03:39:54.820 |
- Such a technical, it's overdetermined, look it up. 03:39:58.780 |
- So I meet a lot of people interested in the future, 03:40:02.940 |
They're thinking about how can I influence the future? 03:40:08.540 |
the main way people have influenced the future 03:40:18.820 |
That is, you're the sequence of thousands of generations, 03:40:27.500 |
You just have to expect, and it's true that who you are 03:40:37.220 |
to have that be a natural and meaningful interaction for you. 03:40:41.780 |
It's just one of those things you just should have expected, 03:40:54.180 |
more and more of us are able to influence the future 03:40:59.980 |
- Even so, though, still most of our influence 03:41:02.660 |
in the future has probably happened being kids, 03:41:05.300 |
even though we've accumulated more other ways to do it. 03:41:14.180 |
how much of yourself you really put into another human being. 03:41:33.960 |
Let me ask some dark, difficult questions, if I might. 03:42:02.620 |
And so I'm apparently somewhat emotionally scarred 03:42:09.700 |
which must have happened because some rejections 03:42:26.340 |
- Hold on a second, but you're a contrarian thinker. 03:42:42.820 |
at a much larger scale, constantly with your ideas? 03:42:47.620 |
Or that I've just categorized them differently 03:43:04.220 |
- The intellectual ideas haven't been the thing 03:43:07.500 |
- The one that, the things that challenge your mind, 03:43:18.100 |
- You just asked me what took me to a dark place. 03:43:30.980 |
at a more surface level than something emotional. 03:43:37.620 |
in your life when you're just in a dark place 03:43:55.020 |
If you ask them what was the worst part of their life, 03:43:58.420 |
that was the worst part of life for most people. 03:44:02.340 |
- Yeah, I mean, not in grade school probably, 03:44:13.340 |
or later on lots of different kinds of rejection. 03:44:21.460 |
most of us like to pretend we don't that much 03:44:25.180 |
need other people, we don't care what they think. 03:44:27.420 |
It's a common sort of stance if somebody rejects you 03:44:29.660 |
and says, "Oh, I didn't care about them anyway." 03:44:32.300 |
But I think to be honest, people really do care. 03:44:35.100 |
- Yeah, we do seek that connection, that love. 03:44:51.300 |
where we know at some level it's important to us, 03:45:01.740 |
where we can just clearly see that we want it 03:45:04.300 |
We know when we're thirsty and we know why we were thirsty 03:45:08.500 |
and we know when it's over that we're no longer thirsty. 03:45:18.820 |
why we're drawn exactly, because it's not just affection, 03:45:28.020 |
We don't seem to be drawn to somebody who's like a servant. 03:45:32.980 |
We don't seem to be necessarily drawn to somebody 03:45:35.200 |
that satisfies all your needs or something like that. 03:45:45.620 |
So I've also noticed there are some kinds of things 03:45:53.180 |
that you can clearly, you can imagine it being bright 03:45:59.860 |
but there's some aspects about your emotional stance 03:46:02.380 |
in a situation that's actually just hard to imagine 03:46:06.400 |
It's hard to like, you can often remember an emotion 03:46:08.820 |
only when you're in a similar sort of emotion situation, 03:46:11.540 |
and otherwise you just can't bring the emotion 03:46:14.620 |
to your mind, and you can't even imagine it, right? 03:46:17.700 |
So there's certain kinds of emotions you can have, 03:46:30.740 |
- Right, and it's sort of a reason why we have, 03:46:33.180 |
one of the reasons that pushes us to reconsume it 03:46:35.540 |
and reproduce it is that we can't reimagine it. 03:46:41.340 |
'cause there's a Daniel Kahneman type of thing 03:46:45.780 |
'cause I'm able to summon some aspect of that emotion, 03:46:54.740 |
- So like a certain song, you can listen to it, 03:46:59.820 |
the first time you remembered that song associated 03:47:02.980 |
to remember that situation in some sort of complete package. 03:47:08.540 |
the whole package again, if you remember the whole feeling. 03:47:11.100 |
- Yes, or some fundamental aspect of that whole experience 03:47:17.180 |
and actually the feeling is probably different in some way. 03:47:36.100 |
we expand our vocabulary as a community of language, 03:47:39.300 |
and that allows us to sort of have more feelings 03:47:43.700 |
'Cause you can have a feeling, but not have a word for it, 03:47:45.580 |
and then you don't know how to categorize it, 03:47:47.660 |
or even what it is, and whether it's the same 03:47:49.760 |
as something else, but once you have a word for it, 03:47:52.260 |
you can sort of pull it together more easily. 03:48:05.640 |
Maybe somebody or something that's no longer in your life, 03:48:26.080 |
- Certainly, yes, absolutely, I'm a different person 03:48:28.560 |
than I was when I was younger, and I'm not who, 03:48:33.440 |
So I don't remember as many things from the past 03:48:37.360 |
I've just lost a lot of my history by not remembering it. 03:48:42.680 |
anymore, that person's gone, and I don't have 03:48:44.240 |
any of their abilities. - Is that a painful loss? 03:48:59.580 |
- Or you just call it-- - I just was this person, 03:49:02.800 |
and I felt assured that I could continue to be that person. 03:49:14.480 |
- And that the person you are today, talking to me, 03:49:20.800 |
- Yes, and in 20 years, he won't be there anymore. 03:49:32.660 |
For M's, they would be able to save an archived copy 03:50:03.440 |
Like, does that make you the experience of life, 03:50:08.440 |
the experience of a moment, the scarcity of that moment, 03:50:16.880 |
that experience so delicious, so rich of feeling? 03:50:30.380 |
So there's a perspective on copying and cloning 03:50:37.680 |
where you're just scaling happiness versus degrading it. 03:50:48.640 |
- And on mass, right, you're actually spreading 03:50:56.800 |
and that increases the overall happiness in the world. 03:50:59.600 |
And then you're able to do that with multiple songs. 03:51:42.340 |
Maybe from the rest of the world's point of view, 03:51:53.640 |
- See, but the identical twin copying happens 03:52:13.400 |
to different people who have different degrees 03:52:27.000 |
This seems like interesting world to live in. 03:52:29.360 |
- And there could be some ethical conundrums there. 03:53:03.040 |
there'll be an equivalent increase in lawyers. 03:53:25.720 |
that is not very transparent and able to do things. 03:53:28.320 |
So, for example, an operating system of a computer 03:53:33.760 |
When the operating system is simple and clean, 03:53:37.440 |
to make a key choice with the operating system. 03:53:40.360 |
- You just make a choice. - Qualify rules, yeah. 03:54:07.160 |
That's what this little tag around my deck says. 03:54:09.720 |
It says that if you find me in a medical situation, 03:54:12.840 |
you should call these people to enable the cryonics transfer. 03:54:25.600 |
- So when medical science gives up on me in this world, 03:54:30.560 |
instead of burning me or letting worms eat me, 03:54:33.800 |
they will freeze me, or at least freeze my head. 03:54:40.480 |
but once it's frozen, it won't change for a very long time. 03:54:44.360 |
Chemically, it'll just be completely exactly the same. 03:54:47.560 |
So future technology might be able to revive me. 03:54:54.880 |
which doesn't require reviving my entire biological body. 03:54:57.920 |
It means I would be in a computer simulation. 03:55:00.520 |
And so that's, I think I've got at least a 5% shot at that. 03:55:35.160 |
- The interesting thing about human experience 03:55:40.080 |
is that the way you phrase it is exactly right. 03:56:01.200 |
it's possible for certain experiences to become bland, 03:56:07.920 |
And that actually makes life really unpleasant. 03:56:12.920 |
Sorry, it makes that experience really unpleasant. 03:56:15.920 |
And perhaps you can generalize that to life itself 03:56:22.280 |
- Might happen, but might as well wait and find out. 03:56:24.760 |
But then you're ending on suffering, you know? 03:56:28.240 |
- So in the world of brain emulations, I have more options. 03:56:48.840 |
So can we do, what do you think about like the metaverse 03:57:06.120 |
But they wouldn't think of it as virtual reality, 03:57:08.640 |
they would just think of it as their usual reality. 03:57:11.280 |
I mean, the thing to notice, I think, in our world, 03:57:16.440 |
And indoors, we are surrounded by walls covered with paint 03:57:28.640 |
it's not the natural world that was there before. 03:57:31.440 |
A virtual reality is basically just like that. 03:57:37.440 |
And, but when it's the right, that environment for you, 03:57:40.540 |
it's real for you, just like the room you're in right now, 03:57:45.120 |
You're not focused on the fact that the paint 03:57:50.000 |
and the actual wires and pipes and everything else. 03:58:02.240 |
in the very kind of system that you're describing 03:58:04.540 |
where the environment and the brain is being emulated 03:58:09.040 |
when you were first did a podcast with Lex after, 03:58:14.040 |
and now, you know, the person that originally launched this 03:58:21.360 |
And you like this time because there's so much uncertainty. 03:58:24.680 |
There's nerves, it could have gone any direction. 03:58:27.120 |
- At the moment, we don't have the technical ability 03:58:32.720 |
So we'd have to be postulating that in the future, 03:58:40.480 |
- Don't you think we could be in the simulation 03:58:47.080 |
- So one scenario would be this never really happened. 03:58:51.180 |
This only happens as a reconstruction later on. 03:58:57.800 |
and now it's happening again as a reconstruction. 03:59:00.840 |
That second scenario is harder to put together 03:59:06.000 |
where between the two times we produce the ability to do it. 03:59:09.220 |
- No, but don't you think replay of memories, 03:59:18.160 |
- That might be a possible thing in the future. 03:59:42.240 |
And so the key issue is the fraction of creatures 03:59:52.560 |
relative to the fraction that are experiencing it 03:59:57.800 |
So then the key parameter is at any one moment in time, 04:00:06.920 |
most of them are presumably really experiencing 04:00:09.240 |
what they're experiencing, but some fraction of them 04:00:22.840 |
what we need to think about is basically two functions. 04:00:26.020 |
One is how fast in time does the number of creatures grow? 04:00:33.960 |
Because at any one time, people will be simulating 04:00:38.000 |
different periods in the past with different emphasis 04:00:50.440 |
then in fact, your chances of being simulated are low. 04:00:54.540 |
So the key question is how fast does interest 04:01:02.160 |
- Does this correlate to, you earlier suggested 04:01:04.360 |
that the interest in the future increases over time. 04:01:13.480 |
- But the simple way to do it is, as you know, 04:01:15.200 |
like Google Ngrams has a way to type in a word 04:01:18.240 |
and see how interest in it declines or rises over time. 04:01:22.320 |
- You can just type in a year and get the answer for that. 04:01:24.840 |
If you type in a particular year like 1900 or 1950, 04:01:29.160 |
you can see with Google Ngram how interest in that year 04:01:32.720 |
increased up until that date and decreased after it. 04:01:36.160 |
And you can see that interest in a date declines faster 04:01:53.160 |
not against, to this particular aspect of the simulation, 04:02:02.080 |
- First of all, if we assume that like simulation 04:02:04.120 |
of the past is a small fraction of all the creatures 04:02:16.520 |
but some unusual category of interest in the past 04:02:25.600 |
So that very outlier specific kind of, yeah, okay. 04:02:33.120 |
but like probably sexual-- - In a trillion years, 04:02:40.160 |
from all possible people in history or something 04:02:46.760 |
- So the question is how big is this research institute 04:02:48.840 |
and how big is the future in a trillion years, right? 04:02:59.120 |
I think it's also true for movies and plays and video games, 04:03:02.120 |
overwhelmingly they're interested in the recent past. 04:03:16.640 |
- But every once in a while, that's brought back. 04:03:21.960 |
I mean, just if you look at the mass of entertainment, 04:03:25.120 |
movies and games, it's focusing on the present, recent past. 04:03:39.080 |
I mean, it's a mix of the past and the present 04:03:45.280 |
to ask deep philosophical questions about humanity. 04:03:49.000 |
- The closest genre to science fiction is clearly fantasy. 04:03:51.600 |
Fantasy and science fiction in many bookstores 04:03:58.960 |
So the function of fantasy is more transparent 04:04:09.080 |
and imagine stories with much fewer constraints. 04:04:15.600 |
Is it to escape from the harshness of the constraints 04:04:28.920 |
- I'm not a cheap fantasy reading kind of person. 04:04:38.200 |
is that there are sort of these deep story structures 04:04:43.920 |
and then many details of the world get in their way. 04:04:46.640 |
Fantasy takes all those obstacles out of the way 04:04:54.760 |
The reality and constraints are not in the way. 04:04:57.960 |
- And so science fiction can be thought of as like fantasy, 04:05:01.280 |
except you're not willing to admit that it can't be true. 04:05:19.960 |
The imagination is something really interesting 04:05:50.120 |
You shouldn't be 100% confident that you're not. 04:05:52.160 |
You should certainly grant a small probability. 04:05:54.320 |
The question is, how large is that probability? 04:05:57.880 |
I misunderstood because I thought our discussion 04:06:01.200 |
was about replaying things that have already happened. 04:06:08.200 |
Am I actually a replay from some distant future? 04:06:11.960 |
- But it doesn't necessarily need to be a replay. 04:06:19.200 |
with a certain kind of world around me, right? 04:06:22.600 |
or it's a past of somebody else in the future. 04:06:25.720 |
But no, it could be a complete fantasy, though. 04:06:28.200 |
- It could be, right, but then you have to talk about 04:06:30.440 |
what's the frank fraction of complete fantasies, right? 04:06:33.840 |
- I would say it's easier to generate a fantasy 04:06:37.920 |
- Sure, but if we just look at the entire history, 04:06:40.000 |
if we just look at the entire history of everything, 04:06:41.760 |
we should say, sure, but most things are real. 04:06:45.360 |
Therefore, the chance that my thing is real, right? 04:06:57.720 |
which makes you more likely to be in a simulation, right? 04:07:00.240 |
If we're just taking the full count and saying, 04:07:02.720 |
in all creatures ever, what percentage are in simulations? 04:07:12.520 |
- 'Cause as Bostrom says the other way, right? 04:07:15.000 |
- In a competitive world, in a world where people 04:07:18.160 |
like have to work and have to get things done, 04:07:24.800 |
And so, you know, leisure things are less common 04:07:32.880 |
- But if you look at the stretch of history in the universe, 04:07:45.480 |
- Right, but now we're looking at the fraction of leisure, 04:07:49.160 |
where the person doing the leisure doesn't realize it. 04:08:04.120 |
somebody, they're a supporting character or something, 04:08:08.120 |
- What, you mentioned that children are one of the things 04:08:11.900 |
that are a source of meaning, broadly speaking. 04:08:39.220 |
about how we can predict that future creatures 04:08:47.400 |
of various sorts of random sort of patched together 04:08:59.760 |
It's not very transparent and it's a mess, right? 04:09:06.600 |
That is how we were made and how we are induced to do things. 04:09:17.480 |
where we don't feel very motivated, we don't know why. 04:09:19.560 |
In other situations, we find ourselves very motivated 04:09:31.400 |
and reason abstractly, this package of motivations 04:09:39.440 |
We can't very well tell the meaning of our life. 04:09:44.920 |
They will actually know exactly what they want 04:09:52.760 |
- Well, it's funny that you have the certainty. 04:09:54.640 |
You have more certainty, you have more transparency 04:09:57.400 |
about our descendants than you do about your own self. 04:10:16.460 |
Or all the feelings, the complex feelings involved. 04:10:19.720 |
- And that's true about many of our motivations. 04:10:40.400 |
Maintaining a sense of mystery about ourselves 04:10:44.480 |
Maybe that's a really nice thing to have, maybe. 04:10:50.140 |
in analyzing the future, what will set the future? 04:11:01.480 |
have some conferences, have some conventions, 04:11:06.000 |
and then hand it off to the implementation people 04:11:08.200 |
to make the future the way we've decided it should be. 04:11:12.400 |
That's not the actual process that's changed the world 04:11:24.880 |
or where we want to live, who we want to live with. 04:11:29.640 |
make our lives better according to our plan and our things, 04:11:33.600 |
The whole world so far has mostly been a competitive world 04:11:37.280 |
where things happen if anybody anywhere chooses 04:11:42.280 |
And then it spreads and other people are forced 04:11:51.100 |
It doesn't tell us it'll be a future we like. 04:11:54.840 |
- And it'll be one where we're trying to maximize 04:12:06.400 |
in expanding aggressively out into the cosmos 04:12:14.840 |
We might become grabby and then this happens. 04:12:19.480 |
the results of competition, but it's less clear 04:12:27.840 |
- Well, again, I told you compared to sort of 04:12:36.840 |
- Yeah, and that's one hell of a fun universe to live in. 04:12:55.560 |
I hope we get a chance to talk many more times 04:13:06.080 |
To support this podcast, please check out our sponsors 04:13:13.840 |
We are an impossibility in an impossible universe. 04:13:18.840 |
Thank you for listening and hope to see you next time.