back to index

Robin Hanson: Alien Civilizations, UFOs, and the Future of Humanity | Lex Fridman Podcast #292


Chapters

0:0 Introduction
1:52 Grabby aliens
39:36 War and competition
45:10 Global government
58:1 Humanity's future
68:2 Hello aliens
95:6 UFO sightings
119:43 Conspiracy theories
128:1 Elephant in the brain
141:32 Medicine
154:1 Institutions
180:54 Physics
185:46 Artificial intelligence
203:35 Economics
206:56 Political science
212:45 Advice for young people
221:36 Darkest moments
224:37 Love and loss
233:59 Immortality
237:56 Simulation hypothesis
248:13 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | we can actually figure out where are the aliens
00:00:02.080 | out there in space time by being clever
00:00:03.840 | about the few things we can see,
00:00:05.280 | one of which is our current date.
00:00:07.120 | And so now that you have this living cosmology,
00:00:10.080 | we can tell the story that the universe starts out empty
00:00:12.960 | and then at some point, things like us appear,
00:00:15.520 | very primitive, and then some of those
00:00:18.240 | stop being quiet and expand.
00:00:20.440 | And then for a few billion years, they expand
00:00:22.360 | and then they meet each other.
00:00:23.580 | And then for the next hundred billion years,
00:00:26.080 | they commune with each other.
00:00:28.300 | That is the usual models of cosmology say
00:00:30.560 | that in roughly 100, 150 billion years,
00:00:34.540 | the expansion of the universe will happen so much
00:00:36.560 | that all you'll have left is some galaxy clusters
00:00:39.600 | and that are sort of disconnected from each other.
00:00:41.780 | But before then, they will interact.
00:00:44.500 | There will be this community
00:00:45.780 | of all the grabby alien civilizations
00:00:48.000 | and each one of them will hear about
00:00:49.340 | and even meet thousands of others.
00:00:51.860 | And we might hope to join them someday
00:00:55.120 | and become part of that community.
00:00:57.000 | (air whooshing)
00:00:59.000 | The following is a conversation with Robin Hansen,
00:01:01.680 | an economist at George Mason University
00:01:04.160 | and one of the most fascinating, wild, fearless,
00:01:06.760 | and fun minds I've ever gotten a chance to accompany
00:01:09.320 | for a time in exploring questions of human nature,
00:01:12.760 | human civilization, and alien life out there
00:01:16.560 | in our impossibly big universe.
00:01:19.240 | He is the co-author of a book titled
00:01:21.680 | "The Elephant in the Brain, Hidden Motives in Everyday Life,
00:01:25.280 | "The Age of M, Work, Love, and Life
00:01:28.040 | "When Robots Rule the Earth,"
00:01:29.800 | and a fascinating recent paper I recommend
00:01:33.040 | on quote, "Grabby Aliens,"
00:01:35.840 | titled "If Loud Aliens Explain Human Earliness,
00:01:39.260 | "Quiet Aliens Are Also Rare."
00:01:41.680 | This is the Lex Friedman Podcast.
00:01:44.480 | To support it, please check out our sponsors
00:01:46.580 | in the description.
00:01:47.960 | And now, dear friends, here's Robin Hansen.
00:01:52.420 | You are working on a book about quote, "Grabby Aliens."
00:01:56.800 | This is a technical term, like the Big Bang.
00:01:59.800 | So what are grabby aliens?
00:02:02.200 | - Grabby aliens expand fast into the universe
00:02:07.160 | and they change stuff.
00:02:08.420 | That's the key concept.
00:02:10.580 | So if they were out there, we would notice.
00:02:14.040 | That's the key idea.
00:02:15.840 | So the question is, where are the grabby aliens?
00:02:19.160 | So Fermi's question is, where are the aliens?
00:02:22.000 | And we could vary that in two terms, right?
00:02:24.580 | Where are the quiet, hard to see aliens
00:02:26.940 | and where are the big, loud grabby aliens?
00:02:30.240 | So it's actually hard to say
00:02:32.680 | where all the quiet ones are, right?
00:02:34.920 | There could be a lot of them out there
00:02:37.000 | 'cause they're not doing much.
00:02:37.880 | They're not making a big difference in the world.
00:02:39.760 | But the grabby aliens, by definition,
00:02:42.240 | are the ones you would see.
00:02:44.100 | We don't know exactly what they do with where they went,
00:02:47.240 | but the idea is they're in some sort of competitive world
00:02:50.000 | where each part of them is trying to grab more stuff
00:02:54.160 | and do something with it.
00:02:56.160 | And almost surely, whatever is the most competitive thing
00:03:00.600 | to do with all the stuff they grab
00:03:03.320 | isn't to leave it alone the way it started, right?
00:03:06.720 | So we humans, when we go around the Earth
00:03:08.640 | and use stuff, we change it.
00:03:10.000 | We turn a forest into a farmland, turn a harbor into a city.
00:03:14.800 | So the idea is aliens would do something with it
00:03:18.100 | and so we're not exactly sure what it would look like,
00:03:20.500 | but it would look different.
00:03:21.340 | So somewhere in the sky, we would see big spheres
00:03:24.460 | of different activity where things had been changed
00:03:27.300 | because they had been there.
00:03:29.020 | - Expanding spheres. - Right.
00:03:30.880 | - So as you expand, you aggressively interact
00:03:33.180 | and change the environment.
00:03:34.540 | So the word grabby versus loud,
00:03:37.700 | you're using them sometimes synonymously, sometimes not.
00:03:40.740 | Grabby to me is a little bit more aggressive.
00:03:45.740 | What does it mean to be loud?
00:03:47.620 | What does it mean to be grabby?
00:03:48.940 | What's the difference?
00:03:49.900 | And loud in what way?
00:03:51.340 | Is it visual, is it sound, is it some other
00:03:54.460 | physical phenomenon like gravitational waves?
00:03:57.180 | What, are you using this kind of
00:03:58.900 | in a broad philosophical sense?
00:04:01.020 | Or there's a specific thing that it means
00:04:04.440 | to be loud in this universe of ours?
00:04:07.380 | - My co-authors and I put together a paper
00:04:09.920 | with a particular mathematical model.
00:04:12.980 | And so we used the term grabby aliens
00:04:15.500 | to describe that more particular model.
00:04:17.700 | And the idea is it's a more particular model
00:04:19.940 | of the general concept of loud.
00:04:21.940 | So loud would just be the general idea
00:04:23.780 | that they would be really obvious.
00:04:25.900 | - So grabby is the technical term,
00:04:27.580 | is it in the title of the paper?
00:04:29.080 | - It's in the body.
00:04:30.080 | The title is actually about loud and quiet.
00:04:33.260 | - Right, loud like that.
00:04:34.100 | - So the idea is there's, you know,
00:04:35.180 | you wanna distinguish your particular model of things
00:04:37.460 | from the general category of things
00:04:39.000 | everybody else might talk about.
00:04:40.180 | So that's how we distinguish.
00:04:41.540 | - The paper title is,
00:04:42.380 | if loud aliens explain human earliness,
00:04:45.180 | quiet aliens are also rare.
00:04:47.400 | If life on Earth, God, this is such a good abstract.
00:04:50.540 | If life on Earth had to achieve--
00:04:53.460 | - N hard.
00:04:54.300 | - N hard steps to reach humanity's level,
00:04:57.300 | then the chance of this event rose as time
00:04:59.860 | to the Nth power.
00:05:00.920 | So we'll talk about power,
00:05:02.220 | we'll talk about linear increase.
00:05:03.940 | So what is the technical definition of grabby?
00:05:10.120 | How do you envision grabbiness?
00:05:12.640 | And why are, in contrast with humans,
00:05:16.760 | why aren't humans grabby?
00:05:18.160 | So like where's that line?
00:05:20.040 | Is it well definable?
00:05:21.400 | What is grabby, what is not grabby?
00:05:23.880 | - We have a mathematical model of the distribution
00:05:26.880 | of advanced civilizations, i.e. aliens, in space and time.
00:05:31.240 | That model has three parameters,
00:05:34.280 | and we can set each one of those parameters from data,
00:05:37.400 | and therefore we claim this is actually what we know about
00:05:41.240 | where they are in space-time.
00:05:43.040 | So the key idea is they appear at some point in space-time,
00:05:46.500 | and then after some short delay, they start expanding,
00:05:51.280 | and they expand at some speed.
00:05:53.460 | And the speed is one of those parameters.
00:05:55.400 | That's one of the three.
00:05:56.760 | And the other two parameters are about
00:05:58.960 | how they appear in time.
00:06:00.200 | That is, they appear at random places,
00:06:02.780 | and they appear in time according to a power law,
00:06:05.800 | and that power law has two parameters,
00:06:07.600 | and we can fit each of those parameters to data.
00:06:10.120 | And so then we can say, now we know.
00:06:13.000 | We know the distribution of advanced civilizations
00:06:15.360 | in space and time.
00:06:16.180 | So we are right now a new civilization,
00:06:19.380 | and we have not yet started to expand.
00:06:21.680 | But plausibly, we would start to do that
00:06:23.600 | within, say, 10 million years of the current moment.
00:06:26.880 | That's plenty of time.
00:06:27.980 | And 10 million years is a really short duration
00:06:30.360 | in the history of the universe.
00:06:31.440 | So we are, at the moment, a sort of random sample
00:06:36.040 | of the kind of times at which an advanced civilization
00:06:38.560 | might appear, because we may or may not become grabby,
00:06:41.280 | but if we do, we'll do it soon.
00:06:42.880 | And so our current date is a sample,
00:06:45.080 | and that gives us one of the other parameters.
00:06:47.280 | The second parameter is the constant
00:06:48.960 | in front of the power law,
00:06:50.020 | and that's arrived from our current date.
00:06:52.760 | - So power law, what is the N in the power law?
00:06:58.600 | What is the constant?
00:06:59.880 | - That's a complicated thing to explain.
00:07:02.000 | Advanced life appeared by going through
00:07:04.560 | a sequence of hard steps.
00:07:07.400 | So starting with very simple life,
00:07:09.360 | and here we are at the end of this process
00:07:11.240 | at pretty advanced life,
00:07:12.360 | and so we had to go through some intermediate steps,
00:07:14.760 | such as sexual selection, photosynthesis,
00:07:18.760 | multicellular animals, and the idea is that
00:07:22.280 | each of those steps was hard.
00:07:23.740 | Evolution just took a long time searching
00:07:26.640 | in a big space of possibilities
00:07:28.200 | to find each of those steps,
00:07:30.880 | and the challenge was to achieve all of those steps
00:07:33.720 | by a deadline of when the planets
00:07:36.560 | would no longer host simple life.
00:07:39.400 | And so Earth has been really lucky
00:07:41.920 | compared to all the other billions of planets out there,
00:07:45.040 | and that we managed to achieve all these steps
00:07:47.200 | in the short time of the five billion years
00:07:50.920 | that Earth can support simple life.
00:07:53.960 | - So not all steps, but a lot of them,
00:07:55.800 | 'cause we don't know how many steps there are
00:07:57.480 | before you start the expansion.
00:07:58.800 | So these are all the steps from the birth of life
00:08:02.320 | to the initiation of major expansion.
00:08:05.640 | - Right, so we're pretty sure that
00:08:07.120 | it would happen really soon so that it couldn't be
00:08:10.160 | the same sort of a hard step as the last one,
00:08:12.080 | so in terms of taking a long time.
00:08:13.400 | So when we look at the history of Earth,
00:08:16.920 | we look at the durations of the major things
00:08:19.160 | that have happened, that suggests that there's roughly,
00:08:23.120 | say, six hard steps that happened,
00:08:25.400 | say, between three and 12,
00:08:27.720 | and that we have just achieved the last one
00:08:30.360 | that would take a long time.
00:08:32.360 | - Which is?
00:08:33.960 | - Well, we don't know.
00:08:35.280 | - Oh, okay.
00:08:36.120 | - But whatever it is, we've just achieved the last one.
00:08:38.560 | - Are we talking about humans or aliens here?
00:08:40.400 | So let's talk about some of these steps.
00:08:42.280 | So Earth is really special in some way.
00:08:44.760 | We don't exactly know the level of specialness,
00:08:47.760 | we don't really know which steps were the hardest or not,
00:08:50.920 | because we just have a sample of one,
00:08:53.000 | but you're saying that there's three to 12 steps
00:08:55.720 | that we have to go through to get to where we are
00:08:58.280 | that are hard steps, hard to find by something
00:09:00.440 | that took a long time and is unlikely.
00:09:05.440 | There's a lot of ways to fail.
00:09:07.880 | There's a lot more ways to fail than to succeed.
00:09:10.880 | - The first step would be sort of the very simplest
00:09:13.200 | form of life of any sort,
00:09:14.740 | and then we don't know whether that first sort
00:09:19.360 | is the first sort that we see in the historical record
00:09:21.640 | or not, but then some other steps are, say,
00:09:24.200 | the development of photosynthesis,
00:09:26.240 | the development of sexual reproduction,
00:09:29.480 | there's the development of eukaryotic cells,
00:09:32.160 | which are certain kind of complicated cell
00:09:34.040 | that seems to have only appeared once,
00:09:36.800 | and then there's multicellularity,
00:09:38.840 | that is multiple cells coming together
00:09:40.680 | to large organisms like us.
00:09:42.340 | And in this statistical model of trying to fit
00:09:46.640 | all these steps into a finite window,
00:09:49.380 | the model actually predicts that these steps
00:09:51.620 | could be of varying difficulties,
00:09:53.260 | that is they could each take different amounts
00:09:55.180 | of time on average, but if you're lucky enough
00:09:57.940 | that they all appear at a very short time,
00:10:00.340 | then the durations between them will be roughly equal,
00:10:03.600 | and the time remaining left over in the rest
00:10:06.460 | of the window will also be the same length.
00:10:08.420 | So we at the moment have roughly a billion years
00:10:11.220 | left on Earth until simple life like us
00:10:14.420 | would no longer be possible.
00:10:16.420 | Life appeared roughly 400 million years
00:10:18.820 | after the very first time when life was possible
00:10:20.860 | at the very beginning, so those two numbers right there
00:10:24.260 | give you the rough estimate of six hard steps.
00:10:26.940 | - Just to build up an intuition here,
00:10:28.480 | so we're trying to create a simple mathematical model
00:10:32.700 | of how life emerges and expands in the universe.
00:10:37.700 | And there's a section in this paper,
00:10:39.780 | how many hard steps, question mark.
00:10:41.780 | - Right.
00:10:43.280 | - The two most plausibly diagnostic Earth durations
00:10:45.820 | seem to be the one remaining after now
00:10:47.820 | before Earth becomes uninhabitable for complex life.
00:10:51.080 | So you estimate how long Earth lasts,
00:10:54.380 | how many hard steps, there's windows
00:10:58.380 | for doing different hard steps,
00:11:00.780 | and you can sort of like queuing theory,
00:11:03.460 | mathematically estimate of like the solution
00:11:08.460 | or the passing of the hard steps
00:11:11.420 | or the taking of the hard steps.
00:11:13.220 | Sort of like coldly mathematical look.
00:11:17.020 | If life, pre-expansionary life requires a number of steps,
00:11:22.020 | what is the probability of taking those steps
00:11:26.300 | on an Earth that lasts a billion years
00:11:28.300 | or two billion years or five billion years
00:11:30.140 | or 10 billion years?
00:11:31.740 | And you say solving for E using the observed durations
00:11:36.140 | of 1.1 and 0.4 then gives E values of 3.9 and 12.5,
00:11:41.140 | range 5.7 to 26, suggesting a middle estimate
00:11:46.020 | of at least six, that's where you said six hard steps.
00:11:49.980 | - Right.
00:11:50.820 | - Just to get to where we are.
00:11:53.140 | - Right.
00:11:53.980 | - We started at the bottom, now we're here,
00:11:55.420 | and that took six steps on average.
00:11:57.780 | The key point is on average,
00:12:00.420 | these things on any one random planet
00:12:02.500 | would take trillions or trillions of years,
00:12:06.300 | just a really long time.
00:12:07.900 | And so we're really lucky that they all happened
00:12:10.100 | really fast in a short time before our window closed.
00:12:14.020 | And the chance of that happening in that short window
00:12:17.460 | goes as that time period to the power
00:12:19.760 | of the number of steps.
00:12:20.740 | And so that was where the power we talked about
00:12:22.780 | before it came from.
00:12:24.300 | And so that means in the history of the universe,
00:12:27.500 | we should overall roughly expect advanced life to appear
00:12:30.780 | as a power law in time.
00:12:32.480 | So that very early on, there was very little chance
00:12:35.900 | of anything appearing, and then later on as things appear,
00:12:38.540 | other things are appearing somewhat closer to them in time
00:12:41.780 | because they're all going as this power law.
00:12:44.260 | - What is a power law?
00:12:45.860 | Can we, for people who are not--
00:12:47.340 | - Sure.
00:12:48.180 | - Math inclined, can you describe what a power law is?
00:12:50.020 | - So, say the function x is linear,
00:12:53.220 | and x squared is quadratic, so it's the power of two.
00:12:57.220 | If we make x to the three, that's cubic,
00:13:00.100 | or the power of three.
00:13:01.420 | And so x to the sixth is the power of six.
00:13:05.020 | And so we'd say life appears in the universe
00:13:08.720 | on a planet like Earth in that proportion to the time
00:13:12.160 | that it's been ready for life to appear.
00:13:17.160 | And that over the universe in general,
00:13:21.120 | it'll appear at roughly a power law like that.
00:13:24.120 | - What is the x, what is n?
00:13:25.780 | Is it the number of hard steps?
00:13:28.160 | - Yes, the number of hard steps.
00:13:29.360 | So that's the idea.
00:13:30.200 | - Okay, so it's like if you're gambling
00:13:32.720 | and you're doubling up every time,
00:13:34.640 | this is the probability you just keep winning.
00:13:37.240 | (laughing)
00:13:39.000 | - So it gets very unlikely very quickly.
00:13:42.920 | And so we're the result of this unlikely chain of successes.
00:13:46.240 | - It's actually a lot like cancer.
00:13:47.760 | So the dominant model of cancer in an organism
00:13:51.080 | like each of us is that we have all these cells,
00:13:53.840 | and in order to become cancerous,
00:13:55.600 | a single cell has to go through a number of mutations.
00:13:58.980 | And these are very unlikely mutations,
00:14:00.800 | and so any one cell is very unlikely
00:14:02.560 | to have all these mutations happen
00:14:04.680 | by the time your lifespan's over.
00:14:06.880 | But we have enough cells in our body
00:14:09.120 | that the chance of any one cell producing cancer
00:14:11.760 | by the end of your life is actually pretty high,
00:14:13.420 | more like 40%.
00:14:15.360 | And so the chance of cancer appearing in your lifetime
00:14:18.360 | also goes as a power law,
00:14:19.520 | this power of the number of mutations that's required
00:14:22.400 | for any one cell in your body to become cancerous.
00:14:24.600 | - So the longer you live,
00:14:26.320 | the likely you are to have cancer.
00:14:29.240 | - And the power is also roughly six.
00:14:30.980 | That is, the chance of you getting cancer
00:14:33.680 | is roughly the power of six of the time you've been
00:14:37.120 | since you were born.
00:14:37.960 | - It is perhaps not lost on people
00:14:40.640 | that you're comparing power laws of the survival
00:14:45.520 | or the arrival of the human species to cancerous cells.
00:14:50.360 | The same mathematical model,
00:14:51.700 | but of course we might have a different value assumption
00:14:55.160 | about the two outcomes.
00:14:56.520 | But of course, from the point of view of cancer,
00:14:58.720 | (both laughing)
00:15:00.000 | it's more similar.
00:15:01.800 | From the point of view of cancer, it's a win-win.
00:15:04.040 | We both get to thrive, I suppose.
00:15:08.000 | It is interesting to take the point of view
00:15:11.240 | of all kinds of life forms on earth,
00:15:13.400 | of viruses, of bacteria.
00:15:15.280 | They have a very different view.
00:15:17.020 | It's like the Instagram channel, Nature is Metal.
00:15:22.600 | The ethic under which nature operates
00:15:25.120 | doesn't often correlate with human morals.
00:15:31.520 | It seems cold and machine-like
00:15:36.120 | in the selection process that it performs.
00:15:38.420 | I am an analyst, I'm a scholar, an intellectual,
00:15:42.400 | and I feel I should carefully distinguish
00:15:45.640 | predicting what's likely to happen
00:15:48.240 | and then evaluating or judging
00:15:50.560 | what I think would be better to happen.
00:15:52.880 | And it's a little dangerous to mix those up too closely
00:15:55.620 | because then we can have wishful thinking.
00:15:58.800 | And so I try typically to just analyze
00:16:01.120 | what seems likely to happen,
00:16:02.440 | regardless of whether I like it
00:16:04.280 | or whether we do anything about it.
00:16:05.880 | And then once you see a rough picture
00:16:08.120 | of what's likely to happen if we do nothing,
00:16:10.540 | then we can ask, well, what might we prefer?
00:16:12.960 | And ask where could the levers be
00:16:14.720 | to move it at least a little toward what we might prefer.
00:16:17.920 | - It's good. - And that's a useful,
00:16:19.480 | but often doing that just analysis
00:16:21.680 | of what's likely to happen if we do nothing
00:16:24.320 | offends many people.
00:16:25.960 | They find that dehumanizing or cold or metal, as you say,
00:16:30.740 | to just say, well, this is what's likely to happen
00:16:33.280 | and it's not your favorite, sorry,
00:16:36.360 | but maybe we can do something,
00:16:38.800 | but maybe we can't do that much.
00:16:41.080 | - This is very interesting, that the cold analysis,
00:16:45.480 | whether it's geopolitics, whether it's medicine,
00:16:50.520 | whether it's economics, sometimes misses
00:16:54.240 | some very specific aspect of human condition.
00:16:59.860 | Like for example, when you look at a doctor
00:17:04.860 | and the act of a doctor helping a single patient,
00:17:09.140 | if you do the analysis of that doctor's time
00:17:12.460 | and cost of the medicine or the surgery
00:17:14.940 | or the transportation of the patient,
00:17:17.160 | this is the Paul Farmer question,
00:17:18.860 | is it worth spending 10, 20, $30,000 on this one patient?
00:17:25.060 | When you look at all the people
00:17:26.460 | that are suffering in the world,
00:17:27.500 | that money could be spent so much better.
00:17:29.760 | And yet, there's something about human nature
00:17:33.300 | that wants to help the person in front of you,
00:17:37.680 | and that is actually the right thing to do,
00:17:40.600 | despite the analysis.
00:17:42.880 | And sometimes when you do the analysis,
00:17:45.040 | there's something about the human mind
00:17:47.840 | that allows you to not take that leap,
00:17:50.320 | that irrational leap to act in this way,
00:17:55.040 | that the analysis explains it away.
00:17:57.200 | Well, it's like, for example, the US government,
00:18:01.180 | the DOT, Department of Transportation,
00:18:04.380 | puts a value of, I think, like $9 million on a human life.
00:18:09.280 | And the moment you put that number on a human life,
00:18:11.700 | you can start thinking, well, okay,
00:18:13.520 | I can start making decisions about this or that
00:18:16.480 | and with a sort of cold economic perspective,
00:18:20.380 | and then you might lose, you might deviate
00:18:24.060 | from a deeper truth of what it means to be human somehow.
00:18:28.260 | So you have to dance, because then if you put too much weight
00:18:32.300 | on the anecdotal evidence on these kinds of human emotions,
00:18:36.880 | then you're going to lose,
00:18:39.140 | you could also probably more likely deviate from truth.
00:18:42.980 | But there's something about that cold analysis.
00:18:45.420 | Like I've been listening to a lot of people
00:18:47.160 | coldly analyze wars, war in Yemen, war in Syria,
00:18:52.820 | Israel-Palestine, war in Ukraine,
00:18:56.420 | and there's something lost when you do a cold analysis
00:18:59.100 | of why something happened.
00:19:00.740 | When you talk about energy,
00:19:02.320 | talking about sort of conflict, competition over resources,
00:19:07.660 | when you talk about geopolitics,
00:19:11.380 | sort of models of geopolitics and why a certain war happened,
00:19:14.540 | you lose something about the suffering that happens.
00:19:18.120 | I don't know.
00:19:19.120 | It's an interesting thing because you're both,
00:19:20.860 | you're exceptionally good at models in all domains,
00:19:25.860 | literally, but also there's a humanity to you.
00:19:31.220 | So it's an interesting dance.
00:19:32.180 | I don't know if you can comment on that dance.
00:19:33.980 | - Sure.
00:19:35.140 | It's definitely true as you say that for many people,
00:19:39.340 | if you are accurate in your judgment of say,
00:19:42.180 | for a medical patient, right?
00:19:43.980 | What's the chance that this treatment might help?
00:19:47.700 | And what's the cost?
00:19:49.780 | And compare those to each other.
00:19:51.600 | And you might say,
00:19:52.720 | this looks like a lot of cost for a small medical gain.
00:19:57.660 | And at that point, knowing that fact,
00:20:01.580 | that might take the air out of your sails.
00:20:05.720 | You might not be willing to do the thing
00:20:08.500 | that maybe you feel is right anyway,
00:20:10.660 | which is still to pay for it.
00:20:12.300 | And then somebody knowing that
00:20:15.980 | might wanna keep that news from you,
00:20:17.540 | not tell you about the low chance of success
00:20:20.060 | or the high cost in order to save you this tension,
00:20:23.540 | this awkward moment where you might fail
00:20:26.420 | to do what they and you think is right.
00:20:28.580 | But I think the higher calling,
00:20:32.540 | the higher standard to hold you to,
00:20:34.900 | which many people can be held to is to say,
00:20:38.660 | I will look at things accurately, I will know the truth,
00:20:41.380 | and then I will also do the right thing with it.
00:20:44.980 | I will be at peace with my judgment
00:20:46.980 | about what the right thing is in terms of the truth.
00:20:49.200 | I don't need to be lied to
00:20:50.500 | in order to figure out what the right thing to do is.
00:20:54.380 | And I think if you do think you need to be lied to
00:20:56.620 | in order to figure out what the right thing to do is,
00:20:59.660 | you're at a great disadvantage
00:21:00.960 | because then people will be lying to you,
00:21:03.500 | you will be lying to yourself,
00:21:04.860 | and you won't be as effective
00:21:08.020 | achieving whatever good you were trying to achieve.
00:21:11.460 | - But getting the data, getting the facts is step one,
00:21:14.780 | not the final step. - Absolutely.
00:21:16.900 | So I would say having a good model,
00:21:20.020 | getting the good data is step one, and it's a burden.
00:21:23.780 | Because you can't just use that data
00:21:28.340 | to arrive at sort of the easy, convenient thing.
00:21:33.340 | You have to really deeply think
00:21:34.900 | about what is the right thing.
00:21:36.980 | You can't use, so the dark aspect of data,
00:21:40.820 | of models, is you can use it
00:21:45.500 | to excuse away actions that aren't ethical.
00:21:49.460 | You can use data to basically excuse away anything.
00:21:52.260 | - But not looking at data lets you--
00:21:54.660 | - Excuse yourself to pretend and think
00:21:57.260 | that you're doing good when you're not.
00:21:59.060 | - Exactly.
00:22:00.540 | But it is a burden.
00:22:01.780 | It doesn't excuse you from still being human
00:22:04.980 | and deeply thinking about what is right.
00:22:07.540 | That very kind of gray area, that very subjective area.
00:22:11.140 | That's part of the human condition.
00:22:15.260 | But let us return for a time to aliens.
00:22:18.340 | So you started to define sort of the model,
00:22:21.620 | the parameters of grabbiness.
00:22:25.260 | - Right.
00:22:26.100 | - Or the, as we approach grabbiness.
00:22:28.300 | So what happens?
00:22:29.420 | - So again, there was three parameters.
00:22:31.140 | - Yes.
00:22:32.060 | - There's the speed at which they expand.
00:22:34.320 | There's the rate at which they appear in time.
00:22:37.420 | And that rate has a constant and a power.
00:22:39.840 | So we've talked about the history of life on Earth
00:22:41.700 | suggests that power is around six, but maybe three to 12.
00:22:45.540 | We can say that constant comes from our current date,
00:22:48.420 | sort of sets the overall rate.
00:22:50.540 | And the speed, which is the last parameter,
00:22:53.260 | comes from the fact that when we look in the sky,
00:22:55.220 | we don't see them.
00:22:56.400 | So the model predicts very strongly
00:22:58.380 | that if they were expanding slowly,
00:22:59.900 | say 1% of the speed of light,
00:23:02.020 | our sky would be full of vast spheres
00:23:04.820 | that were full of activity.
00:23:06.660 | That is, at a random time when a civilization
00:23:09.900 | is first appearing, if it looks out into its sky,
00:23:12.420 | it would see many other
00:23:13.820 | grabby alien civilizations in the sky.
00:23:15.500 | And they would be much bigger than the full moon.
00:23:16.900 | They'd be huge spheres in the sky.
00:23:18.580 | And they would be visibly different.
00:23:20.460 | We don't see them.
00:23:21.300 | - Can we pause for a second?
00:23:22.460 | - Okay.
00:23:23.300 | - There's a bunch of hard steps that Earth had to pass
00:23:27.680 | to arrive at this place we are currently,
00:23:30.220 | which we're starting to launch rockets out into space.
00:23:33.180 | We're kind of starting to expand.
00:23:34.820 | - A bit.
00:23:35.660 | - Very slowly.
00:23:36.620 | - Okay.
00:23:37.660 | - But this is like the birth.
00:23:39.140 | If you look at the entirety of the history of Earth,
00:23:43.180 | we're now at this precipice of expansion.
00:23:46.660 | - We could.
00:23:47.500 | We might not choose to.
00:23:48.320 | But if we do, we will do it in the next 10 million years.
00:23:51.820 | - 10 million, wow.
00:23:53.460 | Time flies when you're having fun.
00:23:55.620 | - I was thinking more like a--
00:23:56.460 | - 10 million is a short time on the cosmological scale.
00:23:58.400 | So that is, it might be only 1,000.
00:24:00.100 | But the point is, even if it's up to 10 million,
00:24:02.220 | that hardly makes any difference to the model.
00:24:03.860 | So I might as well give you 10 million.
00:24:05.500 | - This makes me feel,
00:24:08.020 | I was so stressed about planning what I'm gonna do today.
00:24:10.580 | And now--
00:24:11.420 | - Right, you've got plenty of time.
00:24:12.260 | - Plenty of time.
00:24:13.480 | I just need to be generating some offspring quickly here.
00:24:17.260 | Okay.
00:24:18.100 | So, and there's this moment.
00:24:21.540 | This 10 million year gap or window when we start expanding.
00:24:28.180 | And you're saying, okay,
00:24:29.480 | so this is an interesting moment
00:24:31.540 | where there's a bunch of other alien civilizations
00:24:33.900 | that might, at some history of the universe,
00:24:36.380 | arrived at this moment, we're here.
00:24:38.180 | They passed all the hard steps.
00:24:39.740 | There's a model for how likely it is that that happens.
00:24:44.140 | And then they start expanding.
00:24:45.660 | And you think of an expansion as almost like a sphere.
00:24:48.740 | - Right.
00:24:49.580 | - When you say speed,
00:24:50.860 | we're talking about the speed of the radius growth.
00:24:53.860 | - Exactly, the surface, how fast the surface expands.
00:24:56.540 | - Okay, and so you're saying that there is some speed
00:24:59.580 | for that expansion, average speed.
00:25:01.700 | And then we can play with that parameter.
00:25:04.460 | And if that speed is super slow,
00:25:07.260 | then maybe that explains why we haven't seen anything.
00:25:10.660 | If it's super fast--
00:25:11.700 | - Well, if the slow would create the puzzle,
00:25:14.220 | if slow predicts we would see them, but we don't see them.
00:25:16.700 | - Okay.
00:25:17.540 | - And so the way to explain that is that they're fast.
00:25:19.100 | So the idea is if they're moving really fast,
00:25:21.960 | then we don't see them until they're almost here.
00:25:25.100 | - And okay, this is counterintuitive.
00:25:26.580 | All right, hold on a second.
00:25:27.700 | So I think this works best
00:25:29.380 | when I say a bunch of dumb things.
00:25:31.100 | - Okay.
00:25:33.180 | - And then you elucidate the full complexity
00:25:37.660 | and the beauty of the dumbness.
00:25:39.180 | Okay, so there's these spheres out there in the universe
00:25:44.180 | that are made visible because they're sort of
00:25:47.180 | using a lot of energy.
00:25:48.580 | So they're generating a lot of light.
00:25:49.860 | - Doing stuff, they're changing things.
00:25:51.460 | - They're changing things.
00:25:52.300 | And change would be visible a long way off.
00:25:56.220 | - Yes.
00:25:57.060 | - They would take apart stars, rearrange them,
00:25:58.860 | restructure galaxies, they would just--
00:26:00.540 | - All kinds of fun. - Big, huge stuff.
00:26:02.580 | - Okay, if they're expanding slowly,
00:26:06.060 | we would see a lot of them because the universe is old.
00:26:09.860 | Is old enough to where we would see--
00:26:12.500 | - That is, we're assuming we're just typical,
00:26:15.100 | maybe at the 50th percentile of them.
00:26:16.820 | So like half of them have appeared so far,
00:26:18.820 | the other half will still appear later.
00:26:20.780 | And the math of our best estimate is that
00:26:26.300 | they appear roughly once per million galaxies.
00:26:30.140 | And we would meet them in roughly a billion years
00:26:33.180 | if we expanded out to meet them.
00:26:35.620 | - So we're looking at a Grabby Aliens model, 3D sim.
00:26:39.140 | - Right.
00:26:39.980 | - What's, that's the actual name of the video.
00:26:42.580 | What, by the time we get to 13.8 billion years,
00:26:47.580 | the fun begins.
00:26:49.860 | Okay, so this is, we're watching a three-dimensional sphere
00:26:54.860 | rotating, I presume that's the universe,
00:26:57.500 | and then Grabby Aliens are expanding
00:26:59.660 | and filling that universe--
00:27:01.020 | - Exactly.
00:27:01.860 | - With all kinds of fun.
00:27:04.140 | - Pretty soon it's all full.
00:27:05.540 | - It's full.
00:27:06.360 | So that's how the Grabby Aliens come in contact,
00:27:11.360 | first of all, with other aliens,
00:27:13.340 | and then with us, humans.
00:27:16.380 | The following is a simulation of the Grabby Aliens model
00:27:18.740 | of alien civilizations.
00:27:20.260 | Civilizations are born,
00:27:22.220 | they expand outwards at constant speed.
00:27:24.620 | A spherical region of space is shown.
00:27:27.040 | By the time we get to 13.8 billion years,
00:27:30.180 | this sphere will be about 3,000 times as wide
00:27:33.820 | as the distance from the Milky Way to Andromeda.
00:27:37.520 | Okay, this is fun.
00:27:39.000 | - It's huge.
00:27:39.840 | - Okay, it's huge.
00:27:41.540 | All right, so why don't we see,
00:27:45.560 | we're one little tiny, tiny, tiny, tiny dot
00:27:49.180 | in that giant, giant sphere.
00:27:50.820 | - Right.
00:27:51.660 | - Why don't we see any of the Grabby Aliens?
00:27:55.400 | - Depends on how fast they expand.
00:27:57.520 | So you could see that if they expanded
00:27:59.220 | at the speed of light,
00:28:00.620 | you wouldn't see them until they were here.
00:28:03.260 | So like out there,
00:28:04.100 | if somebody is destroying the universe
00:28:05.480 | with a vacuum decay,
00:28:07.760 | there's this doomsday scenario
00:28:10.760 | where somebody somewhere could change
00:28:13.040 | the vacuum of the universe,
00:28:14.480 | and that would expand at the speed of light
00:28:15.840 | and basically destroy everything it hit.
00:28:17.260 | But you'd never see that until it got here,
00:28:19.320 | 'cause it's expanding at the speed of light.
00:28:21.520 | If you're expanding really slow,
00:28:22.680 | then you see it from a long way off.
00:28:24.420 | So the fact we don't see anything in the sky
00:28:26.720 | tells us they're expanding fast,
00:28:28.800 | say over a third the speed of light,
00:28:30.760 | and that's really, really fast.
00:28:32.520 | But that's what you have to believe
00:28:35.140 | if you look out and you don't see anything.
00:28:37.680 | Now you might say,
00:28:38.520 | well, maybe I just don't wanna believe this whole model.
00:28:40.560 | Why should I believe this whole model at all?
00:28:42.560 | And our best evidence why you should believe this model
00:28:46.000 | is our early date.
00:28:47.280 | We are right now,
00:28:50.120 | almost 14 million years into the universe,
00:28:52.980 | on a planet around a star
00:28:55.240 | that's roughly five billion years old.
00:28:57.280 | But the average star out there
00:29:00.840 | will last roughly five trillion years.
00:29:04.040 | That is 1,000 times longer.
00:29:06.960 | And remember that power law.
00:29:08.600 | It says that the chance of advanced life
00:29:10.520 | appearing on a planet goes as the power of sixth of the time.
00:29:13.900 | So if a planet lasts 1,000 times longer,
00:29:17.160 | then the chance of it appearing on that planet,
00:29:20.120 | if everything would stay empty at least,
00:29:21.760 | is 1,000 to the sixth power, or 10 to the 18.
00:29:25.220 | So enormous, overwhelming chance
00:29:28.900 | that if the universe would just say,
00:29:30.460 | sit and empty and waiting for advanced life to appear,
00:29:33.020 | when it would appear would be way at the end
00:29:36.500 | of all these planet lifetimes.
00:29:39.180 | That is the long planets near the end of the lifetime,
00:29:42.220 | trillions of years into the future.
00:29:44.460 | But we're really early compared to that.
00:29:46.460 | And our explanation is,
00:29:47.820 | at the moment, as you saw in the video,
00:29:49.460 | the universe is filling up in roughly a billion years.
00:29:51.840 | It'll all be full.
00:29:52.880 | And at that point, it's too late
00:29:54.320 | for advanced life to show up.
00:29:55.760 | So you had to show up now before that deadline.
00:29:57.960 | - Okay, can we break that apart a little bit?
00:30:00.240 | Okay, or linger on some of the things you said.
00:30:03.160 | So with the power law, the things we've done on Earth,
00:30:06.120 | the model you have says that it's very unlikely.
00:30:10.480 | Like we're lucky SOBs.
00:30:13.240 | Is that mathematically correct to say?
00:30:15.400 | - We're crazy early.
00:30:18.440 | - That is. - When early means like--
00:30:20.340 | - In the history of the universe.
00:30:21.700 | - In the history, okay, so given this model,
00:30:26.420 | how do we make sense of that?
00:30:28.780 | If we're super, can we just be the lucky ones?
00:30:31.740 | - Well, 10 to the 18 lucky, you know?
00:30:34.020 | How lucky do you feel?
00:30:35.100 | So, you know. (laughs)
00:30:38.020 | That's pretty lucky, right?
00:30:39.860 | 10 to the 18 is a billion billion.
00:30:41.720 | So then if you were just being honest and humble,
00:30:46.100 | that that means, what does that mean?
00:30:49.020 | - Means one of the assumptions that calculated
00:30:50.740 | this crazy early must be wrong.
00:30:52.860 | That's what it means.
00:30:53.700 | So the key assumption we suggest is
00:30:55.500 | that the universe would stay empty.
00:30:57.840 | So most life would appear like 1,000 times longer
00:31:02.060 | later than now if everything would stay empty
00:31:04.780 | waiting for it to appear.
00:31:06.480 | - So what does non-empty mean?
00:31:08.300 | - So the gravity aliens are filling the universe right now.
00:31:10.660 | Roughly at the moment they've filled half of the universe
00:31:13.120 | and they've changed it.
00:31:14.300 | And when they fill everything,
00:31:15.620 | it's too late for stuff like us to appear.
00:31:17.380 | - But wait, hold on a second.
00:31:18.820 | Did anyone help us get lucky?
00:31:23.040 | If it's so difficult, how do, like--
00:31:26.340 | - So it's like cancer, right?
00:31:28.260 | There's all these cells, each of which
00:31:29.940 | randomly does or doesn't get cancer.
00:31:32.340 | And eventually some cell gets cancer
00:31:34.380 | and, you know, we were one of those.
00:31:36.440 | - But hold on a second.
00:31:38.940 | Okay.
00:31:40.020 | But we got it early.
00:31:41.540 | We got it-- - Early compared to
00:31:42.900 | the prediction with an assumption that's wrong.
00:31:46.260 | So that's how we do a lot of, you know,
00:31:48.180 | theoretical analysis.
00:31:49.540 | You have a model that makes a prediction that's wrong,
00:31:51.420 | then that helps you reject that model.
00:31:53.300 | - Okay.
00:31:54.120 | Let's try to understand exactly where the wrong is.
00:31:56.460 | So the assumption is that the universe is empty.
00:31:59.620 | - Stays empty. - Stays empty.
00:32:01.380 | - And waits until this advanced life
00:32:03.780 | appears in trillions of years.
00:32:05.820 | That is, if the universe would just stay empty,
00:32:07.480 | if there was just, you know, nobody else out there,
00:32:10.820 | then when you should expect advanced life to appear,
00:32:14.180 | if you're the only one in the universe,
00:32:15.300 | when should you expect to appear?
00:32:16.580 | You should expect to appear trillions of years
00:32:18.380 | in the future.
00:32:19.620 | - I see.
00:32:20.460 | Right, right.
00:32:21.280 | So this is a very sort of nuanced mathematical assumption.
00:32:25.140 | I don't think we can intuit it cleanly with words.
00:32:29.420 | But if you assume that you're just,
00:32:32.740 | the universe stays empty and you're waiting
00:32:35.220 | for one life civilization to pop up,
00:32:40.700 | then it should happen very late, much later than now.
00:32:44.940 | And if you look at Earth, the way things happen on Earth,
00:32:49.260 | it happened much, much, much, much, much earlier
00:32:51.740 | than it was supposed to according to this model
00:32:53.540 | if you take the initial assumption.
00:32:55.360 | Therefore, you can say, well, the initial assumption
00:32:58.100 | of the universe staying empty is very unlikely.
00:33:00.740 | - Right.
00:33:01.580 | - Okay.
00:33:02.420 | - And the other alternative theory is the universe
00:33:04.900 | is filling up and will fill up soon.
00:33:07.140 | And so we are typical for the origin data
00:33:10.020 | of things that can appear before the deadline.
00:33:12.380 | - Before the deadline.
00:33:13.220 | Okay, it's filling up, so why don't we see anything
00:33:15.460 | if it's filling up?
00:33:16.420 | - Because they're expanding really fast.
00:33:19.060 | - Close to the speed of light.
00:33:19.980 | - Exactly.
00:33:20.820 | - So we will only see it when it's here.
00:33:22.420 | - Almost here.
00:33:23.820 | - Okay.
00:33:24.660 | What are the ways in which we might see a quickly expanding?
00:33:29.940 | - This is both exciting and terrifying.
00:33:32.860 | - It is terrifying.
00:33:33.700 | - It's like watching a truck driving at you
00:33:36.380 | at 100 miles an hour.
00:33:39.220 | - So we would see spheres in the sky,
00:33:41.780 | at least one sphere in the sky, growing very rapidly.
00:33:44.820 | - Like very rapidly.
00:33:48.300 | - Right, yes, very rapidly.
00:33:50.440 | - So there's different, 'cause we were just talking
00:33:55.180 | about 10 million years, this would be--
00:33:57.740 | - You might see it 10 million years in advance coming.
00:34:00.580 | I mean, you still might have a long warning.
00:34:02.660 | Again, the universe is 14 million years old.
00:34:06.100 | The typical origin times of these things
00:34:08.300 | are spread over several billion years.
00:34:10.060 | So the chance of one originating very close to you in time
00:34:13.880 | is very low.
00:34:14.820 | So it still might take millions of years
00:34:18.460 | from the time you see it, from the time it gets here.
00:34:21.540 | You've got a million years to be terrified
00:34:23.620 | of this fast sphere coming at you.
00:34:25.540 | - But coming at you very fast,
00:34:27.300 | so if they're traveling close to the speed of light--
00:34:29.380 | - But they're coming from a long way away.
00:34:31.380 | So remember, the rate at which they appear
00:34:33.620 | is one per million galaxies.
00:34:36.080 | - Right.
00:34:36.920 | - So they're roughly 100 galaxies away.
00:34:40.160 | - I see, so the delta between the speed of light
00:34:43.660 | and their actual travel speed is very important?
00:34:46.740 | - Right, so if they're going at, say,
00:34:48.060 | half the speed of light--
00:34:49.500 | - We'll have a long time.
00:34:50.940 | - Then-- - Yeah.
00:34:52.000 | But what if they're traveling exactly at a speed of light?
00:34:54.740 | Then we see 'em like--
00:34:56.220 | - Then we wouldn't have much warning,
00:34:57.260 | but that's less likely.
00:34:58.500 | Well, we can't exclude it.
00:34:59.820 | - And they could also be somehow traveling fast
00:35:03.060 | in the speed of light.
00:35:04.860 | - Well, I think we can exclude,
00:35:06.380 | because if they could go faster than the speed of light,
00:35:08.480 | then they would just already be everywhere.
00:35:11.020 | So in a universe where you can travel faster
00:35:13.140 | than the speed of light, you can go backwards in space time.
00:35:15.660 | So any time you appeared anywhere in space time,
00:35:17.820 | you could just fill up everything.
00:35:19.620 | - Yeah, and--
00:35:21.300 | - So anybody in the future, whoever appeared,
00:35:23.040 | they would have been here by now.
00:35:24.780 | - Can you exclude the possibility
00:35:26.300 | that those kinds of aliens aren't already here?
00:35:28.660 | - Well, we should have a different discussion of that.
00:35:33.020 | - Right, okay.
00:35:33.860 | Well, let's actually leave that discussion aside
00:35:36.140 | just to linger and understand the Grabby alien expansion,
00:35:39.700 | which is beautiful and fascinating.
00:35:42.220 | Okay.
00:35:43.060 | So there's these giant expanding--
00:35:46.660 | - Spheres.
00:35:47.500 | - Spheres of alien civilizations.
00:35:49.900 | Now, when those spheres collide,
00:35:54.900 | mathematically, it's very likely
00:36:01.020 | that we're not the first collision
00:36:04.020 | of Grabby alien civilizations,
00:36:07.380 | I suppose is one way to say it.
00:36:09.540 | So there's, like, the first time the spheres touch each other
00:36:12.620 | and recognize each other, they meet.
00:36:14.620 | They recognize each other first before they meet.
00:36:19.060 | - They see each other coming.
00:36:20.940 | - They see each other coming.
00:36:21.900 | And then, so there's a bunch of them,
00:36:23.780 | there's a combinatorial thing
00:36:25.300 | where they start seeing each other coming,
00:36:26.860 | and then there's a third neighbor,
00:36:28.220 | it's like, what the hell?
00:36:29.140 | And then there's a fourth one.
00:36:30.420 | Okay, so what does that, you think, look like?
00:36:33.580 | What lessons, from human nature,
00:36:37.300 | that's the only data we have?
00:36:39.140 | What can you draw--
00:36:41.060 | - So the story of the history of the universe here
00:36:43.660 | is what I would call a living cosmology.
00:36:45.700 | So what I'm excited about, in part, by this model,
00:36:49.420 | is that it lets us tell a story of cosmology
00:36:52.020 | where there are actors who have agendas.
00:36:54.820 | So most ancient peoples, they had cosmologies,
00:36:57.860 | the stories they told about where the universe came from
00:36:59.940 | and where it's going and what's happening out there.
00:37:01.540 | And their stories, they like to have agents
00:37:03.340 | and actors, gods or something, out there doing things.
00:37:05.740 | And lately, our favorite cosmology is dead, kind of boring.
00:37:10.740 | We're the only activity we know about our sea
00:37:13.340 | and everything else just looks dead and empty.
00:37:15.780 | But this is now telling us, no, that's not quite right.
00:37:19.820 | At the moment, the universe is filling up,
00:37:21.300 | and in a few billion years, it'll be all full.
00:37:24.820 | And from then on, the history of the universe
00:37:26.820 | will be the universe full of aliens.
00:37:29.980 | - Yeah, so that's a really good reminder,
00:37:32.620 | a really good way to think about cosmologies.
00:37:35.300 | We're surrounded by a vast darkness,
00:37:37.660 | and we don't know what's going on in that darkness
00:37:40.740 | until the light from whatever generate lights arrives here.
00:37:45.660 | So we kind of, yeah, we look up at the sky,
00:37:48.220 | okay, there's stars, oh, they're pretty.
00:37:50.260 | But you don't think about the giant expanding spheres
00:37:55.260 | of aliens. (laughs)
00:37:56.460 | - Right, 'cause you don't see them.
00:37:57.980 | But now our date, looking at the clock,
00:38:00.820 | if you're clever, the clock tells you.
00:38:02.180 | - So I like the analogy with the ancient Greeks.
00:38:04.020 | So you might think that an ancient Greek
00:38:07.180 | staring at the universe couldn't possibly tell
00:38:09.540 | how far away the sun was, or how far away the moon is,
00:38:12.320 | or how big the earth is.
00:38:13.900 | That all you can see is just big things in the sky
00:38:16.300 | you can't tell.
00:38:17.140 | But they were clever enough, actually,
00:38:18.220 | to be able to figure out the size of the earth
00:38:20.540 | and the distance to the moon and the sun
00:38:22.020 | and the size of the moon and sun.
00:38:24.580 | That is, they could figure those things out, actually,
00:38:26.820 | by being clever enough.
00:38:27.820 | And so similarly, we can actually figure out
00:38:29.860 | where are the aliens out there in space-time
00:38:31.700 | by being clever about the few things we can see,
00:38:33.900 | one of which is our current date.
00:38:35.780 | And so now that you have this living cosmology,
00:38:38.740 | we can tell the story that the universe starts out empty,
00:38:41.600 | and then at some point, things like us appear,
00:38:44.140 | very primitive, and then some of those
00:38:46.860 | stop being quiet and expand.
00:38:49.100 | And then for a few billion years, they expand,
00:38:51.000 | and then they meet each other.
00:38:52.220 | And then for the next 100 billion years,
00:38:54.700 | they commune with each other.
00:38:56.940 | That is, the usual models of cosmology say
00:38:59.180 | that in roughly 100, 150 billion years,
00:39:03.180 | the expansion of the universe will happen so much
00:39:05.200 | that all you'll have left is some galaxy clusters
00:39:08.260 | and that are sort of disconnected from each other.
00:39:10.420 | But before then, for the next 100 billion years,
00:39:15.420 | they will interact.
00:39:17.620 | There will be this community
00:39:18.880 | of all the grabby alien civilizations,
00:39:21.100 | and each one of them will hear about
00:39:22.460 | and even meet thousands of others.
00:39:24.980 | And we might hope to join them someday
00:39:28.220 | and become part of that community.
00:39:29.720 | That's an interesting thing to aspire to.
00:39:32.100 | - Yes, interesting is an interesting word.
00:39:35.960 | Is the universe of alien civilizations defined by war
00:39:40.960 | as much or more than war-defined human history?
00:39:45.960 | I would say it's defined by competition,
00:39:52.180 | and then the question is how much competition implies war.
00:39:57.100 | So up until recently, competition defined life on Earth.
00:40:02.100 | Competition between species and organisms and among humans,
00:40:08.580 | competitions among individuals and communities,
00:40:11.580 | and that competition often took the form of war
00:40:13.640 | in the last 10,000 years.
00:40:15.200 | Many people now are hoping or even expecting
00:40:20.380 | to sort of suppress and end competition in human affairs.
00:40:25.340 | They regulate business competition,
00:40:27.340 | they prevent military competition,
00:40:30.460 | and that's a future I think a lot of people
00:40:33.260 | will like to continue and strengthen.
00:40:35.420 | People will like to have something close
00:40:37.020 | to world government or world governance
00:40:39.420 | or at least a world community,
00:40:40.920 | and they will like to suppress war
00:40:42.780 | and any forms of business and personal competition
00:40:45.680 | over the coming centuries.
00:40:46.980 | And they may like that so much
00:40:50.860 | that they prevent interstellar colonization,
00:40:53.180 | which would become the end of that era.
00:40:55.160 | That is, interstellar colonization
00:40:56.780 | would just return severe competition
00:40:59.460 | to human or our descendant affairs.
00:41:01.580 | And many civilizations may prefer that,
00:41:04.060 | and ours may prefer that.
00:41:05.460 | But if they choose to allow interstellar colonization,
00:41:09.100 | they will have chosen to allow competition
00:41:11.640 | to return with great force.
00:41:12.900 | That is, there's really not much of a way
00:41:14.700 | to centrally govern a rapidly expanding
00:41:17.380 | sphere of civilization.
00:41:19.220 | And so I think one of the most solid things
00:41:22.600 | we can predict about Gravelians
00:41:23.940 | is they have accepted competition,
00:41:26.700 | and they have internal competition,
00:41:28.900 | and therefore they have the potential for competition
00:41:31.840 | when they meet each other at the borders.
00:41:33.360 | But whether that's military competition
00:41:36.300 | is more of an open question.
00:41:38.160 | - So military meaning physically destructive.
00:41:41.440 | - Right.
00:41:42.280 | - So there's a lot to say there.
00:41:47.700 | So one idea that you kind of proposed
00:41:51.040 | is progress might be maximized through competition,
00:41:56.040 | through some kind of healthy competition,
00:42:00.640 | some definition of healthy.
00:42:02.360 | So like constructive, not destructive competition.
00:42:06.480 | So like we would likely,
00:42:09.160 | Graby alien civilizations would be likely defined
00:42:12.720 | by competition 'cause they can expand faster.
00:42:15.200 | Because competition allows innovation
00:42:18.380 | and sort of the battle of ideas.
00:42:19.840 | - The way I would take the logic is to say,
00:42:22.080 | competition just happens if you can't coordinate to stop it.
00:42:27.520 | And you probably can't coordinate to stop it
00:42:30.080 | in an expanding interstellar wave.
00:42:32.320 | So competition is a fundamental force in the universe.
00:42:37.320 | - It has been so far,
00:42:39.100 | and it would be within an expanding
00:42:42.080 | Graby alien civilization.
00:42:43.260 | But we today have the chance, many people think and hope,
00:42:47.240 | of greatly controlling and limiting competition
00:42:50.120 | within our civilization for a while.
00:42:52.880 | And that's an interesting choice.
00:42:55.640 | Whether to allow competition to sort of regain
00:42:59.680 | its full force or whether to suppress and manage it.
00:43:03.940 | - Well, one of the open questions that has been raised
00:43:08.940 | in the past less than 100 years
00:43:13.460 | is whether our desire to lessen the destructive nature
00:43:18.380 | of competition or the destructive kind of competition
00:43:22.300 | will be outpaced by the destructive power of our weapons.
00:43:27.200 | Sort of if nuclear weapons and weapons of that kind
00:43:33.720 | become more destructive than our desire for peace,
00:43:40.660 | then all it takes is one asshole at the party
00:43:43.340 | to ruin the party.
00:43:45.580 | - It takes one asshole to make a delay,
00:43:48.140 | but not that much of a delay
00:43:49.860 | on the cosmological scales we're talking about.
00:43:52.500 | So even a vast nuclear war,
00:43:56.200 | if it happened here right now on Earth,
00:43:58.620 | it would not kill all humans.
00:44:01.880 | It certainly wouldn't kill all life.
00:44:05.380 | And so human civilization would return
00:44:07.980 | within 100,000 years.
00:44:10.220 | So all the history of atrocities,
00:44:14.260 | and if you look at the Black Plague,
00:44:18.260 | which is not human-caused atrocities or whatever.
00:44:26.540 | - There are a lot of military atrocities in history,
00:44:29.300 | absolutely. - In the 20th century.
00:44:30.660 | Those are, those challenges to think about human nature,
00:44:36.620 | but the cosmic scale of time and space,
00:44:40.380 | they do not stop the human spirit, essentially.
00:44:44.580 | The humanity goes on.
00:44:46.980 | Through all the atrocities, it goes on.
00:44:49.500 | Life goes on. - Most likely.
00:44:50.640 | So even a nuclear war isn't enough to destroy us
00:44:53.900 | or to stop our potential from expanding,
00:44:57.340 | but we could institute a regime of global governance
00:45:02.340 | that limited competition,
00:45:04.060 | including military and business competition of sorts,
00:45:06.940 | and that could prevent our expansion.
00:45:09.660 | Of course, to play devil's advocate,
00:45:12.660 | global governance is centralized power,
00:45:20.380 | and power corrupts, and absolute power corrupts absolutely.
00:45:25.540 | One of the aspects of competition
00:45:27.980 | that's been very productive
00:45:30.020 | is not letting any one person, any one country,
00:45:35.020 | any one center of power become absolutely powerful,
00:45:39.980 | because that's another lesson, is it seems to corrupt.
00:45:43.380 | There's something about ego and the human mind
00:45:45.540 | that seems to be corrupted by power,
00:45:47.660 | so when you say global governance,
00:45:50.360 | that terrifies me more than the possibility of war,
00:45:55.360 | because it's--
00:45:57.980 | - I think people will be less terrified
00:45:59.940 | than you are right now,
00:46:01.380 | and let me try to paint the picture from their point of view.
00:46:04.300 | This isn't my point of view,
00:46:05.540 | but I think it's going to be a widely shared point of view.
00:46:08.100 | - Yes, this is two devil's advocates arguing, two devils.
00:46:11.020 | - Okay, so for the last half century
00:46:15.060 | and into the continuing future,
00:46:16.900 | we actually have had a strong elite global community
00:46:21.900 | that shares a lot of values and beliefs
00:46:24.980 | and has created a lot of convergence in global policy.
00:46:29.940 | So if you look at electromagnetic spectrum
00:46:32.180 | or medical experiments or pandemic policy
00:46:36.020 | or nuclear power energy or regulating airplanes
00:46:39.980 | or just in a wide range of area,
00:46:41.980 | in fact, the world has very similar regulations
00:46:45.460 | and rules everywhere,
00:46:46.940 | and it's not a coincidence
00:46:48.220 | because they are part of a world community
00:46:50.820 | where people get together at places like Davos, et cetera,
00:46:53.540 | where world elites want to be respected
00:46:57.260 | by other world elites,
00:46:58.580 | and they have a convergence of opinion,
00:47:02.380 | and that produces something like global governance,
00:47:06.180 | but without a global center.
00:47:08.140 | And this is sort of what human mobs
00:47:10.020 | or communities have done for a long time.
00:47:11.780 | That is, humans can coordinate together on shared behavior
00:47:14.740 | without a center by having gossip and reputation
00:47:18.780 | within a community of elites.
00:47:21.020 | And that is what we have been doing
00:47:22.740 | and are likely to do a lot more of.
00:47:24.860 | So for example, one of the things that's happening,
00:47:27.980 | say, with the war in Ukraine
00:47:29.180 | is that this world community of elites
00:47:31.460 | has decided that they disapprove of the Russian invasion
00:47:35.020 | and they are coordinating to pull resources together
00:47:38.180 | from all around the world in order to oppose it,
00:47:40.540 | and they are proud of that,
00:47:42.140 | sharing that opinion in there,
00:47:45.020 | and they feel that they are morally justified
00:47:47.700 | in their stance there.
00:47:49.900 | And that's the kind of event
00:47:53.020 | that actually brings world elite communities together,
00:47:55.500 | where they come together and they push a particular policy
00:47:59.660 | and position that they share and that they achieve successes.
00:48:02.580 | And the same sort of passion animates global elites
00:48:05.500 | with respect to, say, global warming,
00:48:07.620 | or global poverty, and other sorts of things.
00:48:09.820 | And they are, in fact, making progress
00:48:12.420 | on those sorts of things
00:48:13.660 | through shared global community of elites.
00:48:18.500 | And in some sense,
00:48:19.700 | they are slowly walking toward global governance,
00:48:22.220 | slowly strengthening various world institutions
00:48:25.180 | of governance, but cautiously, carefully,
00:48:27.980 | watching out for the possibility
00:48:29.700 | of a single power that might corrupt it.
00:48:32.000 | I think a lot of people over the coming centuries
00:48:35.180 | will look at that history and like it.
00:48:37.080 | - It's an interesting thought,
00:48:41.260 | and thank you for playing that devil's advocate there.
00:48:45.820 | But I think the elites too easily lose touch
00:48:50.100 | of the morals that the best of human nature
00:48:55.900 | and power corrupts.
00:48:57.260 | - Sure, but-- - And everything you just said.
00:48:58.100 | - If their view is the one that determines what happens,
00:49:01.140 | their view may still end up there,
00:49:04.220 | even if you or I might criticize it
00:49:06.500 | from that point of view, so.
00:49:07.780 | - From a perspective of minimizing human suffering,
00:49:11.020 | elites can use topics of the war in Ukraine
00:49:15.820 | and climate change and all of those things
00:49:19.700 | to sell an idea to the world
00:49:23.820 | and with disregard to the amount of suffering it causes,
00:49:30.000 | their actual actions.
00:49:32.220 | So like you can tell all kinds of narratives,
00:49:34.500 | that's the way propaganda works.
00:49:36.540 | Hitler really sold the idea
00:49:39.960 | that everything Germany is doing is either,
00:49:42.300 | it's the victim, is defending itself
00:49:44.420 | against the cruelty of the world,
00:49:46.320 | and it's actually trying to bring about a better world.
00:49:50.320 | So every power center thinks they're doing good.
00:49:54.420 | And so this is the positive of competition,
00:49:59.420 | of having multiple power centers.
00:50:02.260 | This kind of gathering of elites
00:50:04.380 | makes me very, very, very nervous.
00:50:08.520 | The dinners, the meetings in the closed rooms.
00:50:13.520 | I don't know.
00:50:14.820 | But remember we talked about separating our cold analysis
00:50:19.880 | of what's likely or possible from what we prefer,
00:50:22.580 | and so this isn't exactly enough time for that.
00:50:25.020 | We might say, I would recommend we don't go this route
00:50:28.980 | of a strong world governance,
00:50:30.900 | and because I would say it'll preclude this possibility
00:50:35.100 | of becoming grabby aliens,
00:50:36.360 | of filling the nearest million galaxies
00:50:39.500 | for the next billion years with vast amounts of activity,
00:50:43.780 | and interest, and value of life out there.
00:50:47.080 | That's the thing we would lose
00:50:49.020 | by deciding that we wouldn't expand,
00:50:51.300 | that we would stay here
00:50:52.480 | and keep our comfortable shared governance.
00:50:55.980 | - So you, wait, you think that global governance
00:51:04.900 | makes it more likely or less likely
00:51:07.060 | that we expand out into the universe?
00:51:08.780 | - Less.
00:51:10.140 | - So okay.
00:51:10.980 | - This is the key point.
00:51:12.500 | - Great, right, so screw the elites.
00:51:15.560 | (laughing)
00:51:16.400 | - Right. - So if we want to,
00:51:17.500 | wait, do we want to expand?
00:51:19.400 | - So again, I want to separate my neutral analysis
00:51:23.360 | from my evaluation and say,
00:51:25.940 | first of all, I have an analysis that tells us
00:51:28.260 | this is a key choice that we will face,
00:51:30.340 | and that it's a key choice
00:51:31.260 | other aliens have faced out there.
00:51:33.220 | And it could be that only one in 10
00:51:34.660 | or one in 100 civilizations chooses to expand,
00:51:37.660 | and the rest of them stay quiet.
00:51:39.420 | And that's how it goes out there.
00:51:40.860 | And we face that choice too.
00:51:42.720 | And it'll happen sometime in the next 10 million years,
00:51:46.740 | maybe the next thousand.
00:51:48.180 | But the key thing to notice from our point of view
00:51:50.660 | is that even though you might like our global governance,
00:51:54.380 | you might like the fact that we've come together,
00:51:56.140 | we no longer have massive wars,
00:51:58.040 | and we no longer have destructive competition,
00:52:00.460 | and that we could continue that.
00:52:04.200 | The cost of continuing that would be
00:52:05.820 | to prevent interstellar colonization.
00:52:07.980 | That is, once you allow interstellar colonization,
00:52:10.300 | then you've lost control of those colonies,
00:52:12.620 | and whatever they change into,
00:52:14.420 | they could come back here and compete with you back here
00:52:18.040 | as a result of having lost control.
00:52:19.980 | And I think if people value that global governance
00:52:23.700 | and global community and regulation
00:52:26.420 | and all the things it can do enough,
00:52:28.120 | they would then want to prevent interstellar colonization.
00:52:31.660 | - I want to have a conversation with those people.
00:52:33.860 | I believe that both for humanity, for the good of humanity,
00:52:38.860 | for what I believe is good in humanity,
00:52:41.060 | and for expansion, exploration, innovation,
00:52:46.060 | distributing the centers of power is very beneficial.
00:52:50.000 | So this whole meeting of elites,
00:52:51.420 | and I've been very fortunate to meet
00:52:55.040 | quite a large number of elites,
00:52:56.780 | they make me nervous.
00:53:00.200 | Because it's easy to lose touch of reality.
00:53:05.200 | I'm nervous about that in myself,
00:53:09.220 | to make sure that you never lose touch
00:53:11.680 | as you get sort of older, wiser,
00:53:17.600 | you know how you generally get disrespectful of kids,
00:53:20.600 | kids these days.
00:53:21.920 | No, the kids are--
00:53:23.240 | - Okay, but I think you should hear a stronger case
00:53:25.920 | for their position, so I'm gonna play that.
00:53:27.880 | - For the elites.
00:53:29.120 | - Yes, well, for the limiting of expansion,
00:53:33.680 | and for the regulation of behavior.
00:53:37.040 | - Just, okay, can I linger on that?
00:53:39.240 | So you're saying those two are connected.
00:53:41.740 | So the human civilization and alien civilizations
00:53:45.040 | come to a crossroads, they have to decide,
00:53:48.840 | do we want to expand or not?
00:53:51.240 | And connected to that, do we want to give a lot of power
00:53:54.880 | to a central elite, or do we want to
00:53:58.680 | distribute the power centers,
00:54:02.240 | which is naturally connected to the expansion?
00:54:05.600 | When you expand, you distribute the power.
00:54:08.100 | - If, say, over the next thousand years,
00:54:11.280 | we fill up the solar system, right?
00:54:13.200 | We go out from Earth and we colonize Mars
00:54:15.440 | and we change a lot of things.
00:54:17.440 | Within a solar system, still, everything is within reach.
00:54:20.320 | That is, if there's a rebellious colony around Neptune,
00:54:22.800 | you can throw rocks at it and smash it,
00:54:24.480 | and teach them discipline, okay?
00:54:27.480 | - How does that work for the British?
00:54:29.080 | - A central control over the solar system is feasible.
00:54:32.100 | But once you let it escape the solar system,
00:54:35.100 | it's no longer feasible.
00:54:36.040 | But if you have a solar system
00:54:37.760 | that doesn't have a central control,
00:54:39.120 | maybe broken into a thousand different political units
00:54:41.720 | in the solar system, then any one part of that
00:54:45.400 | that allows interstellar colonization, and it happens.
00:54:48.000 | That is, interstellar colonization happens
00:54:50.720 | when only one party chooses to do it,
00:54:53.160 | and is able to do it, and that's what it, therefore.
00:54:55.920 | So we can just say, in a world of competition,
00:54:58.760 | if interstellar colonization is possible, it will happen,
00:55:01.320 | and then competition will continue.
00:55:02.800 | And that will sort of ensure the continuation
00:55:04.720 | of competition into the indefinite future.
00:55:07.680 | - And competition, we don't know,
00:55:10.080 | but competition can take violent forms,
00:55:11.960 | or can take productive forms. - In many forms.
00:55:13.480 | And the case I was going to make is that,
00:55:15.440 | I think one of the things that most scares people
00:55:17.400 | about competition is not just that it creates
00:55:19.800 | holocausts and death on massive scales,
00:55:22.880 | is that it's likely to change who we are,
00:55:27.320 | and what we value. - Yes.
00:55:29.880 | So this is the other thing with power.
00:55:32.800 | As we grow, as human civilization grows,
00:55:37.280 | becomes multi-planetary, multi-solar system, potentially,
00:55:41.600 | how does that change us, do you think?
00:55:43.600 | - I think the more you think about it,
00:55:45.800 | the more you realize it can change us a lot.
00:55:48.040 | So, first of all, I would say--
00:55:49.480 | - This is pretty dark, by the way.
00:55:50.800 | - Well, it's-- - It's just honest.
00:55:53.640 | - Right, well, I'm trying to get you there.
00:55:55.040 | I think the first thing you should say,
00:55:56.040 | if you look at history, just human history
00:55:58.040 | over the last 10,000 years,
00:55:59.920 | if you really understood what people were like
00:56:02.320 | a long time ago, you'd realize
00:56:03.640 | they were really quite different.
00:56:05.440 | Ancient cultures created people
00:56:08.000 | who were really quite different.
00:56:08.960 | Most historical fiction lies to you about that.
00:56:11.680 | It often offers you modern characters in an ancient world.
00:56:14.640 | But if you actually study history,
00:56:16.960 | you will see just how different they were,
00:56:18.560 | and how differently they thought.
00:56:20.640 | And they've changed a lot, many times,
00:56:23.600 | and they've changed a lot across time.
00:56:25.240 | So I think the most obvious prediction about the future is,
00:56:28.160 | even if you only have the mechanisms of change
00:56:30.440 | we've seen in the past, you should still expect
00:56:32.360 | a lot of change in the future.
00:56:33.960 | But we have a lot bigger mechanisms for change
00:56:36.480 | in the future than we had in the past.
00:56:39.040 | So, I have this book called "The Age of M,"
00:56:42.560 | "Work, Love, and Life," and "Robots Rule the Earth,"
00:56:44.840 | and it's about what happens
00:56:46.160 | if brain emulations become possible.
00:56:48.160 | So a brain emulation is where you take
00:56:49.840 | a actual human brain and you scan it
00:56:52.160 | and find spatial and chemical detail
00:56:53.880 | to create a computer simulation of that brain.
00:56:57.560 | And then those computer simulations of brains
00:57:00.320 | are basically citizens in a new world.
00:57:02.280 | They work and they vote and they fall in love
00:57:04.720 | and they get mad and they lie to each other.
00:57:06.960 | And this is a whole new world.
00:57:08.200 | And my book is about analyzing how that world
00:57:10.480 | is different than our world,
00:57:12.760 | basically using competition as my key lever of analysis.
00:57:15.800 | That is, if that world remains competitive,
00:57:18.040 | then I can figure out how they change in that world,
00:57:20.600 | what they do differently than we do.
00:57:22.400 | And it's very different.
00:57:24.920 | And it's different in ways that are shocking sometimes
00:57:28.560 | to many people and ways some people don't like.
00:57:31.600 | I think it's an okay world,
00:57:32.760 | but I have to admit it's quite different.
00:57:34.720 | And that's just one technology.
00:57:38.080 | If we add dozens more technologies,
00:57:41.120 | changes into the future, we should just expect
00:57:44.600 | it's possible to become very different than who we are.
00:57:47.520 | I mean, in the space of all possible minds,
00:57:49.880 | our minds are a particular architecture,
00:57:52.000 | a particular structure, a particular set of habits,
00:57:55.040 | and they are only one piece in a vast space of possibilities.
00:57:59.240 | The space of possible minds is really huge.
00:58:01.960 | - So yeah, let's linger on the space of possible minds
00:58:05.680 | for a moment just to sort of humble ourselves
00:58:09.120 | how peculiar our peculiarities are.
00:58:15.640 | Like the fact that we like a particular kind of sex
00:58:20.040 | and the fact that we eat food through one hole
00:58:23.600 | and poop through another hole.
00:58:26.920 | And that seems to be a fundamental aspect of life,
00:58:29.960 | is very important to us.
00:58:31.480 | And that life is finite in a certain kind of way.
00:58:36.880 | We have a meat vehicle.
00:58:38.920 | So death is very important to us.
00:58:41.000 | I wonder which aspects are fundamental
00:58:43.320 | or would be common throughout human history
00:58:46.320 | and also throughout, sorry,
00:58:48.360 | throughout history of life on Earth
00:58:50.520 | and throughout other kinds of lives.
00:58:53.160 | Like what is really useful?
00:58:55.040 | You mentioned competition,
00:58:56.200 | seems to be a one fundamental thing.
00:58:57.800 | - I've tried to do analysis
00:58:59.800 | of where our distant descendants might go
00:59:02.560 | in terms of what are robust features
00:59:04.200 | we could predict about our descendants.
00:59:06.120 | So again, I have this analysis
00:59:07.800 | of sort of the next generation,
00:59:09.720 | so the next era after ours.
00:59:11.400 | If you think of human history
00:59:12.400 | as having three eras so far, right?
00:59:14.720 | There was the forager era,
00:59:15.960 | the farmer era and the industry era.
00:59:18.080 | Then my attempt in age of M
00:59:19.640 | is to analyze the next era after that.
00:59:21.360 | And it's very different,
00:59:22.280 | but of course there could be
00:59:23.520 | more and more eras after that.
00:59:25.320 | So, analyzing a particular scenario
00:59:27.880 | and thinking it through is one way
00:59:29.200 | to try to see how different the future could be,
00:59:31.920 | but that doesn't give you some sort of like sense
00:59:34.320 | of what's typical.
00:59:35.240 | But I have tried to analyze what's typical.
00:59:39.120 | And so I have two predictions,
00:59:40.840 | I think I can make pretty solidly.
00:59:42.960 | One thing is that we know at the moment
00:59:45.840 | that humans discount the future rapidly.
00:59:49.040 | So, we discount the future
00:59:51.280 | in terms of caring about consequences
00:59:53.200 | roughly a factor of two per generation.
00:59:55.280 | And there's a solid evolutionary analysis
00:59:57.680 | why sexual creatures would do that.
00:59:59.920 | 'Cause basically your descendants
01:00:01.480 | only share half of your genes
01:00:03.240 | and your descendants are a generation away.
01:00:05.280 | - So we only care about our grandchildren.
01:00:08.560 | - Basically that's a factor of four later
01:00:11.040 | because it's later.
01:00:13.360 | So, this actually explains typical interest rates
01:00:16.200 | in the economy,
01:00:17.040 | that is interest rates are greatly influenced
01:00:19.360 | by our discount rates.
01:00:20.440 | And we basically discount the future
01:00:22.680 | by a factor of two per generation.
01:00:24.360 | But that's a side effect of the way
01:00:29.480 | our preferences evolved as sexually selected creatures.
01:00:33.640 | We should expect that in the longer run
01:00:35.920 | creatures will evolve who don't discount the future.
01:00:39.320 | They will care about the long run
01:00:41.440 | and they will therefore not neglect the wrong.
01:00:43.560 | So for example, for things like global warming
01:00:45.240 | or things like that,
01:00:46.840 | at the moment many commenters are sad
01:00:49.520 | that basically ordinary people don't seem to care much,
01:00:51.520 | market prices don't seem to care much
01:00:53.080 | and more ordinary people,
01:00:54.320 | it doesn't really impact them much
01:00:55.600 | because humans don't care much about the long-term future.
01:00:59.240 | But, and futurists find it hard to motivate people
01:01:03.360 | and to engage people about the long-term future
01:01:05.200 | because they just don't care that much.
01:01:07.280 | But that's a side effect of this particular way
01:01:09.720 | that our preferences evolved about the future.
01:01:12.920 | And so in the future, they will neglect the future less.
01:01:15.960 | And that's an interesting thing
01:01:17.560 | that we can predict robustly.
01:01:18.800 | Eventually, maybe a few centuries, maybe longer,
01:01:22.480 | eventually our descendants will care about the future.
01:01:25.920 | - Can you speak to the intuition behind that?
01:01:27.960 | Is it useful to think more about the future?
01:01:32.160 | - Right, if evolution rewards creatures
01:01:35.000 | for having many descendants,
01:01:37.000 | then if you have decisions that influence
01:01:39.720 | how many descendants you have,
01:01:41.480 | then that would be good if you made those decisions.
01:01:43.640 | But in order to do that, you'll have to care about them.
01:01:45.840 | You'll have to care about that future.
01:01:47.440 | - So to push back,
01:01:48.760 | that's if you're trying to maximize the number of descendants
01:01:52.040 | but the nice thing about not caring too much
01:01:54.680 | about the long-term future
01:01:56.520 | is you're more likely to take big risks
01:01:58.640 | or you're less risk-averse.
01:02:01.000 | And it's possible that both evolution
01:02:04.640 | and just life in the universe
01:02:07.640 | rewards the risk-takers.
01:02:11.560 | - Well, we actually have analysis
01:02:13.160 | of the ideal risk preferences too.
01:02:16.240 | So there's literature on ideal preferences
01:02:19.760 | that evolution should promote.
01:02:21.440 | And for example, there's literature
01:02:22.800 | on competing investment funds
01:02:24.600 | and what the managers of those funds should care about
01:02:27.320 | in terms of various kinds of risks
01:02:29.680 | and in terms of discounting.
01:02:30.960 | And so managers of investment funds
01:02:33.440 | should basically have logarithmic risk,
01:02:37.440 | i.e. in shared risk, in correlated risk,
01:02:41.680 | but be very risk-neutral with respect to uncorrelated risk.
01:02:46.040 | So that's a feature that's predicted to happen
01:02:50.400 | about individual personal choices in biology
01:02:54.240 | and also for investment funds.
01:02:55.280 | So that's other things.
01:02:56.120 | That's also something we can say about the long run.
01:02:58.240 | - What's correlated and uncorrelated risk?
01:03:00.840 | - If there's something that would affect
01:03:03.480 | all of your descendants,
01:03:04.840 | then if you take that risk,
01:03:08.440 | you might have more descendants,
01:03:09.920 | but you might have zero.
01:03:11.560 | And that's just really bad to have zero descendants.
01:03:14.920 | But an uncorrelated risk would be a risk
01:03:16.840 | that some of your descendants would suffer,
01:03:18.520 | but others wouldn't.
01:03:19.600 | And then you have a portfolio of descendants.
01:03:22.040 | And so that portfolio ensures you against problems
01:03:25.840 | with any one of them.
01:03:26.680 | - I like the idea of portfolio of descendants.
01:03:28.640 | And we'll talk about portfolios with your idea
01:03:31.440 | of you briefly mentioned.
01:03:32.840 | We'll return there with M, E-M, the age of E-M.
01:03:36.600 | Work, love, and life when robots rule the earth.
01:03:39.600 | E-M, by the way, is emulated minds.
01:03:41.680 | So this, one of the--
01:03:44.360 | - M is short for emulations.
01:03:46.320 | - M is short for emulations,
01:03:47.720 | and it's kind of an idea of how we might
01:03:50.160 | create artificial minds, artificial copies of minds,
01:03:53.480 | or human-like intelligences.
01:03:57.040 | - I have another dramatic prediction I can make
01:03:59.320 | about long-term preferences.
01:04:00.720 | - Yes.
01:04:01.560 | - Which is at the moment, we reproduce as the result
01:04:05.080 | of a hodgepodge of preferences
01:04:07.000 | that aren't very well integrated,
01:04:09.120 | but sort of in our ancestral environment
01:04:11.040 | induced us to reproduce.
01:04:12.360 | So we have preferences over being sleepy,
01:04:14.960 | and hungry, and thirsty, and wanting to have sex,
01:04:17.560 | and wanting to be excitement, et cetera, right?
01:04:21.160 | And so in our ancestral environment,
01:04:23.120 | the packages of preferences that we evolved to have
01:04:25.920 | did induce us to have more descendants.
01:04:29.240 | That's why we're here.
01:04:31.160 | But those packages of preferences are not a robust way
01:04:35.000 | to promote having more descendants.
01:04:36.640 | They were tied to our ancestral environment,
01:04:39.280 | which is no longer true.
01:04:40.120 | So that's one of the reasons we are now having
01:04:41.780 | a big fertility decline,
01:04:43.680 | because in our current environment,
01:04:45.480 | our ancestral preferences are not inducing us
01:04:47.760 | to have a lot of kids,
01:04:48.840 | which is, from evolution's point of view, a big mistake.
01:04:52.440 | We can predict that in the longer run,
01:04:55.020 | there will arise creatures who just abstractly know
01:04:58.600 | that what they want is more descendants.
01:05:00.920 | That's a very robust way to have more descendants
01:05:04.040 | is to have that as your direct preference.
01:05:05.920 | First of all, you're thinking is so clear.
01:05:08.480 | I love it.
01:05:09.480 | So mathematical, and thank you for thinking so clear with me
01:05:14.480 | and bearing with my interruptions
01:05:16.360 | and going on the tangents when we go there.
01:05:20.560 | So you're just clearly saying that successful,
01:05:23.800 | long-term civilizations will prefer to have descendants,
01:05:28.800 | more descendants.
01:05:30.160 | - Not just prefer, consciously and abstractly prefer.
01:05:33.500 | That is, it won't be the indirect consequence
01:05:36.480 | of other preference.
01:05:37.320 | It will just be the thing they know they want.
01:05:40.040 | - There'll be a president in the future that says,
01:05:42.320 | "We must have more sex."
01:05:44.800 | - We must have more descendants
01:05:45.920 | and do whatever it takes to do that.
01:05:47.720 | - Whatever.
01:05:48.960 | - We must go to the moon and do the other things.
01:05:51.160 | - Right.
01:05:52.000 | - Not because they're easy, but because they're hard.
01:05:53.760 | But instead of the moon, let's have lots of sex.
01:05:55.620 | Okay, but there's a lot of ways to have descendants, right?
01:05:59.040 | - Right, but so that's the whole point.
01:06:00.280 | When the world gets more complicated
01:06:02.120 | and there are many possible strategies,
01:06:03.760 | it's having that as your abstract preference
01:06:05.720 | that will force you to think through those possibilities
01:06:08.440 | and pick the one that's most effective.
01:06:10.040 | - So just to clarify, descendants doesn't necessarily mean
01:06:14.400 | the narrow definition of descendants,
01:06:16.140 | meaning humans having sex and then having babies.
01:06:18.640 | - Exactly.
01:06:19.460 | - You can have artificial intelligence systems
01:06:21.600 | that in whom you instill some capability of cognition
01:06:26.600 | and perhaps even consciousness.
01:06:29.280 | You can also create through genetics and biology
01:06:31.560 | clones of yourself or slightly modified clones,
01:06:35.680 | thousands of them.
01:06:36.760 | - Right.
01:06:38.400 | - So all kinds of descendants.
01:06:40.080 | It could be descendants in the space of ideas too,
01:06:43.320 | for somehow we no longer exist in this meat vehicle.
01:06:46.720 | It's now just like whatever the definition of a life form is,
01:06:51.600 | you have descendants of those life forms.
01:06:54.420 | - Yes, and they will be thoughtful about that.
01:06:56.680 | They will have thought about what counts as a descendant
01:06:59.760 | and that'll be important to them to have the right concept.
01:07:02.360 | - So the they there is very interesting, who the they are.
01:07:06.080 | - But the key thing is we're making predictions
01:07:08.320 | that I think are somewhat robust
01:07:09.560 | about what our distant descendants will be like.
01:07:11.800 | Another thing I think you would automatically accept
01:07:14.020 | is they will almost entirely be artificial.
01:07:16.480 | And I think that would be the obvious prediction
01:07:18.000 | about any aliens we would meet.
01:07:19.520 | That is they would long sense have given up
01:07:22.400 | reproducing biologically.
01:07:24.220 | - Well, it's all, it's like organic or something.
01:07:27.220 | It's all real and it's--
01:07:28.800 | - It might be squishy and made out of hydrocarbons,
01:07:31.360 | but it would be artificial in the sense of made in factories
01:07:33.920 | with designs on CAD things, right?
01:07:36.040 | Factories with scale economies.
01:07:37.240 | So the factories we have made on Earth today
01:07:39.400 | have much larger scale economies
01:07:40.960 | than the factories in our cells.
01:07:42.160 | So the factories in our cells are, there are marvels,
01:07:44.920 | but they don't achieve very many scale economies.
01:07:46.840 | They're tiny little factories.
01:07:48.120 | - But they're all factories.
01:07:49.160 | - Yes.
01:07:50.000 | - Factories on top of factories.
01:07:50.820 | So everything, the factories on top--
01:07:53.320 | - But the factories that are designed
01:07:54.520 | is different than sort of the factories that have evolved.
01:07:58.000 | - I think the nature of the word design
01:08:00.160 | is very interesting to uncover there.
01:08:02.320 | But let me, in terms of aliens,
01:08:05.280 | let me go, let me analyze your Twitter like it's Shakespeare.
01:08:09.500 | - Okay.
01:08:10.340 | - There's a tweet that says,
01:08:12.360 | define "hello" in quotes, alien civilizations
01:08:15.520 | as one that might, in the next million years,
01:08:17.960 | identify humans as intelligent and civilized,
01:08:21.020 | travel to Earth and say, "Hello,"
01:08:24.140 | by making their presence and advanced abilities known to us.
01:08:27.560 | The next 15 polls, this is a Twitter thread,
01:08:30.480 | the next 15 polls ask about such "hello" aliens.
01:08:34.320 | And what these polls ask is your Twitter followers,
01:08:38.320 | what they think those aliens will be like.
01:08:41.720 | Certain particular qualities.
01:08:43.640 | So poll number one is, what percent of "hello" aliens
01:08:47.840 | evolved from biological species with two main genders?
01:08:51.320 | And, you know, the popular vote is above 80%.
01:08:56.320 | So most of them have two genders.
01:08:58.640 | What do you think about that?
01:08:59.760 | I'll ask you about some of these
01:09:00.760 | 'cause they're so interesting.
01:09:01.600 | It's such an interesting question.
01:09:02.440 | - It is a fun set of questions.
01:09:03.640 | - Yes, a fun set of questions.
01:09:04.900 | So the genders as we look through evolutionary history,
01:09:08.040 | what's the usefulness of that?
01:09:09.560 | As opposed to having just one or like millions.
01:09:13.680 | - So there's a question in evolution of life on Earth,
01:09:16.520 | there are very few species that have more than two genders.
01:09:19.840 | There are some, but they aren't very many.
01:09:22.240 | But there's an enormous number of species
01:09:24.200 | that do have two genders, much more than one.
01:09:27.040 | And so there's literature on why did multiple genders evolve
01:09:32.040 | and then sort of what's the point of having males
01:09:34.640 | and females versus hermaphrodites.
01:09:36.440 | So most plants are hermaphrodites.
01:09:38.960 | That is they have, they would mate male, female,
01:09:42.840 | but each plant can be either role.
01:09:45.840 | And then most animals have chosen to split
01:09:48.600 | into males and females.
01:09:50.440 | And then they're differentiating the two genders.
01:09:53.000 | And there's an interesting set of questions
01:09:55.440 | about why that happens.
01:09:56.560 | - 'Cause you can do selection.
01:09:58.520 | You basically have like one gender competes
01:10:03.040 | for the affection of other and there's sexual partnership
01:10:06.560 | that creates the offspring.
01:10:07.640 | So there's sexual selection.
01:10:08.880 | It's nice to have like to a party,
01:10:12.440 | it's nice to have dance partners.
01:10:13.880 | And then each one get to choose
01:10:15.880 | based on certain characteristics.
01:10:17.440 | And that's an efficient mechanism
01:10:19.960 | for adapting to the environment,
01:10:21.400 | being successfully adapted to the environment.
01:10:24.440 | - It does look like there's an advantage.
01:10:27.080 | If you have males, then the males can take higher variance.
01:10:30.680 | And so there can be stronger selection among the males
01:10:32.800 | in terms of weeding out genetic mutations
01:10:34.960 | because the males have higher variance
01:10:37.400 | in their mating success.
01:10:39.120 | - Sure, okay.
01:10:40.640 | Question number two, what percent of hello aliens
01:10:43.760 | evolved from land animals as opposed to plants
01:10:46.360 | or ocean/air organisms?
01:10:51.280 | By the way, I did recently see that
01:10:55.400 | there's only 10% of species on earth are in the ocean.
01:11:00.400 | So there's a lot more variety on land.
01:11:04.480 | - There is.
01:11:05.600 | - It's interesting.
01:11:06.520 | So why is that?
01:11:07.440 | I don't even, I can't even intuit exactly why that would be.
01:11:11.040 | Maybe survival on land is harder,
01:11:13.720 | and so you get a lot--
01:11:14.560 | - So the story that I understand is it's about small niches.
01:11:18.120 | So speciation can be promoted
01:11:21.240 | by having multiple different species.
01:11:23.240 | So in the ocean, species are larger.
01:11:25.600 | That is, there are more creatures in each species
01:11:28.720 | because the ocean environments don't vary as much.
01:11:31.320 | So if you're good in one place,
01:11:32.500 | you're good in many other places.
01:11:34.400 | But on land, and especially in rivers,
01:11:36.480 | rivers contain an enormous percentage
01:11:38.360 | of the kinds of species on land, you see,
01:11:42.640 | because they vary so much from place to place.
01:11:46.480 | And so a species can be good in one place,
01:11:48.560 | and then other species can't really compete
01:11:50.400 | because they came from a different place
01:11:52.400 | where things are different.
01:11:53.240 | So it's a remarkable fact, actually,
01:11:57.200 | that speciation promotes evolution in the long run.
01:12:00.360 | That is, more evolution has happened on land
01:12:02.700 | because there have been more species on land
01:12:05.200 | because each species has been smaller.
01:12:07.700 | And that's actually a warning about something called rot
01:12:10.920 | that I've thought a lot about,
01:12:12.200 | which is one of the problems with even a world government,
01:12:15.080 | which is large systems of software today
01:12:17.240 | just consistently rot and decay with time
01:12:19.720 | and have to be replaced.
01:12:20.720 | And that plausibly also is a problem
01:12:23.000 | for other large systems, including biological systems,
01:12:25.360 | legal systems, regulatory systems.
01:12:27.760 | And it seems like large species
01:12:30.880 | actually don't evolve as effectively as small ones do.
01:12:34.280 | And that's an important thing to notice about.
01:12:38.040 | And that's actually, that's different
01:12:40.120 | from ordinary sort of evolution in economies on Earth
01:12:44.760 | in the last few centuries, say.
01:12:46.460 | On Earth, the more technical evolution
01:12:50.040 | and economic growth happens
01:12:51.120 | in larger integrated cities and nations.
01:12:54.520 | But in biology, it's the other way around.
01:12:56.240 | More evolution happened in the fragmented species.
01:12:59.560 | - Yeah, it's such a nuanced discussion
01:13:01.840 | 'cause you can also push back in terms of nations
01:13:04.480 | and at least companies.
01:13:06.400 | It's like large companies seems to evolve less effectively.
01:13:10.560 | There is something that, you know,
01:13:13.400 | they have more resources, more,
01:13:15.860 | they don't even have better resilience.
01:13:18.720 | And when you look at the scale of decades and centuries,
01:13:22.520 | it seems like a lot of large companies die.
01:13:25.480 | - But still large economies do better.
01:13:27.480 | Like large cities grow better than small cities.
01:13:30.400 | Large integrated economies like the United States
01:13:32.440 | or the European Union do better than small fragmented ones.
01:13:35.320 | So even-- - Yeah, sure.
01:13:37.240 | That's a very interesting long discussion.
01:13:40.320 | But so most of the people, and obviously votes on Twitter
01:13:43.720 | represent the absolute objective truth of things.
01:13:48.440 | So most, but an interesting question about oceans is that,
01:13:51.200 | okay, remember I told you about how most planets
01:13:53.320 | would last for trillions of years and be later, right?
01:13:56.640 | So people have tried to explain why life appeared on earth
01:13:59.560 | by saying, oh, all those planets are gonna be unqualified
01:14:02.440 | for life because of various problems.
01:14:04.040 | That is, they're around smaller stars, which lasts longer,
01:14:06.400 | and smaller stars have some things like more solar flares,
01:14:09.280 | maybe more tidal locking.
01:14:10.640 | But almost all of these problems with longer lived planets
01:14:14.640 | aren't problems for ocean worlds.
01:14:16.760 | And a large fraction of planets out there are ocean worlds.
01:14:20.320 | So if life can appear on an ocean world,
01:14:22.560 | then that pretty much ensures
01:14:25.120 | that these planets that last a very long time
01:14:29.280 | could have advanced life because most,
01:14:31.640 | you know, there's a huge fraction of ocean worlds.
01:14:33.080 | - So that's actually an open question.
01:14:34.560 | So when you say, sorry, when you say life appear,
01:14:38.800 | you're kind of saying life and intelligent life.
01:14:42.000 | So like, so that's an open question.
01:14:46.360 | Is land, and that's I suppose the question
01:14:48.960 | behind the Twitter poll,
01:14:52.440 | which is a grabby alien civilization
01:14:55.280 | that comes to say hello.
01:14:57.440 | What's the chance that they first began their early steps,
01:15:02.120 | the difficult steps they took on land?
01:15:04.780 | What do you think?
01:15:07.920 | 80%, most people on Twitter think it's very likely.
01:15:14.040 | - Right. - What do you think?
01:15:15.000 | - I think people are discounting ocean worlds too much.
01:15:18.040 | That is, I think people tend to assume
01:15:20.080 | that whatever we did must be the only way
01:15:22.360 | it's possible, and I think people aren't giving enough
01:15:24.320 | credit for other possible paths, but.
01:15:26.320 | - Dolphins, water world, by the way,
01:15:28.840 | people criticize that movie.
01:15:30.160 | I love that movie.
01:15:31.040 | Kevin Costner can do me no wrong.
01:15:33.080 | Okay, next question.
01:15:34.440 | What percent of hello aliens once had a nuclear war
01:15:38.760 | with greater than 10 nukes fired in anger?
01:15:43.380 | So not in incompetence and as an accident.
01:15:47.760 | - Intentional firing of nukes,
01:15:49.860 | and less than 20% was the most popular vote.
01:15:54.360 | - That just seems wrong to me.
01:15:56.420 | - So like, I wonder what, so most people think
01:15:59.920 | once you get nukes, we're not gonna fire them.
01:16:02.560 | They believe in the power of the game theory.
01:16:05.880 | - I think they're assuming that if you had a nuclear war,
01:16:07.640 | then that would just end civilization for good.
01:16:09.540 | I think that's the thinking.
01:16:10.840 | - That's the main thing.
01:16:11.680 | - Right, and I think that's just wrong.
01:16:12.880 | I think you could rise again after a nuclear war.
01:16:15.320 | It might take 10,000 years or 100,000 years,
01:16:17.640 | but it could rise again.
01:16:19.040 | - So what do you think about mutually assured destruction
01:16:21.720 | as a force to prevent people from firing nuclear weapons?
01:16:25.880 | That's a question that I knew to a terrifying degree
01:16:29.840 | has been raised now and what's going on.
01:16:32.180 | - Well, I mean, clearly it has had an effect.
01:16:34.760 | The question is just how strong an effect for how long?
01:16:37.880 | I mean, clearly we have not gone wild with nuclear war,
01:16:41.680 | and clearly the devastation that you would get
01:16:44.360 | if you initiated a nuclear war is part of the reasons
01:16:46.440 | people have been reluctant to start a war.
01:16:48.120 | The question is just how reliably
01:16:50.800 | will that ensure the absence of a war?
01:16:52.960 | - Yeah, the knight is still young.
01:16:54.560 | - Exactly.
01:16:55.400 | - So it's been 70 years or whatever it's been.
01:16:57.760 | I mean, but what do you think?
01:17:02.120 | Do you think we'll see nuclear war in the century?
01:17:07.040 | - I don't know if in the century,
01:17:07.960 | but it's the sort of thing
01:17:10.560 | that's likely to happen eventually.
01:17:12.960 | - That's a very loose statement.
01:17:14.400 | Okay, I understand.
01:17:15.600 | Now this is where I pull you out of your mathematical model
01:17:18.560 | and ask a human question.
01:17:20.440 | Do you think, this particular human question--
01:17:22.920 | - I think we've been lucky that it hasn't happened so far.
01:17:25.460 | - But what is the nature of nuclear war?
01:17:27.160 | Let's think about this.
01:17:29.040 | There is dictators, there's democracies,
01:17:32.960 | miscommunication, how did war start?
01:17:39.320 | World War I, World War II.
01:17:41.080 | - So the biggest datum here is that we've had
01:17:43.600 | an enormous decline in major war over the last century.
01:17:46.960 | So that has to be taken into account.
01:17:48.640 | Now, so the problem is,
01:17:51.720 | war is a process that has a very long tail.
01:17:54.760 | That is, there are rare, very large wars.
01:17:58.180 | So the average war is much worse than the median war
01:18:02.720 | because of this long tail.
01:18:04.560 | And that makes it hard to identify trends over time.
01:18:07.900 | So the median war has clearly gone way down
01:18:10.480 | in the last century, that a median rate of war.
01:18:12.240 | But it could be that's because the tail has gotten thicker.
01:18:15.440 | And in fact, the average war is just as bad.
01:18:17.400 | But most wars are gonna be big wars.
01:18:19.760 | So that's the thing we're not so sure about.
01:18:21.640 | - There's no strong data on wars with one,
01:18:26.640 | because of the destructive nature of the weapons,
01:18:31.680 | kill hundreds of millions of people.
01:18:33.840 | There's no data on this.
01:18:35.440 | - Right.
01:18:36.280 | - But we can start intuiting.
01:18:37.560 | - But we can see that the power law,
01:18:39.320 | we can do a power law fit to the rate of wars.
01:18:41.180 | And it's a power law with a thick tail.
01:18:43.680 | So it's one of those things that you should expect
01:18:45.920 | most of the damage to be in the few biggest ones.
01:18:48.160 | So that's also true for pandemics
01:18:49.960 | and some a few other things.
01:18:51.180 | For pandemics, most of the damage
01:18:52.480 | is in the few biggest ones.
01:18:53.560 | So the median pandemics of ours less than the average
01:18:56.460 | that you should expect in the future.
01:18:57.920 | - But those, that fitting of data is very questionable
01:19:02.920 | because everything you said is correct.
01:19:06.520 | The question is like, what can we infer
01:19:08.800 | about the future of civilization threatening pandemics
01:19:13.800 | or nuclear war from studying the history
01:19:19.380 | of the 20th century?
01:19:21.220 | So like, you can't just fit it to the data,
01:19:23.740 | the rate of wars and the destructive nature.
01:19:25.720 | Like that's not how nuclear war will happen.
01:19:28.820 | Nuclear war happens with two assholes or idiots
01:19:33.660 | that have access to a button.
01:19:35.180 | - Small wars happen that way too.
01:19:37.060 | - No, I understand that.
01:19:37.940 | But that's, it's very important, small wars aside,
01:19:41.080 | it's very important to understand the dynamics,
01:19:42.940 | the human dynamics and the geopolitics
01:19:45.160 | of the way nuclear war happens
01:19:47.560 | in order to predict how we can minimize the chance of--
01:19:52.560 | - It is a common and useful intellectual strategy
01:19:55.420 | to take something that could be really big
01:19:57.900 | or but is often very small
01:19:59.500 | and fit the distribution of the data,
01:20:01.340 | small things which you have a lot of them
01:20:02.720 | and then ask, do I believe the big things
01:20:04.900 | are really that different?
01:20:05.900 | - Right, I see.
01:20:06.740 | - So sometimes it's reasonable to say like,
01:20:08.540 | say with tornadoes or even pandemics or something,
01:20:11.220 | the underlying process might not be that different
01:20:14.620 | for the high and small ones.
01:20:15.940 | It might not be.
01:20:16.900 | The fact that mutual assured destruction
01:20:21.520 | seems to work to some degree
01:20:23.660 | shows you that to some degree it's different
01:20:26.300 | than the small wars.
01:20:27.440 | So it's a really important question to understand
01:20:36.500 | is are humans capable, one human,
01:20:40.460 | like how many humans on earth,
01:20:42.760 | if I give them a button now,
01:20:44.660 | say you pressing this button will kill everyone on earth,
01:20:48.020 | everyone, right?
01:20:49.060 | How many humans will press that button?
01:20:51.900 | I wanna know those numbers,
01:20:53.700 | like day to day, minute to minute,
01:20:55.860 | how many people have that much irresponsibility,
01:20:58.940 | evil, incompetence, ignorance,
01:21:03.260 | whatever word you wanna assign,
01:21:04.540 | there's a lot of dynamics in the psychology
01:21:06.860 | that leads you to press that button, but how many?
01:21:09.180 | My intuition is the number,
01:21:11.020 | the more destructive that press of a button,
01:21:14.740 | the fewer humans you find.
01:21:16.460 | That number gets very close to zero very quickly,
01:21:19.380 | especially people have access to such a button.
01:21:22.940 | But that's perhaps a hope than a reality.
01:21:26.340 | And unfortunately we don't have good data on this,
01:21:30.980 | which is like how destructive are humans willing to be?
01:21:34.820 | - So I think part of this just has to think about,
01:21:38.340 | ask what your time scales you're looking at, right?
01:21:40.740 | So if you say, if you look at the history of war,
01:21:43.380 | we've had a lot of wars pretty consistently
01:21:45.660 | over many centuries.
01:21:47.400 | So if you ask, will we have a nuclear war
01:21:49.900 | in the next 50 years, I might say, well, probably not.
01:21:52.580 | If I say 500 or 5,000 years,
01:21:55.460 | like if the same sort of risks are underlying
01:21:57.620 | and they just continue,
01:21:58.480 | then you have to add that up over time
01:22:00.260 | and think the risk is getting a lot larger
01:22:02.540 | the longer a time scale we're looking at.
01:22:04.540 | - But, okay, let's generalize nuclear war
01:22:07.100 | because what I was more referring to
01:22:09.020 | is something that kills more than
01:22:12.180 | 20% of humans on earth
01:22:18.180 | and injures or makes
01:22:21.300 | the other 80% suffer horribly,
01:22:28.580 | survive but suffer.
01:22:30.120 | That's what I was referring to.
01:22:31.060 | So when you look at 500 years from now,
01:22:32.800 | there might not be nuclear war,
01:22:33.980 | there might be something else
01:22:35.740 | that has that destructive effect.
01:22:38.740 | And I don't know, these feel like novel questions
01:22:43.740 | in the history of humanity.
01:22:46.060 | I just don't know.
01:22:47.660 | I think since nuclear weapons,
01:22:49.820 | this has been engineering pandemics, for example,
01:22:55.620 | robotics, so nanobots,
01:22:59.080 | here's how I phrase the question.
01:23:01.260 | - It just seems like a real new possibility
01:23:03.340 | that we have to contend with
01:23:04.740 | and we don't have good models,
01:23:06.260 | or from my perspective.
01:23:08.300 | - So if you look on, say, the last 1,000 years
01:23:10.860 | or 10,000 years, we could say we've seen a certain rate
01:23:13.700 | at which people are willing to make big destruction
01:23:16.340 | in terms of war.
01:23:17.420 | - Yes.
01:23:18.260 | - Okay, and if you're willing to project that data forward,
01:23:21.500 | then I think if you wanna ask over periods of thousands
01:23:24.460 | or tens of thousands of years,
01:23:25.620 | you would have a reasonable data set.
01:23:27.500 | So the key question is what's changed lately?
01:23:30.300 | - Yes.
01:23:31.140 | - Okay, and so a big question
01:23:33.620 | of which I've given a lot of thought to,
01:23:35.340 | what are the major changes that seem to have happened
01:23:37.820 | in culture and human attitudes over the last few centuries
01:23:40.860 | and what's our best explanation for those
01:23:42.700 | so that we can project them forward into the future?
01:23:45.660 | And I have a story about that,
01:23:48.660 | which is the story that we have been drifting back
01:23:51.180 | toward forager attitudes in the last few centuries
01:23:54.820 | as we get rich.
01:23:55.860 | So the idea is we spent a million years being a forager,
01:23:59.460 | and that was a very sort of standard lifestyle
01:24:02.820 | that we know a lot about.
01:24:04.540 | Foragers sort of live in small bands,
01:24:06.580 | they make decisions cooperatively, they share food,
01:24:09.780 | they don't have much property, et cetera.
01:24:13.940 | And humans liked that.
01:24:15.580 | And then 10,000 years ago, farming became possible,
01:24:18.220 | but it was only possible because we were plastic enough
01:24:20.620 | to really change our culture.
01:24:21.900 | Farming styles and cultures are very different.
01:24:24.540 | They have slavery, they have war, they have property,
01:24:26.780 | they have inequality, they have kings,
01:24:29.340 | they stay in one place instead of wandering,
01:24:31.580 | they don't have as much diversity of experience or food,
01:24:34.780 | they have more disease.
01:24:36.460 | Farming life is just very different.
01:24:37.980 | But humans were able to sort of introduce conformity
01:24:41.700 | and religion and all sorts of things
01:24:42.980 | to become just a very different kind of creature as farmers.
01:24:45.380 | Farmers are just really different than foragers
01:24:47.140 | in terms of their values and their lives.
01:24:49.180 | But the pressures that made foragers into farmers
01:24:52.660 | were a part mediated by poverty.
01:24:54.420 | Farmers are poor, and if they deviated
01:24:57.340 | from the farming norms that people around them supported,
01:25:00.100 | they were quite at risk of starving to death.
01:25:02.340 | And then in the last few centuries, we've gotten rich.
01:25:06.660 | And as we've gotten rich, the social pressures
01:25:10.020 | that turned foragers into farmers
01:25:12.060 | have become less persuasive to us.
01:25:15.740 | So for example, a farming young woman who was told,
01:25:18.100 | "If you have a child out of wedlock,
01:25:19.660 | "you and your child may starve,"
01:25:21.420 | that was a credible threat.
01:25:22.700 | She would see actual examples around her
01:25:25.340 | to make that a believable threat.
01:25:27.620 | Today, if you say to a young woman,
01:25:29.340 | "You shouldn't have a child out of wedlock,"
01:25:30.900 | she will see other young women around her
01:25:32.580 | doing okay that way.
01:25:33.980 | We're all rich enough to be able to afford
01:25:35.860 | that sort of a thing, and therefore,
01:25:38.060 | she's more inclined often to go with her inclinations,
01:25:41.860 | her sort of more natural inclinations about such things
01:25:44.140 | rather than to be pressured to follow
01:25:46.380 | the official farming norms
01:25:48.420 | of that you shouldn't do that sort of thing.
01:25:49.660 | And all through our lives,
01:25:51.180 | we have been drifting back toward forager attitudes
01:25:54.580 | because we've been getting rich.
01:25:56.180 | And so aside from at work, which is an exception,
01:25:59.300 | but elsewhere, I think this explains trends
01:26:02.620 | toward less slavery, more democracy, less religion,
01:26:06.260 | less fertility, more promiscuity, more travel,
01:26:09.540 | more art, more leisure, fewer work hours.
01:26:14.220 | All of these trends are basically explained
01:26:16.420 | by becoming more forager-like.
01:26:19.100 | And much science fiction celebrates this.
01:26:21.180 | Star Trek or the culture novels,
01:26:22.740 | people like this image that we are moving
01:26:25.060 | toward this world where basically like foragers,
01:26:27.380 | we're peaceful, we share, we make decisions collectively,
01:26:30.460 | we have a lot of free time, we are into art.
01:26:33.940 | So forager, you know, forager is a word,
01:26:38.860 | and it's a loaded word because it's connected
01:26:42.180 | to the actual, what life was actually like at that time.
01:26:47.580 | As you mentioned, we sometimes don't do a good job
01:26:50.100 | of telling accurately what life was like back then.
01:26:53.260 | But you're saying if it's not exactly like foragers,
01:26:55.940 | it rhymes in some fundamental way.
01:26:58.340 | - Right. - 'Cause you also said peaceful.
01:27:00.820 | Is it obvious that a forager with a nuclear weapon
01:27:03.900 | would be peaceful?
01:27:07.740 | I don't know if that's 100% obvious.
01:27:09.540 | - So we know, again, we know a fair bit
01:27:11.380 | about what foragers' lives were like.
01:27:13.860 | The main sort of violence they had would be sexual jealousy.
01:27:16.900 | They were relatively promiscuous,
01:27:18.380 | and so there'd be a lot of jealousy.
01:27:19.720 | But they did not have organized wars with each other.
01:27:22.940 | That is, they were at peace
01:27:24.300 | with their neighboring forager bands.
01:27:25.980 | They didn't have property in land or even in people.
01:27:28.540 | They didn't really have marriage.
01:27:30.180 | And so they were, in fact, peaceful.
01:27:35.060 | - And when you think about large-scale wars,
01:27:37.580 | they don't start large-scale wars.
01:27:38.420 | - Right, they didn't have coordinated large-scale wars
01:27:40.500 | in the way chimpanzees do.
01:27:41.660 | Now, chimpanzees do have wars
01:27:43.560 | between one tribe of chimpanzees and others,
01:27:45.420 | but human foragers did not.
01:27:46.860 | Farmers returned to that, of course,
01:27:48.640 | the more chimpanzee-like styles.
01:27:50.500 | - Well, that's a hopeful message.
01:27:52.660 | If we could return real quick
01:27:54.340 | to the Hello Aliens Twitter thread,
01:27:59.180 | one of them is really interesting about language.
01:28:01.060 | What percent of Hello Aliens
01:28:02.460 | would be able to talk to us in our language?
01:28:05.460 | So this is the question of communication.
01:28:08.020 | It actually gets to the nature of language.
01:28:10.460 | - It also gets to the nature
01:28:13.180 | of how advanced you expect them to be.
01:28:16.260 | So I think some people see that we have advanced
01:28:21.100 | over the last thousands of years
01:28:22.980 | and we aren't reaching any sort of limit,
01:28:25.240 | and so they tend to assume it could go on forever.
01:28:28.380 | And I actually tend to think that within, say,
01:28:30.980 | 10 million years, we will sort of max out on technology.
01:28:35.100 | We will sort of learn everything that's feasible to know,
01:28:38.780 | for the most part.
01:28:39.940 | And then obstacles to understanding
01:28:42.280 | would more be about cultural differences,
01:28:44.840 | like ways in which different places
01:28:46.540 | have just chosen to do things differently.
01:28:49.140 | And so then the question is,
01:28:51.900 | is it even possible to communicate
01:28:54.540 | across some cultural distances?
01:28:57.180 | And I might think, yeah,
01:28:58.340 | I could imagine some maybe advanced aliens
01:29:00.280 | who've become so weird and different from each other
01:29:02.420 | they can't communicate with each other,
01:29:03.820 | but we're probably pretty simple compared to them.
01:29:07.540 | So I would think, sure, if they wanted to,
01:29:10.860 | they could communicate with us.
01:29:12.600 | - So it's the simplicity of the recipient.
01:29:15.020 | I tend to, just to push back,
01:29:18.480 | let's explore the possibility where that's not the case.
01:29:22.560 | Can we communicate with ants?
01:29:24.960 | I find that, like this idea that--
01:29:30.560 | - We're not very good at communicating in general.
01:29:33.400 | - Oh, you're saying, all right, I see.
01:29:36.240 | You're saying once you get orders of magnitude better
01:29:38.520 | at communicating.
01:29:39.880 | - Once they had maxed out on all communication technology
01:29:43.080 | in general, and they just understood in general
01:29:45.320 | how to communicate with lots of things,
01:29:46.920 | and had done that for millions of years.
01:29:48.680 | - But you have to be able to, this is so interesting,
01:29:51.240 | as somebody who cares a lot about empathy
01:29:53.040 | and imagining how other people feel,
01:29:55.220 | communication requires empathy,
01:30:00.240 | meaning you have to truly understand
01:30:03.520 | how the other person, the other organism sees the world.
01:30:08.880 | It's like a four-dimensional species
01:30:11.800 | talking to a two-dimensional species.
01:30:13.440 | It's not as trivial as, to me at least,
01:30:16.640 | as it might at first seem.
01:30:18.320 | - So let me reverse my position a little,
01:30:20.840 | because I'll say, well, the hello aliens question
01:30:24.480 | really combines two different scenarios
01:30:28.200 | that we're slipping over.
01:30:30.400 | So one scenario would be that the hello aliens
01:30:33.700 | would be like grabby aliens.
01:30:35.160 | They would be just fully advanced.
01:30:36.880 | They would have been expanding for millions of years.
01:30:38.560 | They would have a very advanced civilization,
01:30:40.880 | and then they would finally be arriving here
01:30:43.280 | after a billion years, perhaps, of expanding,
01:30:45.600 | in which case they're gonna be crazy advanced
01:30:47.720 | at some maximum level.
01:30:49.000 | But the hello aliens about aliens we might meet soon,
01:30:54.000 | which might be sort of UFO aliens,
01:30:56.160 | and UFO aliens probably are not grabby aliens.
01:31:00.460 | - How do you get here if you're not a grabby alien?
01:31:04.500 | - Well, they would have to be able to travel.
01:31:08.400 | But they would not be expansive.
01:31:11.640 | So if it's a road trip, it doesn't count as grabby.
01:31:14.960 | So we're talking about expanding the comfortable colony.
01:31:19.640 | - The question is, if UFOs, some of them are aliens,
01:31:24.180 | what kind of aliens would they be?
01:31:26.560 | This is sort of the key question you have to ask
01:31:28.600 | in order to try to interpret that scenario.
01:31:30.800 | The key fact we would know is that they are here right now,
01:31:36.140 | but the universe around us is not full
01:31:39.080 | of an alien civilization.
01:31:41.240 | So that says right off the bat that they chose not
01:31:45.800 | to allow massive expansion of a grabby civilization.
01:31:50.800 | - Is it possible that they chose it,
01:31:53.160 | but we just don't see them yet?
01:31:55.120 | These are the stragglers, the journeymen, the--
01:31:58.200 | - So the timing coincidence is,
01:32:00.440 | it's almost surely if they are here now,
01:32:02.920 | they are much older than us.
01:32:04.160 | They are many millions of years older than us.
01:32:07.340 | And so they could have filled the galaxy
01:32:09.800 | in that last millions of years if they had wanted to.
01:32:12.400 | That is, they couldn't just be right at the edge.
01:32:15.900 | Very unlikely.
01:32:17.060 | Most likely they would have been around waiting for us
01:32:19.540 | for a long time.
01:32:20.380 | They could have come here anytime
01:32:21.520 | in the last millions of years,
01:32:22.820 | and they just chosen, they've been waiting around for this,
01:32:24.940 | or they just chose to come recently.
01:32:26.860 | But the timing coincidence, it would be crazy unlikely
01:32:30.100 | that they just happened to be able to get here,
01:32:32.140 | say in the last 100 years.
01:32:34.180 | They would no doubt have been able to get here
01:32:36.500 | far earlier than that.
01:32:37.860 | - Again, we don't know.
01:32:39.100 | So this is a friend like UFO sightings on Earth.
01:32:41.580 | We don't know if this kind of increase in sightings
01:32:44.580 | have anything to do with actual visitation.
01:32:46.580 | - I'm just talking about the timing.
01:32:48.340 | They arose at some point in space time.
01:32:50.980 | And it's very unlikely that that was just to the point
01:32:55.260 | that they could just barely get here recently.
01:32:57.680 | Almost surely they would have--
01:32:59.500 | - But they might have been here.
01:33:00.340 | - They could have gotten here much earlier.
01:33:01.900 | - And well, throughout the stretch of several billion years
01:33:04.580 | that Earth existed, they could have been here often.
01:33:06.380 | - Exactly, so they could have therefore filled the galaxy
01:33:10.180 | long time ago if they had wanted to.
01:33:11.980 | - Let's push back on that.
01:33:13.940 | The question to me is, isn't it possible
01:33:16.580 | that the expansion of a civilization
01:33:19.680 | is much harder than the travel,
01:33:23.700 | the sphere of the reachable is different
01:33:28.780 | than the sphere of the colonized.
01:33:31.500 | So isn't it possible that the sphere of places
01:33:36.340 | where the stragglers go, the different people
01:33:38.980 | that journey out, the explorers, is much, much larger
01:33:42.100 | and grows much faster than the civilization?
01:33:46.460 | So in which case, they would visit us.
01:33:49.560 | There's a lot of visitors,
01:33:50.700 | the grad students of the civilization.
01:33:53.080 | They're exploring, they're collecting the data,
01:33:55.780 | but we're not yet going to see them.
01:33:58.660 | And by yet, I mean across millions of years.
01:34:01.640 | - The time delay between when the first thing might arrive
01:34:07.540 | and then when colonists could arrive in mass
01:34:11.140 | and do a mass amount of work is cosmologically short.
01:34:14.540 | In human history, of course, sure,
01:34:16.420 | there might be a century between that,
01:34:18.700 | but a century is just a tiny amount of time
01:34:21.520 | on the scales we're talking about.
01:34:23.100 | - So this is, in computer science,
01:34:25.460 | there's ant colony optimization.
01:34:27.020 | It's true for ants.
01:34:28.500 | So it's like when the first ant shows up,
01:34:30.340 | it's likely, and if there's anything of value,
01:34:33.260 | it's likely the other ants will follow quickly.
01:34:36.700 | Yeah. - Relatively short.
01:34:37.580 | It's also true that traveling over very long distances,
01:34:41.620 | probably one of the main ways to make that feasible
01:34:44.500 | is that you land somewhere, you colonize a bit,
01:34:47.460 | you create new resources that can then allow you
01:34:49.940 | to go farther. - Many short hops
01:34:51.620 | as opposed to a giant, long journey.
01:34:53.260 | - Exactly, but those hops require that you are able
01:34:56.180 | to start a colonization of sorts along those hops, right?
01:34:59.460 | You have to be able to stop somewhere,
01:35:01.180 | make it into a way station such that you can then
01:35:04.460 | support you moving farther.
01:35:05.620 | - So what do you think of,
01:35:07.460 | there's been a lot of UFO sightings,
01:35:09.880 | what do you think about those UFO sightings
01:35:12.580 | and what do you think if any of them are
01:35:16.460 | of extraterrestrial origin and we don't see
01:35:21.820 | giant civilizations out in the sky,
01:35:25.220 | how do you make sense of that then?
01:35:27.620 | - I wanna do some clearing of throats,
01:35:29.900 | which is people like to do on this topic, right?
01:35:33.020 | They wanna make sure you understand
01:35:34.260 | they're saying this and not that, right?
01:35:36.060 | So I would say the analysis needs both
01:35:40.140 | a prior and a likelihood.
01:35:41.780 | So the prior is what are the scenarios
01:35:45.960 | that are at all plausible in terms of what we know
01:35:48.740 | about the universe and then the likelihood
01:35:50.820 | is the particular actual sightings,
01:35:53.300 | like how hard are those to explain through various means.
01:35:56.740 | I will establish myself as someone of an expert
01:36:00.340 | on the prior, I would say my studies
01:36:03.060 | and the things I've studied make me an expert
01:36:05.060 | and I should stand up and have an opinion on that
01:36:07.080 | and be able to explain it.
01:36:09.120 | The likelihood, however, is not my area of expertise.
01:36:11.980 | That is, I'm not a pilot, I don't do atmospheric studies
01:36:16.300 | of things I haven't studied in detail,
01:36:18.140 | the various kinds of atmospheric phenomena or whatever
01:36:20.740 | that might be used to explain the particular sightings.
01:36:23.260 | I can just say from my amateur stance,
01:36:25.260 | the sightings look damn puzzling.
01:36:27.860 | They do not look easy to dismiss,
01:36:30.620 | the attempts I've seen to easily dismiss them
01:36:33.060 | seem to me to fail, it seems like these are pretty puzzling,
01:36:36.180 | weird stuff that deserve an expert's attention
01:36:40.500 | in terms of considering, asking what the likelihood is.
01:36:43.540 | So analogy I would make is a murder trial, okay?
01:36:46.320 | On average, if we say what's the chance
01:36:49.140 | any one person murdered another person
01:36:51.100 | as a prior probability, maybe one in a thousand people
01:36:53.660 | get murdered, maybe each person has a thousand people
01:36:55.860 | around them who could plausibly have done it,
01:36:57.440 | so the prior probability of a murder is one in a million.
01:37:00.740 | But we allow murder trials because often evidence
01:37:03.740 | is sufficient to overcome a one in a million prior
01:37:06.660 | because the evidence is often strong enough, right?
01:37:10.020 | My guess, rough guess for the UFOs as aliens scenario,
01:37:14.460 | at least some of them, is the prior is roughly
01:37:15.900 | one in a thousand, much higher than the usual murder trial,
01:37:19.860 | plenty high enough that strong physical evidence
01:37:23.620 | could put you over the top to think
01:37:25.460 | it's more likely than not.
01:37:27.180 | But I'm not an expert on that physical evidence,
01:37:29.300 | I'm gonna leave that part to someone else.
01:37:31.060 | I'm gonna say the prior is pretty high,
01:37:33.540 | this isn't a crazy scenario.
01:37:34.900 | So then I can elaborate on where my prior comes from.
01:37:38.060 | What scenario could make most sense of this data?
01:37:41.300 | My scenario to make sense has two main parts.
01:37:46.500 | First is panspermia siblings.
01:37:49.820 | So panspermia is the hypothesized process
01:37:53.260 | by which life might have arrived on Earth from elsewhere.
01:37:56.460 | And a plausible time for that, I mean,
01:37:59.700 | it would have to happen very early in Earth's history
01:38:01.780 | 'cause we see life early in history.
01:38:03.460 | And a plausible time could have been
01:38:05.340 | during the stellar nursery where the sun was born
01:38:08.580 | with many other stars in the same close proximity
01:38:12.580 | with lots of rocks flying around,
01:38:14.180 | able to move things from one place to another.
01:38:18.340 | If a rock with life on it from some rock with planet
01:38:22.580 | with life came into that stellar nursery,
01:38:24.620 | it plausibly could have seeded many planets
01:38:27.980 | in that stellar nursery all at the same time.
01:38:30.140 | They're all born at the same time in the same place,
01:38:31.700 | pretty close to each other, lots of rocks flying around.
01:38:35.140 | So a panspermia scenario would then create siblings,
01:38:38.580 | i.e. there would be say a few thousand
01:38:42.220 | other planets out there.
01:38:44.740 | So after the nursery forms, it drifts, it separates,
01:38:47.380 | they drift apart.
01:38:48.420 | And so out there in the galaxy,
01:38:50.020 | there would now be a bunch of other stars
01:38:51.700 | all formed at the same time,
01:38:52.940 | and we can actually spot them in terms of their spectrum.
01:38:55.780 | And they would have then started on the same path of life
01:38:59.840 | as we did with that life being seeded,
01:39:02.020 | but they would move at different rates.
01:39:04.180 | And most likely, most of them would never
01:39:08.780 | reach an advanced level before the deadline,
01:39:10.580 | but maybe one other did, and maybe it did before us.
01:39:16.340 | So if they did, they could know all of this,
01:39:19.020 | and they could go searching for their siblings.
01:39:20.740 | That is, they could look in the sky for the other stars
01:39:23.060 | that match the spectrum that matches the spectrum
01:39:25.740 | that came from this nursery.
01:39:27.060 | They could identify their sibling stars in the galaxy,
01:39:30.260 | the thousand of them,
01:39:31.940 | and those would be of special interest to them
01:39:33.780 | 'cause they would think, well, life might be on those.
01:39:36.480 | And they could go looking for them.
01:39:39.580 | - You're just such a brilliant mathematical,
01:39:43.180 | philosophical, physical, biological idea
01:39:48.180 | of panspermia siblings
01:39:51.500 | because we all kind of started at similar time
01:39:54.060 | in this local pocket of the universe.
01:39:59.140 | And so that changes a lot of the math.
01:40:02.980 | - So that would create this correlation
01:40:04.580 | between when advanced life might appear.
01:40:06.140 | No longer just random independent spaces in space-time.
01:40:09.240 | There'd be this cluster, perhaps.
01:40:10.900 | And that allows interaction between--
01:40:13.500 | - The elements of the cluster, yes.
01:40:15.020 | - Non-grabby alien civilizations,
01:40:17.340 | like kind of primitive alien civilizations
01:40:21.660 | like us with others,
01:40:23.700 | and they might be a little bit ahead.
01:40:25.500 | That's so fascinating.
01:40:26.980 | - They would probably be a lot ahead.
01:40:28.420 | So the puzzle is-- - Sure, sure.
01:40:30.940 | - If they happen before us,
01:40:33.780 | they probably happened hundreds of millions
01:40:35.780 | of years before us.
01:40:37.180 | - But less than a billion.
01:40:38.700 | - Less than a billion, but still plenty of time
01:40:41.300 | that they could have become grabby
01:40:42.700 | and filled the galaxy and gone beyond.
01:40:45.580 | So there'd be plenty,
01:40:46.460 | so the fact is they chose not to become grabby.
01:40:49.080 | That would have to be the interpretation.
01:40:51.220 | If we have panspermia siblings--
01:40:52.060 | - Plenty of time to become grabby, you said.
01:40:54.260 | So it should be fine. - Yes, they had plenty of time
01:40:55.860 | and they chose not to.
01:40:56.980 | - Are we sure about this?
01:40:59.300 | So 100 million years is enough?
01:41:01.900 | - 100 million, so I told you before that I said
01:41:04.660 | within 10 million years,
01:41:06.300 | our descendants will become grabby or not.
01:41:09.020 | - And they'll have that choice, okay.
01:41:11.020 | - And so they clearly more than 10 million years
01:41:13.740 | earlier than us, so they chose not to.
01:41:16.380 | - But still go on vacation, look around,
01:41:19.500 | so it's not grabby.
01:41:20.460 | - If they chose not to expand,
01:41:22.860 | that's going to have to be a rule they set
01:41:24.540 | to not allow any part of themselves to do it.
01:41:26.500 | If they let any little ship fly away
01:41:30.820 | with the ability to create a colony,
01:41:33.140 | the game's over.
01:41:33.980 | Then they have prevented, then the universe
01:41:36.780 | becomes grabby from their origin with this one colony.
01:41:39.260 | So in order to prevent their civilization being grabby,
01:41:42.340 | they have to have a rule they enforce pretty strongly
01:41:44.540 | that no part of them can ever try to do that.
01:41:47.100 | - Through a global authoritarian regime
01:41:49.980 | or through something that's internal to them,
01:41:52.900 | meaning it's part of the nature of life
01:41:55.940 | that it doesn't want, as become advanced--
01:41:58.300 | - Like a political officer in the brain or whatever.
01:42:00.540 | - Yes, there's something in human nature
01:42:04.620 | that prevents you from, or like alien nature,
01:42:08.780 | that as you get more advanced,
01:42:10.620 | you become lazier and lazier
01:42:12.380 | in terms of exploration and expansion.
01:42:14.700 | - So I would say they would have to have enforced
01:42:17.540 | a rule against expanding, and that rule
01:42:20.220 | would probably make them reluctant
01:42:22.000 | to let people leave very far.
01:42:24.380 | Any one vacation trip far away could risk
01:42:27.260 | an expansion from this vacation trip.
01:42:29.180 | So they would probably have a pretty tight lid
01:42:31.060 | on just allowing any travel out from their origin
01:42:34.220 | in order to enforce this rule.
01:42:35.840 | - Interesting. - But then we also know,
01:42:38.900 | well, they would have chosen to come here.
01:42:40.740 | So clearly they made an exception from their general rule
01:42:43.900 | to say, okay, but an expedition to Earth,
01:42:46.900 | that should be allowed.
01:42:48.260 | - It could be intentional exception
01:42:50.340 | or incompetent exception.
01:42:52.780 | - But if incompetent, then they couldn't maintain
01:42:54.840 | this over 100 million years,
01:42:56.700 | this policy of not allowing any expansion.
01:42:58.860 | So we have to see they have successfully,
01:43:01.180 | they not just had a policy to try,
01:43:02.820 | they succeeded over 100 million years
01:43:05.340 | in preventing the expansion.
01:43:07.700 | That's a substantial competence.
01:43:09.780 | - Let me think about this.
01:43:11.540 | So you don't think there could be a barrier
01:43:13.300 | in 100 million years, you don't think
01:43:15.100 | there could be a barrier to, like,
01:43:19.860 | technological barrier to becoming expansionary?
01:43:24.540 | - Imagine the Europeans that tried to prevent anybody
01:43:28.220 | from leaving Europe to go to the New World.
01:43:31.100 | And imagine what it would have taken
01:43:33.020 | to make that happen over 100 million years.
01:43:36.260 | - Yeah, it's impossible.
01:43:38.020 | - They would have had to have very strict, you know,
01:43:40.760 | guards at the borders, at the borders saying,
01:43:43.060 | "No, you can't go."
01:43:44.260 | - But just to clarify, you're not suggesting
01:43:47.380 | that's actually possible.
01:43:49.040 | - I am suggesting it's possible.
01:43:51.340 | - I don't know how you keep, in my silly human brain,
01:43:56.020 | maybe it's a brain that values freedom,
01:43:57.780 | but I don't know how you can keep,
01:43:59.700 | no matter how much force,
01:44:01.820 | no matter how much censorship or control or so on,
01:44:05.140 | I just don't know how you can keep people
01:44:08.100 | from exploring into the mysterious, into the unknown.
01:44:11.780 | - You're thinking of people, we're talking aliens.
01:44:13.580 | So remember, there's a vast space
01:44:14.940 | of different possible social creatures
01:44:16.400 | they could have evolved from,
01:44:17.520 | different kinds of cultures they could be in,
01:44:19.580 | different kinds of threats.
01:44:21.500 | I mean, there are many things, as you talked about,
01:44:23.000 | that most of us would feel very reluctant to do.
01:44:25.820 | This isn't one of those, but--
01:44:27.100 | - Okay, so how, if the UFO sightings
01:44:29.740 | represent alien visitors,
01:44:32.300 | how the heck are they getting here
01:44:33.940 | under the Panspermia siblings?
01:44:36.380 | - So Panspermia siblings is one part of the scenario,
01:44:39.100 | which is that's where they came from.
01:44:41.500 | And from that, we can conclude
01:44:42.940 | they had this rule against expansion,
01:44:44.520 | and they've successfully enforced that.
01:44:46.980 | That also creates a plausible agenda
01:44:49.380 | for why they would be here,
01:44:50.820 | that is, to enforce that rule on us.
01:44:52.740 | That is, if we go out and expanding,
01:44:54.860 | then we have defeated the purpose of this rule they set up.
01:44:58.100 | - Interesting.
01:44:58.940 | - Right, so they would be here to convince us
01:45:02.660 | to not expand.
01:45:03.740 | - Convince in quotes.
01:45:05.260 | - Right, through various mechanisms.
01:45:06.860 | So obviously, one thing we conclude
01:45:08.620 | is they didn't just destroy us.
01:45:10.040 | That would have been completely possible, right?
01:45:12.320 | So the fact that they're here and we are not destroyed
01:45:14.980 | means that they chose not to destroy us.
01:45:17.900 | They have some degree of empathy
01:45:19.620 | or whatever their morals are
01:45:22.060 | that would make them reluctant to just destroy us.
01:45:24.460 | They would rather persuade us.
01:45:25.900 | - They're, destroy their brethren.
01:45:27.820 | And so they may have been,
01:45:29.500 | there's a difference between arrival and observation.
01:45:32.460 | They may have been observing for a very long time.
01:45:34.580 | - Exactly.
01:45:35.420 | - And they arrived to try to, not to try,
01:45:40.420 | I don't think, to try. - To ensure.
01:45:42.380 | - To ensure that we don't become grabby.
01:45:46.780 | - Which is because that's, we can see that they did not,
01:45:49.220 | they must have enforced a ruling against that,
01:45:51.340 | and they are therefore here to,
01:45:53.700 | that's a plausible interpretation
01:45:55.300 | why they would risk this expedition
01:45:56.860 | when they clearly don't risk very many expeditions
01:45:59.060 | over this long period, to allow this one exception,
01:46:01.420 | because otherwise, if they don't, we may become grabby.
01:46:04.900 | And they could have just destroyed us, but they didn't.
01:46:06.500 | - And they're closely monitoring
01:46:07.660 | the technological advancing of civilization,
01:46:10.900 | like what nuclear weapons is one thing,
01:46:12.740 | is that, all right, cool,
01:46:13.940 | that might have less to do with nuclear weapons
01:46:16.340 | and more with nuclear energy.
01:46:18.140 | Maybe they're monitoring fusion closely.
01:46:20.860 | Like, how clever are these apes getting?
01:46:23.460 | So no doubt, they have a button
01:46:25.420 | that if we get too uppity or risky,
01:46:27.860 | they can push the button and ensure that we don't expand,
01:46:30.940 | but they'd rather do it some other way.
01:46:32.540 | So now, that explains why they're here
01:46:35.740 | and why they aren't out there.
01:46:36.580 | But there's another thing that we need to explain.
01:46:38.180 | There's another key data we need to explain about UFOs
01:46:40.500 | if we're gonna have a hypothesis that explains them.
01:46:42.460 | And this is something many people have noticed,
01:46:44.740 | which is they had two extreme options
01:46:48.860 | they could have chosen and didn't choose.
01:46:51.020 | They could have either just remained completely invisible.
01:46:53.980 | Clearly, an advanced civilization
01:46:55.340 | could have been completely invisible.
01:46:56.620 | There's no reason they need to fly around and be noticed.
01:46:59.180 | They could just be in orbit in dark satellites
01:47:01.580 | that are completely invisible to us,
01:47:03.220 | watching whatever they wanna watch.
01:47:04.340 | That would be well within their abilities.
01:47:06.700 | That's one thing they could have done.
01:47:07.820 | The other thing they could do is just show up
01:47:10.020 | and land on the White House lawn, as they say,
01:47:12.260 | and shake hands, like make themselves really obvious.
01:47:15.180 | They could have done either of those,
01:47:16.980 | and they didn't do either of those.
01:47:18.260 | That's the next thing you need to explain
01:47:20.380 | about UFOs as aliens.
01:47:21.420 | Why would they take this intermediate approach,
01:47:23.500 | hanging out near the edge of visibility
01:47:26.300 | with somewhat impressive mechanisms,
01:47:27.860 | but not walking up and introducing themselves,
01:47:29.740 | nor just being completely invisible?
01:47:31.580 | - So, okay, a lot of questions there.
01:47:33.820 | So one, do you think it's obvious
01:47:36.820 | where the White House is, or the White House lawn--
01:47:39.860 | - Well, it's obvious where there are concentrations
01:47:41.340 | of humans that you could go up and introduce.
01:47:42.580 | - But is humans the most interesting thing about Earth?
01:47:46.100 | - Yeah, are you sure about this?
01:47:47.900 | Because-- - If they're worried
01:47:49.580 | about an expansion, then they would be worried
01:47:52.420 | about a civilization that could be capable of expansion.
01:47:54.660 | Obviously, humans are the civilization on Earth
01:47:56.660 | that's by far the closest to being able to expand.
01:47:59.820 | - I just don't know if aliens obviously see,
01:48:03.880 | obviously see humans, like the individual humans,
01:48:11.500 | like the meat vehicles, as the center of focus
01:48:16.420 | for observing a life on a planet.
01:48:19.700 | - They're supposed to be really smart and advanced.
01:48:21.460 | Like, this shouldn't be that hard for them.
01:48:23.740 | - But I think we're actually the dumb ones,
01:48:26.260 | because we think humans are the important things,
01:48:28.220 | but it could be our ideas,
01:48:30.720 | it could be something about our technologies.
01:48:32.940 | - But that's mediated with us, it's correlated with us.
01:48:34.620 | - No, we make it seem like it's mediated by us humans,
01:48:39.620 | but the focus for alien civilizations
01:48:43.380 | might be the AI systems or the technologies themselves.
01:48:47.300 | That might be the organism.
01:48:48.880 | Human is the food, the source of the organism
01:48:56.200 | that's under observation, versus like--
01:48:59.660 | - So what they wanted to have close contact with
01:49:01.500 | was something that was closely near humans,
01:49:03.460 | then they would be contacting those,
01:49:05.620 | and we would just incidentally see, but we would still see.
01:49:08.180 | - But don't you think, isn't it possible,
01:49:11.380 | taking their perspective, isn't it possible
01:49:13.780 | that they would want to interact
01:49:15.140 | with some fundamental aspect that they're interested in
01:49:18.540 | without interfering with it?
01:49:21.060 | And that's actually a very, no matter how advanced you are,
01:49:24.380 | it's very difficult to do, I think.
01:49:25.580 | - But that's puzzling.
01:49:26.820 | So, I mean, the prototypical UFO observation
01:49:30.820 | is a shiny, big object in the sky
01:49:35.620 | that has very rapid acceleration
01:49:38.340 | and no apparent surfaces for using air
01:49:42.700 | to manipulate its speed.
01:49:44.600 | And the question is, why that, right?
01:49:50.180 | Again, if they just, for example,
01:49:52.020 | if they just wanted to talk to our computer systems,
01:49:53.780 | they could move some sort of a little probe
01:49:56.540 | that connects to a wire and reads and sends bits there.
01:50:00.580 | They don't need a shiny thing flying in the sky.
01:50:02.980 | - But don't you think they would be,
01:50:05.940 | they are, would be looking for the right way to communicate,
01:50:09.060 | the right language to communicate?
01:50:11.100 | Everything you just said, looking at the computer systems,
01:50:14.380 | I mean, that's not a trivial thing.
01:50:16.900 | Coming up with a signal that us humans
01:50:19.880 | would not freak out too much about,
01:50:22.760 | but also understand, might not be that trivial.
01:50:25.340 | How would you talk to things? - Well, so not freak out
01:50:26.500 | a part is another interesting constraint.
01:50:28.580 | So again, I said, like the two obvious strategies
01:50:30.740 | are just to remain completely invisible and watch,
01:50:33.100 | which would be quite feasible,
01:50:34.340 | or to just directly interact,
01:50:36.780 | let's come out and be really very direct, right?
01:50:39.420 | I mean, there's big things that you can see around.
01:50:41.580 | There's big cities, there's aircraft carriers.
01:50:44.220 | There's lots of, if you want to just find a big thing
01:50:46.300 | and come right up to it and like tap it on the shoulder
01:50:48.940 | or whatever, that would be quite feasible.
01:50:50.420 | Then they're not doing that.
01:50:52.080 | So my hypothesis is that one of the other questions there
01:50:57.080 | was, do they have a status hierarchy?
01:50:59.540 | And I think most animals on earth
01:51:02.380 | who are social animals have status hierarchy,
01:51:05.060 | and they would reasonably presume
01:51:06.740 | that we have a status hierarchy.
01:51:08.340 | - Take me to your leader.
01:51:11.420 | - Well, I would say their strategy is to be impressive
01:51:15.340 | and sort of get us to see them
01:51:17.280 | at the top of our status hierarchy.
01:51:19.280 | Just to, you know, that's how, for example,
01:51:23.660 | we domesticate dogs, right?
01:51:25.660 | We convince dogs we're the leader of their pack, right?
01:51:29.100 | And we domesticate many animals that way,
01:51:30.800 | but as we just swap into the top of their status hierarchy
01:51:34.140 | and we say, we're your top status animal,
01:51:36.800 | so you should do what we say, you should follow our lead.
01:51:39.600 | So the idea that would be,
01:51:42.120 | they are going to get us to do what they want
01:51:44.400 | by being top status.
01:51:47.760 | You know, all through history,
01:51:49.640 | kings and emperors, et cetera,
01:51:51.080 | have tried to impress their citizens and other people
01:51:53.340 | by having the bigger palace, the bigger parade,
01:51:55.400 | the bigger crown and diamonds, right?
01:51:57.560 | Whatever, maybe building a bigger pyramid, et cetera.
01:52:00.120 | Just, it's a very well-established trend
01:52:02.440 | that just be high status
01:52:04.000 | by being more impressive than the rest.
01:52:05.840 | - To push back, when there's an order of,
01:52:08.520 | several orders of magnitude of power differential,
01:52:11.680 | asymmetry of power,
01:52:13.280 | I feel like that status hierarchy no longer applies.
01:52:15.840 | It's like mimetic theory.
01:52:17.460 | It's like-
01:52:18.300 | - Most emperors are several orders of magnitude
01:52:20.280 | more powerful than any one member of their empire.
01:52:23.080 | - Let's increase that by even more.
01:52:25.880 | So like, if I'm interacting with ants, right?
01:52:29.760 | I no longer feel like I need to establish
01:52:32.640 | my power with ants.
01:52:34.120 | I actually want to lessen,
01:52:36.640 | I want to lower myself to the ants.
01:52:39.720 | I want to become the lowest possible ant
01:52:42.440 | so that they would welcome me.
01:52:44.560 | So I'm less concerned about them worshiping me.
01:52:46.980 | I'm more concerned about them welcoming me,
01:52:49.240 | integrating me into their world.
01:52:50.080 | - But it is important that you be non-threatening
01:52:51.720 | and that you be local.
01:52:52.640 | So I think, for example,
01:52:53.480 | if the aliens had done something really big in the sky,
01:52:55.780 | you know, a hundred light years away,
01:52:57.780 | that would be there, not here.
01:52:59.360 | - Yes.
01:53:00.200 | - And that could seem threatening.
01:53:01.280 | So I think their strategy to be the high status
01:53:03.720 | would have to be to be visible,
01:53:05.140 | but to be here and non-threatening.
01:53:06.640 | - I just don't know if it's obvious how to do that.
01:53:09.440 | Like, take your own perspective.
01:53:10.920 | You see a planet with relatively intelligent,
01:53:15.820 | like complex structures being formed,
01:53:18.360 | like, yeah, life forms.
01:53:20.200 | We could see this under, in Titan or something like that.
01:53:23.800 | The moon, you know, Europa.
01:53:27.320 | You start to see not just primitive bacterial life,
01:53:29.980 | but multicellular life,
01:53:31.300 | and it seems to form some very complicated
01:53:33.620 | cellular colonies, structures that they're dynamic.
01:53:38.620 | There's a lot of stuff going on.
01:53:40.360 | Some gigantic cellular automata type of construct.
01:53:45.360 | How do you make yourself known to them
01:53:50.100 | in an impressive fashion without destroying it?
01:53:54.640 | Like, we know how to destroy, potentially.
01:53:56.920 | - Right, so if you go touch stuff,
01:53:59.680 | you're likely to hurt it, right?
01:54:01.160 | There's a good risk of hurting something
01:54:02.560 | by getting too close and touching it and interacting, right?
01:54:05.080 | - Yeah, like landing on a White House lawn.
01:54:06.940 | - Right, so the claim is that their current strategy
01:54:11.400 | of hanging out at the periphery of our vision
01:54:13.740 | and just being very clearly physically impressive
01:54:16.240 | with very clear physically impressive abilities
01:54:19.360 | is at least a plausible strategy they might use
01:54:23.060 | to impress us and convince us
01:54:25.240 | that we're at the top of their status hierarchy.
01:54:28.160 | And I would say, if they came closer,
01:54:30.560 | not only would they risk hurting us
01:54:31.960 | in ways that they couldn't really understand,
01:54:33.840 | but more plausibly, they would reveal things
01:54:36.440 | about themselves we would hate.
01:54:37.800 | So if you look at how we treat other civilizations
01:54:41.120 | on Earth and other people,
01:54:42.920 | we are generally interested in foreigners
01:54:46.080 | and people from other plant lands,
01:54:47.840 | and we were generally interested
01:54:49.240 | in their varying cult customs, et cetera,
01:54:51.240 | until we find out that they do something
01:54:53.260 | that violates our moral norms, and then we hate them.
01:54:55.920 | (laughs)
01:54:56.760 | And these are aliens, for God's sakes, right?
01:54:59.280 | - Yeah. - There's just gonna be
01:55:00.360 | something about them that we hate.
01:55:01.760 | They eat babies, who knows what it is.
01:55:04.520 | But something they don't think is offensive,
01:55:05.920 | but that they think we might find.
01:55:07.840 | And so they would be risking a lot
01:55:10.040 | by revealing a lot about themselves.
01:55:11.720 | We would find something we hated.
01:55:13.720 | - Interesting, but do you resonate at all
01:55:16.200 | with mimetic theory where we only feel this way
01:55:20.040 | about things that are very close to us?
01:55:21.920 | So aliens are sufficiently different
01:55:23.360 | to where we'll be like fascinated,
01:55:25.940 | terrified or fascinated, but not like--
01:55:27.880 | - Right, but if they wanna be at the top
01:55:29.260 | of our status hierarchy to get us to follow them,
01:55:31.500 | they can't be too distant.
01:55:32.800 | They have to be close enough that we would see them that way.
01:55:36.100 | - But pretend to be close enough, right,
01:55:38.180 | and not reveal much, that mystery,
01:55:40.300 | that old Clint Eastwood cowboy say less.
01:55:43.180 | - I mean, the point is we're clever enough
01:55:45.420 | that we can figure out their agenda.
01:55:46.820 | That is just from the fact that we're here.
01:55:48.380 | If we see that they're here, we can figure out,
01:55:49.980 | oh, they want us not to expand.
01:55:51.620 | And look, they are this huge power
01:55:53.780 | and they're very impressive,
01:55:54.620 | and a lot of us don't wanna expand,
01:55:56.660 | so that could easily tip us over the edge
01:55:58.780 | toward we already wanted to not expand.
01:56:01.980 | We already wanted to be able to regulate
01:56:03.820 | and have a central community,
01:56:05.560 | and here are these very advanced, smart aliens
01:56:08.020 | who have survived for 100 million years,
01:56:11.060 | and they're telling us not to expand either.
01:56:13.260 | - (laughs) This is brilliant.
01:56:16.100 | I love this so much.
01:56:17.200 | Returning to panspermia siblings,
01:56:21.540 | just to clarify one thing,
01:56:22.900 | in that framework, who originated, who planted it?
01:56:28.800 | Would it be a grabby alien civilization
01:56:33.060 | that planted the siblings or no?
01:56:35.820 | - The simple scenario is that life started
01:56:38.760 | on some other planet billions of years ago,
01:56:42.460 | and it went through part of the stages of evolution
01:56:45.220 | to advanced life, but not all the way to advanced life,
01:56:48.100 | and then some rock hit it, grabbed a piece of it
01:56:50.980 | on the rock, and that rock drifted
01:56:52.660 | for maybe a million years until it happened
01:56:55.260 | to prawn the stellar nursery,
01:56:57.060 | where it then seeded many stars.
01:56:59.220 | - And something about that life,
01:57:01.340 | without being super advanced,
01:57:02.860 | it was nevertheless resilient
01:57:04.580 | to the harsh conditions of space.
01:57:06.560 | - There's some graphs that I've been impressed by
01:57:08.860 | that show sort of the level of genetic information
01:57:12.360 | in various kinds of life on the history of Earth,
01:57:15.100 | and basically, we are now more complex
01:57:17.780 | than the earlier life, but the earlier life
01:57:19.620 | was still pretty damn complex.
01:57:21.660 | And so if you actually project this log graph in history,
01:57:24.740 | it looks like it was many billions of years ago
01:57:27.420 | when you get down to zero, so plausibly,
01:57:29.620 | you could say there was just a lot of evolution
01:57:31.500 | that had to happen before you could get
01:57:33.240 | to the simplest life we've ever seen in history
01:57:35.180 | of life on Earth, was still pretty damn complicated.
01:57:37.540 | Okay, and so that's always been this puzzle.
01:57:40.660 | How could life get to this enormously complicated level
01:57:43.860 | in the short period it seems to
01:57:46.420 | at the beginning of Earth history?
01:57:48.300 | So where, you know, it's only 300 million years
01:57:51.780 | at most it appeared, and then it was really complicated
01:57:55.180 | at that point, so panspermia allows you
01:57:57.860 | to explain that complexity by saying,
01:57:59.220 | well, it's been another five billion years
01:58:01.860 | on another planet, going through lots of earlier stages
01:58:04.700 | where it was working its way up to the level
01:58:06.780 | of complexity you see at the beginning of Earth.
01:58:09.380 | - We'll try to talk about other ideas
01:58:12.060 | of the origin of life, but let me return to UFO sightings.
01:58:16.080 | Is there other explanations that are possible
01:58:18.620 | outside of panspermia siblings that can explain
01:58:22.180 | no grabby aliens in the sky,
01:58:24.700 | and yet alien arrival on Earth?
01:58:27.340 | - Well, the other categories of explanations
01:58:31.200 | that most people will use is, well, first of all,
01:58:33.820 | just mistakes, like, you know, you're confusing
01:58:37.020 | something ordinary for something mysterious, right?
01:58:40.780 | Or some sort of secret organization,
01:58:43.380 | like our government is secretly messing with us,
01:58:45.940 | and trying to do a false flag ops or whatever, right?
01:58:50.100 | They're trying to convince the Russians or the Chinese
01:58:52.180 | that there might be aliens and scare them
01:58:53.820 | into not attacking or something, right?
01:58:55.800 | 'Cause if you know the history of World War II,
01:58:58.500 | say the US government did all these big fake operations
01:59:02.700 | where they were faking a lot of big things
01:59:04.580 | in order to mess with people, so that's a possibility.
01:59:07.060 | The government's been lying and faking things
01:59:09.740 | and paying people to lie about what they saw, et cetera.
01:59:14.060 | That's a plausible set of explanations
01:59:16.380 | for the range of sightings seen.
01:59:18.980 | And another explanation people offer
01:59:20.540 | is some other hidden organization on Earth,
01:59:22.660 | or some secret organization somewhere
01:59:25.340 | that has much more advanced capabilities
01:59:27.220 | than anybody's given it credit for.
01:59:28.860 | For some reason, it's been keeping secret.
01:59:31.020 | I mean, they all sound somewhat implausible,
01:59:34.060 | but again, we're looking for maybe
01:59:35.460 | one in a thousand sort of priors.
01:59:37.780 | Question is, you know, could they be
01:59:40.540 | in that level of plausibility?
01:59:42.860 | - Can we just linger on this?
01:59:44.060 | So you, first of all, you've written,
01:59:47.460 | talked about, thought about so many different topics.
01:59:51.260 | You're an incredible mind,
01:59:52.940 | and I just thank you for sitting down today.
01:59:55.940 | I'm almost like at a loss of which place we explore,
01:59:59.740 | but let me, on this topic, ask about conspiracy theories.
02:00:03.660 | 'Cause you've written about institutions, authorities.
02:00:11.100 | What, this is a bit of a therapy session,
02:00:14.180 | but what do we make of conspiracy theories?
02:00:19.180 | - The phrase itself is pushing you in a direction, right?
02:00:23.260 | So clearly, in history, we have had
02:00:26.020 | many large coordinated keepings of secrets, right?
02:00:29.140 | Say the Manhattan Project, right?
02:00:30.580 | And there was, what, hundreds of thousands
02:00:32.020 | of people working on that over many years.
02:00:34.420 | But they kept it a secret, right?
02:00:36.180 | Clearly, many large military operations
02:00:38.460 | have kept things secret over even decades
02:00:43.300 | with many thousands of people involved.
02:00:45.380 | So clearly, it's possible to keep something secret
02:00:48.620 | over time periods, but the more people you involve
02:00:53.460 | and the more time you're assuming,
02:00:55.300 | and the more, the less centralized an organization
02:00:58.820 | or the less discipline they have,
02:00:59.980 | the harder it gets to believe.
02:01:01.420 | But we're just trying to calibrate, basically, in our minds
02:01:04.620 | which kind of secrets can be kept by which groups
02:01:06.820 | over what time periods for what purposes, right?
02:01:08.860 | - But let me, I don't have enough data.
02:01:11.940 | So I'm somebody, I hang out with people,
02:01:16.580 | and I love people, I love all things, really.
02:01:19.620 | And I just, I think that most people,
02:01:22.980 | even the assholes, have the capacity to be good,
02:01:25.900 | and they're beautiful, and I enjoy them.
02:01:28.500 | So the kind of data my brain,
02:01:30.860 | whatever the chemistry of my brain is
02:01:32.400 | that sees the beautiful in things,
02:01:34.500 | is maybe collecting a subset of data
02:01:37.860 | that doesn't allow me to intuit the competence
02:01:42.460 | that humans are able to achieve
02:01:46.860 | in constructing a conspiracy theory.
02:01:49.460 | So for example, one thing that people often talk about
02:01:52.060 | is intelligence agencies, this broad thing they say,
02:01:55.340 | the CIA, the FSB, the different, the British intelligence.
02:01:58.900 | I've, fortunate or unfortunate enough,
02:02:01.980 | never gotten a chance that I know of
02:02:04.740 | to talk to any member of those intelligence agencies.
02:02:07.440 | Nor take a peek behind the curtain,
02:02:13.340 | or the first curtain, I don't know how many levels
02:02:15.380 | of curtains there are, and so I can't intuit.
02:02:18.580 | But my interactions with government,
02:02:20.060 | I was funded by DOD and DARPA, and I've interacted,
02:02:24.420 | been to the Pentagon.
02:02:25.660 | With all due respect to my friends,
02:02:30.720 | lovely friends in government,
02:02:32.140 | and there are a lot of incredible people,
02:02:34.100 | but there is a very giant bureaucracy
02:02:37.140 | that sometimes suffocates the ingenuity of the human spirit,
02:02:41.620 | is one way I can put it.
02:02:43.140 | Meaning, they are, I just, it's difficult for me
02:02:46.940 | to imagine extreme competence at a scale
02:02:50.100 | of hundreds or thousands of human beings.
02:02:52.520 | Now, that doesn't mean, that's my very anecdotal data
02:02:55.620 | of the situation.
02:02:56.900 | And so I try to build up my intuition
02:03:00.700 | about centralized system of government,
02:03:04.880 | how much conspiracy is possible,
02:03:07.900 | how much the intelligence agencies,
02:03:10.720 | or some other source can generate
02:03:14.100 | sufficiently robust propaganda that controls the populace.
02:03:18.420 | If you look at World War II, as you mentioned,
02:03:21.780 | there have been extremely powerful propaganda machines
02:03:25.380 | on the side of Nazi Germany, on the side of the Soviet Union,
02:03:30.260 | on the side of the United States,
02:03:32.340 | and all these different mechanisms.
02:03:35.780 | Sometimes they control the free press
02:03:38.660 | through social pressures.
02:03:40.220 | Sometimes they control the press
02:03:42.940 | through the threat of violence,
02:03:45.980 | as you do in authoritarian regimes.
02:03:48.100 | Sometimes it's like deliberately,
02:03:49.500 | the dictator writing the news,
02:03:52.420 | the headlines and literally announcing it.
02:03:54.460 | And something about human psychology
02:03:59.020 | forces you to embrace the narrative
02:04:03.580 | and believe the narrative.
02:04:05.020 | And at scale, that becomes reality
02:04:07.620 | when the initial spark was just a propaganda thought
02:04:11.360 | in a single individual's mind.
02:04:13.300 | So I can't necessarily intuit of what's possible,
02:04:18.300 | but I'm skeptical of the power of human institutions
02:04:23.380 | to construct conspiracy theories
02:04:26.480 | that cause suffering at scale.
02:04:28.860 | Especially in this modern age,
02:04:30.780 | when information is becoming more and more accessible
02:04:33.340 | by the populace.
02:04:34.420 | Anyway, I don't know if you can elucidate--
02:04:37.700 | - You said it's cause suffering at scale,
02:04:39.020 | but of course, say during wartime,
02:04:40.660 | the people who are managing the various conspiracies
02:04:43.300 | like D-Day or Manhattan Project,
02:04:45.300 | they thought that their conspiracy was avoiding harm
02:04:49.260 | rather than causing harm.
02:04:51.040 | So if you can get a lot of people to think
02:04:52.560 | that supporting the conspiracy is helpful,
02:04:56.340 | then a lot more might do that.
02:04:58.820 | And there's just a lot of things
02:05:00.300 | that people just don't wanna see.
02:05:02.980 | So if you can make your conspiracy the sort of thing
02:05:04.980 | that people wouldn't wanna talk about anyway,
02:05:06.860 | even if they knew about it,
02:05:08.620 | you're most of the way there.
02:05:10.740 | So I have learned over the years,
02:05:13.020 | many things that most ordinary people
02:05:15.280 | should be interested in, but somehow don't know,
02:05:17.320 | even though the data has been very widespread.
02:05:19.620 | So I have this book, "The Elephant and the Brain,"
02:05:21.740 | and one of the chapters is there on medicine.
02:05:23.940 | And basically, most people seem ignorant
02:05:27.140 | of the very basic fact that when we do randomized trials
02:05:29.860 | where we give some people more medicine than others,
02:05:32.540 | the people who get more medicine are not healthier.
02:05:35.100 | Just overall, in general, just like induce somebody
02:05:38.940 | to get more medicine because you just give them
02:05:40.580 | more budget to buy medicine, say.
02:05:42.860 | Not a specific medicine, just the whole category.
02:05:45.820 | And you would think that would be something
02:05:47.360 | most people should know about medicine.
02:05:49.100 | You might even think that would be a conspiracy theory
02:05:51.940 | to think that would be hidden,
02:05:53.100 | but in fact, most people never learn that fact.
02:05:55.860 | - So just to clarify, just a general high-level statement,
02:06:00.860 | the more medicine you take, the less healthy you are.
02:06:03.580 | - Randomized experiments don't find that fact.
02:06:07.340 | Do not find that more medicine makes you more healthy.
02:06:10.220 | They're just no connection.
02:06:11.520 | In randomized experiments, there's no relationship
02:06:15.060 | between more medicine and being healthier.
02:06:16.540 | - So it's not a negative relationship,
02:06:18.300 | but it's just no relationship.
02:06:19.740 | - Right.
02:06:21.020 | - And so the conspiracy theory would say
02:06:25.080 | that the businesses that sell you medicine
02:06:27.100 | don't want you to know that fact.
02:06:28.820 | And then you're saying that there's also part of this
02:06:32.100 | is that people just don't wanna know.
02:06:33.860 | - They just don't wanna know.
02:06:35.300 | And so they don't learn this.
02:06:36.300 | So I've lived in the Washington area
02:06:38.620 | for several decades now,
02:06:39.940 | reading the Washington Post regularly.
02:06:41.660 | Every week there was a special section
02:06:44.940 | on health and medicine.
02:06:45.840 | It never was mentioned in that section of the paper
02:06:48.500 | in all the 20 years I read that.
02:06:50.740 | - So do you think there is some truth
02:06:52.080 | to this caricatured blue pill, red pill,
02:06:55.320 | where most people don't want to know the truth?
02:06:58.160 | - There are many things about which people
02:07:00.560 | don't want to know certain kinds of truths.
02:07:03.160 | - Yeah.
02:07:04.000 | - That is bad looking truths, truths that discouraging,
02:07:06.640 | truths that sort of take away the justification
02:07:09.040 | for things they feel passionate about.
02:07:11.320 | - Do you think that's a bad aspect of human nature?
02:07:14.200 | That's something we should try to overcome?
02:07:16.360 | - Well, as we discussed, my first priority
02:07:20.060 | is to just tell people about it,
02:07:21.620 | to do the analysis and the cold facts
02:07:23.600 | of what's actually happening,
02:07:24.680 | and then to try to be careful about how we can improve.
02:07:27.260 | So our book, "The Elephant in the Brain,"
02:07:28.960 | coauthored with Kevin Simler,
02:07:30.280 | is about hidden motives in everyday life.
02:07:33.360 | And our first priority there is just to explain to you
02:07:36.080 | what are the things that you are not looking at
02:07:38.360 | that you are reluctant to look at.
02:07:40.540 | And many people try to take that book
02:07:42.200 | as a self-help book where they're trying
02:07:43.760 | to improve themselves and make sure
02:07:45.520 | they look at more things.
02:07:46.840 | And that often goes badly because it's harder
02:07:49.300 | to actually do that than you think.
02:07:51.240 | - Yeah.
02:07:52.080 | - And so, but we at least want you to know
02:07:54.000 | that this truth is available if you want to learn about it.
02:07:57.160 | - It's the Nietzsche, "If you gaze long into the abyss,
02:07:59.440 | the abyss gazes into you."
02:08:01.280 | Let's talk about this "Elephant in the Brain."
02:08:03.580 | Amazing book.
02:08:05.960 | "The Elephant in the Room" is, quote,
02:08:08.440 | "An important issue that people are reluctant
02:08:10.240 | to acknowledge or address a social taboo.
02:08:12.920 | The elephant in the brain is an important
02:08:15.400 | but unacknowledged feature of how our mind works,
02:08:18.440 | an introspective taboo.
02:08:21.360 | You describe selfishness and self-deception
02:08:25.360 | as the core or some of the core elephants,
02:08:29.760 | as some of the elephant offspring in the brain.
02:08:33.480 | Selfishness and self-deception."
02:08:35.720 | All right.
02:08:38.200 | Can you explain?
02:08:39.240 | Can you explain why these are the taboos
02:08:43.240 | in our brain that we don't want to acknowledge to ourselves?
02:08:48.240 | - In your conscious mind, the one that's listening to me
02:08:51.040 | that I'm talking to at the moment,
02:08:53.000 | you like to think of yourself as the president
02:08:55.360 | or king of your mind, ruling over all that you see,
02:08:59.080 | issuing commands that immediately obeyed.
02:09:01.440 | You are instead better understood
02:09:04.720 | as the press secretary of your brain.
02:09:07.680 | You don't make decisions, you justify them to an audience.
02:09:11.600 | That's what your conscious mind is for.
02:09:15.960 | You watch what you're doing and you try to come up
02:09:19.440 | with stories that explain what you're doing
02:09:21.440 | so that you can avoid accusations of violating norms.
02:09:24.800 | So humans compared to most other animals have norms
02:09:28.280 | and this allows us to manage larger groups
02:09:30.640 | with our morals and norms about what we should
02:09:33.560 | or shouldn't be doing.
02:09:35.000 | This is so important to us that we needed
02:09:38.040 | to be constantly watching what we were doing
02:09:40.000 | in order to make sure we had a good story
02:09:42.560 | to avoid norm violations.
02:09:44.080 | So many norms are about motives.
02:09:45.960 | So if I hit you on purpose, that's a big violation.
02:09:48.720 | If I hit you accidentally, that's okay.
02:09:50.880 | I need to be able to explain why it was an accident
02:09:53.120 | and not on purpose.
02:09:54.160 | - So where does that need come from
02:09:57.080 | for your own self-preservation?
02:09:59.000 | - Right, so humans have norms and we have the norm
02:10:01.280 | that if we see anybody violating a norm,
02:10:03.200 | we need to tell other people and then coordinate
02:10:05.280 | to make them stop and punish them for violating.
02:10:09.280 | So such benefits are strong enough and severe enough
02:10:12.240 | that we each want to avoid being successfully accused
02:10:15.920 | of violating norms.
02:10:16.880 | So for example, hitting someone on purpose
02:10:20.600 | is a big clear norm violation.
02:10:22.280 | If we do it consistently, we may be thrown out of the group
02:10:24.760 | and that would mean we would die.
02:10:26.640 | Okay, so we need to be able to convince people
02:10:29.440 | we are not going around hitting people on purpose.
02:10:32.520 | If somebody happens to be at the other end of our fist
02:10:35.360 | and their face connects, that was an accident
02:10:38.240 | and we need to be able to explain that.
02:10:41.780 | And similarly for many other norms humans have,
02:10:44.720 | we are serious about these norms
02:10:46.200 | and we don't want people to violate.
02:10:48.120 | We find them violating, we're going to accuse them.
02:10:49.920 | But many norms have a motive component
02:10:52.400 | and so we are trying to explain ourselves
02:10:55.320 | and make sure we have a good motive story
02:10:57.280 | about everything we do, which is why we're constantly
02:11:01.000 | trying to explain what we're doing
02:11:02.240 | and that's what your conscious mind is doing.
02:11:04.240 | It is trying to make sure you've got a good motive story
02:11:07.480 | for everything you're doing.
02:11:08.720 | And that's why you don't know why you really do things.
02:11:11.240 | What you know is what the good story is
02:11:13.880 | about why you've been doing things.
02:11:15.480 | - And that's the self-deception.
02:11:17.320 | And you're saying that there is a machine,
02:11:19.380 | the actual dictator is selfish.
02:11:22.840 | And then you're just the press secretary
02:11:24.520 | who's desperately doesn't want to get fired
02:11:26.680 | and is justifying all of the decisions of the dictator.
02:11:30.240 | And that's the self-deception.
02:11:32.240 | - Right, now most people actually are willing to believe
02:11:35.300 | that this is true in the abstract.
02:11:36.720 | So our book has been classified as psychology
02:11:39.200 | and it was reviewed by psychologists
02:11:40.880 | and the basic way that psychology referees
02:11:43.840 | and reviewers responded is to say this is well known.
02:11:47.080 | Most people accept that there's a fair bit
02:11:48.840 | of self-deception.
02:11:49.720 | - But they don't want to accept it about themselves.
02:11:52.080 | - Well, they don't want to accept it
02:11:52.920 | about the particular topics that we talk about.
02:11:55.180 | So people accept the idea in the abstract
02:11:57.800 | that they might be self-deceived
02:11:59.160 | or that they might not be honest about various things.
02:12:01.920 | But that hasn't penetrated into the literatures
02:12:04.200 | where people are explaining particular things
02:12:06.000 | like why we go to school, why we go to the doctor,
02:12:08.440 | why we vote, et cetera.
02:12:10.120 | So our book is mainly about 10 areas of life
02:12:13.080 | and explaining about in each area
02:12:15.160 | what our actual motives there are.
02:12:17.760 | And people who study those things have not admitted
02:12:21.640 | that hidden motives are explaining those particular areas.
02:12:25.480 | - So they haven't taken the leap from theoretical psychology
02:12:28.040 | to actual public policy.
02:12:30.240 | - Exactly.
02:12:31.080 | - And economics and all that kind of stuff.
02:12:32.720 | Well, let me just linger on this
02:12:34.680 | and bring up my old friends Zingman Freud
02:12:39.960 | and Carl Jung.
02:12:41.440 | So how vast is this landscape of the unconscious mind,
02:12:46.440 | the power and the scope of the dictator?
02:12:50.860 | Is it only dark there?
02:12:53.680 | Is it some light?
02:12:55.720 | Is there some love?
02:12:56.880 | - The vast majority of what's happening in your head
02:12:59.480 | you are unaware of.
02:13:00.660 | So in a literal sense, the unconscious,
02:13:03.860 | the aspects of your mind that you're not conscious of
02:13:06.760 | is the overwhelming majority.
02:13:09.160 | But that's just true in a literal engineering sense.
02:13:11.960 | Your mind is doing lots of low-level things
02:13:13.920 | and you just can't be consciously aware
02:13:15.600 | of all that low-level stuff.
02:13:16.840 | But there's plenty of room there
02:13:18.120 | for lots of things you're not aware of.
02:13:20.080 | - But can we try to shine a light
02:13:22.920 | at the things we're unaware of,
02:13:25.200 | specifically, now again,
02:13:26.600 | staying with the philosophical psychology side for a moment.
02:13:29.920 | You know, can you shine a light in the Jungian shadow?
02:13:32.800 | Can you, what's going on there?
02:13:35.600 | What is this machine like?
02:13:37.360 | Like what level of thoughts are happening there?
02:13:40.440 | Is it something that we can even interpret?
02:13:43.320 | If we somehow could visualize it,
02:13:46.000 | is it something that's human interpretable?
02:13:47.680 | Or is it just a kind of chaos
02:13:49.320 | of like monitoring different systems in the body,
02:13:52.300 | making sure you're happy, making sure you're fed,
02:13:56.100 | all those kind of basic forces
02:13:57.640 | that form abstractions on top of each other
02:13:59.880 | and they're not introspective at all?
02:14:01.800 | - We humans are social creatures.
02:14:03.960 | Plausibly being social is the main reason
02:14:05.980 | we have these unusually large brains.
02:14:07.880 | Therefore, most of our brain is devoted to being social.
02:14:10.940 | And so the things we are very obsessed with
02:14:14.360 | and constantly paying attention to are,
02:14:16.680 | how do I look to others?
02:14:18.880 | What would others think of me
02:14:20.480 | if they knew these various things
02:14:22.120 | they might learn about me?
02:14:23.240 | - So that's close to being fundamental
02:14:25.080 | to what it means to be human,
02:14:26.560 | is caring what others think.
02:14:28.040 | - Right, to be trying to present a story
02:14:31.400 | that would be okay for what other things,
02:14:33.240 | but we're very constantly thinking,
02:14:35.320 | what do other people think?
02:14:36.820 | - So let me ask you this question then
02:14:39.060 | about you, Robin Hanson, who many places,
02:14:44.060 | sometimes for fun, sometimes as a basic statement
02:14:48.180 | of principle, likes to disagree
02:14:50.540 | with what the majority of people think.
02:14:53.220 | So how do you explain,
02:14:57.220 | how are you self-deceiving yourself in this task
02:15:01.220 | and how are you being self,
02:15:03.140 | how's your, like why is the dictator manipulating you
02:15:06.540 | inside your head to be self-critical?
02:15:08.700 | Like there's norms, why do you wanna stand out in this way?
02:15:13.560 | Why do you want to challenge the norms in this way?
02:15:15.560 | - Almost by definition, I can't tell you
02:15:17.480 | what I'm deceiving myself about,
02:15:19.680 | but the more practical strategy that's quite feasible
02:15:22.400 | is to ask about what are typical things
02:15:24.760 | that most people deceive themselves about
02:15:26.720 | and then to own up to those particular things.
02:15:29.360 | - Sure, what's a good one?
02:15:31.960 | - So for example, I can very much acknowledge
02:15:35.360 | that I would like to be well thought of,
02:15:38.560 | that I would be seeking attention and glory
02:15:43.360 | and praise from my intellectual work
02:15:47.480 | and that that would be a major agenda
02:15:49.380 | driving my intellectual attempts.
02:15:51.880 | So if there were topics that other people
02:15:55.980 | would find less interesting,
02:15:57.420 | I might be less interested in those
02:15:59.240 | for that reason, for example.
02:16:00.680 | I might want to find topics where other people are interested
02:16:03.080 | and I might want to go for the glory
02:16:06.680 | of finding a big insight rather than a small one
02:16:10.600 | and maybe one that was especially surprising.
02:16:15.520 | That's also of course consistent
02:16:16.880 | with some more ideal concept
02:16:18.920 | of what an intellectual should be,
02:16:20.680 | but most intellectuals are relatively risk-averse.
02:16:25.120 | They are in some local intellectual tradition
02:16:28.540 | and they are adding to that
02:16:30.160 | and they are staying conforming
02:16:32.160 | to the sort of usual assumptions
02:16:33.780 | and usual accepted beliefs and practices
02:16:36.320 | of a particular area
02:16:38.040 | so that they can be accepted in that area
02:16:40.540 | and treated as part of the community.
02:16:43.280 | But you might think for the purpose
02:16:47.380 | of the larger intellectual project
02:16:49.120 | of understanding the world better,
02:16:51.160 | people should be less eager
02:16:53.200 | to just add a little bit to some tradition
02:16:55.480 | and they should be looking for what's neglected
02:16:57.240 | between the major traditions and major questions.
02:16:59.400 | They should be looking for assumptions
02:17:00.840 | maybe we're making that are wrong.
02:17:02.520 | They should be looking at ways,
02:17:04.400 | things that are very surprising,
02:17:05.880 | like things that would be,
02:17:07.240 | you would have thought a priori unlikely
02:17:10.120 | that once you are convinced of it,
02:17:11.640 | you find that to be very important
02:17:13.480 | and a big update, right?
02:17:16.200 | So you could say that one motivation I might have
02:17:21.200 | is less motivated to be sort of comfortably accepted
02:17:26.480 | into some particular intellectual community
02:17:28.340 | and more willing to just go for these
02:17:30.400 | more fundamental long shots
02:17:33.480 | that should be very important if you could find them.
02:17:36.760 | - Which would, if you can find them,
02:17:39.680 | would get you appreciated-- - Attention, respect.
02:17:43.680 | - Across a larger number of people
02:17:45.960 | across the longer time span of history.
02:17:49.000 | - Right.
02:17:49.840 | - So like maybe the small local community
02:17:52.840 | will say you suck.
02:17:54.640 | - Right.
02:17:55.480 | - You must conform,
02:17:56.720 | but the larger community will see the brilliance
02:18:00.300 | of you breaking out of the cage of the small conformity
02:18:03.540 | into a larger cage.
02:18:05.340 | It's always a bigger, there's always a bigger cage,
02:18:07.580 | and then you'll be remembered by more.
02:18:10.140 | Yeah, also that explains your choice of colorful shirt
02:18:14.300 | that looks great in a black background,
02:18:15.860 | so you definitely stand out.
02:18:17.940 | - Right, now of course, you could say,
02:18:20.860 | well, you could get all this attention
02:18:22.500 | by making false claims of dramatic improvement.
02:18:26.100 | - Sure.
02:18:26.940 | - And then wouldn't that be much easier
02:18:28.460 | than actually working through all the details--
02:18:30.180 | - Why not?
02:18:31.020 | - To make true claims, right?
02:18:31.840 | - Let me ask the press secretary, why not?
02:18:34.860 | Why, so of course, you spoke several times
02:18:37.680 | about how much you value truth and the pursuit of truth.
02:18:40.640 | That's a very nice narrative.
02:18:42.240 | - Right.
02:18:43.080 | - Hitler and Stalin also talked about the value of truth.
02:18:46.460 | Do you worry when you introspect,
02:18:49.440 | as broadly as all humans might,
02:18:52.860 | that it becomes a drug?
02:18:55.800 | This being a martyr,
02:19:00.360 | being the person who points out
02:19:03.620 | that the emperor wears no clothes,
02:19:05.740 | even when the emperor is obviously dressed,
02:19:09.280 | just to be the person who points out
02:19:12.360 | that the emperor is wearing no clothes.
02:19:14.760 | Do you think about that?
02:19:15.960 | - So I think the standards you hold yourself to
02:19:22.960 | are dependent on the audience you have in mind.
02:19:25.360 | So if you think of your audience
02:19:28.120 | as relatively easily fooled or relatively gullible,
02:19:32.500 | then you won't bother to generate more complicated,
02:19:35.400 | deep arguments and structures and evidence
02:19:39.240 | to persuade somebody who has higher standards
02:19:42.280 | because why bother?
02:19:43.520 | You can get away with something much easier.
02:19:46.340 | And of course, if you are, say, a salesperson,
02:19:49.200 | or you make money on sales,
02:19:50.440 | then you don't need to convince the top few percent
02:19:53.480 | of the most sharp customers.
02:19:54.700 | You can just go for the bottom 60%
02:19:57.440 | of the most gullible customers
02:19:58.640 | and make plenty of sales, right?
02:20:01.040 | So I think intellectuals have to vary.
02:20:05.280 | One of the main ways intellectuals vary
02:20:06.680 | is in who is their audience in their mind.
02:20:08.440 | Who are they trying to impress?
02:20:10.840 | Is it the people down the hall?
02:20:12.440 | Is it the people who are reading their Twitter feed?
02:20:14.840 | Is it their parents?
02:20:16.340 | Is it their high school teacher?
02:20:19.000 | Or is it Einstein and Freud and Socrates, right?
02:20:23.380 | So I think those of us who are especially arrogant,
02:20:27.560 | especially think that we're really big shot
02:20:31.040 | or have a chance at being a really big shot,
02:20:33.120 | we were naturally gonna pick the big shot audience
02:20:35.880 | that we can.
02:20:36.720 | We're gonna be trying to impress Socrates and Einstein.
02:20:39.520 | - Is that why you hang out with Tyler Cohen a lot?
02:20:41.680 | - Sure, I mean.
02:20:42.760 | - Try to convince him and stuff.
02:20:44.040 | - You know, and you might think,
02:20:45.640 | from the point of view of just making money
02:20:47.480 | or having sex or other sorts of things,
02:20:49.360 | this is misdirected energy, right?
02:20:52.560 | Trying to impress the very most highest quality minds,
02:20:56.760 | that's such a small sample
02:20:57.920 | and they can't do that much for you anyway.
02:21:00.280 | So I might well have had more ordinary success in life,
02:21:04.640 | be more popular, invited to more parties, make more money
02:21:07.320 | if I had targeted a lower tier set of intellectuals
02:21:12.320 | with the standards they have.
02:21:14.560 | But for some reason, I decided early on
02:21:17.160 | that Einstein was my audience, or people like him,
02:21:20.640 | and I was gonna impress them.
02:21:23.280 | - Yeah, I mean, you pick your set of motivations.
02:21:26.800 | You know, convincing, impressing Tyler Cohen
02:21:28.880 | is not gonna help you get laid, trust me, I tried.
02:21:32.360 | All right.
02:21:33.200 | What are some notable sort of effects
02:21:39.760 | of the elephant in the brain in everyday life?
02:21:43.560 | So you mentioned when we tried to apply that
02:21:46.840 | to economics, to public policy,
02:21:49.160 | so when we think about medicine, education,
02:21:51.080 | all those kinds of things, what are some things
02:21:53.200 | that we just-- - Well, the key thing is,
02:21:54.200 | medicine is much less useful health-wise than you think.
02:21:57.120 | So, you know, if you were focused on your health,
02:22:01.280 | you would care a lot less about it.
02:22:03.400 | And if you were focused on other people's health,
02:22:05.160 | you would also care a lot less about it.
02:22:06.720 | But if medicine is, as we suggest,
02:22:09.420 | more about showing that you care
02:22:10.560 | and let other people showing that they care about you,
02:22:13.160 | then a lot of priority on medicine can make sense.
02:22:16.120 | So that was our very earliest discussion in the podcast.
02:22:19.280 | You were talking about, you know,
02:22:21.000 | should you give people a lot of medicine
02:22:22.840 | when it's not very effective?
02:22:24.120 | And then the answer then is, well,
02:22:25.960 | if that's the way that you show that you care about them
02:22:28.560 | and you really want them to know you care,
02:22:31.160 | then maybe that's what you need to do
02:22:33.220 | if you can't find a cheaper, more effective substitute.
02:22:36.200 | - So if we actually just pause on that for a little bit,
02:22:39.500 | how do we start to untangle the full set
02:22:43.200 | of self-deception happening in the space of medicine?
02:22:46.720 | - So we have a method that we use in our book
02:22:48.800 | that is what I recommend for people
02:22:50.600 | to use in all these sorts of topics.
02:22:52.200 | The straightforward method is,
02:22:53.360 | first, don't look at yourself.
02:22:55.800 | Look at other people.
02:22:57.420 | Look at broad patterns of behavior in other people.
02:23:00.560 | And then ask, what are the various theories we could have
02:23:03.720 | to explain these patterns of behavior?
02:23:05.700 | And then just do the simple matching.
02:23:07.240 | Which theory better matches the behavior they have?
02:23:10.920 | And the last step is to assume that's true of you
02:23:13.980 | Don't assume you're an exception.
02:23:17.120 | If you happen to be an exception, that won't go so well,
02:23:19.000 | but nevertheless, on average, you aren't very well
02:23:21.680 | positioned to judge if you're an exception.
02:23:23.220 | So look at what other people do,
02:23:26.200 | explain what other people do, and assume that's you too.
02:23:29.220 | - But also, in the case of medicine,
02:23:31.080 | there's several parties to consider.
02:23:34.400 | So there's the individual person
02:23:35.960 | that's receiving the medicine.
02:23:37.480 | There's the doctors that are prescribing the medicine.
02:23:40.080 | There's drug companies that are selling drugs.
02:23:44.380 | There are governments that have regulations
02:23:46.060 | that are lobbyists.
02:23:46.940 | So you can build up a network of categories of humans
02:23:50.980 | in this, and they each play their role.
02:23:53.900 | So how do you introspect,
02:23:56.140 | sort of analyze the system at a system scale
02:24:01.340 | versus at the individual scale?
02:24:03.820 | - So it turns out that in general,
02:24:06.100 | it's usually much easier to explain producer behavior
02:24:09.700 | than consumer behavior.
02:24:11.340 | That is, the drug companies or the doctors
02:24:14.280 | have relatively clear incentives
02:24:16.100 | to give the customers whatever they want.
02:24:18.240 | And similarly say, governments in democratic countries
02:24:22.020 | have the incentive to give the voters what they want.
02:24:24.900 | So that focuses your attention on the patient
02:24:29.020 | and the voter in this equation,
02:24:31.260 | and saying, what do they want?
02:24:33.020 | They would be driving the rest of the system.
02:24:36.040 | Whatever they want, the other parties
02:24:37.900 | are willing to give them in order to get paid.
02:24:40.560 | So now we're looking for puzzles
02:24:43.460 | in patient and voter behavior.
02:24:46.020 | What are they choosing and why do they choose that?
02:24:49.260 | - And how much exactly?
02:24:52.780 | And then we can explain that potentially,
02:24:54.500 | again, returning to the producer,
02:24:56.380 | by the producer being incentivized to manipulate
02:24:59.220 | the decision-making processes of the voter
02:25:01.420 | and the consumer.
02:25:02.240 | - Well now, in almost every industry,
02:25:04.340 | producers are, in general, happy to lie
02:25:07.340 | and exaggerate in order to get more customers.
02:25:09.700 | This is true of auto repair
02:25:10.900 | as much as human body repair in medicine.
02:25:13.560 | So the differences between these industries
02:25:15.460 | can't be explained by the willingness of the producers
02:25:18.500 | to give customers what they want
02:25:20.220 | or to do various things that we have to, again,
02:25:22.540 | go to the customers.
02:25:23.500 | Why are customers treating body repair
02:25:26.420 | different than auto repair?
02:25:31.100 | - Yeah, and that potentially requires a lot of thinking,
02:25:34.500 | a lot of data collection,
02:25:35.780 | and potentially looking at historical data too
02:25:38.060 | 'cause things don't just happen overnight.
02:25:40.340 | Over time, there's trends.
02:25:41.180 | - In principle, it does, but actually,
02:25:42.700 | it's a lot, actually, easier than you might think.
02:25:45.500 | I think the biggest limitation is just the willingness
02:25:47.940 | to consider alternative hypotheses.
02:25:49.860 | So many of the patterns that you need to rely on
02:25:52.860 | are actually pretty obvious, simple patterns.
02:25:54.780 | You just have to notice them and ask yourself,
02:25:57.220 | how can I explain those?
02:25:59.300 | Often, you don't need to look at the most subtle,
02:26:01.340 | most difficult statistical evidence
02:26:04.420 | that might be out there.
02:26:05.540 | The simplest patterns are often enough.
02:26:08.620 | - All right, so there's a fundamental statement
02:26:10.780 | about self-deception in the book.
02:26:12.740 | There's the application of that,
02:26:14.060 | like we just did in medicine.
02:26:15.920 | Can you steel man the argument
02:26:18.740 | that many of the foundational ideas in the book are wrong?
02:26:25.260 | Meaning there's two that you just made,
02:26:29.540 | which is it can be a lot simpler than it looks.
02:26:33.100 | Can you steel man the case that it's,
02:26:35.780 | case by case, it's always super complicated.
02:26:38.900 | Like it's a complex system.
02:26:39.980 | It's very difficult to have a simple model about.
02:26:42.860 | It's very difficult to introspect.
02:26:44.140 | And the other one is that the human brain
02:26:46.500 | isn't not just about self-deception,
02:26:51.340 | that there's a lot of motivations at play.
02:26:56.340 | And we are able to really introspect our own mind.
02:26:59.700 | And like what's on the surface of the conscious
02:27:02.340 | is actually quite a good representation
02:27:04.660 | of what's going on in the brain.
02:27:06.500 | And you're not deceiving yourself.
02:27:07.900 | You're able to actually arrive to deeply think
02:27:11.060 | about where your mind stands
02:27:12.560 | and what you think about the world.
02:27:14.180 | And it's less about impressing people
02:27:15.980 | and more about being a free thinking individual.
02:27:19.500 | - So when a child tries to explain
02:27:23.820 | why they don't have their homework assignment,
02:27:26.500 | they are sometimes inclined to say,
02:27:28.580 | the dog ate my homework.
02:27:30.400 | They almost never say the dragon ate my homework.
02:27:33.080 | The reason is the dragon
02:27:35.740 | is a completely implausible explanation.
02:27:39.060 | Almost always when we make excuses for things,
02:27:41.700 | we choose things that are at least in some degree plausible.
02:27:45.200 | It could perhaps have happened.
02:27:47.980 | That's an obstacle for any explanation of a hidden motive
02:27:52.140 | or a hidden feature of human behavior.
02:27:54.660 | If people are pretending one thing
02:27:57.260 | while really doing another,
02:27:58.500 | they're usually gonna pick as a pretense
02:28:00.340 | something that's somewhat plausible.
02:28:02.100 | That's gonna be an obstacle to proving that hypothesis
02:28:07.900 | if you are focused on sort of the local data
02:28:10.260 | that a person would typically have if they were challenged.
02:28:13.060 | So if you're just looking at one kid
02:28:14.460 | and his lack of homework,
02:28:17.020 | maybe you can't tell whether his dog ate his homework or not.
02:28:19.780 | If you happen to know he doesn't have a dog,
02:28:21.980 | you might have more confidence, right?
02:28:24.660 | You will need to have a wider range of evidence
02:28:26.780 | than a typical person would
02:28:28.340 | when they're encountering that actual excuse
02:28:30.420 | in order to see past the excuse.
02:28:32.340 | That will just be a general feature of it.
02:28:35.540 | So if I say, there's this usual story
02:28:38.860 | about why we go to the doctor
02:28:39.940 | and then there's this other explanation,
02:28:42.660 | it'll be true that you'll have to look at wider data
02:28:44.940 | in order to see that
02:28:46.140 | because people don't usually offer excuses
02:28:49.860 | unless in the local context of their excuse,
02:28:52.500 | they can get away with it.
02:28:54.060 | That is, it's hard to tell, right?
02:28:55.780 | So in the case of medicine,
02:28:58.340 | I have to point you to sort of larger sets of data,
02:29:00.900 | but in many areas of academia, including health economics,
02:29:06.860 | the researchers there also
02:29:09.100 | want to support the usual points of view.
02:29:11.820 | And so they will have selection effects
02:29:14.260 | in their publications and their analysis
02:29:16.260 | whereby if they're getting a result too much contrary
02:29:19.460 | to the usual point of view everybody wants to have,
02:29:21.420 | they will file drawer that paper
02:29:24.100 | or redo the analysis until they get an answer
02:29:26.500 | that's more to people's liking.
02:29:28.220 | So that means in the health economics literature,
02:29:31.620 | there are plenty of people who will claim
02:29:33.780 | that in fact, we have evidence that medicine is effective.
02:29:37.140 | And when I respond,
02:29:39.380 | I will have to point you to our most reliable evidence
02:29:43.940 | and ask you to consider the possibility
02:29:45.940 | that the literature is biased
02:29:47.540 | in that when the evidence isn't as reliable,
02:29:50.220 | when they have more degrees of freedom
02:29:51.660 | in order to get the answer they want,
02:29:53.260 | they do tend to get the answer they want.
02:29:55.020 | But when we get to the kind of evidence
02:29:57.460 | that's much harder to mess with,
02:29:59.580 | that's where we will see the truth be more revealed.
02:30:03.420 | So with respect to medicine,
02:30:05.620 | we have millions of papers published in medicine
02:30:08.460 | over the years,
02:30:09.300 | most of which give the impression that medicine is useful.
02:30:13.660 | There's a small literature on randomized experiments
02:30:16.980 | of the aggregate effects of medicine,
02:30:19.220 | where there's maybe a few half dozen or so papers,
02:30:22.980 | where it would be the hardest to hide it
02:30:26.460 | because it's such a straightforward experiment
02:30:29.580 | done in a straightforward way
02:30:32.420 | that it's hard to manipulate.
02:30:36.060 | And that's where I will point you to,
02:30:37.740 | to show you that there's relatively little correlation
02:30:40.100 | between health and medicine.
02:30:40.940 | But even then, people could try to save the phenomenon
02:30:44.500 | and say, "Well, it's not hidden motives, it's just ignorance."
02:30:46.500 | They could say, for example,
02:30:48.020 | medicine's complicated,
02:30:50.700 | most people don't know the literature,
02:30:53.300 | therefore they can be excused for ignorance.
02:30:56.820 | They are just ignorantly assuming that medicine is effective.
02:30:59.700 | It's not that they have some other motive
02:31:01.380 | that they're trying to achieve.
02:31:02.940 | And then I will have to do,
02:31:06.020 | as with a conspiracy theory analysis,
02:31:08.180 | I'm saying, "Well, how long has this misperception
02:31:11.780 | been going on?
02:31:12.620 | How consistently has it happened around the world
02:31:15.140 | and across time?"
02:31:16.420 | And I would have to say, "Look, if we're talking about,
02:31:20.140 | say, a recent new product, like Segway scooters
02:31:24.220 | or something, I could say not so many people
02:31:26.700 | have seen them or used them.
02:31:27.740 | Maybe they could be confused about their value.
02:31:29.820 | If we're talking about a product that's been around
02:31:31.660 | for thousands of years, used in roughly the same way
02:31:34.260 | all across the world, and we see the same pattern
02:31:36.700 | over and over again, this sort of ignorance mistake
02:31:40.300 | just doesn't work so well."
02:31:41.860 | - It's also is a question of how much of the self-deception
02:31:46.860 | is prevalent versus foundational?
02:31:50.580 | Because there's a kind of implied thing
02:31:52.500 | where it's foundational to human nature
02:31:54.980 | versus just a common pitfall.
02:31:58.700 | This is a question I have.
02:32:01.780 | So maybe human progress is made by people
02:32:05.740 | who don't fall into the self-deception.
02:32:09.220 | It's like a baser aspect of human nature,
02:32:11.740 | but then you escape it easily if you're motivated.
02:32:14.820 | - The motivational hypotheses about the self-deceptions
02:32:19.220 | are in terms of how it makes you look
02:32:21.180 | to the people around you.
02:32:22.020 | Again, the press secretary.
02:32:24.100 | So the story would be most people want to look good
02:32:27.340 | to the people around them.
02:32:28.660 | Therefore, most people present themselves in ways
02:32:31.860 | that help them look good to the people around them.
02:32:35.840 | That's sufficient to say there would be a lot of it.
02:32:39.000 | It doesn't need to be 100%, right?
02:32:41.400 | There's enough variety in people and in circumstances
02:32:44.000 | that sometimes taking a contrarian strategy
02:32:46.120 | can be in the interest of some minority of the people.
02:32:49.620 | So I might, for example, say that
02:32:51.520 | that's a strategy I've taken.
02:32:53.640 | I've decided that being contrarian on these things
02:32:57.860 | could be winning for me in that there's a room
02:33:01.040 | for a small number of people like me
02:33:02.880 | who have these sort of messages
02:33:05.440 | who can then get more attention,
02:33:07.080 | even if there's not room for most people to do that.
02:33:09.680 | And that can be explaining sort of the variety, right?
02:33:14.320 | Similarly, you might say, look,
02:33:16.280 | just look at the most obvious things.
02:33:17.580 | Most people would like to look good, right,
02:33:19.260 | in the sense of physically.
02:33:20.100 | Just you look good right now.
02:33:21.420 | You're wearing a nice suit.
02:33:22.260 | You have a haircut.
02:33:23.100 | You shaved, right?
02:33:24.560 | So, and we--
02:33:25.400 | - I cut my own hair, by the way.
02:33:26.240 | - Okay, well, then, all the more impressive.
02:33:28.120 | - That's a counter argument for your claim
02:33:31.840 | that most people wanna look good.
02:33:33.080 | - Clearly, if we look at most people
02:33:34.280 | and their physical appearance,
02:33:35.200 | clearly most people are trying to look somewhat nice, right?
02:33:37.880 | They shower, they shave, they comb their hair,
02:33:41.080 | but we certainly see some people around
02:33:42.760 | who are not trying to look so nice, right?
02:33:44.440 | Is that a big challenge,
02:33:46.120 | the hypothesis that people wanna look nice?
02:33:48.040 | Not that much, right?
02:33:49.640 | We can see in those particular people's context
02:33:53.800 | more particular reasons why they've chosen
02:33:55.880 | to be an exception to the more general rule.
02:33:58.720 | - So the general rule does reveal something foundational.
02:34:01.720 | - In general, really.
02:34:02.680 | - Right.
02:34:04.720 | - That's the way things work.
02:34:05.840 | Let me ask you, you wrote a blog post
02:34:07.840 | about the accuracy of authorities,
02:34:09.520 | since we're talking about this, especially in medicine.
02:34:12.520 | Just looking around us,
02:34:16.720 | especially during this time of the pandemic,
02:34:18.960 | there's been a growing distrust of authorities,
02:34:22.000 | of institutions, even the institution of science itself.
02:34:27.760 | What are the pros and cons of authorities, would you say?
02:34:32.520 | So what's nice about authorities?
02:34:35.880 | What's nice about institutions?
02:34:37.920 | And what are their pitfalls?
02:34:39.560 | - One standard function of authority
02:34:43.080 | is as something you can defer to respectively
02:34:45.800 | without needing to seem too submissive
02:34:48.560 | or ignorant or gullible.
02:34:54.680 | That is, when you're asking, what should I act on
02:34:59.680 | or what belief should I act on?
02:35:02.880 | You might be worried if I chose something too contrarian,
02:35:06.440 | too weird, too speculative, that that would make me look bad
02:35:11.200 | so I would just choose something very conservative.
02:35:14.340 | So maybe an authority lets you choose something
02:35:17.000 | a little less conservative
02:35:18.160 | because the authority is your authorization.
02:35:21.120 | The authority will let you do it.
02:35:23.320 | And somebody says, why did you do that thing?
02:35:25.640 | And they say, the authority authorized it.
02:35:27.680 | The authority tells me I should do this.
02:35:29.960 | Why aren't you doing it?
02:35:31.520 | - So the authority is often pushing for the conservative.
02:35:34.600 | - Well, no, the authority can do more.
02:35:36.760 | I mean, so for example, we just think about,
02:35:39.000 | I don't know, in a pandemic even,
02:35:41.400 | you could just think, oh, I'll just stay home
02:35:43.040 | and close all the doors or I'll just ignore it.
02:35:44.800 | You could just think of just some very simple strategy
02:35:47.000 | that might be defensible if there were no authorities.
02:35:50.960 | But authorities might be able to know more than that.
02:35:54.080 | They might be able to look at some evidence,
02:35:55.820 | draw a more context-dependent conclusion,
02:35:58.280 | declare it as the authority's opinion,
02:36:00.520 | and then other people might follow that,
02:36:01.760 | and that could be better than doing nothing.
02:36:03.640 | - So you mentioned WHO,
02:36:06.080 | the world's most beloved organization.
02:36:08.640 | So this is me speaking in general,
02:36:13.640 | WHO and CDC has been kind of,
02:36:17.760 | depending on degrees.
02:36:19.880 | - Right.
02:36:20.720 | - Details, just not behaving as I would have imagined
02:36:27.720 | in the best possible evolution of human civilization,
02:36:31.600 | authorities should act.
02:36:33.200 | They seem to have failed in some fundamental way
02:36:36.920 | in terms of leadership in a difficult time for our society.
02:36:40.880 | Can you say what are the pros and cons
02:36:43.200 | of this particular authority?
02:36:45.600 | - So again, if there were no authorities whatsoever,
02:36:49.080 | no accepted authorities,
02:36:51.080 | then people would have to sort of randomly pick
02:36:54.760 | different local authorities
02:36:56.360 | who would conflict with each other,
02:36:57.560 | and then they'd be fighting each other about that,
02:36:59.400 | or just not believe anybody
02:37:01.000 | and just do some initial default action
02:37:02.920 | that you would always do without responding to context.
02:37:05.640 | So the potential gain of an authority
02:37:08.600 | is that they could know more than just basic ignorance,
02:37:11.600 | and if people followed them,
02:37:13.600 | they could both be more informed than ignorance
02:37:16.280 | and all doing the same thing,
02:37:17.900 | so they're each protected from being accused
02:37:20.300 | or complained about.
02:37:21.200 | That's the idea of an authority.
02:37:23.200 | That would be the good--
02:37:24.720 | - What's the con of that?
02:37:26.760 | - Okay.
02:37:27.600 | - What's the negative?
02:37:28.440 | How does that go wrong?
02:37:29.260 | - So the con is that if you think of yourself
02:37:31.900 | as the authority and asking,
02:37:33.200 | "What's my best strategy as an authority?"
02:37:35.920 | it's unfortunately not to be maximally informative.
02:37:39.100 | So you might think the ideal authority
02:37:42.320 | would not just tell you more than ignorance,
02:37:44.760 | it would tell you as much as possible.
02:37:47.560 | Okay, it would give you as much detail
02:37:49.800 | as you could possibly listen to and manage to assimilate,
02:37:53.460 | and it would update that as frequently as possible
02:37:56.260 | or as frequently as you were able to listen and assimilate,
02:37:58.880 | and that would be the maximally informative authority.
02:38:02.060 | The problem is there's a conflict
02:38:04.960 | between being an authority or being seen as an authority
02:38:08.800 | and being maximally informative.
02:38:10.980 | That was the point of my blog post
02:38:12.840 | that you're pointing out to here.
02:38:15.460 | That is, if you look at it from their point of view,
02:38:18.900 | they won't long remain the perceived authority
02:38:22.860 | if they are too incautious about how they use that authority,
02:38:27.860 | and one of the ways to be incautious
02:38:31.780 | would be to be too informative.
02:38:34.740 | - Okay, that's still in the pro column for me
02:38:37.140 | 'cause you're talking about the tensions
02:38:39.100 | that are very data-driven and very honest,
02:38:42.660 | and I would hope that authorities struggle with that,
02:38:46.140 | how much information to provide to people
02:38:49.140 | to maximize outcomes.
02:38:53.800 | Now, I'm generally somebody
02:38:54.940 | that believes more information is better
02:38:56.500 | 'cause I trust in the intelligence of people,
02:38:58.500 | but I'd like to mention a bigger con on authorities,
02:39:02.940 | which is the human question.
02:39:04.220 | This comes back to global government and so on,
02:39:07.820 | is that there's humans that sit in chairs during meetings
02:39:12.820 | and those authorities, they have different titles,
02:39:16.460 | humans form hierarchies,
02:39:18.220 | and sometimes those titles get to your head a little bit,
02:39:20.780 | and you start to want to think,
02:39:23.060 | how do I preserve my control over this authority,
02:39:26.380 | as opposed to thinking through
02:39:28.540 | what is the mission of the authority,
02:39:30.940 | what is the mission of WHO and the other such organization,
02:39:34.300 | and how do I maximize the implementation of that mission,
02:39:37.740 | you start to think, well, I kind of like sitting
02:39:40.420 | in this big chair at the head of the table,
02:39:42.620 | I'd like to sit there for another few years,
02:39:45.220 | or better yet, I want to be remembered
02:39:47.980 | as the person who in a time of crisis
02:39:50.300 | was at the head of this authority
02:39:52.660 | and did a lot of good things.
02:39:54.420 | So you stop trying to do good
02:39:58.780 | under what good means, given the mission of the authority,
02:40:02.100 | and you start to try to carve a narrative,
02:40:05.600 | to manipulate the narrative.
02:40:06.780 | First, in the meeting room, everybody around you,
02:40:09.900 | just a small little story you tell yourself,
02:40:12.380 | the new interns, the managers,
02:40:15.420 | throughout the whole hierarchy of the company.
02:40:17.540 | Okay, once everybody in the company,
02:40:19.780 | or in the organization believes this narrative,
02:40:22.680 | now you start to control the release of information,
02:40:27.680 | not because you're trying to maximize outcomes,
02:40:30.060 | but because you're trying to maximize
02:40:32.340 | the effectiveness of the narrative,
02:40:34.220 | that you are truly a great representative
02:40:38.020 | of this authority in human history.
02:40:40.220 | And I just feel like those human forces,
02:40:42.900 | whenever you have an authority,
02:40:45.620 | it starts getting to people's heads.
02:40:48.040 | One of the most, me as a scientist,
02:40:50.180 | one of the most disappointing things to see
02:40:52.100 | during the pandemic is the use of authority
02:40:56.480 | from colleagues of mine to roll their eyes,
02:41:01.580 | to dismiss other human beings,
02:41:05.340 | just because they got a PhD,
02:41:07.940 | just because they're an assistant associate, full faculty,
02:41:11.860 | just because they are deputy head of X organization,
02:41:16.860 | NIH, whatever the heck the organization is,
02:41:20.740 | just because they got an award of some kind,
02:41:22.980 | and at a conference,
02:41:24.900 | they won a Best Paper award seven years ago,
02:41:28.300 | and then somebody shook their hand and gave them a medal,
02:41:31.140 | maybe it was a president,
02:41:32.460 | and it's been 20, 30 years
02:41:34.980 | that people have been patting them on the back,
02:41:36.820 | saying how special they are,
02:41:38.860 | especially when they're controlling money
02:41:40.700 | and getting sucked up from other scientists
02:41:44.040 | who really want the money in a self-deception kind of way,
02:41:46.780 | they don't actually really care about your performance,
02:41:48.900 | and all of that gets to your head,
02:41:50.540 | and no longer are you the authority
02:41:52.700 | that's trying to do good
02:41:54.140 | and lessen the suffering in the world,
02:41:55.820 | you become an authority that just wants to maximize
02:42:00.460 | self-preserve yourself in a sitting on a throne of power.
02:42:05.460 | - So this is core to sort of what it is to be an economist,
02:42:08.860 | I'm a professor of economics.
02:42:11.740 | - There you go, with the authority again.
02:42:13.380 | - No, so it's about saying--
02:42:15.220 | - Just joking, yes.
02:42:16.340 | - We often have a situation where we see a world of behavior,
02:42:20.820 | and then we see ways in which particular behaviors
02:42:23.700 | are not sort of maximally socially useful.
02:42:27.100 | - Yes.
02:42:28.140 | - And we have a variety of reactions to that,
02:42:31.780 | so one kind of reaction is to sort of morally blame
02:42:34.640 | each individual for not doing
02:42:36.900 | the maximally socially useful thing,
02:42:38.800 | under perhaps the idea that people could be identified
02:42:43.740 | and shamed for that,
02:42:44.640 | and maybe induced into doing the better thing
02:42:46.660 | if only enough people were calling them out on it, right?
02:42:50.080 | But another way to think about it
02:42:52.460 | is to think that people sit in institutions
02:42:55.180 | with certain stable institutional structures,
02:42:58.100 | and that institutions create particular incentives
02:43:00.740 | for individuals, and that individuals are typically
02:43:04.220 | doing whatever is in their local interest
02:43:07.280 | in the context of that institution,
02:43:09.080 | and then perhaps to less blame individuals
02:43:13.340 | for winning their local institutional game,
02:43:15.940 | and more blaming the world for having the wrong institutions.
02:43:19.380 | So economists are often like wondering
02:43:21.420 | what other institutions we could have
02:43:22.980 | instead of the ones we have,
02:43:24.340 | and which of them might promote better behavior,
02:43:26.140 | and this is a common thing we do all across human behavior,
02:43:29.660 | is to think of what are the institutions we're in,
02:43:32.060 | and what are the alternative variations we could imagine,
02:43:34.500 | and then to say which institutions
02:43:35.980 | would be most productive.
02:43:37.220 | I would agree with you that our information institutions,
02:43:42.080 | that is the institutions by which we collect information
02:43:45.060 | and aggregate it and share it with people,
02:43:47.260 | are especially broken in the sense of far from the ideal
02:43:52.020 | of what would be the most cost-effective way
02:43:54.540 | to collect and share information,
02:43:56.740 | but then the challenge is to try
02:43:58.100 | to produce better institutions.
02:44:00.980 | And as an academic, I'm aware that academia
02:44:03.820 | is particularly broken in the sense
02:44:06.860 | that we give people incentives to do research
02:44:09.700 | that's not very interesting or important,
02:44:11.980 | because basically they're being impressive,
02:44:14.300 | and we actually care more about whether academics
02:44:16.380 | are impressive than whether they're interesting or useful.
02:44:20.260 | And I'm happy to go into detail
02:44:23.580 | with lots of different known institutions
02:44:25.380 | and their known institutional failings,
02:44:27.100 | ways in which those institutions produce incentives
02:44:30.820 | that are mistaken, and that was the point of the post
02:44:32.940 | we started with talking about the authorities.
02:44:34.740 | If I need to be seen as an authority,
02:44:37.620 | that's at odds with my being informative,
02:44:40.580 | and I might choose to be the authority
02:44:42.860 | instead of being informative,
02:44:43.860 | 'cause that's my institutional incentives.
02:44:46.300 | - And if I may, I'd like to,
02:44:48.220 | given that beautiful picture of incentives
02:44:53.100 | and individuals that you just painted,
02:44:55.980 | let me just apologize for a couple of things.
02:44:58.660 | One, I often put too much blame on leaders of institutions
02:45:03.660 | versus the incentives that govern those institutions.
02:45:09.620 | And as a result of that, I've been,
02:45:12.060 | I believe, too critical of Anthony Fauci,
02:45:17.500 | too emotional about my criticism of Anthony Fauci,
02:45:20.940 | and I'd like to apologize for that,
02:45:22.820 | because I think there's deeper truths to think about,
02:45:27.460 | there's deeper incentives to think about.
02:45:29.620 | That said, I do sort of, I'm a romantic creature by nature.
02:45:34.300 | I romanticize Winston Churchill,
02:45:37.220 | and when I think about Nazi Germany,
02:45:42.220 | I think about Hitler more than I do
02:45:44.500 | about the individual people of Nazi Germany.
02:45:46.900 | You think about leaders, you think about individuals,
02:45:49.380 | not necessarily the parameters,
02:45:51.020 | the incentives that govern the system,
02:45:53.380 | 'cause it's harder.
02:45:55.620 | It's harder to think through deeply about the models
02:45:58.900 | from which those individuals arise,
02:46:00.660 | but that's the right thing to do.
02:46:03.300 | But also, I don't apologize for being emotional sometimes,
02:46:07.860 | and being--
02:46:08.700 | - I'm happy to blame the individual leaders
02:46:11.060 | in the sense that I might say,
02:46:12.940 | well, you should be trying to reform these institutions
02:46:15.180 | if you're just there to get promoted
02:46:17.420 | and look good at being at the top,
02:46:19.100 | but maybe I can blame you for your motives
02:46:21.060 | and your priorities in there.
02:46:22.660 | But I can understand why the people at the top
02:46:24.780 | would be the people who are selected
02:46:26.180 | for having the priority of primarily
02:46:27.940 | trying to get to the top.
02:46:29.180 | I get that.
02:46:30.300 | - Can I maybe ask you about particular universities?
02:46:33.220 | They've received, like science has received
02:46:37.340 | an increase in distrust overall as an institution,
02:46:40.740 | which breaks my heart,
02:46:42.580 | because I think science is beautiful as a,
02:46:45.260 | not maybe not as an institution,
02:46:47.220 | but as one of the things,
02:46:50.420 | one of the journeys that humans have taken on.
02:46:54.540 | The other one is university.
02:46:55.820 | I think university is actually a place,
02:46:58.580 | for me at least, in the way I see it,
02:47:01.020 | is a place of freedom of exploring ideas,
02:47:05.740 | scientific ideas, engineering ideas,
02:47:10.100 | more than a corporate, more than a company,
02:47:13.820 | more than a lot of domains in life.
02:47:15.620 | It's not just in its ideal,
02:47:18.740 | but it's in its implementation,
02:47:21.300 | a place where you can be a kid for your whole life
02:47:24.500 | and play with ideas.
02:47:25.900 | And I think with all the criticism
02:47:28.060 | that universities still not currently receive,
02:47:32.140 | I think they, I don't think that criticism
02:47:34.500 | is representative of universities.
02:47:36.340 | They focus on very anecdotal evidence
02:47:38.420 | of particular departments, particular people.
02:47:40.740 | But I still feel like there's a lot of place
02:47:44.180 | for freedom of thought,
02:47:46.580 | at least at MIT,
02:47:50.060 | at least in the fields I care about,
02:47:52.900 | in particular kind of science,
02:47:56.060 | particular kind of technical fields,
02:47:57.940 | mathematics, computer science, physics,
02:48:01.140 | engineering, so robotics, artificial intelligence.
02:48:04.460 | This is a place where you get to be a kid.
02:48:06.620 | Yet there is bureaucracy that's rising up.
02:48:11.620 | There's like more rules, there's more meetings,
02:48:16.100 | and there's more administration,
02:48:18.000 | having like PowerPoint presentations,
02:48:21.420 | which to me, you should like
02:48:25.100 | be more of a renegade explorer of ideas.
02:48:30.260 | And meetings destroy, they suffocate that radical thought
02:48:34.500 | that happens when you're an undergraduate student
02:48:36.420 | and you can do all kinds of wild things
02:48:37.860 | when you're a graduate student.
02:48:39.340 | Anyway, all that to say,
02:48:40.700 | you've thought about this aspect too.
02:48:42.220 | Is there something positive, insightful you could say
02:48:46.460 | about how we can make for better universities
02:48:50.340 | in the decades to come, this particular institution?
02:48:53.380 | How can we improve them?
02:48:55.500 | - I hear that centuries ago,
02:48:58.100 | many scientists and intellectuals were aristocrats.
02:49:01.540 | They had time and could, if they chose,
02:49:06.460 | choose to be intellectuals.
02:49:07.820 | That's a feature of the combination
02:49:12.140 | that they had some source of resources
02:49:14.380 | that allowed them leisure,
02:49:16.280 | and that the kind of competition they were faced in
02:49:19.260 | among aristocrats allowed that sort of a self-indulgence
02:49:23.860 | or self-pursuit at least at some point in their lives.
02:49:28.920 | So the analogous observation is that university professors
02:49:33.920 | often have sort of the freedom and space
02:49:37.080 | to do a wide range of things.
02:49:39.160 | And I am certainly enjoying that as a tenured professor.
02:49:41.960 | - You're a really, sorry to interrupt,
02:49:44.800 | a really good representative of that.
02:49:47.080 | Just the exploration you're doing, the depth of thought,
02:49:50.120 | like most people are afraid to do the kind of
02:49:53.680 | broad thinking that you're doing, which is great.
02:49:56.000 | - The fact that that can happen
02:49:57.880 | is the combination of these two things analogously.
02:50:00.080 | One is that we have fierce competition
02:50:02.520 | to become a tenured professor,
02:50:03.800 | but then once you become tenured,
02:50:05.160 | we give you the freedom to do what you like.
02:50:08.560 | And that's a happenstance.
02:50:10.720 | It didn't have to be that way.
02:50:12.640 | And in many other walks of life,
02:50:14.400 | even though people have a lot of resources, et cetera,
02:50:17.120 | they don't have that kind of freedom set up.
02:50:18.600 | So I think I'm kind of lucky that tenure exists
02:50:23.280 | and that I'm enjoying it.
02:50:25.440 | But I can't be too enthusiastic about this
02:50:28.480 | unless I can approve of sort of the source
02:50:30.280 | of the resources that's paying for all this.
02:50:32.480 | So for the aristocrat, if you thought
02:50:33.760 | they stole it in war or something,
02:50:36.800 | you wouldn't be so pleased,
02:50:38.000 | whereas if you thought they had earned it
02:50:39.560 | or their ancestors had earned this money
02:50:41.920 | that they were spending as an aristocrat,
02:50:43.480 | then you could be more okay with that.
02:50:45.340 | So for universities, I have to ask,
02:50:48.920 | where are the main sources of resources
02:50:50.640 | that are going to the universities
02:50:52.180 | and are they getting their money's worth?
02:50:54.240 | Or are they getting a good value for that payment?
02:50:56.960 | So first of all, they're students.
02:51:00.820 | And the question is, are students getting good value
02:51:03.920 | for their education?
02:51:05.040 | And each person is getting value in the sense
02:51:08.340 | that they are identified and shown
02:51:10.720 | to be a more capable person,
02:51:12.560 | which is then worth more salary as an employee later.
02:51:16.180 | But there is a case for saying
02:51:17.560 | there's a big waste of the system
02:51:19.040 | because we aren't actually changing the students
02:51:22.520 | or educating them, we're more sorting them or labeling them.
02:51:26.880 | And that's a very expensive process to produce that outcome.
02:51:30.280 | And part of the expense is the freedom from tenure, I guess.
02:51:33.720 | So I feel like I can't be too proud of that
02:51:36.680 | because it's basically a tax on all these young students
02:51:39.600 | to pay this enormous amount of money
02:51:41.280 | in order to be labeled as better,
02:51:42.840 | whereas I feel like we should be able
02:51:44.600 | to find cheaper ways of doing that.
02:51:47.120 | The other main customer is researcher patrons
02:51:50.640 | like the government or other foundations.
02:51:53.360 | And then the question is, are they getting their money worth
02:51:55.880 | out of the money they're paying for research to happen?
02:51:58.800 | And my analysis is they don't actually care
02:52:03.440 | about the research progress.
02:52:04.660 | They are mainly buying an affiliation
02:52:06.520 | with credentialed impressiveness
02:52:07.880 | on the part of the researchers.
02:52:09.620 | They mainly pay money to researchers who are impressive
02:52:12.680 | and have impressive affiliations,
02:52:15.240 | and they don't really much care
02:52:16.200 | what research project happens as a result.
02:52:19.000 | - Is that a cynical?
02:52:21.160 | So there's a deep truth to that cynical perspective.
02:52:25.560 | Is there a less cynical perspective that they do care
02:52:29.920 | about the long-term investment
02:52:31.760 | into the progress of science and humanity?
02:52:34.480 | - They might personally care,
02:52:35.780 | but they're stuck in an equilibrium.
02:52:37.680 | - Sure.
02:52:38.520 | - Wherein they basically most foundations
02:52:40.680 | like governments or research,
02:52:42.520 | or like the Ford Foundation,
02:52:46.520 | the individuals there are rated based on the prestige
02:52:49.780 | they bring to that organization.
02:52:51.720 | - Yeah.
02:52:52.560 | - And even if they might personally want
02:52:54.000 | to produce more intellectual progress,
02:52:55.440 | they are in a competitive game where they don't have tenure
02:52:58.520 | and they need to produce this prestige.
02:53:01.160 | And so once they give grant money to prestigious people,
02:53:04.000 | that is the thing that shows that they have achieved
02:53:06.100 | prestige for the organization,
02:53:07.520 | and that's what they need to do
02:53:08.840 | in order to retain their position.
02:53:11.080 | - And you do hope that there's a correlation
02:53:13.320 | between prestige and actual competence.
02:53:17.000 | - Of course there is a correlation.
02:53:18.320 | The question is just,
02:53:19.240 | could we do this better some other way?
02:53:21.160 | - Yes.
02:53:22.000 | - I think it's pretty clear we could.
02:53:24.700 | What it's harder to do is move the world
02:53:27.240 | to a new equilibrium where we do that instead.
02:53:29.720 | - What are the components of the better ways to do it?
02:53:33.440 | Is it money?
02:53:36.820 | So the sources of money and how the money is allocated
02:53:40.620 | to give the individual researchers freedom?
02:53:44.500 | - Years ago, I started studying this topic exactly
02:53:47.440 | because this was my issue
02:53:48.760 | and this was many decades ago now,
02:53:50.680 | and I spent a long time,
02:53:51.720 | and my best guess still is prediction markets,
02:53:55.340 | betting markets.
02:53:56.180 | So if you as a research patron want to know
02:53:59.920 | the answer to a particular question,
02:54:01.840 | like what's the mass of the electron neutrino,
02:54:04.280 | then what you can do is just subsidize a betting market
02:54:07.760 | in that question, and that will induce more research
02:54:11.180 | into answering that question
02:54:12.340 | because the people who then answer that question
02:54:14.320 | can then make money in that betting market
02:54:15.940 | with the new information they gain.
02:54:17.900 | So that's a robust way to induce more information
02:54:21.420 | on a topic.
02:54:22.260 | If you want to induce an accomplishment,
02:54:23.780 | you can create prizes,
02:54:25.040 | and there's of course a long history
02:54:26.820 | of prizes to induce accomplishments.
02:54:29.660 | And we moved away from prizes,
02:54:32.220 | even though we once used them far more often
02:54:34.980 | than we did today.
02:54:36.940 | And there's a history to that.
02:54:38.480 | And for the customers who want to be affiliated
02:54:42.860 | with impressive academics,
02:54:44.740 | which is what most of the customers want,
02:54:46.320 | students, journalists, and patrons,
02:54:48.460 | I think there's a better way of doing that,
02:54:50.080 | which I just wrote about in my second most recent blog post.
02:54:54.640 | - Can you explain?
02:54:55.520 | - Sure.
02:54:56.360 | What we do today is we take sort of acceptance
02:54:58.960 | by other academics recently as our best indication
02:55:02.560 | of their deserved prestige.
02:55:04.260 | That is recent publications, recent job affiliation,
02:55:08.820 | institutional affiliations, recent invitations to speak,
02:55:12.900 | recent grants.
02:55:14.780 | We are today taking other impressive academics'
02:55:19.060 | recent choices to affiliate with them
02:55:21.740 | as our best guesstimate of their prestige.
02:55:24.020 | I would say we could do better by creating betting markets
02:55:28.300 | in what the distant future will judge
02:55:30.700 | to have been their deserved prestige,
02:55:32.900 | looking back on them.
02:55:34.480 | I think most intellectuals, for example,
02:55:37.280 | think that if we looked back two centuries,
02:55:39.880 | say to intellectuals from two centuries ago,
02:55:42.800 | and tried to look in detail at their research
02:55:45.680 | and how it influenced future research
02:55:47.460 | and which path it was on,
02:55:49.240 | we could much more accurately judge
02:55:52.640 | their actual deserved prestige.
02:55:55.240 | That is who was actually on the right track,
02:55:57.040 | who actually helped,
02:55:58.560 | which will be different than what people at the time judged
02:56:01.200 | using the immediate indications of the time
02:56:03.740 | or which position they had
02:56:04.860 | or which publications they had or things like that.
02:56:07.600 | - In this way, if you think from the perspective
02:56:10.780 | of multiple centuries,
02:56:12.700 | you would higher prioritize true novelty,
02:56:17.380 | you would disregard the temporal proximity,
02:56:19.980 | like how recent the thing is,
02:56:22.220 | and you would think like what is the brave, the bold,
02:56:25.700 | the big, novel idea that this,
02:56:28.420 | and you would actually--
02:56:29.260 | - You would be able to rate that
02:56:31.060 | 'cause you could see the path with which ideas took,
02:56:33.420 | which things had dead ends,
02:56:34.500 | which led to what other followings.
02:56:36.020 | You could, looking back centuries later,
02:56:39.120 | have a much better estimate of who actually had
02:56:41.760 | what long-term effects on intellectual progress.
02:56:44.100 | So my proposal is we actually pay people
02:56:47.140 | in several centuries to do this historical analysis
02:56:49.500 | and we have prediction markets today
02:56:52.340 | where we buy and sell assets which will later off pay off
02:56:55.300 | in terms of those final evaluations.
02:56:57.660 | So now we'll be inducing people today
02:56:59.520 | to make their best estimate of those things
02:57:01.100 | by actually looking at the details of people
02:57:04.620 | and setting the prices according.
02:57:06.260 | So my proposal would be we rate people today
02:57:08.820 | on those prices today.
02:57:10.220 | So instead of looking at their list of publications
02:57:12.420 | or affiliations, you look at the actual price of assets
02:57:15.340 | that represent people's best guess
02:57:17.880 | of what the future will say about them.
02:57:19.380 | - That's brilliant.
02:57:20.220 | So this concept of idea futures,
02:57:23.540 | can you elaborate what this would entail?
02:57:27.780 | - I've been elaborating two versions of it here.
02:57:30.240 | So one is if there's a particular question,
02:57:33.040 | say the mass of the electron neutrino,
02:57:35.520 | and what you as a patron wanna do
02:57:37.160 | is get an answer to that question,
02:57:39.760 | then what you would do is subsidize a betting market
02:57:42.080 | in that question under the assumption
02:57:44.260 | that eventually we'll just know the answer
02:57:45.840 | and we can pay off the bets that way.
02:57:48.240 | And that is a plausible assumption
02:57:49.560 | for many kinds of concrete intellectual questions
02:57:52.240 | like what's the mass of the electron neutrino.
02:57:53.880 | - In this hypothetical world that you're constructing
02:57:56.400 | that may be a real world,
02:57:58.160 | do you mean literally financial?
02:58:01.200 | - Yes, literal, very literal.
02:58:03.520 | Very cash, very direct and literal, yes.
02:58:07.760 | - Or crypto.
02:58:09.100 | - Well, crypto is money.
02:58:10.760 | - Yes, true.
02:58:11.600 | - So the idea would be research labs would be for profit.
02:58:15.240 | They would have as their expense,
02:58:17.060 | paying researchers to study things,
02:58:18.600 | and then their profit would come from using the insights
02:58:21.200 | the researchers gains to trade in these financial markets.
02:58:25.540 | Just like hedge funds today make money
02:58:27.400 | by paying researchers to study firms
02:58:29.740 | and then making their profits by trading on those,
02:58:32.440 | that insight in the ordinary financial market.
02:58:34.280 | - And the market would, if it's efficient,
02:58:37.800 | would be able to become better and better
02:58:39.720 | predicting the powerful ideas
02:58:42.320 | that the individual is able to generate.
02:58:44.320 | - The variance around the mass of the electron neutrino
02:58:46.600 | would decrease with time as we learned
02:58:48.440 | that value of that parameter better
02:58:49.960 | and any other parameters that we wanted to estimate.
02:58:52.600 | - You don't think those markets would also respond
02:58:54.520 | to recency of prestige and all those kinds of things?
02:58:58.680 | - They would respond, but the question is,
02:59:00.960 | if they might respond incorrectly,
02:59:02.680 | but if you think they're doing it incorrectly,
02:59:04.400 | you have a profit opportunity.
02:59:05.240 | - There'll be a correction mechanism.
02:59:06.760 | - You can go fix it.
02:59:08.120 | So we'd be inviting everybody to ask
02:59:10.360 | whether they can find any biases or errors
02:59:12.400 | in the current ways in which people are estimating
02:59:14.200 | these things from whatever clues they have.
02:59:15.760 | - Right, there's a big incentive
02:59:16.960 | for the correction mechanism.
02:59:18.640 | In academia currently, there's not,
02:59:22.400 | it's the safe choice to go with the prestige.
02:59:25.080 | - Exactly.
02:59:25.920 | - And there's no--
02:59:26.840 | - Even if you privately think that the prestige
02:59:28.800 | is overrated.
02:59:30.960 | - Even if you privately think strongly that it's overrated.
02:59:34.200 | - Still, you don't have an incentive to defy that publicly.
02:59:36.680 | - You're going to lose a lot,
02:59:38.520 | unless you're a contrarian that writes brilliant blogs
02:59:42.720 | and you could talk about it on a podcast.
02:59:45.560 | - Right.
02:59:46.400 | I mean, initially, this was my initial concept
02:59:48.120 | of having these betting markets on these key parameters.
02:59:50.240 | What I then realized over time was that
02:59:52.600 | that's more what people pretend to care about.
02:59:54.520 | What they really mostly care about is just who's how good.
02:59:58.200 | And that's what most of the system is built on,
02:59:59.960 | is trying to rate people and rank them.
03:00:01.840 | And so I designed this other alternative
03:00:04.080 | based on historical evaluation centuries later,
03:00:06.480 | just about who's how good,
03:00:08.600 | because that's what I think most of the customers
03:00:10.240 | really care about.
03:00:11.120 | - Customers, I like the word customers here, humans.
03:00:15.960 | - Right, well, every major area of life,
03:00:18.440 | which has specialists who get paid to do that thing,
03:00:21.080 | must have some customers from elsewhere
03:00:22.920 | who are paying for it.
03:00:24.520 | - Well, who are the customers for the mass of the neutrino?
03:00:27.400 | Yes, I understand, in a sense,
03:00:31.200 | people who are willing to pay for a thing.
03:00:35.480 | - That's an important thing to understand about anything,
03:00:37.560 | who are the customers, and what's the product,
03:00:39.760 | like medicine, education, academia, military, et cetera.
03:00:44.280 | That's part of the hidden motives analysis.
03:00:45.920 | Often people have a thing they say
03:00:47.440 | about what the product is and who the customer is,
03:00:49.240 | and maybe you need to dig a little deeper
03:00:51.120 | to find out what's really going on.
03:00:52.880 | - Or a lot deeper.
03:00:54.960 | You've written that you seek out, quote, view quakes.
03:00:59.960 | You're able, as an intelligent black box
03:01:02.880 | word generating machine,
03:01:04.320 | you're able to generate a lot of sexy words.
03:01:06.040 | I like it, I love it.
03:01:07.440 | View quakes, which are insights
03:01:10.520 | which dramatically changed my worldview, your worldview.
03:01:15.920 | You write, I loved science fiction as a child,
03:01:18.880 | studied physics and artificial intelligence
03:01:21.000 | for a long time each,
03:01:22.680 | and now study economics and political science,
03:01:25.600 | all fields full of such insights.
03:01:28.720 | So, let me ask, what are some view quakes,
03:01:31.940 | or a beautiful, surprising idea to you
03:01:34.600 | from each of those fields,
03:01:36.280 | physics, AI, economics, political science?
03:01:38.920 | I know it's a tough question,
03:01:40.320 | something that springs to mind about physics, for example,
03:01:43.180 | that just is beautiful to me.
03:01:44.360 | Right from the beginning, say,
03:01:46.120 | special relativity was a big surprise.
03:01:48.460 | Most of us have a simple concept of time,
03:01:51.720 | and it seems perfectly adequate
03:01:53.280 | for everything we've ever seen.
03:01:55.080 | And to have it explained to you
03:01:56.560 | that you need to sort of have a mixture concept
03:01:58.800 | of time and space,
03:01:59.700 | where you put it into the space-time construct,
03:02:02.200 | how it looks different from different perspectives,
03:02:05.120 | that was quite a shock.
03:02:06.680 | And that was such a shock that it makes you think,
03:02:10.080 | what else do I know that isn't the way it seems?
03:02:13.440 | Certainly, quantum mechanics is certainly
03:02:15.200 | another enormous shock in terms of,
03:02:17.520 | from your point, you know,
03:02:18.680 | you have this idea that there's a space,
03:02:20.580 | and then there's particles at points,
03:02:23.440 | and maybe fields in between.
03:02:25.140 | And quantum mechanics is just
03:02:28.440 | a whole different representation.
03:02:29.600 | It looks nothing like what you would have thought
03:02:32.000 | as sort of the basic representation of the physical world.
03:02:35.140 | And that was quite a surprise.
03:02:36.880 | - What would you say is the catalyst
03:02:39.020 | for the view quake in theoretical physics
03:02:42.200 | in the 20th century?
03:02:43.080 | Where does that come from?
03:02:44.560 | So the interesting thing about Einstein,
03:02:46.000 | it seems like a lot of that came
03:02:47.240 | from like almost thought experiments.
03:02:49.200 | It wasn't almost experimentally driven.
03:02:51.860 | And with, actually, I don't know
03:02:56.480 | the full story of quantum mechanics,
03:02:58.240 | how much of it is experiment,
03:02:59.960 | like where, if you look at the full trace
03:03:03.300 | of idea generation there,
03:03:05.280 | of all the weird stuff that falls out of quantum mechanics,
03:03:08.720 | how much of that was the experimentalists,
03:03:10.640 | how much was it the theoreticians?
03:03:12.160 | But usually, in theoretical physics,
03:03:14.000 | the theories lead the way.
03:03:15.720 | So maybe can you elucidate,
03:03:19.720 | like what is the catalyst for these?
03:03:22.420 | - The remarkable thing about physics
03:03:24.540 | and about many other areas of academic intellectual life
03:03:28.000 | is that it just seems way over-determined.
03:03:30.680 | That is, if it hadn't been for Einstein
03:03:34.080 | or if it hadn't been for Heisenberg,
03:03:36.120 | certainly within a half a century,
03:03:37.840 | somebody else would have come up
03:03:39.440 | with essentially the same things.
03:03:42.120 | - Is that something you believe?
03:03:43.320 | - Yes. - Or is that something?
03:03:44.160 | - Yes, so I think when you look at
03:03:46.000 | sort of just the history of physics
03:03:47.600 | and the history of other areas,
03:03:49.480 | some areas like that,
03:03:50.360 | there's just this enormous convergence.
03:03:52.400 | That the different kind of evidence
03:03:54.000 | that was being collected was so redundant
03:03:57.600 | in the sense that so many different things
03:03:59.480 | revealed the same things that eventually
03:04:02.080 | you just kind of have to accept it
03:04:04.280 | because it just gets obvious.
03:04:06.580 | So if you look at the details, of course,
03:04:09.180 | Einstein did it before somebody else,
03:04:11.760 | and it's well worth celebrating Einstein for that.
03:04:14.240 | And we, by celebrating the particular people
03:04:17.520 | who did something first or came across something first,
03:04:19.960 | we are encouraging all the rest to move a little faster,
03:04:23.320 | to try to push us all a little faster, which is great,
03:04:28.760 | but I still think we would have gotten
03:04:31.920 | roughly to the same place within a half century.
03:04:34.520 | So sometimes people are special
03:04:37.040 | because of how much longer it would have taken.
03:04:39.400 | So some people say general relativity
03:04:41.000 | would have taken longer without Einstein than other things.
03:04:43.880 | I mean, Heisenberg quantum mechanics,
03:04:45.960 | I mean, there were several different formulations
03:04:47.400 | of quantum mechanics all around the same few years,
03:04:50.320 | means no one of them made that much of a difference.
03:04:52.880 | We would have had pretty much the same thing
03:04:54.640 | regardless of which of them did it exactly when.
03:04:57.400 | Nevertheless, I'm happy to celebrate them all.
03:05:00.040 | But this is a choice I make in my research.
03:05:02.440 | That is, when there's an area
03:05:03.280 | where there's lots of people working together,
03:05:05.960 | who are sort of scoping each other
03:05:08.080 | and getting a result just before somebody else does,
03:05:10.720 | you ask, well, how much of a difference would I make there?
03:05:14.040 | At most, I could make something happen
03:05:15.640 | a few months before somebody else.
03:05:17.800 | And so I'm less worried about them missing things.
03:05:20.600 | So when I'm trying to help the world, like doing research,
03:05:23.080 | I'm looking for neglected things.
03:05:24.200 | I'm looking for things that nobody's doing it.
03:05:26.200 | If I didn't do it, nobody would do it.
03:05:28.200 | - Nobody would do it.
03:05:29.120 | - Or at least for a long time.
03:05:29.960 | - In the next 10, 20 years kind of thing.
03:05:31.160 | - Exactly.
03:05:32.000 | - Same with general relativity, just, you know,
03:05:33.520 | the whole would do it.
03:05:35.160 | It might take another 10, 20, 30, 50 years.
03:05:37.520 | - So that's the place
03:05:38.360 | where you can have the biggest impact,
03:05:39.760 | is finding the things that nobody would do
03:05:41.640 | unless you did them.
03:05:43.400 | - And then that's when you get the big view quake,
03:05:46.200 | the insight.
03:05:47.240 | So what about artificial intelligence?
03:05:49.440 | Would it be the EMs, the emulated minds?
03:05:54.440 | What idea, whether that struck you in the shower one day,
03:05:59.600 | or are they you just observed?
03:06:03.440 | - Clearly, the biggest view quake
03:06:05.120 | in artificial intelligence is the realization
03:06:08.200 | of just how complicated our human minds are.
03:06:10.880 | So most people who come to artificial intelligence
03:06:14.420 | from other fields or from relative ignorance,
03:06:17.820 | a very common phenomenon, which you must be familiar with,
03:06:20.160 | is that they come up with some concept
03:06:22.360 | and then they think that must be it.
03:06:24.640 | Once we implement this new concept, we will have it.
03:06:27.200 | We will have full human level
03:06:28.880 | or higher artificial intelligence, right?
03:06:30.560 | And they're just not appreciating just how big the problem is,
03:06:34.240 | how long the road is, just how much is involved,
03:06:36.720 | because that's actually hard to appreciate.
03:06:38.680 | When we just think, it seems really simple.
03:06:41.600 | And studying artificial intelligence,
03:06:43.720 | going through many particular problems,
03:06:45.160 | looking in each problem, all the different things
03:06:47.220 | you need to be able to do to solve a problem like that
03:06:49.960 | makes you realize all the things your minds are doing
03:06:52.360 | that you are not aware of.
03:06:54.200 | That's that vast subconscious that you're not aware.
03:06:57.560 | That's the biggest view quake
03:06:58.560 | from artificial intelligence by far,
03:07:00.180 | for most people who study artificial intelligence,
03:07:02.320 | is to see just how hard it is.
03:07:05.080 | - I think that's a good point.
03:07:07.160 | But I think it's a very early view quake.
03:07:10.920 | It's when the stunning Kruger crashes hard.
03:07:15.920 | It's the first realization
03:07:18.840 | that humans are actually quite incredible.
03:07:21.160 | The human mind, the human body is quite incredible.
03:07:23.240 | - There's a lot of different parts to it.
03:07:25.220 | - But then, see, it's already been so long for me
03:07:29.460 | that I've experienced that view quake,
03:07:31.080 | that for me, I now experience the view quakes
03:07:34.040 | of holy shit, this little thing is actually quite powerful.
03:07:37.400 | Like neural networks, I'm amazed.
03:07:39.880 | 'Cause you've become more cynical
03:07:43.520 | after that first view quake of like, this is so hard.
03:07:48.000 | Like evolution did some incredible work
03:07:50.440 | to create the human mind.
03:07:52.240 | But then you realize, Jessica, as you have,
03:07:55.000 | you've talked about a bunch of simple models,
03:07:57.280 | that simple things can actually be extremely powerful.
03:08:01.080 | That maybe emulating the human mind is extremely difficult,
03:08:06.000 | but you can go a long way with a large neural network.
03:08:09.000 | You can go a long way with a dumb solution.
03:08:11.000 | It's that Stuart Russell thing
03:08:12.140 | with the reinforcement learning.
03:08:14.120 | Holy crap, you can go quite a long way with a simple thing.
03:08:17.480 | - But we still have a very long road to go,
03:08:19.160 | but nevertheless.
03:08:20.640 | - I can't, I refuse to sort of know.
03:08:23.400 | The road is full of surprises.
03:08:27.720 | So long is an interesting, like you said,
03:08:30.520 | with the six hard steps that humans have to take
03:08:34.080 | to arrive at where we are from the origin of life on Earth.
03:08:37.600 | So it's long maybe in the statistical improbability
03:08:42.040 | of the steps that have to be taken.
03:08:44.560 | But in terms of how quickly those steps could be taken,
03:08:48.000 | I don't know if my intuition says
03:08:50.800 | it's if it's hundreds of years away,
03:08:53.560 | or if it's a couple of years away.
03:08:57.480 | I prefer to measure--
03:08:59.000 | - Pretty confident it's at least a decade.
03:09:00.880 | And mildly confident it's at least three decades.
03:09:03.400 | - I can steel man either direction.
03:09:05.480 | I prefer to measure that journey in Elon Musk's.
03:09:08.760 | That's a new--
03:09:09.720 | - We don't get Elon Musk very often,
03:09:11.080 | so that's a long timescale.
03:09:12.680 | - For now, I don't know, maybe you can clone,
03:09:15.560 | or maybe multiply, or I don't even know what Elon Musk,
03:09:18.880 | what that is, what is that?
03:09:21.080 | - That's a good question, exactly.
03:09:23.000 | Well, that's an excellent question.
03:09:24.000 | - How does that fit into the model of the three parameters
03:09:28.080 | that are required for becoming a grabby alien civilization?
03:09:33.080 | - That's the question of how much any individual makes
03:09:35.880 | in the long path of civilization over time.
03:09:38.720 | Yes, and it's a favorite topic of historians
03:09:41.320 | and people to try to focus on individuals
03:09:43.960 | and how much of a difference they make.
03:09:45.240 | And certainly some individuals make a substantial difference
03:09:48.080 | in the modest term, right?
03:09:50.400 | Like, certainly without Hitler being Hitler
03:09:53.880 | in the role he took,
03:09:55.440 | European history would have taken a different path
03:09:57.440 | for a while there.
03:09:59.280 | But if we're looking over many centuries longer term things,
03:10:02.520 | most individuals do fade in their individual influence.
03:10:05.800 | - So, I mean--
03:10:08.360 | - Even Einstein.
03:10:09.200 | - Even Einstein.
03:10:10.800 | No matter how sexy your hair is,
03:10:13.920 | you will also be forgotten in the long arc of history.
03:10:17.720 | So you said at least 10 years,
03:10:19.760 | so let's talk a little bit about this AI point
03:10:26.320 | of how we achieve,
03:10:28.120 | how hard is the problem of solving intelligence
03:10:31.120 | by engineering artificial intelligence
03:10:35.440 | that achieves human-level, human-like qualities
03:10:39.840 | that we associate with intelligence?
03:10:41.240 | How hard is this?
03:10:42.120 | What are the different trajectories that take us there?
03:10:45.240 | - One way to think about it is in terms of the scope
03:10:48.600 | of the technology space you're talking about.
03:10:50.920 | So let's take the biggest possible scope,
03:10:53.480 | all of human technology, right?
03:10:56.160 | The entire human economy.
03:10:57.880 | So the entire economy is composed of many industries,
03:11:02.320 | each of which have many products
03:11:03.680 | with many different technologies supporting each one.
03:11:06.320 | At that scale, I think we can accept
03:11:10.560 | that most innovations are a small fraction of the total.
03:11:14.560 | That is, usually you have relatively gradual overall progress
03:11:18.200 | and that individual innovations
03:11:21.920 | that have a substantial effect,
03:11:23.400 | that total are rare and their total effect
03:11:26.000 | is still a small percentage of the total economy, right?
03:11:29.000 | There's very few individual innovations
03:11:30.800 | that made a substantial difference to the whole economy.
03:11:34.120 | What are we talking, steam engine,
03:11:36.080 | shipping containers, a few things.
03:11:38.080 | - Shipping containers?
03:11:40.600 | - Shipping containers deserves to be up there
03:11:42.000 | with steam engines, honestly.
03:11:44.160 | - Can you say exactly why shipping containers?
03:11:47.080 | - Shipping containers revolutionized shipping.
03:11:49.800 | Shipping is very important.
03:11:51.240 | - But placing that at shipping containers,
03:11:55.480 | so you're saying you wouldn't have some of the magic
03:11:57.640 | of the supply chain and all that
03:11:59.240 | without shipping containers?
03:12:00.880 | - Made a big difference, absolutely.
03:12:02.280 | - Interesting, that's something we'll look into.
03:12:05.200 | We shouldn't take that tangent, although I'm tempted to.
03:12:08.000 | But anyway, so there's a few, just a few innovations.
03:12:11.040 | - Right, so at the scale of the whole economy, right?
03:12:14.040 | Now, as you move down to a much smaller scale,
03:12:17.480 | you will see individual innovations
03:12:19.560 | having a bigger effect, right?
03:12:21.280 | So if you look at, I don't know, lawnmowers or something,
03:12:24.960 | I don't know about the innovations of lawnmower,
03:12:26.160 | but there were probably like steps where you just had
03:12:28.800 | a new kind of lawnmower, and that made a big difference
03:12:31.640 | to mowing lawns, because you're focusing on a smaller part
03:12:35.800 | of the whole technology space, right?
03:12:38.200 | So, and you know, sometimes like military technology,
03:12:42.480 | there's a lot of military technologies,
03:12:43.880 | a lot of small ones, but every once in a while,
03:12:45.280 | a particular military weapon like makes a big difference.
03:12:48.840 | But still, even so, mostly overall,
03:12:51.360 | they're making modest differences to something
03:12:54.480 | that's increasing relatively, like US military
03:12:57.040 | is the strongest in the world consistently for a while.
03:12:59.840 | No one weapon in the last 70 years has like made
03:13:03.320 | a big difference in terms of the overall prominence
03:13:06.080 | of the US military, right?
03:13:07.400 | 'Cause that's just saying, even though every once in a while,
03:13:09.760 | even the recent Soviet hyper missiles or whatever they are,
03:13:13.360 | they aren't changing the overall balance dramatically, right?
03:13:17.000 | So when we get to AI, now I can frame the question,
03:13:21.560 | how big is AI?
03:13:23.680 | Basically, if, so one way of thinking about AI
03:13:26.760 | is it's just all mental tasks.
03:13:28.840 | And then you ask what fraction of tasks are mental tasks?
03:13:30.960 | And then I go, a lot.
03:13:32.320 | And then if I think of AI as like half of everything,
03:13:37.320 | then I think, well, it's gotta be composed of lots of parts
03:13:41.240 | where any one innovation is only a small impact, right?
03:13:44.480 | Now, if you think, no, no, no, AI is like AGI.
03:13:48.800 | And then you think AGI is a small thing, right?
03:13:52.840 | There's only a small number of key innovations
03:13:55.080 | that will enable it.
03:13:56.640 | Now you're thinking there could be a bigger chunk
03:14:00.760 | that you might find that would have a bigger impact.
03:14:02.760 | So the way I would ask you to frame these things
03:14:05.480 | in terms of the chunkiness of different areas of technology,
03:14:09.840 | in part, in terms of how big they are.
03:14:11.240 | So if you take 10 chunky areas and you add them together,
03:14:14.480 | the total is less chunky.
03:14:16.080 | - Yeah, but don't you, are you able until you solve
03:14:19.840 | the fundamental core parts of the problem
03:14:22.680 | to estimate the chunkiness of that problem?
03:14:25.440 | - Well, if you have a history of prior chunkiness,
03:14:28.360 | that could be your best estimate for future chunkiness.
03:14:30.600 | So for example, I mean,
03:14:31.680 | even at the level of the world economy, right?
03:14:34.080 | We've had this, what, 10,000 years of civilization.
03:14:37.480 | Well, that's only a short time.
03:14:39.120 | You might say, oh, that doesn't predict future chunkiness.
03:14:42.120 | But it looks relatively steady and consistent.
03:14:46.280 | We can say, even in computer science,
03:14:48.600 | we've had seven years of computer science.
03:14:50.800 | We have enough data to look at chunkiness
03:14:52.520 | in computer science.
03:14:54.040 | Like, when were there algorithms or approaches
03:14:57.400 | that made a big, chunky difference?
03:14:59.160 | And how large a fraction of those that was that?
03:15:03.720 | And I'd say, mostly in computer science,
03:15:05.800 | most innovation has been relatively small chunks.
03:15:07.880 | The bigger chunks have been rare.
03:15:10.080 | - Well, this is the interesting thing.
03:15:11.600 | This is about AI and just algorithms in general,
03:15:14.480 | is page rank.
03:15:17.760 | So Google's, right?
03:15:19.640 | So sometimes it's a simple algorithm
03:15:23.600 | that by itself is not that useful, but the scale--
03:15:28.320 | - Context.
03:15:29.160 | - And in a context that's scalable,
03:15:31.680 | like depending on the, yeah, depending on the context,
03:15:34.280 | is all of a sudden the power is revealed.
03:15:36.480 | And there's something,
03:15:37.480 | I guess that's the nature of chunkiness,
03:15:39.720 | is that you could,
03:15:42.440 | things that can reach a lot of people simply
03:15:45.560 | can be quite chunky.
03:15:46.480 | - So one standard story about algorithms is to say,
03:15:49.760 | algorithms have a fixed cost plus a marginal cost.
03:15:54.400 | And so in history, when you had computers that were
03:15:57.000 | very small, you tried,
03:15:58.800 | all the algorithms had low fixed costs,
03:16:00.760 | and you look for the best of those.
03:16:03.400 | But over time, as computers got bigger,
03:16:04.840 | you could afford to do larger fixed costs and try those.
03:16:07.760 | And some of those had more effective algorithms
03:16:10.600 | in terms of their marginal cost.
03:16:12.560 | And that, in fact, that roughly explains
03:16:15.720 | the long-term history where, in fact,
03:16:17.560 | the rate of algorithmic improvement is about the same
03:16:19.760 | as the rate of hardware improvement,
03:16:22.040 | which is a remarkable coincidence.
03:16:23.800 | But it would be explained by saying,
03:16:26.440 | well, there's all these better algorithms you can't try
03:16:29.840 | until you have a big enough computer to pay the fixed cost
03:16:32.440 | of doing some trials to find out if that algorithm
03:16:35.160 | actually saves you on the marginal cost.
03:16:37.320 | And so that's an explanation
03:16:39.600 | for this relatively continuous history where,
03:16:41.720 | so we have a good story about why hardware
03:16:43.480 | is so continuous, right?
03:16:44.640 | And you might think, why would software be so continuous
03:16:47.560 | with the hardware?
03:16:48.400 | But if there's a distribution of algorithms
03:16:49.960 | in terms of their fixed costs,
03:16:51.840 | and it's, say, spread out in a wide log-normal distribution,
03:16:55.280 | then we could be sort of marching
03:16:56.880 | through that log-normal distribution,
03:16:58.520 | trying out algorithms with larger fixed costs
03:17:00.840 | and finding the ones that have lower marginal costs.
03:17:04.240 | - So would you say AGI, human-level, AI, even EM, M,
03:17:14.440 | emulated minds, is chunky?
03:17:18.560 | Like a few breakthroughs can take this?
03:17:20.040 | - So an M is by its nature chunky,
03:17:23.080 | in the sense that if you have an emulated brain
03:17:25.520 | and you're 25% effective at emulating it, that's crap.
03:17:29.560 | That's nothing.
03:17:30.400 | You pretty much need to emulate a full human brain.
03:17:35.160 | - Is that obvious?
03:17:36.600 | Is that obvious that the 25%- - I think it's pretty obvious.
03:17:38.760 | I'm talking about like, you know,
03:17:40.840 | so the key thing is you're emulating various brain cells,
03:17:43.600 | and so you have to emulate
03:17:44.560 | the input-output pattern of those cells.
03:17:46.520 | So if you get that pattern somewhat close,
03:17:49.040 | but not close enough, then the whole system
03:17:51.400 | just doesn't have the overall behavior
03:17:53.200 | you're looking for, right?
03:17:54.120 | - But it could have, functionally,
03:17:56.520 | some of the power of the overall system.
03:17:58.120 | - So there'll be some threshold.
03:17:59.160 | The point is, when you get close enough,
03:18:00.880 | then it goes over the threshold.
03:18:02.520 | It's like taking a computer chip
03:18:04.040 | and deleting every 1% of the gates, right?
03:18:07.200 | - No, that's very chunky.
03:18:09.520 | But the hope is that the emulating the human brain,
03:18:12.840 | I mean, the human brain itself is not-
03:18:14.520 | - Right, so it has a certain level of redundancy
03:18:16.680 | and a certain level of robustness.
03:18:17.800 | And so there's some threshold.
03:18:18.800 | When you get close to that level of redundancy
03:18:20.520 | and robustness, then it starts to work.
03:18:21.920 | But until you get to that level,
03:18:24.040 | it's just gonna be crap, right?
03:18:25.520 | It's gonna be just a big thing that isn't working well.
03:18:28.160 | So we can be pretty sure that emulations
03:18:31.160 | is a big chunk in an economic sense, right?
03:18:34.320 | At some point, you'll be able to make one
03:18:36.000 | that's actually effective in able substituting for humans,
03:18:39.920 | and then that will be this huge economic product
03:18:43.040 | that people will try to buy like crazy.
03:18:44.480 | - You'll bring a lot of value to people's lives,
03:18:46.240 | so they'll be willing to pay for it.
03:18:48.960 | - But it could be that the first emulation costs
03:18:51.520 | a billion dollars each, right?
03:18:53.600 | And then we have them, but we can't really use them,
03:18:55.760 | they're too expensive.
03:18:56.600 | And then the cost slowly comes down,
03:18:58.040 | and now we have less of a chunky adoption, right?
03:19:02.440 | That as the cost comes down,
03:19:04.320 | then we use more and more of them in more and more contexts.
03:19:07.360 | And that's a more continuous curve.
03:19:10.120 | So it's only if the first emulations are relatively cheap
03:19:13.640 | that you get a more sudden disruption to society.
03:19:17.600 | And that could happen if sort of the algorithm
03:19:19.960 | is the last thing you figure out how to do or something.
03:19:21.840 | - What about robots that capture some magic
03:19:24.360 | in terms of social connection?
03:19:28.680 | The robots, like we have a robot dog
03:19:30.920 | on the carpet right there,
03:19:32.240 | robots that are able to capture some magic
03:19:36.160 | of human connection as they interact with humans,
03:19:39.960 | but are not emulating the brain.
03:19:42.000 | What about those, how far away?
03:19:44.920 | - So we're thinking about chunkiness or distance now.
03:19:48.840 | So if you ask how chunky is the task
03:19:51.160 | of making a emulatable robot or something,
03:19:54.800 | - Which chunkiness and time are correlated.
03:19:59.320 | - Right, but it's about how far away it is
03:20:02.040 | or how suddenly it would happen.
03:20:04.240 | Chunkiness is how suddenly,
03:20:06.160 | and difficulty is just how far away it is.
03:20:08.800 | But it could be a continuous difficulty.
03:20:10.720 | It could just be far away,
03:20:11.560 | but we'll slowly steadily get there.
03:20:13.080 | Or there could be these thresholds
03:20:14.480 | where we reach a threshold
03:20:15.440 | and suddenly we can do a lot better.
03:20:16.880 | - Yeah, that's a good question for both.
03:20:19.280 | I tend to believe that all of it, not just the M,
03:20:23.160 | but AGI too is chunky.
03:20:25.840 | And human level intelligence embodied in robots
03:20:30.840 | is also chunky.
03:20:32.120 | The history of computer science and chunkiness so far
03:20:34.440 | seems to be my rough best guess for the chunkiness of AGI.
03:20:37.840 | That is, it is chunky.
03:20:39.800 | Modestly chunky, not that chunky.
03:20:42.440 | Right?
03:20:43.280 | (laughing)
03:20:44.120 | Our ability to use computers to do many things in the economy
03:20:46.920 | has been moving relatively steadily.
03:20:48.600 | Overall, in terms of our use of computers in society,
03:20:51.520 | they have been relatively steadily improving for 70 years.
03:20:55.760 | - No, but I would say that's hard.
03:20:57.440 | Well, yeah, okay.
03:20:58.880 | Okay, I would have to really think about that
03:21:00.560 | 'cause neural networks are quite surprising.
03:21:03.880 | - Sure, but every once in a while
03:21:04.960 | we have a new thing that's surprising.
03:21:06.240 | But if you stand back,
03:21:07.920 | we see something like that every 10 years or so.
03:21:10.360 | Some new innovations-- - The progress is gradual.
03:21:12.280 | - That has a big effect.
03:21:13.640 | - So moderately chunky.
03:21:16.840 | Huh, yeah.
03:21:19.800 | - The history of the level of disruption
03:21:21.400 | we've seen in the past would be a rough estimate
03:21:23.240 | of the level of disruption in the future.
03:21:24.720 | Unless the future is we're gonna hit a chunky territory,
03:21:27.520 | much chunkier than we've seen in the past.
03:21:29.480 | - Well, I do think there's,
03:21:30.880 | it's like Kuhnian, like revolution type.
03:21:35.400 | It seems like the data, especially on AI,
03:21:39.640 | is difficult to reason with because it's so recent.
03:21:44.640 | It's such a recent field.
03:21:48.040 | - Well, I've been around for 50 years.
03:21:50.640 | - I mean, 50, 60, 70, 80 years being recent.
03:21:53.760 | - Okay.
03:21:54.600 | - That's how I'm--
03:21:56.240 | - It's enough time to see a lot of trends.
03:21:58.880 | - A few trends, a few trends.
03:22:01.080 | I think the internet, computing,
03:22:04.840 | there's really a lot of interesting stuff
03:22:07.040 | that's happened over the past 30 years
03:22:09.360 | that I think the possibility of revolutions
03:22:13.880 | is likelier than it was in the--
03:22:16.880 | - I think for the last 70 years,
03:22:18.040 | there have always been a lot of things
03:22:19.600 | that looked like they had a potential for revolution.
03:22:21.240 | - So we can't reason well about this.
03:22:23.200 | - I mean, we can reason well
03:22:24.600 | by looking at the past trends.
03:22:26.000 | I would say the past trend is roughly your best guess
03:22:28.480 | for the future.
03:22:29.320 | - No, but if I look back at the things
03:22:32.520 | that might have looked like revolutions
03:22:33.840 | in the '70s and '80s and '90s,
03:22:35.640 | they are less like the revolutions
03:22:40.400 | that appear to be happening now,
03:22:42.240 | or the capacity of revolution that appear to be there now.
03:22:45.520 | First of all, there's a lot more money to be made.
03:22:48.920 | So there's a lot more incentive for markets
03:22:50.800 | to do a lot of kind of innovation,
03:22:52.360 | it seems like, in the AI space.
03:22:54.680 | But then again, there's a history of winters
03:22:57.160 | and summers and so on.
03:22:58.680 | So maybe we're just like riding a nice wave right now.
03:23:01.080 | - One of the biggest issues is the difference
03:23:03.240 | between impressive demos and commercial value.
03:23:05.880 | - Yes.
03:23:06.720 | - So we often, through the history of AI,
03:23:08.320 | we saw very impressive demos
03:23:10.360 | that never really translated much into commercial value.
03:23:12.920 | - Somebody who works on and cares about autonomous
03:23:15.480 | and semi-autonomous vehicles, tell me about it.
03:23:18.080 | And there again, we return to the number
03:23:21.440 | of Elon Musks per Earth per year generated.
03:23:27.040 | That's the M.
03:23:28.040 | Coincidentally, same initials as the M.
03:23:30.800 | - Yeah.
03:23:31.640 | - Very suspicious, very suspicious.
03:23:33.240 | We're gonna have to look into that.
03:23:35.120 | All right, two more fields that I would like to force
03:23:39.320 | and twist your arm to.
03:23:40.520 | - All right.
03:23:41.360 | - To look for view quakes and for beautiful ideas, economics.
03:23:44.280 | What is a beautiful idea to you about economics?
03:23:49.280 | You've mentioned a lot of them already.
03:23:53.160 | - Sure, so as you said before,
03:23:55.760 | there's gonna be the first view quake most people encounter
03:23:58.600 | that makes the biggest difference on average in the world,
03:24:01.040 | 'cause that's the only thing most people ever see
03:24:03.360 | is the first one.
03:24:05.040 | And so, with AI, the first one is just how big
03:24:09.360 | the problem is, but once you get past that,
03:24:11.760 | you'll find others.
03:24:12.600 | Certainly for economics, the first one is just
03:24:16.080 | the power of markets.
03:24:17.240 | You might've thought it was just really hard
03:24:20.800 | to figure out how to optimize in a big, complicated space,
03:24:24.160 | and markets just do a good first pass
03:24:27.640 | for an awful lot of stuff.
03:24:29.000 | And they are really quite robust and powerful.
03:24:31.400 | And that's just quite the view quake,
03:24:35.080 | where you just say, you know, just let a,
03:24:37.600 | if you wanna get in the ballpark,
03:24:39.160 | just let a market handle it and step back.
03:24:42.120 | And that's true for a wide range of things.
03:24:44.720 | It's not true for everything,
03:24:45.560 | but it's a very good first approximation.
03:24:48.720 | And most people's intuitions for how they should limit
03:24:50.720 | markets are actually messing them up.
03:24:53.560 | They're that good in sense, right?
03:24:55.160 | Most people, when you go, I don't know if we wanna trust
03:24:57.200 | that, well, you should be trusting that.
03:24:59.120 | - What about, what are markets?
03:25:02.600 | Like just a couple of words.
03:25:05.520 | - So the idea is if people want something,
03:25:09.120 | then let other companies form to try to supply that thing.
03:25:12.520 | Let those people pay for their cost
03:25:14.560 | of whatever they're making and try to offer that product
03:25:17.000 | to those people.
03:25:17.840 | Let many people, many such firms enter that industry
03:25:21.240 | and let the customers decide which ones they want.
03:25:23.280 | And if the firm goes out of business, let it go bankrupt
03:25:25.760 | and let other people invest in whichever ventures
03:25:28.080 | they wanna try to attract customers
03:25:29.960 | to their version of the product.
03:25:31.840 | And that just works for a wide range
03:25:33.200 | of products and services.
03:25:34.440 | - And through all of this,
03:25:35.360 | there's a free exchange of information too.
03:25:37.800 | There's a hope that there's no manipulation of information
03:25:40.680 | and so on, that there, you're making--
03:25:43.400 | - Even when those things happen,
03:25:45.440 | still just the simple market solution is usually better
03:25:48.120 | than the things you'll try to do to fix it.
03:25:49.840 | - Than the alternative.
03:25:52.560 | - That's a view, Craig.
03:25:53.760 | It's surprising.
03:25:54.600 | It's not what you would have initially thought.
03:25:57.120 | - That's one of the great, I guess, inventions
03:25:59.800 | of human civilization that trusts the markets.
03:26:03.240 | - Now, another view, Craig, that I learned in my research
03:26:06.680 | that's not all of economics but something more specialized
03:26:09.200 | is the rationality of disagreement.
03:26:12.240 | That is, basically, people who are trying to believe
03:26:14.720 | what's true in a complicated situation
03:26:17.560 | would not actually disagree.
03:26:18.960 | And of course, humans disagree all the time,
03:26:22.280 | so it was quite the striking fact for me to learn
03:26:24.400 | in grad school that actually,
03:26:26.560 | rational agents would not knowingly disagree.
03:26:28.800 | And so, that makes disagreement more puzzling
03:26:32.200 | and it makes you less willing to disagree.
03:26:35.180 | - Humans are, to some degree, rational and are able to--
03:26:42.520 | - Their priorities are different
03:26:44.560 | than just figuring out the truth.
03:26:46.260 | Which might not be the same as being irrational.
03:26:52.280 | - That's another tangent that could take an hour.
03:26:54.720 | In the space of human affairs, political science,
03:27:01.300 | what is a beautiful, foundational, interesting idea to you,
03:27:06.000 | a view, Craig, in the space of political science?
03:27:08.560 | - The main thing that goes wrong in politics
03:27:13.640 | is people not agreeing on what the best thing to do is.
03:27:17.760 | - That's a wrong thing.
03:27:20.720 | - So that's what goes wrong, that is when you say
03:27:22.400 | what's fundamentally behind most political failures,
03:27:25.340 | it's that people are ignorant
03:27:27.400 | of what the consequences of policy is.
03:27:29.800 | And that's surprising because it's actually feasible
03:27:33.280 | to solve that problem, which we aren't solving.
03:27:35.800 | - So it's a bug, not a feature,
03:27:37.160 | that there's an inability to arrive at a consensus.
03:27:42.160 | - So most political systems,
03:27:44.800 | if everybody looked to some authority, say, on a question,
03:27:47.640 | and that authority told them the answer,
03:27:49.320 | then most political systems are capable
03:27:51.040 | of just doing that thing.
03:27:52.280 | That is, and so it's the failure
03:27:57.840 | to have trustworthy authorities
03:28:00.120 | that is sort of the underlying failure
03:28:02.720 | behind most political failure.
03:28:04.480 | We failed, we have bad, we invade Iraq, say,
03:28:07.120 | when we don't have an authority to tell us
03:28:09.080 | that's a really stupid thing to do.
03:28:10.840 | And it is possible to create
03:28:14.840 | more informative, trustworthy authorities,
03:28:17.760 | that that's a remarkable fact
03:28:19.280 | about the world of institutions,
03:28:22.080 | that we could do that, but we aren't.
03:28:24.660 | - Yeah, that's surprising.
03:28:26.640 | We could and we aren't.
03:28:28.160 | - Right, another big view crick about politics
03:28:30.120 | is from the elephant in the brain,
03:28:31.320 | that most people, when they're interacting with politics,
03:28:33.360 | they say they want to make the world better,
03:28:35.920 | they make their city better, their country better,
03:28:37.640 | and that's not their priority.
03:28:39.360 | - What is it?
03:28:40.200 | - They want to show loyalty to their allies.
03:28:42.760 | They wanna show their people they're on their side, yes.
03:28:45.400 | Are there various tribes they're in?
03:28:47.000 | That's their primary priority, and they do accomplish that.
03:28:51.480 | - Yeah, and the tribes are usually color-coded,
03:28:54.240 | conveniently enough.
03:28:55.540 | What would you say, you know, it's the Churchill question,
03:29:01.480 | democracy's the crappiest form of government,
03:29:04.360 | but it's the best one we got.
03:29:05.800 | What's the best form of government
03:29:08.280 | for this, our, seven billion human civilization,
03:29:12.740 | and the, maybe, as we get farther and farther,
03:29:16.160 | you mentioned a lot of stuff that's fascinating
03:29:18.160 | about human history as we become more forager-like,
03:29:21.760 | and looking out beyond, what's the best form of government
03:29:25.160 | in the next 50, 100 years
03:29:26.680 | as we become a multi-planetary species?
03:29:28.520 | - So, the key failing is that
03:29:32.520 | we have existing political institutions
03:29:35.040 | and related institutions, like media institutions
03:29:37.680 | and other authority institutions,
03:29:39.300 | and these institutions sit in a vast space
03:29:42.600 | of possible institutions.
03:29:44.240 | And the key failing, we're just not exploring that space.
03:29:47.600 | So, I have made my proposals in that space,
03:29:50.560 | and I think I can identify many promising solutions,
03:29:53.600 | and many other people have made
03:29:54.720 | many other promising proposals in that space,
03:29:57.080 | but the key thing is we're just not pursuing
03:29:58.960 | those proposals, we're not trying them out on small scales,
03:30:01.360 | we're not doing tests, we're not exploring
03:30:04.240 | the space of these options.
03:30:05.760 | That is the key thing we're failing to do.
03:30:08.040 | And if we did that, I am confident we would find
03:30:11.520 | much better institutions than the one we're using now,
03:30:13.920 | but we would have to actually try.
03:30:15.940 | (Lex laughing)
03:30:17.400 | - So, a lot of those topics,
03:30:19.960 | I do hope we get a chance to talk again.
03:30:21.860 | You're a fascinating human being,
03:30:23.640 | so I'm skipping a lot of tangents on purpose
03:30:26.480 | that I would love to take.
03:30:28.040 | You're such a brilliant person
03:30:29.280 | on so many different topics.
03:30:31.160 | Let me take a stroll
03:30:35.240 | into the deep human psyche
03:30:39.820 | of Robin Hanson himself.
03:30:42.960 | So first-- - May not be that deep.
03:30:44.640 | (both laughing)
03:30:47.300 | I might just be all on the surface.
03:30:49.880 | What you see, what you get,
03:30:50.840 | there might not be much hiding behind it.
03:30:52.440 | - Some of the fun is on the surface.
03:30:55.000 | - I actually think this is true
03:30:57.880 | of many of the most successful,
03:30:59.800 | most interesting people you see in the world.
03:31:02.080 | That is, they have put so much effort
03:31:04.740 | into the surface that they've constructed,
03:31:07.640 | and that's where they put all their energy.
03:31:09.600 | So somebody might be a statesman
03:31:12.360 | or an actor or something else,
03:31:13.640 | and people wanna interview them,
03:31:14.600 | and they wanna say, "What are you behind the scenes?
03:31:16.760 | "What do you do in your free time?"
03:31:17.960 | You know what?
03:31:18.800 | Those people don't have free time.
03:31:19.640 | They don't have another life behind the scenes.
03:31:21.920 | They put all their energy into that surface,
03:31:24.700 | the one we admire, the one we're fascinated by,
03:31:27.340 | and they kinda have to make up the stuff behind the scenes
03:31:29.740 | to supply it for you, but it's not really there.
03:31:32.920 | - Well, there's several ways of phrasing this.
03:31:34.480 | So one of it is authenticity,
03:31:35.920 | which is if you become the thing you are on the surface,
03:31:40.920 | if the depths mirror the surface,
03:31:45.440 | then that's what authenticity is.
03:31:47.480 | You're not hiding something.
03:31:48.640 | You're not concealing something.
03:31:49.920 | To push back on the idea of actors,
03:31:51.920 | they actually have often a manufactured surface
03:31:55.840 | that they put on, and they try on different masks,
03:31:58.880 | and the depths are very different from the surface,
03:32:01.800 | and that's actually what makes them
03:32:02.880 | very not interesting to interview.
03:32:04.800 | If you're an actor who actually lives the role
03:32:09.800 | that you play, so like, I don't know,
03:32:13.440 | a Clint Eastwood-type character
03:32:14.720 | who clearly represents the cowboy,
03:32:17.760 | I mean, at least rhymes or echoes
03:32:21.480 | the person you play on the surface, that's authenticity.
03:32:24.560 | - Some people are typecasts,
03:32:25.880 | and they have basically one persona.
03:32:27.580 | They play in all of their movies and TV shows,
03:32:29.680 | and so those people, it probably is
03:32:31.460 | the actual persona that they are,
03:32:34.200 | or it has become that over time.
03:32:36.640 | Clint Eastwood would be one.
03:32:37.700 | I think of Tom Hanks as another.
03:32:39.120 | I think they just always play the same person.
03:32:41.000 | - And you and I are just both surface players.
03:32:44.600 | You're the fun, brilliant thinker,
03:32:47.960 | and I am the suit-wearing idiot full of silly questions.
03:32:52.960 | All right, that said, let's put on your wise sage hat
03:33:02.520 | and ask you what advice would you give
03:33:04.360 | to young people today in high school and college
03:33:08.080 | about life, about how to live a successful life
03:33:12.880 | in career or just in general that they can be proud of?
03:33:16.380 | - Most young people, when they actually ask you
03:33:20.640 | that question, what they usually mean
03:33:22.560 | is how can I be successful by usual standards.
03:33:26.200 | - Yeah.
03:33:27.020 | - I'm not very good at giving advice about that
03:33:28.500 | 'cause that's not how I tried to live my life.
03:33:31.680 | So I would more flip it around and say,
03:33:35.960 | you live in a rich society, you will have a long life,
03:33:40.080 | you have many resources available to you.
03:33:42.380 | Whatever career you take, you'll have plenty of time
03:33:46.580 | to make progress on something else.
03:33:49.440 | Yes, it might be better if you find a way
03:33:51.360 | to combine your career and your interests
03:33:53.760 | in a way that gives you more time and energy,
03:33:55.320 | but there are often big compromises there as well.
03:33:58.560 | So if you have a passion about some topic
03:34:01.040 | or some thing that you think just is worth pursuing,
03:34:03.920 | you can just do it, you don't need other people's approval.
03:34:06.880 | And you can just start doing whatever it is
03:34:10.640 | you think is worth doing.
03:34:12.520 | It might take you decades, but decades are enough
03:34:14.800 | to make enormous progress on most all interesting things.
03:34:18.520 | - And don't worry about the commitment of it.
03:34:20.520 | I mean, that's a lot of what people worry about is,
03:34:23.000 | well, there's so many options, and if I choose a thing
03:34:25.540 | and I stick with it, I sacrifice all the other
03:34:28.280 | paths I could have taken.
03:34:29.360 | - But I mean, so I switched my career at the age of 34
03:34:32.320 | with two kids age zero and two, went back to grad school
03:34:35.040 | in social science after being a software,
03:34:37.720 | research software engineer.
03:34:40.640 | So it's quite possible to change your mind later in life.
03:34:43.860 | - How can you have an age of zero?
03:34:46.980 | - Shot, less than one.
03:34:49.320 | - Okay, so, oh, oh, you indexed with zero, I got it, okay.
03:34:55.080 | - Right, and you know, like people also ask what to read,
03:34:57.720 | and I say textbooks.
03:34:59.480 | And until you've read lots of textbooks,
03:35:02.160 | or maybe review articles, I'm not so sure you should
03:35:04.840 | be reading blog posts and Twitter feeds and even podcasts.
03:35:09.840 | I would say at the beginning, read the,
03:35:13.800 | this is our best, humanity's best summary
03:35:16.000 | of how to learn things is crammed into textbooks.
03:35:18.280 | - Yeah, especially the ones on introduction to biology.
03:35:22.360 | - Yeah, everything, introduction to everything.
03:35:23.640 | Just read all the textbooks. - Algorithms.
03:35:25.420 | - Read as many textbooks as you can stomach
03:35:27.640 | and then maybe if you wanna know more about a subject,
03:35:29.600 | find review articles.
03:35:31.240 | You don't need to read the latest stuff for most topics.
03:35:33.600 | - Yeah, and actually textbooks often have
03:35:35.560 | the prettiest pictures. - There you go.
03:35:37.760 | - And depending on the field, if it's technical,
03:35:40.120 | then doing the homework problems at the end
03:35:42.600 | is actually extremely, extremely useful.
03:35:44.880 | Extremely powerful way to understand something
03:35:47.440 | if you allow it.
03:35:48.880 | You know, I actually think of like high school and college,
03:35:52.520 | which you kind of remind me of.
03:35:54.720 | People don't often think of it that way,
03:35:56.280 | but you will almost not again get an opportunity
03:36:01.280 | to spend the time with a fundamental subject.
03:36:04.360 | - Bring up lots of stuff.
03:36:05.720 | - And like, no. - All the basics.
03:36:06.760 | - And everybody's forcing you,
03:36:08.120 | like everybody wants you to do it.
03:36:10.400 | And like you'll never get that chance again to sit there,
03:36:14.820 | even though it's outside of your interest, biology.
03:36:16.920 | Like in high school I took AP biology, AP chemistry.
03:36:20.840 | I'm thinking of subjects I never again
03:36:24.840 | really visited seriously.
03:36:26.480 | And it was so nice to be forced into anatomy and physiology,
03:36:31.480 | to be forced into that world, to stay with it,
03:36:35.200 | to look at the pretty pictures,
03:36:36.840 | to certain moments to actually for a moment
03:36:39.760 | enjoy the beauty of these, of like how a cell works
03:36:43.520 | and all those kinds of things.
03:36:44.640 | And somehow that stays, like the ripples of that fascination
03:36:48.740 | that stays with you even if you never do those,
03:36:51.120 | even if you never utilize those learnings
03:36:56.040 | in your actual work.
03:36:56.960 | - A common problem, at least of many young people I meet,
03:36:59.880 | is that they're like feeling idealistic and altruistic,
03:37:03.320 | but in a rush.
03:37:05.040 | - Yes.
03:37:05.880 | - So the usual human tradition that goes back
03:37:09.640 | hundreds of thousands of years
03:37:10.840 | is that people's productivity rises with time
03:37:13.640 | and maybe peaks around the age of 40 or 50.
03:37:16.440 | The age of 40, 50 is when you will be
03:37:18.120 | having the highest income, you'll have the most contacts,
03:37:21.080 | you will sort of be wise about how the world works.
03:37:23.640 | Expect to have your biggest impact then.
03:37:27.720 | Before then, you can have impacts,
03:37:30.200 | but you're also mainly building up
03:37:32.060 | your resources and abilities.
03:37:33.580 | That's the usual human trajectory,
03:37:37.640 | expect that to be true of you too.
03:37:39.640 | Don't be in such a rush to like accomplish
03:37:41.640 | enormous things at the age of 18 or whatever.
03:37:43.920 | I mean, you might as well practice trying to do things,
03:37:46.080 | but that's mostly about learning how to do things
03:37:48.600 | by practicing.
03:37:49.440 | There's a lot of things you can't do
03:37:50.380 | unless you just keep trying them.
03:37:51.980 | - And when all else fails,
03:37:54.720 | try to maximize the number of offspring
03:37:56.600 | however way you can.
03:37:58.520 | - That's certainly something I've neglected.
03:38:00.280 | I would tell my younger version of myself,
03:38:03.360 | try to have more descendants.
03:38:04.820 | Yes, absolutely.
03:38:07.520 | It matters more than I realized at the time.
03:38:10.420 | - Both in terms of making copies of yourself
03:38:15.960 | in mutated form and just the joy of raising them?
03:38:20.960 | - Sure.
03:38:22.040 | I mean, the meaning even.
03:38:23.960 | So in the literature on the value people get out of life,
03:38:29.680 | there's a key distinction between happiness and meaning.
03:38:32.480 | So happiness is how do you feel right now about right now,
03:38:36.440 | and meaning is how do you feel about your whole life?
03:38:39.040 | And many things that produce happiness
03:38:43.340 | don't produce meaning as reliably,
03:38:44.880 | and if you have to choose between them,
03:38:46.000 | you'd rather have meaning.
03:38:48.080 | And meaning goes along with sacrificing happiness sometimes.
03:38:53.080 | And children are an example of that.
03:38:56.480 | You get a lot more meaning out of children,
03:38:58.880 | even if they're a lot more work.
03:39:01.100 | - Why do you think kids, children are so magical,
03:39:07.520 | like raising kids?
03:39:08.960 | 'Cause I would love to have kids,
03:39:12.120 | and whenever I work with robots,
03:39:16.080 | there's some of the same magic
03:39:17.340 | when there's an entity that comes to life.
03:39:19.640 | And in that case, I'm not trying to draw too many parallels,
03:39:23.860 | but there is some echo to it,
03:39:27.360 | which is when you program a robot,
03:39:29.620 | there's some aspect of your intellect
03:39:32.160 | that is now instilled in this other moving being
03:39:35.400 | that's kind of magical.
03:39:37.320 | Well, why do you think that's magical?
03:39:39.280 | And you said happiness and meaning.
03:39:41.940 | - Meaningful. - As opposed to a short,
03:39:44.820 | why is it meaningful?
03:39:46.320 | - Overdetermined, like I can give you
03:39:49.380 | several different reasons, all of which is sufficient.
03:39:51.860 | And so the question is,
03:39:52.900 | we don't know which ones are the correct reasons.
03:39:54.820 | - Such a technical, it's overdetermined, look it up.
03:39:58.780 | - So I meet a lot of people interested in the future,
03:40:01.460 | interested in thinking about the future.
03:40:02.940 | They're thinking about how can I influence the future?
03:40:05.220 | But overwhelmingly in history so far,
03:40:08.540 | the main way people have influenced the future
03:40:10.380 | is by having children, overwhelmingly.
03:40:13.740 | And that's just not an incidental fact.
03:40:16.820 | You are built for that.
03:40:18.820 | That is, you're the sequence of thousands of generations,
03:40:22.580 | each of which successfully had a descendant.
03:40:25.340 | And that affected who you are.
03:40:27.500 | You just have to expect, and it's true that who you are
03:40:30.980 | is built to be, expect to have a child,
03:40:34.900 | to want to have a child,
03:40:37.220 | to have that be a natural and meaningful interaction for you.
03:40:40.260 | And it's just true.
03:40:41.780 | It's just one of those things you just should have expected,
03:40:44.020 | and it's not a surprise.
03:40:46.340 | - Well, to push back and sort of,
03:40:49.140 | in terms of influencing the future,
03:40:51.980 | as we get more and more technology,
03:40:54.180 | more and more of us are able to influence the future
03:40:56.780 | in all kinds of other ways, right?
03:40:58.740 | Being a teacher, educator.
03:40:59.980 | - Even so, though, still most of our influence
03:41:02.660 | in the future has probably happened being kids,
03:41:05.300 | even though we've accumulated more other ways to do it.
03:41:08.780 | - You mean at scale.
03:41:09.900 | I guess the depth of influence,
03:41:11.940 | like really how much effort,
03:41:14.180 | how much of yourself you really put into another human being.
03:41:16.500 | Do you mean both the raising of a kid,
03:41:20.340 | or do you mean raw genetic information?
03:41:23.360 | - Well, both, but raw genetics
03:41:25.260 | is probably more than half of it.
03:41:27.140 | - More than half.
03:41:28.280 | More than half, even in this modern world?
03:41:31.220 | - Yeah.
03:41:32.860 | - Genetics.
03:41:33.960 | Let me ask some dark, difficult questions, if I might.
03:41:39.520 | Let's take a stroll into that place
03:41:43.300 | that may or may not exist, according to you.
03:41:46.900 | What's the darkest place you've ever gone to
03:41:48.940 | in your mind, in your life?
03:41:50.940 | A dark time, a challenging time in your life
03:41:54.060 | that you had to overcome?
03:41:55.320 | - Probably just feeling strongly rejected.
03:42:02.620 | And so I'm apparently somewhat emotionally scarred
03:42:07.260 | by just being very rejection-averse,
03:42:09.700 | which must have happened because some rejections
03:42:12.260 | were just very scarring.
03:42:14.460 | - At a scale, in what kinds of communities?
03:42:17.820 | On the individual scale?
03:42:19.860 | - I mean, lots of different scales, yeah.
03:42:22.380 | All the different, many different scales.
03:42:24.540 | Still, that rejection stings.
03:42:26.340 | - Hold on a second, but you're a contrarian thinker.
03:42:32.260 | You challenge the norms.
03:42:34.700 | Why, if you were scarred by rejection,
03:42:39.700 | why welcome it in so many ways
03:42:42.820 | at a much larger scale, constantly with your ideas?
03:42:45.460 | - It could be that I'm just stupid.
03:42:47.620 | Or that I've just categorized them differently
03:42:51.780 | than I should or something.
03:42:53.140 | Most rejection that I've faced hasn't been
03:42:58.300 | because of my intellectual ideas.
03:43:03.100 | - Oh, so that once--
03:43:04.220 | - The intellectual ideas haven't been the thing
03:43:06.140 | to risk the rejection.
03:43:07.500 | - The one that, the things that challenge your mind,
03:43:12.500 | taking you to a dark place,
03:43:15.900 | the more psychological rejections.
03:43:18.100 | - You just asked me what took me to a dark place.
03:43:21.780 | You didn't specify it as sort of
03:43:23.500 | an intellectual dark place, I guess.
03:43:25.020 | Yeah, I just meant like what--
03:43:27.260 | - So intellectual is disjoint, or at least
03:43:30.980 | at a more surface level than something emotional.
03:43:35.300 | - Yeah, I would just think there are times
03:43:37.620 | in your life when you're just in a dark place
03:43:40.020 | and that can have many different causes.
03:43:42.060 | Most intellectuals are still just people
03:43:45.620 | and most of the things that will affect them
03:43:47.460 | are the kinds of things that affect people.
03:43:49.260 | They aren't that different necessarily.
03:43:51.300 | I mean, that's gonna be true for,
03:43:52.380 | like I presume most basketball players
03:43:54.100 | are still just people.
03:43:55.020 | If you ask them what was the worst part of their life,
03:43:56.620 | it's gonna be this kind of thing
03:43:58.420 | that was the worst part of life for most people.
03:44:00.260 | - So rejection early in life?
03:44:02.340 | - Yeah, I mean, not in grade school probably,
03:44:05.820 | but yeah, sort of being a young nerdy guy
03:44:08.340 | and feeling not in much demand or interest
03:44:13.340 | or later on lots of different kinds of rejection.
03:44:18.860 | But yeah, but I think that's,
03:44:21.460 | most of us like to pretend we don't that much
03:44:25.180 | need other people, we don't care what they think.
03:44:27.420 | It's a common sort of stance if somebody rejects you
03:44:29.660 | and says, "Oh, I didn't care about them anyway."
03:44:32.300 | But I think to be honest, people really do care.
03:44:35.100 | - Yeah, we do seek that connection, that love.
03:44:37.900 | What do you think is the role of love
03:44:39.540 | in the human condition?
03:44:40.880 | - Opacity in part.
03:44:45.980 | That is, love is one of those things
03:44:51.300 | where we know at some level it's important to us,
03:44:54.220 | but it's not very clearly shown to us
03:44:56.940 | exactly how or why or in what ways.
03:44:59.580 | There are some kinds of things we want
03:45:01.740 | where we can just clearly see that we want it
03:45:03.380 | and why that we want it, right?
03:45:04.300 | We know when we're thirsty and we know why we were thirsty
03:45:06.700 | and we know what to do about being thirsty
03:45:08.500 | and we know when it's over that we're no longer thirsty.
03:45:11.340 | Love isn't like that.
03:45:13.800 | - It's like, what do we seek from this?
03:45:16.620 | We're drawn to it, but we do not understand
03:45:18.820 | why we're drawn exactly, because it's not just affection,
03:45:23.660 | because if it was just affection,
03:45:25.180 | we don't seem to be drawn to pure affection.
03:45:28.020 | We don't seem to be drawn to somebody who's like a servant.
03:45:32.980 | We don't seem to be necessarily drawn to somebody
03:45:35.200 | that satisfies all your needs or something like that.
03:45:38.260 | - So it's clearly something we want or need,
03:45:41.540 | but we're not exactly very clear about it,
03:45:43.620 | and that isn't kind of important to it.
03:45:45.620 | So I've also noticed there are some kinds of things
03:45:48.220 | you can't imagine very well.
03:45:49.820 | So if you imagine a situation,
03:45:51.820 | there's some aspects of the situation
03:45:53.180 | that you can clearly, you can imagine it being bright
03:45:55.060 | or dim, you can imagine it being windy,
03:45:57.100 | or you can imagine it being hot or cold,
03:45:59.860 | but there's some aspects about your emotional stance
03:46:02.380 | in a situation that's actually just hard to imagine
03:46:05.560 | or even remember.
03:46:06.400 | It's hard to like, you can often remember an emotion
03:46:08.820 | only when you're in a similar sort of emotion situation,
03:46:11.540 | and otherwise you just can't bring the emotion
03:46:14.620 | to your mind, and you can't even imagine it, right?
03:46:17.700 | So there's certain kinds of emotions you can have,
03:46:20.340 | and when you're in that emotion,
03:46:21.380 | you can know that you have it,
03:46:22.460 | and you can have a name and it's associated,
03:46:23.980 | but later on I tell you, you know,
03:46:26.180 | remember joy, and it doesn't come to mind.
03:46:29.020 | - Not able to replay it.
03:46:30.740 | - Right, and it's sort of a reason why we have,
03:46:33.180 | one of the reasons that pushes us to reconsume it
03:46:35.540 | and reproduce it is that we can't reimagine it.
03:46:39.700 | - Well, there's a, it's interesting,
03:46:41.340 | 'cause there's a Daniel Kahneman type of thing
03:46:44.540 | of like reliving memories,
03:46:45.780 | 'cause I'm able to summon some aspect of that emotion,
03:46:49.660 | again, by thinking of that situation
03:46:51.820 | from which that emotion came.
03:46:53.900 | - Right.
03:46:54.740 | - So like a certain song, you can listen to it,
03:46:58.140 | and you can feel the same way you felt
03:46:59.820 | the first time you remembered that song associated
03:47:02.140 | with a certain-- - Right, but you need
03:47:02.980 | to remember that situation in some sort of complete package.
03:47:05.740 | - Yes, and then-- - You can't just take
03:47:06.780 | one part off of it, and then if you get
03:47:08.540 | the whole package again, if you remember the whole feeling.
03:47:11.100 | - Yes, or some fundamental aspect of that whole experience
03:47:14.540 | that aroused, from which the feeling arose,
03:47:17.180 | and actually the feeling is probably different in some way.
03:47:20.500 | It could be more pleasant or less pleasant
03:47:22.220 | than the feeling you felt originally,
03:47:24.060 | and that morphs over time,
03:47:25.500 | every time you replay that memory.
03:47:27.340 | It is interesting, you're not able
03:47:28.660 | to replay the feeling perfectly.
03:47:31.540 | You don't remember the feeling,
03:47:32.700 | you remember the facts of the events.
03:47:34.500 | - So there's a sense in which, over time,
03:47:36.100 | we expand our vocabulary as a community of language,
03:47:39.300 | and that allows us to sort of have more feelings
03:47:41.540 | and know that we are feeling them.
03:47:43.700 | 'Cause you can have a feeling, but not have a word for it,
03:47:45.580 | and then you don't know how to categorize it,
03:47:47.660 | or even what it is, and whether it's the same
03:47:49.760 | as something else, but once you have a word for it,
03:47:52.260 | you can sort of pull it together more easily.
03:47:54.980 | And so I think, over time, we are having
03:47:57.160 | a richer palette of feelings,
03:47:59.340 | 'cause we have more words for them.
03:48:03.040 | - What has been a painful loss in your life?
03:48:05.640 | Maybe somebody or something that's no longer in your life,
03:48:09.820 | but played an important part in your life.
03:48:12.720 | - Youth? (laughs)
03:48:14.880 | - That's a concept, no, it has to be--
03:48:17.440 | But I was once younger, I had more health,
03:48:19.720 | and I had vitality, I was insomer,
03:48:21.340 | I mean, you know, I've lost that over time.
03:48:23.080 | - Do you see that as a different person?
03:48:24.480 | Maybe you've lost that person?
03:48:26.080 | - Certainly, yes, absolutely, I'm a different person
03:48:28.560 | than I was when I was younger, and I'm not who,
03:48:31.360 | I don't even remember exactly what he was.
03:48:33.440 | So I don't remember as many things from the past
03:48:35.880 | as many people do, so in some sense,
03:48:37.360 | I've just lost a lot of my history by not remembering it.
03:48:40.880 | - Does that-- - And I'm not that person
03:48:42.680 | anymore, that person's gone, and I don't have
03:48:44.240 | any of their abilities. - Is that a painful loss?
03:48:45.200 | Is it a painful loss, though?
03:48:46.960 | - Yeah. - Or is it a,
03:48:48.680 | why is it painful?
03:48:50.300 | 'Cause you're wiser, you're, I mean,
03:48:54.800 | there's so many things that are beneficial
03:48:57.120 | to getting older. - Right, but--
03:48:59.580 | - Or you just call it-- - I just was this person,
03:49:02.800 | and I felt assured that I could continue to be that person.
03:49:06.520 | - And you're no longer that person.
03:49:07.840 | - And he's gone, and I'm not him anymore,
03:49:10.280 | and he died without fanfare or a funeral.
03:49:14.480 | - And that the person you are today, talking to me,
03:49:16.800 | that person will be changed, too.
03:49:20.800 | - Yes, and in 20 years, he won't be there anymore.
03:49:24.440 | - And a future person, you have to,
03:49:26.600 | we'll look back, a future version of you--
03:49:30.680 | - For M's, this'll be less of a problem.
03:49:32.660 | For M's, they would be able to save an archived copy
03:49:35.160 | of themselves at each different age,
03:49:37.240 | and they could turn it on periodically
03:49:39.080 | and go back and talk to it.
03:49:40.120 | - To replay.
03:49:41.060 | You think some of that will be,
03:49:44.480 | so with emulated minds, with M's,
03:49:47.380 | there's a digital cloning that happens,
03:49:54.000 | and do you think that makes your,
03:49:58.240 | you less special, if you're clonable?
03:50:03.440 | Like, does that make you the experience of life,
03:50:08.440 | the experience of a moment, the scarcity of that moment,
03:50:12.840 | the scarcity of that experience,
03:50:14.960 | isn't that a fundamental part of what makes
03:50:16.880 | that experience so delicious, so rich of feeling?
03:50:20.280 | - I think if you think of a song
03:50:21.920 | that lots of people listen to
03:50:23.440 | that are copies all over the world,
03:50:25.040 | we're gonna call that a more special song.
03:50:27.140 | - Yeah, yeah.
03:50:30.380 | So there's a perspective on copying and cloning
03:50:37.680 | where you're just scaling happiness versus degrading it.
03:50:42.520 | Each copy of a song is less special
03:50:44.720 | if there are many copies,
03:50:45.680 | but the song itself is more special
03:50:47.580 | if there are many copies.
03:50:48.640 | - And on mass, right, you're actually spreading
03:50:52.960 | the happiness even if it diminishes
03:50:54.920 | over a larger number of people at scale,
03:50:56.800 | and that increases the overall happiness in the world.
03:50:59.600 | And then you're able to do that with multiple songs.
03:51:02.320 | - Is a person who has an identical twin
03:51:05.600 | more or less special?
03:51:07.880 | - Well, the problem with identical twins
03:51:12.800 | is it's just two with Ms.
03:51:17.240 | - But two is different than one.
03:51:19.600 | - But there's a diminishing--
03:51:20.440 | - I think an identical twin's life is richer
03:51:22.740 | for having this other identical twin,
03:51:24.400 | somebody who understands them better
03:51:25.640 | than anybody else can.
03:51:26.900 | From the point of view of an identical twin,
03:51:30.040 | I think they have a richer life
03:51:31.480 | for being part of this couple,
03:51:33.720 | each of which is very similar.
03:51:34.680 | Now if you said, will the world,
03:51:37.120 | if we lose one of the identical twins,
03:51:38.880 | will the world miss it as much
03:51:40.320 | because you've got the other one
03:51:41.360 | and they're pretty similar?
03:51:42.340 | Maybe from the rest of the world's point of view,
03:51:44.000 | they suffer less of a loss
03:51:46.400 | when they lose one of the identical twins,
03:51:48.240 | but from the point of view
03:51:49.160 | of the identical twin themselves,
03:51:51.440 | their life is enriched by having a twin.
03:51:53.640 | - See, but the identical twin copying happens
03:51:57.040 | at the place of birth.
03:51:58.580 | It's different than copying
03:52:01.400 | after you've done some of the environment,
03:52:04.120 | like the nurture at the teenage
03:52:06.680 | or in the 20s after going to college.
03:52:09.000 | - Yes, that'll be an interesting thing
03:52:10.040 | for M's to find out,
03:52:11.000 | all the different ways
03:52:11.960 | that they can have different relationships
03:52:13.400 | to different people who have different degrees
03:52:15.520 | of similarity to them in time.
03:52:17.760 | - Yeah.
03:52:18.600 | Yeah, man.
03:52:21.540 | - But it seems like a rich space to explore
03:52:25.760 | and I don't feel sorry for them.
03:52:27.000 | This seems like interesting world to live in.
03:52:29.360 | - And there could be some ethical conundrums there.
03:52:32.040 | - There will be many new choices to make
03:52:33.920 | that they don't make now.
03:52:36.200 | We discussed it,
03:52:37.040 | and I discussed that in the book "Age of M."
03:52:39.560 | Say you have a lover
03:52:41.040 | and you make a copy of yourself,
03:52:42.320 | but the lover doesn't make a copy.
03:52:43.560 | Well, now, which one of you,
03:52:45.920 | or are both still related to the lover?
03:52:48.880 | - Socially entitled to show up.
03:52:53.680 | - Yes, so you'll have to make choices then
03:52:56.380 | when you split yourself.
03:52:57.400 | Which of you inherit which unique things?
03:53:00.280 | - Yeah, and of course,
03:53:03.040 | there'll be an equivalent increase in lawyers.
03:53:08.040 | Well, I guess you can clone the lawyers
03:53:10.760 | to help manage some of these negotiations
03:53:14.800 | of how to split property.
03:53:16.320 | The nature of owning, I mean,
03:53:18.120 | property is connected to individuals, right?
03:53:22.240 | - You only really need lawyers for this
03:53:23.920 | with an inefficient, awkward law
03:53:25.720 | that is not very transparent and able to do things.
03:53:28.320 | So, for example, an operating system of a computer
03:53:31.980 | is a law for that computer.
03:53:33.760 | When the operating system is simple and clean,
03:53:35.560 | you don't need to hire a lawyer
03:53:37.440 | to make a key choice with the operating system.
03:53:39.040 | - You don't need a human in the loop.
03:53:40.360 | - You just make a choice. - Qualify rules, yeah.
03:53:42.640 | - Right, so ideally, we want a legal system
03:53:44.760 | that makes the common choices easy
03:53:47.920 | and not require much overhead.
03:53:49.520 | - And that's what the digitization of things
03:53:52.520 | further enables that.
03:53:54.500 | So the loss of a younger self.
03:53:59.040 | What about the loss of your life overall?
03:54:01.480 | Do you ponder your death, your mortality?
03:54:03.960 | Are you afraid of it?
03:54:05.240 | - I am a cryonics customer.
03:54:07.160 | That's what this little tag around my deck says.
03:54:09.720 | It says that if you find me in a medical situation,
03:54:12.840 | you should call these people to enable the cryonics transfer.
03:54:17.000 | So I am taking a long-shot chance
03:54:20.400 | at living a much longer life.
03:54:22.640 | - Can you explain what cryonics is?
03:54:25.600 | - So when medical science gives up on me in this world,
03:54:30.560 | instead of burning me or letting worms eat me,
03:54:33.800 | they will freeze me, or at least freeze my head.
03:54:37.120 | And there's damage that happens
03:54:39.080 | in the process of freezing the head,
03:54:40.480 | but once it's frozen, it won't change for a very long time.
03:54:44.360 | Chemically, it'll just be completely exactly the same.
03:54:47.560 | So future technology might be able to revive me.
03:54:51.080 | And in fact, I would be mainly counting
03:54:52.920 | on the brain emulation scenario,
03:54:54.880 | which doesn't require reviving my entire biological body.
03:54:57.920 | It means I would be in a computer simulation.
03:55:00.520 | And so that's, I think I've got at least a 5% shot at that.
03:55:06.560 | And that's immortality.
03:55:08.440 | - Are you, can you still--
03:55:10.920 | - Most likely it won't happen,
03:55:12.120 | and therefore I'm sad that it won't happen.
03:55:15.040 | - Do you think immortality is something
03:55:17.960 | that you would like to have?
03:55:19.360 | - Well, I mean, just like infinity,
03:55:22.800 | I mean, you can't know until forever,
03:55:25.640 | which means never, right?
03:55:26.920 | So all you can really, the better choice is,
03:55:29.760 | at each moment, do you wanna keep going?
03:55:31.640 | So I would like at every moment
03:55:33.520 | to have the option to keep going.
03:55:35.160 | - The interesting thing about human experience
03:55:40.080 | is that the way you phrase it is exactly right.
03:55:45.080 | At every moment, I would like to keep going.
03:55:48.840 | But the thing that happens,
03:55:50.500 | you know, I'll leave them wanting more
03:55:55.560 | of whatever that phrase is.
03:55:59.360 | The thing that happens is over time,
03:56:01.200 | it's possible for certain experiences to become bland,
03:56:05.680 | and you become tired of them.
03:56:07.920 | And that actually makes life really unpleasant.
03:56:12.920 | Sorry, it makes that experience really unpleasant.
03:56:15.920 | And perhaps you can generalize that to life itself
03:56:19.160 | if you have a long enough horizon.
03:56:21.440 | And so--
03:56:22.280 | - Might happen, but might as well wait and find out.
03:56:24.760 | But then you're ending on suffering, you know?
03:56:28.240 | - So in the world of brain emulations, I have more options.
03:56:31.980 | - You can return yourself to--
03:56:34.240 | - That is, I can make copies of myself,
03:56:36.920 | archive copies at various ages.
03:56:39.200 | And at a later age, I could decide
03:56:40.940 | that I'd rather replace myself
03:56:42.520 | with a new copy from a younger age.
03:56:44.800 | - So does a brain emulation still operate
03:56:47.960 | in physical space?
03:56:48.840 | So can we do, what do you think about like the metaverse
03:56:51.440 | and operating in virtual reality?
03:56:53.520 | So we can conjure up, not just emulate,
03:56:56.000 | not just your own brain and body,
03:56:59.440 | but the entirety of the environment.
03:57:01.480 | - Most brain emulations will in fact
03:57:03.680 | spost most of their time in virtual reality.
03:57:06.120 | But they wouldn't think of it as virtual reality,
03:57:08.640 | they would just think of it as their usual reality.
03:57:11.280 | I mean, the thing to notice, I think, in our world,
03:57:13.520 | most of us spend most time indoors.
03:57:16.440 | And indoors, we are surrounded by walls covered with paint
03:57:20.240 | and floors covered with tile or rugs.
03:57:23.520 | Most of our environment is artificial.
03:57:26.520 | It's constructed to be convenient for us,
03:57:28.640 | it's not the natural world that was there before.
03:57:31.440 | A virtual reality is basically just like that.
03:57:33.960 | It is the environment that's comfortable
03:57:35.920 | and convenient for you.
03:57:37.440 | And, but when it's the right, that environment for you,
03:57:40.540 | it's real for you, just like the room you're in right now,
03:57:43.320 | most likely is very real for you.
03:57:45.120 | You're not focused on the fact that the paint
03:57:47.240 | is hiding the actual studs behind the wall
03:57:50.000 | and the actual wires and pipes and everything else.
03:57:52.960 | The fact that we're hiding that from you
03:57:54.320 | doesn't make it fake or unreal.
03:57:56.280 | - What are the chances that we're actually
03:58:02.240 | in the very kind of system that you're describing
03:58:04.540 | where the environment and the brain is being emulated
03:58:07.400 | and you're just replaying an experience
03:58:09.040 | when you were first did a podcast with Lex after,
03:58:14.040 | and now, you know, the person that originally launched this
03:58:17.980 | already did hundreds of podcasts with Lex.
03:58:19.880 | This is just the first time.
03:58:21.360 | And you like this time because there's so much uncertainty.
03:58:24.680 | There's nerves, it could have gone any direction.
03:58:27.120 | - At the moment, we don't have the technical ability
03:58:30.640 | to create that emulation.
03:58:32.720 | So we'd have to be postulating that in the future,
03:58:35.560 | we have that ability, and then they choose
03:58:37.680 | to evaluate this moment now, to simulate it.
03:58:40.480 | - Don't you think we could be in the simulation
03:58:43.840 | of that exact experience right now
03:58:45.440 | and we wouldn't be able to know?
03:58:47.080 | - So one scenario would be this never really happened.
03:58:51.180 | This only happens as a reconstruction later on.
03:58:54.440 | - Yeah.
03:58:55.280 | - That's different than the scenario
03:58:56.200 | that this did happen the first time
03:58:57.800 | and now it's happening again as a reconstruction.
03:59:00.840 | That second scenario is harder to put together
03:59:03.880 | because it requires this coincidence
03:59:06.000 | where between the two times we produce the ability to do it.
03:59:09.220 | - No, but don't you think replay of memories,
03:59:13.880 | poor replay of memories is something that--
03:59:18.160 | - That might be a possible thing in the future.
03:59:19.640 | - So you're saying it's harder
03:59:20.480 | than to conjure up things from scratch?
03:59:23.760 | - It's certainly possible.
03:59:25.120 | So the main way I would think about it
03:59:26.840 | is in terms of the demand for simulation
03:59:29.680 | versus other kinds of things.
03:59:31.160 | So I've given this a lot of thought
03:59:32.720 | because I first wrote about this long ago
03:59:35.580 | when Bostrom first wrote his papers
03:59:37.400 | about simulation argument and I wrote about
03:59:39.320 | how to live in a simulation.
03:59:42.240 | And so the key issue is the fraction of creatures
03:59:47.240 | in the universe that are really experiencing
03:59:50.680 | what you appear to be really experiencing
03:59:52.560 | relative to the fraction that are experiencing it
03:59:54.960 | in a simulation way, i.e. simulated.
03:59:57.800 | So then the key parameter is at any one moment in time,
04:00:02.800 | creatures at that time, many of them,
04:00:06.920 | most of them are presumably really experiencing
04:00:09.240 | what they're experiencing, but some fraction of them
04:00:11.200 | are experiencing some past time
04:00:14.600 | where that past time is being remembered
04:00:17.280 | via their simulation.
04:00:18.560 | So to figure out this ratio,
04:00:22.840 | what we need to think about is basically two functions.
04:00:26.020 | One is how fast in time does the number of creatures grow?
04:00:30.360 | And then how fast in time does the interest
04:00:32.760 | in the past decline?
04:00:33.960 | Because at any one time, people will be simulating
04:00:38.000 | different periods in the past with different emphasis
04:00:40.560 | based on-- - I love the way
04:00:41.400 | you think so much.
04:00:43.000 | That's exactly right, yeah.
04:00:44.240 | - So if the first function grows slower
04:00:48.120 | than the second one declines,
04:00:50.440 | then in fact, your chances of being simulated are low.
04:00:54.540 | So the key question is how fast does interest
04:00:57.960 | in the past decline relative to the rate
04:01:00.440 | at which the population grows with time?
04:01:02.160 | - Does this correlate to, you earlier suggested
04:01:04.360 | that the interest in the future increases over time.
04:01:07.520 | Are those correlated, interest in the future
04:01:09.640 | versus interest in the past?
04:01:11.000 | Like why are we interested in the past?
04:01:13.480 | - But the simple way to do it is, as you know,
04:01:15.200 | like Google Ngrams has a way to type in a word
04:01:18.240 | and see how interest in it declines or rises over time.
04:01:21.480 | Right? - Yeah, yeah.
04:01:22.320 | - You can just type in a year and get the answer for that.
04:01:24.840 | If you type in a particular year like 1900 or 1950,
04:01:29.160 | you can see with Google Ngram how interest in that year
04:01:32.720 | increased up until that date and decreased after it.
04:01:36.160 | And you can see that interest in a date declines faster
04:01:39.720 | than does the population grow with time.
04:01:42.500 | - That is brilliant.
04:01:45.480 | - And so-- - That is so interesting.
04:01:46.760 | - You have the answer.
04:01:47.880 | - Wow.
04:01:50.720 | And that was your argument against,
04:01:53.160 | not against, to this particular aspect of the simulation,
04:01:56.280 | how much past simulation there will be,
04:02:00.520 | replay of past memories.
04:02:02.080 | - First of all, if we assume that like simulation
04:02:04.120 | of the past is a small fraction of all the creatures
04:02:06.560 | at that moment, right? - Yes.
04:02:08.960 | - And then it's about how fast.
04:02:10.520 | Now, some people have argued plausibly
04:02:12.440 | that maybe most interest in the past
04:02:15.360 | falls with this fast function,
04:02:16.520 | but some unusual category of interest in the past
04:02:19.000 | won't fall that quickly,
04:02:20.320 | and then that eventually would dominate.
04:02:22.200 | So that's a other hypothesis.
04:02:24.280 | - Some category.
04:02:25.600 | So that very outlier specific kind of, yeah, okay.
04:02:28.880 | Yeah, yeah, yeah.
04:02:29.720 | Like really popular kinds of memories,
04:02:33.120 | but like probably sexual-- - In a trillion years,
04:02:36.200 | there's some small research institute
04:02:38.560 | that tries to randomly select
04:02:40.160 | from all possible people in history or something
04:02:42.240 | to simulate.
04:02:43.440 | - Yeah, yeah, yeah.
04:02:46.760 | - So the question is how big is this research institute
04:02:48.840 | and how big is the future in a trillion years, right?
04:02:51.320 | And that would be hard to say.
04:02:52.840 | But if we just look at the ordinary process
04:02:54.880 | by which people simulate recent,
04:02:56.920 | so if you look at,
04:02:59.120 | I think it's also true for movies and plays and video games,
04:03:02.120 | overwhelmingly they're interested in the recent past.
04:03:04.920 | There's very few video games
04:03:06.200 | where you play someone in the Roman Empire.
04:03:08.080 | - Right.
04:03:09.280 | - Even fewer where you play someone
04:03:10.840 | in the ancient Egyptian Empire.
04:03:12.600 | - Yeah, just different--
04:03:15.520 | - It's just declined very quickly.
04:03:16.640 | - But every once in a while, that's brought back.
04:03:19.360 | But yeah, you're right.
04:03:21.960 | I mean, just if you look at the mass of entertainment,
04:03:25.120 | movies and games, it's focusing on the present, recent past.
04:03:29.080 | And maybe some, I mean,
04:03:30.480 | where does science fiction fit into this?
04:03:32.320 | Because it's sort of,
04:03:35.200 | what is science fiction?
04:03:39.080 | I mean, it's a mix of the past and the present
04:03:40.960 | and some kind of manipulation of that
04:03:43.080 | to make it more efficient for us
04:03:45.280 | to ask deep philosophical questions about humanity.
04:03:49.000 | - The closest genre to science fiction is clearly fantasy.
04:03:51.600 | Fantasy and science fiction in many bookstores
04:03:53.400 | and even Netflix or whatever categories,
04:03:55.240 | they're just lumped together.
04:03:56.800 | So clearly they have a similar function.
04:03:58.960 | So the function of fantasy is more transparent
04:04:01.800 | than the function of science fiction.
04:04:02.920 | So use that as your guide.
04:04:04.960 | What's fantasy for?
04:04:05.880 | It's just to take away the constraints
04:04:08.240 | of the ordinary world
04:04:09.080 | and imagine stories with much fewer constraints.
04:04:11.400 | That's what fantasy is.
04:04:12.520 | You're much less constrained.
04:04:13.800 | - What's the purpose to remove constraints?
04:04:15.600 | Is it to escape from the harshness of the constraints
04:04:18.880 | of the real world?
04:04:20.280 | Or is it to just remove constraints
04:04:22.080 | in order to explore some,
04:04:24.480 | get a deeper understanding of our world?
04:04:26.920 | What is it?
04:04:27.760 | I mean, why do people read fantasy?
04:04:28.920 | - I'm not a cheap fantasy reading kind of person.
04:04:33.920 | So I need to...
04:04:35.480 | - One story that it sounds plausible to me
04:04:38.200 | is that there are sort of these deep story structures
04:04:40.880 | that we love and we want to realize,
04:04:43.920 | and then many details of the world get in their way.
04:04:46.640 | Fantasy takes all those obstacles out of the way
04:04:48.680 | and lets you tell the essential hero story
04:04:51.320 | or the essential love story,
04:04:52.360 | whatever essential story you want to tell.
04:04:54.760 | The reality and constraints are not in the way.
04:04:57.960 | - And so science fiction can be thought of as like fantasy,
04:05:01.280 | except you're not willing to admit that it can't be true.
04:05:04.320 | So the future gives the excuse of saying,
04:05:06.200 | "Well, it could happen."
04:05:07.400 | And you accept some more reality constraints
04:05:11.040 | for the illusion, at least,
04:05:12.960 | that maybe it could really happen.
04:05:14.720 | - Maybe it could happen,
04:05:18.000 | and that it stimulates the imagination.
04:05:19.960 | The imagination is something really interesting
04:05:23.000 | about human beings,
04:05:24.840 | and it seems also to be an important part
04:05:27.000 | of creating really special things,
04:05:28.520 | is to be able to first imagine them.
04:05:30.920 | With you and Nick Bostrom,
04:05:32.600 | where do you land on the simulation
04:05:34.840 | and all the mathematical ways of thinking it
04:05:38.560 | and just the thought experiment of it?
04:05:41.240 | Are we living in a simulation?
04:05:42.920 | - That was the discussion we just had.
04:05:46.720 | That is, you should grant the possibility
04:05:49.280 | of being in a simulation.
04:05:50.120 | You shouldn't be 100% confident that you're not.
04:05:52.160 | You should certainly grant a small probability.
04:05:54.320 | The question is, how large is that probability?
04:05:56.280 | - Are you saying we would be,
04:05:57.880 | I misunderstood because I thought our discussion
04:06:01.200 | was about replaying things that have already happened.
04:06:03.480 | - Right, but the whole question is,
04:06:05.400 | right now, is that what I am?
04:06:08.200 | Am I actually a replay from some distant future?
04:06:11.960 | - But it doesn't necessarily need to be a replay.
04:06:13.760 | It could be a totally new.
04:06:15.400 | You don't have to be an NPC.
04:06:16.920 | - Right, but clearly, I'm in a certain era
04:06:19.200 | with a certain kind of world around me, right?
04:06:21.040 | So either this is a complete fantasy
04:06:22.600 | or it's a past of somebody else in the future.
04:06:25.720 | But no, it could be a complete fantasy, though.
04:06:28.200 | - It could be, right, but then you have to talk about
04:06:30.440 | what's the frank fraction of complete fantasies, right?
04:06:33.840 | - I would say it's easier to generate a fantasy
04:06:35.840 | than to replay a memory, right?
04:06:37.920 | - Sure, but if we just look at the entire history,
04:06:40.000 | if we just look at the entire history of everything,
04:06:41.760 | we should say, sure, but most things are real.
04:06:43.880 | Most things aren't fantasies, right?
04:06:45.360 | Therefore, the chance that my thing is real, right?
04:06:47.200 | So the simulation argument works stronger
04:06:49.680 | about sort of the past.
04:06:50.600 | We say, ah, but there's more future people
04:06:52.560 | than there are today.
04:06:53.840 | So you being in the past or the future
04:06:56.080 | makes you special relative to them,
04:06:57.720 | which makes you more likely to be in a simulation, right?
04:07:00.240 | If we're just taking the full count and saying,
04:07:02.720 | in all creatures ever, what percentage are in simulations?
04:07:05.120 | Probably no more than 10%.
04:07:07.600 | - So what's a good argument for that?
04:07:10.000 | That most things are real?
04:07:11.680 | - Yeah.
04:07:12.520 | - 'Cause as Bostrom says the other way, right?
04:07:15.000 | - In a competitive world, in a world where people
04:07:18.160 | like have to work and have to get things done,
04:07:21.040 | then they have a limited budget for leisure.
04:07:24.800 | And so, you know, leisure things are less common
04:07:28.240 | than work things, like real things, right?
04:07:31.080 | That's just--
04:07:32.880 | - But if you look at the stretch of history in the universe,
04:07:37.880 | doesn't the ratio of leisure increase?
04:07:41.980 | Isn't that where we, isn't that the--
04:07:45.480 | - Right, but now we're looking at the fraction of leisure,
04:07:47.480 | which takes the form of something
04:07:49.160 | where the person doing the leisure doesn't realize it.
04:07:52.120 | Now there could be some fraction of it,
04:07:53.320 | but that's much smaller, right?
04:07:55.240 | - Yeah.
04:07:56.080 | Clueless foragers.
04:07:59.040 | - Or somebody is clueless in the process
04:08:00.860 | of supporting this leisure, right?
04:08:02.800 | It might not be the person leisuring,
04:08:04.120 | somebody, they're a supporting character or something,
04:08:05.760 | but still, that's gotta be a pretty small
04:08:07.120 | fraction of leisure.
04:08:08.120 | - What, you mentioned that children are one of the things
04:08:11.900 | that are a source of meaning, broadly speaking.
04:08:14.960 | And let me ask the big question,
04:08:16.440 | what's the meaning of this whole thing?
04:08:19.160 | - The-- - Robin, meaning of life.
04:08:21.280 | What is the meaning of life?
04:08:23.200 | We talked about alien civilizations,
04:08:26.260 | but this is the one we got.
04:08:27.680 | Where are the aliens?
04:08:28.920 | Where are the human?
04:08:30.960 | Seem to be conscious, be able to introspect.
04:08:35.160 | What's, why are we here?
04:08:37.400 | - This is the thing I told you before
04:08:39.220 | about how we can predict that future creatures
04:08:41.560 | will be different from us.
04:08:42.860 | We, our preferences are this amalgam
04:08:47.400 | of various sorts of random sort of patched together
04:08:51.280 | preferences about thirst and sex and sleep
04:08:55.200 | and attention and all these sorts of things.
04:08:57.420 | So we don't understand that very well.
04:08:59.760 | It's not very transparent and it's a mess, right?
04:09:03.300 | That is the source of our motivation.
04:09:06.600 | That is how we were made and how we are induced to do things.
04:09:10.280 | But we can't summarize it very well
04:09:12.400 | and we don't even understand it very well.
04:09:14.440 | That's who we are.
04:09:15.740 | And often we find ourselves in a situation
04:09:17.480 | where we don't feel very motivated, we don't know why.
04:09:19.560 | In other situations, we find ourselves very motivated
04:09:21.740 | and we don't know why either.
04:09:23.200 | And so that's the nature of being a human
04:09:27.880 | of the sort that we are,
04:09:28.960 | because even though we can think abstractly
04:09:31.400 | and reason abstractly, this package of motivations
04:09:33.760 | is just opaque and a mess.
04:09:35.960 | And that's what it means to be a human today
04:09:38.500 | and the motivation.
04:09:39.440 | We can't very well tell the meaning of our life.
04:09:42.280 | It is this mess.
04:09:43.600 | That our descendants will be different.
04:09:44.920 | They will actually know exactly what they want
04:09:48.300 | and it will be to have more descendants.
04:09:51.080 | That will be the meaning for them.
04:09:52.760 | - Well, it's funny that you have the certainty.
04:09:54.640 | You have more certainty, you have more transparency
04:09:57.400 | about our descendants than you do about your own self.
04:10:01.960 | - Right.
04:10:02.800 | - So it's really interesting to think,
04:10:06.000 | 'cause you mentioned this about love,
04:10:08.040 | that something that's fundamental about love
04:10:12.200 | is this opaqueness, that we're not able
04:10:13.820 | to really introspect what the heck it is.
04:10:16.460 | Or all the feelings, the complex feelings involved.
04:10:19.720 | - And that's true about many of our motivations.
04:10:21.520 | - And that's what it means to be human
04:10:23.860 | of the 20th and the 21st century variety.
04:10:28.860 | Why is that not a feature that we want,
04:10:33.640 | will choose to persist in civilization then?
04:10:37.040 | This opaqueness, put another way, mystery.
04:10:40.400 | Maintaining a sense of mystery about ourselves
04:10:42.440 | and about those around us.
04:10:44.480 | Maybe that's a really nice thing to have, maybe.
04:10:47.040 | - So, I mean, this is the fundamental issue
04:10:50.140 | in analyzing the future, what will set the future?
04:10:53.780 | One theory about what will set the future
04:10:55.440 | is what do we want the future to be?
04:10:58.560 | So under that theory, we should sit and talk
04:11:00.320 | about what we want the future to be,
04:11:01.480 | have some conferences, have some conventions,
04:11:04.100 | discussion things, vote on it maybe,
04:11:06.000 | and then hand it off to the implementation people
04:11:08.200 | to make the future the way we've decided it should be.
04:11:12.400 | That's not the actual process that's changed the world
04:11:15.080 | over history up to this point.
04:11:16.800 | It has not been the result of us deciding
04:11:19.280 | what we want and making it happen.
04:11:21.440 | In our individual lives, we can do that.
04:11:23.400 | We might decide what career we want
04:11:24.880 | or where we want to live, who we want to live with.
04:11:26.900 | In our individual lives, we often do slowly
04:11:29.640 | make our lives better according to our plan and our things,
04:11:32.040 | but that's not the whole world.
04:11:33.600 | The whole world so far has mostly been a competitive world
04:11:37.280 | where things happen if anybody anywhere chooses
04:11:40.300 | to adopt them and they have an advantage.
04:11:42.280 | And then it spreads and other people are forced
04:11:43.920 | to adopt it by competitive pressures.
04:11:46.360 | So that's the kind of analysis I can use
04:11:48.360 | to predict the future, and I do use that
04:11:50.260 | to predict the future.
04:11:51.100 | It doesn't tell us it'll be a future we like.
04:11:52.640 | It just tells us what it'll be.
04:11:54.840 | - And it'll be one where we're trying to maximize
04:11:56.760 | the number of our descendants.
04:11:57.920 | - And we know that abstractly and directly,
04:12:00.600 | and it's not opaque.
04:12:01.680 | - With some probability that's non-zero
04:12:04.380 | that will lead us to become grabby
04:12:06.400 | in expanding aggressively out into the cosmos
04:12:12.080 | until we meet other aliens.
04:12:13.800 | - The timing isn't clear.
04:12:14.840 | We might become grabby and then this happens.
04:12:17.160 | These are, be grabbyness and this are both
04:12:19.480 | the results of competition, but it's less clear
04:12:21.680 | which happens first.
04:12:22.680 | - Does this future excite you, scare you?
04:12:26.840 | How do you feel about this whole thing?
04:12:27.840 | - Well, again, I told you compared to sort of
04:12:30.000 | a dead cosmology, at least it's energizing
04:12:32.720 | and having a living story with real actors
04:12:34.920 | and characters and agendas, right?
04:12:36.840 | - Yeah, and that's one hell of a fun universe to live in.
04:12:40.880 | - Robin, you're one of the most fascinating,
04:12:42.960 | fun people to talk to, brilliant, humble,
04:12:47.200 | systematic in your analysis.
04:12:48.920 | - Hold on to my wallet here.
04:12:50.000 | What's he looking for?
04:12:50.840 | - I already stole your wallet long ago.
04:12:53.040 | I really, really appreciate you spending
04:12:54.560 | your valuable time with me.
04:12:55.560 | I hope we get a chance to talk many more times
04:12:58.880 | in the future.
04:12:59.720 | Thank you so much for sitting down.
04:13:01.440 | - Thank you.
04:13:02.260 | - Thanks for listening to this conversation
04:13:04.800 | with Robin Hanson.
04:13:06.080 | To support this podcast, please check out our sponsors
04:13:08.640 | in the description.
04:13:10.040 | And now let me leave you with some words
04:13:12.040 | from Ray Bradbury.
04:13:13.840 | We are an impossibility in an impossible universe.
04:13:18.840 | Thank you for listening and hope to see you next time.
04:13:21.600 | (upbeat music)
04:13:24.180 | (upbeat music)
04:13:26.760 | [BLANK_AUDIO]