back to index

Tim Urban: Elon Musk, Neuralink, AI, Aliens, and the Future of Humanity | Lex Fridman Podcast #264


Chapters

0:0 Introduction
0:38 The big and the small
8:28 Aliens
16:42 The pencil problem
23:27 Food abundance
25:31 Extinction of human civilization
30:49 Future politics of Mars
37:49 SpaceX
43:49 Elon Musk
69:17 Nuclear power
73:43 The higher mind
78:27 Echo chambers and idea labs
81:39 How our brain processes film and music
84:53 Neuralink
93:7 Future of physical interactions
97:18 AI
104:38 Free speech
108:41 How to read more
115:23 Spaced repetition
119:26 Procrastination
146:18 Goals for the future
151:36 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | If you read a half hour a night,
00:00:01.640 | the calculation I came to is that you can read
00:00:03.520 | a thousand books in 50 years.
00:00:05.200 | - All of the components are there
00:00:07.160 | to engineer intimate experiences.
00:00:09.480 | - Extraterrestrial life is a true mystery,
00:00:11.540 | the most tantalizing mystery of all.
00:00:13.360 | - How many humans need to disappear
00:00:15.880 | for us to be completely lost?
00:00:17.660 | The following is a conversation with Tim Urban,
00:00:22.320 | author and illustrator of the amazing blog
00:00:25.480 | called "Wait, But Why?"
00:00:28.040 | This is the Lex Friedman Podcast.
00:00:30.080 | To support it, please check out our sponsors
00:00:32.200 | in the description.
00:00:33.320 | And now, dear friends, here's Tim Urban.
00:00:37.260 | You wrote a "Wait, But Why?" blog post
00:00:41.160 | about the big and the small,
00:00:42.720 | from the observable universe to the atom.
00:00:45.640 | What world do you find most mysterious or beautiful,
00:00:48.480 | the very big or the very small?
00:00:50.660 | - The very small seems a lot more mysterious.
00:00:54.080 | And the very big, I feel like we kind of understand.
00:00:56.600 | I mean, not the very, very big, not the multiverse,
00:01:00.120 | if there is a multiverse,
00:01:01.760 | not anything outside of the observable universe.
00:01:04.160 | But the very small, I think we really have no idea
00:01:08.180 | what's going on, or very, you know, much less idea.
00:01:11.280 | But I find that, so I think the small is more mysterious,
00:01:13.680 | but I think the big is sexier.
00:01:16.240 | I just cannot get enough of the bigness of space
00:01:22.360 | and the farness of stars.
00:01:26.320 | And it just continually blows my mind.
00:01:29.240 | - I mean, we still, the vastness of the observable universe
00:01:33.480 | has the mystery that we don't know what's out there.
00:01:35.820 | We know how it works, perhaps.
00:01:38.260 | Like, general relativity can tell us
00:01:40.920 | how the movement of bodies works,
00:01:42.800 | how they're born, all that kind of things.
00:01:45.400 | But like, how many civilizations are out there?
00:01:49.280 | How many, like, what are the weird things that are out there?
00:01:51.480 | - Oh yeah, life, well, extraterrestrial life
00:01:53.640 | is a true mystery,
00:01:54.720 | the most tantalizing mystery of all.
00:01:56.720 | But that's like our size.
00:02:00.080 | So that's maybe it's that the actual,
00:02:02.660 | the big and the small are really cool,
00:02:04.080 | but it's actually the things that are potentially our size
00:02:06.440 | that are the most tantalizing.
00:02:07.760 | - Potentially our size is probably the key word.
00:02:10.120 | - Yeah, I mean, I wonder how small intelligent life
00:02:12.560 | could get, probably not that small.
00:02:15.400 | And I assume that there's a limit that you're not gonna,
00:02:18.360 | I mean, you might have like a whale,
00:02:19.920 | blue whale-sized intelligent being,
00:02:21.840 | that would be kind of cool.
00:02:23.320 | But I feel like we're in the range
00:02:26.080 | of order of magnitude smaller and bigger than us for life.
00:02:28.720 | But maybe not, maybe you could have some giant life form.
00:02:32.200 | Just seems like, I don't know,
00:02:34.000 | there's gotta be some reason that anything intelligent
00:02:36.640 | between kind of like a little tiny rodent or finger monkey
00:02:40.680 | up to a blue whale on this planet.
00:02:43.640 | I don't know, maybe when you change the gravity
00:02:46.440 | and other things.
00:02:47.720 | - Well, you could think of life
00:02:49.120 | as a thing of self-assembling organisms
00:02:52.520 | and they just get bigger and bigger and bigger.
00:02:54.000 | Like there's no such thing as a human being.
00:02:55.920 | A human being is made up of a bunch of tiny organisms
00:02:58.880 | that are working together.
00:03:00.120 | And we somehow envision that as one entity
00:03:03.040 | 'cause it has consciousness.
00:03:04.560 | But maybe it's just organisms on top of organisms.
00:03:08.080 | Organisms all the way down, turtles all the way down.
00:03:10.600 | So like Earth can be seen as an organism
00:03:13.320 | for people, for alien species that's very different.
00:03:17.400 | Like why is the human the fundamental entity
00:03:21.520 | that is living and then everything else
00:03:25.000 | is just either a collection of humans
00:03:26.760 | or components of humans?
00:03:28.600 | - I think of it kind of as if you think about,
00:03:30.400 | I think of like an emergence elevator.
00:03:32.680 | And so you've got an ant is on one floor
00:03:36.800 | and then the ant colony is a floor above.
00:03:39.480 | Or maybe there's even units within the colony
00:03:41.320 | that's one floor above and the full colony
00:03:42.880 | is two floors above.
00:03:44.680 | And to me, I think that it's the colony
00:03:47.000 | that is closest to being the animal.
00:03:49.440 | It's like the individual thing that competes with others
00:03:53.920 | while the individual ants are like cells
00:03:57.680 | in the animal's body.
00:03:59.080 | We are more like a colony in that regard.
00:04:01.360 | But the humans are weird because we kind of,
00:04:04.680 | I think of it if emergence happens in an emergence tower,
00:04:08.280 | where you've got kind of, as I said, cells
00:04:10.000 | and then humans and communities and societies.
00:04:13.360 | Ants are very specific.
00:04:15.080 | The individual ants are always cooperating
00:04:16.960 | with each other for the sake of the colony.
00:04:19.280 | So the colony is this unit that is the competitive unit.
00:04:22.760 | Humans can kind of go, we take the elevator up and down
00:04:25.880 | emergence tower psychologically.
00:04:27.280 | Sometimes we are individuals that are competing
00:04:30.960 | with other individuals and that's where our mindset is.
00:04:33.160 | And then other times we get in this crazy zone,
00:04:35.920 | a protest or a sporting event,
00:04:37.960 | and you're just chanting and screaming
00:04:40.160 | and doing the same hand motions with all these other people
00:04:42.720 | and you feel like one.
00:04:43.840 | You feel like one and you'd sacrifice yourself.
00:04:45.760 | And now that's with soldiers.
00:04:46.960 | And so our brains can kind of psychologically
00:04:50.480 | go up and down this elevator in an interesting way.
00:04:53.600 | - Yeah, I wonder how much of that is just the narrative
00:04:56.120 | we tell ourselves.
00:04:57.080 | Maybe we are just like an ant colony.
00:04:59.040 | We're just collaborating always.
00:05:00.560 | Even in our stories of individualism,
00:05:04.160 | of like the freedom of the individual,
00:05:05.840 | like this kind of isolation,
00:05:09.040 | lone man on an island kind of thing.
00:05:11.400 | We're actually all part of this giant network
00:05:14.160 | of maybe one of the things that makes humans who we are
00:05:17.560 | is probably deeply social.
00:05:21.120 | The ability to maintain not just a single human intelligence
00:05:24.480 | but like a collective intelligence.
00:05:26.560 | And so this feeling like individual is just
00:05:29.840 | 'cause we woke up at this level of the hierarchy.
00:05:33.080 | So we make it special,
00:05:34.600 | but we very well could be just part of the ant colony.
00:05:39.600 | This whole conversation,
00:05:40.840 | I'm either going to be doing a Shakespearean analysis
00:05:44.000 | of your Twitter, your writing,
00:05:46.480 | or very specific statements that you've made.
00:05:49.560 | So you've written answers to a mailbag of questions.
00:05:54.560 | The questions are amazing, the ones you've chosen,
00:05:57.080 | and your answers are amazing.
00:05:58.640 | So on this topic of the big and the small,
00:06:00.200 | somebody asked, "Are we bigger than we are small
00:06:03.400 | "or smaller than we are big?"
00:06:05.000 | Who's asking these questions?
00:06:06.400 | This is really good.
00:06:08.000 | You have amazing fans.
00:06:09.120 | Okay, so where do we sit
00:06:13.600 | at this level of the very small to the very big?
00:06:18.480 | - So are we bigger or are we small?
00:06:20.040 | Are we bigger than we are small?
00:06:21.560 | I think it depends on what we're asking here.
00:06:23.840 | So if we're talking about the biggest thing
00:06:26.800 | that we kind of can talk about without just imagining
00:06:31.800 | is the observable universe, the Hubble sphere.
00:06:35.660 | And that's about 10 to the 26th meters in diameter.
00:06:40.660 | The smallest thing we talk about is a Planck length.
00:06:44.460 | But you could argue that that's kind of an imaginary thing.
00:06:47.220 | But that's 10 to the negative 35.
00:06:49.460 | Now we're about, conveniently, about 10 to the one.
00:06:52.540 | Not quite, 10 to the zero.
00:06:53.820 | We're about 10 to the zero meters long.
00:06:56.920 | So it's easy because you can just look and say,
00:06:59.100 | okay, well, for example,
00:07:01.300 | atoms are like 10 to the negative 15th
00:07:04.700 | or 10 to the negative 16th meters across, right?
00:07:08.540 | If you go 10 to the 15th or 10 to the 16th,
00:07:10.580 | which is right, that's now,
00:07:11.980 | so an atom to us is us to this.
00:07:13.860 | You get to like nebulas.
00:07:15.420 | Smaller than a galaxy and bigger than the biggest star.
00:07:18.500 | So we're right in between nebula and an atom.
00:07:22.740 | Now, if you want to go down to quark level,
00:07:24.100 | you might be able to get up to galaxy level.
00:07:26.260 | When you go up to the observable universe,
00:07:29.260 | you're getting down on the small side
00:07:31.460 | to things that we, I think,
00:07:33.100 | are mostly theoretically imagining are there
00:07:36.580 | and hypothesizing are there.
00:07:39.760 | So I think as far as real world objects
00:07:43.380 | that we really know a lot about,
00:07:45.280 | I would say we are smaller than we are big.
00:07:48.820 | But if you want to go down to the Planck length,
00:07:50.340 | we're very quickly, we're bigger than we are small.
00:07:52.380 | If you think about strings.
00:07:54.060 | - Yeah, strings, exactly, string theory and so on.
00:07:57.580 | That's interesting.
00:07:58.580 | But I think like you answered,
00:08:00.140 | no matter what, we're kind of middle-ish.
00:08:02.020 | - Yeah, I mean, here's something cool.
00:08:04.340 | If a human is a neutrino, and again,
00:08:06.380 | neutrino, the size doesn't really make sense.
00:08:08.580 | It's not really a size.
00:08:09.640 | But when we talk about some of these neutrinos,
00:08:11.660 | I mean, if a neutrino is a human, a proton is the sun.
00:08:15.500 | So that's like, I mean, a proton's real small,
00:08:18.660 | like really small.
00:08:19.740 | And so, yeah, the small gets like crazy small very quickly.
00:08:25.660 | - Let's talk about aliens.
00:08:30.420 | We already mentioned it.
00:08:31.660 | Let's start just by with the basic,
00:08:34.220 | what's your intuition as of today?
00:08:36.020 | This is a thing that could change day by day.
00:08:38.380 | But how many alien civilizations out there?
00:08:40.900 | Is it zero?
00:08:42.340 | Is it a handful?
00:08:45.180 | Is it almost endless?
00:08:46.540 | Like the observable universe,
00:08:50.260 | or the universe is teeming with life?
00:08:52.220 | - If I had a gun to my head, I'd have to take a guess.
00:08:56.340 | I would say it's teeming with life.
00:08:57.940 | I would say there is.
00:08:59.180 | I think running a Monte Carlo simulation,
00:09:02.340 | this paper by Anders Sandberg and Drexler
00:09:05.580 | and a few others a couple years ago,
00:09:07.300 | I think you probably know about it.
00:09:08.800 | I think the mean,
00:09:11.900 | you know, using different,
00:09:15.100 | using different, you know,
00:09:17.780 | running through a randomized rate equation multiplication,
00:09:21.780 | you ended up with 27 million as the mean
00:09:25.060 | of intelligent civilizations in the galaxy,
00:09:27.660 | in the Milky Way alone.
00:09:29.060 | And so then if you go outside the Milky Way,
00:09:31.860 | that would turn into trillions.
00:09:33.420 | That's the mean.
00:09:35.500 | Now what's interesting is that there's a long tail
00:09:38.220 | because they believe some of these multipliers
00:09:41.220 | in the Drake equation.
00:09:42.060 | So for example, the probability that life starts
00:09:46.860 | in the first place,
00:09:48.180 | they think that the kind of range that we use
00:09:52.700 | is for that variable or is way too small.
00:09:55.460 | And that's constraining our possibilities.
00:09:58.660 | And if you actually extend it to, you know,
00:10:00.700 | some crazy number of orders of magnitude,
00:10:03.340 | like 200, they think that that variable should be,
00:10:06.480 | you get this long tail where,
00:10:09.620 | I forget the exact number,
00:10:10.700 | but it's like a third or a quarter
00:10:12.540 | of the total outcomes have us alone.
00:10:15.740 | Like, you know, I think it's like,
00:10:17.140 | I think it's a sizable percentage has us
00:10:19.900 | as the only intelligent life in the galaxy,
00:10:23.100 | but you can keep going.
00:10:23.940 | And I think there's like, you know,
00:10:25.580 | a non-zero like legitimate amount of outcomes there
00:10:29.060 | that have us as the only life in the observable universe
00:10:32.460 | at all is on earth.
00:10:33.500 | I mean, it seems incredibly counterintuitive.
00:10:36.260 | It seems like, you know,
00:10:37.100 | when you mentioned that people think you're, you know,
00:10:39.340 | you must be an idiot because, you know,
00:10:41.580 | if you picked up one grain of sand on a beach
00:10:43.260 | and examined it and you found all these little things on it,
00:10:45.980 | it's like saying, well,
00:10:46.820 | maybe this is the only one that has that.
00:10:48.580 | And it's like, probably not.
00:10:50.020 | They're probably, most of the sand probably,
00:10:51.860 | or a lot of the sand, right?
00:10:53.420 | So, and then the other hand, we don't see anything.
00:10:55.700 | We don't see any evidence, you know,
00:10:57.020 | which of course people would say that the people
00:10:59.260 | who scoff at the concept that we're potentially alone,
00:11:03.000 | they say, well, of course,
00:11:05.860 | there's lots of reasons we wouldn't have seen anything
00:11:07.820 | and they can go list them and they're very compelling,
00:11:11.220 | but we don't know.
00:11:13.060 | And the truth is if this were a freak thing,
00:11:15.340 | I mean, we don't,
00:11:16.180 | if this were a completely freak thing that happened here,
00:11:18.900 | whether it's life at all
00:11:20.060 | or just getting to this level intelligence,
00:11:23.060 | that species, whoever it was,
00:11:25.940 | would think there must be lots of us out there
00:11:28.100 | and they'd be wrong.
00:11:29.380 | So, just being, again,
00:11:30.900 | using the same intuition that most people would use,
00:11:32.820 | I'd say there's probably lots of other things out there.
00:11:35.480 | - Yeah, and you wrote a great blog post about it,
00:11:37.660 | but to me, the two interesting reasons
00:11:41.580 | that we haven't been in contact,
00:11:43.380 | I too have an intuition that the universe
00:11:46.180 | is teeming with life.
00:11:47.580 | So one interesting is around the great filter.
00:11:50.700 | So we either, the great filter's either behind us
00:11:54.100 | or in front of us.
00:11:55.220 | So the reason that's interesting
00:11:57.460 | is you get to think about what kind of things
00:12:00.260 | ensure the survival of an intelligent civilization
00:12:05.260 | or lead to the destruction of intelligent civilization.
00:12:08.480 | That's a very pragmatic, very important question
00:12:10.540 | to always be asking.
00:12:11.860 | And we'll talk about some of those.
00:12:13.380 | And then the other one is I'm saddened by the possibility
00:12:18.380 | that there could be aliens communicating with us
00:12:20.620 | all the time.
00:12:21.660 | In fact, they may have visited.
00:12:23.700 | And we're just too dumb to hear it, to see it.
00:12:29.620 | Like the idea that the kind of life that can evolve
00:12:34.540 | is just the range of life that can evolve is so large
00:12:38.140 | that our narrow view of what is life
00:12:40.660 | and what is intelligent life
00:12:42.860 | is preventing us from having communication with them.
00:12:45.780 | - But then they don't seem very smart
00:12:47.740 | because if they were trying to communicate with us,
00:12:50.380 | they would surely, if they were super intelligent,
00:12:53.420 | they would be very, I'm sure if there's lots of life,
00:12:56.420 | we're not that rare, we're not some crazy weird species
00:12:59.340 | that hears and has different kinds of ways
00:13:01.940 | of perceiving signals.
00:13:04.660 | So they would probably be able to,
00:13:06.980 | if you really wanted to communicate
00:13:08.140 | with an Earth-like species, with a human-like species,
00:13:11.220 | you would send out all kinds of things.
00:13:14.120 | You'd send out radio waves and you send out gravity waves.
00:13:19.340 | And lots of things.
00:13:20.700 | So if they're communicating in a way,
00:13:22.160 | they're trying to communicate with us
00:13:23.900 | and it's just we're too dumb to perceive the signals,
00:13:25.980 | it's like, well, they're not doing a great job
00:13:28.380 | of considering the primitive species we might be.
00:13:32.820 | So I don't know.
00:13:33.660 | I think if a super intelligent species
00:13:35.860 | wanted to get in touch with us and had the capability of,
00:13:40.860 | I think probably they would.
00:13:43.580 | - Well, they may be getting in touch with us.
00:13:48.640 | They're just getting in touch with the thing
00:13:50.100 | that we humans are not understanding
00:13:52.260 | that they're getting in touch with us with.
00:13:53.780 | I guess that's what I was trying to say is
00:13:55.980 | there could be something about Earth
00:13:57.540 | that's much more special than us humans.
00:14:00.240 | Like the nature of the intelligence that's on Earth
00:14:03.500 | or the thing that's of value and that's curious
00:14:07.580 | and that's complicated and fascinating and beautiful
00:14:10.620 | might be something that's not just like tweets, okay?
00:14:15.260 | Like English language that's interpretable
00:14:17.180 | or any kind of language or any kind of signal,
00:14:19.700 | whether it's gravity or radio signal
00:14:22.340 | that humans seem to appreciate.
00:14:24.520 | Why not the actual, it could be the process
00:14:28.500 | of evolution itself.
00:14:29.540 | There could be something about the way
00:14:31.100 | that Earth is breathing essentially
00:14:33.300 | through the creation of life
00:14:34.660 | and this complex growth of life.
00:14:37.380 | It's a whole different way to view organisms
00:14:41.740 | and view life that could be getting communicated with
00:14:44.940 | and we humans are just a tiny fingertip
00:14:47.140 | on top of that intelligence
00:14:48.980 | and the communication is happening
00:14:50.540 | with the main mothership of Earth
00:14:54.380 | versus us humans that seem to treat ourselves
00:14:56.860 | as super important and we're missing the big picture.
00:15:00.500 | I mean, it sounds crazy,
00:15:02.100 | but our understanding of what is intelligence,
00:15:04.660 | of what is life, what is consciousness is very limited
00:15:08.020 | and it seems to be, and just being very suspicious,
00:15:11.820 | it seems to be awfully human-centric.
00:15:14.380 | Like this story, it seems like the progress of science
00:15:17.860 | is constantly putting humans down on the importance,
00:15:22.860 | like on the cosmic importance,
00:15:28.220 | the ranking of how big we are, how important we are.
00:15:32.060 | That seems to be, the more we discover,
00:15:33.940 | that's what's happening and I think science is very young.
00:15:37.300 | And so I think eventually we might figure out
00:15:39.820 | that there's something much, much bigger going on,
00:15:41.880 | that humans are just a curious little side effect
00:15:45.060 | of the much bigger thing.
00:15:46.420 | That's what, I mean, as I'm saying, it just sounds insane.
00:15:50.620 | - Well, it sounds a little like religious.
00:15:54.220 | It sounds like a spiritual.
00:15:55.660 | It gets to that realm where there's something
00:15:59.940 | that more than meets the eye.
00:16:01.960 | - Well, yeah, but not, so religious and spiritual
00:16:06.960 | often have this kind of woo-woo characteristic,
00:16:11.060 | like when people write books about them
00:16:12.900 | and then go to wars over whatever the heck
00:16:14.580 | is written in those books.
00:16:16.140 | I mean more like it's possible that collective intelligence
00:16:19.820 | is more important than individual intelligence, right?
00:16:22.300 | It's the ant colony, what's the primal organism?
00:16:24.780 | Is it the ant colony or is it the ant?
00:16:26.900 | - Yeah, I mean, humans, just like any individual ant
00:16:29.580 | can't do shit, but the colony can do,
00:16:32.060 | make these incredible structures and has this intelligence.
00:16:35.380 | And we're exactly the same.
00:16:36.780 | I mean, you know the famous thing that no human
00:16:40.860 | knows how to make a pencil.
00:16:42.460 | You heard this?
00:16:44.660 | - No. - Basically, I mean.
00:16:45.780 | - This is great.
00:16:47.380 | - There's not a, a single human out there
00:16:49.660 | has absolutely no idea how to make a pencil.
00:16:51.660 | So you have to think about, you have to get the wood,
00:16:54.020 | the paint, the different chemicals
00:16:56.160 | that make up the yellow paint.
00:16:57.660 | The eraser is a whole other thing.
00:16:59.460 | The metal has to be mined from somewhere
00:17:01.420 | and then the graphite, whatever that is.
00:17:04.660 | And there's not one person on earth
00:17:06.140 | who knows how to kind of collect all those materials
00:17:10.060 | and create a pencil.
00:17:11.180 | But together, that's child's play.
00:17:14.140 | It's just one of the easiest things.
00:17:15.300 | So, you know, the other thing I like to think about,
00:17:18.540 | I actually put this as a question on the blog once.
00:17:21.100 | There's a thought experiment
00:17:23.740 | and I actually wanna hear what you think.
00:17:25.980 | So if a witch, kind of a dickish witch comes around
00:17:30.780 | and she says, "I'm gonna cast a spell on all of humanity
00:17:34.980 | "and all material things that you've invented
00:17:39.300 | "are gonna disappear all at once."
00:17:41.380 | So suddenly we're all standing there naked.
00:17:42.960 | There's no buildings, there's no cars and boats and ships
00:17:47.400 | and no mines, nothing, right?
00:17:49.900 | It's just the stone age earth and a bunch of naked humans,
00:17:52.960 | but we're all the same, we have the same brain.
00:17:54.420 | So we all know what's going on.
00:17:55.900 | And we all got a note from her.
00:17:57.100 | So we understand the deal.
00:17:57.940 | And she says, she communicated to every human,
00:18:00.460 | "Here's the deal.
00:18:01.540 | "You lost all your stuff.
00:18:02.900 | "You guys need to make one working iPhone 13.
00:18:07.240 | "And you make one working iPhone 13
00:18:09.420 | "that could pass in the Apple store today,
00:18:12.060 | "in your previous world for an iPhone 13,
00:18:14.800 | "then I will restore everything."
00:18:16.400 | How long do you think?
00:18:17.340 | And so everyone knows, this is the mission.
00:18:19.060 | We're all aware of the mission, all humans.
00:18:21.460 | How long would it take us?
00:18:23.780 | - That's a really interesting question.
00:18:25.220 | So obviously if you do a random selection
00:18:28.000 | of 100 or 1,000 humans within the population,
00:18:31.260 | I think you're screwed to make that iPhone.
00:18:34.740 | I tend to believe that there's fascinating specialization
00:18:39.520 | among the human civilization.
00:18:40.840 | Like there's a few hackers out there
00:18:43.160 | that can like solo build an iPhone.
00:18:46.480 | - But with what materials?
00:18:48.520 | - So no materials whatsoever.
00:18:50.340 | It has to, I mean, it's virtually, I mean, okay.
00:18:54.920 | You have to build factories.
00:18:56.560 | I mean, to fabricate.
00:18:59.100 | Okay.
00:19:01.800 | - And how are you gonna mine them?
00:19:03.360 | You know, you gotta mine the materials
00:19:04.440 | or you don't have any cranes.
00:19:05.500 | You don't have any, you know.
00:19:06.740 | - Okay, you 100% have to have the, everybody's naked.
00:19:10.820 | - Everyone's naked and everyone's where they are.
00:19:12.320 | So you and I would currently be naked.
00:19:14.000 | It's on the ground in what used to be Manhattan.
00:19:16.360 | - So no buildings.
00:19:17.200 | - No, Grassy Island.
00:19:18.560 | - Yeah.
00:19:19.400 | So you need a naked Elon Musk type character
00:19:22.120 | to then start building a company.
00:19:23.840 | You have to have a large company then.
00:19:25.600 | - Right.
00:19:26.440 | He doesn't even know where he, you know, where is everyone?
00:19:28.200 | You know, oh shit, how am I gonna find other people
00:19:30.080 | I need to talk to?
00:19:30.920 | - But we have all the knowledge of.
00:19:32.000 | - Yeah, everyone has the knowledge
00:19:33.000 | that's in their current brains.
00:19:34.080 | - Yeah.
00:19:34.920 | I've met some legit engineers.
00:19:38.080 | - Crazy polymath people.
00:19:39.760 | - Yeah, but the actual labor of,
00:19:42.840 | 'cause you said, it's like the original Mac,
00:19:47.200 | like the Apple II, that can be built.
00:19:51.520 | - Even that, you know.
00:19:52.360 | - Even that's gonna be tough.
00:19:53.200 | - Well, I think part of it is a communication problem.
00:19:55.280 | If you could suddenly have, you know,
00:19:56.600 | if everyone had a walkie talkie
00:19:58.200 | and there was, you know, a couple, you know,
00:19:59.620 | 10 really smart people were designated the leaders,
00:20:01.920 | they could say, okay, I want, you know,
00:20:02.980 | everyone who can do this to walk West, you know,
00:20:05.960 | until you get to this little hub and everyone else,
00:20:08.240 | you know, and they could actually coordinate,
00:20:10.180 | but we don't have that.
00:20:11.080 | So it's like people just, you know,
00:20:12.200 | and then what I think about is,
00:20:14.040 | so you've got some people that are like trying to organize
00:20:15.920 | and you'll have a little community
00:20:17.080 | where a couple hundred people have come together
00:20:18.600 | and maybe a couple thousand have organized
00:20:20.080 | and they designated one person, you know, as the leader,
00:20:22.280 | and then they have sub leaders and okay,
00:20:24.320 | we have a start here, we have some organization.
00:20:26.280 | You're also gonna have some people that say, good,
00:20:28.840 | humans were a scourge upon the earth and this is good.
00:20:31.540 | And they're gonna try to sabotage.
00:20:32.520 | They're gonna try to murder the people
00:20:34.160 | who know what they're talking about.
00:20:35.960 | - The elite that possess the knowledge.
00:20:40.880 | - Well, and so maybe everyone's hopeful for the,
00:20:42.920 | you know, we're all civilized and hopeful
00:20:44.320 | for the first 30 days or something.
00:20:45.720 | And then things start to fall off.
00:20:47.120 | They, you know, people get, start to lose hope
00:20:48.800 | and there's new kinds of, you know,
00:20:50.280 | new kinds of governments popping up, you know,
00:20:52.200 | new kinds of societies and they're, you know,
00:20:56.000 | they don't play nicely with the other ones.
00:20:57.720 | And I think very quickly,
00:20:59.200 | I think a lot of people will just give up and say,
00:21:00.480 | you know what, this is it, we're back in the stone age.
00:21:02.040 | Let's just create, you know, agrarian,
00:21:03.800 | we also don't know how to farm.
00:21:05.040 | No one knows how to farm.
00:21:06.080 | There's like, even the farmers, you know,
00:21:08.520 | a lot of them are relying on their machines.
00:21:11.040 | And so we also, there's gonna be a lot of mass starvation.
00:21:14.680 | And that, you know, when you're trying to organize,
00:21:16.420 | a lot of people are, you know, coming in with, you know,
00:21:18.760 | Spears they fashioned and trying to murder everyone
00:21:21.880 | who has food.
00:21:22.720 | - That's an interesting question.
00:21:23.540 | Given today's society, how much violence would that be?
00:21:25.640 | We've gotten softer, less violent.
00:21:28.000 | - And we don't have weapons.
00:21:29.040 | So that's something. - We don't have weapons.
00:21:29.880 | - We have really primitive weapons now.
00:21:31.720 | - But we have, and also we have a kind of ethics
00:21:33.960 | where murder is bad.
00:21:35.680 | We used to be less, like human life was less valued.
00:21:38.960 | In the past, so murder was more okay, like ethically.
00:21:41.920 | - But in the past, they also were really good
00:21:44.340 | at figuring out how to have sustenance.
00:21:46.340 | They knew how to get food and water because they,
00:21:48.680 | so we have no idea.
00:21:49.840 | Like the ancient hunter-gatherer societies would laugh
00:21:52.020 | at what's going on here.
00:21:52.880 | They'd say, you guys, you don't know what you're,
00:21:54.320 | none of you know what you're doing.
00:21:55.840 | And also the amount of people,
00:21:57.360 | feeding this amount of people in a very, in a stone age,
00:22:01.120 | you know, civilization, that's not gonna happen.
00:22:03.840 | - So New York and San Francisco are screwed.
00:22:06.000 | - Well, whoever's not near water is really screwed.
00:22:07.880 | So that's funny. - That's true.
00:22:08.720 | - You're near a river, a freshwater river, and you know.
00:22:11.060 | Anyways, it's a very interesting question.
00:22:12.440 | And what it does, this and the pencil,
00:22:14.480 | it makes me feel so grateful and like excited about like,
00:22:19.240 | man, our civilization is so cool.
00:22:21.560 | And this is, talk about collective intelligence.
00:22:24.480 | Humans did not build any of this.
00:22:28.040 | It's collective human super,
00:22:30.480 | collective humans is a super intelligent, you know,
00:22:33.900 | being that is, that can do absolutely,
00:22:37.280 | especially over long periods of time,
00:22:38.640 | can do such magical things.
00:22:39.720 | And we just get to be born.
00:22:41.360 | When I go out, when I'm working and I'm hungry,
00:22:43.020 | I just go click, click, click, and like a salad's coming.
00:22:46.400 | The salad arrives.
00:22:47.240 | If you think about the incredible infrastructure
00:22:50.100 | that's in place for that quickly,
00:22:52.120 | it just the internet to, you know, the electricity,
00:22:54.440 | first of all, that's just powering the things, you know,
00:22:56.040 | how the, where the, the amount of structures
00:22:58.040 | that have to be created and for that electricity
00:23:00.360 | to be there.
00:23:01.200 | And then you've got the, of course the internet.
00:23:02.640 | And then you have this system where delivery drivers
00:23:05.640 | and they have, they're riding bikes
00:23:06.680 | that were made by someone else.
00:23:07.720 | And they're going to get the salad
00:23:09.200 | and all those ingredients came from all over the place.
00:23:11.560 | I mean, it's just, so I think it's like,
00:23:13.640 | I like thinking about these things because it,
00:23:16.800 | it makes me feel like just so grateful.
00:23:19.280 | I'm like, man, it would be so awful if we didn't have this.
00:23:21.360 | And people, people who didn't have it would think
00:23:22.800 | this was such magic we live in and we do.
00:23:25.240 | And like, cool, that's fun.
00:23:27.600 | - Yeah, one of the most amazing things when I showed up,
00:23:29.720 | I came here at 13 from the Soviet Union
00:23:32.200 | and the supermarket was, people don't really realize that,
00:23:36.560 | but the abundance of food, it's not even,
00:23:39.960 | so bananas was the thing I was obsessed about.
00:23:43.440 | I just ate bananas every day for many, many months
00:23:45.760 | 'cause I haven't had bananas in Russia.
00:23:47.720 | And the fact that you can have as many bananas as you want,
00:23:49.800 | plus there were like somewhat inexpensive
00:23:53.320 | relative to the other food.
00:23:54.600 | And the fact that you can somehow have a system
00:23:58.080 | that brings bananas to you without having to wait
00:24:00.200 | in a long line, all of those things, it's magic.
00:24:03.240 | - I mean, also imagine, so first of all,
00:24:06.000 | the ancient hunter-gatherers, you know,
00:24:07.600 | you picture the mother gathering and eating
00:24:08.960 | for all this fresh fruit.
00:24:09.800 | No, so do you know what an avocado used to look like?
00:24:11.960 | It was a little, like a sphere.
00:24:14.480 | And the fruit of it, the actual avocado part
00:24:17.080 | was like a little tiny layer around this big pit
00:24:19.120 | that took up almost the whole volume.
00:24:21.440 | We've made a crazy, like robot avocados today
00:24:25.840 | that have nothing to do with like what they,
00:24:28.320 | so same with bananas, these big, sweet, you know,
00:24:33.320 | not infested with bugs and, you know,
00:24:35.800 | they used to eat the shittiest food.
00:24:38.680 | And they're eating, you know, uncooked meat
00:24:40.880 | or maybe they cook it and they're just,
00:24:42.240 | it's gross and things rot.
00:24:44.720 | So you go to the supermarket and it's just,
00:24:46.720 | it's just A, it's like crazy, super engineered cartoon food,
00:24:50.000 | fruit and food.
00:24:51.400 | And then it's all this processed food,
00:24:52.520 | which, you know, we complain about.
00:24:53.680 | In our society, oh, you know, we complain about,
00:24:55.080 | you know, we need too much process.
00:24:56.760 | That's a, this is a good problem.
00:24:58.320 | I mean, if you imagine what they would think,
00:25:00.400 | oh my God, a cracker, you know how delicious
00:25:02.240 | a cracker would taste to them?
00:25:04.320 | You know, candy, you know, pasta and spaghetti.
00:25:08.160 | They never had anything like this.
00:25:09.400 | And then you have from all over the world,
00:25:11.480 | I mean, things that are grown all over the place,
00:25:13.960 | all here in nice little racks organized
00:25:16.080 | and on a middle-class salary,
00:25:17.960 | you can afford anything you want.
00:25:20.080 | I mean, it's, again, just like incredible gratitude.
00:25:23.240 | Like, ah, yeah.
00:25:25.160 | - And the question is how resilient is this whole thing?
00:25:27.440 | I mean, this is another darker version of your question
00:25:30.880 | is if we keep all the material possessions we have,
00:25:35.240 | but we start knocking out some percent of the population,
00:25:39.560 | how resilient is the system that we built up?
00:25:42.080 | Or if we rely on other humans and the knowledge
00:25:44.960 | of built up on the past,
00:25:45.960 | the distributed nature of knowledge,
00:25:48.920 | how much does it take?
00:25:52.980 | How many humans need to disappear
00:25:55.520 | for us to be completely lost?
00:25:57.880 | - Well, I'm trying to go off one thing,
00:25:59.200 | which is Elon Musk says that he has this number,
00:26:02.520 | a million, in mind.
00:26:03.720 | As the order of magnitude of people,
00:26:05.960 | you need to be on Mars to truly be multi-planetary.
00:26:10.960 | Multi-planetary doesn't mean, you know,
00:26:13.760 | like when Neil Armstrong goes to the moon,
00:26:19.520 | they call it a great leap for mankind.
00:26:21.960 | It's not a great leap for anything.
00:26:23.780 | It is a great achievement for mankind.
00:26:26.740 | And I always like think about if the first fish
00:26:29.460 | to kind of go on land just kind of went up
00:26:31.940 | and gave the shore a high five
00:26:33.340 | and goes back into the water,
00:26:34.220 | that's not a great leap for life.
00:26:36.060 | That's a great achievement for that fish.
00:26:37.260 | And there should be a little statue of that fish
00:26:38.540 | and it's, you know, in the water
00:26:39.660 | and everyone should celebrate the fish.
00:26:41.380 | But it's, but when we talk about a great leap for life,
00:26:44.860 | it's permanent.
00:26:45.740 | It's something that now, from now on,
00:26:47.740 | this is how things are.
00:26:48.660 | So this is part of why I get so excited about Mars,
00:26:51.080 | by the way, is because you can count on one hand,
00:26:53.820 | like the number of great leaps that we've had,
00:26:57.140 | you know, like no life to life and single cell
00:27:00.860 | or simple cell to complex cell
00:27:02.680 | and single cell organisms to animals,
00:27:05.620 | to, you know, multi-cell animals,
00:27:07.940 | and then ocean to land,
00:27:09.220 | and then one planet to two planets.
00:27:11.860 | Anyway, diversion.
00:27:12.700 | But the point is that we are officially,
00:27:16.540 | that leap for all of life, you know,
00:27:18.680 | has happened once the ships could stop coming from Earth
00:27:22.740 | because there's some horrible catastrophic World War III
00:27:24.780 | and everyone dies on Earth and they're fine
00:27:26.280 | and they can turn that certain X number of people
00:27:29.220 | into 7 billion, you know, population
00:27:31.740 | that's thriving just like Earth.
00:27:33.060 | They can build ships,
00:27:33.900 | they can come back and recolonize Earth
00:27:35.220 | 'cause now we are officially multi-planetary
00:27:37.180 | where it's a self-sustaining.
00:27:38.860 | He says a million people is about what he thinks.
00:27:40.860 | Now that might be a specialized group.
00:27:42.580 | That's a very specifically, you know,
00:27:44.620 | selected million that has very, very skilled million people,
00:27:49.280 | not just maybe the average million on Earth.
00:27:51.600 | But I think it depends what you're talking about.
00:27:53.180 | But I don't think, you know,
00:27:54.200 | so one million is 1/7000, 1/8000 of the current population.
00:27:58.480 | I think you need a very, very, very small fraction
00:28:02.160 | of humans on Earth to get by.
00:28:04.480 | Obviously, you're not gonna have
00:28:05.320 | the same thriving civilization
00:28:06.960 | if you get to a too small a number,
00:28:08.480 | but it depends who you're killing off, I guess,
00:28:10.680 | is part of the question.
00:28:11.960 | Yeah.
00:28:12.920 | If you killed off half of the people
00:28:14.320 | just randomly right now, I think we'd be fine.
00:28:15.720 | It would be obviously a great, awful tragedy.
00:28:18.740 | I think if you killed off 3/4 of all people randomly,
00:28:20.740 | just three out of every four people drops dead,
00:28:22.420 | I think we'd have, obviously, the stock market would crash.
00:28:24.820 | We'd have a rough patch,
00:28:27.020 | but I almost can assure you that the species would be fine.
00:28:30.540 | - Well, 'cause the million number,
00:28:31.780 | like you said, it is specialized.
00:28:33.780 | So I think, 'cause you have to do this,
00:28:38.780 | you have to basically do the iPhone experiment.
00:28:41.660 | Like, literally, you have to be able
00:28:43.000 | to manufacture computers.
00:28:46.120 | - Yeah, everything.
00:28:46.960 | If you're gonna have the self-sustaining,
00:28:48.240 | it means you can, any major important skill,
00:28:50.800 | any important piece of infrastructure on Earth
00:28:53.560 | can be built there just as well.
00:28:56.560 | - It'd be interesting to list out
00:29:00.160 | what are the important things,
00:29:01.920 | what are the important skills.
00:29:03.360 | - Yeah, I mean, you have to feed everyone.
00:29:05.960 | So mass farming, things like that.
00:29:08.840 | You have mining, these questions.
00:29:12.560 | It's like, the materials might be,
00:29:15.280 | I don't know, five miles, two miles underground,
00:29:17.640 | I don't know the actual, but like,
00:29:19.340 | it's amazing to me just that these things
00:29:22.040 | got built in the first place.
00:29:23.040 | And they never got, no one built the first,
00:29:25.280 | the mine that we're getting stuff for the iPhone for
00:29:28.160 | probably wasn't built for the iPhone.
00:29:30.680 | Or in general, early mining was for,
00:29:33.260 | I think, obviously, I assume the Industrial Revolution
00:29:35.480 | when we realized, oh, fossil fuels,
00:29:37.000 | we want to extract this magical energy source,
00:29:40.040 | I assume that mining took a huge leap.
00:29:42.200 | Without knowing very much about this,
00:29:43.360 | I think you're gonna need mining,
00:29:45.740 | you're gonna need a lot of electrical engineers.
00:29:48.480 | If you're gonna have a civilization like ours,
00:29:50.040 | now, of course, you could have oil and lanterns,
00:29:51.960 | we could go way back,
00:29:53.200 | but if you're trying to build our today thing,
00:29:55.280 | you're gonna need energy and electricity
00:29:58.360 | and mines that can bring materials,
00:30:00.600 | and then you're gonna need a ton of plumbing
00:30:03.360 | and everything that entails.
00:30:04.560 | - Yeah, and like you said, food,
00:30:06.300 | but also the manufacturers,
00:30:07.720 | so like turning raw materials into something useful,
00:30:10.800 | that whole thing, like factories,
00:30:13.040 | some supply chain, transportation.
00:30:15.560 | - Right, I mean, you think about,
00:30:17.480 | when we talk about the world hunger,
00:30:18.460 | one of the major problems is there's plenty of food,
00:30:21.800 | and by the time it arrives,
00:30:22.760 | most of it's gone bad in the truck,
00:30:24.720 | in kind of an impoverished place.
00:30:26.920 | So it's like, again, we take it so for granted,
00:30:29.640 | all the food in the supermarket is fresh,
00:30:32.760 | it's all there, and, which always stresses me out,
00:30:35.480 | if I were running a supermarket,
00:30:36.460 | I would always be so miserable
00:30:37.940 | about things going bad on the shelves,
00:30:41.660 | or if you don't have enough, that's not good,
00:30:43.100 | but if you have too much, it goes bad anyway.
00:30:44.380 | - Of course, there would be entertainers too.
00:30:46.900 | Like somebody would have a YouTube channel
00:30:48.980 | that's running on Mars.
00:30:50.860 | There is something different
00:30:53.180 | about a civilization on Mars and Earth existing
00:30:57.860 | versus a civilization in the United States
00:31:00.020 | versus Russia and China.
00:31:01.420 | Like that's a different,
00:31:03.260 | fundamentally different distance.
00:31:05.180 | Like philosophically.
00:31:07.040 | - Will it be like fuzzy?
00:31:08.160 | We know there'll be like a reality show on Mars
00:31:09.920 | that everyone on Earth is obsessed with.
00:31:11.400 | And I think if people are going back and forth enough,
00:31:15.160 | then it becomes fuzzy, it becomes like,
00:31:16.640 | oh, our friends on Mars,
00:31:17.600 | and there's like this Mars versus Earth,
00:31:20.360 | and it become like fun tribalism.
00:31:22.640 | I think if people don't really go back and forth,
00:31:25.240 | and it really, they're there for,
00:31:26.240 | I think if you get kind of like,
00:31:27.480 | oh, we hate a lot of like us versus them stuff going on.
00:31:30.360 | - There could be also war in space for territory.
00:31:33.520 | As first colony happens, China, Russia,
00:31:38.460 | or whoever, the European, different European nations,
00:31:41.700 | Switzerland finally gets their act together
00:31:43.580 | and starts wars.
00:31:44.780 | This is supposed to, staying out of all of them.
00:31:47.260 | - All kinds of crazy geopolitical things
00:31:49.060 | that like we have not even,
00:31:50.980 | no one's really even thought about too much yet
00:31:52.740 | that like, that could get weird.
00:31:54.420 | Think about the 1500s,
00:31:56.220 | when it was suddenly like a race to like,
00:31:58.540 | you know, colonize or capture land
00:32:00.380 | or discover new land that hasn't been,
00:32:02.140 | you know, so it was like this new frontiers.
00:32:04.200 | And there's not really, you know,
00:32:05.600 | the land is not, you know,
00:32:06.840 | the thing about Crimea was like,
00:32:08.480 | this huge thing, 'cause this tiny peninsula switched.
00:32:11.280 | That's how like optimized everything has become.
00:32:14.000 | Everything is just like really stuck.
00:32:16.400 | Mars is a whole new world of like,
00:32:18.080 | but you know, territory, naming things,
00:32:20.600 | and you know, and it's a chance
00:32:23.640 | for new kind of governments maybe,
00:32:24.960 | or maybe it's just the colonies of these governments,
00:32:27.440 | so we don't get that opportunity.
00:32:28.680 | I think it'd be cool if there's new countries being,
00:32:30.440 | you know, totally new experiments.
00:32:31.760 | - Yeah, and that's fascinating,
00:32:33.280 | 'cause Elon talks exactly about that,
00:32:35.200 | and I believe that very much.
00:32:36.600 | Like, that should be, like, from the start,
00:32:40.440 | they should determine their own sovereignty.
00:32:43.280 | Like, they should determine their own thing.
00:32:46.360 | - There was one modern democracy in late 1700s, the US.
00:32:50.920 | I mean, it was the only modern democracy,
00:32:53.360 | and now, of course, there's hundreds,
00:32:56.120 | or dozen, many dozens.
00:32:57.640 | But I think part of the reason that was able to start,
00:32:59.760 | I mean, it's not that people didn't have the idea.
00:33:01.080 | People had the idea.
00:33:02.000 | It was that they had a clean slate, new place,
00:33:05.600 | you know, and they suddenly were,
00:33:06.760 | so I think it would be a great opportunity to have,
00:33:10.240 | 'cause a lot of people have done that, you know,
00:33:11.720 | oh, if I had my own government on an island,
00:33:13.920 | my own country, what would I do?
00:33:15.560 | And it's, the US founders actually had the opportunity,
00:33:20.120 | that fantasy, they were like, we can do it.
00:33:21.640 | Let's make, okay, what's the perfect country?
00:33:23.880 | And they tried to make something.
00:33:25.360 | Sometimes progress is, it's not held up by our imagination.
00:33:29.480 | It's held up by just, there's no, you know,
00:33:32.600 | blank canvas to try something on.
00:33:34.400 | - Yeah, it's an opportunity for a fresh start.
00:33:37.200 | You know, the funny thing about the conversation
00:33:39.000 | we're having is it's not often had.
00:33:41.320 | I mean, even by Elon, he's so focused on starship
00:33:43.760 | and actually putting the first human on Mars.
00:33:46.160 | I think thinking about this kind of stuff is inspiring.
00:33:51.160 | It makes us dream.
00:33:52.960 | It makes us hope for the future.
00:33:54.280 | So, and it makes us, somehow, like,
00:33:56.960 | thinking about civilization on Mars is helping us
00:34:00.960 | think about the civilization here on Earth.
00:34:03.260 | - Yeah, totally. - And how we should run it.
00:34:05.080 | What do you think are, like, in our lifetime?
00:34:07.640 | Are we gonna, I think any effort that goes to Mars,
00:34:11.220 | the goal is in this decade.
00:34:13.320 | Do you think that's actually gonna be achieved?
00:34:15.440 | - I have a big bet, $10,000, with a friend
00:34:18.880 | when I was drunk in an argument.
00:34:21.520 | - This is great.
00:34:22.360 | - That the Neil Armstrong of Mars,
00:34:24.000 | whoever he or she may be, will set foot
00:34:26.800 | by the end of 2030.
00:34:28.680 | Now, this was probably in 2018 when I had this argument.
00:34:30.960 | - So, like, what if-- - So, a human has to touch Mars
00:34:33.320 | by the end of 2030. - '39.
00:34:35.560 | Oh, by the year '39. - Yeah, by January 1st, 2031.
00:34:39.600 | - Yeah.
00:34:40.440 | - So-- - Did you agree
00:34:42.400 | on the time zone, or what?
00:34:44.080 | - No, no, yeah, if it's coming on that exact day,
00:34:46.080 | that's gonna be really stressful.
00:34:47.080 | But anyway, 'cause I think that there will be.
00:34:52.080 | That was 2018.
00:34:53.240 | I was more confident then.
00:34:54.920 | I think it's gonna be around this time.
00:34:56.240 | I mean, I still won the general bet,
00:34:58.280 | 'cause his point was, "You are crazy.
00:34:59.640 | "This is not gonna happen in our lifetime.
00:35:01.040 | "They're not for many, many decades."
00:35:02.640 | And I said, "You're wrong.
00:35:03.460 | "You don't know what's going on in SpaceX."
00:35:05.520 | I think if the world depended on it,
00:35:08.200 | I think probably SpaceX could probably,
00:35:10.480 | I mean, I don't know this,
00:35:11.520 | but I think the tech is almost there.
00:35:14.440 | Like, I don't think, of course,
00:35:16.160 | it's delayed many years by safety,
00:35:18.160 | so they first wanna send a ship around Mars,
00:35:20.360 | and they wanna land a cargo ship on Mars.
00:35:22.080 | - And there's the moon on the way, too.
00:35:23.320 | - Yeah, yeah, there's a lot.
00:35:24.560 | I think the moon, a decade before,
00:35:27.000 | seemed like magical tech that humans didn't have.
00:35:30.640 | This is like, no.
00:35:31.680 | It's totally conceivable that this,
00:35:35.600 | you've seen Starship,
00:35:36.760 | like it is a interplanetary transport system.
00:35:41.760 | That's what they used to call it.
00:35:45.920 | SpaceX, the way they do it is,
00:35:47.520 | every time they do a launch,
00:35:49.000 | something fails, usually, when they're testing,
00:35:52.120 | and they learn a thousand things.
00:35:54.280 | The amount of data they get,
00:35:55.640 | and they improve, each one has,
00:35:58.520 | it's like they've moved up eight generations in each one.
00:36:01.560 | Anyway, so it's not inconceivable that pretty soon,
00:36:04.480 | they could send a Starship to Mars and land it.
00:36:07.600 | There's just no good reason,
00:36:08.520 | I don't think that they couldn't do that.
00:36:09.800 | And so if they could do that,
00:36:10.840 | they could, in theory, send a person to Mars pretty soon.
00:36:13.880 | Now, taking off from Mars and coming back,
00:36:15.720 | again, I don't think anyone would want
00:36:17.920 | to be on that voyage today,
00:36:19.200 | because there's just,
00:36:21.160 | it's still amateur hour here,
00:36:22.920 | getting that perfect.
00:36:24.640 | I don't think we're too far away now,
00:36:25.920 | the question is,
00:36:26.760 | so every 26 months, Earth laps Mars, right?
00:36:31.680 | It's like the sinusoidal orbit,
00:36:34.240 | or whatever it's called, the period, 26 months.
00:36:37.200 | So it's right now, in the evens,
00:36:39.280 | 2022 is gonna have one of these,
00:36:42.000 | late 2024, so people could,
00:36:43.800 | this was the earliest estimate I heard.
00:36:45.200 | Elon said, maybe we can send people to Mars in 2024,
00:36:49.120 | to land in early 2025.
00:36:51.680 | That is not gonna happen,
00:36:52.680 | because that included 2022,
00:36:54.320 | sending a cargo ship to Mars,
00:36:56.520 | maybe even one in 2020,
00:36:58.480 | and so I think they're not quite on that schedule,
00:37:00.080 | but to win my bet,
00:37:01.840 | 2027, I have a chance,
00:37:03.360 | and 2029, I have another chance.
00:37:04.960 | - Nice.
00:37:05.800 | - We're not very good at backing up
00:37:06.880 | and seeing the big picture,
00:37:07.720 | we're very distracted by what's going on today,
00:37:09.280 | and what we can believe,
00:37:10.800 | 'cause it's happening in front of our face.
00:37:12.440 | There's no way that a human's gonna be landing on Mars,
00:37:15.400 | and it's not gonna be the only thing
00:37:17.480 | everyone is talking about, right?
00:37:18.840 | I mean, it's gonna be the moon landing,
00:37:20.480 | an even bigger deal, going to another planet, right?
00:37:23.000 | And for it to start a colony,
00:37:24.600 | not just to, again, high five and come back.
00:37:27.080 | So this is like, the 2020s, maybe the 2030s,
00:37:31.760 | is gonna be the new 1960s,
00:37:33.160 | we're gonna have a space decade,
00:37:34.280 | I'm so excited about it.
00:37:36.000 | And again, it's one of the great leaps for all of life
00:37:38.080 | happening in our lifetimes, like that's wild.
00:37:40.500 | - To paint a slightly cynical possibility,
00:37:43.720 | which I don't see happening,
00:37:46.080 | but I just wanna put sort of value into leadership.
00:37:49.280 | I think it wasn't obvious that the moon landing
00:37:52.560 | would be so exciting for all of human civilization.
00:37:55.000 | Some of that had to do with the right speeches,
00:37:56.920 | with the space race.
00:37:58.880 | Like, space, depending on how it's presented,
00:38:01.720 | can be boring.
00:38:02.560 | I don't think it's been that so far, but I've actually--
00:38:07.160 | - I agree, I think space is quite boring right now.
00:38:10.200 | Not, you know, SpaceX is super,
00:38:11.880 | but like 10 years ago, space.
00:38:13.840 | Some writer, I forget who, wrote,
00:38:15.840 | it's like, the best magic trick in the show
00:38:17.680 | happened at the beginning,
00:38:18.560 | and now they're starting to do this easy magic.
00:38:20.440 | You know, it's like, you can't go in that direction.
00:38:21.720 | And the line that this writer said is like,
00:38:23.600 | watching astronauts go up to the space station
00:38:28.440 | after watching, the moon is like
00:38:29.760 | watching Columbus sail to Ibiza.
00:38:31.800 | It's just like, you know, everything is so impractical.
00:38:34.880 | You're going up to the space station not to explore,
00:38:36.720 | but to do science experiments in microgravity.
00:38:38.600 | And you're sending rockets up, you know,
00:38:41.520 | mostly here and there there's a probe,
00:38:43.080 | but mostly you're sending them up to put satellites
00:38:44.960 | for DirecTV, you know, or whatever it is.
00:38:48.520 | It's kind of like lame earth industry, you know, usage.
00:38:51.840 | So I agree with you, space is boring there.
00:38:55.560 | The first human setting foot on Mars,
00:38:59.760 | that's gotta be a crazy global event.
00:39:01.680 | I can't imagine it not being.
00:39:02.880 | Maybe you're right, maybe I'm taking for granted
00:39:04.680 | the speeches and the space race and the--
00:39:06.480 | - I think the value of, I guess what I'm pushing
00:39:09.920 | is the value of people like Elon Musk
00:39:12.080 | and potentially other leaders that hopefully step up
00:39:14.480 | is extremely important here.
00:39:16.160 | Like I would argue without the publicity of SpaceX,
00:39:19.240 | it's not just the ingenuity of SpaceX,
00:39:21.360 | but like what they've done publicly
00:39:23.400 | by having a figure that tweets
00:39:25.720 | and all that kind of stuff like that,
00:39:27.640 | that's a source of inspiration.
00:39:29.560 | - Totally.
00:39:30.400 | - NASA wasn't able to quite pull off with the shuttle.
00:39:32.680 | - That's one of his two reasons for doing this.
00:39:35.000 | SpaceX exists for two reasons.
00:39:37.240 | One, life insurance for the species.
00:39:40.040 | If we're on, you know, if you're,
00:39:41.640 | I always think about it this way,
00:39:42.560 | if you're an alien on some faraway planet
00:39:44.440 | and you're rooting against humanity
00:39:46.720 | and you win the bet if humanity goes extinct,
00:39:49.600 | you do not like SpaceX.
00:39:51.560 | You do not want them to have their eggs in two baskets now.
00:39:54.200 | - Yeah.
00:39:55.040 | (laughing)
00:39:55.880 | - You know, sure, it's like obviously this,
00:39:57.880 | you know, you could have some, you know,
00:39:59.160 | something that kills everyone on both planets,
00:40:00.640 | some AI war or something,
00:40:02.120 | but the point is obviously it's good for our chances,
00:40:06.880 | our long-term chances to be having, you know,
00:40:08.600 | choose a self-sustaining civilization is going on.
00:40:11.360 | The second reason, he values this,
00:40:14.200 | I think just as high is it's the greatest adventure
00:40:16.400 | in history, you know, going multi-planetary
00:40:18.920 | and that, you know, it's, you know,
00:40:20.160 | people need some reason to wake up in the morning
00:40:21.840 | and it'll just be this hopefully great uniting event too.
00:40:26.240 | I mean, I'm sure in today's nasty,
00:40:27.920 | awful political environment,
00:40:29.560 | which is like a whirlpool of,
00:40:31.720 | that sucks everything into it.
00:40:34.120 | So it doesn't mean, you name a thing
00:40:35.640 | and it's become a nasty political topic.
00:40:38.000 | So I hope, I hope that space can, you know,
00:40:42.720 | Mars can just bring everyone together, but you know,
00:40:45.440 | it could become this hideous thing where it's, you know,
00:40:47.560 | oh, you know, billionaire,
00:40:48.640 | some annoying storyline gets built.
00:40:50.720 | So half the people think that anyone who's excited
00:40:52.680 | about Mars is an evil, you know, something.
00:40:55.280 | - Yeah.
00:40:56.120 | - Anyway, I hope it is super exciting.
00:40:57.920 | - So far space has been a uniting, inspiring thing.
00:41:02.920 | And in fact, especially during this time of a pandemic
00:41:06.000 | has been just a commercial entity
00:41:08.960 | putting out humans into space for the first time
00:41:12.080 | was just one of the only big sources of hope.
00:41:15.280 | - Totally, and awe, just like watching this huge skyscraper
00:41:19.000 | go up in the air, flip over, come back down and land.
00:41:21.480 | I mean, it just makes everyone just want to sit back
00:41:23.200 | and clap and kind of like, you know,
00:41:25.000 | the way I look at something like SpaceX is
00:41:27.360 | it makes me proud to be a human.
00:41:30.280 | And I think it makes a lot of people feel that way.
00:41:31.520 | It's like good for our self-esteem.
00:41:33.280 | It's like, you know what, we're pretty, you know,
00:41:34.320 | we have a lot of problems, but like, we're kind of awesome.
00:41:36.960 | - Yeah, we're awesome.
00:41:37.800 | - And if we can put people on Mars, you know,
00:41:39.000 | sticking an Earth flag on Mars, like, damn, you know,
00:41:42.640 | we should be so proud of our like little family here.
00:41:45.400 | Like we did something cool.
00:41:46.480 | And by the way, I've made it clear to SpaceX people,
00:41:51.160 | including Elon, many times,
00:41:52.560 | and it's like once a year reminder
00:41:54.040 | that if they want to make this more exciting,
00:41:56.920 | they send the writer to Mars on, you know,
00:42:00.280 | I was on the thing and I'll blog about it.
00:42:02.360 | So I'm just, you know, continuing to throw this out there.
00:42:04.960 | - On which trip?
00:42:05.800 | - I'm trying to get them to send me to Mars.
00:42:07.480 | - I understand that.
00:42:08.880 | So I just want to clarify,
00:42:10.280 | on which trip does the writer want to go?
00:42:12.400 | - I think my dream one, to be honest,
00:42:14.120 | would be like the, you know, like the Apollo 8,
00:42:16.960 | where they just looped around the moon and came back.
00:42:19.680 | 'Cause landing on Mars--
00:42:21.800 | - Give you a lot of good content to write about.
00:42:24.840 | - Great content, right?
00:42:26.200 | I mean, the amount of kind of high-minded, you know,
00:42:28.640 | and so I would go into the thing and I would blog about it.
00:42:32.200 | And I'd be in microgravity,
00:42:34.080 | so I'd be bouncing around my little space.
00:42:35.480 | I get a little, they can just send me in a dragon.
00:42:37.040 | They don't need to do a whole starship.
00:42:38.520 | And I would bounce around and I would get to,
00:42:41.200 | and I've always had a dream of going to like
00:42:43.960 | one of those nice jails for a year.
00:42:47.280 | Because I just have nothing to do besides like read books
00:42:49.600 | and no responsibilities and no social plans.
00:42:51.560 | So this is the ultimate version of that.
00:42:53.560 | Anyway, it's a side topic, but I think it would be--
00:42:55.320 | - But also if you, I mean, to be honest,
00:42:57.280 | if you land on Mars, it's epic.
00:43:00.040 | And then if you die there, like finishing your writing,
00:43:04.080 | it will be just even that much more powerful
00:43:06.600 | for the impact and the performance.
00:43:08.960 | - But then I'm gone and I don't even get to like
00:43:10.800 | experience the publication of it,
00:43:12.360 | which is the whole point of this.
00:43:13.200 | - Well, some of the greatest writers in history
00:43:15.240 | didn't get a chance to experience
00:43:16.920 | the publication of their great--
00:43:18.080 | - I know, I don't really think that.
00:43:19.360 | I think like, I think back to Jesus and I'm like,
00:43:20.920 | oh man, that guy really like crushed it, you know?
00:43:23.200 | But then if you think about it, it doesn't like,
00:43:26.680 | you could literally die today
00:43:28.360 | and then become the next Jesus like 2000 years from now
00:43:31.280 | in this civilization that's like,
00:43:32.840 | they're like magical in the clouds
00:43:35.840 | and they're worshiping you.
00:43:37.280 | They're worshiping Lex.
00:43:38.760 | And like, that sounds like your ego probably would be like,
00:43:40.640 | wow, that's pretty cool, except irrelevant to you
00:43:43.080 | 'cause you never even knew it happened.
00:43:44.240 | - This feels like a Rick and Morty episode.
00:43:46.440 | - It does, it does.
00:43:47.920 | - Okay, you've talked to Elon quite a bit.
00:43:51.720 | You've written about him quite a bit.
00:43:53.840 | Just, it'd be cool to hear you talk about
00:43:58.280 | what are your ideas of what, you know,
00:44:00.160 | the magic sauce is you've written about with Elon.
00:44:02.920 | What makes him so successful?
00:44:06.040 | His style of thinking, his ambition, his dreams,
00:44:09.000 | his, the people he connects with,
00:44:11.760 | the kind of problems he tackles.
00:44:13.080 | Is there a kind of comments you can make
00:44:14.720 | about what makes him special?
00:44:16.200 | - I think that obviously there's a lot of things
00:44:18.440 | that he's very good at.
00:44:19.360 | He has, he's obviously super intelligent.
00:44:23.200 | His heart is very much in like, I think the right place.
00:44:25.880 | Like, you know, I really, really believe that.
00:44:27.760 | Like, and I think people can sense that, you know,
00:44:30.080 | he just doesn't seem like a grifter of any kind.
00:44:32.800 | He's truly trying to do these big things
00:44:34.560 | for the right reasons.
00:44:36.080 | And he's obviously crazy ambitious and hardworking, right?
00:44:38.360 | Not everyone is.
00:44:39.200 | Some people are as talented and have cool visions,
00:44:40.880 | but they just don't wanna spend their life that way.
00:44:43.800 | So, but that's, none of those alone is what makes Elon.
00:44:47.800 | Elon, I mean, if it were, there'd be more of him
00:44:49.840 | because there's a lot of people that are very smart
00:44:51.480 | and smart enough to accumulate a lot of money and influence
00:44:53.920 | and they have great ambition and they have, you know,
00:44:56.920 | their heart's in the right place.
00:44:58.800 | To me, it is the very unusual quality he has
00:45:01.960 | is that he's sane in a way that almost every human is crazy.
00:45:06.360 | What I mean by that is we are programmed
00:45:08.800 | to trust conventional wisdom over our own reasoning
00:45:15.480 | for good reason.
00:45:19.120 | If you go back 50,000 years and conventional wisdom says,
00:45:22.840 | you know, don't eat that berry, you know,
00:45:26.000 | or this is the way you tie a spearhead to a spear,
00:45:29.000 | and you're thinking, I'm smarter than that.
00:45:31.680 | Like, you're not.
00:45:32.640 | You know, that comes from the accumulation
00:45:35.080 | of life experience, accumulation of observation
00:45:37.000 | and experience over many generations.
00:45:39.600 | And that's a little mini version
00:45:41.760 | of the collective super intelligence.
00:45:43.360 | It's like, you know, it's equivalent
00:45:45.320 | of like making a pencil today.
00:45:46.640 | Like people back then, like the conventional wisdom
00:45:51.640 | like had this super, this knowledge
00:45:54.160 | that no human could ever accumulate.
00:45:56.080 | So we're very wired to trust it.
00:45:57.720 | Plus the secondary thing is that the people who, you know,
00:46:00.800 | just say that they believe the mountain is,
00:46:02.680 | they worship the mountain as their God, right?
00:46:04.360 | And the mountain determines their faith.
00:46:06.080 | That's not true, right?
00:46:07.320 | And the conventional wisdom's wrong there,
00:46:08.560 | but believing it was helpful to survival
00:46:12.720 | because you were part of the crowd
00:46:15.240 | and you stayed in the tribe.
00:46:16.080 | And if you started to, you know,
00:46:18.240 | insult the mountain God and say, that's just a mountain,
00:46:21.240 | it's not, you know, you didn't fare very well, right?
00:46:23.300 | So for a lot of reasons, it was a great survival trait
00:46:25.840 | to just trust what other people said and believe it.
00:46:29.360 | And truly, you know, obviously, you know,
00:46:30.680 | the more you really believed it, the better.
00:46:32.940 | Today, conventional wisdom in a rapidly changing world
00:46:37.940 | and a huge, giant society,
00:46:42.360 | our brains are not built to understand that.
00:46:44.120 | They have a few settings, you know,
00:46:45.640 | and none of them is, you know,
00:46:47.720 | 300 million person society.
00:46:49.520 | So your brain is basically,
00:46:52.460 | is treating a lot of things like a small tribe,
00:46:57.700 | even though they're not.
00:46:58.540 | And they're treating conventional wisdom as, you know,
00:47:01.640 | very wise in a way that it's not.
00:47:03.680 | If you think about it this way,
00:47:04.520 | it's like picture a, like a bucket
00:47:06.400 | that's not moving very much,
00:47:08.200 | moving like a millimeter a year.
00:47:09.520 | And so it has time to collect a lot of water in it.
00:47:11.240 | That's like conventional wisdom in the old days
00:47:12.680 | when very few things changed.
00:47:13.680 | Like your 10, you know, great, great, great grandmother
00:47:15.940 | probably lived a similar life to you,
00:47:17.080 | maybe on the same piece of land.
00:47:19.300 | And so old people really knew what they were talking about.
00:47:21.180 | Today, the bucket's moving really quickly.
00:47:23.380 | And so, you know, the wisdom doesn't accumulate,
00:47:25.200 | but we think it does.
00:47:26.400 | 'Cause our brain settings doesn't have the,
00:47:27.880 | oh, move, you know, quickly moving bucket setting on it.
00:47:31.520 | So my grandmother gives me advice all the time.
00:47:35.680 | And I have to decide, is this,
00:47:38.480 | so there are certain things that are not changing,
00:47:40.240 | like relationships and love and loyalty
00:47:42.800 | and things like this.
00:47:44.060 | Her advice on those things, I'll listen to it all day.
00:47:45.760 | She's one of the people who said,
00:47:46.880 | you've got to live near your people you love,
00:47:48.560 | live near your family, right?
00:47:50.040 | I think that is like tremendous wisdom, right?
00:47:52.880 | That is wisdom.
00:47:53.720 | 'Cause that happens to be something that hasn't,
00:47:54.960 | doesn't change from generation to generation.
00:47:56.720 | - For now.
00:47:57.560 | - For now.
00:47:58.400 | She, all right, for now.
00:47:59.680 | She's also telling, right, so I'll be the idiot
00:48:01.600 | telling my grandkids that,
00:48:02.600 | and they'll actually be in some metaverse,
00:48:04.080 | like being like, it doesn't matter.
00:48:05.800 | And I'm like, it's not the same when you're not in person.
00:48:08.200 | They're gonna say, it's exactly the same, grandpa.
00:48:10.480 | And they'll also be thinking to me with their near link,
00:48:12.560 | and I'm gonna be like, slow down.
00:48:13.960 | I don't understand what you're,
00:48:14.800 | can you just talk like a normal person?
00:48:16.640 | Anyway, so my grandmother then, but then she says,
00:48:19.560 | you know, you're, I don't know about this writing
00:48:21.360 | you're doing, you should go to law school.
00:48:23.140 | And you know, you wanna be secure.
00:48:25.120 | And that's not good advice for me, you know,
00:48:27.120 | given the world I'm in and what I like to do
00:48:29.360 | and what I'm good at, that's not the right advice.
00:48:31.840 | But because the world is totally,
00:48:34.000 | she's in a different world.
00:48:34.840 | So she became wise for a world that's no longer here, right?
00:48:37.520 | Now, if you think about that,
00:48:38.840 | so then when we think about conventional wisdom,
00:48:40.680 | it's a little like my grandmother.
00:48:41.860 | And there's a lot of, no, it's not maybe, you know,
00:48:45.120 | 60 years outdated, like her software.
00:48:47.200 | It's maybe 10 years outdated,
00:48:49.640 | it's conventional wisdom, sometimes 20.
00:48:51.560 | So anyway, I think that we all continually
00:48:55.560 | don't have the confidence in our own reasoning
00:48:58.580 | when it conflicts with what everyone else thinks,
00:49:00.800 | when with what seems right.
00:49:02.840 | We don't have the guts to act on that reasoning
00:49:06.180 | for that reason, right?
00:49:07.120 | You know, we, and so there's so many Elon examples.
00:49:11.060 | I mean, just from the beginning,
00:49:12.240 | building Zip2 was the first company.
00:49:14.760 | And it was internet advertising at the time
00:49:19.440 | when people said, you know, this internet was brand new,
00:49:22.120 | like kind of like kind of thinking of like the metaverse,
00:49:24.000 | VR metaverse today.
00:49:24.840 | And people have been like, oh, we're saying, you know,
00:49:26.080 | we, you know, we facilitate internet advertising.
00:49:29.120 | People are saying, yeah, people are gonna advertise
00:49:30.720 | on the internet, yeah, right.
00:49:32.000 | Actually, it wasn't that he's magical and saw the future,
00:49:34.720 | is that he looked at the present,
00:49:36.260 | looked at what the internet was,
00:49:38.480 | thought about, you know,
00:49:40.000 | the obvious like advertising opportunity
00:49:42.440 | this was gonna be.
00:49:43.960 | It wasn't rocket science, it wasn't genius, I don't believe.
00:49:46.760 | I think it was just seeing the truth.
00:49:48.800 | And when everyone else is laughing, saying,
00:49:50.440 | well, you're wrong, I mean, I did the math
00:49:53.740 | and here it is, right?
00:49:54.760 | Next company, you know, x.com,
00:49:56.720 | which became eventually PayPal.
00:49:58.540 | People said, oh yeah, people are gonna put
00:50:00.720 | their financial information on the internet.
00:50:02.760 | No way.
00:50:03.600 | To us, it seems so obvious.
00:50:06.200 | If you went back then, you would probably feel the same
00:50:08.040 | where you'd think this is, that is a fake company.
00:50:10.600 | That no, it's just obviously not a good idea.
00:50:12.880 | He looked around and said, you know, I see where this is.
00:50:14.400 | And so again, he could see where it was going
00:50:15.720 | 'cause he could see what it was that day
00:50:17.240 | and not what it, you know, not people,
00:50:18.560 | conventional wisdom was still a bunch of years earlier.
00:50:21.680 | SpaceX is the ultimate example.
00:50:23.520 | A friend of his apparently bought,
00:50:25.040 | actually compiled a montage, video montage
00:50:28.360 | of rockets blowing up to show him this is not a good idea.
00:50:32.120 | And if, but just even the bigger picture,
00:50:34.480 | the amount of billionaires who have like thought
00:50:36.240 | this was, I'm gonna start launching rockets
00:50:38.800 | and you know, the amount that failed.
00:50:40.380 | I mean, it's not, conventional wisdom said
00:50:43.560 | this isn't a bad endeavor.
00:50:44.640 | He was putting all of his money into it.
00:50:45.960 | - Yeah.
00:50:47.280 | - Landing rockets was another thing, you know.
00:50:49.440 | Well, if, you know, here's the classic kind of way
00:50:52.040 | we reason, which is, if this could be done,
00:50:55.480 | NASA would have done it a long time ago
00:50:57.280 | 'cause of the money it would save.
00:50:58.640 | This could be done, the Soviet Union would have done it
00:51:00.200 | back in the 60s.
00:51:01.680 | It's obviously something that can't be done.
00:51:03.480 | And the math on his envelope said,
00:51:06.280 | well, I think it can be done.
00:51:07.240 | And so he just did it.
00:51:08.140 | So in each of these cases, I think actually,
00:51:10.080 | in some ways, Elon gets too much credit as, you know,
00:51:12.280 | people think it's that he's, you know,
00:51:13.960 | it's that his Einstein intelligence
00:51:15.400 | or he can see the future.
00:51:16.680 | He has incredible, he has incredible guts.
00:51:19.560 | He's so, you know, courageous.
00:51:20.820 | I think if you actually are looking at reality,
00:51:23.580 | you're just assessing probabilities
00:51:26.680 | and you're ignoring all the noise, which is wrong,
00:51:29.160 | so wrong, right?
00:51:30.280 | And you just, then you just have to be, you know,
00:51:33.040 | pretty smart and, you know, pretty courageous.
00:51:36.440 | And you have to have this magical ability to be sane
00:51:39.720 | and trust your reasoning over conventional wisdom
00:51:42.240 | because your individual reasoning, you know,
00:51:44.560 | part of it is that we see that we can't build a pencil.
00:51:46.780 | We can't build, you know, the civilization on our own,
00:51:49.320 | right?
00:51:50.160 | We kind of tout, you know, tout to the collective
00:51:55.160 | because for good reason, but this is different
00:51:57.060 | when it comes to kind of what's possible.
00:51:59.040 | You know, the Beatles were doing their kind of Motown-y
00:52:02.020 | chord patterns in the early '60s
00:52:03.760 | and they were doing what was normal.
00:52:05.560 | They were doing what was clearly this kind of sound
00:52:07.800 | is a hit.
00:52:08.920 | Then they started getting weird
00:52:10.060 | because they were so popular,
00:52:11.480 | they had this confidence to say,
00:52:12.600 | let's just, we're gonna start just experimenting.
00:52:15.120 | And it turns out that like, if you just,
00:52:17.600 | all these people are in this like one groove together
00:52:19.660 | doing music and it's just like,
00:52:20.840 | there's a lot of land over there.
00:52:22.880 | And it seems like, you know,
00:52:24.440 | I'm sure the managers would say,
00:52:25.760 | and all the record execs would say,
00:52:28.000 | no, you have to be here.
00:52:28.960 | This is what sells.
00:52:30.400 | And it's just not true.
00:52:31.900 | So I think that Elon is,
00:52:33.040 | so the term for this that actually Elon likes to use
00:52:36.880 | is reasoning from first principles, the physics term.
00:52:39.920 | First principles are your axioms.
00:52:41.680 | And physicists, they don't say, well, what's, you know,
00:52:43.600 | what do people think?
00:52:44.920 | No, they say, what are the axioms?
00:52:46.300 | Those are the puzzle pieces.
00:52:47.280 | Let's use those to build a conclusion.
00:52:49.300 | That's our hypothesis.
00:52:50.200 | Now let's test it, right?
00:52:51.040 | And they come up with all kinds of new things constantly
00:52:53.800 | by doing that.
00:52:55.040 | If Einstein was assuming conventional wisdom was right,
00:52:57.440 | he never would have even tried to create something
00:52:59.360 | that really disproved Newton's laws.
00:53:02.080 | And the other way to reason is reasoning by analogy,
00:53:05.440 | which is a great shortcut.
00:53:08.120 | It's when we look at other people's reasoning
00:53:10.460 | and we kind of photocopy it into our head, we steal it.
00:53:13.240 | So reasoning by analogy, we do all the time.
00:53:15.920 | And it's usually a good thing.
00:53:17.000 | I mean, we don't, if you,
00:53:17.840 | it takes a lot of mental energy and time
00:53:19.620 | to reason from first principles.
00:53:20.600 | It's actually, you know,
00:53:21.500 | you don't want to reinvent the wheel every time, right?
00:53:23.320 | You want to often copy other people's reasoning
00:53:26.400 | most of the time.
00:53:27.240 | And I, you know, most of us do it most of the time
00:53:28.740 | and that's good, but there's certain moments
00:53:29.980 | when you're, forget just for a second,
00:53:31.520 | like succeeding in like the world of like Elon,
00:53:34.500 | just who you're going to marry,
00:53:35.600 | where are you going to settle down?
00:53:36.600 | How are you going to raise your kids?
00:53:38.820 | How are you going to educate your kids?
00:53:40.880 | How you should educate yourself?
00:53:42.240 | What kind of career paths in terms, these moments,
00:53:44.520 | this is what on your deathbed, like you look back on,
00:53:46.480 | and that's what, these are the few number of choices
00:53:49.280 | that really define your life.
00:53:50.480 | Those should not be reasoned by analogy.
00:53:52.280 | You should absolutely try to reason from first principles.
00:53:55.280 | And Elon, not just by the way in his work,
00:53:58.320 | but in his personal life.
00:53:59.360 | I mean, if you just look at the way he is on Twitter,
00:54:01.440 | it's not how you're supposed to be
00:54:03.400 | when you're a super famous, you know, industry titan.
00:54:07.280 | You're not supposed to just be silly on Twitter
00:54:08.840 | and do memes and get in little quibbles.
00:54:11.800 | He just does things his own way,
00:54:14.300 | regardless of what you're supposed to do,
00:54:15.880 | which sometimes serves him and sometimes doesn't.
00:54:17.640 | But I think it has taken him where it has taken him.
00:54:21.520 | - Yeah, I mean, I probably wouldn't describe
00:54:23.400 | his approach to Twitter as first principles,
00:54:26.000 | but I guess it has the same element.
00:54:26.840 | - I don't think it is.
00:54:28.280 | Well, first of all, I will say that a lot of tweets,
00:54:30.040 | people think, oh, he's going to be done after that.
00:54:32.560 | He's fine, he's just one man, time man of the year.
00:54:35.480 | Like, it's something, it's not sinking him.
00:54:38.560 | And I think, you know, it's not that I think
00:54:40.600 | this is like super reasoned out.
00:54:41.760 | I think that, you know, Twitter is his silly side.
00:54:43.440 | But I think that he saw,
00:54:48.200 | with his reasoning did not feel like there was a giant risk
00:54:50.800 | in just being his silly self on Twitter,
00:54:52.520 | when a lot of billionaires would say,
00:54:53.600 | well, no one else is doing that.
00:54:55.560 | So it must be a good reason, right?
00:54:58.200 | - Well, I gotta say that he inspires me to,
00:55:01.120 | that it's okay to be silly.
00:55:02.920 | - Totally. - On Twitter.
00:55:04.160 | But yeah, you're right.
00:55:06.680 | The big inspiration is the willingness to do that
00:55:09.000 | when nobody else is doing it.
00:55:11.040 | - Yeah, and I think about all the great artists,
00:55:13.280 | you know, all the great inventors and entrepreneurs,
00:55:16.240 | almost all of them,
00:55:17.800 | they had a moment when they trusted their reasoning.
00:55:19.480 | I mean, Airbnb was over 60 with VCs.
00:55:24.480 | A lot of people would say,
00:55:25.840 | obviously they know something we don't, right?
00:55:28.960 | But they didn't, they said, I think they're all wrong.
00:55:30.680 | I mean, that takes some kind of
00:55:32.120 | different wiring in your brain.
00:55:34.240 | - And then that's both for big picture
00:55:36.440 | and detailed like engineering problems.
00:55:39.520 | It's fun to talk to him.
00:55:40.760 | It's fun to talk to Jim Keller,
00:55:42.160 | who's a good example of this kind of thinking
00:55:44.520 | about like manufacturing, how to get costs down.
00:55:47.320 | They always talk about like,
00:55:49.040 | they talk about SpaceX rockets this way.
00:55:52.200 | They talk about manufacturing this way,
00:55:53.860 | like cost per pound or per ton
00:55:58.860 | to get to orbit or something like that.
00:56:02.560 | This is how they reason we need to get the cost down.
00:56:04.640 | It's a very kind of raw materials,
00:56:07.800 | like just very basic way of thinking.
00:56:10.360 | - First principles.
00:56:11.640 | - It's really, yeah.
00:56:12.480 | - And the first principles of a rocket
00:56:14.040 | are like the price of raw materials and gravity, you know,
00:56:18.360 | and wind.
00:56:19.840 | I mean, these are your first principles and fuel.
00:56:22.800 | Henry Ford, you know, what made Henry Ford
00:56:27.200 | blow up as an entrepreneur?
00:56:30.640 | The assembly line, right?
00:56:32.000 | I mean, he did, he thought for a second and said,
00:56:35.120 | this isn't how manufacturing is normally, you know,
00:56:37.280 | it is normally done this way,
00:56:38.280 | but I think this is a different kind of product.
00:56:40.680 | And that's what changed it.
00:56:41.600 | 'Cause you know, and then what happens is
00:56:42.840 | when someone reasons from first principles,
00:56:44.560 | they often fail and you're going out into the fog
00:56:47.640 | with no conventional wisdom to guide you.
00:56:49.320 | But when you succeed,
00:56:50.520 | what you notice is that everyone else turns and says,
00:56:52.200 | wait, what, what, what are they doing?
00:56:53.200 | What are they?
00:56:54.040 | And then they all, they flock over.
00:56:55.240 | Look at the iPhone.
00:56:56.480 | iPhone, you know, Steve Jobs was famously good
00:56:58.800 | at reasoning from first principles
00:57:00.120 | 'cause that guy had crazy self-confidence.
00:57:02.840 | He just said, you know, if I think this is right,
00:57:04.880 | like everyone, and that, I mean, I don't know how,
00:57:06.320 | I don't know how he does that.
00:57:07.520 | And, and I don't think Apple can do that anymore.
00:57:09.600 | I mean, they lost that, that one brain,
00:57:11.680 | his ability to do that was made of that
00:57:13.520 | in a totally different company,
00:57:14.840 | even though there's tens of thousands of people there.
00:57:17.280 | He said, he didn't say,
00:57:19.080 | and I'm giving a lot of credit to Steve Jobs,
00:57:21.560 | but of course it was a team at Apple who said
00:57:23.840 | they didn't look at the flip phones and, and, and say,
00:57:26.320 | okay, what kind of, you know, let's make a keyboard
00:57:28.360 | that's like clicky and, you know,
00:57:29.400 | really cool Apple-y keyboard.
00:57:30.360 | They said, what should a mobile device be?
00:57:32.480 | You know, what, axioms, what are the axioms here?
00:57:35.680 | And none of them involved a keyboard necessarily.
00:57:37.520 | And by the time they pieced it up,
00:57:38.360 | there was no keyboard, 'cause it didn't make sense.
00:57:40.360 | Everyone suddenly is going, wait, what?
00:57:41.600 | What are they doing?
00:57:42.440 | And now every phone looks like the iPhone.
00:57:43.760 | I mean, that's, that's how it goes.
00:57:46.760 | - You tweeted, what's something
00:57:49.680 | you've changed your mind about?
00:57:51.400 | That's the question you've tweeted.
00:57:53.560 | Elon replied, brain transplants.
00:57:55.960 | Sam Harris responded, nuclear power.
00:57:58.040 | There's a bunch of people with cool responses there.
00:58:01.280 | In general, what are your thoughts
00:58:02.640 | about some of the responses?
00:58:03.840 | And what have you changed your mind about, big or small?
00:58:07.520 | Perhaps in doing the research for some of your writing.
00:58:10.600 | - So I'm writing right now, just finishing a book
00:58:13.840 | on kind of why our society is such a shit place
00:58:18.600 | at the moment, just polarized.
00:58:20.200 | And, you know, we have all these gifts,
00:58:21.880 | like we're talking about, just the supermarket.
00:58:23.520 | You know, we have these, it's exploding technology.
00:58:25.720 | Fewer and fewer people are in poverty.
00:58:28.060 | You know, it's, Louis C.K., you know, likes to say,
00:58:30.480 | you know, everything's amazing and no one's happy, right?
00:58:32.480 | But it's really extreme moment right now,
00:58:34.960 | where it's like, hate is on the rise.
00:58:37.340 | Like, crazy things, right?
00:58:38.800 | - If I could interrupt briefly, you did tweet
00:58:41.720 | that you just wrote the last word.
00:58:43.360 | - I sure did.
00:58:44.320 | - And then there's some hilarious asshole who said,
00:58:46.920 | now you just have to work on all the ones in the middle.
00:58:49.840 | - Yeah, I earned that.
00:58:50.680 | I mean, when you earn a reputation
00:58:52.960 | as a tried and true procrastinator,
00:58:55.800 | you're just gonna get shit forever, and that's fine.
00:58:58.480 | I accept my fate there.
00:58:59.640 | - So do you mind sharing a little bit more
00:59:02.040 | about the details of what you're writing?
00:59:04.080 | - Yeah.
00:59:04.920 | - So you're, what, how do you approach this question
00:59:07.840 | about the state of society?
00:59:09.240 | - I wanted to figure out what was going on,
00:59:11.760 | because what I noticed was a bad trend.
00:59:15.040 | It's not that, you know, things are bad.
00:59:16.400 | It's that things are getting worse in certain ways.
00:59:19.360 | Not in every way.
00:59:20.200 | If you look at Max Roser's stuff, you know,
00:59:23.120 | he comes up with all these amazing graphs.
00:59:24.880 | This is what's weird, is that things are getting better
00:59:27.380 | in almost every important metric you can think of.
00:59:31.600 | Except the amount of people who hate other people
00:59:35.240 | in their own country, and the amount of people
00:59:37.780 | that hate their own country, the amount of Americans
00:59:40.880 | that hate America is on the rise, right?
00:59:42.800 | The amount of Americans that hate other Americans
00:59:45.580 | is on the rise.
00:59:47.160 | The amount of Americans that hate the president
00:59:49.120 | is on the rise, all these things,
00:59:50.200 | like on the very steep rise.
00:59:52.780 | So what the hell?
00:59:53.720 | What's going on?
00:59:54.560 | Like, there's something causing that.
00:59:56.120 | It's not that, you know, a bunch of new people were born
00:59:58.280 | who were just dicks.
00:59:59.280 | It's that something is going on.
01:00:01.280 | So I think of it as a very simple, oversimplified equation,
01:00:06.280 | human behavior, and it's the output.
01:00:09.880 | And I think the two inputs are human nature
01:00:12.000 | and environment, right?
01:00:12.840 | And this is basic, you know, super, super kindergarten level
01:00:16.120 | like, you know, animal behavior.
01:00:18.440 | But I think it's worth thinking about.
01:00:20.080 | You've got human nature, which is not changing very much,
01:00:23.000 | right, and then you throw that nature
01:00:27.520 | into a certain environment,
01:00:29.000 | and it reacts to the environment, right?
01:00:30.800 | It's shaped by the environment.
01:00:32.520 | And then eventually what comes out is behavior, right?
01:00:36.440 | Human nature is not changing very much,
01:00:38.000 | but suddenly we're behaving differently, right?
01:00:40.280 | We are, again, you know, look at the polls.
01:00:42.560 | Like, it used to be that the president, you know,
01:00:45.120 | was liked by, I don't remember the exact numbers,
01:00:47.040 | but, you know, 80% or 70% of their own party,
01:00:51.160 | and, you know, 50% of the other party.
01:00:53.000 | And now it's like 40% of their own party
01:00:54.640 | and 10% of the other party, you know?
01:00:56.600 | And it's not that the presidents are getting worse,
01:00:59.200 | since maybe some people would argue that they are,
01:01:00.680 | but more so, and there's a lot of, you know,
01:01:03.920 | idiot presidents throughout the,
01:01:05.640 | what's going on is something in the environment is changing,
01:01:08.240 | and that's, that you're seeing is a change in behavior.
01:01:11.160 | An easy example here is that, you know,
01:01:12.760 | by a lot of metrics, racism is getting,
01:01:16.360 | is becoming less and less of a problem.
01:01:18.280 | You know, it's hard to measure, but there's metrics like,
01:01:21.840 | you know, how upset would you be
01:01:23.280 | if your kid married someone of another race?
01:01:26.520 | And that number is plummeting.
01:01:28.000 | But racial grievance is skyrocketing, right?
01:01:30.800 | There's a lot of examples like this.
01:01:32.000 | So I wanted to look around and say,
01:01:33.600 | and the reason I took it on,
01:01:34.720 | the reason I don't think this is just an unfortunate trend,
01:01:36.800 | unpleasant trend that hopefully we come out of,
01:01:38.880 | is that all this other stuff I like to write about,
01:01:40.440 | all this future stuff, right?
01:01:41.960 | And it's this magical, I always think of this,
01:01:43.720 | I'm very optimistic in a lot of ways,
01:01:45.280 | and I think that our world would be a utopia,
01:01:48.440 | would seem like actual heaven.
01:01:50.120 | Like whatever Thomas Jefferson was picturing as heaven,
01:01:53.640 | other than maybe the eternal life aspect,
01:01:55.440 | I think that if he came to 2021 US, it would be better.
01:01:59.160 | It's cooler than heaven.
01:02:00.880 | But we live in a place that's cooler than 1700s heaven.
01:02:03.840 | Again, other than the fact that we still die.
01:02:05.640 | Now, I think that future world
01:02:07.200 | actually probably would have quote, eternal life.
01:02:09.600 | I don't think anyone wants eternal life, actually,
01:02:11.960 | if people think they do.
01:02:13.280 | Eternal is a long time,
01:02:14.520 | but I think the choice to die when you want,
01:02:16.920 | maybe we're uploaded, maybe we can refresh our bodies,
01:02:19.400 | I don't know what it is.
01:02:20.240 | But the point is, I think about that utopia.
01:02:23.400 | And I do believe that if we don't botch this,
01:02:25.960 | we'd be heading towards somewhere
01:02:27.160 | that would seem like heaven, maybe in our lifetimes.
01:02:30.040 | Of course, if things go wrong,
01:02:32.920 | now think about the trends here.
01:02:34.920 | Just like the 20th century would seem like
01:02:36.980 | some magical utopia to someone from the 16th century,
01:02:39.680 | the bad things in the 20th century
01:02:43.880 | were kind of the worst things ever,
01:02:46.160 | in terms of just absolute magnitude.
01:02:47.960 | World War II, the biggest genocides ever.
01:02:52.060 | You've got maybe climate change,
01:02:55.060 | if it is the existential threat that many people think it is.
01:02:58.060 | I mean, we never had an existential threat
01:02:59.700 | on that level before.
01:03:01.020 | So the good is getting better and the bad's getting worse.
01:03:04.040 | And so what I think about the future,
01:03:05.400 | I think of us as in some kind of big,
01:03:08.200 | long canoe as a species.
01:03:10.960 | Five million mile long canoe,
01:03:13.840 | each of us sitting in a row.
01:03:14.960 | And we each have one oar,
01:03:16.320 | we can paddle on the left side or the right side.
01:03:18.540 | And what we know is there's a fork up there somewhere.
01:03:22.000 | And the river forks,
01:03:24.240 | and there's a utopia on one side
01:03:26.040 | and a dystopia on the other side.
01:03:27.240 | And I really believe that that's,
01:03:28.600 | we're probably not headed for just an okay future.
01:03:30.480 | It's just the way tech is exploding,
01:03:32.560 | it's probably gonna be really good or really bad.
01:03:34.240 | The question is, which side should we be rowing on?
01:03:35.840 | We can't see up there, right?
01:03:37.760 | But it really matters.
01:03:38.600 | So I'm writing about all this future stuff,
01:03:39.600 | and I'm saying none of this matters
01:03:40.660 | if we're squabbling our way
01:03:42.020 | into kind of like a civil war right now.
01:03:44.440 | So what's going on?
01:03:45.760 | - So it's a really important problem to solve.
01:03:48.600 | What are your sources of hope in this?
01:03:51.880 | So like, how do you steer the canoe?
01:03:54.920 | - One of my big sources of hope,
01:03:56.320 | and this is my answer to what I changed my mind on,
01:03:59.880 | is I think I always knew this,
01:04:01.700 | but it's easy to forget it.
01:04:03.680 | Our primitive brain does not remember this fact,
01:04:05.600 | which is that I don't think there are very many bad people.
01:04:10.600 | Now, you say bad, are there selfish people?
01:04:13.600 | Most of us, I think that if you think of people,
01:04:16.400 | there's digital languages, ones and zeros.
01:04:21.600 | And our primitive brain very quickly can get into the land
01:04:24.080 | where everyone's a one or a zero.
01:04:25.040 | Our tribe, we're all ones, we're perfect, I'm perfect,
01:04:27.400 | my family is that other family, it's that other tribe.
01:04:29.480 | There are zeros, and you dehumanize them, right?
01:04:31.840 | These people are awful.
01:04:33.920 | So zero is not a human place.
01:04:36.200 | No one's a zero and no one's a one.
01:04:37.680 | You're dehumanizing yourself.
01:04:39.240 | So when we get into this land,
01:04:41.080 | I call it political Disney world,
01:04:42.680 | 'cause the Disney movies have good guys,
01:04:44.800 | Scar is totally bad and Mufasa's totally good, right?
01:04:48.680 | You don't see Mufasa's character flaws.
01:04:50.360 | You don't see Scar's upbringing that made him like that,
01:04:53.000 | that humanizes him, no, lionizes him, whatever.
01:04:55.560 | You are-- - Well done.
01:04:57.360 | - Yeah. (laughing)
01:04:58.600 | Mufasa's a one and Scar's a zero, very simple.
01:05:01.820 | So political Disney world is a place,
01:05:03.880 | a psychological place that all of us have been in.
01:05:06.680 | And it can be religious Disney world,
01:05:08.200 | it can be national Disney world,
01:05:09.680 | and war, whatever it is,
01:05:11.040 | but it's a place where we fall into this delusion
01:05:13.400 | that there are protagonists and antagonists
01:05:14.880 | and that's it, right?
01:05:15.740 | That is not true.
01:05:17.000 | We are all 0.5s or maybe 0.6s to 0.4s
01:05:20.280 | in that we are also, on one hand,
01:05:22.320 | it's not that, I don't think there's that many
01:05:23.360 | really great people, frankly.
01:05:24.860 | I think if you get into it,
01:05:26.240 | people are kind of, a lot of people,
01:05:27.840 | most of us have, if you get really into
01:05:29.760 | our most shameful memories,
01:05:31.200 | the things we've done that are worse,
01:05:32.440 | the most shameful thoughts, the deep selfishness
01:05:34.440 | that some of us have in areas we wouldn't want to admit,
01:05:36.840 | right, most of us have a lot of unadmirable stuff, right?
01:05:40.580 | On the other hand, if you actually got into,
01:05:42.480 | really got into someone else's brain
01:05:44.240 | and you looked at their upbringing
01:05:45.320 | and you looked at the trauma that they've experienced
01:05:47.120 | and then you looked at the insecurities they have
01:05:49.640 | and you look at all their,
01:05:50.480 | if you assembled a highlight reel
01:05:52.000 | of your worst moments,
01:05:53.640 | the meanest things you've ever done,
01:05:55.520 | the worst, the most selfish,
01:05:56.680 | the times you stole something, whatever,
01:05:58.480 | and you just, people think,
01:05:59.320 | "Wow, Lex is an awful person."
01:06:00.960 | If you highlighted your,
01:06:01.800 | if you did a montage of your best moments,
01:06:03.620 | people would say, "Oh, he's a god," right?
01:06:05.000 | But of course, we all have both of those.
01:06:06.800 | So, I've started to really try to remind myself
01:06:11.520 | that everyone's a 0.5, right?
01:06:13.720 | And 0.5s are all worthy of criticism
01:06:15.920 | and we're all worthy of compassion.
01:06:17.720 | And the thing that makes me hopeful
01:06:19.840 | is that I really think that,
01:06:21.480 | there's a bunch of 0.5s,
01:06:22.640 | and 0.5s are good enough
01:06:24.720 | that we should be able to create a good society together.
01:06:26.480 | There's a lot of love in every human.
01:06:28.040 | And I think there's more love in humans than hate.
01:06:30.540 | You know, I always remember this moment.
01:06:34.320 | This is a weird anecdote,
01:06:35.560 | but I'm a Red Sox fan, Boston Red Sox baseball,
01:06:38.000 | and Derek Jeter is who we hate the most.
01:06:40.440 | He's on the Yankees.
01:06:41.360 | - Yes.
01:06:42.400 | - And hate, right?
01:06:44.320 | Ugh, Jeter, right?
01:06:45.840 | He was his last game in Fenway, he's retiring.
01:06:47.960 | And he got this rousing standing ovation
01:06:49.800 | and I almost cried.
01:06:51.120 | And it was like, what is going on?
01:06:52.480 | We hate this guy, but actually,
01:06:53.800 | there's so much love in all humans.
01:06:56.080 | It felt so good to just give a huge cheer
01:06:58.640 | to this guy we hate
01:06:59.480 | because it's like this moment of a little fist pound,
01:07:02.040 | being like, of course we all actually love each other.
01:07:04.040 | And I think there's so much of that.
01:07:06.200 | And so, the thing that I think I've come around on
01:07:08.080 | is I think that we are in an environment
01:07:11.160 | that's bringing out really bad stuff.
01:07:13.280 | I don't think it's,
01:07:14.100 | if I thought it was the people,
01:07:15.800 | I would be more hopeful.
01:07:16.640 | Like, if I thought it was human nature,
01:07:17.640 | I'd be more upset.
01:07:19.720 | It's the two independent variables here,
01:07:23.120 | or there's a fixed variable,
01:07:24.480 | there's a constant, which is human nature,
01:07:25.840 | and there's the independent variable, environment,
01:07:27.680 | and then behavior is the dependent variable.
01:07:30.480 | I like that the thing that I think is bad
01:07:32.360 | is the independent variable, the environment.
01:07:34.520 | Which means I think the environment can get better.
01:07:36.840 | And there's a lot of things I can go into
01:07:38.040 | about why the environment I think is bad,
01:07:39.800 | but I have hope because I think the thing that's bad for us
01:07:42.760 | is something that can change.
01:07:44.520 | - The first principle's idea here
01:07:46.720 | is that most people have the capacity
01:07:48.320 | to be a 0.7 to a 0.9
01:07:51.640 | if the environment is properly calibrated
01:07:56.640 | with the right incentives.
01:07:57.960 | - I think that, well, I think that maybe if we're all,
01:08:00.200 | yeah, if we're all 0.5s,
01:08:01.680 | I think that environments can bring out our good side.
01:08:05.320 | You know, yes, if maybe we're all
01:08:06.400 | on some kind of distribution.
01:08:08.880 | And the right environment can, yes,
01:08:11.000 | can bring out our higher sides.
01:08:12.720 | And I think in a lot of ways you could say it has.
01:08:15.080 | I mean, the US environment, we take for granted
01:08:18.880 | how the liberal laws and liberal environment that we live in.
01:08:23.000 | I mean, like in New York City, right,
01:08:26.240 | if you walk down the street and you like assault someone,
01:08:29.360 | A, if anyone sees you, they're probably gonna yell at you.
01:08:31.000 | You might get your ass kicked by someone for doing that.
01:08:32.920 | You also might end up in jail, you know,
01:08:35.640 | if it's security cameras.
01:08:37.080 | And there's just norms.
01:08:38.040 | You know, we're all trained.
01:08:38.880 | That's what awful people do, right?
01:08:40.320 | So there's, it's not that human nature doesn't have it in it
01:08:43.000 | to be like that.
01:08:44.280 | It's that this environment we're in has made that
01:08:47.080 | a much, much, much smaller experience for people.
01:08:50.120 | There's so many examples like that where it's like, man,
01:08:52.160 | you don't realize how much of the worst human nature
01:08:54.520 | is contained by our environment.
01:08:56.200 | And, but I think that, you know,
01:08:58.800 | rapidly changing environment,
01:09:00.240 | which is what we have right now, social media starts.
01:09:02.240 | I mean, what a seismic change to the environment.
01:09:04.680 | There's a lot of examples like that.
01:09:05.760 | Rapidly changing environment
01:09:06.920 | can create rapidly changing behavior.
01:09:09.240 | And wisdom sometimes can't keep up.
01:09:11.680 | And so we, you know, we can,
01:09:14.240 | we can really kind of lose our grip
01:09:15.800 | on some of the good behavior.
01:09:17.280 | - Were you surprised by Elon's answer about brain transplants
01:09:21.760 | or Sam's about nuclear power?
01:09:23.320 | Or anything else just--
01:09:24.800 | - Sam's I think is, I have a friend,
01:09:27.760 | Isabel Bohemeke, who has a,
01:09:29.780 | who's a nuclear power, you know, influencer.
01:09:33.120 | I've become very convinced.
01:09:34.920 | And I've not done my deep dive on this.
01:09:37.960 | But here's, in this case,
01:09:39.480 | this is reasoning by analogy here.
01:09:41.460 | The amount of really smart people I respect,
01:09:45.120 | who all, who seem to have dug in,
01:09:46.960 | who all say nuclear power is clearly a good option.
01:09:50.000 | It's obviously emission free,
01:09:51.120 | but you know, the concerns about meltdowns and waste,
01:09:54.160 | they see it, they say, completely overblown.
01:09:56.880 | So judging from those people, secondary knowledge here,
01:10:00.260 | I will say I'm a strong advocate.
01:10:02.760 | I haven't done my own deep dive yet,
01:10:04.240 | but it does seem like a little bit odd
01:10:06.520 | that you've got people who are so concerned
01:10:09.200 | about climate change,
01:10:10.400 | who have, it seems like it's kind of an ideology
01:10:15.200 | where nuclear power doesn't fit,
01:10:17.200 | rather than rational, you know, fear of climate change
01:10:20.360 | that somehow is anti-nuclear power.
01:10:22.200 | It just, yeah.
01:10:23.200 | - I personally am uncomfortably reasoning
01:10:25.840 | by analogy with climate change.
01:10:27.960 | I've actually have not done a deep dive myself.
01:10:29.640 | - Me neither, because it's so,
01:10:31.840 | man, it seems like a deep dive.
01:10:33.320 | - Yeah.
01:10:34.760 | And my reasoning by analogy there
01:10:37.240 | currently has me thinking it's a truly existential thing,
01:10:40.600 | but feeling hopeful.
01:10:41.840 | - So let me, this is me speaking,
01:10:44.360 | and this is speaking from a person
01:10:45.760 | who's not done the deep dive.
01:10:47.240 | I'm a little suspicious
01:10:50.500 | of the amount of fear mongering going on.
01:10:52.680 | I've, especially over the past couple of years,
01:10:54.720 | I've gotten uncomfortable with fear mongering
01:10:56.640 | in all walks of life.
01:10:59.140 | There's way too many people interested
01:11:01.520 | in manipulating the populace with fear.
01:11:04.240 | And so I don't like it.
01:11:05.480 | I should probably do a deep dive,
01:11:06.920 | 'cause to me it's, well, the big problem
01:11:10.640 | with the opposition to climate change,
01:11:13.160 | or whatever the fear mongering is,
01:11:15.160 | that it also grows the skepticism in science broadly.
01:11:19.720 | - Yeah.
01:11:20.560 | - It's like, and that,
01:11:21.380 | so I need to make sure I do that deep dive.
01:11:23.480 | I have listened to a few folks
01:11:25.120 | who kind of criticize the fear mongering
01:11:27.840 | and all those kinds of things,
01:11:29.120 | but they're few and far in between.
01:11:30.840 | And so it's like, all right, what is the truth here?
01:11:33.800 | And it feels lazy.
01:11:35.280 | But it also feels like it's hard to get to the,
01:11:38.480 | like there's a lot of kind of activists talking about idea
01:11:43.040 | versus sources of objective,
01:11:48.040 | like calm, first principles type reasoning.
01:11:52.440 | Like one of the things,
01:11:54.080 | I know it's supposed to be a very big problem,
01:11:57.880 | but when people talk about catastrophic effects
01:12:00.000 | of climate change,
01:12:01.680 | I haven't been able to like see really great deep analysis
01:12:06.360 | of what that looks like in 10, 20, 30 years,
01:12:09.480 | raising rising sea levels.
01:12:12.400 | What are the models of how that changes human behavior,
01:12:16.960 | society, what are the things that happen?
01:12:19.260 | There's going to be constraints on the resources
01:12:21.620 | and people are gonna have to move around.
01:12:23.320 | This is happening gradually.
01:12:24.900 | Are we gonna be able to respond to this?
01:12:26.720 | How would we respond to this?
01:12:27.840 | What are the best,
01:12:28.920 | like what are the best models for how everything goes wrong?
01:12:32.280 | Again, I was, this is a question I keep starting
01:12:36.180 | to ask myself without doing any research,
01:12:38.640 | like motivating myself to get up to this deep dive
01:12:43.400 | that I feel is deep.
01:12:45.040 | Just watching people not do a great job
01:12:47.200 | with that kind of modeling with the pandemic
01:12:49.480 | and sort of being caught off guard and wondering,
01:12:52.640 | okay, if we're not good with this pandemic,
01:12:55.140 | how are we going to respond to other kinds of tragedies?
01:12:57.560 | Well, this is part of why I wrote the book,
01:12:59.440 | 'cause I said, we're going to have more and more of these,
01:13:04.440 | big collective, what should we do here situations?
01:13:08.200 | Whether it's, how about when,
01:13:09.720 | we're probably not that far away from people being able
01:13:11.560 | to go and decide the IQ of their kid
01:13:14.520 | or make a bunch of embryos and actually pick
01:13:17.480 | the highest IQ.
01:13:18.320 | - It can possibly go wrong.
01:13:20.000 | - Yeah, and also imagine the political sides of that
01:13:23.000 | and that's something only wealthy people can afford at first
01:13:25.480 | and just a nightmare, right?
01:13:27.340 | We need to be able to have our wits about us as a species
01:13:29.620 | where we can actually get into a topic like that
01:13:32.880 | and come up with, where the collective brain can be smart.
01:13:36.720 | I think that there are certain topics where I think of this,
01:13:40.760 | and this is again, another simplistic model,
01:13:42.440 | but I think it works, is that there's a higher mind
01:13:44.080 | and a primitive mind, right?
01:13:45.200 | You can, in your head.
01:13:47.240 | And these team up with others.
01:13:48.920 | So when the higher minds are,
01:13:50.160 | and a higher mind is more rational and puts out ideas
01:13:53.000 | that it's not attached to.
01:13:54.040 | And so it can change its mind easily
01:13:56.440 | 'cause it's just an idea and the higher mind
01:13:58.460 | can get criticized.
01:14:00.400 | Their ideas can get criticized and it's no big deal.
01:14:02.880 | And so when the higher minds team up,
01:14:05.040 | it's like all these people in the room
01:14:06.480 | like throwing out ideas and kicking them
01:14:07.920 | and one idea goes out and everyone criticizes it,
01:14:10.040 | which is like shooting bows and arrows at it.
01:14:11.800 | And the truth, the true idea is,
01:14:13.880 | the arrows bounce off and it's so okay, it rises up.
01:14:16.920 | And the other ones get shot down.
01:14:18.300 | So it's this incredible system.
01:14:19.520 | This is what good science institution is,
01:14:22.540 | is someone puts out a thing, criticism arrows come at it
01:14:25.320 | and most of them fall and the needle is in the haystack,
01:14:29.040 | end up rising up, right?
01:14:30.040 | Incredible mechanism.
01:14:30.920 | So what that's happening is a bunch of people,
01:14:32.520 | a bunch of flawed medium scientists
01:14:34.800 | are creating super intelligence.
01:14:36.500 | Then there's the primitive mind,
01:14:40.480 | which is the more limbic systemy part of our brain.
01:14:44.160 | It's the part of us that is very much not living in 2021.
01:14:48.720 | It's living many tens of thousands of years ago.
01:14:51.360 | And it does not treat ideas like this separate thing.
01:14:54.280 | It identifies with its ideas.
01:14:56.400 | It only gets involved when it finds an idea sacred.
01:14:59.920 | It starts holding an idea sacred and it starts identifying.
01:15:02.080 | So what happens is they team up too.
01:15:04.440 | And so when you have a topic that a bunch of primitive,
01:15:08.360 | that really rouses a bunch of primitive minds,
01:15:10.660 | it quickly, the primitive minds team up
01:15:14.920 | and they create an echo chamber
01:15:16.680 | where suddenly no one can criticize this.
01:15:18.480 | And in fact, if it's powerful enough,
01:15:20.680 | people outside the community, no one can criticize it.
01:15:22.840 | We will get your paper retracted.
01:15:24.040 | We will get you fired, right?
01:15:25.840 | That's not higher mind behavior.
01:15:27.120 | That is crazy primitive mind.
01:15:28.560 | And so now what happens is the collective
01:15:30.600 | becomes dumber than an individual,
01:15:32.640 | a dumber than a single reasoning individual.
01:15:34.940 | You have this collective is suddenly attached
01:15:37.120 | to this sacred scripture with the idea
01:15:40.800 | and they will not change their mind
01:15:43.200 | and they get dumber and dumber.
01:15:44.300 | And so climate change, what's worrisome
01:15:47.300 | is that climate change has in many ways
01:15:49.540 | become a sacred topic,
01:15:50.980 | where if you come up with a nuanced thing,
01:15:52.560 | you might get called branded a denier.
01:15:54.640 | So there goes the super intelligence,
01:15:57.680 | all the arrows, no arrows can be fired.
01:15:59.560 | But if you get called a denier,
01:16:01.040 | that's a social penalty for firing an arrow
01:16:03.480 | at a certain orthodoxy.
01:16:05.160 | And so what's happening is the big brain gets frozen
01:16:08.080 | and it becomes very stupid.
01:16:08.920 | Now, you can also say that
01:16:10.560 | about a lot of other topics right now.
01:16:12.600 | You just mentioned another one, I forget what it was,
01:16:16.600 | but that's also kind of like this.
01:16:19.040 | - The world of vaccine.
01:16:20.320 | - Yeah, COVID.
01:16:21.600 | - And here's my point earlier is that
01:16:23.360 | what I see is that the political divide
01:16:25.760 | has like a whirlpool that's pulling everything into it.
01:16:28.360 | And in that whirlpool,
01:16:29.800 | thinking is done with the primitive mind tribes.
01:16:34.880 | And so I get, okay, obviously something like race,
01:16:38.520 | that makes sense, that also right now,
01:16:40.240 | the topic of race, for example, or gender,
01:16:42.200 | these things are in the whirlpool.
01:16:43.860 | But that at least is like, okay,
01:16:45.680 | that's something that the primitive mind
01:16:47.080 | would always get really worked up about.
01:16:49.160 | It taps into like our deepest kind of like primal selves.
01:16:52.320 | COVID, make this COVID in a way too,
01:16:56.080 | but climate change,
01:16:57.120 | like that should just be something
01:16:57.960 | that our rational brains are like,
01:16:58.920 | let's solve this complex problem.
01:17:00.380 | But the problem is that it's all gotten sucked
01:17:02.400 | into the red versus blue whirlpool.
01:17:04.300 | And once that happens,
01:17:05.140 | it's in the hands of the primitive minds.
01:17:06.360 | And we're losing our ability to be wise together,
01:17:09.200 | to make decisions.
01:17:10.200 | It's like the big species brain is like,
01:17:13.360 | or the big American brain is like,
01:17:15.240 | drunk at the wheel right now.
01:17:16.760 | And we're about to go into a future
01:17:18.180 | with more and more big technologies,
01:17:21.160 | scary things, we have to make big, right decisions.
01:17:24.160 | We're getting dumber as a collective,
01:17:26.080 | and that's part of this environmental problem.
01:17:28.440 | - So within the space of technologists
01:17:30.360 | and the space of scientists, we should allow the arrows.
01:17:33.320 | That's one of the saddest things to me about,
01:17:35.080 | is like the scientists, like I've seen arrogance.
01:17:40.000 | There's a lot of mechanisms that maintain the tribe.
01:17:42.460 | It's the arrogance, it's how you build up this mechanism
01:17:46.520 | that defends this wall that defends against the arrows.
01:17:50.120 | It's arrogance, credentialism,
01:17:54.320 | like just ego, really.
01:17:58.200 | And then just, it protects you
01:17:59.580 | from actually challenging your own ideas.
01:18:01.960 | This ideal of science that makes science beautiful.
01:18:05.680 | In a time of fear, and in a time of division
01:18:09.600 | created by perhaps politicians that leverage the fear,
01:18:12.900 | it, like you said, makes the whole system dumber.
01:18:16.220 | The science system dumber,
01:18:17.880 | the tech developer system dumber,
01:18:22.720 | if they don't allow the challenging of ideas.
01:18:25.000 | - What's really bad is that like, in a normal environment,
01:18:28.640 | you're always gonna have echo chambers.
01:18:30.400 | And so what's the opposite of an echo chamber?
01:18:32.360 | I created a term for it, 'cause I think we need it,
01:18:34.140 | which is called an ideal lab.
01:18:35.700 | An ideal lab, right?
01:18:36.540 | It's like people treat,
01:18:37.720 | it's like people act like scientists,
01:18:38.960 | even if they're not doing science.
01:18:40.480 | They just treat their ideas like science experiments,
01:18:43.440 | and they toss them out there, and everyone disagrees.
01:18:45.360 | And disagreement is like the game.
01:18:46.940 | Everyone likes to disagree.
01:18:48.020 | On a certain text thread where everyone is just saying,
01:18:50.900 | it's almost like someone throws something out,
01:18:52.260 | and just as an impulse for the rest of the group to say,
01:18:54.140 | I think you're being overly general there.
01:18:56.620 | Or I think, aren't you kind of being,
01:18:58.340 | I think that's like your bias showing.
01:19:00.180 | And it's like, no one's getting offended,
01:19:01.620 | because it's like, we're all just messing,
01:19:02.780 | we all, of course, respect each other, obviously.
01:19:04.900 | We're just trashing each other's ideas,
01:19:08.460 | and that the whole group becomes smarter.
01:19:10.620 | You're always gonna have ideal labs and echo chambers,
01:19:12.860 | in different communities.
01:19:13.680 | But most of us participate in both of them.
01:19:15.700 | And maybe in your marriage is a great idea lab,
01:19:18.060 | you love to disagree with your spouse.
01:19:20.020 | And maybe in, but this group of friends,
01:19:21.980 | or your family at home, you know,
01:19:23.580 | in front of that sister, you do not bring up politics,
01:19:25.660 | because she's now enforced, when that happens,
01:19:27.580 | her bullying is forcing the whole room
01:19:30.300 | to be an echo chamber to appease her.
01:19:32.880 | Now, what scares me is that usually you have
01:19:34.940 | these things existing kind of in bubbles,
01:19:36.940 | and usually there's like,
01:19:38.780 | and they each have their natural defenses
01:19:40.060 | against each other.
01:19:41.220 | An echo chamber person stays in their echo chamber.
01:19:44.260 | They don't like, they will cut you out.
01:19:45.740 | They don't like to be friends with people
01:19:47.320 | who disagree with them.
01:19:48.160 | You notice that, they will cut you out.
01:19:49.220 | They'll cut out their parents,
01:19:50.300 | if they voted for Trump or whatever, right?
01:19:51.660 | So, that's how they do it.
01:19:54.460 | They will say, I'm gonna stay inside
01:19:56.080 | of an echo chamber safely.
01:19:57.220 | So my ideas, which I identify with,
01:19:59.480 | because my primitive mind is doing the thinking,
01:20:02.500 | are not gonna ever have to get challenged,
01:20:04.100 | 'cause it feels so scary and awful for that to happen.
01:20:07.400 | But if they leave, and they go into an ideal lab
01:20:09.140 | environment, they're gonna, people are gonna say,
01:20:10.500 | what, no, they're gonna disagree,
01:20:11.820 | and they're gonna say, and the person's gonna try
01:20:13.240 | to bully, they're gonna say, that's really offensive,
01:20:15.660 | and people are gonna say, no, it's not,
01:20:16.580 | and they're gonna immediately say,
01:20:17.960 | these people are assholes, right?
01:20:19.140 | So the echo chamber person, it doesn't have much power
01:20:23.260 | once they leave the echo chamber.
01:20:24.620 | Likewise, the ideal lab person,
01:20:26.140 | they have this great environment,
01:20:27.260 | but if they go into an echo chamber
01:20:28.580 | where everyone else is, and they do that,
01:20:29.980 | they will get kicked out of the group.
01:20:30.980 | They will get branded as something,
01:20:32.460 | a denier, a racist, a right-winger, a radical,
01:20:36.660 | these nasty words.
01:20:39.980 | The thing that I don't like right now
01:20:41.460 | is that the echo chambers have found ways
01:20:44.740 | to forcefully expand into places that normally
01:20:49.740 | have a pretty good immune system against echo chambers,
01:20:52.440 | like universities, like science journals,
01:20:54.980 | places where usually it's like there's a strong
01:20:57.260 | ideal lab culture, they're veritas,
01:20:59.020 | you know, that's an ideal lab slogan.
01:21:02.700 | You have is that these people have found a way to,
01:21:06.140 | a lot of people have found a way to actually go out
01:21:08.620 | of their thing and keep their echo chamber
01:21:10.260 | by making sure that everyone is scared
01:21:11.980 | 'cause they can punish anyone,
01:21:13.060 | whether you're in their community or not.
01:21:15.100 | - That's all brilliantly put.
01:21:19.180 | When's the book coming out?
01:21:20.780 | Any idea?
01:21:21.620 | - June, July, we're not quite sure yet.
01:21:24.020 | - Okay, I can't wait.
01:21:25.420 | - Thanks. - It's awesome.
01:21:26.380 | Do you have a title yet or you can't talk about that?
01:21:28.220 | - Still working on it. - Okay.
01:21:30.580 | If it's okay, just a couple of questions from Mailbag.
01:21:32.980 | I just love these.
01:21:34.300 | I would love to hear you riff on these.
01:21:37.500 | So one is about film and music.
01:21:39.580 | Why do we prefer to watch, the question goes,
01:21:41.940 | why do we prefer to watch a film we haven't watched before,
01:21:45.560 | but we want to listen to songs
01:21:47.540 | that we have heard hundreds of times?
01:21:50.060 | This question and your answer really started
01:21:52.300 | to make me think like, yeah, that's true.
01:21:54.440 | That's really interesting.
01:21:55.740 | Like we draw that line somehow.
01:21:59.100 | So what's the answer?
01:22:00.240 | - So I think, let's use these two minds again.
01:22:02.700 | I think that when your higher mind is the one
01:22:04.940 | who's taking something in and they're really interested
01:22:07.140 | in what are the lyrics or I'm gonna learn something
01:22:08.780 | or reading a book or whatever,
01:22:11.500 | and the higher mind is trying to get information,
01:22:15.820 | and once it has it, there's no point in listening to it again
01:22:17.900 | it has the information.
01:22:19.060 | Your rational brain is like, I got it.
01:22:21.960 | But when you eat a good meal or have sex or whatever,
01:22:27.420 | that's something you can do again and again
01:22:28.780 | because it actually, your primitive brain loves it, right?
01:22:32.700 | And it never gets bored of things that it loves.
01:22:35.380 | So I think music is a very primal thing.
01:22:38.140 | I think music goes right into our primitive brain.
01:22:40.540 | I think it's of course, it's a collaboration.
01:22:43.900 | Your rational brain is absorbing the actual message,
01:22:47.500 | but I think it's all about emotions
01:22:48.940 | and even more than emotions, it literally,
01:22:51.460 | music taps into some very, very deep,
01:22:55.340 | primal part of us.
01:22:59.140 | And so when you hear a song once,
01:23:02.060 | even some of your favorite songs,
01:23:03.280 | the first time you heard it, you were like,
01:23:04.340 | I guess that's kind of catchy, yeah.
01:23:06.280 | And then you end up loving it on the 10th listen,
01:23:10.220 | but sometimes you even don't even like a song,
01:23:11.620 | you're like, oh, this song sucks.
01:23:12.540 | But suddenly you find yourself on the 40th time
01:23:15.260 | 'cause it's on the radio all the time,
01:23:16.240 | just kind of being like, oh, I love this song.
01:23:17.420 | And you're like, wait, I hate this song.
01:23:19.700 | And what's happening is that the sound is actually,
01:23:23.060 | music's actually carving a pathway in your brain.
01:23:27.220 | And it's a dance.
01:23:29.140 | And when your brain knows what's coming,
01:23:30.940 | it can dance, it knows the steps.
01:23:33.060 | So your brain is, your internal kind of,
01:23:35.740 | your brain is actually dancing with the music
01:23:37.500 | and it knows the steps and it can anticipate.
01:23:39.780 | And so there's something about knowing,
01:23:45.020 | having memorized the song
01:23:46.140 | that makes it incredibly enjoyable to us.
01:23:48.940 | But when we hear it for the first time,
01:23:49.980 | we don't know where it's gonna go.
01:23:50.820 | We're like an awkward dancer, we don't know the steps.
01:23:52.380 | And your primitive brain can't really have that much fun yet.
01:23:55.220 | That's how I feel.
01:23:56.060 | - And in the movies, that's more, that's less primitive.
01:23:59.820 | That's a story.
01:24:01.500 | You're taking in.
01:24:03.620 | - But a really good movie that we really love,
01:24:05.660 | often we will watch it like 12 times.
01:24:07.220 | You know, it's to like it.
01:24:08.340 | Not that many, but versus if you're watching a talk show,
01:24:11.620 | right, listening to, if you're listening to a pod,
01:24:13.300 | one of your podcasts is a perfect example.
01:24:15.260 | There's not many people that will listen to
01:24:17.020 | one of your podcasts, no matter how good it is, 12 times.
01:24:19.060 | Because it's, once you've got it, you got it.
01:24:21.380 | It's a form of information that's very higher mind focused.
01:24:25.300 | That's how I feel.
01:24:26.140 | - Well, you know, the funny thing is,
01:24:28.180 | there is people that listen to a podcast episode,
01:24:31.140 | many, many times.
01:24:32.180 | And often I think the reason for that
01:24:33.980 | is not because of the information,
01:24:35.320 | it's the chemistry, it's the music of the conversation.
01:24:38.100 | So it's not the actual--
01:24:38.940 | - It's just the art of it they like.
01:24:40.300 | - Yeah, they'll fall in love with some kind of person,
01:24:42.360 | some weird personality, and they'll just be listening to,
01:24:45.340 | they'll be captivated by the beat of that kind of person.
01:24:47.780 | - Or like a standup comic.
01:24:48.620 | I've watched like certain things,
01:24:49.700 | like episodes like 20 times, even though I, you know.
01:24:52.340 | - I have to ask you about the wizard hat.
01:24:56.460 | You wrote a blog about Neuralink.
01:24:58.100 | I got a chance to visit Neuralink a couple of times,
01:25:00.500 | hanging out with those folks.
01:25:01.940 | That was one of the pieces of writing you did
01:25:06.540 | that like changes culture
01:25:09.260 | and changes the way people think about a thing.
01:25:12.060 | The ridiculousness of your stick figure drawings
01:25:15.860 | are somehow, it's like, you know,
01:25:19.460 | it's like calling the origin of the universe, the Big Bang.
01:25:23.340 | It's a silly title, but it somehow sticks
01:25:25.860 | to be the representative of that.
01:25:28.980 | In the same way, the wizard hat for the Neuralink
01:25:30.980 | is somehow was a really powerful way to explain that.
01:25:35.340 | You actually proposed that the man of the year
01:25:37.780 | cover of Time should be--
01:25:39.740 | - One of my drawings.
01:25:40.580 | - One of your drawings.
01:25:41.420 | - In general, yes.
01:25:42.240 | - It's an outrage that it wasn't.
01:25:43.260 | - It was.
01:25:44.180 | - Okay, so what are your thoughts
01:25:47.700 | about like all those years later about Neuralink?
01:25:50.740 | Do you find this idea, like what excites you about it?
01:25:54.220 | Is it the big long-term philosophical things?
01:25:56.860 | Is it the practical things?
01:25:58.340 | Do you think it's super difficult to do
01:26:00.540 | on the neurosurgery side and the material engineering,
01:26:03.660 | the robotics side?
01:26:05.140 | Or do you think the machine learning side
01:26:08.220 | for the brain-computer interfaces
01:26:09.540 | where they get to learn about each other,
01:26:11.220 | all that kind of stuff?
01:26:12.060 | I would just love to get your thoughts
01:26:13.300 | 'cause you're one of the people
01:26:14.260 | that really considered this problem,
01:26:17.060 | really studied it, brain-computer interfaces.
01:26:19.620 | - I mean, I'm super excited about it.
01:26:21.500 | I really think it's actually Elon's most ambitious thing,
01:26:27.400 | more than colonizing Mars,
01:26:28.740 | because that's just a bunch of people going somewhere,
01:26:31.140 | even though it's somewhere far.
01:26:33.140 | Neuralink is changing what a person is eventually.
01:26:37.700 | Now, I think that Neuralink engineers and Elon himself
01:26:42.420 | would all be the first to admit that it is a maybe,
01:26:45.620 | whether they can do their goals here.
01:26:46.940 | I mean, it is so crazy ambitious to try to,
01:26:50.300 | I mean, their eventual goals are,
01:26:52.260 | of course, in the interim,
01:26:54.020 | they have a higher probability
01:26:55.100 | of accomplishing smaller things,
01:26:56.180 | which are still huge, like basically solving paralysis,
01:26:59.620 | strokes, Parkinson, things like that.
01:27:02.600 | I mean, it can be unbelievable.
01:27:04.640 | Anyone who doesn't have one of these things,
01:27:06.440 | we might, everyone should be very happy
01:27:08.840 | about this kind of helping with different disabilities.
01:27:13.720 | But the thing that is so,
01:27:18.240 | the grand goal is this augmentation
01:27:21.440 | where you take someone who's totally healthy
01:27:23.560 | and you put a brain-machine interface in any way
01:27:26.220 | to give them superpowers.
01:27:27.700 | The possibilities if they can do this,
01:27:32.300 | if they can really, so they've already shown
01:27:35.460 | that they are for real, they've created this robot.
01:27:38.580 | Elon talks about it should be like LASIK,
01:27:40.700 | where it's not, it shouldn't be something
01:27:43.140 | that needs a surgeon.
01:27:44.260 | This shouldn't just be for rich people
01:27:46.540 | who have waited in line for six months.
01:27:48.220 | It should be for anyone who can afford LASIK
01:27:50.300 | and eventually, hopefully, something that isn't covered by
01:27:53.180 | insurance or something that anyone can do.
01:27:56.320 | Something this big a deal should be something
01:27:57.760 | that anyone can afford eventually.
01:28:00.200 | And when we have this, again,
01:28:03.160 | I'm talking about a very advanced phase down the road.
01:28:05.720 | So maybe a less advanced phase,
01:28:07.680 | maybe right now, think about when you listen to a song,
01:28:13.280 | what's happening?
01:28:14.760 | Do you actually hear the sound?
01:28:16.760 | Well, not really.
01:28:18.600 | It's that the sound is coming out of the speaker.
01:28:22.280 | The speaker is vibrating.
01:28:23.540 | It's vibrating air molecules.
01:28:25.120 | Those air molecules get vibrated all the way
01:28:27.860 | to your head, the pressure wave,
01:28:31.780 | and then it vibrates your eardrum.
01:28:34.540 | Your eardrum is really the speaker now in your head
01:28:37.100 | that then vibrates bones and fluid,
01:28:40.360 | which then stimulates neurons in your auditory cortex,
01:28:44.880 | which give you the perception that you're hearing sound.
01:28:49.840 | Now, if you think about that,
01:28:52.280 | do we really need to have a speaker to do that?
01:28:55.160 | You could just somehow,
01:28:56.120 | if you had a little tiny thing
01:28:56.960 | that could vibrate eardrums, you could do it that way.
01:28:59.320 | That seems very hard, but really what you need,
01:29:02.200 | if you go to the very end,
01:29:03.280 | but the thing that really needs to happen
01:29:05.040 | is your auditory cortex neurons need to be stimulated
01:29:08.000 | in a certain way.
01:29:09.680 | If you have a ton of neural link things in there,
01:29:11.960 | neural link electrodes,
01:29:13.440 | and they get really good at stimulating things,
01:29:15.960 | you could play a song in your head that you hear
01:29:18.760 | that is not playing anywhere.
01:29:20.520 | There's no sound in the room,
01:29:21.920 | but you hear it and no one else could.
01:29:23.760 | It's not like they can get close to your head and hear it.
01:29:25.280 | There's no sound.
01:29:26.120 | They could not hear anything, but you hear sound.
01:29:28.360 | You can turn up.
01:29:29.200 | So you open your phone, you have the Neuralink app.
01:29:31.320 | You open the Neuralink app,
01:29:32.640 | and so basically you can open your Spotify
01:29:35.640 | and you can play to your speaker,
01:29:38.800 | you can play to your computer,
01:29:39.680 | you can play right out of your phone to your headphones,
01:29:41.280 | or you now have a new one.
01:29:43.400 | You can play into your brain.
01:29:44.680 | And this is one of the earlier things.
01:29:47.160 | This is something that seems really doable.
01:29:50.640 | So no more headphones.
01:29:52.280 | I always think it's so annoying
01:29:53.320 | 'cause I can leave the house with just my phone,
01:29:56.760 | and nothing else, or even just an Apple Watch.
01:29:58.440 | But there's always this one thing,
01:29:59.280 | I'm like, and headphones.
01:30:00.120 | You do need your headphones, right?
01:30:01.440 | So I feel like that'll be the end of that.
01:30:03.080 | But there's so many things that you,
01:30:04.600 | and you keep going, the ability to think together.
01:30:08.560 | You can talk about super brains.
01:30:10.320 | I mean, one of the examples Elon uses
01:30:12.760 | is the low bandwidth of speech.
01:30:16.040 | If I go to a movie and I come out of a scary movie
01:30:19.360 | and you say, how was it?
01:30:20.200 | I say, oh, it was terrifying.
01:30:21.160 | Well, what did I just do?
01:30:22.960 | I just gave you, I had five buckets I could have given you.
01:30:26.440 | One was horrifying, terrifying, scary, eerie, creepy,
01:30:29.480 | whatever, that's about it.
01:30:31.860 | And I had a much more nuanced experience than that.
01:30:36.240 | And I don't, all I have is these words, right?
01:30:39.560 | And so instead I just hand you the bucket.
01:30:41.920 | I put the stuff in the bucket and give it to you,
01:30:44.240 | but all you have is the bucket.
01:30:45.440 | You just have to guess what I put into that bucket.
01:30:47.720 | All you can do is look at the label of the bucket and say,
01:30:50.480 | when I say terrifying, here's what I mean.
01:30:52.880 | So the point is it's very lossy.
01:30:55.120 | I had all this nuanced information
01:30:57.320 | of what I thought of the movie,
01:30:58.160 | and I'm sending you a very low res package
01:31:00.700 | that you're gonna now guess
01:31:02.040 | what the high res thing looked like.
01:31:04.080 | That's language in general.
01:31:05.680 | Our thoughts are much more nuanced.
01:31:07.160 | We can think to each other.
01:31:08.760 | We can do amazing things.
01:31:09.740 | We could A, have a brainstorm that doesn't feel like,
01:31:11.600 | oh, we're not talking in each other's heads.
01:31:13.080 | Not just that I hear your voice.
01:31:14.480 | No, no, no, we are just thinking.
01:31:15.760 | No words are being said internally or externally.
01:31:18.920 | The two brains are literally collaborating.
01:31:21.680 | It's something, it's a skill.
01:31:22.520 | I'm sure we'd have to get good at it.
01:31:23.360 | I'm sure young kids will be great at it
01:31:25.080 | and old people will be bad.
01:31:26.500 | But you think together and together you're like,
01:31:28.440 | ah, I had the joint epiphany.
01:31:29.600 | And now how about eight people in a room doing it, right?
01:31:31.760 | So it gets, you know, there's other examples.
01:31:34.040 | How about when you're a dress designer or a bridge designer
01:31:37.520 | and you want to show people what your dress looks like?
01:31:40.760 | Well, right now you gotta sketch it for a long time.
01:31:42.480 | Here, just beam it onto the screen from your head.
01:31:44.320 | So you can picture it.
01:31:45.160 | If you can picture a tree in your head,
01:31:47.400 | well, you can just suddenly,
01:31:48.840 | whatever's in your head, you can be pictured.
01:31:50.080 | So we'll have to get very good at it, right?
01:31:52.040 | And take a skill, right?
01:31:52.920 | You know, you're gonna have to,
01:31:54.240 | but the possibilities, my God.
01:31:56.240 | Talk about like, I feel like if that works,
01:31:58.960 | if we really do have that as something,
01:32:01.440 | I think it'll almost be like a new ADBC line.
01:32:04.280 | It's such a big change that the idea of like anyone living
01:32:07.300 | before everyone had brain machine interfaces
01:32:09.320 | is living in like before the common era.
01:32:12.840 | It's that level of like big change, if it can work.
01:32:16.080 | - Yeah, and like a replay of memories,
01:32:18.040 | just replaying stuff in your head.
01:32:19.400 | - Oh my God, yeah.
01:32:20.640 | And copying, you know, you can hopefully copy memories
01:32:23.040 | onto other things and you don't have to just rely
01:32:24.880 | on your wet circuitry.
01:32:27.920 | - It does make me sad 'cause you're right.
01:32:29.660 | The brain is incredibly neuroplastic
01:32:32.080 | and so it can adjust, it can learn how to do this.
01:32:34.620 | I think it will be a skill.
01:32:36.080 | But probably you and I will be too old to truly learn.
01:32:38.560 | - Well, maybe we can get, there'll be great trainings.
01:32:40.400 | You know, I'm spending the next three months
01:32:42.040 | in like one of the Neuralink trainings.
01:32:44.840 | - But it'll still be a bit of like grandpa can't--
01:32:47.520 | - Definitely.
01:32:48.360 | This is, you know, I was thinking,
01:32:49.180 | how am I gonna be old?
01:32:50.020 | I'm like, no, I'm gonna be great at the new phones.
01:32:51.440 | It's like, I'm not gonna be the phones.
01:32:52.960 | It's gonna be the, you know,
01:32:54.320 | the kid's gonna be thinking to me,
01:32:55.440 | I'm gonna be like, I just, can you just talk, please?
01:32:57.960 | And they're gonna be like, okay, I'll just talk
01:32:59.420 | and they're gonna, so that'll be the equivalent
01:33:02.160 | of, you know, yelling to your grandpa today.
01:33:04.360 | - I really suspect, I don't know what your thoughts are,
01:33:06.800 | but I grew up in a time when physical contact
01:33:10.480 | and interaction was valuable.
01:33:12.160 | I just feel like that's going to go the way
01:33:15.640 | that's gonna disappear.
01:33:17.120 | - Why?
01:33:17.960 | I mean, is there anything more intimate
01:33:19.120 | than thinking with each other?
01:33:20.200 | I mean, that's, you talk about, you know,
01:33:21.520 | once we were all doing that, it might feel like,
01:33:22.920 | man, everyone was so isolated from each other before.
01:33:24.880 | - Yeah, sorry.
01:33:25.700 | So I didn't say that intimacy disappears.
01:33:27.580 | I just meant physical, having to be in the same,
01:33:29.600 | having to touch each other.
01:33:31.000 | - People like that.
01:33:33.640 | If it is important, won't there be whole waves
01:33:36.000 | of people start to say, you know,
01:33:37.160 | there's all these articles that come out about how,
01:33:38.560 | you know, in our metaverse, we've lost something important.
01:33:41.080 | And then now there's a huge,
01:33:42.320 | all first the hippies start doing it,
01:33:43.600 | and eventually it becomes this big wave.
01:33:44.940 | And now everyone, won't, you know,
01:33:46.800 | if something truly is lost, won't we recover it?
01:33:49.600 | - Well, I think from first principles,
01:33:51.640 | all of the components are there to engineer
01:33:54.680 | intimate experiences in the metaverse,
01:33:57.400 | or in the cyberspace.
01:34:00.140 | And so to me, it's, I don't see anything profoundly unique
01:34:05.140 | to the physical experience.
01:34:08.160 | Like I don't understand.
01:34:09.240 | - But then why are you saying there's a loss there?
01:34:12.200 | - No, I'm just sad because I won't,
01:34:13.640 | oh, it's a loss for me personally,
01:34:15.000 | because the world--
01:34:15.840 | - So then you do think there's something unique
01:34:17.320 | in the physical experience?
01:34:18.360 | - For me, because I was raised with it.
01:34:20.280 | - Oh. - Yeah, yeah, yeah.
01:34:21.120 | So whatever, so anything you're raised with,
01:34:24.240 | you fall in love with.
01:34:25.060 | Like people in this country came up with baseball.
01:34:27.320 | I was raised in the Soviet Union.
01:34:29.200 | I don't understand baseball.
01:34:30.400 | I get, I like it, but I don't love it
01:34:32.700 | the way Americans love it.
01:34:35.960 | Because a lot of times,
01:34:36.920 | they went to baseball games with their father,
01:34:39.520 | and there's that family connection.
01:34:41.680 | There's a young kid dreaming about, I don't know,
01:34:45.120 | becoming an MLB player himself.
01:34:49.840 | I don't know, something like that.
01:34:50.780 | But that's what you're raised with,
01:34:52.680 | obviously, is really important.
01:34:53.760 | But I mean, fundamentally to the human experience.
01:34:57.280 | Listen, we're doing this podcast in person,
01:34:58.920 | so clearly I still value it, but--
01:35:01.800 | - But it's true.
01:35:02.640 | If this were, obviously if there were a screen,
01:35:04.040 | we all agree that's not the same.
01:35:05.280 | - Yeah, it's not the same.
01:35:06.120 | - But if this were some, you know,
01:35:07.000 | we had contact lenses on, and like,
01:35:09.120 | maybe Neuralink, you know, maybe, again, forget,
01:35:11.840 | again, this is all, the devices,
01:35:13.800 | even if it's just cool as a contact lens,
01:35:15.280 | that's all old school.
01:35:16.800 | Once you have the brain-machine interface,
01:35:18.640 | it'll just be projection of,
01:35:20.880 | it'll take over my visual cortex.
01:35:22.320 | My visual cortex will get put into a virtual room,
01:35:25.200 | and so will yours, so we will see,
01:35:27.040 | we will hear, really hear and see,
01:35:29.160 | as if we're, you won't have any masks,
01:35:30.900 | no VR mask needed.
01:35:32.400 | And at that point, it really will feel like,
01:35:34.480 | you'll forget, you'll say,
01:35:35.320 | "Were we together physically or not?"
01:35:37.440 | You won't even, it'd be so unimportant,
01:35:39.000 | you won't even remember.
01:35:40.200 | - And you're right, this is one of those shits in society
01:35:45.040 | that changes everything.
01:35:46.760 | - Romantically, people still need to be together.
01:35:50.120 | There's a whole set of physical things
01:35:52.640 | with relationship that are needed.
01:35:55.560 | You know, like--
01:35:56.400 | - Like what, like sex?
01:35:57.560 | - Sex, but also just like, there's pheromones,
01:36:00.120 | like there's, the physical touch is such a,
01:36:03.160 | it's like music, it goes to such a deeply primitive
01:36:05.720 | part of us, that what physical touch
01:36:08.040 | with a romantic partner does, that I think that,
01:36:11.160 | so I'm sure there'll be a whole wave of people
01:36:13.440 | who, their new thing is that, you know,
01:36:14.880 | you're romantically involved with people
01:36:16.180 | you never actually are in person with,
01:36:17.440 | but, and I'm sure there'll be things
01:36:18.920 | where you can actually smell what's in the room,
01:36:20.360 | and you can--
01:36:21.200 | - Yeah, and touch.
01:36:22.040 | - Yeah, but I think that'll be one of the last things to go.
01:36:24.320 | I think there'll be, there's something,
01:36:26.840 | that to me seems like something that'll be,
01:36:29.120 | it'll be a while before people feel like
01:36:30.960 | there's nothing lost by not being in the same room.
01:36:32.960 | - It's very difficult to replicate the human interaction.
01:36:35.840 | - Although sex also, again, you could,
01:36:38.640 | not to get too weird, but you could have a thing
01:36:41.000 | where you basically, you know, or let's just do a massage,
01:36:45.480 | 'cause it's less awkward, but like, someone--
01:36:48.040 | - Everyone is still imagining sex, so go on.
01:36:50.400 | - A masseuse could massage a fake body,
01:36:54.720 | and you could feel whatever's happening, right?
01:36:57.000 | So you're lying down in your apartment alone,
01:36:58.960 | but you're feeling a full--
01:37:00.200 | - That'll be the new YouTube streaming,
01:37:02.880 | where it's one masseuse massaging one body,
01:37:05.040 | but like 1,000 people are experiencing.
01:37:06.760 | - Exactly, right, now think about it, right now,
01:37:08.160 | you know what, Taylor Swift doesn't play for one person,
01:37:10.680 | it has to go around, and every one of her fans
01:37:12.320 | she has to go play for, or a book, right?
01:37:14.440 | You do it, and it goes everywhere,
01:37:15.760 | so it'll be the same idea.
01:37:17.040 | - You've written and thought a lot about AI.
01:37:21.720 | So AI safety specifically, you've mentioned
01:37:26.160 | you're actually starting a podcast, which is awesome.
01:37:28.360 | You're so good at talking, so good at thinking,
01:37:30.560 | so good at being weird in the most beautiful of ways.
01:37:33.920 | But you've been thinking about this AI safety question,
01:37:38.600 | where today does your concern lie?
01:37:41.640 | For the near future, for the long-term future.
01:37:43.960 | Quite a bit of stuff happened,
01:37:46.600 | including with Elon's work with Tesla Autopilot,
01:37:48.640 | there's a bunch of amazing robots, there's Boston Dynamics,
01:37:52.160 | and everyone's favorite vacuum robot, iRobot, Roomba,
01:37:57.160 | and then there's obviously the applications
01:37:59.320 | of machine learning for recommender systems
01:38:01.880 | in Twitter, Facebook, and so on.
01:38:04.400 | And face recognition for surveillance,
01:38:07.820 | all these kinds of things are happening.
01:38:09.200 | Just a lot of incredible use of, not the face recognition,
01:38:12.640 | but the incredible use of deep learning, machine learning,
01:38:16.860 | to capture information about people
01:38:19.360 | and try to recommend to them what they wanna consume next.
01:38:24.200 | Some of that can be abused, some of that can be used
01:38:26.280 | for good, like for Netflix or something like that.
01:38:29.120 | What are your thoughts about all this?
01:38:30.280 | - Yeah, I mean, I really don't think humans are very smart,
01:38:35.280 | all things considered, I think we're limited.
01:38:38.680 | And we're dumb enough that we're very easily manipulable.
01:38:43.200 | Not just like, oh, our emotions.
01:38:46.120 | Our emotions can be pulled like puppet strings.
01:38:50.240 | I mean, again, I do look at what's going on
01:38:52.280 | with political polarization now,
01:38:53.440 | and I see a lot of puppet string emotions happening.
01:38:56.480 | So yeah, there's a lot to be scared of, for sure.
01:38:58.960 | Like very scared of.
01:39:00.040 | I get excited about a lot of things, very specific things.
01:39:04.200 | One of the things I get excited about is,
01:39:06.480 | so the future of wearables, right?
01:39:08.560 | Again, I think that they'll be like,
01:39:09.760 | oh, the wrist, the Fitbit around my wrist is gonna seem,
01:39:12.360 | or the whoop is gonna seem really hilariously old school
01:39:17.120 | in 20 years.
01:39:17.960 | - Like with Neuralink. - We're like a big bracelet.
01:39:21.040 | It's gonna turn into little sensors in our blood probably,
01:39:24.160 | or even infrared, just things that are gonna be,
01:39:28.720 | it's gonna be collecting 100 times more data
01:39:31.080 | than it collects now, more nuanced data,
01:39:32.600 | more specific to our body.
01:39:34.440 | And it's going to be super reliable,
01:39:36.680 | but that's the hardware side.
01:39:38.120 | And then the software is gonna be,
01:39:40.760 | I've not done my deep dive, this is all speculation,
01:39:42.640 | but the software is gonna get really good.
01:39:45.080 | And this is the AI component.
01:39:46.960 | And so I get excited about specific things like that.
01:39:49.200 | Like think about if hardware were able to collect,
01:39:54.200 | first of all, the hardware knows your whole genome.
01:39:56.680 | And we know a lot more about what a genome sequence means,
01:40:00.720 | 'cause you can collect your genome now,
01:40:03.440 | and we just don't know,
01:40:04.280 | okay, we don't have much to do with that information.
01:40:07.160 | As AI gets, so now you have your genome,
01:40:09.240 | you've got what's in your blood at any given moment,
01:40:11.280 | all the levels of everything, right?
01:40:13.240 | You have the exact width of your heart arteries
01:40:16.160 | at any given moment.
01:40:17.080 | You've got--
01:40:17.920 | - All the virons, all the viruses
01:40:21.400 | that ever visited your body,
01:40:22.800 | 'cause there's a trace of it.
01:40:24.040 | So you have all the pathogens,
01:40:25.720 | all the things that you should be concerned about health-wise
01:40:29.320 | and might have threatened you,
01:40:30.440 | or you might be immune from all of that kind of stuff.
01:40:32.920 | - They also, of course, it knows
01:40:34.200 | how fast your heart is beating,
01:40:35.640 | and it knows how much you,
01:40:37.120 | exactly the amount of exercise,
01:40:39.480 | knows your muscle mass and your weight and all that.
01:40:41.160 | But it also maybe can even know your emotions.
01:40:42.960 | I mean, if emotions, what are they?
01:40:44.960 | Where do they come from?
01:40:45.800 | Probably pretty obvious chemicals once we get in there.
01:40:48.840 | So again, Neuralink can be involved here
01:40:50.360 | maybe in collecting information.
01:40:51.960 | 'Cause right now you have to do the thing,
01:40:54.120 | what's your mood right now?
01:40:55.040 | And it's hard to even assess,
01:40:56.280 | and you're in a bad mood, it's hard to even, but--
01:40:58.840 | - By the way, just as a shout out,
01:41:00.600 | Lisa Feldman Barrett, who's a neuroscientist
01:41:03.760 | at Northeastern just wrote a,
01:41:05.440 | I mean, not just, like a few years ago,
01:41:07.040 | wrote a whole book saying,
01:41:08.320 | "Our expression of emotions has nothing to do
01:41:10.360 | "with the experience of emotions."
01:41:12.200 | So you really actually want to be measuring.
01:41:15.080 | - That's exactly.
01:41:16.480 | You can tell, 'cause one of these apps pops up
01:41:18.640 | and says, "How do you feel right now?
01:41:20.600 | "Good, bad?"
01:41:21.440 | I'm like, "I don't know.
01:41:22.260 | "I feel bad right now because the thing popping up
01:41:24.600 | "reminded me that I'm procrastinating
01:41:26.560 | "because I was on my phone and I should've been."
01:41:27.880 | You know, I'm like, "That's not my, you know."
01:41:29.320 | So I think we'll probably be able to
01:41:32.480 | very, get all this info, right?
01:41:34.660 | Now the AI can go to town.
01:41:37.280 | Think about when the AI gets really good at this.
01:41:39.720 | And it knows your genome, and it knows,
01:41:41.520 | it can just, I want the AI to just tell me what to do.
01:41:45.440 | When it turns up, okay, for when you,
01:41:47.360 | so how about this, now imagine attaching that
01:41:48.960 | to a meal service, right?
01:41:50.540 | And the meal service has everything,
01:41:51.520 | you know, all the million ingredients
01:41:53.300 | and supplements and vitamins and everything.
01:41:55.880 | And I give the, I tell the AI my broad goals.
01:41:59.400 | I wanna gain muscle, or I want to, you know,
01:42:02.480 | maintain my weight, but I wanna have more energy,
01:42:04.200 | or whatever, or I just wanna be very healthy,
01:42:06.400 | and I wanna, obviously, everyone wants the same,
01:42:08.120 | like, 10 basic things, like you wanna avoid cancer,
01:42:10.400 | you wanna, you know, various things,
01:42:11.880 | you wanna age slower.
01:42:13.200 | So now the AI has my goals, and a drone comes at,
01:42:19.280 | you know, a little thing pops up,
01:42:21.600 | and it says, like, you know, beep, beep,
01:42:23.100 | like, you know, 15 minutes, you're gonna eat.
01:42:24.980 | 'Cause it knows that's a great,
01:42:25.960 | that's the right time for my body to eat.
01:42:27.860 | 15 minutes later, a little slot opens in my wall,
01:42:30.180 | where a drone has come from the factory,
01:42:32.140 | the eating, the food factory, and dropped
01:42:33.860 | the perfect meal for my, that moment for me,
01:42:36.780 | for my mood, for my genome, for my blood contents.
01:42:40.700 | And it's, 'cause it knows my goals,
01:42:42.740 | so, you know, it knows I wanna feel energy at this time,
01:42:44.620 | and then I wanna wind down here,
01:42:45.740 | so I, those things, you have to tell it.
01:42:47.420 | - Well, plus the pleasure thing,
01:42:48.980 | like, it knows what kind of components
01:42:51.020 | of a meal you've enjoyed in the past,
01:42:52.500 | so you can assemble the perfect meal.
01:42:53.940 | - Exactly, it knows you way better than you know yourself,
01:42:56.980 | better than any human could ever know you.
01:42:58.380 | And a little thing pops up,
01:42:59.980 | you still have some choice, right?
01:43:01.500 | So it pops up and it says, like, you know, coffee,
01:43:05.260 | because it knows that, you know, my cutoff,
01:43:07.340 | they says, you know, I can have coffee
01:43:08.980 | for the next 15 minutes only,
01:43:10.260 | because at that point, it knows how long
01:43:12.260 | it stays in my system, it knows what my sleep is like
01:43:14.140 | when I have it too late, it knows I have to wake up
01:43:15.620 | at this time tomorrow, 'cause that's my calendar.
01:43:17.780 | And so I think a lot of people's,
01:43:19.300 | this is, I think, something that humans are wrong about,
01:43:22.100 | is that most people will hear this and be like,
01:43:23.660 | that sounds awful, that sounds dystopian.
01:43:26.060 | No, it doesn't, it sounds incredible.
01:43:27.460 | And if we all had this, we would not look back
01:43:29.420 | and be like, I wish I was, like, making awful choices
01:43:31.560 | every day, like I was in the past.
01:43:33.660 | And then, this isn't, these aren't important decisions,
01:43:36.180 | your important decision-making energy,
01:43:38.860 | your important focus and your attention
01:43:41.420 | can go onto your kids and on your work
01:43:43.420 | and on, you know, helping other people
01:43:45.340 | and things that matter.
01:43:46.420 | And so I think AI, when I think about, like,
01:43:49.540 | personal lifestyle stuff like that,
01:43:51.700 | I really love, like, I love thinking about that.
01:43:55.420 | I think it's gonna be very, and I think we'll all be
01:43:57.020 | so much healthier, that when we look back today,
01:44:00.460 | one of the things that's gonna look so primitive
01:44:02.060 | is the one-size-fits-all thing,
01:44:03.500 | getting, like, reading advice about keto.
01:44:05.540 | Each genome is gonna have very specific,
01:44:10.020 | one, you know, unique advice coming from AI, and so, yeah.
01:44:14.180 | - Yeah, the customization that's enabled
01:44:16.020 | by a collection of data and the use of AI,
01:44:18.900 | a lot of people think, what's the, like,
01:44:20.660 | they think of the worst-case scenario
01:44:22.340 | of that data being used by authoritarian governments
01:44:25.180 | to control you, all that kind of stuff.
01:44:26.580 | They don't think about, most likely,
01:44:28.420 | especially in a capitalist society,
01:44:30.040 | it's most likely going to be used as part of a competition
01:44:34.300 | to get you the most delicious and healthy meal possible
01:44:36.780 | as fast as possible. - Exactly.
01:44:38.340 | - Yeah, so the world will definitely be much better
01:44:40.700 | with the integration of data.
01:44:42.260 | But of course, you wanna be able to
01:44:45.220 | be transparent and honest about how that data is misused,
01:44:48.340 | and that's why it's important to have free speech
01:44:50.380 | and people to speak out, like,
01:44:51.700 | when some bullshit is being done by companies.
01:44:53.580 | - That we need to have our wits about us as a society.
01:44:55.940 | Like, this is what, free speech is the mechanism
01:45:00.480 | by which the big brain can think, can think for itself,
01:45:03.500 | can think straight, can see straight.
01:45:05.960 | When you take away free speech,
01:45:07.300 | when you start saying that, in every topic,
01:45:09.300 | when any topic's political,
01:45:10.940 | it becomes treacherous to talk about.
01:45:12.820 | So forget the government taking away free speech.
01:45:15.660 | If the culture penalizes nuanced conversation
01:45:18.860 | about any topic that's political,
01:45:21.020 | and the politics is so all-consuming,
01:45:24.380 | and it's such a incredible market to polarize people,
01:45:29.380 | for media to polarize people,
01:45:31.420 | and to bring any topic it can into that
01:45:34.020 | and get people hooked on it as a political topic,
01:45:36.700 | we become a very dumb society.
01:45:38.100 | So free speech goes away, as far as it matters.
01:45:39.820 | People say, oh, people like to say,
01:45:41.220 | well, it's not, you don't even know what free speech is.
01:45:43.220 | Free speech is, you know, it's, you know,
01:45:45.860 | your free speech is not being violated.
01:45:46.900 | It's like, no, you're right.
01:45:48.380 | My First Amendment rights are not being violated.
01:45:50.780 | But the culture of free speech,
01:45:52.060 | which is the second ingredient of two.
01:45:53.880 | You need the First Amendment,
01:45:55.540 | and you need the culture of free speech,
01:45:57.260 | and now you have free speech.
01:45:58.380 | And the culture is much more specific.
01:46:00.940 | You obviously can have a culture
01:46:01.920 | that believes people right now.
01:46:03.960 | Take any topic, again, that has to do with, like,
01:46:06.100 | you know, some very sensitive topics,
01:46:08.920 | you know, police shootings, or, you know,
01:46:11.560 | what's going on in, you know, K-12 schools,
01:46:13.680 | or, you know, even, you know, climate change.
01:46:15.440 | You know, take any of these,
01:46:16.840 | and the First Amendment's still there.
01:46:19.920 | No one, you're not gonna get arrested,
01:46:21.520 | no matter what you say.
01:46:22.760 | The culture of free speech is gone,
01:46:25.640 | because you will be destroyed.
01:46:27.080 | Your life can be over, you know, as far as it matters,
01:46:29.840 | if you say the wrong thing.
01:46:31.720 | But even, you know, but a culture of,
01:46:33.360 | a really vigorous culture of free speech,
01:46:35.280 | you get no penalty at all
01:46:36.560 | for even saying something super dumb.
01:46:38.360 | People will say, like, people will laugh,
01:46:40.160 | and be like, well, that was, like,
01:46:41.440 | kind of hilariously offensive,
01:46:42.640 | and, like, not at all correct.
01:46:43.920 | Like, you know, you're wrong, and here's why.
01:46:45.440 | But no one's, like, mad at you.
01:46:47.080 | Now, the brain is thinking at its best.
01:46:49.480 | The IQ of the big brain is, like,
01:46:50.980 | as high as it can be in that culture.
01:46:53.120 | And the culture of, and you say something wrong,
01:46:55.040 | and people say, oh, wow, you've changed.
01:46:56.500 | Oh, wow, like, look, this is his real, you know, colors.
01:46:58.800 | You know, the big brain is dumb.
01:47:01.560 | - You still have mutual respect for each other.
01:47:03.840 | So, like, you don't think lesser of others
01:47:06.960 | when they say a bunch of dumb things.
01:47:08.500 | You know it's just a play of ideas.
01:47:10.840 | But you still have respect, you still have love for them.
01:47:12.800 | Because I think the worst case
01:47:15.160 | is when you have a complete free, like,
01:47:17.640 | anarchy of ideas where it's, like,
01:47:19.720 | like, everybody lost hope that something like a truth
01:47:24.000 | can even be converged towards.
01:47:25.680 | Like, everybody has their own truth.
01:47:27.520 | Then it's just chaos.
01:47:29.540 | Like, if you have mutual respect,
01:47:31.440 | and a mutual goal of arriving at the truth,
01:47:33.760 | and the humility that you want to listen
01:47:35.800 | to other people's ideas, and a forgiveness
01:47:38.000 | that other people's ideas might be dumb as hell,
01:47:39.840 | that doesn't mean they're lesser beings,
01:47:41.320 | all that kind of stuff.
01:47:42.400 | But that's, like, a weird balance to strike.
01:47:44.560 | - Right now, people are being trained, little kids,
01:47:47.560 | college students, being trained
01:47:49.320 | to think the exact opposite way.
01:47:51.360 | To think that there's no such thing as objective truth.
01:47:53.240 | Which is, you know, the objective truth
01:47:54.640 | is the N on the compass for every thinker.
01:47:59.180 | Doesn't mean we're, you know, necessarily on our way,
01:48:01.720 | or we're finding, but we're all aiming in the same direction.
01:48:03.560 | And we all believe that there's a place
01:48:05.000 | we can eventually get closer to.
01:48:06.680 | Not objective truth, you know, teaching them
01:48:09.720 | that disagreement is bad, violence.
01:48:12.160 | You know, it's, you know, it's like, you know,
01:48:17.160 | you quickly sound like you're just going on,
01:48:18.480 | like, a political rant with this topic.
01:48:20.360 | But, like, it's really bad.
01:48:22.400 | It's, like, genuinely the worst.
01:48:24.760 | If I had my own country, I mean,
01:48:27.540 | it's like, I would teach kids some very specific things
01:48:31.760 | that this is doing the exact opposite of.
01:48:34.980 | And it sucks.
01:48:37.340 | It sucks.
01:48:38.380 | - Speaking of a way to escape this,
01:48:40.700 | you've tweeted, "30 minutes of reading a day equals,"
01:48:43.460 | yeah, this whole video, and it's cool to think about reading
01:48:46.020 | like, as a habit, and something that accumulates.
01:48:50.260 | You said, "30 minutes of reading a day
01:48:51.820 | "equals 1,000 books in 50 years."
01:48:54.300 | I love, like, thinking about this.
01:48:56.960 | Like, chipping away at the mountain.
01:48:59.360 | Can you expand on that?
01:49:00.740 | Sort of the habit of reading?
01:49:02.940 | How do you recommend people read?
01:49:04.660 | - Yeah, yeah, I mean, it's incredible.
01:49:06.980 | If you do something, a little of something every day,
01:49:10.620 | it compiles, it compiles.
01:49:12.820 | You know, I always think about, like,
01:49:13.860 | the people who achieve these incredible things in life,
01:49:16.740 | these great, like, famous, legendary people,
01:49:19.620 | they have the same number of days that you do,
01:49:21.340 | and it's not like they were doing magical days.
01:49:23.700 | They just, they got a little done every day,
01:49:26.660 | and that adds up to,
01:49:30.740 | to a monument, you know,
01:49:32.900 | they're putting one brick in a day.
01:49:33.980 | Eventually, they have this building,
01:49:35.020 | this legendary building.
01:49:36.300 | So, you can take writing, someone who, you know,
01:49:39.100 | there's two aspiring writers, and one doesn't ever write,
01:49:42.140 | doesn't, you know, manages to never, you know,
01:49:43.880 | zero write, zero pages a day,
01:49:45.060 | and the other one manages to do two pages a week, right?
01:49:49.380 | Not very much.
01:49:50.220 | The other one does zero pages a week, two pages a week.
01:49:51.740 | 98% of both of their time is the same.
01:49:55.480 | The other person, just 2%, they're doing one other thing.
01:49:58.620 | One year later, they have written,
01:50:00.780 | they write two books a year.
01:50:01.820 | This prolific person, you know, in 20 years,
01:50:04.500 | they've written 40 books.
01:50:05.580 | They're one of the most prolific writers of all time.
01:50:07.580 | They write two pages a week.
01:50:08.980 | Sorry, that's not true.
01:50:11.060 | That was two pages a day.
01:50:12.380 | Okay, two pages a week, you're still writing
01:50:14.180 | about a book every two years.
01:50:15.940 | So, in 20 years, you've still written 10 books,
01:50:17.980 | also prolific writer, right?
01:50:19.820 | Huge, massive writing career.
01:50:21.440 | You write two pages every Sunday morning.
01:50:23.380 | The other person has the same exact week,
01:50:25.300 | and they don't do that Sunday morning thing.
01:50:26.540 | They are a wannabe writer.
01:50:27.960 | They always said they could write.
01:50:29.180 | They talk about how they used to be,
01:50:30.940 | and nothing happens, right?
01:50:31.980 | So, it's inspiring, I think, for a lot of people
01:50:34.900 | who feel frustrated they're not doing anything.
01:50:36.540 | So, reading is another example
01:50:38.580 | where someone who reads very, you know, doesn't read,
01:50:43.580 | and someone who's a prolific reader.
01:50:45.700 | You know, I always think about like the Tyler Cowen types.
01:50:47.580 | I'm like, how the hell do you read so much?
01:50:50.020 | It's infuriating, you know?
01:50:51.380 | Or like James Clear puts out his like,
01:50:53.860 | his 10 favorite books of the year,
01:50:56.260 | 20, his 20 favorite books of the year.
01:50:57.860 | I'm like, your 20 favorites?
01:51:00.700 | Like, I'm trying to just read 20 books,
01:51:02.220 | like, and it would be an amazing year.
01:51:03.580 | So, but the thing is,
01:51:06.060 | they're not doing something crazy and magical.
01:51:07.980 | They're just reading a half hour a night, you know?
01:51:09.980 | If you read a half hour a night,
01:51:11.620 | the calculation I came to is that
01:51:13.020 | you can read a thousand books in 50 years.
01:51:15.260 | So, if someone who's 80, and they've read a thousand books,
01:51:18.620 | you know, between 30 and 80, they are extremely well read.
01:51:21.780 | They can delve deep into many non-fiction areas.
01:51:24.820 | They can be, you know, an amazing fiction reader,
01:51:27.860 | avid fiction reader.
01:51:29.380 | And again, that's a half hour a day.
01:51:30.580 | Some people can do an hour,
01:51:31.540 | a half hour in the morning, audiobook,
01:51:33.060 | half hour at night in bed.
01:51:34.140 | Now they've read 2000 books.
01:51:35.140 | So, I think it's,
01:51:37.940 | it's just, it's motivating.
01:51:40.860 | And you realize that a lot of times you think
01:51:43.820 | that the people who are doing amazing things,
01:51:45.820 | and you're not, you think that there's a bigger gap
01:51:48.860 | between you and them than there really is.
01:51:51.420 | I, on the reading front, I'm a very slow reader,
01:51:54.020 | which is just a very frustrating fact about me.
01:51:58.420 | But I'm faster with audiobooks.
01:52:00.700 | And I also, I just, you know, I'll just,
01:52:02.500 | it's just hard to get myself to read.
01:52:04.340 | But I've started doing audiobooks,
01:52:05.700 | and I'll wake up, throw it on, do it in the shower,
01:52:09.100 | brushing my teeth, you know, making breakfast,
01:52:11.020 | dealing with the dogs, things like that, whatever,
01:52:12.860 | until I sit down.
01:52:14.180 | And that's, I can read, I can read a book a week,
01:52:17.860 | a book every 10 days at that clip.
01:52:20.460 | And suddenly I'm this big reader,
01:52:21.820 | because I'm just, while doing my morning stuff,
01:52:24.180 | I have it on, and also it's this fun,
01:52:25.380 | it makes the morning so fun.
01:52:26.260 | I'm like, having a great time the whole morning,
01:52:27.860 | so I'm like, oh, I'm so into this book.
01:52:29.500 | So I think that, you know, audiobooks is another amazing
01:52:32.220 | gift to people who have a hard time reading.
01:52:34.180 | - I find that that's actually an interesting skill.
01:52:36.300 | I do audiobooks quite a bit.
01:52:38.700 | Like, it's a skill to maintain, at least for me,
01:52:41.740 | probably the kind of books I read,
01:52:43.180 | which is often like history, or like,
01:52:45.620 | there's a lot of content, and if you miss parts of it,
01:52:49.660 | you miss out on stuff.
01:52:51.940 | And so, it's a skill to maintain focus, at least for me.
01:52:55.700 | - Well, the 10 second back button is very valuable.
01:52:58.900 | - Oh, interesting.
01:52:59.740 | - So I just, if I get lost, sometimes the book is so good
01:53:02.500 | that I'm thinking about what the person just said,
01:53:04.020 | and I just get, the skill for me is just remembering
01:53:05.980 | to pause, and if I don't, no problem,
01:53:07.980 | just back, back, back, back.
01:53:09.060 | Just three quick backs.
01:53:10.520 | So that, of course, is not that efficient,
01:53:11.820 | but that's, but it's, I do the same thing when I'm reading.
01:53:14.060 | I'll read a whole paragraph and realize I was tuning out.
01:53:15.940 | - Yeah. - You know?
01:53:17.140 | I haven't actually even considered to try that.
01:53:19.780 | I've been so hard on myself maintaining focus,
01:53:22.940 | because you do get lost in thought.
01:53:24.340 | Maybe I should try that.
01:53:25.180 | - Yeah, and when you get lost in thought, by the way,
01:53:26.860 | you're processing the book.
01:53:27.940 | That's not wasted time.
01:53:29.260 | That's your brain really categorizing and cataloging
01:53:32.180 | what you just read, and like.
01:53:33.500 | - Well, there's several kinds of thoughts, right?
01:53:36.080 | There's thoughts related to the book,
01:53:37.580 | and there's a thought that it could take you elsewhere.
01:53:40.100 | - Well, I find that if I am continually thinking
01:53:43.900 | about something else, I just say, I'm not,
01:53:46.620 | I would just pause the book.
01:53:47.460 | - Yeah, yeah, yeah. - Yeah.
01:53:48.280 | Especially in the shower or something,
01:53:49.260 | when like, that's sometimes
01:53:51.220 | when really great thoughts come out.
01:53:52.180 | If I'm having all these thoughts about other stuff,
01:53:53.900 | I'm saying, clearly my mind wants to work on something else.
01:53:55.900 | So I'll just pause it.
01:53:56.740 | - Yeah, quiet Dan Carlin.
01:53:57.820 | I'm thinking about something else right now.
01:53:59.260 | - Yeah, exactly, exactly.
01:54:00.660 | Also, you can, things like you have to head out to the store.
01:54:04.240 | Like, I'm gonna read 20 pages on that trip,
01:54:06.500 | just walking back and forth, going to the airport.
01:54:08.580 | I mean, flights, you know, the Uber,
01:54:10.460 | and then you're walking to the,
01:54:12.100 | walking through the airport,
01:54:13.100 | you're standing in the security line.
01:54:14.020 | I'm reading the whole time.
01:54:15.260 | Like, I know this is not groundbreaking.
01:54:17.420 | People know what audio books are,
01:54:18.540 | but I think that more people
01:54:20.020 | should probably get into them than do.
01:54:21.420 | 'Cause I know a lot of people,
01:54:22.260 | they have this stubborn kind of thing.
01:54:23.500 | Say, I don't like, I like to have the paper book.
01:54:25.380 | And sure, but like, it's pretty fun to be able to read.
01:54:27.940 | - I still, to this day,
01:54:29.420 | I listen to a huge number of audio books and podcasts,
01:54:31.920 | but I still, the most impactful experiences for me
01:54:35.860 | are still reading.
01:54:36.700 | And I read very, very slow.
01:54:38.380 | And it's very frustrating when,
01:54:40.140 | like you go to these websites,
01:54:42.120 | like that estimate how long a book takes on average,
01:54:45.660 | that those always annoy me.
01:54:46.600 | - They do like a page a minute when I read, like,
01:54:48.840 | best, a page every two minutes at best.
01:54:51.780 | - At best, when you're like really, like,
01:54:54.000 | - Right. - actually not pausing.
01:54:55.800 | - I just, my ADD, it's like, I just,
01:54:58.100 | it's hard to keep focusing.
01:54:59.880 | And I also like to really absorb.
01:55:01.880 | So, on the other side of things,
01:55:03.520 | when I finish a book, 10 years later,
01:55:05.560 | I'll be able to, like, you know that scene
01:55:06.560 | when this happens and another friend had read it,
01:55:07.720 | I'll be like, what?
01:55:08.560 | I don't remember any, like, details.
01:55:09.460 | I'm like, oh, I can tell you, like, the entire,
01:55:10.900 | so I absorb the shit out of it.
01:55:12.400 | But I don't think it's worth it to, like,
01:55:14.160 | have to read so less, so much less in my life.
01:55:16.600 | - I actually, so, in terms of going to the airport,
01:55:18.720 | you know, in these, like, filler moments of life,
01:55:22.240 | I do a lot of, there's an app called Anki,
01:55:24.880 | I don't know if you know about it.
01:55:26.640 | It's a space repetition app.
01:55:29.200 | So, there's all of these facts I have.
01:55:31.000 | When I read, I write it down if I want to remember it.
01:55:33.960 | And it's this, it, you review it.
01:55:36.960 | And the one, the things you remember,
01:55:39.160 | it takes longer and longer to bring back up.
01:55:41.380 | It's like flashcards, but a digital app.
01:55:43.660 | It's called Anki, I recommend it to a lot of people.
01:55:45.340 | There's a huge community of people
01:55:46.780 | that are just, like, obsessed with it.
01:55:48.380 | - A-N-K-E?
01:55:49.740 | - A-N-K-I.
01:55:51.660 | So, this is extremely well-known app and idea,
01:55:55.620 | like, among students who are, like, medical students,
01:55:59.460 | like, people that really have to study.
01:56:01.860 | Like, this is not, like, fun stuff.
01:56:03.740 | They really have to memorize a lot of things.
01:56:06.300 | They have to remember them well.
01:56:07.420 | They have to be able to integrate them
01:56:08.540 | with a bunch of ideas, so.
01:56:10.180 | And I find it to be really useful for,
01:56:12.040 | like, when you read history,
01:56:14.820 | if you think this particular factoid,
01:56:17.820 | it'd probably be extremely useful for you.
01:56:19.780 | 'Cause you're, that'd be interesting, actually,
01:56:23.140 | thought, 'cause you're doing,
01:56:24.140 | you talked about, like, opening up a trillion tabs
01:56:26.740 | and reading things.
01:56:27.700 | You know, you probably want to remember some facts
01:56:32.260 | you read along the way.
01:56:33.560 | Like, you might remember, okay,
01:56:34.940 | this thing I can't directly put into the writing,
01:56:37.060 | but it's a cool little factoid.
01:56:38.860 | I wanna-- - All the time.
01:56:40.300 | - Store that in there. - Yeah.
01:56:41.420 | - And that's why I go Anki, drop it in.
01:56:43.700 | - Oh, you can just drop it in.
01:56:45.020 | - Yeah. - You drop it
01:56:46.020 | in a line of a podcast or, like, a video?
01:56:48.460 | - Well, no. - I guess I can type it,
01:56:50.300 | though. - So, yes.
01:56:51.300 | So, Anki, there's a bunch of,
01:56:53.780 | it's called Space Repetitions,
01:56:54.900 | there's a bunch of apps that are much nicer than Anki.
01:56:57.060 | Anki is the ghetto, like, Craigslist version,
01:57:00.500 | but it has a giant community,
01:57:01.940 | because people are like, we don't want features.
01:57:04.540 | We want a text box.
01:57:06.620 | Like, it's very basic, very stripped down.
01:57:08.780 | So, you can drop in stuff, you can drop in--
01:57:10.540 | - That sounds really, I can't believe
01:57:12.060 | I have not come across this.
01:57:13.460 | - You, actually, once you look into it--
01:57:14.820 | - There's the amount of-- - You'll realize that,
01:57:16.660 | how have I not come, you are the person,
01:57:19.860 | I guarantee you'll probably write a blog about it.
01:57:22.020 | I can't believe you actually have--
01:57:23.260 | - Well, it's also just like-- - It's your people, too.
01:57:25.100 | - And my, people say, what do you write about?
01:57:28.220 | Literally anything I find interesting.
01:57:30.740 | And so, for me, once you start a blog,
01:57:33.740 | like, your entire worldview becomes,
01:57:36.740 | would this be a good blog post?
01:57:37.580 | Would this be, I mean, it's the lens
01:57:39.260 | I see everything through, but I'm constantly
01:57:42.100 | coming across something, or just a tweet, you know?
01:57:44.660 | Something that I'm like, ooh, I need to share this
01:57:47.340 | with my readers.
01:57:48.180 | My readers, to me, are like my friends,
01:57:52.180 | who I'm like, oh, I need to tell them about this.
01:57:55.300 | And so, I feel like just a place to,
01:57:56.460 | I mean, I collect things in a document right now,
01:57:58.140 | if it's really good, but it's the little factoids
01:58:00.500 | and stuff like that, I think, especially
01:58:01.820 | if I'm learning something, if I'm like--
01:58:03.100 | - So, the problem is, when you save stuff,
01:58:05.180 | when you look at it, a tweet and all that kind of stuff,
01:58:07.980 | is you also need to couple that with a system for review.
01:58:11.580 | 'Cause what Enki does is, like, literally,
01:58:14.100 | it determines for me, I don't have to do anything.
01:58:16.460 | There's this giant pile of things I've saved,
01:58:18.700 | and it brings up to me, okay, here's,
01:58:21.660 | I don't know, when Churchill did something, right?
01:58:27.740 | I'm reading about World War II a lot now.
01:58:29.900 | Like, a particular event, here's that.
01:58:31.900 | Do you remember what year that happened?
01:58:33.940 | And you say yes or no, or, like, you get to pick.
01:58:38.700 | You get to see the answer, and you get to self-evaluate
01:58:42.100 | how well you remember that fact.
01:58:43.380 | And if you remember it well, it'll be another month
01:58:45.660 | before you see it again.
01:58:46.900 | If you don't remember, it'll bring it up again.
01:58:48.900 | That's a way to review tweets, to review concepts.
01:58:52.260 | And it offloads the kind of, the process of selecting
01:58:55.860 | which parts you're supposed to review or not.
01:58:57.460 | And you can grow that library.
01:58:58.980 | I mean, obviously, medical students use it
01:59:00.820 | for, like, tens of thousands of facts.
01:59:03.220 | - It just gamifies it, too.
01:59:05.140 | It's like you can passively sit back and just,
01:59:07.780 | and the thing will, like, make sure
01:59:09.140 | you eventually learn it all.
01:59:10.340 | Versus, you know, you don't have to be the executive
01:59:13.500 | calling that, like, the program, the memorization program
01:59:16.300 | someone else is handling.
01:59:17.500 | - I would love to hear about, like, you trying it out,
01:59:20.900 | or spaced repetition as an idea.
01:59:22.380 | There's a few other apps, but Anki's the big must.
01:59:24.860 | - I totally wanna try it.
01:59:26.180 | - You've written and spoken quite a bit
01:59:27.660 | about procrastination.
01:59:30.580 | I, like you, suffer from procrastination,
01:59:32.820 | like many other people.
01:59:34.620 | Suffer, in quotes.
01:59:35.940 | How do we avoid procrastination?
01:59:39.740 | - I don't think the suffer is in quotes.
01:59:42.020 | (Lex laughing)
01:59:43.620 | I think that's a huge part of the problem,
01:59:45.660 | is that it's treated like a silly problem.
01:59:48.900 | People don't take it seriously as a dire problem.
01:59:55.060 | But it can be.
01:59:57.900 | It can ruin your life.
02:00:00.900 | There's, like, we talked about the compiling concept
02:00:08.260 | with, you know, if you read a little, you know,
02:00:12.220 | if you write, if you write two pages a week,
02:00:14.620 | you write a book every two years,
02:00:16.700 | you're a prolific writer, right?
02:00:18.220 | And the difference between, you know, again,
02:00:20.580 | it's not that that person's working so hard,
02:00:22.580 | it's that they have the ability to,
02:00:24.660 | when they commit to something, like on Sunday mornings,
02:00:26.460 | I'm gonna write two pages.
02:00:27.740 | That's it.
02:00:28.580 | They respect, they have enough,
02:00:33.860 | they respect the part of them that made that decision
02:00:36.820 | is a respected character in their brain.
02:00:39.460 | And they say, well, I decided it, so I'm gonna do it.
02:00:42.140 | The procrastinator won't do those two pages.
02:00:46.780 | That's just exactly the kind of thing
02:00:48.580 | the procrastinator will keep on their list
02:00:50.060 | and they will not do.
02:00:51.020 | But that doesn't mean they're any less talented
02:00:53.420 | than the writer who does the two pages.
02:00:54.980 | Doesn't mean they want it any less.
02:00:57.380 | Maybe they want it even more.
02:00:58.900 | And it doesn't mean that they wouldn't be just as happy
02:01:01.300 | having done it as the writer who does it.
02:01:03.660 | So what they're missing out on,
02:01:05.140 | picture a writer who writes 10 books, you know,
02:01:08.020 | bestsellers, and they go on these book tours,
02:01:11.100 | and, you know, they, and they just are so gratified
02:01:15.300 | with their career, you know, and they,
02:01:17.340 | think about what the other person is missing
02:01:20.620 | who does none of that, right?
02:01:23.500 | So that is a massive loss, a massive loss.
02:01:27.820 | And it's because the internal mechanism in their brain
02:01:32.020 | is not doing what the other person's is.
02:01:33.940 | So they don't have the respect for the part of them
02:01:36.580 | that made the choice.
02:01:38.020 | They feel like it's someone they can disregard.
02:01:40.060 | And so to me, is this in the same boat
02:01:42.820 | as someone who is obese because their eating habits
02:01:47.220 | make them obese over time or their exercise habits?
02:01:52.100 | That, you know, that's a huge loss for that person.
02:01:54.300 | That person is, you know, the health problems
02:01:56.700 | and it's just probably making them miserable.
02:01:58.900 | And it's self-inflicted, right?
02:02:01.900 | It's self-defeating, but that doesn't make it
02:02:04.620 | an easy problem to fix just 'cause
02:02:06.420 | you're doing it to yourself.
02:02:07.500 | So to me, procrastination is another one of these
02:02:09.380 | where you are the only person in your own way.
02:02:11.800 | You are, you know, you are failing at something
02:02:15.220 | or not doing something that you really wanna do.
02:02:17.580 | You know, it does not have to be work.
02:02:18.900 | Maybe you're, you wanna get out of that marriage
02:02:21.260 | that you know, you realize, it hits you,
02:02:23.060 | you shouldn't be in this marriage, you should get divorced.
02:02:24.980 | And you wait 20 extra years before you do it
02:02:27.140 | or you don't do it at all.
02:02:28.440 | That is, you know, you're not living the life
02:02:33.660 | that you know you should be living, right?
02:02:35.080 | And so I think it's fascinating.
02:02:38.020 | Now, the problem is it's also a funny problem
02:02:39.940 | because there's short-term procrastination,
02:02:43.020 | which I talk about as, you know,
02:02:44.220 | the kind that has a deadline.
02:02:45.680 | Now, some people, you know, this is when I bring in,
02:02:49.340 | there's different characters,
02:02:50.540 | there's the panic monster comes in the room.
02:02:52.580 | And that's when you actually, you know,
02:02:53.940 | the procrastinator can, there's different levels.
02:02:57.780 | There's the kind that even when there's a deadline,
02:03:02.540 | they stop panicking, they just, they've given up
02:03:04.660 | and they really have a problem.
02:03:06.460 | Then there's the kind that when there's a deadline,
02:03:08.340 | they'll do it, but they'll wait to the last second.
02:03:10.340 | Both of those people, I think, have a huge problem
02:03:13.300 | once there's no deadline.
02:03:15.100 | Because, and most of the important things in life,
02:03:17.420 | there's no deadline, which is, you know,
02:03:18.500 | changing your career, you know,
02:03:21.140 | becoming a writer when you never have been before,
02:03:23.180 | getting out of your relationship,
02:03:24.880 | you're gonna be doing whatever you need to,
02:03:28.500 | the changes you need to make
02:03:29.580 | in order to get into a relationship.
02:03:31.020 | There's, the thing after--
02:03:32.660 | - Launching a startup.
02:03:33.760 | - Launching a startup, right?
02:03:35.340 | Or once you've launched a startup,
02:03:37.260 | firing is the right, someone that needs to be fired, right?
02:03:39.500 | - Yes.
02:03:40.340 | - I mean, going out for fundraising
02:03:41.900 | instead of just trying to, you know,
02:03:43.340 | there's so many moments when the big change
02:03:46.420 | that you know you should be making
02:03:48.340 | that would completely change your life if you just did it
02:03:51.900 | has no deadline.
02:03:53.180 | It just has to be coming from yourself.
02:03:55.580 | And I think that a ton of people have a problem
02:04:00.580 | where they will, they think this delusion that,
02:04:04.980 | you know, I'm gonna do that,
02:04:05.820 | I'm definitely gonna do that, you know,
02:04:07.260 | but not this week, not this month,
02:04:09.380 | not today, 'cause whatever.
02:04:10.820 | And they make this excuse again and again,
02:04:12.260 | and it just sits there on their list, collecting dust.
02:04:16.300 | And so yeah, to me, it is very real suffering.
02:04:20.660 | - And the fixes and fixing the habits?
02:04:23.740 | Just--
02:04:24.580 | - I'm still working on the fix, first of all.
02:04:27.540 | So there's, okay, there is,
02:04:30.660 | there's, just say you have a boat that sucks
02:04:34.980 | and it's leaking and it's gonna sink,
02:04:37.300 | you can fix it with duct tape for a couple,
02:04:40.020 | you know, for one ride or whatever.
02:04:42.260 | That's not really fixing the boat,
02:04:43.740 | but it can get you by.
02:04:44.860 | So there's duct tape solutions.
02:04:47.300 | To me, so the panic monster is the character
02:04:49.180 | that rushes into the room once the deadline
02:04:51.500 | gets too close or once there's some scary external pressure,
02:04:54.560 | not just from yourself.
02:04:55.580 | And that's a huge aid to a lot of procrastinators.
02:04:59.140 | Again, there's a lot of people who won't, you know,
02:05:01.540 | do that thing, they've been writing that book
02:05:03.120 | they wanted to write, but there's way fewer people
02:05:05.060 | who will not show up to the exam.
02:05:07.100 | You know, most people show up to the exam.
02:05:08.940 | So that's because the panic monster
02:05:12.380 | is gonna freak out if they don't.
02:05:14.640 | So you can create a panic monster.
02:05:16.640 | If you wanna, you know, you really wanna write music,
02:05:19.340 | you really wanna become a singer, songwriter,
02:05:20.780 | well, book a venue, tell 40 people about it
02:05:25.620 | and say, hey, on this day, two months from now,
02:05:28.020 | come and see, I'm gonna play you some of my songs.
02:05:30.340 | You now have a panic monster, you're gonna write songs,
02:05:31.940 | you're gonna have to, right?
02:05:33.400 | So there's duct tape things, you know, you can do things,
02:05:38.300 | you know, people do, I've done a lot of this
02:05:40.280 | with a friend and I say, if I don't get X done
02:05:42.900 | by a week from now, I have to donate a lot of money
02:05:47.360 | somewhere I don't wanna donate.
02:05:48.200 | - And that's, you would put that in the category
02:05:49.880 | of duct tape solutions.
02:05:51.520 | - Yeah, because it's not, why do I need that, right?
02:05:55.240 | If I really had solved this, this is something
02:05:57.100 | I want to do for me, it's selfish.
02:05:59.000 | This is, I just literally just want to be selfish here
02:06:01.400 | and do the work I need to do to get the goals
02:06:03.400 | I wanna get, right?
02:06:04.320 | There's a, all the incentives should be in the right place
02:06:08.600 | and yet, if I don't say that, it'll be a week from now
02:06:11.120 | and I won't have done it.
02:06:12.080 | Something weird is going on, there's some resistance,
02:06:14.080 | there's some force that is in my own way, right?
02:06:17.800 | And so, doing something where I have to pay all this money,
02:06:21.160 | okay, now I'll panic and I'll do it.
02:06:22.520 | So that's duct tape.
02:06:23.920 | Fixing the boat is something where I don't have to do that,
02:06:27.360 | I just will do the things that I, again,
02:06:29.520 | it's not, I'm not talking about super crazy work ethic,
02:06:33.060 | just like, for example, okay, I have a lot of examples
02:06:36.560 | 'cause I have a serious problem that I've been working on
02:06:40.160 | and in some ways, I've gotten really successful
02:06:41.880 | at solving it and in other ways, I'm still floundering.
02:06:44.720 | So-- - You're the world's
02:06:45.640 | greatest duct taper.
02:06:47.520 | - Yes, well, I'm pretty good at duct taping,
02:06:49.960 | I probably could be even better and I'm like, and I'm--
02:06:52.400 | - You're procrastinating on becoming a better duct taper.
02:06:54.880 | - Literally, like yes, there's nothing I won't.
02:06:57.520 | So, here's what I know what I should do as a writer, right?
02:07:00.760 | It's very obvious to me, is that I should wake up,
02:07:03.640 | doesn't have to be crazy, I don't have 6 a.m.
02:07:05.000 | or anything insane or I'm not gonna be one of those
02:07:07.720 | crazy people, 5.30 jogs.
02:07:10.640 | I'm gonna wake up at whatever, 7.30, 8, 8.30
02:07:14.400 | and I should have a block, just say nine to noon
02:07:17.280 | where I get up and I just really quick
02:07:22.040 | make some coffee and write.
02:07:23.600 | - It's obvious because all the great writers in history
02:07:26.880 | did exactly that, some-- - Some of them have done that,
02:07:29.320 | that's common, there's some that I like these writers,
02:07:31.520 | they do the late night sessions,
02:07:32.720 | but most of them, they do-- - But there's a session,
02:07:34.840 | but there's a session that's-- - Most writers write
02:07:37.440 | in the morning and there's a reason,
02:07:38.440 | I don't think I'm different than those people.
02:07:41.040 | It's a great time to write, you're fresh, right?
02:07:43.760 | Your ideas from dreaming have kind of collected,
02:07:46.800 | you have all the new answers that you didn't have yesterday
02:07:49.400 | and you can just go.
02:07:51.120 | But more importantly, if I just had a routine
02:07:53.320 | where I wrote from nine to noon, weekdays,
02:07:58.000 | every week would have a minimum of 15 focused hours
02:08:02.240 | of writing, which doesn't sound like a lot,
02:08:03.920 | but it's a lot, a 15, 15, no, this is no joke,
02:08:06.920 | this is, you're not, your phone's away,
02:08:09.600 | you're not talking to anyone, you're not opening your email,
02:08:11.680 | you are focused writing for three hours, five,
02:08:14.100 | that's a big week for most writers, right?
02:08:16.560 | So now what's happening is that every weekday
02:08:18.840 | is a minimum of a B, I'll give myself.
02:08:21.480 | I know an A might be, wow, I really just got into a flow
02:08:24.160 | and wrote for six hours and had, great,
02:08:26.080 | but it's a minimum of a B, I can keep going if I want,
02:08:28.720 | and every week is a minimum of a B, that's 15 hours.
02:08:31.360 | Right, and if I just had, talk about compiling,
02:08:33.320 | this is the two pages a week, if I just did that
02:08:35.880 | every week, I'd achieve all my writing goals in my life.
02:08:39.040 | And yet, I wake up and most days I just,
02:08:42.600 | either I'll revenge procrastination late at night
02:08:44.920 | and go to bed way too late and then wake up later
02:08:46.360 | and get on a bad schedule and I just fall into
02:08:48.000 | these bad schedules, or I'll wake up and there's just,
02:08:50.440 | you know, I'll say I was gonna do a few emails
02:08:52.320 | and I'll open it up and suddenly I'm texting,
02:08:53.960 | I'm texting, or I'll just go and I'll make a phone call
02:08:56.880 | and I'll be on phone calls for three hours,
02:08:58.200 | it's always something.
02:09:00.080 | Or I'll start writing and then I hit a little bit of a wall,
02:09:02.080 | but because there's no sacred,
02:09:03.440 | this is a sacred writing block,
02:09:05.280 | I'll just hit the wall and say,
02:09:06.680 | well, this is icky and I'll go do something else.
02:09:08.920 | So, duct tape, what I've done is,
02:09:11.400 | White But Why has one employee, Alicia,
02:09:14.480 | she's the manager of lots of things, that's her role.
02:09:17.320 | She truly does lots of things.
02:09:19.020 | And one of the things we started doing is
02:09:23.960 | either she comes over and sits next to me
02:09:26.000 | where she can see my screen from nine to noon,
02:09:29.280 | that's all it takes.
02:09:30.120 | The thing about procrastination is usually
02:09:31.560 | they're not kicking and screaming,
02:09:32.880 | I don't want to do this.
02:09:33.760 | It's the feeling of, you know, in the old days
02:09:35.280 | when you had to go to class,
02:09:36.760 | you know, your lunch block is over and it's like,
02:09:38.520 | oh, shit, I have class in five minutes,
02:09:40.080 | or it's Monday morning, you go, oh.
02:09:41.780 | But you said, you know what, you go, you say, okay,
02:09:45.080 | and then you get to class and it's not that bad
02:09:46.480 | once you're there, right?
02:09:48.400 | You have a trainer and he says, okay, next set,
02:09:49.880 | and you go, oh, okay, and you do it.
02:09:51.600 | That's all it is.
02:09:52.440 | It's someone, some external thing being like,
02:09:55.440 | okay, I have to do this,
02:09:56.320 | and then you have that moment of like,
02:09:57.600 | it sucks, but I guess I'll do it.
02:09:59.200 | If no one's there, though,
02:10:00.320 | the problem with the procrastinator is
02:10:01.480 | they don't have that person in their head.
02:10:03.440 | Other people, I think, were raised with a sense of shame
02:10:05.480 | if they don't do stuff,
02:10:06.320 | and that stick in their head is hugely helpful.
02:10:09.720 | I don't really have that.
02:10:11.000 | And so, anyway, Alicia's sitting there next to me.
02:10:14.160 | She's doing her own work, but she can see my screen,
02:10:17.160 | and she, of all people, knows exactly what I should be doing
02:10:19.600 | and what I shouldn't be doing.
02:10:21.240 | That's all it takes.
02:10:22.480 | The shame of just having her see me
02:10:24.800 | while she's sitting there not working
02:10:26.040 | would just be too weird and too embarrassing.
02:10:27.760 | So I get it done, and it's amazing.
02:10:29.720 | It's a game changer for me.
02:10:31.600 | So duct tape can solve, sometimes duct tape is enough,
02:10:34.760 | but I'm curious to, I'm still trying to,
02:10:37.640 | what is going on?
02:10:39.000 | I think part of it is that we are actually wired.
02:10:43.600 | I think I'm being very sane,
02:10:46.240 | human, actually, is what's happening.
02:10:47.800 | Or not sane is not the right word.
02:10:48.880 | I'm being like, I'm being a natural human
02:10:52.080 | that we are not programmed to sit there
02:10:53.840 | and do homework of a certain kind
02:10:56.480 | that we get the results like six months later.
02:10:59.640 | Like that is not, so we're supposed to conserve energy
02:11:03.120 | and fulfill our needs as we need them
02:11:05.040 | and do immediate things.
02:11:06.680 | And we're overriding our natural ways
02:11:10.320 | when we wake up and get to it.
02:11:12.080 | And I think sometimes it's because the pain,
02:11:14.480 | I think a lot of times we're just avoiding suffering,
02:11:15.960 | and for a lot of people, the pain of not doing it
02:11:18.560 | is actually worse 'cause they feel shame.
02:11:20.640 | So if they don't get up and take a jog
02:11:22.200 | and get up early and get to work,
02:11:23.600 | I'll feel like a bad person,
02:11:25.840 | and that is worse than doing those things.
02:11:28.120 | And then it becomes a habit eventually,
02:11:29.560 | it becomes just easy and automatic.
02:11:31.400 | It just becomes I do it 'cause that's what I do.
02:11:33.320 | But I think that if you don't have
02:11:34.480 | a lot of shame necessarily,
02:11:36.820 | the pain of doing those things is worse
02:11:38.640 | in the immediate moment than not doing it.
02:11:42.320 | - But I think that there's this feeling
02:11:44.160 | that you've captured with your body language
02:11:46.320 | and so on, like the I don't wanna do another set,
02:11:50.140 | that feeling, the people I've seen
02:11:52.680 | that are good at not procrastinating
02:11:54.900 | are the ones that have trained themselves
02:11:57.040 | that the moment they would be having that feeling,
02:12:00.480 | it's like Zen, like Sam Harris style Zen,
02:12:03.800 | you don't experience that feeling.
02:12:05.560 | - Yeah. - You just march forward.
02:12:06.960 | Like I talked to Elon about this a lot actually offline.
02:12:09.840 | It's like he doesn't have this.
02:12:11.720 | - No, clearly not.
02:12:12.960 | - It's the way I think, at least he talks about it,
02:12:16.040 | and the way I think about it is it's like
02:12:18.080 | you just pretend you're a machine running an algorithm.
02:12:21.640 | Like you know you should be doing this,
02:12:24.560 | not because somebody told you so on.
02:12:26.080 | This is probably the thing you want to do.
02:12:28.520 | Like look at the big picture of your life
02:12:30.400 | and just run the algorithm.
02:12:31.560 | Like ignore your feelings, just run as if--
02:12:33.520 | - Like just framing, frame it differently.
02:12:35.240 | - Yeah. - You know, yeah,
02:12:36.480 | you can frame it as like, it can feel like homework
02:12:39.360 | or it can feel like you're living your best life
02:12:42.400 | or something when you're doing your work.
02:12:43.960 | - Yeah.
02:12:45.320 | Yeah, maybe you reframe it.
02:12:46.360 | But I think ultimately is whatever reframing you need to do,
02:12:49.900 | you just need to do it for a few weeks
02:12:52.320 | and that's how the habit is formed
02:12:55.240 | and you stick with it.
02:12:56.960 | Like I'm now on a kick where I exercise every day.
02:13:01.960 | It doesn't matter what that exercise is.
02:13:06.160 | It's not serious.
02:13:07.400 | It could be 200 pushups.
02:13:09.360 | But it's a thing that like I make sure I exercise every day
02:13:12.360 | and it's become way, way easier because of the habit.
02:13:15.520 | And I just, and I don't, like at least with exercise
02:13:19.640 | 'cause it's easier to replicate that feeling,
02:13:22.120 | I don't allow myself to go like,
02:13:24.400 | I don't feel like doing this.
02:13:25.360 | - Right.
02:13:26.200 | Well, I think about that even just like little things
02:13:27.520 | like I brush my teeth before I go to bed
02:13:29.320 | and it's just a habit.
02:13:30.320 | - Yeah.
02:13:31.160 | - And it is effort.
02:13:32.000 | Like if it were something else,
02:13:33.320 | I would be like, oh, I'm gonna go to the bathroom,
02:13:35.440 | I'm gonna do that.
02:13:36.280 | No, I just wanna like, I'm just gonna lie down right now.
02:13:37.680 | But it doesn't even cross my mind.
02:13:38.920 | It's just like that I just robotically go and do it.
02:13:41.440 | - Yeah.
02:13:42.280 | - And it almost has become like a nice routine.
02:13:43.680 | It's like, oh, this part of the night.
02:13:44.720 | You know, it's like a morning routine for me stuff
02:13:47.280 | is like, you know, that stuff is kind of just like
02:13:49.400 | automated.
02:13:50.240 | - It's funny 'cause you don't like go,
02:13:52.720 | like I don't think I've skipped many days.
02:13:54.480 | I don't think I skipped any days brushing my teeth.
02:13:56.520 | - Right.
02:13:57.360 | - Like unless I didn't have a tooth part,
02:13:59.120 | like I was in the woods or something.
02:14:00.720 | And what is that?
02:14:01.560 | 'Cause it's annoying.
02:14:03.120 | - To me there is,
02:14:04.440 | so the character that makes me procrastinate
02:14:06.120 | is the instant gratification monkey.
02:14:08.160 | Now that's what I've labeled him, right?
02:14:09.560 | And there's the rational decision maker
02:14:11.280 | and the instant gratification monkey
02:14:12.440 | and these battle with each other.
02:14:14.200 | But for procrastinator, the monkey wins.
02:14:18.080 | - Yeah.
02:14:18.920 | - I think the monkeys, you know,
02:14:20.800 | you read about this kind of stuff.
02:14:21.840 | I think that this kind of more primitive brain
02:14:25.920 | is always winning.
02:14:26.840 | And in non-procrastinators,
02:14:28.200 | that primitive brain is on board for some reason
02:14:30.600 | and isn't resisting.
02:14:32.680 | So, but when I think about brushing my teeth,
02:14:35.120 | it's like the monkey doesn't even think
02:14:36.720 | there's an option to not do it.
02:14:38.560 | So it doesn't even like get,
02:14:39.800 | there's no hope.
02:14:40.640 | The monkey has no hope there.
02:14:41.880 | So it doesn't even like get involved.
02:14:43.160 | And it's just like, yeah, yeah, no,
02:14:44.000 | we have to just like kind of like robotically,
02:14:45.440 | just like, you know, it was kind of like Stockholm syndrome,
02:14:48.060 | just like, oh no, no, I have to do this.
02:14:50.440 | It doesn't even like wake up.
02:14:51.840 | It's like, yeah, we're doing this now.
02:14:53.240 | For other things, the monkey's like,
02:14:54.240 | ooh, no, no, no, most days I can win this one.
02:14:56.800 | And so the monkey puts up that like fierce resistance.
02:15:01.800 | And it's like, it's a lot of it's like
02:15:03.860 | the initial transition.
02:15:05.660 | So I think of it as like jumping in a cold pool,
02:15:09.040 | where it's like, I will spend the whole day
02:15:12.560 | pacing around the side of the pool in my bathing suit,
02:15:15.080 | just being like, I don't want to have that one second
02:15:17.560 | when you first jump in and it sucks.
02:15:18.800 | And then once I'm in, once I jump in,
02:15:21.440 | I'm usually, you know, once I start writing,
02:15:22.920 | suddenly I'm like, oh, this isn't so bad.
02:15:24.640 | Okay, I'm kind of into it.
02:15:25.480 | And then sometimes you can't tear me away.
02:15:27.040 | You know, then I suddenly I'm like, I get into a flow.
02:15:29.320 | So it's like, once I get in the cold water, I don't mind it.
02:15:31.380 | But I will spend hours standing around the side of the pool.
02:15:34.960 | And by the way, I do this in a more literal sense.
02:15:36.840 | When I go to the gym with a trainer,
02:15:38.640 | in 45 minutes, I do a full ass workout.
02:15:41.960 | And it's not because I'm having a good time,
02:15:44.800 | but it's because it's that,
02:15:47.440 | ugh, okay, I have to go to class feeling, right?
02:15:49.080 | But when I go to the gym alone,
02:15:50.520 | I will literally do a set and then dick around my phone
02:15:55.480 | for 10 minutes before the next set.
02:15:57.380 | And I'll spend over an hour there and do way less.
02:15:59.680 | So it is the transition.
02:16:01.920 | Once I'm actually doing the set,
02:16:03.120 | I'm never like, I don't want to stop in the middle.
02:16:04.480 | Now it's just like, I'm going to do this.
02:16:05.560 | And I feel happy I just did it.
02:16:07.020 | So it's something, there's something about transitions
02:16:09.160 | that is very, that's why procrastinators are late
02:16:11.480 | a lot of places.
02:16:13.000 | It's, I will procrastinate getting ready
02:16:14.920 | to go to the airport,
02:16:16.880 | even though I know I should leave at three,
02:16:18.240 | so I cannot be stressed.
02:16:19.220 | I'll leave at 3.36 and I'll be super stressed.
02:16:22.440 | Once I'm on the way to the airport,
02:16:24.720 | immediately I'm like, why didn't I do this earlier?
02:16:26.360 | Now I'm back on my phone doing what I was doing.
02:16:28.660 | I just had to get in the damn car or whatever.
02:16:31.280 | So yeah, there's some very, very odd, irrational.
02:16:35.440 | - Yeah, like I was waiting for you to come
02:16:38.640 | and you said that you're running a few minutes late.
02:16:40.480 | And I was like-- - Which I did.
02:16:41.980 | - I was like, I'll go get a coffee
02:16:45.560 | because I can't possibly be the one who's early.
02:16:48.720 | - Right. - I can't,
02:16:49.600 | I don't understand, I'm always late to stuff.
02:16:51.800 | And I know it's disrespectful in the eyes of a lot of people.
02:16:55.280 | I can't help, you know what I'm doing ahead of it?
02:16:58.160 | It's not like I don't care about the people.
02:17:00.120 | I'm often like, for like this conference,
02:17:03.000 | I'd be preparing more.
02:17:04.360 | - Right. - Like it's like,
02:17:06.040 | I obviously care about the person, but for some--
02:17:08.600 | - Yeah, it's misinterpreted as like,
02:17:10.080 | there are some people that like show up late
02:17:13.060 | because they like, they kind of like that quality
02:17:14.900 | in themselves and that's a dick, right?
02:17:16.580 | There's a lot of those people.
02:17:17.680 | But more often, it's someone who shows up frazzled
02:17:20.340 | and they feel awful and they're furious at themselves.
02:17:22.580 | - Yeah. - They're so regretful.
02:17:23.540 | - Exactly. - I mean, that's me.
02:17:24.940 | And I mean, also, all you have to do
02:17:26.580 | is look at those people alone running through the airport.
02:17:28.860 | Right? - Yeah.
02:17:29.820 | - They're not being disrespectful to anyone there.
02:17:31.180 | They just inflicted this on themselves.
02:17:33.100 | Like-- - It's hilarious.
02:17:34.060 | - Yeah.
02:17:35.300 | - You've tweeted a quote by James Baldwin saying,
02:17:38.260 | quote, "I imagine one of the reasons people cling
02:17:40.620 | "to their hates so stubbornly is because they sense
02:17:45.620 | "once hate is gone, they will be forced
02:17:48.180 | "to deal with the pain."
02:17:49.700 | What has been a painful but formative experience
02:17:53.860 | in your life?
02:17:55.340 | Or what's the flavor, the shape of your pain that fuels you?
02:17:59.040 | - I mean, honestly, the first thing that jumped to mind
02:18:02.700 | is my own battles against myself to get my work done
02:18:06.300 | because it affects everything.
02:18:07.700 | When I, I just took five years in this book
02:18:09.740 | and granted, it's a beast.
02:18:11.260 | Like, I probably would have taken two or three years,
02:18:13.420 | but it didn't need to take five.
02:18:14.540 | And that was a lot of, not just, you know,
02:18:16.660 | not just that I'm not working.
02:18:17.900 | It's that I'm over-researching.
02:18:19.900 | I'm making it, I'm adding in things I shouldn't
02:18:22.420 | because I'm perfectionist, you know,
02:18:23.860 | being a perfectionist about like, oh, well, I learned that.
02:18:25.700 | Now I want to get it in there.
02:18:26.840 | I know I'm going to end up cutting it later.
02:18:28.020 | Just, you know, or I over-outline, you know,
02:18:30.980 | something, you know, trying to get it perfect
02:18:32.700 | when I know that's not possible.
02:18:33.980 | Just making a lot of immature kind of,
02:18:36.700 | like, I'm not actually that much of a writing amateur.
02:18:38.820 | I've written, including my old blog,
02:18:40.700 | I've been a writer for 15 years.
02:18:42.300 | I know what I'm doing.
02:18:43.300 | I could advise other writers really well.
02:18:45.940 | And yet I do a bunch of amateur things
02:18:47.940 | that I know while I'm doing them,
02:18:50.000 | is I know I'm being an amateur.
02:18:51.260 | So that A, it hurts the actual product.
02:18:55.620 | It makes, you know, B, it wastes your precious time.
02:18:59.380 | C, when you're mad at yourself,
02:19:01.820 | when you're in a negative, you know, self-defeating spiral,
02:19:05.860 | it almost inevitably, you'll be less good to others.
02:19:10.020 | Like, you know, I'll just, I used to, you know,
02:19:12.280 | early on in my now marriage,
02:19:15.660 | one of the things we always used to do
02:19:17.260 | is I used to plan mystery dates.
02:19:19.060 | You know, New York City, great place for this.
02:19:20.860 | I'd find some weird little adventure for us.
02:19:23.060 | You know, it could be anything.
02:19:24.740 | And I wouldn't tell her what it was.
02:19:25.740 | I said, "I'm reserving you for Thursday night,
02:19:27.500 | "you know, at seven, okay?"
02:19:29.100 | And it was such a fun part of our relationship.
02:19:31.460 | Started writing this book and got into a really bad,
02:19:33.900 | you know, personal space where it was like,
02:19:35.940 | in my head, I was like,
02:19:36.780 | "I can't do anything until this is done."
02:19:38.260 | You know, like, no.
02:19:39.140 | And I just stopped, like, ever valuing, like,
02:19:42.620 | like, joy of any kind.
02:19:44.260 | Like, I was like, "No, no, that's when I'm done."
02:19:47.180 | And that's a trap, or very quickly, you know,
02:19:49.940 | 'cause I always think, you know,
02:19:50.780 | think it's gonna be six months away,
02:19:51.960 | but actually five years later, I'm like,
02:19:53.080 | "Wow, I really wasn't living fully."
02:19:55.820 | And for five years is not, we don't live very long.
02:19:58.300 | Like, you're talking about your prime decades.
02:19:59.860 | Like, that's like a sixth of my prime years.
02:20:01.820 | Like, wow, like, that's a huge loss.
02:20:04.900 | So to me, that was excruciating.
02:20:06.300 | And, you know, and it was a bad pattern,
02:20:10.060 | a very unproductive, unhelpful pattern for me,
02:20:13.020 | which is I'd wake up in the morning in this great mood.
02:20:15.580 | Great mood every morning.
02:20:16.620 | Wake up, thrilled to be awake.
02:20:18.140 | I have the whole day ahead of me.
02:20:19.420 | I'm gonna get so much work done today.
02:20:21.180 | And, but, you know, first I'm gonna do all these other things
02:20:23.900 | and it's all gonna be great.
02:20:24.740 | And then I ended up kind of failing for the day
02:20:27.900 | with those goals, sometimes miserably,
02:20:30.300 | sometimes only partially.
02:20:31.940 | And then I get in bed,
02:20:34.300 | probably a couple hours later than I want to.
02:20:36.060 | And that's when all of the real reality hits me.
02:20:39.060 | Suddenly, so much regret, so much anxiety,
02:20:42.060 | furious at myself, wishing I could take a time machine back
02:20:44.740 | three months, six months, a year,
02:20:46.740 | or just even to the beginning of that day.
02:20:49.020 | And just tossing and turning now.
02:20:51.940 | I mean, this is a very bad place.
02:20:53.180 | That's why I said suffering.
02:20:54.340 | Procrastinators suffer in a very serious way.
02:20:56.780 | So look, I, you know, I know this probably sounds
02:21:00.100 | like a lot of like first world problems, and it is,
02:21:02.140 | but it's real suffering as well.
02:21:03.580 | Like it's, so to me, it's like,
02:21:06.940 | it's painful because you're not being,
02:21:11.540 | you're not being as good a friend or a spouse
02:21:13.220 | or whatever as you could be.
02:21:14.340 | You're also not treating yourself very well.
02:21:16.100 | You're usually not being very healthy in these moments.
02:21:17.920 | You know, you're often, and you're not being,
02:21:20.340 | I'm not being good to my readers.
02:21:21.460 | So it's just a lot of this.
02:21:23.000 | And it's like, it feels like it's one small tweak away.
02:21:27.340 | Sometimes it's like, that's what I said.
02:21:28.660 | It's like, you just suddenly are just doing that nine to 12
02:21:31.220 | and you get in that habit.
02:21:32.300 | Everything else falls into place.
02:21:33.700 | All of this reverses.
02:21:35.400 | So I feel hopeful, but it's like, it is a,
02:21:39.060 | I have not figured, I haven't fixed the boat yet.
02:21:41.220 | I have some good duct tape though.
02:21:43.420 | - And you also don't want to romanticize it
02:21:45.460 | 'cause it is true that some of the greats in history,
02:21:48.460 | especially writers, suffer from all the same stuff.
02:21:51.220 | Like they weren't quite able, I mean,
02:21:53.820 | you might only write for two or three hours a day,
02:21:56.660 | but the rest of the day is often spent,
02:21:58.780 | you know, kind of tortured by--
02:22:00.620 | - Well, right, this is the irrational thing.
02:22:02.820 | And this goes for a lot of people's jobs,
02:22:05.100 | people especially who work for themselves.
02:22:07.180 | You'd be shocked how much you could wake up at nine
02:22:10.220 | or eight or seven or whatever,
02:22:11.980 | get to work and stop at one,
02:22:13.340 | but you're really focused in those hours.
02:22:15.420 | One or two, and do 25 really focused hours of stuff,
02:22:20.060 | productive stuff a week,
02:22:20.900 | and then there's 112 waking hours in the week, right?
02:22:23.980 | So we're talking about 80 something hours of free time.
02:22:27.220 | You can live, you know, if you're just really focused
02:22:29.540 | in your yin and yang of your time,
02:22:31.460 | that's my goal is black and white time.
02:22:34.140 | I really focus time and then totally like
02:22:37.300 | clean conscience free time.
02:22:39.460 | Right now I have neither, it's a lot of gray.
02:22:40.780 | It's a lot of I should be working, but I'm not,
02:22:42.420 | oh, I'm wasting this time, this is bad.
02:22:44.260 | And that's just as massive.
02:22:45.620 | So if you can just get really good at the black and the white
02:22:49.580 | so you just wake up and it's just like full work.
02:22:51.580 | And then I think a lot of people could have like,
02:22:53.580 | all this free time, but instead,
02:22:54.700 | I'll do those same three hours.
02:22:55.940 | It's like you said, I'll do them really late at night
02:22:57.700 | or whatever after having tortured myself the whole day
02:23:00.140 | and not had any fun.
02:23:01.060 | It's not like I'm having fun.
02:23:03.140 | I call it the dark playground, by the way,
02:23:04.620 | which is where you are when you know you should be working,
02:23:06.900 | but you're doing something else.
02:23:08.740 | You're doing something fun on paper,
02:23:10.780 | but it's never, it feels awful.
02:23:12.740 | And so, yeah, I spend a lot of time in the dark.
02:23:14.660 | - And you know you shouldn't be doing it
02:23:16.060 | and you still do it and yeah.
02:23:18.100 | - It's not clean conscience fun, it's bad, it's toxic.
02:23:21.340 | And I think that it's, there's something about,
02:23:23.780 | you know, you're draining yourself all the time.
02:23:25.420 | And if you just did your focused hours
02:23:26.820 | and then if you actually have good, clean, fun,
02:23:28.780 | fun can be anything.
02:23:29.620 | You're reading a book, can be hanging out with someone,
02:23:31.460 | it can be really fun.
02:23:32.300 | You can go and do something cool in the city.
02:23:33.900 | You know, that is critical.
02:23:36.660 | It's, you're recharging some part of your psyche there.
02:23:39.060 | And I think it makes it easier
02:23:39.940 | to actually work the next day.
02:23:40.940 | And I say this from the experiences when I have had,
02:23:43.340 | if you know, good stretches, it's like,
02:23:45.300 | it's, you know what it is?
02:23:46.660 | It's like, you feel like you're fist pounding.
02:23:48.380 | One part of your brain is fist pounding the other part.
02:23:50.300 | Like, you're like, we got, like, we treat ourselves well.
02:23:54.820 | Like, it's how you're internally feel like, I treat myself.
02:23:56.980 | And it's like, yeah, no, of course it's work time.
02:23:58.380 | And then later you're like, now it's play time.
02:23:59.820 | And it's like, okay, back to work.
02:24:01.020 | And you're in this very healthy,
02:24:02.180 | like parent-child relationship in your head
02:24:04.100 | versus like this constant conflict.
02:24:06.940 | And like the kid doesn't respect the parent
02:24:08.660 | and parent hates the kid and like, yeah.
02:24:10.980 | - And you're right.
02:24:11.820 | It always feels like it's like one fix away.
02:24:14.540 | So there's hope.
02:24:15.860 | I mean, I guess, I mean,
02:24:18.620 | so much of what you said just rings so true.
02:24:21.700 | I guess I have the same kind of hope.
02:24:24.220 | - But you know, this podcast is very regular.
02:24:27.700 | I mean, I'm impressed.
02:24:29.060 | Like, and I think partially what,
02:24:32.660 | there is a bit of a duct tape solution here,
02:24:34.380 | which is you just, 'cause it's always easy
02:24:37.060 | to schedule stuff for the future for myself, right?
02:24:39.100 | Because that's future Tim and future Tim is not my problem.
02:24:42.480 | So I'll schedule all kinds of shit for future Tim
02:24:44.780 | and I will, and I will not then not do it.
02:24:48.340 | But in this case, you can schedule podcasts
02:24:51.740 | and you have to show up.
02:24:53.020 | - Yeah, you have to show up.
02:24:53.860 | - Right, it seems like a good medium for procrastinating.
02:24:55.820 | - This is not my, this is what I do for fun.
02:24:58.020 | - I know, but at least this is the kind of thing,
02:25:00.540 | especially if it's not your main thing.
02:25:02.580 | Especially if it's not your main thing,
02:25:03.420 | it's the kind of thing that you would dream of doing
02:25:05.100 | and wanna do and never do.
02:25:06.420 | And I feel like your regular production here
02:25:10.100 | is a sign that something is working,
02:25:12.220 | at least in this regard.
02:25:13.380 | - Yeah, in this regard, but this,
02:25:14.300 | I'm sure you have this same kind of thing with the podcast.
02:25:17.100 | In fact, because you're gonna be doing the podcast,
02:25:19.060 | it's possible the podcast becomes
02:25:20.540 | what the podcast is for me.
02:25:22.740 | - This is you procrastinating.
02:25:24.460 | If you think about being 80
02:25:25.740 | and if you can get into that person's head
02:25:27.420 | and look back and be like, just deep regret,
02:25:29.780 | you just, you know, yearning,
02:25:31.520 | you could do anything to just go back
02:25:33.260 | and have done this differently, that is desperation.
02:25:36.120 | It's just you don't feel it yet.
02:25:37.180 | It's not in you yet.
02:25:38.800 | The other thing you could do is if you have a partner,
02:25:40.340 | if you wanna partner with someone,
02:25:41.380 | now you could say, we meet these 15 hours every week.
02:25:45.060 | And that point, you're gonna get it done.
02:25:46.440 | So working with someone can help.
02:25:48.980 | - Yeah, that's why they say like a co-founder
02:25:51.300 | is really powerful for many reasons,
02:25:52.820 | but that's kind of one of them.
02:25:54.580 | Because to actually, for the startup case,
02:25:57.060 | you, unlike writing, perhaps,
02:26:00.940 | it's really like a hundred hour plus thing.
02:26:03.260 | Like once you really launch, you go all in.
02:26:07.060 | Like everything else just disappears.
02:26:09.420 | Like you can't even have a hope of a balanced life
02:26:11.900 | for a little bit.
02:26:12.720 | So, and there co-founder really helps.
02:26:15.380 | That's the idea.
02:26:16.380 | When you, you're one of the most interesting people
02:26:20.940 | on the internet.
02:26:21.820 | So as a writer, you look out into the future.
02:26:26.580 | Do you dream about certain things you want to still create?
02:26:30.500 | Is there projects that you wanna write?
02:26:34.180 | Is there movies you want to write or direct or?
02:26:37.460 | - Endless.
02:26:38.700 | - So it's just endless sea of ideas.
02:26:41.020 | - No, there's specific list of things that really excite me,
02:26:44.180 | but it's a big list that I know
02:26:46.260 | I'll never get through them all.
02:26:47.380 | And that's part of why the last five years really like,
02:26:51.620 | when I feel like I'm not moving as quickly as I could,
02:26:54.380 | it bothers me because I have so much genuine excitement
02:26:57.220 | to try so many different things.
02:26:58.420 | And I get so much joy from finishing things.
02:27:00.240 | I don't like doing things, but a lot of writers are like that.
02:27:03.640 | Publishing something is hugely joyful
02:27:06.620 | and makes it all worth it.
02:27:08.140 | Or just finishing something you're proud of,
02:27:10.020 | putting it out there and have people appreciate it.
02:27:11.520 | It's like the best thing in the world, right?
02:27:13.300 | You know, a lot of every kid makes
02:27:14.420 | some little bargain with themselves,
02:27:15.740 | has a little, you know, a dream or, you know, something.
02:27:19.060 | And I feel like when I do something,
02:27:21.500 | that I make something, and this, you know,
02:27:23.700 | for me it's been mostly writing,
02:27:25.780 | and I feel proud of it and I put it out there.
02:27:27.700 | I feel like I like, again,
02:27:29.160 | I'm like fist pounding my seven year old self.
02:27:30.980 | Like there's a little like, I'm,
02:27:33.260 | I like, I owe it to myself to do certain things.
02:27:35.500 | And I just did one of the things I owe.
02:27:36.860 | I just paid off some debt to myself.
02:27:38.060 | I owed it and I paid it and it feels great.
02:27:40.260 | It feels like very like,
02:27:41.740 | you just feel very, a lot of inner peace when you do it.
02:27:43.860 | So the more things I can do, you know,
02:27:46.140 | and I just have fun doing it, right?
02:27:47.460 | So I just, it's, for me, that includes a lot more writing.
02:27:50.860 | I just, you know, short, short, no, short blog posts.
02:27:53.460 | I write very long blog posts,
02:27:54.460 | but basically short writing in the form
02:27:56.100 | of long blog posts is a great, I love that medium.
02:27:59.060 | I wanna do a lot more of that.
02:28:00.860 | Books yet to be seen.
02:28:01.900 | I'm gonna do this and I'm gonna have another book
02:28:03.260 | I'm gonna do right after,
02:28:04.100 | and we'll see if I like those two.
02:28:05.340 | And if I do, I'll do more, otherwise I won't.
02:28:07.600 | But I also wanna try other mediums.
02:28:08.740 | I wanna make more videos.
02:28:10.840 | I want to, I did a little travel series once.
02:28:14.540 | I love doing that.
02:28:15.380 | I wanna do, you know, more of that.
02:28:16.740 | - Almost like a vlog, like--
02:28:18.500 | - No, it was, I let readers in a survey
02:28:21.220 | pick five countries they wanted me to go to.
02:28:24.780 | - That's awesome, okay.
02:28:25.620 | - And they picked, they sent me to weird places.
02:28:27.940 | They sent me, I went to Siberia, I went to Japan.
02:28:32.460 | I went from there to, this is all in a row,
02:28:34.740 | into to Nigeria, from there to Iraq,
02:28:37.380 | and from there to Greenland.
02:28:39.140 | And then I went back to New York,
02:28:41.000 | like two weeks in each place.
02:28:42.440 | And I get to, you know, each one I got to, you know,
02:28:45.840 | have some weird experiences.
02:28:46.840 | I tried to like really dig in and have like,
02:28:49.600 | you know, some interesting experiences.
02:28:51.560 | And then I wrote about it,
02:28:52.480 | and I taught readers a little bit
02:28:53.840 | about the history of these places.
02:28:54.960 | And it was just, I love doing that.
02:28:56.280 | I love, so, you know, and I'm like,
02:28:58.480 | oh man, like I haven't done one of those in so long.
02:29:01.780 | And then I have a big like desire to do fictional stuff.
02:29:05.760 | Like I wanna write a sci-fi at some point,
02:29:07.680 | and I would love to write a musical.
02:29:09.700 | That's actually what I was doing before Wait But Why.
02:29:11.460 | I was with a partner, Ryan Langer.
02:29:14.380 | We were halfway through a musical,
02:29:15.820 | and he got tied up with his other musical,
02:29:19.780 | and Wait But Why started taking off,
02:29:21.380 | and we just haven't gotten back to it.
02:29:22.700 | But it's such a fun medium.
02:29:24.220 | It's such a silly medium, but it's so fun.
02:29:26.500 | - So you think about all of these mediums
02:29:28.420 | on which you can be creative and create something,
02:29:30.660 | and you like the variety of it.
02:29:32.580 | - Yeah, it's just that I,
02:29:34.820 | if there's a chance on a new medium,
02:29:37.180 | I could do something good, I wanna do it, I wanna try it.
02:29:40.260 | It sounds like so gratifying, so fun.
02:29:42.460 | - I think it's fun to just watch you actually sample these.
02:29:45.500 | So I can't wait for your podcast.
02:29:47.740 | I'll be listening to all of them.
02:29:49.460 | I mean, that's a cool medium to see like where it goes.
02:29:52.220 | The cool thing about podcasting or making videos,
02:29:54.340 | especially with a super creative mind like yours,
02:29:56.840 | you don't really know what you're gonna make of it
02:29:59.640 | until you try it.
02:30:01.160 | - Yeah, podcasts, I'm really excited about,
02:30:03.020 | but I'm like, I like going on other people's podcasts.
02:30:06.540 | And I never try to have my own.
02:30:08.140 | - So there's, with every medium,
02:30:09.980 | there's the challenges of how the sausage is made.
02:30:13.180 | So like the challenges of the challenge of actually.
02:30:15.380 | - Yeah, but it's also, I like to like,
02:30:16.700 | I'll go on like, as you know, long ass monologues.
02:30:19.420 | And you can't do it on, if you're the interviewer,
02:30:21.720 | like you're not supposed to do that as much.
02:30:23.420 | So I have to like reign it in.
02:30:24.780 | And that can be, that might be hard, but we'll see.
02:30:28.060 | - You could also do solo type stuff.
02:30:29.500 | - Yeah, maybe I'll do a little of each.
02:30:31.180 | - You know what's funny?
02:30:32.020 | I mean, some of my favorite is more like solo,
02:30:34.140 | but there's like a sidekick.
02:30:36.060 | So you're having a conversation, but you're like friends,
02:30:40.600 | but it's really you ranting,
02:30:43.180 | which I think you'd be extremely good at.
02:30:46.100 | - That's funny, yeah.
02:30:46.940 | Or even if it's 50/50, that's fine.
02:30:48.820 | Like if it's just a friend who I wanna like really riff with,
02:30:51.860 | I just don't, I don't like interviewing someone,
02:30:55.380 | which I won't, that's not what the podcast will be,
02:30:57.180 | but I can't help, I've tried moderating panels before,
02:30:59.940 | and I cannot help myself.
02:31:01.020 | I have to get involved.
02:31:02.140 | And no one likes a moderator who's too involved.
02:31:04.020 | It's very unappealing.
02:31:04.900 | So I'm interviewing someone and I'm like, I can't,
02:31:08.100 | I don't even know, I just, it's not my,
02:31:09.980 | I can grill someone, but that's different.
02:31:11.540 | That's my curiosity being like, wait, how about this?
02:31:13.740 | And I interrupt them and I'm trying to-
02:31:14.580 | - Yeah, I see the way your brain works is hilarious.
02:31:17.100 | It's awesome.
02:31:17.940 | It's like lights up with fire and excitement.
02:31:19.940 | Yeah, I actually, I love listening.
02:31:21.860 | I like watching people, I like listening to people.
02:31:24.100 | So this is like me right now,
02:31:25.700 | having just listening to a podcast.
02:31:27.900 | This is me listening to your podcast right now.
02:31:28.740 | - I love listening to a podcast
02:31:30.780 | because then it's not even like,
02:31:31.820 | but once I'm in the room,
02:31:32.820 | I suddenly can't help myself by jumping in.
02:31:35.380 | - Okay, big last ridiculous question.
02:31:38.380 | What is the meaning of life?
02:31:40.340 | - The meaning of like an individual life?
02:31:42.380 | - Your existence here on earth,
02:31:44.260 | or maybe broadly this whole thing we got going on,
02:31:47.180 | descendants of apes, busily creating.
02:31:50.420 | - Yeah, well, there's, yeah.
02:31:51.260 | For me, I feel like I want to be around as long as I can.
02:31:55.580 | If I can do some kind of crazy life extension
02:31:57.860 | or upload myself, I'm gonna,
02:31:59.060 | because who doesn't want to see how cool 20,
02:32:02.020 | the year 3000 is, imagine.
02:32:03.700 | - You did say mortality was not appealing.
02:32:06.300 | - No, it's not appealing at all to me.
02:32:08.100 | Now, it's ultimately appealing.
02:32:10.180 | As I said, no one wants eternal life, I believe,
02:32:12.260 | if they understood what eternity really was.
02:32:14.180 | And I did Graham's number as a post,
02:32:15.900 | and I was like, okay, no one wants to live that many years.
02:32:18.140 | But I'd like to choose.
02:32:19.340 | I'd like to say, you know what,
02:32:20.180 | I'm truly over it now, and I'm gonna have,
02:32:21.600 | you know, at that point, we'd have,
02:32:22.700 | our whole society would have like,
02:32:24.380 | we'd have a ceremony.
02:32:25.620 | We'd have a whole process of someone signing off,
02:32:28.060 | and you know, it would be beautiful,
02:32:29.420 | and it wouldn't be sad.
02:32:30.860 | - No, I think you'd be super depressed by that point.
02:32:33.540 | Like, who's gonna sign off when they're doing pretty good?
02:32:35.380 | - Maybe, maybe, yes, okay, maybe it's dark.
02:32:37.500 | But at least, but the point is,
02:32:38.740 | if I'm happy, I can stay around for, you know,
02:32:40.500 | but I'm thinking 50 centuries sounds great.
02:32:42.780 | Like, I don't know if I want more than that.
02:32:44.140 | 50 sounds like the right number,
02:32:45.540 | and so if you're thinking,
02:32:46.860 | if you would sign up for 50 if you had a choice,
02:32:48.780 | one is what I get that is bullshit.
02:32:50.500 | Like, if you're someone who wants 50,
02:32:52.820 | one is a hideous number, right?
02:32:55.500 | You know, anyway.
02:32:56.820 | So, for me personally,
02:32:59.540 | I wanna be around as long as I can.
02:33:01.140 | And then, honestly, the reason I love writing,
02:33:03.020 | the thing that I love most,
02:33:04.140 | is like, warm, fuzzy connection with other people, right?
02:33:08.540 | And that can be my friends,
02:33:10.340 | and it can be readers.
02:33:11.820 | And that's why I would never wanna be like a journalist,
02:33:13.980 | where their personality's like hidden behind the writing.
02:33:16.740 | Or like, even a biographer, you know?
02:33:19.660 | There's a lot of people who would do,
02:33:20.500 | who's great writers, but it's,
02:33:21.580 | I like to personally connect.
02:33:23.660 | And if I can take something that's in my head,
02:33:25.380 | and other people can say, oh my God, I think that too,
02:33:27.060 | and this made me feel so much better,
02:33:28.420 | it made me feel seen, like, that feels amazing.
02:33:30.260 | And I just feel like,
02:33:31.780 | we're all having such a weird common experience,
02:33:34.700 | on this one little rock,
02:33:35.980 | in this one little moment of time,
02:33:37.380 | where this weird,
02:33:38.900 | these weird four-limbed beings,
02:33:40.940 | and we're all the same,
02:33:41.760 | and it's like, we're all,
02:33:43.100 | the human experience,
02:33:43.940 | so I feel like so many of us suffer in the same ways,
02:33:46.220 | and we're all going through a lot of the same things.
02:33:48.100 | And to me, it is very,
02:33:49.780 | if I lived, if I,
02:33:50.900 | it was on my death bed,
02:33:51.740 | and I feel like I had like,
02:33:52.560 | I had a ton of human connection,
02:33:54.780 | and like, shared a lot of common experience,
02:33:56.540 | and made a lot of other people feel like,
02:33:58.540 | like, not alone.
02:34:01.500 | - Do you feel that as a writer?
02:34:02.820 | Do you like,
02:34:04.420 | hear and feel like,
02:34:06.820 | the inspiration,
02:34:08.100 | like, all the people that you make smile,
02:34:10.500 | and all the people you inspire?
02:34:11.460 | - Honestly, not, sometimes, you know,
02:34:13.540 | when we did an in-person event,
02:34:14.940 | and I, you know, meet a bunch of people,
02:34:16.300 | and it's incredibly gratifying,
02:34:18.140 | or you know, you just, you know, you get emails,
02:34:19.660 | but I think it is easy to forget that,
02:34:21.100 | how many people, sometimes your stuff--
02:34:22.940 | - 'Cause you're just sitting there alone, typing.
02:34:25.060 | - Yeah.
02:34:25.900 | - And you get procrastination.
02:34:26.900 | - But that's why publishing is so gratifying,
02:34:28.260 | 'cause that's the moment when all this connection happens.
02:34:30.060 | - Yeah.
02:34:30.900 | - And especially if I had to put my finger on it,
02:34:32.260 | it's like, it's having a bunch of people
02:34:34.660 | who feel lonely,
02:34:35.500 | and they're like, the existence is all realized,
02:34:37.100 | like, all, you know, connect, right?
02:34:38.500 | So that, if I do a lot of that,
02:34:40.260 | and that includes, of course,
02:34:41.780 | my actual spending, you know,
02:34:43.300 | a lot of really high quality time with friends and family,
02:34:45.460 | and like,
02:34:46.300 | and making the whole thing as heartbreaking
02:34:50.180 | as like, mortality in life can be,
02:34:51.940 | make the whole thing like, fun,
02:34:53.460 | and at least we can like, laugh at ourselves together
02:34:55.740 | while going through it.
02:34:56.940 | - Yeah.
02:34:57.780 | - And that to me is the, yeah.
02:34:58.620 | - And then your last blog post will be written from Mars,
02:35:02.620 | as you get the bad news that you're not able to return
02:35:05.420 | because of the malfunction in the rocket.
02:35:07.700 | - Yeah, I would like to go to Mars,
02:35:09.180 | and like, go there for a week,
02:35:10.140 | and be like, yay, here we are,
02:35:11.180 | and then come back.
02:35:12.020 | - No, I know that's what you want.
02:35:13.500 | - Staying there, yeah.
02:35:14.780 | And that's fine, by the way.
02:35:15.740 | If I, yeah, if, so you think,
02:35:17.780 | you're picturing me alone on Mars
02:35:19.220 | as the first person there,
02:35:20.740 | and then it malfunctions.
02:35:21.580 | - Right, no, you were supposed to return,
02:35:23.180 | but it malfunctions,
02:35:24.220 | and then there's this,
02:35:25.420 | so it's both the hope, the awe that you experience,
02:35:30.540 | which is how the blog starts,
02:35:32.420 | and then it's the overwhelming,
02:35:34.980 | like, feeling of existential dread,
02:35:38.060 | but then it returns to like, the love of humanity.
02:35:41.220 | - Well, that's the thing,
02:35:42.060 | is if I could be writing,
02:35:43.780 | and actually like, writing something
02:35:44.860 | that people would read back on,
02:35:46.060 | it would make it feel so much better.
02:35:47.260 | - Yeah.
02:35:48.100 | - You know, if I were just alone,
02:35:48.920 | and no one was gonna realize what happened.
02:35:50.620 | - No, no, no, you get to write.
02:35:51.620 | No, no, no, it's perfect as safe.
02:35:53.180 | - Well, also, that would bring out great writing.
02:35:54.940 | - Yeah, I think so.
02:35:55.780 | - You know, your deathbed on Mars alone.
02:35:56.980 | - I think so.
02:35:57.820 | - Yeah.
02:35:58.640 | - Well, that's exactly the future I hope for you, Tim.
02:36:02.780 | All right, this was an incredible conversation.
02:36:04.500 | You're a really special human being, Tim.
02:36:06.580 | Thank you so much for spending
02:36:08.980 | your really valuable time with me.
02:36:10.140 | I can't wait to hear your podcast.
02:36:11.900 | I can't wait to read your next blog post,
02:36:15.280 | which you said in a Twitter reply.
02:36:17.220 | You'll get more to--
02:36:19.460 | - Yeah, soon enough.
02:36:20.940 | - After the book,
02:36:21.820 | which add that to the long list of ideas to procrastinate.
02:36:26.140 | How about, Tim, thanks so much for talking to me, man.
02:36:28.700 | - Thank you.
02:36:29.540 | - Thanks for listening to this conversation with Tim Urban.
02:36:33.580 | To support this podcast,
02:36:34.940 | please check out our sponsors in the description.
02:36:37.780 | And now, let me leave you with some words
02:36:39.820 | from Tim Urban himself.
02:36:41.360 | Be humbler about what you know,
02:36:44.940 | more confident about what's possible,
02:36:47.420 | and less afraid of things that don't matter.
02:36:51.660 | Thanks for listening, and hope to see you next time.
02:36:54.220 | (upbeat music)
02:36:56.800 | (upbeat music)
02:36:59.380 | [BLANK_AUDIO]