back to indexTim Urban: Elon Musk, Neuralink, AI, Aliens, and the Future of Humanity | Lex Fridman Podcast #264
Chapters
0:0 Introduction
0:38 The big and the small
8:28 Aliens
16:42 The pencil problem
23:27 Food abundance
25:31 Extinction of human civilization
30:49 Future politics of Mars
37:49 SpaceX
43:49 Elon Musk
69:17 Nuclear power
73:43 The higher mind
78:27 Echo chambers and idea labs
81:39 How our brain processes film and music
84:53 Neuralink
93:7 Future of physical interactions
97:18 AI
104:38 Free speech
108:41 How to read more
115:23 Spaced repetition
119:26 Procrastination
146:18 Goals for the future
151:36 Meaning of life
00:00:01.640 |
the calculation I came to is that you can read 00:00:17.660 |
The following is a conversation with Tim Urban, 00:00:45.640 |
What world do you find most mysterious or beautiful, 00:00:50.660 |
- The very small seems a lot more mysterious. 00:00:54.080 |
And the very big, I feel like we kind of understand. 00:00:56.600 |
I mean, not the very, very big, not the multiverse, 00:01:01.760 |
not anything outside of the observable universe. 00:01:04.160 |
But the very small, I think we really have no idea 00:01:08.180 |
what's going on, or very, you know, much less idea. 00:01:11.280 |
But I find that, so I think the small is more mysterious, 00:01:16.240 |
I just cannot get enough of the bigness of space 00:01:29.240 |
- I mean, we still, the vastness of the observable universe 00:01:33.480 |
has the mystery that we don't know what's out there. 00:01:45.400 |
But like, how many civilizations are out there? 00:01:49.280 |
How many, like, what are the weird things that are out there? 00:02:04.080 |
but it's actually the things that are potentially our size 00:02:07.760 |
- Potentially our size is probably the key word. 00:02:10.120 |
- Yeah, I mean, I wonder how small intelligent life 00:02:15.400 |
And I assume that there's a limit that you're not gonna, 00:02:26.080 |
of order of magnitude smaller and bigger than us for life. 00:02:28.720 |
But maybe not, maybe you could have some giant life form. 00:02:34.000 |
there's gotta be some reason that anything intelligent 00:02:36.640 |
between kind of like a little tiny rodent or finger monkey 00:02:43.640 |
I don't know, maybe when you change the gravity 00:02:52.520 |
and they just get bigger and bigger and bigger. 00:02:55.920 |
A human being is made up of a bunch of tiny organisms 00:03:04.560 |
But maybe it's just organisms on top of organisms. 00:03:08.080 |
Organisms all the way down, turtles all the way down. 00:03:13.320 |
for people, for alien species that's very different. 00:03:28.600 |
- I think of it kind of as if you think about, 00:03:39.480 |
Or maybe there's even units within the colony 00:03:49.440 |
It's like the individual thing that competes with others 00:04:04.680 |
I think of it if emergence happens in an emergence tower, 00:04:10.000 |
and then humans and communities and societies. 00:04:19.280 |
So the colony is this unit that is the competitive unit. 00:04:22.760 |
Humans can kind of go, we take the elevator up and down 00:04:27.280 |
Sometimes we are individuals that are competing 00:04:30.960 |
with other individuals and that's where our mindset is. 00:04:33.160 |
And then other times we get in this crazy zone, 00:04:40.160 |
and doing the same hand motions with all these other people 00:04:43.840 |
You feel like one and you'd sacrifice yourself. 00:04:46.960 |
And so our brains can kind of psychologically 00:04:50.480 |
go up and down this elevator in an interesting way. 00:04:53.600 |
- Yeah, I wonder how much of that is just the narrative 00:05:11.400 |
We're actually all part of this giant network 00:05:14.160 |
of maybe one of the things that makes humans who we are 00:05:21.120 |
The ability to maintain not just a single human intelligence 00:05:29.840 |
'cause we woke up at this level of the hierarchy. 00:05:34.600 |
but we very well could be just part of the ant colony. 00:05:40.840 |
I'm either going to be doing a Shakespearean analysis 00:05:46.480 |
or very specific statements that you've made. 00:05:49.560 |
So you've written answers to a mailbag of questions. 00:05:54.560 |
The questions are amazing, the ones you've chosen, 00:06:00.200 |
somebody asked, "Are we bigger than we are small 00:06:13.600 |
at this level of the very small to the very big? 00:06:21.560 |
I think it depends on what we're asking here. 00:06:26.800 |
that we kind of can talk about without just imagining 00:06:31.800 |
is the observable universe, the Hubble sphere. 00:06:35.660 |
And that's about 10 to the 26th meters in diameter. 00:06:40.660 |
The smallest thing we talk about is a Planck length. 00:06:44.460 |
But you could argue that that's kind of an imaginary thing. 00:06:49.460 |
Now we're about, conveniently, about 10 to the one. 00:06:56.920 |
So it's easy because you can just look and say, 00:07:04.700 |
or 10 to the negative 16th meters across, right? 00:07:15.420 |
Smaller than a galaxy and bigger than the biggest star. 00:07:18.500 |
So we're right in between nebula and an atom. 00:07:48.820 |
But if you want to go down to the Planck length, 00:07:50.340 |
we're very quickly, we're bigger than we are small. 00:07:54.060 |
- Yeah, strings, exactly, string theory and so on. 00:08:06.380 |
neutrino, the size doesn't really make sense. 00:08:09.640 |
But when we talk about some of these neutrinos, 00:08:11.660 |
I mean, if a neutrino is a human, a proton is the sun. 00:08:15.500 |
So that's like, I mean, a proton's real small, 00:08:19.740 |
And so, yeah, the small gets like crazy small very quickly. 00:08:36.020 |
This is a thing that could change day by day. 00:08:52.220 |
- If I had a gun to my head, I'd have to take a guess. 00:09:17.780 |
running through a randomized rate equation multiplication, 00:09:35.500 |
Now what's interesting is that there's a long tail 00:09:38.220 |
because they believe some of these multipliers 00:09:42.060 |
So for example, the probability that life starts 00:09:48.180 |
they think that the kind of range that we use 00:10:03.340 |
like 200, they think that that variable should be, 00:10:25.580 |
a non-zero like legitimate amount of outcomes there 00:10:29.060 |
that have us as the only life in the observable universe 00:10:33.500 |
I mean, it seems incredibly counterintuitive. 00:10:37.100 |
when you mentioned that people think you're, you know, 00:10:41.580 |
if you picked up one grain of sand on a beach 00:10:43.260 |
and examined it and you found all these little things on it, 00:10:53.420 |
So, and then the other hand, we don't see anything. 00:10:57.020 |
which of course people would say that the people 00:10:59.260 |
who scoff at the concept that we're potentially alone, 00:11:05.860 |
there's lots of reasons we wouldn't have seen anything 00:11:07.820 |
and they can go list them and they're very compelling, 00:11:16.180 |
if this were a completely freak thing that happened here, 00:11:25.940 |
would think there must be lots of us out there 00:11:30.900 |
using the same intuition that most people would use, 00:11:32.820 |
I'd say there's probably lots of other things out there. 00:11:35.480 |
- Yeah, and you wrote a great blog post about it, 00:11:47.580 |
So one interesting is around the great filter. 00:11:50.700 |
So we either, the great filter's either behind us 00:11:57.460 |
is you get to think about what kind of things 00:12:00.260 |
ensure the survival of an intelligent civilization 00:12:05.260 |
or lead to the destruction of intelligent civilization. 00:12:08.480 |
That's a very pragmatic, very important question 00:12:13.380 |
And then the other one is I'm saddened by the possibility 00:12:18.380 |
that there could be aliens communicating with us 00:12:23.700 |
And we're just too dumb to hear it, to see it. 00:12:29.620 |
Like the idea that the kind of life that can evolve 00:12:34.540 |
is just the range of life that can evolve is so large 00:12:42.860 |
is preventing us from having communication with them. 00:12:47.740 |
because if they were trying to communicate with us, 00:12:50.380 |
they would surely, if they were super intelligent, 00:12:53.420 |
they would be very, I'm sure if there's lots of life, 00:12:56.420 |
we're not that rare, we're not some crazy weird species 00:13:08.140 |
with an Earth-like species, with a human-like species, 00:13:14.120 |
You'd send out radio waves and you send out gravity waves. 00:13:23.900 |
and it's just we're too dumb to perceive the signals, 00:13:25.980 |
it's like, well, they're not doing a great job 00:13:28.380 |
of considering the primitive species we might be. 00:13:35.860 |
wanted to get in touch with us and had the capability of, 00:13:43.580 |
- Well, they may be getting in touch with us. 00:14:00.240 |
Like the nature of the intelligence that's on Earth 00:14:03.500 |
or the thing that's of value and that's curious 00:14:07.580 |
and that's complicated and fascinating and beautiful 00:14:10.620 |
might be something that's not just like tweets, okay? 00:14:17.180 |
or any kind of language or any kind of signal, 00:14:41.740 |
and view life that could be getting communicated with 00:14:54.380 |
versus us humans that seem to treat ourselves 00:14:56.860 |
as super important and we're missing the big picture. 00:15:02.100 |
but our understanding of what is intelligence, 00:15:04.660 |
of what is life, what is consciousness is very limited 00:15:08.020 |
and it seems to be, and just being very suspicious, 00:15:14.380 |
Like this story, it seems like the progress of science 00:15:17.860 |
is constantly putting humans down on the importance, 00:15:28.220 |
the ranking of how big we are, how important we are. 00:15:33.940 |
that's what's happening and I think science is very young. 00:15:37.300 |
And so I think eventually we might figure out 00:15:39.820 |
that there's something much, much bigger going on, 00:15:41.880 |
that humans are just a curious little side effect 00:15:46.420 |
That's what, I mean, as I'm saying, it just sounds insane. 00:15:55.660 |
It gets to that realm where there's something 00:16:01.960 |
- Well, yeah, but not, so religious and spiritual 00:16:06.960 |
often have this kind of woo-woo characteristic, 00:16:16.140 |
I mean more like it's possible that collective intelligence 00:16:19.820 |
is more important than individual intelligence, right? 00:16:22.300 |
It's the ant colony, what's the primal organism? 00:16:26.900 |
- Yeah, I mean, humans, just like any individual ant 00:16:32.060 |
make these incredible structures and has this intelligence. 00:16:36.780 |
I mean, you know the famous thing that no human 00:16:51.660 |
So you have to think about, you have to get the wood, 00:17:06.140 |
who knows how to kind of collect all those materials 00:17:15.300 |
So, you know, the other thing I like to think about, 00:17:18.540 |
I actually put this as a question on the blog once. 00:17:25.980 |
So if a witch, kind of a dickish witch comes around 00:17:30.780 |
and she says, "I'm gonna cast a spell on all of humanity 00:17:34.980 |
"and all material things that you've invented 00:17:42.960 |
There's no buildings, there's no cars and boats and ships 00:17:49.900 |
It's just the stone age earth and a bunch of naked humans, 00:17:52.960 |
but we're all the same, we have the same brain. 00:17:57.940 |
And she says, she communicated to every human, 00:18:02.900 |
"You guys need to make one working iPhone 13. 00:18:28.000 |
of 100 or 1,000 humans within the population, 00:18:34.740 |
I tend to believe that there's fascinating specialization 00:18:50.340 |
It has to, I mean, it's virtually, I mean, okay. 00:19:06.740 |
- Okay, you 100% have to have the, everybody's naked. 00:19:10.820 |
- Everyone's naked and everyone's where they are. 00:19:14.000 |
It's on the ground in what used to be Manhattan. 00:19:26.440 |
He doesn't even know where he, you know, where is everyone? 00:19:28.200 |
You know, oh shit, how am I gonna find other people 00:19:53.200 |
- Well, I think part of it is a communication problem. 00:19:59.620 |
10 really smart people were designated the leaders, 00:20:02.980 |
everyone who can do this to walk West, you know, 00:20:05.960 |
until you get to this little hub and everyone else, 00:20:08.240 |
you know, and they could actually coordinate, 00:20:14.040 |
so you've got some people that are like trying to organize 00:20:17.080 |
where a couple hundred people have come together 00:20:20.080 |
and they designated one person, you know, as the leader, 00:20:24.320 |
we have a start here, we have some organization. 00:20:26.280 |
You're also gonna have some people that say, good, 00:20:28.840 |
humans were a scourge upon the earth and this is good. 00:20:40.880 |
- Well, and so maybe everyone's hopeful for the, 00:20:47.120 |
They, you know, people get, start to lose hope 00:20:50.280 |
new kinds of governments popping up, you know, 00:20:52.200 |
new kinds of societies and they're, you know, 00:20:59.200 |
I think a lot of people will just give up and say, 00:21:00.480 |
you know what, this is it, we're back in the stone age. 00:21:11.040 |
And so we also, there's gonna be a lot of mass starvation. 00:21:14.680 |
And that, you know, when you're trying to organize, 00:21:16.420 |
a lot of people are, you know, coming in with, you know, 00:21:18.760 |
Spears they fashioned and trying to murder everyone 00:21:23.540 |
Given today's society, how much violence would that be? 00:21:29.040 |
So that's something. - We don't have weapons. 00:21:31.720 |
- But we have, and also we have a kind of ethics 00:21:35.680 |
We used to be less, like human life was less valued. 00:21:38.960 |
In the past, so murder was more okay, like ethically. 00:21:41.920 |
- But in the past, they also were really good 00:21:46.340 |
They knew how to get food and water because they, 00:21:49.840 |
Like the ancient hunter-gatherer societies would laugh 00:21:52.880 |
They'd say, you guys, you don't know what you're, 00:21:57.360 |
feeding this amount of people in a very, in a stone age, 00:22:01.120 |
you know, civilization, that's not gonna happen. 00:22:06.000 |
- Well, whoever's not near water is really screwed. 00:22:08.720 |
- You're near a river, a freshwater river, and you know. 00:22:14.480 |
it makes me feel so grateful and like excited about like, 00:22:21.560 |
And this is, talk about collective intelligence. 00:22:30.480 |
collective humans is a super intelligent, you know, 00:22:41.360 |
When I go out, when I'm working and I'm hungry, 00:22:43.020 |
I just go click, click, click, and like a salad's coming. 00:22:47.240 |
If you think about the incredible infrastructure 00:22:52.120 |
it just the internet to, you know, the electricity, 00:22:54.440 |
first of all, that's just powering the things, you know, 00:22:58.040 |
that have to be created and for that electricity 00:23:01.200 |
And then you've got the, of course the internet. 00:23:02.640 |
And then you have this system where delivery drivers 00:23:09.200 |
and all those ingredients came from all over the place. 00:23:13.640 |
I like thinking about these things because it, 00:23:19.280 |
I'm like, man, it would be so awful if we didn't have this. 00:23:21.360 |
And people, people who didn't have it would think 00:23:27.600 |
- Yeah, one of the most amazing things when I showed up, 00:23:32.200 |
and the supermarket was, people don't really realize that, 00:23:39.960 |
so bananas was the thing I was obsessed about. 00:23:43.440 |
I just ate bananas every day for many, many months 00:23:47.720 |
And the fact that you can have as many bananas as you want, 00:23:54.600 |
And the fact that you can somehow have a system 00:23:58.080 |
that brings bananas to you without having to wait 00:24:00.200 |
in a long line, all of those things, it's magic. 00:24:09.800 |
No, so do you know what an avocado used to look like? 00:24:17.080 |
was like a little tiny layer around this big pit 00:24:21.440 |
We've made a crazy, like robot avocados today 00:24:28.320 |
so same with bananas, these big, sweet, you know, 00:24:46.720 |
it's just A, it's like crazy, super engineered cartoon food, 00:24:53.680 |
In our society, oh, you know, we complain about, 00:24:58.320 |
I mean, if you imagine what they would think, 00:25:04.320 |
You know, candy, you know, pasta and spaghetti. 00:25:11.480 |
I mean, things that are grown all over the place, 00:25:20.080 |
I mean, it's, again, just like incredible gratitude. 00:25:25.160 |
- And the question is how resilient is this whole thing? 00:25:27.440 |
I mean, this is another darker version of your question 00:25:30.880 |
is if we keep all the material possessions we have, 00:25:35.240 |
but we start knocking out some percent of the population, 00:25:39.560 |
how resilient is the system that we built up? 00:25:42.080 |
Or if we rely on other humans and the knowledge 00:25:59.200 |
which is Elon Musk says that he has this number, 00:26:05.960 |
you need to be on Mars to truly be multi-planetary. 00:26:26.740 |
And I always like think about if the first fish 00:26:37.260 |
And there should be a little statue of that fish 00:26:41.380 |
But it's, but when we talk about a great leap for life, 00:26:48.660 |
So this is part of why I get so excited about Mars, 00:26:51.080 |
by the way, is because you can count on one hand, 00:26:53.820 |
like the number of great leaps that we've had, 00:26:57.140 |
you know, like no life to life and single cell 00:27:18.680 |
has happened once the ships could stop coming from Earth 00:27:22.740 |
because there's some horrible catastrophic World War III 00:27:26.280 |
and they can turn that certain X number of people 00:27:38.860 |
He says a million people is about what he thinks. 00:27:44.620 |
selected million that has very, very skilled million people, 00:27:51.600 |
But I think it depends what you're talking about. 00:27:54.200 |
so one million is 1/7000, 1/8000 of the current population. 00:27:58.480 |
I think you need a very, very, very small fraction 00:28:08.480 |
but it depends who you're killing off, I guess, 00:28:14.320 |
just randomly right now, I think we'd be fine. 00:28:15.720 |
It would be obviously a great, awful tragedy. 00:28:18.740 |
I think if you killed off 3/4 of all people randomly, 00:28:20.740 |
just three out of every four people drops dead, 00:28:22.420 |
I think we'd have, obviously, the stock market would crash. 00:28:27.020 |
but I almost can assure you that the species would be fine. 00:28:38.780 |
you have to basically do the iPhone experiment. 00:28:50.800 |
any important piece of infrastructure on Earth 00:29:15.280 |
I don't know, five miles, two miles underground, 00:29:25.280 |
the mine that we're getting stuff for the iPhone for 00:29:33.260 |
I think, obviously, I assume the Industrial Revolution 00:29:37.000 |
we want to extract this magical energy source, 00:29:45.740 |
you're gonna need a lot of electrical engineers. 00:29:48.480 |
If you're gonna have a civilization like ours, 00:29:50.040 |
now, of course, you could have oil and lanterns, 00:29:53.200 |
but if you're trying to build our today thing, 00:30:07.720 |
so like turning raw materials into something useful, 00:30:18.460 |
one of the major problems is there's plenty of food, 00:30:26.920 |
So it's like, again, we take it so for granted, 00:30:32.760 |
it's all there, and, which always stresses me out, 00:30:41.660 |
or if you don't have enough, that's not good, 00:30:43.100 |
but if you have too much, it goes bad anyway. 00:30:44.380 |
- Of course, there would be entertainers too. 00:30:53.180 |
about a civilization on Mars and Earth existing 00:31:08.160 |
We know there'll be like a reality show on Mars 00:31:11.400 |
And I think if people are going back and forth enough, 00:31:22.640 |
I think if people don't really go back and forth, 00:31:27.480 |
oh, we hate a lot of like us versus them stuff going on. 00:31:30.360 |
- There could be also war in space for territory. 00:31:38.460 |
or whoever, the European, different European nations, 00:31:44.780 |
This is supposed to, staying out of all of them. 00:31:50.980 |
no one's really even thought about too much yet 00:32:08.480 |
this huge thing, 'cause this tiny peninsula switched. 00:32:11.280 |
That's how like optimized everything has become. 00:32:24.960 |
or maybe it's just the colonies of these governments, 00:32:28.680 |
I think it'd be cool if there's new countries being, 00:32:46.360 |
- There was one modern democracy in late 1700s, the US. 00:32:57.640 |
But I think part of the reason that was able to start, 00:32:59.760 |
I mean, it's not that people didn't have the idea. 00:33:02.000 |
It was that they had a clean slate, new place, 00:33:06.760 |
so I think it would be a great opportunity to have, 00:33:10.240 |
'cause a lot of people have done that, you know, 00:33:15.560 |
And it's, the US founders actually had the opportunity, 00:33:21.640 |
Let's make, okay, what's the perfect country? 00:33:25.360 |
Sometimes progress is, it's not held up by our imagination. 00:33:34.400 |
- Yeah, it's an opportunity for a fresh start. 00:33:37.200 |
You know, the funny thing about the conversation 00:33:41.320 |
I mean, even by Elon, he's so focused on starship 00:33:43.760 |
and actually putting the first human on Mars. 00:33:46.160 |
I think thinking about this kind of stuff is inspiring. 00:33:56.960 |
thinking about civilization on Mars is helping us 00:34:05.080 |
What do you think are, like, in our lifetime? 00:34:07.640 |
Are we gonna, I think any effort that goes to Mars, 00:34:13.320 |
Do you think that's actually gonna be achieved? 00:34:28.680 |
Now, this was probably in 2018 when I had this argument. 00:34:30.960 |
- So, like, what if-- - So, a human has to touch Mars 00:34:35.560 |
Oh, by the year '39. - Yeah, by January 1st, 2031. 00:34:44.080 |
- No, no, yeah, if it's coming on that exact day, 00:34:47.080 |
But anyway, 'cause I think that there will be. 00:35:27.000 |
seemed like magical tech that humans didn't have. 00:35:36.760 |
like it is a interplanetary transport system. 00:35:49.000 |
something fails, usually, when they're testing, 00:35:58.520 |
it's like they've moved up eight generations in each one. 00:36:01.560 |
Anyway, so it's not inconceivable that pretty soon, 00:36:04.480 |
they could send a Starship to Mars and land it. 00:36:10.840 |
they could, in theory, send a person to Mars pretty soon. 00:36:34.240 |
or whatever it's called, the period, 26 months. 00:36:45.200 |
Elon said, maybe we can send people to Mars in 2024, 00:36:58.480 |
and so I think they're not quite on that schedule, 00:37:07.720 |
we're very distracted by what's going on today, 00:37:12.440 |
There's no way that a human's gonna be landing on Mars, 00:37:20.480 |
an even bigger deal, going to another planet, right? 00:37:36.000 |
And again, it's one of the great leaps for all of life 00:37:38.080 |
happening in our lifetimes, like that's wild. 00:37:46.080 |
but I just wanna put sort of value into leadership. 00:37:49.280 |
I think it wasn't obvious that the moon landing 00:37:52.560 |
would be so exciting for all of human civilization. 00:37:55.000 |
Some of that had to do with the right speeches, 00:37:58.880 |
Like, space, depending on how it's presented, 00:38:02.560 |
I don't think it's been that so far, but I've actually-- 00:38:07.160 |
- I agree, I think space is quite boring right now. 00:38:18.560 |
and now they're starting to do this easy magic. 00:38:20.440 |
You know, it's like, you can't go in that direction. 00:38:23.600 |
watching astronauts go up to the space station 00:38:31.800 |
It's just like, you know, everything is so impractical. 00:38:34.880 |
You're going up to the space station not to explore, 00:38:36.720 |
but to do science experiments in microgravity. 00:38:43.080 |
but mostly you're sending them up to put satellites 00:38:48.520 |
It's kind of like lame earth industry, you know, usage. 00:39:02.880 |
Maybe you're right, maybe I'm taking for granted 00:39:06.480 |
- I think the value of, I guess what I'm pushing 00:39:12.080 |
and potentially other leaders that hopefully step up 00:39:16.160 |
Like I would argue without the publicity of SpaceX, 00:39:30.400 |
- NASA wasn't able to quite pull off with the shuttle. 00:39:32.680 |
- That's one of his two reasons for doing this. 00:39:46.720 |
and you win the bet if humanity goes extinct, 00:39:51.560 |
You do not want them to have their eggs in two baskets now. 00:39:59.160 |
something that kills everyone on both planets, 00:40:02.120 |
but the point is obviously it's good for our chances, 00:40:06.880 |
our long-term chances to be having, you know, 00:40:08.600 |
choose a self-sustaining civilization is going on. 00:40:14.200 |
I think just as high is it's the greatest adventure 00:40:20.160 |
people need some reason to wake up in the morning 00:40:21.840 |
and it'll just be this hopefully great uniting event too. 00:40:42.720 |
Mars can just bring everyone together, but you know, 00:40:45.440 |
it could become this hideous thing where it's, you know, 00:40:50.720 |
So half the people think that anyone who's excited 00:40:57.920 |
- So far space has been a uniting, inspiring thing. 00:41:02.920 |
And in fact, especially during this time of a pandemic 00:41:08.960 |
putting out humans into space for the first time 00:41:12.080 |
was just one of the only big sources of hope. 00:41:15.280 |
- Totally, and awe, just like watching this huge skyscraper 00:41:19.000 |
go up in the air, flip over, come back down and land. 00:41:21.480 |
I mean, it just makes everyone just want to sit back 00:41:30.280 |
And I think it makes a lot of people feel that way. 00:41:33.280 |
It's like, you know what, we're pretty, you know, 00:41:34.320 |
we have a lot of problems, but like, we're kind of awesome. 00:41:37.800 |
- And if we can put people on Mars, you know, 00:41:39.000 |
sticking an Earth flag on Mars, like, damn, you know, 00:41:42.640 |
we should be so proud of our like little family here. 00:41:46.480 |
And by the way, I've made it clear to SpaceX people, 00:41:54.040 |
that if they want to make this more exciting, 00:42:02.360 |
So I'm just, you know, continuing to throw this out there. 00:42:14.120 |
would be like the, you know, like the Apollo 8, 00:42:16.960 |
where they just looped around the moon and came back. 00:42:21.800 |
- Give you a lot of good content to write about. 00:42:26.200 |
I mean, the amount of kind of high-minded, you know, 00:42:28.640 |
and so I would go into the thing and I would blog about it. 00:42:35.480 |
I get a little, they can just send me in a dragon. 00:42:38.520 |
And I would bounce around and I would get to, 00:42:47.280 |
Because I just have nothing to do besides like read books 00:42:53.560 |
Anyway, it's a side topic, but I think it would be-- 00:43:00.040 |
And then if you die there, like finishing your writing, 00:43:08.960 |
- But then I'm gone and I don't even get to like 00:43:13.200 |
- Well, some of the greatest writers in history 00:43:19.360 |
I think like, I think back to Jesus and I'm like, 00:43:20.920 |
oh man, that guy really like crushed it, you know? 00:43:23.200 |
But then if you think about it, it doesn't like, 00:43:28.360 |
and then become the next Jesus like 2000 years from now 00:43:38.760 |
And like, that sounds like your ego probably would be like, 00:43:40.640 |
wow, that's pretty cool, except irrelevant to you 00:44:00.160 |
the magic sauce is you've written about with Elon. 00:44:06.040 |
His style of thinking, his ambition, his dreams, 00:44:16.200 |
- I think that obviously there's a lot of things 00:44:23.200 |
His heart is very much in like, I think the right place. 00:44:25.880 |
Like, you know, I really, really believe that. 00:44:27.760 |
Like, and I think people can sense that, you know, 00:44:30.080 |
he just doesn't seem like a grifter of any kind. 00:44:36.080 |
And he's obviously crazy ambitious and hardworking, right? 00:44:39.200 |
Some people are as talented and have cool visions, 00:44:40.880 |
but they just don't wanna spend their life that way. 00:44:43.800 |
So, but that's, none of those alone is what makes Elon. 00:44:47.800 |
Elon, I mean, if it were, there'd be more of him 00:44:49.840 |
because there's a lot of people that are very smart 00:44:51.480 |
and smart enough to accumulate a lot of money and influence 00:44:53.920 |
and they have great ambition and they have, you know, 00:45:01.960 |
is that he's sane in a way that almost every human is crazy. 00:45:08.800 |
to trust conventional wisdom over our own reasoning 00:45:19.120 |
If you go back 50,000 years and conventional wisdom says, 00:45:26.000 |
or this is the way you tie a spearhead to a spear, 00:45:35.080 |
of life experience, accumulation of observation 00:45:46.640 |
Like people back then, like the conventional wisdom 00:45:57.720 |
Plus the secondary thing is that the people who, you know, 00:46:02.680 |
they worship the mountain as their God, right? 00:46:18.240 |
insult the mountain God and say, that's just a mountain, 00:46:21.240 |
it's not, you know, you didn't fare very well, right? 00:46:23.300 |
So for a lot of reasons, it was a great survival trait 00:46:25.840 |
to just trust what other people said and believe it. 00:46:32.940 |
Today, conventional wisdom in a rapidly changing world 00:46:52.460 |
is treating a lot of things like a small tribe, 00:46:58.540 |
And they're treating conventional wisdom as, you know, 00:47:09.520 |
And so it has time to collect a lot of water in it. 00:47:11.240 |
That's like conventional wisdom in the old days 00:47:13.680 |
Like your 10, you know, great, great, great grandmother 00:47:19.300 |
And so old people really knew what they were talking about. 00:47:23.380 |
And so, you know, the wisdom doesn't accumulate, 00:47:27.880 |
oh, move, you know, quickly moving bucket setting on it. 00:47:31.520 |
So my grandmother gives me advice all the time. 00:47:38.480 |
so there are certain things that are not changing, 00:47:44.060 |
Her advice on those things, I'll listen to it all day. 00:47:46.880 |
you've got to live near your people you love, 00:47:50.040 |
I think that is like tremendous wisdom, right? 00:47:53.720 |
'Cause that happens to be something that hasn't, 00:47:54.960 |
doesn't change from generation to generation. 00:47:59.680 |
She's also telling, right, so I'll be the idiot 00:48:05.800 |
And I'm like, it's not the same when you're not in person. 00:48:08.200 |
They're gonna say, it's exactly the same, grandpa. 00:48:10.480 |
And they'll also be thinking to me with their near link, 00:48:16.640 |
Anyway, so my grandmother then, but then she says, 00:48:19.560 |
you know, you're, I don't know about this writing 00:48:29.360 |
and what I'm good at, that's not the right advice. 00:48:34.840 |
So she became wise for a world that's no longer here, right? 00:48:38.840 |
so then when we think about conventional wisdom, 00:48:41.860 |
And there's a lot of, no, it's not maybe, you know, 00:48:55.560 |
don't have the confidence in our own reasoning 00:48:58.580 |
when it conflicts with what everyone else thinks, 00:49:02.840 |
We don't have the guts to act on that reasoning 00:49:07.120 |
You know, we, and so there's so many Elon examples. 00:49:19.440 |
when people said, you know, this internet was brand new, 00:49:22.120 |
like kind of like kind of thinking of like the metaverse, 00:49:24.840 |
And people have been like, oh, we're saying, you know, 00:49:26.080 |
we, you know, we facilitate internet advertising. 00:49:29.120 |
People are saying, yeah, people are gonna advertise 00:49:32.000 |
Actually, it wasn't that he's magical and saw the future, 00:49:43.960 |
It wasn't rocket science, it wasn't genius, I don't believe. 00:50:06.200 |
If you went back then, you would probably feel the same 00:50:08.040 |
where you'd think this is, that is a fake company. 00:50:10.600 |
That no, it's just obviously not a good idea. 00:50:12.880 |
He looked around and said, you know, I see where this is. 00:50:14.400 |
And so again, he could see where it was going 00:50:18.560 |
conventional wisdom was still a bunch of years earlier. 00:50:28.360 |
of rockets blowing up to show him this is not a good idea. 00:50:34.480 |
the amount of billionaires who have like thought 00:50:47.280 |
- Landing rockets was another thing, you know. 00:50:49.440 |
Well, if, you know, here's the classic kind of way 00:50:58.640 |
This could be done, the Soviet Union would have done it 00:51:10.080 |
in some ways, Elon gets too much credit as, you know, 00:51:20.820 |
I think if you actually are looking at reality, 00:51:26.680 |
and you're ignoring all the noise, which is wrong, 00:51:30.280 |
And you just, then you just have to be, you know, 00:51:33.040 |
pretty smart and, you know, pretty courageous. 00:51:36.440 |
And you have to have this magical ability to be sane 00:51:39.720 |
and trust your reasoning over conventional wisdom 00:51:44.560 |
part of it is that we see that we can't build a pencil. 00:51:46.780 |
We can't build, you know, the civilization on our own, 00:51:50.160 |
We kind of tout, you know, tout to the collective 00:51:55.160 |
because for good reason, but this is different 00:51:59.040 |
You know, the Beatles were doing their kind of Motown-y 00:52:05.560 |
They were doing what was clearly this kind of sound 00:52:12.600 |
let's just, we're gonna start just experimenting. 00:52:17.600 |
all these people are in this like one groove together 00:52:33.040 |
so the term for this that actually Elon likes to use 00:52:36.880 |
is reasoning from first principles, the physics term. 00:52:41.680 |
And physicists, they don't say, well, what's, you know, 00:52:51.040 |
And they come up with all kinds of new things constantly 00:52:55.040 |
If Einstein was assuming conventional wisdom was right, 00:52:57.440 |
he never would have even tried to create something 00:53:02.080 |
And the other way to reason is reasoning by analogy, 00:53:08.120 |
It's when we look at other people's reasoning 00:53:10.460 |
and we kind of photocopy it into our head, we steal it. 00:53:21.500 |
you don't want to reinvent the wheel every time, right? 00:53:23.320 |
You want to often copy other people's reasoning 00:53:27.240 |
And I, you know, most of us do it most of the time 00:53:31.520 |
like succeeding in like the world of like Elon, 00:53:42.240 |
What kind of career paths in terms, these moments, 00:53:44.520 |
this is what on your deathbed, like you look back on, 00:53:46.480 |
and that's what, these are the few number of choices 00:53:52.280 |
You should absolutely try to reason from first principles. 00:53:59.360 |
I mean, if you just look at the way he is on Twitter, 00:54:03.400 |
when you're a super famous, you know, industry titan. 00:54:07.280 |
You're not supposed to just be silly on Twitter 00:54:15.880 |
which sometimes serves him and sometimes doesn't. 00:54:17.640 |
But I think it has taken him where it has taken him. 00:54:28.280 |
Well, first of all, I will say that a lot of tweets, 00:54:30.040 |
people think, oh, he's going to be done after that. 00:54:32.560 |
He's fine, he's just one man, time man of the year. 00:54:41.760 |
I think that, you know, Twitter is his silly side. 00:54:48.200 |
with his reasoning did not feel like there was a giant risk 00:55:06.680 |
The big inspiration is the willingness to do that 00:55:11.040 |
- Yeah, and I think about all the great artists, 00:55:13.280 |
you know, all the great inventors and entrepreneurs, 00:55:17.800 |
they had a moment when they trusted their reasoning. 00:55:25.840 |
obviously they know something we don't, right? 00:55:28.960 |
But they didn't, they said, I think they're all wrong. 00:55:42.160 |
who's a good example of this kind of thinking 00:55:44.520 |
about like manufacturing, how to get costs down. 00:56:02.560 |
This is how they reason we need to get the cost down. 00:56:14.040 |
are like the price of raw materials and gravity, you know, 00:56:19.840 |
I mean, these are your first principles and fuel. 00:56:32.000 |
I mean, he did, he thought for a second and said, 00:56:35.120 |
this isn't how manufacturing is normally, you know, 00:56:38.280 |
but I think this is a different kind of product. 00:56:44.560 |
they often fail and you're going out into the fog 00:56:50.520 |
what you notice is that everyone else turns and says, 00:56:56.480 |
iPhone, you know, Steve Jobs was famously good 00:57:02.840 |
He just said, you know, if I think this is right, 00:57:04.880 |
like everyone, and that, I mean, I don't know how, 00:57:07.520 |
And, and I don't think Apple can do that anymore. 00:57:14.840 |
even though there's tens of thousands of people there. 00:57:19.080 |
and I'm giving a lot of credit to Steve Jobs, 00:57:21.560 |
but of course it was a team at Apple who said 00:57:23.840 |
they didn't look at the flip phones and, and, and say, 00:57:26.320 |
okay, what kind of, you know, let's make a keyboard 00:57:32.480 |
You know, what, axioms, what are the axioms here? 00:57:35.680 |
And none of them involved a keyboard necessarily. 00:57:38.360 |
there was no keyboard, 'cause it didn't make sense. 00:57:58.040 |
There's a bunch of people with cool responses there. 00:58:03.840 |
And what have you changed your mind about, big or small? 00:58:07.520 |
Perhaps in doing the research for some of your writing. 00:58:10.600 |
- So I'm writing right now, just finishing a book 00:58:13.840 |
on kind of why our society is such a shit place 00:58:21.880 |
like we're talking about, just the supermarket. 00:58:23.520 |
You know, we have these, it's exploding technology. 00:58:28.060 |
You know, it's, Louis C.K., you know, likes to say, 00:58:30.480 |
you know, everything's amazing and no one's happy, right? 00:58:38.800 |
- If I could interrupt briefly, you did tweet 00:58:44.320 |
- And then there's some hilarious asshole who said, 00:58:46.920 |
now you just have to work on all the ones in the middle. 00:58:55.800 |
you're just gonna get shit forever, and that's fine. 00:59:04.920 |
- So you're, what, how do you approach this question 00:59:16.400 |
It's that things are getting worse in certain ways. 00:59:24.880 |
This is what's weird, is that things are getting better 00:59:27.380 |
in almost every important metric you can think of. 00:59:31.600 |
Except the amount of people who hate other people 00:59:35.240 |
in their own country, and the amount of people 00:59:37.780 |
that hate their own country, the amount of Americans 00:59:42.800 |
The amount of Americans that hate other Americans 00:59:47.160 |
The amount of Americans that hate the president 00:59:56.120 |
It's not that, you know, a bunch of new people were born 01:00:01.280 |
So I think of it as a very simple, oversimplified equation, 01:00:12.840 |
And this is basic, you know, super, super kindergarten level 01:00:20.080 |
You've got human nature, which is not changing very much, 01:00:32.520 |
And then eventually what comes out is behavior, right? 01:00:38.000 |
but suddenly we're behaving differently, right? 01:00:42.560 |
Like, it used to be that the president, you know, 01:00:45.120 |
was liked by, I don't remember the exact numbers, 01:00:47.040 |
but, you know, 80% or 70% of their own party, 01:00:56.600 |
And it's not that the presidents are getting worse, 01:00:59.200 |
since maybe some people would argue that they are, 01:01:05.640 |
what's going on is something in the environment is changing, 01:01:08.240 |
and that's, that you're seeing is a change in behavior. 01:01:18.280 |
You know, it's hard to measure, but there's metrics like, 01:01:34.720 |
the reason I don't think this is just an unfortunate trend, 01:01:36.800 |
unpleasant trend that hopefully we come out of, 01:01:38.880 |
is that all this other stuff I like to write about, 01:01:41.960 |
And it's this magical, I always think of this, 01:01:45.280 |
and I think that our world would be a utopia, 01:01:50.120 |
Like whatever Thomas Jefferson was picturing as heaven, 01:01:55.440 |
I think that if he came to 2021 US, it would be better. 01:02:00.880 |
But we live in a place that's cooler than 1700s heaven. 01:02:03.840 |
Again, other than the fact that we still die. 01:02:07.200 |
actually probably would have quote, eternal life. 01:02:09.600 |
I don't think anyone wants eternal life, actually, 01:02:16.920 |
maybe we're uploaded, maybe we can refresh our bodies, 01:02:23.400 |
And I do believe that if we don't botch this, 01:02:27.160 |
that would seem like heaven, maybe in our lifetimes. 01:02:36.980 |
some magical utopia to someone from the 16th century, 01:02:55.060 |
if it is the existential threat that many people think it is. 01:03:01.020 |
So the good is getting better and the bad's getting worse. 01:03:16.320 |
we can paddle on the left side or the right side. 01:03:18.540 |
And what we know is there's a fork up there somewhere. 01:03:28.600 |
we're probably not headed for just an okay future. 01:03:32.560 |
it's probably gonna be really good or really bad. 01:03:34.240 |
The question is, which side should we be rowing on? 01:03:45.760 |
- So it's a really important problem to solve. 01:03:56.320 |
and this is my answer to what I changed my mind on, 01:04:03.680 |
Our primitive brain does not remember this fact, 01:04:05.600 |
which is that I don't think there are very many bad people. 01:04:13.600 |
Most of us, I think that if you think of people, 01:04:21.600 |
And our primitive brain very quickly can get into the land 01:04:25.040 |
Our tribe, we're all ones, we're perfect, I'm perfect, 01:04:27.400 |
my family is that other family, it's that other tribe. 01:04:29.480 |
There are zeros, and you dehumanize them, right? 01:04:44.800 |
Scar is totally bad and Mufasa's totally good, right? 01:04:50.360 |
You don't see Scar's upbringing that made him like that, 01:04:53.000 |
that humanizes him, no, lionizes him, whatever. 01:04:58.600 |
Mufasa's a one and Scar's a zero, very simple. 01:05:03.880 |
a psychological place that all of us have been in. 01:05:11.040 |
but it's a place where we fall into this delusion 01:05:22.320 |
it's not that, I don't think there's that many 01:05:32.440 |
the most shameful thoughts, the deep selfishness 01:05:34.440 |
that some of us have in areas we wouldn't want to admit, 01:05:36.840 |
right, most of us have a lot of unadmirable stuff, right? 01:05:45.320 |
and you looked at the trauma that they've experienced 01:05:47.120 |
and then you looked at the insecurities they have 01:06:06.800 |
So, I've started to really try to remind myself 01:06:24.720 |
that we should be able to create a good society together. 01:06:28.040 |
And I think there's more love in humans than hate. 01:06:35.560 |
but I'm a Red Sox fan, Boston Red Sox baseball, 01:06:45.840 |
He was his last game in Fenway, he's retiring. 01:06:59.480 |
because it's like this moment of a little fist pound, 01:07:02.040 |
being like, of course we all actually love each other. 01:07:06.200 |
And so, the thing that I think I've come around on 01:07:25.840 |
and there's the independent variable, environment, 01:07:32.360 |
is the independent variable, the environment. 01:07:34.520 |
Which means I think the environment can get better. 01:07:39.800 |
but I have hope because I think the thing that's bad for us 01:07:57.960 |
- I think that, well, I think that maybe if we're all, 01:08:01.680 |
I think that environments can bring out our good side. 01:08:12.720 |
And I think in a lot of ways you could say it has. 01:08:15.080 |
I mean, the US environment, we take for granted 01:08:18.880 |
how the liberal laws and liberal environment that we live in. 01:08:26.240 |
if you walk down the street and you like assault someone, 01:08:29.360 |
A, if anyone sees you, they're probably gonna yell at you. 01:08:31.000 |
You might get your ass kicked by someone for doing that. 01:08:40.320 |
So there's, it's not that human nature doesn't have it in it 01:08:44.280 |
It's that this environment we're in has made that 01:08:47.080 |
a much, much, much smaller experience for people. 01:08:50.120 |
There's so many examples like that where it's like, man, 01:08:52.160 |
you don't realize how much of the worst human nature 01:09:00.240 |
which is what we have right now, social media starts. 01:09:02.240 |
I mean, what a seismic change to the environment. 01:09:17.280 |
- Were you surprised by Elon's answer about brain transplants 01:09:46.960 |
who all say nuclear power is clearly a good option. 01:09:51.120 |
but you know, the concerns about meltdowns and waste, 01:09:56.880 |
So judging from those people, secondary knowledge here, 01:10:10.400 |
who have, it seems like it's kind of an ideology 01:10:17.200 |
rather than rational, you know, fear of climate change 01:10:27.960 |
I've actually have not done a deep dive myself. 01:10:37.240 |
currently has me thinking it's a truly existential thing, 01:10:52.680 |
I've, especially over the past couple of years, 01:10:54.720 |
I've gotten uncomfortable with fear mongering 01:11:15.160 |
that it also grows the skepticism in science broadly. 01:11:30.840 |
And so it's like, all right, what is the truth here? 01:11:35.280 |
But it also feels like it's hard to get to the, 01:11:38.480 |
like there's a lot of kind of activists talking about idea 01:11:54.080 |
I know it's supposed to be a very big problem, 01:11:57.880 |
but when people talk about catastrophic effects 01:12:01.680 |
I haven't been able to like see really great deep analysis 01:12:12.400 |
What are the models of how that changes human behavior, 01:12:19.260 |
There's going to be constraints on the resources 01:12:28.920 |
like what are the best models for how everything goes wrong? 01:12:32.280 |
Again, I was, this is a question I keep starting 01:12:38.640 |
like motivating myself to get up to this deep dive 01:12:49.480 |
and sort of being caught off guard and wondering, 01:12:55.140 |
how are we going to respond to other kinds of tragedies? 01:12:59.440 |
'cause I said, we're going to have more and more of these, 01:13:04.440 |
big collective, what should we do here situations? 01:13:09.720 |
we're probably not that far away from people being able 01:13:20.000 |
- Yeah, and also imagine the political sides of that 01:13:23.000 |
and that's something only wealthy people can afford at first 01:13:27.340 |
We need to be able to have our wits about us as a species 01:13:29.620 |
where we can actually get into a topic like that 01:13:32.880 |
and come up with, where the collective brain can be smart. 01:13:36.720 |
I think that there are certain topics where I think of this, 01:13:42.440 |
but I think it works, is that there's a higher mind 01:13:50.160 |
and a higher mind is more rational and puts out ideas 01:14:00.400 |
Their ideas can get criticized and it's no big deal. 01:14:07.920 |
and one idea goes out and everyone criticizes it, 01:14:10.040 |
which is like shooting bows and arrows at it. 01:14:13.880 |
the arrows bounce off and it's so okay, it rises up. 01:14:22.540 |
is someone puts out a thing, criticism arrows come at it 01:14:25.320 |
and most of them fall and the needle is in the haystack, 01:14:30.920 |
So what that's happening is a bunch of people, 01:14:40.480 |
which is the more limbic systemy part of our brain. 01:14:44.160 |
It's the part of us that is very much not living in 2021. 01:14:48.720 |
It's living many tens of thousands of years ago. 01:14:51.360 |
And it does not treat ideas like this separate thing. 01:14:56.400 |
It only gets involved when it finds an idea sacred. 01:14:59.920 |
It starts holding an idea sacred and it starts identifying. 01:15:04.440 |
And so when you have a topic that a bunch of primitive, 01:15:08.360 |
that really rouses a bunch of primitive minds, 01:15:20.680 |
people outside the community, no one can criticize it. 01:15:34.940 |
You have this collective is suddenly attached 01:16:05.160 |
And so what's happening is the big brain gets frozen 01:16:12.600 |
You just mentioned another one, I forget what it was, 01:16:25.760 |
has like a whirlpool that's pulling everything into it. 01:16:29.800 |
thinking is done with the primitive mind tribes. 01:16:34.880 |
And so I get, okay, obviously something like race, 01:16:49.160 |
It taps into like our deepest kind of like primal selves. 01:17:00.380 |
But the problem is that it's all gotten sucked 01:17:06.360 |
And we're losing our ability to be wise together, 01:17:21.160 |
scary things, we have to make big, right decisions. 01:17:26.080 |
and that's part of this environmental problem. 01:17:30.360 |
and the space of scientists, we should allow the arrows. 01:17:33.320 |
That's one of the saddest things to me about, 01:17:35.080 |
is like the scientists, like I've seen arrogance. 01:17:40.000 |
There's a lot of mechanisms that maintain the tribe. 01:17:42.460 |
It's the arrogance, it's how you build up this mechanism 01:17:46.520 |
that defends this wall that defends against the arrows. 01:18:01.960 |
This ideal of science that makes science beautiful. 01:18:09.600 |
created by perhaps politicians that leverage the fear, 01:18:12.900 |
it, like you said, makes the whole system dumber. 01:18:22.720 |
if they don't allow the challenging of ideas. 01:18:25.000 |
- What's really bad is that like, in a normal environment, 01:18:30.400 |
And so what's the opposite of an echo chamber? 01:18:32.360 |
I created a term for it, 'cause I think we need it, 01:18:40.480 |
They just treat their ideas like science experiments, 01:18:43.440 |
and they toss them out there, and everyone disagrees. 01:18:48.020 |
On a certain text thread where everyone is just saying, 01:18:50.900 |
it's almost like someone throws something out, 01:18:52.260 |
and just as an impulse for the rest of the group to say, 01:19:02.780 |
we all, of course, respect each other, obviously. 01:19:10.620 |
You're always gonna have ideal labs and echo chambers, 01:19:15.700 |
And maybe in your marriage is a great idea lab, 01:19:23.580 |
in front of that sister, you do not bring up politics, 01:19:25.660 |
because she's now enforced, when that happens, 01:19:41.220 |
An echo chamber person stays in their echo chamber. 01:19:59.480 |
because my primitive mind is doing the thinking, 01:20:04.100 |
'cause it feels so scary and awful for that to happen. 01:20:07.400 |
But if they leave, and they go into an ideal lab 01:20:09.140 |
environment, they're gonna, people are gonna say, 01:20:11.820 |
and they're gonna say, and the person's gonna try 01:20:13.240 |
to bully, they're gonna say, that's really offensive, 01:20:19.140 |
So the echo chamber person, it doesn't have much power 01:20:32.460 |
a denier, a racist, a right-winger, a radical, 01:20:44.740 |
to forcefully expand into places that normally 01:20:49.740 |
have a pretty good immune system against echo chambers, 01:20:54.980 |
places where usually it's like there's a strong 01:21:02.700 |
You have is that these people have found a way to, 01:21:06.140 |
a lot of people have found a way to actually go out 01:21:26.380 |
Do you have a title yet or you can't talk about that? 01:21:30.580 |
If it's okay, just a couple of questions from Mailbag. 01:21:39.580 |
Why do we prefer to watch, the question goes, 01:21:41.940 |
why do we prefer to watch a film we haven't watched before, 01:22:00.240 |
- So I think, let's use these two minds again. 01:22:02.700 |
I think that when your higher mind is the one 01:22:04.940 |
who's taking something in and they're really interested 01:22:07.140 |
in what are the lyrics or I'm gonna learn something 01:22:11.500 |
and the higher mind is trying to get information, 01:22:15.820 |
and once it has it, there's no point in listening to it again 01:22:21.960 |
But when you eat a good meal or have sex or whatever, 01:22:28.780 |
because it actually, your primitive brain loves it, right? 01:22:32.700 |
And it never gets bored of things that it loves. 01:22:38.140 |
I think music goes right into our primitive brain. 01:22:40.540 |
I think it's of course, it's a collaboration. 01:22:43.900 |
Your rational brain is absorbing the actual message, 01:23:06.280 |
And then you end up loving it on the 10th listen, 01:23:10.220 |
but sometimes you even don't even like a song, 01:23:12.540 |
But suddenly you find yourself on the 40th time 01:23:16.240 |
just kind of being like, oh, I love this song. 01:23:19.700 |
And what's happening is that the sound is actually, 01:23:23.060 |
music's actually carving a pathway in your brain. 01:23:35.740 |
your brain is actually dancing with the music 01:23:37.500 |
and it knows the steps and it can anticipate. 01:23:50.820 |
We're like an awkward dancer, we don't know the steps. 01:23:52.380 |
And your primitive brain can't really have that much fun yet. 01:23:56.060 |
- And in the movies, that's more, that's less primitive. 01:24:03.620 |
- But a really good movie that we really love, 01:24:08.340 |
Not that many, but versus if you're watching a talk show, 01:24:11.620 |
right, listening to, if you're listening to a pod, 01:24:17.020 |
one of your podcasts, no matter how good it is, 12 times. 01:24:19.060 |
Because it's, once you've got it, you got it. 01:24:21.380 |
It's a form of information that's very higher mind focused. 01:24:28.180 |
there is people that listen to a podcast episode, 01:24:35.320 |
it's the chemistry, it's the music of the conversation. 01:24:40.300 |
- Yeah, they'll fall in love with some kind of person, 01:24:42.360 |
some weird personality, and they'll just be listening to, 01:24:45.340 |
they'll be captivated by the beat of that kind of person. 01:24:49.700 |
like episodes like 20 times, even though I, you know. 01:24:58.100 |
I got a chance to visit Neuralink a couple of times, 01:25:01.940 |
That was one of the pieces of writing you did 01:25:09.260 |
and changes the way people think about a thing. 01:25:12.060 |
The ridiculousness of your stick figure drawings 01:25:19.460 |
it's like calling the origin of the universe, the Big Bang. 01:25:28.980 |
In the same way, the wizard hat for the Neuralink 01:25:30.980 |
is somehow was a really powerful way to explain that. 01:25:35.340 |
You actually proposed that the man of the year 01:25:47.700 |
about like all those years later about Neuralink? 01:25:50.740 |
Do you find this idea, like what excites you about it? 01:25:54.220 |
Is it the big long-term philosophical things? 01:26:00.540 |
on the neurosurgery side and the material engineering, 01:26:17.060 |
really studied it, brain-computer interfaces. 01:26:21.500 |
I really think it's actually Elon's most ambitious thing, 01:26:28.740 |
because that's just a bunch of people going somewhere, 01:26:33.140 |
Neuralink is changing what a person is eventually. 01:26:37.700 |
Now, I think that Neuralink engineers and Elon himself 01:26:42.420 |
would all be the first to admit that it is a maybe, 01:26:56.180 |
which are still huge, like basically solving paralysis, 01:27:08.840 |
about this kind of helping with different disabilities. 01:27:23.560 |
and you put a brain-machine interface in any way 01:27:35.460 |
that they are for real, they've created this robot. 01:27:50.300 |
and eventually, hopefully, something that isn't covered by 01:27:56.320 |
Something this big a deal should be something 01:28:03.160 |
I'm talking about a very advanced phase down the road. 01:28:07.680 |
maybe right now, think about when you listen to a song, 01:28:18.600 |
It's that the sound is coming out of the speaker. 01:28:34.540 |
Your eardrum is really the speaker now in your head 01:28:40.360 |
which then stimulates neurons in your auditory cortex, 01:28:44.880 |
which give you the perception that you're hearing sound. 01:28:52.280 |
do we really need to have a speaker to do that? 01:28:56.960 |
that could vibrate eardrums, you could do it that way. 01:28:59.320 |
That seems very hard, but really what you need, 01:29:05.040 |
is your auditory cortex neurons need to be stimulated 01:29:09.680 |
If you have a ton of neural link things in there, 01:29:13.440 |
and they get really good at stimulating things, 01:29:15.960 |
you could play a song in your head that you hear 01:29:23.760 |
It's not like they can get close to your head and hear it. 01:29:26.120 |
They could not hear anything, but you hear sound. 01:29:29.200 |
So you open your phone, you have the Neuralink app. 01:29:39.680 |
you can play right out of your phone to your headphones, 01:29:53.320 |
'cause I can leave the house with just my phone, 01:29:56.760 |
and nothing else, or even just an Apple Watch. 01:30:04.600 |
and you keep going, the ability to think together. 01:30:16.040 |
If I go to a movie and I come out of a scary movie 01:30:22.960 |
I just gave you, I had five buckets I could have given you. 01:30:26.440 |
One was horrifying, terrifying, scary, eerie, creepy, 01:30:31.860 |
And I had a much more nuanced experience than that. 01:30:36.240 |
And I don't, all I have is these words, right? 01:30:41.920 |
I put the stuff in the bucket and give it to you, 01:30:45.440 |
You just have to guess what I put into that bucket. 01:30:47.720 |
All you can do is look at the label of the bucket and say, 01:31:09.740 |
We could A, have a brainstorm that doesn't feel like, 01:31:15.760 |
No words are being said internally or externally. 01:31:26.500 |
But you think together and together you're like, 01:31:29.600 |
And now how about eight people in a room doing it, right? 01:31:31.760 |
So it gets, you know, there's other examples. 01:31:34.040 |
How about when you're a dress designer or a bridge designer 01:31:37.520 |
and you want to show people what your dress looks like? 01:31:40.760 |
Well, right now you gotta sketch it for a long time. 01:31:42.480 |
Here, just beam it onto the screen from your head. 01:31:48.840 |
whatever's in your head, you can be pictured. 01:32:01.440 |
I think it'll almost be like a new ADBC line. 01:32:04.280 |
It's such a big change that the idea of like anyone living 01:32:12.840 |
It's that level of like big change, if it can work. 01:32:20.640 |
And copying, you know, you can hopefully copy memories 01:32:23.040 |
onto other things and you don't have to just rely 01:32:32.080 |
and so it can adjust, it can learn how to do this. 01:32:36.080 |
But probably you and I will be too old to truly learn. 01:32:38.560 |
- Well, maybe we can get, there'll be great trainings. 01:32:44.840 |
- But it'll still be a bit of like grandpa can't-- 01:32:50.020 |
I'm like, no, I'm gonna be great at the new phones. 01:32:55.440 |
I'm gonna be like, I just, can you just talk, please? 01:32:57.960 |
And they're gonna be like, okay, I'll just talk 01:32:59.420 |
and they're gonna, so that'll be the equivalent 01:33:04.360 |
- I really suspect, I don't know what your thoughts are, 01:33:06.800 |
but I grew up in a time when physical contact 01:33:21.520 |
once we were all doing that, it might feel like, 01:33:22.920 |
man, everyone was so isolated from each other before. 01:33:27.580 |
I just meant physical, having to be in the same, 01:33:33.640 |
If it is important, won't there be whole waves 01:33:37.160 |
there's all these articles that come out about how, 01:33:38.560 |
you know, in our metaverse, we've lost something important. 01:33:46.800 |
if something truly is lost, won't we recover it? 01:34:00.140 |
And so to me, it's, I don't see anything profoundly unique 01:34:09.240 |
- But then why are you saying there's a loss there? 01:34:15.840 |
- So then you do think there's something unique 01:34:25.060 |
Like people in this country came up with baseball. 01:34:36.920 |
they went to baseball games with their father, 01:34:41.680 |
There's a young kid dreaming about, I don't know, 01:34:53.760 |
But I mean, fundamentally to the human experience. 01:35:02.640 |
If this were, obviously if there were a screen, 01:35:09.120 |
maybe Neuralink, you know, maybe, again, forget, 01:35:22.320 |
My visual cortex will get put into a virtual room, 01:35:40.200 |
- And you're right, this is one of those shits in society 01:35:46.760 |
- Romantically, people still need to be together. 01:35:57.560 |
- Sex, but also just like, there's pheromones, 01:36:03.160 |
it's like music, it goes to such a deeply primitive 01:36:08.040 |
with a romantic partner does, that I think that, 01:36:11.160 |
so I'm sure there'll be a whole wave of people 01:36:18.920 |
where you can actually smell what's in the room, 01:36:22.040 |
- Yeah, but I think that'll be one of the last things to go. 01:36:30.960 |
there's nothing lost by not being in the same room. 01:36:32.960 |
- It's very difficult to replicate the human interaction. 01:36:38.640 |
not to get too weird, but you could have a thing 01:36:41.000 |
where you basically, you know, or let's just do a massage, 01:36:45.480 |
'cause it's less awkward, but like, someone-- 01:36:54.720 |
and you could feel whatever's happening, right? 01:36:57.000 |
So you're lying down in your apartment alone, 01:37:06.760 |
- Exactly, right, now think about it, right now, 01:37:08.160 |
you know what, Taylor Swift doesn't play for one person, 01:37:10.680 |
it has to go around, and every one of her fans 01:37:26.160 |
you're actually starting a podcast, which is awesome. 01:37:28.360 |
You're so good at talking, so good at thinking, 01:37:30.560 |
so good at being weird in the most beautiful of ways. 01:37:33.920 |
But you've been thinking about this AI safety question, 01:37:41.640 |
For the near future, for the long-term future. 01:37:46.600 |
including with Elon's work with Tesla Autopilot, 01:37:48.640 |
there's a bunch of amazing robots, there's Boston Dynamics, 01:37:52.160 |
and everyone's favorite vacuum robot, iRobot, Roomba, 01:38:09.200 |
Just a lot of incredible use of, not the face recognition, 01:38:12.640 |
but the incredible use of deep learning, machine learning, 01:38:19.360 |
and try to recommend to them what they wanna consume next. 01:38:24.200 |
Some of that can be abused, some of that can be used 01:38:26.280 |
for good, like for Netflix or something like that. 01:38:30.280 |
- Yeah, I mean, I really don't think humans are very smart, 01:38:35.280 |
all things considered, I think we're limited. 01:38:38.680 |
And we're dumb enough that we're very easily manipulable. 01:38:46.120 |
Our emotions can be pulled like puppet strings. 01:38:53.440 |
and I see a lot of puppet string emotions happening. 01:38:56.480 |
So yeah, there's a lot to be scared of, for sure. 01:39:00.040 |
I get excited about a lot of things, very specific things. 01:39:09.760 |
oh, the wrist, the Fitbit around my wrist is gonna seem, 01:39:12.360 |
or the whoop is gonna seem really hilariously old school 01:39:17.960 |
- Like with Neuralink. - We're like a big bracelet. 01:39:21.040 |
It's gonna turn into little sensors in our blood probably, 01:39:24.160 |
or even infrared, just things that are gonna be, 01:39:40.760 |
I've not done my deep dive, this is all speculation, 01:39:46.960 |
And so I get excited about specific things like that. 01:39:49.200 |
Like think about if hardware were able to collect, 01:39:54.200 |
first of all, the hardware knows your whole genome. 01:39:56.680 |
And we know a lot more about what a genome sequence means, 01:40:04.280 |
okay, we don't have much to do with that information. 01:40:09.240 |
you've got what's in your blood at any given moment, 01:40:13.240 |
You have the exact width of your heart arteries 01:40:25.720 |
all the things that you should be concerned about health-wise 01:40:30.440 |
or you might be immune from all of that kind of stuff. 01:40:39.480 |
knows your muscle mass and your weight and all that. 01:40:41.160 |
But it also maybe can even know your emotions. 01:40:45.800 |
Probably pretty obvious chemicals once we get in there. 01:40:56.280 |
and you're in a bad mood, it's hard to even, but-- 01:41:08.320 |
"Our expression of emotions has nothing to do 01:41:16.480 |
You can tell, 'cause one of these apps pops up 01:41:22.260 |
"I feel bad right now because the thing popping up 01:41:26.560 |
"because I was on my phone and I should've been." 01:41:27.880 |
You know, I'm like, "That's not my, you know." 01:41:37.280 |
Think about when the AI gets really good at this. 01:41:41.520 |
it can just, I want the AI to just tell me what to do. 01:41:47.360 |
so how about this, now imagine attaching that 01:41:55.880 |
And I give the, I tell the AI my broad goals. 01:42:02.480 |
maintain my weight, but I wanna have more energy, 01:42:04.200 |
or whatever, or I just wanna be very healthy, 01:42:06.400 |
and I wanna, obviously, everyone wants the same, 01:42:08.120 |
like, 10 basic things, like you wanna avoid cancer, 01:42:13.200 |
So now the AI has my goals, and a drone comes at, 01:42:23.100 |
like, you know, 15 minutes, you're gonna eat. 01:42:27.860 |
15 minutes later, a little slot opens in my wall, 01:42:36.780 |
for my mood, for my genome, for my blood contents. 01:42:42.740 |
so, you know, it knows I wanna feel energy at this time, 01:42:53.940 |
- Exactly, it knows you way better than you know yourself, 01:43:01.500 |
So it pops up and it says, like, you know, coffee, 01:43:12.260 |
it stays in my system, it knows what my sleep is like 01:43:14.140 |
when I have it too late, it knows I have to wake up 01:43:15.620 |
at this time tomorrow, 'cause that's my calendar. 01:43:19.300 |
this is, I think, something that humans are wrong about, 01:43:22.100 |
is that most people will hear this and be like, 01:43:27.460 |
And if we all had this, we would not look back 01:43:29.420 |
and be like, I wish I was, like, making awful choices 01:43:33.660 |
And then, this isn't, these aren't important decisions, 01:43:51.700 |
I really love, like, I love thinking about that. 01:43:55.420 |
I think it's gonna be very, and I think we'll all be 01:43:57.020 |
so much healthier, that when we look back today, 01:44:00.460 |
one of the things that's gonna look so primitive 01:44:10.020 |
one, you know, unique advice coming from AI, and so, yeah. 01:44:22.340 |
of that data being used by authoritarian governments 01:44:30.040 |
it's most likely going to be used as part of a competition 01:44:34.300 |
to get you the most delicious and healthy meal possible 01:44:38.340 |
- Yeah, so the world will definitely be much better 01:44:45.220 |
be transparent and honest about how that data is misused, 01:44:48.340 |
and that's why it's important to have free speech 01:44:51.700 |
when some bullshit is being done by companies. 01:44:53.580 |
- That we need to have our wits about us as a society. 01:44:55.940 |
Like, this is what, free speech is the mechanism 01:45:00.480 |
by which the big brain can think, can think for itself, 01:45:12.820 |
So forget the government taking away free speech. 01:45:15.660 |
If the culture penalizes nuanced conversation 01:45:24.380 |
and it's such a incredible market to polarize people, 01:45:34.020 |
and get people hooked on it as a political topic, 01:45:38.100 |
So free speech goes away, as far as it matters. 01:45:41.220 |
well, it's not, you don't even know what free speech is. 01:45:48.380 |
My First Amendment rights are not being violated. 01:46:03.960 |
Take any topic, again, that has to do with, like, 01:46:13.680 |
or, you know, even, you know, climate change. 01:46:27.080 |
Your life can be over, you know, as far as it matters, 01:46:43.920 |
Like, you know, you're wrong, and here's why. 01:46:53.120 |
And the culture of, and you say something wrong, 01:46:56.500 |
Oh, wow, like, look, this is his real, you know, colors. 01:47:01.560 |
- You still have mutual respect for each other. 01:47:10.840 |
But you still have respect, you still have love for them. 01:47:19.720 |
like, everybody lost hope that something like a truth 01:47:38.000 |
that other people's ideas might be dumb as hell, 01:47:44.560 |
- Right now, people are being trained, little kids, 01:47:51.360 |
To think that there's no such thing as objective truth. 01:47:59.180 |
Doesn't mean we're, you know, necessarily on our way, 01:48:01.720 |
or we're finding, but we're all aiming in the same direction. 01:48:12.160 |
You know, it's, you know, it's like, you know, 01:48:27.540 |
it's like, I would teach kids some very specific things 01:48:40.700 |
you've tweeted, "30 minutes of reading a day equals," 01:48:43.460 |
yeah, this whole video, and it's cool to think about reading 01:48:46.020 |
like, as a habit, and something that accumulates. 01:49:06.980 |
If you do something, a little of something every day, 01:49:13.860 |
the people who achieve these incredible things in life, 01:49:19.620 |
they have the same number of days that you do, 01:49:21.340 |
and it's not like they were doing magical days. 01:49:36.300 |
So, you can take writing, someone who, you know, 01:49:39.100 |
there's two aspiring writers, and one doesn't ever write, 01:49:42.140 |
doesn't, you know, manages to never, you know, 01:49:45.060 |
and the other one manages to do two pages a week, right? 01:49:50.220 |
The other one does zero pages a week, two pages a week. 01:49:55.480 |
The other person, just 2%, they're doing one other thing. 01:50:05.580 |
They're one of the most prolific writers of all time. 01:50:15.940 |
So, in 20 years, you've still written 10 books, 01:50:31.980 |
So, it's inspiring, I think, for a lot of people 01:50:34.900 |
who feel frustrated they're not doing anything. 01:50:38.580 |
where someone who reads very, you know, doesn't read, 01:50:45.700 |
You know, I always think about like the Tyler Cowen types. 01:51:06.060 |
they're not doing something crazy and magical. 01:51:07.980 |
They're just reading a half hour a night, you know? 01:51:15.260 |
So, if someone who's 80, and they've read a thousand books, 01:51:18.620 |
you know, between 30 and 80, they are extremely well read. 01:51:21.780 |
They can delve deep into many non-fiction areas. 01:51:24.820 |
They can be, you know, an amazing fiction reader, 01:51:40.860 |
And you realize that a lot of times you think 01:51:43.820 |
that the people who are doing amazing things, 01:51:45.820 |
and you're not, you think that there's a bigger gap 01:51:51.420 |
I, on the reading front, I'm a very slow reader, 01:51:54.020 |
which is just a very frustrating fact about me. 01:52:05.700 |
and I'll wake up, throw it on, do it in the shower, 01:52:09.100 |
brushing my teeth, you know, making breakfast, 01:52:11.020 |
dealing with the dogs, things like that, whatever, 01:52:14.180 |
And that's, I can read, I can read a book a week, 01:52:21.820 |
because I'm just, while doing my morning stuff, 01:52:26.260 |
I'm like, having a great time the whole morning, 01:52:29.500 |
So I think that, you know, audiobooks is another amazing 01:52:34.180 |
- I find that that's actually an interesting skill. 01:52:38.700 |
Like, it's a skill to maintain, at least for me, 01:52:45.620 |
there's a lot of content, and if you miss parts of it, 01:52:51.940 |
And so, it's a skill to maintain focus, at least for me. 01:52:55.700 |
- Well, the 10 second back button is very valuable. 01:52:59.740 |
- So I just, if I get lost, sometimes the book is so good 01:53:02.500 |
that I'm thinking about what the person just said, 01:53:04.020 |
and I just get, the skill for me is just remembering 01:53:11.820 |
but that's, but it's, I do the same thing when I'm reading. 01:53:14.060 |
I'll read a whole paragraph and realize I was tuning out. 01:53:17.140 |
I haven't actually even considered to try that. 01:53:19.780 |
I've been so hard on myself maintaining focus, 01:53:25.180 |
- Yeah, and when you get lost in thought, by the way, 01:53:29.260 |
That's your brain really categorizing and cataloging 01:53:33.500 |
- Well, there's several kinds of thoughts, right? 01:53:37.580 |
and there's a thought that it could take you elsewhere. 01:53:40.100 |
- Well, I find that if I am continually thinking 01:53:52.180 |
If I'm having all these thoughts about other stuff, 01:53:53.900 |
I'm saying, clearly my mind wants to work on something else. 01:54:00.660 |
Also, you can, things like you have to head out to the store. 01:54:06.500 |
just walking back and forth, going to the airport. 01:54:23.500 |
Say, I don't like, I like to have the paper book. 01:54:25.380 |
And sure, but like, it's pretty fun to be able to read. 01:54:29.420 |
I listen to a huge number of audio books and podcasts, 01:54:31.920 |
but I still, the most impactful experiences for me 01:54:42.120 |
like that estimate how long a book takes on average, 01:54:46.600 |
- They do like a page a minute when I read, like, 01:55:06.560 |
when this happens and another friend had read it, 01:55:09.460 |
I'm like, oh, I can tell you, like, the entire, 01:55:14.160 |
have to read so less, so much less in my life. 01:55:16.600 |
- I actually, so, in terms of going to the airport, 01:55:18.720 |
you know, in these, like, filler moments of life, 01:55:31.000 |
When I read, I write it down if I want to remember it. 01:55:43.660 |
It's called Anki, I recommend it to a lot of people. 01:55:51.660 |
So, this is extremely well-known app and idea, 01:55:55.620 |
like, among students who are, like, medical students, 01:56:03.740 |
They really have to memorize a lot of things. 01:56:19.780 |
'Cause you're, that'd be interesting, actually, 01:56:24.140 |
you talked about, like, opening up a trillion tabs 01:56:27.700 |
You know, you probably want to remember some facts 01:56:34.940 |
this thing I can't directly put into the writing, 01:56:54.900 |
there's a bunch of apps that are much nicer than Anki. 01:56:57.060 |
Anki is the ghetto, like, Craigslist version, 01:57:01.940 |
because people are like, we don't want features. 01:57:14.820 |
- There's the amount of-- - You'll realize that, 01:57:19.860 |
I guarantee you'll probably write a blog about it. 01:57:23.260 |
- Well, it's also just like-- - It's your people, too. 01:57:25.100 |
- And my, people say, what do you write about? 01:57:42.100 |
coming across something, or just a tweet, you know? 01:57:44.660 |
Something that I'm like, ooh, I need to share this 01:57:52.180 |
who I'm like, oh, I need to tell them about this. 01:57:56.460 |
I mean, I collect things in a document right now, 01:57:58.140 |
if it's really good, but it's the little factoids 01:58:05.180 |
when you look at it, a tweet and all that kind of stuff, 01:58:07.980 |
is you also need to couple that with a system for review. 01:58:14.100 |
it determines for me, I don't have to do anything. 01:58:16.460 |
There's this giant pile of things I've saved, 01:58:21.660 |
I don't know, when Churchill did something, right? 01:58:33.940 |
And you say yes or no, or, like, you get to pick. 01:58:38.700 |
You get to see the answer, and you get to self-evaluate 01:58:43.380 |
And if you remember it well, it'll be another month 01:58:46.900 |
If you don't remember, it'll bring it up again. 01:58:48.900 |
That's a way to review tweets, to review concepts. 01:58:52.260 |
And it offloads the kind of, the process of selecting 01:58:55.860 |
which parts you're supposed to review or not. 01:59:05.140 |
It's like you can passively sit back and just, 01:59:10.340 |
Versus, you know, you don't have to be the executive 01:59:13.500 |
calling that, like, the program, the memorization program 01:59:17.500 |
- I would love to hear about, like, you trying it out, 01:59:22.380 |
There's a few other apps, but Anki's the big must. 01:59:48.900 |
People don't take it seriously as a dire problem. 02:00:00.900 |
There's, like, we talked about the compiling concept 02:00:08.260 |
with, you know, if you read a little, you know, 02:00:24.660 |
when they commit to something, like on Sunday mornings, 02:00:33.860 |
they respect the part of them that made that decision 02:00:39.460 |
And they say, well, I decided it, so I'm gonna do it. 02:00:51.020 |
But that doesn't mean they're any less talented 02:00:58.900 |
And it doesn't mean that they wouldn't be just as happy 02:01:05.140 |
picture a writer who writes 10 books, you know, 02:01:08.020 |
bestsellers, and they go on these book tours, 02:01:11.100 |
and, you know, they, and they just are so gratified 02:01:27.820 |
And it's because the internal mechanism in their brain 02:01:33.940 |
So they don't have the respect for the part of them 02:01:38.020 |
They feel like it's someone they can disregard. 02:01:42.820 |
as someone who is obese because their eating habits 02:01:47.220 |
make them obese over time or their exercise habits? 02:01:52.100 |
That, you know, that's a huge loss for that person. 02:01:54.300 |
That person is, you know, the health problems 02:01:56.700 |
and it's just probably making them miserable. 02:02:01.900 |
It's self-defeating, but that doesn't make it 02:02:07.500 |
So to me, procrastination is another one of these 02:02:09.380 |
where you are the only person in your own way. 02:02:11.800 |
You are, you know, you are failing at something 02:02:15.220 |
or not doing something that you really wanna do. 02:02:18.900 |
Maybe you're, you wanna get out of that marriage 02:02:23.060 |
you shouldn't be in this marriage, you should get divorced. 02:02:28.440 |
That is, you know, you're not living the life 02:02:38.020 |
Now, the problem is it's also a funny problem 02:02:45.680 |
Now, some people, you know, this is when I bring in, 02:02:53.940 |
the procrastinator can, there's different levels. 02:02:57.780 |
There's the kind that even when there's a deadline, 02:03:02.540 |
they stop panicking, they just, they've given up 02:03:06.460 |
Then there's the kind that when there's a deadline, 02:03:08.340 |
they'll do it, but they'll wait to the last second. 02:03:10.340 |
Both of those people, I think, have a huge problem 02:03:15.100 |
Because, and most of the important things in life, 02:03:21.140 |
becoming a writer when you never have been before, 02:03:37.260 |
firing is the right, someone that needs to be fired, right? 02:03:48.340 |
that would completely change your life if you just did it 02:03:55.580 |
And I think that a ton of people have a problem 02:04:00.580 |
where they will, they think this delusion that, 02:04:12.260 |
and it just sits there on their list, collecting dust. 02:04:16.300 |
And so yeah, to me, it is very real suffering. 02:04:24.580 |
- I'm still working on the fix, first of all. 02:04:51.500 |
gets too close or once there's some scary external pressure, 02:04:55.580 |
And that's a huge aid to a lot of procrastinators. 02:04:59.140 |
Again, there's a lot of people who won't, you know, 02:05:01.540 |
do that thing, they've been writing that book 02:05:03.120 |
they wanted to write, but there's way fewer people 02:05:16.640 |
If you wanna, you know, you really wanna write music, 02:05:19.340 |
you really wanna become a singer, songwriter, 02:05:25.620 |
and say, hey, on this day, two months from now, 02:05:28.020 |
come and see, I'm gonna play you some of my songs. 02:05:30.340 |
You now have a panic monster, you're gonna write songs, 02:05:33.400 |
So there's duct tape things, you know, you can do things, 02:05:40.280 |
with a friend and I say, if I don't get X done 02:05:42.900 |
by a week from now, I have to donate a lot of money 02:05:48.200 |
- And that's, you would put that in the category 02:05:51.520 |
- Yeah, because it's not, why do I need that, right? 02:05:55.240 |
If I really had solved this, this is something 02:05:59.000 |
This is, I just literally just want to be selfish here 02:06:01.400 |
and do the work I need to do to get the goals 02:06:04.320 |
There's a, all the incentives should be in the right place 02:06:08.600 |
and yet, if I don't say that, it'll be a week from now 02:06:12.080 |
Something weird is going on, there's some resistance, 02:06:14.080 |
there's some force that is in my own way, right? 02:06:17.800 |
And so, doing something where I have to pay all this money, 02:06:23.920 |
Fixing the boat is something where I don't have to do that, 02:06:29.520 |
it's not, I'm not talking about super crazy work ethic, 02:06:33.060 |
just like, for example, okay, I have a lot of examples 02:06:36.560 |
'cause I have a serious problem that I've been working on 02:06:40.160 |
and in some ways, I've gotten really successful 02:06:41.880 |
at solving it and in other ways, I'm still floundering. 02:06:49.960 |
I probably could be even better and I'm like, and I'm-- 02:06:52.400 |
- You're procrastinating on becoming a better duct taper. 02:06:54.880 |
- Literally, like yes, there's nothing I won't. 02:06:57.520 |
So, here's what I know what I should do as a writer, right? 02:07:00.760 |
It's very obvious to me, is that I should wake up, 02:07:03.640 |
doesn't have to be crazy, I don't have 6 a.m. 02:07:05.000 |
or anything insane or I'm not gonna be one of those 02:07:14.400 |
and I should have a block, just say nine to noon 02:07:23.600 |
- It's obvious because all the great writers in history 02:07:26.880 |
did exactly that, some-- - Some of them have done that, 02:07:29.320 |
that's common, there's some that I like these writers, 02:07:32.720 |
but most of them, they do-- - But there's a session, 02:07:34.840 |
but there's a session that's-- - Most writers write 02:07:38.440 |
I don't think I'm different than those people. 02:07:41.040 |
It's a great time to write, you're fresh, right? 02:07:43.760 |
Your ideas from dreaming have kind of collected, 02:07:46.800 |
you have all the new answers that you didn't have yesterday 02:07:51.120 |
But more importantly, if I just had a routine 02:07:58.000 |
every week would have a minimum of 15 focused hours 02:08:03.920 |
but it's a lot, a 15, 15, no, this is no joke, 02:08:09.600 |
you're not talking to anyone, you're not opening your email, 02:08:11.680 |
you are focused writing for three hours, five, 02:08:16.560 |
So now what's happening is that every weekday 02:08:21.480 |
I know an A might be, wow, I really just got into a flow 02:08:26.080 |
but it's a minimum of a B, I can keep going if I want, 02:08:28.720 |
and every week is a minimum of a B, that's 15 hours. 02:08:31.360 |
Right, and if I just had, talk about compiling, 02:08:33.320 |
this is the two pages a week, if I just did that 02:08:35.880 |
every week, I'd achieve all my writing goals in my life. 02:08:42.600 |
either I'll revenge procrastination late at night 02:08:44.920 |
and go to bed way too late and then wake up later 02:08:46.360 |
and get on a bad schedule and I just fall into 02:08:48.000 |
these bad schedules, or I'll wake up and there's just, 02:08:50.440 |
you know, I'll say I was gonna do a few emails 02:08:52.320 |
and I'll open it up and suddenly I'm texting, 02:08:53.960 |
I'm texting, or I'll just go and I'll make a phone call 02:09:00.080 |
Or I'll start writing and then I hit a little bit of a wall, 02:09:06.680 |
well, this is icky and I'll go do something else. 02:09:14.480 |
she's the manager of lots of things, that's her role. 02:09:26.000 |
where she can see my screen from nine to noon, 02:09:33.760 |
It's the feeling of, you know, in the old days 02:09:36.760 |
you know, your lunch block is over and it's like, 02:09:41.780 |
But you said, you know what, you go, you say, okay, 02:09:45.080 |
and then you get to class and it's not that bad 02:09:48.400 |
You have a trainer and he says, okay, next set, 02:09:52.440 |
It's someone, some external thing being like, 02:10:03.440 |
Other people, I think, were raised with a sense of shame 02:10:06.320 |
and that stick in their head is hugely helpful. 02:10:11.000 |
And so, anyway, Alicia's sitting there next to me. 02:10:14.160 |
She's doing her own work, but she can see my screen, 02:10:17.160 |
and she, of all people, knows exactly what I should be doing 02:10:26.040 |
would just be too weird and too embarrassing. 02:10:31.600 |
So duct tape can solve, sometimes duct tape is enough, 02:10:39.000 |
I think part of it is that we are actually wired. 02:10:56.480 |
that we get the results like six months later. 02:10:59.640 |
Like that is not, so we're supposed to conserve energy 02:11:14.480 |
I think a lot of times we're just avoiding suffering, 02:11:15.960 |
and for a lot of people, the pain of not doing it 02:11:31.400 |
It just becomes I do it 'cause that's what I do. 02:11:46.320 |
and so on, like the I don't wanna do another set, 02:11:57.040 |
that the moment they would be having that feeling, 02:12:06.960 |
Like I talked to Elon about this a lot actually offline. 02:12:12.960 |
- It's the way I think, at least he talks about it, 02:12:18.080 |
you just pretend you're a machine running an algorithm. 02:12:36.480 |
you can frame it as like, it can feel like homework 02:12:39.360 |
or it can feel like you're living your best life 02:12:46.360 |
But I think ultimately is whatever reframing you need to do, 02:12:56.960 |
Like I'm now on a kick where I exercise every day. 02:13:09.360 |
But it's a thing that like I make sure I exercise every day 02:13:12.360 |
and it's become way, way easier because of the habit. 02:13:15.520 |
And I just, and I don't, like at least with exercise 02:13:19.640 |
'cause it's easier to replicate that feeling, 02:13:26.200 |
Well, I think about that even just like little things 02:13:33.320 |
I would be like, oh, I'm gonna go to the bathroom, 02:13:36.280 |
No, I just wanna like, I'm just gonna lie down right now. 02:13:38.920 |
It's just like that I just robotically go and do it. 02:13:42.280 |
- And it almost has become like a nice routine. 02:13:44.720 |
You know, it's like a morning routine for me stuff 02:13:47.280 |
is like, you know, that stuff is kind of just like 02:13:54.480 |
I don't think I skipped any days brushing my teeth. 02:14:21.840 |
I think that this kind of more primitive brain 02:14:28.200 |
that primitive brain is on board for some reason 02:14:32.680 |
So, but when I think about brushing my teeth, 02:14:44.000 |
we have to just like kind of like robotically, 02:14:45.440 |
just like, you know, it was kind of like Stockholm syndrome, 02:14:54.240 |
ooh, no, no, no, most days I can win this one. 02:14:56.800 |
And so the monkey puts up that like fierce resistance. 02:15:05.660 |
So I think of it as like jumping in a cold pool, 02:15:12.560 |
pacing around the side of the pool in my bathing suit, 02:15:15.080 |
just being like, I don't want to have that one second 02:15:27.040 |
You know, then I suddenly I'm like, I get into a flow. 02:15:29.320 |
So it's like, once I get in the cold water, I don't mind it. 02:15:31.380 |
But I will spend hours standing around the side of the pool. 02:15:34.960 |
And by the way, I do this in a more literal sense. 02:15:47.440 |
ugh, okay, I have to go to class feeling, right? 02:15:50.520 |
I will literally do a set and then dick around my phone 02:15:57.380 |
And I'll spend over an hour there and do way less. 02:16:03.120 |
I'm never like, I don't want to stop in the middle. 02:16:07.020 |
So it's something, there's something about transitions 02:16:09.160 |
that is very, that's why procrastinators are late 02:16:19.220 |
I'll leave at 3.36 and I'll be super stressed. 02:16:24.720 |
immediately I'm like, why didn't I do this earlier? 02:16:26.360 |
Now I'm back on my phone doing what I was doing. 02:16:28.660 |
I just had to get in the damn car or whatever. 02:16:31.280 |
So yeah, there's some very, very odd, irrational. 02:16:38.640 |
and you said that you're running a few minutes late. 02:16:45.560 |
because I can't possibly be the one who's early. 02:16:49.600 |
I don't understand, I'm always late to stuff. 02:16:51.800 |
And I know it's disrespectful in the eyes of a lot of people. 02:16:55.280 |
I can't help, you know what I'm doing ahead of it? 02:17:06.040 |
I obviously care about the person, but for some-- 02:17:13.060 |
because they like, they kind of like that quality 02:17:17.680 |
But more often, it's someone who shows up frazzled 02:17:20.340 |
and they feel awful and they're furious at themselves. 02:17:26.580 |
is look at those people alone running through the airport. 02:17:29.820 |
- They're not being disrespectful to anyone there. 02:17:35.300 |
- You've tweeted a quote by James Baldwin saying, 02:17:38.260 |
quote, "I imagine one of the reasons people cling 02:17:40.620 |
"to their hates so stubbornly is because they sense 02:17:49.700 |
What has been a painful but formative experience 02:17:55.340 |
Or what's the flavor, the shape of your pain that fuels you? 02:17:59.040 |
- I mean, honestly, the first thing that jumped to mind 02:18:02.700 |
is my own battles against myself to get my work done 02:18:11.260 |
Like, I probably would have taken two or three years, 02:18:19.900 |
I'm making it, I'm adding in things I shouldn't 02:18:23.860 |
being a perfectionist about like, oh, well, I learned that. 02:18:30.980 |
something, you know, trying to get it perfect 02:18:36.700 |
like, I'm not actually that much of a writing amateur. 02:18:55.620 |
It makes, you know, B, it wastes your precious time. 02:19:01.820 |
when you're in a negative, you know, self-defeating spiral, 02:19:05.860 |
it almost inevitably, you'll be less good to others. 02:19:10.020 |
Like, you know, I'll just, I used to, you know, 02:19:19.060 |
You know, New York City, great place for this. 02:19:25.740 |
I said, "I'm reserving you for Thursday night, 02:19:29.100 |
And it was such a fun part of our relationship. 02:19:31.460 |
Started writing this book and got into a really bad, 02:19:39.140 |
And I just stopped, like, ever valuing, like, 02:19:44.260 |
Like, I was like, "No, no, that's when I'm done." 02:19:47.180 |
And that's a trap, or very quickly, you know, 02:19:55.820 |
And for five years is not, we don't live very long. 02:19:58.300 |
Like, you're talking about your prime decades. 02:20:10.060 |
a very unproductive, unhelpful pattern for me, 02:20:13.020 |
which is I'd wake up in the morning in this great mood. 02:20:21.180 |
And, but, you know, first I'm gonna do all these other things 02:20:24.740 |
And then I ended up kind of failing for the day 02:20:34.300 |
probably a couple hours later than I want to. 02:20:36.060 |
And that's when all of the real reality hits me. 02:20:42.060 |
furious at myself, wishing I could take a time machine back 02:20:54.340 |
Procrastinators suffer in a very serious way. 02:20:56.780 |
So look, I, you know, I know this probably sounds 02:21:00.100 |
like a lot of like first world problems, and it is, 02:21:11.540 |
you're not being as good a friend or a spouse 02:21:16.100 |
You're usually not being very healthy in these moments. 02:21:17.920 |
You know, you're often, and you're not being, 02:21:23.000 |
And it's like, it feels like it's one small tweak away. 02:21:28.660 |
It's like, you just suddenly are just doing that nine to 12 02:21:39.060 |
I have not figured, I haven't fixed the boat yet. 02:21:45.460 |
'cause it is true that some of the greats in history, 02:21:48.460 |
especially writers, suffer from all the same stuff. 02:21:53.820 |
you might only write for two or three hours a day, 02:22:07.180 |
You'd be shocked how much you could wake up at nine 02:22:15.420 |
One or two, and do 25 really focused hours of stuff, 02:22:20.900 |
and then there's 112 waking hours in the week, right? 02:22:23.980 |
So we're talking about 80 something hours of free time. 02:22:27.220 |
You can live, you know, if you're just really focused 02:22:39.460 |
Right now I have neither, it's a lot of gray. 02:22:40.780 |
It's a lot of I should be working, but I'm not, 02:22:45.620 |
So if you can just get really good at the black and the white 02:22:49.580 |
so you just wake up and it's just like full work. 02:22:51.580 |
And then I think a lot of people could have like, 02:22:55.940 |
It's like you said, I'll do them really late at night 02:22:57.700 |
or whatever after having tortured myself the whole day 02:23:04.620 |
which is where you are when you know you should be working, 02:23:12.740 |
And so, yeah, I spend a lot of time in the dark. 02:23:18.100 |
- It's not clean conscience fun, it's bad, it's toxic. 02:23:21.340 |
And I think that it's, there's something about, 02:23:23.780 |
you know, you're draining yourself all the time. 02:23:26.820 |
and then if you actually have good, clean, fun, 02:23:29.620 |
You're reading a book, can be hanging out with someone, 02:23:32.300 |
You can go and do something cool in the city. 02:23:36.660 |
It's, you're recharging some part of your psyche there. 02:23:40.940 |
And I say this from the experiences when I have had, 02:23:46.660 |
It's like, you feel like you're fist pounding. 02:23:48.380 |
One part of your brain is fist pounding the other part. 02:23:50.300 |
Like, you're like, we got, like, we treat ourselves well. 02:23:54.820 |
Like, it's how you're internally feel like, I treat myself. 02:23:56.980 |
And it's like, yeah, no, of course it's work time. 02:23:58.380 |
And then later you're like, now it's play time. 02:24:24.220 |
- But you know, this podcast is very regular. 02:24:37.060 |
to schedule stuff for the future for myself, right? 02:24:39.100 |
Because that's future Tim and future Tim is not my problem. 02:24:42.480 |
So I'll schedule all kinds of shit for future Tim 02:24:53.860 |
- Right, it seems like a good medium for procrastinating. 02:24:58.020 |
- I know, but at least this is the kind of thing, 02:25:03.420 |
it's the kind of thing that you would dream of doing 02:25:14.300 |
I'm sure you have this same kind of thing with the podcast. 02:25:17.100 |
In fact, because you're gonna be doing the podcast, 02:25:33.260 |
and have done this differently, that is desperation. 02:25:38.800 |
The other thing you could do is if you have a partner, 02:25:41.380 |
now you could say, we meet these 15 hours every week. 02:25:48.980 |
- Yeah, that's why they say like a co-founder 02:26:09.420 |
Like you can't even have a hope of a balanced life 02:26:16.380 |
When you, you're one of the most interesting people 02:26:21.820 |
So as a writer, you look out into the future. 02:26:26.580 |
Do you dream about certain things you want to still create? 02:26:34.180 |
Is there movies you want to write or direct or? 02:26:41.020 |
- No, there's specific list of things that really excite me, 02:26:47.380 |
And that's part of why the last five years really like, 02:26:51.620 |
when I feel like I'm not moving as quickly as I could, 02:26:54.380 |
it bothers me because I have so much genuine excitement 02:27:00.240 |
I don't like doing things, but a lot of writers are like that. 02:27:10.020 |
putting it out there and have people appreciate it. 02:27:11.520 |
It's like the best thing in the world, right? 02:27:15.740 |
has a little, you know, a dream or, you know, something. 02:27:25.780 |
and I feel proud of it and I put it out there. 02:27:29.160 |
I'm like fist pounding my seven year old self. 02:27:33.260 |
I like, I owe it to myself to do certain things. 02:27:41.740 |
you just feel very, a lot of inner peace when you do it. 02:27:47.460 |
So I just, it's, for me, that includes a lot more writing. 02:27:50.860 |
I just, you know, short, short, no, short blog posts. 02:27:56.100 |
of long blog posts is a great, I love that medium. 02:28:01.900 |
I'm gonna do this and I'm gonna have another book 02:28:05.340 |
And if I do, I'll do more, otherwise I won't. 02:28:10.840 |
I want to, I did a little travel series once. 02:28:25.620 |
- And they picked, they sent me to weird places. 02:28:27.940 |
They sent me, I went to Siberia, I went to Japan. 02:28:42.440 |
And I get to, you know, each one I got to, you know, 02:28:58.480 |
oh man, like I haven't done one of those in so long. 02:29:01.780 |
And then I have a big like desire to do fictional stuff. 02:29:09.700 |
That's actually what I was doing before Wait But Why. 02:29:28.420 |
on which you can be creative and create something, 02:29:37.180 |
I could do something good, I wanna do it, I wanna try it. 02:29:42.460 |
- I think it's fun to just watch you actually sample these. 02:29:49.460 |
I mean, that's a cool medium to see like where it goes. 02:29:52.220 |
The cool thing about podcasting or making videos, 02:29:54.340 |
especially with a super creative mind like yours, 02:29:56.840 |
you don't really know what you're gonna make of it 02:30:03.020 |
but I'm like, I like going on other people's podcasts. 02:30:09.980 |
there's the challenges of how the sausage is made. 02:30:13.180 |
So like the challenges of the challenge of actually. 02:30:16.700 |
I'll go on like, as you know, long ass monologues. 02:30:19.420 |
And you can't do it on, if you're the interviewer, 02:30:24.780 |
And that can be, that might be hard, but we'll see. 02:30:32.020 |
I mean, some of my favorite is more like solo, 02:30:36.060 |
So you're having a conversation, but you're like friends, 02:30:48.820 |
Like if it's just a friend who I wanna like really riff with, 02:30:51.860 |
I just don't, I don't like interviewing someone, 02:30:55.380 |
which I won't, that's not what the podcast will be, 02:30:57.180 |
but I can't help, I've tried moderating panels before, 02:31:02.140 |
And no one likes a moderator who's too involved. 02:31:04.900 |
So I'm interviewing someone and I'm like, I can't, 02:31:11.540 |
That's my curiosity being like, wait, how about this? 02:31:14.580 |
- Yeah, I see the way your brain works is hilarious. 02:31:17.940 |
It's like lights up with fire and excitement. 02:31:21.860 |
I like watching people, I like listening to people. 02:31:27.900 |
This is me listening to your podcast right now. 02:31:44.260 |
or maybe broadly this whole thing we got going on, 02:31:51.260 |
For me, I feel like I want to be around as long as I can. 02:31:55.580 |
If I can do some kind of crazy life extension 02:32:10.180 |
As I said, no one wants eternal life, I believe, 02:32:15.900 |
and I was like, okay, no one wants to live that many years. 02:32:25.620 |
We'd have a whole process of someone signing off, 02:32:30.860 |
- No, I think you'd be super depressed by that point. 02:32:33.540 |
Like, who's gonna sign off when they're doing pretty good? 02:32:38.740 |
if I'm happy, I can stay around for, you know, 02:32:46.860 |
if you would sign up for 50 if you had a choice, 02:33:01.140 |
And then, honestly, the reason I love writing, 02:33:04.140 |
is like, warm, fuzzy connection with other people, right? 02:33:11.820 |
And that's why I would never wanna be like a journalist, 02:33:13.980 |
where their personality's like hidden behind the writing. 02:33:23.660 |
And if I can take something that's in my head, 02:33:25.380 |
and other people can say, oh my God, I think that too, 02:33:28.420 |
it made me feel seen, like, that feels amazing. 02:33:31.780 |
we're all having such a weird common experience, 02:33:43.940 |
so I feel like so many of us suffer in the same ways, 02:33:46.220 |
and we're all going through a lot of the same things. 02:34:18.140 |
or you know, you just, you know, you get emails, 02:34:22.940 |
- 'Cause you're just sitting there alone, typing. 02:34:26.900 |
- But that's why publishing is so gratifying, 02:34:28.260 |
'cause that's the moment when all this connection happens. 02:34:30.900 |
- And especially if I had to put my finger on it, 02:34:35.500 |
and they're like, the existence is all realized, 02:34:43.300 |
a lot of really high quality time with friends and family, 02:34:53.460 |
and at least we can like, laugh at ourselves together 02:34:58.620 |
- And then your last blog post will be written from Mars, 02:35:02.620 |
as you get the bad news that you're not able to return 02:35:25.420 |
so it's both the hope, the awe that you experience, 02:35:38.060 |
but then it returns to like, the love of humanity. 02:35:53.180 |
- Well, also, that would bring out great writing. 02:35:58.640 |
- Well, that's exactly the future I hope for you, Tim. 02:36:02.780 |
All right, this was an incredible conversation. 02:36:21.820 |
which add that to the long list of ideas to procrastinate. 02:36:26.140 |
How about, Tim, thanks so much for talking to me, man. 02:36:29.540 |
- Thanks for listening to this conversation with Tim Urban. 02:36:34.940 |
please check out our sponsors in the description. 02:36:51.660 |
Thanks for listening, and hope to see you next time.