back to indexElon Musk: SpaceX, Mars, Tesla Autopilot, Self-Driving, Robotics, and AI | Lex Fridman Podcast #252
Chapters
0:0 Introduction
0:7 Elon singing
0:55 SpaceX human spaceflight
7:40 Starship
16:16 Quitting is not in my nature
17:51 Thinking process
27:25 Humans on Mars
32:55 Colonizing Mars
36:41 Wormholes
41:19 Forms of government on Mars
48:22 Smart contracts
49:52 Dogecoin
51:24 Cryptocurrency and Money
57:33 Bitcoin vs Dogecoin
60:16 Satoshi Nakamoto
62:38 Tesla Autopilot
65:44 Tesla Self-Driving
77:48 Neural networks
86:44 When will Tesla solve self-driving?
88:48 Tesla FSD v11
96:21 Tesla Bot
107:1 History
114:52 Putin
120:32 Meme Review
134:58 Stand-up comedy
136:31 Rick and Morty
138:10 Advice for young people
146:8 Love
149:1 Meaning of life
00:00:00.000 |
"The following is a conversation with Elon Musk, 00:00:02.540 |
"his third time on this The Lex Friedman Podcast." 00:00:14.540 |
I mean, how close do I need to get at the same thing? 00:01:07.400 |
These human spaceflight missions were a beacon of hope 00:01:10.600 |
to me and to millions over the past two years 00:01:14.640 |
one of the most difficult periods in recent human history. 00:01:18.040 |
We saw, we see the rise of division, fear, cynicism, 00:01:47.200 |
and I think we should make sure we do everything 00:01:52.200 |
we can to have a good future and an exciting future 00:01:54.600 |
and one that maximizes the happiness of the people. 00:02:11.360 |
- Yeah, no, that was extremely stressful, no question. 00:02:17.320 |
We obviously could not let them down in any way. 00:02:21.720 |
So extremely stressful, I'd say, to say the least. 00:02:26.720 |
I was confident that at the time that we launched 00:02:30.920 |
that no one could think of anything at all to do 00:02:35.920 |
that would improve the probability of success. 00:02:39.080 |
And we racked our brains to think of any possible way 00:02:44.840 |
We could not think of anything more and nor could NASA. 00:02:48.080 |
And so that's just the best that we could do. 00:03:12.760 |
and when they returned back home or back to Earth? 00:03:33.480 |
and proved out the systems, 'cause we really, 00:03:49.840 |
was actually very inspiring, the Inspiration 4 mission. 00:03:58.860 |
And it really is, I was actually inspired by that. 00:04:13.760 |
it's the all-civilian, first time all-civilian 00:04:19.400 |
- Yeah, and it was, I think, the highest orbit 00:04:22.480 |
that in like, I don't know, 30 or 40 years or something. 00:04:26.120 |
The only one that was higher was the one shuttle, 00:04:32.160 |
And then before that, it would have been Apollo in '72. 00:04:40.480 |
I think as a species, we want to be continuing 00:04:49.720 |
And I think it would be tragic, extremely tragic, 00:04:52.880 |
if Apollo was the high-water mark for humanity, 00:05:09.040 |
and so almost half a century, and we've not been back. 00:05:27.160 |
I think we could learn a lot about the nature 00:05:28.880 |
of the universe if we have a proper science base 00:05:45.680 |
and then get people to Mars and get out there 00:05:54.480 |
but since you're so busy with the hard engineering 00:06:00.440 |
are you still able to marvel at the magic of it all, 00:06:03.000 |
of space travel, of every time the rocket goes up, 00:06:08.140 |
Or are you just so overwhelmed with all the challenges 00:06:16.240 |
the reason I wanted to ask this question of May 30th, 00:06:23.480 |
It's already, at the time, it was an engineering problem, 00:06:33.680 |
To me, that or something like that, maybe Inspiration 4, 00:06:38.440 |
one of those will be remembered as the early steps 00:06:46.620 |
so I mean, the thing I think maybe some people know, 00:06:50.180 |
is I'm actually the chief engineer of SpaceX, 00:07:03.100 |
with that vehicle, it's fundamentally my fault. 00:07:09.780 |
So I'm really just thinking about all the things that, 00:07:21.500 |
It's like, other people see, oh, this is a spacecraft 00:07:46.500 |
in more detail about Starship in the near future, perhaps. 00:07:49.580 |
- Yeah, we can talk about it now if you want. 00:08:08.640 |
perfectly, one engineering problem perfectly, 00:08:15.540 |
So is it maybe related to the efficiency of the engine, 00:08:22.700 |
maybe the controls of the crazy thing it has to do to land? 00:08:28.900 |
the biggest thing absorbing my time is engine production. 00:08:50.240 |
'Cause I say currently the best rocket engine ever 00:09:03.700 |
And still, I think an engine should only count 00:09:09.020 |
So our engine has not gotten anything to orbit yet, 00:09:11.920 |
but it's the first engine that's actually better 00:09:16.980 |
than the Russian RD engines, which were amazing design. 00:09:25.500 |
What are the different aspects of it that make it, 00:09:31.940 |
if the whole thing works in terms of efficiency, 00:09:37.120 |
- Well, the Raptor is a full-flow staged combustion engine 00:09:42.120 |
and it's operating at a very high chamber pressure. 00:10:06.940 |
possibly maybe higher, that's 300 atmospheres. 00:10:10.320 |
So the record right now for operational engine 00:10:15.320 |
is the RD engine that I mentioned, the Russian RD, 00:10:27.860 |
So 10% more chamber pressure is more like 50% more difficult. 00:10:38.640 |
that is what allows you to get a very high power density 00:10:45.240 |
So enabling a very high thrust to weight ratio 00:11:23.980 |
which is the ratio between the exit nozzle and the throat. 00:11:45.940 |
- So why is it such a hard engine to manufacture at scale? 00:12:07.020 |
that don't exist in order to make this engine work. 00:12:38.100 |
And they all have a recursive effect on each other. 00:12:51.480 |
Like there's a reason no one's made this before. 00:12:55.060 |
And the reason we're doing a staged combustion full flow 00:13:17.740 |
which that's the really the holy grail of orbital rocketry, 00:13:23.200 |
you have to have, everything's gotta be the best. 00:13:40.620 |
You've got to shed mass in any possible way that you can. 00:13:45.100 |
For example, we are, instead of putting landing legs 00:13:53.220 |
So that's like, I mean, we're talking about catching 00:14:06.660 |
It's like a karate kid with the fly, but much bigger. 00:14:15.480 |
Anyway, so this is bananas, this is banana stuff. 00:14:19.820 |
- So you mentioned that you doubt, well, not you doubt, 00:14:52.660 |
before we're able to have full and rapid reusability. 00:15:18.340 |
that success was in the set of possible outcomes, 00:15:39.580 |
They're working night and day to make it happen. 00:15:41.980 |
And like I said, the critical thing to achieve 00:15:48.300 |
for the revolution in space flight and for humanity 00:15:52.020 |
is to have a fully and rapidly reusable rocket, 00:16:00.100 |
and this has always been the holy grail of rocketry. 00:16:09.700 |
have tried to do this before and they've not succeeded, 00:16:16.100 |
- What's your source of belief in situations like this, 00:16:21.220 |
when the engineering problem is so difficult? 00:16:23.580 |
There's a lot of experts, many of whom you admire, 00:16:34.060 |
a lot of experts, maybe journalists, all the kind of, 00:16:42.820 |
And you yourself know that even if it's a non-null set, 00:17:02.540 |
And to keep going with the project, take it to completion? 00:17:19.820 |
It doesn't really matter how I think about things. 00:17:22.320 |
I mean, for me, it's simply this is something 00:17:28.100 |
And we should just keep doing it or die trying. 00:17:41.720 |
- And I don't care about optimism or pessimism. 00:17:48.900 |
Can you then zoom back in to specific problems 00:17:55.180 |
with Starship or any engineering problems you work on? 00:18:02.260 |
your thinking process, and describe how you think 00:18:10.100 |
You've spoken about first principles thinking, 00:18:14.180 |
- Well, yeah, I like saying like physics is law 00:18:21.700 |
Like I've met a lot of people that can break the law, 00:18:23.220 |
but I haven't met anyone who could break physics. 00:18:25.620 |
So first, for any kind of technology problem, 00:18:49.840 |
It's really just saying, let's boil something down 00:18:58.420 |
the things that we are most confident are true 00:19:18.120 |
oh, you're violating conservation of energy or momentum 00:19:20.280 |
or something like that, then it's not gonna work. 00:19:30.760 |
And another good physics tool is thinking about things 00:19:41.680 |
or to a very small number, how do things change? 00:19:51.640 |
- Yeah, let's say take an example of manufacturing, 00:19:55.920 |
which I think is just a very underrated problem. 00:20:17.800 |
is, like, why is this part or product expensive? 00:20:22.800 |
Is it because of something fundamentally foolish 00:20:27.400 |
that we're doing, or is it because our volume is too low? 00:20:32.200 |
what if our volume was a million units a year? 00:20:35.760 |
That's what I mean by thinking about things in the limit. 00:20:38.140 |
If it's still expensive at a million units a year, 00:20:40.140 |
then volume is not the reason why your thing is expensive. 00:20:42.520 |
There's something fundamental about the design. 00:20:44.700 |
- And then you then can focus on reducing the complexity 00:21:04.100 |
well, it's expensive because our unit volume is low. 00:21:06.460 |
And if we were in automotive or something like that, 00:21:08.740 |
or consumer electronics, then our costs would be lower. 00:21:31.760 |
like how we're gonna make the supply chain work here? 00:21:35.720 |
- And then the cost of materials, things like that, 00:21:41.860 |
like you're thinking about things in the limit, 00:21:56.060 |
and say, if you look at the raw materials in the rocket, 00:22:11.780 |
and you say, what's the weight of the constituent elements 00:22:38.740 |
like just a pile of these raw materials here, 00:22:43.620 |
and rearrange the atoms into the final shape, 00:22:56.380 |
So then what's actually causing things to be expensive 00:23:01.160 |
is how you put the atoms into the desired shape. 00:23:05.960 |
- Yeah, actually, if you don't mind me taking a tiny tangent, 00:23:11.400 |
who's somebody that worked with you as a friend. 00:23:20.480 |
of the same kind of thinking that you're talking about now. 00:23:52.120 |
Because I've gotten the chance to interact quite a bit, 00:23:55.960 |
obviously in the academic circles with humanoid robots, 00:23:59.480 |
and then my boss in dynamics and stuff like that, 00:24:04.520 |
And then Jim kind of schooled me on saying like, 00:24:07.600 |
okay, like this kind of first principles thinking 00:24:10.080 |
of how can we get the cost of manufacture down? 00:24:12.440 |
I suppose you do that, you have done that kind of thinking 00:24:16.840 |
for Tesla bot and for all kinds of complex systems 00:24:23.680 |
And you say, okay, how can we simplify everything down? 00:24:28.520 |
I mean, I think if you are really good at manufacturing, 00:24:37.520 |
that asymptotically approaches the raw material value 00:24:42.040 |
of the constituents, plus any intellectual property 00:24:50.280 |
It's not like, that's a very hard thing to do, 00:25:05.400 |
So what will often happen in trying to design a product 00:25:08.520 |
is people will start with the tools and parts and methods 00:25:19.560 |
The other way to think about it is actually imagine the, 00:25:25.080 |
try to imagine the platonic ideal of the perfect product, 00:25:31.360 |
And so what is the perfect arrangement of atoms 00:25:52.080 |
and you really should think about it in this way, 00:25:58.180 |
if you think you might fall victim to the momentum 00:26:07.680 |
people will want to use the same tools and methods 00:26:12.040 |
They just, that's what they'll do by default. 00:26:15.720 |
And then that will lead to an outcome of things 00:26:18.720 |
that can be made with those tools and methods, 00:26:28.480 |
it's good to think of things in both directions. 00:26:30.120 |
So like, what can we build with the tools that we have? 00:26:39.560 |
is gonna be a moving target, 'cause as you learn more, 00:26:42.700 |
the definition for that perfect product will change, 00:26:46.920 |
'cause you don't actually know what the perfect product is, 00:26:48.720 |
but you can successfully approximate a more perfect product. 00:26:53.080 |
So thinking about it like that, and then saying, 00:26:56.680 |
okay, now, what tools, methods, materials, whatever, 00:26:59.920 |
do we need to create in order to get the atoms in that shape? 00:27:11.780 |
- I should mention that the brilliant Siobhan Zillis 00:27:15.800 |
is hanging out with us, in case you hear a voice of wisdom 00:27:30.440 |
to put a base on the moon to do some research, 00:27:43.800 |
When do you think SpaceX will land a human being on Mars? 00:27:51.200 |
Best case is about five years, worst case, 10 years. 00:28:15.820 |
- What are the determining factors, would you say, 00:28:28.960 |
I mean, Starship is the most complex and advanced rocket 00:28:51.540 |
and ultimately cost per ton to the surface of Mars. 00:28:56.400 |
but it is actually the thing that needs to be optimized. 00:28:59.300 |
Like, there is a certain cost per ton to the surface of Mars 00:29:04.060 |
where we can afford to establish a self-sustaining city, 00:29:08.820 |
and then above that, we cannot afford to do it. 00:29:11.620 |
So right now, you can fly to Mars for a trillion dollars, 00:29:16.780 |
no amount of money could get you a ticket to Mars. 00:29:22.460 |
to get that, like, something that is actually possible 00:29:25.480 |
But then, but that's, we don't just wanna have, 00:29:37.940 |
In order to pass a very important great filter, 00:29:42.940 |
I think we need to be a multi-planet species. 00:29:45.700 |
This may sound somewhat esoteric to a lot of people, 00:29:55.720 |
there's something, the Earth is likely to experience 00:30:05.260 |
that humans do to themselves or an external event, 00:30:09.580 |
And, but eventually, and if none of that happens, 00:30:39.340 |
And so if you think about, like, the current situation, 00:30:43.220 |
it's really remarkable, and kind of hard to believe, 00:30:50.100 |
and this is the first time in 4 1/2 billion years 00:30:52.340 |
that it's been possible to extend life beyond Earth. 00:30:55.700 |
And that window of eternity may be open for a long time, 00:30:58.820 |
and I hope it is, but it also may be open for a short time. 00:31:13.780 |
- Yeah, the existence of nuclear weapons, pandemics, 00:31:17.820 |
all kinds of threats should kind of give us some motivation. 00:31:26.800 |
could die with a bang or a whimper, you know, 00:31:37.260 |
But if it's World War III, it's more of a bang. 00:31:42.120 |
I mean, it's important to think of these things 00:31:44.540 |
and just, you know, think of things as probabilities, 00:31:48.620 |
There's a certain probability that something bad 00:31:55.540 |
But there's, like, let's say for argument's sake, 00:31:59.420 |
a 1% chance per century of a civilization ending event. 00:32:10.420 |
So then, you know, we should basically think of this 00:32:19.500 |
just like taking out insurance for life itself. 00:32:23.200 |
- Wow, it's turned into an infomercial real quick. 00:32:31.540 |
And, you know, we can bring the creatures from, 00:32:35.700 |
you know, plants and animals from Earth to Mars 00:32:52.260 |
when the sun expands anyway, and then that'll be it. 00:32:56.180 |
- What do you think is the most difficult aspect 00:32:58.520 |
of building a civilization on Mars, terraforming Mars, 00:33:03.720 |
from a financial perspective, human perspective, 00:33:28.420 |
Like the ones that go to Mars, we need them back. 00:33:32.620 |
But we can't just not have the spaceships come back. 00:33:38.740 |
- I mean, do you think about the terraforming aspect, 00:33:40.660 |
like actually building, are you so focused right now 00:33:43.180 |
on the spaceships part that's so critical to get to Mars? 00:34:04.860 |
and the launch and everything, you need like heat shield, 00:34:15.020 |
So like rough approximation would be a billion dollars 00:34:29.080 |
So we need to improve that by at least a factor of 1,000. 00:34:38.460 |
- Yes, ideally much less than a million a ton. 00:34:44.260 |
you ever say like, well, how much can society afford 00:35:05.300 |
even if the spaceships from Earth stop coming 00:35:07.700 |
for any reason, doesn't matter what the reason is, 00:35:13.700 |
And if there's even one critical ingredient missing, 00:35:22.580 |
it's only a matter of time, you're gonna die. 00:35:30.820 |
I'm not sure this will really happen in my lifetime, 00:35:33.860 |
but I hope to see it at least have a lot of momentum. 00:35:50.420 |
'cause you have to set up a lot of infrastructure on Mars. 00:36:00.980 |
you can't be missing, like you need semiconductor fabs, 00:36:15.700 |
but it's definitely a fixer-upper of a planet. 00:36:20.220 |
- Earth is pretty good. - Earth is like easy. 00:36:22.180 |
- And also we should clarify in the solar system. 00:36:29.780 |
- There might be some great planets out there, 00:36:41.980 |
So you did mention physics as the first starting point. 00:36:55.700 |
by humans to travel fast in the speed of light? 00:37:21.380 |
Like so, you can only move at the speed of light 00:37:26.380 |
through space, but if you can make space itself move, 00:37:34.520 |
Space is capable of moving faster than the speed of light. 00:38:03.100 |
- So all the work you've done with propulsion, 00:38:05.060 |
how much innovation is possible with rocket propulsion? 00:38:11.140 |
and you're constantly innovating in every aspect. 00:38:24.620 |
- Well, as I was saying, really the Holy Grail 00:38:27.940 |
is a fully and rapidly reusable orbital system. 00:38:36.500 |
the Falcon 9 is the only reusable rocket out there. 00:39:07.860 |
So, that upper stage is at least $10 million. 00:39:15.460 |
the booster is not as rapidly and completely reusable 00:39:27.140 |
is on the order of $15 to $20 million, maybe. 00:39:38.040 |
it's by far better than any rocket ever in history. 00:39:54.400 |
like imagine if you had an aircraft or something, 00:40:08.320 |
But in fact, you just refuel the car or recharge the car, 00:40:15.480 |
like, I don't know, a thousand times cheaper. 00:40:30.560 |
and have to throw even any significant part of it away, 00:40:40.320 |
could do a cost per launch of like a million, 00:41:00.720 |
versus like some kind of brilliant breakthrough 00:41:11.240 |
This is an extremely difficult engineering problem. 00:41:19.320 |
Let me ask a slightly philosophical fun question. 00:41:29.960 |
political system do you think would work best 00:41:36.580 |
I mean, the interesting reason to talk about this stuff, 00:41:56.960 |
and an opportunity to rethink the whole nature of government 00:41:59.640 |
just as was done in the creation of the United States. 00:42:02.680 |
So, I mean, I would suggest having direct democracy, 00:42:25.120 |
and a coercion of the politicians and that kind of thing. 00:42:29.580 |
So I'd recommend that there's just direct democracy. 00:42:46.880 |
- Yeah, and then keeping a well-informed populace, 00:42:49.920 |
like really being transparent about all the information, 00:42:54.800 |
- Yeah, and not make it as annoying as those cookies 00:42:59.720 |
There's always a slight amount of trepidation 00:43:05.960 |
I feel as though there's perhaps a very tiny chance 00:43:09.040 |
that'll open a portal to hell or something like that. 00:43:13.800 |
- Why do they, why do they keep wanting me to accept it? 00:43:24.000 |
It's so annoying to keep accepting all these cookies. 00:43:28.960 |
- Yes, you can have my damn cookie, I don't care, whatever. 00:43:32.400 |
- Heard it from Yon first, he accepts all your damn cookies. 00:43:50.200 |
- Yeah, it's somebody who has some good intentions 00:43:52.520 |
of privacy or whatever, but now everyone just has 00:43:56.120 |
to accept cookies and it's not, you have billions 00:43:59.000 |
of people who have to keep clicking accept cookie, 00:44:11.280 |
like a world war or something like that in a while, 00:44:14.320 |
and obviously we'd like to not have world wars, 00:44:31.120 |
So World Wars I and II, there were huge resets 00:44:39.400 |
and there's no cleansing function or garbage collection 00:44:41.720 |
for rules and regulations, then rules and regulations 00:44:43.840 |
will accumulate every year 'cause they're immortal. 00:44:46.440 |
There's no actual, humans die, but the laws don't. 00:44:59.000 |
that are put in place will be counterproductive. 00:45:01.920 |
Done with good intentions, but counterproductive. 00:45:05.600 |
So if rules and regulations just accumulate every year 00:45:12.360 |
then eventually you won't be able to do anything. 00:45:24.560 |
like basically all economies that have been around 00:45:36.560 |
but they don't put effort into removing them. 00:45:38.680 |
And I think that's very important that we put effort 00:45:42.080 |
But it gets tough 'cause you get special interests 00:45:59.360 |
with the constitution is it's kind of like C versus Java 00:46:04.120 |
'cause it doesn't have any garbage collection built in. 00:46:07.920 |
when you first said that the metaphor of garbage collection, 00:46:14.320 |
It would be interesting if the laws themselves 00:46:20.720 |
unless somebody explicitly publicly defends them. 00:46:23.600 |
So that's sort of, it's not like somebody has to kill them, 00:46:43.760 |
or just the civilization's arteries just harden over time 00:46:50.840 |
because there's just a rule against everything. 00:46:53.200 |
So I think, I don't know, for Mars or whatever, 00:46:57.920 |
I'd say, or even for, obviously for Earth as well, 00:47:00.240 |
like I think there should be an active process 00:47:11.560 |
'cause rules and regulations can also think of as like 00:47:21.320 |
So it's not like we shouldn't have rules and regulations, 00:47:23.000 |
but you have code accumulation, but no code removal. 00:47:32.280 |
And it's just, it makes it hard for things to progress. 00:47:37.920 |
So I don't know, maybe Mars, you'd have like, 00:47:57.040 |
Ultimately, it will be up to the people on Mars to decide. 00:48:00.680 |
But I think it should be easier to remove a law 00:48:15.240 |
you need like say 60% vote to have a law take effect, 00:48:36.320 |
- I mean, that's happened to me so many times. 00:48:43.480 |
there's any room for ideas of smart contracts or so on? 00:48:49.320 |
That's an interesting use of things like smart contracts 00:48:53.000 |
to implement the laws by which governments function. 00:48:58.920 |
or maybe a dog coin that enables smart contracts somehow. 00:49:03.920 |
- I don't quite understand this whole smart contract thing. 00:49:09.960 |
I mean, I'm too dumb to understand smart contracts. 00:49:16.000 |
- I mean, my general approach to any kind of deal 00:49:25.800 |
And just keep any kind of deal very short and simple, 00:49:29.640 |
plain language, and just make sure everyone understands 00:49:41.360 |
But usually deals are, business deals or whatever, 00:49:47.200 |
are way too long and complex and overly layered 00:49:52.720 |
- You mentioned that Doge is the people's coin. 00:49:57.880 |
- And you said that you were literally going, 00:49:59.640 |
SpaceX may consider literally putting a Doge coin 00:50:13.640 |
we've talked about political systems on Mars, 00:50:16.080 |
that Doge coin is the official currency of Mars 00:50:27.040 |
because you can't synchronize due to speed of light. 00:50:32.680 |
- So it must be completely standalone from Earth. 00:50:36.480 |
- Well, yeah, 'cause Mars, at closest approach, 00:50:45.000 |
it's roughly 20 light minutes away, maybe a little more. 00:50:48.480 |
So you can't really have something synchronizing 00:50:52.480 |
if you've got a 20 minute speed of light issue, 00:51:00.000 |
So Mars would, I don't know if Mars would have 00:51:07.640 |
But it would be some kind of localized thing on Mars. 00:51:16.520 |
The future of Mars should be up to the Martians. 00:51:25.800 |
is an interesting approach to reducing the error 00:51:42.920 |
of what money actually is on a practical day-to-day basis 00:51:59.960 |
is really a bunch of heterogeneous mainframes 00:52:26.040 |
And banks are still buying mainframes in 2021 00:52:32.040 |
And the Federal Reserve is probably even older 00:52:41.080 |
And so the government effectively has editing privileges 00:52:56.420 |
And this increases the error in the database that is money. 00:53:04.580 |
And so it's kind of like an internet connection. 00:53:25.460 |
I think that's probably the right way to think of it. 00:53:35.180 |
And crypto is an attempt to reduce the error in money 00:54:05.740 |
like cryptocurrency takes us into the 21st century 00:54:14.200 |
- Like I said, just think of money as information. 00:54:35.020 |
of thinking about things in the limit is helpful. 00:54:39.980 |
and you have a trillion dollars, it's useless. 00:54:52.660 |
There's no resource to allocate except yourself. 00:54:56.060 |
If you're stranded on a desert island with no food, 00:55:04.100 |
all the Bitcoin in the world will not stop you from starving. 00:55:20.820 |
for resource allocation across time and space. 00:55:24.000 |
In what form should that database or data system, 00:55:45.020 |
in that the transaction volume is very limited. 00:55:48.740 |
And the latency for a properly confirmed transaction 00:56:19.940 |
or an accounting of relative obligations, I suppose. 00:56:34.340 |
- Yeah, Lightning Network and the Layer 2 technologies 00:56:40.860 |
but the point is, it's kind of brilliant to say 00:56:48.060 |
that exchange of information. - Yeah, so say like 00:56:55.300 |
for the efficient, to have efficient value ratios 00:57:01.380 |
So you've got this massive number of products and services, 00:57:12.100 |
a ratio of exchange between goods and services. 00:57:26.620 |
debt and equity shift obligations across time. 00:57:37.060 |
even though it was obviously created as a joke, 00:57:56.020 |
Like right now, if you want to do a Bitcoin transaction, 00:57:58.260 |
the price of doing that transaction is very high. 00:58:00.460 |
So you could not use it effectively for most things. 00:58:04.380 |
And nor could it even scale to a high volume. 00:58:16.220 |
the internet connections were much worse than they are today. 00:58:26.860 |
So like having a small block size or whatever is, 00:58:45.380 |
And I think there's some value to having a linear increase 00:59:07.820 |
if a currency is expected to increase in value over time, 00:59:15.180 |
'Cause you're like, oh, I'll just hold it and not spend it 00:59:18.500 |
because it's scarcity is increasing with time. 00:59:20.580 |
So if I spend it now, then I will regret spending it. 00:59:26.020 |
But if there's some dilution of the currency occurring 00:59:41.860 |
or hash strings that are generated every year. 00:59:49.620 |
So there's some inflation, but it's not a percentage base. 01:00:00.580 |
So it just, I'm not saying that it's like the ideal system 01:00:09.500 |
just fundamentally better than anything else I've seen, 01:00:19.740 |
so you're not, you know, some people suggested, 01:00:37.260 |
It's an interesting kind of quirk of human history 01:00:43.580 |
that is a completely anonymous inventor or creator. 01:00:47.020 |
- Well, I mean, you can look at the evolution of ideas 01:01:26.100 |
but the evolution of ideas is pretty clear before that. 01:01:33.020 |
is probably more than anyone else responsible 01:01:41.900 |
but I'm not sure that's neither here nor there. 01:01:48.300 |
for the ideas behind Bitcoin than anyone else. 01:01:53.360 |
aren't even as important as the figures involved 01:01:56.260 |
in the evolution of ideas that led to a thing. 01:02:03.620 |
but maybe most names will be forgotten anyway. 01:02:13.680 |
- I think Shakespeare had a thing about roses and stuff, 01:02:43.680 |
Tesla Autopilot has been through an incredible journey 01:02:55.180 |
- I think that's where we first connected really 01:03:01.920 |
- The whole journey was incredible to me to watch. 01:03:07.720 |
Because I knew, well, part of it is I was at MIT 01:03:10.360 |
and I knew the difficulty of computer vision. 01:03:13.200 |
And I knew the whole, I had a lot of colleagues and friends 01:03:15.800 |
about the DARPA challenge and knew how difficult it is. 01:03:20.160 |
When I first drove a Tesla with the initial system 01:03:26.540 |
So at first when I got in, I thought there's no way 01:03:29.920 |
this car could maintain, like stay in the lane 01:03:35.900 |
So my intuition initially was that the lane keeping problem 01:03:41.760 |
- Oh, lane keeping, yeah, that's relatively easy. 01:03:48.440 |
we talked about previous is prototype versus a thing 01:03:54.400 |
over hundreds of thousands of miles and millions. 01:03:59.360 |
- We had to wrap a lot of code around the Mobileye thing. 01:04:09.680 |
Sometimes at first you kind of see what's out there 01:04:14.340 |
That was one of the boldest decisions I've seen 01:04:28.900 |
What I see now with everything, the hardware, the compute, 01:04:32.500 |
the sensors, the things I maybe care and love about most 01:04:40.040 |
with the data set selection, the whole data engine process, 01:04:50.920 |
versus the ImageNet model of computer vision, 01:05:01.340 |
And Andrej's awesome and obviously plays an important role, 01:05:04.220 |
but we have a lot of really talented people driving things. 01:05:07.760 |
And Ashok is actually the head of autopilot engineering. 01:05:17.700 |
So yeah, I'm aware that there's an incredible team 01:05:23.020 |
obviously people will give me too much credit 01:05:28.660 |
- And people should realize how much is going on 01:05:34.680 |
The Tesla Autopilot AI team is extremely talented. 01:05:40.000 |
It's like some of the smartest people in the world. 01:05:54.220 |
So you leaped in having some sort of first principles, 01:05:59.220 |
kinds of intuitions, but nobody knows how difficult 01:06:04.540 |
- Like the problem-- - I thought the self-driving 01:06:06.220 |
problem would be hard, but it was harder than I thought. 01:06:10.740 |
but it was actually way harder than even that. 01:06:14.140 |
So I mean, what it comes down to at the end of the day 01:06:22.660 |
you basically need to recreate what humans do to drive, 01:06:33.800 |
And so in order to, that's how the entire road system 01:06:37.860 |
is designed to work, with basically passive optical 01:06:48.460 |
for full self-driving to work, we have to recreate that 01:06:51.880 |
So we have to, that means cameras with advanced 01:07:02.560 |
And then it will obviously solve for full self-driving. 01:07:07.940 |
That's the only way, I don't think there's any other way. 01:07:10.260 |
- But the question is, what aspects of human nature 01:07:12.880 |
do you have to encode into the machine, right? 01:07:15.500 |
So you have to solve the perception problem, like detect, 01:07:18.740 |
and then you first, well, realize what is the perception 01:07:22.340 |
problem for driving, like all the kinds of things 01:07:27.900 |
There's, I just recently heard Andre talked about at MIT 01:07:32.420 |
about car doors, I think it was the world's greatest talk 01:07:44.420 |
So like the ontology of that, that's a perception problem. 01:07:51.620 |
And then there's the control and the planning 01:07:54.980 |
You have to figure out like what's involved in driving, 01:07:58.300 |
like especially in all the different edge cases. 01:08:00.880 |
And then, I mean, maybe you can comment on this, 01:08:06.540 |
how much game theoretic kind of stuff needs to be involved, 01:08:14.020 |
As humans, when we drive, our actions affect the world. 01:08:20.780 |
Most autonomous driving, you're usually just responding 01:08:25.220 |
to the scene as opposed to like really asserting yourself 01:08:33.140 |
- I think these sort of control logic conundrums 01:09:14.420 |
and then you have this massive bit stream in image space, 01:09:26.780 |
a massive bit stream corresponding to photons 01:09:34.580 |
that knocked off an electron in a camera sensor, 01:09:49.600 |
you've got cars and humans and lane lines and curves 01:10:08.500 |
the control problem is similar to that of a video game, 01:10:17.700 |
I wouldn't say it's trivial, it's not trivial, 01:10:25.300 |
Having an accurate vector space is very difficult. 01:10:32.140 |
- Yeah, I think we humans don't give enough respect 01:10:35.540 |
to how incredible the human perception system is, 01:10:41.500 |
to the vector space representation in our heads. 01:10:44.660 |
- Your brain is doing an incredible amount of processing 01:10:47.420 |
and giving you an image that is a very cleaned up image. 01:10:53.380 |
like you see color in the corners of your eyes, 01:10:59.420 |
like cone receptors in the peripheral vision. 01:11:02.260 |
Your eyes are painting color in the peripheral vision. 01:11:09.060 |
And your eyes also have like this blood vessels 01:11:12.280 |
and also some gnarly things, and there's a blind spot, 01:11:16.380 |
No, your brain is painting in the missing, the blind spot. 01:11:24.980 |
and look at this point and then look at this point. 01:11:30.480 |
your brain will just fill in the missing bits. 01:11:35.220 |
- Yeah. - Makes you realize all the illusions 01:11:38.020 |
and so it makes you realize just how incredible the brain is. 01:11:40.640 |
- The brain is doing crazy amount of post-processing 01:11:46.680 |
So, and then even once you get all those vision signals, 01:11:59.520 |
perhaps the weakest thing about the brain is memory. 01:12:01.920 |
So because memory is so expensive to our brain 01:12:06.700 |
your brain is trying to forget as much as possible 01:12:12.180 |
into the smallest amounts of information possible. 01:12:16.540 |
So your brain is trying to not just get to a vector space, 01:12:25.080 |
And I think like, you can sort of look inside your brain, 01:12:46.600 |
you don't have eyes in the back of your head or the side. 01:12:53.180 |
you basically have like two cameras on a slow gimbal. 01:13:07.220 |
and doing all sorts of things they shouldn't do in a car, 01:13:19.140 |
like when's the last time you looked right and left 01:13:32.420 |
and what your mind is doing is trying to still, 01:13:38.460 |
basically objects with a position and motion, 01:13:41.200 |
and then editing that down to the least amount 01:13:53.260 |
or compress it even further into things like concepts. 01:13:57.660 |
the human mind seems to go sometimes beyond vector space 01:14:01.260 |
to sort of space of concepts to where you'll see a thing. 01:14:05.100 |
It's no longer represented spatially somehow. 01:14:07.540 |
It's almost like a concept that you should be aware of. 01:14:17.500 |
you don't need to fully represent those things. 01:14:27.740 |
and then actually have predictions for those vector spaces. 01:14:57.420 |
you saw that there were some kids about to cross the road 01:15:05.940 |
okay, those kids are probably gonna pass by the truck 01:15:08.980 |
and cross the road, even though you cannot see them. 01:15:15.920 |
you have to need to remember that there were kids there 01:15:30.820 |
even when it just walks behind a tree and reappears, 01:15:37.100 |
it's tracking through occlusions, it's very difficult. 01:15:45.380 |
like same thing happens with the humans with neural nets. 01:15:59.940 |
and you put it behind your back and you pop it out, 01:16:02.640 |
if they don't, before they have object permanence, 01:16:05.860 |
It's like, whoa, this toy went, poof, disappeared, 01:16:08.260 |
and now it's back again, and they can't believe it. 01:16:18.220 |
then they realize, oh no, the object is not gone, 01:16:21.760 |
- Sometimes I wish we never did figure out object permanence. 01:16:47.460 |
like how long do you want to remember things for? 01:16:48.940 |
And there's a cost to remembering things for a long time. 01:17:03.540 |
And then you also need things that are remembered over time. 01:17:10.640 |
for argument's sake, five seconds of memory on a time basis, 01:17:14.580 |
but like, let's say you're parked at a light, 01:17:25.200 |
and you can't quite see them because of an occlusion, 01:17:30.560 |
before the light changes for them to cross the road. 01:17:33.160 |
You still need to remember that that's where they were, 01:17:36.920 |
and that they're probably going to cross the road 01:17:40.580 |
So even if that exceeds your time-based memory, 01:17:47.020 |
- And I just think the data engine side of that, 01:17:50.520 |
so getting the data to learn all of the concepts 01:17:53.520 |
that you're saying now is an incredible process. 01:18:05.400 |
- Okay, I'm sure it'll be equally as Rick and Morty-like. 01:18:13.760 |
the neural nets in the cars so many times, it's crazy. 01:18:18.000 |
- Oh, so every time there's a new major version, 01:18:20.040 |
you'll rename it to something more ridiculous, 01:18:32.520 |
that are operating in the car, it kind of boggles the mind. 01:18:48.880 |
that were basically image recognition on a single frame 01:18:53.880 |
from a single camera, and then trying to net those together 01:19:04.360 |
I should say we're really primarily running C here, 01:19:15.800 |
and are continuing to optimize our C compiler 01:19:32.640 |
'cause there's all kinds of compute, there's CPU, GPU, 01:19:37.360 |
and you have to somehow figure out the scheduling 01:19:43.480 |
So that's why there's a lot of people involved. 01:19:46.800 |
- There's a lot of hardcore software engineering 01:19:57.240 |
that's constrained to our full self-driving computer. 01:20:02.240 |
And we wanna try to have the highest frames per second 01:20:26.040 |
by some very talented software engineers at Tesla 01:20:38.920 |
which are basically doing matrix math dot products, 01:20:49.560 |
It's like, compute-wise, like 99% dot products. 01:20:57.040 |
- And you wanna achieve as many high frame rates, 01:21:16.000 |
we're moving towards now is no post-processing 01:21:20.560 |
of the image through the image signal processor. 01:21:40.280 |
And so we don't care about pictures looking pretty. 01:21:56.360 |
than what you'd see if you represented it on a camera. 01:22:02.560 |
you can see that there's a small photon count difference 01:22:08.800 |
which means that, so it can see in the dark incredibly well 01:22:16.160 |
Like much better than you could possibly imagine. 01:22:19.840 |
So, and then we also save 13 milliseconds on a latency. 01:22:28.280 |
- From removing the post-processing in the image? 01:22:42.000 |
maybe 1.6 milliseconds of latency for each camera. 01:22:56.320 |
gets us back 13 milliseconds of latency, which is important. 01:23:09.080 |
go through the various neural nets and the C code. 01:23:13.320 |
And there's a little bit of C++ there as well. 01:23:32.000 |
to accelerate the brakes, just to slow down the steering, 01:23:41.800 |
And like some of these controllers have an update frequency 01:23:44.120 |
that's maybe 10 Hertz or something like that, which is slow. 01:23:47.280 |
That's like, now you lose a hundred milliseconds potentially. 01:23:58.720 |
to have more like a hundred Hertz instead of 10 Hertz. 01:24:04.400 |
instead of a hundred milliseconds, worst case latency. 01:24:06.400 |
And actually jitter is more of a challenge than latency. 01:24:09.520 |
'Cause latency is like, you can anticipate and predict, 01:24:22.120 |
If you have a stack up of tolerances, of timing tolerances, 01:24:52.400 |
So you can make like robust control decisions. 01:24:56.940 |
Yeah, so the jitter is in the sensor information 01:25:01.600 |
or the jitter can occur at any stage in the pipeline. 01:25:04.960 |
- You can, if you have just, if you have fixed latency, 01:25:11.920 |
we know that our information is for argument's sake, 01:25:30.820 |
So then you can say, okay, well, we're gonna, 01:25:44.240 |
However, if you've got then 150 milliseconds of latency 01:25:52.520 |
so then your latency could be from 150 to 250 milliseconds. 01:25:59.940 |
So getting rid of jitter is extremely important. 01:26:07.900 |
- Yeah, the car's just gonna fundamentally maneuver better 01:26:13.760 |
- And the cars will maneuver with superhuman ability 01:26:36.400 |
- That's exactly what I was imagining in my mind, 01:26:39.280 |
- It's like impossible maneuvers that a human couldn't do. 01:26:45.040 |
- Well, let me ask sort of looking back the six years, 01:26:51.800 |
how hard do you think this full self-driving problem, 01:26:55.360 |
when do you think Tesla will solve level four FSD? 01:27:07.000 |
Is it the current pool of FSD beta candidates, 01:27:15.720 |
and then there's a certain level beyond which 01:27:26.480 |
anybody who's been following the full self-driving beta 01:27:29.320 |
closely will see that the rate of disengagements 01:27:37.400 |
So like a disengagement B where the driver intervenes 01:27:59.760 |
it happens next year, is that the probability 01:28:04.200 |
of an accident on FSD is less than that of the average human, 01:28:09.200 |
and then significantly less than that of the average human. 01:28:13.920 |
So it certainly appears like we will get there next year. 01:28:18.920 |
Then of course, then there's gonna be a case of, 01:28:24.760 |
okay, well, we now have to prove this to regulators 01:28:34.240 |
I think it's gotta be at least two or three times 01:28:39.000 |
So two or three times lower probability of injury 01:28:41.360 |
than a human before we would actually say like, 01:28:45.920 |
It's not gonna be equivalent, it's gonna be much better. 01:28:48.360 |
- So if you look at 10 point, FSD 10.6 just came out 01:28:55.440 |
Maybe 11 is on the way somewhere in the future. 01:28:58.440 |
- Yeah, we were hoping to get 11 out this year, 01:29:04.880 |
of fundamental rewrites on the neural net architecture, 01:29:10.840 |
and some fundamental improvements in creating vector space. 01:29:40.540 |
neural net architecture changes that will allow 01:29:51.120 |
So like we have this working on like sort of alpha software 01:29:54.680 |
and it's good, but it's basically taking a whole bunch 01:30:04.600 |
of C++ code and replacing it with a neural net. 01:30:09.160 |
which is like neural nets are kind of eating software. 01:30:12.500 |
Over time there's like less and less conventional software, 01:30:15.880 |
more and more neural net, which is still software, 01:30:18.260 |
but it's, still comes out the lines of software, 01:30:21.100 |
but more neural net stuff and less heuristics basically. 01:30:40.120 |
And, you know, like one of the big changes will be, 01:30:55.900 |
a giant bag of points to the C++ or C and C++ code. 01:31:16.960 |
Then you've got to assemble this giant bag of points 01:31:28.820 |
but it's, we want to just, we need another layer 01:31:34.240 |
of neural nets on top of that to take the giant bag 01:31:37.420 |
of points and distill that down to vector space 01:31:45.320 |
as opposed to the heuristics part of the software. 01:31:50.080 |
- Neural nets all the way down, is what you want. 01:31:55.200 |
but it's, this will be just a, this is a game changer 01:31:58.920 |
to not have the bag of points, the giant bag of points 01:32:02.800 |
that has to be assembled with many lines of C++ 01:32:12.200 |
So the neural net is outputting much, much less data. 01:32:17.200 |
It's outputting, this is a lane line, this is a curb, 01:32:24.480 |
this is a pedestrian or cyclist or something like that. 01:32:29.240 |
It's outputting proper vectors to the C++ control code 01:32:34.240 |
as opposed to the sort of constructing the vectors in C. 01:32:52.240 |
but it's, we're kind of hitting a local maximum 01:33:10.280 |
And all of the training needs to move to surround video 01:33:18.760 |
And then we need to move everything to raw photon counts 01:33:31.440 |
'cause the system's trained on post-processed images. 01:33:43.960 |
- So ultimately it's kind of reducing the complexity 01:33:56.240 |
reducing the complexity of having to deal with-- 01:34:07.520 |
- Yeah, well actually you need to incorporate sound as well 01:34:11.240 |
'cause you need to listen for ambulance sirens 01:34:14.600 |
or fire trucks, if somebody like yelling at you 01:34:28.960 |
- Honestly, frankly, the ideas are the easy thing 01:34:35.080 |
Like the idea of going to the moon is the easy part. 01:34:42.720 |
that's gotta get done at the hardware and software level, 01:34:53.960 |
If we don't do this, the system will not work properly. 01:35:05.560 |
but they are critical to the success of the situation. 01:35:11.640 |
everything that's going on outside of what Andre's doing. 01:35:15.360 |
Just the whole infrastructure of the software. 01:35:17.880 |
I mean, everything that's going on with Data Engine, 01:35:20.040 |
whatever it's called, the whole process is just work of art. 01:35:26.800 |
Like the training, the amount of work done with, 01:35:30.680 |
for training and labeling and to do auto-labeling. 01:35:36.320 |
'Cause especially when you've got surround video, 01:35:41.200 |
it's very difficult to label surround video from scratch. 01:35:50.320 |
to even label one video clip, like several hours. 01:35:57.400 |
like heavy duty, like a lot of compute to the video clips 01:36:02.400 |
to pre-assign and guess what all the things are 01:36:10.360 |
- Yeah, and then all the human has to do is like tweet, 01:36:15.960 |
This is like increases productivity by a hundred or more. 01:36:25.160 |
First of all, I think humanoid robots are incredible. 01:36:28.320 |
From a fan of robotics, I think the elegance of movement 01:36:32.360 |
that humanoid robots, that bipedal robots show 01:36:38.000 |
So it's really interesting that you're working on this 01:36:40.760 |
and also talking about applying the same kind of, 01:36:43.240 |
all the ideas of some of which we've talked about 01:36:45.440 |
with data engine, all the things that we're talking about 01:36:47.920 |
with Tesla autopilot, just transferring that over 01:36:54.560 |
I have to ask, since I care about human-robot interactions, 01:36:59.200 |
so you've talked about mostly in the factory. 01:37:01.600 |
Do you see part of this problem that TeslaBot has to solve 01:37:06.040 |
is interacting with humans and potentially having a place 01:37:10.520 |
So interacting, not just not replacing labor, 01:37:18.000 |
- Yeah, yeah, I think the possibilities are endless. 01:37:29.760 |
it's not quite in Tesla's primary mission direction 01:37:42.120 |
that is capable of interacting with the world 01:37:59.280 |
it's like, I think work will become optional. 01:38:16.480 |
it's like, eh, even if you really like washing dishes, 01:38:19.720 |
you really wanna do it for eight hours a day every day. 01:38:29.320 |
has like potential for repetitive stress injury, 01:38:36.600 |
humanoid robots would add the most value initially. 01:38:53.480 |
with some kind of universal basic income in the future. 01:38:59.220 |
- So do you see a world when there's like hundreds 01:39:07.480 |
performing different tasks throughout the world? 01:39:14.240 |
but I guess that there may be something like that. 01:39:21.720 |
So the number of Tesla cars has been accelerating, 01:39:33.800 |
when there'll be more Tesla bots than Tesla cars? 01:39:39.460 |
- Yeah, actually it's funny you asked this question 01:39:44.240 |
because normally I do try to think pretty far 01:39:46.720 |
into the future, but I haven't really thought 01:39:59.320 |
Because it's not like a giant transformer robot. 01:40:04.400 |
So, but it's meant to be a general purpose, helpful bot. 01:40:33.520 |
and like a lot of hardcore low-level software 01:40:38.040 |
to have it run efficiently and be power efficient, 01:40:43.480 |
if you've got a gigantic server room with 10,000 computers, 01:40:45.800 |
but now let's say you have to now distill that down 01:40:48.680 |
into one computer that's running at low power 01:40:54.480 |
A lot of hardcore software work is required for that. 01:41:02.840 |
navigate the real world with neural nets problem for cars, 01:41:07.840 |
which are kind of like robots with four wheels, 01:41:10.080 |
then it's like kind of a natural extension of that 01:41:12.480 |
is to put it in a robot with arms and legs and actuators. 01:41:33.120 |
to interact in a sensible way with the environment. 01:41:38.800 |
and you need to be very good at manufacturing, 01:41:52.200 |
basically means developing custom motors and sensors 01:42:00.760 |
that are different from what a car would use. 01:42:10.320 |
in developing advanced electric motors and power electronics. 01:42:15.200 |
So, it just has to be for a humanoid robot application 01:42:25.720 |
So, let me ask, this isn't like for like sex robots 01:42:28.600 |
or something like that. - Love is the answer. 01:42:35.640 |
not compelling, but we connect with humanoid robots 01:42:45.360 |
there's a huge amount of loneliness in this world. 01:42:48.200 |
All of us seek companionship with other humans, 01:42:52.720 |
We have a lot of here in Austin, a lot of people have dogs. 01:43:11.560 |
Do you think about that, would test about it all 01:43:16.920 |
of performing specific tasks, not connecting with humans? 01:43:21.120 |
- I mean, to be honest, I have not actually thought about it 01:43:47.400 |
And that personality could evolve to be, you know, 01:43:53.400 |
match the owner or the, you know, I guess, the owner. 01:44:11.080 |
'Cause, you know, like there's a Japanese phrase, 01:44:17.480 |
I like the, wabi-sabi, you know, the subtle imperfections 01:44:24.080 |
And the subtle imperfections of the personality of the robot 01:44:36.480 |
but could actually make an incredible buddy, basically. 01:44:43.640 |
- Like R2D2 or like a C3PO sort of thing, you know? 01:44:49.920 |
I think the flaws being a feature is really nice. 01:44:56.160 |
for quite a while in the general home environment 01:44:58.920 |
or in the general world, and that's kind of adorable. 01:45:06.720 |
So it's very different than autonomous driving, 01:45:13.200 |
And so it's more fun to be a robot in the home. 01:45:33.960 |
- I think they definitely added a lot to the story. 01:45:45.480 |
It was like it made them relatable, I don't know, 01:45:51.680 |
So yeah, I think that that could be something 01:45:57.320 |
But our initial focus is just to make it useful. 01:46:11.240 |
a decent prototype towards the end of next year 01:46:15.520 |
- And it's cool that it's connected to Tesla, the car. 01:46:22.640 |
it would use the autopilot inference computer, 01:46:25.240 |
and a lot of the training that we've done for cars 01:46:35.800 |
But there's a lot of custom actuators and sensors 01:46:42.480 |
- And an extra module on top of the vector space for love. 01:46:57.440 |
Like you said, a lot of people argue in the car, 01:47:03.560 |
fan of Dan Carlin's Hardcore History podcast. 01:47:21.120 |
He said you guys went military and all that kind of stuff. 01:47:36.640 |
then engineering plays a pivotal role in victory and battle. 01:47:47.280 |
- It was mostly, well, it was supposed to be a deep dive 01:47:50.760 |
on fighters and bomber technology in World War II, 01:47:55.360 |
but that ended up being more wide ranging than that. 01:48:00.400 |
of studying all of the fighters and bombers in World War II 01:48:12.240 |
and that country would make a plane to beat that, 01:48:15.520 |
And really what matters is the pace of innovation 01:48:18.560 |
and also access to high quality fuel and raw materials. 01:48:31.520 |
And they had a real problem with the oil and fuel, basically. 01:48:42.360 |
- Yeah, the US had kick-ass fuel that was very consistent. 01:48:47.240 |
The problem is if you make a very high performance 01:48:48.840 |
aircraft engine, in order to make high performance, 01:49:14.080 |
And Germany just never had good access to oil. 01:49:16.280 |
Like they tried to get it by invading the Caucasus, 01:49:27.560 |
So Germany was always struggling with basically shitty oil. 01:49:31.000 |
And so then they couldn't count on high quality fuel 01:49:48.280 |
So that allowed the British and the Americans 01:49:51.160 |
to design aircraft engines that were super high performance, 01:50:01.040 |
And then also the quality of the aluminum alloys 01:50:05.800 |
that they were getting was also not that great. 01:50:09.280 |
- Is this like, you talked about all this with Dan? 01:50:13.640 |
Broadly looking at history, when you look at Genghis Khan, 01:50:23.920 |
Does it help you gain insight about human nature, 01:50:30.520 |
or just the behavior of people, any aspects of history? 01:50:43.020 |
I mean, there's just a lot of incredible things 01:50:52.960 |
that they help you understand the nature of civilization. 01:51:21.640 |
I mean, if you, like there's a lot of human history. 01:51:25.320 |
Most of it is actually people just getting on 01:51:36.280 |
Those are actually just, those are intermittent and rare. 01:51:38.920 |
And if they weren't, then humans would soon cease to exist. 01:51:43.260 |
But it's just that wars tend to be written about a lot, 01:51:54.200 |
well, a normal year where nothing major happened 01:51:58.480 |
But that's, you know, most people just like farming 01:52:01.160 |
and kind of like living their life, you know? 01:52:25.920 |
But the book about Stalin, "The Court of the Red Czar," 01:52:38.120 |
The '30s, there's a lot of lessons there to me, 01:52:48.360 |
like all of us have that, it's the old Solzhenitsyn line, 01:53:01.200 |
that all of us have to tend towards the good. 01:53:19.400 |
to do evil onto each other, onto your family, onto others. 01:53:23.400 |
And so it's like our responsibility to do good. 01:53:26.720 |
It's not like now is somehow different from history. 01:53:29.700 |
That can happen again, all of it can happen again. 01:53:42.100 |
the quality of life was way worse back in the day, 01:54:10.380 |
died of the plague, starvation, freezing to death, 01:54:27.680 |
the primary goal of most people throughout history, 01:54:32.840 |
to last through the winter and not freeze or whatever. 01:54:43.560 |
- Well, yeah, the lesson there is to be grateful 01:54:58.480 |
If I sat down for a long-form in-person conversation 01:55:03.740 |
with the president of Russia, Vladimir Putin, 01:55:06.960 |
would you potentially want to call in for a few minutes 01:55:17.400 |
- You've shown interest in the Russian language. 01:55:23.520 |
in history of linguistics, culture, general curiosity? 01:55:37.580 |
Once you know what the Cyrillic characters stand for, 01:55:43.140 |
actually, then reading Russian becomes a lot easier 01:55:47.100 |
'cause there are a lot of words that are actually the same. 01:55:52.840 |
- So find the words that are exactly the same 01:55:57.740 |
and now you start to understand Cyrillic, yeah. 01:56:24.140 |
So do you draw inspiration from that history? 01:56:30.540 |
I mean, one of the sad things is because of the language, 01:56:35.380 |
because it's not translated, all those kinds of, 01:56:37.480 |
because it is in some ways an isolated culture. 01:56:47.780 |
from the history of science and engineering there? 01:56:51.500 |
- I mean, the Soviet Union, Russia, and Ukraine as well, 01:56:56.500 |
and have a really strong history in space flight. 01:57:01.780 |
Like some of the most advanced and impressive things 01:57:16.260 |
and one cannot help but admire the impressive 01:57:38.060 |
that it was happening before the Soviet Union 01:57:46.740 |
- Yeah, I mean, there's Roscosmos, the Russian agency. 01:57:51.420 |
I look forward to a time when those countries 01:58:00.020 |
Maybe a little bit of friendly competition, but-- 01:58:06.300 |
and the only thing slower than one government 01:58:18.060 |
And people wouldn't try hard to run fast and stuff. 01:58:22.380 |
So I think friendly competition is a good thing. 01:58:24.780 |
- This is also a good place to give a shout out 01:58:28.120 |
to a video titled "The Entire Soviet Rocket Engine 01:58:30.780 |
Family Tree" by Tim Dodd, aka Everyday Astronaut. 01:58:53.940 |
If you're interested in anything to do with space, 01:58:56.220 |
he's, in terms of explaining rocket technology 01:59:03.220 |
And I should say, like, the part of the reason 01:59:12.180 |
Grafter at one point was gonna be a hydrogen engine. 01:59:39.940 |
or Soviet Union, Russia, and Ukraine, primarily, 01:59:43.700 |
were actually in the process of switching to Methylox. 01:59:48.700 |
And there was some interesting test and data for ISP. 01:59:57.300 |
up to like a 380 second ISP with a Methylox engine. 02:00:05.420 |
So I think you could actually get a much lower cost, 02:00:24.340 |
And I was partly inspired by the Russian work 02:00:31.480 |
- And now for something completely different. 02:00:38.340 |
in the spirit of the great, the powerful PewDiePie? 02:01:05.220 |
- So you get it because he likes impaling things? 02:01:15.860 |
This is ground in some engineering, some history. 02:01:35.740 |
in a place that is not subject to extreme natural disasters, 02:01:41.780 |
I think nuclear power is a great way to generate electricity. 02:01:59.900 |
there's a lot of fear of radiation and stuff. 02:02:02.620 |
And I guess the problem is a lot of people just don't, 02:02:15.820 |
So they can't calibrate what radiation means. 02:02:19.120 |
But radiation is much less dangerous than you'd think. 02:02:43.620 |
if they should worry about radiation from Fukushima. 02:03:09.020 |
And I actually, I donated a solar power system 02:03:15.300 |
And I made a point of eating locally grown vegetables 02:03:31.500 |
- So it's not even that the risk of these events is low, 02:03:40.060 |
- It's people, people don't know what radiation is. 02:03:55.460 |
let's say photons, what frequency or wavelength? 02:04:01.820 |
Like, do you know that everything's radiating all the time? 02:04:06.020 |
Like, yeah, everything's radiating all the time. 02:04:17.420 |
to stand in front of nuclear fire, go outside. 02:04:32.540 |
- Yeah, I guess radiation is one of the words 02:04:42.420 |
- I mean, that's the way to fight that fear, I suppose, 02:04:51.620 |
And say how many people have died from coal plants, 02:04:59.380 |
So, like, obviously we should not be starting up coal plants 02:05:24.340 |
everybody loves the math, nobody gives a shit about 270. 02:05:44.300 |
- The United States oscillating between establishing 02:05:51.540 |
Yeah, it's a seven out of 10, it's kinda true. 02:06:02.020 |
- Or it's like referring to Laika or something? 02:06:11.500 |
And then the last one is him with his eyes closed 02:06:19.140 |
- They don't tell you the full story of, you know, 02:06:28.300 |
- Oh yeah, this keeps going on the Russian theme. 02:06:31.680 |
First man in space, nobody cares, first man on the moon. 02:06:38.940 |
- Yuri Gagarin's names will be forever in history, I think. 02:06:48.000 |
like, stepping foot onto another totally foreign land. 02:06:52.820 |
It's not the journey, like, people that explore the oceans. 02:07:06.260 |
Oh yeah, I'd love to get your comment on this. 02:07:08.100 |
Elon Musk, after sending $6.6 billion to the UN 02:07:31.860 |
Like, we don't have a caloric constraint at this point. 02:07:35.840 |
So, where there is hunger, it is almost always due to, 02:07:47.360 |
extremely rare for it to be just a matter of, like, 02:07:59.180 |
literally trying to starve the other part of the country. 02:08:11.340 |
it's money, monetary systems, all that kind of stuff. 02:08:22.340 |
among low-income families, obesity is actually 02:08:31.020 |
It's like too much, you know, too many calories. 02:09:08.340 |
these historical artifacts from all around the world 02:09:16.020 |
So, it is a convenient place to see these ancient artifacts 02:09:32.700 |
- It's like you wanna make these historical artifacts 02:09:37.140 |
And the British Museum, I think, does a good job of that. 02:09:44.860 |
whatever the empire is, however things were done, 02:09:52.060 |
You can't sort of erase that history, unfortunately. 02:10:00.860 |
how are we gonna pass moral judgment on these things? 02:10:03.980 |
Like, it's like, you know, if one is gonna judge, 02:10:09.500 |
say, the British Empire, you gotta judge, you know, 02:10:13.500 |
and how were the British relative to everyone. 02:10:20.020 |
like a relatively good grade, relatively good grade, 02:10:24.180 |
but compared to what everyone else was doing, 02:10:56.060 |
you're basically gonna give everyone a failing grade. 02:10:59.860 |
I don't think anyone would get a passing grade 02:11:02.380 |
in their morality of like, you go back 300 years ago, 02:11:26.820 |
The Life of Brian and the Quest of the Holy Grail 02:11:39.740 |
Is that, how does that affect your leadership? 02:12:02.220 |
- I like this, like Shakespearean analysis of memes. 02:12:11.380 |
- Yeah, yeah, it must come from the eyebrows. 02:12:44.260 |
- I mean, when was the cheeseburger invented? 02:12:59.900 |
And then you start getting, is it a pizza sandwich? 02:13:05.980 |
- Yeah, but everybody knows if you order a burger 02:13:08.860 |
and you get tomato and some lettuce and onions 02:13:11.460 |
and whatever and mayo and ketchup and mustard, 02:13:16.380 |
- Yeah, but I'm sure they've had bread and meat separately 02:13:21.140 |
on the same plate, but somebody who actually combined them 02:13:23.660 |
into the same thing and you bite it and hold it 02:13:30.260 |
Like, your hands don't get dirty and whatever. 02:13:41.180 |
- But everyone knows, like, if you order a cheeseburger, 02:13:53.420 |
I mean, they're the devil, but fries are awesome. 02:14:06.620 |
- What about the Matthew McConaughey Austinite here? 02:14:11.980 |
President Kennedy, do you know how to put men on the moon 02:14:16.020 |
President Kennedy, it'd be a lot cooler if you did. 02:14:18.820 |
- Pretty much, sure, six, six or seven, I suppose. 02:14:30.620 |
- Someone drew a bunch of dicks all over the walls, 02:14:38.460 |
- This is our highest ranking meme for today. 02:14:42.420 |
- I mean, it's true, like, how do they get away with that? 02:14:46.540 |
- Dick pics are, I mean, just something throughout history. 02:14:49.500 |
As long as people can draw things, there's been a dick pic. 02:15:12.620 |
- Full-on stand-up, is that in there, or is that-- 02:15:16.620 |
- It's extremely difficult, at least that's what 02:15:29.500 |
- You know, I have done stand-up for friends, 02:15:32.700 |
just impromptu, you know, I'll get on like a roof, 02:15:37.700 |
and they do laugh, but they're our friends too. 02:15:41.340 |
So I don't know if you've got to call, you know, 02:15:44.060 |
like a room of strangers, are they gonna actually 02:15:46.220 |
also find it funny, but I could try, see what happens. 02:15:54.900 |
- I love both when you bomb and when you do great, 02:16:02.140 |
It's so difficult, you're so fragile up there. 02:16:06.940 |
It's just you, and you think you're gonna be funny, 02:16:11.300 |
it's just beautiful to see people deal with that. 02:16:15.380 |
- Yeah, I might have enough material to do stand-up. 02:16:30.220 |
- What's your favorite Rick and Morty concept? 02:16:36.700 |
There's a lot of sort of scientific engineering ideas 02:16:48.540 |
from an alternate dimension showed up there, Elon Tusk. 02:17:16.220 |
it's like you don't wanna have like super intelligence 02:17:22.540 |
if we're talking about from the engineering perspective 02:17:25.100 |
of super intelligence, like with Marvin the robot, 02:17:28.420 |
like is it, it seems like it might be very easy 02:17:46.700 |
If you don't do a good job on building a robot, 02:17:56.620 |
So I guess if you let it evolve without tinkering 02:18:20.340 |
If we think about young people in high school, 02:18:23.460 |
maybe in college, what advice would you give to them 02:18:27.860 |
about if they want to try to do something big in this world, 02:18:31.660 |
they want to really have a big positive impact, 02:18:33.580 |
what advice would you give them about their career, 02:18:40.620 |
You do things that are useful to your fellow human beings, 02:18:49.740 |
You know, are you contributing more than you consume? 02:18:57.140 |
You know, like can you try to have a positive net 02:19:17.020 |
A lot of time, the people you want as leaders 02:19:29.300 |
that is a good life, a life worth having lived. 02:19:36.180 |
You know, and like I said, I would encourage people 02:19:49.900 |
- When you think about education and self-education, 02:19:53.820 |
So, there's the university, there's self-study, 02:19:58.740 |
there is hands-on sort of finding a company or a place 02:20:03.140 |
or a set of people that do the thing you're passionate about 02:20:07.420 |
There's taking a road trip across Europe for a few years 02:20:16.500 |
in terms of learning about how you can become useful, 02:20:22.580 |
as you mentioned, how you can have the most positive impact? 02:20:25.580 |
- Well, I'd encourage people to read a lot of books. 02:20:42.000 |
And try to also just develop a good general knowledge. 02:20:45.880 |
So, you at least have, like, a rough lay of the land 02:20:53.080 |
Like, try to learn a little bit about a lot of things. 02:20:57.060 |
'Cause you might not know what you're really interested in. 02:20:59.800 |
How would you know what you're really interested in 02:21:01.040 |
if you at least aren't, like, doing a peripheral exploration 02:21:10.140 |
And talk to people from different walks of life 02:21:14.940 |
and different industries and professions and skills 02:21:28.700 |
- Isn't the whole thing a search for meaning? 02:21:33.100 |
- Yeah, what's the meaning of life and all, you know. 02:21:42.540 |
And then try to find something where there's an overlap 02:21:48.340 |
of your talents and what you're interested in. 02:21:53.540 |
or they may have skill at a particular thing, 02:21:57.740 |
So, you want to try to find a thing where you're, 02:22:06.660 |
that you're inherently good at, but you also like doing. 02:22:09.780 |
- And reading is a super fast shortcut to figure out 02:22:16.500 |
which, where are you both good at it, you like doing it, 02:22:23.260 |
- Well, you gotta learn about things somehow. 02:22:24.900 |
So, reading, a broad range, just really read it. 02:22:42.140 |
I didn't even know existed, well, a lot, so obviously. 02:22:57.420 |
of the Encyclopedia Britannica, I'd recommend that. 02:23:07.620 |
So, read the encyclopedia or skim through it. 02:23:11.500 |
But I put a lot of stock and certainly have a lot of respect 02:23:24.940 |
And just generally to have not a zero-sum mindset, 02:23:47.420 |
like doing things that seem morally questionable, 02:24:00.100 |
they don't realize they have a zero-sum mindset, 02:24:02.820 |
or at least they don't realize it consciously. 02:24:21.420 |
has grown dramatically over time, the economic pie. 02:24:37.380 |
So, you really wanna make sure you're not operating 02:24:41.740 |
without realizing it from a zero-sum mindset, 02:24:49.540 |
trying to take things from others, which is not good. 02:24:52.020 |
It's much better to work on adding to the economic pie. 02:25:02.060 |
creating more than you consume, doing more than you, yeah. 02:25:09.340 |
I think there's a fair number of people in finance 02:25:38.940 |
I mean, like the resources become less scarce, 02:25:43.940 |
and that applies in a lot of kinds of domains. 02:25:46.740 |
It applies in academia, where a lot of people are very, 02:25:49.820 |
see some funding for academic research is zero-sum. 02:26:11.600 |
What is the role of love in the human condition, 02:26:21.720 |
made you a better person, a better human being? 02:26:28.080 |
- Now you're asking really perplexing questions. 02:26:31.660 |
It's hard to give a, I mean, there are many books, 02:27:06.420 |
so many inspiring things, like be useful in the world, 02:27:08.420 |
sort of like solve problems, alleviate suffering, 02:27:11.760 |
but it seems like connection between humans is a source, 02:27:15.520 |
you know, it's a source of joy, it's a source of meaning, 02:27:21.980 |
love, I just wonder if you think about that kind of thing, 02:27:40.260 |
if we're just alone and conscious and intelligent, 02:27:44.380 |
it doesn't mean nearly as much as if we're with others, 02:27:48.180 |
right, and there's some magic created when we're together. 02:27:52.060 |
The friendship of it, and I think the highest form of it 02:27:56.180 |
is love, which I think broadly is much bigger 02:27:59.340 |
than just sort of romantic, but also, yes, romantic love 02:28:06.100 |
- Well, I mean, the reason I guess I care about 02:28:32.980 |
the human history, all the people who's ever lived, 02:28:35.020 |
all the people alive now, it's pretty, we're okay. 02:28:41.740 |
On the whole, we're a pretty interesting bunch. 02:29:01.860 |
What do you think is the meaning of this whole thing? 02:29:08.780 |
- Yeah, well, really, I think what Douglas Adams 02:29:11.260 |
was saying in "The Hitchhiker's Guide to the Galaxy" 02:29:25.580 |
and that the question is really the hard part, 02:29:30.500 |
then the answer, relatively speaking, is easy. 02:29:42.260 |
we need to expand the scope and scale of consciousness 02:29:58.540 |
- Thereby elevating the role of the interviewer. 02:30:15.260 |
But yeah, it's like that is the foundation of my philosophy 02:30:18.140 |
is that I am curious about the nature of the universe, 02:30:22.700 |
and obviously I will die, I don't know when I'll die, 02:30:30.960 |
but I would like to know that we're on a path 02:30:40.740 |
and so if we expand the scope and scale of humanity 02:30:47.900 |
then that seems like a fundamentally good thing. 02:30:58.980 |
that you would spend your extremely valuable time 02:31:04.260 |
millions of people hope in this difficult time, 02:31:11.300 |
so I hope you do continue doing what you're doing. 02:31:18.700 |
- Thanks for listening to this conversation with Elon Musk. 02:31:22.700 |
please check out our sponsors in the description, 02:31:29.540 |
When something is important enough, you do it, 02:31:34.620 |
Thank you for listening, and hope to see you next time.