back to index

Elon Musk: SpaceX, Mars, Tesla Autopilot, Self-Driving, Robotics, and AI | Lex Fridman Podcast #252


Chapters

0:0 Introduction
0:7 Elon singing
0:55 SpaceX human spaceflight
7:40 Starship
16:16 Quitting is not in my nature
17:51 Thinking process
27:25 Humans on Mars
32:55 Colonizing Mars
36:41 Wormholes
41:19 Forms of government on Mars
48:22 Smart contracts
49:52 Dogecoin
51:24 Cryptocurrency and Money
57:33 Bitcoin vs Dogecoin
60:16 Satoshi Nakamoto
62:38 Tesla Autopilot
65:44 Tesla Self-Driving
77:48 Neural networks
86:44 When will Tesla solve self-driving?
88:48 Tesla FSD v11
96:21 Tesla Bot
107:1 History
114:52 Putin
120:32 Meme Review
134:58 Stand-up comedy
136:31 Rick and Morty
138:10 Advice for young people
146:8 Love
149:1 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | "The following is a conversation with Elon Musk,
00:00:02.540 | "his third time on this The Lex Friedman Podcast."
00:00:07.080 | Yeah, make yourself comfortable.
00:00:08.520 | - Boo.
00:00:09.360 | - Oh, wow, okay.
00:00:10.200 | (laughing)
00:00:11.120 | - Do you don't do the headphones thing?
00:00:12.880 | - No.
00:00:13.700 | - Okay.
00:00:14.540 | I mean, how close do I need to get at the same thing?
00:00:15.920 | - The closer you are, the sexier you sound.
00:00:17.680 | - Hey, babe.
00:00:18.520 | - Yep.
00:00:19.340 | - Can't get enough of the oil, baby.
00:00:21.000 | (laughing)
00:00:22.960 | - I'm gonna clip that out
00:00:24.760 | anytime somebody messages me about it.
00:00:26.760 | - You want my body and you think I'm sexy.
00:00:29.960 | ♪ Come right out and tell me so ♪
00:00:33.960 | (humming)
00:00:36.120 | - So good.
00:00:39.760 | Okay, serious mode activate.
00:00:41.920 | All right.
00:00:42.760 | - Serious mode.
00:00:44.400 | Come on, you're Russian, you can be serious.
00:00:45.760 | - Yeah, I know.
00:00:46.600 | - Everyone's serious all the time in Russia.
00:00:47.720 | - Yeah.
00:00:48.560 | (laughing)
00:00:50.080 | We'll get there, we'll get there.
00:00:51.920 | - Yeah.
00:00:52.760 | - It's gotten soft.
00:00:53.600 | Allow me to say that the SpaceX launch
00:00:57.760 | of human beings to orbit on May 30th, 2020
00:01:00.920 | was seen by many as the first step
00:01:03.880 | in a new era of human space exploration.
00:01:07.400 | These human spaceflight missions were a beacon of hope
00:01:10.600 | to me and to millions over the past two years
00:01:12.720 | as our world has been going through
00:01:14.640 | one of the most difficult periods in recent human history.
00:01:18.040 | We saw, we see the rise of division, fear, cynicism,
00:01:21.720 | and the loss of common humanity
00:01:24.600 | right when it is needed most.
00:01:26.360 | So first, Elon, let me say thank you
00:01:29.040 | for giving the world hope and reason
00:01:30.600 | to be excited about the future.
00:01:32.480 | - Oh, it's kind of easy to say.
00:01:34.160 | I do want to do that.
00:01:35.440 | Humanity has obviously a lot of issues
00:01:38.280 | and people at times do bad things,
00:01:42.600 | but despite all that, I love humanity
00:01:47.200 | and I think we should make sure we do everything
00:01:52.200 | we can to have a good future and an exciting future
00:01:54.600 | and one that maximizes the happiness of the people.
00:01:58.320 | - Let me ask about Crew Dragon Demo 2,
00:02:00.960 | so that first flight with humans on board.
00:02:04.360 | How did you feel leading up to that launch?
00:02:06.160 | Were you scared?
00:02:07.240 | Were you excited?
00:02:08.240 | What was going through your mind?
00:02:09.640 | So much was at stake.
00:02:11.360 | - Yeah, no, that was extremely stressful, no question.
00:02:17.320 | We obviously could not let them down in any way.
00:02:21.720 | So extremely stressful, I'd say, to say the least.
00:02:26.720 | I was confident that at the time that we launched
00:02:30.920 | that no one could think of anything at all to do
00:02:35.920 | that would improve the probability of success.
00:02:39.080 | And we racked our brains to think of any possible way
00:02:43.360 | to improve the probability of success.
00:02:44.840 | We could not think of anything more and nor could NASA.
00:02:48.080 | And so that's just the best that we could do.
00:02:51.480 | So then we went ahead and launched.
00:02:55.720 | Now I'm not a religious person,
00:02:57.840 | but I nonetheless got on my knees and prayed
00:03:01.640 | for that mission.
00:03:02.480 | - Were you able to sleep?
00:03:05.840 | - No.
00:03:06.680 | - How did it feel when it was a success?
00:03:10.800 | First when the launch was a success
00:03:12.760 | and when they returned back home or back to Earth?
00:03:15.600 | - It was a great relief.
00:03:18.160 | Yeah.
00:03:22.080 | For high-stress situations,
00:03:23.480 | I find it's not so much elation as relief.
00:03:26.040 | And I think once, as we got more comfortable
00:03:33.480 | and proved out the systems, 'cause we really,
00:03:36.500 | you gotta make sure everything works.
00:03:41.280 | It was definitely a lot more enjoyable
00:03:43.200 | with the subsequent astronaut missions.
00:03:46.520 | And I thought the Inspiration mission
00:03:49.840 | was actually very inspiring, the Inspiration 4 mission.
00:03:53.960 | I'd encourage people to watch
00:03:55.080 | the Inspiration documentary on Netflix.
00:03:57.560 | It's actually really good.
00:03:58.860 | And it really is, I was actually inspired by that.
00:04:03.360 | And so that one, I felt I was kind of able
00:04:07.760 | to enjoy the actual mission
00:04:09.320 | and not just be super stressed all the time.
00:04:10.880 | - So for people that somehow don't know,
00:04:13.760 | it's the all-civilian, first time all-civilian
00:04:17.240 | out to space, out to orbit.
00:04:19.400 | - Yeah, and it was, I think, the highest orbit
00:04:22.480 | that in like, I don't know, 30 or 40 years or something.
00:04:26.120 | The only one that was higher was the one shuttle,
00:04:29.520 | sorry, a Hubble servicing mission.
00:04:32.160 | And then before that, it would have been Apollo in '72.
00:04:35.980 | It's pretty wild.
00:04:39.180 | So it's cool.
00:04:40.480 | I think as a species, we want to be continuing
00:04:46.640 | to do better and reach higher ground.
00:04:49.720 | And I think it would be tragic, extremely tragic,
00:04:52.880 | if Apollo was the high-water mark for humanity,
00:04:57.040 | and that's as far as we ever got.
00:04:58.800 | And it's concerning that here we are,
00:05:04.040 | 49 years after the last mission to the moon,
00:05:09.040 | and so almost half a century, and we've not been back.
00:05:14.040 | And that's worrying.
00:05:16.280 | It's like, does that mean we've peaked
00:05:19.360 | as a civilization or what?
00:05:20.960 | So I think we've got to get back to the moon
00:05:24.480 | and build a base there, a science base.
00:05:27.160 | I think we could learn a lot about the nature
00:05:28.880 | of the universe if we have a proper science base
00:05:31.120 | on the moon.
00:05:31.940 | Like we have a science base in Antarctica
00:05:35.520 | and many other parts of the world.
00:05:38.080 | And so that's, I think, the next big thing.
00:05:41.880 | We've got to have a serious moon base
00:05:45.680 | and then get people to Mars and get out there
00:05:50.000 | and be a space-faring civilization.
00:05:52.600 | - I'll ask you about some of those details,
00:05:54.480 | but since you're so busy with the hard engineering
00:05:57.520 | challenges of everything that's involved,
00:06:00.440 | are you still able to marvel at the magic of it all,
00:06:03.000 | of space travel, of every time the rocket goes up,
00:06:05.880 | especially when it's a crude mission?
00:06:08.140 | Or are you just so overwhelmed with all the challenges
00:06:11.440 | that you have to solve?
00:06:13.480 | - And actually, sort of to add to that,
00:06:16.240 | the reason I wanted to ask this question of May 30th,
00:06:19.200 | it's been some time, so you can look back
00:06:21.800 | and think about the impact already.
00:06:23.480 | It's already, at the time, it was an engineering problem,
00:06:26.320 | maybe, now it's becoming a historic moment.
00:06:29.280 | Like it's a moment that, how many moments
00:06:31.600 | will be remembered about the 21st century?
00:06:33.680 | To me, that or something like that, maybe Inspiration 4,
00:06:38.440 | one of those will be remembered as the early steps
00:06:40.720 | of a new age of space exploration.
00:06:44.180 | - Yeah, I mean, during the launches itself,
00:06:46.620 | so I mean, the thing I think maybe some people know,
00:06:49.100 | but a lot of people don't know,
00:06:50.180 | is I'm actually the chief engineer of SpaceX,
00:06:52.100 | so I've signed off on pretty much
00:06:56.660 | all the design decisions.
00:06:58.000 | And so if there's something that goes wrong
00:07:03.100 | with that vehicle, it's fundamentally my fault.
00:07:09.780 | So I'm really just thinking about all the things that,
00:07:13.780 | like so when I see the rocket,
00:07:15.580 | I see all the things that could go wrong
00:07:17.300 | and the things that could be better,
00:07:18.500 | and the same with the Dragon spacecraft.
00:07:21.500 | It's like, other people see, oh, this is a spacecraft
00:07:25.120 | or a rocket, and this looks really cool.
00:07:27.020 | I'm like, I have like a readout of like,
00:07:29.580 | this is the, these are the risks,
00:07:31.200 | these are the problems, that's what I see.
00:07:33.820 | Like, ch-ch-ch-ch-ch-ch-ch.
00:07:35.220 | (laughing)
00:07:36.300 | So it's not what other people see
00:07:38.820 | when they see the product, you know?
00:07:40.420 | - So let me ask you then to analyze Starship
00:07:43.020 | in that same way.
00:07:44.340 | I know you have, you'll talk about,
00:07:46.500 | in more detail about Starship in the near future, perhaps.
00:07:49.580 | - Yeah, we can talk about it now if you want.
00:07:51.820 | - But just in that same way, like you said,
00:07:54.140 | you see, when you see a rocket,
00:07:57.620 | you see a sort of a list of risks.
00:07:59.140 | In that same way, you said that Starship
00:08:01.060 | is a really hard problem.
00:08:03.260 | So there's many ways I can ask this,
00:08:05.460 | but if you magically could solve one problem
00:08:08.640 | perfectly, one engineering problem perfectly,
00:08:11.740 | which one would it be?
00:08:13.020 | - On Starship?
00:08:14.140 | - On, sorry, on Starship.
00:08:15.540 | So is it maybe related to the efficiency of the engine,
00:08:20.060 | the weight of the different components,
00:08:21.380 | the complexity of various things,
00:08:22.700 | maybe the controls of the crazy thing it has to do to land?
00:08:26.740 | - No, it's actually, by far,
00:08:28.900 | the biggest thing absorbing my time is engine production.
00:08:33.740 | Not the design of the engine,
00:08:36.500 | but I've often said prototypes are easy,
00:08:41.500 | production is hard.
00:08:42.840 | So we have the most advanced rocket engine
00:08:48.680 | that's ever been designed.
00:08:50.240 | 'Cause I say currently the best rocket engine ever
00:08:55.240 | is probably the RD-180 or RD-170,
00:08:58.580 | that's the Droid Russian engine, basically.
00:09:03.700 | And still, I think an engine should only count
00:09:06.780 | if it's gotten something to orbit.
00:09:09.020 | So our engine has not gotten anything to orbit yet,
00:09:11.920 | but it's the first engine that's actually better
00:09:16.980 | than the Russian RD engines, which were amazing design.
00:09:21.980 | - So you're talking about Raptor engine,
00:09:24.140 | what makes it amazing?
00:09:25.500 | What are the different aspects of it that make it,
00:09:29.260 | like what are you the most excited about
00:09:31.940 | if the whole thing works in terms of efficiency,
00:09:35.080 | all those kinds of things?
00:09:37.120 | - Well, the Raptor is a full-flow staged combustion engine
00:09:42.120 | and it's operating at a very high chamber pressure.
00:09:50.840 | So one of the key figures of merit,
00:09:52.640 | perhaps the key figure of merit is
00:09:55.640 | what is the chamber pressure
00:09:58.680 | at which the rocket engine can operate?
00:10:00.800 | That's the combustion chamber pressure.
00:10:03.060 | So Raptor is designed to operate at 300 bar,
00:10:06.940 | possibly maybe higher, that's 300 atmospheres.
00:10:10.320 | So the record right now for operational engine
00:10:15.320 | is the RD engine that I mentioned, the Russian RD,
00:10:17.860 | which is, I believe, around 267 bar.
00:10:21.280 | And the difficulty of the chamber pressure
00:10:26.140 | increases on a nonlinear basis.
00:10:27.860 | So 10% more chamber pressure is more like 50% more difficult.
00:10:32.860 | But that chamber pressure is,
00:10:38.640 | that is what allows you to get a very high power density
00:10:42.940 | for the engine.
00:10:45.240 | So enabling a very high thrust to weight ratio
00:10:51.140 | and a very high specific impulse.
00:10:57.020 | So specific impulse is like a measure
00:10:59.020 | of the efficiency of a rocket engine.
00:11:01.660 | It's really the effective exhaust velocity
00:11:06.660 | of the gas coming out of the engine.
00:11:10.120 | So with a very high chamber pressure,
00:11:17.140 | you can have a compact engine
00:11:21.580 | that nonetheless has a high expansion ratio,
00:11:23.980 | which is the ratio between the exit nozzle and the throat.
00:11:28.980 | So you know, engines got,
00:11:32.860 | like you see a rocket engine's got like,
00:11:33.980 | sort of like a hourglass shape.
00:11:35.980 | It's like a chamber and then it necks down
00:11:38.140 | and there's a nozzle.
00:11:39.700 | And the ratio of the exit diameter
00:11:41.860 | to the throat is the expansion ratio.
00:11:45.940 | - So why is it such a hard engine to manufacture at scale?
00:11:49.660 | - It's very complex.
00:11:53.100 | - So what does complexity mean here?
00:11:55.180 | There's a lot of components involved.
00:11:56.740 | - There's a lot of components
00:11:58.620 | and a lot of unique materials.
00:12:02.020 | So we had to invent several alloys
00:12:07.020 | that don't exist in order to make this engine work.
00:12:09.560 | - So it's a materials problem too.
00:12:13.020 | - It's a materials problem.
00:12:16.220 | And in a staged combustion,
00:12:19.740 | a full flow staged combustion,
00:12:21.220 | there are many feedback loops in the system.
00:12:24.180 | So basically you've got propellants
00:12:29.180 | and hot gas flowing simultaneously
00:12:33.060 | to so many different places on the engine.
00:12:38.100 | And they all have a recursive effect on each other.
00:12:43.780 | So you change one thing here,
00:12:44.900 | it has a recursive effect here,
00:12:46.100 | changes something over there.
00:12:47.500 | And it's quite hard to control.
00:12:51.480 | Like there's a reason no one's made this before.
00:12:55.060 | And the reason we're doing a staged combustion full flow
00:13:03.860 | is because it has the highest
00:13:06.700 | theoretical possible efficiency.
00:13:12.740 | So in order to make a fully reusable rocket,
00:13:17.740 | which that's the really the holy grail of orbital rocketry,
00:13:23.200 | you have to have, everything's gotta be the best.
00:13:28.420 | It's gotta be the best engine,
00:13:30.080 | the best airframe, the best heat shield,
00:13:32.100 | extremely light avionics,
00:13:35.920 | very clever control mechanisms.
00:13:40.620 | You've got to shed mass in any possible way that you can.
00:13:45.100 | For example, we are, instead of putting landing legs
00:13:47.620 | on the booster and ship,
00:13:48.660 | we are gonna catch them with a tower
00:13:50.740 | to save the weight of the landing legs.
00:13:53.220 | So that's like, I mean, we're talking about catching
00:13:58.220 | the largest flying object ever made
00:14:00.180 | on a giant tower with chopstick arms.
00:14:06.660 | It's like a karate kid with the fly, but much bigger.
00:14:10.500 | (laughing)
00:14:12.060 | - I mean, pulling something like that--
00:14:12.900 | - This probably won't work the first time.
00:14:15.480 | Anyway, so this is bananas, this is banana stuff.
00:14:19.820 | - So you mentioned that you doubt, well, not you doubt,
00:14:23.100 | but there's days or moments when you doubt
00:14:26.460 | that this is even possible.
00:14:28.340 | It's so difficult.
00:14:30.140 | - The possible part is, well, at this point,
00:14:34.120 | I think we'll get Starship to work.
00:14:38.300 | There's a question of timing.
00:14:42.860 | How long will it take us to do this?
00:14:45.460 | How long will it take us to actually achieve
00:14:47.700 | full and rapid reusability?
00:14:49.060 | 'Cause it will take probably many launches
00:14:52.660 | before we're able to have full and rapid reusability.
00:14:55.680 | But I can say that the physics pencils out.
00:15:06.180 | At this point, I'd say we're confident that,
00:15:09.080 | like let's say, I'm very confident success
00:15:12.020 | is in the set of all possible outcomes.
00:15:13.980 | For a while there, I was not convinced
00:15:18.340 | that success was in the set of possible outcomes,
00:15:20.780 | which is very important, actually.
00:15:23.460 | - You're saying there's a chance.
00:15:29.700 | - I'm saying there's a chance, exactly.
00:15:33.260 | I'm just not sure how long it will take.
00:15:37.080 | We have a very talented team.
00:15:39.580 | They're working night and day to make it happen.
00:15:41.980 | And like I said, the critical thing to achieve
00:15:48.300 | for the revolution in space flight and for humanity
00:15:50.580 | to be a space-faring civilization
00:15:52.020 | is to have a fully and rapidly reusable rocket,
00:15:54.380 | orbital rocket.
00:15:55.220 | There's not even been any orbital rocket
00:15:58.820 | that's been fully reusable ever,
00:16:00.100 | and this has always been the holy grail of rocketry.
00:16:05.040 | And many smart people, very smart people,
00:16:09.700 | have tried to do this before and they've not succeeded,
00:16:12.500 | 'cause it's such a hard problem.
00:16:16.100 | - What's your source of belief in situations like this,
00:16:21.220 | when the engineering problem is so difficult?
00:16:23.580 | There's a lot of experts, many of whom you admire,
00:16:27.500 | who have failed in the past.
00:16:29.220 | - Yes.
00:16:30.060 | - And a lot of people,
00:16:34.060 | a lot of experts, maybe journalists, all the kind of,
00:16:38.900 | the public in general have a lot of doubt
00:16:40.900 | about whether it's possible.
00:16:42.820 | And you yourself know that even if it's a non-null set,
00:16:47.340 | non-empty set of success,
00:16:49.540 | it's still unlikely or very difficult.
00:16:52.140 | Like, where do you go to, both personally,
00:16:54.740 | intellectually as an engineer, as a team,
00:16:58.940 | like for source of strength needed
00:17:00.820 | to sort of persevere through this?
00:17:02.540 | And to keep going with the project, take it to completion?
00:17:06.440 | - A source of strength, hmm.
00:17:19.820 | It doesn't really matter how I think about things.
00:17:22.320 | I mean, for me, it's simply this is something
00:17:25.540 | that is important to get done.
00:17:28.100 | And we should just keep doing it or die trying.
00:17:32.500 | And I don't need a source of strength.
00:17:35.980 | - So quitting is not even like--
00:17:38.180 | - That's not in my nature.
00:17:40.900 | - Okay.
00:17:41.720 | - And I don't care about optimism or pessimism.
00:17:45.120 | Fuck that, we're gonna get it done.
00:17:47.900 | - Gonna get it done.
00:17:48.900 | Can you then zoom back in to specific problems
00:17:55.180 | with Starship or any engineering problems you work on?
00:17:58.340 | Can you try to introspect
00:18:00.060 | your particular biological neural network,
00:18:02.260 | your thinking process, and describe how you think
00:18:05.380 | through problems, the different engineering
00:18:07.180 | and design problems?
00:18:08.020 | Is there like a systematic process?
00:18:10.100 | You've spoken about first principles thinking,
00:18:11.900 | but is there a kind of process to it?
00:18:14.180 | - Well, yeah, I like saying like physics is law
00:18:19.180 | and everything else is a recommendation.
00:18:21.700 | Like I've met a lot of people that can break the law,
00:18:23.220 | but I haven't met anyone who could break physics.
00:18:25.620 | So first, for any kind of technology problem,
00:18:31.740 | you have to sort of just make sure
00:18:34.540 | you're not violating physics.
00:18:37.860 | And first principles analysis,
00:18:44.820 | I think is something that can be applied
00:18:45.980 | to really any walk of life, anything really.
00:18:49.840 | It's really just saying, let's boil something down
00:18:54.680 | to the most fundamental principles,
00:18:58.420 | the things that we are most confident are true
00:19:00.840 | at a foundational level.
00:19:02.480 | And that sets your axiomatic base,
00:19:05.040 | and then you reason up from there,
00:19:07.000 | and then you cross-check your conclusion
00:19:09.020 | against the axiomatic truths.
00:19:11.320 | So some basics in physics would be like,
00:19:18.120 | oh, you're violating conservation of energy or momentum
00:19:20.280 | or something like that, then it's not gonna work.
00:19:23.700 | So that's just to establish, is it possible?
00:19:30.760 | And another good physics tool is thinking about things
00:19:35.720 | in the limit.
00:19:36.560 | If you take a particular thing
00:19:38.360 | and you scale it to a very large number
00:19:41.680 | or to a very small number, how do things change?
00:19:45.960 | - Both in number of things you manufacture
00:19:48.920 | or something like that, and then in time?
00:19:51.640 | - Yeah, let's say take an example of manufacturing,
00:19:55.920 | which I think is just a very underrated problem.
00:19:59.140 | And like I said, it's much harder
00:20:04.520 | to take an advanced technology product
00:20:09.520 | and bring it into volume manufacturing
00:20:11.000 | than it is to design it in the first place,
00:20:13.360 | or as an attitude.
00:20:14.500 | So let's say you're trying to figure out
00:20:17.800 | is, like, why is this part or product expensive?
00:20:22.800 | Is it because of something fundamentally foolish
00:20:27.400 | that we're doing, or is it because our volume is too low?
00:20:31.320 | And so then you say, okay, well,
00:20:32.200 | what if our volume was a million units a year?
00:20:34.240 | Is it still expensive?
00:20:35.760 | That's what I mean by thinking about things in the limit.
00:20:38.140 | If it's still expensive at a million units a year,
00:20:40.140 | then volume is not the reason why your thing is expensive.
00:20:42.520 | There's something fundamental about the design.
00:20:44.700 | - And then you then can focus on reducing the complexity
00:20:47.460 | or something like that in the design?
00:20:48.300 | - You could change the design to,
00:20:49.780 | change the part to be something
00:20:51.260 | that is not fundamentally expensive.
00:20:55.580 | But that's a common thing in rocketry,
00:20:58.940 | 'cause the unit volume is relatively low,
00:21:01.820 | and so a common excuse would be,
00:21:04.100 | well, it's expensive because our unit volume is low.
00:21:06.460 | And if we were in automotive or something like that,
00:21:08.740 | or consumer electronics, then our costs would be lower.
00:21:10.900 | I'm like, okay, so let's say,
00:21:13.240 | now you're making a million units a year.
00:21:14.640 | Is it still expensive?
00:21:16.040 | If the answer is yes,
00:21:17.640 | then economies of scale are not the issue.
00:21:22.080 | - Do you throw into manufacturing,
00:21:24.120 | do you throw supply chain?
00:21:26.040 | You talked about resources and materials
00:21:27.760 | and stuff like that.
00:21:28.600 | Do you throw that into the calculation
00:21:29.920 | of trying to reason from first principles,
00:21:31.760 | like how we're gonna make the supply chain work here?
00:21:34.600 | - Yeah, yeah.
00:21:35.720 | - And then the cost of materials, things like that,
00:21:37.800 | or is that too much?
00:21:38.920 | - Exactly, so like a good example,
00:21:41.860 | like you're thinking about things in the limit,
00:21:44.380 | is if you take any product,
00:21:47.860 | any machine or whatever,
00:21:53.100 | like take a rocket or whatever,
00:21:56.060 | and say, if you look at the raw materials in the rocket,
00:22:01.060 | so you're gonna have like aluminum, steel,
00:22:06.780 | titanium, Inconel, specialty alloys, copper,
00:22:11.780 | and you say, what's the weight of the constituent elements
00:22:18.480 | of each of these elements,
00:22:20.440 | and what is their raw material value?
00:22:22.600 | And that sets the asymptotic limit
00:22:25.720 | for how low the cost of the vehicle can be
00:22:29.480 | unless you change the materials.
00:22:31.780 | So, and then when you do that,
00:22:33.960 | I call it like maybe the magic wand number
00:22:35.760 | or something like that.
00:22:36.600 | So that would be like, if you had the,
00:22:38.740 | like just a pile of these raw materials here,
00:22:42.520 | and you could wave the magic wand
00:22:43.620 | and rearrange the atoms into the final shape,
00:22:45.880 | that would be the lowest possible cost
00:22:49.560 | that you could make this thing for
00:22:50.920 | unless you change the materials.
00:22:52.680 | So then, and that is always,
00:22:54.760 | almost always a very low number.
00:22:56.380 | So then what's actually causing things to be expensive
00:23:01.160 | is how you put the atoms into the desired shape.
00:23:05.960 | - Yeah, actually, if you don't mind me taking a tiny tangent,
00:23:08.960 | I often talk to Jim Keller,
00:23:11.400 | who's somebody that worked with you as a friend.
00:23:14.440 | - Jim was, yeah, did great work at Tesla.
00:23:17.760 | - So I suppose he carries the flame
00:23:20.480 | of the same kind of thinking that you're talking about now.
00:23:24.560 | And I guess I see that same thing at Tesla
00:23:28.600 | and SpaceX folks who work there,
00:23:31.760 | they kind of learn this way of thinking,
00:23:33.760 | and it kind of becomes obvious almost.
00:23:36.720 | But anyway, I had argument, not argument,
00:23:39.680 | he educated me about how cheap it might be
00:23:44.800 | to manufacture a Tesla bot.
00:23:46.600 | We just, we had an argument.
00:23:48.360 | How can you reduce the cost of the scale
00:23:50.520 | of producing a robot?
00:23:52.120 | Because I've gotten the chance to interact quite a bit,
00:23:55.960 | obviously in the academic circles with humanoid robots,
00:23:59.480 | and then my boss in dynamics and stuff like that,
00:24:01.920 | and they're very expensive to build.
00:24:04.520 | And then Jim kind of schooled me on saying like,
00:24:07.600 | okay, like this kind of first principles thinking
00:24:10.080 | of how can we get the cost of manufacture down?
00:24:12.440 | I suppose you do that, you have done that kind of thinking
00:24:16.840 | for Tesla bot and for all kinds of complex systems
00:24:21.840 | that are traditionally seen as complex.
00:24:23.680 | And you say, okay, how can we simplify everything down?
00:24:27.160 | - Yeah.
00:24:28.520 | I mean, I think if you are really good at manufacturing,
00:24:31.440 | you can basically make, at high volume,
00:24:35.320 | you can basically make anything for a cost
00:24:37.520 | that asymptotically approaches the raw material value
00:24:42.040 | of the constituents, plus any intellectual property
00:24:44.600 | that you need to do license.
00:24:46.520 | Anything.
00:24:47.360 | But it's hard.
00:24:50.280 | It's not like, that's a very hard thing to do,
00:24:51.840 | but it is possible for anything.
00:24:54.720 | Anything in volume can be made, like I said,
00:24:57.480 | for a cost that asymptotically approaches
00:25:00.440 | this raw material constituents,
00:25:02.800 | plus intellectual property license rights.
00:25:05.400 | So what will often happen in trying to design a product
00:25:08.520 | is people will start with the tools and parts and methods
00:25:12.440 | that they are familiar with,
00:25:14.520 | and then try to create a product
00:25:17.480 | using their existing tools and methods.
00:25:19.560 | The other way to think about it is actually imagine the,
00:25:25.080 | try to imagine the platonic ideal of the perfect product,
00:25:28.800 | or technology, whatever it might be.
00:25:31.360 | And so what is the perfect arrangement of atoms
00:25:35.680 | that would be the best possible product?
00:25:38.560 | And now let us try to figure out
00:25:39.840 | how to get the atoms in that shape.
00:25:41.600 | - I mean, it sounds,
00:25:45.620 | it's almost like Rick and Morty absurd
00:25:50.400 | until you start to really think about it,
00:25:52.080 | and you really should think about it in this way,
00:25:56.440 | 'cause everything else is kind of,
00:25:58.180 | if you think you might fall victim to the momentum
00:26:03.040 | of the way things were done in the past,
00:26:04.440 | unless you think in this way.
00:26:06.000 | - Well, just as a function of inertia,
00:26:07.680 | people will want to use the same tools and methods
00:26:10.640 | that they are familiar with.
00:26:12.040 | They just, that's what they'll do by default.
00:26:15.720 | And then that will lead to an outcome of things
00:26:18.720 | that can be made with those tools and methods,
00:26:20.480 | but is unlikely to be the platonic ideal
00:26:23.880 | of the perfect product.
00:26:25.040 | So then, so that's why,
00:26:28.480 | it's good to think of things in both directions.
00:26:30.120 | So like, what can we build with the tools that we have?
00:26:32.200 | But then, but also what is the perfect,
00:26:35.600 | the theoretical perfect product look like?
00:26:37.240 | And that theoretical perfect product
00:26:39.560 | is gonna be a moving target, 'cause as you learn more,
00:26:42.700 | the definition for that perfect product will change,
00:26:46.920 | 'cause you don't actually know what the perfect product is,
00:26:48.720 | but you can successfully approximate a more perfect product.
00:26:53.080 | So thinking about it like that, and then saying,
00:26:56.680 | okay, now, what tools, methods, materials, whatever,
00:26:59.920 | do we need to create in order to get the atoms in that shape?
00:27:04.680 | But people rarely think about it that way.
00:27:10.160 | But it's a powerful tool.
00:27:11.780 | - I should mention that the brilliant Siobhan Zillis
00:27:15.800 | is hanging out with us, in case you hear a voice of wisdom
00:27:20.800 | from outside, from up above.
00:27:24.120 | Okay, so let me ask you about Mars.
00:27:28.200 | You mentioned it would be great for science
00:27:30.440 | to put a base on the moon to do some research,
00:27:35.000 | but the truly big leap, again,
00:27:38.800 | in this category of seemingly impossible,
00:27:40.880 | is to put a human being on Mars.
00:27:43.800 | When do you think SpaceX will land a human being on Mars?
00:27:46.920 | - Hmm.
00:27:51.200 | Best case is about five years, worst case, 10 years.
00:28:13.660 | (sighing)
00:28:15.820 | - What are the determining factors, would you say,
00:28:19.580 | from an engineering perspective,
00:28:21.180 | or is that not the bottlenecks?
00:28:23.040 | - You know, it's fundamentally,
00:28:25.620 | you know, engineering the vehicle.
00:28:28.960 | I mean, Starship is the most complex and advanced rocket
00:28:36.580 | that's ever been made by, I don't know,
00:28:39.620 | what, a magnitude or something like that.
00:28:40.820 | It's a lot, it's really next level, so.
00:28:43.640 | And the fundamental optimization of Starship
00:28:49.260 | is minimizing cost per ton to orbit,
00:28:51.540 | and ultimately cost per ton to the surface of Mars.
00:28:54.820 | This may seem like a mercantile objective,
00:28:56.400 | but it is actually the thing that needs to be optimized.
00:28:59.300 | Like, there is a certain cost per ton to the surface of Mars
00:29:04.060 | where we can afford to establish a self-sustaining city,
00:29:08.820 | and then above that, we cannot afford to do it.
00:29:11.620 | So right now, you can fly to Mars for a trillion dollars,
00:29:16.780 | no amount of money could get you a ticket to Mars.
00:29:19.140 | So we need to get that above, you know,
00:29:22.460 | to get that, like, something that is actually possible
00:29:24.640 | at all.
00:29:25.480 | But then, but that's, we don't just wanna have,
00:29:32.100 | you know, with Mars, flags and footprints,
00:29:33.840 | and then not come back for a half century,
00:29:35.900 | like we did with the Moon.
00:29:37.940 | In order to pass a very important great filter,
00:29:42.940 | I think we need to be a multi-planet species.
00:29:45.700 | This may sound somewhat esoteric to a lot of people,
00:29:51.460 | but like, eventually, given enough time,
00:29:55.720 | there's something, the Earth is likely to experience
00:30:00.500 | some calamity that could be something
00:30:05.260 | that humans do to themselves or an external event,
00:30:08.040 | like happened to the dinosaurs.
00:30:09.580 | And, but eventually, and if none of that happens,
00:30:16.140 | and somehow, magically, we keep going,
00:30:21.300 | then the Sun is gradually expanding,
00:30:24.540 | and will engulf the Earth,
00:30:26.620 | and probably Earth gets too hot for life
00:30:31.060 | in about 500 million years.
00:30:34.780 | It's a long time, but that's only 10% longer
00:30:37.280 | than Earth has been around.
00:30:39.340 | And so if you think about, like, the current situation,
00:30:43.220 | it's really remarkable, and kind of hard to believe,
00:30:45.620 | but Earth's been around 4 1/2 billion years,
00:30:50.100 | and this is the first time in 4 1/2 billion years
00:30:52.340 | that it's been possible to extend life beyond Earth.
00:30:55.700 | And that window of eternity may be open for a long time,
00:30:58.820 | and I hope it is, but it also may be open for a short time.
00:31:01.740 | And we should, I think it is wise for us to
00:31:05.660 | act quickly while the window is open,
00:31:11.020 | just in case it closes.
00:31:13.780 | - Yeah, the existence of nuclear weapons, pandemics,
00:31:17.820 | all kinds of threats should kind of give us some motivation.
00:31:22.820 | - I mean, civilization could get,
00:31:26.800 | could die with a bang or a whimper, you know,
00:31:31.500 | if it dies a demographic collapse,
00:31:35.020 | then it's more of a whimper, obviously.
00:31:37.260 | But if it's World War III, it's more of a bang.
00:31:40.700 | But these are all risks.
00:31:42.120 | I mean, it's important to think of these things
00:31:44.540 | and just, you know, think of things as probabilities,
00:31:47.060 | not certainties.
00:31:48.620 | There's a certain probability that something bad
00:31:51.460 | will happen on Earth.
00:31:53.300 | I think most likely the future will be good.
00:31:55.540 | But there's, like, let's say for argument's sake,
00:31:59.420 | a 1% chance per century of a civilization ending event.
00:32:03.580 | Like, that was Stephen Hawking's estimate.
00:32:05.680 | I think he might be right about that.
00:32:10.420 | So then, you know, we should basically think of this
00:32:15.420 | like being a multi-planet species,
00:32:19.500 | just like taking out insurance for life itself.
00:32:21.460 | Like life insurance for life.
00:32:23.200 | - Wow, it's turned into an infomercial real quick.
00:32:29.300 | - Life insurance for life, yes.
00:32:31.540 | And, you know, we can bring the creatures from,
00:32:35.700 | you know, plants and animals from Earth to Mars
00:32:37.980 | and breathe life into the planet
00:32:39.580 | and have a second planet with life.
00:32:43.680 | That would be great.
00:32:45.940 | They can't bring themselves there, you know,
00:32:47.500 | so if we don't bring them to Mars,
00:32:49.980 | then they will just for sure all die
00:32:52.260 | when the sun expands anyway, and then that'll be it.
00:32:56.180 | - What do you think is the most difficult aspect
00:32:58.520 | of building a civilization on Mars, terraforming Mars,
00:33:02.220 | like from engineering perspective,
00:33:03.720 | from a financial perspective, human perspective,
00:33:07.180 | to get a large number of folks there
00:33:12.180 | who will never return back to Earth?
00:33:15.820 | - No, they could certainly return.
00:33:17.100 | Some will return back to Earth.
00:33:18.420 | - They will choose to stay there
00:33:19.900 | for the rest of their lives.
00:33:21.260 | - Yeah, many will.
00:33:22.140 | But, you know, we need the spaceships back.
00:33:28.420 | Like the ones that go to Mars, we need them back.
00:33:29.980 | So you can hop on if you want, you know.
00:33:32.620 | But we can't just not have the spaceships come back.
00:33:34.940 | Those things are expensive.
00:33:35.780 | We need them back.
00:33:36.600 | I'd like to come back and do another trip.
00:33:38.740 | - I mean, do you think about the terraforming aspect,
00:33:40.660 | like actually building, are you so focused right now
00:33:43.180 | on the spaceships part that's so critical to get to Mars?
00:33:46.700 | - We absolutely, if you can't get there,
00:33:48.220 | nothing else matters.
00:33:49.620 | So, and like I said, we can't get there
00:33:52.220 | with some extraordinarily high cost.
00:33:54.660 | I mean, the current cost of, let's say,
00:33:57.780 | one ton to the surface of Mars
00:34:00.060 | is on the order of a billion dollars.
00:34:02.660 | So, 'cause you don't just need the rocket
00:34:04.860 | and the launch and everything, you need like heat shield,
00:34:06.660 | you need guidance system,
00:34:09.860 | you need deep space communications,
00:34:12.260 | you need some kind of landing system.
00:34:15.020 | So like rough approximation would be a billion dollars
00:34:19.540 | per ton to the surface of Mars right now.
00:34:22.180 | This is obviously way too expensive
00:34:26.900 | to create a self-sustaining civilization.
00:34:29.080 | So we need to improve that by at least a factor of 1,000.
00:34:35.680 | - A million per ton?
00:34:38.460 | - Yes, ideally much less than a million a ton.
00:34:40.880 | But if it's not, like it's gotta be,
00:34:44.260 | you ever say like, well, how much can society afford
00:34:47.620 | to spend or want to spend
00:34:49.460 | on a self-sustaining city on Mars?
00:34:52.380 | The self-sustaining part is important.
00:34:53.780 | Like it's just the key threshold,
00:34:56.240 | the grateful to will have been passed
00:35:01.340 | when the city on Mars can survive
00:35:05.300 | even if the spaceships from Earth stop coming
00:35:07.700 | for any reason, doesn't matter what the reason is,
00:35:09.980 | but if they stop coming for any reason,
00:35:12.140 | will it die out or will it not?
00:35:13.700 | And if there's even one critical ingredient missing,
00:35:16.400 | then it still doesn't count.
00:35:18.340 | It's like, if you're on a long sea voyage
00:35:20.460 | and you've got everything except vitamin C,
00:35:22.580 | it's only a matter of time, you're gonna die.
00:35:26.260 | So we gotta get a Mars city
00:35:28.860 | to the point where it's self-sustaining.
00:35:30.820 | I'm not sure this will really happen in my lifetime,
00:35:33.860 | but I hope to see it at least have a lot of momentum.
00:35:37.300 | And then you could say, okay,
00:35:38.540 | what is the minimum tonnage necessary
00:35:40.380 | to have a self-sustaining city?
00:35:43.060 | And there's a lot of uncertainty about this.
00:35:46.460 | You could say, I don't know,
00:35:48.540 | it's probably at least a million tons
00:35:50.420 | 'cause you have to set up a lot of infrastructure on Mars.
00:35:55.460 | Like I said, you can't be missing anything
00:35:58.700 | that in order to be self-sustaining,
00:36:00.980 | you can't be missing, like you need semiconductor fabs,
00:36:04.460 | you need iron ore refineries,
00:36:07.140 | like you need lots of things.
00:36:08.840 | And Mars is not super hospitable.
00:36:13.460 | It's the least inhospitable planet,
00:36:15.700 | but it's definitely a fixer-upper of a planet.
00:36:18.140 | - Outside of Earth.
00:36:19.380 | - Yes.
00:36:20.220 | - Earth is pretty good. - Earth is like easy.
00:36:22.180 | - And also we should clarify in the solar system.
00:36:25.500 | - Yes, in the solar system.
00:36:26.660 | - There might be nice vacation spots.
00:36:29.780 | - There might be some great planets out there,
00:36:31.140 | but it's hopeless. - Too hard to get there?
00:36:33.820 | - Yeah, way, way, way, way, way too hard,
00:36:36.660 | to say the least.
00:36:37.500 | - Let me push back on that.
00:36:38.700 | Not really a pushback,
00:36:39.700 | but a quick curveball of a question.
00:36:41.980 | So you did mention physics as the first starting point.
00:36:44.820 | So general relativity allows for wormholes.
00:36:49.820 | They technically can exist.
00:36:53.080 | Do you think those can ever be leveraged
00:36:55.700 | by humans to travel fast in the speed of light?
00:36:58.100 | - Well--
00:37:00.260 | - Are you saying--
00:37:02.300 | - The wormhole thing is debatable.
00:37:03.620 | We currently do not know of any means
00:37:08.220 | of going faster than the speed of light.
00:37:11.780 | There is like,
00:37:13.380 | there are some ideas about having space.
00:37:21.380 | Like so, you can only move at the speed of light
00:37:26.380 | through space, but if you can make space itself move,
00:37:30.720 | that's what we mean by space.
00:37:34.520 | Space is capable of moving faster than the speed of light.
00:37:39.580 | - Right.
00:37:40.740 | - Like the universe, in the Big Bang,
00:37:42.100 | the universe expanded much more
00:37:44.620 | than the speed of light, by a lot.
00:37:46.820 | - Yeah.
00:37:47.660 | - So,
00:37:49.680 | but the,
00:37:53.040 | if this is possible,
00:37:56.660 | the amount of energy required to move space
00:37:58.660 | is so gigantic, it boggles the mind.
00:38:03.100 | - So all the work you've done with propulsion,
00:38:05.060 | how much innovation is possible with rocket propulsion?
00:38:08.100 | Is this, I mean, you've seen it all,
00:38:11.140 | and you're constantly innovating in every aspect.
00:38:14.420 | How much is possible?
00:38:15.340 | Like how much, can you get 10x somehow?
00:38:17.380 | Is there something in there, in physics,
00:38:19.660 | that you can get significant improvement
00:38:21.220 | in terms of efficiency of engines
00:38:22.580 | and all those kinds of things?
00:38:24.620 | - Well, as I was saying, really the Holy Grail
00:38:27.940 | is a fully and rapidly reusable orbital system.
00:38:31.480 | So, right now,
00:38:36.500 | the Falcon 9 is the only reusable rocket out there.
00:38:41.500 | But the booster comes back and lands,
00:38:44.300 | and you've seen the videos,
00:38:46.180 | and we get the nose cone or fairing back,
00:38:47.620 | but we do not get the upper stage back.
00:38:50.580 | that means that we have a minimum cost
00:38:54.260 | of building an upper stage.
00:38:56.660 | You can think of like a two-stage rocket
00:38:59.300 | of sort of like two airplanes,
00:39:00.700 | like a big airplane and a small airplane,
00:39:03.180 | and we get the big airplane back,
00:39:04.600 | but not the small airplane.
00:39:05.860 | And so, it still costs a lot.
00:39:07.860 | So, that upper stage is at least $10 million.
00:39:11.340 | And then the degree of,
00:39:15.460 | the booster is not as rapidly and completely reusable
00:39:19.060 | as we'd like, nor are the fairings.
00:39:20.900 | So, our kind of minimum marginal cost,
00:39:25.140 | not counting overhead, per flight
00:39:27.140 | is on the order of $15 to $20 million, maybe.
00:39:33.360 | So, that's extremely good for,
00:39:38.040 | it's by far better than any rocket ever in history.
00:39:40.600 | But with full and rapid reusability,
00:39:45.320 | we can reduce the cost per time to orbit
00:39:48.800 | by a factor of 100.
00:39:51.820 | But just think of it like,
00:39:54.400 | like imagine if you had an aircraft or something,
00:39:57.140 | or a car,
00:39:58.880 | and if you had to buy a new car
00:40:02.920 | every time you went for a drive,
00:40:05.080 | it would be very expensive.
00:40:06.760 | It would be silly, frankly.
00:40:08.320 | But in fact, you just refuel the car or recharge the car,
00:40:13.320 | and that makes your trip,
00:40:15.480 | like, I don't know, a thousand times cheaper.
00:40:20.320 | So, it's the same for rockets.
00:40:23.320 | Very difficult to make this complex machine
00:40:27.240 | that can go to orbit.
00:40:28.680 | And so, if you cannot reuse it,
00:40:30.560 | and have to throw even any significant part of it away,
00:40:34.000 | that massively increases the cost.
00:40:36.660 | So, you know, Starship in theory
00:40:40.320 | could do a cost per launch of like a million,
00:40:44.360 | maybe $2 million or something like that,
00:40:46.360 | and put over 100 tons in orbit.
00:40:51.520 | This is crazy.
00:40:53.560 | - Yeah.
00:40:55.000 | That's incredible.
00:40:55.960 | So, you're saying like it's by far
00:40:57.600 | the biggest bang for the buck
00:40:58.720 | is to make it fully reusable,
00:41:00.720 | versus like some kind of brilliant breakthrough
00:41:04.160 | in theoretical physics.
00:41:05.780 | - No, no.
00:41:06.620 | There's no brilliant break, no.
00:41:07.960 | There's no, just make the rocket reusable.
00:41:11.240 | This is an extremely difficult engineering problem.
00:41:13.480 | - Got it.
00:41:14.320 | - But no new physics is required.
00:41:15.720 | - Just brilliant engineering.
00:41:19.320 | Let me ask a slightly philosophical fun question.
00:41:22.200 | Gotta ask.
00:41:23.040 | I know you're focused on getting to Mars,
00:41:24.840 | but once we're there on Mars,
00:41:26.720 | what form of government, economic system,
00:41:29.960 | political system do you think would work best
00:41:33.760 | for an early civilization of humans?
00:41:36.580 | I mean, the interesting reason to talk about this stuff,
00:41:41.560 | it also helps people dream about the future.
00:41:44.320 | I know you're really focused
00:41:45.560 | about the short-term engineering dream,
00:41:48.020 | but it's like, I don't know,
00:41:49.280 | there's something about imagining
00:41:50.680 | an actual civilization on Mars
00:41:52.320 | that gives people, really gives people hope.
00:41:55.440 | - Well, it would be a new frontier
00:41:56.960 | and an opportunity to rethink the whole nature of government
00:41:59.640 | just as was done in the creation of the United States.
00:42:02.680 | So, I mean, I would suggest having direct democracy,
00:42:07.680 | like people vote directly on things
00:42:16.200 | as opposed to representative democracy.
00:42:18.440 | So representative democracy, I think,
00:42:21.520 | is too subject to special interests
00:42:25.120 | and a coercion of the politicians and that kind of thing.
00:42:29.580 | So I'd recommend that there's just direct democracy.
00:42:36.320 | People vote on laws,
00:42:40.560 | the population votes on laws themselves,
00:42:42.680 | and then the laws must be short enough
00:42:44.520 | that people can understand them.
00:42:46.880 | - Yeah, and then keeping a well-informed populace,
00:42:49.920 | like really being transparent about all the information,
00:42:52.240 | about what they're voting for.
00:42:53.080 | - Yeah, absolute transparency.
00:42:54.800 | - Yeah, and not make it as annoying as those cookies
00:42:57.480 | we have to accept.
00:42:58.800 | - Accept cookies.
00:42:59.720 | There's always a slight amount of trepidation
00:43:03.640 | when you click accept cookies.
00:43:05.960 | I feel as though there's perhaps a very tiny chance
00:43:09.040 | that'll open a portal to hell or something like that.
00:43:12.280 | - That's exactly how I feel.
00:43:13.800 | - Why do they, why do they keep wanting me to accept it?
00:43:16.640 | What do they want with this cookie?
00:43:19.200 | Somebody got upset with accepting cookies
00:43:21.040 | or something somewhere, who cares?
00:43:24.000 | It's so annoying to keep accepting all these cookies.
00:43:26.960 | - To me, this is just a great--
00:43:28.960 | - Yes, you can have my damn cookie, I don't care, whatever.
00:43:32.400 | - Heard it from Yon first, he accepts all your damn cookies.
00:43:35.960 | - Yeah.
00:43:36.800 | (laughing)
00:43:38.080 | Stop asking me.
00:43:38.920 | It's annoying.
00:43:41.400 | - Yeah, it's one example of implementation
00:43:46.000 | of a good idea done really horribly.
00:43:50.200 | - Yeah, it's somebody who has some good intentions
00:43:52.520 | of privacy or whatever, but now everyone just has
00:43:56.120 | to accept cookies and it's not, you have billions
00:43:59.000 | of people who have to keep clicking accept cookie,
00:44:00.720 | it's super annoying.
00:44:02.360 | Then just accept the damn cookie, it's fine.
00:44:05.000 | There is, I think, a fundamental problem
00:44:07.960 | that because we've not really had a major,
00:44:11.280 | like a world war or something like that in a while,
00:44:14.320 | and obviously we'd like to not have world wars,
00:44:16.720 | there's not been a cleansing function
00:44:19.800 | for rules and regulations.
00:44:22.360 | So wars did have some sort of lining
00:44:25.760 | in that there would be a reset on rules
00:44:29.000 | and regulations after a war.
00:44:31.120 | So World Wars I and II, there were huge resets
00:44:33.000 | on rules and regulations.
00:44:34.440 | Now, if society does not have a war
00:44:39.400 | and there's no cleansing function or garbage collection
00:44:41.720 | for rules and regulations, then rules and regulations
00:44:43.840 | will accumulate every year 'cause they're immortal.
00:44:46.440 | There's no actual, humans die, but the laws don't.
00:44:50.320 | So we need a garbage collection function
00:44:53.160 | for rules and regulations.
00:44:54.560 | They should not just be immortal
00:44:57.480 | 'cause some of the rules and regulations
00:44:59.000 | that are put in place will be counterproductive.
00:45:01.920 | Done with good intentions, but counterproductive.
00:45:03.720 | Sometimes not done with good intentions.
00:45:05.600 | So if rules and regulations just accumulate every year
00:45:10.520 | and you get more and more of them,
00:45:12.360 | then eventually you won't be able to do anything.
00:45:14.840 | You're just like Gulliver tied down
00:45:17.320 | by thousands of little strings.
00:45:19.560 | And we see that in US and,
00:45:24.560 | like basically all economies that have been around
00:45:29.280 | for a while and regulators and legislators
00:45:34.280 | create new rules and regulations every year,
00:45:36.560 | but they don't put effort into removing them.
00:45:38.680 | And I think that's very important that we put effort
00:45:40.280 | into removing rules and regulations.
00:45:42.080 | But it gets tough 'cause you get special interests
00:45:45.560 | that then are dependent on,
00:45:47.320 | like they have a vested interest
00:45:50.520 | in that whatever rule and regulation
00:45:51.920 | and then they fight to not get it removed.
00:45:55.560 | - Yeah, so I mean, I guess the problem
00:45:59.360 | with the constitution is it's kind of like C versus Java
00:46:04.120 | 'cause it doesn't have any garbage collection built in.
00:46:06.720 | I think there should be,
00:46:07.920 | when you first said that the metaphor of garbage collection,
00:46:10.800 | I loved it. - Yeah, it's from
00:46:11.640 | a coding standpoint.
00:46:12.480 | - From a coding standpoint, yeah, yeah.
00:46:14.320 | It would be interesting if the laws themselves
00:46:16.880 | kind of had a built-in thing
00:46:19.040 | where they kind of die after a while
00:46:20.720 | unless somebody explicitly publicly defends them.
00:46:23.600 | So that's sort of, it's not like somebody has to kill them,
00:46:26.240 | they kind of die themselves, they disappear.
00:46:29.160 | - Yeah.
00:46:30.000 | - Not to defend Java or anything, but C++,
00:46:36.640 | you could also have great garbage collection
00:46:38.520 | in Python and so on.
00:46:39.920 | - Yeah, so yeah, something needs to happen
00:46:43.760 | or just the civilization's arteries just harden over time
00:46:48.600 | and you can just get less and less done
00:46:50.840 | because there's just a rule against everything.
00:46:53.200 | So I think, I don't know, for Mars or whatever,
00:46:57.920 | I'd say, or even for, obviously for Earth as well,
00:47:00.240 | like I think there should be an active process
00:47:02.640 | for removing rules and regulations
00:47:04.960 | and questioning their existence.
00:47:07.120 | Just like if we've got a function
00:47:10.280 | for creating rules and regulations,
00:47:11.560 | 'cause rules and regulations can also think of as like
00:47:13.480 | they're like software or lines of code
00:47:15.800 | for operating civilization.
00:47:18.880 | That's the rules and regulations.
00:47:21.320 | So it's not like we shouldn't have rules and regulations,
00:47:23.000 | but you have code accumulation, but no code removal.
00:47:27.120 | And so it just gets to become
00:47:29.960 | basically archaic bloatware after a while.
00:47:32.280 | And it's just, it makes it hard for things to progress.
00:47:37.920 | So I don't know, maybe Mars, you'd have like,
00:47:42.000 | any given law must have a sunset
00:47:43.560 | and require active voting to restore,
00:47:50.160 | to keep it up there.
00:47:51.160 | And I actually also say like,
00:47:54.080 | and these are just, I don't know,
00:47:55.160 | recommendations or thoughts.
00:47:57.040 | Ultimately, it will be up to the people on Mars to decide.
00:48:00.680 | But I think it should be easier to remove a law
00:48:05.680 | than to add one because of the,
00:48:08.600 | just to overcome the inertia of laws.
00:48:10.720 | So maybe it's like, for argument's sake,
00:48:15.240 | you need like say 60% vote to have a law take effect,
00:48:19.800 | but only a 40% vote to remove it.
00:48:22.040 | - So let me be the guy,
00:48:24.800 | you posted a meme on Twitter recently
00:48:26.640 | where there's like a row of urinals
00:48:30.160 | and a guy just walks all the way across.
00:48:32.600 | - Oh sure, yeah.
00:48:33.440 | - And he tells you about crypto.
00:48:34.640 | (laughing)
00:48:36.320 | - I mean, that's happened to me so many times.
00:48:38.360 | I think maybe even literally.
00:48:40.400 | - Yeah.
00:48:41.800 | Do you think technologically speaking,
00:48:43.480 | there's any room for ideas of smart contracts or so on?
00:48:47.360 | 'Cause you mentioned laws.
00:48:49.320 | That's an interesting use of things like smart contracts
00:48:53.000 | to implement the laws by which governments function.
00:48:56.000 | Like something built on a theory
00:48:58.920 | or maybe a dog coin that enables smart contracts somehow.
00:49:03.920 | - I don't quite understand this whole smart contract thing.
00:49:08.280 | You know?
00:49:09.960 | I mean, I'm too dumb to understand smart contracts.
00:49:14.960 | - That's a good line.
00:49:16.000 | - I mean, my general approach to any kind of deal
00:49:21.000 | or whatever is just make sure
00:49:22.280 | there's clarity of understanding.
00:49:23.760 | That's the most important thing.
00:49:25.800 | And just keep any kind of deal very short and simple,
00:49:29.640 | plain language, and just make sure everyone understands
00:49:33.440 | this is the deal, is it clear?
00:49:38.600 | And what are the consequences
00:49:39.800 | if various things don't happen?
00:49:41.360 | But usually deals are, business deals or whatever,
00:49:47.200 | are way too long and complex and overly layered
00:49:50.920 | and pointlessly.
00:49:52.720 | - You mentioned that Doge is the people's coin.
00:49:57.040 | - Yeah.
00:49:57.880 | - And you said that you were literally going,
00:49:59.640 | SpaceX may consider literally putting a Doge coin
00:50:04.640 | on the moon.
00:50:07.480 | Is this something you're still considering?
00:50:09.960 | Mars, perhaps?
00:50:11.960 | Do you think there's some chance,
00:50:13.640 | we've talked about political systems on Mars,
00:50:16.080 | that Doge coin is the official currency of Mars
00:50:20.240 | at some point in the future?
00:50:21.680 | - Well, I think Mars itself will need
00:50:25.400 | to have a different currency
00:50:27.040 | because you can't synchronize due to speed of light.
00:50:30.320 | Or not easily.
00:50:32.680 | - So it must be completely standalone from Earth.
00:50:36.480 | - Well, yeah, 'cause Mars, at closest approach,
00:50:41.320 | it's four light minutes away, roughly.
00:50:43.000 | And then at furthest approach,
00:50:45.000 | it's roughly 20 light minutes away, maybe a little more.
00:50:48.480 | So you can't really have something synchronizing
00:50:52.480 | if you've got a 20 minute speed of light issue,
00:50:55.480 | if it's got a one minute blockchain.
00:50:58.200 | It's not gonna synchronize properly.
00:51:00.000 | So Mars would, I don't know if Mars would have
00:51:03.960 | a cryptocurrency as a thing, but probably.
00:51:06.040 | Seems likely.
00:51:07.640 | But it would be some kind of localized thing on Mars.
00:51:10.240 | - And you let the people decide.
00:51:13.920 | - Yeah, absolutely.
00:51:16.520 | The future of Mars should be up to the Martians.
00:51:20.760 | Yeah, so.
00:51:21.600 | I mean, I think the cryptocurrency thing
00:51:25.800 | is an interesting approach to reducing the error
00:51:34.080 | in the database that is called money.
00:51:38.060 | I think I have a pretty deep understanding
00:51:42.920 | of what money actually is on a practical day-to-day basis
00:51:46.760 | because of PayPal.
00:51:48.080 | We really got in deep there.
00:51:52.960 | And right now, the money system,
00:51:57.280 | actually for practical purposes,
00:51:59.960 | is really a bunch of heterogeneous mainframes
00:52:04.080 | running old COBOL.
00:52:05.800 | - Okay, you mean literally?
00:52:08.720 | - Literally.
00:52:09.560 | - That's literally what's happening.
00:52:10.840 | - In batch mode.
00:52:11.680 | Okay.
00:52:13.600 | - In batch mode.
00:52:14.440 | - Yeah, pretty the poor bastards
00:52:16.440 | who have to maintain that code.
00:52:19.000 | Okay, that's a pain.
00:52:22.240 | - Not even Fortran's COBOL, yep.
00:52:24.240 | - It's COBOL.
00:52:26.040 | And banks are still buying mainframes in 2021
00:52:30.120 | and running ancient COBOL code.
00:52:32.040 | And the Federal Reserve is probably even older
00:52:37.800 | than what the banks have,
00:52:38.960 | and they have an old COBOL mainframe.
00:52:41.080 | And so the government effectively has editing privileges
00:52:46.960 | on the money database.
00:52:48.660 | And they use those editing privileges
00:52:51.520 | to make more money whenever they want.
00:52:56.420 | And this increases the error in the database that is money.
00:53:00.020 | So I think money should really be viewed
00:53:01.820 | through the lens of information theory.
00:53:04.580 | And so it's kind of like an internet connection.
00:53:09.260 | Like what's the bandwidth, total bit rate,
00:53:13.500 | what is the latency, jitter, packet drop,
00:53:17.220 | errors in network communication.
00:53:21.780 | So you think money like that, basically.
00:53:25.460 | I think that's probably the right way to think of it.
00:53:27.420 | And then say what system,
00:53:29.640 | from an information theory standpoint,
00:53:32.900 | allows an economy to function the best.
00:53:35.180 | And crypto is an attempt to reduce the error in money
00:53:41.540 | that is contributed by governments
00:53:50.740 | diluting the money supply
00:53:53.260 | as basically a pernicious form of taxation.
00:53:57.180 | - So both policy in terms of with inflation
00:54:01.860 | and actual technological COBOL,
00:54:05.740 | like cryptocurrency takes us into the 21st century
00:54:08.860 | in terms of the actual systems
00:54:10.700 | that allow you to do the transaction,
00:54:12.100 | to store wealth, all those kinds of things.
00:54:14.200 | - Like I said, just think of money as information.
00:54:18.540 | People often will think of money
00:54:20.860 | as having power in and of itself.
00:54:22.700 | It does not.
00:54:24.960 | Money is information.
00:54:26.820 | And it does not have power in and of itself.
00:54:29.860 | Like applying the physics tools
00:54:35.020 | of thinking about things in the limit is helpful.
00:54:37.500 | If you are stranded on a tropical island
00:54:39.980 | and you have a trillion dollars, it's useless.
00:54:45.600 | 'Cause there's no resource allocation.
00:54:50.540 | Money is a database resource allocation.
00:54:52.660 | There's no resource to allocate except yourself.
00:54:55.020 | So money is useless.
00:54:56.060 | If you're stranded on a desert island with no food,
00:55:04.100 | all the Bitcoin in the world will not stop you from starving.
00:55:15.940 | So just think of money as a database
00:55:20.820 | for resource allocation across time and space.
00:55:24.000 | In what form should that database or data system,
00:55:36.160 | what would be most effective?
00:55:39.060 | Now, there is a fundamental issue with, say,
00:55:42.980 | Bitcoin in its current form,
00:55:45.020 | in that the transaction volume is very limited.
00:55:48.740 | And the latency for a properly confirmed transaction
00:55:55.700 | is too long, much longer than you'd like.
00:56:00.140 | So it's actually not great
00:56:02.180 | from a transaction volume standpoint
00:56:04.980 | or a latency standpoint.
00:56:09.140 | So it is perhaps useful to solve an aspect
00:56:14.140 | of the money database problem,
00:56:16.180 | which is this sort of store of wealth
00:56:19.940 | or an accounting of relative obligations, I suppose.
00:56:24.700 | But it is not useful as a currency,
00:56:29.420 | as a day-to-day currency.
00:56:30.820 | - But people have proposed
00:56:31.900 | different technological solutions.
00:56:33.340 | - Like Lightning.
00:56:34.340 | - Yeah, Lightning Network and the Layer 2 technologies
00:56:36.760 | on top of that.
00:56:38.980 | It seems to be all kind of a trade-off,
00:56:40.860 | but the point is, it's kind of brilliant to say
00:56:43.140 | that just think about information,
00:56:44.500 | think about what kind of database,
00:56:46.180 | what kind of infrastructure enables
00:56:48.060 | that exchange of information. - Yeah, so say like
00:56:48.900 | you're operating an economy,
00:56:50.200 | and you need to have something that allows
00:56:55.300 | for the efficient, to have efficient value ratios
00:56:59.740 | between products and services.
00:57:01.380 | So you've got this massive number of products and services,
00:57:03.900 | and you need to, you can't just barter.
00:57:07.300 | That would be extremely unwieldy.
00:57:09.620 | So you need something that gives you
00:57:12.100 | a ratio of exchange between goods and services.
00:57:19.240 | And then something that allows you
00:57:22.620 | to shift obligations across time, like debt,
00:57:26.620 | debt and equity shift obligations across time.
00:57:29.220 | Then what does the best job of that?
00:57:31.380 | Part of the reason why I think
00:57:34.180 | there's some merit to Dogecoin,
00:57:37.060 | even though it was obviously created as a joke,
00:57:39.500 | is that it actually does have a much higher
00:57:44.980 | transaction volume capability than Bitcoin.
00:57:48.340 | And the costs of doing a transaction,
00:57:53.020 | the Dogecoin fee is very low.
00:57:56.020 | Like right now, if you want to do a Bitcoin transaction,
00:57:58.260 | the price of doing that transaction is very high.
00:58:00.460 | So you could not use it effectively for most things.
00:58:04.380 | And nor could it even scale to a high volume.
00:58:06.900 | And when Bitcoin was started,
00:58:14.020 | I guess around 2008 or something like that,
00:58:16.220 | the internet connections were much worse than they are today.
00:58:20.860 | Like order of magnitude, I mean,
00:58:23.580 | they're way, way worse in 2008.
00:58:26.860 | So like having a small block size or whatever is,
00:58:33.180 | and a long synchronization time is,
00:58:36.660 | made sense in 2008.
00:58:37.780 | But 2021, or fast forward 10 years,
00:58:41.500 | it's like economically low.
00:58:45.380 | And I think there's some value to having a linear increase
00:58:55.540 | in the amount of currency that is generated.
00:58:58.620 | So because some amount of the currency,
00:59:02.940 | if a currency is too deflationary,
00:59:06.340 | or like, or should say,
00:59:07.820 | if a currency is expected to increase in value over time,
00:59:12.820 | there's reluctance to spend it.
00:59:15.180 | 'Cause you're like, oh, I'll just hold it and not spend it
00:59:18.500 | because it's scarcity is increasing with time.
00:59:20.580 | So if I spend it now, then I will regret spending it.
00:59:23.380 | So I will just, you know, hodl it.
00:59:26.020 | But if there's some dilution of the currency occurring
00:59:30.940 | over time, that's more of an incentive
00:59:32.820 | to use it as a currency.
00:59:34.260 | So those coins, somewhat randomly,
00:59:36.860 | has just a fixed number of sort of coins
00:59:41.860 | or hash strings that are generated every year.
00:59:49.620 | So there's some inflation, but it's not a percentage base.
00:59:52.780 | It's a fixed number.
00:59:55.700 | So the percentage of inflation
00:59:58.380 | will necessarily decline over time.
01:00:00.580 | So it just, I'm not saying that it's like the ideal system
01:00:05.580 | for a currency, but I think it actually is
01:00:09.500 | just fundamentally better than anything else I've seen,
01:00:13.460 | just by accident.
01:00:16.620 | - I like how you said around 2008,
01:00:19.740 | so you're not, you know, some people suggested,
01:00:23.420 | you might be Satoshi Nakamoto,
01:00:24.860 | you've previously said you're not.
01:00:26.060 | Let me ask, you're not for sure.
01:00:28.780 | Would you tell us if you were?
01:00:30.140 | - Yes.
01:00:30.980 | - Okay.
01:00:31.820 | Do you think it's a feature or bug
01:00:34.780 | that he's anonymous or she or they?
01:00:37.260 | It's an interesting kind of quirk of human history
01:00:41.680 | that there is a particular technology
01:00:43.580 | that is a completely anonymous inventor or creator.
01:00:47.020 | - Well, I mean, you can look at the evolution of ideas
01:00:52.020 | before the launch of Bitcoin
01:01:12.860 | and see who wrote about those ideas.
01:01:17.860 | And then, I don't know exactly,
01:01:22.820 | obviously I don't know who created Bitcoin
01:01:25.180 | for practical purposes,
01:01:26.100 | but the evolution of ideas is pretty clear before that.
01:01:29.940 | And it seems as though Nick Szabo
01:01:33.020 | is probably more than anyone else responsible
01:01:36.240 | for the evolution of those ideas.
01:01:38.060 | So he claims not to be Nakamoto,
01:01:41.900 | but I'm not sure that's neither here nor there.
01:01:45.380 | But he seems to be the one more responsible
01:01:48.300 | for the ideas behind Bitcoin than anyone else.
01:01:50.900 | - So it's not, perhaps singular figures
01:01:53.360 | aren't even as important as the figures involved
01:01:56.260 | in the evolution of ideas that led to a thing.
01:01:58.620 | - Yeah.
01:01:59.460 | - Yeah, you know.
01:02:00.500 | And perhaps it's sad to think about history,
01:02:03.620 | but maybe most names will be forgotten anyway.
01:02:06.900 | - What is a name anyway?
01:02:08.020 | It's a name attached to an idea.
01:02:11.720 | - What does it even mean really?
01:02:13.680 | - I think Shakespeare had a thing about roses and stuff,
01:02:16.240 | whatever he said.
01:02:17.200 | - Rose by any other name, it smells sweet.
01:02:19.480 | - I got Elon to quote Shakespeare.
01:02:24.280 | I feel like I accomplished something today.
01:02:26.880 | - Shall I compare thee to a summer's day?
01:02:28.920 | (laughing)
01:02:30.800 | - I'm gonna clip that out.
01:02:32.100 | (laughing)
01:02:34.000 | - Not more temperate and more fair.
01:02:35.800 | (laughing)
01:02:38.880 | - Autopilot, Tesla Autopilot.
01:02:41.440 | (laughing)
01:02:43.680 | Tesla Autopilot has been through an incredible journey
01:02:48.520 | over the past six years,
01:02:50.540 | or perhaps even longer in your mind
01:02:53.520 | and the minds of many involved.
01:02:55.180 | - I think that's where we first connected really
01:02:59.000 | was the Autopilot stuff, autonomy.
01:03:01.920 | - The whole journey was incredible to me to watch.
01:03:07.720 | Because I knew, well, part of it is I was at MIT
01:03:10.360 | and I knew the difficulty of computer vision.
01:03:13.200 | And I knew the whole, I had a lot of colleagues and friends
01:03:15.800 | about the DARPA challenge and knew how difficult it is.
01:03:18.440 | And so there was a natural skepticism.
01:03:20.160 | When I first drove a Tesla with the initial system
01:03:23.720 | based on Mobileye, I thought there's no way.
01:03:26.540 | So at first when I got in, I thought there's no way
01:03:29.920 | this car could maintain, like stay in the lane
01:03:34.180 | and create a comfortable experience.
01:03:35.900 | So my intuition initially was that the lane keeping problem
01:03:39.520 | is way too difficult to solve.
01:03:41.760 | - Oh, lane keeping, yeah, that's relatively easy.
01:03:44.160 | - Well, but solve in the way that we just,
01:03:48.440 | we talked about previous is prototype versus a thing
01:03:52.640 | that actually creates a pleasant experience
01:03:54.400 | over hundreds of thousands of miles and millions.
01:03:57.480 | Yeah, so I was proven wrong pretty quickly.
01:03:59.360 | - We had to wrap a lot of code around the Mobileye thing.
01:04:01.720 | It doesn't just work by itself.
01:04:04.400 | - I mean, that's part of the story
01:04:06.420 | of how you approach things sometimes.
01:04:07.980 | Sometimes you do things from scratch.
01:04:09.680 | Sometimes at first you kind of see what's out there
01:04:12.460 | and then you decide to do from scratch.
01:04:14.340 | That was one of the boldest decisions I've seen
01:04:17.180 | is both on the hardware and the software
01:04:18.820 | to decide to eventually go from scratch.
01:04:21.000 | I thought, again, I was skeptical
01:04:22.700 | whether that's going to be able to work out
01:04:24.500 | 'cause it's such a difficult problem.
01:04:26.860 | And so it was an incredible journey.
01:04:28.900 | What I see now with everything, the hardware, the compute,
01:04:32.500 | the sensors, the things I maybe care and love about most
01:04:37.300 | is the stuff that Andrej Karpathy is leading
01:04:40.040 | with the data set selection, the whole data engine process,
01:04:43.100 | the neural network architectures,
01:04:45.020 | the way that's in the real world,
01:04:47.300 | that network is tested, validated,
01:04:49.380 | all the different test sets,
01:04:50.920 | versus the ImageNet model of computer vision,
01:04:54.740 | like what's in academia
01:04:56.220 | is like real world artificial intelligence.
01:05:01.340 | And Andrej's awesome and obviously plays an important role,
01:05:04.220 | but we have a lot of really talented people driving things.
01:05:07.760 | And Ashok is actually the head of autopilot engineering.
01:05:12.740 | Andrej is the director of AI.
01:05:16.380 | - AI stuff, yeah, yeah, yeah.
01:05:17.700 | So yeah, I'm aware that there's an incredible team
01:05:20.580 | of just a lot going on.
01:05:22.060 | - Yeah, just, you know,
01:05:23.020 | obviously people will give me too much credit
01:05:26.020 | and they'll give Andrej too much credit.
01:05:28.660 | - And people should realize how much is going on
01:05:31.660 | under the hood. - Yeah, it's just a lot
01:05:33.380 | of really talented people.
01:05:34.680 | The Tesla Autopilot AI team is extremely talented.
01:05:40.000 | It's like some of the smartest people in the world.
01:05:42.540 | So yeah, we're getting it done.
01:05:45.020 | - What are some insights you've gained
01:05:47.620 | over those five, six years of autopilot
01:05:51.260 | about the problem of autonomous driving?
01:05:54.220 | So you leaped in having some sort of first principles,
01:05:59.220 | kinds of intuitions, but nobody knows how difficult
01:06:02.980 | the problem-- - Yeah.
01:06:04.540 | - Like the problem-- - I thought the self-driving
01:06:06.220 | problem would be hard, but it was harder than I thought.
01:06:08.940 | It's not like I thought it would be easy.
01:06:09.900 | I thought it would be very hard,
01:06:10.740 | but it was actually way harder than even that.
01:06:14.140 | So I mean, what it comes down to at the end of the day
01:06:16.980 | is to solve self-driving, you have to solve,
01:06:22.660 | you basically need to recreate what humans do to drive,
01:06:27.660 | which is humans drive with optical sensors,
01:06:31.500 | eyes, and biological neural nets.
01:06:33.800 | And so in order to, that's how the entire road system
01:06:37.860 | is designed to work, with basically passive optical
01:06:42.860 | and neural nets, biologically.
01:06:46.060 | And now that we need to, so for actually
01:06:48.460 | for full self-driving to work, we have to recreate that
01:06:51.020 | in digital form.
01:06:51.880 | So we have to, that means cameras with advanced
01:06:57.460 | neural nets in silicon form.
01:07:02.560 | And then it will obviously solve for full self-driving.
01:07:07.940 | That's the only way, I don't think there's any other way.
01:07:10.260 | - But the question is, what aspects of human nature
01:07:12.880 | do you have to encode into the machine, right?
01:07:15.500 | So you have to solve the perception problem, like detect,
01:07:18.740 | and then you first, well, realize what is the perception
01:07:22.340 | problem for driving, like all the kinds of things
01:07:24.220 | you have to be able to see.
01:07:25.420 | Like what do we even look at when we drive?
01:07:27.900 | There's, I just recently heard Andre talked about at MIT
01:07:32.420 | about car doors, I think it was the world's greatest talk
01:07:35.560 | of all time about car doors.
01:07:37.020 | - Yeah.
01:07:37.860 | - The fine details of car doors.
01:07:41.380 | Like what is even an open car door, man?
01:07:44.420 | So like the ontology of that, that's a perception problem.
01:07:47.980 | We humans solve that perception problem,
01:07:49.860 | and Tesla has to solve that problem.
01:07:51.620 | And then there's the control and the planning
01:07:53.380 | coupled with the perception.
01:07:54.980 | You have to figure out like what's involved in driving,
01:07:58.300 | like especially in all the different edge cases.
01:08:00.880 | And then, I mean, maybe you can comment on this,
01:08:06.540 | how much game theoretic kind of stuff needs to be involved,
01:08:10.460 | you know, at a four-way stop sign.
01:08:14.020 | As humans, when we drive, our actions affect the world.
01:08:18.060 | Like it changes how others behave.
01:08:20.780 | Most autonomous driving, you're usually just responding
01:08:25.220 | to the scene as opposed to like really asserting yourself
01:08:30.980 | in the scene.
01:08:31.820 | Do you think?
01:08:33.140 | - I think these sort of control logic conundrums
01:08:37.580 | are not the hard part.
01:08:43.060 | Let's see.
01:08:43.900 | - What do you think is the hard part
01:08:46.860 | in this whole beautiful, complex problem?
01:08:50.580 | - So it's a lot of frigging software, man.
01:08:52.980 | A lot of smart lines of code.
01:08:54.420 | For sure, in order to have,
01:08:59.660 | create an accurate vector space.
01:09:03.980 | So like you're coming from image space,
01:09:08.280 | which is like this flow of photons,
01:09:12.580 | you're going to the cameras,
01:09:14.420 | and then you have this massive bit stream in image space,
01:09:19.420 | and then you have to effectively compress
01:09:26.780 | a massive bit stream corresponding to photons
01:09:34.580 | that knocked off an electron in a camera sensor,
01:09:41.380 | and turn that bit stream into vector space.
01:09:45.280 | By vector space, I mean like,
01:09:49.600 | you've got cars and humans and lane lines and curves
01:09:56.620 | and traffic lights and that kind of thing.
01:10:02.380 | Once you have an accurate vector space,
01:10:08.500 | the control problem is similar to that of a video game,
01:10:11.680 | like a Grand Theft Auto or Cyberpunk,
01:10:14.100 | if you have accurate vector space.
01:10:16.260 | It's the control problem is,
01:10:17.700 | I wouldn't say it's trivial, it's not trivial,
01:10:20.300 | but it's not like some insurmountable thing.
01:10:25.300 | Having an accurate vector space is very difficult.
01:10:32.140 | - Yeah, I think we humans don't give enough respect
01:10:35.540 | to how incredible the human perception system is,
01:10:37.900 | to mapping the raw photons
01:10:41.500 | to the vector space representation in our heads.
01:10:44.660 | - Your brain is doing an incredible amount of processing
01:10:47.420 | and giving you an image that is a very cleaned up image.
01:10:51.360 | Like when we look around here,
01:10:53.380 | like you see color in the corners of your eyes,
01:10:55.340 | but actually your eyes have very few cones,
01:10:59.420 | like cone receptors in the peripheral vision.
01:11:02.260 | Your eyes are painting color in the peripheral vision.
01:11:05.700 | You don't realize it,
01:11:06.520 | but their eyes are actually painting color.
01:11:09.060 | And your eyes also have like this blood vessels
01:11:12.280 | and also some gnarly things, and there's a blind spot,
01:11:14.660 | but do you see your blind spot?
01:11:16.380 | No, your brain is painting in the missing, the blind spot.
01:11:21.180 | You're gonna do these like,
01:11:22.820 | see these things online where you look here
01:11:24.980 | and look at this point and then look at this point.
01:11:27.380 | And if it's in your blind spot,
01:11:30.480 | your brain will just fill in the missing bits.
01:11:33.660 | - The peripheral vision is so cool.
01:11:35.220 | - Yeah. - Makes you realize all the illusions
01:11:37.180 | for vision science,
01:11:38.020 | and so it makes you realize just how incredible the brain is.
01:11:40.640 | - The brain is doing crazy amount of post-processing
01:11:42.640 | on the vision signals for your eyes.
01:11:44.760 | It's insane.
01:11:46.680 | So, and then even once you get all those vision signals,
01:11:51.920 | your brain is constantly trying
01:11:54.720 | to forget as much as possible.
01:11:57.720 | So human memory is,
01:11:59.520 | perhaps the weakest thing about the brain is memory.
01:12:01.920 | So because memory is so expensive to our brain
01:12:05.280 | and so limited,
01:12:06.700 | your brain is trying to forget as much as possible
01:12:09.780 | and distill the things that you see
01:12:12.180 | into the smallest amounts of information possible.
01:12:16.540 | So your brain is trying to not just get to a vector space,
01:12:19.380 | but get to a vector space
01:12:20.920 | that is the smallest possible vector space
01:12:23.580 | of only relevant objects.
01:12:25.080 | And I think like, you can sort of look inside your brain,
01:12:29.900 | or at least I can,
01:12:31.340 | like when you drive down the road
01:12:33.160 | and try to think about what your brain
01:12:35.960 | is actually doing consciously.
01:12:38.980 | And it's like, you'll see a car,
01:12:43.980 | because you don't have cameras,
01:12:46.600 | you don't have eyes in the back of your head or the side.
01:12:48.980 | So you say like,
01:12:49.820 | you basically, your head is like a,
01:12:53.180 | you basically have like two cameras on a slow gimbal.
01:12:56.840 | And what's your,
01:13:00.460 | and eyesight's not that great.
01:13:01.700 | Okay, human eyes are,
01:13:02.900 | and people are constantly distracted
01:13:05.740 | and thinking about things and texting
01:13:07.220 | and doing all sorts of things they shouldn't do in a car,
01:13:09.300 | changing the radio station.
01:13:10.980 | So having arguments is like,
01:13:15.060 | so then like, say like,
01:13:19.140 | like when's the last time you looked right and left
01:13:23.020 | and, or and rearward,
01:13:25.340 | or even diagonally forward
01:13:28.420 | to actually refresh your vector space.
01:13:31.220 | So you're glancing around
01:13:32.420 | and what your mind is doing is trying to still,
01:13:35.940 | relevant vectors,
01:13:38.460 | basically objects with a position and motion,
01:13:41.200 | and then editing that down to the least amount
01:13:47.860 | that's necessary for you to drive.
01:13:49.940 | - It does seem to be able to edit it down
01:13:53.260 | or compress it even further into things like concepts.
01:13:55.780 | So it's not, it's like it goes beyond,
01:13:57.660 | the human mind seems to go sometimes beyond vector space
01:14:01.260 | to sort of space of concepts to where you'll see a thing.
01:14:05.100 | It's no longer represented spatially somehow.
01:14:07.540 | It's almost like a concept that you should be aware of.
01:14:10.080 | Like if this is a school zone,
01:14:12.340 | you'll remember that as a concept,
01:14:14.940 | which is a weird thing to represent,
01:14:16.420 | but perhaps for driving,
01:14:17.500 | you don't need to fully represent those things.
01:14:20.460 | Or maybe you get those kind of indirectly.
01:14:26.300 | - You need to establish vector space
01:14:27.740 | and then actually have predictions for those vector spaces.
01:14:32.740 | So like if you drive past, say a bus,
01:14:39.100 | and you see that there's people,
01:14:47.340 | before you drove past the bus,
01:14:48.500 | you saw people crossing,
01:14:50.260 | or just imagine there's a large truck
01:14:52.820 | or something blocking site,
01:14:55.500 | but before you came out to the truck,
01:14:57.420 | you saw that there were some kids about to cross the road
01:15:00.700 | in front of the truck.
01:15:01.540 | Now you can no longer see the kids,
01:15:03.220 | but you need to be able,
01:15:04.940 | but you would now know,
01:15:05.940 | okay, those kids are probably gonna pass by the truck
01:15:08.980 | and cross the road, even though you cannot see them.
01:15:11.980 | So you have to have memory,
01:15:15.920 | you have to need to remember that there were kids there
01:15:19.060 | and you need to have some forward prediction
01:15:21.780 | of what their position will be--
01:15:23.940 | - That's a really hard problem.
01:15:24.780 | - At the time of relevance.
01:15:25.620 | - So with occlusions and computer vision,
01:15:28.500 | when you can't see an object anymore,
01:15:30.820 | even when it just walks behind a tree and reappears,
01:15:33.420 | that's a really, really,
01:15:35.100 | I mean, at least in academic literature,
01:15:37.100 | it's tracking through occlusions, it's very difficult.
01:15:40.600 | - Yeah, we're doing it.
01:15:41.940 | - I understand this.
01:15:43.460 | So some of it--
01:15:44.300 | - It's like object permanence,
01:15:45.380 | like same thing happens with the humans with neural nets.
01:15:47.700 | Like when a toddler grows up,
01:15:50.580 | there's a point in time where they develop,
01:15:54.420 | they have a sense of object permanence.
01:15:56.200 | So before a certain age,
01:15:57.380 | if you have a ball or a toy or whatever,
01:15:59.940 | and you put it behind your back and you pop it out,
01:16:02.640 | if they don't, before they have object permanence,
01:16:04.660 | it's like a new thing every time.
01:16:05.860 | It's like, whoa, this toy went, poof, disappeared,
01:16:08.260 | and now it's back again, and they can't believe it.
01:16:09.980 | And that they can play peekaboo all day long
01:16:12.140 | because peekaboo is fresh every time.
01:16:13.940 | (both laughing)
01:16:16.220 | But then we figured out object permanence,
01:16:18.220 | then they realize, oh no, the object is not gone,
01:16:20.360 | it's just behind your back.
01:16:21.760 | - Sometimes I wish we never did figure out object permanence.
01:16:26.380 | - Yeah, so that's--
01:16:27.740 | - That's an important problem to solve.
01:16:31.620 | - Yes, so like an important evolution
01:16:33.960 | of the neural nets in the car is
01:16:35.740 | memory across both time and space.
01:16:43.860 | So now you can't remember,
01:16:46.620 | like you have to say,
01:16:47.460 | like how long do you want to remember things for?
01:16:48.940 | And there's a cost to remembering things for a long time.
01:16:53.260 | So you run out of memory
01:16:55.720 | to try to remember too much for too long.
01:16:58.100 | And then you also have things that are stale
01:17:01.240 | if you remember them for too long.
01:17:03.540 | And then you also need things that are remembered over time.
01:17:06.880 | So even if you, like say, have like,
01:17:10.640 | for argument's sake, five seconds of memory on a time basis,
01:17:14.580 | but like, let's say you're parked at a light,
01:17:17.080 | and you saw, use a pedestrian example,
01:17:20.640 | that people were waiting to cross the road,
01:17:25.200 | and you can't quite see them because of an occlusion,
01:17:27.900 | but they might wait for a minute
01:17:30.560 | before the light changes for them to cross the road.
01:17:33.160 | You still need to remember that that's where they were,
01:17:36.920 | and that they're probably going to cross the road
01:17:39.040 | type of thing.
01:17:40.580 | So even if that exceeds your time-based memory,
01:17:44.720 | it should not exceed your space memory.
01:17:47.020 | - And I just think the data engine side of that,
01:17:50.520 | so getting the data to learn all of the concepts
01:17:53.520 | that you're saying now is an incredible process.
01:17:56.200 | It's this iterative process of just,
01:17:58.400 | it's this hydranet of many--
01:18:00.680 | - Hydranet.
01:18:01.520 | We're changing the name to something else.
01:18:05.400 | - Okay, I'm sure it'll be equally as Rick and Morty-like.
01:18:09.720 | - There's a lot of, yeah.
01:18:11.480 | We've re-architected the neural net,
01:18:13.760 | the neural nets in the cars so many times, it's crazy.
01:18:18.000 | - Oh, so every time there's a new major version,
01:18:20.040 | you'll rename it to something more ridiculous,
01:18:21.960 | or memorable and beautiful?
01:18:24.560 | Sorry, not ridiculous, of course.
01:18:26.340 | - If you see the full array of neural nets
01:18:32.520 | that are operating in the car, it kind of boggles the mind.
01:18:36.440 | There's so many layers, it's crazy.
01:18:39.920 | So, yeah.
01:18:41.920 | We started off with simple neural nets
01:18:48.880 | that were basically image recognition on a single frame
01:18:53.880 | from a single camera, and then trying to net those together
01:18:59.880 | with C.
01:19:04.360 | I should say we're really primarily running C here,
01:19:07.680 | 'cause C++ is too much overhead.
01:19:10.000 | And we have our own C compiler.
01:19:11.640 | So, to get maximum performance,
01:19:13.560 | we actually wrote our own C compiler
01:19:15.800 | and are continuing to optimize our C compiler
01:19:18.040 | for maximum efficiency.
01:19:20.040 | In fact, we've just recently done a new rev
01:19:23.000 | on our C compiler that'll compile directly
01:19:25.240 | to our autopilot hardware.
01:19:26.960 | - So you wanna compile the whole thing down
01:19:28.960 | with your own compiler.
01:19:30.440 | - Yeah, absolutely. - So efficiency here,
01:19:32.640 | 'cause there's all kinds of compute, there's CPU, GPU,
01:19:34.960 | there's basic types of things,
01:19:37.360 | and you have to somehow figure out the scheduling
01:19:39.080 | across all of those things.
01:19:40.120 | And so you're compiling the code down.
01:19:42.040 | - Yeah. - It does all, okay.
01:19:43.480 | So that's why there's a lot of people involved.
01:19:46.800 | - There's a lot of hardcore software engineering
01:19:50.600 | at a very sort of bare metal level,
01:19:54.680 | 'cause we're trying to do a lot of compute
01:19:57.240 | that's constrained to our full self-driving computer.
01:20:02.240 | And we wanna try to have the highest frames per second
01:20:07.720 | possible with a sort of very finite amount
01:20:12.560 | of compute and power.
01:20:15.160 | So we really put a lot of effort
01:20:19.800 | into the efficiency of our compute.
01:20:21.720 | And so there's actually a lot of work done
01:20:26.040 | by some very talented software engineers at Tesla
01:20:29.640 | that at a very foundational level
01:20:33.120 | to improve the efficiency of compute
01:20:35.200 | and how we use the trip accelerators,
01:20:38.920 | which are basically doing matrix math dot products,
01:20:43.920 | like a bazillion dot products.
01:20:47.360 | And it's like, what are neural nets?
01:20:49.560 | It's like, compute-wise, like 99% dot products.
01:20:53.160 | (laughs)
01:20:54.320 | So, you know.
01:20:57.040 | - And you wanna achieve as many high frame rates,
01:20:59.680 | like a video game.
01:21:00.560 | You want full resolution, high frame rate.
01:21:04.960 | - High frame rate, low latency, low jitter.
01:21:09.960 | So I think one of the things
01:21:16.000 | we're moving towards now is no post-processing
01:21:20.560 | of the image through the image signal processor.
01:21:25.560 | So like, what happens for cameras is that,
01:21:32.600 | well, almost all cameras is they,
01:21:34.280 | there's a lot of post-processing done
01:21:37.720 | in order to make pictures look pretty.
01:21:40.280 | And so we don't care about pictures looking pretty.
01:21:43.040 | We just want the data.
01:21:45.400 | So we're moving to just raw photon counts.
01:21:48.800 | So the system will, like the image
01:21:52.640 | that the computer sees is actually much more
01:21:56.360 | than what you'd see if you represented it on a camera.
01:21:59.080 | It's got much more data.
01:22:00.800 | And even in very low light conditions,
01:22:02.560 | you can see that there's a small photon count difference
01:22:05.240 | between this spot here and that spot there,
01:22:08.800 | which means that, so it can see in the dark incredibly well
01:22:12.920 | because it can detect these tiny differences
01:22:15.160 | in photon counts.
01:22:16.160 | Like much better than you could possibly imagine.
01:22:19.840 | So, and then we also save 13 milliseconds on a latency.
01:22:28.280 | - From removing the post-processing in the image?
01:22:31.240 | - Yes. - Yeah.
01:22:32.080 | - It's like, 'cause we've got eight cameras
01:22:35.840 | and then there's roughly, I don't know,
01:22:39.800 | one and a half milliseconds or so,
01:22:42.000 | maybe 1.6 milliseconds of latency for each camera.
01:22:46.240 | And so like going to just,
01:22:51.240 | basically bypassing the image processor
01:22:56.320 | gets us back 13 milliseconds of latency, which is important.
01:23:01.520 | And we track latency all the way from,
01:23:03.440 | photon hits the camera to all the steps
01:23:07.400 | that it's gotta go through to get,
01:23:09.080 | go through the various neural nets and the C code.
01:23:13.320 | And there's a little bit of C++ there as well.
01:23:16.540 | Well, maybe a lot, but the core stuff is,
01:23:21.360 | the heavy duty computers all in C.
01:23:23.080 | And so we track that latency all the way
01:23:28.840 | to an output command to the drive unit
01:23:32.000 | to accelerate the brakes, just to slow down the steering,
01:23:36.600 | you know, turn left or right.
01:23:38.040 | So, 'cause you got to output a command
01:23:40.680 | that's gonna go to a controller.
01:23:41.800 | And like some of these controllers have an update frequency
01:23:44.120 | that's maybe 10 Hertz or something like that, which is slow.
01:23:47.280 | That's like, now you lose a hundred milliseconds potentially.
01:23:50.240 | So then we wanna update the drivers on the,
01:23:55.240 | like say steering and braking control
01:23:58.720 | to have more like a hundred Hertz instead of 10 Hertz.
01:24:02.920 | And you've got a 10 millisecond latency
01:24:04.400 | instead of a hundred milliseconds, worst case latency.
01:24:06.400 | And actually jitter is more of a challenge than latency.
01:24:09.520 | 'Cause latency is like, you can anticipate and predict,
01:24:12.080 | but if you've got a stack up of things
01:24:14.680 | going from the camera to the computer,
01:24:17.200 | through then a series of other computers,
01:24:19.400 | and finally to an actuator on the car.
01:24:22.120 | If you have a stack up of tolerances, of timing tolerances,
01:24:26.960 | then you can have quite a variable latency,
01:24:29.000 | which is called jitter.
01:24:30.360 | And that makes it hard to anticipate exactly
01:24:35.360 | how you should turn the car or accelerate.
01:24:37.440 | Because if you've got maybe a hundred,
01:24:40.560 | a 50, 200 milliseconds of jitter,
01:24:42.400 | then you could be off by up to 0.2 seconds.
01:24:45.560 | And this could make a big difference.
01:24:47.560 | - So you have to interpolate somehow
01:24:48.880 | to deal with the effects of jitter.
01:24:52.400 | So you can make like robust control decisions.
01:24:56.940 | Yeah, so the jitter is in the sensor information
01:25:01.600 | or the jitter can occur at any stage in the pipeline.
01:25:04.960 | - You can, if you have just, if you have fixed latency,
01:25:07.760 | you can anticipate and like say, okay,
01:25:11.920 | we know that our information is for argument's sake,
01:25:16.520 | 150 milliseconds stale.
01:25:19.000 | Like, so for argument's sake,
01:25:22.080 | 150 milliseconds from photon second camera
01:25:24.720 | to where you can measure a change
01:25:28.480 | in the acceleration of the vehicle.
01:25:30.820 | So then you can say, okay, well, we're gonna,
01:25:37.200 | we know it's 150 milliseconds,
01:25:39.400 | so we're gonna take that into account
01:25:40.960 | and compensate for that latency.
01:25:44.240 | However, if you've got then 150 milliseconds of latency
01:25:47.320 | plus 100 milliseconds of jitter,
01:25:49.640 | which could be anywhere from zero
01:25:51.000 | to 100 milliseconds on top,
01:25:52.520 | so then your latency could be from 150 to 250 milliseconds.
01:25:55.760 | Now you've got 100 milliseconds
01:25:56.720 | that you don't know what to do with.
01:25:58.320 | And that's basically random.
01:25:59.940 | So getting rid of jitter is extremely important.
01:26:04.280 | - And that affects your control decisions
01:26:05.880 | and all those kinds of things, okay.
01:26:07.900 | - Yeah, the car's just gonna fundamentally maneuver better
01:26:11.160 | with lower jitter.
01:26:12.000 | - Got it.
01:26:13.760 | - And the cars will maneuver with superhuman ability
01:26:16.760 | and reaction time much faster than a human.
01:26:19.020 | I mean, I think over time,
01:26:21.960 | the autopilot, the full self-driving
01:26:24.880 | will be capable of maneuvers that
01:26:26.800 | are far more than what James Bond could do
01:26:34.840 | in the best movie type of thing.
01:26:36.400 | - That's exactly what I was imagining in my mind,
01:26:38.440 | as you said.
01:26:39.280 | - It's like impossible maneuvers that a human couldn't do.
01:26:45.040 | - Well, let me ask sort of looking back the six years,
01:26:48.680 | looking out into the future,
01:26:50.240 | based on your current understanding,
01:26:51.800 | how hard do you think this full self-driving problem,
01:26:55.360 | when do you think Tesla will solve level four FSD?
01:26:58.780 | - I mean, it's looking quite likely
01:27:02.300 | that it will be next year.
01:27:03.600 | - And what does the solution look like?
01:27:07.000 | Is it the current pool of FSD beta candidates,
01:27:10.160 | they start getting greater and greater
01:27:13.120 | as they have been degrees of autonomy,
01:27:15.720 | and then there's a certain level beyond which
01:27:18.680 | they can do their own, they can read a book.
01:27:22.840 | - Yeah, so, I mean, you can see that
01:27:26.480 | anybody who's been following the full self-driving beta
01:27:29.320 | closely will see that the rate of disengagements
01:27:34.320 | has been dropping rapidly.
01:27:37.400 | So like a disengagement B where the driver intervenes
01:27:40.680 | to prevent the car from doing something.
01:27:43.080 | That's dangerous potentially.
01:27:44.520 | So the interventions per million miles
01:27:53.200 | has been dropping dramatically.
01:27:55.160 | At some point, and that trend looks like
01:27:59.760 | it happens next year, is that the probability
01:28:04.200 | of an accident on FSD is less than that of the average human,
01:28:09.200 | and then significantly less than that of the average human.
01:28:13.920 | So it certainly appears like we will get there next year.
01:28:18.920 | Then of course, then there's gonna be a case of,
01:28:24.760 | okay, well, we now have to prove this to regulators
01:28:26.840 | and prove it to, and we want a standard
01:28:29.440 | that is not just equivalent to a human,
01:28:31.900 | but much better than the average human.
01:28:34.240 | I think it's gotta be at least two or three times
01:28:36.920 | higher safety than a human.
01:28:39.000 | So two or three times lower probability of injury
01:28:41.360 | than a human before we would actually say like,
01:28:44.840 | okay, it's okay to go.
01:28:45.920 | It's not gonna be equivalent, it's gonna be much better.
01:28:48.360 | - So if you look at 10 point, FSD 10.6 just came out
01:28:52.840 | recently, 10.7 is on the way.
01:28:55.440 | Maybe 11 is on the way somewhere in the future.
01:28:58.440 | - Yeah, we were hoping to get 11 out this year,
01:29:01.040 | but it's, 11 actually has a whole bunch
01:29:04.880 | of fundamental rewrites on the neural net architecture,
01:29:10.840 | and some fundamental improvements in creating vector space.
01:29:18.320 | - There is some fundamental leap
01:29:22.240 | that really deserves the 11.
01:29:23.960 | I mean, that's a pretty cool number.
01:29:25.160 | - Yeah.
01:29:26.000 | 11 would be a single stack for all,
01:29:31.040 | one stack to rule them all.
01:29:32.400 | But there are just some really fundamental
01:29:40.540 | neural net architecture changes that will allow
01:29:44.720 | for much more capability, but at first
01:29:49.720 | they're gonna have issues.
01:29:51.120 | So like we have this working on like sort of alpha software
01:29:54.680 | and it's good, but it's basically taking a whole bunch
01:29:59.680 | of C, C++ code and deleting a massive amount
01:30:04.600 | of C++ code and replacing it with a neural net.
01:30:06.440 | And Andre makes this point a lot,
01:30:09.160 | which is like neural nets are kind of eating software.
01:30:12.500 | Over time there's like less and less conventional software,
01:30:15.880 | more and more neural net, which is still software,
01:30:18.260 | but it's, still comes out the lines of software,
01:30:21.100 | but more neural net stuff and less heuristics basically.
01:30:26.100 | If you're, more matrix-based stuff
01:30:36.720 | and less heuristics-based stuff.
01:30:40.120 | And, you know, like one of the big changes will be,
01:30:47.660 | like right now the neural nets will deliver
01:30:55.900 | a giant bag of points to the C++ or C and C++ code.
01:31:03.720 | - Yeah.
01:31:05.820 | - We call it the giant bag of points.
01:31:07.400 | - Yeah.
01:31:08.240 | - And it's like, so you've got a pixel
01:31:09.660 | and something associated with that pixel.
01:31:13.140 | Like this pixel is probably car,
01:31:14.980 | this pixel is probably lane line.
01:31:16.960 | Then you've got to assemble this giant bag of points
01:31:20.460 | in the C code and turn it into vectors.
01:31:24.380 | And we're just a pretty good job of it,
01:31:28.820 | but it's, we want to just, we need another layer
01:31:34.240 | of neural nets on top of that to take the giant bag
01:31:37.420 | of points and distill that down to vector space
01:31:42.360 | in the neural net part of the software,
01:31:45.320 | as opposed to the heuristics part of the software.
01:31:48.680 | This is a big improvement.
01:31:50.080 | - Neural nets all the way down, is what you want.
01:31:53.120 | - It's not even neural, all neural nets,
01:31:55.200 | but it's, this will be just a, this is a game changer
01:31:58.920 | to not have the bag of points, the giant bag of points
01:32:02.800 | that has to be assembled with many lines of C++
01:32:06.800 | and have the, and have a neural net
01:32:10.240 | just assemble those into a vector.
01:32:12.200 | So the neural net is outputting much, much less data.
01:32:17.200 | It's outputting, this is a lane line, this is a curb,
01:32:22.680 | this is drivable space, this is a car,
01:32:24.480 | this is a pedestrian or cyclist or something like that.
01:32:29.240 | It's outputting proper vectors to the C++ control code
01:32:34.240 | as opposed to the sort of constructing the vectors in C.
01:32:44.800 | We've done, I think, quite a good job of,
01:32:52.240 | but it's, we're kind of hitting a local maximum
01:32:55.780 | on the how well the C can do this.
01:32:58.760 | So this is really a big deal.
01:33:02.800 | And just all of the networks in the car
01:33:04.760 | need to move to surround video.
01:33:06.720 | There's still some legacy networks
01:33:07.960 | that are not surround video.
01:33:10.280 | And all of the training needs to move to surround video
01:33:14.880 | and the efficiency of the training,
01:33:17.040 | it needs to get better and it is.
01:33:18.760 | And then we need to move everything to raw photon counts
01:33:25.480 | as opposed to processed images.
01:33:28.480 | Which is quite a big reset on the training
01:33:31.440 | 'cause the system's trained on post-processed images.
01:33:35.240 | So we need to redo all the training
01:33:36.960 | to train against the raw photon counts
01:33:41.960 | instead of the post-processed image.
01:33:43.960 | - So ultimately it's kind of reducing the complexity
01:33:46.440 | of the whole thing.
01:33:47.280 | So reducing-- - Yeah.
01:33:49.360 | - Reducing-- - Lines of code
01:33:50.360 | will actually go lower.
01:33:51.960 | - Yeah, that's fascinating.
01:33:54.400 | So you're doing fusion of all the sensors,
01:33:56.240 | reducing the complexity of having to deal with--
01:33:57.840 | - Fusion of the cameras.
01:33:58.680 | There's a lot of cameras really.
01:34:00.200 | - Right, yes.
01:34:01.040 | Same with humans.
01:34:04.920 | I guess we got ears too, okay.
01:34:07.520 | - Yeah, well actually you need to incorporate sound as well
01:34:11.240 | 'cause you need to listen for ambulance sirens
01:34:14.600 | or fire trucks, if somebody like yelling at you
01:34:19.600 | or something, I don't know.
01:34:21.120 | There's a little bit of audio
01:34:23.040 | that needs to be incorporated as well.
01:34:24.800 | - Do you need to go back to break?
01:34:26.280 | - Yeah, sure, let's take a break.
01:34:27.600 | - Okay.
01:34:28.960 | - Honestly, frankly, the ideas are the easy thing
01:34:33.520 | and the implementation is the hard thing.
01:34:35.080 | Like the idea of going to the moon is the easy part.
01:34:37.760 | Not going to the moon is the hard part.
01:34:39.160 | - It's the hard part.
01:34:40.400 | - And there's a lot of hardcore engineering
01:34:42.720 | that's gotta get done at the hardware and software level,
01:34:46.880 | like I said, optimizing the C compiler
01:34:48.960 | and just cutting out latency everywhere.
01:34:53.960 | If we don't do this, the system will not work properly.
01:35:00.040 | So the work of the engineers doing this,
01:35:03.040 | they are like the unsung heroes,
01:35:05.560 | but they are critical to the success of the situation.
01:35:09.080 | - I think you made it clear.
01:35:10.120 | I mean, at least to me, it's super exciting,
01:35:11.640 | everything that's going on outside of what Andre's doing.
01:35:15.360 | Just the whole infrastructure of the software.
01:35:17.880 | I mean, everything that's going on with Data Engine,
01:35:20.040 | whatever it's called, the whole process is just work of art.
01:35:24.760 | - The sheer scale of it boggles the mind.
01:35:26.800 | Like the training, the amount of work done with,
01:35:29.160 | like we've written all this custom software
01:35:30.680 | for training and labeling and to do auto-labeling.
01:35:34.960 | Auto-labeling is essential.
01:35:36.320 | 'Cause especially when you've got surround video,
01:35:41.200 | it's very difficult to label surround video from scratch.
01:35:45.160 | It's extremely difficult.
01:35:46.520 | It would take a human such a long time
01:35:50.320 | to even label one video clip, like several hours.
01:35:52.880 | Or the auto-labeler, basically we just apply
01:35:57.400 | like heavy duty, like a lot of compute to the video clips
01:36:02.400 | to pre-assign and guess what all the things are
01:36:07.400 | that are going on in the surround video.
01:36:08.960 | - And then there's like correcting it.
01:36:10.360 | - Yeah, and then all the human has to do is like tweet,
01:36:12.920 | like say, adjust what is incorrect.
01:36:15.960 | This is like increases productivity by a hundred or more.
01:36:20.960 | - Yeah, so you've presented TeslaBot
01:36:23.440 | as primarily useful in the factory.
01:36:25.160 | First of all, I think humanoid robots are incredible.
01:36:28.320 | From a fan of robotics, I think the elegance of movement
01:36:32.360 | that humanoid robots, that bipedal robots show
01:36:36.960 | are just so cool.
01:36:38.000 | So it's really interesting that you're working on this
01:36:40.760 | and also talking about applying the same kind of,
01:36:43.240 | all the ideas of some of which we've talked about
01:36:45.440 | with data engine, all the things that we're talking about
01:36:47.920 | with Tesla autopilot, just transferring that over
01:36:51.400 | to the just yet another robotics problem.
01:36:54.560 | I have to ask, since I care about human-robot interactions,
01:36:57.840 | so the human side of that,
01:36:59.200 | so you've talked about mostly in the factory.
01:37:01.600 | Do you see part of this problem that TeslaBot has to solve
01:37:06.040 | is interacting with humans and potentially having a place
01:37:08.840 | like in the home?
01:37:10.520 | So interacting, not just not replacing labor,
01:37:13.320 | but also like, I don't know, being a friend
01:37:17.160 | or an assistant or something like that.
01:37:18.000 | - Yeah, yeah, I think the possibilities are endless.
01:37:22.440 | Yeah, I mean, it's obviously,
01:37:29.760 | it's not quite in Tesla's primary mission direction
01:37:32.680 | of accelerating sustainable energy,
01:37:34.440 | but it is an extremely useful thing
01:37:37.800 | that we can do for the world,
01:37:38.840 | which is to make a useful humanoid robot
01:37:42.120 | that is capable of interacting with the world
01:37:44.720 | and helping in many different ways.
01:37:49.080 | So in fact, reason, and really just,
01:37:53.800 | I mean, I think if you say like extrapolate
01:37:57.480 | to many years in the future,
01:37:59.280 | it's like, I think work will become optional.
01:38:04.080 | So like there's a lot of jobs that if people
01:38:09.480 | weren't paid to do it, they wouldn't do it.
01:38:12.000 | Like it's not fun, necessarily.
01:38:14.280 | Like if you're washing dishes all day,
01:38:16.480 | it's like, eh, even if you really like washing dishes,
01:38:19.720 | you really wanna do it for eight hours a day every day.
01:38:22.120 | Probably not.
01:38:22.960 | And then there's like dangerous work.
01:38:27.320 | And basically if it's dangerous, boring,
01:38:29.320 | has like potential for repetitive stress injury,
01:38:33.000 | that kind of thing, then that's really where
01:38:36.600 | humanoid robots would add the most value initially.
01:38:41.040 | So that's what we're aiming for is to,
01:38:44.460 | for the humanoid robots to do jobs
01:38:47.840 | that people don't voluntarily want to do.
01:38:50.160 | And then we'll have to pair that obviously
01:38:53.480 | with some kind of universal basic income in the future.
01:38:57.160 | So I think.
01:38:59.220 | - So do you see a world when there's like hundreds
01:39:02.920 | of millions of Tesla bots doing different,
01:39:07.480 | performing different tasks throughout the world?
01:39:10.760 | - Yeah, I haven't really thought about it
01:39:13.400 | that far into the future,
01:39:14.240 | but I guess that there may be something like that.
01:39:16.740 | - Can I ask a wild question?
01:39:21.720 | So the number of Tesla cars has been accelerating,
01:39:25.520 | has been close to 2 million produced.
01:39:28.260 | Many of them have autopilot.
01:39:30.320 | - I think we're over 2 million now, yeah.
01:39:32.400 | - Do you think there will ever be a time
01:39:33.800 | when there'll be more Tesla bots than Tesla cars?
01:39:37.280 | (silence)
01:39:39.460 | - Yeah, actually it's funny you asked this question
01:39:44.240 | because normally I do try to think pretty far
01:39:46.720 | into the future, but I haven't really thought
01:39:47.800 | that far into the future with the Tesla bot
01:39:52.600 | or it's codenamed Optimus.
01:39:54.920 | I call it Optimus Subprime.
01:39:56.940 | (laughs)
01:39:59.320 | Because it's not like a giant transformer robot.
01:40:04.400 | So, but it's meant to be a general purpose, helpful bot.
01:40:09.400 | And basically, like the things that we're,
01:40:17.080 | basically, like Tesla, I think,
01:40:20.640 | has the most advanced real-world AI
01:40:24.240 | for interacting with the real world,
01:40:26.400 | which we've developed as a function
01:40:28.160 | to make self-driving work.
01:40:29.600 | And so, along with custom hardware
01:40:33.520 | and like a lot of hardcore low-level software
01:40:38.040 | to have it run efficiently and be power efficient,
01:40:41.240 | because it's one thing to do neural nets
01:40:43.480 | if you've got a gigantic server room with 10,000 computers,
01:40:45.800 | but now let's say you have to now distill that down
01:40:48.680 | into one computer that's running at low power
01:40:51.120 | in a humanoid robot or a car.
01:40:53.520 | That's actually very difficult.
01:40:54.480 | A lot of hardcore software work is required for that.
01:40:57.040 | So, since we're kind of like solving the,
01:41:02.840 | navigate the real world with neural nets problem for cars,
01:41:07.840 | which are kind of like robots with four wheels,
01:41:10.080 | then it's like kind of a natural extension of that
01:41:12.480 | is to put it in a robot with arms and legs and actuators.
01:41:17.480 | So, like the two hard things are,
01:41:25.400 | like you basically need to make the,
01:41:31.480 | have the robot be intelligent enough
01:41:33.120 | to interact in a sensible way with the environment.
01:41:35.680 | So, you just need real world AI
01:41:38.800 | and you need to be very good at manufacturing,
01:41:43.560 | which is a very hard problem.
01:41:44.880 | Tesla's very good at manufacturing
01:41:46.720 | and also has the real world AI.
01:41:49.400 | So, making the humanoid robot work is,
01:41:52.200 | basically means developing custom motors and sensors
01:42:00.760 | that are different from what a car would use.
01:42:03.000 | But we've also, we have,
01:42:05.920 | I think we have the best expertise
01:42:10.320 | in developing advanced electric motors and power electronics.
01:42:15.200 | So, it just has to be for a humanoid robot application
01:42:19.600 | not a car.
01:42:20.440 | - Still, you do talk about love sometimes.
01:42:25.720 | So, let me ask, this isn't like for like sex robots
01:42:28.600 | or something like that. - Love is the answer.
01:42:31.560 | There is something compelling to us,
01:42:35.640 | not compelling, but we connect with humanoid robots
01:42:39.520 | or even legged robots, like with a dog
01:42:41.480 | in shapes of dogs.
01:42:43.160 | It just, it seems like, you know,
01:42:45.360 | there's a huge amount of loneliness in this world.
01:42:48.200 | All of us seek companionship with other humans,
01:42:50.840 | friendship and all those kinds of things.
01:42:52.720 | We have a lot of here in Austin, a lot of people have dogs.
01:42:55.960 | - Sure.
01:42:56.800 | - There seems to be a huge opportunity
01:42:58.480 | to also have robots that decrease
01:43:01.160 | the amount of loneliness in the world
01:43:06.200 | or help us humans connect with each other.
01:43:09.720 | So, in a way that dogs can.
01:43:11.560 | Do you think about that, would test about it all
01:43:14.720 | or is it really focused on the problem
01:43:16.920 | of performing specific tasks, not connecting with humans?
01:43:21.120 | - I mean, to be honest, I have not actually thought about it
01:43:26.000 | from the companionship standpoint,
01:43:27.760 | but I think it actually would end up being,
01:43:30.160 | it could be actually a very good companion.
01:43:32.320 | And it could, you develop like a personality
01:43:39.040 | over time that is like unique.
01:43:44.040 | Like, you know, it's not like they're just,
01:43:45.880 | all the robots are the same.
01:43:47.400 | And that personality could evolve to be, you know,
01:43:53.400 | match the owner or the, you know, I guess, the owner.
01:43:58.400 | - Well--
01:44:00.560 | - Whatever you wanna call it.
01:44:02.240 | - The other half, right?
01:44:04.200 | In the same way that friends do.
01:44:06.560 | See, I think that's a huge opportunity.
01:44:08.840 | I think--
01:44:09.680 | - Yeah, no, that's interesting.
01:44:11.080 | 'Cause, you know, like there's a Japanese phrase,
01:44:17.480 | I like the, wabi-sabi, you know, the subtle imperfections
01:44:22.080 | are what makes something special.
01:44:24.080 | And the subtle imperfections of the personality of the robot
01:44:27.880 | mapped to the subtle imperfections
01:44:29.680 | of the robot's human friend, I don't know,
01:44:34.680 | owner sounds like maybe the wrong word,
01:44:36.480 | but could actually make an incredible buddy, basically.
01:44:41.480 | - And in that way, the imperfections--
01:44:43.640 | - Like R2D2 or like a C3PO sort of thing, you know?
01:44:46.440 | - So from a machine learning perspective,
01:44:49.920 | I think the flaws being a feature is really nice.
01:44:54.000 | You could be quite terrible at being a robot
01:44:56.160 | for quite a while in the general home environment
01:44:58.920 | or in the general world, and that's kind of adorable.
01:45:02.720 | And that's like, those are your flaws,
01:45:04.720 | and you fall in love with those flaws.
01:45:06.720 | So it's very different than autonomous driving,
01:45:09.880 | where it's a very high stakes environment
01:45:11.600 | you cannot mess up.
01:45:13.200 | And so it's more fun to be a robot in the home.
01:45:17.800 | - Yeah, in fact, if you think of like a C3PO
01:45:19.640 | and R2D2, they actually had a lot of flaws
01:45:23.440 | and imperfections and silly things,
01:45:25.280 | and they would argue with each other.
01:45:27.120 | - Were they actually good at doing anything?
01:45:31.240 | I'm not exactly sure.
01:45:33.960 | - I think they definitely added a lot to the story.
01:45:36.400 | But there's sort of quirky elements
01:45:40.600 | that they would make mistakes and do things.
01:45:45.480 | It was like it made them relatable, I don't know,
01:45:49.720 | enduring.
01:45:51.680 | So yeah, I think that that could be something
01:45:55.440 | that probably would happen.
01:45:57.320 | But our initial focus is just to make it useful.
01:46:01.920 | I'm confident we'll get it done.
01:46:06.760 | I'm not sure what the exact timeframe is,
01:46:08.200 | but we'll probably have, I don't know,
01:46:11.240 | a decent prototype towards the end of next year
01:46:13.720 | or something like that.
01:46:15.520 | - And it's cool that it's connected to Tesla, the car.
01:46:18.840 | - Yeah, it's using a lot of,
01:46:22.640 | it would use the autopilot inference computer,
01:46:25.240 | and a lot of the training that we've done for cars
01:46:29.280 | in terms of recognizing real-world things
01:46:32.240 | could be applied directly to the robot.
01:46:35.800 | But there's a lot of custom actuators and sensors
01:46:40.760 | that need to be developed.
01:46:42.480 | - And an extra module on top of the vector space for love.
01:46:46.440 | That's me saying.
01:46:48.200 | Okay.
01:46:51.440 | - We can add that to the car too.
01:46:53.280 | - That's true.
01:46:54.120 | That could be useful in all environments.
01:46:57.440 | Like you said, a lot of people argue in the car,
01:46:59.320 | so maybe we can help them out.
01:47:00.840 | You're a student of history,
01:47:03.560 | fan of Dan Carlin's Hardcore History podcast.
01:47:06.120 | - Yeah, it's great.
01:47:06.960 | - Greatest podcast ever.
01:47:08.520 | - Yeah, I think it is, actually.
01:47:12.080 | It almost doesn't really count as a podcast.
01:47:13.800 | - Yeah, it's more like an audio book.
01:47:16.760 | - So you were on the podcast with Dan,
01:47:18.760 | just had a chat with him about it.
01:47:21.120 | He said you guys went military and all that kind of stuff.
01:47:23.560 | - Yeah, it was basically,
01:47:26.140 | it should be titled Engineer Wars.
01:47:31.840 | Essentially, when there's a rapid change
01:47:35.240 | in the rate of technology,
01:47:36.640 | then engineering plays a pivotal role in victory and battle.
01:47:42.240 | - How far back in history did you go?
01:47:45.480 | Did you go World War II?
01:47:47.280 | - It was mostly, well, it was supposed to be a deep dive
01:47:50.760 | on fighters and bomber technology in World War II,
01:47:55.360 | but that ended up being more wide ranging than that.
01:47:58.600 | 'Cause I just went down the total rat hole
01:48:00.400 | of studying all of the fighters and bombers in World War II
01:48:04.760 | and the constant rock, paper, scissors game
01:48:07.320 | that one country would make this plane,
01:48:11.000 | and then it'd make a plane to beat that,
01:48:12.240 | and that country would make a plane to beat that,
01:48:13.840 | and then they'd do it.
01:48:15.520 | And really what matters is the pace of innovation
01:48:18.560 | and also access to high quality fuel and raw materials.
01:48:23.560 | So Germany had some amazing designs,
01:48:27.720 | but they couldn't make them
01:48:29.120 | because they couldn't get the raw materials.
01:48:31.520 | And they had a real problem with the oil and fuel, basically.
01:48:36.520 | The fuel quality was extremely variable.
01:48:39.960 | - So the design wasn't the bottleneck?
01:48:42.360 | - Yeah, the US had kick-ass fuel that was very consistent.
01:48:47.240 | The problem is if you make a very high performance
01:48:48.840 | aircraft engine, in order to make high performance,
01:48:51.520 | you have to, the fuel, the aviation gas,
01:48:56.520 | has to be a consistent mixture
01:49:01.440 | and it has to have a high octane.
01:49:05.360 | High octane is the most important thing,
01:49:08.960 | but also can't have impurities and stuff
01:49:11.680 | 'cause you'll foul up the engine.
01:49:14.080 | And Germany just never had good access to oil.
01:49:16.280 | Like they tried to get it by invading the Caucasus,
01:49:19.080 | but that didn't work too well.
01:49:20.580 | (laughing)
01:49:21.840 | - Never works well.
01:49:22.680 | - Didn't work out for them.
01:49:24.280 | - See you, Jerry.
01:49:25.120 | See you, Jerry.
01:49:26.520 | - Nice to meet you.
01:49:27.560 | So Germany was always struggling with basically shitty oil.
01:49:31.000 | And so then they couldn't count on high quality fuel
01:49:36.000 | for their aircraft, so then they had to have
01:49:38.160 | all these additives and stuff.
01:49:40.080 | Whereas the US had awesome fuel
01:49:45.520 | and they provided that to Britain as well.
01:49:48.280 | So that allowed the British and the Americans
01:49:51.160 | to design aircraft engines that were super high performance,
01:49:55.000 | better than anything else in the world.
01:49:56.840 | Germany could design the engines,
01:49:59.920 | they just didn't have the fuel.
01:50:01.040 | And then also the quality of the aluminum alloys
01:50:05.800 | that they were getting was also not that great.
01:50:07.400 | And so, you know.
01:50:09.280 | - Is this like, you talked about all this with Dan?
01:50:11.560 | - Yep.
01:50:12.400 | - Awesome.
01:50:13.640 | Broadly looking at history, when you look at Genghis Khan,
01:50:16.760 | when you look at Stalin, Hitler,
01:50:19.360 | the darkest moments of human history,
01:50:21.260 | what do you take away from those moments?
01:50:23.920 | Does it help you gain insight about human nature,
01:50:26.000 | about human behavior today?
01:50:27.480 | Whether it's the wars or the individuals
01:50:30.520 | or just the behavior of people, any aspects of history?
01:50:33.560 | (computer mouse clicking)
01:50:37.060 | - Yeah, I find history fascinating.
01:50:43.020 | I mean, there's just a lot of incredible things
01:50:51.320 | that have been done, good and bad,
01:50:52.960 | that they help you understand the nature of civilization.
01:51:03.520 | And individuals and--
01:51:05.080 | - Does it make you sad that humans
01:51:07.480 | do these kinds of things to each other?
01:51:09.560 | You look at the 20th century, World War II,
01:51:12.360 | the cruelty, the abuse of power.
01:51:15.720 | Talk about communism, Marxism, Stalin.
01:51:19.360 | - I mean, some of these things do,
01:51:21.640 | I mean, if you, like there's a lot of human history.
01:51:25.320 | Most of it is actually people just getting on
01:51:27.560 | with their lives, you know?
01:51:28.800 | And it's not like human history is just
01:51:33.000 | nonstop war and disaster.
01:51:36.280 | Those are actually just, those are intermittent and rare.
01:51:38.920 | And if they weren't, then humans would soon cease to exist.
01:51:43.260 | But it's just that wars tend to be written about a lot,
01:51:50.080 | and whereas like, something being like,
01:51:54.200 | well, a normal year where nothing major happened
01:51:56.960 | was just get written about much.
01:51:58.480 | But that's, you know, most people just like farming
01:52:01.160 | and kind of like living their life, you know?
01:52:03.720 | Being a villager somewhere.
01:52:06.580 | And every now and again, there's a war.
01:52:12.040 | And yeah, what I would say, like,
01:52:18.200 | there aren't very many books that I,
01:52:21.520 | where I just had to stop reading
01:52:22.960 | 'cause it was just too dark.
01:52:25.920 | But the book about Stalin, "The Court of the Red Czar,"
01:52:30.820 | I had to stop reading.
01:52:32.200 | It was just too dark, rough.
01:52:37.200 | - Yeah.
01:52:38.120 | The '30s, there's a lot of lessons there to me,
01:52:44.280 | in particular that it feels like humans,
01:52:48.360 | like all of us have that, it's the old Solzhenitsyn line,
01:52:51.620 | that the line between good and evil
01:52:54.720 | runs through the heart of every man,
01:52:56.160 | that all of us are capable of evil,
01:52:57.880 | all of us are capable of good.
01:52:59.340 | It's almost like this kind of responsibility
01:53:01.200 | that all of us have to tend towards the good.
01:53:06.200 | And so, like, to me, looking at history
01:53:09.400 | is almost like an example of,
01:53:11.420 | look, you have some charismatic leader
01:53:14.080 | that convinces you of things.
01:53:16.560 | It's too easy, based on that story,
01:53:19.400 | to do evil onto each other, onto your family, onto others.
01:53:23.400 | And so it's like our responsibility to do good.
01:53:26.720 | It's not like now is somehow different from history.
01:53:29.700 | That can happen again, all of it can happen again.
01:53:32.820 | And yes, most of the time, you're right.
01:53:34.980 | I mean, the optimistic view here is
01:53:37.180 | mostly people are just living life.
01:53:38.900 | And as you've often memed about,
01:53:42.100 | the quality of life was way worse back in the day,
01:53:44.980 | and it keeps improving over time
01:53:46.940 | through innovation, through technology.
01:53:48.800 | But still, it's somehow notable
01:53:50.300 | that these blimps of atrocities happen.
01:53:54.380 | - Sure.
01:53:56.560 | Yeah, I mean, life was really tough
01:53:58.740 | for most of history.
01:54:02.020 | I mean, really for most of human history,
01:54:04.220 | a good year would be one where
01:54:07.820 | not that many people in your village
01:54:10.380 | died of the plague, starvation, freezing to death,
01:54:13.620 | or being killed by a neighboring village.
01:54:15.620 | It's like, well, it wasn't that bad.
01:54:17.860 | You know, it was only like, you know,
01:54:19.020 | we lost 5% this year.
01:54:20.140 | That was a good year.
01:54:23.340 | That would be par for the course.
01:54:25.220 | Just not starving to death would have been
01:54:27.680 | the primary goal of most people throughout history,
01:54:31.160 | is making sure we'll have enough food
01:54:32.840 | to last through the winter and not freeze or whatever.
01:54:36.680 | now food is plentiful.
01:54:42.280 | I have an obesity problem.
01:54:43.560 | - Well, yeah, the lesson there is to be grateful
01:54:48.640 | for the way things are now for some of us.
01:54:51.540 | We've spoken about this offline.
01:54:55.880 | I'd love to get your thought about it here.
01:54:58.480 | If I sat down for a long-form in-person conversation
01:55:03.740 | with the president of Russia, Vladimir Putin,
01:55:06.960 | would you potentially want to call in for a few minutes
01:55:09.880 | to join in on a conversation with him,
01:55:12.600 | moderated and translated by me?
01:55:14.360 | - Sure, yeah, sure, I'd be happy to do that.
01:55:17.400 | - You've shown interest in the Russian language.
01:55:22.040 | Is this grounded in your interest
01:55:23.520 | in history of linguistics, culture, general curiosity?
01:55:27.780 | - I think it sounds cool.
01:55:29.660 | - Sounds cool, not looks cool.
01:55:31.300 | - Well, it takes a moment to read Cyrillic.
01:55:37.580 | Once you know what the Cyrillic characters stand for,
01:55:43.140 | actually, then reading Russian becomes a lot easier
01:55:47.100 | 'cause there are a lot of words that are actually the same.
01:55:49.340 | Like bank is bank.
01:55:50.580 | (laughing)
01:55:52.840 | - So find the words that are exactly the same
01:55:57.740 | and now you start to understand Cyrillic, yeah.
01:55:59.700 | - If you can sound it out,
01:56:01.700 | there's at least some commonality of words.
01:56:06.600 | - What about the culture?
01:56:07.920 | You love great engineering, physics.
01:56:12.100 | There's a tradition of the sciences there.
01:56:14.780 | You look at the 20th century from rocketry,
01:56:16.880 | so some of the greatest rockets,
01:56:19.340 | some of the space exploration has been done
01:56:21.740 | in the Soviet and the former Soviet Union.
01:56:24.140 | So do you draw inspiration from that history?
01:56:27.700 | Just how this culture that in many ways,
01:56:30.540 | I mean, one of the sad things is because of the language,
01:56:33.200 | a lot of it is lost to history
01:56:35.380 | because it's not translated, all those kinds of,
01:56:37.480 | because it is in some ways an isolated culture.
01:56:40.900 | It flourishes within its borders.
01:56:43.380 | So do you draw inspiration from those folks,
01:56:47.780 | from the history of science and engineering there?
01:56:51.500 | - I mean, the Soviet Union, Russia, and Ukraine as well,
01:56:56.500 | and have a really strong history in space flight.
01:57:01.780 | Like some of the most advanced and impressive things
01:57:04.580 | in history were done by the Soviet Union.
01:57:09.580 | So one cannot help but admire the history
01:57:16.260 | and one cannot help but admire the impressive
01:57:18.380 | rocket technology that was developed.
01:57:20.220 | After the fall of the Soviet Union,
01:57:24.220 | there's much less that happened.
01:57:29.040 | But still things are happening,
01:57:34.380 | but it's not quite at the frenetic pace
01:57:38.060 | that it was happening before the Soviet Union
01:57:41.940 | kind of dissolved into separate republics.
01:57:46.740 | - Yeah, I mean, there's Roscosmos, the Russian agency.
01:57:51.420 | I look forward to a time when those countries
01:57:55.780 | with China are working together.
01:57:58.020 | The United States are all working together.
01:58:00.020 | Maybe a little bit of friendly competition, but--
01:58:01.900 | - I think friendly competition is good.
01:58:03.800 | You know, governments are slow,
01:58:06.300 | and the only thing slower than one government
01:58:07.860 | is a collection of governments.
01:58:09.460 | So the Olympics would be boring
01:58:14.140 | if everyone just crossed the finishing line
01:58:15.780 | at the same time.
01:58:17.060 | Nobody would watch.
01:58:18.060 | And people wouldn't try hard to run fast and stuff.
01:58:22.380 | So I think friendly competition is a good thing.
01:58:24.780 | - This is also a good place to give a shout out
01:58:28.120 | to a video titled "The Entire Soviet Rocket Engine
01:58:30.780 | Family Tree" by Tim Dodd, aka Everyday Astronaut.
01:58:34.300 | It's like an hour and a half.
01:58:35.420 | It gives a full history of Soviet rockets.
01:58:38.620 | And people should definitely go check out
01:58:40.260 | and support Tim in general.
01:58:41.500 | That guy is super excited about the future,
01:58:43.900 | super excited about space flight.
01:58:45.940 | Every time I see anything by him,
01:58:47.180 | I just have a stupid smile on my face
01:58:48.980 | 'cause he's so excited about stuff.
01:58:51.220 | - Yeah. - I love people like that.
01:58:52.060 | - Tim Dodd is really great.
01:58:53.940 | If you're interested in anything to do with space,
01:58:56.220 | he's, in terms of explaining rocket technology
01:58:59.980 | to your average person, he's awesome.
01:59:02.220 | The best, I'd say.
01:59:03.220 | And I should say, like, the part of the reason
01:59:08.380 | like I switched us from, like,
01:59:12.180 | Grafter at one point was gonna be a hydrogen engine.
01:59:14.740 | But hydrogen has a lot of challenges.
01:59:17.460 | It's very low density.
01:59:18.500 | It's a deep cryogen, so it's only liquid
01:59:20.500 | at a very close to absolute zero.
01:59:23.380 | Requires a lot of insulation.
01:59:24.980 | So there's a lot of challenges there.
01:59:28.880 | And I was actually reading a bit
01:59:32.940 | about Russian rocket engine development.
01:59:35.780 | And at least the impression I had was that,
01:59:39.940 | or Soviet Union, Russia, and Ukraine, primarily,
01:59:43.700 | were actually in the process of switching to Methylox.
01:59:48.700 | And there was some interesting test and data for ISP.
01:59:55.500 | Like, they were able to get, like,
01:59:57.300 | up to like a 380 second ISP with a Methylox engine.
02:00:01.340 | And I was like, whoa, okay,
02:00:02.180 | that's actually really impressive.
02:00:05.420 | So I think you could actually get a much lower cost,
02:00:10.420 | like, in optimizing cost per ton to orbit,
02:00:16.040 | cost per ton to Mars.
02:00:17.140 | It's, I think,
02:00:20.680 | methane oxygen is the way to go.
02:00:24.340 | And I was partly inspired by the Russian work
02:00:27.900 | on the test stands with Methylox engines.
02:00:31.480 | - And now for something completely different.
02:00:35.180 | Do you mind doing a bit of a meme review
02:00:38.340 | in the spirit of the great, the powerful PewDiePie?
02:00:41.740 | Let's say one to 11,
02:00:42.860 | just go over a few documents printed out.
02:00:45.660 | - We can try.
02:00:46.760 | - Let's try this.
02:00:47.600 | I present to you document numero uno.
02:00:52.620 | - I don't know, okay.
02:00:58.740 | - Vlad the Impaler discovers marshmallows.
02:01:04.060 | - Yeah, that's not bad.
02:01:05.220 | - So you get it because he likes impaling things?
02:01:11.660 | - Yes, I get it.
02:01:12.500 | I don't know, three, whatever.
02:01:14.700 | - That's not very good.
02:01:15.860 | This is ground in some engineering, some history.
02:01:23.420 | - Yeah, I give this an eight out of 10.
02:01:31.300 | - What do you think about nuclear power?
02:01:33.180 | - I'm in favor of nuclear power.
02:01:34.900 | I think it's,
02:01:35.740 | in a place that is not subject to extreme natural disasters,
02:01:41.780 | I think nuclear power is a great way to generate electricity.
02:01:46.460 | I don't think we should be shutting down
02:01:49.580 | nuclear power stations.
02:01:50.740 | - Yeah, but what about Chernobyl?
02:01:53.540 | - Exactly.
02:01:54.380 | So I think people,
02:01:59.900 | there's a lot of fear of radiation and stuff.
02:02:02.620 | And I guess the problem is a lot of people just don't,
02:02:07.780 | they didn't study engineering or physics,
02:02:12.140 | so they don't,
02:02:13.540 | just the word radiation just sounds scary.
02:02:15.820 | So they can't calibrate what radiation means.
02:02:19.120 | But radiation is much less dangerous than you'd think.
02:02:28.180 | like for example, Fukushima,
02:02:32.180 | when the Fukushima problem happened,
02:02:36.540 | due to the tsunami,
02:02:39.820 | I got people in California asking me
02:02:43.620 | if they should worry about radiation from Fukushima.
02:02:46.420 | And I'm like, definitely not,
02:02:48.660 | not even slightly, not at all.
02:02:51.520 | That is crazy.
02:02:52.440 | And just to
02:02:57.380 | show, like, look, this is how,
02:03:00.380 | like the dangers is so much overplayed
02:03:04.140 | compared to what it really is,
02:03:06.380 | that I actually flew to Fukushima.
02:03:09.020 | And I actually, I donated a solar power system
02:03:13.740 | for a water treatment plant.
02:03:15.300 | And I made a point of eating locally grown vegetables
02:03:20.300 | on TV in Fukushima.
02:03:26.540 | (silence)
02:03:28.700 | Like, I'm still alive, okay.
02:03:31.500 | - So it's not even that the risk of these events is low,
02:03:34.100 | but the impact of them is--
02:03:36.060 | - Impact is greatly exaggerated.
02:03:38.420 | It's just great. - It's human nature.
02:03:40.060 | - It's people, people don't know what radiation is.
02:03:41.900 | Like I've had people ask me, like,
02:03:42.980 | what about radiation from cell phones,
02:03:44.980 | causing brain cancer?
02:03:46.240 | I'm like, when you say radiation,
02:03:47.700 | do you mean photons or particles?
02:03:49.620 | Then like, then I don't know what,
02:03:51.140 | what do you mean photons, particles?
02:03:52.220 | So do you mean,
02:03:55.460 | let's say photons, what frequency or wavelength?
02:03:59.380 | And they're like, no, I have no idea.
02:04:01.820 | Like, do you know that everything's radiating all the time?
02:04:04.780 | Like, what do you mean?
02:04:06.020 | Like, yeah, everything's radiating all the time.
02:04:08.700 | Photons are being emitted by all objects
02:04:11.500 | all the time, basically.
02:04:12.420 | So, and if you wanna know what it means
02:04:17.420 | to stand in front of nuclear fire, go outside.
02:04:21.660 | The sun is a gigantic thermonuclear reactor
02:04:26.660 | that you're staring right at it.
02:04:28.580 | - Yeah. - Are you still alive?
02:04:29.620 | Yes, okay, amazing. (laughs)
02:04:32.540 | - Yeah, I guess radiation is one of the words
02:04:34.460 | that can be used as a tool to fear monger
02:04:38.660 | by certain people, that's it.
02:04:40.260 | - I think people just don't understand, so.
02:04:42.420 | - I mean, that's the way to fight that fear, I suppose,
02:04:44.820 | is to understand, is to learn.
02:04:46.660 | - Yeah, just say like, okay, how many people
02:04:48.260 | have actually died from nuclear accidents?
02:04:50.340 | Like, practically nothing.
02:04:51.620 | And say how many people have died from coal plants,
02:04:56.620 | and it's a very big number.
02:04:59.380 | So, like, obviously we should not be starting up coal plants
02:05:04.180 | and shutting down nuclear plants.
02:05:05.300 | Just doesn't make any sense at all.
02:05:07.480 | Coal plants, like, I don't know,
02:05:09.460 | 100 to 1,000 times worse for health
02:05:12.740 | than nuclear power plants.
02:05:14.060 | - You wanna go to the next one?
02:05:16.340 | It's really bad.
02:05:17.860 | (Richard laughs)
02:05:20.180 | - It's that 90, 180, and 360 degrees,
02:05:24.340 | everybody loves the math, nobody gives a shit about 270.
02:05:28.020 | - It's not super funny.
02:05:30.060 | I don't like 203.
02:05:31.340 | This is not, you know, LOL situation.
02:05:35.840 | (both laugh)
02:05:37.460 | - Yeah.
02:05:38.300 | (both laugh)
02:05:43.460 | - That was pretty good.
02:05:44.300 | - The United States oscillating between establishing
02:05:46.780 | and destroying dictatorships.
02:05:48.540 | It's like a, is that a metric?
02:05:49.940 | - Yeah, metronome.
02:05:51.540 | Yeah, it's a seven out of 10, it's kinda true.
02:05:54.260 | - Oh yeah, this is kinda personal for me.
02:05:57.660 | Next one.
02:05:58.500 | - Oh man, this is Laika?
02:06:00.500 | - Yeah, well, no, this is--
02:06:02.020 | - Or it's like referring to Laika or something?
02:06:03.900 | - As Laika's, like, husband.
02:06:06.980 | - Husband, yeah, yeah.
02:06:08.140 | - Hello, yes, this is dog.
02:06:09.460 | Your wife was launched into space.
02:06:11.500 | And then the last one is him with his eyes closed
02:06:14.780 | and a bottle of vodka.
02:06:16.540 | - Yeah, Laika didn't come back.
02:06:17.940 | - No.
02:06:19.140 | - They don't tell you the full story of, you know,
02:06:21.260 | what the impact they had on the loved ones.
02:06:25.100 | - True.
02:06:25.940 | - That one gets an 11 from me.
02:06:26.860 | - Sure, what's your one?
02:06:28.300 | - Oh yeah, this keeps going on the Russian theme.
02:06:31.680 | First man in space, nobody cares, first man on the moon.
02:06:36.860 | - Well, I think people do care.
02:06:37.940 | - No, I know, but--
02:06:38.940 | - Yuri Gagarin's names will be forever in history, I think.
02:06:45.620 | - There is something special about placing,
02:06:48.000 | like, stepping foot onto another totally foreign land.
02:06:52.820 | It's not the journey, like, people that explore the oceans.
02:06:56.900 | It's not as important to explore the oceans
02:06:58.580 | as to land on a whole new continent.
02:07:01.100 | - Yeah.
02:07:03.940 | - Well, this is about you.
02:07:05.300 | (laughing)
02:07:06.260 | Oh yeah, I'd love to get your comment on this.
02:07:08.100 | Elon Musk, after sending $6.6 billion to the UN
02:07:12.300 | to end world hunger, you have three hours.
02:07:17.860 | - Yeah, well, I mean, obviously $6 billion
02:07:19.420 | is not gonna end world hunger, so.
02:07:21.120 | So, I mean, the reality is, at this point,
02:07:26.920 | the world is producing far more food
02:07:29.980 | than it can really consume.
02:07:31.860 | Like, we don't have a caloric constraint at this point.
02:07:35.840 | So, where there is hunger, it is almost always due to,
02:07:39.520 | it's like Civil War or strife or some, like,
02:07:45.700 | it's not a thing that is,
02:07:47.360 | extremely rare for it to be just a matter of, like,
02:07:52.660 | lack of money.
02:07:53.500 | It's like, you know, it's like some,
02:07:56.180 | the Civil War in some country,
02:07:57.900 | and like one part of the country's
02:07:59.180 | literally trying to starve the other part of the country.
02:08:01.460 | - So, it's much more complex
02:08:04.140 | than something that money could solve.
02:08:05.700 | It's geopolitics, it's a lot of things.
02:08:09.760 | It's human nature, it's governments,
02:08:11.340 | it's money, monetary systems, all that kind of stuff.
02:08:14.660 | - Yeah, food is extremely cheap these days.
02:08:17.660 | I mean, the US at this point,
02:08:22.340 | among low-income families, obesity is actually
02:08:27.180 | another problem.
02:08:28.020 | It's not, like, obesity is, it's not hunger.
02:08:31.020 | It's like too much, you know, too many calories.
02:08:34.540 | So, it's not that nobody's hungry anywhere.
02:08:37.860 | It's just, this is not a simple matter
02:08:42.460 | of adding money and solving it.
02:08:43.980 | - Hmm.
02:08:44.820 | What do you think that one gets?
02:08:48.740 | It's getting-- - I don't know.
02:08:51.420 | (laughing)
02:08:54.020 | - This is going after empires.
02:08:55.260 | World, where did you get those artifacts?
02:08:57.500 | The British Museum.
02:08:59.340 | Shout out to Monty Python, we found them.
02:09:02.980 | - Yeah, the British Museum is pretty great.
02:09:05.860 | I mean, admittedly, Britain did take
02:09:08.340 | these historical artifacts from all around the world
02:09:10.300 | and put them in London, but, you know,
02:09:12.900 | it's not like people can't go see them.
02:09:16.020 | So, it is a convenient place to see these ancient artifacts
02:09:19.900 | is London for a large segment of the world.
02:09:24.900 | So, I think, you know, on balance,
02:09:26.420 | the British Museum is a net good.
02:09:28.140 | Although, I'm sure that a lot of countries
02:09:30.300 | would argue about that.
02:09:31.220 | - Yeah.
02:09:32.700 | - It's like you wanna make these historical artifacts
02:09:34.740 | accessible to as many people as possible.
02:09:37.140 | And the British Museum, I think, does a good job of that.
02:09:41.220 | Even if there's a darker aspect
02:09:42.900 | to like the history of empire in general,
02:09:44.860 | whatever the empire is, however things were done,
02:09:47.900 | it is the history that happened.
02:09:52.060 | You can't sort of erase that history, unfortunately.
02:09:54.220 | You could just become better in the future.
02:09:56.580 | That's the point.
02:09:58.340 | - Yeah, I mean, it's like, well,
02:10:00.860 | how are we gonna pass moral judgment on these things?
02:10:03.980 | Like, it's like, you know, if one is gonna judge,
02:10:09.500 | say, the British Empire, you gotta judge, you know,
02:10:11.580 | what everyone was doing at the time
02:10:13.500 | and how were the British relative to everyone.
02:10:15.980 | And I think the British would actually get
02:10:20.020 | like a relatively good grade, relatively good grade,
02:10:22.980 | not in absolute terms,
02:10:24.180 | but compared to what everyone else was doing,
02:10:27.620 | they were not the worst.
02:10:31.500 | Like I said, you gotta look at these things
02:10:32.740 | in the context of the history at the time
02:10:35.300 | and say, what were the alternatives
02:10:37.100 | and what are you comparing it against?
02:10:38.540 | - Yes.
02:10:39.380 | I do not think it would be the case
02:10:41.820 | that Britain would get a bad grade
02:10:46.820 | when looking at history at the time.
02:10:48.980 | Now, if you judge history from, you know,
02:10:53.340 | from what is morally acceptable today,
02:10:56.060 | you're basically gonna give everyone a failing grade.
02:10:58.860 | I'm not clear.
02:10:59.860 | I don't think anyone would get a passing grade
02:11:02.380 | in their morality of like, you go back 300 years ago,
02:11:06.380 | like who's getting a passing grade?
02:11:08.660 | Basically no one.
02:11:09.580 | - And we might not get a passing grade
02:11:13.060 | from generations that come after us.
02:11:15.780 | What does that one get?
02:11:18.460 | - Sure, six, seven.
02:11:22.420 | - For the Monty Python, maybe.
02:11:24.180 | - I always love Monty Python, they're great.
02:11:26.820 | The Life of Brian and the Quest of the Holy Grail
02:11:28.380 | are incredible.
02:11:29.300 | - Yeah, yeah.
02:11:30.260 | - Damn, those serious eyebrows.
02:11:32.580 | - Brashen up.
02:11:33.420 | How important do you think is facial hair
02:11:35.940 | to great leadership?
02:11:38.300 | You got a new haircut.
02:11:39.740 | Is that, how does that affect your leadership?
02:11:42.860 | - I don't know, hopefully not.
02:11:45.780 | It doesn't.
02:11:46.620 | - Is that the second no one?
02:11:48.460 | - Yeah, the second is no one.
02:11:49.900 | - There is no one competing with Brashen.
02:11:52.860 | - No one two.
02:11:53.740 | - Those are like epic eyebrows.
02:11:55.260 | So, sure.
02:11:58.500 | - That's ridiculous.
02:12:00.380 | - Give it a six or seven, I don't know.
02:12:02.220 | - I like this, like Shakespearean analysis of memes.
02:12:05.700 | - Brashen, he had a flair for drama as well.
02:12:08.420 | Like, you know, showmanship.
02:12:11.380 | - Yeah, yeah, it must come from the eyebrows.
02:12:14.180 | All right, invention, great engineering.
02:12:18.140 | Look what I invented.
02:12:19.740 | That's the best thing since ripped up bread.
02:12:22.060 | - Yeah.
02:12:22.900 | - 'Cause they invented sliced bread.
02:12:25.860 | Am I just explaining memes at this point?
02:12:27.900 | This is what my life has become.
02:12:31.580 | - He's a meme lord, he's a meme explainer.
02:12:35.340 | - Yeah, I'm a meme, like a scribe
02:12:39.700 | that runs around with the kings
02:12:41.460 | and just writes down memes.
02:12:44.260 | - I mean, when was the cheeseburger invented?
02:12:46.140 | That's like an epic invention.
02:12:47.540 | - Yeah.
02:12:48.380 | - Like, wow.
02:12:49.820 | - Versus just like a burger?
02:12:53.300 | - Or a burger, I guess a burger in general.
02:12:55.500 | - Then there's like, what is a burger?
02:12:58.780 | What's a sandwich?
02:12:59.900 | And then you start getting, is it a pizza sandwich?
02:13:01.900 | And what is the original?
02:13:04.140 | It gets into an ontology argument.
02:13:05.980 | - Yeah, but everybody knows if you order a burger
02:13:08.020 | or a cheeseburger or whatever,
02:13:08.860 | and you get tomato and some lettuce and onions
02:13:11.460 | and whatever and mayo and ketchup and mustard,
02:13:15.220 | it's like epic.
02:13:16.380 | - Yeah, but I'm sure they've had bread and meat separately
02:13:19.020 | for a long time, and it was kind of a burger
02:13:21.140 | on the same plate, but somebody who actually combined them
02:13:23.660 | into the same thing and you bite it and hold it
02:13:27.620 | makes it convenient.
02:13:28.940 | It's a materials problem.
02:13:30.260 | Like, your hands don't get dirty and whatever.
02:13:33.380 | - Yeah, it's brilliant.
02:13:34.540 | - Well--
02:13:39.260 | - That is not what I would have guessed.
02:13:41.180 | - But everyone knows, like, if you order a cheeseburger,
02:13:43.740 | you know what you're getting.
02:13:44.660 | It's not like some obtuse, like,
02:13:47.100 | I wonder what I'll get, you know?
02:13:48.740 | Fries are, I mean, great.
02:13:53.420 | I mean, they're the devil, but fries are awesome.
02:13:56.020 | And yeah, pizza is incredible.
02:14:01.180 | - Food innovation doesn't get enough love,
02:14:03.780 | I guess is what we're getting at.
02:14:05.780 | - It's great.
02:14:06.620 | - What about the Matthew McConaughey Austinite here?
02:14:11.980 | President Kennedy, do you know how to put men on the moon
02:14:14.940 | yet and ask them no?
02:14:16.020 | President Kennedy, it'd be a lot cooler if you did.
02:14:18.820 | - Pretty much, sure, six, six or seven, I suppose.
02:14:23.100 | - And this is the last one.
02:14:27.540 | - That's funny.
02:14:30.620 | - Someone drew a bunch of dicks all over the walls,
02:14:33.620 | Sistine Chapel, boys' bathroom.
02:14:35.620 | - Sure, I'll give it nine.
02:14:36.980 | It's really true.
02:14:38.460 | - This is our highest ranking meme for today.
02:14:42.420 | - I mean, it's true, like, how do they get away with that?
02:14:44.660 | - Lots of nakedness.
02:14:46.540 | - Dick pics are, I mean, just something throughout history.
02:14:49.500 | As long as people can draw things, there's been a dick pic.
02:14:53.660 | - It's a staple of human history.
02:14:55.060 | - It's a staple.
02:14:56.460 | Consistent throughout human history.
02:14:58.420 | - You tweeted that you aspire to comedy.
02:15:00.620 | You're friends with Joe Rogan.
02:15:02.380 | Might you do a short stand-up comedy set
02:15:05.100 | at some point in the future?
02:15:06.620 | Maybe open for Joe, something like that.
02:15:09.580 | Is that-- - Really, stand-up?
02:15:11.260 | Actual, just full-on stand-up?
02:15:12.620 | - Full-on stand-up, is that in there, or is that--
02:15:15.020 | - I've never thought about that.
02:15:16.620 | - It's extremely difficult, at least that's what
02:15:21.580 | Joe says and the comedians say.
02:15:23.260 | - Huh, I wonder if I could.
02:15:27.260 | There's only one way to find out.
02:15:29.500 | - You know, I have done stand-up for friends,
02:15:32.700 | just impromptu, you know, I'll get on like a roof,
02:15:37.700 | and they do laugh, but they're our friends too.
02:15:41.340 | So I don't know if you've got to call, you know,
02:15:44.060 | like a room of strangers, are they gonna actually
02:15:46.220 | also find it funny, but I could try, see what happens.
02:15:51.220 | - I think you'd learn something either way.
02:15:54.060 | - Yeah.
02:15:54.900 | - I love both when you bomb and when you do great,
02:15:59.580 | just watching people, how they deal with it.
02:16:02.140 | It's so difficult, you're so fragile up there.
02:16:06.940 | It's just you, and you think you're gonna be funny,
02:16:09.740 | and when it completely falls flat,
02:16:11.300 | it's just beautiful to see people deal with that.
02:16:15.380 | - Yeah, I might have enough material to do stand-up.
02:16:17.900 | No, I've never thought about it,
02:16:19.420 | but I might have enough material.
02:16:23.620 | I don't know, like 15 minutes or something.
02:16:25.620 | - Oh yeah, yeah, do a Netflix special.
02:16:28.260 | - A Netflix special, sure.
02:16:30.220 | - What's your favorite Rick and Morty concept?
02:16:35.100 | Just to spring that on you.
02:16:36.700 | There's a lot of sort of scientific engineering ideas
02:16:39.020 | explored there, there's the--
02:16:40.580 | - Favorite Rick and Morty.
02:16:41.420 | - There's the butter robot.
02:16:42.860 | - It's great, yeah, it's a great show.
02:16:44.700 | - You like it?
02:16:45.540 | - Yeah, Rick and Morty's awesome.
02:16:47.140 | - Somebody that's exactly like you
02:16:48.540 | from an alternate dimension showed up there, Elon Tusk.
02:16:52.060 | - Yeah, that's right.
02:16:52.900 | - And he's voiced.
02:16:53.740 | - Yeah, Rick and Morty certainly explores
02:16:55.500 | a lot of interesting concepts.
02:16:57.180 | So like what's a favorite one, I don't know.
02:17:00.820 | The butter robot certainly is, you know,
02:17:03.180 | it's like, it's certainly possible
02:17:04.620 | to have too much sentience in a device.
02:17:06.880 | Like you don't wanna have your toast
02:17:08.900 | to be like a super genius toaster.
02:17:11.960 | It's gonna hate life 'cause all it could do
02:17:14.300 | is make his toast, but if, you know,
02:17:16.220 | it's like you don't wanna have like super intelligence
02:17:17.700 | stuck in a very limited device.
02:17:21.220 | - Do you think it's too easy
02:17:22.540 | if we're talking about from the engineering perspective
02:17:25.100 | of super intelligence, like with Marvin the robot,
02:17:28.420 | like is it, it seems like it might be very easy
02:17:31.620 | to engineer just a depressed robot.
02:17:34.100 | Like it's not obvious to engineer a robot
02:17:37.220 | that's going to find a fulfilling existence.
02:17:40.560 | Same as humans, I suppose.
02:17:42.780 | But I wonder if that's like the default.
02:17:46.700 | If you don't do a good job on building a robot,
02:17:49.540 | it's going to be sad a lot.
02:17:51.980 | - Well, we can reprogram robots
02:17:54.820 | easier than we can reprogram humans.
02:17:56.620 | So I guess if you let it evolve without tinkering
02:18:03.020 | then it might get sad,
02:18:05.300 | but you can change the optimization function
02:18:08.100 | and have it be a cheery robot.
02:18:10.240 | - You, like I mentioned with SpaceX,
02:18:15.700 | you give a lot of people hope
02:18:17.140 | and a lot of people look up to you,
02:18:18.260 | millions of people look up to you.
02:18:20.340 | If we think about young people in high school,
02:18:23.460 | maybe in college, what advice would you give to them
02:18:27.860 | about if they want to try to do something big in this world,
02:18:31.660 | they want to really have a big positive impact,
02:18:33.580 | what advice would you give them about their career,
02:18:36.160 | maybe about life in general?
02:18:37.620 | - Try to be useful.
02:18:40.620 | You do things that are useful to your fellow human beings,
02:18:45.260 | to the world.
02:18:46.340 | It's very hard to be useful.
02:18:48.900 | Very hard.
02:18:49.740 | You know, are you contributing more than you consume?
02:18:57.140 | You know, like can you try to have a positive net
02:19:02.140 | contribution to society?
02:19:05.640 | I think that's the thing to aim for.
02:19:09.220 | You know, not to try to be sort of a leader
02:19:11.740 | for the sake of being a leader or whatever.
02:19:17.020 | A lot of time, the people you want as leaders
02:19:21.420 | are the people who don't want to be leaders.
02:19:24.300 | So, if you can live a useful life,
02:19:29.300 | that is a good life, a life worth having lived.
02:19:36.180 | You know, and like I said, I would encourage people
02:19:41.740 | to use the mental tools of physics
02:19:46.340 | and apply them broadly in life.
02:19:48.260 | They're the best tools.
02:19:49.900 | - When you think about education and self-education,
02:19:52.220 | what do you recommend?
02:19:53.820 | So, there's the university, there's self-study,
02:19:58.740 | there is hands-on sort of finding a company or a place
02:20:03.140 | or a set of people that do the thing you're passionate about
02:20:05.460 | and joining them as early as possible.
02:20:07.420 | There's taking a road trip across Europe for a few years
02:20:12.180 | and writing some poetry.
02:20:13.340 | Which trajectory do you suggest
02:20:16.500 | in terms of learning about how you can become useful,
02:20:22.580 | as you mentioned, how you can have the most positive impact?
02:20:25.580 | - Well, I'd encourage people to read a lot of books.
02:20:35.900 | Just read, like, basically try to ingest
02:20:39.220 | as much information as you can.
02:20:42.000 | And try to also just develop a good general knowledge.
02:20:45.880 | So, you at least have, like, a rough lay of the land
02:20:51.240 | of the knowledge landscape.
02:20:53.080 | Like, try to learn a little bit about a lot of things.
02:20:57.060 | 'Cause you might not know what you're really interested in.
02:20:59.800 | How would you know what you're really interested in
02:21:01.040 | if you at least aren't, like, doing a peripheral exploration
02:21:05.400 | or broadly of the knowledge landscape?
02:21:10.140 | And talk to people from different walks of life
02:21:14.940 | and different industries and professions and skills
02:21:18.940 | and occupations.
02:21:20.340 | Like, just try to learn as much as possible.
02:21:24.800 | And then search for meaning.
02:21:28.700 | - Isn't the whole thing a search for meaning?
02:21:33.100 | - Yeah, what's the meaning of life and all, you know.
02:21:37.220 | But just generally, like I said,
02:21:38.260 | I would encourage people to read broadly
02:21:40.940 | in many different subject areas.
02:21:42.540 | And then try to find something where there's an overlap
02:21:48.340 | of your talents and what you're interested in.
02:21:51.820 | So, people may be good at something,
02:21:53.540 | or they may have skill at a particular thing,
02:21:56.300 | but they don't like doing it.
02:21:57.740 | So, you want to try to find a thing where you're,
02:22:01.900 | that's a good combination of the things
02:22:06.660 | that you're inherently good at, but you also like doing.
02:22:09.780 | - And reading is a super fast shortcut to figure out
02:22:16.500 | which, where are you both good at it, you like doing it,
02:22:20.820 | and it will actually have positive impact.
02:22:23.260 | - Well, you gotta learn about things somehow.
02:22:24.900 | So, reading, a broad range, just really read it.
02:22:29.900 | You know, more important when I was a kid,
02:22:32.780 | I read through the encyclopedia.
02:22:36.260 | So, that was pretty helpful.
02:22:38.380 | And there were all sorts of things
02:22:42.140 | I didn't even know existed, well, a lot, so obviously.
02:22:44.300 | - Check as broad as it gets.
02:22:45.940 | - Encyclopedias were digestible, I think,
02:22:48.060 | whatever, 40 years ago.
02:22:50.900 | So, maybe read through the condensed version
02:22:57.420 | of the Encyclopedia Britannica, I'd recommend that.
02:22:59.980 | You can always skip subjects
02:23:02.580 | where you read a few paragraphs
02:23:03.980 | and you know you're not interested,
02:23:05.700 | just jump to the next one.
02:23:07.620 | So, read the encyclopedia or skim through it.
02:23:11.500 | But I put a lot of stock and certainly have a lot of respect
02:23:20.020 | for someone who puts in an honest day's work
02:23:23.780 | to do useful things.
02:23:24.940 | And just generally to have not a zero-sum mindset,
02:23:31.260 | or have more of a grow the pie mindset.
02:23:35.660 | If you sort of say, when we see people,
02:23:41.420 | perhaps, including some very smart people,
02:23:44.140 | kind of taking an attitude of,
02:23:47.420 | like doing things that seem morally questionable,
02:23:50.980 | it's often because they have, at a base,
02:23:53.540 | sort of axiomatic level, a zero-sum mindset.
02:23:56.560 | And they, without realizing it,
02:24:00.100 | they don't realize they have a zero-sum mindset,
02:24:02.820 | or at least they don't realize it consciously.
02:24:05.860 | And so, if you have a zero-sum mindset,
02:24:07.640 | then the only way to get ahead
02:24:08.780 | is by taking things from others.
02:24:10.380 | If it's like, if the pie is fixed,
02:24:15.180 | then the only way to have more pie
02:24:17.220 | is to take someone else's pie.
02:24:19.260 | But this is false, like obviously the pie
02:24:21.420 | has grown dramatically over time, the economic pie.
02:24:24.540 | So, in reality, you can have,
02:24:27.340 | I don't wanna overuse this analogy,
02:24:31.140 | we can have a lot of, there's a lot of pie.
02:24:34.040 | (Zubin laughs)
02:24:35.940 | - Yeah. - Pie is not fixed.
02:24:37.380 | So, you really wanna make sure you're not operating
02:24:41.740 | without realizing it from a zero-sum mindset,
02:24:45.460 | where the only way to get ahead
02:24:47.380 | is to take things from others,
02:24:48.300 | then that's gonna result in you
02:24:49.540 | trying to take things from others, which is not good.
02:24:52.020 | It's much better to work on adding to the economic pie.
02:24:57.060 | So, creating, like I said,
02:25:02.060 | creating more than you consume, doing more than you, yeah.
02:25:06.360 | So, that's a big deal.
02:25:09.340 | I think there's a fair number of people in finance
02:25:13.580 | that do have a bit of a zero-sum mindset.
02:25:17.340 | - I mean, it's all walks of life.
02:25:18.940 | I've seen that.
02:25:20.100 | One of the reasons Rogan inspires me
02:25:24.500 | is he celebrates others a lot.
02:25:26.420 | This is not creating a constant competition.
02:25:29.340 | Like, there's a scarcity of resources.
02:25:31.260 | What happens when you celebrate others
02:25:33.220 | and you promote others, the ideas of others,
02:25:36.420 | it actually grows that pie.
02:25:38.940 | I mean, like the resources become less scarce,
02:25:43.940 | and that applies in a lot of kinds of domains.
02:25:46.740 | It applies in academia, where a lot of people are very,
02:25:49.820 | see some funding for academic research is zero-sum.
02:25:53.140 | It is not.
02:25:54.060 | If you celebrate each other,
02:25:55.180 | if you get everybody to be excited about AI,
02:25:58.020 | about physics, about mathematics,
02:26:00.300 | I think there'll be more and more funding,
02:26:02.340 | and I think everybody wins.
02:26:03.860 | Yeah, that applies, I think, broadly.
02:26:05.700 | - Yeah, yeah, exactly.
02:26:08.580 | - So, last question about love and meaning.
02:26:11.600 | What is the role of love in the human condition,
02:26:16.260 | broadly and more specific to you?
02:26:18.540 | How has love, romantic love or otherwise,
02:26:21.720 | made you a better person, a better human being?
02:26:24.980 | - Better engineer?
02:26:28.080 | - Now you're asking really perplexing questions.
02:26:31.660 | It's hard to give a, I mean, there are many books,
02:26:39.260 | poems, and songs written about what is love,
02:26:42.060 | and what exactly, you know,
02:26:45.500 | what is love, baby, don't hurt me.
02:26:49.800 | (both laughing)
02:26:52.500 | - That's one of the great ones, yes.
02:26:54.540 | You have earlier quoted Shakespeare,
02:26:56.260 | but that's really up there.
02:26:57.940 | - Yeah.
02:26:58.780 | Love is a many-splendored thing.
02:27:01.420 | - I mean, there's, 'cause we've talked about
02:27:06.420 | so many inspiring things, like be useful in the world,
02:27:08.420 | sort of like solve problems, alleviate suffering,
02:27:11.760 | but it seems like connection between humans is a source,
02:27:15.520 | you know, it's a source of joy, it's a source of meaning,
02:27:20.340 | and that's what love is, friendship,
02:27:21.980 | love, I just wonder if you think about that kind of thing,
02:27:26.620 | when you talk about preserving the light
02:27:29.380 | of human consciousness, and us becoming
02:27:31.540 | a multi-planetary species.
02:27:34.340 | I mean, to me at least, that means, like,
02:27:40.260 | if we're just alone and conscious and intelligent,
02:27:44.380 | it doesn't mean nearly as much as if we're with others,
02:27:48.180 | right, and there's some magic created when we're together.
02:27:52.060 | The friendship of it, and I think the highest form of it
02:27:56.180 | is love, which I think broadly is much bigger
02:27:59.340 | than just sort of romantic, but also, yes, romantic love
02:28:02.860 | and family and those kinds of things.
02:28:06.100 | - Well, I mean, the reason I guess I care about
02:28:08.580 | us becoming a multi-planet species
02:28:10.140 | in a space-faring civilization is,
02:28:13.140 | foundationally, I love humanity.
02:28:15.280 | And so I wish to see it prosper,
02:28:20.420 | and do great things, and be happy,
02:28:23.580 | and if I did not love humanity,
02:28:27.380 | I would not care about these things.
02:28:29.180 | - So when you look at the whole of it,
02:28:32.980 | the human history, all the people who's ever lived,
02:28:35.020 | all the people alive now, it's pretty, we're okay.
02:28:40.020 | (laughing)
02:28:41.740 | On the whole, we're a pretty interesting bunch.
02:28:45.300 | - Yeah, all things considered,
02:28:48.260 | and I've read a lot of history,
02:28:49.860 | including the darkest, worst parts of it,
02:28:52.700 | and despite all that, I think on balance,
02:28:56.980 | I still love humanity.
02:28:58.140 | - You joked about it with the 42.
02:29:01.860 | What do you think is the meaning of this whole thing?
02:29:04.460 | (laughing)
02:29:05.500 | Is there a non-numerical representation?
02:29:08.780 | - Yeah, well, really, I think what Douglas Adams
02:29:11.260 | was saying in "The Hitchhiker's Guide to the Galaxy"
02:29:13.140 | is that the universe is the answer,
02:29:17.980 | and what we really need to figure out
02:29:21.860 | are what questions to ask about the answer
02:29:24.460 | that is the universe,
02:29:25.580 | and that the question is really the hard part,
02:29:28.620 | and if you can properly frame the question,
02:29:30.500 | then the answer, relatively speaking, is easy.
02:29:32.740 | So therefore, if you want to understand
02:29:37.540 | what questions to ask about the universe,
02:29:40.820 | you want to understand the meaning of life,
02:29:42.260 | we need to expand the scope and scale of consciousness
02:29:45.900 | so that we're better able to understand
02:29:48.820 | the nature of the universe
02:29:50.620 | and understand the meaning of life.
02:29:52.460 | - And ultimately, the most important part
02:29:54.540 | would be to ask the right question.
02:29:56.720 | - Yes.
02:29:58.540 | - Thereby elevating the role of the interviewer.
02:30:01.780 | - Yes, exactly.
02:30:03.700 | - As the most important human in the room.
02:30:05.980 | - Absolutely.
02:30:06.820 | (laughing)
02:30:07.740 | Good questions are, you know,
02:30:10.220 | it's hard to come up with good questions.
02:30:13.180 | Absolutely.
02:30:14.020 | (laughing)
02:30:15.260 | But yeah, it's like that is the foundation of my philosophy
02:30:18.140 | is that I am curious about the nature of the universe,
02:30:22.700 | and obviously I will die, I don't know when I'll die,
02:30:29.620 | but I won't live forever,
02:30:30.960 | but I would like to know that we're on a path
02:30:34.620 | to understanding the nature of the universe
02:30:36.680 | and the meaning of life
02:30:37.520 | and what questions to ask about the answer
02:30:39.700 | that is the universe,
02:30:40.740 | and so if we expand the scope and scale of humanity
02:30:44.540 | and consciousness in general,
02:30:46.540 | which includes Silicon consciousness,
02:30:47.900 | then that seems like a fundamentally good thing.
02:30:52.900 | - Elon, like I said, I'm deeply grateful
02:30:58.980 | that you would spend your extremely valuable time
02:31:00.860 | with me today, and also that you have given
02:31:04.260 | millions of people hope in this difficult time,
02:31:07.280 | this divisive time, in this cynical time,
02:31:11.300 | so I hope you do continue doing what you're doing.
02:31:13.980 | Thank you so much for talking today.
02:31:15.420 | - Oh, you're welcome.
02:31:16.340 | Thanks for your excellent questions.
02:31:18.700 | - Thanks for listening to this conversation with Elon Musk.
02:31:21.460 | To support this podcast,
02:31:22.700 | please check out our sponsors in the description,
02:31:25.380 | and now let me leave you with some words
02:31:27.360 | from Elon Musk himself.
02:31:29.540 | When something is important enough, you do it,
02:31:32.220 | even if the odds are not in your favor.
02:31:34.620 | Thank you for listening, and hope to see you next time.
02:31:38.740 | (upbeat music)
02:31:41.320 | (upbeat music)
02:31:43.900 | [BLANK_AUDIO]