back to indexIn conversation with Elon Musk | All-In Summit 2024
Chapters
0:0 Besties intro Elon Musk!
4:1 The Battle of Free Speech
13:3 Potential government efficiency agency
30:23 SpaceX updates, overreaching regulations
38:48 Thoughts on Boeing's culture
41:5 The 80/20 AI Future
56:41 Elon and Jason share unaired SNL skits
00:00:00.000 |
After buying Twitter for $44 billion, Musk`s time as CEO has been a whirlwind. Shares of 00:00:10.000 |
Musk`s other major company, Tesla, have plummeted more than 30 percent since he took over Twitter. 00:00:16.840 |
As is often the case, his next move is unclear. 00:00:20.440 |
I go as far to say that he`s demonstrating some erratic behavior. 00:00:24.240 |
Go fuck yourself. Is that clear? I hope it is. Hey, Bob, share in the audience. 00:00:34.600 |
Elon Musk`s cooperation and/or relationships with other countries is worthy of being looked 00:00:44.120 |
The Biden administration has just announced its second investigation into Elon Musk in 00:00:49.840 |
Both the Tesla and SpaceX, there`s a product roadmap that they`re on, and that whether 00:00:55.360 |
Elon is in the building or not is not going to impact the plan that they have. 00:01:02.360 |
People said he`d never get the rocket in space. He did that. People said the roads here would 00:01:05.880 |
never get delivered. He did that. People said he`d never get 100 of them done. He`s got 00:01:11.360 |
As an entrepreneur, you can`t listen to the noise. And you certainly can`t listen to losers 00:01:15.560 |
who have never accomplished anything with their life, who are obsessing about you. Please. 00:01:22.640 |
We`re out there among the stars and we`re a multi-planet species across many planets 00:01:36.920 |
This is a great future, and that`s what we should strive for. 00:01:45.760 |
Nearly every VC I speak with, every CEO is looking to Elon`s behavior and saying that`s 00:01:50.760 |
a model for how you can challenge your team to achieve the impossible in an impossibly 00:01:55.600 |
And you can see those grid fins on your left-hand screen rotating and turning to guide the booster. 00:02:08.600 |
And you can see the water below and we have blast zone. 00:02:26.480 |
Elon seems to be on track to be not only the world`s richest man, but the world`s first 00:02:31.840 |
Elon basically has had over the last 10 or 15 years an incredible amount of challenges 00:02:38.640 |
Probably had to deal with stuff that most of us would have broken under, and he just 00:02:44.000 |
And the guy just basically bended all the haters until he crushed their souls. 00:02:53.240 |
My greatest entrepreneur of this generation, Elon Musk. 00:03:42.640 |
I mean, any given week, it just seems like the thing is getting out of here. 00:03:50.640 |
I mean, look, if we are in some alien Netflix series, I think the ratings are high. 00:04:04.040 |
This is a - you`ve been at war for two years now. 00:04:11.160 |
The price of freedom of speech is not cheap, is it? 00:04:13.640 |
I think it`s like 44 billion, something like that. 00:04:25.680 |
There is like this weird movement to quell free speech kind of around the world. 00:04:32.440 |
And that`s something we should be very concerned about. 00:04:37.040 |
You have to ask, like, why was the First Amendment like a high priority? 00:04:43.760 |
Is because people came from countries where if you spoke freely, you would be imprisoned 00:04:51.800 |
And they were like, well, we`d like to not have that here. 00:04:59.080 |
And actually, you know, there`s a lot of places in the world right now, if you are critical 00:05:03.160 |
of the government, you get imprisoned or killed. 00:05:13.560 |
I mean, I suspect this is a receptive audience to that message. 00:05:24.560 |
I think we always thought that the West was the exception to that, that we knew there 00:05:29.120 |
were authoritarian places around the world, but we thought that in the West, we`d have 00:05:33.280 |
And we`ve seen, like you said, it seems like a global movement. 00:05:36.440 |
In Britain, you`ve got teenagers being put in prison for memes, opposing -- 00:05:41.660 |
It`s like you like to -- you like to Facebook post, throw them in the prison. 00:05:48.040 |
People have got an actual, you know, prison for, like, obscure comments on social media. 00:06:07.800 |
Pavel in France, and then, of course, we got Brazil with Judge Voldemort. 00:06:12.800 |
That one seems like the one that impacts you the most. 00:06:19.380 |
Well, we -- I guess we are trying to figure out is there some reasonable solution in Brazil. 00:06:30.080 |
The -- you know, the concern -- I mean, I want to just make sure that this is framed 00:06:36.400 |
And, you know, funny memes aside, the nature of the concern was that, at least at XCorp, 00:06:46.240 |
we had the perception that we were being asked to do things that violated Brazilian law. 00:06:53.240 |
So, obviously, we cannot, as an American company, impose American laws and values on other countries 00:07:02.080 |
that -- you know, we wouldn`t get very far if we did that. 00:07:06.120 |
But we do, you know, think that if a country`s laws are a particular way, and we are being 00:07:13.360 |
asked to -- what we think we are being asked to break them, then -- and be silent about 00:07:21.960 |
So I just want to be clear, because sometimes it comes across as Elon is trying to just 00:07:29.080 |
be a crazy whatever, billionaire, and demand outrageous things from other countries. 00:07:36.280 |
And, you know, while that is true, in addition, there are other things that I think are -- I 00:07:51.920 |
think are valid, which is, like, we obviously can`t -- you know, I think any given thing 00:07:58.400 |
that we do at XCorp, we`ve got to be able to explain in the light of day, and not feel 00:08:04.400 |
that it was dishonorable, or, you know, we did the wrong thing, you know? 00:08:11.760 |
So we don`t -- that was the -- that`s the nature of the concern. 00:08:15.560 |
So we actually are in sort of discussions with the, you know, judicial authorities in 00:08:24.160 |
Brazil to try to, you know, run this to ground, like, what`s actually going on? 00:08:32.320 |
Like, if we`re being asked to break the law, Brazilian law, then that obviously should 00:08:37.640 |
not be -- should not sit well with the Brazilian judiciary. 00:08:42.120 |
And if we`re not, and we`re mistaken, we`d like to understand how we`re mistaken. 00:08:46.140 |
I think that`s a -- that`s a pretty reasonable position. 00:08:48.320 |
I`m a bit concerned, as your friend, that you`re going to go to one of these countries, 00:08:55.840 |
and I`m going to wake up one day, and you`re going to get arrested, and, like, I`m going 00:09:05.040 |
Like, they`re literally saying, like, you know, it`s not just Biden saying, like, we 00:09:08.640 |
have to look into that guy, now it`s become quite literal, like, this -- I don`t know, 00:09:12.800 |
who was the guy who just wrote the -- was it the Guardian piece about, like -- 00:09:16.600 |
Oh, yeah, yeah, there`ve been three articles, and I think in the past three weeks -- 00:09:29.760 |
Calling for me to be imprisoned in the Guardian, you know, Guardian of what? 00:09:47.840 |
But the premise here is that you bought this thing, this online forum, this communication 00:09:53.000 |
platform, and you`re allowing people to use it to express themselves, therefore, you have 00:10:03.080 |
So, what do you think they`re actually afraid of at this point? 00:10:08.000 |
Well, I mean, I think -- if somebody`s afraid -- if somebody`s sort of trying to push a false 00:10:13.440 |
premise on the world, then that premise can be undermined with public dialogue, then they 00:10:20.480 |
will be opposed to public dialogue on that premise, because they wish that false premise 00:10:26.760 |
So, that`s, I think, you know, the issue there is, if they don`t like the truth, you know, 00:10:35.640 |
So, now, you know, the sort of -- what we`re trying to do with XCorp is -- I distinguish 00:10:50.000 |
You have parental goals, and then you have goals for the company. 00:10:59.640 |
So, what we`re trying to do is simply adhere to the, you know, the laws in a country. 00:11:09.960 |
So, if something is illegal in the United States, or if it`s illegal in, you know, Europe 00:11:15.760 |
or Brazil or wherever it might be, then we will take it down, and we`ll suspend the account, 00:11:22.520 |
because we`re not, you know, there to make the laws, we -- but if speech is not illegal, 00:11:32.280 |
Now, we`re injecting ourselves in as a censor, and where does it stop, and who decides? 00:11:45.800 |
So, if the people in a country want the laws to be different, they should make the laws 00:11:52.760 |
But otherwise, we`re going to obey the law in each jurisdiction. 00:12:02.960 |
We`re trying to adhere to the law, and if the laws change, we will change. 00:12:08.320 |
We`re just literally trying to adhere to the law. 00:12:15.280 |
And if somebody thinks we`re not adhering to the law, well, they can file a lawsuit. 00:12:21.720 |
But what about countries that don`t want people to promote Nazi propaganda? 00:12:29.640 |
And in those countries, if somebody puts that up, you take it down. 00:12:33.280 |
But they typically file something and say, "Take this down." 00:12:36.760 |
No, in some cases, it is just obviously illegal. 00:12:39.360 |
Like, you don`t need to file a lawsuit for, you know, if something is just, you know, 00:12:51.160 |
You know, you don`t need -- like, if somebody is stealing, you don`t need -- let me check 00:13:05.440 |
And you know, one of the things is there`s this image on X of, like, basically, like, 00:13:09.880 |
you, Bobby, Trump, and J.D. are like the Avengers, I guess. 00:13:15.960 |
And then there`s another meme where you`re in front of a desk where it says, "D-O-G-E." 00:13:26.560 |
I made it using Grok, the Grok image generator. 00:13:46.600 |
I think with great difficulty, but, you know, look, it`s been a long time since there was 00:13:52.120 |
a serious effort to reduce the size of government and to remove absurd regulations. 00:14:00.120 |
And, you know, last time there was a really concerted effort on that front was Reagan 00:14:05.240 |
We`re 40 years away from a serious effort to remove, you know, regulations that don`t 00:14:14.200 |
serve the greater good and reduce the size of government. 00:14:18.120 |
And I think it`s just -- if we don`t do that, then what`s happening is that we get regulations 00:14:23.480 |
and laws accumulating every year until eventually everything`s illegal. 00:14:28.920 |
And that`s why we can`t get major infrastructure projects done in the United States. 00:14:32.800 |
Like, if you look at the absurdity of the California high-speed rail, I think they spent 00:14:37.160 |
$7 billion and have a 1,600-foot segment that doesn`t actually have rail in it. 00:14:46.240 |
That`s the expense of 1,600 feet of concrete, you know. 00:14:50.640 |
And I mean, I think it`s like, you know, I realize sometimes I`m perhaps a little optimistic 00:14:57.480 |
with schedules, but, you know, I mean, I wouldn`t be doing the things I`m doing if I was, you 00:15:06.360 |
know, not an optimist, so -- but at the current trend, you know, California high-speed rail 00:15:28.320 |
So, I think you really think of, you know, the United States and many countries, it`s 00:15:34.360 |
arguably worse than the EU, as being like Gulliver tied down by a million little strings. 00:15:40.360 |
And like any one given regulation is not that bad, but you`ve got a million of them, or 00:15:48.760 |
And then eventually you just can`t get anything done, and this is a massive tax on the consumer, 00:15:56.680 |
It`s just they don`t realize that there`s this massive tax in the form of irrational 00:16:03.360 |
I`m going to give you a recent example that, you know, is just insane, is that like SpaceX 00:16:10.880 |
was fined by the EPA $140,000 for they claimed dumping potable water on the ground, drinking 00:16:20.200 |
So, and we`re like, this is at Starbase, and we`re like, it`s -- we`re in a tropical thunderstorm 00:16:26.400 |
region, that stuff comes from the sky all the time. 00:16:30.640 |
And there was no actual harm done, you know, it was just water to cool the launch pad during 00:16:40.440 |
Like, and they`re like, they agree, yes, there`s zero harm done. 00:16:42.240 |
And we`re like, okay, so there`s no harm done. 00:16:44.800 |
And you want us to pay $140,000 fine, it`s like, yes, because you didn`t have a permit. 00:16:50.320 |
We didn`t know there was a permit needed for zero harm, fresh water being on the ground 00:16:56.600 |
in a place that where fresh water falls from the sky all the time. 00:17:03.400 |
Because there`s a little bit of water there, too. 00:17:06.520 |
I mean, sometimes it rains so much, the roads are flooded. 00:17:08.520 |
So we`re like, you know, how does this make any sense? 00:17:13.360 |
And then, like, they were like, well, we`re not going to process any more of your -- any 00:17:17.880 |
more of your applications for launch, for Starship launch, unless you pay this $140,000. 00:17:21.880 |
They just ransomed us, and we`re like, okay, so we paid $140,000, but it was -- it`s like, 00:17:28.320 |
At this rate, we`re never going to get to Mars. 00:17:30.880 |
I mean, that`s the -- that`s the confounding part here is we`re acting against our own 00:17:38.520 |
You know, when you look at -- we do have to make -- putting aside fresh water, but, hey, 00:17:44.280 |
you know, there -- the rocket makes a lot of noise. 00:17:47.360 |
So I`m certain there`s some complaints about noise once in a while. 00:17:50.600 |
But sometimes you want to have a party, or you want to make progress, and there`s a little 00:17:53.680 |
bit of noise, therefore, you know, we trade off a little bit of noise for massive progress 00:17:59.840 |
So, like, when did we stop being able to make those tradeoffs? 00:18:03.400 |
But talk about the difference between California and Texas, where you and I now reside. 00:18:10.240 |
Texas, you were able to build the Gigafactory, I remember when you got the plot of land, 00:18:15.600 |
and then our -- it seemed like it was less than two years when you had the party to open 00:18:23.280 |
From start of construction to completion was 14 months. 00:18:30.200 |
Is there anywhere on the planet that would go faster? 00:18:36.640 |
So, Texas, China, 11 and 14 months, California, how many months? 00:18:42.120 |
And just to give you a sense of size, the -- Tesla Gigafactory in China is three times 00:18:50.120 |
No, there were bigger buildings, but the Pentagon was a pretty big one. 00:19:04.480 |
Just the regulatory approvals in California would have taken two years. 00:19:14.280 |
Like for the people that will say, "We need some checks and balances. 00:19:17.000 |
We can't have some -- because for every good actor like you, there'll be a bad actor." 00:19:23.280 |
I mean, I haven't -- sort of, you know, in sort of doing a sensible deregulation and 00:19:32.080 |
reduction in the size of government is just, like, be very public about it and say, like, 00:19:36.520 |
which of these rules do you -- if the public is really excited about a rule and wants to 00:19:42.720 |
And here, the thing about the rule is, if, like, if the rule is, you know, turns out 00:19:51.960 |
It's, like, it's easy to add rules, but we don't actually have a process for getting 00:19:59.560 |
When we were watching you work, David and I and Antonio, in that first month at Twitter, 00:20:08.640 |
which was all hands on deck, and you were doing zero-based budgeting, you really quickly 00:20:15.520 |
And then, miraculously, everybody said this site will go down, and you added 50 more features. 00:20:24.640 |
Yeah, there were, like, so many articles, like, that this is -- Twitter is dead forever. 00:20:29.680 |
There's no way it could possibly even continue at all. 00:20:33.960 |
It was almost like the press was rooting for you to fail. 00:20:37.960 |
And they were all saying their goodbyes on Twitter. 00:20:42.440 |
They were all leaving and saying their goodbyes, because the site was going to melt down. 00:20:48.440 |
Which is, if you ever want to, like, hang out with a bunch of hall monitors, oh my god, 00:20:53.760 |
Every time I go over there and post, they're, like, they're really triggered. 00:20:58.880 |
I mean, if you like being condemned repeatedly, then, you know, for reasons that make no sense, 00:21:05.320 |
It's really, it's the most miserable place on earth. 00:21:08.680 |
If Disney's the happiest, this is the anti-Disney. 00:21:12.000 |
But if we were to go into government, you went into the Department of Education or pick 00:21:22.720 |
But if you could just pair 2%, 3%, 4%, 5% of those organizations, what kind of impact 00:21:31.880 |
I mean, I think we'd need to do more than that. 00:21:39.840 |
I mean, it would be better than what's happening now. 00:21:43.640 |
Look, I think we've, you know, if Trump wins, and I suspect there are people with mixed feelings 00:21:55.160 |
about whether that should happen, but we do have an opportunity to do kind of a once-in-a-lifetime 00:22:03.360 |
deregulation and reduction in the size of government. 00:22:06.640 |
Because the other thing, besides the regulations, America is also going bankrupt extremely quickly. 00:22:12.360 |
And nobody seems to, everyone seems to be sort of whistling past the graveyard on this 00:22:21.040 |
Everyone's stuffing their pockets in the silverware before the Titanic sinks. 00:22:24.720 |
Well, you know, the Defense Department budget is a very big budget, OK? 00:22:30.120 |
It's a trillion dollars a year, DOD, Intel, it's a trillion dollars. 00:22:36.280 |
And interest payments on the national debt just exceeded the Defense Department budget. 00:22:42.680 |
They're over a trillion dollars a year, just in interest, and rising. 00:22:48.640 |
We're adding a trillion dollars to our debt, which our kids and grandkids are going to 00:22:55.120 |
have to pay somehow, you know, every three months. 00:23:01.160 |
And then soon it's going to be every two months, and then every month. 00:23:04.600 |
And then the only thing we'll be able to pay is interest. 00:23:08.320 |
And it's just, you know, it's just like a person at scale that has racked up too much 00:23:14.760 |
credit card debt, and this does not have a good ending. 00:23:26.000 |
Let me ask one question, because I've brought this up a lot, and the counterargument I hear, 00:23:30.760 |
which I disagree with, but the counterargument I hear from a lot of politicians is if we 00:23:35.400 |
reduce spending, because right now, if you add up federal, state, and local government 00:23:39.840 |
spending, it's between 40 and 50 percent of GDP. 00:23:44.180 |
So nearly half of our economy is supported by government spending, and nearly half of 00:23:48.960 |
people in the United States are dependent directly or indirectly on government checks, 00:23:54.520 |
and either through contractors that the government pays, or they're employed by a government 00:23:59.200 |
So if you go in and you take too hard an ax too fast, you will have significant contraction, 00:24:10.300 |
Just thinking realistically, because I'm 100 percent on board with you, the next set of 00:24:15.600 |
steps, however, assume Trump wins and you become the chief D-O-G-E, D-O-G-E, D-O, like 00:24:26.960 |
Yeah, and I think the challenge is how quickly can we go in, how quickly can things change, 00:24:46.360 |
So I guess, how do you really address it when so much of the economy and so many people's 00:24:49.240 |
jobs and livelihoods are dependent on government spending? 00:24:52.240 |
Well, I mean, I do think it's sort of, you know, it's a false dichotomy. 00:24:59.760 |
It's not like no government spending is going to happen. 00:25:03.320 |
You really have to say, like, is it the right level? 00:25:06.820 |
And just remember that, you know, any given person, if they are doing things in a less 00:25:13.220 |
efficient organization versus a more efficient organization, their contribution to the economy, 00:25:17.760 |
their net output of goods and services will reduce. 00:25:20.600 |
I mean, you've got a couple of clear examples between East Germany and West Germany, North 00:25:32.440 |
It's the compounding effect of productivity gains. 00:25:36.800 |
And so in North Korea, you've got 100% government, and in South Korea, you've got probably, I 00:25:45.360 |
And yet you've got a standard of living that is probably 10 times higher in South Korea. 00:25:52.020 |
And then East and West Germany, in West Germany, just thinking in terms of cars, I mean, you 00:26:02.360 |
And East Germany, which is a random line on a map, the only car you could get was a Trabant, 00:26:09.200 |
which is basically a lawnmower with a shell in it. 00:26:18.080 |
So you put your kid on the list as soon as they're conceived. 00:26:24.800 |
And even then, only, I think, a quarter of people maybe got this lousy car. 00:26:33.320 |
So that's just an interesting example of basically the same people, different operating system. 00:26:38.560 |
And it's not like West Germany was some capitalist heaven. 00:26:49.920 |
So when you look, probably it was half government in West Germany and 100% government in East 00:26:57.000 |
And again, sort of a 5 to 10x standard of living difference, and even qualitatively 00:27:06.000 |
And it's obviously, so many people have these, amazingly, in this modern era, this debate 00:27:14.120 |
The one that doesn't need to build the wall to keep people in, OK? 00:27:24.200 |
Are they climbing the wall to get out or come in? 00:27:28.400 |
You have to build a barrier to keep people in. 00:27:35.200 |
It wasn't West Berlin that built the wall, OK? 00:27:37.800 |
They were like, you know, anyone who wants to flee West Berlin, go ahead. 00:27:44.780 |
And if you look at sort of the flux of boats from Cuba, there's a large number of boats 00:27:49.880 |
from Cuba, and there's a bunch of free boats that anyone can take to go back to Cuba. 00:27:59.680 |
I could use this boat to go to Cuba, where they have communism. 00:28:04.720 |
And yet nobody picks up those boats and does it. 00:28:13.520 |
Wait, so your point is jobs will be created if we cut government spending in half. 00:28:16.880 |
Jobs will be created fast enough to make up for, right, just the countries. 00:28:21.760 |
Obviously, you know, I'm not suggesting that people, you know, have like immediately tossed 00:28:27.160 |
out with no severance and, you know, now can't pay their mortgage. 00:28:31.000 |
There needs to be some reasonable off-ramp where, yeah, so a reasonable off-ramp where, 00:28:36.440 |
you know, they're still, you know, earning, they're still receiving money, but have like, 00:28:41.040 |
I don't know, a year or two to find jobs in the private sector, which they will find. 00:28:46.000 |
And then they will be in a different operating system. 00:28:50.440 |
East Germany was incorporated into West Germany. 00:28:52.440 |
Living standards in East Germany rose dramatically. 00:28:55.760 |
Well, in four years, if you could shrink the size of the government with Trump, what would 00:29:01.840 |
be a good target, just in terms of like ballpark? 00:29:04.600 |
I mean, are you trying to get me assassinated before this even happens? 00:29:11.040 |
I mean, you know, there's that old phrase, "Go postal." 00:29:16.760 |
I mean, I'm going to need a lot of security details, guys. 00:29:20.640 |
I mean, the sheer number of disgruntled workers, former government employees, is quite a scary 00:29:30.720 |
I was saying a low digits every year for four years would be palatable. 00:29:35.640 |
Yeah, but the thing is that if it's not done, like if you have a once in a lifetime, once 00:29:40.920 |
in a generation opportunity, and you don't take serious action, and then you have four 00:29:46.800 |
years to get it done, and if it doesn't get done, then. 00:29:58.840 |
No, I think actually the reality is that if we get rid of nonsense regulations and shift 00:30:02.680 |
people from the government sector to the private sector, we will have immense prosperity, and 00:30:10.080 |
I think we will have a golden age in this country. 00:30:13.480 |
You have a bunch of critical milestones coming up. 00:30:24.580 |
In fact, there's a very exciting launch that is maybe happening tonight, so if the weather 00:30:31.840 |
is holding up, then I'm going to leave here, head to Cape Canaveral for the Polaris Dawn 00:30:38.400 |
mission, which is a private mission, so funded by Jared Isaacman. 00:30:42.640 |
And he's an awesome guy, and this will be the first commercial spacewalk, and it'll 00:30:52.200 |
be at the highest altitude since Apollo, so it's the furthest from Earth that anyone's 00:31:12.720 |
Yeah, astronaut safety is, man, if I had all the wishes I could say about, that'd be the 00:31:32.200 |
So yeah, the next milestone after that would be the next flight of Starship, which, you 00:31:44.040 |
know, Starship is, the next flight of Starship is ready to fly. 00:31:52.640 |
You know, it really should not be possible to build a giant rocket faster than the paper 00:32:11.360 |
You ever see that movie Zootopia, there's like a sloth coming in for the approval. 00:32:22.640 |
Yeah, they accidentally tell a joke, and I was like, "Oh no, this is going to take a 00:32:29.560 |
Yeah, Zootopia, you know, the funny thing is, like, so I went to the DMV about, I don't 00:32:38.280 |
know, a year later after Zootopia to get my license renewal, and the guy in an exercise 00:32:45.320 |
of incredible self-awareness had the sloth from Zootopia in his cube, and he was actually 00:33:00.560 |
No, I mean, sometimes people think the government is more competent than it is. 00:33:08.520 |
I'm not saying that there aren't competent people in the government, they're just in 00:33:13.840 |
Once you move them to a more efficient operating system, their output is dramatically greater, 00:33:18.520 |
as we've seen, you know, when East Germany was reintegrated with West Germany, and the 00:33:25.480 |
same people were vastly more prosperous, with a basically half-capitalist operating system. 00:33:33.400 |
So, but I mean, for a lot of people, like, their most direct experience with the government 00:33:42.400 |
is the DMV, and then the important thing to remember is that the government is the DMV 00:33:58.480 |
Elon, sorry, can you go back to Chamath's question on Starship? 00:34:03.600 |
So you announced just the other day Starship going to Mars in two years, and- 00:34:12.080 |
And then four years for a crude aspirational launch in the next window? 00:34:16.520 |
And how much is the government involved, and NASA involved? 00:34:18.360 |
I'm not saying, like, say you watch by these, you know, but these, but based on our current 00:34:25.120 |
progress where with Starship we were able to successfully reach orbital velocity twice, 00:34:31.680 |
we were able to achieve soft landings of the booster and the ship in water, and that's 00:34:37.680 |
despite the ship having, you know, half its flaps corked off. 00:34:41.000 |
You can see the video on the X platform, it's quite exciting. 00:34:45.200 |
So, you know, we think we'll be able to launch reliably and repeatedly and quite quickly. 00:34:54.600 |
And the fundamental Holy Grail breakthrough for rocketry, the fundamental breakthrough 00:35:01.760 |
that is needed for life to become multi-planetary is a rapidly reusable, reliable rocket. 00:35:08.880 |
With a pirate, somehow, throw a pirate in there. 00:35:18.560 |
So Starship is the first rocket design where success is one of the possible outcomes with 00:35:28.460 |
So, you know, for any given project you have to say, this is the circle, so we'll write 00:35:34.120 |
that, here's the circle, and it is success, the success dot in the circle is success in 00:35:43.840 |
That's, you know, it sounds pretty obvious, but there are often projects where that success 00:35:53.240 |
And so Starship not only is full reusability in the set of possible outcomes, it is being 00:36:00.080 |
proven with each launch, and I'm confident we'll succeed, it's simply a matter of time. 00:36:06.240 |
And if we can get some improvement in the speed of regulation, we could actually move 00:36:17.720 |
So that would be very helpful, and in fact, if something isn't done about reducing regulation 00:36:27.040 |
and sort of speeding up approvals, and to be clear, I'm not talking about anything unsafe, 00:36:32.120 |
it's simply the processing of the safe thing can be done as fast as the rocket is built, 00:36:39.320 |
not slower, then we could become a space-faring civilization and a multi-planet species and 00:36:49.000 |
And it's incredibly important that we have things that we find inspiring, that you look 00:37:02.640 |
to the future and say the future's going to be better than the past, things to look forward 00:37:08.880 |
Like kids are a good way to assess this, like what are kids fired up about? 00:37:14.880 |
If you could say, you could be an astronaut on Mars, you could maybe one day go beyond 00:37:23.280 |
the solar system, we could make Star Trek, Starfleet Academy real. 00:37:34.200 |
I mean, you need things that move your heart. 00:37:39.720 |
Life can't just be about solving one miserable problem after another, there's got to be things 00:37:54.920 |
Yeah, and do you think you might have to move it to a different jurisdiction to move faster? 00:38:03.320 |
Rocket technology is considered advanced weapons technology, so we can't just go do it... 00:38:10.360 |
And if we don't do it, other countries could do it. 00:38:12.040 |
I mean, they're so far behind us, but theoretically, there is a national security, you know, justification 00:38:21.360 |
If somebody can put their thinking caps on, like, do we want to have this technology that 00:38:24.560 |
you're building, the team's working so hard on, stolen by other countries? 00:38:28.080 |
And then, you know, maybe they don't have as much red tape. 00:38:35.000 |
So no one's trying to steal it, it's just too crazy, basically. 00:38:46.360 |
Elon, what do you think is going on that led to Boeing building the Starliner the way that 00:39:09.280 |
Well, I mean, I think Boeing is a company that is, you know, they actually do so much 00:39:16.960 |
business with the government, they have sort of impedance matched to the government. 00:39:21.100 |
So they're like basically one notch away from the government, maybe, they're not far from 00:39:25.800 |
the government from an efficiency standpoint, because they derive so much of their revenue 00:39:30.080 |
And a lot of people think, well, SpaceX is super dependent on the government. 00:39:34.240 |
And actually, no, most of our revenue is commercial. 00:39:42.780 |
And there's, I think, at least up until perhaps recently, because they have a new CEO who 00:39:53.320 |
And the CEO before that, I think, had a degree in accounting and never went to the factory, 00:40:02.240 |
So I think if you are in charge of a company that makes airplanes fly, and a spacecraft 00:40:10.140 |
go to orbit, then it can't be a total mystery as to how they work. 00:40:18.200 |
So you know, I'm like, sure, if somebody is like running Coke or Pepsi, and they're like 00:40:25.000 |
great at marketing or whatever, that's fine, because it's not a sort of technology-dependent 00:40:32.880 |
Or if they're running a financial consulting, and their degree is in accounting, that makes 00:40:40.600 |
But I think if you're the cavalry captain, you should know how to ride a horse. 00:40:50.440 |
It's like, it's disconcerting if the cavalry captain just falls off the horse. 00:41:04.040 |
Shifting gears to AI, Peter was here earlier, and he was talking about how so far the only 00:41:08.400 |
company to really make money off AI is NVIDIA with the chips. 00:41:11.920 |
Do you have a sense yet of where you think the big applications will be from AI? 00:41:18.640 |
Is it going to be in enabling self-driving, is it going to be enabling robots, is it transforming 00:41:24.200 |
I mean, it's still, I think, early in terms of where the big business impact is going 00:41:42.960 |
I think the spending on AI probably runs ahead of, I mean, it does run ahead of the revenue 00:41:53.800 |
But the rate of improvement of AI is faster than any technology I've ever seen by far. 00:42:02.120 |
And it's, I mean, for example, the Turing test used to be a thing. 00:42:10.360 |
Now your basic open source, random LLM, you're writing on a frigging Raspberry Pi probably 00:42:22.560 |
So there's, I think actually the good future of AI is one of immense prosperity, where 00:42:36.440 |
there is an age of abundance, no shortage of goods and services. 00:42:43.280 |
Everyone can have whatever they want, unless, except for things we artificially define to 00:42:52.120 |
But anything that is a manufactured good or provided service will, I think, with the advent 00:42:57.240 |
of AI plus robotics, that the cost of goods and services will trend to zero. 00:43:06.320 |
I'm not saying it'll be actually zero, but it'll be, everyone will be able to have anything 00:43:16.640 |
Of course, in my view, that's probably 80% likely. 00:43:20.840 |
Look on the bright side, only 20% probability of annihilation, nothing. 00:43:33.520 |
I mean, frankly, I do have to go engage in some degree of deliberate suspension of disbelief 00:43:38.200 |
with respect to AI in order to sleep well, and even then, because I think the actual 00:43:46.280 |
issue, the most likely issue is like, well, how do we find meaning in a world where AI 00:43:53.640 |
That is perhaps the bigger challenge, although at this point, I know more and more people 00:44:00.360 |
who are retired, and they seem to enjoy that life, but I think that maybe there'll be some 00:44:07.800 |
crisis of meaning, because the computer can do everything you can do, but better, so maybe 00:44:13.880 |
that'll be a challenge, but really, you need the end effectors, you need the autonomous 00:44:25.040 |
cars, and you need the humanoid robots, or general purpose robots, but once you have 00:44:32.760 |
general purpose humanoid robots, and autonomous vehicles, really, you can build anything, 00:44:44.760 |
and I think that there's no actual limit to the size of the economy. 00:44:48.920 |
I mean, there's obviously the mass of Earth, that'll be one limit, but the economy is really 00:44:57.600 |
just the average productivity per person times number of people. 00:45:03.840 |
And if you've got humanoid robots that can do, where there's no real limit on the number 00:45:10.840 |
of humanoid robots, and they can operate very intelligently, then there's no actual limit 00:45:17.680 |
to the economy, there's no meaningful limit to the economy. 00:45:21.240 |
You guys just turned on Colossus, which is like the largest private compute cluster, 00:45:30.080 |
It's the most powerful supercomputer of any kind. 00:45:34.440 |
Which sort of speaks to what David said, and kind of what Peter said, which is a lot of 00:45:38.200 |
the kind of economic value so far of AI has entirely gone to NVIDIA, but there are people 00:45:45.360 |
with alternatives, and you're actually one with an alternative, now you have a very specific 00:45:48.800 |
case, because Dojo's really about images, and large images, huge video. 00:45:53.920 |
Yeah, I mean, the Tesla problem is different from the sort of LLM problem. 00:46:03.120 |
The nature of the intelligence is actually, and what matters in the AI is different to 00:46:11.920 |
the point you just made, which is that in Tesla's case, the context is very long. 00:46:19.480 |
Yeah, you've got billions of tokens of context, a nutty amount of context, because you've 00:46:27.640 |
got seven cameras, and if you've got several, let's say you've got a minute of several high 00:46:38.400 |
So the Tesla problem is you've got to compress a gigantic context into the pixels that actually 00:46:47.080 |
matter, and condense that over a time, so you've got to, in both the time dimension 00:46:56.760 |
and the space dimension, you've got to compress the pixels in space and the pixels in time, 00:47:05.080 |
and then have that inference done on a tiny computer, relatively speaking, a few hundred 00:47:12.320 |
gigabytes, it's a Tesla-designed AI inference computer, which is by the way still the best, 00:47:18.320 |
there isn't a better thing we could buy from suppliers. 00:47:21.160 |
So the Tesla-designed AI inference computer that's in the cars is better than anything 00:47:25.720 |
we could buy from any supplier, just by the way, that's kind of a, the Tesla AI chip team 00:47:32.900 |
You guys, in the design, there was a technical paper, and there was a deck that somebody 00:47:36.280 |
on your team from Tesla published, and it was stunning to me. 00:47:40.560 |
You designed your own transport control layer over Ethernet, you were like, "Ah, Ethernet's 00:47:44.560 |
not good enough for us," and you have this TT-COE or something, and you're like, "Oh, 00:47:48.880 |
we're just going to reinvent Ethernet and string these chips," it's pretty incredible 00:47:55.640 |
No, the Tesla chip design team is extremely good. 00:48:01.120 |
But is there a world where, for example, other people over time that need some sort of video 00:48:07.380 |
use case or image use case could theoretically, you'd say, "Oh, why not, I have some extra 00:48:12.680 |
cycles over here," which would kind of make you a competitor of NVIDIA, it's not intentionally 00:48:16.960 |
per se, but ... Yeah, I mean, there's this training and inference, 00:48:25.200 |
and we do have those two projects at Tesla, we've got Dojo, which is the training computer, 00:48:31.280 |
and then our inference chip, which is in every car, inference computer, and Dojo, we've only 00:48:42.420 |
had Dojo 1, Dojo 2 is, we should have Dojo 2 in volume towards the end of next year, 00:48:51.020 |
and that will be, we think, sort of comparable to a B200 type system, a training system, 00:49:02.700 |
and so, I guess there's some potential for that to be used as a service, but Dojo is 00:49:15.880 |
just kind of like, I guess I have some improved confidence in Dojo, but I think we won't really 00:49:26.340 |
know how good Dojo is until probably version 3, it usually takes three major iterations 00:49:33.200 |
on a technology for it to be excellent, and we'll only have the second major iteration 00:49:38.720 |
next year, the third iteration, I don't know, maybe late, you know, 26 or something like 00:49:47.760 |
How's the Optimist project going, I remember when we talked last, and you said this publicly, 00:49:52.300 |
that it's in doing some light testing inside the factory, so it's actually being useful, 00:49:58.820 |
what's the build of materials, and when, you know, for something like that at scale, so 00:50:03.380 |
when you start making it like you're making the Model 3 now, and there's a million of 00:50:06.260 |
them coming off the factory line, what would they cost, $20,000, $30,000, $40,000 you think? 00:50:11.660 |
Yeah, I mean, I've discovered really that anything made in sufficient volume will asymptotically 00:50:19.200 |
approach the cost of its materials, so some things are constrained by the cost of intellectual 00:50:29.580 |
property and like paying for patents and stuff, so a lot of what's in a chip is like paying 00:50:36.580 |
royalties and depreciation of the chip fab, but the actual marginal cost of the chips 00:50:42.620 |
is very low, so Optimist is obviously a humanoid robot, it weighs much less and is much smaller 00:50:51.020 |
than a car, so you could expect that in high volume, and I'd say you also probably need 00:51:00.420 |
three production versions of Optimist, so you need to refine the design at least three 00:51:07.060 |
major times, and then you need to scale production to sort of the million unit plus per year 00:51:13.500 |
level, and I think at that point, the labor and materials on Optimist is probably not 00:51:28.580 |
Basically, think of it like Optimist will cost less than a small car, so at scale volume 00:51:39.020 |
with the three major iterations of technology, and so if a small car costs $25,000, it's 00:51:47.380 |
probably like $20,000 for an Optimist, for a humanoid robot that can be your buddy like 00:51:59.820 |
I honestly think people are going to get really attached to their humanoid robot, because 00:52:03.500 |
I mean, like you look at sort of, you watch Star Wars, and it's like R2-D2 and C3PO, I 00:52:07.880 |
love those guys, you know, they're awesome, and their personality, and I mean, all R2 00:52:16.500 |
could do is just beef at you, can't speak English, and C3PO to translate the beefs. 00:52:24.020 |
So you're in year two of that, if you did two or three years per iteration or something, 00:52:28.420 |
it's a decade-long journey for this to hit some sort of scale? 00:52:32.420 |
I would say the major iterations are less than two years, so it's probably on the order 00:52:38.980 |
of five years, maybe six to get to a million units a year. 00:52:46.540 |
And at that price point, everybody can afford one, on planet Earth. 00:52:49.860 |
I mean, it's going to be that one-to-one, two-to-one, what do you think ultimately, 00:52:53.420 |
if we're sitting here in 30 years, the number of robots on the planet versus humans? 00:52:58.620 |
Yeah, I think the number of robots will vastly exceed the number of humans. 00:53:05.060 |
I mean, you have to say, who would not want their robot buddy? 00:53:12.020 |
You know, this is like, especially if it can, you know, it can take care of your, take your 00:53:19.740 |
dog for a walk, it could mow the lawn, it could watch your kids, it could, you know, 00:53:32.020 |
We could send a lot of robots to Mars to do the work needed to make it a colonized planet 00:53:39.180 |
There's like a whole bunch of, you know, robots, like rovers and... 00:53:45.540 |
So yeah, no, I think the sort of useful humanoid robot opportunity is the single biggest opportunity 00:54:01.300 |
Because if you assume like, I mean, the ratio of humanoid robots to humans is going to be 00:54:06.460 |
at least two to one, maybe three to one, because everybody will want one, and then there'll 00:54:11.340 |
be a bunch of robots that you don't see that are making goods and services. 00:54:14.220 |
And you think it's a general, one generalized robot that then learns how to do different 00:54:28.940 |
Yeah, I mean, I'm operating my meat puppet, you know. 00:54:35.660 |
And by the way, it turns out like, as we're designing Optimus, we sort of learn more and 00:54:40.780 |
more about why humans are shaped the way they're shaped. 00:54:45.380 |
And you know, and why we have five fingers and why your little finger is smaller than 00:54:50.020 |
your index finger, obviously why you have opposable thumbs, but also why, for example, 00:54:58.340 |
the muscles, the major muscles that operate your hand are actually in your forearm. 00:55:04.340 |
And your fingers are primarily operated, like... 00:55:09.900 |
The muscles that actuate your fingers are located, the vast majority of your finger 00:55:15.740 |
strength is actually coming from your forearm, and your fingers are being operated by tendons, 00:55:24.900 |
And so the current version of the Optimus hand has the actuators in the hand and has 00:55:31.540 |
only 11 degrees of freedom, so it doesn't have all the degrees of freedom of human hand, 00:55:36.940 |
which has, depending on how you count it, roughly 25 degrees of freedom. 00:55:45.040 |
And it's also like, not strong enough in certain ways, because the actuators have to fit in 00:55:51.460 |
So the next generation Optimus hand, which we have in prototype form, the actuators have 00:55:57.740 |
moved to the forearm, just like a human, and they operate the fingers through cables, just 00:56:05.300 |
And then the next generation hand has 22 degrees of freedom, which we think is enough to do 00:56:18.300 |
And presumably, I think it was written that X and Tesla may work together and provide 00:56:25.420 |
services, but my immediate thought went to, "Oh, if you just provide a grok to the robot, 00:56:28.940 |
then the robot has a personality and can process voice and video and images and all of that 00:56:37.940 |
I think everybody talks about all the projects you're working on, but people don't know you 00:56:48.020 |
People don't see it, but I would say, I know for me, the funniest week of my life, or one 00:56:52.740 |
of the funniest, was when you did SNL and I got to tag along. 00:57:00.380 |
Maybe behind the scenes, some of your funniest recollections of that chaotic, insane week 00:57:10.180 |
It was a little terrorizing on the first couple of days. 00:57:13.100 |
Yeah, I was a bit worried at the beginning there because frankly, nothing was funny. 00:57:24.260 |
Well, it's like a rule, but can't you guys just say it? 00:57:32.500 |
The funniest skits were the ones they didn't let you do. 00:57:36.220 |
There were a couple of funny ones, yeah, that they didn't let you do. 00:57:37.220 |
You can say it, so that he doesn't get ... I mean, how much time do we have here? 00:57:40.780 |
Well, we should just give him one or two because it was ... In your mind, which one do we regret 00:57:59.780 |
So, one of the things that I think everyone's been sort of wondering this whole time is, 00:58:20.100 |
Do they have like a delay or like just in case there's a wardrobe malfunction or something 00:58:36.100 |
There's a way to test this, which is we don't tell them what's going on. 00:58:41.900 |
I walk on and say, "This is the script I'll throw on the ground. 00:58:54.220 |
And the way that we're going to do this is I'm going to take my cock out. 00:59:29.760 |
We're pitching this on Zoom on like a Monday afternoon. 00:59:32.760 |
We're like kind of hungover from the weekend. 00:59:40.760 |
My friends who I think are sort of quite funny, Jason's quite funny. 00:59:48.980 |
I think Jason's the closest thing to Cartman that exists in real life. 00:59:53.980 |
We have a joke going that he's Butters and I'm Cartman. 01:00:05.620 |
So we come in like just like guns blazing with like ideas. 01:00:11.500 |
And we didn't realize actually that's not how it works and that's normally like actors 01:00:17.720 |
and they just get told what to do and like, oh, you mean we can't just like do funny things 01:00:25.940 |
And on the Zoom, they're aghast at Elon's pitch. 01:00:39.140 |
And then after a long silence, like Mike just says the word "crickets." 01:00:48.860 |
And then Elon explains the punchline, which is. 01:01:00.740 |
So then I'm like, so, so, so, so I said like, I'm, I'm, I'm gonna, I'm going to reach down 01:01:10.700 |
into my pants and I stuck my hand on my pants and I'm going to, and I'm, and I want to pull 01:01:15.540 |
my cock out and I tell this to the audience and the audience is going to be like, what? 01:01:22.340 |
And then, and then, and then, and then I pull out a baby rooster, you know? 01:01:31.060 |
And it's like, okay, this is kind of PG, you know, it's like, not that bad. 01:01:44.820 |
And so then, and do you think it's a nice cock? 01:01:48.820 |
And I pitch, I'm like, and then Kate McKinnon walks out. 01:01:53.020 |
And I'm like, oh no, but you haven't heard half of it, so Kate McKinnon comes out and 01:01:56.340 |
she says, Elon, I expected you would have a bigger cock. 01:02:01.940 |
I was like, I don't mean to disappoint you, Kate, but yeah. 01:02:10.260 |
Kate's got to come out with, with, with her cat. 01:02:20.380 |
That's, that's a, that's a, that's a nice pussy you've got there, Kate. 01:02:36.980 |
It's like, oh no, Elon, actually, can I hold your cock? 01:02:41.940 |
Of course, Kate, you definitely hold my cock. 01:02:45.740 |
And then, you know, we exchanged and I think just the audio version of this was pretty 01:02:50.860 |
And, and, and, you know, it's just like, wow, you, I really like stroking your cock. 01:02:57.580 |
And I was like, I'm really enjoying stroking your pussy. 01:03:07.020 |
And yeah, so, you know, they're looking at us like, oh my God, what have we done inviting 01:03:16.660 |
And then they said, they said like, well, um, it is, uh, it is Mother's Day. 01:03:21.420 |
It's Mother's Day, we might not want to go with this. 01:03:26.780 |
A lot of moms in the audience, and I'm like, well, that's a good point. 01:03:31.540 |
It might be a bit uncomfortable for all the moms in the audience, maybe, I don't know. 01:03:37.460 |
Uh, so, uh, yeah, that was, that's the, um, that's the, that's the, that's the, um, cold 01:03:47.140 |
We didn't get that on the air, um, but, uh, we did fight for Doge. 01:03:55.020 |
Well, I mean, there's a bunch of things that I said that were just not on the script. 01:03:57.180 |
Like if they have these like cue cards for what you're supposed to say, and I just didn't 01:04:14.940 |
Elon wanted to do Doge on late night and he says, um, hey, Jake, how can you, um, make 01:04:21.540 |
Like you sort of redo the, you know, that scene from, uh, the, the, the Godfather. 01:04:25.460 |
I mean, you kind of need the music to cue things up. 01:04:28.020 |
You bring me on my daughter's wedding and you ask for Doge. 01:04:42.300 |
You got to have the tuxedo and the sort of job office and the, you know, and you're gonna 01:04:46.260 |
have like Marlon Brando and I said, you come to me on this day of my Doge's wedding and 01:05:14.500 |
So they come to me and I'm, I'm talking to Colin, um, and Joe's who's got a great sense 01:05:20.940 |
He loves Elon and he's like, we can't do it because of the law and stuff like that. 01:05:29.220 |
Elon called Comcast and he put in an offer and they just accepted it, NBC. 01:05:39.640 |
And Colin Jones looks at me and I sold it so good and he's like, you're serious. 01:05:45.560 |
I'm like, yep, we own NBC now and he's like, okay, well that kind of changes things, doesn't 01:05:56.880 |
And then he's like, you're fucking with me and I'm like, I'm fucking with you. 01:06:03.720 |
It was the greatest week of, and that like is like two of 10 stories. 01:06:12.120 |
But it was, and I was just so happy for you to see you have a great week of just joy and 01:06:19.600 |
Cause you were launching rockets, you're dealing with so much bullshit in your life to have 01:06:30.480 |
I think we gotta, we gotta get you back on SNL. 01:06:32.480 |
All right, ladies and gentlemen, our bestie, Elon Musk.