back to index

All-In Summit: Elon Musk on Ukraine, X, the creator economy, China, AI, & more


Chapters

0:0 Besties welcome Elon via Starlink
5:31 Ukraine and Starlink
19:10 green shoots of X
22:24 the creator economy and optimizing the X experience
26:43 the ADL, free speech, and advocating for peace
32:41 China
37:20 AI
44:30 where are we with self-driving?

Whisper Transcript | Transcript Only Page

00:00:00.000 | (audience applauding)
00:00:03.160 | - Okay, wait.
00:00:11.080 | I'll just read off all of your companies, Elon.
00:00:13.840 | I know them, but I'm just gonna read them
00:00:15.280 | to make sure I don't miss one,
00:00:16.120 | 'cause there's so many now.
00:00:17.480 | Founder, CEO, chief engineer of SpaceX,
00:00:21.040 | CEO, product architect, and chairman of Tesla,
00:00:24.440 | owner, chairman, CTO of X,
00:00:28.520 | SpaceX.com, founder of Boring Company,
00:00:30.920 | co-founder of Neuralink and OpenAI,
00:00:33.760 | and president of the Musk Foundation.
00:00:35.480 | Did I get everything? - Yeah.
00:00:36.880 | (upbeat music)
00:00:39.460 | - Where are you?
00:00:54.680 | - I am, here's Connie.
00:00:56.280 | (laughing)
00:00:57.680 | - It's kind of absurd.
00:00:58.560 | Where are you, at Starbase?
00:00:59.920 | - I'm in flight, currently.
00:01:02.760 | So, this is a Starlink and flight connection.
00:01:07.160 | - Are you kidding me?
00:01:08.000 | That's, oh yeah, that works pretty well, huh?
00:01:09.960 | (laughing)
00:01:11.560 | I think there's only one.
00:01:12.840 | (laughing)
00:01:14.760 | Wait, I think it's one of one.
00:01:15.600 | - We're doing a test of how Starlink works
00:01:18.320 | in an airplane at altitude.
00:01:21.520 | - There's only one of those in existence, right?
00:01:23.440 | It's on your plane, that's it, one-on-one?
00:01:25.800 | - There are a number of airliners that have Starlink,
00:01:28.280 | and there'll be a lot more in the future.
00:01:30.480 | - Nice.
00:01:31.320 | - The Starlink connection,
00:01:32.400 | when something is working properly,
00:01:34.040 | is you won't even be able to tell
00:01:36.120 | you're on the ground or in the air.
00:01:37.960 | 'Cause unlike a geosynchronous satellite,
00:01:41.840 | the latency is, you know,
00:01:44.440 | really can be less than 20 milliseconds.
00:01:46.440 | It's a, in fact, for a lot of,
00:01:52.240 | I think for some people, the Starlink connection
00:01:53.760 | on the plane will be better
00:01:54.600 | than the connection at their house.
00:01:56.520 | - That'd be pretty great.
00:01:57.800 | How is the Starship doing?
00:02:01.600 | It was incredible to see the first launch,
00:02:04.480 | but I understand you're closing in on the second.
00:02:06.520 | I know you've been working really hard on that,
00:02:07.960 | and the team's working hard on it.
00:02:09.660 | When do you think you're gonna get the next one up,
00:02:12.480 | and what are the chances it makes it to orbit?
00:02:14.780 | - Well, we have the second one stacked at starbase,
00:02:19.240 | so it's ready to go.
00:02:20.560 | And we finished that up in the last week.
00:02:25.120 | We believe we've completed the remaining items
00:02:28.760 | requested by the FAA,
00:02:30.280 | so we should get our license hopefully soon.
00:02:32.640 | But really, the only thing holding back
00:02:36.480 | second-class Starship at this point
00:02:37.800 | is the right to require approval.
00:02:39.680 | - Wow.
00:02:41.200 | What's your expectation, or your hope,
00:02:44.480 | in terms of the probability that it gets to orbit?
00:02:47.360 | - You know, it's just a question of timing.
00:02:49.560 | How long does it take to get the approval,
00:02:52.440 | paperwork, whatnot?
00:02:54.200 | So that's really up to the FAA at this point.
00:02:57.000 | - But what about making it to orbit?
00:02:58.880 | Do you think you got a shot this time?
00:03:00.840 | - We are doing a new staging technique called hot staging,
00:03:05.080 | where you light the upstage engines, or the ship engines,
00:03:11.040 | while the boost stage is still firing.
00:03:15.920 | And this is the most efficient way to do stage separation
00:03:20.120 | of rocket during orbit,
00:03:22.120 | but we did not try that on the last mission,
00:03:23.880 | and we're trying it on this mission.
00:03:25.560 | We think it will be overall better.
00:03:28.040 | But I think probably about,
00:03:31.040 | I hope, we'll have a 50% chance
00:03:33.320 | of getting to stage separation.
00:03:35.000 | And maybe a close to 50% chance of getting to orbit
00:03:39.480 | if the hot staging,
00:03:40.960 | the new separation method, works.
00:03:45.080 | I'd say maybe it's like a,
00:03:47.040 | I'd say probably above 30% chance
00:03:50.080 | of getting to orbit this time,
00:03:51.200 | whereas previously I said below 50.
00:03:52.920 | - Is this, in terms of complexity,
00:03:57.680 | how complex is this of a problem
00:03:59.840 | compared to the other problems
00:04:00.920 | you've worked on in your career?
00:04:02.520 | - Well, I mean, making a rocket
00:04:07.400 | that is more than twice the size of the Saturn V,
00:04:10.120 | you know, it's a, in fact,
00:04:14.920 | with an extra orbit of the rocket,
00:04:16.120 | it'll have roughly three times the thrust
00:04:17.480 | of a Saturn V green rocket.
00:04:19.760 | And the orbit's designed to be fully and rapidly reusable,
00:04:22.640 | whereas the Saturn V was completely expendable.
00:04:27.640 | And with Falcon 9, we still expand the upper stage,
00:04:32.640 | but we bring back the upper stage,
00:04:34.440 | as people have probably seen the rocket landing videos.
00:04:37.160 | And we are also able to recover the ferry
00:04:40.600 | with Falcon 9.
00:04:43.040 | But these things do land typically out to sea.
00:04:45.600 | So it takes a while to bring them back to board
00:04:48.200 | and get the fighting game.
00:04:49.600 | There's a scale of Starship,
00:04:54.480 | but then also the fact that it is designed
00:04:55.960 | for full and rapid reusability.
00:04:57.680 | So the both the booster and the ship
00:04:59.360 | come back to the launch site.
00:05:00.720 | They get caught by these giant Godzilla arms.
00:05:03.520 | You've seen Kong versus Godzilla.
00:05:07.080 | It's basically that.
00:05:08.800 | Catches this giant rocket out of thin air
00:05:12.880 | and puts it back on the launch stand
00:05:14.680 | and gets ready for launch.
00:05:16.160 | So it will be capable of basically
00:05:19.560 | aircraft level flight rates.
00:05:22.000 | It's much bigger than say a 747 or an A380.
00:05:26.920 | - Elon, can we talk about the events of,
00:05:33.000 | was it last weekend, the whole Ukraine Starlink thing?
00:05:35.640 | Can you give us the tick-tock of what's going on
00:05:39.960 | and how you're being forced to decide?
00:05:43.080 | (laughing)
00:05:46.200 | But what is it like in that decision room,
00:05:49.200 | if there was one, or wherever you were,
00:05:50.960 | where you're trying to figure out,
00:05:52.040 | am I keeping this on, do I turn it off,
00:05:53.520 | what is going on?
00:05:54.560 | People must have been bombarding you.
00:05:56.080 | Whatever you can share about what that was like,
00:05:58.440 | how you made the decision.
00:05:59.720 | - Yeah, so I think someone was actually mistaken
00:06:03.800 | a little bit in his understanding of the situation.
00:06:06.960 | Obviously, SpaceX have provided Starlink connectivity
00:06:14.560 | for, to Ukraine really since the beginning of the war.
00:06:19.120 | We're really within a few days of the war starting.
00:06:22.920 | And as the Ukrainian government said,
00:06:26.480 | the Starlink was instrumental in the defense of Ukraine.
00:06:31.080 | So they've said that really many times,
00:06:34.160 | although the media forgets to mention that.
00:06:36.320 | So, and in fact, they've said it on Twitter,
00:06:41.200 | it's fully known as Twitter.
00:06:42.640 | (laughing)
00:06:44.000 | It's gonna take a while to get that right, yeah.
00:06:46.600 | It'll take a little time.
00:06:47.640 | - Okay.
00:06:48.480 | (laughing)
00:06:50.720 | So, you don't have to take my word for it,
00:06:58.040 | you can just read what they posted.
00:07:00.080 | So Starlink has been incredibly helpful
00:07:06.080 | to the Ukrainian war effort.
00:07:07.480 | We've gotten out of pocket very significantly to help them.
00:07:13.440 | And at the time this happened,
00:07:17.520 | the region around Crimea was actually turned off.
00:07:22.520 | Now the reason it was turned off was,
00:07:25.400 | actually originally was because the United States
00:07:27.440 | had sanctions against Russia.
00:07:29.160 | And we're not allowed to actually,
00:07:32.760 | that includes Crimea in the sanctions.
00:07:35.120 | And we're not allowed to actually turn on connectivity
00:07:39.360 | to a sanctioned country without explicit government approval.
00:07:43.640 | Which we did not have from the US government.
00:07:45.880 | So, basically the,
00:07:50.360 | look, Ukraine didn't give us any advance warning
00:07:54.760 | or heads up or anything.
00:07:56.760 | We just got these sort of urgent calls
00:08:00.840 | from the Ukrainian government saying
00:08:02.120 | that we needed to turn on Crimea.
00:08:04.160 | It's like in the middle of the night, basically.
00:08:06.200 | We're like, what are you talking about?
00:08:08.560 | - You asked?
00:08:09.400 | - What's it for?
00:08:11.240 | (laughing)
00:08:13.480 | We basically figured out that this was
00:08:20.800 | kind of like a Pearl Harbor type attack on this festival,
00:08:25.720 | on the Russian people's festival.
00:08:27.680 | So they're really asking us to really practically
00:08:30.000 | take part in a major act of war.
00:08:32.880 | And, well, we certainly have huge support
00:08:39.240 | for the Ukrainian government.
00:08:41.520 | Ukrainian government is not in charge of US people
00:08:46.160 | or companies.
00:08:47.440 | (audience cheering)
00:08:51.480 | - And Elon, if I could just--
00:08:54.160 | - No, but I should say that,
00:08:56.920 | although I'm not President Biden's biggest fan,
00:08:59.040 | if I had received a presidential directive to turn it on,
00:09:03.280 | I would have done so.
00:09:04.360 | Because I do regard the president
00:09:06.160 | as the chief executive officer of the country,
00:09:08.640 | whether I want that to be the president or not,
00:09:12.360 | I still respect the office.
00:09:14.080 | And so if I'd gotten a request from the president
00:09:19.080 | type of thing, from the American president,
00:09:21.520 | to be clear,
00:09:22.360 | (audience laughing)
00:09:25.600 | then I would have turned it on.
00:09:27.680 | So, but no such request came through.
00:09:29.680 | - That's a really interesting point.
00:09:32.800 | And I mean, what Jamal's referring to
00:09:36.600 | is you're now being attacked.
00:09:37.840 | I saw there was Jake Tapper the other day on CNN
00:09:41.640 | interviewing our Secretary of State,
00:09:43.400 | was just, he was all lathered up,
00:09:45.560 | basically attacking you for this.
00:09:47.720 | David-- - Yeah.
00:09:48.600 | - David-- - I mean, to his credit,
00:09:51.080 | Secretary Blinken was actually quite supportive,
00:09:54.200 | despite the absurd accusations and leading questions
00:09:58.720 | of Jake Tapper on CNN.
00:10:01.080 | - Yeah, he didn't take the bait, to his credit.
00:10:03.280 | - Now, to Secretary Blinken, in his regard,
00:10:07.000 | for not taking the bait at all.
00:10:10.720 | - Yeah, well, to me, this is an example
00:10:13.560 | of no good deed goes unpunished,
00:10:15.520 | because if you had never given--
00:10:17.720 | - I hope some good deeds go unpunished.
00:10:21.040 | (audience laughing)
00:10:22.840 | - I mean, if you had never given Starlink--
00:10:25.920 | - It just despired to that.
00:10:27.520 | - Yeah. (laughing)
00:10:29.520 | But yeah, I mean, my point is just,
00:10:31.240 | if you had never given Starlink
00:10:32.800 | to the Ukrainian government for free, voluntarily,
00:10:35.600 | you just volunteered it,
00:10:37.000 | then no one would be attacking you right now
00:10:39.240 | for not turning it on
00:10:40.440 | so they could do their attack on Crimea.
00:10:42.440 | Also, one other thing I'll note
00:10:45.360 | is that your reason for not turning it on,
00:10:47.440 | which is you don't want to be part
00:10:48.520 | of what could be a major escalation,
00:10:50.960 | was exactly the reason the Biden administration
00:10:54.440 | did not give ATACM missiles to Ukraine
00:10:58.600 | at that point in the war.
00:10:59.680 | Now, they may be changing their minds,
00:11:01.320 | but they were very worried
00:11:03.040 | about an attack the administration was,
00:11:05.480 | an attack on Crimea,
00:11:07.160 | triggering some huge escalation in this war.
00:11:09.800 | So not only did you not receive a directive
00:11:13.040 | from President Biden,
00:11:15.680 | your thinking was very much in line
00:11:17.000 | with theirs at the time,
00:11:19.040 | and yet, and you're being attacked for that now.
00:11:21.480 | - There's something you mentioned,
00:11:24.200 | which is that you did this
00:11:25.640 | at a lot of economic cost to SpaceX.
00:11:28.880 | Can you just talk about that for a second?
00:11:30.600 | 'Cause I'm not sure people understand
00:11:32.520 | who's paying for what right now
00:11:33.840 | and who hasn't been paid,
00:11:34.920 | and you know, et cetera, et cetera.
00:11:36.720 | - Yeah, there is,
00:11:38.640 | well, I should say,
00:11:40.760 | like a lot of people contributed to the effort.
00:11:43.240 | Stalingrad is the fundamental communication backbone
00:11:46.480 | of the Ukrainian government and essential services,
00:11:49.760 | like first responders and that kind of thing.
00:11:51.840 | And it's used, we hope, peacefully,
00:11:55.280 | relatively peacefully on the war front.
00:11:56.880 | It is the only thing that works on the war front.
00:11:59.000 | Everything else has been jammed by the Russians.
00:12:01.360 | So it's the only thing that works.
00:12:04.000 | Not one of the things.
00:12:05.400 | (audience applauding)
00:12:08.560 | But I think you have to sort of think of, say,
00:12:15.160 | taking the actual example of Pearl Harbor
00:12:20.320 | and say, well, how did that work out for Japan?
00:12:23.880 | It didn't work out well at all
00:12:25.560 | because it was a tactical victory, a strategic defeat.
00:12:31.440 | It enraged the American public.
00:12:34.320 | It was sort of a nationally wanted vengeance
00:12:36.640 | for this act, you know, this act.
00:12:39.360 | And I think that, you know,
00:12:41.320 | while I don't think it's on the same scale,
00:12:42.880 | that there was certainly that potential
00:12:44.760 | of sort of a mini Pearl Harbor
00:12:47.680 | with results in a mass escalation of hostilities.
00:12:52.120 | But this would not defeat Russia.
00:12:56.600 | It would enrage Russia.
00:12:58.240 | - Do you donate the network, or do they pay you for it?
00:13:02.960 | - Sorry?
00:13:03.800 | - Yeah.
00:13:05.480 | - So I'm actually not sure what the final accounting is
00:13:08.640 | at this point, but I think at one point,
00:13:12.240 | at one point, you recalculated our sort of cost
00:13:15.440 | of supporting things that are roughly $100 million.
00:13:17.600 | Now, the $100 million does not count the massive risk
00:13:21.480 | to the entire Starlink constellation
00:13:24.120 | because Russia would like to have the entire thing deleted.
00:13:31.040 | You know, nobody compensating us for that.
00:13:33.160 | And so if we were to get, say,
00:13:37.480 | our control center were take down in a cyber attack,
00:13:41.480 | they, you know, they could command all the satellites
00:13:43.480 | to be over and destroy the entire system
00:13:46.760 | or use anti-satellite weapons.
00:13:49.880 | So, you know, these are, this is a pretty significant risk
00:13:56.000 | for which we have not received any compensation.
00:14:00.200 | And obviously it would be catastrophic
00:14:01.640 | to the entire Starlink system,
00:14:04.240 | which is, you know, approaching $10 billion.
00:14:07.160 | - Elon, do you think the current government administration--
00:14:11.040 | - I'm not saying, hey, $10 billion.
00:14:13.040 | And then actually I say,
00:14:14.280 | one of the rather interesting things was,
00:14:17.560 | as you've seen this, there's a very large amount of money
00:14:20.920 | that's been appropriated for Ukraine.
00:14:23.880 | You know, I'm not sure what the total is at this point,
00:14:26.920 | but it must be a hundred, close to a hundred billion
00:14:29.680 | or somewhere between 80 and a hundred billion.
00:14:32.200 | You know, now all of the, you know,
00:14:36.880 | other sort of providers, the US providers of support
00:14:40.160 | to Ukraine are being paid.
00:14:42.400 | So then why should SpaceX be excluded?
00:14:44.640 | That doesn't make sense.
00:14:46.840 | We're doing one of the most valuable things
00:14:49.360 | and yet aren't getting the least money in the system, sir.
00:14:52.200 | But, you know, despite that,
00:14:55.200 | we're still happy to keep going.
00:14:57.240 | And--
00:14:58.080 | - Elon, does the Biden administration have it out for you?
00:15:03.120 | And why?
00:15:03.960 | - Why did we give you that idea?
00:15:06.920 | - Yeah.
00:15:07.760 | But let me ask, you own and control--
00:15:13.440 | - I don't know if the whole administration has it out for you.
00:15:16.000 | I think there's probably aspects of the administration
00:15:18.480 | that are not, or, you know, aspects of, you know,
00:15:24.080 | interests aligned with President Biden,
00:15:29.080 | who probably do not wish good things for me.
00:15:34.160 | I don't know, you know, really what their issue is,
00:15:38.680 | but there does seem to be a significant increase
00:15:42.680 | in the weaponization of government.
00:15:44.480 | And I would say really sort of misuse
00:15:51.120 | of prosecutorial discretion in many areas where,
00:15:55.640 | and I think this is really a dangerous thing for,
00:15:59.560 | you know, for, I don't know,
00:16:04.040 | for there to be partisan politics with government agencies.
00:16:10.120 | It's just really, and then I think from, you know,
00:16:15.120 | from say, you know, a Democratic Party standpoint
00:16:20.480 | or say a Biden administration standpoint,
00:16:22.840 | I think the danger here is that if there's
00:16:27.960 | a significant misuse of prosecutorial discretion,
00:16:31.560 | let's say one says, okay, everyone's equal under the law.
00:16:34.200 | Yes, but who are you choosing to pursue?
00:16:37.000 | And if you're pursuing what appear to independent voters
00:16:42.400 | to be trivial cases while ignoring serious crimes,
00:16:48.400 | it's hard to imagine that a lot of independent voters,
00:16:53.400 | that's gonna win over a thoroughful independent voters.
00:16:56.960 | - Did things change when you bought Twitter?
00:17:02.720 | - Yeah, I think they did change somewhat.
00:17:04.840 | You know, our goal with the sort of, you know,
00:17:10.800 | the X platform is really to be a level playing field,
00:17:16.000 | a public square that is supportive of, you know,
00:17:21.000 | most of the country, let's say that the middle 80%
00:17:23.320 | or something like that.
00:17:24.480 | Now, that has not been the case really for all social media.
00:17:30.640 | All social media have been really very left leaning,
00:17:34.040 | far left leaning, and really Twitter was far left leaning.
00:17:37.000 | You know, the suspensions of say Republican candidates
00:17:43.360 | or interests or voices was really at least 10 times
00:17:47.880 | the rate of suppression of left wing voices on all Twitter.
00:17:52.880 | So, you know, what we're trying to do is move it
00:18:00.320 | to the middle, which from standpoint of say the left
00:18:04.320 | appears, it is moving to the right, everything's relative,
00:18:08.080 | if you're standing on the left.
00:18:09.640 | (laughing)
00:18:10.800 | But it's not, it's simply moving to the middle, that's all,
00:18:15.000 | in an attempt to actually represent the whole country
00:18:19.240 | and not just, you know, half the country
00:18:22.800 | or even maybe less than half the country.
00:18:25.040 | So that's it really.
00:18:27.280 | So I think there's really nothing to be alarmed about here.
00:18:29.840 | It's just that it's intended to be a town square
00:18:34.840 | inclusive of the whole country and the world, that's all.
00:18:39.480 | - It's been, I guess you took over X Twitter
00:18:44.480 | on Halloween weekend, if I remember correctly,
00:18:47.880 | when you got to the building and you got the keys.
00:18:50.640 | David and I were lucky to be there with you
00:18:53.240 | when you got the keys and we got to check things out.
00:18:57.560 | This is 10 months into the turnaround
00:18:59.880 | and it wasn't a high functioning organization,
00:19:02.960 | I think, when you took it over.
00:19:04.640 | Where is the company at now and are you pleased with,
00:19:08.360 | I guess, the progress because it looks like
00:19:10.200 | new features are getting launched,
00:19:11.560 | the product velocity is great.
00:19:13.800 | Obviously advertising's been challenging,
00:19:16.560 | but it feels like there's some green shoots.
00:19:18.200 | So how do you feel about the purchase now?
00:19:20.760 | - Yeah, well I should say we've recently seen
00:19:24.320 | a significant increase in advertising, which is great.
00:19:27.160 | So that's, if that trend continues,
00:19:31.400 | I think the company will be in very good financial shape
00:19:34.720 | on the advertising front.
00:19:36.840 | So that, in terms of positive developments,
00:19:39.200 | that seems to be one of them.
00:19:41.240 | And from a feature standpoint,
00:19:46.640 | I think that those who are using the system,
00:19:48.960 | I think we might have delivered more new features
00:19:53.760 | in the last year than in the last,
00:19:56.240 | all Twitter did in five years.
00:19:59.000 | The feature development pace is very rapid.
00:20:06.280 | And this is being done with really about
00:20:09.480 | 15% of the original company.
00:20:11.200 | Maybe a little more, 15 to 20%.
00:20:16.080 | So it's really efficient.
00:20:19.480 | At the end of the day, you have to say,
00:20:21.760 | how complicated is a system like the X Twitter platform?
00:20:26.760 | How different is it from a group chat, frankly?
00:20:31.600 | It's like a group chat at scale.
00:20:34.080 | So I don't think you need an army to maintain a group chat.
00:20:39.080 | - Yeah.
00:20:40.880 | (audience laughing)
00:20:42.920 | (audience applauding)
00:20:44.160 | Yeah.
00:20:45.000 | I mean, it's not the self-driving platform,
00:20:49.360 | and it had maybe 10 times as many people working on it
00:20:53.200 | as the self-driving platform at Tesla, which seems crazy.
00:20:58.200 | - The entire self-driving AI software team is 200 people.
00:21:03.400 | And what they're doing is much more complex than Twitter.
00:21:06.680 | - Yeah.
00:21:07.520 | - Or, you know, much more.
00:21:08.360 | (audience laughing)
00:21:11.360 | - Well, you know.
00:21:12.200 | - There's other things that obviously need to be done,
00:21:15.360 | like advertising sales,
00:21:17.080 | obviously network operations.
00:21:20.320 | - How, can you talk to us about that?
00:21:23.080 | - It's really not, it's not a huge,
00:21:25.840 | I don't even know what I mean for what we're doing here.
00:21:29.560 | And I think, you know, people that are still at the company
00:21:33.840 | are obviously being very productive
00:21:36.200 | in creating and delivering new features.
00:21:39.560 | And, you know, we keep seeing sort of breadboard usage.
00:21:44.920 | And the most rigorous number is really the user seconds
00:21:49.720 | as reported by the mobile device, especially iOS.
00:21:53.080 | The iOS, what iOS reports as the screen time
00:21:58.080 | is the least gainful metric.
00:22:01.400 | And those numbers are very good.
00:22:04.720 | So, you know, I think, of course,
00:22:09.600 | I'm pretty optimistic about where things are headed.
00:22:11.760 | And I feel like the company's just, you know,
00:22:14.880 | just recently turned a corner.
00:22:17.240 | - Tell us about.
00:22:18.240 | - You know, it's been, well, at least moderate prosperity
00:22:21.280 | and hopefully significant.
00:22:23.080 | - Tell us about the success of sharing revenue.
00:22:26.640 | Why did you do it?
00:22:27.800 | And then just the vision you have
00:22:29.480 | for just the creator economy
00:22:30.840 | and what you want that to evolve into and build into.
00:22:33.520 | - Yeah, I mean, it sounds to reason that
00:22:36.800 | if you're a creator and you need to make a living
00:22:41.800 | for what you're doing.
00:22:44.480 | So there's gotta be, you know, fair compensation,
00:22:49.480 | competitive compensation for a creator,
00:22:52.320 | whether they're doing, you know,
00:22:54.440 | writing or pictures, video, whatever the case may be.
00:22:58.680 | And so we're not really advancing anything new here.
00:23:05.000 | We're just, you know, as YouTube does with creators,
00:23:07.720 | they will do rev share with advertising.
00:23:10.800 | And so we're doing rev share with advertising.
00:23:13.400 | We're also, you know,
00:23:15.800 | obviously they have enabled direct subscription
00:23:18.400 | to accounts where whatever that somebody, you know,
00:23:21.880 | you could be doing audio, video, long form text, anything,
00:23:25.400 | and you could subscribe to someone.
00:23:27.360 | And that's, you know,
00:23:29.520 | obviously that's where you're first subscriber
00:23:30.920 | and make a living as well, you know,
00:23:33.400 | for a creator to make a living.
00:23:34.960 | So the intent is for the X platform
00:23:37.280 | to be the best home for creators,
00:23:40.720 | where if you've got interesting content,
00:23:42.720 | then you'd want to put it on our platform.
00:23:45.440 | And, you know, there's a lot of questions
00:23:48.400 | about like sort of the algorithm and whatnot.
00:23:50.440 | I should mention like the algorithm is,
00:23:53.640 | I think almost all of it is open sourced
00:23:55.840 | and we will, I think quite soon,
00:23:57.800 | have the entire thing open source.
00:24:00.520 | The only reason it really hasn't been done
00:24:03.040 | entirely open source yet
00:24:04.000 | is because we're somewhat embarrassed with the code
00:24:05.680 | and need to just clean it up
00:24:07.280 | before putting something extremely embarrassing out there.
00:24:11.040 | But the point is that like we want transparency builds trust.
00:24:15.080 | And if you've got, if you can recreate the results,
00:24:20.760 | on the X platform of how viral a post is gonna be,
00:24:25.240 | independently using the public algorithm,
00:24:29.600 | you know, the open source algorithm,
00:24:31.520 | that's really where we want to get to.
00:24:34.400 | So you kind of know what to expect
00:24:38.160 | and why something happened.
00:24:40.560 | Now I should say that we are trying to optimize
00:24:45.040 | for user time on the platform.
00:24:51.040 | What this naturally means is that posting content
00:24:56.040 | that someone looks at longer
00:24:59.520 | is gonna get higher priority than content that is short.
00:25:03.520 | Just because the system is trying to maximize,
00:25:07.840 | it's aspiring to maximize
00:25:11.160 | unregretted user minutes is what I call it.
00:25:15.320 | So like basically how do we,
00:25:17.520 | if we're succeeding,
00:25:19.160 | you want to spend more time on the platform
00:25:21.800 | and you want to, and after having spent that time,
00:25:24.400 | you don't want to regret it.
00:25:25.960 | Speaking of TikTok,
00:25:28.120 | you know, I've had a lot of people tell me
00:25:31.760 | they spent a lot of time on TikTok and they regret it.
00:25:34.520 | We don't want to be,
00:25:36.800 | we want it to be that you spent a lot of time
00:25:39.200 | on the X platform and you learned a lot,
00:25:42.840 | you were entertained and you don't regret it.
00:25:46.400 | So when you're optimizing for user minutes,
00:25:50.280 | and like I said, aspirationally unregretted user minutes,
00:25:53.280 | if you, the more content that you post on the system,
00:25:58.360 | the more reach that thing will get
00:26:01.000 | because the system is saying,
00:26:02.400 | oh, the user is spending more time on the platform
00:26:05.960 | because they're seeing say your podcast
00:26:08.680 | or reading a long form article or watching some video.
00:26:15.840 | That's going to get a lot more time than say,
00:26:18.120 | if you link to a video elsewhere
00:26:20.720 | or you link to an article elsewhere,
00:26:23.080 | that's just, that means you'll see people feel that post
00:26:27.160 | for a very short period of time.
00:26:28.960 | And so the system will be like,
00:26:30.360 | okay, that did not increase user time.
00:26:33.160 | So it will, it won't be excluded,
00:26:35.000 | but it will get less attention
00:26:37.280 | than actually posting content natively on the system.
00:26:40.800 | - Do you want to talk about the ADL
00:26:45.800 | and you sort of what the status of that is,
00:26:49.000 | whether you're pursuing a lawsuit or not,
00:26:51.080 | or where that stands?
00:26:52.400 | - I think we'll have to see about that.
00:26:55.240 | I mean, yeah, I mean, the fact of the matter
00:26:59.800 | is that ADL did initiate a boycott.
00:27:03.680 | They don't call it a boycott, they call it a pause.
00:27:05.920 | But you know, a pause that is never ending is boycott.
00:27:10.040 | So it's the same thing.
00:27:12.840 | So, and we just, we saw a massive drop in US advertising.
00:27:17.840 | We saw basically no change in advertising in Asia,
00:27:23.080 | but domestically with ADL is strong,
00:27:25.320 | we saw a 60% drop in advertising.
00:27:28.480 | So, you know, that's pretty intense.
00:27:32.000 | And this is despite, you know,
00:27:36.480 | showing repeated analysis of the system,
00:27:39.920 | including third party analysis of the system,
00:27:42.720 | which actually showed that the number of views
00:27:46.400 | of hateful content declined.
00:27:49.200 | So, you know, the third parties who have all the data
00:27:54.920 | analyzed and said, actually, there's less hate speech.
00:27:57.680 | The issue, I think, with the ADL
00:28:01.480 | is not a question of hate speech.
00:28:03.400 | It's not a question of anti-Semitism, obviously.
00:28:06.000 | It's that the ADL and a lot of other organizations
00:28:09.360 | have become activist organizations
00:28:12.640 | which are acting far beyond their stated mandate
00:28:16.840 | or their original mandate.
00:28:18.440 | And I think far beyond what donors
00:28:20.640 | to those organizations think they are doing.
00:28:22.840 | You know, one of the things that the ADL
00:28:27.040 | was extremely opposed to,
00:28:29.080 | and in fact was instrumental in happening
00:28:31.520 | was that the ADL was instrumental
00:28:32.880 | in getting Donald Trump de-baffled.
00:28:36.080 | And then when we, you know, we restored the account
00:28:42.040 | that they made it super clear
00:28:43.720 | that they regarded simply restoring his account
00:28:46.280 | on Twitter and Alex, that constituted hateful speech.
00:28:51.280 | I mean, he hasn't even said anything, you know.
00:28:55.000 | He hasn't at least said something or posted something
00:28:59.480 | for there to be incremental hateful content.
00:29:02.800 | This is absurd.
00:29:04.120 | And what's this got to do with anti-Semitism?
00:29:06.320 | Donald Trump's son-in-law is Jewish.
00:29:09.480 | His Jewish grandchildren are pretty sure
00:29:10.960 | he's not anti-Semitic, okay.
00:29:13.040 | You know, it's at the wedding.
00:29:15.560 | (laughing)
00:29:17.800 | The problem is that a lot of these organizations,
00:29:23.280 | like I said, they're really being captured
00:29:24.600 | by the woke agenda and they're pushing, you know,
00:29:27.880 | a series of beliefs and values that I think are often
00:29:30.440 | contrary to what they're done as belief.
00:29:34.000 | And that's what we have in this situation.
00:29:37.960 | - Yeah, I'll note that the two positions
00:29:40.480 | that you've taken that have brought the most heat on you,
00:29:43.660 | number one, defending free speech,
00:29:45.920 | number two, advocating peace.
00:29:48.440 | (laughing)
00:29:49.800 | - How dare you, you lot mother.
00:29:51.920 | - How dare you?
00:29:52.960 | - How dare you?
00:29:54.000 | - And there's an article--
00:29:56.560 | - I feel like I'm in the opposite world or something.
00:29:58.120 | - Yeah, we're living in an upside down world.
00:30:00.160 | There's an article in today's New Yorker
00:30:02.400 | calling you a super villain because you're advocating peace
00:30:05.560 | and protecting the First Amendment.
00:30:06.840 | I mean, it's like completely upside down.
00:30:09.120 | (laughing)
00:30:11.960 | - Do you want people to eat their vegetables?
00:30:14.960 | - At this point, you literally cannot tell
00:30:17.520 | actual press from parody.
00:30:19.880 | - No.
00:30:20.720 | - If that was a Babylon Bee or Onion,
00:30:24.160 | (laughing)
00:30:25.120 | - No, literally.
00:30:26.280 | - You're doing it--
00:30:27.120 | - And change the banner to, you know, Babylon Bee,
00:30:31.160 | whatever, Onion or something like that,
00:30:32.480 | have some parody thing and be like,
00:30:34.600 | oh, that's a good joke.
00:30:36.120 | (laughing)
00:30:37.800 | - Yeah, super villains normally advocate for peace.
00:30:40.000 | That's, you know, of course.
00:30:41.520 | (laughing)
00:30:43.600 | - We wanna get rid of all the nuclear weapons.
00:30:46.600 | Hey, you want--
00:30:47.720 | - Well, hold on, hold on.
00:30:48.560 | - There was a, that's a, what?
00:30:52.320 | (laughing)
00:30:53.160 | - The funniest skit that didn't make it on SNL
00:30:57.120 | that we were workshopping was probably "Woke James Bond."
00:31:01.400 | And we wanted to do like this "Woke James Bond"
00:31:04.960 | and then we were like,
00:31:05.800 | "Oh, we're gonna do this."
00:31:06.640 | And then we were like,
00:31:07.480 | "Oh, we're gonna do this."
00:31:08.320 | And then we were like,
00:31:09.160 | "Oh, we're gonna do this."
00:31:10.000 | And then we were like,
00:31:10.840 | "Oh, we're gonna do this."
00:31:11.680 | And then we were like,
00:31:12.520 | "Oh, we're gonna do this."
00:31:13.360 | And then we were like,
00:31:14.200 | "Oh, we're gonna do this."
00:31:15.040 | And then we were like,
00:31:15.880 | "Oh, we're gonna do this."
00:31:16.720 | And then we were like,
00:31:17.560 | "Oh, we're gonna do this."
00:31:18.400 | And then we were like,
00:31:19.240 | "Oh, we're gonna do this."
00:31:20.080 | And then we were like,
00:31:20.920 | "Oh, we're gonna do this."
00:31:21.760 | And then we were like,
00:31:22.600 | "Oh, we're gonna do this."
00:31:23.440 | And then we were like,
00:31:24.640 | - We had a Graham Allison.
00:31:27.120 | - Reality and also like the, you know,
00:31:29.280 | the conspiracy theories that haven't come true list
00:31:34.280 | is quite short.
00:31:37.280 | (laughing)
00:31:39.520 | And we really need more conspiracies generated
00:31:46.200 | because we're running out of--
00:31:47.360 | - To find the truth.
00:31:48.300 | (laughing)
00:31:51.160 | - The to be checked off is accurate.
00:31:53.120 | So I don't know who's responsible
00:31:57.800 | for these conspiracy theories,
00:31:59.360 | but we've seen some more material.
00:32:03.080 | - Paging Alex Jones.
00:32:04.760 | - Elon, we had Graham Allison here today.
00:32:06.520 | I know you talked about his book.
00:32:08.200 | We had Ray Dalio here.
00:32:09.760 | We had Ro Khanna.
00:32:11.400 | And we talked a lot about China,
00:32:13.040 | the US relationship with China.
00:32:14.960 | You have several businesses that have deep supplier
00:32:18.240 | and customer relationships in China.
00:32:21.320 | Given what's going on,
00:32:22.280 | and clearly the tenor has changed,
00:32:24.360 | the mood has changed with respect to US policy
00:32:27.360 | towards China, what it's like in DC,
00:32:29.360 | what it's like in Silicon Valley,
00:32:30.680 | and how everyone talks about the relationship
00:32:32.480 | with China today.
00:32:33.560 | It's pretty crazy how quick things have changed.
00:32:36.480 | As a business leader with all these business relationships
00:32:39.420 | with China, how do you make decisions
00:32:41.320 | and how things are changing?
00:32:43.080 | And how do you think about where this is headed?
00:32:46.480 | - Sure, well, I mean, let's just clarify here.
00:32:49.320 | You know, SpaceX has no,
00:32:51.040 | SpaceX and Starlink have no business in China whatsoever.
00:32:55.760 | They're not allowed, you know,
00:32:58.120 | SpaceX doesn't launch China satellites,
00:33:01.200 | and Starlink is abandoned in China.
00:33:04.720 | So, to be clear, SpaceX, Starlink, zero business in China.
00:33:08.280 | In the case of Tesla,
00:33:12.960 | one of our four vehicle factories,
00:33:16.720 | one is in China.
00:33:19.880 | So, you know, it's a significant car market,
00:33:24.840 | but it is, you know,
00:33:27.160 | so what I'm trying to say is,
00:33:28.960 | by far the bulk of my business interests,
00:33:31.880 | if I were purely mercantile, which I aspire not to be,
00:33:35.120 | are outside of China, let's just be clear about that.
00:33:39.520 | Then, with respect to,
00:33:42.040 | now, that said, I think I understand China well.
00:33:45.000 | I've been there many times,
00:33:46.040 | I've met with senior leadership
00:33:48.040 | at many levels in China for many years.
00:33:52.080 | And so, I think I've got a pretty good understanding,
00:33:55.280 | at least as an outsider of China.
00:33:57.880 | So, and Tesla has been very successful domestically in China.
00:34:02.400 | So, you know, the fundamental thing here
00:34:07.400 | is really Taiwan.
00:34:10.400 | The, China has, well, really since,
00:34:14.840 | for like half a century or so,
00:34:18.640 | maybe longer at this point,
00:34:20.160 | it's longer at this point,
00:34:21.280 | that their policy has been to sort of reunite
00:34:26.280 | Taiwan with China.
00:34:28.080 | From this standpoint, you know,
00:34:29.400 | it may be analogous to like Hawaii or something like that.
00:34:32.480 | Like an integral part of China
00:34:34.080 | that is arbitrarily not part of China.
00:34:38.560 | Mostly because of the US,
00:34:41.000 | as the US Pacific Fleet has stopped
00:34:44.840 | any sort of reunification effort, wars.
00:34:49.840 | So, now, really things get to the point,
00:34:54.840 | increasingly year over year,
00:34:58.040 | where China's military strength is increasing,
00:35:00.640 | and ours is more or less static.
00:35:03.720 | And strategically, you know,
00:35:05.920 | you can imagine trying to defend Taiwan is not easy,
00:35:10.920 | 'cause it's very close to the coast of China.
00:35:13.600 | So, there will come a point,
00:35:17.680 | you know, probably in the not too distant future,
00:35:20.840 | where China's military strength in that region,
00:35:23.480 | or it's at exceeds US military strength in that region.
00:35:26.680 | And if one is to take China's policy literally,
00:35:31.680 | and probably one should,
00:35:34.440 | then there will be some forceful,
00:35:36.880 | force will be used for, you know,
00:35:42.320 | to incorporate Taiwan into China.
00:35:44.280 | This is what they've said.
00:35:47.680 | That if there's not a diplomatic solution,
00:35:50.480 | there will be a solution by force.
00:35:53.280 | - Let me, if I can--
00:35:55.280 | - And so, really what's going on here,
00:35:57.640 | and you've seen, you know, this in many areas,
00:36:01.080 | and I think this tempo's gonna increase,
00:36:02.920 | is that, you know, both China and the US
00:36:07.360 | are preparing for a potential showdown,
00:36:12.360 | you know, in the South China Sea.
00:36:15.560 | So, that's why you're seeing increasing restrictions
00:36:20.560 | on export of US technology to China,
00:36:23.960 | such as the Nvidia's, you know,
00:36:26.280 | the Nvidia H100's being banned,
00:36:28.120 | you do not ship it to China.
00:36:30.080 | And I think there'll be more and more,
00:36:31.600 | you also know that there's not enough
00:36:33.800 | advanced ship making equipment to China.
00:36:36.640 | So, and I suspect, you know,
00:36:39.520 | China's gonna respond with some reciprocal sanctions,
00:36:42.760 | and I think you'll see this kind of a tip for a cap,
00:36:45.920 | reciprocal sanctions increasing in the next few years.
00:36:50.200 | So, I think quite a very hot temperature.
00:36:53.360 | And then we'll see this,
00:36:56.360 | is there gonna be a diplomatic solution
00:37:01.320 | to reunification or a non diplomatic solution?
00:37:06.320 | - You--
00:37:07.800 | - But it is made clear that there will,
00:37:09.880 | one way or another, be a solution,
00:37:12.040 | from this standpoint.
00:37:13.320 | - You mentioned Nvidia, so let me just talk about AI
00:37:16.720 | and bring it back to that for a second.
00:37:18.320 | Can you tell us your regrets,
00:37:23.320 | but also the positives of the experience
00:37:25.720 | you had with OpenAI,
00:37:26.720 | and then what your goals are with XAI?
00:37:29.160 | - Well, the AI discussion is certainly a long one,
00:37:34.160 | or could be a long one.
00:37:39.040 | Digital super intelligence,
00:37:49.600 | that might be the most significant technology
00:37:52.640 | that humanity ever creates.
00:37:54.280 | And it has the potential to be more dangerous
00:37:57.400 | than nuclear weapons.
00:38:01.160 | you know, in the case of OpenAI,
00:38:08.560 | it was to, how they're not be a unipolar world
00:38:11.920 | where Google, with its subsidiary DeepMind,
00:38:16.040 | you know, would control an overwhelming amount
00:38:19.560 | of AI talent and compute and resources,
00:38:24.480 | which then is somewhat dependent on,
00:38:27.040 | basically how Larry Page and Sergey Brin
00:38:32.040 | and Eric Schmidt, believe things should go,
00:38:34.840 | 'cause they, between the three of them,
00:38:36.400 | or two out of three, have control over Alphabet,
00:38:40.640 | 'cause they've got super voting rights.
00:38:42.640 | And, you know, I was quite concerned,
00:38:45.080 | based on some conversations I had with Larry Page,
00:38:48.000 | where, you know, he did call me a specious,
00:38:52.000 | for being pro-humanity.
00:38:54.040 | And, so I'm like, what side are you on, Larry?
00:38:58.080 | You know, not our, as you can see.
00:39:01.240 | You know, I think, and so, so, you know,
00:39:05.320 | I felt like uncomfortable having the entire future
00:39:09.720 | of digital super intelligence be in the hands
00:39:11.800 | of someone who called me a specious,
00:39:13.200 | for being pro-humanity.
00:39:14.400 | You know, how could it not be?
00:39:18.040 | So, as OpenAI was originally created
00:39:23.080 | as an open source non-profit,
00:39:24.880 | and now is a closed, it's supposed to be,
00:39:27.960 | it should be renamed Closed for Maximal Profit AI.
00:39:30.440 | (audience laughing)
00:39:33.640 | It is closed, and they are aiming to,
00:39:36.400 | I think they try to make $100 billion,
00:39:38.520 | I think according to Sam Hoffman,
00:39:40.520 | get $100 billion from somewhere
00:39:43.320 | for some vast amount of compute to create digital God.
00:39:47.760 | Apparently all the weights are stored
00:39:50.440 | in a comma-separated value file, by the way, so.
00:39:53.200 | Our digital God will be a CSV file.
00:39:55.400 | (audience laughing)
00:39:57.080 | - How do we import it?
00:39:58.200 | File, import, God.
00:40:00.680 | - Yeah, just, yeah.
00:40:02.440 | So, anyway, so it's now, OpenAI is more,
00:40:07.440 | it's also very closely aligned with Microsoft.
00:40:15.320 | You know, Microsoft is really,
00:40:16.760 | the OpenAI servers are running on,
00:40:20.240 | in Azure and Microsoft's data centers.
00:40:22.360 | So really, what you have is,
00:40:25.360 | I think at the end of the day,
00:40:26.200 | Microsoft's having more control than OpenAI.
00:40:28.560 | They have access to all the source code,
00:40:30.560 | they have access to all the weights
00:40:32.080 | of the GP4 and future versions.
00:40:35.760 | So they have all rights to this, to think.
00:40:38.520 | It's not, at any point, really,
00:40:42.440 | they could cut off OpenAI.
00:40:43.680 | I don't think OpenAI quite realizes
00:40:45.600 | the dependence on Microsoft.
00:40:48.680 | And even if Microsoft does break some contract,
00:40:50.680 | they'll just be tied up in litigation for years.
00:40:54.040 | So really, you've got a contest
00:40:58.800 | between Google and Microsoft.
00:41:03.200 | Google, as I mentioned, I'm concerned about,
00:41:05.840 | you know, not caring enough about AI safety,
00:41:09.920 | and good reason.
00:41:12.720 | And then Microsoft just is, I think,
00:41:15.920 | you know, a profit-seeking organization.
00:41:19.080 | And I think such is great,
00:41:21.960 | but I can't say, like, you know,
00:41:25.120 | it would be difficult to say that Microsoft
00:41:27.520 | has an amazing track record in moral decision-making.
00:41:31.280 | (audience laughing)
00:41:34.280 | - Diplomatic.
00:41:36.560 | - Anyway, so I was like, okay, look,
00:41:42.040 | let's just, I think let's try to create
00:41:44.920 | a third company that is competitive.
00:41:48.560 | I do think Tesla is underrated from an AI standpoint,
00:41:51.480 | in terms of real-world AI.
00:41:52.600 | Tesla has the best real-world AI.
00:41:55.320 | So, you know, hopefully between XAI and Tesla,
00:42:00.320 | there's kind of a third contender
00:42:04.920 | or visual supercompany.
00:42:06.720 | - Look, you've done, you open-sourced your patents at Tesla.
00:42:10.000 | You are very pro-open-source, your source code at X.
00:42:13.200 | Would you ever considering releasing Dojo and FSD
00:42:16.280 | more as a platform substrate for everybody else,
00:42:18.920 | or that's sort of off the table right now?
00:42:21.080 | - Well, I don't know that, you know,
00:42:26.840 | in the case of, say, Dojo,
00:42:28.440 | or our work inference hardware that's in the car,
00:42:32.720 | our inference computer,
00:42:34.680 | which is actually a lot more compute than Dojo, by the way.
00:42:37.600 | You know, we've got somewhere in the order
00:42:41.840 | of four million cars that have high-speed AI
00:42:45.360 | inference computers in them.
00:42:46.760 | Like, open-sourcing chip designs
00:42:51.360 | doesn't mean you suddenly get that thing.
00:42:53.440 | - Yeah.
00:42:54.280 | - You know, so.
00:42:55.760 | You can open-source the software,
00:43:02.080 | but I think chip designs,
00:43:03.240 | they'll need ones that actually use those chips,
00:43:08.280 | or really, yeah,
00:43:11.080 | would be someone that's willing to spend
00:43:14.440 | many billions of dollars on
00:43:16.040 | a computer development.
00:43:19.880 | So, anyway, I think in the case of,
00:43:23.600 | you know, Dojo's interesting,
00:43:26.440 | Optimist is really interesting.
00:43:28.040 | Anyway, I think just in general,
00:43:32.080 | Tesla is one of the world's leading AI companies,
00:43:37.080 | and in some respects, the leading AI company
00:43:39.600 | when it comes to real-world AI,
00:43:41.480 | understanding the real world
00:43:43.880 | and actually reacting to that with self-driving.
00:43:46.280 | And I think that will become part of a solution
00:43:52.360 | for AGI, or General Superintelligence.
00:43:55.640 | So, in the case of Tesla,
00:44:00.360 | I think we've got a sort of a good governance structure,
00:44:03.920 | and that there's no super-voting rights
00:44:05.360 | or anything like that.
00:44:06.600 | So, if I'm, you know, go crazy,
00:44:09.720 | the shareholders of Tesla can vote me out.
00:44:11.840 | You know, I have enough of work to be,
00:44:15.520 | you know, I think, moderately influential,
00:44:17.480 | but not enough to stay in,
00:44:19.120 | even if I'm doing crazy stuff.
00:44:21.720 | So, I think that's actually good.
00:44:23.360 | - Great.
00:44:28.080 | I was told we have to wrap up.
00:44:29.080 | - Oh, okay.
00:44:30.000 | Just on FSD, before we wrap, I'll let you go.
00:44:33.280 | We were talking earlier this year,
00:44:34.640 | and you said, hey, maybe chat GPT 4.0-like moment
00:44:39.640 | for self-driving was coming.
00:44:42.480 | And I've been playing with the beta,
00:44:44.840 | and yeah, how close does it feel to you?
00:44:48.800 | Because some of the rides it's been doing for me
00:44:51.840 | are pretty darn impressive.
00:44:54.400 | - The latest beta's pretty incredible.
00:44:55.800 | - Yeah, it's pretty neat.
00:44:57.720 | You know, I used to love it on the highways
00:44:59.600 | and on the streets.
00:45:00.440 | I'd be like, okay.
00:45:01.560 | But now I'm using it increasingly on the streets.
00:45:03.800 | So, how do you feel about it right now?
00:45:06.840 | And I guess you made a lot of predictions on it
00:45:08.840 | over the years, but it does feel like
00:45:11.880 | it's getting pretty close.
00:45:13.440 | - Yeah, I think it's very close, too.
00:45:16.400 | You know, being in a situation where,
00:45:19.120 | even if there's no human oversight or intervention,
00:45:21.400 | that the probability of a safe journey is higher
00:45:25.520 | with FSD and no supervision,
00:45:28.240 | like even if you're asleep in the car,
00:45:30.040 | than if the person is driving.
00:45:33.200 | We were very close to that.
00:45:34.920 | You know, those that have the FSD beta,
00:45:38.240 | which really anyone can get at this point.
00:45:40.440 | So, the miles we see driven under the FSD beta
00:45:46.360 | currently are much safer than the miles
00:45:49.600 | that are driven without it.
00:45:50.960 | So, that's already a very good milestone.
00:45:56.840 | You can just see that it's getting better,
00:46:02.200 | but if you compare the FSD beta today
00:46:07.200 | versus six months ago, versus a year ago,
00:46:11.280 | versus 18 months ago,
00:46:13.000 | it's really, the improvement is dramatic.
00:46:15.160 | And we've got the final piece of the puzzle,
00:46:20.640 | which is to have the control part of the car
00:46:24.160 | transition from about 300,000 lines of C++ code
00:46:27.720 | to also neural network.
00:46:29.840 | So, the whole system will be neural network.
00:46:34.200 | Photons in to controls out.
00:46:37.440 | And that's kind of the final piece of the puzzle
00:46:40.200 | for full self-driving being significantly better than human.
00:46:44.440 | - Awesome.
00:46:45.720 | Thanks for taking the time, buddy.
00:46:47.080 | Fly safe, and I'll see you shortly.
00:46:49.560 | Ladies and gentlemen, Elon Musk.
00:46:51.920 | (audience applauding)
00:46:53.320 | Thanks, bud.
00:46:54.160 | (audience applauding)
00:46:57.240 | (upbeat music)
00:46:58.080 | - Rain Man David Sack.
00:46:59.480 | (upbeat music)
00:47:02.080 | - And it said.
00:47:02.920 | - We open sourced it to the fans,
00:47:04.600 | and they've just gone crazy with it.
00:47:06.440 | - Love you, Wes.
00:47:07.280 | - I'm a queen of the king of the line.
00:47:08.600 | (upbeat music)
00:47:11.200 | - Besties are back.
00:47:16.520 | (laughing)
00:47:18.760 | - We should all just get a room
00:47:26.800 | and just have one big huge orgy
00:47:28.400 | 'cause they're all just useless.
00:47:29.640 | It's like this like sexual tension
00:47:31.120 | that we just need to release somehow.
00:47:32.920 | - What about B?
00:47:35.560 | What about B?
00:47:36.400 | - What about B?
00:47:38.080 | - What?
00:47:38.920 | - We need to get merch.
00:47:39.920 | - Besties are back.
00:47:40.760 | ♪ I'm going all in ♪
00:47:45.760 | ♪ I'm going all in ♪
00:47:51.240 | [BLANK_AUDIO]