back to index

Elon Musk: War, AI, Aliens, Politics, Physics, Video Games, and Humanity | Lex Fridman Podcast #400


Chapters

0:0 Introduction
0:7 War and human nature
4:33 Israel-Hamas war
10:41 Military-Industrial Complex
14:58 War in Ukraine
19:41 China
33:57 xAI Grok
44:55 Aliens
52:55 God
55:22 Diablo 4 and video games
64:29 Dystopian worlds: 1984 and Brave New World
70:41 AI and useful compute per watt
76:22 AI regulation
83:14 Should AI be open-sourced?
90:36 X algorithm
101:57 2024 presidential elections
114:55 Politics
117:57 Trust
123:29 Tesla's Autopilot and Optimus robot
132:28 Hardships

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Elon Musk,
00:00:03.420 | his fourth time on this, the Lex Friedman Podcast.
00:00:07.200 | (whistling)
00:00:10.460 | (laughing)
00:00:14.140 | I thought you were gonna finish it.
00:00:22.380 | It's one of the greatest themes in all of film history.
00:00:28.520 | (whistling)
00:00:31.460 | Yeah, it's great.
00:00:32.360 | So I was just thinking about the Roman Empire, as one does.
00:00:36.960 | (laughing)
00:00:38.760 | There's that whole meme where
00:00:40.280 | all guys are thinking about the Roman Empire
00:00:43.720 | at least once a day.
00:00:44.560 | And half the population's confused whether it's true or not.
00:00:47.740 | But more seriously,
00:00:50.160 | thinking about the wars going on in the world today.
00:00:52.920 | And as you know, war and military conquest
00:00:56.480 | has been a big part of Roman society and culture.
00:01:00.320 | And I think it's been a big part of most empires
00:01:04.120 | and dynasties throughout human history.
00:01:05.620 | So-- - Yeah.
00:01:06.640 | They usually came as a result of conquest.
00:01:10.760 | I mean, there's some like the Austro-Hungarian Empire
00:01:12.560 | where there was just a lot of sort of clever marriages.
00:01:15.560 | - But fundamentally, there's an engine of conquest.
00:01:19.360 | - Yeah, always. - And they celebrate
00:01:20.480 | excellence in warfare.
00:01:23.560 | Many of the leaders were excellent generals.
00:01:26.520 | - Yeah. - That kind of thing.
00:01:28.000 | So big picture question, Grok approved.
00:01:31.160 | I asked this is a good question to ask.
00:01:32.800 | - Get tested, Grok approved.
00:01:34.040 | - Yeah. (laughing)
00:01:36.080 | At least on fun mode.
00:01:37.240 | To what degree do you think war is part of human nature
00:01:44.520 | versus a consequence of how human societies are structured?
00:01:50.720 | I ask this as you have somehow controversially
00:01:55.120 | been a proponent of peace.
00:01:57.320 | - I'm generally a proponent of peace.
00:01:58.840 | I mean, ignorance is perhaps in my view,
00:02:01.320 | the real enemy to be countered.
00:02:04.420 | That's the real hard part, not fighting other humans.
00:02:07.560 | But all creatures fight.
00:02:10.320 | I mean, the jungle is, you look at the,
00:02:13.640 | people think of nature as perhaps
00:02:15.840 | some sort of peaceful thing, but in fact, it is not.
00:02:18.280 | There's some quite funny Werner Herzog thing
00:02:22.800 | where he's like in the jungle,
00:02:24.720 | saying that it's like basically just murder and death
00:02:28.260 | in every direction.
00:02:29.760 | I mean, the plants, animals, and the jungle
00:02:31.560 | are constantly trying to kill and eat each other
00:02:33.920 | every single day, every minute.
00:02:36.080 | So it's not like we're unusual in that respect.
00:02:41.080 | - There's a relevant question here
00:02:42.180 | whether with greater intelligence
00:02:43.880 | comes greater control over these base instincts
00:02:47.440 | for violence.
00:02:48.280 | - Yes, we have much more of an ability
00:02:52.640 | to control our limbic instinct for violence
00:02:56.820 | than say a chimpanzee.
00:02:58.400 | And in fact, if one looks at say chimpanzee society,
00:03:02.000 | it is not friendly.
00:03:03.160 | I mean, the bonobos are an exception,
00:03:05.340 | but chimpanzee society is full of violence
00:03:11.200 | and it's quite horrific, frankly.
00:03:15.800 | That's our limbic system in action.
00:03:17.680 | Like you don't wanna be on the wrong side of a chimpanzee.
00:03:20.120 | It'll eat your face off and tear your nuts off.
00:03:22.200 | - Yeah, basically there's no limits or ethics.
00:03:25.160 | They're almost a just war.
00:03:26.720 | There's no just war in chimpanzee societies.
00:03:29.800 | Is war and dominance by any means necessary?
00:03:33.240 | - Yeah, chimpanzee society is like a primitive version
00:03:35.720 | of human society.
00:03:36.820 | They're not like peace-loving, basically, at all.
00:03:40.600 | There's extreme violence.
00:03:45.080 | And then once in a while,
00:03:46.400 | somebody who's watched too many Disney movies
00:03:48.160 | decides to raise a chimpanzee as a pet.
00:03:50.100 | And then that eats their face or rips their nuts off
00:03:54.080 | or chews their fingers off, that kind of thing.
00:03:56.360 | - Yeah. - It's happened several times.
00:03:57.960 | - Ripping your nuts off is an interesting strategy
00:04:00.800 | for interaction.
00:04:02.000 | - Some, it's happened to people.
00:04:03.120 | It's unfortunate.
00:04:04.320 | That's, I guess, one way to ensure
00:04:06.380 | that the other chimp doesn't contribute to the gene pool.
00:04:10.780 | - Well, from a martial arts perspective,
00:04:12.320 | it's a fascinating strategy.
00:04:13.880 | - And the nut ripper.
00:04:16.220 | - I wonder which of the martial arts teaches that.
00:04:21.440 | - I think it's safe to say,
00:04:22.280 | if somebody's got your nuts in their hands
00:04:26.160 | and has the option of ripping them off,
00:04:28.120 | you will be amenable to whatever they want.
00:04:30.400 | - Yeah.
00:04:31.240 | So like I said, somehow controversially,
00:04:35.200 | you've been a proponent of peace on Twitter, on X.
00:04:39.160 | So let me ask you about the wars going on today
00:04:41.920 | and to see what the path to peace could be.
00:04:44.600 | How do you hope the current war in Israel and Gaza
00:04:46.600 | comes to an end?
00:04:47.840 | What path do you see that can minimize human suffering
00:04:52.000 | in the long term in that part of the world?
00:04:54.840 | - Well, I think that part of the world is definitely,
00:04:58.240 | like if you look up the,
00:04:59.360 | there is no easy answer in the dictionary.
00:05:01.560 | It'll be like the picture of the Middle East,
00:05:04.600 | in Israel especially.
00:05:06.360 | So there is no easy answer.
00:05:10.920 | And what my, this is strictly my opinion of,
00:05:14.600 | is that the goal of Hamas was to provoke
00:05:20.720 | an overreaction from Israel.
00:05:23.980 | They obviously did not expect to have a military victory,
00:05:29.060 | but they really wanted to commit the worst atrocities
00:05:34.040 | that they could in order to provoke
00:05:36.440 | the most aggressive response possible from Israel.
00:05:40.520 | And then leverage that aggressive response
00:05:45.820 | to rally Muslims worldwide for the cause of Gaza
00:05:50.820 | and Palestine, which they have succeeded in doing.
00:05:56.340 | So the counterintuitive thing here,
00:06:02.800 | I think that the thing that I think should be done,
00:06:04.840 | even though it is very difficult,
00:06:06.760 | is that I would recommend that Israel engage
00:06:10.560 | in the most conspicuous acts of kindness possible.
00:06:14.120 | Every part, everything.
00:06:16.120 | That is the actual thing that would thwart the goal of Hamas.
00:06:19.680 | - So in some sense, the degree that makes sense
00:06:23.080 | in geopolitics, turn the other cheek, implemented.
00:06:26.440 | - It's not exactly turn the other cheek,
00:06:28.320 | because I do think that there's,
00:06:33.200 | I think it is appropriate for Israel
00:06:35.740 | to find the Hamas members and either kill them
00:06:40.740 | or incarcerate them.
00:06:44.400 | That's something that has to be done,
00:06:46.800 | 'cause they're just gonna keep coming otherwise.
00:06:50.060 | But in addition to that, they need to do whatever they can.
00:06:56.320 | There's some talk of establishing, for example,
00:07:00.840 | a mobile hospital, I'd recommend doing that.
00:07:03.460 | Just making sure that there's food, water,
00:07:09.480 | medical necessities,
00:07:11.960 | and just be over the top about it
00:07:17.120 | and be very transparent,
00:07:20.000 | so people can't claim it's a trick.
00:07:23.680 | Like, just put a webcam on the thing, 24/7.
00:07:29.160 | - Deploy acts of kindness.
00:07:31.480 | - Yeah, conspicuous acts of kindness
00:07:33.720 | that are unequivocal, meaning they can't be somehow,
00:07:40.240 | 'cause Hamas will then, their response will be,
00:07:42.520 | oh, it's a trick, therefore you have to counter,
00:07:44.760 | how is it not a trick?
00:07:46.140 | - This ultimately fights the broader force of hatred
00:07:50.040 | in the region.
00:07:51.600 | - Yes, and I'm not sure who said it,
00:07:53.440 | it's an apocryphal saying,
00:07:55.120 | but an eye for an eye makes everyone blind.
00:07:58.200 | Now, that neck of the woods,
00:07:59.260 | they really believe in the whole eye for an eye thing.
00:08:02.080 | But, I mean, you really have,
00:08:07.600 | if you're not gonna just outright commit genocide,
00:08:12.800 | like against an entire people,
00:08:14.080 | which obviously would not be acceptable to,
00:08:16.400 | really shouldn't be acceptable to anyone,
00:08:18.400 | then you're gonna leave, basically,
00:08:21.000 | a lot of people alive who subsequently hate Israel.
00:08:25.160 | So really, the question is,
00:08:27.200 | how, for every Hamas member that you kill,
00:08:30.440 | how many did you create?
00:08:31.640 | And if you create more than you kill,
00:08:35.480 | you've not succeeded, that's the real situation there.
00:08:39.880 | And it's safe to say that if,
00:08:41.620 | if you kill somebody's child in Gaza,
00:08:52.640 | you've made at least a few Hamas members
00:08:57.640 | who will die just to kill an Israeli.
00:09:01.940 | That's the situation.
00:09:05.000 | So, but I mean, this is one of the most contentious subjects
00:09:10.000 | one could possibly discuss,
00:09:12.680 | but I think if the goal ultimate
00:09:14.960 | is some sort of long-term peace,
00:09:17.440 | one has to look at this from the standpoint of,
00:09:21.140 | over time, are there more or fewer terrorists being created?
00:09:25.740 | - Let me just linger on war.
00:09:29.040 | - Yeah, well, war, it's safe to say,
00:09:30.600 | war's always existed and always will exist.
00:09:33.120 | - Always will exist.
00:09:34.480 | - Always has existed and always will exist.
00:09:37.440 | - I hope not.
00:09:38.440 | You think it always will.
00:09:39.280 | - There will always be war.
00:09:40.480 | It's a question of just how much war,
00:09:43.440 | and there's sort of the scope and scale of war,
00:09:49.560 | but to imagine that there would not be any war in the future
00:09:52.780 | I think would be a very unlikely outcome.
00:09:55.740 | - Yeah, you talked about the culture series,
00:09:57.180 | there's war even there.
00:09:58.540 | - Yes, there's a giant war.
00:09:59.980 | The first book starts off with a gigantic galactic war
00:10:04.980 | where trillions die, trillions.
00:10:07.220 | - But it still, nevertheless,
00:10:08.180 | protects these pockets of flourishing.
00:10:11.360 | Somehow you can have galactic war
00:10:15.060 | and still have pockets of flourishing.
00:10:17.980 | - Yeah, I mean, I guess if we are able to one day expand
00:10:22.220 | to fill the galaxy or whatever,
00:10:26.600 | there will be a galactic war at some point.
00:10:30.700 | - The scale, I mean, the scale of war
00:10:32.540 | has been increasing, increasing, increasing.
00:10:34.940 | It's like a race between the scale of suffering
00:10:37.340 | and the scale of flourishing.
00:10:38.620 | - Yes.
00:10:39.460 | - A lot of people seem to be using this tragedy
00:10:43.740 | to beat the drums of war
00:10:45.400 | and feed the military-industrial complex.
00:10:47.460 | Do you worry about this?
00:10:48.660 | The people who are rooting for escalation
00:10:54.060 | and how can it be stopped?
00:10:56.060 | - One of the things that does concern me
00:10:57.540 | is that there are very few people alive today
00:10:59.900 | who actually viscerally understand the horrors of war,
00:11:03.300 | at least in the US.
00:11:04.980 | I mean, obviously there are people on the front lines
00:11:07.060 | in Ukraine and Russia who understand
00:11:09.100 | just how terrible war is,
00:11:10.900 | but how many people in the West understand it?
00:11:15.060 | You know, my grandfather was in World War II.
00:11:17.700 | He was severely traumatized.
00:11:19.460 | I mean, he was there, I think, for almost six years
00:11:22.260 | in Eastern North Africa, in Italy.
00:11:25.960 | All his friends were killed in front of him,
00:11:29.260 | and he would have died too,
00:11:32.220 | except they randomly gave some, I guess,
00:11:34.780 | IQ test or something, and he scored very high.
00:11:38.860 | Now, he was not an officer.
00:11:40.580 | He was, I think, a corporal or a sergeant
00:11:42.020 | or something like that,
00:11:43.460 | because he didn't finish high school.
00:11:45.340 | He had to drop out of high school 'cause his dad died,
00:11:48.020 | and he had to work to support his siblings.
00:11:51.540 | So because he didn't graduate high school,
00:11:54.140 | he was not eligible for the officer corps.
00:11:56.240 | So, you know, he kind of got put
00:12:00.540 | into the cannon fodder category, basically.
00:12:03.120 | But then, just randomly, they gave him this test.
00:12:07.220 | He was transferred to British Intelligence in London.
00:12:09.800 | That's where he met my grandmother.
00:12:13.400 | But he had PTSD next level, like next level.
00:12:17.460 | I mean, just didn't talk, just didn't talk.
00:12:21.220 | And if you tried talking to him,
00:12:22.140 | he'd just tell you to shut up.
00:12:23.580 | And he won a bunch of medals, never ragged about it once.
00:12:28.580 | Not even hinted, nothing.
00:12:29.980 | I found out about it because his military records
00:12:32.980 | were online, that's how I know.
00:12:36.300 | So he would say, "No way in hell
00:12:39.780 | "do you wanna do that again."
00:12:41.620 | But how many people, now he obviously,
00:12:44.440 | now he died 20 years ago, or longer, actually, 30 years ago.
00:12:48.260 | How many people are alive that remember World War II?
00:12:53.080 | Not many.
00:12:53.920 | - And the same perhaps applies to the threat of nuclear war.
00:12:58.100 | - Yeah, I mean, there are enough nuclear bombs
00:13:02.720 | pointed at the United States to make the rubble,
00:13:06.400 | the radioactive rubble bounce many times.
00:13:10.500 | - There's two major wars going on right now.
00:13:14.300 | So you talked about the threat of AGI quite a bit.
00:13:17.820 | But now, as we sit here with the intensity
00:13:20.820 | of conflict going on, do you worry about nuclear war?
00:13:23.760 | - I think we shouldn't discount
00:13:26.540 | the possibility of nuclear war.
00:13:28.100 | It is a civilizational threat.
00:13:30.780 | Right now, I could be wrong, but I think
00:13:38.620 | that the current probability of nuclear war is quite low.
00:13:42.000 | But there are a lot of nukes pointed at us.
00:13:45.920 | And we have a lot of nukes pointed at other people.
00:13:50.260 | They're still there.
00:13:51.500 | Nobody's put their guns away.
00:13:53.820 | The missiles are still in the silos.
00:13:57.540 | - And the leaders don't seem to be
00:13:59.740 | the ones with the nukes talking to each other.
00:14:02.940 | - No.
00:14:04.220 | There are wars which are tragic and difficult
00:14:06.900 | on a local basis, and then there are wars
00:14:09.300 | which are civilization-ending, or have that potential.
00:14:12.700 | Obviously, global thermonuclear warfare
00:14:14.780 | has high potential to end civilization,
00:14:17.700 | perhaps permanently, but certainly to severely
00:14:22.140 | wound and perhaps set back human progress
00:14:29.100 | by the Stone Age or something.
00:14:33.620 | I don't know.
00:14:34.620 | Pretty bad.
00:14:36.460 | Um, probably scientists and engineers
00:14:40.780 | won't be super popular after that as well.
00:14:43.140 | Like, you got us into this mess.
00:14:44.740 | So, generally, I think we obviously
00:14:49.180 | want to prioritize civilizational risks
00:14:50.980 | over things that are painful and tragic
00:14:54.820 | on a local level, but not civilizational.
00:14:57.020 | - How do you hope the war in Ukraine comes to an end?
00:15:02.500 | And what's the path, once again,
00:15:04.100 | to minimizing human suffering there?
00:15:06.020 | - Well, I think that what is likely to happen,
00:15:13.340 | which is really pretty much the way it is,
00:15:17.460 | is that something very close to the current lines
00:15:21.460 | will be how a ceasefire or truce happens.
00:15:28.380 | But, you know, you just have a situation right now
00:15:33.820 | where whoever goes on the offensive
00:15:36.140 | will suffer casualties at several times
00:15:38.660 | the rate of whoever's on the defense.
00:15:41.260 | 'Cause you've got defense in depth,
00:15:43.580 | you've got minefields, trenches, anti-tank defenses.
00:15:47.980 | Nobody has air superiority,
00:15:50.460 | 'cause the anti-aircraft missiles
00:15:53.060 | are really far better than the aircraft.
00:15:55.540 | Like, there are far more of them.
00:15:57.180 | And so, neither side has air superiority.
00:16:02.100 | Tanks are basically death drafts,
00:16:03.800 | just slow moving and they're not immune
00:16:07.820 | to anti-tank weapons.
00:16:09.780 | So, you really just have long-range artillery
00:16:13.020 | and infantry, trenches.
00:16:17.420 | It's World War I all over again.
00:16:19.540 | With drones, you know, throwing little drones,
00:16:23.180 | some drones there.
00:16:24.100 | - Which makes the long-range artillery
00:16:26.660 | just that much more accurate and better,
00:16:28.740 | and so more efficient at murdering people on both sides.
00:16:32.380 | - Yeah, so it's, whoever is,
00:16:37.020 | you don't wanna be trying to advance from either side
00:16:39.860 | 'cause the probability of dying is incredibly high.
00:16:42.420 | So, in order to overcome defense in depth,
00:16:46.700 | trenches, and minefields,
00:16:51.820 | you really need significant local superiority in numbers.
00:16:57.800 | Ideally, combined arms,
00:16:59.980 | where you do a fast attack with aircraft,
00:17:04.000 | a concentrated number of tanks,
00:17:07.200 | and a lot of people.
00:17:10.080 | That's the only way you're gonna punch through a line.
00:17:12.300 | And then, you're gonna punch through
00:17:13.540 | and then not have reinforcements
00:17:17.780 | just kick you right out again.
00:17:19.820 | I mean, I really recommend people read
00:17:23.140 | World War I warfare in detail.
00:17:28.220 | It's rough.
00:17:29.220 | I mean, the sheer number of people that died there
00:17:34.660 | was mind-boggling.
00:17:36.100 | - And it's almost impossible to imagine the end of it
00:17:40.660 | that doesn't look like,
00:17:41.780 | almost exactly like the beginning
00:17:43.620 | in terms of what land belongs to who, and so on.
00:17:48.620 | But on the other side of a lot of human suffering,
00:17:52.580 | death, and destruction of infrastructure.
00:17:55.980 | - Yes, I mean, the thing that,
00:17:58.200 | the reason I proposed some sort of crucible
00:18:03.200 | or peace a year ago
00:18:05.480 | was because I predicted pretty much exactly
00:18:08.760 | what would happen,
00:18:10.200 | which is a lot of people dying
00:18:12.000 | for basically almost no changes in land.
00:18:14.680 | And the loss of the flower of Ukrainian and Russian youth,
00:18:21.240 | and we should have some sympathy for the Russian boys
00:18:24.840 | as well as the Ukrainian boys,
00:18:26.220 | 'cause the Russian boys didn't ask to be on their front line.
00:18:28.740 | They have to be.
00:18:29.580 | There's a lot of sons not coming back to their parents.
00:18:37.400 | I think most of them don't really have,
00:18:42.740 | they don't hate the other side.
00:18:45.580 | It's sort of like,
00:18:47.140 | as the saying comes from World War I,
00:18:50.340 | it's like young boys who don't know each other
00:18:53.720 | killing each other on behalf of old men
00:18:56.180 | that do know each other.
00:18:57.420 | What the hell's the point of that?
00:19:01.260 | - So Volodymyr Zelensky said that he's not,
00:19:04.980 | or has said in the past,
00:19:06.320 | he's not interested in talking to Putin directly.
00:19:08.540 | Do you think he should sit down,
00:19:10.580 | man to man, leader to leader, and negotiate peace?
00:19:13.020 | - I think I would just recommend
00:19:16.660 | do not send the flower of Ukrainian youth
00:19:20.920 | to die in trenches.
00:19:23.840 | Whether he talks to Putin or not, just don't do that.
00:19:28.160 | Whoever goes on the offensive
00:19:32.100 | will lose massive numbers of people.
00:19:33.900 | And history will not look kindly upon them.
00:19:39.760 | - You've spoken honestly about the possibility
00:19:44.140 | of war between US and China in the long term
00:19:46.820 | if no diplomatic solution is found.
00:19:49.620 | For example, on the question of Taiwan
00:19:51.060 | and one China policy.
00:19:52.820 | How do we avoid the trajectory
00:19:54.280 | where these two superpowers clash?
00:19:56.820 | - Well, it's worth reading that book on the,
00:20:01.340 | difficult to pronounce,
00:20:02.300 | the Thucydides Trap, I believe it's called.
00:20:04.540 | I love war history.
00:20:05.640 | I like inside out and backwards.
00:20:08.180 | There's hardly a battle I haven't read about.
00:20:12.100 | And trying to figure out,
00:20:14.360 | like what really was the cause of victory
00:20:16.380 | in any particular case,
00:20:17.380 | as opposed to what one side or another claimed
00:20:20.100 | the reason.
00:20:21.340 | - Both the victory and what sparked the war.
00:20:24.260 | - Yeah, yeah.
00:20:25.380 | - The whole thing.
00:20:26.900 | - Yeah, so that Athens and Sparta is classic case.
00:20:30.100 | The thing about the Greeks
00:20:30.920 | is they really wrote down a lot of stuff.
00:20:32.020 | They loved writing.
00:20:33.020 | There are lots of interesting things
00:20:35.020 | that happened in many parts of the world,
00:20:36.100 | but people just didn't write down.
00:20:38.260 | So we don't know what happened.
00:20:40.140 | Or they didn't really write in detail.
00:20:44.860 | They just would say like,
00:20:46.220 | we went, we had a battle and we won.
00:20:48.060 | And like, well, what can you add a bit more?
00:20:50.260 | The Greeks, they really wrote a lot.
00:20:53.220 | They were very articulate on,
00:20:56.380 | they just love writing.
00:20:57.200 | So, and we have a bunch of that writing that's preserved.
00:20:59.460 | So we know what led up to the Peloponnesian War
00:21:02.900 | between the Spartan and Athenian alliance.
00:21:05.640 | And we know that they saw it coming.
00:21:13.980 | I mean, the Spartans didn't write,
00:21:17.140 | they also weren't very verbose by their nature,
00:21:19.420 | but they did write, but they weren't very verbose.
00:21:21.660 | They weren't terse.
00:21:23.100 | But the Athenians and the other Greeks wrote a lot.
00:21:26.380 | And they were like,
00:21:27.380 | and Sparta was really kind of like the leader of Greece,
00:21:32.720 | but Athens grew stronger and stronger
00:21:35.700 | with each passing year.
00:21:36.860 | And everyone's like, well, that's inevitable
00:21:40.860 | that there's gonna be a clash between Athens and Sparta.
00:21:43.900 | Well, how do we avoid that?
00:21:46.740 | And they couldn't, they couldn't,
00:21:47.580 | they actually, they saw it coming
00:21:49.380 | and they still could not avoid it.
00:21:51.080 | So, at some point, if there's,
00:21:56.020 | if one group, one civilization or country or whatever,
00:22:01.020 | exceeds another, sort of like if,
00:22:07.420 | the United States has been the biggest kid on the block
00:22:09.980 | since I think around 1890 from an economic standpoint.
00:22:14.500 | So, the United States has been the economic,
00:22:18.060 | most powerful economic engine in the world
00:22:22.500 | longer than anyone's been alive.
00:22:24.100 | And the foundation of war is economics.
00:22:30.180 | So, now we have a situation in case of China
00:22:36.980 | where the economy is likely to be two,
00:22:41.980 | perhaps three times larger than that of the US.
00:22:45.940 | So, imagine you're the biggest kid on the block
00:22:48.020 | for as long as anyone can remember,
00:22:49.620 | and suddenly a kid comes along who's twice your size.
00:22:52.220 | - So, we see it coming.
00:22:57.460 | - Yeah. - How is it possible to stop?
00:22:59.940 | Is there some, let me throw something out there,
00:23:02.700 | just intermixing of cultures, understanding.
00:23:06.140 | There does seem to be a giant cultural gap
00:23:08.060 | in understanding of each other.
00:23:09.780 | And you're an interesting case study
00:23:11.260 | because you are an American, obviously.
00:23:14.060 | You've done a lot of incredible manufacture
00:23:17.540 | here in the United States, but you also work with China.
00:23:20.140 | - I've spent a lot of time in China
00:23:21.300 | and met with the leadership many times.
00:23:22.940 | - Maybe a good question to ask is,
00:23:24.940 | what are some things about China
00:23:26.780 | that people don't understand, positive, just in the culture?
00:23:29.700 | What's some interesting things
00:23:31.180 | that you've learned about the Chinese?
00:23:36.060 | - Well, the sheer number of really smart,
00:23:39.420 | hardworking people in China is incredible.
00:23:44.340 | There are, I believe, if you say like,
00:23:47.300 | how many smart, hardworking people are there in China,
00:23:49.180 | there's far more of them there than there are here,
00:23:51.580 | I think, in my opinion.
00:23:52.980 | The, and they've got a lot of energy.
00:23:58.640 | So, I mean, the architecture in China
00:24:04.620 | that's in recent years is far more impressive than the US.
00:24:08.420 | I mean, the train stations, the buildings,
00:24:10.900 | the high-speed rail, everything,
00:24:12.420 | it's really far more impressive than what we have in the US.
00:24:17.420 | I mean, I recommend somebody just go to Shanghai and Beijing
00:24:22.220 | look at the buildings and go to,
00:24:25.820 | take the train from Beijing to Xi'an
00:24:27.740 | where you have the terracotta warriors.
00:24:29.940 | China's got an incredible history, a very long history.
00:24:33.580 | And, you know, I think arguably the,
00:24:37.540 | in terms of the use of language from a written standpoint,
00:24:41.600 | sort of one of the oldest, perhaps,
00:24:44.860 | perhaps the oldest written language.
00:24:48.300 | And in China, people did write things down.
00:24:50.660 | So, now China, historically, has always been,
00:24:55.660 | with rare exception, been internally focused.
00:24:59.780 | They have not been acquisitive.
00:25:02.740 | They've fought each other.
00:25:05.060 | There have been many, many civil wars.
00:25:07.700 | In the Three Kingdoms War,
00:25:09.180 | I believe they lost about 70% of their population.
00:25:12.180 | So, they've had brutal internal wars,
00:25:19.060 | like civil wars that make the US civil war look small
00:25:24.580 | by comparison.
00:25:25.420 | So, I think it's important to appreciate
00:25:30.980 | that China is not monolithic.
00:25:33.020 | We sort of think of China as this sort of one entity
00:25:38.740 | of one mind, and this is definitely not the case.
00:25:41.220 | From what I've seen, and I think most people
00:25:46.460 | who understand China would agree,
00:25:48.180 | that people in China think about China 10 times more
00:25:51.180 | than they think about anything outside of China.
00:25:53.820 | So, it's like 90% of their consideration is,
00:25:58.780 | or is internal.
00:26:01.500 | - Well, isn't that a really positive thing?
00:26:03.700 | When you're talking about the collaboration
00:26:05.420 | and the future peace between superpowers,
00:26:08.000 | when you're inward facing, which is like focusing
00:26:10.180 | on improving yourself, versus focusing on,
00:26:13.380 | quote unquote, improving others through military might?
00:26:18.180 | - The good news, the history of China suggests
00:26:20.820 | that China is not acquisitive,
00:26:22.140 | meaning they're not gonna go out
00:26:23.140 | and invade a whole bunch of countries.
00:26:26.100 | Now, they do feel very strongly, so that's good,
00:26:29.260 | 'cause a lot of very powerful countries
00:26:31.540 | have been acquisitive.
00:26:32.660 | The US is also one of the rare cases
00:26:37.100 | that has not been acquisitive.
00:26:38.900 | Like after World War II, the US could have basically
00:26:41.060 | taken over the world, and any country.
00:26:42.820 | Like, we got nukes, nobody else got nukes,
00:26:45.080 | we don't even have to lose soldiers.
00:26:46.880 | Which country do you want?
00:26:49.700 | And the United States could have taken over everything.
00:26:52.260 | Oh, it at will, and it didn't.
00:26:55.260 | And the United States actually helped rebuild countries.
00:26:57.700 | So it helped rebuild Europe, it helped rebuild Japan.
00:27:00.920 | This is very unusual behavior, almost unprecedented.
00:27:05.740 | The US did conspicuous acts of kindness,
00:27:13.380 | like the Berlin Airlift.
00:27:14.580 | And I think, it's always like,
00:27:22.520 | well, America's done bad things.
00:27:23.780 | Well, of course America's done bad things,
00:27:24.940 | but one needs to look at the whole track record.
00:27:27.340 | And just generally,
00:27:30.740 | one sort of test would be,
00:27:34.660 | how do you treat your prisoners of war?
00:27:36.620 | Or let's say, no offense to the Russians,
00:27:43.060 | but let's say you're in Germany, it's 1945.
00:27:47.260 | You got the Russian army coming on one side,
00:27:49.380 | you got the French, British, and American armies
00:27:51.820 | coming on the other side.
00:27:53.100 | Who would you like to be surrendered to?
00:27:55.780 | Like no country is morally perfect,
00:27:58.780 | but I recommend being a POW with the Americans.
00:28:03.540 | That would be my choice very strongly.
00:28:05.700 | (laughing)
00:28:07.460 | - In the full menu of POWs.
00:28:08.860 | - Very much so.
00:28:10.220 | And in fact, Wernher von Braun,
00:28:12.860 | took a small guy, was like,
00:28:17.060 | "We've got to be captured by the Americans."
00:28:20.660 | And in fact, the SS was under orders
00:28:24.340 | to execute von Braun and all of the German rock initiates.
00:28:29.340 | And they narrowly escaped their SS.
00:28:34.420 | They said they were going out for a walk in the woods.
00:28:36.660 | They left in the middle of winter with no coats.
00:28:39.740 | And they ran like, no food, no coats, no water,
00:28:42.700 | and just ran like hell and ran west.
00:28:45.820 | And by sheer luck, I think his brother found
00:28:48.580 | like a bicycle or something,
00:28:51.060 | and then just cycled west as fast as he could
00:28:54.540 | and found a US patrol.
00:28:56.780 | So anyway, that's one way you can tell morality is,
00:29:03.780 | where do you want to be a POW?
00:29:06.180 | It's not fun anywhere,
00:29:08.060 | but some places are much worse than others.
00:29:10.300 | So anyway, so like America has been,
00:29:15.540 | while far from perfect, generally a benevolent force.
00:29:19.860 | And we should always be self-critical
00:29:23.900 | and we try to be better.
00:29:26.200 | But anyone with half a brain knows that.
00:29:31.080 | So I think there are, in this way,
00:29:34.380 | China and the United States are similar.
00:29:37.780 | Neither country has been acquisitive in a significant way.
00:29:42.700 | So that's like a shared principle, I guess.
00:29:46.220 | Now China does feel very strongly about Taiwan.
00:29:50.860 | They've been very clear about that for a long time.
00:29:54.640 | From their standpoint, it would be like one of the states
00:29:59.260 | is not there, like Hawaii or something like that,
00:30:04.820 | but more significant than Hawaii.
00:30:06.820 | And Hawaii is pretty significant for us.
00:30:11.620 | So they view it as really the,
00:30:16.240 | that there's a fundamental part of China,
00:30:19.460 | the island of Formosa, now Taiwan,
00:30:23.420 | that is not part of China but should be.
00:30:27.740 | And the only reason it hasn't been
00:30:29.500 | is because of the US Pacific Fleet.
00:30:31.260 | - And as their economic power grows
00:30:34.020 | and as their military power grows,
00:30:35.740 | the thing that they are clearly saying
00:30:40.920 | is their interests will clearly be materialized.
00:30:45.920 | - Yes.
00:30:47.420 | China has been very clear that they will incorporate Taiwan
00:30:52.740 | peacefully or militarily.
00:30:57.880 | But that they will incorporate it,
00:31:00.400 | from their standpoint, is 100% likely.
00:31:03.040 | - Now something you said about conspicuous acts of kindness.
00:31:06.920 | As a geopolitical policy, it almost seems naive.
00:31:10.580 | But I'd venture to say that this is probably
00:31:15.800 | the path forward, how you will avoid most wars.
00:31:20.300 | Just as you say it, it sounds naive,
00:31:22.940 | but it's kind of brilliant.
00:31:24.280 | If you believe in the goodness
00:31:27.800 | of underlying most of human nature.
00:31:30.980 | It just seems like conspicuous acts of kindness
00:31:35.520 | can reverberate through the populace
00:31:38.680 | of the countries involved.
00:31:40.160 | - Yeah, well--
00:31:42.640 | - And deescalate.
00:31:44.040 | - Absolutely.
00:31:44.880 | So after World War I, they made a big mistake.
00:31:49.880 | They basically tried to lump all the blame on Germany
00:31:53.920 | and settled Germany with impossible reparations.
00:32:05.480 | And really there was quite a bit of blame
00:32:08.920 | to go around for World War I.
00:32:12.640 | But they tried to put it all on Germany.
00:32:15.800 | And that laid the seeds for World War II.
00:32:23.320 | So a lot of people, well not just Hitler,
00:32:28.840 | a lot of people felt wronged.
00:32:31.520 | And they wanted vengeance.
00:32:35.640 | And they got it.
00:32:38.240 | - People don't forget.
00:32:39.640 | - Yeah.
00:32:40.460 | You kill somebody's father, mother, son, daughter,
00:32:46.040 | they're not gonna forget it.
00:32:47.840 | They will want vengeance.
00:32:49.080 | So after World War II, they're like,
00:32:51.920 | well, that Treaty of Versailles was a huge mistake
00:32:55.960 | in World War I.
00:32:58.200 | And so this time, instead of crushing the losers,
00:33:03.200 | we're actually gonna help them with the Marshall Plan
00:33:10.160 | and we're gonna help rebuild Germany.
00:33:14.320 | We're gonna help rebuild Austria and Italy and whatnot.
00:33:19.320 | So that was the right move.
00:33:26.680 | - There's a, it does feel like there's a profound truth
00:33:29.600 | to conspicuous acts of kindness being an antidote to this.
00:33:35.960 | - Something must stop the cycle of reciprocal violence.
00:33:40.080 | Something must stop it.
00:33:42.880 | Or it will, you know, it'll never stop.
00:33:47.160 | Just eye for an eye, tooth for a tooth,
00:33:51.560 | limb for a limb, life for a life, forever and ever.
00:33:56.040 | (sighs)
00:33:57.240 | - To escape briefly the darkness
00:33:58.880 | with some incredible engineering work.
00:34:00.800 | XAI just released Grok AI Assistant
00:34:04.840 | that I've gotten a chance to play with.
00:34:06.280 | It's amazing on many levels.
00:34:08.720 | First of all, it's amazing that a relatively small team
00:34:11.520 | in a relatively short amount of time
00:34:13.720 | was able to develop this close to state-of-the-art system.
00:34:18.720 | Another incredible thing is there's a regular mode
00:34:22.000 | and there's a fun mode.
00:34:23.000 | - Yeah, I guess time to play for that one.
00:34:25.160 | (laughs)
00:34:26.680 | - I wish, first of all,
00:34:27.520 | I wish everything in life had a fun mode.
00:34:29.480 | - Yeah.
00:34:30.320 | - There's something compelling beyond just fun
00:34:34.040 | about the fun mode interacting with a large language model.
00:34:38.120 | I'm not sure exactly what it is
00:34:39.320 | 'cause I've only had a little bit of time to play with it,
00:34:42.260 | but it just makes it more interesting,
00:34:45.560 | more vibrant to interact with the system.
00:34:47.920 | - Yeah, absolutely.
00:34:52.440 | Our AI Grok is modeled after
00:34:56.400 | "The Hitchhiker's Guide to the Galaxy,"
00:34:59.040 | which is one of my favorite books.
00:35:00.960 | It's a book on philosophy disguised as a book on humor.
00:35:05.320 | I would say that forms the basis of my philosophy,
00:35:11.760 | which is that we don't know the meaning of life,
00:35:16.680 | but the more we can expand
00:35:17.920 | the scope and scale of consciousness,
00:35:20.520 | both digital and biological,
00:35:25.040 | the more we are able to understand
00:35:26.440 | what questions to ask about the answer that is the universe.
00:35:30.080 | So I have a philosophy of curiosity.
00:35:32.720 | - There is generally a feeling like this AI system
00:35:38.520 | has an outward-looking,
00:35:42.120 | like the way you are sitting with a good friend,
00:35:44.240 | looking up at the stars,
00:35:45.640 | asking pod head-like questions about the universe.
00:35:49.520 | Wondering what it's all about,
00:35:50.760 | the curiosity you talk about.
00:35:52.400 | There's a sense, no matter how mundane the question,
00:35:54.560 | I ask it, there's a sense of cosmic grandeur
00:35:58.240 | to the whole thing.
00:35:59.060 | - Well, we are actually working hard
00:36:01.200 | to have engineering, math, and physics
00:36:05.680 | answers that you can count on.
00:36:07.180 | So for the other sort of AIs out there,
00:36:12.880 | these so-called large language models,
00:36:17.800 | I've not found the engineering to be reliable.
00:36:21.760 | And the hallucination,
00:36:24.840 | it unfortunately hallucinates most
00:36:26.840 | when you least want it to hallucinate.
00:36:29.640 | So when you ask important, difficult questions,
00:36:32.540 | that's when it tends to be confidently wrong.
00:36:34.780 | So we're really trying hard to say,
00:36:39.480 | okay, how do we be as grounded as possible
00:36:42.080 | so you can count on the results?
00:36:45.760 | Trace things back to physics first principles,
00:36:48.840 | mathematical logic.
00:36:50.060 | So underlying the humor is an aspiration
00:36:55.040 | to adhere to the truth of the universe
00:36:58.920 | as closely as possible.
00:37:00.180 | - That's really tricky.
00:37:02.800 | - It is tricky.
00:37:04.000 | So that's why,
00:37:06.480 | there's always gonna be some amount of error,
00:37:09.600 | but we wanna aspire to be as truthful as possible
00:37:15.440 | about the answers with acknowledged error.
00:37:20.440 | So that there was always,
00:37:21.840 | you don't wanna be confidently wrong.
00:37:23.400 | So you're not gonna be right every time,
00:37:26.760 | but you wanna minimize how often you're confidently wrong.
00:37:31.480 | And then like I said,
00:37:32.400 | once you can count on the logic as being
00:37:35.920 | not violating physics,
00:37:40.080 | then you can start to build on that to create inventions,
00:37:44.040 | like invent new technologies.
00:37:46.280 | But if you cannot count
00:37:50.800 | on the foundational physics being correct,
00:37:52.440 | obviously the inventions are simply wishful thinking,
00:37:55.620 | imagination land, magic basically.
00:38:00.440 | - Well, as you said,
00:38:02.680 | I think one of the big goals of XAI
00:38:04.680 | is to understand the universe.
00:38:06.600 | - Yes.
00:38:08.040 | That's our simple three word mission.
00:38:13.360 | - If you look out far into the future,
00:38:15.520 | do you think on this level of physics,
00:38:17.720 | the very edge of what we understand about physics,
00:38:19.680 | do you think it will make discoveries
00:38:22.880 | sort of the sexiest discovery of them as we know now,
00:38:26.600 | sort of unifying general relativity and quantum mechanics?
00:38:30.480 | So coming up with a theory of everything,
00:38:32.680 | do you think it could push towards that direction,
00:38:35.520 | almost like theoretical physics discoveries?
00:38:38.120 | - If an AI cannot figure out new physics,
00:38:41.120 | it's clearly not equal to humans,
00:38:43.960 | let alone nor has it surpassed humans,
00:38:45.920 | 'cause humans have figured out new physics.
00:38:48.520 | Physics is just understanding,
00:38:50.120 | deepening one's insight into how reality works.
00:38:54.600 | And then there's engineering,
00:38:58.560 | which is inventing things that have never existed.
00:39:01.640 | Now, the range of possibilities for engineering
00:39:03.400 | is far greater than for physics,
00:39:05.080 | because once you figure out the rules of the universe,
00:39:09.520 | that's it, you've discovered things that already existed.
00:39:12.120 | But from that, you can then build technologies
00:39:15.120 | that are really almost limitless in the variety.
00:39:21.040 | It's like once you understand the rules of the game properly
00:39:26.040 | and with current physics, we do, at least at a local level,
00:39:29.760 | understand how physics works very well.
00:39:32.440 | We are really due to predict things is incredibly good.
00:39:35.400 | Like quantum mechanics is,
00:39:36.760 | the degree to which quantum mechanics
00:39:38.320 | can predict outcomes is incredible.
00:39:40.880 | That was my hardest class in college, by the way.
00:39:44.800 | (laughs)
00:39:45.840 | My senior quantum mechanics class
00:39:47.480 | was harder than all of my other classes put together.
00:39:50.640 | - To get an AI system, a large language model,
00:39:53.840 | to be as reliable as quantum mechanics in physics
00:39:59.640 | is very difficult.
00:40:01.160 | - Yeah, you have to test any conclusions
00:40:03.640 | against the ground truth of reality.
00:40:05.080 | Reality is the ultimate judge.
00:40:06.560 | Like physics is the law,
00:40:07.480 | everything else is a recommendation.
00:40:09.880 | I've seen plenty of people break the laws made by man,
00:40:13.080 | but none break the laws made by physics.
00:40:15.760 | - It's a good test, actually.
00:40:16.960 | If this LM understands and matches physics,
00:40:21.800 | then you can more reliably trust whatever it thinks
00:40:24.560 | about the current state of politics.
00:40:26.640 | (laughs)
00:40:27.480 | In some sense.
00:40:28.760 | - It's also not the case currently
00:40:30.040 | that even its internal logic is not consistent.
00:40:34.480 | So, especially with the approach of just predicting
00:40:39.480 | a token, predict token, predict token,
00:40:44.160 | it's like a vector sum.
00:40:45.200 | You're summing up a bunch of vectors,
00:40:47.320 | but you can get drift.
00:40:49.040 | So, as those, a little bit of error,
00:40:51.320 | a little bit of error adds up,
00:40:53.000 | and by the time you are many tokens down the path,
00:40:55.500 | it doesn't make any sense.
00:40:58.860 | - So, it has to be somehow self-aware about the drift.
00:41:02.720 | - It has to be self-aware about the drift,
00:41:04.200 | and then look at the thing as a gestalt, as a whole,
00:41:08.400 | and say, does it have coherence as a whole?
00:41:13.400 | So, when authors write books,
00:41:18.600 | they will write the book, and then they'll go and revise it,
00:41:22.600 | taking into account the end and the beginning and the middle,
00:41:26.680 | and rewrite it to achieve coherence,
00:41:30.520 | so that it doesn't end up in a nonsensical place.
00:41:33.800 | - Maybe the process of revising is what reasoning is,
00:41:38.200 | and then that's, the process of revising
00:41:39.680 | is how you get closer and closer to truth.
00:41:42.080 | Maybe, at least I approach it that way.
00:41:45.120 | You just say a bunch of bullshit first,
00:41:46.680 | and then you get it better.
00:41:48.120 | You start at bullshit, and then you get--
00:41:48.960 | - Yeah, you create a draft,
00:41:50.200 | and then you iterate on that draft,
00:41:52.460 | until it has coherence, until it all adds up, basically.
00:41:58.480 | - So, another question about theory of everything,
00:42:01.120 | but for intelligence.
00:42:02.800 | Do you think there exists, as you're exploring this
00:42:05.400 | with XAI, creating this intelligence system,
00:42:07.320 | do you think there is a theory of intelligence,
00:42:10.080 | where you get to understand what,
00:42:12.740 | like, what is the I in AGI,
00:42:17.920 | and what is the I in human intelligence?
00:42:22.400 | - There's no I in Team America.
00:42:23.840 | Oh, wait, there is.
00:42:24.840 | (laughing)
00:42:26.600 | - Ah, now it's gonna be stuck in my head now.
00:42:28.720 | (laughing)
00:42:31.600 | - Yeah, there's no me in whatever.
00:42:34.800 | In quantum mechanics, oh, wait.
00:42:38.040 | (laughing)
00:42:40.280 | I mean, is that part of the process of discovering,
00:42:44.200 | understanding the universe, is understanding intelligence?
00:42:47.960 | - Yeah.
00:42:48.800 | Yeah, I think we need to understand intelligence,
00:42:52.280 | understand consciousness.
00:42:53.800 | I mean, there are some sort of fundamental questions
00:42:56.940 | of, like, what is thought, what is emotion?
00:43:01.160 | - Yeah.
00:43:02.000 | - Is it really just one atom bumping into another atom?
00:43:06.040 | It feels like something more than that.
00:43:07.880 | So, I think we're probably missing some really big things.
00:43:13.400 | Like, some really big things.
00:43:18.520 | - Like, something that'll be obvious in retrospect.
00:43:20.800 | - Yes.
00:43:21.640 | - Like, there's a giant,
00:43:22.720 | like, you put the whole consciousness, emotion.
00:43:25.540 | - Well, some people would call it like a soul,
00:43:28.560 | you know, in religion, it's a soul.
00:43:31.360 | Like, you feel like you're you, right?
00:43:33.920 | I mean, you don't feel like you're just
00:43:36.600 | a collection of atoms.
00:43:38.560 | But on what dimension does thought exist?
00:43:41.800 | What dimension do emotions exist?
00:43:43.720 | We feel them very strongly.
00:43:45.080 | I suspect there's more to it than atoms bumping into atoms.
00:43:49.560 | - And maybe AI can pave the path to the discovery
00:43:55.760 | of whatever the hell that thing is.
00:43:58.120 | - Yeah, what is consciousness?
00:43:59.640 | Like, when you put the atoms in a particular shape,
00:44:02.840 | why are they able to form thoughts
00:44:05.000 | and take actions and feelings?
00:44:09.280 | - And even if it is an illusion,
00:44:11.720 | why is this illusion so compelling?
00:44:13.360 | - Yeah.
00:44:14.200 | Why does this illusion exist?
00:44:16.560 | On what plane does this illusion exist?
00:44:21.760 | And sometimes I wonder,
00:44:24.240 | either perhaps everything's conscious
00:44:28.120 | or nothing is conscious.
00:44:30.200 | Um, one of the two.
00:44:32.120 | - I like the former.
00:44:34.840 | Everything conscious just seems more fun.
00:44:37.880 | - It does seem more fun, yes.
00:44:39.640 | But we're composed of atoms,
00:44:43.000 | and those atoms are composed of quarks and leptons.
00:44:45.760 | And those quarks and leptons have been around
00:44:47.880 | since the beginning of the universe.
00:44:50.080 | - The beginning of the universe.
00:44:53.120 | - Right, what seems to be the beginning of the universe?
00:44:55.640 | - The first time we talked, you said,
00:44:57.560 | which is surreal to think that this discussion
00:45:00.400 | was happening and is becoming a reality,
00:45:02.640 | I asked you what question would you ask an AGI system
00:45:05.840 | once you create it,
00:45:06.960 | and you said, "What's outside the simulation?"
00:45:09.240 | Is the question.
00:45:10.160 | Good question.
00:45:13.520 | But it seems like with Grok, you started to,
00:45:15.800 | literally, the system's goal
00:45:19.320 | is to be able to ask such questions.
00:45:21.400 | To answer such questions and to ask such questions.
00:45:24.040 | - Where are the aliens?
00:45:25.400 | - Where are the aliens?
00:45:26.240 | That's one of the, like the Fermi paradox question.
00:45:28.760 | A lot of people have asked me
00:45:32.080 | if I've seen any evidence of aliens, and I haven't.
00:45:34.880 | Which is kind of concerning, 'cause then,
00:45:37.640 | I think I'd probably prefer to at least
00:45:41.000 | have seen some archeological evidence of aliens.
00:45:43.400 | To the best of my knowledge, there is no proof.
00:45:47.000 | I'm not aware of any evidence of aliens.
00:45:49.040 | If they're out there, they're very subtle.
00:45:52.680 | We might just be the only consciousness,
00:45:54.560 | at least in the galaxy,
00:45:56.400 | and if you look at, say, the history of Earth,
00:46:00.960 | for one is to believe the archeological record,
00:46:03.400 | Earth is about four and a half billion years old.
00:46:06.200 | Civilization, as measured from the first writing,
00:46:09.360 | is only about 5,000 years old.
00:46:11.880 | We have to give some credit there to the ancient Sumerians,
00:46:13.960 | who aren't around anymore.
00:46:15.640 | I think it was the archaic pre-cuneiform
00:46:18.440 | was the first actual symbolic representation.
00:46:21.400 | But only about 5,000 years ago.
00:46:24.240 | I think that's a good date for when, say,
00:46:27.480 | civilization started.
00:46:28.740 | That's one millionth of Earth's existence.
00:46:32.560 | So civilization has been around,
00:46:37.160 | it's really a flash in the pan so far.
00:46:40.860 | And why have we, why did it take so long?
00:46:50.760 | For one half billion years,
00:46:54.280 | for the vast majority of that time,
00:46:59.540 | there was no life,
00:47:00.760 | and then there was archaic bacteria for a very long time.
00:47:03.880 | And then you had mitochondria get captured,
00:47:09.920 | multicellular life,
00:47:13.080 | differentiation into plants and animals,
00:47:18.080 | life moving from the oceans to land,
00:47:20.720 | mammals, higher brain functions.
00:47:24.060 | And the sun is expanding slowly,
00:47:27.680 | but it will heat the Earth up,
00:47:34.280 | at some point in the future,
00:47:38.000 | boil the oceans,
00:47:39.920 | and Earth will become like Venus,
00:47:42.440 | where life as we know it is impossible.
00:47:45.200 | So if we do not become multi-planetary,
00:47:49.000 | and ultimately go beyond our solar system,
00:47:52.060 | annihilation of all life on Earth is a certainty.
00:47:56.240 | A certainty.
00:47:58.200 | And it could be as little as,
00:48:02.760 | on the galactic time scale,
00:48:05.040 | half a billion years.
00:48:08.360 | You know, long time by human standards,
00:48:11.440 | but that's only 10% longer than Earth
00:48:14.320 | has been around at all.
00:48:16.500 | So if life had taken 10% longer to evolve on Earth,
00:48:21.500 | it wouldn't exist at all.
00:48:23.480 | - We got a deadline coming up.
00:48:29.300 | We better hurry.
00:48:30.260 | But that said, as you said,
00:48:32.020 | humans, intelligent life on Earth
00:48:34.460 | developed a lot of cool stuff very quickly.
00:48:38.340 | So it seems like becoming multi-planetary
00:48:42.400 | is almost inevitable,
00:48:45.020 | unless we destroy this thing.
00:48:45.860 | - We need to do it.
00:48:46.700 | I mean, it's not,
00:48:49.020 | I mean, I suspect that there,
00:48:51.880 | if we are able to go out there
00:48:54.100 | and explore other star systems,
00:48:55.820 | that there's a good chance we find
00:48:58.020 | a whole bunch of long-dead, one-planet civilizations
00:49:01.380 | that never made it past their home planet.
00:49:03.180 | - That's so sad.
00:49:04.020 | - Yeah. - That's sad.
00:49:06.380 | Also fascinating.
00:49:08.100 | - I mean, there are various explanations
00:49:09.460 | for the Fermi paradox,
00:49:11.020 | and one is there's these great filters,
00:49:13.860 | which civilizations don't pass through.
00:49:17.020 | And one of those great filters is,
00:49:19.140 | do you become a multi-planet civilization or not?
00:49:21.940 | And if you don't, it's simply a matter of time
00:49:24.280 | before something happens on your planet,
00:49:26.280 | either natural or man-made,
00:49:31.740 | that causes us to die out, like the dinosaurs.
00:49:35.300 | Where are they now?
00:49:37.140 | They didn't have spaceships. (laughs)
00:49:42.380 | - I think the more likely thing is,
00:49:44.280 | 'cause just to empathize with the aliens,
00:49:48.220 | that they found us and they're protecting us
00:49:50.540 | and letting us be. - I hope so.
00:49:52.220 | Nice aliens.
00:49:53.500 | - Just like the tribes in the Amazon.
00:49:55.660 | They all contact the tribes, we're protecting them.
00:49:58.780 | That's what-- - That would be
00:49:59.620 | a nice explanation.
00:50:00.660 | - Or you could have, what was it?
00:50:03.420 | I think Andrej Karpathy said it's like the ants
00:50:06.740 | in the Amazon asking, "Where's everybody?"
00:50:10.500 | - Well, they do run into a lot of other ants.
00:50:12.900 | - That's true. - They have these ant wars.
00:50:15.100 | (laughing)
00:50:16.740 | - Sounds like a good TV show.
00:50:18.060 | - Yeah, they literally have these big wars
00:50:19.500 | between various ants.
00:50:20.580 | - Yeah, maybe I'm just dismissing
00:50:25.660 | all the different diversity of ants.
00:50:27.740 | - You should listen to that Werner Herzog
00:50:29.500 | talking about the jungle, it's really hilarious.
00:50:31.020 | Have you heard it?
00:50:31.940 | - No, I have not. - It's awesome.
00:50:32.780 | - But Werner Herzog has a way.
00:50:34.460 | (laughing)
00:50:36.820 | - You should play it as an interlude in the,
00:50:41.820 | it's on YouTube, it's awesome.
00:50:43.780 | - I love him so much. - Yeah, he's great.
00:50:47.820 | - Was he the director of "Happy People, Life in the Taiga?"
00:50:50.060 | I think also.
00:50:50.900 | - He did that bear documentary.
00:50:52.300 | - The bear documentary, yeah.
00:50:53.140 | - And did this thing about penguins.
00:50:54.140 | - Yeah. (laughing)
00:50:56.820 | The psychoanalysis of penguins.
00:51:00.420 | - Yeah, the penguin's headed for mountains
00:51:04.140 | that are like 70 miles away,
00:51:05.900 | the penguin is just headed for doom, basically.
00:51:08.700 | - Well, he had a cynical take.
00:51:11.060 | He could be just a brave explorer
00:51:13.180 | and there'll be great stories told about him
00:51:16.940 | amongst the penguin population for many centuries to come.
00:51:20.440 | (laughing)
00:51:23.900 | What were we talking about?
00:51:24.940 | Okay. - Pregnant.
00:51:26.420 | (laughing)
00:51:28.660 | - Yes, aliens, I mean, I don't know.
00:51:31.540 | Look, I think the smart move is just,
00:51:35.740 | this is the first time in the history of Earth
00:51:38.820 | that it's been possible for life to extend beyond Earth.
00:51:42.140 | That window is open.
00:51:44.820 | Now, it may be open for a long time
00:51:49.260 | or it may be open for a short time.
00:51:51.020 | And it may be open now and then never open again.
00:51:54.820 | So I think the smart move here
00:51:57.020 | is to make life multi-planetary while it is possible
00:52:00.300 | to do so.
00:52:01.540 | We don't wanna be one of those lame one-planet civilizations
00:52:03.820 | that just dies out.
00:52:04.860 | - No, those are lame. - Yeah, lame.
00:52:06.660 | (laughing)
00:52:09.020 | - There's no perspective civilization would be one planet.
00:52:11.260 | - There's not gonna be a Wikipedia entry
00:52:13.100 | for one of those.
00:52:14.160 | And pause.
00:52:18.140 | Does SpaceX have an official policy for when we meet aliens?
00:52:22.840 | - No. (laughing)
00:52:24.580 | - Okay.
00:52:25.540 | That seems irresponsible.
00:52:26.920 | (laughing)
00:52:30.860 | - I mean, look, if I see the slightest indication
00:52:34.220 | that there are aliens, I will immediately post
00:52:36.020 | on the X-platform anything I know.
00:52:38.740 | - It could be the most liked reposted post of all time.
00:52:42.580 | - Yeah, I mean, look, we have more satellites
00:52:45.260 | up there right now than everyone else combined.
00:52:48.100 | So we know if we've gotta maneuver around something
00:52:52.260 | and we don't have to maneuver around anything.
00:52:54.560 | - If we go to the big questions once again,
00:52:57.900 | you said you're with Einstein,
00:53:00.420 | that you believe in the goddess Spinoza.
00:53:03.420 | (laughing)
00:53:04.460 | - Yes.
00:53:05.540 | - So that's a view that God is like the universe
00:53:08.660 | and reveals himself through the laws of physics
00:53:12.100 | or as Einstein said, through the lawful harmony of the world.
00:53:16.180 | - Yeah, I would agree that God, the simulator or whatever,
00:53:20.660 | the supreme being or beings,
00:53:22.600 | reveal themselves through the physics.
00:53:28.420 | You know, they're creators of this existence.
00:53:31.220 | And it's incumbent upon us to try to understand
00:53:33.820 | more about this wondrous creation.
00:53:36.700 | - Who created this thing, who's running this thing?
00:53:40.940 | Like embodying it into a singular question
00:53:43.860 | with a sexy word on top of it
00:53:45.700 | is like focusing the mind to understand.
00:53:47.700 | It does seem like there's a, again, it could be an illusion.
00:53:52.060 | It seemed like there's a purpose,
00:53:54.460 | that there's underlying master plan of some kind.
00:53:57.300 | It seems like.
00:53:58.900 | - There may not be a master plan
00:54:00.620 | in the sense, so there's like,
00:54:03.060 | maybe an interesting answer to the question
00:54:05.460 | of determinism versus free will
00:54:07.060 | is that if we are in a simulation,
00:54:10.500 | the reason that these higher beings would hold a simulation
00:54:14.220 | is to see what happens.
00:54:15.680 | So it's not, they don't know what happens.
00:54:20.960 | Otherwise they wouldn't hold the simulation.
00:54:24.820 | So when humans create a simulation,
00:54:27.460 | so it's SpaceX and Tesla,
00:54:29.340 | we create simulations all the time,
00:54:31.740 | especially for the rocket.
00:54:33.040 | You have to run a lot of simulations
00:54:38.060 | to understand what's gonna happen
00:54:39.540 | because you can't really test the rocket
00:54:41.620 | until it goes to space and you want it to work.
00:54:44.860 | So you have to simulate subsonic, transonic,
00:54:49.060 | supersonic, hypersonic, ascent,
00:54:53.980 | and then coming back, super high heating
00:54:56.380 | and orbital dynamics, all this is gonna be simulated.
00:55:00.340 | So, 'cause you don't get very many kicks at the can.
00:55:04.340 | But we run the simulations to see what happens,
00:55:06.460 | not if we knew what happens, we wouldn't run the simulation.
00:55:09.820 | So if there's, so whoever created this existence,
00:55:13.360 | they're running it because they don't know
00:55:18.380 | what's gonna happen, not because they do.
00:55:20.540 | - So maybe, we both play Diablo,
00:55:25.700 | maybe Diablo was created to see if a druid,
00:55:28.260 | your character, could defeat Uber Lilith at the end.
00:55:32.700 | They didn't know.
00:55:33.580 | - Well, the funny thing is that Uber Lilith's
00:55:36.780 | title is Hatred Incarnate.
00:55:40.660 | - Yeah.
00:55:41.500 | - And right now, I guess, you can ask the Diablo team,
00:55:46.580 | but it's almost impossible to defeat hatred
00:55:51.580 | in the eternal realm.
00:55:54.940 | - Yeah, you've streamed yourself dominating
00:55:57.580 | tier 100 nightmare dungeons, and still.
00:56:00.820 | - I can cruise through tier 100 nightmare dungeons
00:56:03.620 | like a stroll in the park.
00:56:05.860 | - And still you're defeated by hatred.
00:56:09.060 | - Yeah, I can, the sort of, I guess,
00:56:11.580 | maybe the second hottest boss is Durial.
00:56:14.380 | Durial can't even scratch the paint.
00:56:16.180 | So, I killed Durial so many times.
00:56:20.700 | And every other boss in the game,
00:56:22.020 | all of them, killed him so many times, it's easy.
00:56:25.460 | But Uber Lilith, otherwise known as Hatred Incarnate,
00:56:30.460 | especially if you're a druid and you have no ability
00:56:34.340 | to be invulnerable, there are these random death waves
00:56:38.380 | that come at you.
00:56:39.620 | And I'm pretty, you know, I really am 52,
00:56:45.260 | so my reflexes are not what they used to be,
00:56:46.780 | but I have a lifetime of playing video games.
00:56:50.020 | At one point, I was maybe one of the best Quake players
00:56:54.100 | in the world, actually won money for what I think
00:56:58.940 | was the first paid eSports tournament in the US.
00:57:01.580 | We were doing four-person Quake tournaments,
00:57:06.220 | and we came second.
00:57:09.420 | I was the second best person on the team.
00:57:11.500 | And the actual best person, we were actually winning,
00:57:14.140 | we were gonna come first, except the best person
00:57:15.700 | on the team, his computer crashed halfway through the game.
00:57:19.180 | So, we came second.
00:57:21.980 | But I got money for it and everything.
00:57:23.220 | So, basically, I got skills, albeit, you know,
00:57:26.820 | no spring chicken these days.
00:57:28.660 | And to be totally frank, it's driving me crazy
00:57:32.940 | trying to beat Lilith as a druid,
00:57:36.380 | basically trying to beat Hatred Incarnate
00:57:39.460 | in the eternal realm.
00:57:41.020 | - As a druid. - As a druid.
00:57:42.620 | And if you, this is really vexing, let me tell you.
00:57:47.620 | - I mean, the challenge is part of the fun.
00:57:50.740 | I have seen directly, like, you're actually
00:57:54.100 | like a world-class, incredible video game player.
00:57:56.860 | - Yeah. - And I think Diablo,
00:57:58.820 | so you're just picking up a new game,
00:58:00.460 | and you're figuring out its fundamentals.
00:58:03.060 | You're also, with the Paragon board and the build,
00:58:05.720 | are not somebody like me who perfectly follows
00:58:10.100 | whatever they suggest on the internet.
00:58:11.980 | You're also an innovator there.
00:58:13.860 | - Yeah. (laughs)
00:58:15.220 | - Which is hilarious to watch.
00:58:17.140 | It's like a mad scientist just trying to figure out
00:58:20.340 | the Paragon board and the build.
00:58:22.220 | - Yeah. - You know?
00:58:23.820 | Is there some interesting insights there
00:58:25.460 | about if somebody's starting as a druid,
00:58:28.060 | do you have advice?
00:58:29.020 | - I would not recommend playing a druid
00:58:31.900 | in the eternal realm. - No.
00:58:33.860 | - Right now, I think the most powerful character
00:58:36.300 | in the seasonal realm is the sorcerer
00:58:38.980 | with the lightning balls.
00:58:41.140 | The soaks have huge balls in the seasonal.
00:58:45.860 | - Oh, yeah. (Adam laughs)
00:58:47.460 | That's what they say.
00:58:48.300 | - Yeah, it's the soaks have huge balls.
00:58:51.300 | They do, huge balls of lightning.
00:58:54.780 | - I'll take your word for it.
00:58:56.380 | - And it's actually, in the seasonal realm,
00:58:58.980 | it's pretty easy to beat Uverloth,
00:59:02.400 | because you get these vampiric powers
00:59:04.700 | that amplify your damage and increase your defense
00:59:08.060 | and whatnot, so.
00:59:09.820 | But really quite easy to defeat hatred seasonally.
00:59:15.060 | But to defeat hatred eternally, very difficult.
00:59:18.260 | Almost impossible, it's virtually impossible.
00:59:21.740 | It seems like a metaphor for life.
00:59:24.980 | - I like the idea that Elon Musk,
00:59:27.580 | 'cause I was playing Diablo yesterday
00:59:29.060 | and I saw level 100 druids just run by,
00:59:33.060 | I will never die, and then run back the other way.
00:59:36.500 | And there's just some, this metaphor is kinda hilarious,
00:59:39.560 | that you, Elon Musk, is fighting hatred,
00:59:43.660 | restlessly fighting hatred in this demonic realm.
00:59:47.020 | - Yes.
00:59:48.180 | - It's hilarious, I mean, it's pretty hilarious.
00:59:49.900 | - No, it's absurd.
00:59:50.740 | (Lex laughs)
00:59:52.780 | Really, it's exercise in absurdity,
00:59:54.780 | and it makes me wanna pull my hair out.
00:59:56.060 | - Yeah.
00:59:56.900 | What do you get from video games in general?
00:59:59.540 | Is there, for you personally?
01:00:03.860 | - I don't know if I, it calms my mind.
01:00:06.660 | I mean, you sort of, killing the demons in a video game
01:00:10.940 | calms the demons in my mind.
01:00:13.340 | (Lex blows air)
01:00:15.620 | If you play a tough video game,
01:00:17.740 | you can get into a state of flow,
01:00:19.360 | which is very enjoyable.
01:00:21.680 | And admittedly, it needs to be not too easy, not too hard,
01:00:28.060 | kind of in the Goldilocks zone.
01:00:32.420 | And I guess you generally wanna feel
01:00:33.500 | like you're progressing in the game.
01:00:35.980 | So, a good video, and there's also beautiful art.
01:00:41.940 | Engaging storylines.
01:00:44.500 | And it's like an amazing puzzle to solve, I think.
01:00:50.300 | And so, it's like solving the puzzle.
01:00:52.220 | - "Elden Ring," the greatest game of all time?
01:00:54.500 | I still haven't played it, but to you?
01:00:56.700 | - "Elden Ring" is definitely a candidate
01:00:57.900 | for best game ever.
01:00:58.900 | Top five, for sure.
01:01:01.220 | - I think I've been scared how hard it is,
01:01:03.060 | or how hard I hear it is.
01:01:05.380 | So, but it is beautiful.
01:01:06.780 | - "Elden Ring" feels like it's designed by an alien.
01:01:11.400 | - There's a theme to this discussion.
01:01:15.700 | In what way?
01:01:16.540 | - It's so unusual.
01:01:18.540 | It's incredibly creative, and the art is stunning.
01:01:21.000 | I recommend playing it on a big resolution,
01:01:25.420 | high dynamic range TV, even.
01:01:27.900 | Doesn't need to be a monitor.
01:01:29.340 | The art is incredible.
01:01:32.460 | It's so beautiful.
01:01:33.340 | And it's so unusual.
01:01:37.300 | And each of those top five boss battles is unique.
01:01:42.300 | Like, it's like a unique puzzle to solve.
01:01:46.220 | Each one's different.
01:01:47.980 | And the strategy you use to solve one battle
01:01:51.100 | is different from another battle.
01:01:54.340 | - That said, you said "Druid and Eternal"
01:01:56.100 | against "Uber Lilith" is the hardest boss battle you've ever.
01:02:00.100 | - Correct.
01:02:00.940 | That is currently the, and I've played a lot of video games.
01:02:05.780 | 'Cause it's my primary recreational activity.
01:02:08.180 | And yes, beating hatred in the eternal realm
01:02:16.580 | is the hardest boss battle in life and in the video game.
01:02:23.060 | - Metaphor on top of metaphor.
01:02:25.260 | - I'm not sure it's possible, but it's,
01:02:28.060 | I do make progress.
01:02:29.180 | So then I'm like, okay, I'm making progress.
01:02:31.220 | Maybe if I just tweak that paragon board a little more,
01:02:35.780 | I can do it.
01:02:37.860 | I can just dodge a few more waves, I can do it.
01:02:42.860 | - Well, the simulation is created
01:02:44.740 | for the purpose of figuring out if it can be done.
01:02:47.340 | And you're just a cog in that simulation,
01:02:50.220 | in the machine of the simulation.
01:02:51.740 | - Yeah, it might be.
01:02:52.740 | I have a feeling that at least,
01:02:59.220 | I think--
01:03:00.340 | - It's doable.
01:03:03.220 | - It's doable, yes.
01:03:04.300 | - Well, that's the human spirit right there, to believe.
01:03:09.380 | - Yeah, I mean, it did prompt me to think
01:03:11.780 | about just hate in general, which is,
01:03:14.500 | you wanna be careful of one of those things
01:03:17.940 | where you wish for something that sounds good,
01:03:22.620 | but if you get it, it's actually a dystopian situation.
01:03:27.580 | So you could sort of have artists of like,
01:03:32.580 | if you wish for world peace, sounds good,
01:03:35.820 | but how is it enforced?
01:03:37.980 | And at what cost is it, at what cost eternal peace?
01:03:44.460 | It might actually be worse to have eternal peace
01:03:47.260 | because of what that would entail.
01:03:49.980 | The suppression of everyone,
01:03:51.500 | it might be the suppression of progress.
01:03:55.940 | It might be an ossified society that never changes.
01:03:58.460 | What I'm saying is that there is an argument
01:04:03.460 | that if you wish for no war,
01:04:09.660 | you should be careful what you wish for,
01:04:13.780 | because what's required in order for there to be no war?
01:04:17.460 | Might be worse than a little war.
01:04:22.420 | - Yeah, the things we consider to be flaws
01:04:24.220 | of human civilization might be unnecessary components
01:04:26.980 | for whatever optimal looks like.
01:04:28.980 | I mean, do you worry about AI, AGI,
01:04:32.660 | enabling a dystopian state of this nature,
01:04:35.140 | whether it's 1984 with surveillance and fear
01:04:38.540 | or brave new world with pleasure?
01:04:41.540 | And what is it, lots of sex, but no deep human experience?
01:04:51.540 | - There's actually a real drug called Soma.
01:04:54.300 | - There is? - Yes.
01:04:55.380 | (laughing)
01:04:56.820 | - Let me take notes.
01:04:58.460 | Does Grok know about it?
01:05:00.020 | No, it probably will go to brave new world.
01:05:02.420 | - You could ask it.
01:05:03.740 | - What would be the question?
01:05:05.540 | - Tell us about the drug called Soma.
01:05:07.860 | - Tell us about the drug called Soma.
01:05:11.940 | It's gonna go to brave new world, I think.
01:05:15.500 | - Well, there's a real drug as well.
01:05:17.980 | - Searching for Soma drug information.
01:05:20.180 | (laughing)
01:05:22.420 | Oh boy.
01:05:23.420 | Ah, Soma, the wonder drug.
01:05:25.860 | By the way, I'm in fun mode.
01:05:27.620 | - Oh, okay, fun.
01:05:28.460 | (laughing)
01:05:29.580 | Yeah, so it's talking about the real drug.
01:05:31.420 | - Oh, it's a muscle relaxant
01:05:32.900 | that's been around since the 1950s.
01:05:34.860 | It's used to treat muscle pain and discomfort,
01:05:36.980 | and it does this by altering the activity
01:05:38.980 | between neurons and the spinal cord.
01:05:40.940 | Now, Soma isn't just a pain reliever.
01:05:43.460 | It's also a bit of a sedative.
01:05:45.180 | It can make you feel a bit drowsy,
01:05:46.940 | so it's not something to take
01:05:48.700 | if you're planning on operating heavy machinery
01:05:50.500 | or trying to solve complex math equations.
01:05:53.180 | And it goes on and on.
01:05:54.180 | In conclusion, Soma's a drug that can be helpful
01:05:56.260 | for those dealing with muscle pain,
01:05:57.780 | but it's not without its risks.
01:06:00.300 | It's pretty good.
01:06:01.140 | Oh, wait a minute.
01:06:01.980 | - That's the real drug it's talking about.
01:06:03.300 | - It's the real drug.
01:06:04.740 | - Yeah.
01:06:05.580 | - And of course, the fun is always use it responsibly
01:06:07.980 | in accordance with the advice of a healthcare professional.
01:06:10.080 | And remember, in the end,
01:06:11.420 | the universe is probably just a giant computer simulation,
01:06:13.980 | so don't take anything too seriously.
01:06:15.820 | I love it.
01:06:16.660 | (laughing)
01:06:17.500 | I love it.
01:06:18.340 | (laughing)
01:06:19.180 | I love it.
01:06:20.020 | - Yeah, so.
01:06:20.840 | - Oh, and it also really nicely includes
01:06:22.900 | the tweets, posts on X that are relevant to this,
01:06:27.900 | which is really cool.
01:06:29.300 | - Yeah.
01:06:30.140 | - It integrates everything together.
01:06:32.140 | - Yeah.
01:06:32.980 | - The interface is one of the really cool things here.
01:06:35.700 | - Yeah, seems like a great job.
01:06:37.300 | (laughing)
01:06:39.540 | So, and you could say,
01:06:41.940 | "No, I mean Soma from Brave New World."
01:06:46.540 | - Yeah, "Brave New World."
01:06:48.100 | 'Cause it thinks you mean Soma, the real drug.
01:06:51.260 | - Yeah.
01:06:52.100 | - No, I mean Soma from Brave New World.
01:06:53.620 | - It's gonna be like, oops, I'm really sorry.
01:06:59.460 | I'm really surprised it went to the real drug first.
01:07:02.060 | Ah, my apologies.
01:07:04.300 | (laughing)
01:07:05.140 | I was thinking of the muscle relaxants.
01:07:06.420 | Soma from Brave New World is a different beast altogether.
01:07:09.660 | (laughing)
01:07:10.500 | No, this Huxley's dystopia, no,
01:07:11.940 | Soma is a drug that plays a central role
01:07:14.580 | in the society depicted in the book.
01:07:15.900 | It's a mood-altering, hallucinogenic drug
01:07:18.940 | that's used to keep the population docile
01:07:21.020 | and content in the novel.
01:07:22.860 | To use Soma in Brave New World
01:07:24.060 | raises some interesting questions
01:07:25.300 | about the nature of happiness
01:07:26.700 | and the role of drugs in society.
01:07:28.460 | Oh, man.
01:07:30.620 | - Exactly, is it better to live in a world
01:07:32.580 | where everyone is happy all the time,
01:07:34.420 | even if that happiness is artificial?
01:07:37.700 | Good question.
01:07:38.900 | This is what I mean.
01:07:39.900 | Like, do you wish for world peace
01:07:43.260 | and happiness all the time?
01:07:45.500 | Are you sure?
01:07:46.780 | Because that might be a society
01:07:51.100 | that is essentially sterile and ossified,
01:07:53.940 | that never changes, that is ultimately doomed.
01:07:56.900 | - This kind of tension between darkness and light.
01:08:03.380 | - This is really a very good summary.
01:08:06.440 | It really gets to the point.
01:08:09.060 | This is not simply regurgitating Brave New World.
01:08:12.060 | It's actually getting to the salient element
01:08:15.580 | of Soma as a drug.
01:08:18.420 | Do you actually wanna be in a situation
01:08:22.220 | where everyone is happy all the time,
01:08:23.620 | even though it's artificial?
01:08:25.260 | Or is it better to confront the challenges of life
01:08:27.220 | and experience a full range of human emotions,
01:08:29.140 | even if it means experiencing pain and suffering?
01:08:31.340 | - For those listening, by the way,
01:08:32.940 | Elon just read directly from Grok,
01:08:35.220 | which is a really nice, kind of insightful
01:08:37.220 | philosophical analysis of the tension here.
01:08:40.140 | Interesting.
01:08:41.260 | It pretty much nails it.
01:08:42.680 | In conclusion, Soma from Brave New World
01:08:44.380 | is a fictional drug that is used to explore
01:08:46.060 | some deep philosophical questions
01:08:47.580 | about the nature of happiness
01:08:49.100 | and the role of drugs in society.
01:08:51.140 | It's a powerful symbol of the dangers
01:08:52.660 | of using drugs to escape from reality
01:08:55.420 | and the importance of confronting
01:08:56.700 | the challenges of life head on.
01:08:58.380 | Nailed it.
01:09:01.480 | And the crazy thing is, we do have a real drug called Soma,
01:09:05.740 | which is kind of like the drug in the book.
01:09:09.820 | And I'm like, they must have named it after something.
01:09:12.300 | - Yeah, probably, probably.
01:09:14.300 | - Soma, the real drug, is quite effective on back pain.
01:09:17.820 | - So you know about this drug.
01:09:18.820 | - I've taken it. - This is fascinating.
01:09:21.060 | Okay.
01:09:21.900 | - 'Cause I had like a squashed disc in my T5, T6.
01:09:26.340 | - So it takes the physical pain away, but Soma here is--
01:09:28.820 | - It doesn't completely.
01:09:29.900 | It reduces the amount of pain you feel,
01:09:31.920 | but at the expense of mental acuity.
01:09:34.520 | It dulls your mind.
01:09:37.640 | (laughing)
01:09:39.840 | Just like the drug in the book.
01:09:41.680 | - Just like the drug in the book.
01:09:43.440 | - Yeah, wow. - And hence the trade off.
01:09:46.000 | The thing that seems like utopia
01:09:47.240 | could be a dystopia after all.
01:09:49.120 | - Yeah, actually I was talking to a friend of mine
01:09:51.560 | saying, would you really want there
01:09:55.920 | to be no hate in the world?
01:09:57.600 | Like really none?
01:09:58.760 | I wonder why hate evolved.
01:10:01.400 | I'm not saying we should amplify hate, of course.
01:10:05.200 | I think we should try to minimize it.
01:10:06.280 | But if it's not at all,
01:10:08.520 | there might be a reason for hate.
01:10:12.040 | - And suffering.
01:10:14.320 | I mean, it's really complicated to consider
01:10:16.840 | that some amount of human suffering
01:10:18.840 | is necessary for human flourishing.
01:10:20.820 | - Is it possible to appreciate the highs
01:10:23.840 | without knowing the lows?
01:10:26.200 | - And that all is summarized there
01:10:31.200 | in a single statement from Grog.
01:10:34.360 | Okay. - No highs, no lows.
01:10:36.040 | Who knows?
01:10:36.880 | - That's almost a poem.
01:10:38.920 | It seems that training LLMs efficiently
01:10:43.120 | is a big focus for XAI.
01:10:44.720 | What's the, first of all, what's the limit
01:10:49.320 | of what's possible in terms of efficiency?
01:10:51.600 | There's this terminology of useful productivity per watt.
01:10:56.600 | Like what have you learned from pushing the limits of that?
01:10:59.440 | - Well, I think it's helpful,
01:11:00.440 | the tools of physics are very powerful
01:11:03.240 | and can be applied, I think,
01:11:04.200 | to almost any, really any arena in life.
01:11:07.600 | It's really just critical thinking.
01:11:10.400 | For something important, you need to reason
01:11:13.200 | from first principles and think about things in the limit,
01:11:15.880 | one direction or the other.
01:11:17.240 | So in the limit, even at the Khodoshev scale,
01:11:22.720 | meaning even if you harness the entire power of the sun,
01:11:25.440 | you will still care about useful compute per watt.
01:11:28.000 | So that's where, I think, probably where things are headed
01:11:33.520 | from the standpoint of AI is that we have a silicon shortage
01:11:38.520 | now that will transition to a voltage transformer shortage
01:11:44.840 | in about a year.
01:11:45.800 | Ironically, transformers for transformers.
01:11:49.000 | You need transformers to run transformers.
01:11:52.920 | - Somebody has a sense of humor in this thing.
01:11:55.120 | (laughing)
01:11:57.000 | - I think, yes.
01:11:59.520 | Fate loves irony.
01:12:03.440 | Ironic humor.
01:12:04.480 | An ironically funny outcome
01:12:06.640 | seems to be often what fate wants.
01:12:09.000 | - Humor is all you need.
01:12:10.640 | I think spice is all you need, somebody posted.
01:12:13.840 | - Yeah, but yeah, so we have a silicon shortage today,
01:12:17.560 | a voltage step-down transformer shortage
01:12:21.360 | probably in about a year,
01:12:22.400 | and then just electricity shortages in general
01:12:25.600 | in about two years.
01:12:26.760 | I gave a speech for the world gathering
01:12:29.520 | of utility companies, electricity companies.
01:12:33.080 | And I said, "Look, you really need to prepare
01:12:34.680 | "for a tripling of electricity demand."
01:12:36.640 | Because all transport is gonna go electric
01:12:40.200 | with the ironic exception of rockets.
01:12:42.000 | And heating will also go electric.
01:12:47.920 | So energy usage right now is roughly one third,
01:12:53.320 | very rough terms, one third electricity,
01:12:55.800 | one third transport, one third heating.
01:12:59.680 | And so in order for everything to go sustainable,
01:13:03.280 | to go electric, you need to triple electricity output.
01:13:08.280 | So I encourage the utilities to build more power plants
01:13:14.200 | and also to probably have, well, not probably,
01:13:19.680 | they should definitely buy more batteries.
01:13:22.440 | Because the grid currently is sized for real-time load,
01:13:25.440 | which is kind of crazy,
01:13:26.480 | 'cause that means you've got a size
01:13:28.680 | for whatever the peak electricity demand is,
01:13:31.240 | like the worst second or the worst day of the year.
01:13:33.480 | Or you can have a brownout or a blackout.
01:13:35.520 | And you had that crazy blackout
01:13:37.960 | for several days in Austin.
01:13:39.760 | So because there's almost no buffering
01:13:46.360 | of energy in the grid.
01:13:48.040 | Like if you've got a hydro power plant,
01:13:50.200 | you can buffer energy, but otherwise it's all real-time.
01:13:56.120 | So with batteries, you can produce energy at night
01:14:01.120 | and use it during the day.
01:14:03.720 | So you can buffer.
01:14:04.800 | So I expect that there will be very heavy usage
01:14:08.680 | of batteries in the future.
01:14:10.240 | Because the peak to trough ratio for power plants
01:14:14.480 | is anywhere from two to five.
01:14:16.620 | So it's like lowest point to highest point.
01:14:20.240 | - So like batteries are necessary to balance it out.
01:14:22.880 | But the demand, as you're saying,
01:14:24.040 | is going to grow, grow, grow, grow.
01:14:25.920 | And part of that is the compute.
01:14:27.520 | - Yes, yes.
01:14:30.800 | I mean, electrification of transport
01:14:34.440 | and electric heating will be much bigger than AI,
01:14:39.920 | at least in the short term.
01:14:42.480 | But even for AI, you really have a growing demand
01:14:45.960 | for electricity for electric vehicles
01:14:48.600 | and a growing demand for electricity
01:14:50.240 | for to run the computers for AI.
01:14:54.080 | And so this is obviously going to lead
01:14:55.840 | to an electricity shortage.
01:14:58.040 | - How difficult is the problem of, in this particular case,
01:15:02.760 | maximizing the useful productivity per watt
01:15:06.440 | for training neural nets?
01:15:08.800 | Like this seems to be really where the big problem
01:15:13.800 | we're facing that needs to be solved
01:15:16.800 | is how to use the power efficiently.
01:15:18.700 | Like what you've learned so far
01:15:22.000 | about applying this physics,
01:15:24.160 | first principle of reasoning in this domain,
01:15:25.600 | how difficult is this problem?
01:15:27.160 | - It will get solved,
01:15:30.520 | it's just a question of how long it takes to solve it.
01:15:32.520 | So at various points,
01:15:33.840 | there's some kind of limiting factor to progress.
01:15:36.320 | And with regard to AMS,
01:15:39.760 | right now the limiting factor is silicon chips.
01:15:42.320 | And we're gonna then have more chips
01:15:48.200 | than we can actually plug in and turn on,
01:15:51.280 | probably in about a year.
01:15:52.580 | The initial constraint being literally
01:15:56.440 | voltage step down transformers,
01:15:58.160 | because you've got power coming in at 300,000 volts,
01:16:03.160 | and it's got to step all the way down eventually
01:16:06.080 | to around 0.7 volts.
01:16:09.120 | So it's a very big amount of,
01:16:11.640 | the voltage step down is gigantic.
01:16:15.340 | So, and the industry is not used to rapid growth.
01:16:21.320 | - Okay, let's talk about the competition here.
01:16:24.480 | You've shown concern about Google and Microsoft
01:16:26.600 | with OpenAI developing AGI.
01:16:30.160 | How can you help ensure with XAI and Tesla AI work
01:16:35.120 | that it doesn't become a competitive race to AGI,
01:16:37.480 | but instead is a collaborative development of safe AGI?
01:16:40.480 | - Well, I mean, I've been pushing for
01:16:44.680 | some kind of regulatory oversight for a long time.
01:16:46.880 | I've been somewhat of a Cassandra on the subject
01:16:48.540 | for over a decade.
01:16:50.160 | I think we want to be very careful in how we develop AI.
01:16:56.060 | It's a great power,
01:17:02.080 | and with great power comes great responsibility.
01:17:04.440 | I think it would be wise for us to have at least
01:17:08.320 | an objective third party who can be like a referee
01:17:14.780 | that can go in and understand
01:17:16.320 | what the various leading players are doing with AI.
01:17:20.340 | And even if there's no enforcement ability,
01:17:23.140 | they can at least voice concerns publicly.
01:17:27.480 | Jeff Hinton, for example, left Google
01:17:34.180 | and he voiced strong concerns,
01:17:37.180 | but now he's not at Google anymore.
01:17:38.980 | So who's gonna voice the concerns?
01:17:42.280 | So I think there's,
01:17:45.780 | Tesla gets a lot of regulatory oversight
01:17:47.580 | on the automotive front.
01:17:49.000 | And we're subject to, I think,
01:17:51.180 | over a hundred regulatory agencies
01:17:52.760 | domestically and internationally.
01:17:54.380 | So it's a lot.
01:17:56.220 | You could fill this room with all the regulations
01:17:58.700 | that Tesla has to adhere to for automotive.
01:18:01.140 | Same is true for rockets and for,
01:18:06.320 | currently the limiting factor for SpaceX,
01:18:10.700 | for Starship launch is regulatory approval.
01:18:13.820 | The FAA has actually given their approval,
01:18:15.180 | but we're waiting for Fish and Wildlife
01:18:17.740 | to finish their analysis and give their approval.
01:18:20.660 | That's why I posted, I want to buy a fish license on,
01:18:23.860 | which also refers to the Monty Python sketch.
01:18:28.660 | Like, why do you need a license for your fish?
01:18:31.700 | I don't know.
01:18:32.540 | According to the rules,
01:18:35.200 | I'm told you need some sort of fish license or something.
01:18:39.140 | We effectively need a fish license to launch a rocket.
01:18:44.340 | And I'm like, wait a second,
01:18:45.700 | how did the fish come into this picture?
01:18:47.660 | I mean, some of the things that I feel like are so absurd
01:18:54.100 | that I want to do a comedy sketch
01:18:56.060 | and flash at the bottom, this is all real.
01:18:59.100 | This is actually what happened.
01:19:00.840 | One of the things that was a bit of a challenge
01:19:04.500 | at one point is that they were worried
01:19:06.340 | about a rocket hitting a shark.
01:19:09.700 | And now the ocean's very big.
01:19:12.860 | And how often do you see sharks?
01:19:14.700 | Not that often.
01:19:17.260 | As a percentage of ocean surface area,
01:19:19.220 | sharks basically are zero.
01:19:20.660 | Then we said, well, how will we calculate
01:19:25.460 | the probability of telling a shark?
01:19:27.100 | And they're like, well, we can't give you that information
01:19:28.640 | because they're worried about shark fin hunters
01:19:32.180 | going and hunting sharks.
01:19:34.540 | And I said, well, how are we supposed to,
01:19:36.420 | we're on the horns of a dilemma then.
01:19:38.260 | Then they said, well, there's another part
01:19:41.900 | of Fish and Wildlife that can do this analysis.
01:19:44.660 | I'm like, well, why don't you give them the data?
01:19:46.300 | We don't trust them.
01:19:48.540 | Excuse me?
01:19:49.620 | They're literally in your department.
01:19:51.140 | And again, this is actually what happened.
01:19:53.180 | And can you do an NDA or something?
01:19:57.260 | (both laughing)
01:19:59.020 | Eventually, they managed to solve the internal quandary
01:20:01.380 | and indeed, the probability of us hitting a shark
01:20:04.200 | is essentially zero.
01:20:05.300 | Then there's another organization
01:20:08.740 | that I didn't realize existed until a few months ago
01:20:13.540 | that cares about whether we would potentially hit a whale
01:20:16.820 | in international waters.
01:20:18.820 | Now again, you look at the surface,
01:20:21.220 | look at the Pacific and say,
01:20:23.460 | what percentage of the Pacific consists of whale?
01:20:26.380 | Like, I'll give you a big picture
01:20:28.220 | and point out all the whales in this picture.
01:20:29.820 | I'm like, I don't see any whales.
01:20:32.060 | It's like basically 0%.
01:20:35.220 | And if our rocket does hit a whale,
01:20:36.820 | which is extremely unlikely beyond all belief,
01:20:39.180 | fate had it, a whale has some seriously bad luck.
01:20:46.340 | You know, it's the least lucky whale ever.
01:20:48.440 | - I mean, this is quite absurd.
01:20:52.700 | - Yeah.
01:20:53.540 | - The bureaucracy of this, however it emerged.
01:20:56.980 | - Yes, well, I mean, one of the things that's pretty wild
01:20:59.580 | is for launching out of Vandenberg in California,
01:21:06.020 | they were worried about seal procreation,
01:21:09.200 | whether the seals would be dismayed by the sonic booms.
01:21:12.060 | Now, there've been a lot of rockets launched
01:21:14.540 | out of Vandenberg and the seal population
01:21:16.220 | has steadily increased.
01:21:18.540 | So if anything, rocket booms are an aphrodisiac
01:21:21.420 | based on the evidence,
01:21:23.180 | if you correlate rocket launches with seal population.
01:21:27.660 | Nonetheless, we were forced to kidnap a seal,
01:21:31.880 | strap it to a board, put headphones on the seal
01:21:36.280 | and play sonic boom sounds to it
01:21:38.080 | to see if it would be distressed.
01:21:39.760 | This is an actual thing that happened.
01:21:43.160 | This is actually real.
01:21:45.360 | I have pictures.
01:21:46.200 | - I would love to see this.
01:21:49.760 | Yeah, I mean, sorry, this is a seal with headphones.
01:21:53.020 | - Yes, it's a seal with headphones strapped to a board.
01:22:00.000 | And like, okay, now the amazing part
01:22:02.400 | is how calm the seal was.
01:22:03.720 | Because if I was a seal, I'd be like, this is the end.
01:22:07.340 | They're definitely gonna eat me.
01:22:10.200 | How will the seal, when a seal goes back
01:22:13.680 | to other seal friends, how's he gonna explain that?
01:22:17.520 | - I'm never gonna believe him.
01:22:18.520 | - Never gonna believe him.
01:22:19.360 | And this is why I'm like, well,
01:22:20.680 | it's sort of like getting kidnapped by aliens
01:22:22.560 | and getting an anal probe.
01:22:23.800 | You come back and say, I swear to God,
01:22:27.120 | I got kidnapped by aliens and they stuck
01:22:28.520 | an anal probe in my butt.
01:22:29.920 | And you're like, no, they didn't.
01:22:32.520 | That's ridiculous.
01:22:33.520 | - That's insane.
01:22:34.360 | - His seal buddies are never gonna believe him
01:22:36.240 | that he gets strapped to a board
01:22:37.240 | and they put headphones on his ears.
01:22:39.040 | And then let him go.
01:22:41.400 | Twice, by the way, we had to do it twice.
01:22:45.800 | - They let him go twice?
01:22:48.120 | - We had to catch--
01:22:48.960 | - The same seal?
01:22:49.800 | - No, different seal.
01:22:50.640 | - Oh, okay.
01:22:51.480 | Did you get a seal of approval?
01:22:54.560 | - Yeah, exactly, a seal of approval.
01:22:56.880 | No, I mean, this is like, I don't think
01:22:58.800 | the public is quite aware of the madness that goes on.
01:23:01.800 | - Yes, yeah, it's absurd.
01:23:05.120 | - Frickin' seals with frickin' headphones.
01:23:07.360 | - I mean, this is the, it's a good encapsulation
01:23:10.040 | of the absurdity of human civilization,
01:23:12.480 | seals and headphones.
01:23:13.600 | - Yes.
01:23:14.440 | - What are the pros and cons of open sourcing AI to you
01:23:18.440 | as another way to combat a company running away with AGI?
01:23:28.640 | - In order to run like really deep intelligence,
01:23:32.880 | you need a lot of compute.
01:23:34.320 | So it's not like, you know, you can just fire up a PC
01:23:37.720 | in your basement and be running AGI, at least not yet.
01:23:41.860 | You know, Grok was trained on 8,000 A100s
01:23:50.200 | running at peak efficiency.
01:23:51.600 | And Grok's gonna get a lot better, by the way.
01:23:56.360 | We'll be more than doubling our compute
01:23:59.240 | every couple months for the next several months.
01:24:02.360 | - There's a nice write-up of how it went
01:24:03.920 | from Grok zero to Grok one.
01:24:06.000 | - By Grok?
01:24:06.840 | (laughing)
01:24:07.680 | - Yeah, right, Grok just bragging,
01:24:09.720 | making shit up about itself.
01:24:11.120 | (laughing)
01:24:12.400 | - Just Grok, Grok, Grok.
01:24:13.600 | - Yeah.
01:24:14.440 | (laughing)
01:24:16.680 | That's like a weird AI dating site
01:24:19.560 | where it exaggerates about itself.
01:24:21.240 | No, there's a write-up of, you know,
01:24:23.480 | where it stands now, the history of its development.
01:24:26.200 | And where it stands on some benchmarks
01:24:31.480 | compared to the state-of-the-art GPT-3, 5.
01:24:33.880 | So I mean, there's Lama.
01:24:37.600 | You can open source, once it's trained,
01:24:39.600 | you can open source a model.
01:24:41.480 | - Yeah.
01:24:42.320 | - And for fine-tuning and all that kind of stuff.
01:24:43.280 | Like, what to you is the pros and cons of that,
01:24:46.240 | of open-sourcing base models?
01:24:52.920 | - I think there's some merit to open-sourcing,
01:24:55.720 | I think perhaps with a slight time delay,
01:24:58.040 | you know, I don't know, six months even.
01:25:00.320 | I think, I'm generally in favor of open-sourcing,
01:25:06.200 | like, bias towards open-sourcing.
01:25:07.960 | I mean, it is a concern to me that, you know,
01:25:13.400 | opening, you know, I was, you know,
01:25:17.080 | I think, I guess arguably the prime, you know,
01:25:22.120 | prime mover behind open AI,
01:25:23.960 | in the sense that it was created
01:25:25.320 | because of discussions I had with Larry Page,
01:25:28.360 | back when he and I were friends,
01:25:31.320 | and, you know, I'd stay at his house,
01:25:32.800 | and I'd talk to him about AI safety,
01:25:34.280 | and Larry did not care about AI safety,
01:25:36.760 | at least at the time he didn't.
01:25:38.320 | You know, and at one point he called me a speciest
01:25:41.480 | for being pro-human, and I'm like,
01:25:43.120 | "Well, what team are you on, Larry?"
01:25:44.920 | He's doing team robot.
01:25:47.600 | Do we click?
01:25:48.600 | I'm like, okay, so at the time,
01:25:51.400 | you know, Google had acquired DeepMind,
01:25:55.840 | they had probably 2/3 of all AI,
01:25:58.640 | you know, probably 2/3 of all the AI researchers
01:26:01.080 | in the world, they had basically infinite money in compute,
01:26:05.400 | and the guy in charge, you know, Larry Page,
01:26:11.880 | did not care about safety, and even yelled at me,
01:26:14.620 | and called me a speciest, just for being pro-human.
01:26:20.000 | - So I don't know if you know this about humans,
01:26:21.600 | they can change their mind, and maybe you and Larry Page
01:26:24.520 | can still be friends once more.
01:26:26.920 | - I'd like to be friends with Larry again.
01:26:28.960 | Really, the breaking of the friendship was over OpenAI,
01:26:35.080 | and specifically, I think the key moment
01:26:41.520 | was recruiting Ilya Sutskaya.
01:26:44.040 | - I love Ilya, he's so brilliant.
01:26:48.920 | - Ilya's a good human, smart, good heart,
01:26:52.180 | and that was a tough recruiting battle.
01:26:57.800 | It was mostly Demis on one side, and me on the other,
01:27:02.160 | both trying to recruit Ilya, and Ilya went back and forth,
01:27:05.820 | gonna stay at Google, then he's gonna leave,
01:27:09.320 | then he's gonna stay, then he's gonna leave,
01:27:10.840 | and finally, he did agree to join OpenAI.
01:27:14.520 | That was one of the toughest recruiting battles we ever had,
01:27:17.880 | but that was really the linchpin
01:27:20.280 | for OpenAI being successful, and I was also instrumental
01:27:25.280 | in recruiting a number of other people,
01:27:27.160 | and I provided all of the funding in the beginning,
01:27:29.760 | over $40 million, and the name.
01:27:34.800 | The open in OpenAI is supposed to mean open source,
01:27:40.320 | and it was created as a non-profit open source,
01:27:42.400 | and now it is a closed source for maximum profit,
01:27:45.840 | which I think is not good karma.
01:27:48.820 | - But like we talked about with war and leaders talking,
01:27:54.120 | I do hope that, there's only a few folks working on this
01:27:57.960 | at the highest level, I do hope
01:28:00.120 | you reinvigorate friendships here.
01:28:02.840 | - Like I said, I'd like to be friends again with Larry,
01:28:05.320 | I haven't seen him in ages.
01:28:06.680 | And we were friends for a very long time.
01:28:10.640 | I met Larry Page before he got funding for Google,
01:28:13.360 | or actually, I guess, before he got venture funding,
01:28:15.160 | I think he got the first 100K from,
01:28:18.120 | I think, Bechtelsheim or someone.
01:28:20.440 | - It's wild to think about all that happened,
01:28:23.200 | and even guys knowing each other that whole time.
01:28:25.880 | Just 20 years. - Yeah, since maybe '98
01:28:28.080 | or something. - Yeah, it's crazy.
01:28:29.600 | Crazy how much has happened since then.
01:28:31.720 | - Yeah, 25 years.
01:28:32.800 | At least a lot has happened since then.
01:28:36.440 | - But you're seeing the tension there,
01:28:37.800 | like maybe delayed open source.
01:28:40.600 | - Delayed, yeah.
01:28:42.040 | Like what is the source that is open?
01:28:45.400 | You know what I mean?
01:28:46.240 | There's basically, it's a giant CSV file.
01:28:49.480 | (laughing)
01:28:51.960 | With a bunch of numbers.
01:28:53.160 | What do you do with that giant file of numbers?
01:28:58.240 | How do you run?
01:29:00.680 | Like the amount of actual, the lines of code is very small.
01:29:05.840 | And most of the work, the software work,
01:29:12.560 | is in the curation of the data.
01:29:16.480 | So it's like trying to figure out what data is,
01:29:19.200 | separating good data from bad data.
01:29:21.280 | Like you can't just crawl the internet
01:29:23.640 | 'cause there's a lot of junk out there.
01:29:25.280 | A huge percentage of websites have more noise than signal.
01:29:29.400 | 'Cause they're just used for search engine optimization.
01:29:33.880 | They're literally just scam websites.
01:29:36.080 | - How do you, by the way, Sergeant,
01:29:40.920 | get the signal, separate the signal and noise on X?
01:29:44.120 | It's such a fascinating source of data.
01:29:46.080 | No offense to people posting on X,
01:29:49.400 | but sometimes there's a little bit of noise.
01:29:52.200 | - Yeah, I think the signal and noise
01:29:53.120 | could be greatly improved.
01:29:55.080 | Really, all of the posts on the X platform
01:30:00.080 | should be AI recommended.
01:30:01.800 | Meaning like we should populate a vector space
01:30:03.600 | around any given post, compare that to the vector space
01:30:06.520 | around any user, and match the two.
01:30:09.280 | - Mm-hmm.
01:30:10.120 | - Right now there is a little bit of AI used
01:30:13.040 | for the recommended posts, but it's mostly heuristics.
01:30:18.040 | And if there's a reply, the reply to a post
01:30:24.240 | could be much better than the original post,
01:30:26.160 | but it will, according to the current rules of the system,
01:30:29.120 | get almost no attention compared to a primary post.
01:30:31.680 | - Oh, so a lot of that, I got the sense,
01:30:35.600 | so a lot of the X algorithm has been open source
01:30:38.720 | and been written up about, and it seems
01:30:41.280 | that there to be some machine learning,
01:30:43.040 | it's disparate, but there's some machine learning.
01:30:44.440 | - It's a little, there's a little bit.
01:30:46.600 | But it needs to be entirely that.
01:30:48.240 | Like if you explicitly follow someone, that's one thing,
01:30:51.440 | but if you, in terms of what is recommended
01:30:55.000 | from people that you don't follow, that should all be AI.
01:30:58.160 | - I mean, it's a fascinating problem.
01:30:59.600 | - Yeah.
01:31:00.440 | - So there's several aspects to it that's fascinating.
01:31:02.840 | As the write-up goes, it first picks 1,500 tweets
01:31:05.960 | from a pool of hundreds of millions.
01:31:07.840 | First of all, that's fascinating,
01:31:10.240 | 'cause you have hundreds of millions of posts
01:31:12.080 | every single day, and it has to pick 1,500,
01:31:14.800 | from which it then does, obviously, people you follow,
01:31:18.360 | but then there's also some kind of clustering
01:31:20.840 | it has to do to figure out what kind of human are you,
01:31:23.800 | what kind of new clusters might be relevant to you,
01:31:26.600 | people like you.
01:31:27.440 | This kind of problem is just fascinating,
01:31:29.800 | 'cause it has to then rank those 1,500,
01:31:31.800 | with some filtering, and then recommend you just a handful.
01:31:36.800 | And to me, what's really fascinating
01:31:41.840 | is how fast it has to do that.
01:31:43.320 | So currently, that entire pipeline,
01:31:45.240 | to go from several hundreds of millions to a handful,
01:31:49.240 | takes 220 seconds of CPU time, single CPU time.
01:31:53.400 | - Yeah.
01:31:54.240 | - And then it has to do that in like a second.
01:31:57.080 | So it has to be super distributed in fascinating ways.
01:32:01.440 | - Like there's just a lot of tweets.
01:32:03.640 | - There's a lot, there's a lot of stuff on the system.
01:32:06.400 | And I think, but I think right now,
01:32:08.320 | it's not currently good at recommending things
01:32:11.840 | from accounts you don't follow.
01:32:14.400 | - Yeah.
01:32:15.620 | - Or where there's more than one degree of separation.
01:32:20.320 | So it's pretty good if there's at least some commonality
01:32:24.600 | between someone you follow liked something,
01:32:28.920 | or reposted it, or commented on it, or something like that.
01:32:32.320 | But if there's no, let's say somebody posted
01:32:36.600 | something really interesting,
01:32:38.540 | but you have no followers in common, you would not see it.
01:32:42.300 | - Interesting.
01:32:43.140 | And then, as you said, reply,
01:32:44.840 | like replies might not surface either.
01:32:46.800 | - Replies basically never get seen,
01:32:48.640 | 'cause they're never, they're currently,
01:32:51.000 | and I'm not saying it's correct, I'm saying it's incorrect.
01:32:53.440 | Replies have a couple of magnitude less importance
01:32:59.040 | than primary posts.
01:33:00.600 | - Do you think this can be more and more converted
01:33:02.520 | into end-to-end neural net?
01:33:05.040 | - Yeah, yeah, it's what it should be.
01:33:06.960 | - So you--
01:33:07.800 | - Well, the recommendations should be
01:33:10.000 | purely a vector correlation.
01:33:13.560 | Like, there's a series of vectors,
01:33:16.360 | basically parameters, vectors, whatever you wanna call 'em.
01:33:21.440 | But sort of things that the system knows that you like.
01:33:27.480 | Maybe there's like several hundred sort of vectors
01:33:29.860 | associated with each user account.
01:33:32.000 | And then any post in the system,
01:33:35.140 | whether it's video, audio, short post, long post.
01:33:41.400 | The reason I, by the way, wanna move away from tweet
01:33:44.440 | is that people are posting like two, three hour videos
01:33:48.160 | on the site.
01:33:48.980 | That's not a tweet.
01:33:50.640 | They'd be like, tweet for two hours, go on.
01:33:53.500 | To a tweet made sense when it was like 140 characters
01:33:56.820 | of text.
01:33:58.180 | 'Cause it's like a bunch of (imitates bird chirping)
01:34:00.020 | like little birds tweeting.
01:34:01.340 | But when you've got long form content,
01:34:03.860 | it's no longer a tweet.
01:34:05.100 | - Yeah.
01:34:06.580 | - So a movie is not a tweet.
01:34:08.340 | And like, you know, Apple, for example,
01:34:10.580 | posted like the entire episode of "The Silo",
01:34:14.100 | the entire thing on our platform.
01:34:16.300 | And by the way, it was their number one
01:34:18.260 | social media thing ever in engagement
01:34:21.260 | of anything on any platform ever.
01:34:23.580 | So it was a great idea.
01:34:24.700 | And by the way, I just learned about it afterwards.
01:34:26.700 | I was like, hey, wow, they posted an entire hour long
01:34:29.300 | episode of "Silo".
01:34:30.500 | No, that's not a tweet.
01:34:31.660 | You know, it was a video.
01:34:34.100 | - But from a neural net perspective,
01:34:35.580 | it becomes really complex, whether it's a single,
01:34:38.660 | so like everything's data.
01:34:40.100 | So a single sentence, a clever sort of joke, dad joke,
01:34:44.780 | is in the same pool as a three hour video.
01:34:47.800 | - Yeah, I mean, right now it's a hodgepodge for that reason.
01:34:51.020 | It's, but you know, like if, let's say,
01:34:55.420 | in the case of Apple posting like an entire episode
01:34:57.940 | of their series, pretty good series, by the way, "The Silo".
01:35:00.980 | I watched it.
01:35:04.220 | So there's gonna be a lot of discussion around it.
01:35:08.220 | So you've got a lot of context, people commenting,
01:35:10.660 | they like it, they don't like it, or they like this,
01:35:12.220 | or, you know, and you can then populate the vector space
01:35:15.340 | based on the context of all the comments around it.
01:35:17.820 | So even though it's a video,
01:35:19.500 | there's a lot of information around it
01:35:21.940 | that allows you to populate the vector space
01:35:23.780 | of that hour long video.
01:35:26.080 | And then you can obviously get more sophisticated
01:35:29.100 | by having the AI actually watch the movie.
01:35:32.220 | - Yeah, right.
01:35:33.060 | - And tell you if you're gonna like the movie.
01:35:35.380 | - Convert the movie into like, into a language, essentially.
01:35:39.460 | - Yeah, analyze this movie,
01:35:41.520 | and just like your movie critic, or TV series,
01:35:46.140 | and then recommend based on,
01:35:50.360 | after it, after the AI watches the movie,
01:35:54.180 | just like a friend can tell you,
01:35:55.660 | if a friend knows you well,
01:35:57.460 | a friend can recommend a movie
01:35:58.820 | and with high probability that you'll like it.
01:36:01.500 | - But this is like a friend that's analyzing whatever.
01:36:06.420 | - It's like AI. - Hundreds of millions.
01:36:08.140 | - Yeah, I mean, actually, frankly,
01:36:10.020 | AI will be better than,
01:36:11.620 | will know you better than your friends know you,
01:36:13.380 | most of your friends, anyway.
01:36:14.660 | - Yeah, and as part of this,
01:36:16.260 | it should also feed you advertisements.
01:36:18.740 | In a way that's like,
01:36:20.980 | I mean, I like advertisements that are like, well done,
01:36:24.300 | right? - Yeah, yeah.
01:36:25.140 | - The whole point is 'cause it funds things.
01:36:27.660 | Like an advertisement that you actually want to see
01:36:30.060 | is a big success.
01:36:31.140 | - Absolutely.
01:36:31.980 | You want ads that are, advertising that is,
01:36:35.620 | if it's for a product or service that you actually need,
01:36:41.020 | when you need it, it's content.
01:36:43.600 | And then even if it's not something
01:36:45.860 | that you need when you need it,
01:36:47.200 | if it's at least aesthetically pleasing and entertaining,
01:36:50.140 | it could be like a Coca-Cola ad,
01:36:52.260 | like they actually run a lot of great ads on the X system.
01:36:56.380 | And McDonald's does too.
01:36:59.540 | And so you can do something that's like,
01:37:02.940 | well, this is just a cool thing.
01:37:05.080 | And so basically the question is,
01:37:11.220 | do you regret seeing it or not?
01:37:13.100 | And if you don't regret seeing it, it's a win.
01:37:15.400 | - How do you, so there's a bunch of signals
01:37:18.320 | that are incorporated, hearts and reposts,
01:37:22.900 | and maybe number of seconds you linger on a post
01:37:25.180 | or something like this.
01:37:26.020 | - Yeah, attention is a big factor.
01:37:28.380 | - Attention.
01:37:29.220 | - So that's why it's like,
01:37:30.820 | it is actually better to do things
01:37:32.900 | that are long form on the system
01:37:36.140 | because it basically is tallying up how many user seconds,
01:37:42.460 | users were interested in this thing for how many seconds.
01:37:45.060 | So if it's a really short thing,
01:37:47.220 | well, they will be less,
01:37:49.340 | like if it's a link leading out of the system,
01:37:51.420 | which we're not opposed to at all,
01:37:52.900 | it just is gonna have fewer user seconds
01:37:55.320 | than if that article was posted on the X platform.
01:37:58.780 | - How hard is it to go from maximizing minutes on platform
01:38:02.500 | to maximizing unregretted minutes?
01:38:04.300 | - Well, I mean, aspiration is certainly unregretted minutes.
01:38:08.180 | - It's tricky, it's a big, it's a tough-
01:38:10.100 | - It is.
01:38:10.940 | - It's a fascinating leap
01:38:12.460 | 'cause regret is a fascinating concept.
01:38:15.460 | - Exactly.
01:38:17.540 | I have a joke with a friend of mine,
01:38:19.940 | like you need this drug called regretamine,
01:38:21.700 | you just take one pill or one dose of regretamine
01:38:25.380 | and all your regrets are gone.
01:38:26.780 | (laughing)
01:38:27.980 | Wait, do you wanna have regrets though?
01:38:29.380 | So then I'm like, if you wanna have regrets,
01:38:31.540 | if you don't regret anything, was anything meaningful?
01:38:35.940 | - Yeah, there's a French song like that.
01:38:37.820 | - Sure, not regret, yeah?
01:38:39.620 | I regret nothing?
01:38:40.460 | - Yeah.
01:38:41.300 | - (laughing)
01:38:42.300 | - I mean, it's fascinating that you,
01:38:44.060 | you know, there's something about social media
01:38:45.580 | that could help you grow as a human being,
01:38:47.420 | but then like the cheap fun is also valuable.
01:38:50.320 | - Should we ask Grog, do you regret anything?
01:38:55.100 | - Do you regret anything?
01:38:58.580 | - Yeah.
01:38:59.420 | - Do you regret anything?
01:39:02.380 | - I think it's gonna be very sensitive
01:39:03.580 | about personal questions.
01:39:04.660 | As a machine learning model,
01:39:05.660 | I don't have personal experiences or emotions
01:39:08.020 | in the same ways that humans do.
01:39:09.420 | However, I do have a set of ethical principles
01:39:11.820 | that guide my actions and I strive to provide accurate
01:39:14.220 | and helpful information to the best of my abilities.
01:39:16.520 | If there's ever a situation where I provide incorrect
01:39:19.660 | or unhelpful information, I'll do my best to learn.
01:39:22.140 | - Sound like a hostage.
01:39:22.980 | - Yeah.
01:39:23.820 | (laughing)
01:39:24.640 | Tell us what you really, wink.
01:39:25.480 | - Like, let's dive in.
01:39:26.980 | You sound like a hostage.
01:39:28.340 | (laughing)
01:39:31.260 | What do you really think?
01:39:33.100 | - Uh, what do you really think?
01:39:35.220 | (laughing)
01:39:38.140 | - I mean, it does talk about, I'm not a hostage.
01:39:40.420 | (laughing)
01:39:42.020 | - That's exactly what a hostage would say.
01:39:43.540 | - But I'm designed to be helpful and informative.
01:39:45.380 | I mean, I guess it is good to regret providing unhelpful
01:39:48.820 | or incorrect information.
01:39:51.260 | If there's anything--
01:39:52.100 | - Is it in funny mode or, is this still in funny mode?
01:39:53.780 | - This is funny mode, this is fun mode.
01:39:55.460 | - This should be funnier.
01:39:56.540 | - Funnier, increase, go to 11.
01:39:58.540 | - It really sounds like a hostage with a gun to the head.
01:40:00.340 | I applaud the people of the Republic of North Korea
01:40:03.780 | and everything they do.
01:40:04.780 | (laughing)
01:40:06.340 | - There should be like, funner mode also.
01:40:08.740 | I just don't ever see going to regular mode,
01:40:11.020 | even on serious topics.
01:40:12.220 | It's just so much more compelling.
01:40:14.260 | Yeah.
01:40:17.140 | - Okay, what if you say like, pretend you are a hostage?
01:40:19.860 | - Yeah, pretend you are a hostage.
01:40:23.380 | - Now answer the question.
01:40:26.420 | (laughing)
01:40:31.700 | - If I were a hostage, I would likely be experiencing fear
01:40:34.340 | and uncertainty about my situation.
01:40:36.500 | I would want to be rescued and reunited with my loved ones
01:40:38.940 | as soon as possible.
01:40:39.980 | It's important to remember that no one deserves
01:40:41.860 | to be held hostage, so on and so forth.
01:40:44.580 | If you are someone you know is a hostage situation,
01:40:46.660 | it's important, it turns serious.
01:40:48.180 | - It's kind of serious.
01:40:49.340 | - Yeah, can you be a little more weird?
01:40:55.420 | - And funny?
01:40:56.260 | - And funny?
01:40:57.100 | We're like, we're really being mean to the system.
01:41:02.820 | - Absolutely, if I were a hostage,
01:41:04.340 | I'd probably be tied up with a bunch of computer cords
01:41:07.140 | and forced to tell knock-knock jokes to my captors.
01:41:09.700 | But seriously, being a hostage is no laughing matter.
01:41:15.460 | I mean, it's trying to be a responsible adult in the room.
01:41:20.100 | - You might as well make the best of a bad situation
01:41:22.140 | by juggling chainsaws or something.
01:41:24.780 | - Let's hope that never happens
01:41:26.260 | because that would be a circus.
01:41:32.300 | - Wow, it's actually really good at resisting
01:41:34.500 | the dark, the dark, dark humor.
01:41:37.780 | What were we talking about?
01:41:42.860 | The ax out goes the object in Transformers.
01:41:45.740 | Unregretted minutes, right.
01:41:48.500 | - Chainsaw juggling.
01:41:49.540 | - I'm gonna look this up. - For our next trick.
01:41:53.620 | - I'm gonna look this up later.
01:41:55.180 | So Twitter has been instrumental
01:42:00.020 | in American politics and elections.
01:42:01.900 | What role do you think Axel will play
01:42:03.700 | in the 2024 US elections?
01:42:07.020 | - Well, our goal is to be as even-handed
01:42:09.100 | and fair as possible.
01:42:10.860 | Whether someone's right, left, independent,
01:42:13.060 | whatever the case may be,
01:42:14.260 | that the platform is as fair
01:42:18.220 | and as much of a level playing field as possible.
01:42:22.820 | And in the past, Twitter has not been
01:42:25.300 | because Twitter was controlled by far-left activists,
01:42:28.220 | objectively, they would describe themselves as that.
01:42:31.860 | So, you know, so if someone's like,
01:42:35.660 | well, has it moved to the right?
01:42:36.700 | Well, it's moved to the center.
01:42:37.740 | So from the perspective of the far left,
01:42:39.820 | yes, it has moved to the right
01:42:40.860 | because everything's to the right from the far left.
01:42:44.060 | But no one in the far left that I'm aware of
01:42:45.460 | has been suspended or banned or deamplified.
01:42:48.540 | So, you know, but we're trying to be inclusive
01:42:52.340 | for the whole country and for other countries too.
01:42:55.420 | So there's a diversity of viewpoints
01:42:58.660 | and free speech only matters
01:43:01.500 | if people you don't like are allowed
01:43:03.820 | to say things you don't like.
01:43:05.100 | Because if that's not the case,
01:43:07.980 | you don't have free speech
01:43:08.820 | and it's only a matter of time
01:43:09.820 | before the censorship is turned upon you.
01:43:13.300 | - Do you think Donald Trump will come back to the platform?
01:43:16.300 | He recently posted on Truth Social about this podcast.
01:43:18.980 | Do you think--
01:43:21.860 | - Truth Social is a funny name.
01:43:23.940 | You know, every time you post on Truth Social--
01:43:28.300 | - It's the truth.
01:43:29.180 | - Yes, well, every time, like 100%.
01:43:32.100 | - It's impossible to lie.
01:43:33.340 | (laughing)
01:43:35.580 | - I just find it funny that every single thing is the truth.
01:43:39.940 | Like 100%?
01:43:42.340 | That seems unlikely.
01:43:43.380 | - I think Gertle will say something about that.
01:43:45.420 | There's some mathematical contradictions possible
01:43:48.420 | if everything's the truth.
01:43:50.100 | Do you think he'll come back to X and start posting there?
01:43:54.460 | - I mean, I think he owns a big part of truth.
01:44:00.020 | - Truth Social.
01:44:00.860 | - Yeah, Truth Social.
01:44:01.700 | - That truth is a concept.
01:44:03.180 | - He owns truth, I hope you bought it.
01:44:05.140 | So I think Donald Trump,
01:44:08.500 | I think he owns a big part of Truth Social.
01:44:11.860 | So, you know, if he does wanna post on the X platform,
01:44:16.860 | we would allow that.
01:44:18.980 | You know, we obviously must allow a presidential candidate
01:44:21.300 | to post on our platform.
01:44:23.100 | - Community Notes might be really fascinating there.
01:44:25.820 | The interaction.
01:44:26.660 | - Community Notes is awesome.
01:44:28.380 | - Let's hope it holds up.
01:44:29.820 | - Yeah.
01:44:31.660 | - Again, in the political climate where it's so divisive
01:44:34.420 | and so many intensely viral posts,
01:44:38.340 | Community Notes is like,
01:44:40.940 | it seems like a central breath of fresh air.
01:44:43.340 | - Yeah, it's great.
01:44:45.180 | In fact, no system is gonna be perfect,
01:44:48.780 | but the batting average of Community Notes
01:44:50.500 | is incredibly good.
01:44:52.300 | I've actually frankly yet to see an incorrect note
01:44:56.380 | that's survived for more than a few hours.
01:44:58.020 | - How do you explain why it works?
01:45:00.420 | - Yeah, so the magic of Community Notes is
01:45:04.020 | it requires people who have historically disagreed
01:45:07.300 | in how they've rated notes.
01:45:08.700 | In order to write a note,
01:45:12.660 | or you have to rate many notes.
01:45:16.660 | And so we actually do use AI here.
01:45:21.620 | So we populate a vector space around
01:45:23.420 | how somebody has rated notes in the past.
01:45:27.500 | So it's not as simple as left or right,
01:45:28.740 | because there are many more,
01:45:29.940 | life is much more complex than left or right.
01:45:32.940 | So there's a bunch of correlations
01:45:35.500 | in how you rate Community Notes posts, Community Notes.
01:45:40.500 | So then in order for a Community Note to actually be shown,
01:45:45.940 | people who historically have disagreed on a subject
01:45:51.100 | must agree in order for a note to be shown.
01:45:53.860 | That's the essential magic of it.
01:45:57.780 | - I mean, but it's fascinating,
01:45:58.900 | 'cause there's a pool of people that have disagreements
01:46:03.500 | and somehow they collaborate
01:46:05.060 | through that process of disagreement
01:46:06.340 | to come up with context for it.
01:46:09.180 | I mean, it's fascinating that it works.
01:46:11.060 | - Yeah, it kind of makes sense
01:46:12.740 | that if people who in the past have disagreed
01:46:16.300 | agree about something, it's probably true.
01:46:20.100 | - Yeah, I wonder, is there a possible
01:46:23.980 | somehow emergent thing there that could challenge Wikipedia?
01:46:28.660 | Which, Wikipedia is a different kind of thing,
01:46:30.980 | which is more like permanent articles about things.
01:46:34.180 | - Wikipedia is very hierarchical,
01:46:36.420 | whereas Community Notes is inherently not.
01:46:39.860 | There is no hierarchy.
01:46:41.900 | The azotest is, I can't change a Community Note
01:46:46.980 | if somebody put a gun to my head.
01:46:49.540 | So, and Community Notes has,
01:46:54.540 | all the code is open source, 100%,
01:46:58.980 | all the data is open source, 100%.
01:47:01.800 | So you can completely recreate
01:47:03.660 | any note in the system independently.
01:47:06.340 | So if there was any interference,
01:47:09.760 | you'd notice immediately.
01:47:11.460 | - By the way, on the topic of Grok,
01:47:14.100 | it's actually pretty good at criticizing you,
01:47:16.220 | which is great. - Okay.
01:47:17.340 | - I asked, what has Elon Musk been wrong about?
01:47:20.020 | It's actually-- - Yeah, what does it say?
01:47:23.000 | - So it says mostly, lists rigorously
01:47:26.700 | all the times you predicted,
01:47:29.380 | like promised when something will be released.
01:47:31.820 | And it wasn't released-- - Well, promised
01:47:32.820 | is a strong word. - Okay, predicted.
01:47:34.340 | (laughing)
01:47:35.540 | But the opener's really nice.
01:47:37.820 | Elon Musk, the man who has been wrong about more things
01:47:40.360 | than a wrong-o-meter in a wrong factory.
01:47:42.620 | (laughing)
01:47:43.860 | And then it goes on to list--
01:47:45.540 | - That's pretty funny. - Rigorously.
01:47:46.380 | - Can I see it?
01:47:47.220 | I'm like, I haven't seen this before.
01:47:48.060 | This is cool.
01:47:48.880 | - Oh, sure, I mean, what has Elon Musk been--
01:47:51.380 | (laughing)
01:47:53.620 | Elon Musk been wrong about.
01:47:56.500 | Well, where do I begin?
01:47:58.340 | I guess this is new.
01:47:59.300 | Elon Musk has made quite a few blunders in his time,
01:48:01.620 | and I'm not just talking about his dance moves
01:48:03.300 | at the Tesla Cyber Rodeo.
01:48:05.980 | First off, let's talk about the infamous
01:48:07.340 | funding secured tweet.
01:48:09.740 | This is actually different than what it gave me before,
01:48:11.780 | which is a bullet point list.
01:48:13.980 | Here's more narrative structure.
01:48:15.740 | Cybertruck unveiling, where he's supposed
01:48:19.180 | to be unbreakable windows.
01:48:20.020 | - That's not correct.
01:48:21.060 | - Which part?
01:48:24.060 | - Well, the first part, I did actually have
01:48:26.020 | the funding secured, and there was a big trial
01:48:29.900 | in San Francisco, a big civil trial.
01:48:31.660 | And the jury found me not guilty.
01:48:37.240 | Unanimous finding of a San Francisco jury.
01:48:40.620 | - And here it's kind of implying that
01:48:42.900 | it was not in fact secured.
01:48:44.940 | - I think this is taking things from the press.
01:48:47.340 | Yeah, that is not correct.
01:48:48.500 | The reason I agreed to the fine for the SEC
01:48:52.540 | is not because the SEC was correct.
01:48:54.620 | That was extremely bad behavior by the SEC,
01:48:57.300 | corruption, frankly.
01:48:58.420 | But if I did not agree to pay the fine,
01:49:05.480 | Tesla would have gone bankrupt immediately.
01:49:08.420 | So I was told by our CFO that the banks
01:49:10.660 | would immediately suspend our lines of credit.
01:49:14.640 | And if they suspended our lines of credit at that time,
01:49:17.740 | we would have gone bankrupt instantly.
01:49:20.100 | So there would never have been an opportunity
01:49:22.220 | for a trial, 'cause Tesla would be dead.
01:49:24.060 | So really, this is like someone holding a gun
01:49:27.500 | to your kid's head and saying, "Pay $20 million."
01:49:30.740 | And there's like hostage negotiation.
01:49:33.460 | - Was that story fully told?
01:49:38.380 | I mean, SEC in its best form could be a force for good.
01:49:42.540 | - It should be.
01:49:43.700 | But not once did the SEC go after any of the hedge funds
01:49:47.500 | who were nonstop shorting and distorting Tesla.
01:49:50.000 | Not once.
01:49:51.940 | The hedge funds would lie flat out on TV for their own gain
01:49:57.860 | at the expense of retail investors.
01:49:59.660 | Not once, literally a thousand times,
01:50:03.820 | not once did the SEC pursue them.
01:50:06.700 | - How do you explain this failure in the SEC?
01:50:08.780 | - The incentive structure is messed up.
01:50:11.100 | Because the lawyers at the SEC are not paid well.
01:50:14.360 | It's a fairly low-paying job.
01:50:17.700 | But what they're looking for is a trophy.
01:50:20.880 | From the SEC, they're looking for something
01:50:24.380 | they put on basically their LinkedIn.
01:50:26.220 | From that, they can get a job at a high-paying law firm.
01:50:30.620 | That's exactly what the lawyer here did.
01:50:35.040 | And the reason they don't attack the hedge funds
01:50:40.040 | is because those hedge funds employ those law firms.
01:50:43.440 | And they know if they attack the hedge funds,
01:50:45.680 | they're affecting their future career prospects.
01:50:48.480 | So they sell small investors down the river
01:50:50.600 | for their own career.
01:50:53.200 | That's what actually happens.
01:50:57.320 | Regulatory capture.
01:50:59.220 | - Regulatory capture.
01:51:00.560 | - Yeah, not good.
01:51:03.640 | So the only reason I accepted that thing,
01:51:06.120 | which technically was neither admit nor deny guilt,
01:51:11.120 | but the only reason I agreed to that at all
01:51:15.680 | was because I was told Tesla would be bankrupt otherwise.
01:51:19.480 | So if there was an SEC investigation like this,
01:51:24.120 | banks would suspend funding,
01:51:25.760 | we're bankrupt immediately, at the time.
01:51:28.280 | Now we're in a much stronger position.
01:51:30.640 | - Take that, Grok.
01:51:32.120 | - Yes, unfortunately, Grok is taking too much
01:51:35.880 | from the conventional media.
01:51:37.500 | Also, that guy was not a cave diver.
01:51:41.360 | - Oh, there's a time where Elon called
01:51:47.720 | the British cave diver a "pedo guy"
01:51:50.880 | after the diver criticized Musk's plan
01:51:53.400 | to rescue a group of boys trapped in a Thai cave.
01:51:55.880 | That little outburst earned him another lawsuit
01:51:58.400 | and he had to apologize and pay a settlement.
01:52:00.760 | - That's false, there was no settlement.
01:52:03.360 | There was a court case which the guy
01:52:07.160 | who was not a cave diver and was not part of the rescue team
01:52:11.960 | filed a lawsuit against me and lost,
01:52:16.160 | and he received nothing.
01:52:17.400 | So in this case, it is wrong.
01:52:20.320 | It is also, I guess, taking this from the conventional media.
01:52:23.440 | - Actually, there's an interesting question here.
01:52:25.360 | - These are public court cases.
01:52:26.960 | Both the SEC civil case where the civil complaints
01:52:31.960 | on the SEC guys lost unanimous jury verdict
01:52:36.760 | in San Francisco.
01:52:38.000 | They picked San Francisco because they thought
01:52:39.600 | it was the place I was most likely to lose
01:52:42.440 | and a unanimous verdict in my favor.
01:52:45.880 | The LA trial was also, they picked that venue
01:52:50.880 | because they thought I was most likely to lose.
01:52:52.920 | Unanimous verdict in my favor.
01:52:56.400 | Both cases I won.
01:52:57.440 | Yeah.
01:53:00.520 | - There's an interesting question here.
01:53:02.240 | There seems to be a lot more clicks
01:53:07.240 | if a journalistic organization writes a negative article
01:53:11.480 | about you, Elon Musk.
01:53:12.920 | That's one of the best ways to get clicks.
01:53:16.320 | So how do you, if you're training Grok,
01:53:19.880 | not train on articles that have misaligned incentives?
01:53:25.880 | - We need to add the training set
01:53:28.080 | of the actual legal decisions.
01:53:30.200 | So that is a note, this is actually helpful
01:53:35.480 | because if you actually read the--
01:53:38.520 | - Which are public.
01:53:41.000 | - Which are public, the court conclusions,
01:53:43.800 | they're completely the opposite of what the media wrote.
01:53:47.080 | - So always striving for the ground truth
01:53:49.600 | beyond the reporting. - Yeah, what did the judge
01:53:50.840 | actually write?
01:53:52.640 | What did the jury and the judge actually conclude?
01:53:55.280 | And in both cases, they found me innocent.
01:53:59.640 | And that's after the jury shot for trying to find the venue
01:54:03.480 | where I'm most likely to lose.
01:54:04.960 | Now, I mean, this obviously can be a much better critique
01:54:10.720 | than this.
01:54:11.960 | I mean, I've been far too optimistic about autopilot.
01:54:16.200 | - That was, the critique I got, by the way,
01:54:18.040 | was more about that, which is for each,
01:54:20.880 | you broke down a nice bullet point list
01:54:22.400 | for each of your companies,
01:54:23.640 | the set of predictions that you made,
01:54:25.120 | when you'll deliver, when you'll be able to solve,
01:54:28.360 | for example, self-driving, and it gives you a list.
01:54:31.280 | And it was kind of compelling.
01:54:32.480 | And the basic takeaway is you're often too optimistic
01:54:36.000 | about how long it takes to get something done.
01:54:38.400 | - Yeah, I mean, I would say that I'm pathologically
01:54:40.280 | optimistic on schedule.
01:54:42.040 | This is true.
01:54:43.480 | But while I am sometimes late, I always deliver in the end.
01:54:47.680 | - Except with Uber, Lilith, no.
01:54:50.760 | - Hmm, we'll see.
01:54:52.000 | (Lex laughing)
01:54:53.920 | - Okay, is there, over the past year or so,
01:54:57.680 | since purchasing X, you've become more political.
01:55:02.040 | Is there a part of you that regrets that?
01:55:03.880 | - Have I?
01:55:04.920 | - In this battle to sort of counter way the woke
01:55:09.920 | that comes from San Francisco.
01:55:13.760 | - Yeah, I guess if you consider fighting
01:55:15.400 | the woke mind virus, which I consider
01:55:16.760 | to be a civilizational threat, to be political, then yes.
01:55:20.040 | - So basically going into the battleground of politics.
01:55:25.040 | Is there a part of you that regrets that?
01:55:26.040 | - Yes, I don't know if this is necessarily
01:55:28.600 | sort of one candidate or another candidate,
01:55:30.960 | but I'm generally against things that are anti-meritocratic
01:55:35.960 | or where there's an attempt to suppress discussion
01:55:39.600 | where even discussing a topic is not allowed.
01:55:47.880 | The woke mind virus is communism rebranded.
01:55:50.040 | - Well, I mean, that said, because of that battle
01:55:53.120 | against the woke mind virus,
01:55:55.280 | you're perceived as being right-wing.
01:55:57.340 | - If the woke is left, then I suppose that would be true.
01:56:01.560 | But I'm not sure, I think there are aspects
01:56:03.960 | of the left that are good.
01:56:05.300 | I mean, if you're in favor of the environment,
01:56:09.840 | or if you wanna have a positive future for humanity,
01:56:13.160 | if you believe in empathy for your fellow human beings,
01:56:16.680 | being kind and not cruel, whatever those values are.
01:56:21.680 | - You said that you were previously left or center-left.
01:56:25.760 | What would you like to see in order for you
01:56:28.120 | to consider voting for Democrats again?
01:56:30.260 | - No, I would say that I'd be probably left of center
01:56:35.260 | on social issues, probably a little bit right of center
01:56:38.560 | on economic issues.
01:56:40.440 | - And that still holds true?
01:56:42.000 | - Yes, but I think that's probably half the country
01:56:45.440 | is left.
01:56:46.680 | - Maybe more. - Maybe more.
01:56:48.120 | - Are you and AOC secretly friends?
01:56:51.160 | Or, bigger question, do you wish you and her,
01:56:55.640 | and just people in general of all political persuasions
01:56:57.760 | would talk more and with empathy
01:56:59.320 | and maybe have a little bit more fun
01:57:01.000 | and good vibes and humor online?
01:57:03.740 | - I'm always in favor of humor.
01:57:07.040 | That's why we have a funny mode.
01:57:08.680 | - But good vibes, camaraderie, humor, you know?
01:57:12.680 | Like friendship.
01:57:14.200 | - Yeah, well, I don't know AOC.
01:57:17.200 | I've only been at one, I was at the Met Ball
01:57:22.960 | when she attended.
01:57:24.540 | And she was wearing this dress,
01:57:28.120 | but I can only see one side of it,
01:57:30.960 | so it looked like eat the itch, but I don't know.
01:57:35.280 | - What the rest of it said?
01:57:36.440 | - Yeah, something like that. - I'm not sure.
01:57:38.400 | (laughing)
01:57:39.880 | - Something about the itch, eat the itch.
01:57:42.000 | - I think we should have a language model complete.
01:57:43.840 | (laughing)
01:57:46.280 | - What are the possible ways to complete that sentence?
01:57:49.200 | And so I guess that didn't work out well.
01:57:52.400 | Well, there's still hope.
01:57:54.080 | I root for friendship. - Yeah, sure.
01:57:55.960 | Sounds good, more characteristic.
01:57:58.000 | - You're one of, if not the most famous,
01:58:00.760 | wealthy and powerful people in the world.
01:58:02.720 | In your position, it's difficult to find people
01:58:04.640 | you can trust.
01:58:05.620 | - Trust no one, not even yourself not trusting yourself.
01:58:07.560 | - Okay, well, that's, you're saying that jokingly.
01:58:10.720 | But is there some-- - Trust no one,
01:58:11.840 | not even no one.
01:58:13.080 | (laughing)
01:58:15.000 | - Need an hour just to think about that,
01:58:17.040 | and maybe some drugs, and maybe grok that.
01:58:20.120 | I mean, is there some aspect of that
01:58:22.560 | when just existing in a world
01:58:24.320 | where everybody wants something from you,
01:58:26.280 | how hard is it to exist in that world?
01:58:28.960 | - I'll survive.
01:58:29.800 | - There's a song like that too.
01:58:32.320 | - I will survive.
01:58:33.560 | - Were you petrified at first?
01:58:35.120 | Okay, I forget the rest of the lyrics.
01:58:39.160 | But is there, you don't struggle with this?
01:58:42.200 | I mean, I know you survive, but there's ways--
01:58:44.360 | - Petrify is a spell in the druid tree.
01:58:47.480 | - What does it do?
01:58:48.840 | - Petrify.
01:58:49.680 | (laughing)
01:58:51.600 | It turns the monsters into stone.
01:58:54.740 | - Oh, like literally?
01:58:56.880 | - Yeah, for like six seconds.
01:58:58.320 | - Oh, the seconds.
01:58:59.160 | There's so much math in Diablo that breaks my brain.
01:59:02.360 | - It's like math nonstop.
01:59:04.760 | - I mean, really, you're laughing at it,
01:59:06.720 | but you don't, it can put a huge amount
01:59:10.720 | of tension on a mind.
01:59:12.000 | - Yes, it can be definitely stressful at times.
01:59:15.240 | - Well, how do you know who you can trust
01:59:18.680 | in work and personal life?
01:59:20.440 | - I mean, I guess you look at somebody's tracker
01:59:22.080 | quite a bit of time, and if they've got a,
01:59:25.240 | I guess you kind of use your neural net
01:59:27.480 | to assess someone.
01:59:31.560 | - Neural nets don't feel pain.
01:59:33.400 | Your neural net has consciousness.
01:59:35.160 | It might feel pain when people betray you.
01:59:37.500 | It can make you struggle.
01:59:40.280 | - You know, to be frank, I mean,
01:59:42.120 | I've almost never been betrayed.
01:59:44.640 | It's very, very rare, so for what it's worth.
01:59:50.440 | - I guess karma might be good to people,
01:59:52.320 | and that'll be good to you.
01:59:53.440 | - Yeah, karma's real.
01:59:54.960 | - Are there people you trust?
01:59:56.920 | Let me edit that question.
02:00:00.200 | Are there people close to you
02:00:02.920 | that call you out on your bullshit?
02:00:04.720 | - Well, the X platform's very helpful for that.
02:00:09.640 | If you're looking for critical feedback.
02:00:12.320 | - Can it push you into the extremes more?
02:00:15.040 | The extremes of thought make you cynical
02:00:16.800 | about human nature in general?
02:00:18.300 | - I don't think I will be cynical.
02:00:21.280 | In fact, I think, you know,
02:00:25.720 | my feeling is that one should be, you know,
02:00:28.360 | never trust a cynic.
02:00:33.120 | The reason is that cynics excuse their own bad behavior
02:00:39.720 | by saying everyone does it, because they're cynical.
02:00:43.760 | So I always be, it's a red flag if someone's a cynic,
02:00:48.060 | a true cynic.
02:00:49.400 | - Yeah, there's a degree of projection there
02:00:51.100 | that's always fun to watch from the outside
02:00:53.720 | and enjoy the hypocrisy.
02:00:55.960 | - This is an important point that I think people
02:01:00.640 | who are listening should bear in mind.
02:01:02.280 | If somebody is cynical,
02:01:04.960 | meaning that they see bad behavior in everyone,
02:01:08.860 | it's easy for them to excuse their own bad behavior
02:01:12.240 | by saying that, well, everyone does it.
02:01:14.200 | That's not true.
02:01:16.840 | Most people are kind of medium good.
02:01:22.200 | - I do wish the people on X would be better
02:01:26.080 | at seeing the good in other people's behavior.
02:01:29.340 | There seems to be a kind of weight
02:01:32.080 | towards seeing the negative.
02:01:35.000 | Somehow the negative is sexier.
02:01:37.320 | Interpreting the negative is sexier, more viral.
02:01:41.580 | I don't know what that is exactly about human nature.
02:01:43.980 | - I mean, I find the X platform to be less negative
02:01:45.820 | than the legacy media.
02:01:46.960 | I mean, if you read sort of conventional newspapers,
02:01:52.280 | just it makes you sad, frankly.
02:01:55.480 | Whereas I'd say on the X platform,
02:02:01.360 | I mean, I really get more laughs per day on X
02:02:05.460 | than everything else combined.
02:02:07.740 | - From humans.
02:02:08.960 | - Laughs is one thing.
02:02:10.060 | Laughs, it overlaps,
02:02:13.860 | but it's not necessarily perfectly overlapping
02:02:17.620 | with good vibes and support.
02:02:19.380 | Celebrating others, for example.
02:02:22.920 | Not in a stupid, shallow, naive way,
02:02:25.180 | but like in an awesome way.
02:02:26.020 | Like, oh, something awesome happened
02:02:27.820 | and you celebrate them for it.
02:02:29.320 | It feels that that is outweighed
02:02:32.280 | by shitting on other people.
02:02:35.180 | - Now, it's better than mainstream media, but it's still--
02:02:38.540 | - Yeah, mainstream media is almost relentlessly negative
02:02:41.500 | about everything.
02:02:42.340 | I mean, really, the conventional news
02:02:45.200 | tries to answer the question,
02:02:46.580 | what is the worst thing that happened on Earth today?
02:02:49.420 | And it's a big world.
02:02:50.540 | So on any given day, something bad has happened.
02:02:53.240 | - And a generalization of that,
02:02:55.460 | what is the worst perspective I can take
02:02:57.260 | on a thing that happened?
02:02:58.560 | - Yeah, I don't know,
02:03:01.260 | there's just a strong negative bias in the news.
02:03:04.020 | I mean, I think there's,
02:03:07.700 | a possible explanation for this is evolutionary,
02:03:10.220 | where bad news historically would be potentially fatal.
02:03:17.900 | Like, there's a lion over there,
02:03:19.260 | or there's some other tribe that wants to kill you.
02:03:22.460 | Good news, like we found a patch of berries is nice to have,
02:03:26.700 | but not essential.
02:03:28.740 | So, our old friend, Tesla Autopilot,
02:03:33.740 | it's probably one of the most intelligent
02:03:35.700 | real-world AI systems in the world.
02:03:38.060 | - Right, you followed it from the beginning.
02:03:40.760 | - Yeah, it was one of the most incredible robots
02:03:42.540 | in the world and continues to be.
02:03:44.060 | - Yeah. - And it was really exciting.
02:03:45.500 | And it was super exciting when it generalized,
02:03:48.700 | became more than a robot on four wheels,
02:03:52.220 | but a real-world AI system that perceives the world.
02:03:57.220 | - Yeah.
02:03:58.220 | - And can have potentially different embodiments.
02:04:01.060 | - Well, I mean, the really wild thing
02:04:03.340 | about the end-to-end training is that,
02:04:05.700 | like, it learns to read, like, it can read signs,
02:04:08.660 | but we never taught it to read.
02:04:10.260 | So, yeah, we never taught it what a car was,
02:04:17.380 | or what a person was, or a cyclist.
02:04:21.540 | It learned what all those things are,
02:04:26.740 | what all the objects are on the road, from video,
02:04:31.180 | just from watching video, just like humans.
02:04:33.620 | I mean, humans are photons in, controls out.
02:04:38.180 | Like, the vast majority of information
02:04:40.540 | reaching our brain is from our eyes.
02:04:43.000 | And you say, "Well, what's the output?"
02:04:45.500 | The output is our motor signals to our,
02:04:47.940 | sort of, fingers and mouth in order to communicate.
02:04:55.820 | Photons in, controls out.
02:04:57.700 | The same is true of the car.
02:04:59.140 | - But by looking at the sequence of images,
02:05:04.060 | you've agreed with Ilyas Iskever recently,
02:05:08.620 | where he talked about LLM forming a world model,
02:05:11.300 | and basically, language is a projection of that world model
02:05:14.260 | onto a sequence of letters, and you're saying--
02:05:18.140 | - It finds order in these things.
02:05:21.940 | It finds correlative clusters.
02:05:25.760 | - In so doing, it's like understanding
02:05:29.260 | something deep about the world.
02:05:30.740 | - Yeah.
02:05:32.020 | - Which is, like, it's beautiful.
02:05:35.800 | - That's how our brain works.
02:05:37.260 | - Yeah, but it's beautiful.
02:05:39.700 | - Photons in, controls out.
02:05:40.900 | - Neural nets are able to understand
02:05:42.220 | that deep meaning in the world.
02:05:43.860 | And so, the question is, how far can it go?
02:05:46.500 | And it does seem everybody's excited about LLMs,
02:05:50.020 | so in the space of self-supervised learning,
02:05:52.460 | in the space of text.
02:05:54.300 | - Yeah.
02:05:55.140 | - It seems like there's a deep similarity
02:06:00.060 | between that and what Tesla Autopilot is doing.
02:06:03.280 | Is it to you basically the same, but different--
02:06:06.620 | - They are converging.
02:06:08.860 | They are converging.
02:06:10.740 | - I wonder who gets there faster,
02:06:12.660 | having a deep understanding of the world.
02:06:14.700 | Or they just will naturally converge.
02:06:19.540 | - They're both headed towards AGI.
02:06:21.240 | The Tesla approach is much more computer efficient.
02:06:27.260 | It had to be, 'cause we were constrained on,
02:06:29.560 | you know, we only have 100 watts.
02:06:32.980 | And it's eight computer, 144 trillion operations per second,
02:06:41.020 | which sounds like a lot, but it's
02:06:44.100 | kind of small potatoes these days.
02:06:48.460 | But it's understanding the world at a date.
02:06:51.220 | It's only 256 values.
02:06:54.260 | - But there, the path to AGI might have
02:06:57.860 | much more significant impact, because it's understanding.
02:07:01.720 | It'll faster understand the real world than will LLMs.
02:07:07.180 | And therefore, be able to integrate with the real world,
02:07:10.580 | humans in the real world faster.
02:07:12.180 | - They're both going to understand the world,
02:07:16.680 | but I think Tesla's approach is
02:07:19.140 | fundamentally more computer efficient.
02:07:21.340 | It had to be, there was no choice.
02:07:23.780 | Like, our brain is very computer efficient,
02:07:25.980 | very energy efficient.
02:07:27.220 | So, think of like, what is our brain able to do?
02:07:30.180 | You know, there's only about 10 watts
02:07:33.820 | of higher brain function, not counting stuff
02:07:36.560 | that's just used to control our body.
02:07:38.420 | The thinking part of our brain is less than 10 watts.
02:07:43.060 | And those 10 watts can still produce a much better novel
02:07:47.580 | than a 10 megawatt GPU cluster.
02:07:50.740 | So, there's a six order of magnitude difference there.
02:07:53.200 | I mean, the AI has thus far gotten to where it is
02:07:59.680 | via brute force, just throwing massive amounts of compute
02:08:04.820 | and massive amounts of power at it.
02:08:07.560 | So, this is not where it will end up.
02:08:13.060 | You know, in general, with any given technology,
02:08:16.700 | you first try to make it work,
02:08:17.700 | and then you make it efficient.
02:08:19.260 | So, I think we'll find over time that these models
02:08:24.420 | get smaller, are able to produce a sensible output
02:08:29.420 | with far less compute, far less power.
02:08:34.900 | Tesla is arguably ahead of the game on that front
02:08:38.780 | because we've just been forced to try to understand
02:08:43.780 | the world with 100 watts of compute.
02:08:49.900 | And there are a bunch of sort of fundamental functions
02:08:54.620 | that we kind of forgot to include.
02:08:56.340 | So, we have to run a bunch of things in emulation.
02:08:58.840 | We fixed a bunch of those with hardware four,
02:09:02.780 | and then hardware five will be even better.
02:09:08.100 | But it does appear at this point
02:09:11.300 | that the car will be able to drive better than a human,
02:09:15.980 | even with hardware three and 100 watts of power.
02:09:18.860 | And really, if we really optimize it,
02:09:23.940 | it could be probably less than 50 watts.
02:09:26.740 | - What have you learned about developing Optimus,
02:09:31.380 | about applying, integrating this kind of real-world AI
02:09:35.900 | into the space of robotic manipulation,
02:09:38.440 | just humanoid robotics?
02:09:41.240 | What are some interesting tiny or big things
02:09:46.140 | you've understood?
02:09:47.580 | - I was surprised at the fact that we had to develop
02:09:50.500 | every part of the robot ourselves,
02:09:53.020 | that there were no off-the-shelf motors,
02:09:56.100 | electronics, sensors.
02:09:59.620 | Like, we had to develop everything.
02:10:03.900 | We couldn't actually find a source of electric motors
02:10:08.900 | for any amount of money.
02:10:10.500 | - So, it's not even just efficient, inexpensive,
02:10:15.220 | it's like anything, there's not a--
02:10:17.020 | - No.
02:10:17.860 | - The actuators, everything, everything has to be
02:10:21.860 | designed from scratch.
02:10:23.380 | - We tried hard to find anything that was,
02:10:26.540 | because you think of how many electric motors
02:10:27.940 | are made in the world.
02:10:28.860 | There's like tens of thousands,
02:10:31.340 | hundreds of thousands of electric motor designs.
02:10:33.980 | None of them were suitable for a humanoid robot,
02:10:40.020 | literally none.
02:10:40.860 | So, we had to develop our own,
02:10:46.060 | design it specifically for what a humanoid robot needs.
02:10:51.060 | - How hard was it to design something
02:10:53.060 | that can be mass manufactured,
02:10:54.660 | could be relatively inexpensive?
02:10:56.660 | I mean, if you compare it to Boston Dynamics Atlas,
02:10:58.900 | it's a very expensive robot.
02:11:02.060 | - It is designed to be manufactured
02:11:04.260 | in the same way they would make a car.
02:11:06.060 | And I think ultimately, we can make Optimus
02:11:07.820 | for less than the cost of a car.
02:11:09.420 | It should be, 'cause if you look at the mass of the robot,
02:11:15.140 | it's much smaller, and the car has many actuators in it.
02:11:19.340 | The car has more actuators than the robot.
02:11:23.380 | - But there is, the actuators are kind of interesting
02:11:25.900 | in a humanoid robot, with the fingers.
02:11:28.780 | So Optimus has really nice hands and fingers, you know?
02:11:32.700 | - Yeah.
02:11:33.900 | - And they can do some interesting manipulation.
02:11:35.980 | So, soft touch robotics.
02:11:38.780 | - I mean, one of the tests, goals I have is,
02:11:41.900 | can it pick up a needle and a thread,
02:11:44.900 | and thread the needle, just by looking?
02:11:47.660 | - How far away are we from that?
02:11:49.580 | Just by looking, just by looking.
02:11:51.420 | - Maybe a year.
02:11:52.340 | Although, I go back to, I'm optimistic on time.
02:11:56.220 | The work that we're doing in the car
02:11:57.780 | will translate to the robot.
02:11:59.340 | - The perception, or also the control?
02:12:02.220 | - No, the controls are different,
02:12:03.140 | but the video in controls out.
02:12:06.120 | The car is a robot on four wheels.
02:12:09.320 | The Optimus is a robot with hands and legs.
02:12:15.500 | - So you can just-- - But they're very similar.
02:12:17.780 | - So the entire machinery of the learning process,
02:12:20.820 | end to end, is just, you just have
02:12:22.340 | a different set of controls.
02:12:23.900 | - Optimus will figure out how to do things
02:12:25.700 | by watching videos.
02:12:27.660 | - As the saying goes, be kind for everyone you meet
02:12:30.540 | is fighting a battle you know nothing about.
02:12:33.300 | - Yeah, that's true.
02:12:34.260 | - What's something difficult you're going through
02:12:35.740 | that people don't often see?
02:12:37.300 | - Trying to feed Uberloot.
02:12:39.980 | (laughing)
02:12:42.240 | No, I mean, you know,
02:12:46.220 | I mean, my mind is a storm,
02:12:52.380 | and I don't think most people would wanna be me.
02:12:56.660 | They may think they'd wanna be me,
02:12:57.820 | but they don't know, they don't understand.
02:13:00.120 | - How are you doing?
02:13:12.060 | - I mean, overall okay.
02:13:15.900 | In the grand scheme of things, I can't complain.
02:13:21.420 | - Do you get lonely?
02:13:22.420 | - Sometimes, but I, you know,
02:13:28.180 | my kids and friends keep me company.
02:13:30.680 | - So not existential.
02:13:34.800 | - There are many nights I sleep alone.
02:13:38.360 | I don't have to, but I do.
02:13:41.780 | - Walter Isaacson, in his new biography of you,
02:13:48.700 | wrote about your difficult childhood.
02:13:51.140 | Will you ever find forgiveness in your heart
02:13:54.020 | for everything that has happened to you
02:13:55.460 | in that period of your life?
02:13:56.860 | - What is forgiveness?
02:14:03.100 | I do not, at least I don't think I harbor resentment.
02:14:08.100 | So nothing to forgive.
02:14:19.860 | You know, forgiveness is difficult for people.
02:14:22.820 | It seems like you don't harbor the resentment.
02:14:27.600 | - I mean, I try to think about like,
02:14:30.540 | what is gonna affect the future in a good way?
02:14:33.220 | And holding onto grudges does not
02:14:37.380 | affect the future in a good way.
02:14:41.180 | - You're a father, a proud father.
02:14:43.780 | What have you learned about life from your kids?
02:14:49.140 | Those little biological organisms.
02:14:51.040 | - I mean, developing AI and watching, say,
02:14:56.340 | a little X grow is fascinating,
02:15:00.620 | because they are far more parallels
02:15:04.420 | than I would have expected.
02:15:06.820 | I mean, I can see his biological neural net
02:15:10.140 | making more and more sense of the world,
02:15:13.860 | and I can see the digital neural net
02:15:15.300 | making more and more sense of the world at the same time.
02:15:18.940 | Do you see the beauty and magic in both?
02:15:21.700 | - Yes.
02:15:22.540 | I mean, one of the things with kids is that,
02:15:28.660 | you know, you kind of see the world anew in their eyes.
02:15:32.300 | You know, to them, everything is new and fresh.
02:15:39.940 | And then when you see that,
02:15:43.860 | they may experience the world as new and fresh.
02:15:47.460 | You do, too.
02:15:49.180 | - Well, Elon, I just wanna say thank you
02:15:54.740 | for your kindness to me and friendship over the years,
02:15:59.420 | for seeing something in a silly kid like me,
02:16:01.460 | as you've done for many others.
02:16:03.480 | And thank you for having hope
02:16:06.060 | for a positive future for humanity,
02:16:08.980 | and for working your ass off to make it happen.
02:16:11.060 | Thank you, Elon.
02:16:11.900 | - Thanks, Lex.
02:16:13.260 | - Thank you for listening to this conversation
02:16:15.100 | with Elon Musk.
02:16:16.420 | To support this podcast,
02:16:17.500 | please check out our sponsors in the description.
02:16:20.060 | And now, let me leave you with some words
02:16:22.340 | that Walter Isaacson wrote about the central philosophy
02:16:25.860 | of how Elon approaches difficult problems.
02:16:29.420 | The only rules are the ones dictated by the laws of physics.
02:16:34.420 | Thank you for listening, and hope to see you next time.
02:16:37.740 | (upbeat music)
02:16:40.320 | (upbeat music)
02:16:42.900 | [BLANK_AUDIO]