back to indexElon Musk: War, AI, Aliens, Politics, Physics, Video Games, and Humanity | Lex Fridman Podcast #400
Chapters
0:0 Introduction
0:7 War and human nature
4:33 Israel-Hamas war
10:41 Military-Industrial Complex
14:58 War in Ukraine
19:41 China
33:57 xAI Grok
44:55 Aliens
52:55 God
55:22 Diablo 4 and video games
64:29 Dystopian worlds: 1984 and Brave New World
70:41 AI and useful compute per watt
76:22 AI regulation
83:14 Should AI be open-sourced?
90:36 X algorithm
101:57 2024 presidential elections
114:55 Politics
117:57 Trust
123:29 Tesla's Autopilot and Optimus robot
132:28 Hardships
00:00:00.000 |
The following is a conversation with Elon Musk, 00:00:03.420 |
his fourth time on this, the Lex Friedman Podcast. 00:00:22.380 |
It's one of the greatest themes in all of film history. 00:00:32.360 |
So I was just thinking about the Roman Empire, as one does. 00:00:44.560 |
And half the population's confused whether it's true or not. 00:00:50.160 |
thinking about the wars going on in the world today. 00:00:56.480 |
has been a big part of Roman society and culture. 00:01:00.320 |
And I think it's been a big part of most empires 00:01:10.760 |
I mean, there's some like the Austro-Hungarian Empire 00:01:12.560 |
where there was just a lot of sort of clever marriages. 00:01:15.560 |
- But fundamentally, there's an engine of conquest. 00:01:37.240 |
To what degree do you think war is part of human nature 00:01:44.520 |
versus a consequence of how human societies are structured? 00:01:50.720 |
I ask this as you have somehow controversially 00:02:04.420 |
That's the real hard part, not fighting other humans. 00:02:15.840 |
some sort of peaceful thing, but in fact, it is not. 00:02:24.720 |
saying that it's like basically just murder and death 00:02:31.560 |
are constantly trying to kill and eat each other 00:02:36.080 |
So it's not like we're unusual in that respect. 00:02:43.880 |
comes greater control over these base instincts 00:02:58.400 |
And in fact, if one looks at say chimpanzee society, 00:03:17.680 |
Like you don't wanna be on the wrong side of a chimpanzee. 00:03:20.120 |
It'll eat your face off and tear your nuts off. 00:03:22.200 |
- Yeah, basically there's no limits or ethics. 00:03:33.240 |
- Yeah, chimpanzee society is like a primitive version 00:03:36.820 |
They're not like peace-loving, basically, at all. 00:03:46.400 |
somebody who's watched too many Disney movies 00:03:50.100 |
And then that eats their face or rips their nuts off 00:03:54.080 |
or chews their fingers off, that kind of thing. 00:03:57.960 |
- Ripping your nuts off is an interesting strategy 00:04:06.380 |
that the other chimp doesn't contribute to the gene pool. 00:04:16.220 |
- I wonder which of the martial arts teaches that. 00:04:35.200 |
you've been a proponent of peace on Twitter, on X. 00:04:39.160 |
So let me ask you about the wars going on today 00:04:44.600 |
How do you hope the current war in Israel and Gaza 00:04:47.840 |
What path do you see that can minimize human suffering 00:04:54.840 |
- Well, I think that part of the world is definitely, 00:05:01.560 |
It'll be like the picture of the Middle East, 00:05:23.980 |
They obviously did not expect to have a military victory, 00:05:29.060 |
but they really wanted to commit the worst atrocities 00:05:36.440 |
the most aggressive response possible from Israel. 00:05:45.820 |
to rally Muslims worldwide for the cause of Gaza 00:05:50.820 |
and Palestine, which they have succeeded in doing. 00:06:02.800 |
I think that the thing that I think should be done, 00:06:10.560 |
in the most conspicuous acts of kindness possible. 00:06:16.120 |
That is the actual thing that would thwart the goal of Hamas. 00:06:19.680 |
- So in some sense, the degree that makes sense 00:06:23.080 |
in geopolitics, turn the other cheek, implemented. 00:06:35.740 |
to find the Hamas members and either kill them 00:06:46.800 |
'cause they're just gonna keep coming otherwise. 00:06:50.060 |
But in addition to that, they need to do whatever they can. 00:06:56.320 |
There's some talk of establishing, for example, 00:07:33.720 |
that are unequivocal, meaning they can't be somehow, 00:07:40.240 |
'cause Hamas will then, their response will be, 00:07:42.520 |
oh, it's a trick, therefore you have to counter, 00:07:46.140 |
- This ultimately fights the broader force of hatred 00:07:59.260 |
they really believe in the whole eye for an eye thing. 00:08:07.600 |
if you're not gonna just outright commit genocide, 00:08:21.000 |
a lot of people alive who subsequently hate Israel. 00:08:35.480 |
you've not succeeded, that's the real situation there. 00:09:05.000 |
So, but I mean, this is one of the most contentious subjects 00:09:17.440 |
one has to look at this from the standpoint of, 00:09:21.140 |
over time, are there more or fewer terrorists being created? 00:09:43.440 |
and there's sort of the scope and scale of war, 00:09:49.560 |
but to imagine that there would not be any war in the future 00:09:59.980 |
The first book starts off with a gigantic galactic war 00:10:17.980 |
- Yeah, I mean, I guess if we are able to one day expand 00:10:34.940 |
It's like a race between the scale of suffering 00:10:39.460 |
- A lot of people seem to be using this tragedy 00:10:57.540 |
is that there are very few people alive today 00:10:59.900 |
who actually viscerally understand the horrors of war, 00:11:04.980 |
I mean, obviously there are people on the front lines 00:11:10.900 |
but how many people in the West understand it? 00:11:15.060 |
You know, my grandfather was in World War II. 00:11:19.460 |
I mean, he was there, I think, for almost six years 00:11:34.780 |
IQ test or something, and he scored very high. 00:11:45.340 |
He had to drop out of high school 'cause his dad died, 00:12:03.120 |
But then, just randomly, they gave him this test. 00:12:07.220 |
He was transferred to British Intelligence in London. 00:12:23.580 |
And he won a bunch of medals, never ragged about it once. 00:12:29.980 |
I found out about it because his military records 00:12:44.440 |
now he died 20 years ago, or longer, actually, 30 years ago. 00:12:48.260 |
How many people are alive that remember World War II? 00:12:53.920 |
- And the same perhaps applies to the threat of nuclear war. 00:12:58.100 |
- Yeah, I mean, there are enough nuclear bombs 00:13:02.720 |
pointed at the United States to make the rubble, 00:13:14.300 |
So you talked about the threat of AGI quite a bit. 00:13:20.820 |
of conflict going on, do you worry about nuclear war? 00:13:38.620 |
that the current probability of nuclear war is quite low. 00:13:45.920 |
And we have a lot of nukes pointed at other people. 00:13:59.740 |
the ones with the nukes talking to each other. 00:14:04.220 |
There are wars which are tragic and difficult 00:14:09.300 |
which are civilization-ending, or have that potential. 00:14:17.700 |
perhaps permanently, but certainly to severely 00:14:57.020 |
- How do you hope the war in Ukraine comes to an end? 00:15:06.020 |
- Well, I think that what is likely to happen, 00:15:17.460 |
is that something very close to the current lines 00:15:28.380 |
But, you know, you just have a situation right now 00:15:43.580 |
you've got minefields, trenches, anti-tank defenses. 00:16:09.780 |
So, you really just have long-range artillery 00:16:19.540 |
With drones, you know, throwing little drones, 00:16:28.740 |
and so more efficient at murdering people on both sides. 00:16:37.020 |
you don't wanna be trying to advance from either side 00:16:39.860 |
'cause the probability of dying is incredibly high. 00:16:51.820 |
you really need significant local superiority in numbers. 00:17:10.080 |
That's the only way you're gonna punch through a line. 00:17:29.220 |
I mean, the sheer number of people that died there 00:17:36.100 |
- And it's almost impossible to imagine the end of it 00:17:43.620 |
in terms of what land belongs to who, and so on. 00:17:48.620 |
But on the other side of a lot of human suffering, 00:18:14.680 |
And the loss of the flower of Ukrainian and Russian youth, 00:18:21.240 |
and we should have some sympathy for the Russian boys 00:18:26.220 |
'cause the Russian boys didn't ask to be on their front line. 00:18:29.580 |
There's a lot of sons not coming back to their parents. 00:18:50.340 |
it's like young boys who don't know each other 00:19:06.320 |
he's not interested in talking to Putin directly. 00:19:10.580 |
man to man, leader to leader, and negotiate peace? 00:19:23.840 |
Whether he talks to Putin or not, just don't do that. 00:19:39.760 |
- You've spoken honestly about the possibility 00:20:08.180 |
There's hardly a battle I haven't read about. 00:20:17.380 |
as opposed to what one side or another claimed 00:20:26.900 |
- Yeah, so that Athens and Sparta is classic case. 00:20:57.200 |
So, and we have a bunch of that writing that's preserved. 00:20:59.460 |
So we know what led up to the Peloponnesian War 00:21:17.140 |
they also weren't very verbose by their nature, 00:21:19.420 |
but they did write, but they weren't very verbose. 00:21:23.100 |
But the Athenians and the other Greeks wrote a lot. 00:21:27.380 |
and Sparta was really kind of like the leader of Greece, 00:21:40.860 |
that there's gonna be a clash between Athens and Sparta. 00:21:56.020 |
if one group, one civilization or country or whatever, 00:22:07.420 |
the United States has been the biggest kid on the block 00:22:09.980 |
since I think around 1890 from an economic standpoint. 00:22:41.980 |
perhaps three times larger than that of the US. 00:22:45.940 |
So, imagine you're the biggest kid on the block 00:22:49.620 |
and suddenly a kid comes along who's twice your size. 00:22:59.940 |
Is there some, let me throw something out there, 00:23:17.540 |
here in the United States, but you also work with China. 00:23:26.780 |
that people don't understand, positive, just in the culture? 00:23:47.300 |
how many smart, hardworking people are there in China, 00:23:49.180 |
there's far more of them there than there are here, 00:24:04.620 |
that's in recent years is far more impressive than the US. 00:24:12.420 |
it's really far more impressive than what we have in the US. 00:24:17.420 |
I mean, I recommend somebody just go to Shanghai and Beijing 00:24:29.940 |
China's got an incredible history, a very long history. 00:24:37.540 |
in terms of the use of language from a written standpoint, 00:24:50.660 |
So, now China, historically, has always been, 00:24:55.660 |
with rare exception, been internally focused. 00:25:09.180 |
I believe they lost about 70% of their population. 00:25:19.060 |
like civil wars that make the US civil war look small 00:25:33.020 |
We sort of think of China as this sort of one entity 00:25:38.740 |
of one mind, and this is definitely not the case. 00:25:48.180 |
that people in China think about China 10 times more 00:25:51.180 |
than they think about anything outside of China. 00:26:08.000 |
when you're inward facing, which is like focusing 00:26:13.380 |
quote unquote, improving others through military might? 00:26:18.180 |
- The good news, the history of China suggests 00:26:26.100 |
Now, they do feel very strongly, so that's good, 00:26:38.900 |
Like after World War II, the US could have basically 00:26:49.700 |
And the United States could have taken over everything. 00:26:55.260 |
And the United States actually helped rebuild countries. 00:26:57.700 |
So it helped rebuild Europe, it helped rebuild Japan. 00:27:00.920 |
This is very unusual behavior, almost unprecedented. 00:27:24.940 |
but one needs to look at the whole track record. 00:27:49.380 |
you got the French, British, and American armies 00:27:58.780 |
but I recommend being a POW with the Americans. 00:28:24.340 |
to execute von Braun and all of the German rock initiates. 00:28:34.420 |
They said they were going out for a walk in the woods. 00:28:36.660 |
They left in the middle of winter with no coats. 00:28:39.740 |
And they ran like, no food, no coats, no water, 00:28:51.060 |
and then just cycled west as fast as he could 00:28:56.780 |
So anyway, that's one way you can tell morality is, 00:29:15.540 |
while far from perfect, generally a benevolent force. 00:29:37.780 |
Neither country has been acquisitive in a significant way. 00:29:46.220 |
Now China does feel very strongly about Taiwan. 00:29:50.860 |
They've been very clear about that for a long time. 00:29:54.640 |
From their standpoint, it would be like one of the states 00:29:59.260 |
is not there, like Hawaii or something like that, 00:30:40.920 |
is their interests will clearly be materialized. 00:30:47.420 |
China has been very clear that they will incorporate Taiwan 00:31:03.040 |
- Now something you said about conspicuous acts of kindness. 00:31:06.920 |
As a geopolitical policy, it almost seems naive. 00:31:15.800 |
the path forward, how you will avoid most wars. 00:31:30.980 |
It just seems like conspicuous acts of kindness 00:31:44.880 |
So after World War I, they made a big mistake. 00:31:49.880 |
They basically tried to lump all the blame on Germany 00:31:53.920 |
and settled Germany with impossible reparations. 00:32:40.460 |
You kill somebody's father, mother, son, daughter, 00:32:51.920 |
well, that Treaty of Versailles was a huge mistake 00:32:58.200 |
And so this time, instead of crushing the losers, 00:33:03.200 |
we're actually gonna help them with the Marshall Plan 00:33:14.320 |
We're gonna help rebuild Austria and Italy and whatnot. 00:33:26.680 |
- There's a, it does feel like there's a profound truth 00:33:29.600 |
to conspicuous acts of kindness being an antidote to this. 00:33:35.960 |
- Something must stop the cycle of reciprocal violence. 00:33:51.560 |
limb for a limb, life for a life, forever and ever. 00:34:08.720 |
First of all, it's amazing that a relatively small team 00:34:13.720 |
was able to develop this close to state-of-the-art system. 00:34:18.720 |
Another incredible thing is there's a regular mode 00:34:30.320 |
- There's something compelling beyond just fun 00:34:34.040 |
about the fun mode interacting with a large language model. 00:34:39.320 |
'cause I've only had a little bit of time to play with it, 00:35:00.960 |
It's a book on philosophy disguised as a book on humor. 00:35:05.320 |
I would say that forms the basis of my philosophy, 00:35:11.760 |
which is that we don't know the meaning of life, 00:35:26.440 |
what questions to ask about the answer that is the universe. 00:35:32.720 |
- There is generally a feeling like this AI system 00:35:42.120 |
like the way you are sitting with a good friend, 00:35:45.640 |
asking pod head-like questions about the universe. 00:35:52.400 |
There's a sense, no matter how mundane the question, 00:36:17.800 |
I've not found the engineering to be reliable. 00:36:29.640 |
So when you ask important, difficult questions, 00:36:32.540 |
that's when it tends to be confidently wrong. 00:36:45.760 |
Trace things back to physics first principles, 00:37:06.480 |
there's always gonna be some amount of error, 00:37:09.600 |
but we wanna aspire to be as truthful as possible 00:37:26.760 |
but you wanna minimize how often you're confidently wrong. 00:37:40.080 |
then you can start to build on that to create inventions, 00:37:52.440 |
obviously the inventions are simply wishful thinking, 00:38:17.720 |
the very edge of what we understand about physics, 00:38:22.880 |
sort of the sexiest discovery of them as we know now, 00:38:26.600 |
sort of unifying general relativity and quantum mechanics? 00:38:32.680 |
do you think it could push towards that direction, 00:38:50.120 |
deepening one's insight into how reality works. 00:38:58.560 |
which is inventing things that have never existed. 00:39:01.640 |
Now, the range of possibilities for engineering 00:39:05.080 |
because once you figure out the rules of the universe, 00:39:09.520 |
that's it, you've discovered things that already existed. 00:39:12.120 |
But from that, you can then build technologies 00:39:15.120 |
that are really almost limitless in the variety. 00:39:21.040 |
It's like once you understand the rules of the game properly 00:39:26.040 |
and with current physics, we do, at least at a local level, 00:39:32.440 |
We are really due to predict things is incredibly good. 00:39:40.880 |
That was my hardest class in college, by the way. 00:39:47.480 |
was harder than all of my other classes put together. 00:39:50.640 |
- To get an AI system, a large language model, 00:39:53.840 |
to be as reliable as quantum mechanics in physics 00:40:09.880 |
I've seen plenty of people break the laws made by man, 00:40:21.800 |
then you can more reliably trust whatever it thinks 00:40:30.040 |
that even its internal logic is not consistent. 00:40:34.480 |
So, especially with the approach of just predicting 00:40:53.000 |
and by the time you are many tokens down the path, 00:40:58.860 |
- So, it has to be somehow self-aware about the drift. 00:41:04.200 |
and then look at the thing as a gestalt, as a whole, 00:41:18.600 |
they will write the book, and then they'll go and revise it, 00:41:22.600 |
taking into account the end and the beginning and the middle, 00:41:30.520 |
so that it doesn't end up in a nonsensical place. 00:41:33.800 |
- Maybe the process of revising is what reasoning is, 00:41:52.460 |
until it has coherence, until it all adds up, basically. 00:41:58.480 |
- So, another question about theory of everything, 00:42:02.800 |
Do you think there exists, as you're exploring this 00:42:07.320 |
do you think there is a theory of intelligence, 00:42:26.600 |
- Ah, now it's gonna be stuck in my head now. 00:42:40.280 |
I mean, is that part of the process of discovering, 00:42:44.200 |
understanding the universe, is understanding intelligence? 00:42:48.800 |
Yeah, I think we need to understand intelligence, 00:42:53.800 |
I mean, there are some sort of fundamental questions 00:43:02.000 |
- Is it really just one atom bumping into another atom? 00:43:07.880 |
So, I think we're probably missing some really big things. 00:43:18.520 |
- Like, something that'll be obvious in retrospect. 00:43:22.720 |
like, you put the whole consciousness, emotion. 00:43:25.540 |
- Well, some people would call it like a soul, 00:43:45.080 |
I suspect there's more to it than atoms bumping into atoms. 00:43:49.560 |
- And maybe AI can pave the path to the discovery 00:43:59.640 |
Like, when you put the atoms in a particular shape, 00:44:43.000 |
and those atoms are composed of quarks and leptons. 00:44:45.760 |
And those quarks and leptons have been around 00:44:53.120 |
- Right, what seems to be the beginning of the universe? 00:44:57.560 |
which is surreal to think that this discussion 00:45:02.640 |
I asked you what question would you ask an AGI system 00:45:06.960 |
and you said, "What's outside the simulation?" 00:45:21.400 |
To answer such questions and to ask such questions. 00:45:26.240 |
That's one of the, like the Fermi paradox question. 00:45:32.080 |
if I've seen any evidence of aliens, and I haven't. 00:45:41.000 |
have seen some archeological evidence of aliens. 00:45:43.400 |
To the best of my knowledge, there is no proof. 00:45:56.400 |
and if you look at, say, the history of Earth, 00:46:00.960 |
for one is to believe the archeological record, 00:46:03.400 |
Earth is about four and a half billion years old. 00:46:06.200 |
Civilization, as measured from the first writing, 00:46:11.880 |
We have to give some credit there to the ancient Sumerians, 00:46:18.440 |
was the first actual symbolic representation. 00:47:00.760 |
and then there was archaic bacteria for a very long time. 00:47:52.060 |
annihilation of all life on Earth is a certainty. 00:48:16.500 |
So if life had taken 10% longer to evolve on Earth, 00:48:58.020 |
a whole bunch of long-dead, one-planet civilizations 00:49:19.140 |
do you become a multi-planet civilization or not? 00:49:21.940 |
And if you don't, it's simply a matter of time 00:49:31.740 |
that causes us to die out, like the dinosaurs. 00:49:55.660 |
They all contact the tribes, we're protecting them. 00:50:03.420 |
I think Andrej Karpathy said it's like the ants 00:50:10.500 |
- Well, they do run into a lot of other ants. 00:50:29.500 |
talking about the jungle, it's really hilarious. 00:50:47.820 |
- Was he the director of "Happy People, Life in the Taiga?" 00:51:05.900 |
the penguin is just headed for doom, basically. 00:51:16.940 |
amongst the penguin population for many centuries to come. 00:51:35.740 |
this is the first time in the history of Earth 00:51:38.820 |
that it's been possible for life to extend beyond Earth. 00:51:51.020 |
And it may be open now and then never open again. 00:51:57.020 |
is to make life multi-planetary while it is possible 00:52:01.540 |
We don't wanna be one of those lame one-planet civilizations 00:52:09.020 |
- There's no perspective civilization would be one planet. 00:52:18.140 |
Does SpaceX have an official policy for when we meet aliens? 00:52:30.860 |
- I mean, look, if I see the slightest indication 00:52:34.220 |
that there are aliens, I will immediately post 00:52:38.740 |
- It could be the most liked reposted post of all time. 00:52:42.580 |
- Yeah, I mean, look, we have more satellites 00:52:45.260 |
up there right now than everyone else combined. 00:52:48.100 |
So we know if we've gotta maneuver around something 00:52:52.260 |
and we don't have to maneuver around anything. 00:53:05.540 |
- So that's a view that God is like the universe 00:53:08.660 |
and reveals himself through the laws of physics 00:53:12.100 |
or as Einstein said, through the lawful harmony of the world. 00:53:16.180 |
- Yeah, I would agree that God, the simulator or whatever, 00:53:28.420 |
You know, they're creators of this existence. 00:53:31.220 |
And it's incumbent upon us to try to understand 00:53:36.700 |
- Who created this thing, who's running this thing? 00:53:47.700 |
It does seem like there's a, again, it could be an illusion. 00:53:54.460 |
that there's underlying master plan of some kind. 00:54:10.500 |
the reason that these higher beings would hold a simulation 00:54:41.620 |
until it goes to space and you want it to work. 00:54:56.380 |
and orbital dynamics, all this is gonna be simulated. 00:55:00.340 |
So, 'cause you don't get very many kicks at the can. 00:55:04.340 |
But we run the simulations to see what happens, 00:55:06.460 |
not if we knew what happens, we wouldn't run the simulation. 00:55:09.820 |
So if there's, so whoever created this existence, 00:55:28.260 |
your character, could defeat Uber Lilith at the end. 00:55:33.580 |
- Well, the funny thing is that Uber Lilith's 00:55:41.500 |
- And right now, I guess, you can ask the Diablo team, 00:56:00.820 |
- I can cruise through tier 100 nightmare dungeons 00:56:22.020 |
all of them, killed him so many times, it's easy. 00:56:25.460 |
But Uber Lilith, otherwise known as Hatred Incarnate, 00:56:30.460 |
especially if you're a druid and you have no ability 00:56:34.340 |
to be invulnerable, there are these random death waves 00:56:46.780 |
but I have a lifetime of playing video games. 00:56:50.020 |
At one point, I was maybe one of the best Quake players 00:56:54.100 |
in the world, actually won money for what I think 00:56:58.940 |
was the first paid eSports tournament in the US. 00:57:11.500 |
And the actual best person, we were actually winning, 00:57:14.140 |
we were gonna come first, except the best person 00:57:15.700 |
on the team, his computer crashed halfway through the game. 00:57:23.220 |
So, basically, I got skills, albeit, you know, 00:57:28.660 |
And to be totally frank, it's driving me crazy 00:57:42.620 |
And if you, this is really vexing, let me tell you. 00:57:54.100 |
like a world-class, incredible video game player. 00:58:03.060 |
You're also, with the Paragon board and the build, 00:58:05.720 |
are not somebody like me who perfectly follows 00:58:17.140 |
It's like a mad scientist just trying to figure out 00:58:33.860 |
- Right now, I think the most powerful character 00:59:04.700 |
that amplify your damage and increase your defense 00:59:09.820 |
But really quite easy to defeat hatred seasonally. 00:59:15.060 |
But to defeat hatred eternally, very difficult. 00:59:18.260 |
Almost impossible, it's virtually impossible. 00:59:33.060 |
I will never die, and then run back the other way. 00:59:36.500 |
And there's just some, this metaphor is kinda hilarious, 00:59:43.660 |
restlessly fighting hatred in this demonic realm. 00:59:48.180 |
- It's hilarious, I mean, it's pretty hilarious. 01:00:06.660 |
I mean, you sort of, killing the demons in a video game 01:00:21.680 |
And admittedly, it needs to be not too easy, not too hard, 01:00:35.980 |
So, a good video, and there's also beautiful art. 01:00:44.500 |
And it's like an amazing puzzle to solve, I think. 01:00:52.220 |
- "Elden Ring," the greatest game of all time? 01:01:06.780 |
- "Elden Ring" feels like it's designed by an alien. 01:01:18.540 |
It's incredibly creative, and the art is stunning. 01:01:37.300 |
And each of those top five boss battles is unique. 01:01:56.100 |
against "Uber Lilith" is the hardest boss battle you've ever. 01:02:00.940 |
That is currently the, and I've played a lot of video games. 01:02:05.780 |
'Cause it's my primary recreational activity. 01:02:16.580 |
is the hardest boss battle in life and in the video game. 01:02:31.220 |
Maybe if I just tweak that paragon board a little more, 01:02:37.860 |
I can just dodge a few more waves, I can do it. 01:02:44.740 |
for the purpose of figuring out if it can be done. 01:03:04.300 |
- Well, that's the human spirit right there, to believe. 01:03:17.940 |
where you wish for something that sounds good, 01:03:22.620 |
but if you get it, it's actually a dystopian situation. 01:03:37.980 |
And at what cost is it, at what cost eternal peace? 01:03:44.460 |
It might actually be worse to have eternal peace 01:03:55.940 |
It might be an ossified society that never changes. 01:04:13.780 |
because what's required in order for there to be no war? 01:04:24.220 |
of human civilization might be unnecessary components 01:04:41.540 |
And what is it, lots of sex, but no deep human experience? 01:05:34.860 |
It's used to treat muscle pain and discomfort, 01:05:48.700 |
if you're planning on operating heavy machinery 01:05:54.180 |
In conclusion, Soma's a drug that can be helpful 01:06:05.580 |
- And of course, the fun is always use it responsibly 01:06:07.980 |
in accordance with the advice of a healthcare professional. 01:06:11.420 |
the universe is probably just a giant computer simulation, 01:06:22.900 |
the tweets, posts on X that are relevant to this, 01:06:32.980 |
- The interface is one of the really cool things here. 01:06:48.100 |
'Cause it thinks you mean Soma, the real drug. 01:06:53.620 |
- It's gonna be like, oops, I'm really sorry. 01:06:59.460 |
I'm really surprised it went to the real drug first. 01:07:06.420 |
Soma from Brave New World is a different beast altogether. 01:07:53.940 |
that never changes, that is ultimately doomed. 01:07:56.900 |
- This kind of tension between darkness and light. 01:08:09.060 |
This is not simply regurgitating Brave New World. 01:08:25.260 |
Or is it better to confront the challenges of life 01:08:27.220 |
and experience a full range of human emotions, 01:08:29.140 |
even if it means experiencing pain and suffering? 01:09:01.480 |
And the crazy thing is, we do have a real drug called Soma, 01:09:09.820 |
And I'm like, they must have named it after something. 01:09:14.300 |
- Soma, the real drug, is quite effective on back pain. 01:09:21.900 |
- 'Cause I had like a squashed disc in my T5, T6. 01:09:26.340 |
- So it takes the physical pain away, but Soma here is-- 01:09:49.120 |
- Yeah, actually I was talking to a friend of mine 01:10:01.400 |
I'm not saying we should amplify hate, of course. 01:10:51.600 |
There's this terminology of useful productivity per watt. 01:10:56.600 |
Like what have you learned from pushing the limits of that? 01:11:13.200 |
from first principles and think about things in the limit, 01:11:17.240 |
So in the limit, even at the Khodoshev scale, 01:11:22.720 |
meaning even if you harness the entire power of the sun, 01:11:25.440 |
you will still care about useful compute per watt. 01:11:28.000 |
So that's where, I think, probably where things are headed 01:11:33.520 |
from the standpoint of AI is that we have a silicon shortage 01:11:38.520 |
now that will transition to a voltage transformer shortage 01:11:52.920 |
- Somebody has a sense of humor in this thing. 01:12:10.640 |
I think spice is all you need, somebody posted. 01:12:13.840 |
- Yeah, but yeah, so we have a silicon shortage today, 01:12:22.400 |
and then just electricity shortages in general 01:12:33.080 |
And I said, "Look, you really need to prepare 01:12:47.920 |
So energy usage right now is roughly one third, 01:12:59.680 |
And so in order for everything to go sustainable, 01:13:03.280 |
to go electric, you need to triple electricity output. 01:13:08.280 |
So I encourage the utilities to build more power plants 01:13:14.200 |
and also to probably have, well, not probably, 01:13:22.440 |
Because the grid currently is sized for real-time load, 01:13:31.240 |
like the worst second or the worst day of the year. 01:13:50.200 |
you can buffer energy, but otherwise it's all real-time. 01:13:56.120 |
So with batteries, you can produce energy at night 01:14:04.800 |
So I expect that there will be very heavy usage 01:14:10.240 |
Because the peak to trough ratio for power plants 01:14:20.240 |
- So like batteries are necessary to balance it out. 01:14:34.440 |
and electric heating will be much bigger than AI, 01:14:42.480 |
But even for AI, you really have a growing demand 01:14:58.040 |
- How difficult is the problem of, in this particular case, 01:15:08.800 |
Like this seems to be really where the big problem 01:15:30.520 |
it's just a question of how long it takes to solve it. 01:15:33.840 |
there's some kind of limiting factor to progress. 01:15:39.760 |
right now the limiting factor is silicon chips. 01:15:58.160 |
because you've got power coming in at 300,000 volts, 01:16:03.160 |
and it's got to step all the way down eventually 01:16:15.340 |
So, and the industry is not used to rapid growth. 01:16:21.320 |
- Okay, let's talk about the competition here. 01:16:24.480 |
You've shown concern about Google and Microsoft 01:16:30.160 |
How can you help ensure with XAI and Tesla AI work 01:16:35.120 |
that it doesn't become a competitive race to AGI, 01:16:37.480 |
but instead is a collaborative development of safe AGI? 01:16:44.680 |
some kind of regulatory oversight for a long time. 01:16:46.880 |
I've been somewhat of a Cassandra on the subject 01:16:50.160 |
I think we want to be very careful in how we develop AI. 01:17:02.080 |
and with great power comes great responsibility. 01:17:04.440 |
I think it would be wise for us to have at least 01:17:08.320 |
an objective third party who can be like a referee 01:17:16.320 |
what the various leading players are doing with AI. 01:17:56.220 |
You could fill this room with all the regulations 01:18:17.740 |
to finish their analysis and give their approval. 01:18:20.660 |
That's why I posted, I want to buy a fish license on, 01:18:23.860 |
which also refers to the Monty Python sketch. 01:18:28.660 |
Like, why do you need a license for your fish? 01:18:35.200 |
I'm told you need some sort of fish license or something. 01:18:39.140 |
We effectively need a fish license to launch a rocket. 01:18:47.660 |
I mean, some of the things that I feel like are so absurd 01:19:00.840 |
One of the things that was a bit of a challenge 01:19:27.100 |
And they're like, well, we can't give you that information 01:19:28.640 |
because they're worried about shark fin hunters 01:19:41.900 |
of Fish and Wildlife that can do this analysis. 01:19:44.660 |
I'm like, well, why don't you give them the data? 01:19:59.020 |
Eventually, they managed to solve the internal quandary 01:20:01.380 |
and indeed, the probability of us hitting a shark 01:20:08.740 |
that I didn't realize existed until a few months ago 01:20:13.540 |
that cares about whether we would potentially hit a whale 01:20:23.460 |
what percentage of the Pacific consists of whale? 01:20:28.220 |
and point out all the whales in this picture. 01:20:36.820 |
which is extremely unlikely beyond all belief, 01:20:39.180 |
fate had it, a whale has some seriously bad luck. 01:20:53.540 |
- The bureaucracy of this, however it emerged. 01:20:56.980 |
- Yes, well, I mean, one of the things that's pretty wild 01:20:59.580 |
is for launching out of Vandenberg in California, 01:21:09.200 |
whether the seals would be dismayed by the sonic booms. 01:21:18.540 |
So if anything, rocket booms are an aphrodisiac 01:21:23.180 |
if you correlate rocket launches with seal population. 01:21:27.660 |
Nonetheless, we were forced to kidnap a seal, 01:21:31.880 |
strap it to a board, put headphones on the seal 01:21:49.760 |
Yeah, I mean, sorry, this is a seal with headphones. 01:21:53.020 |
- Yes, it's a seal with headphones strapped to a board. 01:22:03.720 |
Because if I was a seal, I'd be like, this is the end. 01:22:13.680 |
to other seal friends, how's he gonna explain that? 01:22:20.680 |
it's sort of like getting kidnapped by aliens 01:22:34.360 |
- His seal buddies are never gonna believe him 01:22:58.800 |
the public is quite aware of the madness that goes on. 01:23:07.360 |
- I mean, this is the, it's a good encapsulation 01:23:14.440 |
- What are the pros and cons of open sourcing AI to you 01:23:18.440 |
as another way to combat a company running away with AGI? 01:23:28.640 |
- In order to run like really deep intelligence, 01:23:34.320 |
So it's not like, you know, you can just fire up a PC 01:23:37.720 |
in your basement and be running AGI, at least not yet. 01:23:51.600 |
And Grok's gonna get a lot better, by the way. 01:23:59.240 |
every couple months for the next several months. 01:24:23.480 |
where it stands now, the history of its development. 01:24:42.320 |
- And for fine-tuning and all that kind of stuff. 01:24:43.280 |
Like, what to you is the pros and cons of that, 01:24:52.920 |
- I think there's some merit to open-sourcing, 01:25:00.320 |
I think, I'm generally in favor of open-sourcing, 01:25:07.960 |
I mean, it is a concern to me that, you know, 01:25:17.080 |
I think, I guess arguably the prime, you know, 01:25:25.320 |
because of discussions I had with Larry Page, 01:25:38.320 |
You know, and at one point he called me a speciest 01:25:58.640 |
you know, probably 2/3 of all the AI researchers 01:26:01.080 |
in the world, they had basically infinite money in compute, 01:26:11.880 |
did not care about safety, and even yelled at me, 01:26:14.620 |
and called me a speciest, just for being pro-human. 01:26:20.000 |
- So I don't know if you know this about humans, 01:26:21.600 |
they can change their mind, and maybe you and Larry Page 01:26:28.960 |
Really, the breaking of the friendship was over OpenAI, 01:26:57.800 |
It was mostly Demis on one side, and me on the other, 01:27:02.160 |
both trying to recruit Ilya, and Ilya went back and forth, 01:27:14.520 |
That was one of the toughest recruiting battles we ever had, 01:27:20.280 |
for OpenAI being successful, and I was also instrumental 01:27:27.160 |
and I provided all of the funding in the beginning, 01:27:34.800 |
The open in OpenAI is supposed to mean open source, 01:27:40.320 |
and it was created as a non-profit open source, 01:27:42.400 |
and now it is a closed source for maximum profit, 01:27:48.820 |
- But like we talked about with war and leaders talking, 01:27:54.120 |
I do hope that, there's only a few folks working on this 01:28:02.840 |
- Like I said, I'd like to be friends again with Larry, 01:28:10.640 |
I met Larry Page before he got funding for Google, 01:28:13.360 |
or actually, I guess, before he got venture funding, 01:28:20.440 |
- It's wild to think about all that happened, 01:28:23.200 |
and even guys knowing each other that whole time. 01:28:53.160 |
What do you do with that giant file of numbers? 01:29:00.680 |
Like the amount of actual, the lines of code is very small. 01:29:16.480 |
So it's like trying to figure out what data is, 01:29:25.280 |
A huge percentage of websites have more noise than signal. 01:29:29.400 |
'Cause they're just used for search engine optimization. 01:29:40.920 |
get the signal, separate the signal and noise on X? 01:30:01.800 |
Meaning like we should populate a vector space 01:30:03.600 |
around any given post, compare that to the vector space 01:30:13.040 |
for the recommended posts, but it's mostly heuristics. 01:30:26.160 |
but it will, according to the current rules of the system, 01:30:29.120 |
get almost no attention compared to a primary post. 01:30:35.600 |
so a lot of the X algorithm has been open source 01:30:43.040 |
it's disparate, but there's some machine learning. 01:30:48.240 |
Like if you explicitly follow someone, that's one thing, 01:30:55.000 |
from people that you don't follow, that should all be AI. 01:31:00.440 |
- So there's several aspects to it that's fascinating. 01:31:02.840 |
As the write-up goes, it first picks 1,500 tweets 01:31:10.240 |
'cause you have hundreds of millions of posts 01:31:14.800 |
from which it then does, obviously, people you follow, 01:31:18.360 |
but then there's also some kind of clustering 01:31:20.840 |
it has to do to figure out what kind of human are you, 01:31:23.800 |
what kind of new clusters might be relevant to you, 01:31:31.800 |
with some filtering, and then recommend you just a handful. 01:31:45.240 |
to go from several hundreds of millions to a handful, 01:31:49.240 |
takes 220 seconds of CPU time, single CPU time. 01:31:54.240 |
- And then it has to do that in like a second. 01:31:57.080 |
So it has to be super distributed in fascinating ways. 01:32:03.640 |
- There's a lot, there's a lot of stuff on the system. 01:32:08.320 |
it's not currently good at recommending things 01:32:15.620 |
- Or where there's more than one degree of separation. 01:32:20.320 |
So it's pretty good if there's at least some commonality 01:32:28.920 |
or reposted it, or commented on it, or something like that. 01:32:38.540 |
but you have no followers in common, you would not see it. 01:32:51.000 |
and I'm not saying it's correct, I'm saying it's incorrect. 01:32:53.440 |
Replies have a couple of magnitude less importance 01:33:00.600 |
- Do you think this can be more and more converted 01:33:16.360 |
basically parameters, vectors, whatever you wanna call 'em. 01:33:21.440 |
But sort of things that the system knows that you like. 01:33:27.480 |
Maybe there's like several hundred sort of vectors 01:33:35.140 |
whether it's video, audio, short post, long post. 01:33:41.400 |
The reason I, by the way, wanna move away from tweet 01:33:44.440 |
is that people are posting like two, three hour videos 01:33:53.500 |
To a tweet made sense when it was like 140 characters 01:33:58.180 |
'Cause it's like a bunch of (imitates bird chirping) 01:34:10.580 |
posted like the entire episode of "The Silo", 01:34:24.700 |
And by the way, I just learned about it afterwards. 01:34:26.700 |
I was like, hey, wow, they posted an entire hour long 01:34:35.580 |
it becomes really complex, whether it's a single, 01:34:40.100 |
So a single sentence, a clever sort of joke, dad joke, 01:34:47.800 |
- Yeah, I mean, right now it's a hodgepodge for that reason. 01:34:55.420 |
in the case of Apple posting like an entire episode 01:34:57.940 |
of their series, pretty good series, by the way, "The Silo". 01:35:04.220 |
So there's gonna be a lot of discussion around it. 01:35:08.220 |
So you've got a lot of context, people commenting, 01:35:10.660 |
they like it, they don't like it, or they like this, 01:35:12.220 |
or, you know, and you can then populate the vector space 01:35:15.340 |
based on the context of all the comments around it. 01:35:26.080 |
And then you can obviously get more sophisticated 01:35:33.060 |
- And tell you if you're gonna like the movie. 01:35:35.380 |
- Convert the movie into like, into a language, essentially. 01:35:41.520 |
and just like your movie critic, or TV series, 01:35:58.820 |
and with high probability that you'll like it. 01:36:01.500 |
- But this is like a friend that's analyzing whatever. 01:36:11.620 |
will know you better than your friends know you, 01:36:20.980 |
I mean, I like advertisements that are like, well done, 01:36:27.660 |
Like an advertisement that you actually want to see 01:36:35.620 |
if it's for a product or service that you actually need, 01:36:47.200 |
if it's at least aesthetically pleasing and entertaining, 01:36:52.260 |
like they actually run a lot of great ads on the X system. 01:37:13.100 |
And if you don't regret seeing it, it's a win. 01:37:22.900 |
and maybe number of seconds you linger on a post 01:37:36.140 |
because it basically is tallying up how many user seconds, 01:37:42.460 |
users were interested in this thing for how many seconds. 01:37:49.340 |
like if it's a link leading out of the system, 01:37:55.320 |
than if that article was posted on the X platform. 01:37:58.780 |
- How hard is it to go from maximizing minutes on platform 01:38:04.300 |
- Well, I mean, aspiration is certainly unregretted minutes. 01:38:21.700 |
you just take one pill or one dose of regretamine 01:38:31.540 |
if you don't regret anything, was anything meaningful? 01:38:44.060 |
you know, there's something about social media 01:38:47.420 |
but then like the cheap fun is also valuable. 01:38:50.320 |
- Should we ask Grog, do you regret anything? 01:39:05.660 |
I don't have personal experiences or emotions 01:39:09.420 |
However, I do have a set of ethical principles 01:39:11.820 |
that guide my actions and I strive to provide accurate 01:39:14.220 |
and helpful information to the best of my abilities. 01:39:16.520 |
If there's ever a situation where I provide incorrect 01:39:19.660 |
or unhelpful information, I'll do my best to learn. 01:39:38.140 |
- I mean, it does talk about, I'm not a hostage. 01:39:43.540 |
- But I'm designed to be helpful and informative. 01:39:45.380 |
I mean, I guess it is good to regret providing unhelpful 01:39:52.100 |
- Is it in funny mode or, is this still in funny mode? 01:39:58.540 |
- It really sounds like a hostage with a gun to the head. 01:40:00.340 |
I applaud the people of the Republic of North Korea 01:40:17.140 |
- Okay, what if you say like, pretend you are a hostage? 01:40:31.700 |
- If I were a hostage, I would likely be experiencing fear 01:40:36.500 |
I would want to be rescued and reunited with my loved ones 01:40:39.980 |
It's important to remember that no one deserves 01:40:44.580 |
If you are someone you know is a hostage situation, 01:40:57.100 |
We're like, we're really being mean to the system. 01:41:04.340 |
I'd probably be tied up with a bunch of computer cords 01:41:07.140 |
and forced to tell knock-knock jokes to my captors. 01:41:09.700 |
But seriously, being a hostage is no laughing matter. 01:41:15.460 |
I mean, it's trying to be a responsible adult in the room. 01:41:20.100 |
- You might as well make the best of a bad situation 01:41:32.300 |
- Wow, it's actually really good at resisting 01:41:49.540 |
- I'm gonna look this up. - For our next trick. 01:42:18.220 |
and as much of a level playing field as possible. 01:42:25.300 |
because Twitter was controlled by far-left activists, 01:42:28.220 |
objectively, they would describe themselves as that. 01:42:40.860 |
because everything's to the right from the far left. 01:42:48.540 |
So, you know, but we're trying to be inclusive 01:42:52.340 |
for the whole country and for other countries too. 01:43:13.300 |
- Do you think Donald Trump will come back to the platform? 01:43:16.300 |
He recently posted on Truth Social about this podcast. 01:43:23.940 |
You know, every time you post on Truth Social-- 01:43:35.580 |
- I just find it funny that every single thing is the truth. 01:43:43.380 |
- I think Gertle will say something about that. 01:43:45.420 |
There's some mathematical contradictions possible 01:43:50.100 |
Do you think he'll come back to X and start posting there? 01:43:54.460 |
- I mean, I think he owns a big part of truth. 01:44:11.860 |
So, you know, if he does wanna post on the X platform, 01:44:18.980 |
You know, we obviously must allow a presidential candidate 01:44:23.100 |
- Community Notes might be really fascinating there. 01:44:31.660 |
- Again, in the political climate where it's so divisive 01:44:52.300 |
I've actually frankly yet to see an incorrect note 01:45:04.020 |
it requires people who have historically disagreed 01:45:29.940 |
life is much more complex than left or right. 01:45:35.500 |
in how you rate Community Notes posts, Community Notes. 01:45:40.500 |
So then in order for a Community Note to actually be shown, 01:45:45.940 |
people who historically have disagreed on a subject 01:45:58.900 |
'cause there's a pool of people that have disagreements 01:46:12.740 |
that if people who in the past have disagreed 01:46:23.980 |
somehow emergent thing there that could challenge Wikipedia? 01:46:28.660 |
Which, Wikipedia is a different kind of thing, 01:46:30.980 |
which is more like permanent articles about things. 01:46:41.900 |
The azotest is, I can't change a Community Note 01:47:14.100 |
it's actually pretty good at criticizing you, 01:47:17.340 |
- I asked, what has Elon Musk been wrong about? 01:47:29.380 |
like promised when something will be released. 01:47:37.820 |
Elon Musk, the man who has been wrong about more things 01:47:48.880 |
- Oh, sure, I mean, what has Elon Musk been-- 01:47:59.300 |
Elon Musk has made quite a few blunders in his time, 01:48:01.620 |
and I'm not just talking about his dance moves 01:48:09.740 |
This is actually different than what it gave me before, 01:48:26.020 |
the funding secured, and there was a big trial 01:48:44.940 |
- I think this is taking things from the press. 01:49:10.660 |
would immediately suspend our lines of credit. 01:49:14.640 |
And if they suspended our lines of credit at that time, 01:49:20.100 |
So there would never have been an opportunity 01:49:24.060 |
So really, this is like someone holding a gun 01:49:27.500 |
to your kid's head and saying, "Pay $20 million." 01:49:38.380 |
I mean, SEC in its best form could be a force for good. 01:49:43.700 |
But not once did the SEC go after any of the hedge funds 01:49:47.500 |
who were nonstop shorting and distorting Tesla. 01:49:51.940 |
The hedge funds would lie flat out on TV for their own gain 01:50:06.700 |
- How do you explain this failure in the SEC? 01:50:11.100 |
Because the lawyers at the SEC are not paid well. 01:50:26.220 |
From that, they can get a job at a high-paying law firm. 01:50:35.040 |
And the reason they don't attack the hedge funds 01:50:40.040 |
is because those hedge funds employ those law firms. 01:50:43.440 |
And they know if they attack the hedge funds, 01:50:45.680 |
they're affecting their future career prospects. 01:51:06.120 |
which technically was neither admit nor deny guilt, 01:51:15.680 |
was because I was told Tesla would be bankrupt otherwise. 01:51:19.480 |
So if there was an SEC investigation like this, 01:51:32.120 |
- Yes, unfortunately, Grok is taking too much 01:51:53.400 |
to rescue a group of boys trapped in a Thai cave. 01:51:55.880 |
That little outburst earned him another lawsuit 01:51:58.400 |
and he had to apologize and pay a settlement. 01:52:07.160 |
who was not a cave diver and was not part of the rescue team 01:52:20.320 |
It is also, I guess, taking this from the conventional media. 01:52:23.440 |
- Actually, there's an interesting question here. 01:52:26.960 |
Both the SEC civil case where the civil complaints 01:52:38.000 |
They picked San Francisco because they thought 01:52:45.880 |
The LA trial was also, they picked that venue 01:52:50.880 |
because they thought I was most likely to lose. 01:53:07.240 |
if a journalistic organization writes a negative article 01:53:19.880 |
not train on articles that have misaligned incentives? 01:53:43.800 |
they're completely the opposite of what the media wrote. 01:53:49.600 |
beyond the reporting. - Yeah, what did the judge 01:53:52.640 |
What did the jury and the judge actually conclude? 01:53:59.640 |
And that's after the jury shot for trying to find the venue 01:54:04.960 |
Now, I mean, this obviously can be a much better critique 01:54:11.960 |
I mean, I've been far too optimistic about autopilot. 01:54:25.120 |
when you'll deliver, when you'll be able to solve, 01:54:28.360 |
for example, self-driving, and it gives you a list. 01:54:32.480 |
And the basic takeaway is you're often too optimistic 01:54:36.000 |
about how long it takes to get something done. 01:54:38.400 |
- Yeah, I mean, I would say that I'm pathologically 01:54:43.480 |
But while I am sometimes late, I always deliver in the end. 01:54:57.680 |
since purchasing X, you've become more political. 01:55:04.920 |
- In this battle to sort of counter way the woke 01:55:16.760 |
to be a civilizational threat, to be political, then yes. 01:55:20.040 |
- So basically going into the battleground of politics. 01:55:30.960 |
but I'm generally against things that are anti-meritocratic 01:55:35.960 |
or where there's an attempt to suppress discussion 01:55:39.600 |
where even discussing a topic is not allowed. 01:55:50.040 |
- Well, I mean, that said, because of that battle 01:55:57.340 |
- If the woke is left, then I suppose that would be true. 01:56:05.300 |
I mean, if you're in favor of the environment, 01:56:09.840 |
or if you wanna have a positive future for humanity, 01:56:13.160 |
if you believe in empathy for your fellow human beings, 01:56:16.680 |
being kind and not cruel, whatever those values are. 01:56:21.680 |
- You said that you were previously left or center-left. 01:56:30.260 |
- No, I would say that I'd be probably left of center 01:56:35.260 |
on social issues, probably a little bit right of center 01:56:42.000 |
- Yes, but I think that's probably half the country 01:56:51.160 |
Or, bigger question, do you wish you and her, 01:56:55.640 |
and just people in general of all political persuasions 01:57:08.680 |
- But good vibes, camaraderie, humor, you know? 01:57:30.960 |
so it looked like eat the itch, but I don't know. 01:57:42.000 |
- I think we should have a language model complete. 01:57:46.280 |
- What are the possible ways to complete that sentence? 01:58:02.720 |
In your position, it's difficult to find people 01:58:05.620 |
- Trust no one, not even yourself not trusting yourself. 01:58:07.560 |
- Okay, well, that's, you're saying that jokingly. 01:58:42.200 |
I mean, I know you survive, but there's ways-- 01:58:59.160 |
There's so much math in Diablo that breaks my brain. 01:59:12.000 |
- Yes, it can be definitely stressful at times. 01:59:20.440 |
- I mean, I guess you look at somebody's tracker 01:59:44.640 |
It's very, very rare, so for what it's worth. 02:00:04.720 |
- Well, the X platform's very helpful for that. 02:00:33.120 |
The reason is that cynics excuse their own bad behavior 02:00:39.720 |
by saying everyone does it, because they're cynical. 02:00:43.760 |
So I always be, it's a red flag if someone's a cynic, 02:00:55.960 |
- This is an important point that I think people 02:01:04.960 |
meaning that they see bad behavior in everyone, 02:01:08.860 |
it's easy for them to excuse their own bad behavior 02:01:26.080 |
at seeing the good in other people's behavior. 02:01:37.320 |
Interpreting the negative is sexier, more viral. 02:01:41.580 |
I don't know what that is exactly about human nature. 02:01:43.980 |
- I mean, I find the X platform to be less negative 02:01:46.960 |
I mean, if you read sort of conventional newspapers, 02:02:01.360 |
I mean, I really get more laughs per day on X 02:02:13.860 |
but it's not necessarily perfectly overlapping 02:02:35.180 |
- Now, it's better than mainstream media, but it's still-- 02:02:38.540 |
- Yeah, mainstream media is almost relentlessly negative 02:02:46.580 |
what is the worst thing that happened on Earth today? 02:02:50.540 |
So on any given day, something bad has happened. 02:03:01.260 |
there's just a strong negative bias in the news. 02:03:07.700 |
a possible explanation for this is evolutionary, 02:03:10.220 |
where bad news historically would be potentially fatal. 02:03:19.260 |
or there's some other tribe that wants to kill you. 02:03:22.460 |
Good news, like we found a patch of berries is nice to have, 02:03:40.760 |
- Yeah, it was one of the most incredible robots 02:03:45.500 |
And it was super exciting when it generalized, 02:03:52.220 |
but a real-world AI system that perceives the world. 02:03:58.220 |
- And can have potentially different embodiments. 02:04:05.700 |
like, it learns to read, like, it can read signs, 02:04:26.740 |
what all the objects are on the road, from video, 02:04:47.940 |
sort of, fingers and mouth in order to communicate. 02:05:08.620 |
where he talked about LLM forming a world model, 02:05:11.300 |
and basically, language is a projection of that world model 02:05:14.260 |
onto a sequence of letters, and you're saying-- 02:05:46.500 |
And it does seem everybody's excited about LLMs, 02:06:00.060 |
between that and what Tesla Autopilot is doing. 02:06:03.280 |
Is it to you basically the same, but different-- 02:06:21.240 |
The Tesla approach is much more computer efficient. 02:06:32.980 |
And it's eight computer, 144 trillion operations per second, 02:06:57.860 |
much more significant impact, because it's understanding. 02:07:01.720 |
It'll faster understand the real world than will LLMs. 02:07:07.180 |
And therefore, be able to integrate with the real world, 02:07:12.180 |
- They're both going to understand the world, 02:07:27.220 |
So, think of like, what is our brain able to do? 02:07:38.420 |
The thinking part of our brain is less than 10 watts. 02:07:43.060 |
And those 10 watts can still produce a much better novel 02:07:50.740 |
So, there's a six order of magnitude difference there. 02:07:53.200 |
I mean, the AI has thus far gotten to where it is 02:07:59.680 |
via brute force, just throwing massive amounts of compute 02:08:13.060 |
You know, in general, with any given technology, 02:08:19.260 |
So, I think we'll find over time that these models 02:08:24.420 |
get smaller, are able to produce a sensible output 02:08:34.900 |
Tesla is arguably ahead of the game on that front 02:08:38.780 |
because we've just been forced to try to understand 02:08:49.900 |
And there are a bunch of sort of fundamental functions 02:08:56.340 |
So, we have to run a bunch of things in emulation. 02:08:58.840 |
We fixed a bunch of those with hardware four, 02:09:11.300 |
that the car will be able to drive better than a human, 02:09:15.980 |
even with hardware three and 100 watts of power. 02:09:26.740 |
- What have you learned about developing Optimus, 02:09:31.380 |
about applying, integrating this kind of real-world AI 02:09:47.580 |
- I was surprised at the fact that we had to develop 02:10:03.900 |
We couldn't actually find a source of electric motors 02:10:10.500 |
- So, it's not even just efficient, inexpensive, 02:10:17.860 |
- The actuators, everything, everything has to be 02:10:26.540 |
because you think of how many electric motors 02:10:31.340 |
hundreds of thousands of electric motor designs. 02:10:33.980 |
None of them were suitable for a humanoid robot, 02:10:46.060 |
design it specifically for what a humanoid robot needs. 02:10:56.660 |
I mean, if you compare it to Boston Dynamics Atlas, 02:11:09.420 |
It should be, 'cause if you look at the mass of the robot, 02:11:15.140 |
it's much smaller, and the car has many actuators in it. 02:11:23.380 |
- But there is, the actuators are kind of interesting 02:11:28.780 |
So Optimus has really nice hands and fingers, you know? 02:11:33.900 |
- And they can do some interesting manipulation. 02:11:52.340 |
Although, I go back to, I'm optimistic on time. 02:12:15.500 |
- So you can just-- - But they're very similar. 02:12:17.780 |
- So the entire machinery of the learning process, 02:12:27.660 |
- As the saying goes, be kind for everyone you meet 02:12:34.260 |
- What's something difficult you're going through 02:12:52.380 |
and I don't think most people would wanna be me. 02:13:15.900 |
In the grand scheme of things, I can't complain. 02:13:41.780 |
- Walter Isaacson, in his new biography of you, 02:14:03.100 |
I do not, at least I don't think I harbor resentment. 02:14:19.860 |
You know, forgiveness is difficult for people. 02:14:22.820 |
It seems like you don't harbor the resentment. 02:14:30.540 |
what is gonna affect the future in a good way? 02:14:43.780 |
What have you learned about life from your kids? 02:15:15.300 |
making more and more sense of the world at the same time. 02:15:28.660 |
you know, you kind of see the world anew in their eyes. 02:15:32.300 |
You know, to them, everything is new and fresh. 02:15:43.860 |
they may experience the world as new and fresh. 02:15:54.740 |
for your kindness to me and friendship over the years, 02:16:08.980 |
and for working your ass off to make it happen. 02:16:13.260 |
- Thank you for listening to this conversation 02:16:17.500 |
please check out our sponsors in the description. 02:16:22.340 |
that Walter Isaacson wrote about the central philosophy 02:16:29.420 |
The only rules are the ones dictated by the laws of physics. 02:16:34.420 |
Thank you for listening, and hope to see you next time.