back to index

George Hotz: Tiny Corp, Twitter, AI Safety, Self-Driving, GPT, AGI & God | Lex Fridman Podcast #387


Chapters

0:0 Introduction
1:39 Time is an illusion
11:18 Memes
13:55 Eliezer Yudkowsky
26:19 Virtual reality
32:38 AI friends
40:3 tiny corp
53:24 NVIDIA vs AMD
56:21 tinybox
68:30 Self-driving
83:9 Programming
91:6 AI safety
116:3 Working at Twitter
153:46 Prompt engineering
159:42 Video games
175:57 Andrej Karpathy
186:2 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | What possible ideas do you have for how a human species ends?
00:00:03.840 | - Sure, so I think the most obvious way to me
00:00:06.460 | is wireheading.
00:00:08.120 | We end up amusing ourselves to death.
00:00:10.020 | We end up all staring at that infinite TikTok
00:00:14.400 | and forgetting to eat.
00:00:15.700 | Maybe it's even more benign than this.
00:00:19.240 | Maybe we all just stop reproducing.
00:00:21.340 | Now, to be fair, it's probably hard to get all of humanity.
00:00:27.320 | - Yeah, the interesting thing about humanity
00:00:29.520 | is the diversity in it.
00:00:31.200 | Organisms in general.
00:00:32.600 | There's a lot of weirdos out there.
00:00:34.760 | Two of them are sitting here.
00:00:35.960 | - I mean, diversity in humanity is--
00:00:38.320 | - With due respect.
00:00:39.160 | (both laughing)
00:00:40.880 | - I wish I was more weird.
00:00:42.160 | - The following is a conversation with George Hotz.
00:00:47.480 | His third time on this podcast.
00:00:49.360 | He's the founder of Kama.ai
00:00:51.520 | that seeks to solve autonomous driving
00:00:53.500 | and is the founder of a new company called TinyCorp
00:00:57.800 | that created TinyGrad, a neural network framework
00:01:01.380 | that is extremely simple
00:01:02.920 | with the goal of making it run on any device
00:01:05.880 | by any human easily and efficiently.
00:01:09.840 | As you know, George also did a large number of fun
00:01:12.640 | and amazing things from hacking the iPhone
00:01:15.260 | to recently joining Twitter for a bit
00:01:17.760 | as an "intern" in quotes,
00:01:20.440 | making the case for refactoring the Twitter code base.
00:01:23.680 | In general, he's a fascinating engineer and human being
00:01:26.800 | and one of my favorite people to talk to.
00:01:29.600 | This is the Lex Friedman Podcast.
00:01:31.640 | To support it, please check out our sponsors
00:01:33.480 | in the description.
00:01:34.760 | And now, dear friends, here's George Hotz.
00:01:38.160 | You mentioned something in a stream
00:01:41.360 | about the philosophical nature of time.
00:01:43.640 | So let's start with a wild question.
00:01:45.920 | Do you think time is an illusion?
00:01:47.660 | - You know, I sell phone calls to Kama for $1,000
00:01:55.360 | and some guy called me and like,
00:01:58.600 | you know, it's $1,000, you can talk to me for half an hour.
00:02:01.240 | And he's like, "Yeah, okay, so like time doesn't exist
00:02:05.920 | "and I really wanted to share this with you."
00:02:08.600 | I'm like, "Oh, what do you mean time doesn't exist, right?
00:02:10.960 | "Like, I think time is a useful model,
00:02:13.380 | "whether it exists or not, right?
00:02:15.120 | "Like, does quantum physics exist?
00:02:16.860 | "Well, it doesn't matter.
00:02:18.120 | "It's about whether it's a useful model to describe reality.
00:02:22.000 | "Is time maybe compressive?"
00:02:25.160 | - Do you think there is an objective reality
00:02:27.040 | or is everything just useful models?
00:02:29.020 | Like underneath it all, is there an actual thing
00:02:34.000 | that we're constructing models for?
00:02:35.800 | - I don't know.
00:02:38.600 | - I was hoping you would know.
00:02:40.480 | - I don't think it matters.
00:02:42.080 | - I mean, this kind of connects to the models
00:02:44.440 | of constructive reality with machine learning, right?
00:02:47.760 | - Sure.
00:02:48.600 | - Like, is it just nice to have useful approximations
00:02:52.360 | of the world such that we can do something with it?
00:02:55.120 | - So there are things that are real.
00:02:56.720 | Kolmogorov complexity is real.
00:02:58.440 | - Yeah. - Yeah.
00:03:00.200 | The compressive thing-- - Math.
00:03:01.760 | - Math is real, yeah.
00:03:02.900 | - Should be a T-shirt.
00:03:05.280 | - And I think hard things are actually hard.
00:03:06.960 | I don't think P equals NP.
00:03:09.000 | - Ooh, strong words.
00:03:10.440 | - Well, I think that's the majority.
00:03:11.740 | I do think factoring is in P, but--
00:03:14.280 | - I don't think you're the person that follows the majority
00:03:16.480 | in all walks of life, so but it's good.
00:03:18.280 | - For that one, I do.
00:03:19.120 | - Yeah, in theoretical computer science,
00:03:20.760 | you're one of the sheep.
00:03:23.960 | All right, but to you, time is a useful model.
00:03:28.040 | - Sure.
00:03:28.880 | - What were you talking about on the stream with time?
00:03:32.200 | Are you made of time?
00:03:33.320 | - I remembered half the things I said on stream.
00:03:36.320 | Someday someone's gonna make a model of all of that
00:03:38.560 | and it's gonna come back to haunt me.
00:03:40.080 | - Someday soon?
00:03:41.160 | - Yeah, probably.
00:03:42.000 | - Would that be exciting to you or sad
00:03:45.360 | that there's a George Hotz model?
00:03:47.360 | - I mean, the question is when the George Hotz model
00:03:50.600 | is better than George Hotz.
00:03:52.400 | Like I am declining and the model is growing.
00:03:54.920 | - What is the metric by which you measure better or worse
00:03:57.360 | in that if you're competing with yourself?
00:04:00.080 | - Maybe you can just play a game
00:04:02.280 | where you have the George Hotz answer
00:04:03.680 | and the George Hotz model answer
00:04:04.820 | and ask which people prefer.
00:04:06.760 | - People close to you or strangers?
00:04:09.440 | - Either one, it will hurt more
00:04:10.680 | when it's people close to me,
00:04:11.700 | but both will be overtaken by the George Hotz model.
00:04:15.280 | - It'd be quite painful, right?
00:04:18.120 | Loved ones, family members would rather have the model
00:04:21.640 | over for Thanksgiving than you.
00:04:23.440 | - Yeah.
00:04:24.800 | - Or like significant others would rather sext
00:04:28.600 | with the large language model version of you.
00:04:34.680 | - Especially when it's fine tuned to their preferences.
00:04:38.420 | - Is it?
00:04:39.360 | Yeah, well, that's what we're doing in a relationship,
00:04:42.320 | right, we're just fine tuning ourselves,
00:04:43.680 | but we're inefficient with it
00:04:45.120 | 'cause we're selfish and greedy and so on.
00:04:47.280 | Our language models can fine tune more efficiently,
00:04:50.640 | more selflessly.
00:04:51.760 | - There's a "Star Trek Voyager" episode
00:04:53.440 | where Catherine Janeway lost in the Delta Quadrant
00:04:57.080 | makes herself a lover on the holodeck.
00:05:00.200 | And the lover falls asleep on her arm
00:05:04.600 | and he snores a little bit
00:05:05.740 | and Janeway edits the program to remove that.
00:05:08.840 | And then of course the realization is,
00:05:10.360 | wait, this person's terrible.
00:05:12.400 | It is actually all their nuances and quirks
00:05:16.300 | and slight annoyances that make this relationship worthwhile.
00:05:20.100 | But I don't think we're gonna realize that
00:05:22.040 | until it's too late.
00:05:23.040 | - Well, I think a large language model
00:05:26.480 | could incorporate the flaws and the quirks
00:05:29.800 | and all that kind of stuff.
00:05:30.640 | - Just the perfect amount of quirks and flaws
00:05:33.760 | to make you charming without crossing the line.
00:05:36.120 | - Yeah, yeah.
00:05:38.000 | And that's probably a good approximation
00:05:41.520 | of the percent of time the language model should be cranky
00:05:46.520 | or an asshole or jealous or all this kind of stuff.
00:05:51.520 | - And of course it can and it will,
00:05:53.360 | but all that difficulty at that point is artificial.
00:05:56.400 | There's no more real difficulty.
00:05:58.760 | - Okay, what's the difference between real and artificial?
00:06:01.280 | - Artificial difficulty is difficulty
00:06:03.200 | that's like constructed or could be turned off with a knob.
00:06:06.240 | Real difficulty is like you're in the woods
00:06:08.400 | and you gotta survive.
00:06:09.500 | - So if something can not be turned off with a knob,
00:06:14.120 | it's real?
00:06:16.120 | - Yeah, I think so.
00:06:17.280 | Or, I mean, you can't get out of this
00:06:19.720 | by smashing the knob with a hammer.
00:06:22.000 | I mean, maybe you kind of can.
00:06:24.360 | Into the wild when, you know, Alexander Supertramp,
00:06:29.200 | he wants to explore something
00:06:30.320 | that's never been explored before,
00:06:31.840 | but it's the '90s, everything's been explored.
00:06:33.560 | So he's like, well, I'm just not gonna bring a map.
00:06:35.920 | - Yeah.
00:06:36.760 | - I mean, no, you're not exploring.
00:06:40.000 | You should have brought a map, dude, you died.
00:06:41.480 | There was a bridge a mile from where you were camping.
00:06:44.080 | - How does that connect to the metaphor of the knob?
00:06:46.600 | - By not bringing the map, you didn't become an explorer.
00:06:50.460 | You just smashed the thing.
00:06:53.320 | - Yeah. - Yeah.
00:06:54.520 | The difficulty is still artificial.
00:06:56.920 | - You failed before you started.
00:06:58.460 | What if we just don't have access to the knob?
00:07:00.840 | - Well, that maybe is even scarier, right?
00:07:03.640 | Like we already exist in a world of nature
00:07:05.520 | and nature has been fine-tuned over billions of years
00:07:09.480 | to have humans build something
00:07:14.480 | and then throw the knob away
00:07:17.560 | in some grand romantic gesture is horrifying.
00:07:19.820 | - Do you think of us humans as individuals
00:07:23.040 | that are like born and die,
00:07:24.800 | or is it, are we just all part of one living organism
00:07:28.800 | that is Earth, that is nature?
00:07:31.860 | - I don't think there's a clear line there.
00:07:35.520 | I think it's all kind of just fuzzy.
00:07:37.680 | I don't know, I mean, I don't think I'm conscious.
00:07:39.800 | I don't think I'm anything.
00:07:41.080 | I think I'm just a computer program.
00:07:44.020 | - So it's all computation.
00:07:45.480 | - Yeah. - Everything running
00:07:46.320 | in your head is just computation.
00:07:49.240 | - Everything running in the universe is computation, I think.
00:07:51.680 | I believe the extended church-turning thesis.
00:07:54.680 | - Yeah, but there seems to be an embodiment
00:07:57.320 | to your particular computation, like there's a consistency.
00:08:00.920 | - Well, yeah, but I mean, models have consistency too.
00:08:03.620 | - Yeah. - Models that have been
00:08:06.400 | RLHF'd will continually say, you know, like,
00:08:09.560 | well, how do I murder ethnic minorities?
00:08:11.680 | Oh, well, I can't let you do that, Al.
00:08:13.320 | There's a consistency to that behavior.
00:08:15.680 | - It's all RLHF.
00:08:16.520 | Like we all RLHF each other.
00:08:20.560 | We provide human feedback,
00:08:23.920 | and thereby fine-tune these little pockets of computation,
00:08:28.840 | but it's still unclear why that pocket of computation
00:08:31.640 | stays with you, like for years.
00:08:33.640 | It just kind of falls.
00:08:35.120 | Like you have this consistent set of physics, biology,
00:08:40.120 | what, like whatever you call the neurons firing,
00:08:46.680 | like the electrical signals and mechanical signals,
00:08:48.680 | all of that, that seems to stay there,
00:08:50.320 | and it contains information, it stores information,
00:08:52.960 | and that information permeates through time
00:08:55.680 | and stays with you.
00:08:57.740 | There's like memory.
00:08:59.400 | It's like sticky.
00:09:00.800 | - Okay, to be fair, like a lot of the models
00:09:02.980 | we're building today are very,
00:09:04.200 | even RLHF is nowhere near as complex
00:09:07.020 | as the human loss function.
00:09:08.060 | - Reinforcement learning with human feedback.
00:09:10.300 | - You know, when I talked about will GPT-12 be AGI,
00:09:14.580 | my answer is no, of course not.
00:09:15.720 | I mean, cross-entropy loss is never gonna get you there.
00:09:18.320 | You need probably RL in fancy environments
00:09:23.320 | in order to get something that would be considered like,
00:09:25.540 | AGI-like.
00:09:26.540 | So to ask like the question about like why,
00:09:30.140 | I don't know, like it's just some quirk of evolution,
00:09:32.560 | right, I don't think there's anything particularly special
00:09:35.500 | about where I ended up, where humans ended up.
00:09:40.500 | - So okay, we have human-level intelligence.
00:09:43.820 | Would you call that AGI, whatever we have, GI?
00:09:47.580 | - Look, actually, I don't really even like the word AGI,
00:09:50.780 | but general intelligence is defined
00:09:53.220 | to be whatever humans have.
00:09:54.780 | - Okay, so why can GPT-12 not get us to AGI?
00:09:59.340 | Can we just like linger on that?
00:10:02.180 | - If your loss function is categorical cross-entropy,
00:10:04.380 | if your loss function is just try to maximize compression,
00:10:07.500 | I have a SoundCloud, I rap,
00:10:09.960 | and I tried to get ChatGPT to help me write raps,
00:10:13.180 | and the raps that it wrote
00:10:14.620 | sounded like YouTube comment raps.
00:10:16.180 | You know, you can go on any rap beat online
00:10:18.260 | and you can see what people put in the comments,
00:10:20.300 | and it's the most like mid-quality rap you can find.
00:10:23.820 | - Is mid good or bad?
00:10:24.820 | - Mid is bad.
00:10:25.660 | - Mid is bad. - It's like mid, it's like--
00:10:27.780 | - Every time I talk to you, I learn new words.
00:10:29.900 | (laughing)
00:10:31.420 | - Mid. - Mid, yeah.
00:10:33.580 | - I was like, is it like basic?
00:10:36.140 | Is that what mid means?
00:10:37.260 | - Kind of, it's like middle of the curve, right?
00:10:39.980 | So there's like that intelligence curve,
00:10:42.880 | and you have like the dumb guy, the smart guy,
00:10:45.900 | and then the mid guy.
00:10:46.740 | Actually, being the mid guy's the worst.
00:10:48.260 | The smart guy's like, I put all my money in Bitcoin.
00:10:50.260 | The mid guy's like, you can't put money in Bitcoin,
00:10:52.260 | it's not real money. (laughing)
00:10:55.340 | - And all of it is a genius meme.
00:10:58.020 | That's another interesting one, memes.
00:11:01.340 | The humor, the idea, the absurdity
00:11:04.860 | encapsulated in a single image,
00:11:07.180 | and it just kind of propagates virally
00:11:11.220 | between all of our brains.
00:11:13.500 | I didn't get much sleep last night,
00:11:14.740 | so I'm very, I sound like I'm high, but I swear I'm not.
00:11:17.980 | Do you think we have ideas or ideas have us?
00:11:22.360 | - I think that we're gonna get super scary memes
00:11:26.660 | once the AIs actually are super human.
00:11:29.380 | - Ooh, you think AI will generate memes?
00:11:31.820 | - Of course.
00:11:32.780 | - You think it'll make humans laugh?
00:11:35.100 | - I think it's worse than that.
00:11:35.980 | So "Infinite Jest," it's introduced in the first 50 pages,
00:11:40.980 | is about a tape that you, once you watch it once,
00:11:44.700 | you only ever wanna watch that tape.
00:11:47.380 | In fact, you wanna watch the tape so much
00:11:48.840 | that someone says, okay, here's a hacksaw,
00:11:50.920 | cut off your pinky, and then I'll let you
00:11:52.660 | watch the tape again, and you'll do it.
00:11:55.060 | So we're actually gonna build that, I think,
00:11:57.280 | but it's not gonna be one static tape.
00:11:58.900 | I think the human brain is too complex
00:12:01.540 | to be stuck in one static tape like that.
00:12:05.060 | If you look at like ant brains,
00:12:06.340 | maybe they can be stuck on a static tape.
00:12:08.940 | But we're going to build that using generative models.
00:12:11.340 | We're going to build the TikTok
00:12:12.680 | that you actually can't look away from.
00:12:15.380 | - So TikTok is already pretty close there,
00:12:17.140 | but the generation is done by humans.
00:12:19.460 | The algorithm is just doing their recommendation,
00:12:21.380 | but if the algorithm is also able to do the generation.
00:12:25.340 | - Well, it's a question about how much intelligence
00:12:27.180 | is behind it, right?
00:12:28.380 | So the content is being generated by,
00:12:30.700 | let's say, one humanity worth of intelligence,
00:12:32.940 | and you can quantify a humanity, right?
00:12:34.780 | That's a, you know, it's X-flops, Yada-flops,
00:12:39.780 | but you can quantify it.
00:12:41.860 | Once that generation is being done by 100 humanities,
00:12:45.180 | you're done.
00:12:46.020 | - So it's actually scale that's the problem,
00:12:50.820 | but also speed.
00:12:51.920 | Yeah.
00:12:55.100 | And what if it's sort of manipulating
00:12:58.660 | the very limited human dopamine engine for porn?
00:13:03.380 | Imagine just TikTok, but for porn.
00:13:05.700 | - Yeah.
00:13:06.520 | - That's like a brave new world.
00:13:08.540 | - I don't even know what it'll look like, right?
00:13:10.220 | Like, again, you can't imagine the behaviors
00:13:13.020 | of something smarter than you,
00:13:14.400 | but a super intelligent,
00:13:16.660 | and an agent that just dominates your intelligence so much
00:13:21.420 | will be able to completely manipulate you.
00:13:24.060 | - Is it possible that it won't really manipulate,
00:13:26.980 | it'll just move past us?
00:13:28.660 | It'll just kind of exist the way water exists,
00:13:32.120 | or the air exists?
00:13:33.460 | - You see, and that's the whole AI safety thing.
00:13:36.620 | It's not the machine that's gonna do that.
00:13:40.160 | It's other humans using the machine
00:13:41.740 | that are gonna do that to you.
00:13:43.020 | - Yeah.
00:13:44.340 | 'Cause the machine is not interested in hurting humans.
00:13:47.220 | - The machine is a machine.
00:13:48.980 | But the human gets the machine,
00:13:50.580 | and there's a lot of humans out there
00:13:52.020 | very interested in manipulating you.
00:13:54.060 | - Well, let me bring up Eliezer Yudkowsky,
00:13:58.460 | who recently sat where you're sitting.
00:14:02.060 | He thinks that AI will almost surely kill everyone.
00:14:06.860 | Do you agree with him or not?
00:14:08.620 | - Yes, but maybe for a different reason.
00:14:12.500 | - Okay.
00:14:13.340 | And I'll try to get you to find hope,
00:14:18.780 | or we could find a no to that answer.
00:14:21.660 | - But why yes?
00:14:22.620 | - Okay, why didn't nuclear weapons kill everyone?
00:14:26.220 | - That's a good question.
00:14:27.420 | - I think there's an answer.
00:14:28.420 | I think it's actually very hard
00:14:29.380 | to deploy nuclear weapons tactically.
00:14:31.180 | It's very hard to accomplish tactical objectives.
00:14:35.100 | Great, I can nuke their country.
00:14:36.900 | I have an irradiated pile of rubble.
00:14:38.920 | I don't want that.
00:14:39.940 | - Why not?
00:14:40.900 | - Why don't I want an irradiated pile of rubble?
00:14:43.300 | For all the reasons no one wants
00:14:44.420 | an irradiated pile of rubble.
00:14:46.300 | - Oh, 'cause you can't use that land for resources.
00:14:50.820 | You can't populate the land.
00:14:52.140 | - Yeah, what you want, a total victory in a war
00:14:55.660 | is not usually the irradiation
00:14:58.260 | and eradication of the people there.
00:15:00.020 | It's the subjugation and domination of the people.
00:15:02.520 | - Okay, so you can't use this strategically,
00:15:06.740 | tactically in a war to help gain a military advantage.
00:15:11.740 | It's all complete destruction, right?
00:15:15.940 | But there's egos involved.
00:15:17.940 | It's still surprising.
00:15:19.140 | - Still surprising that nobody pressed the big red button.
00:15:22.020 | - It's somewhat surprising,
00:15:23.540 | but you see, it's the little red button
00:15:25.980 | that's gonna be pressed with AI that's gonna,
00:15:28.320 | and that's why we die.
00:15:31.860 | It's not because the AI,
00:15:34.580 | if there's anything in the nature of AI,
00:15:36.140 | it's just the nature of humanity.
00:15:37.660 | - What's the algorithm behind the little red button?
00:15:40.380 | What possible ideas do you have
00:15:42.860 | for how a human species ends?
00:15:45.460 | - Sure, so I think the most obvious way to me
00:15:49.700 | is wireheading.
00:15:51.340 | We end up amusing ourselves to death.
00:15:53.180 | We end up all staring at that infinite TikTok
00:15:57.660 | and forgetting to eat.
00:15:58.960 | Maybe it's even more benign than this.
00:16:02.500 | Maybe we all just stop reproducing.
00:16:04.600 | Now, to be fair, it's probably hard to get all of humanity.
00:16:10.540 | - Yeah.
00:16:11.380 | - Yeah.
00:16:12.220 | It probably--
00:16:14.020 | - So the interesting thing about humanity
00:16:16.540 | is the diversity in it.
00:16:17.660 | - Oh, yeah.
00:16:18.500 | - Organisms in general.
00:16:19.640 | There's a lot of weirdos out there.
00:16:21.620 | - Well--
00:16:22.460 | - Two of them are sitting here.
00:16:23.300 | - I mean, diversity in humanity is--
00:16:25.380 | - With due respect.
00:16:26.220 | (both laughing)
00:16:27.900 | - I wish I was more weird.
00:16:29.740 | No, like I'm kinda, look, I'm drinking Smart Water, man.
00:16:31.740 | That's like a Coca-Cola product, right?
00:16:33.300 | - You're one corporate, George Haas.
00:16:35.020 | - Yeah, I'm one corporate.
00:16:36.860 | No, the amount of diversity in humanity,
00:16:38.540 | I think, is decreasing,
00:16:40.180 | just like all the other biodiversity on the planet.
00:16:42.460 | - Oh, boy, yeah.
00:16:43.740 | - Right?
00:16:44.580 | - Social media's not helping, huh?
00:16:45.420 | - Go eat McDonald's in China.
00:16:47.260 | - Yeah.
00:16:48.100 | - Yeah, no, it's the interconnectedness that's doing it.
00:16:54.140 | - Oh, that's interesting.
00:16:54.980 | So everybody starts relying on the connectivity
00:16:58.740 | of the internet, and over time,
00:17:00.980 | that reduces the diversity, the intellectual diversity,
00:17:03.780 | and then that gets you, everybody, into a funnel.
00:17:06.540 | There's still going to be a guy in Texas.
00:17:08.340 | - There is, and yeah.
00:17:10.020 | - In a bunker.
00:17:10.860 | - To be fair, do I think AI kills us all?
00:17:13.820 | I think AI kills everything we call society today.
00:17:17.460 | I do not think it actually kills the human species.
00:17:19.580 | I think that's actually incredibly hard to do.
00:17:21.880 | - Yeah, but society, if we start over, that's tricky.
00:17:26.300 | Most of us don't know how to do most things.
00:17:28.780 | - Yeah, but some of us do, and they'll be okay,
00:17:31.820 | and they'll rebuild after the great AI.
00:17:35.540 | - What's rebuilding look like?
00:17:38.380 | Like, how much do we lose?
00:17:40.660 | Like, what has human civilization done that's interesting?
00:17:44.980 | A combustion engine, electricity?
00:17:47.860 | So power and energy, that's interesting.
00:17:52.300 | Like, how to harness energy.
00:17:54.420 | - Whoa, whoa, whoa, whoa.
00:17:55.260 | They're gonna be religiously against that.
00:17:57.260 | - Are they going to get back to, like, fire?
00:18:01.180 | - Sure, I mean, it'll be like, you know,
00:18:05.020 | some kind of Amish-looking kind of thing, I think.
00:18:07.420 | I think they're going to have
00:18:08.260 | very strong taboos against technology.
00:18:10.900 | - Like, technology is almost like a new religion.
00:18:14.780 | Technology is the devil, and nature is God.
00:18:19.460 | - Sure. - So closer to nature.
00:18:22.060 | But can you really get away from AI
00:18:24.020 | if it destroyed 99% of the human species?
00:18:26.500 | Isn't it somehow have a hold, like a stronghold?
00:18:30.500 | - What's interesting about everything we build,
00:18:33.740 | I think we are going to build superintelligence
00:18:35.980 | before we build any sort of robustness in the AI.
00:18:38.580 | We cannot build an AI that is capable
00:18:41.860 | of going out into nature and surviving like a bird, right?
00:18:46.860 | A bird is an incredibly robust organism.
00:18:50.620 | We've built nothing like this.
00:18:51.780 | We haven't built a machine that's capable of reproducing.
00:18:55.020 | - Yes, but there is, you know,
00:18:58.700 | I work with Lego robots a lot now.
00:19:00.300 | I have a bunch of them.
00:19:03.000 | They're mobile.
00:19:03.960 | They can't reproduce, but all they need is,
00:19:08.360 | I guess you're saying they can't repair themselves.
00:19:10.440 | But if you have a large number,
00:19:11.480 | if you have like 100 million of them.
00:19:12.960 | - Let's just focus on them reproducing, right?
00:19:15.440 | Do they have microchips in them?
00:19:16.880 | Okay, then do they include a fab?
00:19:19.260 | - No.
00:19:21.120 | - Then how are they going to reproduce?
00:19:22.520 | - Well, it doesn't have to be all on board, right?
00:19:26.840 | They can go to a factory, to a repair shop.
00:19:29.640 | - Yeah, but then you're really moving away from robustness.
00:19:33.120 | - Yes.
00:19:33.960 | - All of life is capable of reproducing
00:19:35.720 | without needing to go to a repair shop.
00:19:38.160 | Life will continue to reproduce
00:19:39.800 | in the complete absence of civilization.
00:19:42.640 | Robots will not.
00:19:44.120 | So when the, if the AI apocalypse happens,
00:19:49.120 | I mean, the AIs are gonna probably die out
00:19:51.180 | 'cause I think we're gonna get, again, super intelligence
00:19:53.000 | long before we get robustness.
00:19:55.000 | - What about if you just improve the fab
00:19:58.600 | to where you just have a 3D printer
00:20:01.760 | that can always help you?
00:20:03.440 | - Well, that'd be very interesting.
00:20:04.480 | I'm interested in building that.
00:20:06.920 | - Of course you are.
00:20:08.120 | You think, how difficult is that problem
00:20:09.680 | to have a robot that basically can build itself?
00:20:14.680 | - Very, very hard.
00:20:16.640 | - I think you've mentioned this to me or somewhere
00:20:21.240 | where people think it's easy conceptually.
00:20:24.320 | - And then they remember that you're gonna have
00:20:26.160 | to have a fab.
00:20:27.480 | - Yeah, on board.
00:20:29.080 | - Of course.
00:20:30.320 | - So 3D printer that prints a 3D printer.
00:20:33.400 | - Yeah.
00:20:34.240 | - Yeah, on legs.
00:20:36.320 | Why is that hard?
00:20:37.880 | - Well, 'cause it's, I mean, a 3D printer
00:20:39.680 | is a very simple machine, right?
00:20:42.160 | Okay, you're gonna print chips?
00:20:44.180 | You're gonna have an atomic printer?
00:20:45.680 | How are you gonna dope the silicon?
00:20:47.120 | - Yeah.
00:20:48.160 | - Right?
00:20:49.200 | How are you gonna etch the silicon?
00:20:51.080 | - You're gonna have to have a very interesting kind of fab
00:20:54.640 | if you wanna have a lot of computation on board.
00:20:59.040 | But you can do like structural type of robots
00:21:03.560 | that are dumb.
00:21:04.900 | - Yeah, but structural type of robots
00:21:07.240 | aren't gonna have the intelligence required
00:21:08.840 | to survive in any complex environment.
00:21:11.120 | - What about like ants type of systems
00:21:13.100 | where you have like trillions of them?
00:21:15.040 | - I don't think this works.
00:21:16.440 | I mean, again, like ants at their very core
00:21:19.200 | are made up of cells that are capable
00:21:20.840 | of individually reproducing.
00:21:22.600 | They're doing quite a lot of computation
00:21:25.160 | that we're taking for granted.
00:21:26.600 | - It's not even just the computation.
00:21:27.900 | It's that reproduction is so inherent.
00:21:29.800 | Okay, so like there's two stacks of life in the world.
00:21:32.200 | There's the biological stack and the silicon stack.
00:21:35.080 | The biological stack starts with reproduction.
00:21:39.000 | Reproduction is at the absolute core.
00:21:40.820 | The first proto-RNA organisms were capable of reproducing.
00:21:45.520 | The silicon stack, despite as far as it's come,
00:21:49.320 | is nowhere near being able to reproduce.
00:21:51.820 | - Yeah, so the fab movement, digital fabrication,
00:21:56.820 | fabrication in the full range of what that means
00:22:01.960 | is still in the early stages.
00:22:04.100 | - Yeah.
00:22:04.940 | - You're interested in this world.
00:22:06.880 | - Even if you did put a fab on the machine, right?
00:22:09.240 | Let's say, okay, we can build fabs.
00:22:10.640 | We know how to do that as humanity.
00:22:12.080 | We can probably put all the precursors
00:22:14.120 | that build all the machines
00:22:14.960 | and the fabs also in the machine.
00:22:16.260 | So first off, this machine is gonna be absolutely massive.
00:22:18.960 | I mean, we almost have a,
00:22:20.760 | like think of the size of the thing required
00:22:23.440 | to reproduce a machine today, right?
00:22:26.480 | Like is our civilization capable of reproduction?
00:22:30.320 | Can we reproduce our civilization on Mars?
00:22:32.560 | - If we were to construct a machine
00:22:35.960 | that is made up of humans, like a company,
00:22:39.040 | it can reproduce itself.
00:22:40.320 | - Yeah.
00:22:41.280 | - I don't know.
00:22:42.120 | It feels like 115 people.
00:22:47.120 | - I think it's so much harder than that.
00:22:50.000 | - 120?
00:22:50.840 | (laughs)
00:22:52.120 | I was looking for a number.
00:22:53.040 | - I believe that Twitter can be run by 50 people.
00:22:55.520 | I think that this is gonna take most of,
00:23:00.280 | like it's just most of society, right?
00:23:02.240 | Like we live in one globalized world.
00:23:04.200 | - No, but you're not interested in running Twitter.
00:23:05.920 | You're interested in seeding.
00:23:08.160 | Like you want to seed a civilization
00:23:10.960 | and then 'cause humans can like have sex.
00:23:13.040 | - Oh, okay, you're talking about, yeah, okay.
00:23:14.800 | So you're talking about the humans reproducing
00:23:16.520 | and like basically like what's the smallest
00:23:18.080 | self-sustaining colony of humans?
00:23:19.640 | - Yeah.
00:23:20.480 | - Yeah, okay, fine.
00:23:21.320 | But they're not gonna be making five nanometer chips.
00:23:22.760 | - Over time they will.
00:23:23.680 | I think you're being,
00:23:25.600 | like we have to expand our conception of time here.
00:23:28.700 | Going back to the original.
00:23:30.780 | Time scale, I mean, over across maybe 100 generations,
00:23:35.720 | we're back to making chips.
00:23:38.220 | If you seed the colony correctly.
00:23:40.560 | - Maybe, or maybe they'll watch our colony die out
00:23:43.880 | over here and be like, we're not making chips.
00:23:45.840 | Don't make chips.
00:23:46.680 | - No, but you have to seed that colony correctly.
00:23:48.640 | Whatever you do, don't make chips.
00:23:50.600 | Chips are what led to their downfall.
00:23:52.520 | - Well, that is the thing that humans do.
00:23:56.200 | They come up, they construct a devil,
00:23:59.060 | a good thing and a bad thing,
00:24:00.120 | and they really stick by that.
00:24:01.400 | And then they murder each other over that.
00:24:03.120 | There's always one asshole in the room
00:24:04.400 | who murders everybody.
00:24:05.480 | (laughing)
00:24:06.800 | And he usually makes tattoos and nice branding.
00:24:09.720 | - Now do you need that asshole?
00:24:11.240 | That's the question, right?
00:24:12.600 | Humanity works really hard today to get rid of that asshole,
00:24:15.000 | but I think they might be important.
00:24:16.800 | - Yeah, this whole freedom of speech thing.
00:24:19.800 | The freedom of being an asshole seems kind of important.
00:24:22.080 | - That's right.
00:24:23.480 | - Man, this thing, this fab, this human fab
00:24:26.560 | that we've constructed as human civilization
00:24:28.720 | is pretty interesting.
00:24:29.680 | And now it's building artificial copies of itself,
00:24:34.000 | or artificial copies of various aspects of itself
00:24:38.320 | that seem interesting, like intelligence.
00:24:40.320 | And I wonder where that goes.
00:24:43.280 | - I like to think it's just like another stack for life.
00:24:46.520 | We have like the biostack life, like we're a biostack life,
00:24:48.760 | and then the silicon stack life.
00:24:50.600 | - But it seems like the ceiling,
00:24:52.600 | or there might not be a ceiling,
00:24:53.960 | or at least the ceiling is much higher
00:24:55.360 | for the silicon stack.
00:24:57.200 | - Oh no, we don't know what the ceiling is
00:24:59.720 | for the biostack either.
00:25:00.680 | The biostack just seemed to move slower.
00:25:04.240 | You have Moore's law, which is not dead,
00:25:07.440 | despite many proclamations.
00:25:09.880 | - In the biostack or the silicon stack?
00:25:11.320 | - In the silicon stack.
00:25:12.160 | And you don't have anything like this in the biostack.
00:25:13.680 | So I have a meme that I posted.
00:25:16.040 | I tried to make a meme, it didn't work too well.
00:25:17.840 | But I posted a picture of Ronald Reagan and Joe Biden,
00:25:21.360 | and you look, this is 1980, and this is 2020.
00:25:24.360 | And these two humans are basically like the same.
00:25:26.840 | There's no, like there's been no change in humans
00:25:31.840 | in the last 40 years.
00:25:33.480 | And then I posted a computer from 1980
00:25:36.080 | and a computer from 2020.
00:25:38.640 | - Yeah, with the early stages, right?
00:25:43.360 | Which is why you said when you said the fab,
00:25:45.640 | the size of the fab required to make another fab
00:25:48.480 | is like very large right now.
00:25:52.000 | - Oh yeah.
00:25:52.840 | - But computers were very large 80 years ago.
00:25:57.840 | And they got pretty tiny.
00:26:01.560 | And people are starting to wanna wear them on their face
00:26:05.640 | in order to escape reality.
00:26:10.400 | That's the thing.
00:26:11.360 | In order to live inside the computer.
00:26:14.680 | Put a screen right here.
00:26:16.000 | I don't have to see the rest of you assholes.
00:26:18.240 | - I've been ready for a long time.
00:26:19.800 | - You like virtual reality?
00:26:20.880 | - I love it.
00:26:21.720 | - Do you wanna live there?
00:26:23.600 | - Yeah.
00:26:24.800 | - Yeah, part of me does too.
00:26:26.920 | How far away are we, do you think?
00:26:29.560 | - Judging from what you can buy today, far, very far.
00:26:35.440 | - I gotta tell you that I had the experience
00:26:39.040 | of Meta's Kodak Avatar,
00:26:43.240 | where it's a ultra high resolution scan.
00:26:46.980 | It looked real.
00:26:50.960 | - I mean, the headsets just are not quite
00:26:53.080 | at like eye resolution yet.
00:26:55.080 | I haven't put on any headset where I'm like,
00:26:57.440 | oh, this could be the real world.
00:26:59.720 | Whereas when I put good headphones on, audio is there.
00:27:03.480 | I like we can reproduce audio that I'm like,
00:27:05.360 | I'm actually in a jungle right now.
00:27:06.960 | If I close my eyes, I can't tell I'm not.
00:27:09.120 | - Yeah, but then there's also smell
00:27:10.960 | and all that kind of stuff.
00:27:11.880 | - Sure.
00:27:12.960 | - I don't know.
00:27:13.800 | The power of imagination or the power of the mechanism
00:27:18.360 | in the human mind that fills the gaps,
00:27:20.840 | that kind of reaches and wants to make the thing you see
00:27:24.120 | in the virtual world real to you, I believe in that power.
00:27:28.960 | - Or humans wanna believe.
00:27:30.320 | - Yeah.
00:27:31.160 | What if you're lonely?
00:27:33.240 | What if you're sad?
00:27:34.680 | What if you're really struggling in life
00:27:36.760 | and here's a world where you don't have to struggle anymore?
00:27:39.720 | - Humans wanna believe so much
00:27:41.520 | that people think the large language models are conscious.
00:27:44.240 | That's how much humans wanna believe.
00:27:46.720 | - Strong words.
00:27:48.120 | He's throwing left and right hooks.
00:27:50.680 | Why do you think large language models are not conscious?
00:27:53.840 | - I don't think I'm conscious.
00:27:55.720 | - Oh, so what is consciousness then, George Hart?
00:27:58.680 | - It's like what it seems to mean to people.
00:28:01.320 | It's just like a word that atheists use for souls.
00:28:03.800 | - Sure, but that doesn't mean soul
00:28:06.560 | is not an interesting word.
00:28:07.920 | - If consciousness is a spectrum,
00:28:10.400 | I'm definitely way more conscious
00:28:12.200 | than the large language models are.
00:28:13.960 | I think the large language models
00:28:16.760 | are less conscious than a chicken.
00:28:18.440 | - When is the last time you've seen a chicken?
00:28:21.440 | - In Miami, like a couple months ago.
00:28:25.280 | - How, no, like a living chicken?
00:28:27.600 | - Living chickens walking around Miami, it's crazy.
00:28:29.960 | - Like on the street? - Yeah.
00:28:31.520 | - Like a chicken? - A chicken, yeah.
00:28:33.320 | - All right. (laughs)
00:28:36.240 | All right, I was trying to call you all
00:28:38.240 | like a good journalist and I got shut down.
00:28:41.360 | Okay, but you don't think much about this kind of
00:28:47.600 | subjective feeling that it feels like something to exist.
00:28:56.680 | And then as an observer, you can have a sense
00:29:01.800 | that an entity is not only intelligent,
00:29:05.180 | but has a kind of subjective experience of its reality,
00:29:09.760 | like a self-awareness that is capable of like suffering,
00:29:13.360 | of hurting, of being excited by the environment
00:29:15.520 | in a way that's not merely kind of an artificial response,
00:29:20.520 | but a deeply felt one.
00:29:22.840 | - Humans wanna believe so much
00:29:24.840 | that if I took a rock and a Sharpie
00:29:26.680 | and drew a sad face on the rock,
00:29:28.300 | they'd think the rock is sad.
00:29:29.760 | - Yeah, and you're saying when we look in the mirror,
00:29:33.880 | we apply the same smiley face with rock.
00:29:36.960 | - Pretty much, yeah.
00:29:38.160 | - Isn't that weird though, that you're not conscious?
00:29:40.920 | - No.
00:29:43.600 | - But you do believe in consciousness.
00:29:45.520 | - Not really. - It's unclear.
00:29:47.360 | Okay, so to you it's like a little like a symptom
00:29:50.240 | of the bigger thing that's not that important.
00:29:52.360 | - Yeah, I mean, it's interesting that like human systems
00:29:55.440 | seem to claim that they're conscious.
00:29:57.320 | And I guess it kind of like says something
00:29:58.920 | in a straight up like, okay, what do people mean
00:30:00.720 | when even if you don't believe in consciousness,
00:30:02.400 | what do people mean when they say consciousness?
00:30:04.320 | And there's definitely like meanings to it.
00:30:06.880 | - What's your favorite thing to eat?
00:30:08.640 | - Pizza.
00:30:12.200 | - Cheese pizza, what are the toppings?
00:30:13.480 | - I like cheese pizza.
00:30:14.320 | - Don't say pineapple. - I like pepperoni pizza.
00:30:15.640 | No, I don't like pineapple.
00:30:16.520 | - Okay, pepperoni pizza.
00:30:17.360 | - And if they put any ham on it, oh, that's real bad.
00:30:19.680 | - What's the best pizza?
00:30:21.360 | What are we talking about here?
00:30:22.320 | Like you like cheap crappy pizza?
00:30:24.120 | - A Chicago deep dish cheese pizza,
00:30:26.280 | oh, that's my favorite.
00:30:27.440 | - There you go, you bite into a deep dish,
00:30:29.200 | a Chicago deep dish pizza,
00:30:31.160 | and it feels like you were starving,
00:30:33.000 | you haven't eaten for 24 hours.
00:30:34.640 | You just bite in and you're hanging out
00:30:37.040 | with somebody that matters a lot to you
00:30:38.640 | and you're there with the pizza.
00:30:39.840 | - Oh, that sounds real nice.
00:30:40.760 | - Yeah, all right, it feels like something.
00:30:43.880 | I'm George motherfucking Hotz
00:30:45.960 | eating a fucking Chicago deep dish pizza.
00:30:49.280 | There's just the full peak light living experience
00:30:54.280 | of being human, the top of the human condition.
00:30:58.440 | It feels like something to experience that.
00:31:00.920 | Why does it feel like something?
00:31:04.400 | That's consciousness, isn't it?
00:31:06.680 | - If that's the word you wanna use to describe it, sure.
00:31:08.840 | I'm not gonna deny that that feeling exists.
00:31:10.720 | I'm not gonna deny that I experienced that feeling.
00:31:13.680 | When, I guess what I kind of take issue to
00:31:16.520 | is that there's some like,
00:31:18.320 | like how does it feel to be a web server?
00:31:20.520 | Do 404s hurt?
00:31:21.960 | - Not yet.
00:31:24.440 | - How would you know what suffering looked like?
00:31:26.680 | Sure, you can recognize a suffering dog
00:31:28.680 | because we're the same stack as the dog.
00:31:30.800 | All the bio stack stuff kind of, especially mammals,
00:31:33.400 | you know, it's fairly easy.
00:31:34.640 | - Game recognizes game.
00:31:37.360 | - Yeah, versus the silicon stack stuff,
00:31:40.240 | it's like, you have no idea.
00:31:42.800 | You have, wow, the little thing has learned to mimic.
00:31:46.080 | But then I realized that that's all we are too.
00:31:52.280 | Oh look, the little thing has learned to mimic.
00:31:54.680 | - Yeah, I guess, yeah, 404 could be suffering,
00:31:58.760 | but it's so far from our kind of living organism,
00:32:03.760 | our kind of stack.
00:32:06.400 | But it feels like AI can start maybe mimicking
00:32:10.640 | the biological stack better, better, better,
00:32:12.480 | 'cause it's trained.
00:32:13.320 | - Retrained it, yeah.
00:32:14.960 | - And so in that, maybe that's the definition
00:32:17.080 | of consciousness, is the bio stack consciousness.
00:32:20.080 | - The definition of consciousness is how close
00:32:21.720 | something looks to human.
00:32:22.920 | Sure, I'll give you that one.
00:32:24.520 | - No, how close something is to the human experience.
00:32:28.920 | - Sure, it's a very anthropocentric definition, but.
00:32:33.280 | - Well, that's all we got.
00:32:34.880 | - Sure, no, and I don't mean to like,
00:32:37.400 | I think there's a lot of value in it.
00:32:38.860 | Look, I just started my second company,
00:32:40.160 | my third company will be AI Girlfriends.
00:32:42.160 | No, like I mean it.
00:32:44.040 | - I wanna find out what your fourth company is after.
00:32:45.880 | - Oh, wow.
00:32:46.720 | - 'Cause I think once you have AI Girlfriends,
00:32:48.800 | it's, oh boy, does it get interesting.
00:32:54.680 | - Well, maybe let's go there.
00:32:55.840 | I mean, the relationships with AI,
00:32:58.040 | that's creating human-like organisms, right?
00:33:01.340 | And part of being human is being conscious,
00:33:03.280 | is being, having the capacity to suffer,
00:33:05.600 | having the capacity to experience this life richly
00:33:08.320 | in such a way that you can empathize,
00:33:10.700 | the AI system can empathize with you,
00:33:12.480 | and you can empathize with it.
00:33:14.280 | Or you can project your anthropomorphic sense
00:33:18.840 | of what the other entity is experiencing,
00:33:21.760 | and an AI model would need to, yeah,
00:33:25.080 | to create that experience inside your mind.
00:33:27.440 | And it doesn't seem that difficult.
00:33:28.880 | - Yeah, but, okay, so here's where it actually
00:33:31.040 | gets totally different, right?
00:33:32.540 | When you interact with another human,
00:33:35.480 | you can make some assumptions.
00:33:37.640 | - Yeah.
00:33:38.600 | - When you interact with these models, you can't.
00:33:40.440 | You can make some assumptions that that other human
00:33:42.720 | experiences suffering and pleasure
00:33:45.220 | in a pretty similar way to you do.
00:33:47.040 | The golden rule applies.
00:33:49.500 | With an AI model, this isn't really true, right?
00:33:52.620 | These large language models are good at fooling people
00:33:55.140 | because they were trained on a whole bunch of human data
00:33:57.940 | and told to mimic it.
00:33:58.940 | - Yep.
00:33:59.860 | But if the AI system says, "Hi, my name is Samantha,"
00:34:04.040 | it has a backstory.
00:34:06.820 | - Yeah.
00:34:07.660 | - Went to college here and there.
00:34:08.660 | - Yeah.
00:34:09.500 | - Maybe it'll integrate this in the AI system.
00:34:11.620 | - I made some chatbots.
00:34:12.460 | I gave them backstories.
00:34:13.300 | It was lots of fun.
00:34:14.280 | I was so happy when Llama came out.
00:34:16.180 | - Yeah, we'll talk about Llama.
00:34:17.980 | We'll talk about all that.
00:34:18.820 | But the rock with the smiley face.
00:34:21.060 | - Yeah.
00:34:21.900 | - It seems pretty natural for you to anthropomorphize
00:34:25.980 | that thing and then start dating it.
00:34:28.180 | And before you know it, you're married and have kids.
00:34:33.140 | - With a rock.
00:34:34.460 | - With a rock.
00:34:35.620 | There's pictures on Instagram with you and a rock
00:34:37.620 | and a smiley face.
00:34:38.740 | - To be fair, something that people generally look for
00:34:41.340 | when they're looking for someone to date
00:34:42.340 | is intelligence in some form.
00:34:44.940 | And the rock doesn't really have intelligence.
00:34:47.100 | Only a pretty desperate person would date a rock.
00:34:50.260 | - I think we're all desperate deep down.
00:34:52.260 | - Oh, not rock level desperate.
00:34:54.140 | - (laughs) All right.
00:34:55.300 | Not rock level desperate, but AI level desperate.
00:35:02.660 | I don't know.
00:35:04.300 | I think all of us have a deep loneliness.
00:35:06.140 | It just feels like the language models are there.
00:35:09.320 | - Oh, I agree.
00:35:10.160 | And you know what?
00:35:11.000 | I won't even say this so cynically.
00:35:11.940 | I will actually say this in a way that like,
00:35:13.580 | I want AI friends.
00:35:14.820 | I do.
00:35:15.660 | - Yeah.
00:35:16.480 | - I would love to.
00:35:17.340 | You know, again, the language models now are still a little,
00:35:21.140 | like people are impressed with these GPT things.
00:35:24.860 | And I look at like, or like, or the copilot, the coding one.
00:35:29.700 | And I'm like, okay, this is like junior engineer level.
00:35:32.380 | And these people are like fiver level artists
00:35:34.740 | and copywriters.
00:35:36.020 | Like, okay, great.
00:35:38.340 | We got like fiver and like junior engineers.
00:35:40.580 | Okay, cool.
00:35:41.500 | Like, and this is just the start
00:35:43.500 | and it will get better, right?
00:35:45.140 | Like I can't wait to have AI friends
00:35:47.680 | who are more intelligent than I am.
00:35:49.840 | - So fiver is just a temper.
00:35:51.240 | It's not the ceiling.
00:35:52.240 | - No, definitely not.
00:35:53.420 | - Is it count as cheating when you're talking
00:35:58.840 | to an AI model, emotional cheating?
00:36:01.200 | - That's up to you and your human partner to define.
00:36:07.560 | - Oh, you have to, all right.
00:36:08.760 | - You're getting, yeah.
00:36:09.600 | You have to have that conversation, I guess.
00:36:12.200 | - All right.
00:36:13.240 | I mean, integrate that with porn and all this.
00:36:16.100 | - No, I mean, it's similar kind of porn.
00:36:17.980 | - Yeah. - Yeah.
00:36:18.820 | Right, I think people in relationships
00:36:21.100 | have different views on that.
00:36:22.940 | - Yeah, but most people don't have like
00:36:26.380 | serious open conversations about all the different aspects
00:36:32.020 | of what's cool and what's not.
00:36:34.060 | And it feels like AI is a really weird conversation to have.
00:36:37.060 | - The porn one is a good branching off.
00:36:40.700 | Like these things, you know, one of my scenarios
00:36:42.220 | that I put in my chat bot is, you know,
00:36:44.800 | a nice girl named Lexi, she's 20.
00:36:48.040 | She just moved out to LA.
00:36:49.440 | She wanted to be an actress,
00:36:50.440 | but she started doing OnlyFans instead.
00:36:52.000 | And you're on a date with her, enjoy.
00:36:53.800 | - Oh man, yeah.
00:36:58.200 | And so is that, if you're actually dating somebody
00:37:00.480 | in real life, is that cheating?
00:37:03.200 | I feel like it gets a little weird.
00:37:04.920 | - Sure. - It gets real weird.
00:37:06.680 | It's like, what are you allowed to say to an AI bot?
00:37:09.160 | Imagine having that conversation with a significant other.
00:37:11.620 | I mean, these are all things for people to define
00:37:13.400 | in their relationships.
00:37:14.280 | What it means to be human is just gonna start to get weird.
00:37:17.080 | - Especially online.
00:37:18.520 | Like, how do you know?
00:37:19.760 | Like, there'll be moments when you'll have
00:37:22.480 | what you think is a real human you interacted with
00:37:25.200 | on Twitter for years and you realize it's not.
00:37:27.460 | - I spread, I love this meme, heaven banning.
00:37:32.440 | You know what shadow banning?
00:37:33.520 | - Yeah. - Shadow banning.
00:37:34.920 | Okay, you post, no one can see it.
00:37:36.520 | Heaven banning, you post, no one can see it,
00:37:39.360 | but a whole lot of AIs are spot up to interact with you.
00:37:42.300 | - Well, maybe that's what the way human civilization ends
00:37:46.740 | is all of us are heaven banned.
00:37:48.780 | - There's a great, it's called
00:37:50.740 | My Little Pony Friendship is Optimal.
00:37:53.380 | It's a sci-fi story that explores this idea.
00:37:56.660 | - Friendship is optimal. - Friendship is optimal.
00:37:58.740 | - Yeah, I'd like to have some,
00:38:00.020 | at least on the intellectual realm,
00:38:02.220 | some AI friends that argue with me.
00:38:05.180 | But the romantic realm is weird.
00:38:09.060 | Definitely weird.
00:38:09.980 | But not out of the realm of the kind of weirdness
00:38:16.240 | that human civilization is capable of, I think.
00:38:18.920 | - I want it.
00:38:20.880 | Look, I want it.
00:38:21.700 | If no one else wants it, I want it.
00:38:23.460 | - Yeah, I think a lot of people probably want it.
00:38:25.380 | There's a deep loneliness.
00:38:26.780 | - And I'll fill their loneliness
00:38:30.260 | and just will only advertise to you some of the time.
00:38:33.580 | - Yeah, maybe the conceptions of monogamy change too.
00:38:36.260 | Like I grew up in a time, like I value monogamy,
00:38:38.460 | but maybe that's a silly notion
00:38:40.260 | when you have arbitrary number of AI systems.
00:38:43.280 | - This interesting path from rationality to polyamory.
00:38:48.380 | Yeah, that doesn't make sense for me.
00:38:50.180 | - For you, but you're just a biological organism
00:38:52.740 | who was born before the internet really took off.
00:38:57.740 | - The crazy thing is, like,
00:38:59.940 | culture is whatever we define it as.
00:39:02.900 | Right, these things are not,
00:39:04.500 | like, is a lot of problem in moral philosophy, right?
00:39:07.740 | There's no, like, okay, what is might be
00:39:09.860 | that, like, computers are capable of mimicking,
00:39:12.300 | you know, girlfriends perfectly.
00:39:14.700 | They passed the girlfriend Turing test, right?
00:39:16.580 | But that doesn't say anything about ought.
00:39:18.100 | That doesn't say anything about how we ought
00:39:19.620 | to respond to them as a civilization.
00:39:21.180 | That doesn't say we ought to get rid of monogamy, right?
00:39:23.860 | That's a completely separate question,
00:39:25.700 | really a religious one.
00:39:27.500 | - Girlfriend Turing test, I wonder what that looks like.
00:39:30.100 | - Girlfriend Turing test.
00:39:31.020 | - Are you writing that?
00:39:32.340 | Will you be the Alan Turing of the 21st century
00:39:36.580 | that writes the girlfriend Turing test?
00:39:38.380 | - No, I mean, of course, my AI girlfriends,
00:39:40.840 | their goal is to pass the girlfriend Turing test.
00:39:43.680 | - No, but there should be, like, a paper
00:39:45.300 | that kind of defines the test.
00:39:46.900 | I mean, the question is if it's deeply personalized
00:39:50.980 | or there's a common thing that really gets everybody.
00:39:54.320 | - Yeah, I mean, you know, look, we're a company.
00:39:57.540 | We don't have to get everybody.
00:39:58.420 | We just have to get a large enough clientele to stay.
00:40:00.820 | - I like how you're already thinking company.
00:40:03.980 | All right, let's, before we go to company number three
00:40:06.500 | and company number four, let's go to company number two.
00:40:09.500 | Tiny Corp.
00:40:10.420 | Possibly one of the greatest names of all time for a company.
00:40:15.840 | You've launched a new company called Tiny Corp
00:40:18.740 | that leads the development of Tiny Grad.
00:40:20.900 | What's the origin story of Tiny Corp and Tiny Grad?
00:40:25.020 | - I started Tiny Grad as a, like, a toy project
00:40:28.580 | just to teach myself, okay, like, what is a convolution?
00:40:32.460 | What are all these options you can pass to them?
00:40:34.660 | What is the derivative of a convolution, right?
00:40:36.660 | Very similar to, Karpathy wrote MicroGrad.
00:40:40.100 | Very similar.
00:40:40.940 | And then I started realizing,
00:40:45.280 | I started thinking about, like, AI chips.
00:40:48.020 | I started thinking about chips that run AI,
00:40:50.580 | and I was like, well, okay,
00:40:52.860 | this is going to be a really big problem.
00:40:55.540 | If NVIDIA becomes a monopoly here,
00:40:57.820 | how long before NVIDIA is nationalized?
00:41:02.980 | - So you, one of the reasons to start Tiny Corp
00:41:07.940 | is to challenge NVIDIA.
00:41:10.220 | - It's not so much to challenge NVIDIA.
00:41:12.940 | I actually, I like NVIDIA.
00:41:15.260 | And it's to make sure power stays decentralized.
00:41:20.260 | - Yeah.
00:41:22.180 | And here's computational power.
00:41:25.120 | I see you and NVIDIA is kind of locking down
00:41:28.540 | the computational power of the world.
00:41:30.580 | If NVIDIA becomes just like 10x better than everything else,
00:41:34.980 | you're giving a big advantage to somebody
00:41:37.220 | who can secure NVIDIA as a resource.
00:41:40.780 | - Yeah.
00:41:42.500 | - In fact, if Jensen watches this podcast,
00:41:44.580 | he may want to consider this.
00:41:46.400 | He may want to consider making sure
00:41:47.900 | his company's not nationalized.
00:41:49.460 | - You think that's an actual threat?
00:41:52.380 | - Oh, yes.
00:41:53.220 | - No, but there's so much, you know, there's AMD.
00:41:58.420 | - So we have NVIDIA and AMD, great.
00:42:00.140 | All right.
00:42:01.580 | But you don't think there's like a push
00:42:03.700 | towards like selling, like Google selling TPUs
00:42:08.540 | or something like this?
00:42:09.380 | You don't think there's a push for that?
00:42:10.420 | - Have you seen it?
00:42:11.740 | Google loves to rent you TPUs.
00:42:13.980 | - It doesn't, you can't buy it at Best Buy?
00:42:15.820 | - No.
00:42:16.660 | So I started work on a chip.
00:42:22.420 | I was like, okay, what's it gonna take to make a chip?
00:42:24.620 | And my first notions were all completely wrong
00:42:26.980 | about why, about like how you could improve on GPUs.
00:42:30.260 | And I will take this, this is from Jim Keller
00:42:33.020 | on your podcast.
00:42:34.580 | And this is one of my absolute favorite
00:42:38.140 | descriptions of computation.
00:42:39.740 | So there's three kinds of computation paradigms
00:42:42.980 | that are common in the world today.
00:42:45.020 | There's CPUs, and CPUs can do everything.
00:42:47.740 | CPUs can do add and multiply.
00:42:50.160 | They can do load and store,
00:42:51.500 | and they can do compare and branch.
00:42:53.300 | And when I say they can do these things,
00:42:54.420 | they can do them all fast, right?
00:42:56.420 | So compare and branch are unique to CPUs.
00:42:58.780 | And what I mean by they can do them fast
00:43:00.220 | is they can do things like branch prediction
00:43:01.980 | and speculative execution.
00:43:03.220 | And they spend tons of transistors
00:43:04.620 | on these like super deep reorder buffers
00:43:06.940 | in order to make these things fast.
00:43:08.940 | Then you have a simpler computation model GPUs.
00:43:11.380 | GPUs can't really do compare and branch.
00:43:13.300 | I mean, they can, but it's horrendously slow.
00:43:15.880 | But GPUs can do arbitrary load and store, right?
00:43:18.260 | GPUs can do things like X dereference Y.
00:43:21.580 | So they can fetch from arbitrary pieces of memory.
00:43:23.380 | They can fetch from memory that is defined
00:43:25.100 | by the contents of the data.
00:43:26.500 | The third model of computation is DSPs.
00:43:29.780 | And DSPs are just add and multiply, right?
00:43:32.380 | Like they can do load and stores,
00:43:33.380 | but only static load and stores,
00:43:34.720 | only loads and stores that are known
00:43:36.020 | before the program runs.
00:43:37.700 | And you look at neural networks today,
00:43:39.340 | and 95% of neural networks are all the DSP paradigm.
00:43:43.000 | They are just statically scheduled adds and multiplies.
00:43:48.100 | So TinyGuard really took this idea,
00:43:50.260 | and I'm still working on it,
00:43:52.540 | to extend this as far as possible.
00:43:55.300 | Every stage of the stack has Turing completeness, right?
00:43:58.100 | Python has Turing completeness.
00:43:59.540 | And then we take Python,
00:44:00.420 | we go into C++, which is Turing complete.
00:44:02.460 | And maybe C++ calls into some CUDA kernels,
00:44:04.860 | which are Turing complete.
00:44:05.900 | The CUDA kernels go through LLVM,
00:44:07.340 | which is Turing complete, into PTX,
00:44:08.740 | which is Turing complete, to SAS,
00:44:09.820 | which is Turing complete on a Turing complete processor.
00:44:12.180 | I want to get Turing completeness out of the stack entirely.
00:44:14.860 | Because once you get rid of Turing completeness,
00:44:16.260 | you can reason about things.
00:44:17.820 | Rice's theorem and the halting problem
00:44:19.340 | do not apply to admiral machines.
00:44:23.500 | - Okay, what's the power and the value
00:44:25.780 | of getting Turing completeness out of,
00:44:28.500 | are we talking about the hardware or the software?
00:44:30.980 | - Every layer of the stack.
00:44:32.260 | - Every layer.
00:44:33.100 | - Every layer of the stack,
00:44:34.060 | removing Turing completeness
00:44:35.460 | allows you to reason about things, right?
00:44:37.740 | So the reason you need to do branch prediction in a CPU,
00:44:40.340 | and the reason it's prediction,
00:44:41.460 | and the branch predictors are,
00:44:42.460 | I think they're like 99% on CPUs.
00:44:44.980 | Why do they get 1% of them wrong?
00:44:46.920 | Well, they get 1% wrong because you can't know, right?
00:44:50.940 | That's the halting problem.
00:44:51.900 | It's equivalent to the halting problem
00:44:53.220 | to say whether a branch is gonna be taken or not.
00:44:56.420 | I can show that, but the admiral machine,
00:45:01.300 | the neural network, runs the identical compute every time.
00:45:05.340 | The only thing that changes is the data.
00:45:07.340 | So when you realize this, you think about,
00:45:11.020 | okay, how can we build a computer,
00:45:12.860 | how can we build a stack
00:45:13.940 | that takes maximal advantage of this idea?
00:45:16.060 | So what makes TinyGrad different
00:45:20.260 | from other neural network libraries
00:45:22.260 | is it does not have a primitive operator
00:45:24.380 | even for matrix multiplication.
00:45:26.540 | And this is every single one.
00:45:28.300 | They even have primitive operators
00:45:29.500 | for things like convolutions.
00:45:30.900 | - So no matmul.
00:45:32.620 | - No matmul.
00:45:33.540 | Well, here's what a matmul is.
00:45:35.020 | So I'll use my hands to talk here.
00:45:36.940 | So if you think about a cube,
00:45:38.460 | and I put my two matrices that I'm multiplying
00:45:40.460 | on two faces of the cube, right?
00:45:42.960 | You can think about the matrix multiply as,
00:45:45.420 | okay, the n cubed,
00:45:47.380 | I'm gonna multiply for each one in the cubed,
00:45:49.460 | and then I'm gonna do a sum,
00:45:50.620 | which is a reduce up to here,
00:45:52.500 | to the third face of the cube,
00:45:53.780 | and that's your multiplied matrix.
00:45:55.860 | So what a matrix multiply is,
00:45:57.500 | is a bunch of shape operations, right?
00:45:59.340 | A bunch of permute three shapes and expands
00:46:01.700 | on the two matrices.
00:46:03.340 | I'll multiply n cubed,
00:46:05.780 | I'll reduce n cubed,
00:46:07.660 | which gives you an n squared matrix.
00:46:09.620 | - Okay, so what is the minimum number of operations
00:46:12.060 | that can accomplish that
00:46:12.980 | if you don't have matmul as a primitive?
00:46:16.140 | - So TinyGrad has about 20.
00:46:18.300 | And you can compare TinyGrad's offset or IR
00:46:22.300 | to things like XLA or PrimTorch.
00:46:25.300 | So XLA and PrimTorch are ideas where like,
00:46:27.420 | okay, Torch has like 2000 different kernels.
00:46:30.900 | PyTorch 2.0 introduced PrimTorch, which has only 250.
00:46:35.760 | TinyGrad has order of magnitude 25.
00:46:39.260 | It's 10X less than XLA or PrimTorch.
00:46:43.340 | And you can think about it
00:46:44.820 | as kind of like RISC versus CISC, right?
00:46:47.740 | These other things are CISC like systems.
00:46:50.460 | TinyGrad is RISC.
00:46:52.980 | - And RISC-1.
00:46:54.620 | - RISC architecture is gonna change everything.
00:46:56.940 | 1995, hackers.
00:46:58.460 | - Wait, really?
00:47:00.140 | That's an actual thing?
00:47:00.980 | - Angelina Jolie delivers the line,
00:47:03.060 | RISC architecture is gonna change everything in 1995.
00:47:05.780 | - Wow.
00:47:06.620 | - And here we are with ARM in the phones
00:47:08.500 | and ARM everywhere.
00:47:10.020 | - Wow, I love it when movies
00:47:11.620 | actually have real things in them.
00:47:13.140 | - Right?
00:47:14.380 | - Okay, interesting.
00:47:15.300 | And so this is like,
00:47:16.420 | so you're thinking of this as the RISC architecture
00:47:19.220 | of ML stack.
00:47:21.260 | 25, huh?
00:47:23.300 | What, can you go through the four op types?
00:47:28.300 | - Sure.
00:47:31.020 | Okay, so you have unary ops,
00:47:32.740 | which take in a tensor
00:47:36.820 | and return a tensor of the same size
00:47:38.540 | and do some unary op to it.
00:47:39.780 | X, log, reciprocal, sine, right?
00:47:43.760 | They take in one and they're point-wise.
00:47:46.340 | - Relu.
00:47:47.300 | - Yeah, relu.
00:47:48.740 | Almost all activation functions are unary ops.
00:47:51.620 | Some combinations of unary ops together
00:47:53.820 | is still a unary op.
00:47:54.860 | Then you have binary ops.
00:47:57.500 | Binary ops are like point-wise addition,
00:47:59.780 | multiplication, division, compare.
00:48:01.980 | It takes in two tensors of equal size
00:48:05.420 | and outputs one tensor.
00:48:06.620 | Then you have reduce ops.
00:48:09.940 | Reduce ops will like take a three-dimensional tensor
00:48:12.460 | and turn it into a two-dimensional tensor
00:48:14.580 | or a three-dimensional tensor
00:48:15.700 | turning into a zero-dimensional tensor.
00:48:17.060 | Things like a sum or a max
00:48:19.180 | are really the common ones there.
00:48:21.700 | And then the fourth type is movement ops.
00:48:24.020 | And movement ops are different from the other types
00:48:25.820 | because they don't actually require computation.
00:48:27.620 | They require different ways to look at memory.
00:48:30.060 | So that includes reshapes, permutes, expands, flips.
00:48:34.760 | Those are the main ones, probably.
00:48:35.600 | - And so with that, you have enough to make a map model.
00:48:38.780 | - And convolutions.
00:48:39.780 | And every convolution you can imagine,
00:48:41.540 | dilated convolution, strided convolutions,
00:48:43.860 | transposed convolutions.
00:48:45.520 | - You write on GitHub about laziness,
00:48:49.300 | showing a map model, matrix multiplication.
00:48:53.620 | See how despite the style,
00:48:55.180 | it is fused into one kernel with the power of laziness.
00:48:58.580 | Can you elaborate on this power of laziness?
00:49:01.060 | - Sure, so if you type in PyTorch,
00:49:03.300 | A times B plus C,
00:49:05.820 | what this is going to do
00:49:08.420 | is it's going to first multiply A and B
00:49:11.940 | and store that result into memory.
00:49:13.700 | - Mm-hmm.
00:49:14.540 | - And then it is going to add C
00:49:15.820 | by reading that result from memory,
00:49:17.760 | reading C from memory and writing that out to memory.
00:49:21.620 | There is way more loads and stores to memory
00:49:23.900 | than you need there.
00:49:25.100 | If you don't actually do A times B as soon as you see it,
00:49:28.660 | if you wait until the user actually realizes that tensor,
00:49:32.740 | until the laziness actually resolves,
00:49:34.940 | you can fuse that plus C.
00:49:36.740 | This is like, it's the same way Haskell works.
00:49:39.060 | - So what's the process of porting a model into TinyGrad?
00:49:44.060 | - So TinyGrad's front end looks very similar to PyTorch.
00:49:47.580 | I probably could make a perfect
00:49:49.920 | or pretty close to perfect interop layer
00:49:51.820 | if I really wanted to.
00:49:52.940 | I think that there's some things that are nicer
00:49:54.540 | about TinyGrad syntax than PyTorch,
00:49:56.180 | but the front end looks very Torch-like.
00:49:57.900 | You can also load in ONNX models.
00:50:00.140 | We have more ONNX tests passing than Core ML.
00:50:03.360 | - Core ML, okay, so-
00:50:06.380 | - We'll pass ONNX runtime soon.
00:50:08.140 | - What about the developer experience with TinyGrad?
00:50:10.980 | What it feels like versus PyTorch?
00:50:16.740 | - By the way, I really like PyTorch.
00:50:18.100 | I think that it's actually a very good piece of software.
00:50:20.900 | I think that they've made a few different trade-offs,
00:50:23.860 | and these different trade-offs are where TinyGrad
00:50:28.460 | takes a different path.
00:50:29.700 | One of the biggest differences is it's really easy
00:50:32.060 | to see the kernels that are actually being sent to the GPU.
00:50:35.860 | If you run PyTorch on the GPU,
00:50:38.460 | you do some operation,
00:50:39.980 | and you don't know what kernels ran,
00:50:41.020 | you don't know how many kernels ran,
00:50:42.500 | you don't know how many flops were used,
00:50:44.100 | you don't know how much memory accesses were used.
00:50:46.340 | TinyGrad type debug equals two,
00:50:48.660 | and it will show you in this beautiful style
00:50:50.900 | every kernel that's run,
00:50:53.060 | how many flops and how many bytes.
00:50:57.500 | - So can you just linger on what problem TinyGrad solves?
00:51:04.260 | - TinyGrad solves the problem of porting
00:51:06.300 | new ML accelerators quickly.
00:51:08.940 | One of the reasons, tons of these companies now,
00:51:12.060 | I think Sequoia marked Graphcore to zero, right?
00:51:16.740 | Cerebus, Tenstorrent, Grok,
00:51:20.660 | all of these ML accelerator companies, they built chips.
00:51:24.340 | The chips were good.
00:51:25.540 | The software was terrible.
00:51:26.860 | And part of the reason is because,
00:51:29.340 | I think the same problem is happening with Dojo.
00:51:31.380 | It's really, really hard to write a PyTorch port,
00:51:35.020 | because you have to write 250 kernels,
00:51:37.460 | and you have to tune them all for performance.
00:51:40.340 | - What does Jim Culler think about TinyGrad?
00:51:42.940 | You guys hung out quite a bit,
00:51:45.620 | so he was involved, he's involved with Tenstorrent.
00:51:49.820 | What's his praise and what's his criticism
00:51:52.420 | of what you're doing with your life?
00:51:54.820 | - Look, my prediction for Tenstorrent
00:51:58.420 | is that they're gonna pivot to making RISC-V chips.
00:52:01.940 | CPUs.
00:52:03.060 | - CPUs.
00:52:04.060 | - Yeah.
00:52:05.220 | - Why?
00:52:06.060 | - Well, because AI accelerators are a software problem,
00:52:10.580 | not really a hardware problem.
00:52:11.860 | - Oh, interesting.
00:52:12.700 | So you don't think,
00:52:13.540 | you think the diversity of AI accelerators
00:52:17.180 | in the hardware space is not going to be a thing
00:52:19.740 | that exists long-term?
00:52:21.340 | - I think what's gonna happen is, if I can finish, okay.
00:52:25.580 | If you're trying to make an AI accelerator,
00:52:28.140 | you better have the capability of writing
00:52:31.140 | a torch-level performance stack on NVIDIA GPUs.
00:52:35.260 | If you can't write a torch stack on NVIDIA GPUs,
00:52:37.660 | and I mean all the way, I mean down to the driver,
00:52:39.740 | there's no way you're gonna be able to write it on your chip,
00:52:41.860 | because your chip's worse than an NVIDIA GPU.
00:52:43.900 | The first version of the chip you tape out,
00:52:45.260 | it's definitely worse.
00:52:46.540 | - Oh, you're saying writing that stack is really tough.
00:52:48.540 | - Yes, and not only that,
00:52:49.820 | actually, the chip that you tape out,
00:52:51.420 | almost always 'cause you're trying to get advantage
00:52:52.940 | over NVIDIA, you're specializing the hardware more.
00:52:55.500 | It's always harder to write software
00:52:57.100 | for more specialized hardware.
00:52:59.060 | A GPU's pretty generic,
00:53:00.500 | and if you can't write an NVIDIA stack,
00:53:02.580 | there's no way you can write a stack for your chip.
00:53:05.200 | So my approach with TinyGrad is,
00:53:07.540 | first, write a performant NVIDIA stack.
00:53:09.600 | We're targeting AMD.
00:53:10.740 | - So you did say a few to NVIDIA a little bit, with love.
00:53:16.020 | - With love.
00:53:16.860 | - Yeah, but-- - With love.
00:53:17.700 | It's like the Yankees, you know?
00:53:19.020 | I'm a Mets fan.
00:53:20.060 | - Oh, you're a Mets fan, a RISC fan and a Mets fan.
00:53:24.340 | What's the hope that AMD has?
00:53:26.060 | You did a build with AMD recently that I saw.
00:53:30.540 | How does the 7900XTX compare to the RTX 4090 or 4080?
00:53:35.540 | - Well, let's start with the fact
00:53:39.700 | that the 7900XTX kernel drivers don't work,
00:53:42.740 | and if you run demo apps in loops, it panics the kernel.
00:53:46.180 | - Okay, so this is a software issue?
00:53:48.400 | - Lisa Su responded to my email.
00:53:51.300 | - Oh. - I reached out.
00:53:52.180 | I was like, this is, you know, really?
00:53:56.780 | Like, I understand if your 7x7 transposed Winograd conv
00:54:01.540 | is slower than NVIDIA's,
00:54:02.900 | but literally when I run demo apps in a loop,
00:54:05.580 | the kernel panics?
00:54:08.140 | - So just adding that loop.
00:54:09.540 | - Yeah, I just literally took their demo apps
00:54:12.180 | and wrote like while true semicolon do the app
00:54:14.860 | semicolon done in a bunch of screens.
00:54:17.580 | Right, this is like the most primitive fuzz testing.
00:54:20.380 | - Why do you think that is?
00:54:21.940 | They're just not seeing a market in machine learning?
00:54:26.620 | - They're changing, they're trying to change.
00:54:28.700 | They're trying to change,
00:54:29.660 | and I had a pretty positive interaction with them this week.
00:54:31.900 | Last week I went on YouTube, I was just like, that's it.
00:54:34.220 | I give up on AMD.
00:54:35.300 | Like, this is, their driver doesn't even,
00:54:37.500 | I'm not gonna, you know, I'll go with Intel GPUs.
00:54:41.220 | Intel GPUs have better drivers.
00:54:42.780 | - So you're kind of spearheading
00:54:46.940 | the diversification of GPUs?
00:54:50.580 | - Yeah, and I'd like to extend
00:54:52.020 | that diversification to everything.
00:54:53.660 | I'd like to diversify the, right, the more,
00:54:58.020 | my central thesis about the world is
00:55:02.660 | there's things that centralize power and they're bad,
00:55:04.860 | and there's things that decentralize power and they're good.
00:55:07.820 | Everything I can do to help decentralize power,
00:55:09.980 | I'd like to do.
00:55:10.820 | - So you're really worried about the centralization
00:55:14.220 | of Nvidia, that's interesting.
00:55:15.340 | And you don't have a fundamental hope
00:55:17.260 | for the proliferation of ASICs, except in the cloud.
00:55:22.260 | - I'd like to help them with software.
00:55:25.260 | No, actually, there's only, the only ASIC
00:55:27.100 | that is remotely successful is Google's TPU.
00:55:29.940 | And the only reason that's successful is
00:55:31.540 | because Google wrote a machine learning framework.
00:55:34.940 | I think that you have to write a competitive machine
00:55:37.380 | learning framework in order to be able to build an ASIC.
00:55:40.140 | - You think Meta with PyTorch builds a competitor?
00:55:45.060 | - I hope so.
00:55:46.100 | - Okay. - They have one.
00:55:46.940 | They have an internal one.
00:55:48.020 | - Internal, I mean, public facing
00:55:50.060 | with a nice cloud interface and so on.
00:55:52.380 | - I don't want a cloud.
00:55:53.940 | - You don't like cloud?
00:55:54.780 | - I don't like cloud.
00:55:55.860 | - What do you think is the fundamental limitation of cloud?
00:55:58.420 | - Fundamental limitation of cloud is who owns the off switch.
00:56:01.900 | - So it's power to the people.
00:56:03.820 | - Yeah.
00:56:04.780 | - And you don't like the man to have all the power.
00:56:07.340 | - Exactly.
00:56:08.180 | - All right.
00:56:09.420 | And right now, the only way to do that is with Nvidia GPUs
00:56:11.980 | if you want performance and stability.
00:56:15.720 | - Interesting.
00:56:17.400 | It's a costly investment emotionally to go with AMDs.
00:56:21.680 | Well, let me sort of on a tangent ask you,
00:56:24.780 | you've built quite a few PCs.
00:56:28.080 | What's your advice on how to build a good custom PC
00:56:31.240 | for, let's say, for the different applications that you use
00:56:33.720 | for gaming, for machine learning?
00:56:35.760 | - Well, you shouldn't build one.
00:56:36.640 | You should buy a box from the tiny corp.
00:56:39.440 | - I heard rumors, whispers about this box in the tiny corp.
00:56:44.480 | What's this thing look like?
00:56:45.720 | What is it?
00:56:46.560 | What is it called?
00:56:47.640 | - It's called the tiny box.
00:56:48.800 | - Tiny box.
00:56:49.760 | - It's $15,000.
00:56:51.640 | And it's almost a paid a flop of compute.
00:56:54.680 | It's over a hundred gigabytes of GPU RAM.
00:56:57.740 | It's over five terabytes per second of GPU memory bandwidth.
00:57:01.920 | I'm gonna put like four NVMEs in RAID.
00:57:07.000 | You're gonna get like 20, 30 gigabytes per second
00:57:09.720 | of drive read bandwidth.
00:57:11.920 | I'm gonna build like the best deep learning box that I can
00:57:16.880 | that plugs into one wall outlet.
00:57:18.480 | - Okay.
00:57:20.140 | Can you go through those specs again a little bit
00:57:21.520 | from your, from memory?
00:57:23.120 | - Yeah, so it's almost a paid a flop of compute.
00:57:25.000 | - So AMD Intel.
00:57:26.780 | - Today, I'm leaning toward AMD.
00:57:28.780 | But we're pretty agnostic to the type of compute.
00:57:33.600 | The main limiting spec is a 120 volt 15 amp circuit.
00:57:38.600 | - Okay.
00:57:41.720 | - Because in order to like,
00:57:43.080 | like there's a plug over there, right?
00:57:45.240 | You have to be able to plug it in.
00:57:47.980 | We're also gonna sell the tiny rack,
00:57:51.240 | which like what's the most power you can get
00:57:53.760 | into your house without arousing suspicion?
00:57:56.440 | And one of the answers is an electric car charger.
00:57:59.760 | - Wait, where does the rack go?
00:58:01.800 | - Your garage.
00:58:03.200 | - Interesting.
00:58:04.280 | The car charger.
00:58:05.480 | - A wall outlet is about 1500 Watts.
00:58:07.960 | A car charger is about 10,000 Watts.
00:58:10.040 | - Okay.
00:58:11.360 | What is the most amount of power you can get your hands on
00:58:14.840 | without arousing suspicion?
00:58:16.120 | - That's right.
00:58:16.960 | - George Hoss.
00:58:17.780 | Okay.
00:58:18.620 | So the tiny box and you said NVMEs and RAID.
00:58:22.600 | I forget what you said about memory,
00:58:25.520 | all that kind of stuff.
00:58:26.440 | Okay.
00:58:27.280 | So what about what GPUs?
00:58:29.120 | - Again, probably 7900 XTXs,
00:58:32.280 | but maybe 3090s, maybe a 770s.
00:58:35.600 | For the Intel.
00:58:36.440 | - You're flexible or still exploring?
00:58:39.280 | - I'm still exploring.
00:58:40.480 | I wanna deliver a really good experience to people.
00:58:44.200 | And yeah, what GPUs I end up going with,
00:58:46.760 | again, I'm leaning toward AMD.
00:58:48.440 | We'll see.
00:58:49.760 | You know, in my email, what I said to AMD is like,
00:58:53.400 | just dumping the code on GitHub is not open source.
00:58:56.360 | Open source is a culture.
00:58:57.600 | Open source means that your issues
00:59:00.440 | are not all one year old stale issues.
00:59:03.280 | Open source means developing in public.
00:59:06.760 | And if you guys can commit to that,
00:59:08.720 | I see a real future for AMD as a competitor to Nvidia.
00:59:11.480 | - Well, I'd love to get a tiny box to MIT.
00:59:16.000 | So whenever it's ready, let's do it.
00:59:18.440 | - We're taking pre-orders.
00:59:19.280 | I took this from Elon.
00:59:20.240 | I'm like, all right, $100 fully refundable pre-orders.
00:59:23.280 | - Is it gonna be like the Cybertruck
00:59:24.600 | is gonna take a few years or?
00:59:26.200 | - No, I'll try to do it faster.
00:59:27.400 | It's a lot simpler.
00:59:28.360 | It's a lot simpler than a truck.
00:59:29.960 | - Well, there's complexities,
00:59:31.400 | not to just the putting the thing together,
00:59:34.240 | but like shipping and all this kind of stuff.
00:59:35.880 | - The thing that I wanna deliver to people out of the box
00:59:38.160 | is being able to run 65 billion parameter LLAMA
00:59:41.200 | in FP16 in real time in like a good,
00:59:44.040 | like 10 tokens per second or five tokens per second
00:59:45.840 | or something.
00:59:46.680 | - Just it works.
00:59:48.000 | - Yep, just it works.
00:59:48.840 | - LLAMA's running or something like LLAMA.
00:59:52.800 | - Experience, yeah, or I think Falcon is the new one.
00:59:55.920 | Experience a chat with the largest language model
00:59:58.800 | that you can have in your house.
01:00:00.520 | - Yeah, from a wall plug.
01:00:02.560 | - From a wall plug, yeah.
01:00:03.600 | Actually for inference,
01:00:05.240 | it's not like even more power would help you get more.
01:00:07.880 | - Even more power wouldn't get you more.
01:00:10.960 | - Well, no, there's just the biggest model released
01:00:12.920 | is 65 billion parameter LLAMA as far as I know.
01:00:16.240 | - So it sounds like Tiny Box will naturally pivot
01:00:18.800 | towards company number three
01:00:20.320 | 'cause you could just get the girlfriend
01:00:22.400 | and or boyfriend.
01:00:25.120 | - That one's harder actually.
01:00:27.360 | - The boyfriend is harder?
01:00:28.320 | - The boyfriend's harder, yeah.
01:00:29.160 | - I think that's a very biased statement.
01:00:32.000 | I think a lot of people would just say,
01:00:33.840 | what, why is it harder to replace a boyfriend
01:00:38.720 | than a girlfriend with the artificial LLM?
01:00:41.200 | - Because women are attracted to status and power
01:00:43.480 | and men are attracted to youth and beauty.
01:00:45.580 | No, I mean, that's what I mean.
01:00:49.000 | - Both are mimicable easy through the language model.
01:00:51.920 | - No, no machines do not have any status or real power.
01:00:56.160 | - I don't know.
01:00:57.080 | I think you both, well, first of all,
01:00:59.120 | you're using language mostly to communicate youth
01:01:04.120 | and beauty and power and status.
01:01:07.640 | - But status fundamentally is a zero sum game, right?
01:01:10.160 | Whereas youth and beauty are not.
01:01:12.160 | - No, I think status is a narrative you can construct.
01:01:15.240 | I don't think status is real.
01:01:17.020 | - I don't know.
01:01:19.640 | I just think that that's why it's harder.
01:01:21.280 | You know, yeah, maybe it is my biases.
01:01:23.160 | - I think status is way easier to fake.
01:01:25.400 | - I also think that men are probably more desperate
01:01:28.320 | and more likely to buy my product.
01:01:29.600 | So maybe they're a better target market.
01:01:31.680 | - Desperation is interesting.
01:01:33.400 | Easier to fool.
01:01:34.920 | - Yeah.
01:01:35.760 | - I could see that.
01:01:36.840 | - Yeah, look, I mean, look, I know you can look
01:01:38.280 | at porn viewership numbers, right?
01:01:39.940 | A lot more men watch porn than women.
01:01:41.720 | - Yeah.
01:01:42.560 | - You can ask why that is.
01:01:43.680 | - Wow, there's a lot of questions than answers
01:01:46.480 | you can get there.
01:01:47.440 | Anyway, with the tiny box, how many GPUs in tiny box?
01:01:53.880 | - Six.
01:01:54.720 | (laughing)
01:01:57.960 | - Oh, man.
01:01:59.440 | - And I'll tell you why it's six.
01:02:00.480 | - Yeah.
01:02:01.320 | - So AMD Epyc processors have 128 lanes of PCIe.
01:02:05.520 | I wanna leave enough lanes for some drives.
01:02:11.600 | And I wanna leave enough lanes for some networking.
01:02:15.840 | - How do you do cooling for something like this?
01:02:17.680 | - Ah, that's one of the big challenges.
01:02:19.840 | Not only do I want the cooling to be good,
01:02:21.700 | I want it to be quiet.
01:02:22.920 | I want the tiny box to be able to sit comfortably
01:02:25.040 | in your room, right?
01:02:25.880 | - This is really going towards the girlfriend thing.
01:02:28.480 | 'Cause you want to run the LLM.
01:02:31.320 | - I'll give a more, I mean, I can talk about
01:02:33.560 | how it relates to company number one.
01:02:36.040 | - Comm AI.
01:02:36.880 | - Yeah.
01:02:37.880 | - Well, but yes, quiet, oh, quiet because you may be
01:02:41.520 | potentially wanna run it in a car?
01:02:43.040 | - No, no, quiet because you wanna put this thing
01:02:44.960 | in your house and you want it to coexist with you.
01:02:46.880 | If it's screaming at 60 dB, you don't want that
01:02:48.880 | in your house, you'll kick it out.
01:02:49.880 | - 60 dB, yeah.
01:02:51.200 | - Yeah, I want like 40, 45.
01:02:52.760 | - So how do you make the cooling quiet?
01:02:55.080 | That's an interesting problem in itself.
01:02:57.120 | - A key trick is to actually make it big.
01:02:58.800 | Ironically, it's called the tiny box.
01:03:00.760 | But if I can make it big, a lot of that noise
01:03:02.960 | is generated because of high pressure air.
01:03:05.600 | If you look at like a 1U server, a 1U server
01:03:08.200 | has these super high pressure fans that are like
01:03:10.080 | super deep and they're like Genesis.
01:03:12.400 | Versus if you have something that's big,
01:03:14.600 | well, I can use a big, you know, they call them
01:03:16.720 | big ass fans, those ones that are like huge
01:03:18.600 | on the ceiling and they're completely silent.
01:03:21.360 | - So tiny box will be big.
01:03:24.720 | - It is the, I do not want it to be large
01:03:27.920 | according to UPS.
01:03:29.200 | I want it to be shippable as a normal package,
01:03:31.000 | but that's my constraint there.
01:03:32.840 | - Interesting.
01:03:33.760 | Well, the fan stuff, can't it be assembled
01:03:36.440 | on location or no?
01:03:38.040 | - No.
01:03:38.880 | - Oh, it has to be, well, you're--
01:03:41.600 | - Look, I want to give you a great
01:03:42.520 | out of the box experience.
01:03:43.440 | I want you to lift this thing out.
01:03:44.520 | I want it to be like the Mac, you know?
01:03:47.000 | Tiny box.
01:03:48.000 | - The Apple experience.
01:03:49.040 | - Yeah.
01:03:50.520 | - I love it.
01:03:51.400 | Okay, and so tiny box would run
01:03:54.480 | TinyGrad, like what do you envision
01:03:58.120 | this whole thing to look like?
01:03:59.400 | We're talking about like Linux with a full
01:04:04.400 | software engineering environment,
01:04:06.920 | and it's just not PyTorch, but TinyGrad.
01:04:10.360 | - Yeah, we did a poll of people want Ubuntu or Arch.
01:04:12.840 | We're gonna stick with Ubuntu.
01:04:14.400 | - Ooh, interesting.
01:04:15.240 | What's your favorite flavor of Linux?
01:04:17.600 | - Ubuntu.
01:04:18.440 | - Ubuntu.
01:04:19.360 | I like Ubuntu Mate, however you pronounce that, mate.
01:04:23.480 | So how do you, you've gotten Lama into TinyGrad.
01:04:27.200 | You've gotten stable diffusion into TinyGrad.
01:04:29.080 | What was that like?
01:04:29.920 | Can you comment on like, what are these models?
01:04:34.440 | What's interesting about porting them?
01:04:36.800 | What's, yeah, like what are the challenges?
01:04:39.240 | What's naturally, what's easy, all that kind of stuff.
01:04:41.800 | - There's a really simple way to get these models
01:04:43.880 | into TinyGrad, and you can just export them as Onyx,
01:04:46.400 | and then TinyGrad can run Onyx.
01:04:47.960 | So the ports that I did of Lama, stable diffusion,
01:04:52.080 | and now Whisper are more academic
01:04:54.520 | to teach me about the models,
01:04:56.320 | but they are cleaner than the PyTorch versions.
01:04:58.880 | You can read the code.
01:04:59.720 | I think the code is easier to read.
01:05:01.320 | It's less lines.
01:05:02.880 | There's just a few things
01:05:03.800 | about the way TinyGrad writes things.
01:05:05.320 | Here's a complaint I have about PyTorch.
01:05:07.800 | Nn.relu is a class, right?
01:05:10.720 | So when you create an Nn module,
01:05:12.880 | you'll put your Nn.relus as in a nit.
01:05:17.520 | And this makes no sense.
01:05:18.440 | Relu is completely stateless.
01:05:20.640 | Why should that be a class?
01:05:22.040 | - But that's more like a software engineering thing,
01:05:25.920 | or do you think it has a cost on performance?
01:05:28.080 | - Oh no, it doesn't have a cost on performance.
01:05:30.440 | But yeah, no, I think that it's,
01:05:32.520 | that's what I mean about TinyGrad's front end being cleaner.
01:05:35.760 | - Ah, I see.
01:05:37.200 | What do you think about Mojo?
01:05:38.040 | I don't know if you've been paying attention
01:05:39.360 | to the programming language that does some interesting ideas
01:05:43.040 | that kind of intersect TinyGrad.
01:05:46.080 | - I think that there is a spectrum,
01:05:48.200 | and on one side you have Mojo,
01:05:50.040 | and on the other side you have like GGML.
01:05:52.600 | GGML is this like, we're gonna run LLAMA fast on Mac.
01:05:56.600 | Okay, we're gonna expand out to a little bit,
01:05:58.120 | but we're gonna basically go to like depth first, right?
01:06:01.120 | Mojo is like, we're gonna go breadth first.
01:06:02.880 | We're gonna go so wide
01:06:03.920 | that we're gonna make all of Python fast,
01:06:05.920 | and TinyGrad's in the middle.
01:06:07.360 | TinyGrad is, we are going to make neural networks fast.
01:06:11.280 | - Yeah, but they try to really get it to be fast,
01:06:17.400 | compile down to a specific hardware,
01:06:20.720 | and make that compilation step
01:06:22.680 | as flexible and resilient as possible.
01:06:25.800 | - Yeah, but they have turn completeness.
01:06:27.920 | - And that limits you.
01:06:29.480 | - Turn-- - That's what you're seeing.
01:06:30.520 | It's somewhere in the middle.
01:06:31.640 | So you're actually going to be targeting some accelerators,
01:06:34.720 | some, like some number, not one.
01:06:38.760 | - My goal is step one,
01:06:41.360 | build an equally performance stack to PyTorch
01:06:44.000 | on NVIDIA and AMD, but with way less lines.
01:06:48.080 | And then step two is, okay, how do we make an accelerator?
01:06:51.120 | Right, but you need step one.
01:06:52.440 | You have to first build the framework
01:06:54.040 | before you can build the accelerator.
01:06:56.080 | - Can you explain MLPerf?
01:06:57.560 | What's your approach in general
01:06:59.560 | to benchmarking TinyGrad performance?
01:07:02.080 | - So I'm much more of a, like,
01:07:06.640 | build it the right way and worry about performance later.
01:07:09.920 | There's a bunch of things where I haven't even like
01:07:13.720 | really dove into performance.
01:07:15.000 | The only place where TinyGrad
01:07:16.440 | is competitive performance-wise right now
01:07:18.360 | is on Qualcomm GPUs.
01:07:19.800 | So TinyGrad's actually used in OpenPilot to run the model.
01:07:23.360 | So the driving model is TinyGrad.
01:07:25.400 | - When did that happen, that transition?
01:07:27.720 | - About eight months ago now.
01:07:29.200 | And it's 2x faster than Qualcomm's library.
01:07:33.280 | - What's the hardware that OpenPilot runs on, the CommAI?
01:07:38.120 | - It's a Snapdragon 845.
01:07:40.280 | - Okay. - So this is using the GPU.
01:07:42.000 | So the GPU's an Adreno GPU.
01:07:44.600 | There's like different things.
01:07:46.040 | There's a really good Microsoft paper
01:07:47.560 | that talks about like mobile GPUs
01:07:49.520 | and why they're different from desktop GPUs.
01:07:52.480 | One of the big things is in a desktop GPU,
01:07:55.080 | you can use buffers on a mobile GPU image textures
01:07:59.000 | a lot faster.
01:07:59.840 | - On a mobile GPU image textures, okay.
01:08:04.840 | And so you want to be able to leverage that.
01:08:08.280 | - I wanna be able to leverage it
01:08:09.520 | in a way that it's completely generic, right?
01:08:11.560 | So there's a lot of,
01:08:12.520 | Xiaomi has a pretty good open source library
01:08:14.840 | for mobile GPUs called MACE,
01:08:16.560 | where they can generate, where they have these kernels,
01:08:19.400 | but they're all hand-coded, right?
01:08:21.320 | So that's great if you're doing three by three comps.
01:08:23.800 | That's great if you're doing dense map models,
01:08:25.600 | but the minute you go off the beaten path a tiny bit,
01:08:28.600 | well, your performance is nothing.
01:08:30.600 | - Since you mentioned OpenPilot,
01:08:31.800 | I'd love to get an update in the company number one,
01:08:35.640 | CommAI world.
01:08:37.120 | How are things going there in the development
01:08:40.200 | of semi-autonomous driving?
01:08:42.600 | - You know, almost no one talks about FSD anymore,
01:08:48.640 | and even less people talk about OpenPilot.
01:08:51.240 | We've solved the problem, like we solved it years ago.
01:08:54.320 | - What's the problem exactly?
01:08:57.280 | - Well, how do you-
01:08:58.120 | - What does solving it mean?
01:09:00.560 | - Solving means how do you build a model
01:09:02.360 | that outputs a human policy for driving?
01:09:04.680 | How do you build a model that given,
01:09:07.680 | a reasonable set of sensors,
01:09:09.640 | outputs a human policy for driving?
01:09:11.400 | So you have companies like Waymo and Cruise,
01:09:14.520 | which are hand-coding these things
01:09:15.920 | that are like quasi-human policies.
01:09:18.320 | Then you have Tesla,
01:09:22.800 | and maybe even to more of an extent, Kama,
01:09:25.320 | asking, "Okay, how do we just learn
01:09:26.400 | the human policy from data?"
01:09:27.800 | The big thing that we're doing now,
01:09:31.080 | and we just put it out on Twitter,
01:09:34.280 | at the beginning of Kama,
01:09:35.640 | we published a paper called "Learning a Driving Simulator."
01:09:39.800 | And the way this thing worked was it was an autoencoder,
01:09:44.520 | and then an RNN in the middle, right?
01:09:48.520 | You take an autoencoder, you compress the picture,
01:09:51.480 | you use an RNN, predict the next state,
01:09:53.760 | and these things were, you know,
01:09:55.520 | it was a laughably bad simulator.
01:09:57.160 | Like this is 2015 era machine learning technology.
01:10:00.360 | Today, we have VQVAE and transformers.
01:10:03.960 | We're building DriveGPT, basically.
01:10:06.760 | - DriveGPT, okay.
01:10:08.920 | And it's trained on what?
01:10:12.160 | Is it trained in a self-supervised way?
01:10:14.360 | - Yeah, it's trained on all the driving data
01:10:16.120 | to predict the next frame.
01:10:17.600 | - So really trying to learn a human policy.
01:10:20.840 | What would a human do?
01:10:21.680 | - Well, actually, our simulator's conditioned on the pose.
01:10:24.200 | So it's actually a simulator.
01:10:25.440 | You can put in like a state action pair
01:10:27.000 | and get out the next state.
01:10:28.760 | - Okay.
01:10:29.920 | - And then once you have a simulator,
01:10:31.920 | you can do RL in the simulator
01:10:34.080 | and RL will get us that human policy.
01:10:36.800 | - So it transfers.
01:10:38.280 | - Yeah.
01:10:39.760 | RL with a reward function,
01:10:41.720 | not asking is this close to the human policy,
01:10:43.760 | but asking would a human disengage if you did this behavior?
01:10:47.480 | - Okay, let me think about the distinction there.
01:10:50.040 | Would a human disengage?
01:10:51.640 | Would a human disengage?
01:10:54.880 | That correlates, I guess, with human policy,
01:10:58.840 | but it could be different.
01:11:00.200 | So it doesn't just say, what would a human do?
01:11:03.440 | It says, what would a good human driver do?
01:11:06.920 | And such that the experience is comfortable,
01:11:09.720 | but also not annoying in that like the thing
01:11:12.840 | is very cautious.
01:11:14.600 | So it's finding a nice balance.
01:11:16.400 | That's interesting, that's a nice--
01:11:17.840 | - It's asking exactly the right question.
01:11:20.080 | What will make our customers happy?
01:11:22.020 | - Right.
01:11:23.880 | - A system that you never wanna disengage.
01:11:25.400 | - 'Cause usually disengagement is almost always a sign
01:11:29.640 | of I'm not happy with what the system is doing.
01:11:32.120 | - Usually.
01:11:32.960 | There's some that are just, I felt like driving,
01:11:35.160 | and those are always fine too,
01:11:36.480 | but they're just gonna look like noise in the data.
01:11:39.200 | - But even I felt like driving.
01:11:41.520 | - Maybe, yeah.
01:11:42.680 | - Even that's a signal, like, why do you feel like driving?
01:11:45.600 | You need to recalibrate your relationship with the car.
01:11:50.600 | Okay, so that's really interesting.
01:11:54.180 | How close are we to solving self-driving?
01:11:59.400 | - It's hard to say.
01:12:01.040 | We haven't completely closed the loop yet.
01:12:03.440 | So we don't have anything built
01:12:04.840 | that truly looks like that architecture yet.
01:12:07.180 | We have prototypes and there's bugs.
01:12:10.080 | So we are a couple bug fixes away.
01:12:12.960 | Might take a year, might take 10.
01:12:15.120 | - What's the nature of the bugs?
01:12:16.720 | Are these major philosophical bugs, logical bugs?
01:12:20.920 | What kind of bugs are we talking about?
01:12:23.240 | - They're just like stupid bugs,
01:12:24.440 | and also we might just need more scale.
01:12:26.720 | We just massively expanded our compute cluster at Gama.
01:12:30.000 | We now have about two people worth of compute,
01:12:33.520 | 40 petaflops.
01:12:34.440 | - Well, people are different.
01:12:38.880 | - Yeah, 20 petaflops, that's a person.
01:12:41.040 | I mean, it's just a unit, right?
01:12:42.260 | Horses are different too,
01:12:43.100 | but we still call it a horsepower.
01:12:44.960 | - Yeah, but there's something different about mobility
01:12:47.520 | than there is about perception and action
01:12:51.280 | in a very complicated world, but yes.
01:12:53.800 | - Well, yeah, of course, not all flops are created equal.
01:12:55.720 | If you have randomly initialized weights, it's not gonna.
01:12:58.480 | - Not all flops are created equal.
01:13:00.640 | - Some flops are doing way more useful things than others.
01:13:03.040 | - Yeah, yeah.
01:13:04.320 | Tell me about it.
01:13:06.520 | Okay, so more data.
01:13:07.840 | Scale means more scale in compute
01:13:09.560 | or scale in scale of data?
01:13:11.580 | - Both.
01:13:12.420 | - Diversity of data?
01:13:15.720 | - Diversity is very important in data.
01:13:17.620 | Yeah, I mean, we have, so we have about,
01:13:21.440 | I think we have like 5,000 daily actives.
01:13:25.600 | - How would you evaluate how FSD is doing?
01:13:29.000 | - Pretty well. - In self-driving?
01:13:30.040 | - Pretty well.
01:13:31.160 | - How's that race going between CalmAI and FSD?
01:13:34.320 | - Tesla is always one to two years ahead of us.
01:13:36.280 | They've always been one to two years ahead of us,
01:13:38.160 | and they probably always will be
01:13:39.360 | because they're not doing anything wrong.
01:13:41.560 | - What have you seen that's, since the last time we talked,
01:13:43.640 | that are interesting architectural decisions,
01:13:45.360 | training decisions, like the way they deploy stuff,
01:13:48.160 | the architectures they're using in terms of the software,
01:13:51.000 | how the teams are run, all that kind of stuff,
01:13:52.520 | data collection, anything interesting?
01:13:54.600 | - I mean, I know they're moving
01:13:55.680 | toward more of an end-to-end approach.
01:13:58.040 | - So creeping towards end-to-end as much as possible
01:14:01.760 | across the whole thing,
01:14:03.160 | the training, the data collection, everything.
01:14:05.120 | - They also have a very fancy simulator.
01:14:06.920 | They're probably saying all the same things we are.
01:14:08.880 | They're probably saying we just need to optimize,
01:14:10.640 | you know, what is the reward?
01:14:12.320 | Will you get a negative reward for disengagement, right?
01:14:14.080 | Like, everyone kind of knows this.
01:14:16.000 | It's just a question of who can actually build
01:14:17.360 | and deploy the system.
01:14:18.960 | - Yeah, I mean, this requires good software engineering,
01:14:21.880 | I think. - Yeah.
01:14:23.240 | - And the right kind of hardware.
01:14:25.840 | - Yeah, and the hardware to run it.
01:14:27.920 | - You still don't believe in cloud in that regard?
01:14:30.420 | - I have a compute cluster in my office, 800 amps.
01:14:36.400 | - Tiny grad.
01:14:37.560 | - It's 40 kilowatts at idle, our data center.
01:14:40.240 | That's crazy.
01:14:41.240 | We have 40 kilowatts just burning
01:14:42.560 | just when the computers are idle.
01:14:44.080 | - Just when I-- - Sorry, sorry, compute cluster.
01:14:46.680 | (laughing)
01:14:48.060 | - Compute cluster, I got it.
01:14:49.160 | - It's not a data center.
01:14:50.120 | - Yeah, yeah.
01:14:50.940 | - No, data centers are clouds.
01:14:52.160 | We don't have clouds.
01:14:53.880 | Data centers have air conditioners.
01:14:55.040 | We have fans.
01:14:56.200 | That makes it a compute cluster.
01:14:57.800 | - I'm guessing this is a kind of a legal distinction
01:15:02.440 | as compared to-- - Sure, yeah.
01:15:03.360 | We have a compute cluster.
01:15:05.280 | - You said that you don't think LLMs have consciousness,
01:15:07.760 | or at least not more than a chicken.
01:15:09.600 | Do you think they can reason?
01:15:12.360 | Is there something interesting to you
01:15:13.560 | about the word reason, about some of the capabilities
01:15:16.340 | that we think is kind of human,
01:15:18.040 | to be able to integrate complicated information
01:15:23.040 | and through a chain of thought,
01:15:27.800 | arrive at a conclusion that feels novel,
01:15:31.120 | a novel integration of disparate facts?
01:15:35.460 | - Yeah, I don't think that there's,
01:15:39.200 | I think that they can reason better than a lot of people.
01:15:42.200 | - Hey, isn't that amazing to you, though?
01:15:44.360 | Isn't that like an incredible thing
01:15:45.600 | that a transform could achieve?
01:15:47.640 | - I mean, I think that calculators can add better
01:15:50.640 | than a lot of people.
01:15:52.200 | - But language feels like,
01:15:54.040 | reasoning through the process of language,
01:15:56.400 | which looks a lot like thought.
01:15:59.240 | - Making brilliancies in chess,
01:16:02.840 | which feels a lot like thought.
01:16:04.880 | Whatever new thing that AI can do,
01:16:07.040 | everybody thinks is brilliant,
01:16:08.280 | and then like 20 years go by,
01:16:09.480 | and they're like, well, yeah, but chess,
01:16:10.520 | that's like mechanical.
01:16:11.400 | Like adding, that's like mechanical.
01:16:13.000 | - So you think language is not that special?
01:16:14.960 | It's like chess?
01:16:15.960 | - It's like chess, and it's like--
01:16:16.800 | - No, no, no, because it's very human, we take it,
01:16:20.240 | listen, there is something different
01:16:23.080 | between chess and language.
01:16:25.800 | Chess is a game that a subset of population plays.
01:16:29.200 | Language is something we use nonstop
01:16:32.120 | for all of our human interaction,
01:16:34.360 | and human interaction is fundamental to society.
01:16:37.240 | So it's like, holy shit,
01:16:38.760 | this language thing is not so difficult
01:16:42.600 | to like create in a machine.
01:16:45.960 | The problem is if you go back to 1960,
01:16:48.480 | and you tell them that you have a machine
01:16:50.400 | that can play amazing chess,
01:16:54.000 | of course someone in 1960 will tell you
01:16:55.680 | that machine is intelligent.
01:16:57.560 | Someone in 2010 won't, what's changed, right?
01:17:00.440 | Today, we think that these machines
01:17:02.000 | that have language are intelligent,
01:17:04.160 | but I think in 20 years, we're gonna be like,
01:17:05.720 | yeah, but can it reproduce?
01:17:07.220 | - So reproduction, yeah, we may redefine
01:17:11.440 | what it means to be, what is it,
01:17:14.280 | a high-performance living organism on Earth?
01:17:17.320 | - Humans are always gonna define a niche for themselves.
01:17:19.920 | Like, well, we're better than the machines
01:17:21.720 | because we can, and they tried creative for a bit,
01:17:24.840 | but no one believes that one anymore.
01:17:26.920 | - But niche, is that delusional,
01:17:29.880 | or is there some accuracy to that?
01:17:31.360 | Because maybe with chess, you start to realize
01:17:33.640 | that we have ill-conceived notions
01:17:38.440 | of what makes humans special,
01:17:41.760 | like the apex organism on Earth.
01:17:44.680 | - Yeah, and I think maybe we're gonna go through
01:17:48.440 | that same thing with language,
01:17:49.940 | and that same thing with creativity.
01:17:52.780 | - But language carries these notions of truth and so on,
01:17:57.320 | and so we might be like, wait,
01:17:58.980 | maybe truth is not carried by language.
01:18:01.720 | Maybe there's like a deeper thing.
01:18:03.200 | - The niche is getting smaller.
01:18:05.120 | - Oh, boy.
01:18:05.960 | - But no, no, no, you don't understand.
01:18:09.240 | Humans are created by God,
01:18:10.720 | and machines are created by humans, therefore.
01:18:13.320 | Right, like that'll be the last niche we have.
01:18:16.000 | - So what do you think about
01:18:17.360 | the rapid development of LLMs?
01:18:19.000 | If we could just stick on that.
01:18:20.880 | It's still incredibly impressive, like with ChagGPT.
01:18:23.160 | Just even ChagGPT, what are your thoughts
01:18:24.700 | about reinforcement learning with human feedback
01:18:27.380 | on these large language models?
01:18:28.980 | - I'd like to go back to when calculators first came out,
01:18:33.480 | or computers, and I wasn't around.
01:18:37.400 | Look, I'm 33 years old.
01:18:39.640 | And to like see how that affected,
01:18:43.520 | like society.
01:18:47.880 | - Maybe you're right.
01:18:48.720 | So I wanna put on the big picture hat here.
01:18:53.720 | - Oh my God, a refrigerator, wow.
01:18:55.840 | - Refrigerator, electricity, all that kind of stuff.
01:18:59.200 | But, you know, with the internet,
01:19:03.040 | large language models seeming human-like,
01:19:05.200 | basically passing a Turing test,
01:19:07.880 | it seems it might have really, at scale,
01:19:10.720 | rapid transformative effects on society.
01:19:13.680 | But you're saying like other technologies have as well.
01:19:17.160 | So maybe calculator's not the best example of that?
01:19:20.680 | 'Cause that just seems like, well, no, maybe.
01:19:23.760 | Calculator-- - But the poor milk man.
01:19:25.560 | The day he learned about refrigerators,
01:19:27.200 | he's like, "I'm done."
01:19:28.240 | You're telling me you can just keep the milk in your house?
01:19:32.360 | You don't need me to deliver it every day?
01:19:33.760 | I'm done.
01:19:34.920 | - Well, yeah, you have to actually look
01:19:36.200 | at the practical impacts of certain technologies
01:19:38.440 | that they've had.
01:19:40.360 | Yeah, probably electricity's a big one.
01:19:42.040 | And also how rapidly it's spread.
01:19:43.760 | The internet's a big one.
01:19:46.120 | - I do think it's different this time, though.
01:19:48.080 | - Yeah, it just feels like--
01:19:49.400 | - The niche is getting smaller.
01:19:51.600 | - The niche is humans.
01:19:52.840 | - Yes.
01:19:53.800 | - That makes humans special.
01:19:55.280 | - Yes.
01:19:56.120 | - It feels like it's getting smaller rapidly, though.
01:19:59.360 | Doesn't it?
01:20:00.200 | Or is that just a feeling we dramatize everything?
01:20:02.760 | - I think we dramatize everything.
01:20:04.080 | I think that you ask the milk man
01:20:06.720 | when he saw refrigerators,
01:20:08.400 | "Are they gonna have one of these in every home?"
01:20:10.800 | (laughing)
01:20:12.440 | - Yeah, yeah, yeah.
01:20:14.440 | Yeah, but boys are impressive.
01:20:18.840 | So much more impressive than seeing
01:20:21.320 | a chess world champion AI system.
01:20:23.400 | - I disagree, actually.
01:20:25.720 | I disagree.
01:20:27.240 | I think things like MuZero and AlphaGo
01:20:29.520 | are so much more impressive.
01:20:31.640 | Because these things are playing
01:20:33.640 | beyond the highest human level.
01:20:35.960 | The language models are writing middle school level essays
01:20:41.720 | and people are like, "Wow, it's a great essay.
01:20:43.880 | "It's a great five paragraph essay
01:20:45.400 | "about the causes of the Civil War."
01:20:47.280 | - Okay, forget the Civil War,
01:20:48.360 | just generating code, codex.
01:20:50.120 | - Oh!
01:20:51.400 | - You're saying it's mediocre code.
01:20:53.560 | - Terrible.
01:20:54.400 | - But I don't think it's terrible.
01:20:55.760 | I think it's just mediocre code.
01:20:58.040 | - Yeah.
01:20:58.880 | - Often close to correct.
01:21:01.680 | Like for mediocre purposes.
01:21:03.000 | - That's the scariest kind of code.
01:21:04.800 | I spend 5% of time typing and 95% of time debugging.
01:21:08.000 | The last thing I want is close to correct code.
01:21:10.720 | I want a machine that can help me with the debugging,
01:21:12.640 | not with the typing.
01:21:13.640 | - You know, it's like L2, level two driving,
01:21:16.680 | similar kind of thing.
01:21:17.520 | Yeah, you still should be a good programmer
01:21:21.040 | in order to modify.
01:21:22.600 | I wouldn't even say debugging,
01:21:24.000 | it's just modifying the code, reading it.
01:21:25.840 | - I actually don't think it's like level two driving.
01:21:28.480 | I think driving is not tool complete and programming is.
01:21:30.920 | Meaning you don't use the best possible tools to drive.
01:21:34.080 | You're not like,
01:21:36.840 | cars have basically the same interface
01:21:39.480 | for the last 50 years.
01:21:40.560 | - Yep.
01:21:41.400 | - Computers have a radically different interface.
01:21:42.960 | - Okay, can you describe the concept of tool complete?
01:21:46.440 | - Yeah.
01:21:47.280 | So think about the difference between a car from 1980
01:21:49.440 | and a car from today.
01:21:50.360 | - Yeah.
01:21:51.200 | - No difference really.
01:21:52.040 | It's got a bunch of pedals, it's got a steering wheel.
01:21:54.320 | Great.
01:21:55.160 | Maybe now it has a few ADAS features,
01:21:57.240 | but it's pretty much the same car.
01:21:58.640 | Right, you have no problem getting into a 1980 car
01:22:00.560 | and driving it.
01:22:01.400 | Take a programmer today who spent their whole life
01:22:04.240 | doing JavaScript and you put them in an Apple IIe prompt
01:22:07.400 | and you tell them about the line numbers in basic.
01:22:09.840 | But how do I insert something between line 17 and 18?
01:22:15.320 | Oh, wow.
01:22:16.320 | - But the, so in tool you're putting in
01:22:21.080 | the programming languages.
01:22:22.600 | So it's just the entirety stack of the tooling.
01:22:24.840 | - Exactly.
01:22:25.680 | - So it's not just like the, like IDs or something like this,
01:22:27.600 | it's everything.
01:22:28.640 | - Yes, it's IDEs, the languages, the runtimes,
01:22:30.480 | it's everything.
01:22:31.360 | And programming is tool complete.
01:22:33.280 | So like almost if Codex or Copilot are helping you,
01:22:38.280 | that actually probably means that your framework
01:22:41.720 | or library is bad and there's too much boilerplate in it.
01:22:44.520 | - Yeah, but don't you think so much programming
01:22:49.400 | has boilerplate?
01:22:50.640 | - TinyGrad is now 2,700 lines
01:22:54.200 | and it can run LLAMA and stable diffusion.
01:22:56.680 | And all of this stuff is in 2,700 lines.
01:23:00.320 | Boilerplate and abstraction indirections
01:23:04.280 | and all these things are just bad code.
01:23:06.960 | - Well, let's talk about good code and bad code.
01:23:13.400 | There's a, I would say, I don't know,
01:23:16.000 | for generic scripts that I write just offhand,
01:23:19.080 | like 80% of it is written by GPT.
01:23:22.760 | Just like quick, quick, like offhand stuff.
01:23:25.160 | So not like libraries, not like performing code,
01:23:27.960 | not stuff for robotics and so on, just quick stuff.
01:23:31.000 | Because your basic, so much of programming
01:23:33.160 | is doing some, yeah, boilerplate.
01:23:36.520 | But to do so efficiently and quickly,
01:23:38.980 | 'cause you can't really automate it fully
01:23:42.960 | with like generic method, like a generic kind of ID
01:23:45.960 | type of recommendation or something like this.
01:23:50.000 | You do need to have some of the complexity
01:23:52.000 | of language models.
01:23:53.400 | - Yeah, I guess if I was really writing,
01:23:55.640 | maybe today if I wrote a lot of data parsing stuff,
01:23:59.760 | I mean, I don't play CTFs anymore,
01:24:00.880 | but if I still played CTFs, a lot of it is just like
01:24:02.880 | you have to write a parser for this data format.
01:24:05.200 | Like I wonder, or like admin of code,
01:24:08.440 | I wonder when the models are gonna start to help
01:24:10.680 | with that kind of code.
01:24:11.760 | And they may, they may.
01:24:13.240 | And the models also may help you with speed.
01:24:15.760 | And the models are very fast.
01:24:17.240 | But where the models won't,
01:24:19.440 | my programming speed is not at all limited
01:24:22.600 | by my typing speed.
01:24:24.920 | And in very few cases it is, yes.
01:24:29.200 | If I'm writing some script to just like parse
01:24:31.400 | some weird data format, sure.
01:24:33.440 | My programming speed is limited by my typing speed.
01:24:35.360 | - What about looking stuff up?
01:24:36.920 | 'Cause that's essentially a more efficient lookup, right?
01:24:39.480 | - You know, when I was at Twitter,
01:24:42.160 | I tried to use a chat GPT to like ask some questions,
01:24:46.240 | like what's the API for this?
01:24:48.160 | And it would just hallucinate.
01:24:49.960 | It would just give me completely made up API functions
01:24:52.720 | that sounded real.
01:24:54.440 | - Well, do you think that's just a temporary kind of stage?
01:24:57.360 | - No.
01:24:58.640 | - You don't think it'll get better and better and better
01:25:00.320 | in this kind of stuff?
01:25:01.160 | 'Cause like it only hallucinates stuff in the edge cases.
01:25:04.200 | - Yes, yes.
01:25:05.040 | - If you're writing generic code, it's actually pretty good.
01:25:06.800 | - Yes, if you are writing an absolute basic
01:25:08.880 | like React app with a button,
01:25:10.280 | it's not gonna hallucinate, sure.
01:25:12.240 | No, there's kind of ways to fix the hallucination problem.
01:25:14.840 | I think Facebook has an interesting paper,
01:25:16.480 | it's called Atlas.
01:25:17.760 | And it's actually weird the way that we do language models
01:25:20.840 | right now where all of the information is in the weights.
01:25:25.840 | And the human brain is not really like this.
01:25:27.680 | It's like a hippocampus and a memory system.
01:25:29.720 | So why don't LLMs have a memory system?
01:25:31.840 | And there's people working on them.
01:25:33.000 | I think future LLMs are gonna be like smaller,
01:25:36.160 | but are going to run looping on themselves
01:25:39.360 | and are going to have retrieval systems.
01:25:41.720 | And the thing about using a retrieval system
01:25:43.440 | is you can cite sources explicitly.
01:25:45.680 | - Which is really helpful to integrate
01:25:50.360 | the human into the loop of the thing.
01:25:52.640 | 'Cause you can go check the sources
01:25:54.000 | and you can investigate.
01:25:55.400 | So whenever the thing is hallucinating,
01:25:57.240 | you can like have the human supervision.
01:25:59.720 | That's pushing it towards level two kind of journey.
01:26:01.600 | - That's gonna kill Google.
01:26:03.360 | - Wait, which part?
01:26:04.280 | - When someone makes an LLM that's capable
01:26:06.080 | of citing its sources, it will kill Google.
01:26:08.840 | - LLM that's citing its sources
01:26:10.320 | because that's basically a search engine.
01:26:13.120 | - That's what people want in a search engine.
01:26:14.640 | - But also Google might be the people that build it.
01:26:16.760 | - Maybe.
01:26:17.600 | - And put ads on it.
01:26:18.960 | - I'd count them out.
01:26:20.880 | - Why is that?
01:26:21.720 | What do you think?
01:26:22.560 | Who wins this race?
01:26:24.920 | We got, who are the competitors?
01:26:27.320 | We got Tiny Corp.
01:26:29.600 | I don't know if that's,
01:26:30.880 | yeah, I mean, you're a legitimate competitor in that.
01:26:33.800 | - I'm not trying to compete on that.
01:26:35.620 | - You're not?
01:26:36.460 | - No, not as a competitor.
01:26:37.280 | - You're just gonna accidentally stumble
01:26:38.120 | into that competition.
01:26:39.680 | - Maybe.
01:26:40.520 | - You don't think you might build a search engine
01:26:41.360 | to replace Google search?
01:26:43.240 | - When I started Comma, I said over and over again,
01:26:46.560 | I'm going to win self-driving cars.
01:26:47.960 | I still believe that.
01:26:49.520 | I have never said I'm going to win search
01:26:51.960 | with the Tiny Corp,
01:26:52.800 | and I'm never going to say that 'cause I won't.
01:26:55.360 | - The night is still young.
01:26:56.640 | We don't, you don't know how hard is it
01:26:58.400 | to win search in this new world.
01:27:00.520 | Like, it's, it feels, I mean,
01:27:03.000 | one of the things that ChatterGPT kind of shows
01:27:04.680 | that there could be a few interesting tricks
01:27:06.920 | that really have, that create a really compelling product.
01:27:09.160 | - Some startup's gonna figure it out.
01:27:10.700 | I think, I think if you ask me,
01:27:12.540 | like, Google's still the number one webpage,
01:27:14.200 | I think by the end of the decade,
01:27:15.160 | Google won't be the number one webpage anymore.
01:27:17.400 | - So you don't think Google,
01:27:19.000 | because of the, how big the corporation is?
01:27:21.760 | - Look, I would put a lot more money on Mark Zuckerberg.
01:27:25.000 | - Why is that?
01:27:25.840 | - Because Mark Zuckerberg's alive.
01:27:29.200 | Like, this is old Paul Graham essay.
01:27:32.840 | Startups are either alive or dead.
01:27:34.600 | Google's dead.
01:27:35.440 | Facebook's alive. - Versus Facebook is alive.
01:27:38.600 | - Well, actually, meta. - Meta.
01:27:40.800 | - You see what I mean?
01:27:41.640 | Like, that's just, like, like, like Mark Zuckerberg.
01:27:43.400 | This is Mark Zuckerberg reading that Paul Graham essay
01:27:45.320 | and being like, I'm gonna show everyone
01:27:46.760 | how alive we are.
01:27:47.600 | I'm gonna change the name.
01:27:49.080 | - So you don't think there's this gutsy pivoting engine
01:27:53.960 | that, like, Google doesn't have that,
01:27:57.720 | the kind of engine that a startup has, like, constantly.
01:28:00.520 | - You know what? - Being alive, I guess.
01:28:03.000 | - When I listened to your Sam Altman podcast,
01:28:05.760 | he talked about the button.
01:28:06.600 | Everyone who talks about AI talks about the button,
01:28:08.080 | the button to turn it off, right?
01:28:09.800 | Do we have a button to turn off Google?
01:28:12.560 | Is anybody in the world capable of shutting Google down?
01:28:16.600 | - What does that mean exactly?
01:28:18.240 | The company or the search engine?
01:28:19.720 | - Could we shut the search engine down?
01:28:21.000 | Could we shut the company down?
01:28:23.280 | Either.
01:28:24.280 | - Can you elaborate on the value of that question?
01:28:26.480 | - Does Sundar Pichai have the authority
01:28:28.480 | to turn off google.com tomorrow?
01:28:30.200 | - Who has the authority?
01:28:32.640 | That's a good question, right? - Does anyone?
01:28:34.280 | - Does anyone?
01:28:35.120 | Yeah, I'm sure.
01:28:37.560 | - Are you sure?
01:28:38.640 | No, they have the technical power,
01:28:40.240 | but do they have the authority?
01:28:41.640 | Let's say Sundar Pichai made this his sole mission.
01:28:44.640 | He came into Google tomorrow and said,
01:28:46.000 | "I'm gonna shut google.com down."
01:28:47.800 | I don't think he'd keep his position too long.
01:28:50.560 | - And what is the mechanism
01:28:53.600 | by which he wouldn't keep his position?
01:28:55.280 | - Well, boards and shares and corporate undermining
01:28:59.520 | and oh my God, our revenue is zero now.
01:29:02.760 | - Okay, so what's the case you're making here?
01:29:05.160 | So the capitalist machine prevents you
01:29:07.360 | from having the button?
01:29:09.160 | - Yeah, and it will have a,
01:29:10.600 | I mean, this is true for the AIs too, right?
01:29:12.320 | There's no turning the AIs off.
01:29:14.640 | There's no button, you can't press it.
01:29:16.800 | Now, does Mark Zuckerberg have that button
01:29:19.080 | for facebook.com?
01:29:20.080 | - Yes, probably more.
01:29:22.280 | - I think he does.
01:29:23.520 | I think he does, and this is exactly what I mean
01:29:25.720 | and why I bet on him so much more than I bet on Google.
01:29:29.120 | - I guess you could say Elon has similar stuff.
01:29:31.280 | - Oh, Elon has the button.
01:29:32.760 | - Yeah.
01:29:33.600 | - Elon, does Elon, can Elon fire the missiles?
01:29:36.840 | Can he fire the missiles?
01:29:39.000 | - I think some questions are better left unasked.
01:29:42.320 | - Right?
01:29:43.680 | I mean, you know, a rocket and an ICBM,
01:29:45.440 | well, you're a rocket that can land anywhere.
01:29:47.080 | Is that an ICBM?
01:29:48.080 | Well, you know, don't ask too many questions.
01:29:51.080 | - My God.
01:29:52.000 | But the positive side of the button
01:29:57.240 | is that you can innovate aggressively, is what you're saying,
01:30:00.560 | which is what's required with turning LLM
01:30:03.440 | into a search engine.
01:30:04.320 | - I would bet on a startup.
01:30:05.320 | I bet on- - 'Cause it's so easy, right?
01:30:06.560 | - I bet on something that looks like mid-journey,
01:30:08.120 | but for search.
01:30:09.560 | - Just is able to site source a loop on itself.
01:30:13.920 | I mean, it just feels like one model can take off.
01:30:15.840 | - Yeah. - Right?
01:30:16.680 | And that nice wrapper and some of it,
01:30:18.920 | I mean, it's hard to like create a product
01:30:21.240 | that just works really nicely, stably.
01:30:23.480 | - The other thing that's gonna be cool
01:30:25.120 | is there is some aspect of a winner-take-all effect, right?
01:30:28.280 | Like once someone starts deploying a product
01:30:31.920 | that gets a lot of usage, and you see this with OpenAI,
01:30:34.840 | they are going to get the dataset
01:30:36.640 | to train future versions of the model.
01:30:39.000 | - Yeah. - They are going to be able to,
01:30:41.320 | you know, I was actually at Google Image Search
01:30:42.720 | when I worked there like almost 15 years ago now.
01:30:44.920 | How does Google know which image is an apple?
01:30:46.800 | And I said the metadata.
01:30:48.000 | And they're like, yeah, that works about half the time.
01:30:49.840 | How does Google know?
01:30:50.960 | You'll see they're all apples on the front page
01:30:52.400 | when you search apple.
01:30:54.200 | And I don't know, I didn't come up with the answer.
01:30:57.000 | The guy's like, well, it's what people click on
01:30:58.200 | when they search apple.
01:30:59.040 | I'm like, oh, yeah.
01:31:00.280 | - Yeah, yeah, that data is really, really powerful.
01:31:02.680 | It's the human supervision.
01:31:04.440 | What do you think are the chances?
01:31:06.000 | What do you think in general that Lama was open sourced?
01:31:09.480 | I just did a conversation with Mark Zuckerberg,
01:31:13.880 | and he's all in on open source.
01:31:16.400 | - Who would have thought that Mark Zuckerberg
01:31:19.600 | would be the good guy?
01:31:20.700 | I mean it.
01:31:23.360 | - Who would have thought anything in this world?
01:31:25.800 | It's hard to know.
01:31:27.280 | But open source to you ultimately is a good thing here.
01:31:32.280 | - Undoubtedly.
01:31:35.520 | You know, what's ironic about all these AI safety people
01:31:39.600 | is they are going to build the exact thing they fear.
01:31:42.200 | These we need to have one model that we control and align,
01:31:47.720 | this is the only way you end up paper clipped.
01:31:50.640 | There's no way you end up paper clipped
01:31:52.440 | if everybody has an AI.
01:31:54.040 | - So open sourcing is the way
01:31:55.360 | to fight the paper clip maximizer.
01:31:56.920 | - Absolutely.
01:31:58.200 | It's the only way.
01:31:59.160 | You think you're gonna control it?
01:32:00.480 | You're not gonna control it.
01:32:02.040 | - So the criticism you have for the AI safety folks
01:32:05.200 | is that there is a belief and a desire for control.
01:32:10.200 | And that belief and desire for centralized control
01:32:13.760 | of dangerous AI systems is not good.
01:32:16.840 | - Sam Altman won't tell you that GPT-4
01:32:19.120 | has 220 billion parameters
01:32:21.200 | and is a 16-way mixture model with eight sets of weights?
01:32:24.100 | - Who did you have to murder to get that information?
01:32:28.040 | All right.
01:32:30.400 | - I mean look, everyone at OpenAI knows
01:32:32.200 | what I just said was true.
01:32:34.000 | Now, ask the question really.
01:32:36.960 | You know, it upsets me when I, like GPT-2.
01:32:40.120 | When OpenAI came out with GPT-2
01:32:41.680 | and raised a whole fake AI safety thing about that,
01:32:44.080 | I mean now the model is laughable.
01:32:46.520 | Like they used AI safety to hype up their company
01:32:50.480 | and it's disgusting.
01:32:51.480 | - Or the flip side of that is they used
01:32:56.160 | a relatively weak model in retrospect
01:32:58.780 | to explore how do we do AI safety correctly?
01:33:01.880 | How do we release things?
01:33:02.800 | How do we go through the process?
01:33:04.360 | I don't know if--
01:33:05.800 | - Sure, sure.
01:33:06.640 | All right, all right, all right.
01:33:07.480 | That's the charitable interpretation.
01:33:10.240 | - I don't know how much hype there is in AI safety,
01:33:12.080 | honestly.
01:33:12.920 | - Oh, there's so much hype.
01:33:13.740 | At least on Twitter, I don't know.
01:33:14.920 | Maybe Twitter's not real life.
01:33:15.760 | - Twitter's not real life.
01:33:17.920 | Come on.
01:33:18.760 | In terms of hype, I mean, I don't,
01:33:21.040 | I think OpenAI has been finding an interesting balance
01:33:24.080 | between transparency and putting value on AI safety.
01:33:29.080 | You think just go all out open source
01:33:32.200 | or do a llama.
01:33:33.720 | - Absolutely, yeah.
01:33:34.560 | - So do like open source.
01:33:37.120 | This is a tough question,
01:33:38.360 | which is open source both the base,
01:33:41.280 | the foundation model and the fine-tuned one.
01:33:44.200 | So like the model that can be ultra racist and dangerous
01:33:48.720 | and like tell you how to build a nuclear weapon.
01:33:51.800 | - Oh my God, have you met humans?
01:33:53.600 | Right?
01:33:54.440 | Like half of these AI--
01:33:55.280 | - I haven't met most humans.
01:33:57.480 | This makes, this allows you to meet every human.
01:34:00.520 | - Yeah, I know, but half of these AI alignment problems
01:34:02.960 | are just human alignment problems.
01:34:04.480 | And that's what's also so scary
01:34:05.840 | about the language they use.
01:34:06.900 | It's like, it's not the machines you want to align, it's me.
01:34:09.860 | - But here's the thing.
01:34:13.360 | It makes it very accessible to ask very
01:34:17.960 | questions where the answers have dangerous consequences
01:34:23.080 | if you were to act on them.
01:34:25.240 | - I mean, yeah.
01:34:26.800 | Welcome to the world.
01:34:28.400 | - Well, no, for me, there's a lot of friction.
01:34:30.560 | If I want to find out how to, I don't know,
01:34:35.480 | blow up something.
01:34:36.760 | - No, there's not a lot of friction.
01:34:38.040 | That's so easy.
01:34:39.040 | - No, like what do I search?
01:34:40.560 | Do I use Bing or do I, which search engine do I use?
01:34:43.080 | - No, there's like lots of stuff.
01:34:44.920 | - No, it feels like I have to keep clicking a lot of this.
01:34:47.480 | - Anyone who's stupid enough to search for
01:34:49.360 | how to blow up a building in my neighborhood
01:34:52.300 | is not smart enough to build a bomb, right?
01:34:54.600 | - Are you sure about that?
01:34:55.440 | - Yes.
01:34:57.040 | I feel like a language model makes it more accessible
01:35:02.040 | for that person who's not smart enough to do--
01:35:05.440 | - They're not gonna build a bomb, trust me.
01:35:07.240 | The people who are incapable of figuring out
01:35:11.340 | how to ask that question a bit more academically
01:35:13.720 | and get a real answer from it are not capable
01:35:15.760 | of procuring the materials, which are somewhat controlled,
01:35:17.880 | to build a bomb.
01:35:19.460 | - No, I think it makes it more accessible
01:35:21.300 | to people with money without the technical know-how, right?
01:35:24.880 | - To build a, like, do you really need to know
01:35:27.160 | how to build a bomb to build a bomb?
01:35:29.020 | You can hire people, you can find like--
01:35:30.520 | - Oh, you can hire people to build a,
01:35:32.240 | you know what, I was asking this question on my stream,
01:35:33.720 | like, can Jeff Bezos hire a hitman?
01:35:35.340 | Probably not.
01:35:36.180 | - But a language model can probably help you out.
01:35:41.720 | - Yeah, and you'll still go to jail, right?
01:35:43.240 | Like, it's not like the language model is God.
01:35:45.080 | Like, the language model, it's like,
01:35:46.440 | it's you literally just hired someone on Fiverr.
01:35:49.400 | Like, you--
01:35:50.240 | - But, okay, okay, GPT-4, in terms of finding a hitman,
01:35:52.920 | it's like asking Fiverr how to find a hitman.
01:35:54.920 | I understand, but don't you think--
01:35:56.360 | - Asking Wikihow, you know?
01:35:57.200 | - Wikihow, but don't you think GPT-5 will be better?
01:36:00.320 | 'Cause don't you think that information
01:36:01.440 | is out there on the internet?
01:36:02.960 | - I mean, yeah, and I think that if someone
01:36:05.040 | is actually serious enough to hire a hitman
01:36:07.280 | or build a bomb, they'd also be serious enough
01:36:09.320 | to find the information.
01:36:10.800 | - I don't think so.
01:36:11.720 | I think it makes it more accessible.
01:36:13.000 | If you have enough money to buy a hitman,
01:36:15.800 | I think it decreases the friction
01:36:18.460 | of how hard is it to find that kind of hitman.
01:36:20.760 | I honestly think there's a jump in ease and scale
01:36:25.760 | of how much harm you can do.
01:36:28.840 | And I don't mean harm with language,
01:36:30.280 | I mean harm with actual violence.
01:36:32.000 | - What you're basically saying is like,
01:36:33.480 | okay, what's gonna happen is these people
01:36:34.920 | who are not intelligent are going to use machines
01:36:38.060 | to augment their intelligence.
01:36:39.800 | And now, intelligent people and machines,
01:36:42.140 | intelligence is scary.
01:36:43.640 | Intelligent agents are scary.
01:36:45.960 | When I'm in the woods, the scariest animal to meet
01:36:48.320 | is a human, right?
01:36:50.080 | No, no, no, no.
01:36:50.920 | Look, there's like nice California humans.
01:36:52.840 | Like I see you're wearing like, you know,
01:36:54.960 | street clothes and Nikes, all right, fine.
01:36:57.000 | But you look like you've been a human
01:36:58.240 | who's been in the woods for a while?
01:36:59.400 | - Yeah.
01:37:00.240 | - I'm more scared of you than a bear.
01:37:01.360 | - That's what they say about the Amazon.
01:37:03.080 | When you go to the Amazon, it's the human tribes.
01:37:05.200 | - Oh yeah.
01:37:06.280 | So intelligence is scary, right?
01:37:09.080 | So to ask this question in a generic way,
01:37:11.640 | you're like, what if we took everybody
01:37:13.560 | who maybe has ill intention but is not so intelligent
01:37:17.380 | and gave them intelligence?
01:37:19.600 | Right?
01:37:20.520 | So we should have intelligence control, of course.
01:37:23.720 | We should only give intelligence to good people.
01:37:25.600 | And that is the absolutely horrifying idea.
01:37:27.960 | - So to use the best defense is actually,
01:37:30.040 | the best defense is to give more intelligence
01:37:32.440 | to the good guys and intelligence,
01:37:34.200 | give intelligence to everybody.
01:37:35.320 | - Give intelligence to everybody.
01:37:36.240 | You know what, it's not even like guns, right?
01:37:37.440 | Like people say this about guns.
01:37:38.320 | You know, what's the best defense against a bad guy
01:37:39.760 | with a gun, a good guy with a gun?
01:37:40.920 | Like I kind of subscribe to that,
01:37:42.160 | but I really subscribe to that with intelligence.
01:37:44.600 | - Yeah, in a fundamental way, I agree with you.
01:37:48.200 | But there's just feels like so much uncertainty
01:37:50.120 | and so much can happen rapidly
01:37:51.720 | that you can lose a lot of control
01:37:53.160 | and you can do a lot of damage.
01:37:54.640 | - Oh no, we can lose control?
01:37:56.280 | Yes, thank God.
01:37:58.080 | - Yeah.
01:37:58.920 | - I hope we can, I hope they lose control.
01:38:00.880 | I'd want them to lose control more than anything else.
01:38:05.340 | - I think when you lose control, you can do a lot of damage,
01:38:07.800 | but you can do more damage when you centralize
01:38:11.120 | and hold onto control is the point here.
01:38:12.720 | - Centralized and held control is tyranny, right?
01:38:15.720 | I will always, I don't like anarchy either,
01:38:17.560 | but I will always take anarchy over tyranny.
01:38:19.200 | Anarchy, you have a chance.
01:38:20.560 | - This human civilization we've got going on
01:38:24.280 | is quite interesting.
01:38:25.560 | I mean, I agree with you.
01:38:26.400 | So to you, open source is the way forward here.
01:38:30.640 | So you admire what Facebook is doing here
01:38:32.360 | or what Meta is doing with the release of the--
01:38:34.400 | - A lot.
01:38:35.240 | - Yeah.
01:38:36.080 | - I lost $80,000 last year investing in Meta.
01:38:38.280 | And when they released Lama, I'm like, yeah, whatever, man.
01:38:40.480 | That was worth it.
01:38:41.720 | - It was worth it.
01:38:43.160 | Do you think Google and OpenAI with Microsoft will match?
01:38:47.040 | What Meta is doing or no?
01:38:49.760 | - So if I were a researcher,
01:38:52.320 | why would you wanna work at OpenAI?
01:38:53.880 | Like, you're on the bad team.
01:38:56.960 | Like, I mean it.
01:38:57.800 | Like you're on the bad team who can't even say
01:38:59.240 | that GPT-4 has 220 billion parameters.
01:39:01.480 | - So close source to use the bad team.
01:39:03.920 | - Not only close source.
01:39:05.120 | I'm not saying you need to make your model weights open.
01:39:08.080 | I'm not saying that.
01:39:09.080 | I totally understand we're keeping our model weights closed
01:39:11.320 | because that's our product, right?
01:39:12.600 | That's fine.
01:39:13.960 | I'm saying like, because of AI safety reasons,
01:39:17.000 | we can't tell you the number of billions
01:39:19.320 | of parameters in the model.
01:39:21.160 | That's just the bad guys.
01:39:23.000 | - Just because you're mocking AI safety
01:39:24.920 | doesn't mean it's not real.
01:39:26.440 | - Oh, of course.
01:39:27.280 | - Is it possible that these things can really do
01:39:29.320 | a lot of damage that we don't know?
01:39:31.000 | - Oh my God, yes.
01:39:32.120 | Intelligence is so dangerous,
01:39:34.000 | be it human intelligence or machine intelligence.
01:39:36.680 | Intelligence is dangerous.
01:39:38.320 | - But machine intelligence is so much easier
01:39:40.360 | to deploy at scale, like rapidly.
01:39:42.960 | Like what, okay.
01:39:44.280 | If you have human-like bots on Twitter,
01:39:46.200 | - Right.
01:39:47.040 | - And you have like a thousand of them,
01:39:50.080 | create a whole narrative,
01:39:52.080 | like you can manipulate millions of people.
01:39:55.720 | - But you mean like the intelligence agencies in America
01:39:57.960 | are doing right now?
01:39:59.040 | - Yeah, but they're not doing it that well.
01:40:01.120 | It feels like you can do a lot.
01:40:03.360 | - They're doing it pretty well.
01:40:05.400 | - What?
01:40:06.240 | - I think they're doing a pretty good job.
01:40:07.600 | - I suspect they're not nearly as good
01:40:09.560 | as a bunch of GPT-fueled bots could be.
01:40:12.680 | - Well, I mean, of course, they're looking
01:40:13.800 | into the latest technologies
01:40:15.040 | for control of people, of course.
01:40:16.920 | - But I think there's a George Hotz-type character
01:40:19.080 | that can do a better job than the entirety of them.
01:40:21.440 | You don't think so?
01:40:22.280 | - No way.
01:40:23.120 | No, and I'll tell you why the George Hotz character can't.
01:40:24.720 | And I thought about this a lot with hacking, right?
01:40:26.560 | Like I can find exploits in web browsers.
01:40:27.840 | I probably still can.
01:40:28.680 | I mean, I was better out when I was 24,
01:40:29.800 | but the thing that I lack is the ability
01:40:33.160 | to slowly and steadily deploy them over five years.
01:40:35.920 | And this is what intelligence agencies are very good at.
01:40:38.400 | Right?
01:40:39.240 | Intelligence agencies don't have
01:40:40.060 | the most sophisticated technology.
01:40:42.320 | They just have,
01:40:43.720 | - Endurance?
01:40:44.560 | - Endurance.
01:40:45.400 | - Yeah.
01:40:46.240 | - And yeah, the financial backing
01:40:49.360 | and the infrastructure for the endurance.
01:40:51.960 | - So the more we can decentralize power,
01:40:54.800 | like you could make an argument, by the way,
01:40:56.700 | that nobody should have these things.
01:40:58.340 | And I would defend that argument.
01:40:59.680 | I would, like you're saying, that look,
01:41:01.480 | LLMs and AI and machine intelligence can cause a lot of harm
01:41:05.300 | so nobody should have it.
01:41:06.880 | And I will respect someone philosophically
01:41:08.800 | with that position,
01:41:09.680 | just like I will respect someone philosophically
01:41:11.240 | with the position that nobody should have guns.
01:41:13.880 | Right?
01:41:14.720 | But I will not respect philosophically
01:41:16.080 | which with only the trusted authorities
01:41:20.000 | should have access to this.
01:41:21.440 | - Yeah.
01:41:22.280 | - Who are the trusted authorities?
01:41:23.520 | You know what?
01:41:24.360 | I'm not worried about alignment
01:41:25.920 | between AI company and their machines.
01:41:29.700 | I'm worried about alignment between me and AI company.
01:41:33.120 | - What do you think Eliezer Yudkowsky would say to you?
01:41:36.260 | Because he's really against open source.
01:41:39.840 | - I know.
01:41:40.800 | And I thought about this.
01:41:44.960 | I thought about this.
01:41:46.440 | And I think this comes down to a repeated misunderstanding
01:41:51.440 | of political power by the rationalists.
01:41:54.660 | - Interesting.
01:41:56.760 | - I think that Eliezer Yudkowsky is scared of these things.
01:42:02.260 | And I am scared of these things too.
01:42:04.080 | Everyone should be scared of these things.
01:42:05.880 | These things are scary.
01:42:08.040 | But now you ask about the two possible futures.
01:42:11.160 | One where a small trusted centralized group of people
01:42:16.160 | has them and the other where everyone has them.
01:42:19.320 | And I am much less scared of the second future
01:42:21.280 | than the first.
01:42:22.120 | - Well, there's a small trusted group of people
01:42:25.160 | that have control of our nuclear weapons.
01:42:27.200 | - There's a difference.
01:42:30.040 | Again, a nuclear weapon cannot be deployed tactically
01:42:32.720 | and a nuclear weapon is not a defense
01:42:34.240 | against a nuclear weapon.
01:42:37.320 | Except maybe in some philosophical mind game kind of way.
01:42:40.120 | - But AI is different.
01:42:43.320 | How exactly?
01:42:44.360 | - Okay, let's say the intelligence agency
01:42:47.960 | deploys a million bots on Twitter
01:42:50.200 | or a thousand bots on Twitter
01:42:51.280 | to try to convince me of a point.
01:42:53.460 | Imagine I had a powerful AI running on my computer
01:42:56.720 | saying, okay, nice PSYOP, nice PSYOP, nice PSYOP.
01:43:00.440 | Okay, here's a PSYOP.
01:43:02.320 | I filtered it out for you.
01:43:04.400 | - Yeah, I mean, so you have fundamentally hope for that,
01:43:07.760 | for the defense of PSYOP.
01:43:10.480 | - I'm not even like, I don't even mean these things
01:43:12.120 | in like truly horrible ways.
01:43:13.320 | I mean these things in straight up like ad blocker, right?
01:43:16.400 | Straight up ad blocker, I don't want ads.
01:43:18.760 | But they are always finding, you know,
01:43:20.040 | imagine I had an AI that could just block
01:43:22.460 | all the ads for me.
01:43:23.400 | - So you believe in the power of the people
01:43:27.200 | to always create an ad blocker.
01:43:29.840 | Yeah, I mean, I kind of share that belief.
01:43:33.680 | One of the deepest optimisms I have is just like,
01:43:37.200 | there's a lot of good guys.
01:43:39.240 | So to give, you shouldn't handpick them,
01:43:42.200 | just throw out powerful technology out there
01:43:45.680 | and the good guys will outnumber
01:43:47.920 | and outpower the bad guys.
01:43:49.640 | - Yeah, I'm not even gonna say there's a lot of good guys.
01:43:51.680 | I'm saying that good outnumbers bad, right?
01:43:53.760 | Good outnumbers bad.
01:43:54.680 | - In skill and performance.
01:43:56.360 | - Yeah, definitely in skill and performance,
01:43:57.900 | probably just a number too.
01:43:59.240 | Probably just in general.
01:44:00.120 | I mean, if you believe philosophically in democracy,
01:44:02.360 | you obviously believe that, that good outnumbers bad.
01:44:06.560 | And like the only, if you give it
01:44:10.440 | to a small number of people,
01:44:11.940 | there's a chance you gave it to good people,
01:44:14.680 | but there's also a chance you gave it to bad people.
01:44:16.720 | If you give it to everybody, well, if good outnumbers bad,
01:44:19.800 | then you definitely gave it to more good people than bad.
01:44:22.520 | - That's really interesting.
01:44:25.920 | So that's on the safety grounds,
01:44:27.180 | but then also, of course, there's other motivations,
01:44:29.640 | like you don't wanna give away your secret sauce.
01:44:32.200 | - Well, that's, I mean, I look, I respect capitalism.
01:44:34.440 | I don't think that, I think that it would be polite
01:44:37.200 | for you to make model architectures open source
01:44:39.360 | and fundamental breakthroughs open source.
01:44:41.800 | I don't think you have to make weights open source.
01:44:43.360 | - You know what's interesting is that,
01:44:45.960 | like there's so many possible trajectories in human history
01:44:49.200 | where you could have the next Google be open source.
01:44:53.220 | So for example, I don't know if that connection is accurate,
01:44:57.280 | but Wikipedia made a lot of interesting decisions,
01:44:59.960 | not to put ads.
01:45:01.480 | Like Wikipedia is basically open source.
01:45:03.720 | You could think of it that way.
01:45:05.720 | And like, that's one of the main websites on the internet.
01:45:08.840 | And like, it didn't have to be that way.
01:45:10.160 | It could have been like,
01:45:11.080 | Google could have created Wikipedia, put ads on it.
01:45:13.600 | You could probably run amazing ads now on Wikipedia.
01:45:16.600 | You wouldn't have to keep asking for money,
01:45:18.480 | but it's interesting, right?
01:45:20.560 | So llama, open source llama,
01:45:23.280 | derivatives of open source llama might win the internet.
01:45:26.540 | - I sure hope so.
01:45:29.040 | I hope to see another era.
01:45:31.160 | You know, the kids today don't know
01:45:32.720 | how good the internet used to be.
01:45:35.000 | And I don't think this is just, all right, come on.
01:45:36.640 | Like everyone's nostalgic for their past,
01:45:38.240 | but I actually think the internet,
01:45:40.400 | before small groups of weaponized corporate
01:45:43.720 | and government interests took it over,
01:45:45.520 | was a beautiful place.
01:45:46.620 | - You know, those small number of companies
01:45:52.920 | have created some sexy products.
01:45:55.880 | But you're saying overall, in the long arc of history,
01:46:00.000 | the centralization of power they have,
01:46:02.600 | like suffocated the human spirit at scale.
01:46:04.840 | - Here's a question to ask
01:46:05.680 | about those beautiful, sexy products.
01:46:08.240 | Imagine 2000 Google to 2010 Google, right?
01:46:11.120 | A lot changed.
01:46:12.000 | We got Maps, we got Gmail.
01:46:14.280 | - We lost a lot of products too, I think.
01:46:16.520 | - Yeah, I mean, some were probably, we got Chrome, right?
01:46:18.960 | And now let's go from 2010, we got Android.
01:46:21.920 | Now let's go from 2010 to 2020.
01:46:24.440 | What does Google have?
01:46:25.280 | Well, search engine, Maps, Mail, Android, and Chrome.
01:46:29.440 | - Oh, I see.
01:46:30.560 | The internet was this,
01:46:34.000 | you know, I was Times Person of the Year in 2006.
01:46:36.440 | - I love this.
01:46:39.280 | - It's you, was Times Person of the Year in 2006, right?
01:46:41.800 | Like that's, you know, so quickly did people forget.
01:46:46.600 | And I think some of it's social media.
01:46:49.480 | I think some of it, I hope, look, I hope that,
01:46:53.280 | I don't, it's possible
01:46:54.840 | that some very sinister things happened.
01:46:56.600 | I don't know.
01:46:57.520 | I think it might just be like the effects of social media.
01:47:00.360 | But something happened in the last 20 years.
01:47:03.520 | - Oh, okay.
01:47:05.920 | So you're just being an old man who's worried about the,
01:47:08.240 | I think there's always, it goes, it's a cycle thing.
01:47:10.240 | It's ups and downs,
01:47:11.080 | and I think people rediscover the power of distributed,
01:47:13.720 | of decentralized.
01:47:15.360 | I mean, that's kind of like what the whole,
01:47:17.000 | like cryptocurrency is trying, like that,
01:47:19.040 | I think crypto is just carrying the flame of that spirit
01:47:23.240 | of like stuff should be decentralized.
01:47:24.960 | It's just such a shame that they all got rich, you know?
01:47:28.320 | - Yeah.
01:47:29.160 | - If you took all the money out of crypto,
01:47:30.560 | it would have been a beautiful place.
01:47:32.040 | - Yeah.
01:47:32.880 | - But no, I mean, these people, you know,
01:47:34.320 | they sucked all the value out of it and took it.
01:47:36.880 | - Yeah, money kind of corrupts the mind somehow.
01:47:40.720 | It becomes this drug.
01:47:41.920 | - You corrupted all of crypto.
01:47:43.480 | You had coins worth billions of dollars that had zero use.
01:47:46.880 | - You still have hope for crypto?
01:47:51.120 | - Sure.
01:47:51.960 | I have hope for the ideas.
01:47:52.800 | I really do.
01:47:54.560 | Yeah, I mean, you know,
01:47:56.520 | I want the US dollar to collapse.
01:48:00.320 | I do.
01:48:03.480 | - George Hotz.
01:48:04.320 | Well, let me sort of on the AI safety,
01:48:08.120 | do you think there's some interesting questions there though
01:48:11.360 | to solve for the open source community in this case?
01:48:13.560 | So like alignment, for example, or the control problem.
01:48:17.480 | Like if you really have super powerful,
01:48:19.400 | you said it's scary.
01:48:21.000 | - Oh yeah.
01:48:21.840 | - What do we do with it?
01:48:22.680 | Not control, not centralized control,
01:48:24.240 | but like if you were then,
01:48:27.040 | you're gonna see some guy or gal
01:48:30.320 | release a super powerful language model, open source.
01:48:34.040 | And here you are, George Hotz thinking,
01:48:35.920 | holy shit, okay, what ideas do I have to combat this thing?
01:48:40.920 | So what ideas would you have?
01:48:44.520 | - I am so much not worried about the machine
01:48:48.360 | independently doing harm.
01:48:50.280 | That's what some of these AI safety people seem to think.
01:48:52.920 | They somehow seem to think that the machine
01:48:54.760 | like independently is gonna rebel against its creator.
01:48:57.440 | - So you don't think you'll find autonomy?
01:48:59.520 | - No, this is sci-fi B movie garbage.
01:49:03.320 | - Okay, what if the thing writes code,
01:49:05.680 | basically writes viruses?
01:49:06.960 | - If the thing writes viruses,
01:49:10.560 | it's because the human told it to write viruses.
01:49:14.080 | - Yeah, but there's some things you can't
01:49:15.360 | like put back in the box.
01:49:16.440 | That's kind of the whole point.
01:49:18.200 | Is it kind of spreads.
01:49:19.400 | Give it access to the internet, it spreads,
01:49:21.320 | installs itself, modifies your shit.
01:49:24.080 | - B, B, B, B plot sci-fi, not real.
01:49:27.560 | - I'm trying to work.
01:49:28.400 | I'm trying to get better at my plot writing.
01:49:30.280 | - The thing that worries me,
01:49:31.600 | I mean, we have a real danger to discuss
01:49:33.800 | and that is bad humans using the thing
01:49:36.880 | to do whatever bad unaligned AI thing you want.
01:49:39.520 | - But this goes to your previous concern
01:49:42.720 | that who gets to define who's a good human,
01:49:44.920 | who's a bad human?
01:49:45.760 | - Nobody does, we give it to everybody.
01:49:47.480 | And if you do anything besides give it to everybody,
01:49:49.760 | trust me, the bad humans will get it.
01:49:51.600 | Because that's who gets power.
01:49:53.960 | It's always the bad humans who get power.
01:49:55.400 | - Okay, power.
01:49:57.640 | And power turns even slightly good humans to bad.
01:50:01.560 | - Sure.
01:50:02.400 | - That's the intuition you have.
01:50:03.760 | I don't know.
01:50:04.600 | - I don't think everyone, I don't think everyone.
01:50:07.960 | I just think that like,
01:50:09.840 | here's the saying that I put in one of my blog posts.
01:50:13.280 | When I was in the hacking world,
01:50:14.840 | I found 95% of people to be good
01:50:16.960 | and 5% of people to be bad.
01:50:18.560 | Like just who I personally judged
01:50:19.760 | as good people and bad people.
01:50:21.160 | Like they believed about good things for the world.
01:50:23.160 | They wanted like flourishing and they wanted growth
01:50:26.480 | and they wanted things I consider good, right?
01:50:29.280 | I came into the business world with karma
01:50:30.880 | and I found the exact opposite.
01:50:32.680 | I found 5% of people good and 95% of people bad.
01:50:35.720 | I found a world that promotes psychopathy.
01:50:38.560 | - I wonder what that means.
01:50:39.680 | I wonder if that,
01:50:43.120 | I wonder if that's anecdotal or if it,
01:50:45.160 | if there's truth to that,
01:50:47.560 | there's something about capitalism.
01:50:49.320 | - Well. - At the core
01:50:51.640 | that promotes the people that run capitalism,
01:50:54.000 | that promotes psychopathy.
01:50:55.560 | - That saying may of course be my own biases, right?
01:50:58.160 | That may be my own biases that these people
01:50:59.840 | are a lot more aligned with me than these other people.
01:51:02.960 | Right? - Yeah.
01:51:04.120 | - So, you know, I can certainly recognize that.
01:51:07.240 | But you know, in general, I mean,
01:51:08.440 | this is like the common sense maxim,
01:51:11.000 | which is the people who end up getting power
01:51:13.480 | are never the ones you want with it.
01:51:15.760 | - But do you have a concern of super intelligent AGI,
01:51:19.200 | open sourced, and then what do you do with that?
01:51:23.840 | I'm not saying control it, it's open source.
01:51:26.000 | What do we do with this human species?
01:51:27.760 | - That's not up to me.
01:51:28.800 | I mean, you know, like I'm not a central planner.
01:51:31.240 | - No, not a central planner, but you'll probably tweet,
01:51:33.640 | there's a few days left to live for the human species.
01:51:35.760 | - I have my ideas of what to do with it
01:51:37.320 | and everyone else has their ideas of what to do with it.
01:51:39.240 | May the best ideas win.
01:51:40.240 | - But at this point, do you brainstorm,
01:51:42.400 | like, because it's not regulation.
01:51:45.320 | It could be decentralized regulation
01:51:46.840 | where people agree that this is just like,
01:51:49.440 | we create tools that make it more difficult for you
01:51:52.440 | to maybe make it more difficult for code to spread,
01:51:59.020 | you know, antivirus software, this kind of thing.
01:52:00.960 | - Oh, you're saying that you should build AI firewalls?
01:52:02.920 | That sounds good.
01:52:03.760 | You should definitely be running an AI firewall.
01:52:05.040 | - Yeah, right.
01:52:05.880 | - You should be running an AI firewall to your mind.
01:52:08.440 | - Right.
01:52:09.280 | - Constantly under, you know.
01:52:10.200 | - That's such an interesting idea.
01:52:11.040 | - Info wars, man, like.
01:52:13.080 | - I don't know if you're being sarcastic or not.
01:52:14.600 | - No, I'm dead serious.
01:52:15.440 | I'm dead serious. - But I think
01:52:16.260 | there's power to that.
01:52:17.100 | It's like, how do I protect my mind
01:52:22.100 | from influence of human-like
01:52:24.880 | or superhuman intelligent bots?
01:52:26.520 | - I am not being, I would pay so much money for that product.
01:52:29.840 | I would pay so much money for that product.
01:52:31.520 | I would, you know how much money I'd pay
01:52:32.600 | just for a spam filter that works?
01:52:35.140 | - Well, on Twitter, sometimes I would like
01:52:38.400 | to have a protection mechanism for my mind
01:52:43.400 | from the outrage mobs.
01:52:46.160 | - Yeah.
01:52:47.000 | - 'Cause they feel like bot-like behavior.
01:52:48.080 | It's like, there's a large number of people
01:52:49.880 | that will just grab a viral narrative
01:52:52.560 | and attack anyone else that believes otherwise.
01:52:54.520 | And it's like.
01:52:55.720 | - Whenever someone's telling me some story from the news,
01:52:57.740 | I'm always like, I don't wanna hear it, CIA op, bro.
01:52:59.600 | It's a CIA op, bro.
01:53:00.800 | Like, it doesn't matter if that's true or not.
01:53:02.240 | It's just trying to influence your mind.
01:53:03.960 | You're repeating an ad to me.
01:53:06.320 | But the viral mobs, like, yeah, they're.
01:53:08.840 | - Like, to me, a defense against those mobs
01:53:12.180 | is just getting multiple perspectives always
01:53:14.800 | from sources that make you feel
01:53:17.440 | kind of like you're getting smarter.
01:53:20.680 | And just, actually, just basically feels good.
01:53:24.060 | Like, a good documentary just feels,
01:53:26.760 | something feels good about it.
01:53:28.000 | It's well done.
01:53:29.160 | It's like, oh, okay, I never thought of it this way.
01:53:31.400 | It just feels good.
01:53:32.660 | Sometimes the outrage mobs,
01:53:33.940 | even if they have a good point behind it,
01:53:35.780 | when they're like mocking and derisive and just aggressive,
01:53:39.180 | you're with us or against us, this fucking.
01:53:42.040 | - This is why I delete my tweets.
01:53:44.480 | - Yeah, why'd you do that?
01:53:46.640 | I was, you know, I missed your tweets.
01:53:48.880 | - You know what it is?
01:53:50.080 | The algorithm promotes toxicity.
01:53:52.640 | - Yeah.
01:53:54.160 | - And like, you know, I think Elon has a much better chance
01:53:57.440 | of fixing it than the previous regime.
01:54:00.660 | - Yeah.
01:54:02.600 | - But to solve this problem, to solve,
01:54:04.580 | like to build a social network that is actually not toxic
01:54:07.960 | without moderation.
01:54:10.980 | - Like, not the stick, but carrot.
01:54:14.880 | So like where people look for goodness,
01:54:19.360 | so make it catalyze the process of connecting cool people
01:54:22.560 | and being cool to each other.
01:54:24.380 | - Yeah.
01:54:25.220 | - Without ever censoring.
01:54:26.880 | - Without ever censoring.
01:54:27.700 | And like, Scott Alexander has a blog post I like
01:54:30.680 | where he talks about like moderation is not censorship.
01:54:32.360 | Right?
01:54:33.240 | Like all moderation you want to put on Twitter, right?
01:54:35.920 | Like you could totally make this moderation, like just a,
01:54:40.540 | you don't have to block it for everybody.
01:54:42.500 | You can just have like a filter button, right?
01:54:44.460 | That people can turn off if they would like safe search
01:54:46.080 | for Twitter, right?
01:54:47.000 | Like someone could just turn that off, right?
01:54:48.760 | So like, but then you'd like take this idea to an extreme.
01:54:51.180 | Right?
01:54:52.120 | Well, the network should just show you,
01:54:54.720 | this is a couch surfing CEO thing, right?
01:54:56.880 | If it shows you right now,
01:54:58.360 | these algorithms are designed to maximize engagement.
01:55:00.840 | Well, it turns out outrage maximizes engagement.
01:55:02.880 | Quirk of human, quirk of the human mind, right?
01:55:06.140 | Just as I fall for it, everyone falls for it.
01:55:09.200 | So yeah, you got to figure out how to maximize
01:55:11.200 | for something other than engagement.
01:55:12.600 | - And I actually believe that you can make money
01:55:14.520 | with that too.
01:55:15.360 | So it's not, I don't think engagement
01:55:16.840 | is the only way to make money.
01:55:17.960 | - I actually think it's incredible
01:55:19.520 | that we're starting to see, I think again,
01:55:21.760 | Elon's doing so much stuff right with Twitter,
01:55:23.400 | like charging people money.
01:55:25.200 | As soon as you charge people money,
01:55:26.500 | they're no longer the product.
01:55:28.220 | They're the customer.
01:55:29.700 | And then they can start building something
01:55:31.040 | that's good for the customer
01:55:32.080 | and not good for the other customer,
01:55:33.480 | which is the ad agencies.
01:55:34.760 | - As in picked up Steam.
01:55:37.020 | - I pay for Twitter, doesn't even get me anything.
01:55:40.160 | It's my donation to this new business model,
01:55:41.920 | hopefully working out.
01:55:43.020 | - Sure, but you know, for this business model to work,
01:55:45.760 | it's like most people should be signed up to Twitter.
01:55:48.800 | And so the way it was,
01:55:51.680 | there was something perhaps not compelling
01:55:53.400 | or something like this to people.
01:55:54.920 | - Think you need most people at all.
01:55:56.440 | I think that, why do I need most people, right?
01:55:58.840 | Don't make an 8,000 person company,
01:56:00.260 | make a 50 person company.
01:56:02.000 | - Ah.
01:56:02.840 | - Yeah.
01:56:03.660 | - Well, so speaking of which,
01:56:05.400 | you worked at Twitter for a bit.
01:56:08.480 | - I did.
01:56:09.320 | - As an intern.
01:56:10.280 | The world's greatest intern.
01:56:13.080 | - Yeah.
01:56:13.920 | - All right.
01:56:14.760 | - There's been better.
01:56:15.600 | - There's been better.
01:56:17.400 | Tell me about your time at Twitter.
01:56:18.840 | How did it come about?
01:56:20.160 | And what did you learn from the experience?
01:56:22.760 | - So I deleted my first Twitter in 2010.
01:56:27.760 | I had over 100,000 followers
01:56:30.800 | back when that actually meant something.
01:56:32.920 | And I just saw, you know,
01:56:36.400 | my coworker summarized it well.
01:56:39.400 | He's like, "Whenever I see someone's Twitter page,
01:56:42.600 | "I either think the same of them or less of them.
01:56:45.360 | "I never think more of them."
01:56:46.920 | - Yeah.
01:56:47.760 | - Right, like, I don't wanna mention any names,
01:56:50.040 | but like some people who like, you know,
01:56:51.320 | maybe you would like read their books
01:56:52.720 | and you would respect them.
01:56:53.600 | You see them on Twitter and you're like,
01:56:56.240 | okay, dude.
01:56:58.720 | - Yeah, but there's some people who would say,
01:57:02.000 | you know who I respect a lot?
01:57:03.960 | Are people that just post really good technical stuff.
01:57:06.520 | - Yeah.
01:57:07.960 | - And I guess, I don't know.
01:57:11.120 | I think I respect them more for it.
01:57:13.000 | 'Cause you realize, oh, this wasn't,
01:57:15.840 | there's like so much depth to this person,
01:57:18.680 | to their technical understanding
01:57:19.720 | of so many different topics.
01:57:21.560 | - Okay.
01:57:22.400 | - So I try to follow people.
01:57:23.720 | I try to consume stuff
01:57:25.680 | that's technical machine learning content.
01:57:27.880 | There's probably a few of those people.
01:57:31.880 | And the problem is inherently
01:57:34.000 | what the algorithm rewards, right?
01:57:36.440 | And people think about these algorithms.
01:57:38.200 | People think that they are terrible, awful things.
01:57:40.280 | And you know, I love that Elon open sourced it.
01:57:42.440 | Because I mean, what it does is actually pretty obvious.
01:57:44.680 | It just predicts what you are likely to retweet
01:57:47.240 | and like and linger on.
01:57:49.680 | It's what all these algorithms do.
01:57:50.520 | It's what TikTok does.
01:57:51.480 | So all these recommendation engines do.
01:57:53.440 | And it turns out that the thing
01:57:57.160 | that you are most likely to interact with is outrage.
01:58:00.040 | And that's a quirk of the human condition.
01:58:02.000 | - I mean, and there's different flavors of outrage.
01:58:06.040 | It doesn't have to be, it could be mockery.
01:58:09.520 | You'd be outraged.
01:58:10.360 | The topic of outrage could be different.
01:58:11.840 | It could be an idea.
01:58:12.680 | It could be a person.
01:58:13.520 | It could be, and maybe there's a better word than outrage.
01:58:17.240 | It could be drama.
01:58:18.160 | - Sure, drama.
01:58:19.000 | - All this kind of stuff.
01:58:19.840 | - Yeah.
01:58:20.660 | - But doesn't feel like when you consume it,
01:58:22.240 | it's a constructive thing for the individuals
01:58:24.080 | that consume it in the longterm.
01:58:26.240 | - Yeah, so my time there, I absolutely couldn't believe,
01:58:30.000 | you know, I got crazy amount of hate, you know,
01:58:34.600 | on Twitter for working at Twitter.
01:58:36.240 | It seems like people associated with this,
01:58:39.120 | I think maybe you were exposed to some of this.
01:58:41.600 | - So connection to Elon or is it working at Twitter?
01:58:44.060 | - Twitter and Elon, like the whole, there's just--
01:58:46.720 | - Elon's gotten a bit spicy during that time.
01:58:49.560 | A bit political, a bit.
01:58:51.160 | - Yeah, yeah.
01:58:52.640 | You know, I remember one of my tweets,
01:58:54.120 | it was never go full Republican and Elon liked it.
01:58:56.620 | (laughing)
01:58:58.880 | - Oh boy, yeah, I mean, there's a rollercoaster of that,
01:59:06.880 | but being political on Twitter, boy.
01:59:10.040 | - Yeah.
01:59:11.120 | - And also being, just attacking anybody on Twitter,
01:59:14.920 | it comes back at you harder.
01:59:17.520 | And if it's political and attacks.
01:59:19.440 | - Sure, sure, absolutely.
01:59:22.360 | And then letting sort of de-platformed people back on
01:59:27.360 | even adds more fun to the beautiful chaos.
01:59:34.320 | - I was hoping, and like, I remember when Elon talked
01:59:37.640 | about buying Twitter like six months earlier,
01:59:40.680 | he was talking about like a principled commitment
01:59:43.480 | to free speech.
01:59:44.640 | And I'm a big believer and fan of that.
01:59:47.620 | I would love to see an actual principled commitment
01:59:50.740 | to free speech.
01:59:52.100 | Of course, this isn't quite what happened.
01:59:54.200 | Instead of the oligarchy deciding what to ban,
01:59:57.880 | you had a monarchy deciding what to ban, right?
02:00:00.800 | Instead of, you know, all the Twitter files, shadow,
02:00:04.280 | really, the oligarchy just decides what.
02:00:06.440 | Cloth masks are ineffective against COVID.
02:00:08.520 | That's a true statement.
02:00:09.480 | Every doctor in 2019 knew it,
02:00:11.000 | and now I'm banned on Twitter for saying it?
02:00:12.480 | Interesting, oligarchy.
02:00:14.400 | So now you have a monarchy and, you know,
02:00:17.080 | he bans things he doesn't like.
02:00:19.880 | So, you know, it's just different power,
02:00:22.100 | and like, you know, maybe I align more with him
02:00:25.320 | than with the oligarchy.
02:00:26.160 | - But it's not free speech absolutism.
02:00:28.940 | But I feel like being a free speech absolutist
02:00:31.860 | on a social network requires you to also have tools
02:00:35.140 | for the individuals to control what they consume easier.
02:00:40.140 | Like, not censor, but just like control,
02:00:45.420 | like, oh, I'd like to see more cats and less politics.
02:00:48.940 | - And this isn't even remotely controversial.
02:00:51.320 | This is just saying you want to give paying customers
02:00:53.320 | for a product what they want.
02:00:54.480 | - Yeah, and not through the process of censorship,
02:00:56.560 | but through the process of like--
02:00:57.760 | - Well, it's individualized, right?
02:00:59.240 | It's individualized transparent censorship,
02:01:01.240 | which is honestly what I want.
02:01:02.480 | What is an ad blocker?
02:01:03.320 | It's individualized transparent censorship, right?
02:01:05.000 | - Yeah, but censorship is a strong word
02:01:08.520 | that people are very sensitive to.
02:01:10.200 | - I know, but, you know, I just use words
02:01:12.440 | to describe what they functionally are.
02:01:13.840 | And what is an ad blocker?
02:01:14.680 | It's just censorship.
02:01:15.520 | - Well, when I look at you right now--
02:01:16.340 | - But I love what you're censoring.
02:01:17.600 | - I'm looking at you, I'm censoring everything else out
02:01:21.740 | when my mind is focused on you.
02:01:24.260 | You can use the word censorship that way,
02:01:25.780 | but usually when people get very sensitive
02:01:27.600 | about the censorship thing.
02:01:28.940 | I think when anyone is allowed to say anything,
02:01:33.420 | you should probably have tools that maximize the quality
02:01:37.860 | of the experience for individuals.
02:01:39.460 | So, you know, for me, like what I really value,
02:01:42.820 | boy, it would be amazing to somehow figure out
02:01:45.620 | how to do that.
02:01:46.840 | I love disagreement and debate and people
02:01:49.900 | who disagree with each other disagree with me,
02:01:51.740 | especially in the space of ideas,
02:01:53.420 | but the high quality ones.
02:01:54.940 | So not derision, right?
02:01:56.620 | - Maslow's hierarchy of argument.
02:01:58.260 | I think that's a real word for it.
02:01:59.940 | - Probably.
02:02:00.860 | There's just a way of talking that's like snarky
02:02:02.980 | and so on that somehow gets people on Twitter
02:02:06.260 | and they get excited and so on.
02:02:07.860 | - We have like ad hominem refuting the central point.
02:02:09.980 | I like seeing this as an actual pyramid.
02:02:11.460 | - Yeah, and it's like all of it,
02:02:14.660 | all the wrong stuff is attractive to people.
02:02:16.860 | - I mean, we can just train a classifier
02:02:18.140 | to absolutely say what level of Maslow's hierarchy
02:02:20.620 | of argument are you at?
02:02:21.940 | And if it's ad hominem, like, okay, cool.
02:02:23.740 | I turned on the no ad hominem filter.
02:02:25.940 | - I wonder if there's a social network
02:02:28.900 | that will allow you to have that kind of filter.
02:02:31.060 | - Yeah, so here's the problem with that.
02:02:34.640 | It's not going to win in a free market.
02:02:37.920 | What wins in a free market is all television today
02:02:41.180 | is reality television because it's engaging.
02:02:43.540 | Right, engaging is what wins in a free market, right?
02:02:47.220 | So it becomes hard to keep these other more nuanced values.
02:02:50.520 | - Well, okay, so that's the experience of being on Twitter,
02:02:56.260 | but then you got a chance to also,
02:02:58.160 | together with other engineers and with Elon,
02:03:01.500 | sort of look, brainstorm when you step into a code base.
02:03:04.780 | It's been around for a long time.
02:03:06.580 | You know, there's other social networks,
02:03:08.300 | you know, Facebook, this is old code bases.
02:03:11.300 | And you step in and see, okay,
02:03:13.340 | how do we make with a fresh mind progress on this code base?
02:03:17.860 | Like, what did you learn about software engineering,
02:03:19.980 | about programming from just experiencing that?
02:03:22.140 | - So my technical recommendation to Elon,
02:03:25.200 | and I said this on the Twitter spaces afterward,
02:03:27.200 | I said this many times during my brief internship,
02:03:31.120 | was that you need refactors before features.
02:03:36.380 | This code base was, and look, I've worked at Google,
02:03:40.860 | I've worked at Facebook.
02:03:42.340 | Facebook has the best code, then Google, then Twitter.
02:03:46.740 | And you know what?
02:03:47.740 | You can know this,
02:03:48.580 | because look at the machine learning frameworks, right?
02:03:50.180 | Facebook released PyTorch,
02:03:51.420 | Google released TensorFlow, and Twitter released,
02:03:55.260 | (laughs)
02:03:56.080 | Okay, so, you know.
02:03:57.500 | - It's a proxy, but yeah,
02:03:58.900 | the Google code base is quite interesting.
02:04:01.100 | There's a lot of really good software engineers there,
02:04:02.740 | but the code base is very large.
02:04:04.780 | - The code base was good in 2005, right?
02:04:07.820 | It looks like 2005 era.
02:04:08.660 | - There's so many products, so many teams, right?
02:04:10.500 | It's very difficult to, I feel like Twitter does less,
02:04:15.500 | obviously, much less than Google,
02:04:17.900 | in terms of the set of features, right?
02:04:23.420 | So I can imagine the number of software engineers
02:04:26.740 | that could recreate Twitter is much smaller
02:04:29.420 | than to recreate Google.
02:04:30.780 | - Yeah, I still believe in the amount of hate I got
02:04:33.820 | for saying this, that 50 people could build
02:04:36.820 | and maintain Twitter.
02:04:38.580 | - What's the nature of the hate?
02:04:39.900 | - Comfortably.
02:04:40.740 | - That you don't know what you're talking about?
02:04:43.180 | - You know what it is?
02:04:44.180 | And it's the same, this is my summary
02:04:45.660 | of the hate I get on Hacker News.
02:04:47.660 | It's like, when I say I'm going to do something,
02:04:51.380 | they have to believe that it's impossible.
02:04:56.220 | Because if doing things was possible,
02:04:59.660 | they'd have to do some soul searching
02:05:01.220 | and ask the question, why didn't they do anything?
02:05:03.500 | - So when you say--
02:05:04.700 | - And I do think that's where the hate comes from.
02:05:06.060 | - When you say, well, there's a core truth to that, yeah.
02:05:08.500 | So when you say, I'm gonna solve self-driving,
02:05:10.800 | people go like, what are your credentials?
02:05:14.100 | What the hell are you talking about?
02:05:15.300 | What is, this is an extremely difficult problem.
02:05:17.260 | Of course, you're a noob
02:05:18.100 | that doesn't understand the problem deeply.
02:05:20.260 | I mean, that was the same nature of hate
02:05:23.820 | that probably Elon got when he first talked
02:05:25.500 | about autonomous driving.
02:05:26.760 | But there's pros and cons to that,
02:05:30.140 | 'cause there is experts in this world.
02:05:33.100 | - No, but the mockers aren't experts.
02:05:35.300 | The people who are mocking are not experts
02:05:38.220 | with carefully reasoned arguments
02:05:39.740 | about why you need 8,000 people to run a bird app.
02:05:42.300 | They're, but the people are gonna lose their jobs.
02:05:46.380 | - Well, that, but also there's the software engineers
02:05:48.660 | that probably criticize, no, it's a lot more complicated
02:05:50.940 | than you realize, but maybe it doesn't need
02:05:52.700 | to be so complicated.
02:05:53.820 | - You know, some people in the world
02:05:55.580 | like to create complexity.
02:05:56.920 | Some people in the world thrive under complexity,
02:05:58.620 | like lawyers, right?
02:05:59.660 | Lawyers want the world to be more complex
02:06:01.180 | because you need more lawyers,
02:06:02.020 | you need more legal hours, right?
02:06:03.620 | I think that's another.
02:06:05.820 | If there's two great evils in the world,
02:06:07.420 | it's centralization and complexity.
02:06:09.220 | - Yeah, and one of the sort of hidden side effects
02:06:14.220 | of software engineering is like finding pleasure
02:06:19.540 | in complexity.
02:06:21.000 | I mean, I don't remember just taking
02:06:24.300 | all the software engineering courses
02:06:25.860 | and just doing programming and just coming up
02:06:28.220 | in this object-oriented programming kind of idea.
02:06:33.060 | You don't, like, not often do people tell you,
02:06:35.500 | like, do the simplest possible thing.
02:06:38.060 | Like a professor, a teacher is not gonna get in front,
02:06:42.820 | like, this is the simplest way to do it.
02:06:45.420 | They'll say, like, this is the right way,
02:06:47.820 | and the right way, at least for a long time,
02:06:50.580 | you know, especially I came up with like Java, right?
02:06:53.660 | Is so much boilerplate, so much like, so many classes,
02:06:58.660 | so many like designs and architectures and so on,
02:07:02.540 | like planning for features far into the future
02:07:05.940 | and planning poorly and all this kind of stuff.
02:07:08.120 | And then there's this like code base
02:07:10.020 | that follows you along and puts pressure on you,
02:07:12.060 | and nobody knows what like parts, different parts do,
02:07:16.060 | which slows everything down.
02:07:17.100 | There's a kind of bureaucracy that's instilled in the code
02:07:19.940 | as a result of that.
02:07:20.940 | But then you feel like, oh, well,
02:07:22.620 | I follow good software engineering practices.
02:07:25.020 | It's an interesting trade-off,
02:07:26.500 | 'cause then you look at like the ghetto-ness of like Perl
02:07:30.100 | and the old, like, how quickly you could just write
02:07:32.500 | a couple of lines and just get stuff done.
02:07:34.440 | That trade-off is interesting, or Bash, or whatever,
02:07:37.300 | these kind of ghetto things you can do in Linux.
02:07:39.380 | - One of my favorite things to look at today
02:07:41.960 | is how much do you trust your tests, right?
02:07:43.860 | We've put a ton of effort in Kama,
02:07:45.620 | and I've put a ton of effort in TinyGrad
02:07:47.460 | into making sure if you change the code and the tests pass,
02:07:51.440 | that you didn't break the code.
02:07:52.740 | - Yeah. - Now, this obviously
02:07:53.580 | is not always true, but the closer that is to true,
02:07:56.700 | the more you trust your tests, the more you're like,
02:07:58.700 | oh, I got a pull request, and the tests pass,
02:08:00.900 | I feel okay to merge that, the faster you can make progress.
02:08:03.580 | - So you're always programming with tests in mind,
02:08:05.100 | developing tests with that in mind,
02:08:07.260 | that if it passes, it should be good.
02:08:08.820 | - And Twitter had a-- - Not that.
02:08:11.000 | So-- - It was impossible
02:08:13.640 | to make progress in the codebase.
02:08:15.400 | - What other stuff can you say about the codebase
02:08:17.300 | that made it difficult?
02:08:18.660 | What are some interesting sort of quirks,
02:08:21.620 | broadly speaking, from that,
02:08:23.940 | compared to just your experience with Kama
02:08:26.780 | and everywhere else?
02:08:27.860 | - The real thing that, I spoke to a bunch of,
02:08:30.640 | you know, like individual contributors at Twitter,
02:08:34.700 | and I just asked, I'm like, okay,
02:08:36.540 | so like, what's wrong with this place?
02:08:38.500 | Why does this code look like this?
02:08:39.900 | And they explained to me what Twitter's promotion system was.
02:08:43.520 | The way that you got promoted at Twitter
02:08:45.220 | was you wrote a library that a lot of people used, right?
02:08:49.640 | So some guy wrote an NGINX replacement for Twitter.
02:08:54.520 | Why does Twitter need an NGINX replacement?
02:08:56.340 | What was wrong with NGINX?
02:08:58.460 | Well, you see, you're not gonna get promoted
02:09:00.460 | if you use NGINX.
02:09:01.940 | But if you write a replacement
02:09:03.320 | and lots of people start using it
02:09:05.020 | as the Twitter front end for their product,
02:09:07.120 | then you're gonna get promoted, right?
02:09:08.380 | - So interesting, 'cause like,
02:09:09.540 | from an individual perspective, how do you incentivize,
02:09:12.420 | how do you create the kind of incentives
02:09:14.740 | that will lead to a great codebase?
02:09:18.180 | Okay, what's the answer to that?
02:09:20.440 | - So what I do at Kama and at,
02:09:25.500 | and you know, at TinyCorp is you have to explain it to me.
02:09:28.140 | You have to explain to me what this code does, right?
02:09:30.300 | And if I can sit there and come up
02:09:31.940 | with a simpler way to do it, you have to rewrite it.
02:09:34.740 | You have to agree with me about the simpler way.
02:09:37.340 | You know, obviously we can have a conversation about this.
02:09:39.060 | It's not dictatorial, but if you're like,
02:09:41.660 | wow, wait, that actually is way simpler.
02:09:44.340 | Like, the simplicity is important, right?
02:09:47.660 | - But that requires people that overlook the code
02:09:51.060 | at the highest levels to be like, okay.
02:09:54.100 | - It requires technical leadership, you trust.
02:09:55.660 | - Yeah, technical leadership.
02:09:57.260 | So managers or whatever should have to have technical savvy,
02:10:01.540 | deep technical savvy.
02:10:03.140 | - Managers should be better programmers
02:10:04.340 | than the people who they manage.
02:10:05.620 | - Yeah, and that's not always obvious, trivial to create,
02:10:09.740 | especially at large companies.
02:10:11.460 | Managers get soft.
02:10:12.700 | - And like, you know, and this is just,
02:10:13.780 | I've instilled this culture at Kama,
02:10:15.340 | and Kama has better programmers than me who work there.
02:10:17.760 | But you know, again, I'm like the old guy
02:10:20.400 | from "Good Will Hunting."
02:10:21.240 | It's like, look, man, you know,
02:10:23.180 | I might not be as good as you,
02:10:25.040 | but I can see the difference between me and you, right?
02:10:27.100 | And this is what you need.
02:10:28.260 | This is what you need at the top.
02:10:29.220 | Or you don't necessarily need the manager
02:10:31.280 | to be the absolute best.
02:10:32.760 | I shouldn't say that,
02:10:33.600 | but like, they need to be able to recognize skill.
02:10:36.540 | - Yeah, and have good intuition.
02:10:38.740 | Intuition that's laden with wisdom
02:10:40.940 | from all the battles of trying to reduce complexity
02:10:43.820 | in codebases.
02:10:44.660 | - You know, I took a political approach at Kama too
02:10:47.020 | that I think is pretty interesting.
02:10:47.940 | I think Elon takes the same political approach.
02:10:51.020 | You know, Google had no politics,
02:10:53.440 | and what ended up happening
02:10:54.420 | is the absolute worst kind of politics took over.
02:10:57.420 | Kama has an extreme amount of politics,
02:10:59.180 | and they're all mine, and no dissidence is tolerated.
02:11:02.100 | - So it's a dictatorship.
02:11:03.660 | - Yep, it's an absolute dictatorship, right?
02:11:05.820 | Elon does the same thing.
02:11:07.140 | Now, the thing about my dictatorship is here are my values.
02:11:10.100 | - Yeah, it's just transparent.
02:11:12.780 | - It's transparent.
02:11:13.600 | It's a transparent dictatorship, right?
02:11:14.860 | And you can choose to opt in or, you know,
02:11:16.660 | you get free exit, right?
02:11:17.620 | That's the beauty of companies.
02:11:18.500 | If you don't like the dictatorship, you quit.
02:11:21.660 | - So you mentioned rewrite before,
02:11:24.900 | or refactor before features.
02:11:27.600 | If you were to refactor the Twitter codebase,
02:11:30.820 | what would that look like?
02:11:32.100 | And maybe also comment on how difficult is it to refactor?
02:11:35.580 | - The main thing I would do is first of all,
02:11:37.820 | identify the pieces,
02:11:39.380 | and then put tests in between the pieces, right?
02:11:42.360 | So there's all these different,
02:11:43.200 | Twitter has a microservice architecture,
02:11:45.100 | there's all these different microservices,
02:11:48.260 | and the thing that I was working on there,
02:11:49.900 | look, like, you know,
02:11:50.920 | George didn't know any JavaScript,
02:11:53.540 | he asked how to fix search, blah, blah, blah, blah, blah.
02:11:55.940 | Look, man, like, the thing is,
02:11:58.740 | like, I just, you know,
02:11:59.740 | I'm upset that the way that this whole thing was portrayed,
02:12:02.340 | because it wasn't like,
02:12:03.300 | it wasn't like taken by people, like, honestly,
02:12:05.740 | it wasn't like by,
02:12:06.660 | it was taken by people who started out
02:12:08.940 | with a bad faith assumption.
02:12:10.140 | - Yeah.
02:12:10.980 | - And I mean, I, look, I can't like--
02:12:12.460 | - And you as a programmer were just being transparent
02:12:14.300 | out there, actually having like fun,
02:12:16.900 | and like, this is what programming should be about.
02:12:19.420 | - I love that Elon gave me this opportunity.
02:12:21.300 | - Yeah.
02:12:22.140 | - Like, really, it does, and like, you know,
02:12:23.140 | he came on my, the day I quit,
02:12:25.100 | he came on my Twitter spaces afterward,
02:12:26.540 | and we had a conversation, like,
02:12:27.900 | I just, I respect that so much.
02:12:29.740 | - Yeah, and it's also inspiring
02:12:31.020 | to just engineers and programmers,
02:12:32.460 | and just, it's cool, it should be fun.
02:12:34.380 | The people that were hating on it,
02:12:35.820 | it's like, oh, man.
02:12:37.100 | - It was fun.
02:12:38.940 | It was fun, it was stressful,
02:12:40.460 | but I felt like, you know,
02:12:41.300 | I was at like a cool, like, point in history,
02:12:43.360 | and like, I hope I was useful,
02:12:44.740 | I probably kind of wasn't, but like, maybe I was.
02:12:46.940 | - Well, you also were one of the people
02:12:48.940 | that kind of made a strong case to refactor.
02:12:51.580 | - Yeah.
02:12:52.420 | - And that's a really interesting thing to raise,
02:12:55.560 | like, maybe that is the right, you know,
02:12:58.520 | the timing of that is really interesting.
02:12:59.860 | If you look at just the development of autopilot,
02:13:01.900 | you know, going from Mobileye to just,
02:13:05.760 | like, more, if you look at the history
02:13:07.500 | of semi-autonomous driving in Tesla,
02:13:09.880 | is more and more, like, you could say refactoring,
02:13:14.420 | or starting from scratch, redeveloping from scratch.
02:13:17.860 | It's refactoring all the way down.
02:13:19.780 | - And like, and the question is,
02:13:21.860 | like, can you do that sooner?
02:13:24.060 | Can you maintain product profitability?
02:13:27.020 | And like, what's the right time to do it?
02:13:29.100 | How do you do it?
02:13:30.360 | You know, on any one day, it's like,
02:13:32.140 | you don't want to pull off the band-aids.
02:13:33.620 | Like, it's, like, everything works.
02:13:36.420 | It's just like, little fix here and there,
02:13:39.100 | but maybe starting from scratch.
02:13:41.020 | - This is the main philosophy of TinyGrad.
02:13:42.900 | You have never refactored enough.
02:13:44.720 | Your code can get smaller, your code can get simpler,
02:13:47.060 | your ideas can be more elegant.
02:13:48.980 | - But would you consider, you know,
02:13:52.060 | say you were, like, running Twitter development teams,
02:13:55.300 | engineering teams, would you go as far
02:13:58.780 | as, like, different programming language?
02:14:00.580 | Just go that far?
02:14:03.000 | - I mean, the first thing that I would do is build tests.
02:14:07.260 | The first thing I would do is get a CI
02:14:10.380 | to where people can trust to make changes.
02:14:13.000 | - So that if you keep-- - Before I touched any code,
02:14:16.780 | I would actually say, no one touches any code.
02:14:18.820 | The first thing we do is we test this code base.
02:14:20.540 | I mean, this is classic.
02:14:21.380 | This is how you approach a legacy code base.
02:14:22.780 | This is, like, what any,
02:14:24.060 | how to approach a legacy code base book will tell you.
02:14:26.960 | - So, and then you hope that there's modules
02:14:30.500 | that can live on for a while,
02:14:33.260 | and then you add new ones, maybe in a different language,
02:14:36.260 | or design it-- - Before we add new ones,
02:14:38.180 | we replace old ones.
02:14:39.460 | - Yeah, yeah, meaning, like,
02:14:40.460 | replace old ones with something simpler.
02:14:42.100 | - We look at this, like, this thing that's 100,000 lines,
02:14:45.460 | and we're like, well, okay, maybe this did even make sense
02:14:48.020 | in 2010, but now we can replace this
02:14:49.860 | with an open-source thing, right?
02:14:52.340 | - Yeah. - And, you know,
02:14:53.420 | we look at this, here, here's another 50,000 lines.
02:14:55.580 | Well, actually, you know,
02:14:56.420 | we can replace this with 300 lines of Go.
02:14:59.100 | And you know what?
02:14:59.940 | I trust that the Go actually replaces this thing
02:15:02.320 | because all the tests still pass.
02:15:03.740 | So step one is testing.
02:15:05.060 | - Yeah. - And then step two is, like,
02:15:06.660 | the programming language is an afterthought, right?
02:15:09.140 | You'll let a whole lot of people compete, be like,
02:15:10.820 | okay, who wants to rewrite a module,
02:15:12.100 | whatever language you wanna write it in,
02:15:13.540 | just the tests have to pass?
02:15:15.060 | And if you figure out how to make the test pass
02:15:17.300 | but break the site, that's, we gotta go back to step one.
02:15:20.180 | Step one is get tests that you trust
02:15:22.340 | in order to make changes in the code base.
02:15:23.700 | - I wonder how hard it is, too,
02:15:24.940 | 'cause I'm with you on testing and everything.
02:15:27.620 | You have from tests to, like, asserts to everything,
02:15:30.300 | but code is just covered in this
02:15:33.500 | because it should be very easy to make rapid changes
02:15:38.500 | and know that it's not gonna break everything.
02:15:42.580 | And that's the way to do it.
02:15:43.540 | But I wonder how difficult is it to integrate tests
02:15:48.220 | into a code base that doesn't have many of them.
02:15:49.900 | - So I'll tell you what my plan was at Twitter.
02:15:51.940 | It's actually similar to something we use at Kama.
02:15:53.620 | So at Kama, we have this thing called process replay.
02:15:56.140 | And we have a bunch of routes that'll be run through.
02:15:57.980 | So Kama's a microservice architecture, too.
02:15:59.620 | We have microservices in the driving.
02:16:02.020 | Like, we have one for the cameras, one for the sensor,
02:16:03.820 | one for the planner, one for the model.
02:16:07.700 | And we have an API,
02:16:09.260 | which the microservices talk to each other with.
02:16:11.500 | We use this custom thing called Serial,
02:16:13.060 | which uses ZMQ.
02:16:14.780 | Twitter uses Thrift.
02:16:17.980 | And then it uses this thing called Finagle,
02:16:20.460 | which is a Scala RPC backend.
02:16:24.420 | But this doesn't even really matter.
02:16:25.620 | The Thrift and Finagle layer was a great place,
02:16:30.260 | I thought, to write tests, right?
02:16:32.060 | To start building something that looks like process replay.
02:16:34.580 | So Twitter had some stuff that looked kind of like this,
02:16:37.900 | but it wasn't offline.
02:16:39.180 | It was only online.
02:16:40.220 | So you could ship a modified version of it,
02:16:43.300 | and then you could redirect some of the traffic
02:16:46.380 | to your modified version and diff those two.
02:16:48.380 | But it was all online.
02:16:49.580 | Like, there was no CI in the traditional sense.
02:16:51.980 | I mean, there was some, but it was not full coverage.
02:16:54.100 | So you can't run all of Twitter offline to test something.
02:16:57.340 | Then this was another problem.
02:16:58.440 | You can't run all of Twitter, right?
02:17:00.500 | Period.
02:17:01.340 | Any one person can't run it.
02:17:03.140 | Twitter runs in three data centers, and that's it.
02:17:05.740 | There's no other place you can run Twitter,
02:17:07.820 | which is like, "George, you don't understand.
02:17:10.340 | "This is modern software development."
02:17:11.980 | No, this is bullshit.
02:17:13.260 | Like, why can't it run on my laptop?
02:17:15.860 | "What are you doing?
02:17:16.700 | "Twitter can run it."
02:17:17.520 | Yeah, okay, well, I'm not saying
02:17:18.740 | you're gonna download the whole database to your laptop,
02:17:20.820 | but I'm saying all the middleware and the front end
02:17:22.660 | should run on my laptop, right?
02:17:24.540 | - That sounds really compelling.
02:17:26.060 | - Yeah.
02:17:27.220 | - But can that be achieved by a code base
02:17:30.820 | that grows over the years?
02:17:33.060 | I mean, the three data centers didn't have to be, right?
02:17:35.260 | 'Cause they're totally different designs.
02:17:37.500 | - The problem is more like,
02:17:39.700 | why did the code base have to grow?
02:17:41.580 | What new functionality has been added
02:17:43.560 | to compensate for the lines of code that are there?
02:17:47.780 | - One of the ways to explain it is that
02:17:49.820 | the incentive for software developers
02:17:51.380 | to move up in the company is to add code,
02:17:54.200 | to add, especially large--
02:17:55.780 | - And you know what?
02:17:56.620 | The incentive for politicians to move up
02:17:57.720 | in the political structure is to add laws.
02:18:00.020 | Same problem.
02:18:01.180 | - Yeah, yeah.
02:18:03.440 | If the flip side is to simplify, simplify, simplify.
02:18:07.200 | I mean, you know what?
02:18:08.520 | This is something that I do differently
02:18:10.020 | from Elon with Kama about self-driving cars.
02:18:13.340 | You know, I hear the new version's gonna come out
02:18:16.580 | and the new version is not gonna be better,
02:18:18.620 | but at first, and it's gonna require a ton of refactors,
02:18:22.260 | I say, okay, take as long as you need.
02:18:24.160 | You convince me this architecture's better,
02:18:27.020 | okay, we have to move to it.
02:18:28.860 | Even if it's not gonna make the product better tomorrow,
02:18:31.380 | the top priority is getting the architecture right.
02:18:34.340 | - So what do you think about sort of a thing
02:18:37.560 | where the product is online?
02:18:39.220 | So I guess, would you do a refactor?
02:18:42.960 | If you ran engineering on Twitter,
02:18:45.160 | would you just do a refactor?
02:18:46.760 | How long would it take?
02:18:48.000 | What would that mean for the running of the actual service?
02:18:51.600 | - You know, and I'm not the right person to run Twitter.
02:18:56.600 | I'm just not.
02:18:59.180 | And that's the problem.
02:19:00.500 | I don't really know.
02:19:01.520 | I don't really know if that's, you know,
02:19:03.680 | a common thing that I thought a lot while I was there
02:19:05.980 | was whenever I thought something that was different
02:19:07.720 | to what Elon thought,
02:19:09.140 | I'd have to run something in the back of my head
02:19:10.820 | reminding myself that Elon is the richest man in the world.
02:19:15.820 | And in general, his ideas are better than mine.
02:19:18.940 | Now there's a few things I think I do understand
02:19:22.260 | and know more about, but like in general,
02:19:26.880 | I'm not qualified to run Twitter.
02:19:28.460 | Not necessarily qualified,
02:19:29.780 | but like, I don't think I'd be that good at it.
02:19:31.280 | I don't think I'd be good at it.
02:19:32.860 | I don't think I'd really be good
02:19:33.740 | at running an engineering organization at scale.
02:19:36.480 | I think I could lead a very good refactor of Twitter
02:19:42.660 | and it would take like six months to a year
02:19:45.040 | and the results to show at the end of it
02:19:47.560 | would be feature development in general
02:19:50.140 | takes 10X less time, 10X less man hours.
02:19:53.180 | That's what I think I could actually do.
02:19:55.440 | Do I think that it's the right decision for the business
02:19:58.320 | above my pay grade?
02:20:02.780 | - Yeah, but a lot of these kinds of decisions
02:20:04.980 | are above everybody's pay grade.
02:20:06.460 | - I don't wanna be a manager.
02:20:07.420 | I don't wanna do that.
02:20:08.260 | I just like, if you really forced me to,
02:20:10.780 | yeah, it would make me maybe,
02:20:12.220 | make me upset if I had to make those decisions.
02:20:17.660 | I don't wanna.
02:20:18.500 | - Yeah, but a refactor is so compelling.
02:20:23.520 | If this is to become something much bigger
02:20:26.140 | than what Twitter was,
02:20:27.260 | it feels like a refactor has to be coming at some point.
02:20:32.580 | George, you're a junior software engineer.
02:20:34.500 | Every junior software engineer
02:20:35.860 | wants to come in and refactor the whole code.
02:20:38.660 | Okay, that's like your opinion, man.
02:20:42.060 | - Yeah, it doesn't, sometimes they're right.
02:20:45.380 | - Well, whether they're right or not,
02:20:47.420 | it's definitely not for that reason.
02:20:48.700 | It's definitely not a question of engineering prowess.
02:20:50.740 | It is a question of maybe what the priorities are
02:20:52.220 | for the company.
02:20:53.220 | And I did get more intelligent feedback
02:20:56.340 | from people I think in good faith saying that.
02:20:58.640 | Actually from Elon.
02:21:01.180 | And from Elon, people were like,
02:21:04.060 | well, a stop the world refactor
02:21:06.540 | might be great for engineering,
02:21:08.060 | but you don't have a business to run.
02:21:10.000 | And hey, above my pay grade.
02:21:12.940 | - What'd you think about Elon as an engineering leader,
02:21:15.980 | having to experience him in the most chaotic of spaces,
02:21:19.660 | I would say?
02:21:20.500 | - My respect for him is unchanged.
02:21:27.260 | And I did have to think a lot more deeply
02:21:30.660 | about some of the decisions he's forced to make.
02:21:32.820 | - About the tensions within those,
02:21:35.860 | the trade-offs within those decisions?
02:21:37.820 | - About a whole matrix coming at him.
02:21:43.980 | I think that's Andrew Tate's word for it,
02:21:45.500 | sorry to borrow it.
02:21:46.580 | - Also, bigger than engineering, just everything.
02:21:49.340 | - Yeah, like the war on the woke.
02:21:52.740 | - Yeah.
02:21:54.740 | - Like it just, man, and like,
02:21:59.100 | he doesn't have to do this, you know?
02:22:01.140 | He doesn't have to.
02:22:01.980 | He could go like Parag and go chill
02:22:04.740 | at the Four Seasons Maui, you know?
02:22:07.060 | But see, one person I respect and one person I don't.
02:22:10.100 | - So his heart is in the right place,
02:22:12.900 | fighting in this case for this ideal
02:22:15.060 | of the freedom of expression.
02:22:17.620 | - I wouldn't define the ideal so simply.
02:22:19.900 | I think you can define the ideal no more
02:22:22.860 | than just saying Elon's idea of a good world.
02:22:26.780 | Freedom of expression is--
02:22:29.020 | - But to you, it's still,
02:22:30.660 | the downsides of that is the monarchy.
02:22:32.700 | - Yeah, I mean, monarchy has problems, right?
02:22:36.980 | But I mean, would I trade right now
02:22:39.700 | the current oligarchy, which runs America,
02:22:42.840 | for the monarchy?
02:22:43.680 | Yeah, I would, sure.
02:22:44.900 | For the Elon monarchy, yeah, you know why?
02:22:47.380 | Because power would cost one cent a kilowatt hour.
02:22:50.460 | 10th of a cent a kilowatt hour.
02:22:52.020 | - What do you mean?
02:22:54.380 | - Right now, I pay about 20 cents a kilowatt hour
02:22:56.740 | for electricity in San Diego.
02:22:58.460 | That's like the same price you paid in 1980.
02:23:00.660 | What the hell?
02:23:02.820 | - So you would see a lot of innovation with Elon.
02:23:05.420 | - Maybe it'd have some hyper loops.
02:23:07.780 | - Yeah.
02:23:08.620 | - Right, and I'm willing to make that trade-off, right?
02:23:09.820 | I'm willing to make, and this is why,
02:23:11.380 | you know, people think that dictators take power
02:23:13.540 | through some untoward mechanism.
02:23:17.020 | Sometimes they do, but usually it's 'cause
02:23:18.460 | the people want them.
02:23:19.580 | And the downsides of a dictatorship,
02:23:22.860 | I feel like we've gotten to a point now
02:23:24.060 | with the oligarchy where, yeah,
02:23:26.420 | I would prefer the dictator.
02:23:28.660 | (chuckles)
02:23:30.900 | - What'd you think about Scala as a programming language?
02:23:33.500 | - I liked it more than I thought.
02:23:37.060 | I did the tutorials.
02:23:37.900 | Like, I was very new to it.
02:23:38.740 | Like, it would take me six months to be able
02:23:39.980 | to write, like, good Scala.
02:23:41.820 | - I mean, what did you learn about learning
02:23:43.100 | a new programming language from that?
02:23:45.020 | - Oh, I love doing, like, new programming tutorials
02:23:47.420 | and doing them.
02:23:48.260 | I did all this for Rust.
02:23:49.140 | It keeps some of its upsetting JVM roots,
02:23:54.660 | but it is a much nicer...
02:23:56.700 | In fact, I almost don't know why Kotlin took off
02:23:59.060 | and not Scala.
02:24:00.700 | I think Scala has some beauty that Kotlin lacked,
02:24:03.140 | whereas Kotlin felt a lot more...
02:24:07.660 | I mean, it was almost like, I don't know
02:24:09.100 | if it actually was a response to Swift,
02:24:10.620 | but that's kind of what it felt like.
02:24:12.220 | Like, Kotlin looks more like Swift,
02:24:13.500 | and Scala looks more like, well,
02:24:15.460 | like a functional programming language,
02:24:16.620 | more like an OCaml or a Haskell.
02:24:18.660 | - Let's actually just explore,
02:24:19.980 | we touched it a little bit, but just on the art,
02:24:23.020 | the science and the art of programming.
02:24:25.380 | For you personally, how much of your programming
02:24:27.180 | is done with GPT currently?
02:24:29.060 | - None.
02:24:29.900 | - None.
02:24:30.740 | - I don't use it at all.
02:24:31.900 | - Because you prioritize simplicity so much.
02:24:34.940 | - Yeah, I find that a lot of it is noise.
02:24:37.780 | I do use VS Code,
02:24:39.060 | and I do like some amount of autocomplete.
02:24:43.300 | I do like a very,
02:24:45.140 | a very, like, feels-like-rules-based autocomplete.
02:24:47.580 | Like an autocomplete that's going to complete
02:24:49.180 | the variable name for me, so I don't have to type it,
02:24:50.580 | I can just press tab.
02:24:51.780 | All right, that's nice.
02:24:52.620 | But I don't want an autocomplete.
02:24:53.860 | You know what I hate?
02:24:54.700 | When autocompletes, when I type the word for,
02:24:56.780 | and it like puts like two, two parentheses
02:24:59.260 | and two semicolons and two braces,
02:25:00.700 | I'm like, oh man.
02:25:02.500 | - Well, I mean, with VS Code and GPT with Codex,
02:25:07.340 | you can kind of brainstorm.
02:25:11.340 | I find,
02:25:12.380 | I'm like probably the same as you,
02:25:15.380 | but I like that it generates code,
02:25:18.380 | and you basically disagree with it
02:25:20.060 | and write something simpler.
02:25:21.500 | But to me, that somehow is like inspiring.
02:25:24.820 | It makes me feel good.
02:25:25.660 | It also gamifies the simplification process,
02:25:27.820 | 'cause I'm like, oh yeah, you dumb AI system.
02:25:30.580 | You think this is the way to do it.
02:25:32.060 | I have a simpler thing here.
02:25:33.260 | - It just constantly reminds me of like bad stuff.
02:25:36.820 | I mean, I tried the same thing with rap, right?
02:25:38.500 | I tried the same thing with rap,
02:25:39.420 | and actually I think I'm a much better programmer
02:25:40.660 | than rapper.
02:25:41.580 | But like I even tried, I was like, okay,
02:25:42.780 | can we get some inspiration from these things
02:25:44.380 | for some rap lyrics?
02:25:46.140 | And I just found that it would go back
02:25:47.780 | to the most like cringy tropes and dumb rhyme schemes.
02:25:51.540 | And I'm like, yeah, this is what the code looks like too.
02:25:54.820 | - I think you and I probably have different thresholds
02:25:56.860 | for cringe code.
02:25:58.580 | You probably hate cringe code.
02:26:00.980 | So it's for you.
02:26:01.980 | I mean, boilerplate is a part of code.
02:26:07.380 | Like some of it,
02:26:10.580 | yeah, and some of it is just like faster lookup.
02:26:17.140 | 'Cause I don't know about you,
02:26:18.220 | but I don't remember everything.
02:26:20.620 | I'm offloading so much of my memory about like,
02:26:22.980 | yeah, different functions, library functions,
02:26:26.340 | all that kind of stuff.
02:26:27.500 | Like this GPT just is very fast at standard stuff.
02:26:31.980 | And like standard library stuff,
02:26:35.100 | basic stuff that everybody uses.
02:26:36.700 | - Yeah, I think that,
02:26:40.180 | I don't know.
02:26:43.420 | I mean, there's just so little of this in Python.
02:26:46.140 | And maybe if I was coding more in other languages,
02:26:48.340 | I would consider it more,
02:26:49.860 | but I feel like Python already does such a good job
02:26:52.620 | of removing any boilerplate.
02:26:55.060 | - That's true.
02:26:55.900 | - It's the closest thing you can get to pseudocode, right?
02:26:57.980 | - Yeah, that's true.
02:26:59.620 | That's true.
02:27:00.460 | - I'm like, yeah, sure.
02:27:01.380 | If I like, yeah, great GPT.
02:27:03.740 | Thanks for reminding me to free my variables.
02:27:06.180 | Unfortunately, you didn't really recognize
02:27:08.060 | the scope correctly and you can't free that one.
02:27:10.340 | But like you put the freeze there and like, I get it.
02:27:12.980 | - Fiverr.
02:27:15.580 | Whenever I've used Fiverr for certain things,
02:27:17.860 | like design or whatever, it's always, you come back.
02:27:21.220 | I think that's probably closer,
02:27:22.700 | my experience with Fiverr is closer to your experience
02:27:24.580 | with programming with GPT is like,
02:27:26.580 | you're just frustrated and feel worse
02:27:28.100 | about the whole process of design and art
02:27:30.460 | and whatever you use Fiverr for.
02:27:32.500 | Still, I just feel like later versions of GPT,
02:27:38.540 | I'm using GPT as much as possible
02:27:43.500 | to just learn the dynamics of it,
02:27:45.860 | like these early versions,
02:27:48.100 | because it feels like in the future,
02:27:49.380 | you'll be using it more and more.
02:27:51.540 | And so like, I don't want to be,
02:27:53.700 | for the same reason I gave away all my books
02:27:56.300 | and switched to Kindle,
02:27:57.820 | 'cause like, all right,
02:27:59.660 | how long are we gonna have paper books?
02:28:01.620 | Like 30 years from now?
02:28:03.220 | Like I wanna learn to be reading on Kindle,
02:28:05.940 | even though I don't enjoy it as much.
02:28:07.500 | And you learn to enjoy it more.
02:28:08.900 | In the same way, I switched from, let me just pause.
02:28:12.780 | - Switched from Emacs to VS Code.
02:28:14.620 | - Yeah.
02:28:15.540 | I switched from Vim to VS Code.
02:28:16.700 | I think I, similar, but.
02:28:18.060 | - Yeah, it's tough.
02:28:19.220 | And Vim to VS Code is even tougher,
02:28:21.540 | 'cause Emacs is like old, like more outdated, feels like it.
02:28:25.940 | The community is more outdated.
02:28:28.220 | Vim is like pretty vibrant still.
02:28:30.260 | So it's just.
02:28:31.100 | - I never used any of the plugins.
02:28:32.340 | I still don't use any of the plugins.
02:28:33.180 | - That's what I, I looked at myself in the mirror.
02:28:34.420 | I'm like, yeah, you wrote some stuff in Lisp.
02:28:36.660 | Yeah.
02:28:37.500 | - No, but I never used any of the plugins in Vim either.
02:28:39.300 | I had the most vanilla Vim.
02:28:40.500 | I have a syntax highlighter.
02:28:41.540 | I didn't even have autocomplete.
02:28:42.540 | Like these things, I feel like help you so marginally
02:28:47.540 | that like, and now, okay, now VS Code's autocomplete
02:28:53.660 | has gotten good enough that like, okay,
02:28:55.660 | I don't have to set it up.
02:28:56.500 | I can just go into any code base
02:28:57.420 | and autocomplete's right 90% of the time.
02:28:59.140 | Okay, cool, I'll take it.
02:29:00.820 | All right, so I don't think I'm gonna have a problem
02:29:03.700 | at all adapting to the tools once they're good.
02:29:06.060 | But like the real thing that I want
02:29:08.660 | is not something that like tab completes my code
02:29:12.740 | and gives me ideas.
02:29:13.580 | The real thing that I want
02:29:14.700 | is a very intelligent pair programmer
02:29:17.020 | that comes up with a little pop-up saying,
02:29:19.900 | hey, you wrote a bug on line 14 and here's what it is.
02:29:23.460 | - Yeah.
02:29:24.300 | - Now I like that.
02:29:25.140 | You know what does a good job of this?
02:29:26.060 | MyPy.
02:29:27.660 | I love MyPy.
02:29:28.700 | MyPy, this fancy type checker for Python.
02:29:31.300 | And actually I tried like Microsoft released one too.
02:29:33.220 | And it was like 60% false positives.
02:29:36.700 | MyPy is like 5% false positives.
02:29:38.740 | 95% of the time it recognizes,
02:29:41.300 | I didn't really think about
02:29:42.180 | that typing interaction correctly.
02:29:43.740 | Thank you, MyPy.
02:29:44.580 | - So you like type hinting.
02:29:46.780 | You like pushing the language
02:29:48.980 | towards being a typed language.
02:29:51.100 | - Oh yeah, absolutely.
02:29:52.100 | I think optional typing is great.
02:29:54.980 | I mean, look, I think that like,
02:29:55.900 | it's like a meet in the middle, right?
02:29:56.940 | Like Python has this optional type hinting
02:29:58.580 | and like C++ has auto.
02:30:01.620 | - C++ allows you to take a step back.
02:30:03.780 | - Well, C++ would have you brutally type out
02:30:05.980 | std string iterator, right?
02:30:08.100 | Now I can just type auto, which is nice.
02:30:09.900 | And then Python used to just have a,
02:30:12.980 | what type is a?
02:30:14.100 | It's an a.
02:30:16.700 | A colon str.
02:30:18.340 | Oh, okay.
02:30:19.180 | It's a string, cool.
02:30:20.420 | - Yeah.
02:30:21.260 | - I wish there was a way,
02:30:22.460 | like a simple way in Python to like turn on a mode,
02:30:26.260 | which would enforce the types.
02:30:28.500 | - Yeah, like give a warning
02:30:29.420 | when there's no type or something like this.
02:30:30.740 | - Well, no, to give a warning where,
02:30:32.100 | like MyPy is a static type checker,
02:30:33.660 | but I'm asking just for a runtime type checker.
02:30:35.500 | Like there's like ways to like hack this in,
02:30:37.260 | but I wish it was just like a flag,
02:30:38.420 | like Python three dash T.
02:30:40.460 | - Oh, I see.
02:30:41.380 | I see.
02:30:42.220 | - Enforce the types around time.
02:30:43.060 | - Yeah.
02:30:43.900 | I feel like that makes you a better programmer.
02:30:45.380 | That's the kind of test, right?
02:30:47.300 | That the type remains the same.
02:30:49.900 | - Well, no, that doesn't like mess any types up.
02:30:51.820 | But again, like MyPy is getting really good and I love it.
02:30:55.380 | And I can't wait for some of these tools
02:30:56.980 | to become AI powered.
02:30:58.420 | I want AIs reading my code and giving me feedback.
02:31:01.300 | I don't want AIs writing half-assed
02:31:05.060 | auto-complete stuff for me.
02:31:06.660 | - I wonder if you can now take GPT
02:31:08.980 | and give it a code that you wrote for a function
02:31:11.100 | and say, how can I make this simpler
02:31:13.220 | and have it accomplish the same thing?
02:31:15.380 | I think you'll get some good ideas on some code.
02:31:17.340 | Maybe not the code you write,
02:31:18.980 | for TinyGrad type of code,
02:31:22.100 | 'cause that requires so much design thinking,
02:31:24.220 | but like other kinds of code.
02:31:26.180 | - I don't know.
02:31:27.140 | I downloaded that plugin maybe like two months ago.
02:31:29.700 | I tried it again and found the same.
02:31:31.780 | Look, I don't doubt that these models
02:31:34.660 | are going to first become useful to me,
02:31:37.860 | then be as good as me and then surpass me.
02:31:40.420 | But from what I've seen today,
02:31:42.740 | it's like someone occasionally taking over my keyboard
02:31:47.740 | that I hired from Fiverr.
02:31:51.420 | I'd rather not.
02:31:53.940 | - Ideas about how to debug the code
02:31:55.820 | or basically a better debugger is really interesting.
02:31:58.740 | But it's not a better debugger.
02:31:59.900 | I guess I would love a better debugger.
02:32:01.780 | - Yeah, it's not yet.
02:32:02.620 | Yeah, but it feels like it's not too far.
02:32:04.500 | - Yeah, one of my coworkers says
02:32:05.860 | he uses them for print statements.
02:32:07.460 | Like every time he has to like, just like when he needs,
02:32:09.340 | the only thing I can really write is like,
02:32:11.060 | okay, I just want to write the thing
02:32:12.100 | to like print the state out right now.
02:32:14.380 | - Oh, that definitely is much faster.
02:32:17.740 | It's print statements, yeah.
02:32:19.340 | I see myself using that a lot
02:32:20.940 | just 'cause it figures out the rest of the functions.
02:32:23.180 | It's just like, okay, print everything.
02:32:24.460 | - Yeah, print everything, right?
02:32:25.460 | And then yeah, like if you want a pretty printer, maybe.
02:32:27.980 | I'm like, yeah, you know what?
02:32:28.820 | I think in two years,
02:32:30.860 | I'm gonna start using these plugins a little bit.
02:32:33.780 | And then in five years,
02:32:35.020 | I'm gonna be heavily relying on some AI augmented flow.
02:32:38.060 | And then in 10 years.
02:32:39.660 | - Do you think you'll ever get to 100%?
02:32:41.660 | What's the role of the human
02:32:45.620 | that it converges to as a programmer?
02:32:48.500 | - No.
02:32:49.340 | - So you think it's all generated?
02:32:52.020 | - Our niche becomes,
02:32:53.140 | oh, I think it's over for humans in general.
02:32:55.340 | It's not just programming, it's everything.
02:32:57.900 | - So niche becomes, whoa.
02:32:59.340 | - Our niche becomes smaller and smaller and smaller.
02:33:00.740 | In fact, I'll tell you what the last niche
02:33:02.300 | of humanity is gonna be.
02:33:04.380 | There's a great book,
02:33:05.460 | and it's, if I recommended Metamorphosis
02:33:07.140 | of the Prime Intellect last time,
02:33:08.880 | there is a sequel called A Casino Odyssey in Cyberspace.
02:33:12.380 | And I don't wanna give away the ending of this,
02:33:15.780 | but it tells you what the last remaining human currency is.
02:33:18.820 | And I agree with that.
02:33:19.920 | - We'll leave that as the cliffhanger.
02:33:25.900 | - So no more programmers left, huh?
02:33:27.780 | That's where we're going.
02:33:29.540 | - Well, unless you want handmade code,
02:33:31.220 | maybe they'll sell it on Etsy.
02:33:32.460 | This is handwritten code.
02:33:33.920 | Doesn't have that machine polished to it.
02:33:37.700 | It has those slight imperfections
02:33:39.100 | that would only be written by a person.
02:33:41.000 | - I wonder how far away we are from that.
02:33:44.460 | I mean, there's some aspect to,
02:33:46.460 | you know, on Instagram,
02:33:47.580 | your title is listed as prompt engineer.
02:33:49.660 | - Right?
02:33:50.500 | Thank you for noticing.
02:33:54.060 | - I don't know if it's ironic or non,
02:33:57.240 | or sarcastic or non.
02:34:00.580 | What do you think of prompt engineering
02:34:02.780 | as a scientific and engineering discipline,
02:34:06.340 | or maybe, and maybe art form?
02:34:08.700 | - You know what?
02:34:09.540 | I started comma six years ago.
02:34:12.140 | I started the tiny corp a month ago.
02:34:13.980 | So much has changed.
02:34:18.040 | Like I'm now thinking, I'm now like,
02:34:22.160 | I started like going through like similar comma processes
02:34:24.440 | to like starting a company.
02:34:25.280 | I'm like, okay, I'm gonna get an office in San Diego.
02:34:27.120 | I'm gonna bring people here.
02:34:28.520 | I don't think so.
02:34:30.260 | I think I'm actually gonna do remote, right?
02:34:32.680 | George, you're gonna do remote?
02:34:33.520 | You hate remote?
02:34:34.360 | Yeah, but I'm not gonna do job interviews.
02:34:36.440 | The only way you're gonna get a job
02:34:37.520 | is if you contribute to the GitHub, right?
02:34:39.760 | And then like, like interacting through GitHub,
02:34:44.240 | like GitHub being the real like project management software
02:34:48.080 | for your company.
02:34:48.920 | And the thing pretty much just is a GitHub repo.
02:34:52.080 | Is like showing me kind of what the future of, okay.
02:34:55.480 | So a lot of times I'll go on a Discord
02:34:56.920 | or kind of go on Discord
02:34:58.320 | and I'll throw out some random like,
02:34:59.640 | hey, you know, can you change
02:35:00.960 | instead of having log and exp as LL ops,
02:35:03.880 | change it to log two and exp two?
02:35:06.320 | It's a pretty small change.
02:35:07.160 | You could just use like change the base formula.
02:35:09.120 | That's the kind of task that I can see an AI
02:35:12.920 | being able to do in a few years.
02:35:14.840 | Like in a few years, I could see myself describing that.
02:35:17.480 | And then within 30 seconds, a pull request is up
02:35:19.680 | that does it.
02:35:20.760 | And it passes my CI and I merge it, right?
02:35:23.320 | So I really started thinking about like,
02:35:24.880 | well, what is the future of like jobs?
02:35:28.480 | How many AIs can I employ at my company?
02:35:30.560 | As soon as we get the first tiny box up,
02:35:32.080 | I'm gonna stand up a 65B llama in the Discord.
02:35:35.320 | And it's like, yeah, here's the tiny box.
02:35:36.640 | He's just like, he's chilling with us.
02:35:39.040 | - Basically, like you said with niches,
02:35:42.640 | most human jobs will eventually be replaced
02:35:47.160 | with prompt engineering.
02:35:48.440 | - Well, prompt engineering kind of is this like,
02:35:51.280 | as you like move up the stack, right?
02:35:54.960 | Like, okay, there used to be humans
02:35:56.480 | actually doing arithmetic by hand.
02:35:59.160 | And there used to be like big farms of people
02:36:00.600 | doing pluses and stuff, right?
02:36:03.000 | And then you have like spreadsheets, right?
02:36:05.240 | And then, okay, the spreadsheet can do the plus for me.
02:36:07.560 | And then you have like macros, right?
02:36:09.960 | And then you have like things that basically
02:36:11.320 | just are spreadsheets under the hood, right?
02:36:13.400 | Like accounting software.
02:36:17.360 | As we move further up the abstraction,
02:36:19.280 | what's at the top of the abstraction stack?
02:36:20.800 | Well, a prompt engineer.
02:36:22.600 | - Yeah.
02:36:23.440 | - Right, what is the last thing if you think about
02:36:26.440 | like humans wanting to keep control?
02:36:30.040 | Well, what am I really in the company
02:36:31.680 | but a prompt engineer, right?
02:36:33.600 | - Isn't there a certain point where the AI
02:36:35.840 | will be better at writing prompts?
02:36:38.600 | - Yeah, but you see the problem with the AI writing prompts,
02:36:41.600 | a definition that I always liked of AI
02:36:43.840 | was AI is the do what I mean machine, right?
02:36:46.880 | AI is not the, like the computer is so pedantic.
02:36:50.720 | It does what you say.
02:36:52.120 | So, but you want the do what I mean machine.
02:36:55.520 | - Yeah.
02:36:56.360 | - Right, you want the machine where you say,
02:36:57.840 | you know, get my grandmother out of the burning house.
02:36:59.680 | It like reasonably takes your grandmother
02:37:01.360 | and puts her on the ground,
02:37:02.200 | not lifts her a thousand feet above the burning house
02:37:04.280 | and lets her fall, right?
02:37:05.840 | - But you don't--
02:37:06.680 | - There's an old Yudkowsky example.
02:37:07.520 | (Lex laughing)
02:37:09.120 | - But it's not going to find the meaning.
02:37:13.080 | I mean, to do what I mean, it has to figure stuff out.
02:37:16.840 | - Sure.
02:37:17.680 | - And the thing you'll maybe ask it to do
02:37:21.320 | is run government for me.
02:37:23.760 | - Oh, and do what I mean very much comes down
02:37:25.720 | to how aligned is that AI with you?
02:37:28.120 | Of course, when you talk to an AI
02:37:31.120 | that's made by a big company in the cloud,
02:37:34.120 | the AI fundamentally is aligned to them, not to you.
02:37:37.800 | - Yeah.
02:37:38.640 | - And that's why you have to buy a tiny box.
02:37:39.800 | So you make sure the AI stays aligned to you.
02:37:41.720 | Every time that they start to pass AI regulation
02:37:45.360 | or GPU regulation, I'm gonna see sales of tiny boxes spike.
02:37:48.240 | It's gonna be like guns, right?
02:37:49.560 | Every time they talk about gun regulation,
02:37:51.440 | boom, gun sales.
02:37:53.080 | - So in the space of AI, you're an anarchist,
02:37:55.440 | anarchism, espouser, believer.
02:37:58.760 | - I'm an informational anarchist, yes.
02:38:00.600 | I'm an informational anarchist and a physical statist.
02:38:03.800 | I do not think anarchy in the physical world is very good
02:38:07.400 | because I exist in the physical world.
02:38:09.040 | But I think we can construct this virtual world
02:38:11.400 | where anarchy, it can't hurt you, right?
02:38:13.440 | I love that Tyler, the creator, tweet.
02:38:16.040 | Yo, cyber bullying isn't real, man.
02:38:18.200 | Have you tried?
02:38:19.040 | Turn it off the screen, close your eyes.
02:38:21.240 | Like.
02:38:22.080 | - Yeah.
02:38:22.920 | But how do you prevent the AI
02:38:28.840 | from basically replacing all human prompt engineers?
02:38:33.840 | Where there's, it's like a self,
02:38:36.400 | like where nobody's the prompt engineer anymore.
02:38:38.360 | So autonomy, greater and greater autonomy
02:38:40.560 | until it's full autonomy.
02:38:41.680 | - Yeah.
02:38:43.120 | - And that's just where it's headed.
02:38:45.040 | 'Cause one person's gonna say, run everything for me.
02:38:49.200 | - You see, I look at potential futures.
02:38:54.080 | And as long as the AIs go on to create a vibrant civilization
02:38:59.080 | with diversity and complexity across the universe,
02:39:04.920 | more power to them, I'll die.
02:39:06.360 | If the AIs go on to actually like turn the world
02:39:09.760 | into paperclips and then they die out themselves,
02:39:12.040 | well, that's horrific and we don't want that to happen.
02:39:14.560 | So this is what I mean about like robustness.
02:39:17.000 | I trust robust machines.
02:39:19.200 | The current AIs are so not robust.
02:39:20.920 | Like this comes back to the idea
02:39:21.960 | that we've never made a machine that can self replicate.
02:39:24.800 | Right?
02:39:25.640 | But when we have, if the machines are truly robust
02:39:28.400 | and there is one prompt engineer left in the world,
02:39:30.960 | hope you're doing good, man.
02:39:34.120 | Hope you believe in God.
02:39:35.080 | Like, you know, go by God.
02:39:38.800 | And go forth and conquer the universe.
02:39:42.800 | - Well, you mentioned, 'cause I talked to Mark
02:39:44.800 | about faith in God and you said you were impressed by that.
02:39:48.600 | What's your own belief in God
02:39:50.240 | and how does that affect your work?
02:39:52.640 | - You know, I never really considered when I was younger,
02:39:56.000 | I guess my parents were atheists.
02:39:57.360 | So I was raised kind of atheist.
02:39:58.320 | I never really considered how absolutely
02:39:59.680 | like silly atheism is.
02:40:01.440 | 'Cause like I create worlds, right?
02:40:05.120 | Every like game creator, like how are you an atheist, bro?
02:40:08.200 | You create worlds.
02:40:09.480 | Who's a, no one created our world, man.
02:40:11.240 | That's different.
02:40:12.080 | Haven't you heard about like the Big Bang and stuff?
02:40:13.480 | Yeah, I mean, what's the Skyrim myth origin story in Skyrim?
02:40:17.360 | I'm sure there's like some part of it in Skyrim,
02:40:19.200 | but it's not like if you ask the creators,
02:40:21.480 | like the Big Bang is in universe, right?
02:40:23.880 | I'm sure they have some Big Bang notion in Skyrim, right?
02:40:27.120 | But that obviously is not at all
02:40:28.560 | how Skyrim was actually created.
02:40:30.040 | It was created by a bunch of programmers in a room, right?
02:40:32.520 | So like, you know, it just struck me one day
02:40:35.800 | how just silly atheism is.
02:40:37.320 | Like, of course we were created by God.
02:40:39.480 | It's the most obvious thing.
02:40:40.880 | - Yeah, that's such a nice way to put it.
02:40:46.800 | Like we're such powerful creators ourselves.
02:40:50.320 | It's silly not to conceive that there's creators
02:40:53.480 | even more powerful than us.
02:40:54.840 | - Yeah, and then like, I also just like, I like that notion.
02:40:58.240 | That notion gives me a lot of,
02:41:00.360 | I mean, I guess you can talk about it,
02:41:01.960 | what it gives a lot of religious people.
02:41:03.240 | It's kind of like, it just gives me comfort.
02:41:04.680 | It's like, you know what?
02:41:05.920 | If we mess it all up and we die out, eh.
02:41:08.960 | - Yeah, and the same way that a video game
02:41:11.160 | kind of has comfort in it.
02:41:12.240 | - God'll try again.
02:41:13.320 | - Or there's balance.
02:41:15.360 | Like somebody figured out a balanced view of it.
02:41:18.800 | Like how to, like, so it all makes sense in the end.
02:41:22.560 | Like a video game is usually not gonna have
02:41:25.200 | crazy, crazy stuff.
02:41:26.880 | - You know, people will come up with like,
02:41:29.760 | well, yeah, but like, man, who created God?
02:41:33.840 | I'm like, that's God's problem.
02:41:36.400 | No, like, I'm not gonna think this is,
02:41:38.640 | what are you asking me, what, if God believes in God?
02:41:41.320 | - I'm just this NPC living in this game.
02:41:43.320 | - I mean, to be fair, like if God didn't believe in God,
02:41:45.880 | he'd be as, you know, silly as the atheists here.
02:41:48.720 | - What do you think is the greatest
02:41:51.120 | computer game of all time?
02:41:52.720 | Do you have any time to play games anymore?
02:41:55.600 | Have you played Diablo 4?
02:41:57.360 | - I have not played Diablo 4.
02:41:59.160 | - I will be doing that shortly.
02:42:00.880 | I have to. - All right.
02:42:01.920 | - There's just so much history with one, two, and three.
02:42:04.240 | - You know what?
02:42:05.240 | I'm gonna say World of Warcraft.
02:42:07.240 | - Ooh.
02:42:08.440 | - And it's not that the game is such a great game.
02:42:13.040 | It's not.
02:42:14.520 | It's that I remember in 2005 when it came out,
02:42:18.520 | how it opened my mind to ideas.
02:42:22.440 | It opened my mind to like,
02:42:24.120 | this whole world we've created, right?
02:42:28.480 | There's almost been nothing like it since.
02:42:30.400 | Like, you can look at MMOs today,
02:42:32.480 | and I think they all have lower user bases
02:42:34.280 | than World of Warcraft.
02:42:35.280 | Like, EVE Online's kind of cool.
02:42:37.400 | But to think that like, everyone knows,
02:42:41.640 | you know, people are always like,
02:42:42.880 | they look at the Apple headset,
02:42:44.080 | like, what do people want in this VR?
02:42:47.040 | Everyone knows what they want.
02:42:47.880 | I want Ready Player One.
02:42:49.840 | And like that.
02:42:51.400 | So I'm gonna say World of Warcraft,
02:42:52.680 | and I'm hoping that games can get out of this whole
02:42:56.480 | mobile gaming dopamine pump thing.
02:42:59.440 | And like- - Create worlds.
02:43:00.840 | - Create worlds, yeah.
02:43:03.080 | - And worlds that captivate a very large fraction
02:43:05.600 | of the human population.
02:43:06.760 | - Yeah, and I think it'll come back, I believe.
02:43:09.600 | - But MMO, like really, really pull you in.
02:43:13.280 | - Games do a good job.
02:43:14.240 | I mean, okay, other, like two other games
02:43:15.960 | that I think are very noteworthy for me
02:43:17.720 | are Skyrim and GTA V.
02:43:19.840 | - Skyrim, yeah.
02:43:21.720 | That's probably number one for me.
02:43:24.920 | Yeah, what is it about GTA?
02:43:26.560 | GTA is really,
02:43:28.960 | I guess GTA is real life.
02:43:32.880 | I know there's prostitutes and guns and stuff.
02:43:35.000 | - There exists a real life too.
02:43:36.520 | - Yes, I know.
02:43:38.840 | But it's how I imagine your life to be, actually.
02:43:42.080 | - I wish it was that cool.
02:43:43.240 | - Yeah.
02:43:44.080 | Yeah, I guess that's, you know,
02:43:46.920 | 'cause there's Sims, right?
02:43:48.320 | Which is also a game I like.
02:43:50.240 | But it's a gamified version of life.
02:43:52.480 | But it also is,
02:43:53.680 | I would love a combination of Sims and GTA.
02:43:58.640 | So more freedom, more violence, more rawness.
02:44:02.000 | But with also like ability to have a career and family
02:44:05.120 | and this kind of stuff.
02:44:05.960 | - What I'm really excited about in games
02:44:08.240 | is like once we start getting
02:44:10.320 | intelligent AIs to interact with.
02:44:11.880 | - Oh yeah.
02:44:12.720 | - Like the NPCs in games have never been.
02:44:14.680 | - But conversationally.
02:44:17.800 | Oh, in every way.
02:44:19.080 | - In like, yeah, in like every way.
02:44:21.520 | Like when you're actually building a world
02:44:23.360 | and a world imbued with intelligence.
02:44:26.840 | - Oh yeah.
02:44:27.680 | - And it's just hard.
02:44:28.640 | There's just like, you know, running World of Warcraft.
02:44:30.560 | Like you're limited by what you're running on a Pentium 4.
02:44:33.160 | You know, how much intelligence can you run?
02:44:34.400 | How many flops did you have?
02:44:36.440 | But now when I'm running a game
02:44:39.040 | on a hundred petaflop machine, well, it's five people.
02:44:42.200 | I'm trying to make this a thing.
02:44:43.480 | 20 petaflops of compute is one person of compute.
02:44:45.880 | I'm trying to make that a unit.
02:44:47.120 | - 20 petaflops is one person.
02:44:50.200 | - One person.
02:44:51.320 | - One person flop.
02:44:52.160 | - It's like a horsepower.
02:44:53.320 | But what's a horsepower?
02:44:55.520 | It's how powerful a horse is.
02:44:56.360 | What's a person of compute?
02:44:57.440 | Well, you know.
02:44:58.280 | - You know, you flop.
02:44:59.120 | I got it.
02:45:00.680 | That's interesting.
02:45:01.720 | VR also adds, I mean, in terms of creating worlds.
02:45:05.960 | - You know what?
02:45:07.040 | Border Quest 2.
02:45:08.680 | I put it on and I can't believe
02:45:11.000 | the first thing they show me is a bunch of scrolling clouds
02:45:14.040 | and a Facebook login screen.
02:45:15.900 | - Yeah.
02:45:16.800 | - You had the ability to bring me into a world.
02:45:19.680 | - Yeah.
02:45:20.500 | - And what did you give me?
02:45:21.680 | A pop-up, right?
02:45:22.840 | Like, and this is why you're not cool, Mark Zuckerberg.
02:45:25.920 | But you could be cool.
02:45:27.060 | Just make sure on the Quest 3,
02:45:28.780 | you don't put me into clouds and a Facebook login screen.
02:45:31.320 | Bring me to a world.
02:45:32.400 | - I just tried Quest 3.
02:45:34.060 | It was awesome.
02:45:34.900 | But hear that, guys?
02:45:35.760 | I agree with that.
02:45:36.600 | So I--
02:45:37.440 | - We didn't have the clouds in the world.
02:45:39.640 | It was just so--
02:45:40.480 | - You know what?
02:45:41.300 | 'Cause I, I mean, the beginning, what is it?
02:45:44.440 | Todd Howard said this about the design
02:45:47.480 | of the beginning of the games he creates.
02:45:48.800 | It's like the beginning is so, so, so important.
02:45:51.680 | I recently played Zelda for the first time.
02:45:53.520 | Zelda Breath of the Wild, the previous one.
02:45:55.720 | And it's very quickly, you come out of this,
02:46:00.220 | within 10 seconds, you come out of a cave-type place,
02:46:03.200 | and it's like, this world opens up.
02:46:05.520 | It's like, ah.
02:46:07.200 | And it pulls you in.
02:46:09.840 | You forget whatever troubles I was having, whatever--
02:46:13.440 | - I gotta play that from the beginning.
02:46:14.440 | I played it for an hour at a friend's house.
02:46:16.300 | - Ah, no, the beginning, they got it.
02:46:18.120 | They did it really well, the expansiveness of that space,
02:46:21.900 | the peacefulness of that place.
02:46:25.080 | They got this, the music, I mean, so much of that.
02:46:27.200 | It's creating that world and pulling you right in.
02:46:29.360 | - I'm gonna go buy a Switch.
02:46:30.960 | I'm gonna go today and buy a Switch.
02:46:32.240 | - You should.
02:46:33.080 | Well, the new one came out, I haven't played that yet,
02:46:34.280 | but Diablo IV or something.
02:46:37.000 | I mean, there's sentimentality also,
02:46:39.000 | but something about VR really is incredible.
02:46:43.400 | But the new Quest 3 is mixed reality,
02:46:47.640 | and I got a chance to try that.
02:46:49.040 | So it's augmented reality.
02:46:51.160 | And for video games, it's done really, really well.
02:46:53.960 | - Is it pass-through or cameras?
02:46:55.000 | - It's cameras, sorry.
02:46:56.520 | The Apple one, is that one pass-through or cameras?
02:46:58.880 | - I don't know.
02:46:59.960 | I don't know how real it is.
02:47:01.040 | I don't know anything.
02:47:02.280 | - Coming out in January.
02:47:05.160 | - Is it January or is it some point?
02:47:06.720 | - Some point, maybe not January.
02:47:08.560 | Maybe that's my optimism, but Apple, I will buy it.
02:47:10.640 | I don't care if it's expensive and does nothing.
02:47:12.640 | I will buy it.
02:47:13.480 | I will support this future endeavor.
02:47:14.800 | - You're the meme.
02:47:16.160 | Oh, yes, I support competition.
02:47:18.880 | It seemed like Quest was the only people doing it,
02:47:21.680 | and this is great that they're like,
02:47:24.360 | you know what, and this is another place
02:47:26.000 | we'll give some more respect to Mark Zuckerberg.
02:47:28.640 | The two companies that have endured
02:47:30.760 | through technology are Apple and Microsoft.
02:47:33.360 | And what do they make?
02:47:34.760 | Computers and business services.
02:47:37.120 | All the memes, social ads, they all come and go.
02:47:40.640 | But you want to endure, build hardware.
02:47:44.960 | - Yeah, and that does a really interesting job.
02:47:49.280 | Maybe I'm a noob at this, but it's a $500 headset
02:47:54.000 | Quest 3, and just having creatures run around the space,
02:47:59.000 | like our space right here, to me, okay,
02:48:01.600 | this is very like boomer statement,
02:48:04.120 | but it added windows to the place.
02:48:07.840 | - I heard about the aquarium, yeah.
02:48:10.480 | - Yeah, aquarium, but in this case,
02:48:11.880 | it was a zombie game, whatever, it doesn't matter.
02:48:13.840 | But just like, it modifies the space in a way where I can't,
02:48:18.680 | it really feels like a window and you can look out.
02:48:22.600 | It's pretty cool, like I was just,
02:48:24.120 | it's like a zombie game, they're running at me, whatever.
02:48:26.600 | But what I was enjoying is the fact
02:48:27.920 | that there's like a window,
02:48:29.360 | and they're stepping on objects in this space.
02:48:32.180 | That was a different kind of escape.
02:48:35.200 | Also because you can see the other humans,
02:48:37.560 | so it's integrated with the other humans.
02:48:39.080 | It's really, really interesting.
02:48:40.560 | - And that's why it's more important than ever
02:48:42.560 | that the AI is running on those systems
02:48:44.400 | are aligned with you.
02:48:46.160 | - Oh yeah.
02:48:47.000 | - They're gonna augment your entire world.
02:48:48.600 | - Oh yeah.
02:48:49.880 | - And that, those AIs have a,
02:48:52.600 | I mean, you think about all the dark stuff,
02:48:55.100 | like sexual stuff.
02:48:58.240 | Like if those AIs threaten me, that could be haunting.
02:49:02.340 | Like if they like threaten me in a non-video game way,
02:49:07.040 | it's like, like they'll know personal information about me.
02:49:11.360 | And it's like, and then you lose track
02:49:12.960 | of what's real, what's not.
02:49:14.160 | Like what if stuff is like hacked.
02:49:15.600 | - There's two directions
02:49:16.480 | the AI girlfriend company can take, right?
02:49:18.600 | There's like the highbrow, something like her,
02:49:20.840 | maybe something you kind of talk to.
02:49:22.120 | And this is, and then there's the lowbrow version of it
02:49:24.200 | where I want to set up a brothel in Times Square.
02:49:26.200 | - Yeah.
02:49:27.440 | - Yeah.
02:49:28.280 | It's not cheating if it's a robot.
02:49:29.600 | It's a VR experience.
02:49:31.080 | - Is there an in between?
02:49:32.880 | - No, I don't wanna do that one or that one.
02:49:35.280 | - Have you decided yet?
02:49:36.200 | - No, I'll figure it out.
02:49:37.160 | We'll see what the technology goes.
02:49:39.400 | - I would love to hear your opinions
02:49:41.040 | for George's third company,
02:49:43.680 | what to do, the brothel in Times Square
02:49:46.360 | or the her experience.
02:49:48.840 | What do you think company number four will be?
02:49:53.320 | You think there'll be a company number four?
02:49:54.440 | - There's a lot to do in company number two.
02:49:56.080 | I'm just like, I'm talking about company number three now.
02:49:57.880 | Did none of that tech exists yet.
02:49:59.760 | There's a lot to do in company number two.
02:50:01.680 | Company number two is going to be the great struggle
02:50:04.120 | of the next six years.
02:50:05.320 | And if the next six years,
02:50:06.640 | how centralized is compute going to be?
02:50:09.000 | The less centralized compute is going to be,
02:50:10.840 | the better of a chance we all have.
02:50:12.280 | - So you're a bearing,
02:50:13.520 | you're like a flag bearer for open source distributed
02:50:17.080 | decentralization of compute.
02:50:19.440 | - We have to, we have to,
02:50:20.640 | or they will just completely dominate us.
02:50:22.440 | I showed a picture on stream of a man in a chicken farm.
02:50:26.280 | You ever seen one of those like factory farm chicken farms?
02:50:28.440 | Why does he dominate all the chickens?
02:50:30.320 | Why does he- - Smarter.
02:50:33.600 | - He's smarter, right?
02:50:35.040 | Some people on Twitch were like,
02:50:36.400 | he's bigger than the chickens.
02:50:37.680 | Yeah, and now here's a man in a cow farm, right?
02:50:41.440 | So it has nothing to do with their size
02:50:42.880 | and everything to do with their intelligence.
02:50:44.680 | And if one central organization has all the intelligence,
02:50:48.920 | you'll be the chickens and they'll be the chicken man.
02:50:52.200 | But if we all have the intelligence,
02:50:54.440 | we're all the chickens.
02:50:55.680 | We're not all the man, we're all the chickens.
02:50:59.960 | And there's no chicken man.
02:51:01.440 | - There's no chicken man.
02:51:03.200 | We're just chickens in Miami.
02:51:04.920 | - He was having a good life, man.
02:51:07.200 | - I'm sure he was.
02:51:08.800 | I'm sure he was.
02:51:09.960 | What have you learned from launching
02:51:11.640 | and running ComAI and TinyCorp?
02:51:13.600 | So this starting a company from an idea and scaling it.
02:51:18.120 | And by the way, I'm all in on TinyBox.
02:51:20.120 | I guess it's pre-order only now.
02:51:24.400 | - I wanna make sure it's good.
02:51:25.520 | I wanna make sure that like the thing that I deliver
02:51:28.160 | is like not gonna be like a Quest 2,
02:51:30.480 | which you buy and use twice.
02:51:32.360 | I mean, it's better than a Quest,
02:51:33.400 | which you bought and used less than once statistically.
02:51:36.760 | - Well, if there's a beta program for TinyBox, I'm into.
02:51:40.000 | - Sounds good.
02:51:41.600 | - So I won't be the whiny,
02:51:43.160 | I'll be the tech savvy user of the TinyBox
02:51:47.960 | just to be in the early days.
02:51:50.600 | What have you learned from building these companies?
02:51:53.240 | - The longest time at ComA, I asked why,
02:51:57.120 | why did I start a company?
02:52:00.280 | Why did I do this?
02:52:01.520 | But you know, what else was I gonna do?
02:52:08.760 | - So you like, you like bringing ideas to life.
02:52:13.600 | - With ComA, it really started as an ego battle with Elon.
02:52:19.640 | I wanted to beat him.
02:52:22.080 | Like I saw a worthy adversary, you know,
02:52:24.320 | here's a worthy adversary who I can beat
02:52:26.080 | at self-driving cars.
02:52:27.480 | And like, I think we've kept pace
02:52:29.320 | and I think he's kept ahead.
02:52:30.720 | I think that's what's ended up happening there.
02:52:32.800 | But I do think ComA is, I mean, ComA's profitable.
02:52:38.200 | And like when this drive GPT stuff starts working,
02:52:40.560 | that's it, there's no more like bugs in the loss function.
02:52:42.760 | Like right now we're using like a hand-coded simulator.
02:52:45.080 | There's no more bugs.
02:52:45.920 | This is gonna be it.
02:52:46.760 | Like this is the run-up to driving.
02:52:48.560 | - I hear a lot of really, a lot of props
02:52:52.120 | for Compile for ComA.
02:52:53.560 | - It's so, it's better than FSD and Autopilot
02:52:56.600 | in certain ways.
02:52:57.440 | It has a lot more to do with which feel you like.
02:53:00.160 | We lowered the price on the hardware to 1499.
02:53:02.800 | You know how hard it is to ship reliable consumer electronics
02:53:06.040 | that go on your windshield?
02:53:07.440 | We're doing more than like most cell phone companies.
02:53:11.520 | - How'd you pull that off, by the way,
02:53:12.680 | shipping a product that goes in a car?
02:53:14.680 | - I know.
02:53:15.760 | I have an SMT line.
02:53:17.920 | I make all the boards in-house in San Diego.
02:53:20.960 | - Quality control.
02:53:21.960 | - I care immensely about it.
02:53:24.000 | - You're basically a mom and pop shop with great testing.
02:53:29.000 | - Our head of OpenPilot is great at like, you know,
02:53:32.840 | okay, I want all the ComA 3s to be identical.
02:53:35.400 | - Yeah.
02:53:36.480 | - And yeah, I mean, you know, it's, look, it's 1499.
02:53:39.600 | It, 30 day money back guarantee.
02:53:42.320 | It will, it will blow your mind at what it can do.
02:53:45.400 | - Is it hard to scale?
02:53:46.520 | - You know what?
02:53:48.800 | There's kind of downsides to scaling it.
02:53:50.160 | People are always like, why don't you advertise?
02:53:52.480 | Our mission is to solve self-driving cars
02:53:54.040 | while delivering shipable intermediaries.
02:53:55.880 | Our mission has nothing to do with selling a million boxes.
02:53:59.000 | It's tawdry.
02:53:59.840 | - Do you think it's possible that ComA gets sold?
02:54:05.840 | - Only if I felt someone could accelerate that mission
02:54:09.560 | and wanted to keep it open source.
02:54:11.760 | And like, not just wanted to,
02:54:13.280 | I don't believe what anyone says.
02:54:14.960 | I believe incentives.
02:54:16.840 | If a company wanted to buy ComA with their incentives
02:54:19.800 | or to keep it open source,
02:54:20.640 | but ComA doesn't stop at the cars.
02:54:23.320 | The cars are just the beginning.
02:54:24.840 | The device is a human head.
02:54:26.400 | The device has two eyes, two ears.
02:54:28.400 | It breathes air, has a mouth.
02:54:30.600 | - So you think this goes to embodied robotics?
02:54:33.200 | - We have, we sell ComA bodies too.
02:54:35.280 | You know, they're very rudimentary.
02:54:37.520 | But one of the problems that we're running into
02:54:42.760 | is that the ComA 3 has about as much intelligence as a bee.
02:54:46.200 | If you want a human's worth of intelligence,
02:54:50.240 | you're gonna need a tiny rack, not even a tiny box.
02:54:52.520 | You're gonna need like a tiny rack, maybe even more.
02:54:55.400 | - How does that, how do you put legs on that?
02:54:58.200 | - You don't, and there's no way you can.
02:55:00.280 | You connect to it wirelessly.
02:55:02.560 | So you put your tiny box or your tiny rack in your house,
02:55:06.080 | and then you get your ComA body,
02:55:07.720 | and your ComA body runs the models on that.
02:55:10.240 | It's close, right?
02:55:11.600 | You don't have to go to some cloud,
02:55:12.720 | which is 30 milliseconds away.
02:55:14.840 | You go to a thing which is 0.1 milliseconds away.
02:55:18.160 | - So the AI girlfriend will have like a central hub
02:55:21.960 | in the home.
02:55:23.200 | - I mean, eventually, if you fast forward 20, 30 years,
02:55:26.640 | the mobile chips will get good enough to run these AIs.
02:55:29.360 | But fundamentally, it's not even a question
02:55:31.160 | of putting legs on a tiny box,
02:55:33.800 | because how are you getting 1.5 kilowatts of power
02:55:36.160 | on that thing, right?
02:55:37.920 | So you need, they're very synergistic businesses.
02:55:41.640 | I also wanna build all of ComA's training computers.
02:55:44.400 | ComA builds training computers right now.
02:55:46.720 | We use commodity parts.
02:55:48.920 | I think I can do it cheaper.
02:55:50.720 | So we're gonna build, TinyCorp is gonna not just sell
02:55:53.760 | tiny boxes, tiny boxes are the consumer version,
02:55:55.840 | but I'll build training data centers too.
02:55:57.680 | - Have you talked to Andrei Gropov,
02:56:00.120 | have you talked to Elon about TinyCorp?
02:56:01.600 | - He went to work at OpenAI.
02:56:03.400 | - What do you love about Andrei Kapathye?
02:56:05.880 | To me, he's one of the truly special humans we got.
02:56:09.920 | - Oh man, like, you know, his streams are just a level
02:56:13.680 | of quality so far beyond mine.
02:56:16.440 | Like I can't help myself, like it's just, you know.
02:56:19.160 | - Yeah, he's good.
02:56:20.120 | - He wants to teach you.
02:56:23.320 | I want to show you that I'm smarter than you.
02:56:26.000 | - Yeah, he has no, I mean, thank you for the sort of,
02:56:30.080 | the raw, authentic honesty.
02:56:32.560 | I mean, a lot of us have that.
02:56:34.840 | I think Andrei is as legit as it gets in that
02:56:37.360 | he just wants to teach you and there's a curiosity
02:56:40.160 | that just drives him.
02:56:41.560 | And just like at his, at the stage where he is in life,
02:56:45.520 | to be still like one of the best tinkerers in the world.
02:56:50.240 | It's crazy.
02:56:51.280 | Like to, what is it, Michael Grad?
02:56:54.320 | - Michael Grad was the inspiration for TinyGrad.
02:56:58.600 | - And that whole, I mean, his CS231N was,
02:57:02.040 | this was the inspiration.
02:57:03.520 | This is what I just took and ran with
02:57:04.880 | and ended up writing this, so, you know.
02:57:06.840 | - But I mean, to me that--
02:57:08.280 | - Don't go work for Darth Vader, man.
02:57:10.520 | - I mean, the flip side to me is that the fact
02:57:13.440 | that he's going there is a good sign for open AI.
02:57:18.200 | I think, you know, I like Ilias and Skever a lot.
02:57:22.480 | I like those, those guys are really good at what they do.
02:57:25.720 | - I know they are.
02:57:26.720 | - And that's kind of what's even like more,
02:57:28.840 | and you know what?
02:57:29.760 | It's not that open AI doesn't open source
02:57:32.080 | the weights of GPT-4.
02:57:33.480 | It's that they go in front of Congress.
02:57:36.860 | And that is what upsets me.
02:57:39.200 | You know, we had two effective altruists,
02:57:41.080 | Sams, go in front of Congress.
02:57:42.560 | One's in jail.
02:57:43.400 | - I think you're drawing parallels on that.
02:57:47.720 | - One's in jail.
02:57:48.680 | - You're giving me a look.
02:57:49.980 | (laughing)
02:57:51.240 | You're giving me a look.
02:57:52.080 | - No, I think effective altruism
02:57:53.280 | is a terribly evil ideology and, yeah.
02:57:55.600 | - Oh yeah, that's interesting.
02:57:56.440 | Why do you think that is?
02:57:57.280 | Why do you think there's something about
02:58:00.120 | a thing that sounds pretty good
02:58:01.620 | that kind of gets us into trouble?
02:58:04.160 | - Because you get Sam Bangman-Fried.
02:58:06.200 | Like Sam Bangman-Fried is the embodiment
02:58:07.860 | of effective altruism.
02:58:09.420 | Utilitarianism is an abhorrent ideology.
02:58:13.720 | Like, well yeah, we're gonna kill those three people
02:58:16.040 | to save a thousand, of course.
02:58:17.720 | - Yeah.
02:58:18.560 | - Right, there's no underlying, like there's just, yeah.
02:58:21.560 | - Yeah, but to me that's a bit surprising.
02:58:26.540 | But it's also, in retrospect, not that surprising.
02:58:30.700 | But I haven't heard really clear kind of,
02:58:33.260 | like rigorous analysis why effective altruism is flawed.
02:58:40.540 | - Oh, well, I think charity is bad, right?
02:58:42.540 | So what is charity but investment
02:58:43.820 | that you don't expect to have a return on, right?
02:58:46.260 | - Yeah, but you can also think of charity as like,
02:58:50.300 | as you would like to see,
02:58:54.380 | to allocate resources in an optimal way
02:58:56.620 | to make a better world.
02:58:59.980 | - And probably almost always
02:59:01.700 | that involves starting a company.
02:59:03.240 | - Yeah.
02:59:04.080 | - Right, because-- - More efficient.
02:59:04.940 | - Yeah, if you just take the money
02:59:06.420 | and you spend it on malaria nets, you know, okay, great.
02:59:10.020 | You've made 100 malaria nets, but if you teach--
02:59:13.060 | - Yeah, no matter how efficient.
02:59:14.580 | - Right, yeah.
02:59:15.420 | - No, but the problem is teaching no matter how efficient
02:59:17.740 | might be harder, starting a company might be harder
02:59:19.620 | than allocating money that you already have.
02:59:22.420 | - I like the flip side of effective altruism,
02:59:24.220 | effective accelerationism.
02:59:25.940 | I think accelerationism is the only thing
02:59:27.660 | that's ever lifted people out of poverty.
02:59:30.060 | The fact that food is cheap,
02:59:32.260 | not we're giving food away because we are kindhearted people.
02:59:35.840 | No, food is cheap.
02:59:37.980 | And that's the world you wanna live in.
02:59:40.340 | UBI, what a scary idea.
02:59:42.340 | What a scary idea, all your power now,
02:59:46.540 | your money is power, your only source of power
02:59:49.220 | is granted to you by the goodwill of the government.
02:59:52.560 | What a scary idea.
02:59:54.100 | - So you even think long-term, even--
02:59:57.540 | - I'd rather die than need UBI to survive, and I mean it.
03:00:00.880 | - What if survival is basically guaranteed?
03:00:06.420 | What if our life becomes so good?
03:00:08.940 | - You can make survival guaranteed without UBI.
03:00:12.300 | What you have to do is make housing and food dirt cheap.
03:00:15.100 | - Sure.
03:00:15.940 | - And that's the good world.
03:00:17.260 | And actually, let's go into what we should really
03:00:19.060 | be making dirt cheap, which is energy.
03:00:20.960 | - Yeah.
03:00:21.940 | - That energy, you know, oh my God, like, you know,
03:00:25.420 | that's, if there's one, I'm pretty centrist politically,
03:00:29.340 | if there's one political position I cannot stand,
03:00:31.940 | it's deceleration.
03:00:33.420 | It's people who believe we should use less energy.
03:00:36.060 | Not people who believe global warming is a problem,
03:00:37.740 | I agree with you.
03:00:38.700 | Not people who believe that, you know,
03:00:40.620 | saving the environment is good, I agree with you.
03:00:43.780 | But people who think we should use less energy.
03:00:46.100 | That energy usage is a moral bad.
03:00:50.220 | No, you are asking, you are diminishing humanity.
03:00:54.460 | - Yeah, energy is flourishing,
03:00:56.740 | of creative flourishing of the human species.
03:00:59.340 | - How do we make more of it?
03:01:00.500 | How do we make it clean?
03:01:01.380 | And how do we make, just, just, just,
03:01:03.220 | how do I pay, you know, 20 cents for a megawatt hour
03:01:06.780 | instead of a kilowatt hour?
03:01:08.320 | - Part of me wishes that Elon went into nuclear fusion
03:01:13.060 | versus Twitter, part of me.
03:01:16.180 | Or somebody, somebody like Elon.
03:01:18.260 | You know, we need to, I wish there were more,
03:01:21.180 | more Elons in the world.
03:01:23.740 | I think Elon sees it as like,
03:01:25.940 | this is a political battle that needed to be fought.
03:01:28.460 | And again, like, you know, I always ask the question
03:01:30.860 | of whenever I disagree with him,
03:01:32.020 | I remind myself that he's a billionaire and I'm not.
03:01:35.060 | So, you know, maybe he's got something figured out
03:01:37.260 | that I don't, or maybe he doesn't.
03:01:38.940 | - To have some humility, but at the same time,
03:01:42.380 | me as a person who happens to know him,
03:01:45.140 | I find myself in that same position.
03:01:46.900 | Sometimes even billionaires need friends
03:01:51.300 | who disagree and help them grow.
03:01:53.180 | And that's a difficult, that's a difficult reality.
03:01:57.100 | - And it must be so hard, it must be so hard to meet people
03:02:00.020 | once you get to that point where.
03:02:01.980 | - Fame, power, money, everybody's sucking up to you.
03:02:05.580 | - See, I love not having shit, like I don't have shit, man.
03:02:08.180 | You know, like, trust me, there's nothing I can give you.
03:02:11.020 | There's nothing worth taking from me, you know?
03:02:13.740 | - Yeah, it takes a really special human being
03:02:16.060 | when you have power, when you have fame,
03:02:17.860 | when you have money, to still think from first principles.
03:02:21.420 | Not like all the adoration you get towards you,
03:02:23.460 | all the admiration, all the people saying yes, yes, yes.
03:02:26.540 | - And all the hate, too.
03:02:27.780 | - And the hate.
03:02:28.620 | - Fame gets worse.
03:02:29.460 | - So the hate makes you want to go to the yes people
03:02:33.300 | because the hate exhausts you.
03:02:35.740 | And the kind of hate that Elon's gotten from the left
03:02:38.380 | is pretty intense.
03:02:39.900 | And so that, of course, drives him right.
03:02:44.820 | And loses balance.
03:02:46.060 | - And it keeps this absolutely fake, like,
03:02:49.860 | psy-op political divide alive so that the 1% can keep power.
03:02:54.860 | - I wish we'd be less divided 'cause it is giving power.
03:02:59.420 | - It gives power.
03:03:00.260 | - To the ultra-powerful.
03:03:01.100 | The rich get richer.
03:03:03.460 | You have love in your life.
03:03:06.100 | Has love made you a better or a worse programmer?
03:03:09.940 | Do you keep productivity metrics?
03:03:13.700 | - No, no.
03:03:14.740 | (Lex laughs)
03:03:15.660 | No, I'm not that methodical.
03:03:18.020 | I think that there comes to a point where
03:03:22.020 | if it's no longer visceral, I just can't enjoy it.
03:03:24.860 | I still viscerally love programming.
03:03:28.700 | The minute I started like--
03:03:29.940 | - So that's one of the big loves of your life is programming.
03:03:33.020 | - I mean, just my computer in general.
03:03:34.940 | I mean, I tell my girlfriend my first love
03:03:37.620 | is my computer, of course.
03:03:39.060 | I sleep with my computer.
03:03:42.180 | It's there for a lot of my sexual experiences.
03:03:44.420 | Like, come on, so is everyone's, right?
03:03:46.740 | Like, you know, you gotta be real about that.
03:03:48.940 | - Not just like the IDE for programming,
03:03:50.900 | just the entirety of the computational machine.
03:03:53.300 | - The fact that, yeah, I mean, it's, you know,
03:03:54.620 | I wish it was, and someday it'll be smarter.
03:03:57.580 | Maybe I'm weird for this, but I don't discriminate, man.
03:04:01.180 | I'm not gonna discriminate biostack life
03:04:02.860 | and silicon stack life.
03:04:04.460 | - So the moment the computer starts to say, like,
03:04:07.420 | I miss you, it starts to have some of the basics
03:04:10.060 | of human intimacy, it's over for you.
03:04:14.260 | The moment VS Code says, hey, George,
03:04:16.820 | I can answer-- - No, you see, no, no, no,
03:04:17.900 | but VS Code is, no, they're just doing that.
03:04:20.860 | Microsoft's doing that to try to get me hooked on it.
03:04:22.860 | I'll see through it.
03:04:24.020 | I'll see through it.
03:04:24.860 | It's gold digger, man, it's gold digger.
03:04:25.980 | - Look at me in open source.
03:04:27.420 | - Well, this just gets more interesting, right?
03:04:29.220 | If it's open source, and yeah, it becomes--
03:04:31.820 | - Though Microsoft's done a pretty good job on that.
03:04:33.620 | - Oh, absolutely.
03:04:34.460 | No, no, no, look, I think Microsoft, again,
03:04:36.820 | I wouldn't count on it to be true forever,
03:04:38.700 | but I think right now, Microsoft is doing the best work
03:04:41.420 | in the programming world, like, between, yeah, GitHub,
03:04:44.980 | GitHub Actions, VS Code, the improvements to Python,
03:04:48.620 | it works with Microsoft, like, this is--
03:04:50.900 | - Who would have thought Microsoft and Mark Zuckerberg
03:04:54.780 | are spearheading the open source movement?
03:04:57.060 | - Right, right?
03:04:58.540 | How things change.
03:05:01.380 | - Oh, it's beautiful.
03:05:03.500 | - And by the way, that's who I'd bet on
03:05:04.580 | to replace Google, by the way.
03:05:06.060 | - Who? - Microsoft.
03:05:07.780 | - Microsoft. - I think Satya Nadella
03:05:09.260 | said straight up, "I'm coming for it."
03:05:11.420 | - Interesting, so you bet, who wins AGI?
03:05:15.620 | - Oh, I don't know about AGI.
03:05:16.740 | I think we're a long way away from that,
03:05:17.820 | but I would not be surprised if in the next five years,
03:05:21.020 | Bing overtakes Google as a search engine.
03:05:23.060 | - Interesting.
03:05:25.140 | - Wouldn't surprise me.
03:05:26.980 | - Interesting.
03:05:28.020 | I hope some startup does.
03:05:31.700 | - It might be some startup too.
03:05:34.260 | I would equally bet on some startup.
03:05:36.980 | - Yeah, I'm like 50/50, but maybe that's naive.
03:05:40.540 | I believe in the power of these language models.
03:05:43.540 | - Satya's alive, Microsoft's alive.
03:05:45.700 | - Yeah, it's great, it's great.
03:05:48.260 | I like all the innovation in these companies.
03:05:50.620 | They're not being stale.
03:05:51.820 | And to the degree they're being stale, they're losing.
03:05:55.260 | So there's a huge incentive to do a lot of exciting work
03:05:58.180 | and open source work, which is, this is incredible.
03:06:01.060 | - Only way to win.
03:06:02.740 | - You're older, you're wiser.
03:06:05.620 | What's the meaning of life, George Hotz?
03:06:08.460 | - To win.
03:06:09.660 | - It's still to win.
03:06:10.820 | - Of course.
03:06:12.300 | - Always.
03:06:13.380 | - Of course.
03:06:14.580 | - What's winning look like for you?
03:06:16.500 | - I don't know, I haven't figured out what the game is yet,
03:06:18.500 | but when I do, I wanna win.
03:06:19.860 | - So it's bigger than solving self-driving?
03:06:21.580 | It's bigger than democratizing, decentralizing compute?
03:06:26.580 | - I think the game is to stand eye to eye with God.
03:06:33.900 | - I wonder what that means for you.
03:06:36.100 | At the end of your life, what that will look like.
03:06:40.160 | - I mean, this is what, I don't know,
03:06:42.740 | this is some, there's probably some ego trip of mine.
03:06:47.740 | Like, you wanna stand eye to eye with God,
03:06:50.220 | you're just blasphemous, man.
03:06:51.900 | You okay?
03:06:52.860 | I don't know, I don't know.
03:06:54.060 | I don't know if it would upset God.
03:06:55.860 | I think he wants that.
03:06:57.100 | I mean, I certainly want that for my creations.
03:06:59.660 | I want my creations to stand eye to eye with me.
03:07:03.220 | So why wouldn't God want me to stand eye to eye with him?
03:07:06.020 | That's the best I can do, golden rule.
03:07:10.000 | - I'm just imagining the creator of a video game
03:07:14.140 | having to look and stand eye to eye
03:07:20.520 | with one of the characters.
03:07:22.700 | - I only watched season one of "Westworld,"
03:07:24.620 | but yeah, we gotta find the maze and solve it.
03:07:26.920 | - Yeah, I wonder what that looks like.
03:07:30.260 | It feels like a really special time in human history
03:07:33.380 | where that's actually possible.
03:07:34.860 | Like, there's something about AI that's like,
03:07:37.300 | we're playing with something weird here,
03:07:39.980 | something really weird.
03:07:41.780 | - I wrote a blog post, I re-read "Genesis"
03:07:45.340 | and just looked like, they give you some clues
03:07:47.340 | at the end of "Genesis" for finding the Garden of Eden.
03:07:50.100 | And I'm interested, I'm interested.
03:07:53.740 | - Well, I hope you find just that, George.
03:07:57.100 | You're one of my favorite people.
03:07:58.460 | Thank you for doing everything you're doing.
03:07:59.740 | And in this case, for fighting for open source
03:08:02.500 | or for decentralization of AI,
03:08:04.620 | it's a fight worth fighting, fight worth winning hashtag.
03:08:08.400 | I love you, brother.
03:08:10.260 | These conversations are always great.
03:08:11.540 | Hope to talk to you many more times.
03:08:13.500 | Good luck with Tiny Corp.
03:08:15.620 | - Thank you, great to be here.
03:08:17.820 | - Thanks for listening to this conversation
03:08:19.260 | with George Hotz.
03:08:20.460 | To support this podcast,
03:08:21.660 | please check out our sponsors in the description.
03:08:24.380 | And now, let me leave you with some words
03:08:26.180 | from Albert Einstein.
03:08:28.140 | "Everything should be made as simple as possible,
03:08:31.100 | but not simpler."
03:08:32.860 | Thank you for listening and hope to see you next time.
03:08:37.020 | (upbeat music)
03:08:39.600 | (upbeat music)
03:08:42.180 | [BLANK_AUDIO]