back to index

Jim Keller: Elon Musk and Tesla Autopilot | AI Podcast Clips


Whisper Transcript | Transcript Only Page

00:00:00.000 | All the cost is in the equipment to do it.
00:00:04.400 | And the trend on equipment is,
00:00:06.520 | once you figure out how to build the equipment,
00:00:08.320 | the trend of cost is zero.
00:00:09.520 | Elon said, first you figure out
00:00:11.480 | what configuration you want the atoms in,
00:00:15.240 | and then how to put them there.
00:00:17.520 | Right?
00:00:18.360 | - Yeah.
00:00:19.180 | - 'Cause, well, here's the, you know,
00:00:20.420 | his great insight is, people are how constrained.
00:00:24.200 | I have this thing, I know how it works,
00:00:26.400 | and then little tweaks to that will generate something,
00:00:30.000 | as opposed to, what do I actually want,
00:00:32.840 | and then figure out how to build it.
00:00:34.760 | It's a very different mindset.
00:00:36.960 | And almost nobody has it, obviously.
00:00:39.040 | - Well, let me ask on that topic,
00:00:43.440 | you were one of the key early people
00:00:45.760 | in the development of autopilot,
00:00:47.840 | at least in the hardware side.
00:00:49.320 | Elon Musk believes that autopilot and vehicle autonomy,
00:00:53.160 | if you just look at that problem,
00:00:54.400 | can follow this kind of exponential improvement.
00:00:57.200 | In terms of the how question that we're talking about,
00:01:00.300 | there's no reason why it can't.
00:01:02.400 | What are your thoughts on this particular space
00:01:05.000 | of vehicle autonomy, and your part of it,
00:01:10.000 | and Elon Musk's and Tesla's vision for--
00:01:12.960 | - Well, the computer you need to build was straightforward.
00:01:16.480 | And you could argue, well, does it need to be
00:01:18.840 | two times faster, or five times, or 10 times?
00:01:21.300 | But that's just a matter of time.
00:01:23.960 | Or price, in the short run.
00:01:26.160 | So that's not a big deal.
00:01:27.960 | You don't have to be especially smart to drive a car.
00:01:31.000 | So it's not like a super hard problem.
00:01:33.440 | I mean, the big problem with safety is attention,
00:01:35.640 | which computers are really good at, not skills.
00:01:38.840 | - Well, let me push back on one.
00:01:42.960 | You see, everything you said is correct,
00:01:44.840 | but we as humans tend to take for granted
00:01:51.960 | how incredible our vision system is.
00:01:55.600 | - You can drive a car with 20/50 vision,
00:01:58.320 | and you can train a neural network to extract
00:02:00.720 | the distance of any object in the shape of any surface
00:02:04.160 | from a video and data.
00:02:06.240 | - Yeah, but-- - It's really simple.
00:02:07.880 | - No, it's not simple.
00:02:09.800 | - That's a simple data problem.
00:02:12.080 | - It's not simple.
00:02:13.800 | 'Cause it's not just detecting objects,
00:02:18.160 | it's understanding the scene,
00:02:20.000 | and it's being able to do it in a way
00:02:22.040 | that doesn't make errors.
00:02:24.280 | So the beautiful thing about the human vision system
00:02:27.720 | and our entire brain around the whole thing
00:02:30.320 | is we're able to fill in the gaps.
00:02:33.240 | It's not just about perfectly detecting cars,
00:02:35.920 | it's inferring the occluded cars.
00:02:37.680 | It's trying to, it's understanding the physics--
00:02:40.120 | - I think that's mostly a data problem.
00:02:42.280 | - So you think what data would compute
00:02:45.400 | with improvement of computation,
00:02:46.920 | with improvement in collection?
00:02:48.440 | - Well, there is a, you know, when you're driving a car
00:02:50.360 | and somebody cuts you off, your brain has theories
00:02:52.480 | about why they did it.
00:02:53.840 | You know, they're a bad person, they're distracted,
00:02:56.360 | they're dumb, you know, you can listen to yourself.
00:02:59.520 | - Right.
00:03:00.520 | - So, you know, if you think that narrative is important
00:03:04.720 | to be able to successfully drive a car,
00:03:06.520 | then current autopilot systems can't do it.
00:03:09.320 | But if cars are ballistic things with tracks
00:03:12.040 | and probabilistic changes of speed and direction,
00:03:15.040 | and roads are fixed and given, by the way,
00:03:17.920 | they don't change dynamically, right?
00:03:21.000 | You can map the world really thoroughly.
00:03:24.040 | You can place every object really thoroughly, right?
00:03:29.040 | You can calculate trajectories of things really thoroughly.
00:03:32.480 | Right?
00:03:34.600 | - But everything you said about really thoroughly
00:03:37.560 | has a different degree of difficulty.
00:03:41.040 | - And you could say at some point,
00:03:42.800 | computer autonomous systems will be way better
00:03:45.320 | at things that humans are lousy at.
00:03:47.720 | Like, they'll be better at attention.
00:03:50.160 | They'll always remember there was a pothole in the road
00:03:52.720 | that humans keep forgetting about.
00:03:55.040 | They'll remember that this set of roads
00:03:57.120 | has these weirdo lines on it
00:03:58.880 | that the computers figured out once.
00:04:00.440 | And especially if they get updates,
00:04:02.840 | so if somebody changes a given,
00:04:05.640 | like the key to robots and stuff,
00:04:08.320 | somebody said, is to maximize the givens.
00:04:10.560 | Right?
00:04:12.400 | - Right.
00:04:13.240 | - Right, so having a robot pick up this bottle cap
00:04:15.600 | is way easier to put a red dot on the top.
00:04:17.760 | 'Cause then you have to figure out,
00:04:20.320 | if you wanna do a certain thing with it,
00:04:22.480 | maximize the givens is the thing.
00:04:24.800 | And autonomous systems are happily maximizing the givens.
00:04:27.880 | Like, humans, when you drive someplace new,
00:04:31.800 | you remember it 'cause you're processing it the whole time.
00:04:34.560 | And after the 50th time you drove to work,
00:04:36.560 | you get to work, you don't know how you got there.
00:04:38.880 | Right?
00:04:39.720 | You're on autopilot.
00:04:41.320 | Right?
00:04:42.440 | Autonomous cars are always on autopilot.
00:04:45.400 | But the cars have no theories about why they got cut off
00:04:48.040 | or why they're in traffic.
00:04:49.800 | So they also never stop paying attention.
00:04:52.400 | - Right.
00:04:53.240 | So I tend to believe you do have to have theories,
00:04:55.680 | meta models of other people,
00:04:57.640 | especially with pedestrian and cyclists,
00:04:59.080 | but also with other cars.
00:05:00.480 | So everything you said is actually essential to driving.
00:05:05.480 | Driving is a lot more complicated than people realize,
00:05:09.400 | I think, so sort of to push back slightly, but--
00:05:12.240 | - So to cut into traffic, right?
00:05:14.200 | - Yep.
00:05:15.040 | - You can't just wait for a gap.
00:05:16.120 | You have to be somewhat aggressive.
00:05:17.800 | You'd be surprised how simple a calculation for that is.
00:05:21.480 | - I may be on that particular point, but there's a...
00:05:24.160 | - Yeah.
00:05:25.000 | - Maybe I should have to push back.
00:05:28.000 | I would be surprised.
00:05:29.280 | You know what?
00:05:30.120 | Yeah, I'll just say where I stand.
00:05:30.960 | I would be very surprised, but I think it's,
00:05:33.880 | you might be surprised how complicated it is.
00:05:36.320 | - I tell people, it's like progress disappoints
00:05:39.640 | in the short run and surprises in the long run.
00:05:41.600 | - It's very possible, yeah.
00:05:43.240 | - I suspect in 10 years, it'll be just taken for granted.
00:05:46.680 | - Yeah, probably.
00:05:47.520 | But you're probably right.
00:05:49.200 | Now it looks like--
00:05:50.040 | - It's gonna be a $50 solution that nobody cares about.
00:05:52.720 | It's like GPS is like, wow, GPS,
00:05:54.920 | we have satellites in space
00:05:57.120 | that tell you where your location is.
00:05:58.800 | It was a really big deal.
00:05:59.720 | Now everything has a GPS in it.
00:06:01.160 | - Yeah, it's true, but I do think that systems
00:06:03.720 | that involve human behavior are more complicated
00:06:07.560 | than we give them credit for.
00:06:08.520 | So we can do incredible things with technology
00:06:11.200 | that don't involve humans,
00:06:12.680 | but when you--
00:06:13.520 | - I think humans are less complicated than people,
00:06:16.120 | you know, frequently ascribed.
00:06:18.240 | - Maybe I have--
00:06:19.080 | - We tend to operate out of large numbers of patterns
00:06:21.440 | and just keep doing it over and over.
00:06:23.520 | - But I can't trust you because you're a human.
00:06:25.760 | That's something a human would say.
00:06:28.440 | But my hope is on the point you've made is,
00:06:32.280 | even if, no matter who's right,
00:06:34.960 | I'm hoping that there's a lot of things
00:06:38.320 | that humans aren't good at
00:06:39.540 | that machines are definitely good at,
00:06:41.120 | like you said, attention.
00:06:42.600 | Things like that, well, they'll be so much better
00:06:45.360 | that the overall picture of safety and autonomy
00:06:48.660 | will be obviously cars will be safer,
00:06:50.560 | even if they're not as good at--
00:06:52.360 | - I'm a big believer in safety.
00:06:54.080 | I mean, there are already the current safety systems
00:06:57.280 | like cruise control that doesn't let you run into people
00:06:59.680 | and lane keeping.
00:07:01.020 | There are so many features that you just look
00:07:03.240 | at the Pareto of accidents and knocking off like 80%
00:07:06.920 | of them is, you know, super doable.
00:07:10.120 | - Just to linger on the autopilot team
00:07:12.320 | and the efforts there,
00:07:13.540 | it seems to be that there's a very intense scrutiny
00:07:19.360 | by the media and the public in terms of safety,
00:07:21.980 | the pressure, the bar put before autonomous vehicles.
00:07:25.680 | What are your, sort of as a person there working
00:07:29.800 | on the hardware and trying to build a system
00:07:31.560 | that builds a safe vehicle and so on,
00:07:34.900 | what was your sense about that pressure?
00:07:36.640 | Is it unfair?
00:07:37.600 | Is it expected of new technology?
00:07:40.000 | - Yeah, it seems reasonable.
00:07:41.200 | I was interested, I talked to both American
00:07:43.120 | and European regulators, and I was worried
00:07:46.040 | that the regulations would write
00:07:49.440 | into the rules technology solutions,
00:07:52.780 | like modern brake systems imply hydraulic brakes.
00:07:57.720 | So if you read the regulations to meet the letter
00:08:00.920 | of the law for brakes, it sort of has to be hydraulic, right?
00:08:05.480 | And the regulator said they're interested
00:08:08.400 | in the use cases, like a head-on crash, an offset crash,
00:08:12.040 | don't hit pedestrians, don't run into people,
00:08:14.760 | don't leave the road, don't run a red light or a stoplight.
00:08:18.060 | They were very much into the scenarios.
00:08:20.820 | And, you know, and they had all the data
00:08:23.280 | about which scenarios injured or killed the most people.
00:08:26.960 | And for the most part, those conversations were like,
00:08:31.680 | what's the right thing to do to take the next step?
00:08:36.440 | Now Elon's very interested also in the benefits
00:08:39.640 | of autonomous driving are freeing people's time
00:08:41.800 | and attention as well as safety.
00:08:44.140 | And I think that's also an interesting thing,
00:08:47.980 | but, you know, building autonomous systems
00:08:52.000 | so they're safe and safer than people seemed,
00:08:55.040 | since the goal is to be 10x safer than people,
00:08:57.780 | having the bar to be safer than people
00:08:59.840 | and scrutinizing accidents seems philosophically correct.
00:09:05.840 | So I think that's a good thing.
00:09:08.100 | - What are, is different than the things you worked at,
00:09:13.100 | the Intel, AMD, Apple, with autopilot chip design
00:09:18.600 | and hardware design, what are interesting
00:09:21.960 | or challenging aspects of building this specialized
00:09:24.320 | kind of computing system in the automotive space?
00:09:26.980 | - I mean, there's two tricks to building
00:09:29.260 | like an automotive computer.
00:09:30.400 | One is the software team, the machine learning team,
00:09:34.980 | is developing algorithms that are changing fast.
00:09:38.300 | So as you're building the accelerator,
00:09:41.940 | you have this, you know, worry or intuition
00:09:44.580 | that the algorithms will change enough
00:09:46.180 | that the accelerator will be the wrong one, right?
00:09:50.300 | And there's the generic thing, which is,
00:09:52.660 | if you build a really good general purpose computer,
00:09:54.900 | say its performance is one, and then GPU guys
00:09:59.100 | will deliver about 5x the performance
00:10:01.960 | for the same amount of silicon,
00:10:03.380 | because instead of discovering parallelism,
00:10:05.260 | you're given parallelism.
00:10:06.900 | And then special accelerators get another two to 5x
00:10:11.380 | on top of a GPU, because you say,
00:10:13.740 | I know the math is always eight bit integers
00:10:16.700 | into 32 bit accumulators, and the operations
00:10:19.860 | are the subset of mathematical possibilities.
00:10:22.860 | So auto, you know, AI accelerators have a claimed
00:10:27.620 | performance benefit over GPUs,
00:10:29.420 | because in the narrow math space,
00:10:32.740 | you're nailing the algorithm.
00:10:34.760 | Now, you still try to make it programmable,
00:10:37.700 | but the AI field is changing really fast.
00:10:40.920 | So there's a, you know, there's a little creative tension
00:10:44.220 | there of, I want the acceleration afforded
00:10:46.700 | by specialization without being over specialized
00:10:49.820 | so that the new algorithm is so much more effective
00:10:53.240 | that you'd have been better off on a GPU.
00:10:55.580 | So there's a tension there.
00:10:57.620 | To build a good computer for an application
00:11:00.620 | like automotive, there's all kinds of sensor inputs
00:11:03.860 | and safety processors and a bunch of stuff.
00:11:06.760 | So one of Elon's goals to make it super affordable.
00:11:09.860 | So every car gets an autopilot computer.
00:11:12.460 | So some of the recent startups you look at,
00:11:14.140 | and they have a server in the trunk,
00:11:16.140 | because they're saying, I'm going to build
00:11:17.300 | this autopilot computer replaces the driver.
00:11:20.160 | So their cost budget's 10 or $20,000.
00:11:22.860 | And Elon's constraint was, I'm going to put one
00:11:25.700 | in every car, whether people buy
00:11:27.260 | autonomous driving or not.
00:11:29.380 | So the cost constraint he had in mind was great.
00:11:32.700 | Right, and to hit that, you had to think
00:11:35.140 | about the system design, that's complicated.
00:11:36.980 | It's fun, you know, it's like, it's like,
00:11:38.900 | it's craftsman's work, like a violin maker, right?
00:11:41.900 | You can say Stradivarius is this incredible thing.
00:11:44.460 | The musicians are incredible.
00:11:46.160 | But the guy making the violin, you know,
00:11:48.140 | picked wood and sanded it, and then he cut it,
00:11:51.660 | you know, and he glued it, you know,
00:11:53.620 | and he waited for the right day.
00:11:55.580 | So that when he put the finish on it,
00:11:57.180 | it didn't, you know, do something dumb.
00:11:59.320 | That's craftsman's work, right?
00:12:01.540 | You may be a genius craftsman,
00:12:03.180 | because you have the best techniques
00:12:04.500 | and you discover a new one,
00:12:06.500 | but most engineering is craftsman's work.
00:12:09.620 | And humans really like to do that.
00:12:11.980 | You know, expression-- - Smart humans.
00:12:13.620 | - No, everybody.
00:12:14.460 | - All humans. - I don't know.
00:12:15.580 | I used to, I dug ditches when I was in college.
00:12:18.020 | I got really good at it, satisfying.
00:12:20.300 | - Yeah. - So.
00:12:21.300 | - Digging ditches is also craftsman work.
00:12:23.100 | - Yeah, of course.
00:12:24.620 | So there's an expression called complex mastery behavior.
00:12:28.580 | So when you're learning something,
00:12:29.740 | that's fun, 'cause you're learning something.
00:12:31.740 | When you do something and it's rote and simple,
00:12:33.420 | it's not that satisfying.
00:12:34.360 | But if the steps that you have to do are complicated
00:12:38.040 | and you're good at 'em, it's satisfying to do them.
00:12:41.160 | And then if you're intrigued by it all,
00:12:44.540 | as you're doing them, you sometimes learn new things
00:12:47.180 | that you can raise your game.
00:12:49.260 | But craftsman's work is good.
00:12:51.420 | And engineers, like engineering is complicated,
00:12:54.420 | enough that you have to learn a lot of skills.
00:12:56.460 | And then a lot of what you do is then craftsman's work,
00:13:00.020 | which is fun.
00:13:01.140 | - Autonomous driving, building a very
00:13:03.100 | resource-constrained computer,
00:13:05.500 | so a computer has to be cheap enough
00:13:07.220 | that it's put in every single car.
00:13:08.780 | That essentially boils down to craftsman's work.
00:13:12.740 | It's engineering, it's--
00:13:13.580 | - Yeah, you know, there's thoughtful decisions
00:13:15.380 | and problems to solve and trade-offs to make.
00:13:18.260 | Do you need 10 camera inputs or eight?
00:13:20.180 | You know, is your building for the current car
00:13:22.220 | or the next one?
00:13:23.700 | You know, how do you do the safety stuff?
00:13:25.580 | You know, there's a whole bunch of details.
00:13:28.300 | But it's fun, but it's not like I'm building
00:13:30.900 | a new type of neural network which has a new mathematics
00:13:33.700 | and a new computer to work.
00:13:35.160 | That's, like there's more invention than that.
00:13:39.180 | But the rejection to practice,
00:13:41.780 | once you pick the architecture, you look inside
00:13:43.780 | and what do you see?
00:13:44.740 | Adders and multipliers and memories and the basics.
00:13:48.820 | So computers, there's always this weird set
00:13:52.140 | of abstraction layers of ideas and thinking
00:13:54.820 | that reduction to practice is transistors and wires
00:13:58.820 | and pretty basic stuff.
00:14:01.500 | And that's an interesting phenomena.
00:14:04.780 | By the way, like factory work,
00:14:07.020 | lots of people think factory work is road assembly stuff.
00:14:09.980 | I've been on the assembly line.
00:14:11.860 | Like the people who work there really like it.
00:14:13.980 | It's a really great job.
00:14:15.580 | It's really complicated.
00:14:16.460 | Putting cars together is hard, right?
00:14:18.620 | And the car is moving and the parts are moving
00:14:21.140 | and sometimes the parts are damaged
00:14:22.660 | and you have to coordinate putting all the stuff together
00:14:25.220 | and people are good at it.
00:14:26.740 | They're good at it.
00:14:28.060 | And I remember one day I went to work
00:14:29.460 | and the line was shut down for some reason
00:14:31.620 | and some of the guys sitting around were really bummed
00:14:34.460 | 'cause they had reorganized a bunch of stuff
00:14:36.900 | and they were gonna hit a new record
00:14:38.420 | for the number of cars built that day.
00:14:40.420 | And they were all gung-ho to do it.
00:14:41.860 | And these were big, tough buggers.
00:14:43.560 | (Luke laughs)
00:14:45.500 | But what they did was complicated and you couldn't do it.
00:14:47.900 | - Yeah, and I mean--
00:14:49.060 | - Well, after a while you could,
00:14:50.440 | but you'd have to work your way up
00:14:51.880 | 'cause putting the bright, what's called the brights,
00:14:56.380 | the trim on a car on a moving assembly line
00:15:00.340 | where it has to be attached 25 places
00:15:02.300 | in a minute and a half is unbelievably complicated.
00:15:05.860 | And human beings can do it.
00:15:09.260 | It's really good.
00:15:10.100 | I think that's harder than driving a car, by the way.
00:15:12.980 | - Putting together, working--
00:15:14.780 | - Working on a factory.
00:15:16.300 | - Too smart, people can disagree.
00:15:19.100 | - Yeah.
00:15:19.940 | - It's like driving a car.
00:15:22.180 | - Well, we'll get you in the factory someday
00:15:23.860 | and then we'll see how you do.
00:15:24.700 | - No, not for us humans driving a car is easy.
00:15:27.220 | I'm saying building a machine that drives a car is not easy.
00:15:32.220 | Okay, driving a car is easy for humans
00:15:35.140 | because we've been evolving for billions of years.
00:15:38.540 | - To drive cars, yeah, I noticed that.
00:15:40.180 | - To do--
00:15:41.020 | - The pale of the cars are super cool.
00:15:43.340 | - Oh, now you join the rest of the internet in mocking me.
00:15:47.580 | - Okay.
00:15:48.420 | (Luke laughs)
00:15:49.240 | - It wasn't mocking, I was just intrigued
00:15:51.900 | by your anthropology.
00:15:54.540 | - Yeah, it's--
00:15:55.380 | - I'll have to go dig into that.
00:15:56.660 | - There's some inaccuracies there, yes.
00:15:58.820 | Okay, but in general, what have you learned
00:16:03.820 | in terms of thinking about passion, craftsmanship,
00:16:10.660 | tension, chaos, the whole mess of it?
00:16:18.620 | Or what have you learned, have taken away from your time
00:16:21.960 | working with Elon Musk, working at Tesla,
00:16:24.720 | which is known to be a place of chaos, innovation,
00:16:29.720 | craftsmanship, and all those things.
00:16:31.160 | - I really like the way he thought.
00:16:33.040 | Like, you think you have an understanding
00:16:35.400 | about what first principles of something is
00:16:37.720 | and then you talk to Elon about it
00:16:39.360 | and you didn't scratch the surface.
00:16:41.600 | He has a deep belief that no matter what you do,
00:16:46.080 | it's a local maximum.
00:16:48.360 | - Right, and I had a friend,
00:16:49.440 | he invented a better electric motor.
00:16:51.960 | And it was a lot better than what we were using.
00:16:54.660 | And one day he came by, he said,
00:16:55.800 | "You know, I'm a little disappointed
00:16:57.720 | "'cause this is really great
00:16:59.540 | "and you didn't seem that impressed."
00:17:01.000 | And I said, "You know when the super intelligent aliens come,
00:17:04.900 | "are they gonna be looking for you?"
00:17:06.640 | Like, where is he?
00:17:07.480 | The guy who built the motor.
00:17:08.800 | (Luke laughs)
00:17:09.640 | - Yeah.
00:17:10.460 | - Probably not.
00:17:11.300 | But doing interesting work that's both innovative
00:17:17.140 | and let's say craftsman's work on the current thing,
00:17:19.520 | it's really satisfying and it's good.
00:17:21.920 | And that's cool.
00:17:22.840 | And then Elon was good at taking everything apart.
00:17:26.760 | Like, what's the deep first principle?
00:17:29.380 | Oh, no, what's really, no, what's really?
00:17:31.680 | You know, that ability to look at it without assumptions
00:17:36.680 | and how constraints is super wild.
00:17:40.840 | You know, he built a rocket ship and--
00:17:44.280 | - Using the same process. - Electric car,
00:17:45.880 | you know, everything.
00:17:47.740 | And that's super fun and he's into it too.
00:17:49.580 | Like, when they first landed two SpaceX rockets to Tesla,
00:17:53.840 | we had a video projector in the big room
00:17:55.700 | and like 500 people came down
00:17:57.540 | and when they landed, everybody cheered
00:17:59.020 | and some people cried.
00:18:00.360 | It was so cool.
00:18:01.420 | All right, but how did you do that?
00:18:03.940 | Well, it was super hard.
00:18:07.080 | And then people say, "Well, it's chaotic."
00:18:09.880 | Really?
00:18:10.720 | To get out of all your assumptions?
00:18:12.260 | You think that's not gonna be unbelievably painful?
00:18:15.860 | And is Elon tough?
00:18:17.760 | Yeah, probably.
00:18:19.100 | Do people look back on it and say,
00:18:20.900 | "Boy, I'm really happy I had that experience
00:18:25.140 | "to go take apart that many layers of assumptions."
00:18:29.520 | Sometimes super fun, sometimes painful.
00:18:33.060 | - So it could be emotionally and intellectually painful,
00:18:35.600 | that whole process of just stripping away assumptions.
00:18:38.560 | - Yeah, imagine 99% of your thought process
00:18:41.020 | is protecting your self-conception.
00:18:44.240 | And 98% of that's wrong.
00:18:46.360 | Now you got the math right.
00:18:49.200 | How do you think you're feeling
00:18:51.320 | when you get back into that one bit that's useful
00:18:54.500 | and now you're open and you have the ability
00:18:56.240 | to do something different?
00:18:58.340 | I don't know if I got the math right.
00:19:01.360 | It might be 99.9, but it ain't 50.
00:19:05.120 | - Imagining it, the 50% is hard enough.
00:19:10.000 | - Now, for a long time, I've suspected
00:19:13.520 | you could get better.
00:19:14.640 | Like you can think better, you can think more clearly,
00:19:18.400 | you can take things apart.
00:19:19.720 | And there's lots of examples of that, people who do that.
00:19:24.100 | - And Elon is an example of that.
00:19:28.680 | - Apparently. - You are an example.
00:19:29.840 | - I don't know if I am.
00:19:31.200 | I'm fun to talk to.
00:19:33.220 | - Certainly.
00:19:35.080 | - I've learned a lot of stuff.
00:19:36.720 | Well, here's the other thing is, I joke, I read books.
00:19:40.720 | And people think, "Oh, you read books."
00:19:42.280 | Well, no, I've read a couple of books a week for 55 years.
00:19:47.280 | Well, maybe 50, 'cause I didn't learn to read
00:19:50.320 | until I was eight or something.
00:19:51.880 | And it turns out when people write books,
00:19:56.200 | they often take 20 years of their life
00:19:58.960 | where they passionately did something,
00:20:01.000 | reduce it to 200 pages.
00:20:03.760 | That's kind of fun.
00:20:05.160 | And then you go online and you can find out
00:20:07.520 | who wrote the best books and who, you know,
00:20:10.120 | that's kind of wild.
00:20:11.040 | So there's this wild selection process.
00:20:12.880 | And then you can read it and, for the most part,
00:20:15.360 | understand it.
00:20:16.320 | And then you can go apply it.
00:20:19.640 | Like I went to one company, I thought,
00:20:21.080 | and I haven't managed much before,
00:20:22.800 | so I read 20 management books.
00:20:24.960 | And I started talking to them and basically compared
00:20:27.520 | to all the VPs running around,
00:20:29.080 | I'd read 19 more management books than anybody else.
00:20:33.080 | (laughing)
00:20:34.760 | Wasn't even that hard.
00:20:36.280 | And half the stuff worked, like first time.
00:20:38.840 | It wasn't even rocket science.
00:20:41.240 | - But at the core of that is questioning the assumptions
00:20:44.680 | or sort of entering the thinking,
00:20:47.720 | first principles thinking,
00:20:49.480 | sort of looking at the reality of the situation
00:20:52.600 | and using that knowledge, applying that knowledge.
00:20:55.960 | - Yeah, so I would say my brain has this idea
00:20:59.080 | that you can question first assumptions.
00:21:02.000 | But I can go days at a time and forget that.
00:21:06.000 | And you have to kind of like circle back that observation.
00:21:10.240 | - Because it is emotionally challenging.
00:21:12.880 | - Well, it's hard to just keep it front and center
00:21:15.040 | 'cause you operate on so many levels all the time
00:21:18.120 | and getting this done takes priority
00:21:21.200 | or being happy takes priority
00:21:24.240 | or screwing around takes priority.
00:21:27.120 | Like how you go through life is complicated.
00:21:30.760 | And then you remember, oh yeah,
00:21:32.120 | I could really think first principles.
00:21:34.200 | Oh shit, that's tiring.
00:21:35.980 | But you do for a while and that's kind of cool.
00:21:40.440 | - So just as a last question in your sense
00:21:43.880 | from the big picture, from the first principles,
00:21:47.160 | do you think, you kind of answered it already,
00:21:49.200 | but do you think autonomous driving is something
00:21:52.680 | we can solve on a timeline of years?
00:21:56.400 | So one, two, three, five, 10 years
00:21:59.880 | as opposed to a century?
00:22:01.560 | - Yeah, definitely.
00:22:03.080 | - Just to linger on it a little longer,
00:22:05.120 | where's the confidence coming from?
00:22:07.800 | Is it the fundamentals of the problem,
00:22:10.360 | the fundamentals of building the hardware and the software?
00:22:14.120 | - As a computational problem,
00:22:16.480 | understanding ballistics, roles, topography,
00:22:21.040 | it seems pretty solvable.
00:22:24.240 | I mean, and you can see this,
00:22:25.680 | like speech recognition for a long time,
00:22:28.000 | people are doing frequency and domain analysis
00:22:30.440 | and all kinds of stuff
00:22:32.120 | and that didn't work for at all, right?
00:22:35.000 | And then they did deep learning about it
00:22:37.080 | and it worked great.
00:22:38.080 | And it took multiple iterations.
00:22:42.040 | And autonomous driving is way past
00:22:45.880 | the frequency analysis point.
00:22:47.560 | Use radar, don't run into things.
00:22:51.600 | And the data gathering is going up
00:22:53.160 | and the computation is going up
00:22:54.560 | and the algorithm understanding is going up
00:22:56.320 | and there's a whole bunch of problems
00:22:57.720 | getting solved like that.
00:22:59.680 | - The data side is really powerful,
00:23:01.240 | but I disagree with both you and Elon.
00:23:03.440 | I'll tell Elon once again as I did before
00:23:06.280 | that when you add human beings into the picture,
00:23:10.080 | it's no longer a ballistics problem.
00:23:13.400 | It's something more complicated,
00:23:15.200 | but I could be very well proven wrong.
00:23:18.080 | - Cars are highly damped in terms of rate of change.
00:23:20.760 | Like the steering system's really slow
00:23:24.360 | compared to a computer.
00:23:25.400 | The acceleration is really slow.
00:23:28.760 | - Yeah, on a certain time scale,
00:23:30.600 | on a ballistics time scale,
00:23:31.880 | but human behavior, I don't know.
00:23:33.520 | I shouldn't say--
00:23:35.920 | - Brains are really slow too.
00:23:37.480 | Weirdly, we operate half a second behind reality.
00:23:41.640 | Nobody really understands that one either.
00:23:43.000 | It's pretty funny.
00:23:44.160 | - Yeah, yeah.
00:23:45.880 | We very well could be surprised.
00:23:51.320 | And I think with the rate of improvement
00:23:52.880 | in all aspects on both the compute
00:23:54.600 | and the software and the hardware,
00:23:57.400 | there's gonna be pleasant surprises all over the place.
00:24:00.240 | - Mm-hmm.
00:24:01.640 | (upbeat music)
00:24:04.240 | (upbeat music)
00:24:06.840 | (upbeat music)
00:24:09.440 | (upbeat music)
00:24:12.040 | (upbeat music)
00:24:14.640 | (upbeat music)
00:24:17.240 | [BLANK_AUDIO]