back to indexJim Keller: Elon Musk and Tesla Autopilot | AI Podcast Clips
00:00:06.520 |
once you figure out how to build the equipment, 00:00:20.420 |
his great insight is, people are how constrained. 00:00:26.400 |
and then little tweaks to that will generate something, 00:00:49.320 |
Elon Musk believes that autopilot and vehicle autonomy, 00:00:54.400 |
can follow this kind of exponential improvement. 00:00:57.200 |
In terms of the how question that we're talking about, 00:01:02.400 |
What are your thoughts on this particular space 00:01:12.960 |
- Well, the computer you need to build was straightforward. 00:01:16.480 |
And you could argue, well, does it need to be 00:01:18.840 |
two times faster, or five times, or 10 times? 00:01:27.960 |
You don't have to be especially smart to drive a car. 00:01:33.440 |
I mean, the big problem with safety is attention, 00:01:35.640 |
which computers are really good at, not skills. 00:01:58.320 |
and you can train a neural network to extract 00:02:00.720 |
the distance of any object in the shape of any surface 00:02:24.280 |
So the beautiful thing about the human vision system 00:02:33.240 |
It's not just about perfectly detecting cars, 00:02:37.680 |
It's trying to, it's understanding the physics-- 00:02:48.440 |
- Well, there is a, you know, when you're driving a car 00:02:50.360 |
and somebody cuts you off, your brain has theories 00:02:53.840 |
You know, they're a bad person, they're distracted, 00:02:56.360 |
they're dumb, you know, you can listen to yourself. 00:03:00.520 |
- So, you know, if you think that narrative is important 00:03:12.040 |
and probabilistic changes of speed and direction, 00:03:24.040 |
You can place every object really thoroughly, right? 00:03:29.040 |
You can calculate trajectories of things really thoroughly. 00:03:34.600 |
- But everything you said about really thoroughly 00:03:42.800 |
computer autonomous systems will be way better 00:03:50.160 |
They'll always remember there was a pothole in the road 00:04:13.240 |
- Right, so having a robot pick up this bottle cap 00:04:24.800 |
And autonomous systems are happily maximizing the givens. 00:04:31.800 |
you remember it 'cause you're processing it the whole time. 00:04:36.560 |
you get to work, you don't know how you got there. 00:04:45.400 |
But the cars have no theories about why they got cut off 00:04:53.240 |
So I tend to believe you do have to have theories, 00:05:00.480 |
So everything you said is actually essential to driving. 00:05:05.480 |
Driving is a lot more complicated than people realize, 00:05:09.400 |
I think, so sort of to push back slightly, but-- 00:05:17.800 |
You'd be surprised how simple a calculation for that is. 00:05:21.480 |
- I may be on that particular point, but there's a... 00:05:33.880 |
you might be surprised how complicated it is. 00:05:36.320 |
- I tell people, it's like progress disappoints 00:05:39.640 |
in the short run and surprises in the long run. 00:05:43.240 |
- I suspect in 10 years, it'll be just taken for granted. 00:05:50.040 |
- It's gonna be a $50 solution that nobody cares about. 00:06:01.160 |
- Yeah, it's true, but I do think that systems 00:06:03.720 |
that involve human behavior are more complicated 00:06:08.520 |
So we can do incredible things with technology 00:06:13.520 |
- I think humans are less complicated than people, 00:06:19.080 |
- We tend to operate out of large numbers of patterns 00:06:23.520 |
- But I can't trust you because you're a human. 00:06:42.600 |
Things like that, well, they'll be so much better 00:06:45.360 |
that the overall picture of safety and autonomy 00:06:54.080 |
I mean, there are already the current safety systems 00:06:57.280 |
like cruise control that doesn't let you run into people 00:07:01.020 |
There are so many features that you just look 00:07:03.240 |
at the Pareto of accidents and knocking off like 80% 00:07:13.540 |
it seems to be that there's a very intense scrutiny 00:07:19.360 |
by the media and the public in terms of safety, 00:07:21.980 |
the pressure, the bar put before autonomous vehicles. 00:07:25.680 |
What are your, sort of as a person there working 00:07:52.780 |
like modern brake systems imply hydraulic brakes. 00:07:57.720 |
So if you read the regulations to meet the letter 00:08:00.920 |
of the law for brakes, it sort of has to be hydraulic, right? 00:08:08.400 |
in the use cases, like a head-on crash, an offset crash, 00:08:12.040 |
don't hit pedestrians, don't run into people, 00:08:14.760 |
don't leave the road, don't run a red light or a stoplight. 00:08:23.280 |
about which scenarios injured or killed the most people. 00:08:26.960 |
And for the most part, those conversations were like, 00:08:31.680 |
what's the right thing to do to take the next step? 00:08:36.440 |
Now Elon's very interested also in the benefits 00:08:39.640 |
of autonomous driving are freeing people's time 00:08:44.140 |
And I think that's also an interesting thing, 00:08:52.000 |
so they're safe and safer than people seemed, 00:08:55.040 |
since the goal is to be 10x safer than people, 00:08:59.840 |
and scrutinizing accidents seems philosophically correct. 00:09:08.100 |
- What are, is different than the things you worked at, 00:09:13.100 |
the Intel, AMD, Apple, with autopilot chip design 00:09:21.960 |
or challenging aspects of building this specialized 00:09:24.320 |
kind of computing system in the automotive space? 00:09:30.400 |
One is the software team, the machine learning team, 00:09:34.980 |
is developing algorithms that are changing fast. 00:09:46.180 |
that the accelerator will be the wrong one, right? 00:09:52.660 |
if you build a really good general purpose computer, 00:09:54.900 |
say its performance is one, and then GPU guys 00:10:06.900 |
And then special accelerators get another two to 5x 00:10:19.860 |
are the subset of mathematical possibilities. 00:10:22.860 |
So auto, you know, AI accelerators have a claimed 00:10:40.920 |
So there's a, you know, there's a little creative tension 00:10:46.700 |
by specialization without being over specialized 00:10:49.820 |
so that the new algorithm is so much more effective 00:11:00.620 |
like automotive, there's all kinds of sensor inputs 00:11:06.760 |
So one of Elon's goals to make it super affordable. 00:11:22.860 |
And Elon's constraint was, I'm going to put one 00:11:29.380 |
So the cost constraint he had in mind was great. 00:11:38.900 |
it's craftsman's work, like a violin maker, right? 00:11:41.900 |
You can say Stradivarius is this incredible thing. 00:11:48.140 |
picked wood and sanded it, and then he cut it, 00:12:15.580 |
I used to, I dug ditches when I was in college. 00:12:24.620 |
So there's an expression called complex mastery behavior. 00:12:29.740 |
that's fun, 'cause you're learning something. 00:12:31.740 |
When you do something and it's rote and simple, 00:12:34.360 |
But if the steps that you have to do are complicated 00:12:38.040 |
and you're good at 'em, it's satisfying to do them. 00:12:44.540 |
as you're doing them, you sometimes learn new things 00:12:51.420 |
And engineers, like engineering is complicated, 00:12:54.420 |
enough that you have to learn a lot of skills. 00:12:56.460 |
And then a lot of what you do is then craftsman's work, 00:13:08.780 |
That essentially boils down to craftsman's work. 00:13:13.580 |
- Yeah, you know, there's thoughtful decisions 00:13:15.380 |
and problems to solve and trade-offs to make. 00:13:20.180 |
You know, is your building for the current car 00:13:30.900 |
a new type of neural network which has a new mathematics 00:13:35.160 |
That's, like there's more invention than that. 00:13:41.780 |
once you pick the architecture, you look inside 00:13:44.740 |
Adders and multipliers and memories and the basics. 00:13:54.820 |
that reduction to practice is transistors and wires 00:14:07.020 |
lots of people think factory work is road assembly stuff. 00:14:11.860 |
Like the people who work there really like it. 00:14:18.620 |
And the car is moving and the parts are moving 00:14:22.660 |
and you have to coordinate putting all the stuff together 00:14:31.620 |
and some of the guys sitting around were really bummed 00:14:45.500 |
But what they did was complicated and you couldn't do it. 00:14:51.880 |
'cause putting the bright, what's called the brights, 00:15:02.300 |
in a minute and a half is unbelievably complicated. 00:15:10.100 |
I think that's harder than driving a car, by the way. 00:15:24.700 |
- No, not for us humans driving a car is easy. 00:15:27.220 |
I'm saying building a machine that drives a car is not easy. 00:15:35.140 |
because we've been evolving for billions of years. 00:15:43.340 |
- Oh, now you join the rest of the internet in mocking me. 00:16:03.820 |
in terms of thinking about passion, craftsmanship, 00:16:18.620 |
Or what have you learned, have taken away from your time 00:16:24.720 |
which is known to be a place of chaos, innovation, 00:16:41.600 |
He has a deep belief that no matter what you do, 00:16:51.960 |
And it was a lot better than what we were using. 00:17:01.000 |
And I said, "You know when the super intelligent aliens come, 00:17:11.300 |
But doing interesting work that's both innovative 00:17:17.140 |
and let's say craftsman's work on the current thing, 00:17:22.840 |
And then Elon was good at taking everything apart. 00:17:31.680 |
You know, that ability to look at it without assumptions 00:17:49.580 |
Like, when they first landed two SpaceX rockets to Tesla, 00:18:12.260 |
You think that's not gonna be unbelievably painful? 00:18:25.140 |
"to go take apart that many layers of assumptions." 00:18:33.060 |
- So it could be emotionally and intellectually painful, 00:18:35.600 |
that whole process of just stripping away assumptions. 00:18:51.320 |
when you get back into that one bit that's useful 00:19:14.640 |
Like you can think better, you can think more clearly, 00:19:19.720 |
And there's lots of examples of that, people who do that. 00:19:36.720 |
Well, here's the other thing is, I joke, I read books. 00:19:42.280 |
Well, no, I've read a couple of books a week for 55 years. 00:19:47.280 |
Well, maybe 50, 'cause I didn't learn to read 00:20:12.880 |
And then you can read it and, for the most part, 00:20:24.960 |
And I started talking to them and basically compared 00:20:29.080 |
I'd read 19 more management books than anybody else. 00:20:41.240 |
- But at the core of that is questioning the assumptions 00:20:49.480 |
sort of looking at the reality of the situation 00:20:52.600 |
and using that knowledge, applying that knowledge. 00:20:55.960 |
- Yeah, so I would say my brain has this idea 00:21:06.000 |
And you have to kind of like circle back that observation. 00:21:12.880 |
- Well, it's hard to just keep it front and center 00:21:15.040 |
'cause you operate on so many levels all the time 00:21:35.980 |
But you do for a while and that's kind of cool. 00:21:43.880 |
from the big picture, from the first principles, 00:21:47.160 |
do you think, you kind of answered it already, 00:21:49.200 |
but do you think autonomous driving is something 00:22:10.360 |
the fundamentals of building the hardware and the software? 00:22:28.000 |
people are doing frequency and domain analysis 00:23:06.280 |
that when you add human beings into the picture, 00:23:18.080 |
- Cars are highly damped in terms of rate of change. 00:23:37.480 |
Weirdly, we operate half a second behind reality. 00:23:57.400 |
there's gonna be pleasant surprises all over the place.