back to indexEmilio Frazzoli, CTO, nuTonomy - MIT Self-Driving Cars
Chapters
0:0 Introduction
1:15 Why selfdriving cars
3:57 Why selfdriving vehicles
6:31 Cost of selfdriving vehicles
11:58 What is a selfdriving car
15:48 Value of selfdriving cars
22:16 Consumer vs service
29:16 HD Maps
31:1 Advantages
32:23 Mobility paradox
35:14 When will selfdriving cars arrive
36:2 State of the art for autonomous technology today
37:28 Driving in traffic
39:24 Technical challenges
40:45 Traffic light example
41:33 Rules of the road
43:48 The industry standard
44:46 Caltechs mistake
45:30 Too many rules of the road
46:41 Learning the wrong thing
48:51 The reality
53:27 Formal methods
53:54 Automatic trajectories
54:20 Automatic parking
55:11 Hierarchy of rules
56:54 Total order
57:31 Other lane
58:5 The problem
58:28 The fundamental norm
00:00:05.240 |
one of the most successful autonomous vehicle companies 00:00:17.560 |
the first autonomous vehicles on road in Singapore. 00:00:57.080 |
What I will talk about today is a little bit about 00:01:04.680 |
some of the guidelines on the technology development, 00:01:14.120 |
a number of stories about why I started doing this 00:01:19.120 |
and why I think this is an important technology, 00:01:24.480 |
I've been a faculty member here for 10 years. 00:01:52.000 |
was mostly, I was working on airplanes and cars 00:02:16.560 |
for a project on future urban mobility in Singapore. 00:02:23.200 |
but essentially I got interested in that project 00:02:37.480 |
but what do you think that you bring to the table? 00:02:40.400 |
And we had just done the DARPA Urban Challenge 00:02:45.400 |
So what, this is a project on future urban mobility 00:02:48.920 |
so what do cars have to do with urban mobility, 00:02:59.080 |
the five minute phone call that changed my life 00:03:11.760 |
So why, well imagine that you have a smartphone 00:03:16.640 |
and then a smartphone app and then you use this app 00:03:25.680 |
goes to pick up somebody else who goes to park 00:03:30.640 |
Uber was Travis Kalanick and a couple of guys 00:03:37.920 |
so I joined the team and we started this activity 00:03:46.280 |
thinking about, you know there was something, 00:03:49.320 |
an excuse that I made up in those five minutes, okay? 00:04:01.000 |
why do we want to have self-driving vehicles, okay? 00:04:04.440 |
So the number one reason that you typically hear 00:04:13.960 |
A very large number of people die on the road, 00:04:23.480 |
most of those people are actually fairly young, 00:04:53.280 |
most of the road accidents are due to human errors, 00:04:55.960 |
you remove the human, you remove the errors, right? 00:05:10.680 |
Essentially, if the car is driving by itself, 00:05:13.880 |
you can do other things, you can sleep, you can read, 00:05:16.880 |
you can text legally to your heart's content, 00:05:21.320 |
you can check your emails, so on and so forth, right? 00:05:35.720 |
or maybe they are too young, they are too old, 00:05:40.000 |
So then, you know, the computer can take them home. 00:05:42.560 |
Another thing is increased efficiency throughput in a city 00:05:49.040 |
as cars can communicate beyond visual range, for example. 00:05:53.880 |
Another one is reduced environmental impact, okay? 00:06:00.400 |
why we may want to have self-driving vehicles. 00:06:02.800 |
The problem with me is that if you think about this, 00:06:09.600 |
but these are all ways that you take the status quo, 00:06:14.680 |
and you make it a little bit better, maybe a lot better, 00:06:24.380 |
Can we use this technology, leverage this technology 00:06:27.620 |
to change the way that we think of mobility, okay? 00:06:31.540 |
So how do you compare all these different things, okay? 00:06:35.880 |
So this is quick, back-of-the-envelope kind of calculation 00:06:45.040 |
but I think that the orders of magnitude are right, okay? 00:06:49.180 |
So, you know, the first thing is, okay, so fine, 00:06:53.040 |
we heard that a big reason for self-driving cars 00:07:06.680 |
to your friends, your family, is probably priceless. 00:07:09.940 |
To the government, it's worth about $9 million, okay? 00:07:17.480 |
this is what is called the cost of a statistical life. 00:07:21.360 |
There was a report that was released a few years ago, 00:07:26.960 |
The economic cost of road accidents in the United States 00:07:31.120 |
is evaluated to be about $300 billion a year. 00:07:35.080 |
The societal harm, you know, of road accidents 00:07:39.960 |
is another, you know, all the pain and suffering 00:07:42.400 |
is evaluated to be another $600 billion a year, 00:07:45.880 |
so what we are getting to is about almost $1 trillion, okay? 00:07:52.780 |
But let's look at where the other effects are, okay? 00:08:02.960 |
The health cost of congestion, of the extra pollution, 00:08:07.800 |
so you see that these are just a small change, right? 00:08:10.800 |
The next effect is actually important, right? 00:08:26.160 |
Simple calculation, what I did is I multiplied 00:08:29.200 |
one half the median wage of workers in the United States, 00:08:42.840 |
And what you get is about, you know, what was it? 00:08:52.040 |
the value to society of getting the time back 00:08:59.360 |
than the value to society of increased safety, okay? 00:09:08.280 |
but you start seeing how these things compare. 00:09:14.360 |
is that there is still half of that is missing. 00:09:22.680 |
that you provide to society, to all individuals, okay? 00:09:36.880 |
So for me, car sharing, or vehicle sharing in general, 00:09:40.080 |
is a concept that everybody loves, but nobody uses, okay? 00:09:51.760 |
I really like using Hubway, the bicycle sharing. 00:10:02.120 |
sorry, there are no more bikes on campus, right? 00:10:07.800 |
or maybe you cannot find a parking spot for your bike. 00:10:12.060 |
So then you have to buy somewhere else and then walk. 00:10:14.200 |
So that defeats the purpose of using that bike. 00:10:28.000 |
Or you have a one-way, but in one-way system, 00:10:30.320 |
then the distribution of cars tend to get skewed, right? 00:10:37.040 |
in some clever way, then you're not guaranteed 00:10:43.680 |
and you're not guaranteed that you will get a spot, 00:10:46.560 |
a parking spot, where you don't need the car anymore, okay? 00:10:49.920 |
If you think of that, these are both friction points 00:10:53.080 |
for using vehicle sharing, and these are both 00:11:17.720 |
So you see that this actually has a big chunk 00:11:22.160 |
And that is using an estimate of what we call 00:11:27.540 |
of the shared vehicles can essentially substitute 00:11:33.080 |
There are some studies that get to this sharing factor 00:11:37.280 |
up to 10, and of course the benefits are even more. 00:11:41.480 |
Now, every time I see a round number like that, 00:11:47.040 |
10 is a little bit too convenient to be true, right? 00:11:51.400 |
But anyway, so that's something that you can find 00:11:54.920 |
So this is really where I think that the major impact 00:12:03.440 |
of autonomous driving or self-driving cars will come from. 00:12:08.440 |
Now, I think that also there is a lot of confusion 00:12:16.200 |
Now, what I'm doing here, I just listed these five levels, 00:12:23.960 |
These are the Society of Automotive Engineers levels, okay? 00:12:34.040 |
Driver assistance, level one, there is, for example, 00:12:38.160 |
cruise control or some simple single-channel automation. 00:12:44.160 |
Partial automation, you have something like, for example, 00:13:03.720 |
but needs to be able to intervene given some notice, okay? 00:13:27.800 |
Now, my first reaction when I started seeing these levels, 00:13:32.360 |
and there is also a similar version by NHTSA, 00:13:47.760 |
So you have level zero, one, two, three, five. 00:13:52.880 |
you are led to believe that these are actually sequential, 00:13:56.460 |
right, that you do level zero, then you do level one, 00:14:06.520 |
because I think that level two and level three, 00:14:29.820 |
And, you know, this is especially painful for me 00:14:33.320 |
as a former aeronautics and astronautics professor 00:14:39.200 |
that as soon as autopilots were being introduced 00:14:43.400 |
and everybody thought that accidents would go down, 00:14:55.120 |
pilots lose the ability to react in case of an emergency. 00:14:59.680 |
Okay, so the airline industry had to essentially 00:15:03.200 |
educate itself on how to deal with automation 00:15:16.820 |
So how do you train people who probably, you know, 00:15:21.820 |
the last time they sat with an instructor in a car 00:15:28.440 |
How do you train people to use the automation technology 00:15:34.320 |
So I think that this is something that I find very scary. 00:15:37.080 |
On the other hand, I think that the full automation 00:15:39.640 |
when the car is essentially able to drive itself 00:16:02.200 |
Okay, so the first thing that people say is safety. 00:16:04.720 |
I think it is true that eventually, asymptotically, 00:16:26.680 |
So how do you demonstrate that the self-driving cars 00:16:31.680 |
demonstrate the reliability of these self-driving cars? 00:16:42.400 |
So with a relatively small number of accidents. 00:16:47.160 |
If I remember correctly, only one was their fault, right? 00:16:50.880 |
But actually, humans drive for many times that 00:17:15.200 |
to a change to your system, to your software, 00:17:26.380 |
And we may not be positive that these self-driving cars 00:17:30.020 |
are actually safer than their human counterparts 00:17:35.720 |
So safety for me remains kind of an open question 00:17:42.400 |
How do you get back the time value of driving? 00:17:46.960 |
If you had, at least I'm speaking for myself, 00:18:06.460 |
the harder it is for me to keep paying attention, right? 00:18:10.900 |
And this is where the whole problem is, right? 00:18:12.740 |
So it would be very hard for me not to fall asleep 00:18:31.960 |
really convenient and reliable and so on and so forth, 00:18:35.000 |
you need the car to come to you with nobody inside. 00:18:38.440 |
And for that, you need level four or level five, okay? 00:18:51.240 |
that you show off to your friends or to your girlfriend, 00:18:58.680 |
So my point is that level four or five automation 00:19:08.400 |
And in fact, the one game-changing feature of these cars 00:19:13.280 |
is the fact that these cars now can move around 00:19:17.000 |
You know, that's really the game-changing feature, okay? 00:19:20.560 |
Good, and this is really what we would like to do. 00:19:31.480 |
So on this figure, what I show on the horizontal axis 00:19:34.440 |
is the scale or the scope of the kind of driving 00:19:48.600 |
On the, you know, like a complex environment, right? 00:20:07.560 |
So we have millions of cars driving all over the world 00:20:11.300 |
completely, you know, in a completely automated way, okay? 00:20:32.880 |
So they're used to thinking of production of cars 00:20:40.880 |
And essentially what they do is they make a lot of cars 00:20:50.880 |
And essentially, they're following this level zero, 00:21:00.120 |
even though they claim fully autonomous package 00:21:08.760 |
in the fine print, they say it's level two, right? 00:21:20.160 |
they're coming out with this kind of feature. 00:21:21.840 |
Cadillac, I think, has a similar thing, okay? 00:21:31.840 |
This red band where you're actually requiring 00:21:59.160 |
is they're working on cars that will be fully automated 00:22:02.440 |
from the beginning, and they start with a small, 00:22:08.260 |
and then scale that up, that operations up, right? 00:22:22.200 |
and don't seem to realize the big difference, 00:22:34.000 |
You know, where autonomous vehicles will be common. 00:22:37.280 |
I ask them, "Okay, but what do you mean exactly by that?" 00:22:42.720 |
Because if you ask me, "When is it that you will be able 00:22:49.060 |
"that you just push a button and it takes you home?" 00:22:52.240 |
That's not happening for another 20 years, at least, okay? 00:22:58.200 |
"When you will be able to go to some new city 00:23:02.440 |
"that picks you up and takes you to your destination?" 00:23:04.760 |
That thing is happening within a couple of years, okay? 00:23:11.080 |
There is a big difference between autonomous vehicles, 00:23:17.040 |
versus a service that you provide to passengers. 00:23:26.260 |
Where do these cars need to be able to drive? 00:23:35.280 |
then I want this thing to work everywhere, right? 00:23:38.940 |
So, take me home, take me to this little alley, 00:23:49.600 |
I'm offering this service in this particular location. 00:23:55.800 |
and maybe under these traffic conditions, okay? 00:23:58.860 |
So, just the problem becomes much more, much easier. 00:24:06.360 |
So, if I have to sell you a car with an autonomy package, 00:24:14.240 |
What are my cost constraints on that autonomy package? 00:24:22.440 |
the cost of the autonomy package must be comparable 00:24:31.320 |
with a half a million dollar autonomy package, right? 00:24:34.820 |
Also, you can do, so another back of the envelope 00:24:40.460 |
calculation that I did is, okay, so let's say that 00:24:47.640 |
Let's say that the value to you is the fact that now, 00:24:56.280 |
What is the value of your time as you are not driving, right? 00:25:05.600 |
behind the wheel, median wage, or value of time. 00:25:10.380 |
What you get is, what I get is that the net present value 00:25:21.480 |
So then, a rational buyer will not pay more than that 00:25:33.000 |
Or actually, if you want to make a profit out of it, 00:25:37.140 |
cannot cost more than a few thousand dollars, okay? 00:25:41.340 |
On the other hand, if you're thinking of this as a service, 00:25:47.140 |
of providing the same service using a carbon-based life form 00:25:57.340 |
you need to hire at least, say, three drivers per car, okay? 00:26:05.740 |
So now I'm comparing the cost of my automation package 00:26:09.560 |
to something that is gonna cost me $100,000 a year 00:26:16.520 |
So now the cost of that LIDAR, or that fancy computer, 00:26:34.840 |
Now, again, if I want to sell it as a product, 00:26:41.720 |
where global could mean all the United States, 00:26:49.180 |
of the whole of Europe, or the continental United States, 00:26:54.940 |
If I'm providing a service, then I only need to map 00:26:59.040 |
the area where I want to provide the service. 00:27:02.040 |
And by the way, how does the complexity of the maps 00:27:06.120 |
scale with the customer base that you're serving? 00:27:11.080 |
If you think of a uniform people density, okay? 00:27:15.480 |
So then actually, you think that the complexity 00:27:43.640 |
of a fleet serving the population of a city, okay? 00:27:51.480 |
how would you calibrate your cameras and your sensors? 00:28:07.760 |
checking the timing belt or changing the oil. 00:28:18.840 |
you take it to the dealership, that's all you do. 00:28:26.080 |
you had to calibrate the sensors every time you go out 00:28:29.280 |
or you have to upload a new version of the drivers 00:28:33.000 |
and things like that, so you don't want to do that. 00:28:37.080 |
I have the maintenance crew that can take care of it 00:28:43.280 |
So there are a couple of important takeaways, right? 00:28:45.920 |
So one thing is that the cost of the autonomy package 00:28:52.640 |
Clearly, the cheaper I can make it, the better it is, right? 00:29:05.080 |
that is crossing your path, buy the LiDAR sensor, okay? 00:29:13.880 |
Any reference to other things is intentional. 00:29:17.060 |
The other thing is HD maps that people worry about 00:29:31.600 |
within a few years, will be a dime a dozen, okay? 00:29:39.760 |
The mapping company need to put these sensors on a car 00:29:45.640 |
Now imagine that I have a fleet of 1,000 cars 00:29:51.260 |
and these cars are just driving around the city 00:29:57.320 |
that I can just use to make and maintain my HD maps. 00:30:00.860 |
So I think that, especially from the point of view 00:30:04.320 |
of the operators, the providers of these mobility services, 00:30:11.200 |
to essentially make and maintain their own maps, okay? 00:30:20.480 |
because as soon as you start offering this service, 00:30:22.760 |
you will be able to collect all the data you need 00:30:28.400 |
Oh, by the way, this is showing, an animation showing, 00:30:40.440 |
So that's where I was based until a few days ago. 00:30:42.880 |
And as you see, essentially you have vehicles 00:30:46.480 |
that go through most of the city every few hours, okay? 00:30:55.000 |
goes through 95% of Manhattan every two hours or so, okay? 00:31:15.600 |
Of course, you remove the driver from the picture, 00:31:18.880 |
Of course, the automation costs you a little bit more, 00:31:26.120 |
you can get a really significant increase in the margin, 00:31:31.120 |
meaning that you can pass some of those savings 00:32:10.480 |
they're gonna shut me down because they're afraid 00:32:12.380 |
that I will put all of their taxi drivers on the street, 00:32:15.680 |
on the street in the sense of being unemployed. 00:32:21.880 |
What most people do not realize is that actually 00:32:27.800 |
mobility services worldwide are actually manpower limited. 00:32:32.100 |
Okay, in Singapore, they would like to buy more buses, 00:32:37.200 |
but they don't have enough people who are able 00:32:48.640 |
Now, there's another back of the envelope calculation 00:32:53.320 |
Now, imagine, so as we know, Uber's been widely successful, 00:33:01.200 |
A lot of this valuation is predicated on the fact 00:33:03.600 |
that everybody in the world will eventually use Uber, right? 00:33:15.520 |
for their mobility needs, how many people in the world 00:33:25.680 |
must drive for Uber if Uber is serving the whole world. 00:33:32.520 |
So people still need to be teachers, doctors, 00:33:39.520 |
policemen, firemen, or some people need to be kids. 00:33:50.920 |
So today, what you have is people who drive around, 00:33:54.720 |
but what is happening today is that we are all 00:34:05.000 |
of our productive day behind the wheel, very often, okay? 00:34:14.440 |
on the supply of mobility rather than on job loss. 00:34:19.360 |
I mean, of course, if you increase supply of mobility, 00:34:33.560 |
maybe balanced by like an added value in service 00:34:56.680 |
It's the single most dangerous industry that you can be in. 00:35:00.600 |
So maybe if you can take some of those people 00:35:08.080 |
remotely control a truck sitting in their office 00:35:14.400 |
Back to the question of when will autonomous vehicles 00:35:18.360 |
arrive, and in a sense, this is what our prediction, 00:35:23.760 |
So what we will see is that, what we think is that 00:35:38.940 |
that people can use to go from point to point, right? 00:35:44.440 |
Of course, eventually, people will be able to buy these cars 00:35:50.420 |
but that is something that is much later in time 00:35:53.320 |
for a number of reasons, some of which I discussed, okay? 00:35:58.500 |
So this is what we expect in terms of the timeline for this. 00:36:12.580 |
You do see a lot of demos from a number of companies 00:36:25.520 |
I don't know if any of you recognizes this video, 00:36:51.380 |
for hundreds of miles on the German highways, okay? 00:36:55.720 |
If you're not showing something that goes beyond that, 00:37:01.020 |
you have not made any progress over the past 20 years, okay? 00:37:10.900 |
but you're doing what people were doing 20 years ago. 00:37:15.740 |
So you see, clearly there is a lot of hype in these things, 00:37:24.940 |
People knew how to do that for a very long time. 00:37:32.460 |
but this is something that I find a little bit more exciting 00:37:35.660 |
is actually footage from our daily drives in Singapore, okay? 00:37:44.620 |
But essentially what we are doing in Singapore, 00:37:49.120 |
we are driving in public roads, normal traffic. 00:37:57.980 |
but we have construction zones, intersections, 00:38:04.880 |
We will get to a pretty interesting intersection. 00:38:15.600 |
Keep in mind in Singapore, they drive on the left, right? 00:38:29.340 |
All of this is without any human intervention, right? 00:38:35.500 |
if you're not showing the capability of driving in traffic 00:38:43.900 |
over what people were able to do 20 years ago, okay? 00:39:06.700 |
And this is what we are doing every day in Singapore. 00:39:09.940 |
We are doing every day here in the seaport area. 00:39:18.300 |
to drive our cars autonomously in the seaport area. 00:39:34.020 |
that Amnon Shashua, the founder and CEO of Mobileye, 00:39:51.980 |
smapping, and then is what he called driving policy, 00:39:56.180 |
right, that I will call more like a decision-making, okay? 00:39:59.140 |
Now, what he said is that sensing, perception, 00:40:17.140 |
What he said is that it's a huge logistical nightmare, 00:40:26.500 |
you know, for me, HD maps, it is a big pain in the neck 00:40:36.300 |
So we'll get all the mapping data that we want and we need. 00:40:48.820 |
that we encounter in any kind of urban driving situation. 00:40:54.580 |
So this is a case where we are at a traffic light, 00:40:59.780 |
the light turns green, we are making the turn, 00:41:18.040 |
so we have to handle all that kind of situation, right? 00:41:20.340 |
So how do you write your software in such a way 00:41:44.740 |
My claim, I have not proved it mathematically yet, 00:41:49.260 |
that actually the rules of the road were introduced 00:41:52.260 |
exactly to avoid the need for negotiation when you drive. 00:42:01.460 |
you know, walking down the infinite corridor, 00:42:03.080 |
and there is a person coming in the other direction, 00:42:06.660 |
So when you're trying to, I go left, I go right, right? 00:42:14.700 |
or in other places, everybody go left, period, 00:42:19.420 |
You get to an intersection, the light is red, you stop. 00:42:24.540 |
You don't say, I'm really in a rush, do you mind if I go? 00:42:31.140 |
So the rules of the road have been invented by humans 00:42:34.560 |
in order to minimize the amount of negotiation, okay? 00:42:37.460 |
And in particular, okay, so this is a slightly, 00:42:45.080 |
So now our car is a little bit more aggressive, 00:42:50.860 |
This is how the car behaved in that particular situation. 00:42:53.940 |
So you see it's raining, red light, turns green. 00:43:02.640 |
You see that there is a, you will see that there is a truck 00:43:05.100 |
that is parked on the left lane, in the middle of the lane, 00:43:09.620 |
but there's a motorcycle that is approaching, 00:43:12.140 |
so we had to be careful in going to the other lane. 00:43:19.400 |
We try to go very slowly next to squishy targets, right? 00:43:26.880 |
the truck driver decides to get moving, okay? 00:43:30.440 |
So then what we do is we wait for the truck to get going, 00:43:44.380 |
and this is a nightmare, so you don't want to do that. 00:43:46.540 |
So how do you handle this kind of situations? 00:43:49.380 |
Okay, so the industry standard approach to this was to, 00:44:05.100 |
that was encoded by, so some finite-state machine 00:44:15.660 |
to come up with this logic, and it's essentially 00:44:29.060 |
So here, in a rental car, just plain interference 00:44:43.500 |
I'm happy to say that actually we did come up 00:44:47.220 |
And by the way, this is a video from the Caltech team 00:44:53.120 |
As you can see, they're trying to go to an intersection. 00:44:58.060 |
they decide actually to back up out of the intersection. 00:45:01.880 |
So the director of DARPA, Tony Tedder at the time, 00:45:15.920 |
Caltech, team of very smart people, very capable, 00:45:22.460 |
They didn't catch this bug, they were out of the race. 00:45:29.800 |
Okay, so as a reaction to that, there is this new 00:45:52.100 |
It's impossible to code all of them correctly. 00:46:00.240 |
Just feed the data, feed the car a lot of data, 00:46:03.380 |
and let the car learn by itself how to behave. 00:46:11.940 |
there are a number of startups and other efforts 00:46:14.420 |
that are trying to use all this deep learning 00:46:17.780 |
or learning approaches to get for end-to-end driving 00:46:30.820 |
I understand this is a course on deep learning for cars. 00:46:48.300 |
So one of our developers, super bright lady from Caltech, 00:47:04.540 |
for the yellow light was, if you see a yellow light, 00:47:23.660 |
in which accelerating when you see a yellow light 00:47:31.740 |
So there are some other features of the situation 00:47:36.900 |
Also, the other thing is, it's a cartoon, right? 00:47:51.700 |
of explaining why the neural network in the car 00:47:57.580 |
so these are the neurons that were activated. 00:48:06.580 |
and I see what neurons, what areas of the brain 00:48:15.380 |
The point is that, yes, you want to trace the reason, 00:48:19.380 |
the cause for why the car behave in a certain way, 00:48:22.380 |
but you also want to be able to revert the cause, right? 00:48:25.820 |
So you want that information to be actionable 00:48:44.180 |
let me actually skip that in the interest of time, okay? 00:49:16.700 |
of every single licensed driver in the United States, okay? 00:49:21.860 |
We don't say, just drive with your dad or mom 00:49:39.580 |
Actually, I went through an exercise of counting, okay? 00:49:43.320 |
And what I did, I kind of like a clustered them. 00:49:45.380 |
So essentially, you have rules on who can drive, 00:49:50.400 |
when and where, what can be driven, when and where, 00:50:04.540 |
how do you interpret the signals that you see on the road, 00:50:08.860 |
right, and where you can park and where you can stop. 00:50:15.440 |
So not that many, it's kind of like 12 categories. 00:50:19.820 |
What is true is that the number of possible combinations 00:50:35.980 |
and where other cars are, that is a humongous number. 00:50:43.940 |
you don't want to be to essentially any generative model 00:50:56.800 |
That is something that is just combinatorially untractable. 00:51:10.300 |
I claim that it's also hard to learn the good behavior. 00:51:13.980 |
Because now you need to have enough training data 00:51:17.340 |
for every possible combination of rules and instantiations. 00:51:34.540 |
And that's why I was showing these slides on NP-hardness. 00:51:56.700 |
that candidate is actually a solution of your problem. 00:51:58.980 |
And that's something that you do in polynomial time. 00:52:02.940 |
So in a sense, what I claim is that if you have an engine 00:52:07.320 |
that is able to generate a very large number of candidates, 00:52:15.380 |
and then what you do is checking whether or not 00:52:21.300 |
with respect to the rules, then that's all you need. 00:52:30.380 |
were exactly generating that very large number, 00:52:38.220 |
a very large graph exploring all potential trajectories, 00:52:41.740 |
reasonable trajectories that a robotic system can take. 00:52:57.820 |
that satisfies everything, rather than given a candidate, 00:53:02.340 |
check whether or not this candidate satisfies the rules. 00:53:07.780 |
The generating rules, the generating candidates 00:53:10.520 |
given all the constraints is a combinatorial problem. 00:53:22.820 |
So that's something that you can do very easily. 00:53:25.700 |
And that's essentially what we have in our course. 00:53:33.260 |
in a formal language, so very precise, like a syntax. 00:53:42.520 |
whether your trajectory satisfy all these rules 00:53:45.360 |
written in this language that is automatically, 00:53:50.280 |
into something look like a finite state machine 00:53:53.480 |
But that's not something that you do by hand. 00:54:06.800 |
that now are not only trajectories in the physical space 00:54:12.280 |
in this logical space, telling me whether or not 00:54:15.400 |
and to what extent I am satisfying the rules. 00:54:27.280 |
So, you know, initially what we are doing is work, 00:54:32.680 |
where we're still working on research projects 00:54:53.200 |
that you have lanes and directional travels, right? 00:54:57.800 |
is what is on the right, where now what the car is doing 00:55:01.480 |
is not only finding the trajectory to go park, 00:55:07.280 |
that are imposed on that particular parking structure. 00:55:13.960 |
and you know this is something that we as humans 00:55:15.920 |
do every day, is to deal with infeasibility, okay? 00:55:28.200 |
and well, sorry, but turns out that there is no trajectory, 00:55:31.640 |
there is no possible behavior that you can do 00:55:38.480 |
The computer returns, sorry, there's no compute, 00:55:46.200 |
So you do need a way of dealing with infeasibility, okay? 00:55:53.720 |
is having this idea of hierarchy of rules, okay? 00:56:03.560 |
generated by humans are actually organized hierarchically. 00:56:06.600 |
Typical example is the three laws of robotics by Asimov. 00:56:20.720 |
orders by a human, unless they violate the first law. 00:56:25.000 |
And the third law is a robot will try to preserve 00:56:30.840 |
unless it violates the first two laws, right? 00:56:37.040 |
So there are some rules that are more important than others. 00:56:39.720 |
So for example, do not hit people, do not hit other cars. 00:56:43.320 |
And then lower priority level is maybe driving your lane, 00:56:47.880 |
lower priority level is maybe maintaining the speed, 00:56:54.680 |
now we have this product graph of trajectories 00:56:59.840 |
On top of that, you can give them a cost, right? 00:57:05.520 |
What we use is a less geographic ordering, okay? 00:57:13.360 |
than violating a less important rule by a large amount, okay? 00:57:16.920 |
So that gives a total order structure to the cost. 00:57:28.040 |
when you try to do any kind of motion planning, okay? 00:57:32.360 |
And well, this is a collection of a few interesting things. 00:57:39.240 |
but you see that there is the other vehicle coming, 00:57:41.560 |
so technically we could not go to the other lane, 00:57:44.200 |
but you see that as long as it is safe to do so, 00:57:51.240 |
And again, you have a lot of difficult situations 00:57:57.640 |
without any scripting or without any special instruction 00:58:18.760 |
this minimum violation planning, everything will be okay. 00:58:24.560 |
a lot of uncertainty in the whole thing, okay? 00:58:29.640 |
Now you can think of this as asking the question. 00:58:32.160 |
So when I was young and naive, that is two years ago, 00:58:34.800 |
I thought that I take all the rules of the road 00:58:37.840 |
and you convert them to this formal language, 00:58:40.040 |
you put them in your software, and you're done. 00:58:42.640 |
And then you go and look at these rules of the road 00:58:55.040 |
tell you to do different things in different cases. 00:59:08.140 |
not to pose an obstacle or danger to other road users 00:59:17.800 |
Okay, that doesn't mean that if I see somebody 00:59:21.040 |
who is violating the rule, I can just hit them, right? 00:59:23.940 |
So you can imagine that you have a fleet of vigilantes, 00:59:26.760 |
you know, autonomous cars that just go around 00:59:29.080 |
and if you run the red light, bam, gonna kill you. 00:59:38.320 |
So the other guy will be the one to blame, right? 00:59:49.320 |
that rule continues to say special care must be exerted 00:59:56.160 |
but still doesn't tell you what you're supposed to do 00:59:59.160 |
when somebody else is violating the rule, okay? 01:00:06.600 |
you hear about all these trolley problems to no end, right? 01:00:17.760 |
I think it's extremely unlikely that you will be given 01:00:21.240 |
the choice of killing either Mother Teresa or Hitler, right? 01:00:24.280 |
So, I mean, for sure that will never happen, right? 01:00:27.360 |
But anything remotely similar will never happen to you, okay? 01:00:35.400 |
of the trolley problem which are actually meaningful, okay? 01:00:53.520 |
we will kill the pedestrian, probability one, okay? 01:01:00.200 |
It's his fault that, you know, his or her fault 01:01:02.760 |
that they stepped in the road when they shouldn't have. 01:01:15.760 |
You know, they were just walking around, you know, 01:01:33.560 |
then we clearly kill the guy who was jaywalking, right? 01:01:46.120 |
So I know that the solution exists for P is equal zero. 01:01:54.920 |
I must have some value of P at which the solution changes. 01:02:07.480 |
But you know, these are the kind of questions 01:02:17.780 |
Now what we, and you know, this is what happens 01:02:21.340 |
So when our computer vision system is telling me 01:02:27.040 |
it's not telling me that there is a pedestrian for sure, 01:02:53.540 |
because just a ghost, you know, like a false positive, 01:02:59.640 |
Well, I thought there was somebody in front of me. 01:03:11.080 |
in which you had to, you know, there would be a boundary. 01:03:16.000 |
And this is something that somebody will need to answer. 01:03:22.340 |
You know, of course I can come up with an answer 01:03:30.680 |
in which the community agrees on how the car should behave 01:03:40.600 |
is the biggest challenge in autonomous vehicles? 01:03:43.240 |
And something that I've come to realize only recently 01:03:47.860 |
in the development of autonomous vehicle technology 01:03:50.540 |
is that we do not understand in a very precise way, 01:03:54.760 |
rigorous way, how we want vehicles in general, 01:03:58.440 |
including human-driven vehicles to behave, okay? 01:04:02.560 |
A lot of these rules of the road are just like 01:04:08.540 |
but almost, you know, it's very uncertain language, 01:04:11.680 |
very, you know, no rigorous laws, right, or rules. 01:04:15.840 |
For example, a lot of the rules are predicated 01:04:26.860 |
of what right-of-way means in mathematical terms. 01:04:30.740 |
I know that it has something to do with distance, 01:04:44.980 |
and this car is farther away than this distance 01:05:05.740 |
and tells me, you know, any kind of situation, 01:05:07.940 |
what is the right behavior, what is the wrong behavior, 01:05:13.900 |
if I have two behaviors, which one is better, okay? 01:05:29.460 |
you know, like look at what people actually do 01:05:32.060 |
and at what point will people honk at you, right, 01:05:35.620 |
rather than in the field that you're cutting them off 01:05:38.480 |
versus the field that you yielded to them, okay? 01:05:51.460 |
What you thought that you had seen, that doesn't matter, 01:06:03.240 |
well, then people will start removing sensors, right? 01:06:11.040 |
But I really think that the compliance of the rules, 01:06:26.120 |
So from my point of view, the main message today 01:06:34.820 |
how we want human-driven vehicles to behave, okay? 01:06:41.740 |
I think that also designing automated vehicles 01:06:51.740 |
to some of our published work on these topics. 01:06:59.020 |
so this is the company, what we are trying to do. 01:07:11.540 |
We want to double our size in the next couple of years,