back to index

Emilio Frazzoli, CTO, nuTonomy - MIT Self-Driving Cars


Chapters

0:0 Introduction
1:15 Why selfdriving cars
3:57 Why selfdriving vehicles
6:31 Cost of selfdriving vehicles
11:58 What is a selfdriving car
15:48 Value of selfdriving cars
22:16 Consumer vs service
29:16 HD Maps
31:1 Advantages
32:23 Mobility paradox
35:14 When will selfdriving cars arrive
36:2 State of the art for autonomous technology today
37:28 Driving in traffic
39:24 Technical challenges
40:45 Traffic light example
41:33 Rules of the road
43:48 The industry standard
44:46 Caltechs mistake
45:30 Too many rules of the road
46:41 Learning the wrong thing
48:51 The reality
53:27 Formal methods
53:54 Automatic trajectories
54:20 Automatic parking
55:11 Hierarchy of rules
56:54 Total order
57:31 Other lane
58:5 The problem
58:28 The fundamental norm

Whisper Transcript | Transcript Only Page

00:00:00.000 | Today we have Emilio Frazzoli.
00:00:02.240 | He's the CTO of Neutonomy,
00:00:05.240 | one of the most successful autonomous vehicle companies
00:00:07.420 | in the world.
00:00:08.560 | He's the inventor of the RRT star algorithm,
00:00:11.880 | formerly a professor at MIT,
00:00:15.000 | directing research group that put
00:00:17.560 | the first autonomous vehicles on road in Singapore.
00:00:21.280 | And now he returns to MIT to talk with us.
00:00:24.760 | Give him a warm welcome.
00:00:26.600 | (audience applauding)
00:00:30.760 | - Thank you, Lex.
00:00:32.100 | It's a great opportunity,
00:00:34.120 | it's a great pleasure to be back here.
00:00:36.560 | I spent 15 years of my life here at MIT,
00:00:39.200 | first as a graduate student,
00:00:40.840 | and then as a faculty member.
00:00:44.440 | And this is where Neutonomy, the company,
00:00:47.880 | essentially was born.
00:00:49.080 | And we did a lot of the research that led us
00:00:51.520 | to start this company and eventually
00:00:54.640 | develop all this technology.
00:00:57.080 | What I will talk about today is a little bit about
00:00:59.680 | our vision on autonomous vehicles,
00:01:02.680 | why we want to have autonomous vehicles,
00:01:04.680 | some of the guidelines on the technology development,
00:01:08.160 | why we are doing things in a certain way.
00:01:10.760 | Let's get started.
00:01:11.600 | But I really would like to tell you
00:01:14.120 | a number of stories about why I started doing this
00:01:19.120 | and why I think this is an important technology,
00:01:21.320 | why we ended up starting this company.
00:01:24.480 | I've been a faculty member here for 10 years.
00:01:27.440 | I was happily working with my UAVs
00:01:29.840 | and I was in AeroAstra.
00:01:31.760 | At some point around 2005,
00:01:36.720 | there was this DARPA Grand Challenges
00:01:39.760 | that sounded cool,
00:01:40.960 | so I started working on cars as well.
00:01:43.880 | But that work that I was doing was mostly,
00:01:49.680 | I was working on the technology
00:01:52.000 | was mostly, I was working on airplanes and cars
00:01:56.960 | to make them fly or drive by themselves
00:01:59.160 | because it was cool, just look,
00:02:00.960 | no hands, it drives.
00:02:02.760 | As a controls guy, as a roboticist,
00:02:04.600 | that's all I needed.
00:02:05.920 | But then in 2009, there was this new project
00:02:11.240 | that was starting, it was a team that was
00:02:14.480 | getting together to write a proposal
00:02:16.560 | for a project on future urban mobility in Singapore.
00:02:20.840 | Okay, I'm not telling you the whole story
00:02:23.200 | but essentially I got interested in that project
00:02:26.520 | just because I wanted to go to Singapore.
00:02:28.680 | And then I called the person who was
00:02:32.960 | putting together the team and I said,
00:02:36.000 | okay, thank you for your interest
00:02:37.480 | but what do you think that you bring to the table?
00:02:40.400 | And we had just done the DARPA Urban Challenge
00:02:42.920 | so I know how to make autonomous cars.
00:02:45.400 | So what, this is a project on future urban mobility
00:02:48.920 | so what do cars have to do with urban mobility,
00:02:53.920 | autonomous cars, what do they have to do
00:02:55.960 | with urban mobility?
00:02:57.760 | And there was the phone call,
00:02:59.080 | the five minute phone call that changed my life
00:03:01.080 | because she asked me this question,
00:03:03.600 | that actually was Cindy Barnard
00:03:05.160 | who is now the chancellor.
00:03:06.640 | And then I had to come up with an excuse.
00:03:11.760 | So why, well imagine that you have a smartphone
00:03:16.640 | and then a smartphone app and then you use this app
00:03:19.160 | to call a car, the car comes to you,
00:03:21.200 | you get on the car, drive wherever you go,
00:03:23.040 | want to go, step off the car and the car
00:03:25.680 | goes to pick up somebody else who goes to park
00:03:27.640 | or something, right?
00:03:28.560 | So this was 2009.
00:03:30.640 | Uber was Travis Kalanick and a couple of guys
00:03:33.160 | and black cars in San Francisco, right?
00:03:34.960 | So, and essentially she bought it
00:03:37.920 | so I joined the team and we started this activity
00:03:42.920 | but the important thing is that I started
00:03:46.280 | thinking about, you know there was something,
00:03:49.320 | an excuse that I made up in those five minutes, okay?
00:03:52.240 | But then I said, you know what,
00:03:53.960 | kind of sounds like a good idea.
00:03:57.040 | And I started thinking more about this
00:03:58.920 | and I started thinking more about
00:04:01.000 | why do we want to have self-driving vehicles, okay?
00:04:04.440 | So the number one reason that you typically hear
00:04:07.960 | is we want to have self-driving vehicles
00:04:10.440 | so that we make roads safer, okay?
00:04:13.960 | A very large number of people die on the road,
00:04:17.880 | road accidents every year.
00:04:19.520 | What is, people do not realize is that
00:04:23.480 | most of those people are actually fairly young,
00:04:28.080 | like in their 20s and 30s, okay?
00:04:32.320 | And clearly what people usually say is that,
00:04:35.360 | Sebastian Thrun, back in the day,
00:04:37.400 | he gave all these TED Talks talking about
00:04:39.840 | his best friend from when he was young
00:04:42.320 | who died in a road accident, right?
00:04:44.640 | And then he made a mission for his life
00:04:47.880 | to reduce road accidents, right?
00:04:49.800 | But anyway, so the idea is that
00:04:53.280 | most of the road accidents are due to human errors,
00:04:55.960 | you remove the human, you remove the errors, right?
00:04:58.800 | And then you save lives, okay.
00:05:00.620 | So this is typically the number one reason
00:05:03.240 | that people mention when they talk about
00:05:05.800 | why you want to have self-driving vehicles.
00:05:07.960 | Second reason is convenience.
00:05:10.680 | Essentially, if the car is driving by itself,
00:05:13.880 | you can do other things, you can sleep, you can read,
00:05:16.880 | you can text legally to your heart's content,
00:05:21.320 | you can check your emails, so on and so forth, right?
00:05:24.120 | This is also great.
00:05:25.260 | Third thing is improved access to mobility.
00:05:30.280 | People who cannot drive, maybe because
00:05:33.000 | they have some physical impairment,
00:05:35.720 | or maybe they are too young, they are too old,
00:05:37.980 | or maybe too intoxicated to drive, right?
00:05:40.000 | So then, you know, the computer can take them home.
00:05:42.560 | Another thing is increased efficiency throughput in a city
00:05:49.040 | as cars can communicate beyond visual range, for example.
00:05:53.880 | Another one is reduced environmental impact, okay?
00:05:57.020 | Now, these are all fantastic reasons
00:06:00.400 | why we may want to have self-driving vehicles.
00:06:02.800 | The problem with me is that if you think about this,
00:06:07.440 | these are all good reasons,
00:06:09.600 | but these are all ways that you take the status quo,
00:06:12.920 | you know, how cars are used today,
00:06:14.680 | and you make it a little bit better, maybe a lot better,
00:06:17.640 | but you do not make it different, okay?
00:06:19.980 | And really, that is what I am mostly,
00:06:22.400 | what I was mostly interested in.
00:06:24.380 | Can we use this technology, leverage this technology
00:06:27.620 | to change the way that we think of mobility, okay?
00:06:31.540 | So how do you compare all these different things, okay?
00:06:35.880 | So this is quick, back-of-the-envelope kind of calculation
00:06:39.360 | that you can do on your own.
00:06:41.420 | You can question the numbers,
00:06:45.040 | but I think that the orders of magnitude are right, okay?
00:06:49.180 | So, you know, the first thing is, okay, so fine,
00:06:53.040 | we heard that a big reason for self-driving cars
00:06:56.280 | is to increase safety, you know, save lives.
00:06:59.240 | Great, now, how much is your life worth?
00:07:03.100 | Well, to yourself, to your loved ones,
00:07:06.680 | to your friends, your family, is probably priceless.
00:07:09.940 | To the government, it's worth about $9 million, okay?
00:07:14.480 | So this is what is called the,
00:07:16.160 | (laughs)
00:07:17.480 | this is what is called the cost of a statistical life.
00:07:21.360 | There was a report that was released a few years ago,
00:07:23.960 | probably, you know, there is an update now,
00:07:25.660 | but I haven't seen it.
00:07:26.960 | The economic cost of road accidents in the United States
00:07:31.120 | is evaluated to be about $300 billion a year.
00:07:35.080 | The societal harm, you know, of road accidents
00:07:39.960 | is another, you know, all the pain and suffering
00:07:42.400 | is evaluated to be another $600 billion a year,
00:07:45.880 | so what we are getting to is about almost $1 trillion, okay?
00:07:50.760 | It's a big number, okay?
00:07:52.780 | But let's look at where the other effects are, okay?
00:07:57.780 | What is the cost of congestion?
00:07:59.920 | It's an estimate, $100 billion a year.
00:08:02.960 | The health cost of congestion, of the extra pollution,
00:08:06.160 | it's another $50 billion a year,
00:08:07.800 | so you see that these are just a small change, right?
00:08:10.800 | The next effect is actually important, right?
00:08:15.840 | So what is the value of the time that we,
00:08:20.160 | as everybody in society, will get back
00:08:23.140 | from not having to drive, okay?
00:08:26.160 | Simple calculation, what I did is I multiplied
00:08:29.200 | one half the median wage of workers in the United States,
00:08:33.920 | which is an embarrassingly low number,
00:08:36.180 | multiplied by the number of hours
00:08:40.000 | that Americans spend behind the wheel, okay?
00:08:42.840 | And what you get is about, you know, what was it?
00:08:45.960 | $1.2 trillion a year.
00:08:48.240 | So something that you may notice is that
00:08:52.040 | the value to society of getting the time back
00:08:56.480 | from having to drive is actually more
00:08:59.360 | than the value to society of increased safety, okay?
00:09:04.080 | Of course, it's a little bit cynical, okay?
00:09:05.880 | So take it with a grain of salt,
00:09:08.280 | but you start seeing how these things compare.
00:09:11.940 | And what you may notice from this pie chart
00:09:14.360 | is that there is still half of that is missing.
00:09:17.360 | What is the other half?
00:09:19.520 | The other half is actually the value
00:09:22.680 | that you provide to society, to all individuals, okay?
00:09:26.660 | By essentially making car sharing
00:09:31.400 | finally something that is convenient to use,
00:09:33.760 | affordable, reliable, okay?
00:09:36.880 | So for me, car sharing, or vehicle sharing in general,
00:09:40.080 | is a concept that everybody loves, but nobody uses, okay?
00:09:43.960 | Or not as many people as we would like
00:09:46.000 | to use these kind of services.
00:09:49.040 | Examples, when I was here at MIT,
00:09:51.760 | I really like using Hubway, the bicycle sharing.
00:09:55.980 | But you have to be very careful.
00:10:00.400 | If you wait too long in the afternoon,
00:10:02.120 | sorry, there are no more bikes on campus, right?
00:10:05.560 | Or maybe very often you cannot find a bike,
00:10:07.800 | or maybe you cannot find a parking spot for your bike.
00:10:12.060 | So then you have to buy somewhere else and then walk.
00:10:14.200 | So that defeats the purpose of using that bike.
00:10:17.880 | Same thing with cars, right?
00:10:19.760 | So typically with car sharing systems,
00:10:22.600 | what you have is either you have a two-way,
00:10:25.320 | which is essentially hourly rental, right?
00:10:28.000 | Or you have a one-way, but in one-way system,
00:10:30.320 | then the distribution of cars tend to get skewed, right?
00:10:33.540 | And unless the company repositions cars
00:10:37.040 | in some clever way, then you're not guaranteed
00:10:41.800 | that you will get a car where you need it,
00:10:43.680 | and you're not guaranteed that you will get a spot,
00:10:46.560 | a parking spot, where you don't need the car anymore, okay?
00:10:49.920 | If you think of that, these are both friction points
00:10:53.080 | for using vehicle sharing, and these are both
00:10:55.600 | friction points that are actually addressed
00:10:58.480 | by if the car can drive itself, okay?
00:11:02.000 | So if you bring in all the economic benefits
00:11:06.440 | of a car sharing system that actually works,
00:11:12.240 | that's something that we estimate it to be,
00:11:14.360 | it's about $2 trillion a year.
00:11:17.720 | So you see that this actually has a big chunk
00:11:20.400 | in this pie chart, okay?
00:11:22.160 | And that is using an estimate of what we call
00:11:25.040 | the sharing factor of four, meaning that one
00:11:27.540 | of the shared vehicles can essentially substitute
00:11:29.680 | for four privately-owned vehicles, okay?
00:11:33.080 | There are some studies that get to this sharing factor
00:11:37.280 | up to 10, and of course the benefits are even more.
00:11:41.480 | Now, every time I see a round number like that,
00:11:44.720 | I get suspicious, right?
00:11:47.040 | 10 is a little bit too convenient to be true, right?
00:11:51.400 | But anyway, so that's something that you can find
00:11:53.600 | in the literature, okay?
00:11:54.920 | So this is really where I think that the major impact
00:12:03.440 | of autonomous driving or self-driving cars will come from.
00:12:08.440 | Now, I think that also there is a lot of confusion
00:12:11.360 | in the community, in the world, about what
00:12:13.320 | a self-driving car means.
00:12:16.200 | Now, what I'm doing here, I just listed these five levels,
00:12:20.840 | it's actually six levels of automation.
00:12:23.960 | These are the Society of Automotive Engineers levels, okay?
00:12:28.800 | So level zero is no automation,
00:12:31.160 | that's your great-grandfather's car, right?
00:12:34.040 | Driver assistance, level one, there is, for example,
00:12:38.160 | cruise control or some simple single-channel automation.
00:12:44.160 | Partial automation, you have something like, for example,
00:12:48.320 | lane keeping and cruise control,
00:12:50.660 | but you still require the driver
00:12:54.560 | to pay attention and intervene.
00:12:57.120 | Conditional automation, level three,
00:12:59.240 | a driver is a necessity, is not required
00:13:02.480 | to pay attention all the time,
00:13:03.720 | but needs to be able to intervene given some notice, okay?
00:13:08.720 | And that some notice, I think,
00:13:11.200 | is a kind of like ill-defined concept.
00:13:15.240 | And then you have level four and level five
00:13:17.640 | that are like a higher automation,
00:13:18.920 | essentially no driver needed,
00:13:21.440 | in some condition that is level four,
00:13:24.020 | and in all conditions that level five, okay?
00:13:27.800 | Now, my first reaction when I started seeing these levels,
00:13:32.360 | and there is also a similar version by NHTSA,
00:13:34.960 | is that they seem to me a horrible idea.
00:13:40.360 | And the horrible idea in the sense
00:13:43.080 | because they are given numeric levels.
00:13:47.760 | So you have level zero, one, two, three, five.
00:13:50.680 | Whenever you have a sequence of numbers,
00:13:52.880 | you are led to believe that these are actually sequential,
00:13:56.460 | right, that you do level zero, then you do level one,
00:13:59.880 | then you do level two, three, four, five.
00:14:02.460 | I think this is enormously bad idea
00:14:06.520 | because I think that level two and level three,
00:14:08.760 | that is anything where you require the human
00:14:12.040 | to pay attention and supervise automation
00:14:14.660 | and be ready to intervene with no notice
00:14:17.320 | or with some ambiguously defined,
00:14:21.440 | you know, like sufficiently long notice,
00:14:24.360 | that they'll just go behind, you know,
00:14:27.720 | go against human nature.
00:14:29.820 | And, you know, this is especially painful for me
00:14:33.320 | as a former aeronautics and astronautics professor
00:14:36.480 | where we saw in the airline industry
00:14:39.200 | that as soon as autopilots were being introduced
00:14:43.400 | and everybody thought that accidents would go down,
00:14:45.560 | actually there were more accidents
00:14:46.860 | because now you have new failure modes
00:14:49.120 | induced by autopilots, okay?
00:14:51.600 | You have mode confusion,
00:14:53.360 | pilots lose situational awareness,
00:14:55.120 | pilots lose the ability to react in case of an emergency.
00:14:59.680 | Okay, so the airline industry had to essentially
00:15:03.200 | educate itself on how to deal with automation
00:15:06.020 | in a good way.
00:15:07.600 | And think of pilots, you know,
00:15:09.040 | pilots are highly trained professionals,
00:15:13.340 | which is not the same that you can say
00:15:14.880 | about your everyday driver, right?
00:15:16.820 | So how do you train people who probably, you know,
00:15:21.820 | the last time they sat with an instructor in a car
00:15:24.700 | was, you know, when they were 16, right?
00:15:28.440 | How do you train people to use the automation technology
00:15:33.080 | and do it safely, right?
00:15:34.320 | So I think that this is something that I find very scary.
00:15:37.080 | On the other hand, I think that the full automation
00:15:39.640 | when the car is essentially able to drive itself
00:15:43.240 | does not rely on a human to take over
00:15:45.520 | is something that in a sense is easier.
00:15:47.580 | And this is what we are doing,
00:15:50.760 | but the point is that not only it is easier,
00:15:53.300 | but I think that it is essential to capture
00:15:55.740 | the value of the technology.
00:15:57.580 | Now if you think of it,
00:15:59.240 | so how do we realize the value
00:16:00.960 | of these self-driving vehicles?
00:16:02.200 | Okay, so the first thing that people say is safety.
00:16:04.720 | I think it is true that eventually, asymptotically,
00:16:11.120 | self-driving cars will be safer
00:16:12.980 | than their human-driven counterparts.
00:16:15.420 | However, at what point can we be confident
00:16:21.080 | that that is the case?
00:16:22.720 | Are we there yet?
00:16:24.520 | Not sure, okay?
00:16:26.680 | So how do you demonstrate that the self-driving cars
00:16:31.680 | demonstrate the reliability of these self-driving cars?
00:16:35.760 | So Waymo, they've driven their cars
00:16:40.340 | for three million miles, right?
00:16:42.400 | So with a relatively small number of accidents.
00:16:47.160 | If I remember correctly, only one was their fault, right?
00:16:50.880 | But actually, humans drive for many times that
00:17:00.520 | without accidents.
00:17:01.520 | So how do you really make sure that,
00:17:05.160 | even though the number sounds impressive,
00:17:07.200 | it really doesn't have that much
00:17:11.480 | of a statistical significance, right?
00:17:13.400 | And then every time you make an update,
00:17:15.200 | to a change to your system, to your software,
00:17:17.680 | you really have to validate again, right?
00:17:19.400 | So I think that making the case for safety
00:17:22.600 | is actually a very challenging issue.
00:17:26.380 | And we may not be positive that these self-driving cars
00:17:30.020 | are actually safer than their human counterparts
00:17:33.200 | until a fairly long time from now, okay?
00:17:35.720 | So safety for me remains kind of an open question
00:17:40.160 | at this point.
00:17:42.400 | How do you get back the time value of driving?
00:17:46.960 | If you had, at least I'm speaking for myself,
00:17:51.280 | if I had to constantly pay attention
00:17:54.120 | to what the car is doing, excuse me,
00:17:56.960 | but I'd rather drive myself, okay?
00:18:00.180 | Because if the car is driving,
00:18:02.940 | and this is the paradox, right?
00:18:04.260 | So the better the car drives,
00:18:06.460 | the harder it is for me to keep paying attention, right?
00:18:10.900 | And this is where the whole problem is, right?
00:18:12.740 | So it would be very hard for me not to fall asleep
00:18:15.620 | or not to get distracted.
00:18:17.740 | So if I want to get the time back,
00:18:19.460 | really the car must be able to drive itself
00:18:22.220 | without requiring me to pay attention.
00:18:24.380 | Car sharing, again, is,
00:18:28.420 | in order to make car sharing
00:18:31.960 | really convenient and reliable and so on and so forth,
00:18:35.000 | you need the car to come to you with nobody inside.
00:18:38.440 | And for that, you need level four or level five, okay?
00:18:43.640 | Anything else just doesn't cut it.
00:18:45.560 | Everything else for me is just a nice gadget
00:18:49.960 | that you have on your car
00:18:51.240 | that you show off to your friends or to your girlfriend,
00:18:54.000 | okay, so that's about it, right?
00:18:55.880 | It's not that useful.
00:18:58.680 | So my point is that level four or five automation
00:19:03.400 | is really essential to capture the value
00:19:05.880 | of this technology, okay?
00:19:08.400 | And in fact, the one game-changing feature of these cars
00:19:13.280 | is the fact that these cars now can move around
00:19:15.640 | with nobody inside.
00:19:17.000 | You know, that's really the game-changing feature, okay?
00:19:20.560 | Good, and this is really what we would like to do.
00:19:23.520 | Now, there are many paths that you can go
00:19:26.540 | after this target, okay?
00:19:28.920 | I usually show this figure, okay?
00:19:31.480 | So on this figure, what I show on the horizontal axis
00:19:34.440 | is the scale or the scope of the kind of driving
00:19:38.720 | that you can do, okay?
00:19:39.560 | So on the left is like a small, you know,
00:19:43.280 | pilot, maybe a close course.
00:19:45.280 | On the right is driving everywhere, okay?
00:19:48.600 | On the, you know, like a complex environment, right?
00:19:54.960 | Mass deployments and so on and so forth.
00:19:57.100 | On the left, there is, on the vertical axis
00:19:59.500 | is the level of automation, okay?
00:20:02.060 | Now, really what we would like to do
00:20:05.580 | is get to the top right corner, right?
00:20:07.560 | So we have millions of cars driving all over the world
00:20:11.300 | completely, you know, in a completely automated way, okay?
00:20:16.780 | What I see is there are two different paths
00:20:22.020 | that the industry is taking, okay?
00:20:24.420 | What I show here is what I call,
00:20:26.640 | this is the OEM path, okay?
00:20:29.400 | So this is the automakers, right?
00:20:32.880 | So they're used to thinking of production of cars
00:20:37.220 | in the orders of many, many millions, okay?
00:20:40.880 | And essentially what they do is they make a lot of cars
00:20:43.600 | and they are adding features to these cars,
00:20:46.640 | you know, advanced driver assistance systems
00:20:48.600 | and so on and so forth, right?
00:20:50.880 | And essentially, they're following this level zero,
00:20:53.560 | one, two, three, four, five, okay?
00:20:56.280 | And today, you can buy cars which,
00:21:00.120 | even though they claim fully autonomous package
00:21:04.440 | for $5,000 plus another $4,000 or something,
00:21:08.760 | in the fine print, they say it's level two, right?
00:21:11.280 | So level two or level three, right?
00:21:12.720 | So, you know, Tesla, Mercedes,
00:21:16.520 | I think the Audi with the new Audi 8 is A8,
00:21:20.160 | they're coming out with this kind of feature.
00:21:21.840 | Cadillac, I think, has a similar thing, okay?
00:21:25.360 | The problem with that, I think,
00:21:27.320 | that you have to cross this red band, okay?
00:21:31.840 | This red band where you're actually requiring
00:21:35.040 | human supervision of your automation system.
00:21:38.220 | Another path that people are following
00:21:41.840 | is this other path, okay?
00:21:43.440 | So this is what we are doing, what Waymo,
00:21:46.840 | where these are all the indications
00:21:51.400 | that Waymo is doing, of course,
00:21:53.520 | they're not telling me exactly what they do,
00:21:55.840 | similar thing for Uber, right?
00:21:57.680 | So essentially, what they're doing
00:21:59.160 | is they're working on cars that will be fully automated
00:22:02.440 | from the beginning, and they start with a small,
00:22:05.380 | you know, maybe geofence application,
00:22:08.260 | and then scale that up, that operations up, right?
00:22:10.840 | But always remaining at the full,
00:22:14.600 | you know, high full automation level, okay?
00:22:16.800 | Another thing that is important,
00:22:20.240 | that people make a lot of confusion
00:22:22.200 | and don't seem to realize the big difference,
00:22:24.820 | is the following.
00:22:25.720 | When people ask me, "When do you think
00:22:30.920 | "that we will see autonomous vehicles
00:22:32.760 | "everywhere on the street?"
00:22:34.000 | You know, where autonomous vehicles will be common.
00:22:37.280 | I ask them, "Okay, but what do you mean exactly by that?"
00:22:42.720 | Because if you ask me, "When is it that you will be able
00:22:45.760 | "to walk into a car dealership
00:22:47.300 | "and get out with the keys to a car
00:22:49.060 | "that you just push a button and it takes you home?"
00:22:52.240 | That's not happening for another 20 years, at least, okay?
00:22:55.460 | On the other hand, if you ask me,
00:22:58.200 | "When you will be able to go to some new city
00:23:00.800 | "and summon one of these vehicles
00:23:02.440 | "that picks you up and takes you to your destination?"
00:23:04.760 | That thing is happening within a couple of years, okay?
00:23:08.240 | What is the difference?
00:23:11.080 | There is a big difference between autonomous vehicles,
00:23:14.200 | self-driving cars, as a consumer product,
00:23:17.040 | versus a service that you provide to passengers.
00:23:21.940 | So, for example, what is the scope?
00:23:26.260 | Where do these cars need to be able to drive?
00:23:29.420 | If it's a product and I pay $10,000 for it,
00:23:35.280 | then I want this thing to work everywhere, right?
00:23:38.940 | So, take me home, take me to this little alley,
00:23:42.160 | drive me through the countryside.
00:23:44.780 | On the other hand, if I'm a service provider
00:23:47.180 | and I'm offering a service, I can decide,
00:23:49.600 | I'm offering this service in this particular location.
00:23:52.440 | And by the way, I'm offering this service
00:23:54.400 | under these weather conditions
00:23:55.800 | and maybe under these traffic conditions, okay?
00:23:58.860 | So, just the problem becomes much more, much easier.
00:24:03.280 | What are the financials, right?
00:24:06.360 | So, if I have to sell you a car with an autonomy package,
00:24:11.360 | how much can I cost?
00:24:14.240 | What are my cost constraints on that autonomy package?
00:24:19.040 | If I sell it to you, first of all,
00:24:22.440 | the cost of the autonomy package must be comparable
00:24:24.760 | to the cost of the vehicle, okay?
00:24:27.000 | You will not buy a $20,000 car
00:24:31.320 | with a half a million dollar autonomy package, right?
00:24:34.820 | Also, you can do, so another back of the envelope
00:24:40.460 | calculation that I did is, okay, so let's say that
00:24:43.820 | what is the value to you as the buyer
00:24:45.820 | of this autonomy package?
00:24:47.640 | Let's say that the value to you is the fact that now,
00:24:49.840 | instead of driving for the rest of,
00:24:52.240 | for the next 10 years, you can have
00:24:54.560 | the computer driving for you.
00:24:56.280 | What is the value of your time as you are not driving, right?
00:25:00.080 | So, do a quick calculations again,
00:25:03.000 | total number of hours that Americans spend
00:25:05.600 | behind the wheel, median wage, or value of time.
00:25:10.380 | What you get is, what I get is that the net present value
00:25:15.040 | of the driver's time over the next 10 years
00:25:18.480 | is about $20,000, okay?
00:25:21.480 | So then, a rational buyer will not pay more than that
00:25:25.960 | to buy this autonomy package, right?
00:25:29.400 | So now you are constrained by $20,000, okay?
00:25:33.000 | Or actually, if you want to make a profit out of it,
00:25:35.240 | you are constrained, your autonomy package
00:25:37.140 | cannot cost more than a few thousand dollars, okay?
00:25:41.340 | On the other hand, if you're thinking of this as a service,
00:25:44.220 | then what you are comparing to is the cost
00:25:47.140 | of providing the same service using a carbon-based life form
00:25:52.140 | like a human behind the wheel, okay?
00:25:54.660 | So now you want to provide 24/7 service,
00:25:57.340 | you need to hire at least, say, three drivers per car, okay?
00:26:00.940 | And then the cost is comparable
00:26:02.580 | of the order of 100K a year, okay?
00:26:05.740 | So now I'm comparing the cost of my automation package
00:26:09.560 | to something that is gonna cost me $100,000 a year
00:26:13.080 | over the life of the car, okay?
00:26:16.520 | So now the cost of that LIDAR, or that fancy computer,
00:26:19.960 | or that fancy radar, or something,
00:26:22.160 | doesn't matter that much, okay?
00:26:23.600 | So I have much more freedom
00:26:25.520 | in buying the sensor that I need.
00:26:27.740 | Infrastructure, for example.
00:26:30.680 | People talk about maps, HD maps, right?
00:26:34.840 | Now, again, if I want to sell it as a product,
00:26:38.320 | I need to, and I have to sell it,
00:26:39.980 | I want to sell it on a global scale,
00:26:41.720 | where global could mean all the United States,
00:26:44.320 | for example, or all of Europe.
00:26:46.440 | Then I need to have maps, HD maps,
00:26:49.180 | of the whole of Europe, or the continental United States,
00:26:52.360 | or wherever I want to sell the cars.
00:26:54.940 | If I'm providing a service, then I only need to map
00:26:59.040 | the area where I want to provide the service.
00:27:02.040 | And by the way, how does the complexity of the maps
00:27:06.120 | scale with the customer base that you're serving?
00:27:11.080 | If you think of a uniform people density, okay?
00:27:15.480 | So then actually, you think that the complexity
00:27:18.800 | and the cost of generating maps scales
00:27:21.080 | with the length of the road network.
00:27:23.800 | Then the cost of the maps scales
00:27:26.480 | with the square root of my customer base,
00:27:29.520 | meaning that will become manageable
00:27:31.560 | as I serve more people, okay?
00:27:33.880 | So HD maps, yes, it's a pain in the neck
00:27:36.800 | to collect them and to maintain them,
00:27:39.300 | but it's much less of a pain in the neck
00:27:41.400 | than actually operating the logistics
00:27:43.640 | of a fleet serving the population of a city, okay?
00:27:47.120 | And servicing and maintenance,
00:27:51.480 | how would you calibrate your cameras and your sensors?
00:27:56.320 | That's not something that you would do
00:27:58.120 | as a normal consumer, right?
00:28:00.880 | We are not used to that.
00:28:02.440 | When I was little, I was used to my father,
00:28:05.280 | he was tinkering with the car all the time,
00:28:07.760 | checking the timing belt or changing the oil.
00:28:11.280 | You don't do any of that nowadays, right?
00:28:14.760 | So you just sit in the car, switch it on.
00:28:16.800 | If the yellow light, check engine comes up,
00:28:18.840 | you take it to the dealership, that's all you do.
00:28:21.320 | Now imagine that now you have,
00:28:24.480 | if you want to use your autonomy package,
00:28:26.080 | you had to calibrate the sensors every time you go out
00:28:29.280 | or you have to upload a new version of the drivers
00:28:33.000 | and things like that, so you don't want to do that.
00:28:34.920 | On the other hand, in the service model,
00:28:37.080 | I have the maintenance crew that can take care of it
00:28:39.720 | in a professional way, okay?
00:28:41.320 | So big difference between the two models.
00:28:43.280 | So there are a couple of important takeaways, right?
00:28:45.920 | So one thing is that the cost of the autonomy package
00:28:50.480 | is not really an issue.
00:28:52.640 | Clearly, the cheaper I can make it, the better it is, right?
00:28:57.120 | But that is not really the main driver.
00:28:59.320 | In particular, if you need a LiDAR sensor,
00:29:02.940 | for example, to detect a big truck
00:29:05.080 | that is crossing your path, buy the LiDAR sensor, okay?
00:29:08.840 | So that is not making the difference
00:29:10.520 | and maybe can save some lives, okay?
00:29:13.880 | Any reference to other things is intentional.
00:29:17.060 | The other thing is HD maps that people worry about
00:29:24.680 | very much today.
00:29:27.400 | From my point of view, HD maps,
00:29:30.000 | my expectation is that HD maps,
00:29:31.600 | within a few years, will be a dime a dozen, okay?
00:29:35.060 | What is complicated, what is expensive now
00:29:37.560 | in generating all these HD maps?
00:29:39.760 | The mapping company need to put these sensors on a car
00:29:42.960 | and send these cars around.
00:29:45.640 | Now imagine that I have a fleet of 1,000 cars
00:29:48.720 | with these sensors on board,
00:29:51.260 | and these cars are just driving around the city
00:29:53.480 | all the time.
00:29:54.760 | They're generating gigantic amount of data
00:29:57.320 | that I can just use to make and maintain my HD maps.
00:30:00.860 | So I think that, especially from the point of view
00:30:04.320 | of the operators, the providers of these mobility services,
00:30:07.880 | it will be very easy to collect data
00:30:11.200 | to essentially make and maintain their own maps, okay?
00:30:16.200 | So if you need HD maps, that's fine,
00:30:20.480 | because as soon as you start offering this service,
00:30:22.760 | you will be able to collect all the data you need
00:30:25.240 | to generate and maintain the HD maps.
00:30:28.400 | Oh, by the way, this is showing, an animation showing,
00:30:31.880 | like a simulation of a fleet of,
00:30:35.520 | I think it's a couple of hundred vehicles
00:30:38.800 | in Zurich, in Switzerland, right?
00:30:40.440 | So that's where I was based until a few days ago.
00:30:42.880 | And as you see, essentially you have vehicles
00:30:46.480 | that go through most of the city every few hours, okay?
00:30:52.480 | I think that, for example, the Uber fleet
00:30:55.000 | goes through 95% of Manhattan every two hours or so, okay?
00:30:59.440 | Cost advantages, of course,
00:31:05.160 | most of the cost of taxi services nowadays
00:31:11.800 | is the driver, it's about half.
00:31:15.600 | Of course, you remove the driver from the picture,
00:31:17.960 | you don't have to pay them.
00:31:18.880 | Of course, the automation costs you a little bit more,
00:31:21.600 | servicing costs you a little bit more,
00:31:23.600 | but you see that you still have,
00:31:26.120 | you can get a really significant increase in the margin,
00:31:31.120 | meaning that you can pass some of those savings
00:31:35.120 | to customers, but also you can make
00:31:37.200 | a very strong business case.
00:31:39.040 | However, this is also misleading.
00:31:43.240 | Now, if you think of it,
00:31:47.280 | okay, so typically the reaction that you get
00:31:50.360 | is the following.
00:31:51.800 | Oh my goodness, now you make this thing
00:31:55.120 | and then all taxi drivers, all truck drivers
00:31:57.960 | will be out of a job, okay?
00:32:01.720 | And in fact, one day I was summoned
00:32:05.400 | by the Singapore Ministry of Manpower, okay?
00:32:08.440 | And I was terrified, oh my goodness,
00:32:10.480 | they're gonna shut me down because they're afraid
00:32:12.380 | that I will put all of their taxi drivers on the street,
00:32:15.680 | on the street in the sense of being unemployed.
00:32:18.600 | Turns out it was the opposite.
00:32:21.880 | What most people do not realize is that actually
00:32:27.800 | mobility services worldwide are actually manpower limited.
00:32:32.100 | Okay, in Singapore, they would like to buy more buses,
00:32:37.200 | but they don't have enough people who are able
00:32:40.880 | and willing to drive the buses, okay?
00:32:44.840 | This is true pretty much, same for trucking,
00:32:47.440 | same for taxes.
00:32:48.640 | Now, there's another back of the envelope calculation
00:32:51.560 | that you can do on your own.
00:32:53.320 | Now, imagine, so as we know, Uber's been widely successful,
00:32:58.320 | very high valuation.
00:33:01.200 | A lot of this valuation is predicated on the fact
00:33:03.600 | that everybody in the world will eventually use Uber, right?
00:33:06.560 | Or something similar.
00:33:07.780 | Now, something that people don't think about
00:33:11.440 | is the following.
00:33:12.360 | Now, if everybody in the world uses Uber
00:33:15.520 | for their mobility needs, how many people in the world
00:33:19.680 | need to be drivers for Uber?
00:33:22.320 | Do the calculation.
00:33:23.240 | What you see is that one person out of seven
00:33:25.680 | must drive for Uber if Uber is serving the whole world.
00:33:29.400 | Do you see that happening?
00:33:31.600 | No way, right?
00:33:32.520 | So people still need to be teachers, doctors,
00:33:39.520 | policemen, firemen, or some people need to be kids.
00:33:43.200 | So that is something that cannot happen.
00:33:46.560 | How are we facing this paradox, in a sense?
00:33:50.920 | So today, what you have is people who drive around,
00:33:54.720 | but what is happening today is that we are all
00:33:57.920 | doubling up as drivers for ourselves,
00:34:00.840 | and in fact, we do spend about 1/7, 1/8
00:34:05.000 | of our productive day behind the wheel, very often, okay?
00:34:09.440 | So for me, the big change will be more
00:34:14.440 | on the supply of mobility rather than on job loss.
00:34:19.360 | I mean, of course, if you increase supply of mobility,
00:34:22.520 | the cost of mobility will probably go down,
00:34:26.680 | wages for drivers will go down, right?
00:34:28.560 | So that is an issue, but maybe added,
00:34:33.560 | maybe balanced by like an added value in service
00:34:37.640 | or other things that you can imagine.
00:34:39.580 | Another thing about truck drivers,
00:34:42.720 | and something that I recently learned,
00:34:44.620 | 25% of all job-related deaths in the US
00:34:52.960 | are actually by truck drivers, okay?
00:34:56.680 | It's the single most dangerous industry that you can be in.
00:35:00.600 | So maybe if you can take some of those people
00:35:05.440 | out of those trucks and maybe supervise,
00:35:08.080 | remotely control a truck sitting in their office
00:35:10.840 | instead of sitting in the truck,
00:35:12.560 | that may be actually benefit to them.
00:35:14.400 | Back to the question of when will autonomous vehicles
00:35:18.360 | arrive, and in a sense, this is what our prediction,
00:35:22.760 | our vision is, right?
00:35:23.760 | So what we will see is that, what we think is that
00:35:26.800 | we will have a fairly rapid adoption
00:35:29.520 | of self-driving vehicles in this mobility
00:35:33.680 | as a service model, okay?
00:35:35.280 | As a fleet of shared autonomous vehicles
00:35:38.940 | that people can use to go from point to point, right?
00:35:42.720 | Rather than own.
00:35:44.440 | Of course, eventually, people will be able to buy these cars
00:35:48.640 | and maybe own them if they really want,
00:35:50.420 | but that is something that is much later in time
00:35:53.320 | for a number of reasons, some of which I discussed, okay?
00:35:58.500 | So this is what we expect in terms of the timeline for this.
00:36:02.860 | Now, what is the state of the art
00:36:07.860 | for autonomous technology today?
00:36:12.580 | You do see a lot of demos from a number of companies
00:36:17.100 | doing a number of things, right?
00:36:18.660 | But a lot of the things that you see
00:36:23.180 | are not too much different from this video.
00:36:25.520 | I don't know if any of you recognizes this video,
00:36:29.340 | but look at the cars.
00:36:32.500 | This was actually done by Ernst Dickmans
00:36:36.000 | in the late '90s in Germany, okay?
00:36:38.940 | No fancy GPUs, no, it was just cameras
00:36:45.580 | and some basic computer vision algorithms,
00:36:49.580 | but essentially he was able to drive
00:36:51.380 | for hundreds of miles on the German highways, okay?
00:36:55.720 | If you're not showing something that goes beyond that,
00:37:01.020 | you have not made any progress over the past 20 years, okay?
00:37:06.020 | Yeah, you're using fancy deep learning
00:37:09.140 | and GPUs and things nowadays,
00:37:10.900 | but you're doing what people were doing 20 years ago.
00:37:13.500 | You know, come on, okay?
00:37:15.740 | So you see, clearly there is a lot of hype in these things,
00:37:19.540 | but if you see something like that,
00:37:22.060 | I don't think it's very impressive, okay?
00:37:24.940 | People knew how to do that for a very long time.
00:37:29.420 | Something that I find a little bit,
00:37:30.940 | I may be biased, clearly,
00:37:32.460 | but this is something that I find a little bit more exciting
00:37:35.660 | is actually footage from our daily drives in Singapore, okay?
00:37:40.660 | This is four times the real time.
00:37:42.700 | We don't drive that fast, okay?
00:37:44.620 | But essentially what we are doing in Singapore,
00:37:49.120 | we are driving in public roads, normal traffic.
00:37:53.460 | What you will see is not so,
00:37:57.980 | but we have construction zones, intersections,
00:38:01.100 | traffic on both sides.
00:38:04.880 | We will get to a pretty interesting intersection.
00:38:08.240 | Okay, so it's a red light.
00:38:11.300 | We'll turn to green in a second.
00:38:15.600 | Keep in mind in Singapore, they drive on the left, right?
00:38:20.420 | So making the right turn is what is hard
00:38:22.620 | because you have to cross traffic, right?
00:38:24.940 | And here you have a lot of traffic
00:38:27.180 | and the car is making the right decision.
00:38:29.340 | All of this is without any human intervention, right?
00:38:31.940 | So I think that in this day and age,
00:38:35.500 | if you're not showing the capability of driving in traffic
00:38:39.940 | in an urban situation like that,
00:38:41.940 | you're not really showing any advance
00:38:43.900 | over what people were able to do 20 years ago, okay?
00:38:47.300 | And as you can see, right,
00:38:51.800 | so intersections, other cars, pedestrians,
00:38:55.340 | all kind of like crazy interactions,
00:38:57.540 | cars parked in the middle of the street
00:39:00.300 | that you had to avoid, go to the other lane,
00:39:03.060 | things like that, okay?
00:39:03.940 | So this is what you have to do every day.
00:39:06.700 | And this is what we are doing every day in Singapore.
00:39:09.940 | We are doing every day here in the seaport area.
00:39:12.720 | I don't know if you're aware of,
00:39:14.420 | but we are driving cars.
00:39:16.900 | We are allowed by the city of Boston
00:39:18.300 | to drive our cars autonomously in the seaport area.
00:39:24.980 | So what are the technical challenges?
00:39:27.420 | Okay, so actually this is a slide
00:39:29.020 | that I'm fairly reusing from a talk
00:39:34.020 | that Amnon Shashua, the founder and CEO of Mobileye,
00:39:39.940 | gave here at MIT a few months ago, okay?
00:39:42.940 | So this is what he said, okay?
00:39:44.380 | So this is not what I say.
00:39:45.980 | What he says is that the big challenges
00:39:49.400 | are sensing, you know, perception,
00:39:51.980 | smapping, and then is what he called driving policy,
00:39:56.180 | right, that I will call more like a decision-making, okay?
00:39:59.140 | Now, what he said is that sensing, perception,
00:40:04.460 | is a challenge, but it's a challenge
00:40:07.100 | that we are aware of, and then we are making
00:40:08.860 | rapid progress on getting better and better
00:40:11.340 | sensing, perception algorithms, okay?
00:40:13.660 | Second, it's HD maps.
00:40:17.140 | What he said is that it's a huge logistical nightmare,
00:40:20.480 | so he didn't want to deal with that,
00:40:21.900 | you know, like Mobileye tries to avoid that.
00:40:24.180 | From my point of view, as I said,
00:40:26.500 | you know, for me, HD maps, it is a big pain in the neck
00:40:30.580 | to get those maps, but in a few years,
00:40:34.660 | maps will be a dime a dozen, okay?
00:40:36.300 | So we'll get all the mapping data that we want and we need.
00:40:39.780 | So the big problem is driving policy, okay?
00:40:42.700 | So the remaining problem is driving policy.
00:40:44.500 | So how do you deal with that?
00:40:46.740 | And this is a typical example of things
00:40:48.820 | that we encounter in any kind of urban driving situation.
00:40:52.460 | So you will see a video.
00:40:54.580 | So this is a case where we are at a traffic light,
00:40:57.220 | we are stopping, the traffic, you know,
00:40:59.780 | the light turns green, we are making the turn,
00:41:02.460 | there's a pedestrian crossing the street,
00:41:05.180 | we wait for the pedestrian, go through it,
00:41:07.220 | and then we see that there is a truck
00:41:08.980 | that is parked in the middle of our lane.
00:41:11.220 | So we need to go to the other lane,
00:41:12.820 | which is in the opposite direction.
00:41:14.740 | There is a motorcycle coming,
00:41:18.040 | so we have to handle all that kind of situation, right?
00:41:20.340 | So how do you write your software in such a way
00:41:24.660 | that your car is able to deal with this kind
00:41:27.440 | of complicated situation by itself, okay?
00:41:31.260 | And my point is that, you know,
00:41:37.140 | this is not really about negotiation.
00:41:39.540 | It's not about policy.
00:41:41.700 | Why do you have rules of the road?
00:41:44.740 | My claim, I have not proved it mathematically yet,
00:41:47.340 | but my claim is the following,
00:41:49.260 | that actually the rules of the road were introduced
00:41:52.260 | exactly to avoid the need for negotiation when you drive.
00:41:57.260 | Okay, when you're walking as a person,
00:42:00.100 | you're just walking down the hallway,
00:42:01.460 | you know, walking down the infinite corridor,
00:42:03.080 | and there is a person coming in the other direction,
00:42:04.980 | there's always that awkward moment, right?
00:42:06.660 | So when you're trying to, I go left, I go right, right?
00:42:09.680 | With cars, you don't do that, right?
00:42:11.980 | So in cars, they say, everybody go right,
00:42:14.700 | or in other places, everybody go left, period,
00:42:17.100 | and you don't negotiate that, okay?
00:42:19.420 | You get to an intersection, the light is red, you stop.
00:42:24.540 | You don't say, I'm really in a rush, do you mind if I go?
00:42:27.940 | No, you don't do that, right?
00:42:29.080 | So it's red, and you stop, okay?
00:42:31.140 | So the rules of the road have been invented by humans
00:42:34.560 | in order to minimize the amount of negotiation, okay?
00:42:37.460 | And in particular, okay, so this is a slightly,
00:42:42.420 | I mean, this is actually a very old video,
00:42:43.980 | but I kind of like it.
00:42:45.080 | So now our car is a little bit more aggressive,
00:42:48.620 | but what you see here is this case.
00:42:50.860 | This is how the car behaved in that particular situation.
00:42:53.940 | So you see it's raining, red light, turns green.
00:42:58.120 | There's a pedestrian crossing our path,
00:43:00.680 | so we yield to the pedestrian.
00:43:02.640 | You see that there is a, you will see that there is a truck
00:43:05.100 | that is parked on the left lane, in the middle of the lane,
00:43:08.420 | so we had to go around it,
00:43:09.620 | but there's a motorcycle that is approaching,
00:43:12.140 | so we had to be careful in going to the other lane.
00:43:15.360 | Okay, so we squeeze through the motorcycle.
00:43:19.400 | We try to go very slowly next to squishy targets, right?
00:43:22.860 | But then as soon as we pass the truck,
00:43:26.880 | the truck driver decides to get moving, okay?
00:43:30.440 | So then what we do is we wait for the truck to get going,
00:43:34.220 | and then go back to our lane.
00:43:36.240 | Now, imagine writing a script,
00:43:39.040 | or if then else, if there is a truck,
00:43:42.340 | but the truck is moving, and then do this,
00:43:44.380 | and this is a nightmare, so you don't want to do that.
00:43:46.540 | So how do you handle this kind of situations?
00:43:49.380 | Okay, so the industry standard approach to this was to,
00:43:54.380 | and by the way, this is what we did
00:43:57.160 | at the time of the Darfur-Bahn challenge.
00:43:59.260 | So we had a lot of if-then-else statements,
00:44:02.640 | or finite-state machines, or some logic
00:44:05.100 | that was encoded by, so some finite-state machine
00:44:08.500 | kind of things.
00:44:09.700 | The problem with that is it's very hard
00:44:15.660 | to come up with this logic, and it's essentially
00:44:18.600 | impossible to debug it and verify it, right?
00:44:21.020 | So I spent many miserable months sitting
00:44:26.020 | in the naval air base in Weymouth, right?
00:44:29.060 | So here, in a rental car, just plain interference
00:44:32.960 | with our autonomous car, trying to adjust
00:44:35.700 | all this logic and parameters and things.
00:44:37.940 | So I vowed that I will never do it again.
00:44:40.300 | Okay, it was just a miserable experience.
00:44:43.500 | I'm happy to say that actually we did come up
00:44:45.780 | with a much better way of doing it.
00:44:47.220 | And by the way, this is a video from the Caltech team
00:44:51.180 | at the Darfur-Bahn challenge.
00:44:53.120 | As you can see, they're trying to go to an intersection.
00:44:56.160 | They decide to go, then for some reason,
00:44:58.060 | they decide actually to back up out of the intersection.
00:45:01.880 | So the director of DARPA, Tony Tedder at the time,
00:45:06.540 | he was there, he went like that.
00:45:08.860 | So they were out of the race, okay?
00:45:10.540 | So as soon as CCO saw that.
00:45:12.900 | What happened here, there was essentially
00:45:14.180 | a bug in their logic.
00:45:15.920 | Caltech, team of very smart people, very capable,
00:45:20.260 | dedicated people, worked on this for months.
00:45:22.460 | They didn't catch this bug, they were out of the race.
00:45:25.780 | So it's very easy to make mistakes,
00:45:27.700 | and it's very hard to find those bugs.
00:45:29.800 | Okay, so as a reaction to that, there is this new
00:45:33.780 | (mumbles)
00:45:35.940 | Is it possible to cut the sound?
00:45:39.100 | Oh, thank you.
00:45:43.180 | So now what you hear people saying is,
00:45:48.860 | well, there are too many rules of the road.
00:45:52.100 | It's impossible to code all of them correctly.
00:45:55.620 | So let's not do that.
00:46:00.240 | Just feed the data, feed the car a lot of data,
00:46:03.380 | and let the car learn by itself how to behave.
00:46:06.420 | And this is what you see in,
00:46:11.940 | there are a number of startups and other efforts
00:46:14.420 | that are trying to use all this deep learning
00:46:17.780 | or learning approaches to get for end-to-end driving
00:46:22.660 | of cars, so you see a video from NVIDIA.
00:46:30.820 | I understand this is a course on deep learning for cars.
00:46:33.620 | So I don't want to sound too negative.
00:46:37.620 | On the other hand, I will try to be honest
00:46:40.460 | in what I think.
00:46:41.300 | So there are a number of problems.
00:46:44.300 | And actually, this happened to us.
00:46:48.300 | So one of our developers, super bright lady from Caltech,
00:46:53.300 | you know, the first version of the code
00:46:59.220 | for dealing with traffic lights,
00:47:00.980 | essentially the reaction that they had
00:47:04.540 | for the yellow light was, if you see a yellow light,
00:47:07.740 | speed up.
00:47:09.300 | I was looking at this, what the heck?
00:47:11.300 | Oh, this is what my brother does.
00:47:13.020 | So there is always the danger
00:47:17.060 | that you learn the wrong thing, okay,
00:47:20.020 | the wrong behavior in a sense.
00:47:21.580 | Of course, there are some situations
00:47:23.660 | in which accelerating when you see a yellow light
00:47:27.460 | is actually the right response,
00:47:29.420 | but it is not always the case, right?
00:47:31.740 | So there are some other features of the situation
00:47:34.660 | that you need to examine, right?
00:47:36.900 | Also, the other thing is, it's a cartoon, right?
00:47:39.180 | So you want to be able to explain
00:47:42.980 | why the car did something,
00:47:45.220 | and I would say that more than explaining,
00:47:47.100 | because now you also see articles
00:47:48.940 | in which people say, oh, we have found a way
00:47:51.700 | of explaining why the neural network in the car
00:47:54.460 | decided to do something, right?
00:47:56.060 | And what they show you is some, okay,
00:47:57.580 | so these are the neurons that were activated.
00:47:59.820 | Okay, this is just saying that, you know,
00:48:03.460 | if I do a fast MRI of the brain,
00:48:06.580 | and I see what neurons, what areas of the brain
00:48:08.980 | are activated when I watch a movie,
00:48:11.460 | then I know how the brain works.
00:48:13.100 | No, I have no idea, okay?
00:48:15.380 | The point is that, yes, you want to trace the reason,
00:48:19.380 | the cause for why the car behave in a certain way,
00:48:22.380 | but you also want to be able to revert the cause, right?
00:48:25.820 | So you want that information to be actionable
00:48:28.260 | in some sense, right?
00:48:29.100 | So you want to know that, okay,
00:48:32.100 | this happened because of this reason,
00:48:33.940 | and this is how I fix it, okay?
00:48:36.140 | And the other thing that, you know,
00:48:36.980 | this is something that is hard to do
00:48:38.860 | with purely based learning algorithms.
00:48:42.340 | On the other hand, you can, well,
00:48:44.180 | let me actually skip that in the interest of time, okay?
00:48:47.220 | The reality is the following,
00:48:53.940 | that it is simply not true
00:48:58.940 | that there are too many rules of the road.
00:49:02.220 | In fact, any 16-year-old in the States
00:49:05.860 | can go to the DMV, get the booklet,
00:49:08.660 | study the booklet, do a written test,
00:49:11.740 | and be given a learner's permit, okay?
00:49:14.820 | And actually, this is what we require
00:49:16.700 | of every single licensed driver in the United States, okay?
00:49:21.860 | We don't say, just drive with your dad or mom
00:49:24.780 | for a few thousand miles,
00:49:26.460 | and they will give you the license.
00:49:28.180 | No, we ask them, you know,
00:49:29.580 | show me that you studied the rules,
00:49:32.380 | and you understand the rules, okay?
00:49:34.820 | So how many are the rules of the road?
00:49:39.580 | Actually, I went through an exercise of counting, okay?
00:49:43.320 | And what I did, I kind of like a clustered them.
00:49:45.380 | So essentially, you have rules on who can drive,
00:49:50.400 | when and where, what can be driven, when and where,
00:49:53.620 | at what speed, in what direction,
00:49:58.420 | who yields to whom, right?
00:50:01.740 | How you use your signals, active signaling,
00:50:04.540 | how do you interpret the signals that you see on the road,
00:50:08.860 | right, and where you can park and where you can stop.
00:50:11.340 | That's essentially it.
00:50:12.380 | You know, these are all the rules, okay?
00:50:15.440 | So not that many, it's kind of like 12 categories.
00:50:19.820 | What is true is that the number of possible combinations
00:50:24.820 | of rules and the instantiation of the rules,
00:50:30.380 | given the context of the scenario,
00:50:33.380 | where other actors are, or pedestrians are,
00:50:35.980 | and where other cars are, that is a humongous number.
00:50:39.460 | Okay?
00:50:41.900 | So you don't want to code,
00:50:43.940 | you don't want to be to essentially any generative model
00:50:48.380 | that gives you what is the right response
00:50:51.440 | to all possible combinations of rules
00:50:53.780 | and instantiations of actors.
00:50:56.800 | That is something that is just combinatorially untractable.
00:51:00.100 | I mean, you just cannot do that.
00:51:02.160 | But the point is that not only it is hard
00:51:05.020 | to code the good behavior,
00:51:06.740 | what to do in every one of these situations,
00:51:10.300 | I claim that it's also hard to learn the good behavior.
00:51:13.980 | Because now you need to have enough training data
00:51:17.340 | for every possible combination of rules and instantiations.
00:51:22.340 | Good luck with that.
00:51:24.700 | Okay?
00:51:27.300 | On the other hand, it is very easy
00:51:32.020 | to assess what is a good behavior.
00:51:34.540 | And that's why I was showing these slides on NP-hardness.
00:51:37.860 | So what is a problem that is NP-hard?
00:51:40.460 | The problem is NP-hard where
00:51:45.420 | if you have a non-deterministic system
00:51:48.120 | that is generating a candidate solution,
00:51:53.120 | then it is very easy to check whether or not
00:51:56.700 | that candidate is actually a solution of your problem.
00:51:58.980 | And that's something that you do in polynomial time.
00:52:01.880 | Okay?
00:52:02.940 | So in a sense, what I claim is that if you have an engine
00:52:07.320 | that is able to generate a very large number of candidates,
00:52:13.140 | and all you do is checking,
00:52:15.380 | and then what you do is checking whether or not
00:52:19.040 | each one of those candidates is good
00:52:21.300 | with respect to the rules, then that's all you need.
00:52:24.140 | And it turns out that the algorithms
00:52:27.580 | that I worked on during my academic career
00:52:30.380 | were exactly generating that very large number,
00:52:33.100 | you know, RRT, RRT*.
00:52:34.980 | These are algorithms that work by generating
00:52:38.220 | a very large graph exploring all potential trajectories,
00:52:41.740 | reasonable trajectories that a robotic system can take.
00:52:45.140 | And then what you do is you check them
00:52:48.180 | for whether they satisfy the rules or not.
00:52:52.740 | You see that it's very different from
00:52:55.620 | given the rules, generate something
00:52:57.820 | that satisfies everything, rather than given a candidate,
00:53:02.340 | check whether or not this candidate satisfies the rules.
00:53:07.780 | The generating rules, the generating candidates
00:53:10.520 | given all the constraints is a combinatorial problem.
00:53:15.420 | Checking a single candidate for compliance
00:53:19.300 | with a number of rules is a linear operation
00:53:21.860 | in the number of rules.
00:53:22.820 | So that's something that you can do very easily.
00:53:25.700 | And that's essentially what we have in our course.
00:53:28.620 | Now we are using these formal methods.
00:53:30.940 | So essentially we write down all the rules
00:53:33.260 | in a formal language, so very precise, like a syntax.
00:53:39.940 | And then what you can do is you can verify
00:53:42.520 | whether your trajectory satisfy all these rules
00:53:45.360 | written in this language that is automatically,
00:53:48.080 | that can be automatically translated
00:53:50.280 | into something look like a finite state machine
00:53:52.120 | by a computer, okay?
00:53:53.480 | But that's not something that you do by hand.
00:53:55.320 | It's something that is done automatically.
00:53:57.640 | And then what happens is that what we have
00:54:00.160 | is we generate trajectories.
00:54:02.040 | These trajectories are, you know,
00:54:03.880 | you can think of these as trajectories
00:54:06.800 | that now are not only trajectories in the physical space
00:54:09.800 | and time, but are also trajectories evolving
00:54:12.280 | in this logical space, telling me whether or not
00:54:15.400 | and to what extent I am satisfying the rules.
00:54:18.240 | Okay, and that's all there is.
00:54:20.000 | Okay, so this is, you know, for example,
00:54:24.840 | a little example.
00:54:27.280 | So, you know, initially what we are doing is work,
00:54:29.600 | so this was very early days on Newtonomy
00:54:32.680 | where we're still working on research projects
00:54:34.940 | with industry, with customers.
00:54:37.300 | So our customer in this case wanted us
00:54:39.120 | to do an automated parking application.
00:54:42.200 | And then what you see on the left
00:54:43.520 | is our plan, our original plan,
00:54:46.400 | that is just trying to park the car, right,
00:54:49.120 | avoiding hitting other cars.
00:54:51.120 | But you see it's kind of ignoring the fact
00:54:53.200 | that you have lanes and directional travels, right?
00:54:56.160 | So you put in the rules, and what you see
00:54:57.800 | is what is on the right, where now what the car is doing
00:55:01.480 | is not only finding the trajectory to go park,
00:55:04.280 | but it does so obeying all the rules
00:55:07.280 | that are imposed on that particular parking structure.
00:55:10.160 | Okay?
00:55:11.000 | Something else that is very important,
00:55:13.960 | and you know this is something that we as humans
00:55:15.920 | do every day, is to deal with infeasibility, okay?
00:55:20.120 | So very often you're doing your planning,
00:55:25.240 | you're trying to plan your trajectory,
00:55:26.600 | you have a number of constraints,
00:55:28.200 | and well, sorry, but turns out that there is no trajectory,
00:55:31.640 | there is no possible behavior that you can do
00:55:34.520 | that will satisfy all the rules.
00:55:37.640 | So what do you do?
00:55:38.480 | The computer returns, sorry, there's no compute,
00:55:40.840 | unfeasible, still driving this car,
00:55:44.160 | I need to do something, right?
00:55:46.200 | So you do need a way of dealing with infeasibility, okay?
00:55:50.120 | The way that we approach this problem
00:55:53.720 | is having this idea of hierarchy of rules, okay?
00:55:58.720 | And my claim is that all bodies of rules
00:56:03.560 | generated by humans are actually organized hierarchically.
00:56:06.600 | Typical example is the three laws of robotics by Asimov.
00:56:09.320 | So the first law of robotics is a robot
00:56:11.680 | will not harm a human, right?
00:56:14.600 | Or cause a human to come to harm.
00:56:16.920 | Second law is a robot will obey a human,
00:56:20.720 | orders by a human, unless they violate the first law.
00:56:25.000 | And the third law is a robot will try to preserve
00:56:27.600 | its own life, or preserve itself,
00:56:30.840 | unless it violates the first two laws, right?
00:56:33.360 | Same thing when you drive, right?
00:56:37.040 | So there are some rules that are more important than others.
00:56:39.720 | So for example, do not hit people, do not hit other cars.
00:56:43.320 | And then lower priority level is maybe driving your lane,
00:56:47.880 | lower priority level is maybe maintaining the speed,
00:56:50.440 | or something like that, okay?
00:56:51.760 | And then what we do is come up with,
00:56:54.680 | now we have this product graph of trajectories
00:56:57.960 | in the physical and logical space.
00:56:59.840 | On top of that, you can give them a cost, right?
00:57:02.960 | What we need is essentially a total order.
00:57:05.520 | What we use is a less geographic ordering, okay?
00:57:08.480 | Where we have violating an important rule
00:57:11.320 | even by a tiny amount is much worse
00:57:13.360 | than violating a less important rule by a large amount, okay?
00:57:16.920 | So that gives a total order structure to the cost.
00:57:19.520 | And then essentially what we do is we solve
00:57:21.320 | a shortest path problem on this graph, okay?
00:57:24.680 | Which is exactly what you do in Robotics 101
00:57:28.040 | when you try to do any kind of motion planning, okay?
00:57:32.360 | And well, this is a collection of a few interesting things.
00:57:36.600 | So here we need to go to the other lane,
00:57:39.240 | but you see that there is the other vehicle coming,
00:57:41.560 | so technically we could not go to the other lane,
00:57:44.200 | but you see that as long as it is safe to do so,
00:57:46.960 | the car will go into the other lane, okay?
00:57:51.240 | And again, you have a lot of difficult situations
00:57:55.480 | that the car was able to handle by itself
00:57:57.640 | without any scripting or without any special instruction
00:58:01.360 | for that particular case, okay?
00:58:03.960 | So what is the problem here?
00:58:10.120 | The problem here is that, okay,
00:58:14.160 | so you can do all of this, right?
00:58:15.760 | But then assuming that everybody is running
00:58:18.760 | this minimum violation planning, everything will be okay.
00:58:22.480 | The problem is that humans introduce
00:58:24.560 | a lot of uncertainty in the whole thing, okay?
00:58:29.640 | Now you can think of this as asking the question.
00:58:32.160 | So when I was young and naive, that is two years ago,
00:58:34.800 | I thought that I take all the rules of the road
00:58:37.840 | and you convert them to this formal language,
00:58:40.040 | you put them in your software, and you're done.
00:58:42.640 | And then you go and look at these rules of the road
00:58:45.800 | and then you see that they're a mess, okay?
00:58:48.080 | These rules are just not a sound theory
00:58:50.240 | in the sense they're not complete,
00:58:51.480 | do not cover every possible case,
00:58:52.960 | and they're not consistent.
00:58:54.000 | You know, they're kind of like,
00:58:55.040 | tell you to do different things in different cases.
00:58:57.560 | My favorite rule is this one.
00:59:00.240 | It's actually called the fundamental norm
00:59:02.360 | in the Swiss rules of the road.
00:59:04.720 | Look at that.
00:59:06.200 | All road users must behave in such a way
00:59:08.140 | not to pose an obstacle or danger to other road users
00:59:11.240 | that behave according to the rules.
00:59:13.480 | Do you see a problem there?
00:59:17.800 | Okay, that doesn't mean that if I see somebody
00:59:21.040 | who is violating the rule, I can just hit them, right?
00:59:23.940 | So you can imagine that you have a fleet of vigilantes,
00:59:26.760 | you know, autonomous cars that just go around
00:59:29.080 | and if you run the red light, bam, gonna kill you.
00:59:32.260 | I mean, technically, the autonomous cars
00:59:36.160 | will be right, right?
00:59:38.320 | So the other guy will be the one to blame, right?
00:59:42.960 | But do we really want that?
00:59:44.400 | Probably not, right?
00:59:45.400 | In defense of the Swiss, they actually have,
00:59:49.320 | that rule continues to say special care must be exerted
00:59:52.720 | in case you have evidence that other people
00:59:54.960 | are not following the rules,
00:59:56.160 | but still doesn't tell you what you're supposed to do
00:59:59.160 | when somebody else is violating the rule, okay?
01:00:01.560 | And you have trolley problems, right?
01:00:05.160 | So probably you've heard, you know,
01:00:06.600 | you hear about all these trolley problems to no end, right?
01:00:09.240 | And most of these I find, you know, I mean,
01:00:11.160 | totally stupid, you know, in the sense
01:00:12.520 | it's like a big waste of time.
01:00:14.880 | In the sense that, yeah, sorry,
01:00:17.760 | I think it's extremely unlikely that you will be given
01:00:21.240 | the choice of killing either Mother Teresa or Hitler, right?
01:00:24.280 | So, I mean, for sure that will never happen, right?
01:00:27.360 | But anything remotely similar will never happen to you, okay?
01:00:30.840 | On the other hand, there are versions
01:00:35.400 | of the trolley problem which are actually meaningful, okay?
01:00:37.560 | So this is one that my collaborator,
01:00:39.840 | Andrea Cenci, came up with, okay?
01:00:42.000 | Look at this case.
01:00:42.840 | So you're driving down the road,
01:00:44.800 | and you see a pedestrian that is jaywalking
01:00:47.040 | in front of you, okay?
01:00:48.440 | If we stay our current course,
01:00:53.520 | we will kill the pedestrian, probability one, okay?
01:00:58.520 | But it's not our fault, okay?
01:01:00.200 | It's his fault that, you know, his or her fault
01:01:02.760 | that they stepped in the road when they shouldn't have.
01:01:05.200 | On the other hand, what we could do is
01:01:08.720 | we can try to swerve, right?
01:01:10.160 | But then with some probability P,
01:01:11.920 | we may kill another person who had nothing
01:01:14.760 | to do with this thing.
01:01:15.760 | You know, they were just walking around, you know,
01:01:17.440 | peacefully, right?
01:01:18.400 | So the reason why I like this is because
01:01:22.760 | this problem actually has clear solutions
01:01:25.640 | in the two extreme cases, right?
01:01:27.160 | So if P is one, okay, in the sense that
01:01:31.560 | if we swerve, we kill somebody else,
01:01:33.560 | then we clearly kill the guy who was jaywalking, right?
01:01:37.960 | If P is zero, that is, I'm sure that
01:01:40.320 | I'm not killing anybody if I swerve,
01:01:42.200 | then clearly I will swerve.
01:01:43.760 | What is the boundary?
01:01:46.120 | So I know that the solution exists for P is equal zero.
01:01:49.080 | I know the solution exists with P equal one.
01:01:52.080 | By some continuity argument, you know,
01:01:54.920 | I must have some value of P at which the solution changes.
01:01:59.960 | What is that value?
01:02:01.000 | Nobody knows.
01:02:04.520 | How do you evaluate that P?
01:02:06.440 | Nobody knows.
01:02:07.480 | But you know, these are the kind of questions
01:02:10.160 | that we actually need to answer somehow.
01:02:12.740 | So it's a more, you know,
01:02:15.320 | a little bit more sophisticated case.
01:02:17.780 | Now what we, and you know, this is what happens
01:02:19.880 | every day in our cars, right?
01:02:21.340 | So when our computer vision system is telling me
01:02:24.840 | that there is a pedestrian in front of us,
01:02:27.040 | it's not telling me that there is a pedestrian for sure,
01:02:29.360 | right, so it's telling me that I think
01:02:31.540 | that there is a pedestrian in front of us,
01:02:33.680 | and you know, I'm 80% confident,
01:02:36.680 | you know, some probability Q, okay?
01:02:39.520 | Now, the one combination of probability
01:02:43.640 | on the pedestrian actually being there
01:02:46.120 | and my probability of killing somebody else
01:02:48.440 | would I swerve, right?
01:02:50.140 | Because if I swerve and kill somebody
01:02:53.540 | because just a ghost, you know, like a false positive,
01:02:56.920 | then I'll be in serious trouble, right?
01:02:58.320 | So how do we explain that?
01:02:59.640 | Well, I thought there was somebody in front of me.
01:03:01.140 | There's nobody there, right?
01:03:02.660 | So again, you know, you do have solutions
01:03:06.020 | for some extreme cases, but then you have
01:03:08.600 | this whole two-dimensional domain now
01:03:11.080 | in which you had to, you know, there would be a boundary.
01:03:13.400 | Where do you put the boundary, okay?
01:03:16.000 | And this is something that somebody will need to answer.
01:03:19.760 | Okay, I don't think it should be me.
01:03:22.340 | You know, of course I can come up with an answer
01:03:24.280 | when I write my code, but I actually think
01:03:27.520 | it should be you, right, in the sense
01:03:28.960 | this should be a community effort
01:03:30.680 | in which the community agrees on how the car should behave
01:03:34.480 | or, you know, in these kind of situations.
01:03:36.720 | So let me conclude by saying, you know,
01:03:39.120 | when people ask me, what do you think
01:03:40.600 | is the biggest challenge in autonomous vehicles?
01:03:43.240 | And something that I've come to realize only recently
01:03:45.840 | is that I think that the biggest challenge
01:03:47.860 | in the development of autonomous vehicle technology
01:03:50.540 | is that we do not understand in a very precise way,
01:03:54.760 | rigorous way, how we want vehicles in general,
01:03:58.440 | including human-driven vehicles to behave, okay?
01:04:02.560 | A lot of these rules of the road are just like
01:04:05.200 | a giant pile of, I wouldn't say garbage,
01:04:08.540 | but almost, you know, it's very uncertain language,
01:04:11.680 | very, you know, no rigorous laws, right, or rules.
01:04:15.840 | For example, a lot of the rules are predicated
01:04:18.400 | on a concept of right-of-way.
01:04:21.200 | You know, I looked everywhere.
01:04:24.940 | There is not a single definition
01:04:26.860 | of what right-of-way means in mathematical terms.
01:04:30.740 | I know that it has something to do with distance,
01:04:33.740 | has something to do with relative speed,
01:04:35.340 | maybe with absolute speed,
01:04:36.840 | but I don't know what are the values.
01:04:39.780 | I don't know what are the numbers.
01:04:40.980 | If I had to write a function,
01:04:43.020 | so if you see this car approaching
01:04:44.980 | and this car is farther away than this distance
01:04:47.300 | and the relative speed is more than this,
01:04:49.640 | then stop, otherwise go.
01:04:51.700 | There's nobody who's telling me
01:04:53.060 | what that relationship should be,
01:04:55.180 | and I think, again, what we need is
01:04:57.960 | we need to develop a sound theory
01:05:00.420 | for these rules of the road, okay,
01:05:02.460 | that cover precisely any kind of situation
01:05:05.740 | and tells me, you know, any kind of situation,
01:05:07.940 | what is the right behavior, what is the wrong behavior,
01:05:10.700 | or a little bit more, maybe what is,
01:05:13.900 | if I have two behaviors, which one is better, okay?
01:05:18.580 | I need to be able to do the comparison.
01:05:21.060 | Now, we can use formal methods.
01:05:25.420 | I think that there is a lot of room here
01:05:26.920 | for statistical or learning-based methods,
01:05:29.460 | you know, like look at what people actually do
01:05:32.060 | and at what point will people honk at you, right,
01:05:35.620 | rather than in the field that you're cutting them off
01:05:38.480 | versus the field that you yielded to them, okay?
01:05:42.780 | So we need to develop this sound theory.
01:05:46.700 | We need to assess the behaviors
01:05:48.560 | on realized space and time trajectories.
01:05:51.460 | What you thought that you had seen, that doesn't matter,
01:05:56.000 | okay, you know, because if you say,
01:05:59.140 | well, if I didn't see the pedestrian,
01:06:01.660 | then it's not my fault that I hit them,
01:06:03.240 | well, then people will start removing sensors, right?
01:06:06.240 | So if you don't see anything,
01:06:07.500 | you can hit anything you want,
01:06:09.060 | and you're not to blame, right?
01:06:11.040 | But I really think that the compliance of the rules,
01:06:16.960 | once we have these precise, rigorous rules,
01:06:19.060 | will actually derive a lot of requirements
01:06:21.520 | for the sensing perception system
01:06:23.620 | for the planning control system, okay?
01:06:26.120 | So from my point of view, the main message today
01:06:29.620 | is what I think is the biggest challenge
01:06:32.620 | is that we don't know how precisely
01:06:34.820 | how we want human-driven vehicles to behave, okay?
01:06:39.820 | Once we answer that question,
01:06:41.740 | I think that also designing automated vehicles
01:06:45.260 | will be much, much easier, okay?
01:06:47.660 | So let me stop here, okay,
01:06:49.820 | so I'm just giving a few references
01:06:51.740 | to some of our published work on these topics.
01:06:55.740 | And let me just conclude, okay,
01:06:59.020 | so this is the company, what we are trying to do.
01:07:03.740 | Allow me, we are also hiring,
01:07:05.580 | so if anybody's interested,
01:07:07.380 | feel free to send me an email or contact us.
01:07:11.540 | We want to double our size in the next couple of years,
01:07:14.780 | so we are hiring a couple hundred people.
01:07:17.020 | Okay, thank you for your attention.
01:07:19.380 | - Thank you, Emilio.
01:07:20.220 | (audience applauding)