back to index

Sterling Anderson, Co-Founder, Aurora - MIT Self-Driving Cars


Chapters

0:0 Sterling Anderson
3:3 The Intelligent Co-Pilot
26:5 Technological Bottlenecks and Challenges
30:37 Go no-Go Tests
36:17 What's Your Opinion about Cooperation of Self-Driving Vehicles

Whisper Transcript | Transcript Only Page

00:00:00.000 | Today we have Sterling Anderson.
00:00:02.500 | He's the co-founder of Aurora,
00:00:04.920 | an exciting new self-driving car company.
00:00:07.600 | Previously, he was the head of the Tesla Autopilot team
00:00:10.860 | that brought both the first
00:00:12.200 | and second generation Autopilot to life.
00:00:14.940 | Before that, he did his PhD at MIT
00:00:17.880 | working on shared human machine control of ground vehicles,
00:00:22.000 | the very thing I've been harping on
00:00:23.800 | over and over in this class.
00:00:25.720 | And now he's back at MIT to talk with us.
00:00:29.200 | Please give him a warm welcome.
00:00:31.160 | (audience applauding)
00:00:34.320 | - Thank you.
00:00:37.080 | It's good to be here.
00:00:37.920 | I was telling Lex just before,
00:00:39.600 | I think it's been a little while
00:00:40.980 | since I've been back to the Institute,
00:00:42.960 | and it's great to be here.
00:00:44.620 | I wanna apologize in advance.
00:00:45.840 | I've just landed this afternoon from Korea via Germany,
00:00:50.340 | where I've been spending the last week.
00:00:52.400 | And so I may speak a little slower than normal.
00:00:55.440 | Please bear with me.
00:00:56.800 | If I become incoherent or slur my speech,
00:01:00.080 | somebody flag it to me and we'll try to make corrections.
00:01:03.600 | So tonight I thought I'd chat with you a little bit
00:01:05.720 | about my journey over the last decade.
00:01:08.080 | It's been just over 10 years since I was at MIT.
00:01:11.440 | A lot has changed.
00:01:12.480 | A lot has changed for the better
00:01:13.800 | in the self-driving community.
00:01:15.300 | And I've been privileged to be a part
00:01:18.440 | of many of those changes.
00:01:19.680 | And so I wanted to talk with you a little bit
00:01:20.960 | about some of the things that I've learned,
00:01:22.200 | some of the things that I've experienced.
00:01:24.200 | And then maybe end by talking about sort of
00:01:28.000 | where we go from here and what the next steps are,
00:01:30.560 | both for the industry at large,
00:01:32.660 | but also for the company that we're building,
00:01:35.480 | that as Lex mentioned is called Aurora.
00:01:38.360 | To start out with, there are a few sort of key phases
00:01:43.360 | or transitions in my journey over the last 10 years.
00:01:46.920 | As Lex mentioned, when I started at MIT,
00:01:49.600 | I worked with Carl Ianniema, Emilio Fusoli, John Leonard,
00:01:53.900 | a few others on some of these sort of shared,
00:01:58.200 | adaptive automation approaches.
00:02:01.200 | I'll talk a little bit about those.
00:02:03.740 | From there, I spent some time at Tesla
00:02:06.320 | where I first led the Model X program
00:02:08.280 | as we both finished the development
00:02:11.720 | and ultimately launched it.
00:02:13.240 | I took over the autopilot program
00:02:15.800 | where we introduced a number of new,
00:02:19.920 | both active safety, but also sort of
00:02:23.380 | enhanced convenience features from auto steer
00:02:27.340 | to adaptive cruise control that we're able to refine
00:02:30.060 | in a few unique ways.
00:02:31.180 | And we'll talk a little bit about that.
00:02:33.460 | And then from there in December of last year,
00:02:35.700 | of 2016, I guess now,
00:02:38.180 | we started a new company called Aurora.
00:02:40.060 | And I'll tell you a little bit about that.
00:02:42.420 | So to start out with, when I came to MIT, it was 2007.
00:02:47.100 | The DARPA urban challenges were well underway at that stage.
00:02:50.320 | And one of the things that we wanted to do
00:02:52.320 | is find a way to address some of the safety issues
00:02:56.480 | in human driving earlier
00:02:59.080 | than potentially full self-driving could do.
00:03:02.240 | And so we developed what became known
00:03:03.600 | as the Intelligent Co-Pilot.
00:03:06.000 | What you see here is a simulation of that operating.
00:03:09.280 | I'll tell you a little bit more about that in just a second.
00:03:12.560 | But to explain a little bit about the methodology,
00:03:15.240 | the innovation, the key approach that we took
00:03:18.680 | that was slightly different from what traditional
00:03:21.760 | planning control theory were doing
00:03:24.680 | was instead of designing in path space for the robot,
00:03:27.920 | we instead found a way to identify, plan, optimize,
00:03:32.920 | and design a controller subject to a set of constraints
00:03:36.580 | rather than paths.
00:03:38.120 | And so what we were doing is looking for homotopies
00:03:40.400 | through an environment.
00:03:41.240 | So imagine for a moment an environment
00:03:42.920 | that's pockmarked by objects, by other vehicles,
00:03:46.640 | by pedestrians, et cetera.
00:03:48.200 | If you were to create the Voronoi diagram
00:03:52.040 | through that environment,
00:03:53.200 | you would have a set of each unique set of paths
00:03:57.700 | or homotopies, continuously deformable paths
00:04:00.180 | that will take you from one location to another through it.
00:04:04.100 | If you then turn that into its dual,
00:04:06.600 | which is the Delaunay triangulation of said environment,
00:04:09.400 | presuming that you've got convex obstacles,
00:04:11.980 | you can then tile those together rather trivially
00:04:14.260 | to create a set of homotopies and transitions
00:04:18.360 | across which those paths can stake out
00:04:22.160 | sort of a given set of options for the human.
00:04:25.280 | It turns out humans tend to,
00:04:27.240 | this tends to be a more intuitive way
00:04:29.760 | of imposing certain constraints on human operation
00:04:32.920 | rather than enforcing that the ego vehicle
00:04:37.600 | stick to some arbitrary position within
00:04:40.860 | some distance of a safe path.
00:04:42.840 | You instead look to enforce only
00:04:46.080 | that the state of the vehicle remain
00:04:48.560 | within a constraint-bounded n-dimensional tube
00:04:51.320 | in state space.
00:04:52.560 | Those constraints being spatial,
00:04:54.040 | you imagine for a moment edges of the roadway
00:04:56.760 | or circumventing various objects in the roadway.
00:05:00.100 | Imagine them also being dynamic, right?
00:05:03.600 | So limits of tire,
00:05:05.420 | tire friction imposed limits on side slip angles.
00:05:10.560 | And so using that,
00:05:11.960 | what we did is found a way to create those homotopies,
00:05:14.340 | forward simulate the trajectory of the vehicle
00:05:17.980 | given its current state
00:05:19.220 | and some optimal set of control inputs
00:05:22.700 | that would optimize its stability through that.
00:05:24.980 | We use model predictive control in that work.
00:05:27.700 | And then taking that forward simulated trajectory,
00:05:32.360 | computing some metric of threat.
00:05:34.860 | For instance, if the objective function for that
00:05:38.740 | minimize the or maximize stability
00:05:41.220 | or minimize some of these parameters like wheel side slip,
00:05:44.860 | then wheel side slip is a fairly good indication
00:05:48.060 | of how threatening that optimal maneuver is becoming.
00:05:52.820 | And so what we did is then use that
00:05:54.340 | in a modulation of control between the human and the car,
00:05:58.340 | such that should the car ever find itself in a state
00:06:01.460 | where that forward simulated optimal trajectory
00:06:04.380 | is very near the limits of what the vehicle
00:06:06.300 | and it can actually handle,
00:06:08.000 | we will have transitioned control fully
00:06:09.960 | to the vehicle, to the automated system
00:06:12.580 | so that it can avoid an accident.
00:06:14.420 | And then it transitions back in some manner.
00:06:16.140 | And we played with a number of different methods
00:06:20.100 | of transitioning this control
00:06:22.020 | to ensure that we didn't throw off the human mental model,
00:06:27.020 | which was one of the key concerns.
00:06:29.460 | We also wanted to make sure
00:06:30.820 | that we were able to arrest accidents before they happen.
00:06:34.380 | What you see here is a simulation
00:06:37.480 | that was fairly faithful to the behavior we saw
00:06:42.060 | in test drivers up in Dearborn, Michigan.
00:06:45.420 | Ford provided us with a Jaguar S-Type to test this on.
00:06:48.840 | And what we did, so what you see here
00:06:51.740 | is there's a blue vehicle and a gray vehicle.
00:06:53.340 | Both in both cases, we have a poorly tuned driver model,
00:06:57.860 | in this case, a pure pursuit controller
00:07:00.060 | with a fairly short look ahead,
00:07:02.020 | shorter than would be appropriate
00:07:04.120 | given this scenario and these dynamics.
00:07:07.520 | The gray vehicle is without
00:07:09.120 | the intelligent copilot in the loop.
00:07:12.240 | You'll notice that obviously the driver becomes unstable,
00:07:15.480 | loses control and leaves the safe roadway.
00:07:18.380 | The copilot, remember, is interested
00:07:21.920 | not in following any given path.
00:07:23.880 | It doesn't care where the vehicle lands on this roadway,
00:07:27.600 | provided it remains inside the road.
00:07:30.240 | In the blue vehicle's case,
00:07:33.040 | it's the exact same human driver model,
00:07:35.640 | now with the copilot in the loop.
00:07:37.800 | You'll notice that as this scenario continues,
00:07:42.520 | what you see here on the left is in this green bar
00:07:46.040 | is the portion of available control authority
00:07:48.400 | that's being taken by the automated system.
00:07:50.000 | You'll notice that it never exceeds
00:07:51.360 | half of the available control,
00:07:52.500 | which is to say that the steering inputs
00:07:54.800 | received by the vehicle end up being a blend
00:07:58.200 | of what the human and what the automation are providing.
00:08:03.280 | And what results is a path for the blue vehicle
00:08:07.680 | that actually better tracks the human's intended trajectory
00:08:12.320 | than even the copilot understood.
00:08:15.280 | Again, the copilot is keeping the vehicle stable,
00:08:17.520 | is keeping it on the road.
00:08:18.900 | The human is hewing to the center line of that roadway.
00:08:23.040 | So there were some very interesting things
00:08:24.280 | that came out of this.
00:08:25.120 | There were a lot of, we did a lot of work
00:08:27.800 | in understanding what kind of feedback
00:08:30.240 | was most natural to provide to a human.
00:08:32.080 | Our biggest concern was if you throw off
00:08:34.560 | a human's mental model by causing the vehicle's behaviors
00:08:38.040 | to deviate from what they expect it to do
00:08:41.200 | in response to various control inputs,
00:08:43.040 | that that could be a problem.
00:08:43.920 | So we tried various things from adjusting,
00:08:46.640 | for instance, one of the key questions
00:08:48.320 | that we had early on was if we couple the computer control
00:08:53.320 | and the human control via planetary gear
00:08:57.200 | and allow the human to feel actually a backwards torque
00:09:02.200 | to what the vehicle is doing.
00:09:03.240 | So the car starts to turn right,
00:09:05.120 | human will feel the wheel turn left.
00:09:07.280 | They'll see it start to turn left.
00:09:09.100 | Is that more confusing or less confusing to a human?
00:09:11.920 | And it turns out it depends on how experienced
00:09:13.960 | that human is.
00:09:14.800 | Some drivers will modulate their inputs
00:09:17.720 | based on the torque feedback
00:09:19.000 | that they feel through the wheel.
00:09:19.920 | And for instance, a very experienced driver
00:09:22.240 | expects to feel the wheel pull left
00:09:24.880 | when they're turning right.
00:09:26.640 | However, less experienced drivers,
00:09:28.960 | in response to seeing the wheel turning opposite
00:09:31.400 | to what the car is supposed to be doing,
00:09:32.960 | that's a rather confusing experience.
00:09:35.200 | So there were a lot of really interesting
00:09:37.600 | human interface challenges that we were dealing with here.
00:09:41.960 | We ended up working through a lot of that,
00:09:46.360 | developing a number of sort of micro applications for it.
00:09:51.360 | One of those, at the time,
00:09:54.720 | Gil Pratt was leading a DARPA program
00:09:57.160 | focused on what they call at the time
00:09:59.000 | maximum mobility manipulation.
00:10:00.700 | We decided to see what this system could do
00:10:05.920 | in application to unmanned ground vehicles.
00:10:08.880 | So in this case, what you see is a human driver
00:10:12.560 | sitting at a remote console,
00:10:14.040 | as one would when operating an unmanned vehicle,
00:10:18.420 | for instance, in the military.
00:10:19.920 | What you see on the left, top left,
00:10:23.160 | is the top down view of what the vehicle sees.
00:10:27.760 | I should have played this in repeat mode.
00:10:29.800 | With bounding boxes, bounding various cones.
00:10:32.880 | And what we did is we set up about 20 drivers,
00:10:35.120 | 20 test subjects, looking at this control screen
00:10:40.120 | and operating the vehicle through this track.
00:10:44.080 | And we set this up as a race
00:10:46.560 | with prizes for the winners, as one would expect,
00:10:51.000 | and penalized them for every barrel they hit.
00:10:55.040 | If they knocked over the barrel,
00:10:56.140 | I think they got a five second penalty.
00:10:57.720 | If they brushed a barrel, they got a one second penalty.
00:11:00.400 | And they were to cross the field as fast as possible.
00:11:03.200 | And they had no line of sight connection to the vehicle.
00:11:05.240 | And we played with some things on their interface.
00:11:07.140 | We caused it to drop out occasionally.
00:11:10.640 | We delayed it, as one would realistically expect
00:11:13.640 | in the field.
00:11:15.080 | And then we either engaged or didn't engage the co-pilot
00:11:19.140 | to try to understand what effect that had
00:11:20.800 | on their performance and their experience.
00:11:23.160 | And what we found was not surprisingly,
00:11:24.800 | the incidence of collisions declined.
00:11:27.320 | It declined by about 72% when the co-pilot was engaged
00:11:30.300 | versus when it was not.
00:11:32.120 | We also found that even with that 72% decline
00:11:36.720 | in collisions, the speed increased by,
00:11:39.800 | I'm blanking on the amount, but it was 20 to 30 percentage.
00:11:43.500 | Finally, and perhaps most interesting to me,
00:11:47.360 | after every run, I would ask the driver,
00:11:50.420 | and again, these were blind tests.
00:11:52.000 | They didn't know if the co-pilot was active or not.
00:11:53.560 | And I would ask them, how much control did you feel
00:11:55.920 | like you had over the vehicle?
00:11:57.760 | And I found that there was a statistically significant
00:12:01.680 | increase of about 12% when the co-pilot was engaged.
00:12:05.360 | That is to say, drivers reported feeling more control
00:12:08.240 | of the vehicle 12% more of the time
00:12:11.200 | when the co-pilot was engaged than when it wasn't.
00:12:13.840 | And then I looked at the statistics.
00:12:15.400 | It turns out they actually, the average level of control
00:12:17.960 | that the co-pilot was taking was 43%.
00:12:20.540 | So they were reporting that they felt more in control
00:12:23.700 | when in fact they were 43% less in control,
00:12:26.260 | which was interesting and I think bears a little bit
00:12:31.460 | on sort of the human psyche in terms of,
00:12:34.620 | they were reporting the vehicle was doing
00:12:37.040 | what I wanted it to do, maybe not what I told it to do,
00:12:39.780 | which was kind of fun observation.
00:12:42.540 | And fun to, I think the most enjoyable part of this
00:12:46.020 | was getting together with the whole group
00:12:49.100 | at the end of the study and presenting some of this
00:12:51.360 | and seeing some of the reactions.
00:12:53.020 | So from there, we looked at a few other areas.
00:12:59.180 | My, Carl Yanima and I looked at a few different opportunities
00:13:04.540 | to commercialize this.
00:13:05.540 | Again, this was years ago and the industry
00:13:08.240 | was in a very different place than it is today.
00:13:10.620 | We started a company first called Gimlet,
00:13:12.340 | then another called Ride.
00:13:15.340 | This is the logo, it may look familiar to you.
00:13:17.500 | We turned that into, at the time it intended
00:13:21.580 | to roll this out across various automakers
00:13:24.580 | in their operations.
00:13:29.300 | At the time, very few saw self-driving as a technology
00:13:34.300 | that was really gonna impact their business going forward.
00:13:38.340 | They were, in fact, even ride sharing at the time
00:13:42.300 | was a fairly new concept that was, I think,
00:13:45.820 | to a large degree viewed as unproven.
00:13:47.660 | So, as I mentioned, December of last year,
00:13:52.620 | I co-founded Aurora with a couple of folks
00:13:59.860 | who have been making significant progress
00:14:02.540 | in this space for many years.
00:14:03.840 | Chris Hermsen, who formerly led
00:14:06.420 | Google's self-driving car group.
00:14:08.380 | Drew Bagnell is a professor at Carnegie Mellon University,
00:14:11.140 | exceptional machine learning and applied machine learning,
00:14:14.900 | was one of the founding members
00:14:16.020 | of Uber's self-driving car team
00:14:17.440 | and led autonomy and perception there.
00:14:19.340 | We felt like we had a unique opportunity
00:14:22.180 | at the convergence of a few things.
00:14:24.160 | One, the automotive world has really come
00:14:28.980 | into the full-on realization that self-driving,
00:14:31.740 | and particularly self-driving and ride sharing
00:14:33.940 | and vehicle electrification are three vectors
00:14:36.380 | that will change the industry.
00:14:38.780 | That was something that didn't exist 10 years ago.
00:14:41.980 | Two, significant advances have been made
00:14:44.980 | in some of these machine learning techniques,
00:14:48.300 | in particular deep learning
00:14:50.140 | and other neural network approaches
00:14:53.420 | in the computers that run them
00:14:56.820 | and the availability of low-power GPU and TPU options
00:15:01.820 | to really do that well.
00:15:04.460 | In sensing technologies, in high-resolution radar,
00:15:07.780 | and a lot of the lidar development.
00:15:09.700 | So it's really a unique time in the self-driving world.
00:15:12.180 | A lot of these things are really coming together now.
00:15:15.380 | And we felt like by bringing together an experienced team,
00:15:18.640 | we had an interesting opportunity
00:15:20.100 | to build from a clean sheet a new platform,
00:15:25.060 | a new self-driving architecture
00:15:27.900 | that leveraged the latest advances
00:15:29.260 | in applied machine learning together with our experience
00:15:35.660 | of where some of the pitfalls tend to be down the road
00:15:38.460 | as you develop these systems.
00:15:39.500 | Because you don't tend to see them early on.
00:15:41.220 | They tend to express themselves
00:15:42.740 | as you get into the long tail of corner cases
00:15:45.260 | that you end up needing to resolve.
00:15:47.860 | So we've built that team.
00:15:49.580 | We have offices in Palo Alto, California
00:15:52.220 | and Pittsburgh, Pennsylvania.
00:15:54.180 | We've got fleets of vehicles
00:15:55.500 | operating in both Palo Alto and Pennsylvania.
00:15:57.780 | A couple of weeks ago, we announced that Volkswagen Group,
00:16:01.500 | one of the largest automakers in the world,
00:16:03.060 | Hyundai Motor Company,
00:16:04.020 | also one of the largest automakers in the world,
00:16:06.900 | have both partnered with Aurora.
00:16:08.980 | We will be developing and are developing with them
00:16:12.380 | a set of platforms that ultimately will scale
00:16:15.380 | that our technology on their vehicles across the world.
00:16:19.100 | And one of the important elements of building Lex,
00:16:23.300 | I asked Lex before coming out here
00:16:25.580 | what this group would be most interested in hearing.
00:16:28.500 | One of the things that he mentioned was
00:16:29.660 | what does it take to build a self-driving,
00:16:31.660 | build a new company in a space like this?
00:16:34.140 | One of the things that we found very important
00:16:36.980 | was a business model that was non-threatening to others.
00:16:40.460 | We recognize that our strengths
00:16:44.100 | and our experience over the last,
00:16:45.940 | in my case, a decade, in Chris's case, almost two,
00:16:48.460 | really lies in the development of the self-driving systems.
00:16:53.380 | Not in building vehicles,
00:16:55.740 | though I have had some experience there,
00:16:57.220 | but in developing the self-driving.
00:16:59.100 | So our feeling was if our mission
00:17:02.100 | is to get this technology to market as quickly,
00:17:03.980 | as broadly, and as safely as possible,
00:17:05.860 | that mission is best served by playing our position
00:17:10.380 | and working well with others who can play theirs,
00:17:13.460 | which is why you see the model that we've adopted
00:17:16.260 | and is now, you'll start to see some of the fruits of that
00:17:19.180 | through these partnerships with some of these automakers.
00:17:21.580 | So at the end of the day, our aspiration and our hope
00:17:24.780 | is that this technology that is so important in the world
00:17:28.740 | in increasing safety, in improving access to transportation,
00:17:32.740 | in improving efficiency,
00:17:33.900 | in the utilization of our roadways in our cities.
00:17:36.300 | This is maybe the first talk I've ever given
00:17:39.660 | where I didn't start by rattling off statistics
00:17:41.700 | about safety and all these other things.
00:17:44.500 | If you haven't heard them yet, you should look them up.
00:17:46.980 | They're stark, right?
00:17:48.900 | The fact that most vehicles in the United States today
00:17:52.780 | have an average, on average,
00:17:54.820 | three parking spaces allocated to them.
00:17:57.940 | The amount of land that's taken up across the world
00:18:02.180 | in housing vehicles that are used
00:18:04.660 | less than 5% of the time.
00:18:06.540 | The number of people, I think in the United States,
00:18:10.660 | the estimate has been somewhere between
00:18:12.060 | six and 15 million people don't have access
00:18:15.060 | to the transportation they need,
00:18:16.820 | because they're elderly or disabled
00:18:18.820 | or one of many other factors.
00:18:21.500 | And so this technology is potentially
00:18:24.940 | one of the most impactful for our society
00:18:27.140 | in the coming years.
00:18:28.780 | It's a tremendously exciting technological challenge.
00:18:31.820 | And at the confluence of those two things,
00:18:34.660 | I think is a really unique opportunity for engineers
00:18:38.500 | and others who are not engineers
00:18:39.900 | who really wanna get involved to play a role
00:18:42.500 | in changing our world going forward.
00:18:45.060 | So with that, maybe I'll stop with this
00:18:48.460 | and we can go to questions.
00:18:52.420 | Let's give Sterling a warm hand.
00:18:55.020 | (audience applauding)
00:18:57.740 | - Hi, I'm Wayne, hello, thanks for coming.
00:18:59.660 | I have a question.
00:19:00.500 | A lot of self-driving car companies
00:19:02.860 | are making extensive use of LiDAR,
00:19:05.620 | but you don't see a lot of that with Tesla.
00:19:07.540 | I wanted to know if you had any thoughts about that.
00:19:10.420 | - Yeah, I don't wanna talk about Tesla too much
00:19:12.260 | in terms of our specific,
00:19:13.940 | anything that wasn't public information
00:19:16.060 | I'm not gonna get into.
00:19:17.380 | I will say that for Aurora,
00:19:19.940 | we believe that the right approach
00:19:21.580 | is getting to market quickly
00:19:23.500 | and you get to market and doing so safely.
00:19:26.140 | And you get to market most quickly and safely
00:19:27.860 | if you leverage multiple modalities, including LiDAR.
00:19:30.860 | These are all just to clarify what's running in the background
00:19:34.700 | these are all just Aurora videos
00:19:36.300 | of our cars driving on various test routes.
00:19:40.460 | Yeah.
00:19:41.300 | - Hi, I'm Luke from the Sloan School.
00:19:42.940 | A lot of customers have visceral type connections
00:19:45.460 | to their automobile.
00:19:46.660 | I was wondering how you see that market,
00:19:48.460 | the car enthusiast market being affected by AVs
00:19:51.420 | and then vice versa,
00:19:52.300 | how the AVs will be designed around those type of customers.
00:19:56.300 | - Yeah, that's a good question.
00:19:57.140 | Thanks for asking.
00:19:58.180 | I am one of those enthusiasts.
00:20:00.540 | I very much appreciate being able to drive a car
00:20:05.540 | in certain settings.
00:20:08.660 | I very much don't appreciate driving in others.
00:20:11.580 | I remember distinctly several evenings,
00:20:16.140 | I almost literally pounding my steering wheel
00:20:18.900 | sitting in Boston traffic, on my way to somewhere.
00:20:23.900 | I do the same in San Francisco.
00:20:27.340 | I think the opportunity really is to turn that,
00:20:30.780 | turn sort of personal vehicle ownership and driving
00:20:34.820 | into more of a sport and something you do for leisure.
00:20:37.660 | I see it, a gentleman sometime ago asked me to talk,
00:20:45.420 | hey, don't you think this is a problem for the country?
00:20:50.420 | I think you meant the world.
00:20:52.500 | If people don't learn how to drive,
00:20:55.060 | that's just something a human should know how to do.
00:20:57.780 | My perspective is it's as much of a problem
00:21:01.620 | as people not intrinsically knowing
00:21:03.540 | how to ride a horse today.
00:21:05.180 | If you wanna know how to ride a horse, go ride a horse.
00:21:07.860 | If you wanna race a car, go to a racetrack
00:21:10.740 | or go out to a mountain road that's been allocated for it.
00:21:14.540 | Ultimately, I think there is an important place for that
00:21:17.620 | because I certainly agree with you.
00:21:19.020 | I'm very much a vehicle enthusiast myself,
00:21:21.300 | but I think there is so much opportunity here
00:21:26.020 | in alleviating some of these other problems,
00:21:28.940 | particularly in places where it's not fun to drive,
00:21:31.460 | that I think there's a place for both.
00:21:33.900 | Yeah.
00:21:34.740 | - Hi, can you hear or do I need to get?
00:21:40.180 | - Yeah. - Yeah.
00:21:41.580 | Congratulations on the partnership
00:21:43.540 | that was announced recently, I think.
00:21:46.100 | So I have a two-part question.
00:21:47.420 | The first one is, so we heard last week from,
00:21:51.300 | I think there was a gentleman from Waymo
00:21:53.220 | talking about how long they've been working
00:21:54.740 | on this autonomous car technology.
00:21:57.380 | And you seem to have ramped up extremely fast.
00:22:00.420 | So is there a licensing model that you've taken?
00:22:04.940 | I mean, how are you able to commercialize
00:22:06.860 | the technology in one year?
00:22:10.700 | - So just to be clear, we're not actually commercializing.
00:22:13.980 | Just to distinguish, we are partnering
00:22:18.260 | and developing vehicles and we'll ultimately
00:22:20.380 | be running pilots, as we announced a week or two ago
00:22:24.180 | with the Moya shuttles.
00:22:26.100 | We are, however, I will distinguish that
00:22:28.420 | from broad commercialization of the technology.
00:22:31.020 | And I don't wanna get too much into the nuances
00:22:36.500 | of that business model.
00:22:38.420 | I will say that it is one that's done
00:22:41.140 | in very close partnership with our automotive partners.
00:22:44.020 | Because at the end of the day, they understand their cars,
00:22:48.180 | they understand their customers,
00:22:49.740 | they have distribution networks.
00:22:51.420 | They are, our automotive partners
00:22:54.460 | are fairly well positioned, provided they have
00:22:57.780 | the right support in developing the self-driving technology,
00:22:59.980 | they're fairly well positioned to roll it out at scale.
00:23:04.700 | - So the second part of my question is,
00:23:07.820 | again, looking at this pace of adoption
00:23:09.900 | and the maturity of technology,
00:23:11.540 | do you see an open source model for autonomous cars
00:23:16.580 | as they become more and more?
00:23:18.140 | - Unclear.
00:23:20.380 | I'm not convinced that an open source model
00:23:24.500 | is what gets to market most quickly.
00:23:27.540 | In the long run, it's not clear to me what will happen.
00:23:34.980 | I think there will be a handful of successful
00:23:38.380 | self-driving stacks that will make it.
00:23:41.420 | Nowhere near the number of self-driving companies today,
00:23:45.420 | but a handful, I think.
00:23:47.100 | - Two questions, one is, invariably,
00:23:53.460 | a new product development, there's typically
00:23:55.540 | two types of bottlenecks.
00:23:57.060 | There's a technological bottleneck
00:23:58.820 | and an economic bottleneck.
00:24:00.260 | Technological bottleneck might be,
00:24:03.700 | hey, the sensors aren't good enough
00:24:05.900 | or the machine learning algorithms
00:24:07.380 | aren't good enough and so on.
00:24:08.540 | I'd be interested to hear,
00:24:09.620 | and it'll shift, obviously, over time.
00:24:11.980 | So I'd be interested to know what you would say
00:24:13.580 | is the current thing that if, hey,
00:24:16.140 | if this part of the architecture was 10 times better,
00:24:19.260 | we would, and then on the economic side,
00:24:21.300 | I'd be interested to know, gee,
00:24:22.860 | if sensors were 100 times cheaper,
00:24:25.820 | then, so I'd be interested to hear your perspective
00:24:27.940 | on both of those. - That's a great question.
00:24:30.500 | Let me start with the economic side of it,
00:24:33.060 | and just to get that out of the way
00:24:34.060 | 'cause it's a little bit quicker answer.
00:24:36.140 | The economics of operating a self-driving vehicle
00:24:40.580 | in a shared network today would close,
00:24:45.180 | that business case closes even with high cost of sensors.
00:24:49.060 | That is not what's stopping us.
00:24:51.140 | And that's part of why the gentleman earlier who asked,
00:24:55.420 | should you use LiDAR or not?
00:24:58.780 | If your target is to initially deploy these in fleets,
00:25:03.060 | you would be wise to start at the top end of the market,
00:25:06.180 | develop and deploy a system that's as capable as possible,
00:25:09.300 | as quickly as possible, and then cost it down over time.
00:25:12.860 | And you can do that as computer vision,
00:25:14.620 | precision recall increase.
00:25:16.420 | Today, they're not good enough, right?
00:25:18.300 | And so economically, depending on your model
00:25:23.300 | of going to market, and we believe that the right model
00:25:25.980 | is through mobility services,
00:25:31.780 | you can cost down, you'll cost down the center.
00:25:34.380 | Inevitably, there's no unobtainium in LiDAR units today.
00:25:37.940 | There's no reason fundamentally that a shared cost
00:25:40.780 | of a LiDAR unit will lead you to a $70,000 price point.
00:25:43.740 | However, if you build anything in low enough volumes,
00:25:47.580 | it's gonna be expensive.
00:25:48.980 | Many of these things will work their way
00:25:50.700 | into the standard automotive process.
00:25:53.020 | They'll work their way into tier one suppliers,
00:25:55.100 | and when they do, the automotive community
00:25:58.060 | has shown themselves to be able to do that.
00:26:00.260 | They've shown themselves to be exceptional
00:26:01.860 | at driving those costs down,
00:26:02.900 | and so I expect them to come way down.
00:26:05.540 | To your other question,
00:26:06.940 | technological bottlenecks and challenges.
00:26:08.980 | One of the key challenges of self-driving
00:26:12.300 | is and remains that of forecasting the intent
00:26:17.300 | and future behaviors of other actors,
00:26:20.780 | both in response to one another,
00:26:21.980 | but also in response to your own decisions in motion.
00:26:25.340 | That's a perception problem,
00:26:28.500 | but it's something more than a perception problem.
00:26:30.300 | It's also a prediction,
00:26:33.180 | and there are a number of different things
00:26:36.580 | that come together to,
00:26:38.540 | that have to come together to solve this.
00:26:40.580 | We're excited about some of the tools that we're using
00:26:43.980 | in interleaving various modern machine learning techniques
00:26:48.340 | throughout the system to do things like project
00:26:52.140 | our own behaviors that were learned
00:26:54.220 | for the ego vehicle on others,
00:26:55.660 | and assume that they'll behave as we would
00:26:57.740 | had we been in that situation.
00:26:59.260 | - Like an expert system kind of approach, right?
00:27:01.220 | - Yeah, yeah.
00:27:02.700 | You assume nominal behavior,
00:27:04.700 | and you guard against off-nominal, right?
00:27:07.500 | But it's very much,
00:27:08.740 | it's not a solved problem, I wouldn't say.
00:27:10.980 | It's very much as you get into that
00:27:13.780 | really long tail of development,
00:27:15.500 | when you're no longer putting out demonstration videos,
00:27:20.780 | but you're instead just putting your head down
00:27:22.340 | and eking out those final nines,
00:27:25.540 | that's the kind of problem you tend to deal with.
00:27:27.980 | - Thank you.
00:27:29.700 | - So this question isn't necessarily about
00:27:34.580 | the development of self-driving cars,
00:27:37.140 | but more of like an ethics question.
00:27:39.100 | When you're putting human lives
00:27:41.180 | into the hands of software,
00:27:43.160 | isn't there always the possibility for outside agents
00:27:46.180 | with malicious intent to use it for their own gain?
00:27:49.820 | And how do you guys, if you do have a plan,
00:27:52.500 | how do you intend to protect against attacks like that?
00:27:57.380 | - So security is a very real
00:28:00.100 | aspect of this that has to be solved.
00:28:05.860 | It's a constant game of cat and mouse.
00:28:09.940 | And so I think it just requires a very good team
00:28:13.900 | and a concerted effort over time.
00:28:15.700 | I don't think you solve it once,
00:28:18.620 | and I certainly wouldn't pretend to have a plan
00:28:22.140 | that solves it and is done with it.
00:28:23.940 | We try to leverage best practices where we can
00:28:28.260 | in the fundamental architecture of the system
00:28:29.900 | to make it less exposed,
00:28:32.460 | in particular key parts of the system,
00:28:35.060 | less exposed to nefarious actions of others.
00:28:38.260 | But at the end of the day, it's just a constant,
00:28:40.660 | it's a constant development effort.
00:28:43.460 | - Thank you for being here.
00:28:47.660 | So I had a question about what opportunities
00:28:50.220 | self-driving cars open up.
00:28:51.980 | Since driving has kind of been designed around
00:28:54.020 | like a human being at the center since the beginning,
00:28:56.940 | if you put a computer at the center,
00:28:59.500 | what society-wide differences,
00:29:02.460 | and maybe even within individual car differences
00:29:04.660 | that open up, could cars go 150 miles an hour
00:29:07.980 | on the highway and get places much faster?
00:29:09.980 | Would cars look differently when a human
00:29:12.380 | doesn't need to be paying attention and stuff like that?
00:29:14.300 | - Yeah, I think the answer is yes.
00:29:16.180 | And that's something that's very exciting, right?
00:29:19.980 | So one of the,
00:29:21.300 | I think one of the unique opportunities
00:29:24.860 | that automakers in particular have
00:29:27.260 | when self-driving technology gets incorporated
00:29:29.380 | into their vehicles is they can do things
00:29:31.300 | like differentiate the user experience.
00:29:33.940 | They can provide services,
00:29:35.380 | augmented reality services or location services,
00:29:41.660 | many other sort of, it opens a new window
00:29:44.340 | into an entirely new market that automakers
00:29:46.780 | haven't historically played in.
00:29:49.940 | And it allows them to change the very vehicles themselves.
00:29:54.940 | As you mentioned, the interior can change.
00:29:58.100 | As we validate some of these self-driving systems
00:30:03.020 | and confirm that they do in fact reduce the collision,
00:30:06.260 | the rate of collisions as we hope they will,
00:30:08.580 | you can start to pull out a lot of the extra mass
00:30:14.460 | and other things that we've added to vehicles
00:30:17.100 | to make them more passively safe, right?
00:30:19.580 | Roll cages, crumple zones, airbags,
00:30:22.380 | a lot of these things, presumably in a world
00:30:26.820 | where we don't crash, there is much less need
00:30:30.820 | for passive safety systems.
00:30:32.380 | So yes.
00:30:34.220 | - Hi, I have a question about the go or no go test
00:30:39.820 | that you conduct for certain features
00:30:41.460 | like you mentioned the throttle control
00:30:43.020 | where you slow down the throttle,
00:30:45.420 | assuming that the driver has pressed the wrong pedal.
00:30:48.580 | When you test, when do you decide to launch that feature?
00:30:51.060 | How do you know it's definitely gonna work
00:30:53.140 | in all scenarios because your dataset might not be--
00:30:55.300 | - Oh, it's a statistical evaluation in every case, right?
00:30:58.440 | You're right.
00:31:00.420 | This is part of the art of self-driving vehicle development
00:31:05.220 | is you will never have comprehensively captured
00:31:09.500 | every case, every scenario.
00:31:11.660 | That is as...
00:31:12.940 | Some of you may wanna correct me on this.
00:31:18.260 | I think that's an unbounded set.
00:31:20.420 | It may in fact be bounded at some point,
00:31:21.780 | but I think it's un.
00:31:22.780 | And so you'll never actually have characterized everything.
00:31:26.980 | What you will have done, hopefully, if you do it right,
00:31:29.980 | is you will have established with a reasonable degree
00:31:33.060 | of confidence that you can perform at a level of safety
00:31:35.740 | that's better than the average human driver.
00:31:37.660 | And once you've reached that threshold
00:31:39.140 | and you're confident that you've reached that threshold,
00:31:41.620 | I think the opportunity to launch is real
00:31:46.180 | and you should seriously consider it.
00:31:48.140 | - So thank you for your talk today first.
00:31:52.500 | And my question is,
00:31:54.580 | self-driving seems to be able to ultimately
00:31:56.900 | take over the world to some extent,
00:32:00.140 | but just like other technologies today
00:32:02.860 | that open up new opportunities,
00:32:04.300 | but also bring in adverse effects.
00:32:06.880 | So how do you respond to fear and negative effects
00:32:11.620 | that may come in one day?
00:32:12.740 | And specifically, what do you see as the positive
00:32:15.340 | and negative implications of future day self-driving?
00:32:18.820 | - Positive and negative implications.
00:32:21.660 | So the positive ones I kind of listed
00:32:26.060 | and go find your favorite press article
00:32:28.980 | and they'll list them as well.
00:32:30.500 | The negative ones in the near term,
00:32:35.060 | I do worry a little bit about the displacement of jobs.
00:32:40.060 | Not a little bit, this will happen.
00:32:42.740 | It happens with every technology like this.
00:32:45.580 | I think it's incumbent on us to find a good way
00:32:50.700 | of transitioning those who are employed
00:32:52.740 | in some of the transportation sectors
00:32:54.140 | that will be affected into better work.
00:32:58.220 | There are a few opportunities that are interesting
00:33:03.220 | in that regard, but I think it's an important thing
00:33:06.140 | to start discussing now,
00:33:07.740 | because it's gonna take a few years.
00:33:10.560 | And by the time we've got these self-driving systems
00:33:14.420 | on the roads, really starting to place that labor,
00:33:16.380 | I'd really like to have a new home for it.
00:33:18.480 | - Hi, I'm Kasia from the Sloan School.
00:33:23.020 | My question was more about your business model,
00:33:25.220 | again, with partnering with both VW and Hyundai,
00:33:29.100 | and your just perspective on how you were able
00:33:31.660 | to effectively do that.
00:33:33.700 | Did not one of them wanna go sort of exclusive with you?
00:33:37.540 | And what was your sort of thought process about that?
00:33:39.980 | - Yeah, so our mission, as I mentioned,
00:33:43.820 | is to get the technology to market broadly,
00:33:46.380 | and quickly, and safely.
00:33:47.860 | We are, have been and remain convinced
00:33:53.020 | that the right way to do that is by providing it
00:33:55.740 | to as much of the industry as possible.
00:33:57.700 | To every automaker who shares our vision and our approach,
00:34:02.340 | we were pleased to see that both Volkswagen Group,
00:34:06.220 | and I'm assuming you all know the scope of Volkswagen, right?
00:34:10.820 | This is a massive automaker.
00:34:13.140 | Hyundai Motor, also very large,
00:34:16.300 | across Hyundai, Kia, and Genesis.
00:34:19.140 | They both shared our vision of how we should do this,
00:34:22.780 | which was important to us.
00:34:24.760 | They both shared a keen interest in
00:34:28.980 | making a difference at scale through their platforms.
00:34:34.780 | Volkswagen has, I think, a very admirable set of initiatives
00:34:38.020 | around vehicle electrification, a few other things.
00:34:40.540 | Hyundai is doing similar things.
00:34:42.460 | And so, for us, it was important that we enable everyone,
00:34:47.220 | and that was kind of what Aurora was started to do.
00:34:49.860 | - Hi, I had a question.
00:34:51.740 | Now that I see a lot of companies are coming up
00:34:54.220 | with self-driving cars, right?
00:34:55.300 | So most of the cars are pretty much,
00:34:57.980 | all the technology is bound only to the car.
00:35:00.500 | So would we see something like an open network
00:35:02.700 | where car communicate with each other,
00:35:04.460 | regardless of which company they come from?
00:35:06.460 | And would this, in any way, increase the safety
00:35:10.140 | or the performance of vehicles and stuff like that?
00:35:12.060 | - Yeah, I think you're getting it vehicle-to-vehicle,
00:35:15.220 | vehicle-to-infrastructure type communication.
00:35:16.780 | There are efforts ongoing in that,
00:35:19.420 | and it's certainly, it's only positive, right?
00:35:22.500 | Having that information available to you
00:35:25.860 | can only make things better.
00:35:28.140 | The challenge has historically been with vehicle-to-vehicle,
00:35:30.620 | and particularly vehicle-to-infrastructure, or vice versa,
00:35:34.820 | that it doesn't scale well, one, and two, it's been slow.
00:35:39.020 | It's been much slower in coming than our development.
00:35:41.500 | And so when we develop these systems,
00:35:44.420 | we develop them without the expectation
00:35:46.740 | that those communication protocol are available to us.
00:35:50.940 | We'll certainly protect for them,
00:35:52.540 | and it will certainly be a benefit once they're here.
00:35:56.940 | But until then, many of the hard problems
00:35:59.180 | that I would have welcomed 10 years ago,
00:36:01.900 | to have a beacon on every traffic light
00:36:03.700 | that just told me it's state,
00:36:04.900 | rather than having to perceive it,
00:36:06.580 | I would have certainly used those 10 years ago.
00:36:09.780 | Now they're less significant,
00:36:11.740 | because we've kind of worked our way through
00:36:13.300 | a lot of the problems that would have solved.
00:36:15.060 | - Thank you for your talk.
00:36:16.300 | My question is, what's your opinion
00:36:19.300 | about the cooperation of self-driving vehicles?
00:36:23.660 | So maybe I think if you can control a group
00:36:26.220 | of self-driving vehicles at the same time,
00:36:28.660 | you can achieve a lot of benefits to the traffic.
00:36:31.460 | - Yes.
00:36:32.300 | That is where a lot of the benefits come from
00:36:34.980 | in infrastructure utilization, right?
00:36:36.500 | Is in ride sharing with autonomous vehicles.
00:36:40.140 | And specifically, the better we understand demand patterns,
00:36:45.140 | people movement, goods movement,
00:36:48.460 | the better we can sort of optimally allocate these vehicles
00:36:53.220 | at locations where they're needed.
00:36:55.060 | So yes, certainly that coordination,
00:36:57.940 | this is where, as I mentioned,
00:36:59.060 | these three vectors of vehicle electrification,
00:37:01.900 | ride sharing and autonomy,
00:37:03.580 | or mobility as a service and autonomy,
00:37:06.460 | really come together with a unique value proposition.
00:37:09.780 | - Okay, thank you.
00:37:12.140 | - Yeah.
00:37:13.140 | - Thank you so much for a great talk and being here.
00:37:15.340 | (audience applauding)