back to index

Chris Urmson: Self-Driving Cars at Aurora, Google, CMU, and DARPA | Lex Fridman Podcast #28


Chapters

0:0 Intro
1:9 What have you learned from these races
2:11 Did you believe it could be done
3:4 Biggest pain points
4:17 The DARPA Challenge
5:12 Leadership Lessons
6:17 Technical Evolution
9:22 Urban Challenge
10:44 Perception
11:54 Where do we go from here
12:53 Hardest part about driving
13:56 LiDAR
14:33 LiDAR is a crutch
16:42 Cost
19:11 Basic Economics
22:33 Safety Critical Concerns
24:26 Level 2 Vehicles
26:18 Whats Going to Change
27:9 Safety
29:34 Safety Metrics
31:59 Public Perception
34:38 The Future
36:20 Deployment
38:15 Breakthroughs
39:49 Pedestrian protection
42:5 Competitors

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Chris Armstron.
00:00:03.120 | He was the CTO of the Google self-driving car team,
00:00:06.040 | a key engineer and leader behind the Carnegie Mellon
00:00:08.880 | University autonomous vehicle entries in the DARPA Grand
00:00:12.000 | Challenges, and the winner of the DARPA Urban Challenge.
00:00:16.160 | Today, he's the CEO of Aurora Innovation,
00:00:19.440 | an autonomous vehicle software company
00:00:21.360 | he started with Sterling Anderson, who
00:00:23.800 | was the former director of Tesla Autopilot,
00:00:25.960 | and Drew Bagnell, Uber's former autonomy and perception lead.
00:00:30.120 | Chris is one of the top roboticists and autonomous
00:00:32.880 | vehicle experts in the world, and a longtime voice of reason
00:00:37.360 | in a space that is shrouded in both mystery and hype.
00:00:41.320 | He both acknowledges the incredible challenges involved
00:00:44.040 | in solving the problem of autonomous driving,
00:00:46.480 | and is working hard to solve it.
00:00:49.680 | This is the Artificial Intelligence Podcast.
00:00:52.400 | If you enjoy it, subscribe on YouTube,
00:00:54.720 | give it five stars on iTunes, support it on Patreon,
00:00:57.920 | or simply connect with me on Twitter
00:00:59.760 | at Lex Friedman, spelled F-R-I-D-M-A-N.
00:01:03.240 | And now, here's my conversation with Chris Armstron.
00:01:09.120 | You were part of both the DARPA Grand Challenge
00:01:11.960 | and the DARPA Urban Challenge teams at CMU with Red Whittaker.
00:01:17.040 | What technical or philosophical things
00:01:19.720 | have you learned from these races?
00:01:22.240 | I think the high order bit was that it could be done.
00:01:26.640 | I think that was the thing that was
00:01:30.200 | incredible about the first, the Grand Challenges,
00:01:34.880 | that I remember I was a grad student at Carnegie Mellon,
00:01:38.160 | and there was kind of this dichotomy of it
00:01:45.360 | seemed really hard, so that would be cool and interesting.
00:01:48.800 | But at the time, we were the only robotics institute around,
00:01:52.800 | and so if we went into it and fell on our faces,
00:01:55.560 | that would be embarrassing.
00:01:58.400 | So I think just having the will to go do it,
00:02:01.120 | to try to do this thing that at the time
00:02:02.880 | was marked as darn near impossible,
00:02:05.000 | and then after a couple of tries,
00:02:07.000 | be able to actually make it happen,
00:02:08.440 | I think that was really exciting.
00:02:12.320 | But at which point did you believe it was possible?
00:02:15.080 | Did you, from the very beginning,
00:02:16.960 | did you personally, because you're
00:02:18.400 | one of the lead engineers, you actually
00:02:20.360 | had to do a lot of the work?
00:02:21.840 | Yeah, I was the technical director there
00:02:23.840 | and did a lot of the work along with a bunch
00:02:26.160 | of other really good people.
00:02:28.400 | Did I believe it could be done?
00:02:29.720 | Yeah, of course.
00:02:31.080 | Why would you go do something you thought
00:02:32.800 | was completely impossible?
00:02:34.880 | We thought it was going to be hard.
00:02:36.320 | We didn't know how we're going to be able to do it.
00:02:38.440 | We didn't know if we'd be able to do it the first time.
00:02:42.920 | Turns out we couldn't.
00:02:45.960 | That, yeah, I guess you have to.
00:02:48.400 | I think there's a certain benefit to naivete, right?
00:02:52.960 | That if you don't know how hard something really is,
00:02:55.440 | you try different things and it gives you an opportunity
00:02:59.600 | that others who are wiser maybe don't have.
00:03:04.120 | What were the biggest pain points?
00:03:05.720 | Mechanical, sensors, hardware, software,
00:03:08.880 | algorithms for mapping, localization,
00:03:11.800 | just general perception and control.
00:03:13.680 | Like hardware, software, first of all.
00:03:15.320 | I think that's the joy of this field, is that it's all hard.
00:03:20.080 | And that you have to be good at each part of it.
00:03:25.320 | So for the urban challenges, if I look back at it from today,
00:03:32.320 | it should be easy today.
00:03:36.280 | That it was a static world.
00:03:38.920 | There weren't other actors moving through it,
00:03:40.800 | is what that means.
00:03:42.480 | It was out in the desert, so you get really good GPS.
00:03:44.480 | So that went, and we could map it, roughly.
00:03:51.400 | And so in retrospect, now it's within the realm of things
00:03:55.200 | we could do back then.
00:03:57.840 | Just actually getting the vehicle,
00:03:59.240 | and there's a bunch of engineering work
00:04:00.840 | to get the vehicle so that we could control it and drive it.
00:04:04.800 | That's still a pain today, but it was even more so back then.
00:04:09.560 | And then the uncertainty of exactly what they wanted us
00:04:12.880 | to do was part of the challenge as well.
00:04:17.000 | >> Right, you didn't actually know the track heading in.
00:04:19.400 | You knew approximately, but you didn't actually
00:04:21.480 | know the route, the route that this can be taken.
00:04:23.520 | >> That's right, we didn't know the route.
00:04:25.280 | We didn't even really, the way the rules had been described,
00:04:28.560 | you had to kind of guess.
00:04:29.760 | So if you think back to that challenge,
00:04:33.320 | the idea was that the government would give us,
00:04:36.960 | DARPA would give us a set of waypoints and kind of the width
00:04:42.360 | that you had to stay within between the line that
00:04:44.320 | went between each of those waypoints.
00:04:46.840 | And so the most devious thing they could have done
00:04:49.280 | is set a kilometer-wide corridor across a field of scrub brush
00:04:54.960 | and rocks and said, go figure it out.
00:04:58.480 | Fortunately, it really turned into basically driving
00:05:01.880 | along a set of trails, which is much more relevant
00:05:04.960 | to the application they were looking for.
00:05:08.720 | But no, it was a hell of a thing back in the day.
00:05:12.040 | >> So the legend, Red, was kind of leading that effort
00:05:16.600 | in terms of just broadly speaking.
00:05:19.080 | So you're a leader now.
00:05:22.000 | What have you learned from Red about leadership?
00:05:24.960 | >> I think there's a couple of things.
00:05:26.280 | One is go and try those really hard things.
00:05:31.040 | That's where there is an incredible opportunity.
00:05:34.760 | I think the other big one, though,
00:05:36.560 | is to see people for who they can be, not who they are.
00:05:41.720 | It's one of the things that I actually--
00:05:43.720 | one of the deepest lessons I learned from Red
00:05:46.080 | was that he would look at undergraduates or graduate
00:05:51.000 | students and empower them to be leaders,
00:05:56.120 | to have responsibility, to do great things that I think
00:06:03.080 | another person might look at them and think,
00:06:04.920 | oh, well, that's just an undergraduate student.
00:06:06.880 | What could they know?
00:06:08.760 | And so I think that kind of trust, but verify,
00:06:12.760 | have confidence in what people can become, I think,
00:06:14.880 | is a really powerful thing.
00:06:16.720 | >> So through that, let's just fast forward
00:06:18.960 | through the history.
00:06:20.480 | Can you maybe talk through the technical evolution
00:06:24.200 | of autonomous vehicle systems from the first two
00:06:27.480 | Grand Challenges to the Urban Challenge to today?
00:06:31.040 | Are there major shifts in your mind,
00:06:33.600 | or is it the same kind of technology,
00:06:35.400 | just made more robust?
00:06:37.320 | >> I think there's been some big, big steps.
00:06:40.920 | So for the Grand Challenge, the real technology
00:06:46.640 | that unlocked that was HD mapping.
00:06:51.440 | Prior to that, a lot of the off-road robotics work
00:06:55.240 | had been done without any real prior model of what
00:06:58.960 | the vehicle was going to encounter.
00:07:01.440 | And so that innovation, the fact that we could get decimeter
00:07:06.640 | resolution models was really a big deal.
00:07:13.560 | And that allowed us to kind of bound
00:07:17.440 | the complexity of the driving problem the vehicle had
00:07:19.680 | and allowed it to operate at speed
00:07:21.080 | because we could assume things about the environment
00:07:23.800 | that it was going to encounter.
00:07:26.400 | So that was the big step there.
00:07:31.280 | For the Urban Challenge, one of the big technological
00:07:38.560 | innovations there was the multi-beam LIDAR
00:07:42.000 | and being able to generate high resolution, mid to long range
00:07:47.480 | 3D models of the world and use that for understanding
00:07:52.880 | the world around the vehicle.
00:07:54.120 | And that was really a game changing technology.
00:07:59.160 | In parallel with that, we saw a bunch
00:08:02.880 | of other technologies that had been kind of converging
00:08:06.640 | half their day in the sun.
00:08:09.000 | So Bayesian estimation had been--
00:08:13.080 | SLAM had been a big field in robotics.
00:08:18.560 | You would go to a conference a couple of years before that
00:08:21.320 | and every paper would effectively
00:08:23.840 | have SLAM somewhere in it.
00:08:25.720 | And so seeing those Bayesian estimation techniques
00:08:31.640 | play out on a very visible stage,
00:08:34.080 | I thought that was pretty exciting to see.
00:08:38.800 | And mostly SLAM was done based on LIDAR at that time.
00:08:41.800 | Well, yeah.
00:08:42.440 | And in fact, we weren't really doing SLAM per se in real time
00:08:46.640 | because we had a model ahead of time.
00:08:48.160 | We had a roadmap.
00:08:49.240 | But we were doing localization.
00:08:51.560 | And we were using the LIDAR or the cameras,
00:08:54.160 | depending on who exactly was doing it,
00:08:55.960 | to localize to a model of the world.
00:08:58.120 | And I thought that was a big step
00:09:00.760 | from kind of naively trusting GPS, INS before that.
00:09:07.160 | And again, lots of work had been going on in this field.
00:09:10.480 | Certainly this was not doing anything particularly
00:09:14.120 | innovative in SLAM or in localization.
00:09:17.440 | But it was seeing that technology necessary
00:09:20.200 | in a real application on a big stage,
00:09:21.840 | I thought was very cool.
00:09:23.080 | So for the urban challenge, those were already
00:09:25.000 | maps constructed offline?
00:09:27.600 | In general.
00:09:28.680 | And did people do that--
00:09:30.920 | did individual teams do it individually
00:09:33.600 | so they had their own different approaches there?
00:09:36.480 | Or did everybody kind of share that information, at least
00:09:42.120 | intuitively?
00:09:43.520 | So DARPA gave all the teams a model of the world, a map.
00:09:49.600 | And then one of the things that we had to figure out back then
00:09:53.720 | was--
00:09:54.560 | and it's still one of these things that
00:09:56.160 | trips people up today-- is actually the coordinate system.
00:10:00.240 | So you get a latitude, longitude.
00:10:03.040 | And to so many decimal places, you
00:10:05.160 | don't really care about the ellipsoid of the Earth that's
00:10:08.000 | being used.
00:10:09.520 | But when you want to get to 10 centimeter or centimeter
00:10:12.760 | resolution, you care whether the coordinate system is NADS-83
00:10:18.520 | or WGS-84.
00:10:20.360 | Or these are different ways to describe
00:10:24.200 | both the kind of non-sphericalness of the Earth,
00:10:26.760 | but also kind of the--
00:10:30.640 | I think-- I can't remember which one-- the tectonic shifts
00:10:33.040 | that are happening and how to transform the global datum
00:10:36.160 | as a function of that.
00:10:37.080 | So getting a map and then actually matching it
00:10:40.440 | to reality to centimeter resolution,
00:10:41.920 | that was kind of interesting and fun back then.
00:10:44.080 | So how much work was the perception doing there?
00:10:46.840 | So how much were you relying on localization based on maps
00:10:52.480 | without using perception to register to the maps?
00:10:55.760 | And I guess the question is, how advanced
00:10:58.000 | was perception at that point?
00:10:59.800 | It's certainly behind where we are today.
00:11:01.960 | We're more than a decade since the urban challenge.
00:11:05.880 | But the core of it was there.
00:11:09.960 | That we were tracking vehicles.
00:11:13.120 | We had to do that at 100-plus meter range
00:11:15.640 | because we had to merge with other traffic.
00:11:18.360 | We were using, again, Bayesian estimates
00:11:21.240 | for state of these vehicles.
00:11:23.840 | We had to deal with a bunch of the problems
00:11:25.600 | that you think of today of predicting
00:11:28.280 | where that vehicle is going to be
00:11:29.840 | a few seconds into the future.
00:11:31.080 | We had to deal with the fact that there
00:11:33.720 | were multiple hypotheses for that
00:11:35.320 | because a vehicle at an intersection
00:11:36.880 | might be going right or it might be going straight
00:11:39.000 | or it might be making a left turn.
00:11:41.560 | And we had to deal with the challenge of the fact
00:11:44.160 | that our behavior was going to impact the behavior
00:11:47.600 | of that other operator.
00:11:48.960 | And we did a lot of that in relatively naive ways.
00:11:53.520 | But it kind of worked.
00:11:54.840 | Still had to have some kind of solution.
00:11:57.120 | And so where does that, 10 years later,
00:12:00.000 | where does that take us today from that artificial city
00:12:03.600 | construction to real cities to the urban environment?
00:12:07.000 | Yeah, I think the biggest thing is that the actors are truly
00:12:13.640 | unpredictable, that most of the time,
00:12:17.720 | the drivers on the road, the other road users are out there
00:12:23.040 | behaving well, but every once in a while, they're not.
00:12:27.040 | The variety of other vehicles is--
00:12:31.120 | you have all of them.
00:12:33.320 | In terms of behavior or in terms of perception or both?
00:12:35.800 | Both.
00:12:38.720 | Back then, we didn't have to deal with cyclists.
00:12:40.800 | We didn't have to deal with pedestrians.
00:12:42.800 | Didn't have to deal with traffic lights.
00:12:46.280 | The scale over which that you have to operate
00:12:48.800 | is now much larger than the airbase
00:12:51.120 | that we were thinking about back then.
00:12:52.760 | So what-- easy question--
00:12:56.320 | what do you think is the hardest part about driving?
00:12:59.760 | Easy question.
00:13:00.520 | Yeah.
00:13:01.360 | No, I'm joking.
00:13:03.160 | I'm sure nothing really jumps out at you as one thing,
00:13:07.400 | but in the jump from the urban challenge to the real world,
00:13:12.960 | is there something that's a particular--
00:13:15.320 | you foresee a very serious, difficult challenge?
00:13:18.520 | I think the most fundamental difference
00:13:21.120 | is that we're doing it for real, that in that environment,
00:13:29.000 | it was both a limited complexity environment,
00:13:31.840 | because certain actors weren't there, because the roads were
00:13:34.760 | maintained.
00:13:35.360 | There were barriers keeping people separate from robots
00:13:38.720 | at the time.
00:13:40.880 | And it only had to work for 60 miles, which,
00:13:43.960 | looking at it from 2006, it had to work for 60 miles.
00:13:48.960 | Looking at it from now, we want things
00:13:52.680 | that will go and drive for half a million miles.
00:13:57.200 | And it's just a different game.
00:14:01.040 | So how important-- you said LiDAR came into the game
00:14:05.400 | early on, and it's really the primary driver
00:14:07.880 | of autonomous vehicles today as a sensor.
00:14:10.240 | So how important is the role of LiDAR in the sensor
00:14:12.400 | suite in the near term?
00:14:14.800 | So I think it's essential.
00:14:18.640 | But I also believe the cameras are essential,
00:14:20.480 | and I believe the radar is essential.
00:14:22.080 | I think that you really need to use
00:14:26.240 | the composition of data from these different sensors
00:14:28.680 | if you want the thing to really be robust.
00:14:32.600 | The question I want to ask--
00:14:34.320 | let's see if we can untangle it--
00:14:35.760 | is what are your thoughts on the Elon Musk provocative statement
00:14:40.200 | that LiDAR is a crutch, that is a kind of, I guess,
00:14:45.840 | growing pains, and that much of the perception task
00:14:49.840 | can be done with cameras?
00:14:52.160 | So I think it is undeniable that people walk around
00:14:56.880 | without lasers in their foreheads.
00:14:59.640 | And they can get into vehicles and drive them.
00:15:01.840 | And so there's an existence proof
00:15:05.520 | that you can drive using passive vision.
00:15:10.840 | No doubt.
00:15:11.440 | Can't argue with that.
00:15:12.680 | In terms of sensors.
00:15:14.000 | Yeah.
00:15:14.520 | So there's proof--
00:15:15.240 | In terms of sensors, right?
00:15:16.400 | There's an example that we all go do it, many of us, every day.
00:15:23.280 | In terms of LiDAR being a crutch, sure.
00:15:28.160 | But in the same way that the combustion engine was
00:15:33.200 | a crutch on the path to an electric vehicle,
00:15:35.200 | in the same way that any technology ultimately
00:15:38.560 | gets replaced by some superior technology in the future.
00:15:44.640 | And really, the way that I look at this
00:15:47.720 | is that the way we get around on the ground, the way that we
00:15:51.960 | use transportation is broken.
00:15:55.280 | And that we have this--
00:15:58.200 | I think the number I saw this morning, 37,000 Americans
00:16:00.920 | killed last year on our roads.
00:16:04.080 | And that's just not acceptable.
00:16:05.480 | And so any technology that we can
00:16:08.040 | bring to bear that accelerates this self-driving technology
00:16:12.840 | coming to market and saving lives
00:16:15.760 | is technology we should be using.
00:16:18.300 | And it feels just arbitrary to say, well, I'm
00:16:23.680 | not OK with using lasers, because that's whatever.
00:16:27.760 | But I am OK with using an 8 megapixel camera or a 16
00:16:31.220 | megapixel camera.
00:16:32.880 | These are just bits of technology.
00:16:34.520 | And we should be taking the best technology
00:16:36.340 | from the tool bin that allows us to go and solve a problem.
00:16:41.600 | The question I often talk to--
00:16:43.640 | well, obviously, you do as well--
00:16:45.120 | to the automotive companies.
00:16:48.300 | And if there's one word that comes up more often
00:16:51.340 | than anything, it's cost, and trying to drive cost down.
00:16:55.300 | So while it's true that it's a tragic number, the 37,000,
00:17:01.420 | the question is--
00:17:03.120 | and I'm not the one asking this question,
00:17:04.820 | because I hate this question.
00:17:06.040 | But we want to find the cheapest sensor suite that
00:17:11.700 | creates a safe vehicle.
00:17:13.260 | So in that uncomfortable trade-off,
00:17:18.220 | do you foresee LiDAR coming down in cost in the future?
00:17:23.700 | Or do you see a day where level 4 autonomy
00:17:26.660 | is possible without LiDAR?
00:17:29.900 | I see both of those.
00:17:30.940 | But it's really a matter of time.
00:17:32.980 | And I think, really, maybe I would
00:17:35.060 | talk to the question you asked about the cheapest sensor.
00:17:38.780 | I don't think that's actually what you want.
00:17:40.580 | What you want is a sensor suite that is economically viable.
00:17:45.780 | And then after that, everything is about margin and driving
00:17:50.220 | cost out of the system.
00:17:52.340 | What you also want is a sensor suite that works.
00:17:55.460 | And so it's great to tell a story about how it would be
00:18:00.940 | better to have a self-driving system with a $50 sensor
00:18:04.340 | instead of a $500 sensor.
00:18:08.780 | But if the $500 sensor makes it work and the $50 sensor
00:18:11.540 | doesn't work, who cares?
00:18:15.700 | So long as you can actually have an economic--
00:18:20.020 | there's an economic opportunity there.
00:18:21.700 | And the economic opportunity is important,
00:18:23.740 | because that's how you actually have a sustainable business.
00:18:27.780 | And that's how you can actually see this come to scale
00:18:31.180 | and be out in the world.
00:18:32.540 | And so when I look at LiDAR, I see a technology
00:18:37.220 | that has no underlying fundamental expense to it.
00:18:42.460 | It's going to be more expensive than an imager,
00:18:46.100 | because CMOS processes or FAB processes
00:18:51.420 | are dramatically more scalable than mechanical processes.
00:18:56.220 | But we still should be able to drive cost out
00:18:58.340 | substantially on that side.
00:19:00.460 | And then I also do think that with the right business model,
00:19:05.940 | you can absorb certainly more cost on the bill of materials.
00:19:09.540 | Yeah, if the sensor suite works, extra value is provided.
00:19:12.620 | Thereby, you don't need to drive cost down to zero.
00:19:15.540 | It's the basic economics.
00:19:17.180 | You've talked about your intuition
00:19:18.860 | that level two autonomy is problematic because
00:19:22.780 | of the human factor of vigilance, decrement,
00:19:26.540 | complacency, overtrust, and so on, just us being human.
00:19:29.660 | We overtrust the system, we start doing even more
00:19:33.060 | so partaking in the secondary activities like smartphones
00:19:36.580 | and so on.
00:19:38.820 | Have your views evolved on this point in either direction?
00:19:43.020 | Can you speak to it?
00:19:44.820 | So I want to be really careful, because sometimes this
00:19:48.300 | gets twisted in a way that I certainly didn't intend.
00:19:53.060 | So active safety systems are a really important technology
00:19:59.420 | that we should be pursuing and integrating into vehicles.
00:20:03.500 | And there's an opportunity in the near term
00:20:05.740 | to reduce accidents, reduce fatalities,
00:20:09.260 | and we should be pushing on that.
00:20:13.460 | Level two systems are systems where the vehicle
00:20:18.180 | is controlling two axes, so braking and throttle/steering.
00:20:24.900 | And I think there are variants of level two systems
00:20:27.140 | that are supporting the driver that absolutely we should
00:20:30.820 | encourage to be out there.
00:20:32.620 | Where I think there's a real challenge
00:20:34.300 | is in the human factors part around this
00:20:39.180 | and the misconception from the public around the capability
00:20:43.660 | set that that enables and the trust
00:20:45.780 | that they should have in it.
00:20:48.060 | And that is where I am actually incrementally more concerned
00:20:54.340 | around level three systems and how exactly a level two
00:20:57.780 | system is marketed and delivered.
00:21:00.020 | And how much effort people have put into those human factors.
00:21:03.260 | So I still believe several things around this.
00:21:07.020 | One is people will overtrust the technology.
00:21:10.780 | We've seen over the last few weeks a spate of people
00:21:13.860 | sleeping in their Tesla.
00:21:16.340 | I watched an episode last night of Trevor Noah
00:21:22.620 | talking about this.
00:21:24.380 | And him, this is a smart guy who has a lot of resources
00:21:28.940 | at his disposal describing a Tesla as a self-driving car.
00:21:32.940 | And that why shouldn't people be sleeping in their Tesla?
00:21:35.700 | It's like, well, because it's not a self-driving car
00:21:38.820 | and it is not intended to be.
00:21:41.100 | And these people will almost certainly die at some point
00:21:48.460 | or hurt other people.
00:21:50.460 | And so we need to really be thoughtful about how
00:21:52.600 | that technology is described and brought to market.
00:21:56.260 | I also think that because of the economic challenges
00:22:01.380 | we were just talking about, that technology path will--
00:22:05.580 | these level two driver assistance systems,
00:22:07.260 | that technology path will diverge from the technology
00:22:10.300 | path that we need to be on to actually deliver truly
00:22:15.020 | self-driving vehicles.
00:22:16.180 | Ones where you can get in it and sleep and have
00:22:19.600 | the equivalent or better safety than a human driver
00:22:23.060 | behind the wheel.
00:22:24.540 | Because again, the economics are very
00:22:26.660 | different in those two worlds.
00:22:29.860 | And so that leads to divergent technology.
00:22:32.700 | So you just don't see the economics of gradually
00:22:36.900 | increasing from level two and doing so quickly enough
00:22:41.500 | to where it doesn't cause safety, critical safety
00:22:43.940 | concerns.
00:22:44.460 | You believe that it needs to diverge at this point
00:22:48.540 | into basically different routes.
00:22:50.520 | And really that comes back to what are those L2 and L1
00:22:55.960 | systems doing.
00:22:56.800 | And they are driver assistance functions
00:22:59.760 | where the people that are marketing that responsibly
00:23:04.320 | are being very clear and putting human factors in place such
00:23:08.320 | that the driver is actually responsible for the vehicle.
00:23:12.360 | And that the technology is there to support the driver.
00:23:15.160 | And the safety cases that are built around those
00:23:19.860 | are dependent on that driver attention and attentiveness.
00:23:24.260 | And at that point, you can kind of give up to some degree
00:23:30.340 | for economic reasons.
00:23:31.220 | You can give up on say false negatives.
00:23:34.260 | And so the way to think about this
00:23:36.180 | is for a for collision mitigation braking system,
00:23:40.740 | if it half the times the driver missed
00:23:43.420 | a vehicle in front of it, it hit the brakes
00:23:46.020 | and brought the vehicle to a stop,
00:23:47.660 | that would be an incredible, incredible advance
00:23:51.620 | in safety on our roads.
00:23:52.980 | That would be equivalent to seat belts.
00:23:54.940 | But it would mean that if that vehicle wasn't being monitored,
00:23:57.560 | it would hit one out of two cars.
00:24:00.580 | And so economically, that's a perfectly good solution
00:24:04.900 | for a driver assistance system.
00:24:06.220 | What you should do at that point if you
00:24:07.380 | can get it to work 50% of the time
00:24:09.220 | is drive the cost out of that so you can get it on as
00:24:11.460 | many vehicles as possible.
00:24:13.300 | But driving the cost out of it doesn't drive up performance
00:24:16.900 | on the false negative case.
00:24:18.820 | And so you'll continue to not have a technology that
00:24:21.580 | could really be available for a self-driven vehicle.
00:24:25.700 | So clearly, the communication-- and this probably
00:24:28.980 | applies to all four vehicles as well--
00:24:31.620 | the marketing and the communication
00:24:34.420 | of what the technology is actually capable of,
00:24:37.060 | how hard it is, how easy it is, all that kind of stuff
00:24:39.300 | is highly problematic.
00:24:41.100 | So say everybody in the world was perfectly communicated
00:24:45.660 | and were made to be completely aware
00:24:48.340 | of every single technology out there, what it's able to do.
00:24:52.860 | What's your intuition?
00:24:54.140 | And now we're maybe getting into philosophical ground.
00:24:56.900 | Is it possible to have a level two vehicle
00:25:00.020 | where we don't overtrust it?
00:25:04.700 | I don't think so.
00:25:05.780 | If people truly understood the risks--
00:25:08.900 | They wouldn't--
00:25:09.540 | --and internalized it, then sure, you could do that safely.
00:25:14.260 | But that's a world that doesn't exist.
00:25:16.100 | That people are going to--
00:25:17.820 | they're going to-- if the facts are put in front of them,
00:25:20.700 | they're going to then combine that with their experience.
00:25:24.420 | And let's say they're using an L2 system,
00:25:28.340 | and they go up and down the 101 every day.
00:25:30.980 | And they do that for a month.
00:25:32.740 | And it just worked every day for a month.
00:25:36.340 | That's pretty compelling.
00:25:37.380 | At that point, just even if you know the statistics,
00:25:41.860 | you're like, well, I don't know.
00:25:43.460 | Maybe there's something funny about those.
00:25:45.220 | Maybe they're driving in difficult places.
00:25:47.140 | I've seen it with my own eyes.
00:25:48.540 | It works.
00:25:49.940 | And the problem is that that sample size that they have--
00:25:52.420 | so it's 30 miles up and down.
00:25:53.940 | So 60 miles times 30 days, so 60, 180, 1,800 miles.
00:26:01.700 | That's a drop in the bucket compared to the, what,
00:26:05.420 | 85 million miles between fatalities.
00:26:07.580 | And so they don't really have a true estimate
00:26:11.340 | based on their personal experience of the real risks.
00:26:13.980 | But they're going to trust it anyway, because it's
00:26:16.020 | hard not to.
00:26:16.520 | It worked for a month.
00:26:17.700 | What's going to change?
00:26:18.700 | -So even if you start a perfect understanding of the system,
00:26:21.580 | your own experience will make it drift.
00:26:24.140 | I mean, that's a big concern.
00:26:25.900 | Over a year, over two years, even.
00:26:28.140 | It doesn't have to be months.
00:26:29.420 | And I think that as this technology moves
00:26:32.900 | from what I would say is kind of the more technology-savvy
00:26:37.740 | ownership group to the mass market,
00:26:42.620 | you may be able to have some of those folks who are really
00:26:45.260 | familiar with technology.
00:26:46.300 | They may be able to internalize it better.
00:26:48.820 | And your immunization against this kind of false risk
00:26:52.860 | assessment might last longer.
00:26:54.260 | But as folks who aren't as savvy about that
00:26:58.660 | read the material and they compare
00:27:00.700 | that to their personal experience,
00:27:02.180 | I think there it's going to move more quickly.
00:27:08.180 | -So your work, the program that you created at Google
00:27:11.300 | and now at Aurora, is focused more
00:27:15.100 | on the second path of creating full autonomy.
00:27:18.460 | So it's such a fascinating-- I think
00:27:22.180 | it's one of the most interesting AI problems of the century.
00:27:26.860 | I just talked to a lot of people, just regular people,
00:27:29.180 | I don't know, my mom, about autonomous vehicles.
00:27:31.780 | And you begin to grapple with ideas
00:27:34.540 | of giving your life control over to a machine.
00:27:38.100 | It's philosophically interesting.
00:27:40.060 | It's practically interesting.
00:27:41.780 | So let's talk about safety.
00:27:43.740 | How do you think we demonstrate-- you've
00:27:46.380 | spoken about metrics in the past.
00:27:47.900 | How do you think we demonstrate to the world
00:27:51.900 | that an autonomous vehicle, an Aurora system, is safe?
00:27:56.140 | -This is one where it's difficult because there isn't
00:27:58.340 | a soundbite answer.
00:27:59.220 | That we have to show a combination of work
00:28:05.900 | that was done diligently and thoughtfully.
00:28:08.300 | And this is where something like a functional safety process
00:28:10.780 | is part of that.
00:28:11.420 | It's like, here's the way we did the work.
00:28:15.260 | That means that we were very thorough.
00:28:17.140 | So if you believe what we said about this
00:28:20.340 | is the way we did it, then you can have some confidence
00:28:22.680 | that we were thorough in the engineering work
00:28:25.180 | we put into the system.
00:28:27.260 | And then on top of that, to demonstrate
00:28:30.140 | that we weren't just thorough, we
00:28:32.060 | were actually good at what we did,
00:28:35.300 | there'll be a collection of evidence
00:28:38.180 | in terms of demonstrating that the capabilities worked
00:28:40.740 | the way we thought they did statistically.
00:28:43.900 | And to whatever degree, we can demonstrate that,
00:28:48.140 | both in some combination of simulation,
00:28:50.260 | some combination of unit testing and decomposition testing.
00:28:54.780 | And then some part of it will be on-road data.
00:28:58.140 | And I think the way we'll ultimately
00:29:02.660 | convey this to the public is there'll
00:29:04.380 | be clearly some conversation with the public about it.
00:29:08.180 | But we'll invoke the trusted nodes
00:29:11.980 | in that we'll spend more time being
00:29:14.020 | able to go into more depth with folks like NHTSA
00:29:17.500 | and other federal and state regulatory bodies.
00:29:19.700 | And given that they are operating
00:29:22.560 | in the public interest and they're trusted,
00:29:26.220 | that if we can show enough work to them that they're convinced,
00:29:29.980 | then I think we're in a pretty good place.
00:29:33.300 | - That means that you work with people that
00:29:35.100 | are essentially experts at safety to try to discuss.
00:29:38.060 | And so do you think--
00:29:40.660 | the answer is probably no, but just in case--
00:29:42.900 | do you think there exists a metric?
00:29:44.340 | So currently, people have been using
00:29:46.300 | number of disengagements.
00:29:48.180 | And it quickly turns into a marketing scheme
00:29:50.100 | to sort of alter the experiments you run to adjust.
00:29:54.300 | I think you've spoken that you don't like--
00:29:56.260 | - Don't love it.
00:29:56.840 | No, in fact, I was on the record telling DMV
00:29:59.680 | that I thought this was not a great metric.
00:30:01.980 | - Do you think it's possible to create a metric, a number,
00:30:06.620 | that could demonstrate safety outside of fatalities?
00:30:12.700 | - So I do.
00:30:13.500 | And I think that it won't be just one number.
00:30:17.700 | So as we are internally grappling with this--
00:30:21.340 | and at some point, we'll be able to talk more publicly about it--
00:30:25.100 | is how do we think about human performance in different tasks,
00:30:29.820 | say, detecting traffic lights or safely making
00:30:34.580 | a left turn across traffic?
00:30:37.740 | And what do we think the failure rates
00:30:40.020 | are for those different capabilities for people?
00:30:42.540 | And then demonstrating to ourselves,
00:30:44.780 | and then ultimately folks in the regulatory role,
00:30:48.500 | and then ultimately the public, that we have confidence
00:30:51.780 | that our system will work better than that.
00:30:55.140 | And so these individual metrics will tell a compelling story,
00:31:00.060 | ultimately.
00:31:01.820 | I do think at the end of the day,
00:31:03.980 | what we care about in terms of safety
00:31:06.660 | is lives saved and injuries reduced.
00:31:12.180 | And then ultimately, casualty dollars
00:31:16.460 | that people aren't having to pay to get their car fixed.
00:31:19.420 | And I do think that you can--
00:31:21.900 | in aviation, they look at an event pyramid,
00:31:25.900 | where a crash is at the top of that,
00:31:28.580 | and that's the worst event, obviously.
00:31:30.460 | And then there's injuries and near-miss events and whatnot,
00:31:34.260 | and violation of operating procedures.
00:31:37.340 | And you build a statistical model
00:31:40.180 | of the relevance of the low severity things
00:31:43.900 | to the high severity things.
00:31:45.100 | And I think that's something we'll
00:31:46.560 | be able to look at as well.
00:31:48.180 | Because an event per 85 million miles
00:31:51.780 | is statistically a difficult thing,
00:31:54.420 | even at the scale of the US, to compare directly.
00:31:59.380 | And that event, the fatality that's
00:32:01.740 | connected to an autonomous vehicle,
00:32:05.020 | is significantly, at least currently, magnified.
00:32:09.140 | And the amount of attention it gets,
00:32:12.300 | so that speaks to public perception.
00:32:15.100 | I think the most popular topic about autonomous vehicles
00:32:18.220 | in the public is the trolley problem formulation, right?
00:32:22.500 | - Sure.
00:32:23.100 | - Which has-- let's not get into that too much,
00:32:26.980 | but is misguided in many ways.
00:32:29.580 | But it speaks to the fact that people
00:32:31.660 | are grappling with this idea of giving control over
00:32:34.940 | to a machine.
00:32:36.260 | So how do you win the hearts and minds of the people
00:32:41.620 | that autonomy is something that could be a part of their lives?
00:32:45.540 | - I think you let them experience it.
00:32:47.660 | I think it's right.
00:32:50.460 | I think people should be skeptical.
00:32:52.780 | I think people should ask questions.
00:32:55.700 | I think they should doubt.
00:32:58.060 | Because this is something new and different.
00:33:00.980 | They haven't touched it yet.
00:33:02.100 | And I think it's perfectly reasonable.
00:33:05.380 | But at the same time, it's clear there's an opportunity
00:33:08.220 | to make the road safer.
00:33:09.180 | It's clear that we can improve access to mobility.
00:33:12.340 | It's clear that we can reduce the cost of mobility.
00:33:16.540 | And that once people try that and understand that it's safe
00:33:22.660 | and are able to use in their daily lives,
00:33:24.340 | I think it's one of these things that will just be obvious.
00:33:27.940 | And I've seen this practically in demonstrations
00:33:32.100 | that I've given where I've had people come in.
00:33:35.540 | And they're very skeptical.
00:33:38.380 | And they get in a vehicle.
00:33:39.740 | My favorite one is taking somebody out on the freeway.
00:33:42.460 | And we're on the 101 driving at 65 miles an hour.
00:33:45.900 | And after 10 minutes, they turn and ask, is that all it does?
00:33:49.380 | And you're like, it's a self-driving car.
00:33:51.940 | I'm not sure exactly what you thought it would do.
00:33:54.700 | But it becomes mundane, which is exactly what you
00:34:00.500 | want a technology like this to be.
00:34:02.540 | We don't really-- when I turn the light switch on in here,
00:34:07.140 | I don't think about the complexity of those electrons
00:34:11.860 | being pushed down a wire from wherever it was
00:34:14.020 | and being generated.
00:34:16.860 | I just get annoyed if it doesn't work.
00:34:19.660 | And what I value is the fact that I can
00:34:21.620 | do other things in this space.
00:34:22.940 | I can see my colleagues.
00:34:24.420 | I can read stuff on a paper.
00:34:26.020 | I can not be afraid of the dark.
00:34:30.180 | And I think that's what we want this technology to be like,
00:34:33.340 | is it's in the background.
00:34:34.660 | And people get to have those life experiences
00:34:37.140 | and do so safely.
00:34:38.460 | - So putting this technology in the hands of people
00:34:42.180 | speaks to scale of deployment.
00:34:46.340 | So what do you think-- the dreaded question
00:34:50.260 | about the future, because nobody can predict the future.
00:34:53.580 | But just maybe speak poetically about when
00:34:57.940 | do you think we'll see a large scale deployment
00:35:00.940 | of autonomous vehicles, 10,000, those kinds of numbers.
00:35:06.700 | - We'll see that within 10 years.
00:35:09.380 | I'm pretty confident.
00:35:14.100 | - What's an impressive scale?
00:35:16.300 | What moment-- so you've done DARPA Challenge
00:35:19.220 | where there's one vehicle.
00:35:20.460 | At which moment does it become, wow, this is serious scale?
00:35:24.860 | - So I think the moment it gets serious
00:35:26.540 | is when we really do have a driverless vehicle operating
00:35:32.980 | on public roads.
00:35:35.020 | And that we can do that kind of continuously.
00:35:37.940 | - Without a safety driver.
00:35:38.940 | - Without a safety driver in the vehicle.
00:35:40.660 | I think at that moment, we've kind of crossed the 0 to 1
00:35:43.620 | threshold.
00:35:45.940 | And then it is about how do we continue to scale that?
00:35:50.220 | How do we build the right business models?
00:35:53.940 | How do we build the right customer experience around it
00:35:56.260 | so that it is actually a useful product out in the world?
00:36:01.060 | And I think that is really--
00:36:03.580 | at that point, it moves from what
00:36:06.100 | is this kind of mixed science engineering project
00:36:09.180 | into engineering and commercialization
00:36:12.340 | and really starting to deliver on the value
00:36:15.820 | that we all see here and actually making
00:36:19.380 | that real in the world.
00:36:20.660 | - What do you think that deployment looks like?
00:36:22.580 | Where do we first see the inkling of no safety driver,
00:36:26.460 | one or two cars here and there?
00:36:28.620 | Is it on the highway?
00:36:29.780 | Is it in specific routes in the urban environment?
00:36:33.260 | - I think it's going to be urban, suburban type
00:36:36.020 | environments.
00:36:37.940 | With Aurora, when we thought about how to tackle this,
00:36:42.420 | it was kind of en vogue to think about trucking as opposed
00:36:46.700 | to urban driving.
00:36:47.780 | And again, the human intuition around this
00:36:51.300 | is that freeways are easier to drive on because everybody's
00:36:57.820 | kind of going in the same direction
00:36:59.260 | and the lanes are a little wider, et cetera.
00:37:01.460 | And I think that that intuition is pretty good,
00:37:03.420 | except we don't really care about most of the time.
00:37:06.020 | We care about all of the time.
00:37:08.620 | And when you're driving on a freeway with a truck,
00:37:10.820 | say, at 70 miles an hour, and you've
00:37:14.740 | got 70,000 pound load with you, that's
00:37:16.340 | just an incredible amount of kinetic energy.
00:37:19.060 | And so when that goes wrong, it goes really wrong.
00:37:22.580 | And those challenges that you see occur more rarely,
00:37:27.740 | so you don't get to learn as quickly.
00:37:30.980 | And they're incrementally more difficult than urban driving,
00:37:34.660 | but they're not easier than urban driving.
00:37:37.420 | And so I think this happens in moderate speed urban
00:37:41.900 | environments because if two vehicles crash
00:37:45.220 | at 25 miles per hour, it's not good,
00:37:48.020 | but probably everybody walks away.
00:37:50.980 | And those events where there's the possibility for that
00:37:53.900 | occurring happen frequently.
00:37:55.700 | So we get to learn more rapidly.
00:37:57.900 | We get to do that with lower risk for everyone.
00:38:02.420 | And then we can deliver value to people
00:38:04.220 | that need to get from one place to another.
00:38:06.020 | And then once we've got that solved,
00:38:08.140 | then the freeway driving part of this just falls out.
00:38:11.260 | But we're able to learn more safely, more quickly
00:38:13.580 | in the urban environment.
00:38:15.180 | - So 10 years and then scale 20, 30 year.
00:38:18.500 | I mean, who knows if a sufficiently compelling experience
00:38:22.020 | is created, it could be faster and slower.
00:38:24.340 | Do you think there could be breakthroughs
00:38:27.140 | and what kind of breakthroughs might there be
00:38:29.900 | that completely changed that timeline?
00:38:32.380 | Again, not only am I asking you to predict the future,
00:38:35.340 | I'm asking you to predict breakthroughs
00:38:37.300 | that haven't happened yet.
00:38:38.300 | - So what's the, I think another way to ask that
00:38:40.740 | would be if I could wave a magic wand,
00:38:44.260 | what part of the system would I make work today
00:38:46.660 | to accelerate it as quickly as possible?
00:38:49.420 | - Don't say infrastructure, please don't say infrastructure.
00:38:54.140 | - No, it's definitely not infrastructure.
00:38:56.260 | It's really that perception forecasting capability.
00:39:00.540 | So if tomorrow you could give me a perfect model
00:39:04.780 | of what's happened, what is happening
00:39:06.900 | and what will happen for the next five seconds
00:39:09.180 | around a vehicle on the roadway,
00:39:12.980 | that would accelerate things pretty dramatically.
00:39:15.300 | - Are you, in terms of staying up at night,
00:39:17.540 | are you mostly bothered by cars, pedestrians or cyclists?
00:39:21.700 | - So I worry most about the vulnerable road users
00:39:25.900 | about the combination of cyclists and cars, right?
00:39:28.020 | Just because, cyclists and pedestrians
00:39:29.460 | because they're not in armor.
00:39:31.880 | The cars, they're bigger,
00:39:35.340 | they've got protection for the people
00:39:37.020 | and so the ultimate risk is lower there.
00:39:41.060 | Whereas a pedestrian or cyclist,
00:39:43.260 | they're out in the road and they don't have any protection
00:39:46.500 | and so we need to pay extra attention to that.
00:39:49.740 | - Do you think about a very difficult technical challenge
00:39:54.100 | of the fact that pedestrians,
00:39:58.540 | if you try to protect pedestrians by being careful and slow,
00:40:03.100 | they'll take advantage of that.
00:40:04.580 | So the game theoretic dance,
00:40:07.460 | does that worry you of how, from a technical perspective,
00:40:10.860 | how we solve that?
00:40:12.500 | 'Cause as humans, the way we solve that
00:40:14.580 | is kind of nudge our way through the pedestrians,
00:40:17.260 | which doesn't feel, from a technical perspective,
00:40:20.020 | as a appropriate algorithm.
00:40:22.320 | But do you think about how we solve that problem?
00:40:25.940 | - Yeah, I think there's,
00:40:27.540 | I think that was actually,
00:40:29.860 | there's two different concepts there.
00:40:31.420 | So one is, am I worried that
00:40:35.020 | because these vehicles are self-driving,
00:40:36.580 | people will kind of step in the road
00:40:37.660 | and take advantage of them?
00:40:38.660 | And I've heard this and I don't really believe it
00:40:43.660 | because if I'm driving down the road
00:40:45.940 | and somebody steps in front of me, I'm going to stop.
00:40:48.900 | Right, like even if I'm annoyed,
00:40:52.740 | I'm not gonna just drive through a person
00:40:54.140 | stood in the road.
00:40:54.980 | - Right.
00:40:56.420 | - And so I think today, people can take advantage of this
00:41:00.380 | and you do see some people do it.
00:41:02.540 | I guess there's an incremental risk
00:41:04.180 | because maybe they have lower confidence
00:41:05.880 | that I'm gonna see them than they might have
00:41:07.740 | for an automated vehicle.
00:41:09.300 | And so maybe that shifts it a little bit.
00:41:12.020 | But I think people don't want to get hit by cars.
00:41:14.340 | And so I think that I'm not that worried
00:41:17.060 | about people walking out of the 101
00:41:18.740 | and creating chaos more than they would today.
00:41:21.780 | Regarding kind of the nudging through
00:41:26.260 | a big stream of pedestrians,
00:41:27.540 | leaving a concert or something,
00:41:29.100 | I think that is further down the technology pipeline.
00:41:33.420 | I think that you're right, that's tricky.
00:41:36.900 | I don't think it's necessarily,
00:41:38.580 | I think the algorithm people use for this is pretty simple.
00:41:43.340 | Right, it's kind of just move forward slowly
00:41:44.740 | and if somebody's really close, then stop.
00:41:47.580 | And I think that that probably
00:41:49.940 | can be replicated pretty easily.
00:41:52.740 | And particularly given that you don't do this
00:41:54.780 | at 30 miles an hour, you do it at one,
00:41:57.220 | that even in those situations,
00:41:59.060 | the risk is relatively minimal.
00:42:01.180 | But it's not something we're thinking about
00:42:03.580 | in any serious way.
00:42:04.540 | - And probably that's less an algorithm problem
00:42:07.900 | and more creating a human experience.
00:42:10.140 | So the HCI people that create a visual display
00:42:14.300 | that you're pleasantly as a pedestrian
00:42:16.260 | nudged out of the way.
00:42:17.420 | - Yes.
00:42:18.260 | - That's an experience problem, not an algorithm problem.
00:42:21.980 | Who's the main competitor to Aurora today?
00:42:25.460 | And how do you out-compete them in the long run?
00:42:28.620 | - So we really focus a lot on what we're doing here.
00:42:31.180 | I think that, you know, I've said this a few times,
00:42:34.420 | that this is a huge, difficult problem
00:42:37.940 | and it's great that a bunch of companies are tackling it
00:42:40.260 | because I think it's so important for society
00:42:42.300 | that somebody gets there.
00:42:43.780 | So we, you know, we don't spend a whole lot of time
00:42:48.300 | thinking tactically about who's out there
00:42:51.540 | and how do we beat that person individually.
00:42:55.220 | What are we trying to do to go faster ultimately?
00:42:58.660 | Well, part of it is the leisure team we have
00:43:02.620 | has got pretty tremendous experience.
00:43:04.220 | And so we kind of understand the landscape
00:43:06.460 | and understand where the cul-de-sacs are to some degree.
00:43:09.180 | And, you know, we try and avoid those.
00:43:11.100 | I think there's a part of it,
00:43:14.260 | just this great team we've built.
00:43:16.260 | People, this is a technology and a company
00:43:19.100 | that people believe in the mission of.
00:43:22.340 | And so it allows us to attract
00:43:23.740 | just awesome people to go work.
00:43:25.700 | We've got a culture, I think, that people appreciate
00:43:29.340 | that allows them to focus,
00:43:30.460 | allows them to really spend time solving problems.
00:43:33.100 | And I think that keeps them energized.
00:43:35.900 | And then we've invested hard,
00:43:38.100 | invested heavily in the infrastructure and architectures
00:43:43.940 | that we think will ultimately accelerate us.
00:43:46.540 | So because of the folks we're able to bring in early on,
00:43:50.660 | because of the great investors we have,
00:43:53.300 | you know, we don't spend all of our time doing demos
00:43:56.780 | and kind of leaping from one demo to the next.
00:43:58.660 | We've been given the freedom to invest in infrastructure
00:44:03.660 | to do machine learning,
00:44:05.500 | infrastructure to pull data from our on-road testing,
00:44:08.580 | infrastructure to use that to accelerate engineering.
00:44:11.500 | And I think that early investment
00:44:14.460 | and continuing investment in those kinds of tools
00:44:17.340 | will ultimately allow us to accelerate
00:44:19.780 | and do something pretty incredible.
00:44:21.940 | - Chris, beautifully put.
00:44:23.420 | It's a good place to end.
00:44:24.660 | Thank you so much for talking today.
00:44:26.500 | - Thank you very much.
00:44:27.340 | Really enjoyed it.
00:44:28.340 | (upbeat music)
00:44:30.940 | (upbeat music)
00:44:33.540 | (upbeat music)
00:44:36.140 | (upbeat music)
00:44:38.740 | (upbeat music)
00:44:41.340 | (upbeat music)
00:44:43.940 | [BLANK_AUDIO]