back to indexChris Urmson: Self-Driving Cars at Aurora, Google, CMU, and DARPA | Lex Fridman Podcast #28
Chapters
0:0 Intro
1:9 What have you learned from these races
2:11 Did you believe it could be done
3:4 Biggest pain points
4:17 The DARPA Challenge
5:12 Leadership Lessons
6:17 Technical Evolution
9:22 Urban Challenge
10:44 Perception
11:54 Where do we go from here
12:53 Hardest part about driving
13:56 LiDAR
14:33 LiDAR is a crutch
16:42 Cost
19:11 Basic Economics
22:33 Safety Critical Concerns
24:26 Level 2 Vehicles
26:18 Whats Going to Change
27:9 Safety
29:34 Safety Metrics
31:59 Public Perception
34:38 The Future
36:20 Deployment
38:15 Breakthroughs
39:49 Pedestrian protection
42:5 Competitors
00:00:00.000 |
The following is a conversation with Chris Armstron. 00:00:03.120 |
He was the CTO of the Google self-driving car team, 00:00:06.040 |
a key engineer and leader behind the Carnegie Mellon 00:00:08.880 |
University autonomous vehicle entries in the DARPA Grand 00:00:12.000 |
Challenges, and the winner of the DARPA Urban Challenge. 00:00:25.960 |
and Drew Bagnell, Uber's former autonomy and perception lead. 00:00:30.120 |
Chris is one of the top roboticists and autonomous 00:00:32.880 |
vehicle experts in the world, and a longtime voice of reason 00:00:37.360 |
in a space that is shrouded in both mystery and hype. 00:00:41.320 |
He both acknowledges the incredible challenges involved 00:00:44.040 |
in solving the problem of autonomous driving, 00:00:54.720 |
give it five stars on iTunes, support it on Patreon, 00:01:03.240 |
And now, here's my conversation with Chris Armstron. 00:01:09.120 |
You were part of both the DARPA Grand Challenge 00:01:11.960 |
and the DARPA Urban Challenge teams at CMU with Red Whittaker. 00:01:22.240 |
I think the high order bit was that it could be done. 00:01:30.200 |
incredible about the first, the Grand Challenges, 00:01:34.880 |
that I remember I was a grad student at Carnegie Mellon, 00:01:45.360 |
seemed really hard, so that would be cool and interesting. 00:01:48.800 |
But at the time, we were the only robotics institute around, 00:01:52.800 |
and so if we went into it and fell on our faces, 00:02:12.320 |
But at which point did you believe it was possible? 00:02:36.320 |
We didn't know how we're going to be able to do it. 00:02:38.440 |
We didn't know if we'd be able to do it the first time. 00:02:48.400 |
I think there's a certain benefit to naivete, right? 00:02:52.960 |
That if you don't know how hard something really is, 00:02:55.440 |
you try different things and it gives you an opportunity 00:03:15.320 |
I think that's the joy of this field, is that it's all hard. 00:03:20.080 |
And that you have to be good at each part of it. 00:03:25.320 |
So for the urban challenges, if I look back at it from today, 00:03:38.920 |
There weren't other actors moving through it, 00:03:42.480 |
It was out in the desert, so you get really good GPS. 00:03:51.400 |
And so in retrospect, now it's within the realm of things 00:04:00.840 |
to get the vehicle so that we could control it and drive it. 00:04:04.800 |
That's still a pain today, but it was even more so back then. 00:04:09.560 |
And then the uncertainty of exactly what they wanted us 00:04:17.000 |
>> Right, you didn't actually know the track heading in. 00:04:19.400 |
You knew approximately, but you didn't actually 00:04:21.480 |
know the route, the route that this can be taken. 00:04:25.280 |
We didn't even really, the way the rules had been described, 00:04:33.320 |
the idea was that the government would give us, 00:04:36.960 |
DARPA would give us a set of waypoints and kind of the width 00:04:42.360 |
that you had to stay within between the line that 00:04:46.840 |
And so the most devious thing they could have done 00:04:49.280 |
is set a kilometer-wide corridor across a field of scrub brush 00:04:58.480 |
Fortunately, it really turned into basically driving 00:05:01.880 |
along a set of trails, which is much more relevant 00:05:08.720 |
But no, it was a hell of a thing back in the day. 00:05:12.040 |
>> So the legend, Red, was kind of leading that effort 00:05:22.000 |
What have you learned from Red about leadership? 00:05:31.040 |
That's where there is an incredible opportunity. 00:05:36.560 |
is to see people for who they can be, not who they are. 00:05:43.720 |
one of the deepest lessons I learned from Red 00:05:46.080 |
was that he would look at undergraduates or graduate 00:05:56.120 |
to have responsibility, to do great things that I think 00:06:04.920 |
oh, well, that's just an undergraduate student. 00:06:08.760 |
And so I think that kind of trust, but verify, 00:06:12.760 |
have confidence in what people can become, I think, 00:06:20.480 |
Can you maybe talk through the technical evolution 00:06:24.200 |
of autonomous vehicle systems from the first two 00:06:27.480 |
Grand Challenges to the Urban Challenge to today? 00:06:40.920 |
So for the Grand Challenge, the real technology 00:06:51.440 |
Prior to that, a lot of the off-road robotics work 00:06:55.240 |
had been done without any real prior model of what 00:07:01.440 |
And so that innovation, the fact that we could get decimeter 00:07:17.440 |
the complexity of the driving problem the vehicle had 00:07:21.080 |
because we could assume things about the environment 00:07:31.280 |
For the Urban Challenge, one of the big technological 00:07:42.000 |
and being able to generate high resolution, mid to long range 00:07:47.480 |
3D models of the world and use that for understanding 00:07:54.120 |
And that was really a game changing technology. 00:08:02.880 |
of other technologies that had been kind of converging 00:08:18.560 |
You would go to a conference a couple of years before that 00:08:25.720 |
And so seeing those Bayesian estimation techniques 00:08:38.800 |
And mostly SLAM was done based on LIDAR at that time. 00:08:42.440 |
And in fact, we weren't really doing SLAM per se in real time 00:09:00.760 |
from kind of naively trusting GPS, INS before that. 00:09:07.160 |
And again, lots of work had been going on in this field. 00:09:10.480 |
Certainly this was not doing anything particularly 00:09:23.080 |
So for the urban challenge, those were already 00:09:33.600 |
so they had their own different approaches there? 00:09:36.480 |
Or did everybody kind of share that information, at least 00:09:43.520 |
So DARPA gave all the teams a model of the world, a map. 00:09:49.600 |
And then one of the things that we had to figure out back then 00:09:56.160 |
trips people up today-- is actually the coordinate system. 00:10:05.160 |
don't really care about the ellipsoid of the Earth that's 00:10:09.520 |
But when you want to get to 10 centimeter or centimeter 00:10:12.760 |
resolution, you care whether the coordinate system is NADS-83 00:10:24.200 |
both the kind of non-sphericalness of the Earth, 00:10:30.640 |
I think-- I can't remember which one-- the tectonic shifts 00:10:33.040 |
that are happening and how to transform the global datum 00:10:37.080 |
So getting a map and then actually matching it 00:10:41.920 |
that was kind of interesting and fun back then. 00:10:44.080 |
So how much work was the perception doing there? 00:10:46.840 |
So how much were you relying on localization based on maps 00:10:52.480 |
without using perception to register to the maps? 00:11:01.960 |
We're more than a decade since the urban challenge. 00:11:36.880 |
might be going right or it might be going straight 00:11:41.560 |
And we had to deal with the challenge of the fact 00:11:44.160 |
that our behavior was going to impact the behavior 00:11:48.960 |
And we did a lot of that in relatively naive ways. 00:12:00.000 |
where does that take us today from that artificial city 00:12:03.600 |
construction to real cities to the urban environment? 00:12:07.000 |
Yeah, I think the biggest thing is that the actors are truly 00:12:17.720 |
the drivers on the road, the other road users are out there 00:12:23.040 |
behaving well, but every once in a while, they're not. 00:12:33.320 |
In terms of behavior or in terms of perception or both? 00:12:38.720 |
Back then, we didn't have to deal with cyclists. 00:12:46.280 |
The scale over which that you have to operate 00:12:56.320 |
what do you think is the hardest part about driving? 00:13:03.160 |
I'm sure nothing really jumps out at you as one thing, 00:13:07.400 |
but in the jump from the urban challenge to the real world, 00:13:15.320 |
you foresee a very serious, difficult challenge? 00:13:21.120 |
is that we're doing it for real, that in that environment, 00:13:29.000 |
it was both a limited complexity environment, 00:13:31.840 |
because certain actors weren't there, because the roads were 00:13:35.360 |
There were barriers keeping people separate from robots 00:13:43.960 |
looking at it from 2006, it had to work for 60 miles. 00:13:52.680 |
that will go and drive for half a million miles. 00:14:01.040 |
So how important-- you said LiDAR came into the game 00:14:10.240 |
So how important is the role of LiDAR in the sensor 00:14:18.640 |
But I also believe the cameras are essential, 00:14:26.240 |
the composition of data from these different sensors 00:14:35.760 |
is what are your thoughts on the Elon Musk provocative statement 00:14:40.200 |
that LiDAR is a crutch, that is a kind of, I guess, 00:14:45.840 |
growing pains, and that much of the perception task 00:14:52.160 |
So I think it is undeniable that people walk around 00:14:59.640 |
And they can get into vehicles and drive them. 00:15:16.400 |
There's an example that we all go do it, many of us, every day. 00:15:28.160 |
But in the same way that the combustion engine was 00:15:35.200 |
in the same way that any technology ultimately 00:15:38.560 |
gets replaced by some superior technology in the future. 00:15:47.720 |
is that the way we get around on the ground, the way that we 00:15:58.200 |
I think the number I saw this morning, 37,000 Americans 00:16:08.040 |
bring to bear that accelerates this self-driving technology 00:16:18.300 |
And it feels just arbitrary to say, well, I'm 00:16:23.680 |
not OK with using lasers, because that's whatever. 00:16:27.760 |
But I am OK with using an 8 megapixel camera or a 16 00:16:36.340 |
from the tool bin that allows us to go and solve a problem. 00:16:48.300 |
And if there's one word that comes up more often 00:16:51.340 |
than anything, it's cost, and trying to drive cost down. 00:16:55.300 |
So while it's true that it's a tragic number, the 37,000, 00:17:06.040 |
But we want to find the cheapest sensor suite that 00:17:18.220 |
do you foresee LiDAR coming down in cost in the future? 00:17:35.060 |
talk to the question you asked about the cheapest sensor. 00:17:40.580 |
What you want is a sensor suite that is economically viable. 00:17:45.780 |
And then after that, everything is about margin and driving 00:17:52.340 |
What you also want is a sensor suite that works. 00:17:55.460 |
And so it's great to tell a story about how it would be 00:18:00.940 |
better to have a self-driving system with a $50 sensor 00:18:08.780 |
But if the $500 sensor makes it work and the $50 sensor 00:18:15.700 |
So long as you can actually have an economic-- 00:18:23.740 |
because that's how you actually have a sustainable business. 00:18:27.780 |
And that's how you can actually see this come to scale 00:18:32.540 |
And so when I look at LiDAR, I see a technology 00:18:37.220 |
that has no underlying fundamental expense to it. 00:18:42.460 |
It's going to be more expensive than an imager, 00:18:51.420 |
are dramatically more scalable than mechanical processes. 00:18:56.220 |
But we still should be able to drive cost out 00:19:00.460 |
And then I also do think that with the right business model, 00:19:05.940 |
you can absorb certainly more cost on the bill of materials. 00:19:09.540 |
Yeah, if the sensor suite works, extra value is provided. 00:19:12.620 |
Thereby, you don't need to drive cost down to zero. 00:19:18.860 |
that level two autonomy is problematic because 00:19:26.540 |
complacency, overtrust, and so on, just us being human. 00:19:29.660 |
We overtrust the system, we start doing even more 00:19:33.060 |
so partaking in the secondary activities like smartphones 00:19:38.820 |
Have your views evolved on this point in either direction? 00:19:44.820 |
So I want to be really careful, because sometimes this 00:19:48.300 |
gets twisted in a way that I certainly didn't intend. 00:19:53.060 |
So active safety systems are a really important technology 00:19:59.420 |
that we should be pursuing and integrating into vehicles. 00:20:13.460 |
Level two systems are systems where the vehicle 00:20:18.180 |
is controlling two axes, so braking and throttle/steering. 00:20:24.900 |
And I think there are variants of level two systems 00:20:27.140 |
that are supporting the driver that absolutely we should 00:20:39.180 |
and the misconception from the public around the capability 00:20:48.060 |
And that is where I am actually incrementally more concerned 00:20:54.340 |
around level three systems and how exactly a level two 00:21:00.020 |
And how much effort people have put into those human factors. 00:21:03.260 |
So I still believe several things around this. 00:21:10.780 |
We've seen over the last few weeks a spate of people 00:21:16.340 |
I watched an episode last night of Trevor Noah 00:21:24.380 |
And him, this is a smart guy who has a lot of resources 00:21:28.940 |
at his disposal describing a Tesla as a self-driving car. 00:21:32.940 |
And that why shouldn't people be sleeping in their Tesla? 00:21:35.700 |
It's like, well, because it's not a self-driving car 00:21:41.100 |
And these people will almost certainly die at some point 00:21:50.460 |
And so we need to really be thoughtful about how 00:21:52.600 |
that technology is described and brought to market. 00:21:56.260 |
I also think that because of the economic challenges 00:22:01.380 |
we were just talking about, that technology path will-- 00:22:07.260 |
that technology path will diverge from the technology 00:22:10.300 |
path that we need to be on to actually deliver truly 00:22:16.180 |
Ones where you can get in it and sleep and have 00:22:19.600 |
the equivalent or better safety than a human driver 00:22:32.700 |
So you just don't see the economics of gradually 00:22:36.900 |
increasing from level two and doing so quickly enough 00:22:41.500 |
to where it doesn't cause safety, critical safety 00:22:44.460 |
You believe that it needs to diverge at this point 00:22:50.520 |
And really that comes back to what are those L2 and L1 00:22:59.760 |
where the people that are marketing that responsibly 00:23:04.320 |
are being very clear and putting human factors in place such 00:23:08.320 |
that the driver is actually responsible for the vehicle. 00:23:12.360 |
And that the technology is there to support the driver. 00:23:15.160 |
And the safety cases that are built around those 00:23:19.860 |
are dependent on that driver attention and attentiveness. 00:23:24.260 |
And at that point, you can kind of give up to some degree 00:23:36.180 |
is for a for collision mitigation braking system, 00:23:47.660 |
that would be an incredible, incredible advance 00:23:54.940 |
But it would mean that if that vehicle wasn't being monitored, 00:24:00.580 |
And so economically, that's a perfectly good solution 00:24:09.220 |
is drive the cost out of that so you can get it on as 00:24:13.300 |
But driving the cost out of it doesn't drive up performance 00:24:18.820 |
And so you'll continue to not have a technology that 00:24:21.580 |
could really be available for a self-driven vehicle. 00:24:25.700 |
So clearly, the communication-- and this probably 00:24:34.420 |
of what the technology is actually capable of, 00:24:37.060 |
how hard it is, how easy it is, all that kind of stuff 00:24:41.100 |
So say everybody in the world was perfectly communicated 00:24:48.340 |
of every single technology out there, what it's able to do. 00:24:54.140 |
And now we're maybe getting into philosophical ground. 00:25:09.540 |
--and internalized it, then sure, you could do that safely. 00:25:17.820 |
they're going to-- if the facts are put in front of them, 00:25:20.700 |
they're going to then combine that with their experience. 00:25:37.380 |
At that point, just even if you know the statistics, 00:25:49.940 |
And the problem is that that sample size that they have-- 00:25:53.940 |
So 60 miles times 30 days, so 60, 180, 1,800 miles. 00:26:01.700 |
That's a drop in the bucket compared to the, what, 00:26:07.580 |
And so they don't really have a true estimate 00:26:11.340 |
based on their personal experience of the real risks. 00:26:13.980 |
But they're going to trust it anyway, because it's 00:26:18.700 |
-So even if you start a perfect understanding of the system, 00:26:32.900 |
from what I would say is kind of the more technology-savvy 00:26:42.620 |
you may be able to have some of those folks who are really 00:26:48.820 |
And your immunization against this kind of false risk 00:27:02.180 |
I think there it's going to move more quickly. 00:27:08.180 |
-So your work, the program that you created at Google 00:27:15.100 |
on the second path of creating full autonomy. 00:27:22.180 |
it's one of the most interesting AI problems of the century. 00:27:26.860 |
I just talked to a lot of people, just regular people, 00:27:29.180 |
I don't know, my mom, about autonomous vehicles. 00:27:34.540 |
of giving your life control over to a machine. 00:27:51.900 |
that an autonomous vehicle, an Aurora system, is safe? 00:27:56.140 |
-This is one where it's difficult because there isn't 00:28:08.300 |
And this is where something like a functional safety process 00:28:20.340 |
is the way we did it, then you can have some confidence 00:28:22.680 |
that we were thorough in the engineering work 00:28:38.180 |
in terms of demonstrating that the capabilities worked 00:28:43.900 |
And to whatever degree, we can demonstrate that, 00:28:50.260 |
some combination of unit testing and decomposition testing. 00:28:54.780 |
And then some part of it will be on-road data. 00:29:04.380 |
be clearly some conversation with the public about it. 00:29:14.020 |
able to go into more depth with folks like NHTSA 00:29:17.500 |
and other federal and state regulatory bodies. 00:29:26.220 |
that if we can show enough work to them that they're convinced, 00:29:35.100 |
are essentially experts at safety to try to discuss. 00:29:40.660 |
the answer is probably no, but just in case-- 00:29:50.100 |
to sort of alter the experiments you run to adjust. 00:30:01.980 |
- Do you think it's possible to create a metric, a number, 00:30:06.620 |
that could demonstrate safety outside of fatalities? 00:30:13.500 |
And I think that it won't be just one number. 00:30:17.700 |
So as we are internally grappling with this-- 00:30:21.340 |
and at some point, we'll be able to talk more publicly about it-- 00:30:25.100 |
is how do we think about human performance in different tasks, 00:30:29.820 |
say, detecting traffic lights or safely making 00:30:40.020 |
are for those different capabilities for people? 00:30:44.780 |
and then ultimately folks in the regulatory role, 00:30:48.500 |
and then ultimately the public, that we have confidence 00:30:55.140 |
And so these individual metrics will tell a compelling story, 00:31:16.460 |
that people aren't having to pay to get their car fixed. 00:31:30.460 |
And then there's injuries and near-miss events and whatnot, 00:31:54.420 |
even at the scale of the US, to compare directly. 00:32:05.020 |
is significantly, at least currently, magnified. 00:32:15.100 |
I think the most popular topic about autonomous vehicles 00:32:18.220 |
in the public is the trolley problem formulation, right? 00:32:23.100 |
- Which has-- let's not get into that too much, 00:32:31.660 |
are grappling with this idea of giving control over 00:32:36.260 |
So how do you win the hearts and minds of the people 00:32:41.620 |
that autonomy is something that could be a part of their lives? 00:33:05.380 |
But at the same time, it's clear there's an opportunity 00:33:09.180 |
It's clear that we can improve access to mobility. 00:33:12.340 |
It's clear that we can reduce the cost of mobility. 00:33:16.540 |
And that once people try that and understand that it's safe 00:33:24.340 |
I think it's one of these things that will just be obvious. 00:33:27.940 |
And I've seen this practically in demonstrations 00:33:32.100 |
that I've given where I've had people come in. 00:33:39.740 |
My favorite one is taking somebody out on the freeway. 00:33:42.460 |
And we're on the 101 driving at 65 miles an hour. 00:33:45.900 |
And after 10 minutes, they turn and ask, is that all it does? 00:33:51.940 |
I'm not sure exactly what you thought it would do. 00:33:54.700 |
But it becomes mundane, which is exactly what you 00:34:02.540 |
We don't really-- when I turn the light switch on in here, 00:34:07.140 |
I don't think about the complexity of those electrons 00:34:11.860 |
being pushed down a wire from wherever it was 00:34:30.180 |
And I think that's what we want this technology to be like, 00:34:34.660 |
And people get to have those life experiences 00:34:38.460 |
- So putting this technology in the hands of people 00:34:50.260 |
about the future, because nobody can predict the future. 00:34:57.940 |
do you think we'll see a large scale deployment 00:35:00.940 |
of autonomous vehicles, 10,000, those kinds of numbers. 00:35:20.460 |
At which moment does it become, wow, this is serious scale? 00:35:26.540 |
is when we really do have a driverless vehicle operating 00:35:35.020 |
And that we can do that kind of continuously. 00:35:40.660 |
I think at that moment, we've kind of crossed the 0 to 1 00:35:45.940 |
And then it is about how do we continue to scale that? 00:35:53.940 |
How do we build the right customer experience around it 00:35:56.260 |
so that it is actually a useful product out in the world? 00:36:06.100 |
is this kind of mixed science engineering project 00:36:20.660 |
- What do you think that deployment looks like? 00:36:22.580 |
Where do we first see the inkling of no safety driver, 00:36:29.780 |
Is it in specific routes in the urban environment? 00:36:33.260 |
- I think it's going to be urban, suburban type 00:36:37.940 |
With Aurora, when we thought about how to tackle this, 00:36:42.420 |
it was kind of en vogue to think about trucking as opposed 00:36:51.300 |
is that freeways are easier to drive on because everybody's 00:37:01.460 |
And I think that that intuition is pretty good, 00:37:03.420 |
except we don't really care about most of the time. 00:37:08.620 |
And when you're driving on a freeway with a truck, 00:37:19.060 |
And so when that goes wrong, it goes really wrong. 00:37:22.580 |
And those challenges that you see occur more rarely, 00:37:30.980 |
And they're incrementally more difficult than urban driving, 00:37:37.420 |
And so I think this happens in moderate speed urban 00:37:50.980 |
And those events where there's the possibility for that 00:37:57.900 |
We get to do that with lower risk for everyone. 00:38:08.140 |
then the freeway driving part of this just falls out. 00:38:11.260 |
But we're able to learn more safely, more quickly 00:38:18.500 |
I mean, who knows if a sufficiently compelling experience 00:38:27.140 |
and what kind of breakthroughs might there be 00:38:32.380 |
Again, not only am I asking you to predict the future, 00:38:38.300 |
- So what's the, I think another way to ask that 00:38:44.260 |
what part of the system would I make work today 00:38:49.420 |
- Don't say infrastructure, please don't say infrastructure. 00:38:56.260 |
It's really that perception forecasting capability. 00:39:00.540 |
So if tomorrow you could give me a perfect model 00:39:06.900 |
and what will happen for the next five seconds 00:39:12.980 |
that would accelerate things pretty dramatically. 00:39:17.540 |
are you mostly bothered by cars, pedestrians or cyclists? 00:39:21.700 |
- So I worry most about the vulnerable road users 00:39:25.900 |
about the combination of cyclists and cars, right? 00:39:43.260 |
they're out in the road and they don't have any protection 00:39:46.500 |
and so we need to pay extra attention to that. 00:39:49.740 |
- Do you think about a very difficult technical challenge 00:39:58.540 |
if you try to protect pedestrians by being careful and slow, 00:40:07.460 |
does that worry you of how, from a technical perspective, 00:40:14.580 |
is kind of nudge our way through the pedestrians, 00:40:17.260 |
which doesn't feel, from a technical perspective, 00:40:22.320 |
But do you think about how we solve that problem? 00:40:38.660 |
And I've heard this and I don't really believe it 00:40:45.940 |
and somebody steps in front of me, I'm going to stop. 00:40:56.420 |
- And so I think today, people can take advantage of this 00:41:12.020 |
But I think people don't want to get hit by cars. 00:41:18.740 |
and creating chaos more than they would today. 00:41:29.100 |
I think that is further down the technology pipeline. 00:41:38.580 |
I think the algorithm people use for this is pretty simple. 00:41:52.740 |
And particularly given that you don't do this 00:42:04.540 |
- And probably that's less an algorithm problem 00:42:10.140 |
So the HCI people that create a visual display 00:42:18.260 |
- That's an experience problem, not an algorithm problem. 00:42:25.460 |
And how do you out-compete them in the long run? 00:42:28.620 |
- So we really focus a lot on what we're doing here. 00:42:31.180 |
I think that, you know, I've said this a few times, 00:42:37.940 |
and it's great that a bunch of companies are tackling it 00:42:40.260 |
because I think it's so important for society 00:42:43.780 |
So we, you know, we don't spend a whole lot of time 00:42:55.220 |
What are we trying to do to go faster ultimately? 00:43:06.460 |
and understand where the cul-de-sacs are to some degree. 00:43:25.700 |
We've got a culture, I think, that people appreciate 00:43:30.460 |
allows them to really spend time solving problems. 00:43:38.100 |
invested heavily in the infrastructure and architectures 00:43:46.540 |
So because of the folks we're able to bring in early on, 00:43:53.300 |
you know, we don't spend all of our time doing demos 00:43:56.780 |
and kind of leaping from one demo to the next. 00:43:58.660 |
We've been given the freedom to invest in infrastructure 00:44:05.500 |
infrastructure to pull data from our on-road testing, 00:44:08.580 |
infrastructure to use that to accelerate engineering. 00:44:14.460 |
and continuing investment in those kinds of tools