back to index

Kyle Vogt: Cruise Automation | Lex Fridman Podcast #14


Chapters

0:0 Introduction
0:55 High School Robotics
2:35 Battle Bots
4:20 Wedges
5:22 Software
7:10 Programming
8:27 Artificial Intelligence
10:40 Deep Learning
11:51 Entrepreneurship
12:52 DARPA Grand Challenge
14:0 AI in Autonomous Vehicles
15:17 DARPA Challenges
16:46 Leaving MIT
18:5 No regrets
18:45 Brave decision
19:15 Failure
22:54 Cruise Automation
25:45 How to solve the problem
27:41 Retrofit
29:49 Detroit vs Silicon Valley
32:37 The culture gap
35:9 The biggest opportunity to make money
37:47 Personality of the car
39:44 Emotional release
45:32 Autonomous Vehicles
47:58 Building a Successful Startup
50:10 Y Combinator vs VC Route
51:54 Philosophical existential
53:48 What does 2019 hold for Crew

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Kyle Vogt.
00:00:02.240 | He's the president and the CTO of Cruise Automation,
00:00:05.040 | leading an effort to solve one of the biggest
00:00:07.880 | robotics challenges of our time, vehicle automation.
00:00:10.800 | He's a co-founder of two successful companies,
00:00:13.040 | Twitch and Cruise, that have each sold for a billion dollars.
00:00:16.960 | And he's a great example of the innovative spirit
00:00:19.800 | that flourishes in Silicon Valley,
00:00:22.080 | and now is facing an interesting and exciting challenge
00:00:25.680 | of matching that spirit with the mass production
00:00:29.960 | and the safety-centric culture of a major automaker,
00:00:32.720 | like General Motors.
00:00:34.360 | This conversation is part of the MIT Artificial General
00:00:37.200 | Intelligence series,
00:00:38.480 | and the Artificial Intelligence podcast.
00:00:40.960 | If you enjoy it, please subscribe on YouTube, iTunes,
00:00:44.760 | or simply connect with me on Twitter @LexFriedman,
00:00:47.560 | spelled F-R-I-D.
00:00:49.720 | And now, here's my conversation with Kyle Vogt.
00:00:54.520 | - You grew up in Kansas, right?
00:00:55.840 | - Yeah, and I just saw that picture you had
00:00:57.680 | hidden over there, so I'm a little bit worried
00:00:59.720 | about that now. - Nervous.
00:01:00.720 | So in high school in Kansas City,
00:01:02.440 | you joined Shawnee Mission North High School Robotics Team.
00:01:07.440 | Now, that wasn't your high school.
00:01:09.280 | - That's right, that was the only high school in the area
00:01:12.160 | that had a teacher who was willing to sponsor
00:01:14.840 | a FIRST Robotics team.
00:01:16.480 | - I was gonna troll you a little bit.
00:01:18.320 | - Jog your memory a little bit.
00:01:19.600 | - Yep, yep. - That kid.
00:01:20.440 | - I was trying to look super cool and intense,
00:01:22.960 | 'cause this was BattleBots,
00:01:24.240 | this is serious business.
00:01:25.640 | So we're standing there with a welded steel frame
00:01:28.840 | and looking tough.
00:01:30.240 | - So go back there, what is that drew you to robotics?
00:01:33.800 | - Well, I think, I've been trying to figure this out
00:01:36.440 | for a while, but I've always liked building things
00:01:37.880 | with Legos, and when I was really, really young,
00:01:39.880 | I wanted the Legos that had motors and other things,
00:01:42.320 | and then, you know, Lego Mindstorms came out,
00:01:44.800 | and for the first time, you could program Lego contraptions,
00:01:48.200 | and I think things just sort of snowballed from that.
00:01:52.520 | But I remember seeing the BattleBots TV show
00:01:56.720 | on Comedy Central and thinking,
00:01:58.000 | that is the coolest thing in the world,
00:01:59.760 | I wanna be a part of that,
00:02:01.160 | and not knowing a whole lot about how to build
00:02:03.640 | these 200-pound fighting robots.
00:02:06.840 | So I sort of obsessively poured over the internet forums
00:02:10.960 | where all the creators for BattleBots
00:02:12.520 | would sort of hang out and talk about,
00:02:14.320 | you know, document their build progress and everything,
00:02:17.080 | and I think I read, I must've read like, you know,
00:02:20.480 | tens of thousands of forum posts
00:02:22.640 | from basically everything that was out there
00:02:24.880 | on what these people were doing,
00:02:26.360 | and eventually, like, sort of triangulated
00:02:28.040 | how to put some of these things together,
00:02:29.840 | and ended up doing BattleBots, which was, you know,
00:02:33.720 | I was like 13 or 14, which was pretty awesome.
00:02:35.920 | - I'm not sure if the show's still running,
00:02:37.640 | but so BattleBots is,
00:02:39.440 | there's not an artificial intelligence component,
00:02:42.680 | it's remotely controlled,
00:02:44.160 | and it's almost like a mechanical engineering challenge
00:02:46.680 | of building things that can be broken.
00:02:49.520 | - They're radio controlled, so,
00:02:51.320 | and I think that they allowed some limited form of autonomy,
00:02:54.760 | but, you know, in a two-minute match,
00:02:56.560 | you're, and the way these things ran,
00:02:58.760 | you're really doing yourself a disservice
00:03:00.320 | by trying to automate it versus just, you know,
00:03:02.280 | do the practical thing, which is drive it yourself.
00:03:04.680 | - And there's an entertainment aspect,
00:03:06.880 | just going on YouTube, there's like,
00:03:08.880 | some of them wield an ax, some of them,
00:03:11.120 | I mean, there's that fun,
00:03:12.120 | so what drew you to that aspect?
00:03:13.720 | Was it the mechanical engineering?
00:03:15.320 | Was it the dream to create, like,
00:03:18.520 | Frankenstein and sentient being,
00:03:21.040 | or was it just like the Lego,
00:03:22.760 | you like tinkering with stuff?
00:03:23.920 | - I mean, that was just building something,
00:03:25.960 | I think the idea of, you know,
00:03:27.880 | this radio controlled machine that can do various things,
00:03:30.840 | if it has like a weapon or something was pretty interesting.
00:03:33.720 | I agree, it doesn't have the same appeal
00:03:35.520 | as, you know, autonomous robots,
00:03:37.240 | which I, you know, sort of gravitated towards later on,
00:03:40.280 | but it was definitely an engineering challenge,
00:03:42.640 | because everything you did in that competition
00:03:45.520 | was pushing components to their limits.
00:03:48.400 | So we would buy like these $40 DC motors
00:03:52.920 | that came out of a winch,
00:03:54.840 | like on the front of a pickup truck or something,
00:03:57.240 | and we'd power the car with those,
00:03:59.200 | and we'd run them at like double or triple
00:04:01.080 | their rated voltage.
00:04:02.440 | So they immediately start overheating,
00:04:04.120 | but for that two minute match,
00:04:05.320 | you can get, you know, a significant increase
00:04:07.920 | in the power output of those motors before they burn out.
00:04:10.520 | And so you're doing the same thing for your battery packs,
00:04:12.720 | all the materials in the system.
00:04:14.320 | And I think there was something,
00:04:15.520 | something intrinsically interesting about
00:04:18.120 | just seeing like where things break.
00:04:20.360 | - And did you offline see where they break?
00:04:23.360 | Did you take it to the testing point?
00:04:25.040 | Like, how did you know two minutes?
00:04:26.120 | Or was there a reckless, let's just go with it and see.
00:04:29.680 | - We weren't very good at BattleBots.
00:04:31.320 | We lost all of our matches the first round.
00:04:32.640 | - What did it first look like?
00:04:34.200 | - The one I built first,
00:04:36.240 | both of them were these wedge shaped robots,
00:04:38.120 | 'cause a wedge, even though it's sort of boring to look at,
00:04:40.200 | is extremely effective.
00:04:41.240 | You drive towards another robot
00:04:42.560 | and the front edge of it gets under them,
00:04:44.720 | and then they sort of flip over,
00:04:46.760 | kind of like a door stopper.
00:04:48.240 | And the first one had a pneumatic
00:04:51.000 | polished stainless steel spike on the front
00:04:52.680 | that would shoot out about eight inches.
00:04:54.880 | - The purpose of which is what?
00:04:56.200 | - Pretty ineffective actually, but it looks cool.
00:04:58.720 | And--
00:04:59.560 | - Was it to help with the lift?
00:05:00.840 | - No, it was just to try to poke holes in the other robot.
00:05:04.040 | And then the second time I did it,
00:05:05.960 | which is the following, I think maybe 18 months later,
00:05:08.660 | we had a, well, a titanium ax
00:05:12.200 | with a hardened steel tip on it
00:05:14.380 | that was powered by a hydraulic cylinder,
00:05:17.200 | which we were activating with liquid CO2,
00:05:20.380 | which had its own set of problems.
00:05:23.880 | - So great, so that's kind of on the hardware side.
00:05:26.320 | I mean, at a certain point,
00:05:28.360 | there must have been born a fascination
00:05:31.240 | on the software side.
00:05:32.460 | So what was the first piece of code you've written?
00:05:35.520 | - Oh man, so--
00:05:36.360 | - Go back there, see what language was it?
00:05:38.620 | What was it, was it Emacs, Vim?
00:05:40.620 | Was it a more respectable modern IDE?
00:05:44.300 | Do you remember any of this?
00:05:45.800 | - Yeah, well, I remember,
00:05:47.800 | I think maybe when I was in third or fourth grade,
00:05:51.280 | the school I was at, elementary school,
00:05:52.460 | had a bunch of Apple II computers,
00:05:55.040 | and we'd play games on those.
00:05:56.680 | And I remember every once in a while,
00:05:57.760 | something would crash or wouldn't start up correctly,
00:06:01.320 | and it would dump you out to what I later learned
00:06:03.960 | was like sort of a command prompt.
00:06:05.800 | And my teacher would come over and type,
00:06:07.600 | actually remember this to this day for some reason,
00:06:09.440 | like PR number six, or PR pound six,
00:06:12.160 | which is peripheral six, which is the disk drive,
00:06:13.840 | which would fire up the disk and load the program.
00:06:15.920 | And I just remember thinking, wow, she's like a hacker,
00:06:17.920 | like teach me these codes, error codes,
00:06:20.920 | is what I called them at the time.
00:06:22.740 | But she had no interest in that.
00:06:23.580 | And so it wasn't until I think about fifth grade
00:06:26.500 | that I had a school where you could actually
00:06:29.120 | go on these Apple IIs and learn to program.
00:06:30.620 | And so it was all in basic,
00:06:31.960 | where every line, the line numbers are all,
00:06:34.240 | or every line is numbered,
00:06:35.640 | and you have to leave enough space between the numbers
00:06:38.720 | so that if you wanna tweak your code,
00:06:40.800 | you go back and if the first line was 10,
00:06:42.600 | and the second line is 20,
00:06:44.040 | now you have to go back and insert 15.
00:06:45.640 | And if you need to add code in front of that,
00:06:47.920 | you know, 11 or 12,
00:06:48.960 | and you hope you don't run out of line numbers
00:06:50.200 | and have to redo the whole thing.
00:06:51.880 | - And there's go-to statements?
00:06:53.240 | - Yeah, go-to and it's very basic,
00:06:56.160 | maybe hence the name, but a lot of fun.
00:06:58.200 | And that was like, that was, you know,
00:07:00.800 | that's when, you know, when you first program,
00:07:02.600 | you see the magic of it.
00:07:03.560 | It's like, it just, just like this world opens up with,
00:07:06.640 | you know, endless possibilities
00:07:07.760 | for the things you could build
00:07:08.720 | or accomplish with that computer.
00:07:10.640 | - So you got the bug then.
00:07:11.800 | So even starting with basic,
00:07:13.360 | and then what, C++ throughout.
00:07:16.040 | What did you, was there a computer programming,
00:07:18.160 | computer science classes in high school?
00:07:19.840 | - Not where I went.
00:07:21.080 | So it was self-taught, but I did a lot of programming.
00:07:24.600 | The thing that, you know,
00:07:27.240 | sort of pushed me in the path of eventually working
00:07:29.360 | on self-driving cars is actually one of these
00:07:32.360 | really long trips driving from my house in Kansas
00:07:35.120 | to I think Las Vegas,
00:07:37.920 | where we did the BattleBots competition.
00:07:39.480 | And I had just gotten my,
00:07:41.480 | I think my learner's permit or early driver's permit.
00:07:45.040 | And so I was driving this, you know,
00:07:47.400 | 10 hour stretch across Western Kansas,
00:07:49.840 | where it's just, you're going straight on a highway
00:07:51.800 | and it is mind numbingly boring.
00:07:53.640 | And I remember thinking even then
00:07:54.960 | with my sort of mediocre programming background
00:07:58.080 | that this is something that a computer can do, right?
00:08:00.040 | Let's take a picture of the road
00:08:01.400 | and let's find the yellow lane markers
00:08:02.880 | and, you know, steer the wheel.
00:08:04.880 | And, you know, later I'd come to realize
00:08:06.600 | this had been done, you know,
00:08:07.880 | since the 80s or the 70s or even earlier,
00:08:11.660 | but I still wanted to do it.
00:08:12.780 | And sort of immediately after that trip
00:08:14.880 | switched from sort of BattleBots,
00:08:16.280 | which is more radio controlled machines
00:08:18.680 | to thinking about building, you know,
00:08:21.840 | autonomous vehicles of some scale.
00:08:23.640 | Start off with really small electric ones
00:08:25.100 | and then, you know, progress to what we're doing now.
00:08:28.320 | - So what was your view of artificial intelligence
00:08:30.040 | at that point?
00:08:30.880 | What did you think?
00:08:31.920 | So this is before,
00:08:34.320 | there's been waves in artificial intelligence, right?
00:08:36.800 | The current wave with deep learning
00:08:39.520 | makes people believe that you can solve
00:08:41.800 | in a really rich, deep way
00:08:43.580 | the computer vision perception problem.
00:08:46.240 | But like in, before the deep learning craze,
00:08:51.240 | you know, how do you think about
00:08:52.840 | how would you even go about building a thing
00:08:55.360 | that perceives itself in the world,
00:08:56.960 | localizes itself in the world, moves around the world?
00:08:59.240 | Like when you were younger, I mean.
00:09:00.080 | - Yeah.
00:09:00.920 | - What was your thinking about it?
00:09:02.160 | - Well, prior to deep neural networks
00:09:04.000 | or convolutional neural nets,
00:09:05.400 | these modern techniques we have,
00:09:06.560 | or at least ones that are in use today,
00:09:09.080 | it was all a heuristic space.
00:09:10.320 | And so like old school image processing,
00:09:12.960 | and I think extracting, you know,
00:09:15.080 | yellow lane markers out of an image of a road
00:09:18.040 | is one of the problems that lends itself
00:09:21.240 | reasonably well to those heuristic based methods.
00:09:23.560 | You know, like just do a threshold on the color yellow
00:09:26.800 | and then try to fit some lines to that
00:09:28.560 | using a Hough transform or something
00:09:30.360 | and then go from there.
00:09:32.280 | - Traffic light detection and stop sign detection,
00:09:34.800 | red, yellow, green.
00:09:35.920 | - And I think you can, you could,
00:09:38.160 | I mean, if you wanted to do a full,
00:09:39.840 | I was just trying to make something that would stay
00:09:41.960 | in between the lanes on a highway.
00:09:43.520 | But if you wanted to do the full,
00:09:45.160 | the full, you know, set of capabilities needed
00:09:49.280 | for a driverless car, I think you could,
00:09:51.360 | and we'd done this at Cruise, you know,
00:09:53.600 | in the very first days,
00:09:54.440 | you can start off with a really simple,
00:09:56.200 | you know, human written heuristic
00:09:58.000 | just to get the scaffolding in place for your system.
00:10:00.720 | Traffic light detection, probably a really simple,
00:10:02.720 | you know, color thresholding on day one
00:10:04.760 | just to get the system up and running
00:10:06.520 | before you migrate to, you know,
00:10:08.640 | a deep learning based technique or something else.
00:10:11.080 | And, you know, back in when I was doing this,
00:10:12.800 | my first one, it was on a Pentium 233 megahertz computer
00:10:17.360 | and I think I wrote the first version in basic,
00:10:19.920 | which is like an interpreted language.
00:10:21.600 | It's extremely slow.
00:10:23.200 | 'Cause that's the thing I knew at the time.
00:10:24.800 | And so there was no chance at all of using,
00:10:27.840 | there was no computational power to do
00:10:30.440 | any sort of reasonable deep nets like you have today.
00:10:33.480 | So I don't know what kids these days are doing.
00:10:35.360 | Are kids these days, you know, at age 13
00:10:37.920 | using neural networks in their garage?
00:10:39.360 | I mean, that would be awesome.
00:10:40.640 | - I get emails all the time from, you know,
00:10:43.040 | like 11, 12 year olds saying, I'm having, you know,
00:10:46.160 | I'm trying to follow this TensorFlow tutorial
00:10:48.760 | and I'm having this problem.
00:10:50.760 | And their general approach in the deep learning community
00:10:56.080 | is of extreme optimism of, as opposed to,
00:11:00.200 | you mentioned like heuristics,
00:11:01.320 | you can separate the autonomous driving problem
00:11:04.800 | into modules and try to solve it sort of rigorously.
00:11:07.520 | Or you can just do it end to end.
00:11:09.080 | And most people just kind of love the idea that,
00:11:11.840 | you know, us humans do it end to end.
00:11:13.360 | We just perceive and act.
00:11:15.360 | We should be able to use that,
00:11:17.040 | do the same kind of thing when you're on nets.
00:11:18.720 | And that kind of thinking,
00:11:20.920 | you don't want to criticize that kind of thinking
00:11:22.840 | because eventually they will be right.
00:11:24.640 | - Yeah.
00:11:25.480 | - And so it's exciting.
00:11:26.360 | And especially when they're younger,
00:11:27.600 | to explore that as a really exciting approach.
00:11:30.640 | But yeah, it's changed the language,
00:11:35.480 | the kind of stuff you're tinkering with.
00:11:37.240 | It's kind of exciting to see when these teenagers grow up.
00:11:40.920 | - Yeah, I can only imagine if your starting point is,
00:11:44.240 | you know, Python and TensorFlow at age 13,
00:11:46.720 | where you end up, you know, after 10 or 15 years of that.
00:11:49.880 | That's pretty cool.
00:11:51.040 | - Because of GitHub, because the state of the art tools
00:11:53.760 | for solving most of the major problems
00:11:55.440 | in artificial intelligence are within a few lines of code
00:11:58.920 | for most kids.
00:12:00.240 | And that's incredible to think about,
00:12:02.320 | also on the entrepreneurial side.
00:12:04.280 | And on that point, was there any thought
00:12:08.520 | about entrepreneurship before you came to college?
00:12:11.960 | Is sort of doing, you're building this into a thing
00:12:15.160 | that impacts the world on a large scale?
00:12:17.800 | - Yeah, I've always wanted to start a company.
00:12:19.840 | I think that's, you know, just a cool concept
00:12:22.600 | of creating something and exchanging it for value
00:12:25.840 | or creating value, I guess.
00:12:28.360 | So in high school, I was trying to build like,
00:12:31.120 | you know, servo motor drivers, little circuit boards
00:12:33.600 | and sell them online or other things like that.
00:12:36.960 | And certainly knew at some point I wanted to do a startup,
00:12:40.320 | but it wasn't really, I'd say until college,
00:12:42.400 | until I felt like I had the, I guess,
00:12:46.040 | the right combination of the environment,
00:12:47.520 | the smart people around you and some free time.
00:12:50.800 | I had a lot of free time at MIT.
00:12:52.360 | - So you came to MIT as an undergrad, 2004.
00:12:55.800 | - That's right.
00:12:57.080 | - And that's when the first DARPA Grand Challenge
00:12:59.080 | was happening.
00:12:59.920 | - Yeah.
00:13:00.760 | - The timing of that is beautifully poetic.
00:13:03.360 | So how'd you get yourself involved in that one?
00:13:05.720 | - Originally there wasn't a--
00:13:07.080 | - Official entry?
00:13:07.920 | - Yeah, faculty sponsored thing.
00:13:09.560 | And so a bunch of undergrads, myself included,
00:13:12.800 | started meeting and got together
00:13:14.200 | and tried to haggle together some sponsorships.
00:13:17.820 | We got a vehicle donated, a bunch of sensors
00:13:20.160 | and tried to put something together.
00:13:21.640 | And so we had, our team was probably mostly freshmen
00:13:24.680 | and sophomores, which was not really a fair fight
00:13:27.960 | against maybe the postdoc and faculty-led teams
00:13:32.160 | from other schools.
00:13:33.000 | But we got something up and running.
00:13:35.040 | We had our vehicle drive by wire
00:13:36.680 | and very, very basic control and things.
00:13:39.620 | But on the day of the qualifying,
00:13:43.660 | sort of pre-qualifying round,
00:13:46.800 | the one and only steering motor that we had purchased,
00:13:50.920 | the thing that we had retrofitted
00:13:52.400 | to turn the steering wheel on the truck, died.
00:13:55.840 | And so our vehicle was just dead in the water,
00:13:57.880 | couldn't steer, so we didn't make it very far.
00:13:59.960 | - On the hardware side.
00:14:01.000 | So was there a software component?
00:14:03.080 | Was there, like how did your view of autonomous vehicles
00:14:06.240 | in terms of artificial intelligence evolve in this moment?
00:14:10.760 | I mean, like you said, from the '80s
00:14:13.000 | there's been autonomous vehicles,
00:14:14.120 | but really that was the birth of the modern wave,
00:14:16.800 | the thing that captivated everyone's imagination
00:14:20.120 | that we can actually do this.
00:14:21.600 | So were you captivated in that way?
00:14:26.080 | So how did your view of autonomous vehicles
00:14:27.680 | change at that point?
00:14:29.080 | - I'd say at that point in time it was a curiosity,
00:14:33.840 | as in like, is this really possible?
00:14:35.920 | And I think that was generally the spirit
00:14:38.520 | and the purpose of that original DARPA Grand Challenge,
00:14:43.360 | which was to just get a whole bunch
00:14:45.600 | of really brilliant people exploring the space
00:14:48.760 | and pushing the limits.
00:14:49.960 | And I think like to this day that DARPA Challenge
00:14:53.160 | with its million dollar prize pool
00:14:56.240 | was probably one of the most effective uses of taxpayer
00:15:00.400 | money dollar for dollar that I've seen,
00:15:03.400 | because that small sort of initiative that DARPA put out
00:15:08.400 | sort of, in my view, was the catalyst or the tipping point
00:15:12.600 | for this whole next wave of autonomous vehicle development.
00:15:16.160 | So that was pretty cool.
00:15:17.200 | - So let me jump around a little bit on that point.
00:15:20.320 | They also did the Urban Challenge, where it was in the city,
00:15:23.280 | but it was very artificial and there's no pedestrians,
00:15:26.000 | and there's very little human involvement
00:15:27.720 | except a few professional drivers.
00:15:30.560 | - Yeah.
00:15:31.700 | - Do you think there's room,
00:15:33.120 | and then there was the Robotics Challenge
00:15:34.440 | with humanoid robots.
00:15:35.440 | - Right.
00:15:36.280 | - So in your now role is looking at this,
00:15:38.780 | you're trying to solve one of the,
00:15:40.480 | autonomous driving, one of the harder,
00:15:43.200 | more difficult places in San Francisco.
00:15:45.540 | Is there a role for DARPA to step in
00:15:47.340 | to also kind of help out, like challenge with new ideas,
00:15:51.600 | specifically like pedestrians and so on,
00:15:54.480 | all these kinds of interesting things?
00:15:55.920 | - Well, I haven't thought about it from that perspective.
00:15:57.720 | Is there anything DARPA could do today
00:15:59.320 | to further accelerate things?
00:16:00.720 | And I would say, my instinct is that
00:16:04.440 | that's maybe not the highest and best use
00:16:05.800 | of their resources and time,
00:16:07.080 | because like kickstarting and spinning up the flywheel
00:16:10.660 | is I think what they did in this case
00:16:12.760 | for very little money.
00:16:14.240 | But today this has become,
00:16:15.860 | this has become like commercially interesting
00:16:19.020 | to very large companies,
00:16:19.860 | and the amount of money going into it,
00:16:21.420 | and the amount of people like going through your class
00:16:24.520 | and learning about these things and developing these skills
00:16:27.260 | is just orders of magnitude more than it was back then.
00:16:30.860 | And so there's enough momentum and inertia
00:16:33.140 | and energy and investment dollars
00:16:35.580 | into this space right now that I don't,
00:16:39.140 | I think they're,
00:16:39.980 | I think they can just say mission accomplished
00:16:42.220 | and move on to the next area of technology that needs help.
00:16:45.380 | - So then stepping back to MIT,
00:16:49.100 | you left MIT during your junior year.
00:16:50.860 | What was that decision like?
00:16:53.060 | - As I said, I always wanted to do a company
00:16:55.700 | or start a company,
00:16:56.540 | and this opportunity landed in my lap,
00:16:59.060 | which was a couple of guys from Yale
00:17:01.940 | were starting a new company,
00:17:02.900 | and I Googled them and found that they had
00:17:05.300 | started a company previously and sold it
00:17:07.740 | actually on eBay for about a quarter million bucks,
00:17:10.620 | which was a pretty interesting story.
00:17:12.940 | So I thought to myself,
00:17:14.220 | these guys are rockstar entrepreneurs,
00:17:16.980 | they've done this before,
00:17:19.100 | they must be driving around in Ferraris
00:17:20.700 | 'cause they sold their company,
00:17:22.200 | and I thought I could learn a lot from them.
00:17:25.980 | So I teamed up with those guys
00:17:27.460 | and went out to California during IAP,
00:17:31.980 | which is MIT's month off,
00:17:34.400 | on a one-way ticket and basically never went back.
00:17:38.060 | We were having so much fun,
00:17:39.300 | we felt like we were building something
00:17:40.500 | and creating something,
00:17:42.060 | and it was gonna be interesting that
00:17:44.460 | I was just all in and got completely hooked.
00:17:46.860 | And that business was Justin TV,
00:17:49.660 | which is originally a reality show
00:17:51.340 | about a guy named Justin,
00:17:52.580 | which morphed into a live video streaming platform,
00:17:57.160 | which then morphed into what is Twitch today.
00:18:00.340 | So that was quite an unexpected journey.
00:18:03.760 | - So no regrets?
00:18:07.040 | - No.
00:18:07.880 | - Looking back, it was just an obvious,
00:18:09.140 | I mean, one-way ticket.
00:18:10.700 | I mean, if we just pause on that for a second,
00:18:12.700 | there was no,
00:18:13.860 | how did you know these were the right guys,
00:18:17.660 | this is the right decision?
00:18:19.460 | You didn't think it was just follow the heart kind of thing?
00:18:22.620 | - Well, I didn't know,
00:18:23.620 | but just trying something for a month during IAP
00:18:26.500 | seems pretty low risk, right?
00:18:28.180 | And then, well, maybe I'll take a semester off,
00:18:30.740 | MIT's pretty flexible about that,
00:18:32.220 | you can always go back, right?
00:18:33.820 | And then after two or three cycles of that,
00:18:35.640 | I eventually threw in the towel,
00:18:36.920 | but I think it's,
00:18:38.760 | I guess in that case,
00:18:41.960 | I felt like I could always hit the undo button if I had to.
00:18:44.900 | - Right.
00:18:45.800 | But nevertheless, from when you look in retrospect,
00:18:49.640 | I mean, it seems like a brave decision
00:18:51.360 | that it would be difficult for a lot of people to make.
00:18:54.380 | - It wasn't as popular.
00:18:55.480 | I'd say the general flux of people out of MIT at the time
00:19:00.480 | was mostly into finance or consulting jobs
00:19:04.160 | in Boston or New York.
00:19:05.740 | And very few people were going to California
00:19:07.860 | to start companies.
00:19:09.080 | But today I'd say that's probably inverted,
00:19:12.220 | which is just a sign of the times, I guess.
00:19:15.400 | - Yeah.
00:19:16.240 | So there's a story about midnight of March 18, 2007,
00:19:21.240 | where TechCrunch, I guess,
00:19:23.720 | announced Justin.TV earlier than it was supposed to,
00:19:26.520 | a few hours.
00:19:27.360 | The site didn't work.
00:19:30.340 | I don't know if any of this is true, you can tell me.
00:19:32.500 | And you and one of the folks at Justin.TV,
00:19:36.220 | Emmett Shear, quoted through the night.
00:19:39.240 | Can you take me through that experience?
00:19:41.400 | So let me say a few nice things
00:19:43.960 | that the article I read quoted Justin Khan said
00:19:48.120 | that you were known for zero coding through problems
00:19:50.800 | and being a creative, quote, "creative genius."
00:19:53.480 | So on that night, what was going through your head,
00:19:58.480 | or maybe put another way,
00:20:00.760 | how do you solve these problems?
00:20:02.480 | What's your approach to solving these kinds of problems
00:20:05.480 | where the line between success and failure
00:20:07.080 | seems to be pretty thin?
00:20:10.000 | - That's a good question.
00:20:10.840 | Well, first of all, that's nice of Justin to say that.
00:20:13.360 | I think I would have been maybe 21 years old then
00:20:16.880 | and not very experienced at programming.
00:20:18.760 | But as with everything in a startup,
00:20:22.640 | you're sort of racing against the clock.
00:20:24.680 | And so our plan was the second we had
00:20:28.120 | this live streaming camera backpack up and running
00:20:32.600 | where Justin could wear it.
00:20:33.600 | And no matter where he went in the city,
00:20:35.320 | it would be streaming live video.
00:20:36.400 | And this is even before the iPhones.
00:20:37.960 | This is like hard to do back then.
00:20:39.660 | We would launch.
00:20:41.800 | And so we thought we were there
00:20:43.640 | and the backpack was working.
00:20:45.200 | And then we sent out all the emails
00:20:47.080 | to launch the company and do the press thing.
00:20:49.960 | And then we weren't quite actually there.
00:20:53.040 | And then we thought, oh, well,
00:20:55.780 | they're not gonna announce it until maybe 10 a.m.
00:20:59.440 | the next morning.
00:21:00.280 | And it's, I don't know, it's 5 p.m. now.
00:21:01.880 | So how many hours do we have left?
00:21:03.640 | What is that like?
00:21:04.760 | You know, 17 hours to go.
00:21:06.160 | And that was gonna be fine.
00:21:10.480 | - Was the problem obvious?
00:21:11.460 | Did you understand what could possibly,
00:21:13.280 | like how complicated was the system at that point?
00:21:16.500 | - It was pretty messy.
00:21:18.860 | So to get a live video feed that looked decent
00:21:22.420 | working from anywhere in San Francisco,
00:21:25.080 | I put together this system where we had like
00:21:27.640 | three or four cell phone data modems
00:21:29.840 | and they were like, we take the video stream
00:21:32.160 | and sort of spray it across these three or four modems
00:21:35.600 | and then try to catch all the packets on the other side,
00:21:37.840 | you know, with unreliable cell phone networks.
00:21:39.440 | - It's pretty low level networking.
00:21:40.960 | - Yeah, and putting these like, you know,
00:21:43.560 | sort of protocols on top of all that
00:21:45.520 | to reassemble and reorder the packets
00:21:47.540 | and have time buffers and error correction
00:21:49.680 | and all that kind of stuff.
00:21:50.920 | And the night before it was just staticky.
00:21:54.460 | Every once in a while, the image would go staticky
00:21:56.840 | and there would be this horrible, like screeching audio
00:22:00.040 | noise 'cause the audio was also corrupted.
00:22:02.080 | And this would happen like every five to 10 minutes or so.
00:22:04.600 | And it was a really off putting to the viewers.
00:22:08.040 | - Yeah.
00:22:08.880 | How do you tackle that problem?
00:22:10.200 | What was the, you just freaking out behind a computer?
00:22:13.260 | There's, are there other folks working on this problem?
00:22:16.880 | Like were you behind a whiteboard?
00:22:18.120 | Were you doing a--
00:22:19.440 | - Yeah, it's a little lonely.
00:22:22.040 | Yeah, it's a little lonely 'cause there's four of us
00:22:23.840 | working on the company and only two people really wrote code
00:22:26.840 | and Emmett wrote the website and the chat system
00:22:29.200 | and I wrote the software for this video streaming device
00:22:32.400 | and video server.
00:22:33.340 | And so, you know, it was my sole responsibility
00:22:36.240 | to figure that out.
00:22:37.280 | And I think it's those, you know, setting deadlines,
00:22:41.180 | trying to move quickly and everything
00:22:42.180 | where you're in that moment of intense pressure
00:22:44.180 | that sometimes people do their best
00:22:45.960 | and most interesting work.
00:22:46.920 | And so even though that was a terrible moment,
00:22:48.800 | I look back on it fondly 'cause that's like, you know,
00:22:50.740 | that's one of those character defining moments, I think.
00:22:54.380 | - So in 2013, October, you founded Cruise Automation.
00:22:59.380 | - Yeah.
00:23:00.300 | - So progressing forward, another exceptionally successful
00:23:04.180 | company was acquired by GM in '16 for $1 billion.
00:23:08.700 | But in October of 2013, what was on your mind?
00:23:14.100 | What was the plan?
00:23:15.200 | How does one seriously start to tackle
00:23:19.820 | one of the hardest robotics, most important impact
00:23:22.780 | robotics problems of our age?
00:23:24.900 | - After going through Twitch, Twitch was,
00:23:28.740 | and is today, pretty successful.
00:23:30.480 | But the work was, the result was entertainment, mostly.
00:23:36.420 | Like the better the product was,
00:23:38.660 | the more we would entertain people
00:23:40.360 | and then, you know, make money on the ad revenues
00:23:42.740 | and other things.
00:23:43.580 | And that was a good thing.
00:23:44.980 | It felt good to entertain people,
00:23:46.260 | but I figured like, you know, what is really the point
00:23:49.100 | of becoming a really good engineer
00:23:51.100 | and developing these skills other than, you know,
00:23:53.140 | my own enjoyment?
00:23:54.220 | And I realized I wanted something that scratched
00:23:55.740 | more of an existential itch,
00:23:57.180 | like something that truly matters.
00:23:59.380 | And so I basically made this list of requirements
00:24:03.660 | for a new, if I was gonna do another company,
00:24:06.140 | and the one thing I knew in the back of my head
00:24:07.980 | that Twitch took like eight years to become successful.
00:24:12.300 | And so whatever I do, I better be willing to commit,
00:24:14.840 | you know, at least 10 years to something.
00:24:16.980 | And when you think about things from that perspective,
00:24:20.380 | you certainly, I think, raise the bar
00:24:21.780 | on what you choose to work on.
00:24:23.180 | So for me, the three things were it had to be something
00:24:25.820 | where the technology itself determines the success
00:24:28.180 | of the product, like hard, really juicy technology problems,
00:24:31.860 | 'cause that's what motivates me.
00:24:33.580 | And then it had to have a direct and positive impact
00:24:36.300 | on society in some way.
00:24:37.660 | So an example would be like, you know, healthcare,
00:24:40.100 | self-driving cars, 'cause they save lives,
00:24:41.580 | other things where there's a clear connection
00:24:43.060 | to somehow improving other people's lives.
00:24:45.180 | And the last one is it had to be a big business
00:24:47.180 | because for the positive impact to matter,
00:24:50.200 | it's gotta be at large scale.
00:24:51.220 | - Scale, yeah.
00:24:52.060 | - And I was thinking about that for a while
00:24:53.820 | and I made like, I tried writing a Gmail clone
00:24:55.940 | and looked at some other ideas.
00:24:57.620 | And then it just sort of light bulb went off,
00:24:59.460 | like self-driving cars, like that was the most fun
00:25:01.660 | I had ever had in college working on that.
00:25:04.020 | And like, well, what's the state of the technology?
00:25:05.980 | It's been 10 years, maybe times have changed
00:25:08.440 | and maybe now is the time to make this work.
00:25:10.780 | And I poked around and looked at,
00:25:12.700 | the only other thing out there really at the time
00:25:14.340 | was the Google self-driving car project.
00:25:16.660 | And I thought, surely there's a way to, you know,
00:25:19.580 | have an entrepreneur mindset and sort of solve
00:25:21.580 | the minimum viable product here.
00:25:23.500 | And so I just took the plunge right then and there
00:25:25.200 | and said, this is something I know I can commit 10 years to.
00:25:27.860 | It's probably the greatest applied AI problem
00:25:30.780 | of our generation.
00:25:31.620 | - That's right.
00:25:32.460 | - And if it works, it's gonna be both a huge business
00:25:34.220 | and therefore like probably the most positive impact
00:25:37.020 | I can possibly have on the world.
00:25:38.260 | So after that light bulb went off,
00:25:40.940 | I went all in on cruise immediately and got to work.
00:25:45.540 | - Did you have an idea how to solve this problem?
00:25:47.340 | Which aspects of the problem to solve?
00:25:49.620 | You know, slow, like we just had Oliver for Voyage here,
00:25:53.740 | slow moving retirement communities,
00:25:56.580 | urban driving, highway driving.
00:25:58.100 | Did you have like, did you have a vision
00:26:00.380 | of the city of the future where, you know,
00:26:03.540 | the transportation is largely automated,
00:26:06.380 | that kind of thing?
00:26:07.220 | Or was it sort of more fuzzy and gray area than that?
00:26:12.220 | - My analysis of the situation is that
00:26:15.820 | Google had been putting a lot of money into that project.
00:26:19.260 | They had a lot more resources.
00:26:20.860 | And so, and they still hadn't cracked
00:26:23.820 | the fully driverless car.
00:26:26.300 | You know, this is 2013, I guess.
00:26:28.620 | So I thought, what can I do to sort of go from zero
00:26:33.460 | to, you know, significant scale
00:26:35.700 | so I can actually solve the real problem,
00:26:37.380 | which is the driverless cars?
00:26:38.780 | And I thought, here's the strategy.
00:26:40.580 | We'll start by doing a really simple problem
00:26:44.160 | or solving a really simple problem
00:26:45.640 | that creates value for people.
00:26:48.140 | So eventually ended up deciding
00:26:50.140 | on automating highway driving,
00:26:51.740 | which is relatively more straightforward
00:26:54.320 | as long as there's a backup driver there.
00:26:56.540 | And I'll, you know, the go-to-market
00:26:58.580 | will be able to retrofit people's cars
00:27:00.320 | and just sell these products directly.
00:27:02.340 | And the idea was, we'll take all the revenue
00:27:04.620 | and profits from that and use it to do the,
00:27:07.580 | so sort of reinvest that in research
00:27:10.120 | for doing fully driverless cars.
00:27:12.700 | And that was the plan.
00:27:14.040 | The only thing that really changed along the way
00:27:15.780 | between then and now is,
00:27:17.420 | we never really launched the first product.
00:27:19.060 | We had enough interest from investors
00:27:21.740 | and enough of a signal that this was something
00:27:24.180 | that we should be working on,
00:27:25.060 | that after about a year of working on the highway autopilot,
00:27:28.460 | we had it working, you know, at a prototype stage,
00:27:31.100 | but we just completely abandoned that
00:27:33.180 | and said, we're gonna go all in on driverless cars
00:27:35.020 | now is the time.
00:27:36.540 | Can't think of anything that's more exciting
00:27:38.220 | and if it works, more impactful,
00:27:39.820 | so we're just gonna go for it.
00:27:41.660 | - The idea of retrofit is kind of interesting.
00:27:43.520 | - Yeah.
00:27:44.360 | - Being able to, it's how you achieve scale.
00:27:46.960 | It's a really interesting idea,
00:27:48.000 | is it something that's still in the back of your mind
00:27:51.200 | as a possibility?
00:27:52.840 | - Not at all, I've come full circle on that one.
00:27:55.320 | After trying to build a retrofit product,
00:27:58.960 | and I'll touch on some of the complexities of that,
00:28:01.360 | and then also having been inside an OEM
00:28:04.320 | and seeing how things work
00:28:05.480 | and how a vehicle is developed and validated,
00:28:08.360 | when it comes to something
00:28:09.400 | that has safety critical implications,
00:28:11.320 | like controlling the steering
00:28:12.560 | and other control inputs on your car,
00:28:15.280 | it's pretty hard to get there with a retrofit,
00:28:17.840 | or if you did, even if you did,
00:28:20.520 | it creates a whole bunch of new complications
00:28:22.320 | around liability or how did you truly validate that,
00:28:25.400 | or something in the base vehicle fails
00:28:27.520 | and causes your system to fail, whose fault is it?
00:28:30.000 | Or if the car's anti-lock brake systems
00:28:34.080 | or other things kick in,
00:28:35.160 | or the software has been,
00:28:36.680 | it's different in one version of the car,
00:28:38.240 | you retrofit versus another,
00:28:39.420 | and you don't know because the manufacturer
00:28:41.280 | has updated it behind the scenes.
00:28:43.000 | There's basically an infinite list of long tail issues
00:28:45.400 | that can get you,
00:28:46.240 | and if you're dealing with a safety critical product,
00:28:47.800 | that's not really acceptable.
00:28:48.960 | - That's a really convincing summary
00:28:51.080 | of why it's really challenging.
00:28:53.160 | - But I didn't know all that at the time,
00:28:54.360 | so we tried it anyway.
00:28:55.480 | - But as a pitch also at the time,
00:28:57.140 | it's a really strong one.
00:28:58.280 | - Yeah.
00:28:59.120 | - 'Cause that's how you achieve scale,
00:28:59.940 | and that's how you beat the current,
00:29:01.320 | the leader at the time of Google,
00:29:03.360 | or the only one in the market.
00:29:04.760 | - The other big problem we ran into,
00:29:06.880 | which is perhaps the biggest problem
00:29:08.280 | from a business model perspective,
00:29:10.300 | is we had kind of assumed that,
00:29:12.780 | we started with an Audi S4 as the vehicle we retrofitted
00:29:16.900 | with this highway driving capability,
00:29:18.800 | and we had kind of assumed that if we just knock out
00:29:21.080 | like three make and models of vehicle,
00:29:23.360 | that'll cover like 80% of the San Francisco market.
00:29:25.920 | Doesn't everyone there drive, I don't know,
00:29:27.440 | a BMW or a Honda Civic, or one of these three cars?
00:29:30.280 | And then we surveyed our users,
00:29:31.400 | and we found out that it's all over the place.
00:29:33.880 | To get even a decent number of units sold,
00:29:36.660 | we'd have to support like 20 or 50 different models,
00:29:39.880 | and each one is a little butterfly
00:29:41.360 | that takes time and effort to maintain
00:29:43.480 | that retrofit integration and custom hardware and all this.
00:29:47.120 | So it was a tough business.
00:29:49.240 | - So GM manufactures and sells
00:29:52.120 | over nine million cars a year.
00:29:54.240 | And what you with Cruise are trying to do,
00:29:58.560 | some of the most cutting edge innovation
00:30:01.160 | in terms of applying AI.
00:30:03.280 | And so how do those,
00:30:04.840 | you've talked about it a little bit before,
00:30:06.440 | but it's also just fascinating to me.
00:30:07.780 | We work a lot of automakers.
00:30:09.360 | The difference between the gap between Detroit
00:30:12.900 | and Silicon Valley, let's say,
00:30:14.680 | just to be sort of poetic about it, I guess.
00:30:17.340 | How do you close that gap?
00:30:18.700 | How do you take GM into the future
00:30:21.460 | where a large part of the fleet
00:30:22.900 | would be autonomous perhaps?
00:30:24.820 | - I wanna start by acknowledging that GM is made up
00:30:27.900 | of tens of thousands of really brilliant,
00:30:30.220 | motivated people who wanna be a part of the future.
00:30:32.720 | And so it's pretty fun to work within the attitude
00:30:36.220 | inside a car company like that is,
00:30:37.900 | embracing this transformation and change
00:30:41.220 | rather than fearing it.
00:30:42.340 | And I think that's a testament to the leadership at GM
00:30:45.420 | and that's flown all the way through
00:30:46.460 | to everyone you talk to,
00:30:48.060 | even the people in the assembly plants
00:30:49.300 | working on these cars.
00:30:51.180 | So that's really great.
00:30:52.020 | So that starting from that position makes it a lot easier.
00:30:55.180 | So then when the people in San Francisco at Cruise
00:31:00.020 | interact with the people at GM,
00:31:01.340 | at least we have this common set of values,
00:31:02.900 | which is that we really want this stuff to work
00:31:04.940 | 'cause we think it's important
00:31:05.980 | and we think it's the future.
00:31:07.440 | That's not to say those two cultures don't clash.
00:31:11.460 | They absolutely do.
00:31:12.380 | There's different sort of value systems.
00:31:14.720 | Like in a car company,
00:31:16.740 | the thing that gets you promoted
00:31:17.940 | and sort of the reward system is following the processes,
00:31:22.540 | delivering the program on time and on budget.
00:31:26.020 | So any sort of risk-taking is discouraged in many ways
00:31:30.420 | because if a program is late
00:31:33.980 | or if you shut down the plant for a day,
00:31:35.540 | it's, you can count the millions of dollars
00:31:37.500 | that burn by pretty quickly.
00:31:39.540 | Whereas I think most Silicon Valley companies
00:31:43.740 | and in Cruise and the methodology we were employing,
00:31:48.180 | especially around the time of the acquisition,
00:31:50.060 | the reward structure is about
00:31:51.740 | trying to solve these complex problems
00:31:54.900 | in any way, shape or form,
00:31:56.060 | or coming up with crazy ideas that 90% of them won't work.
00:31:59.620 | And so meshing that culture
00:32:02.860 | of sort of continuous improvement and experimentation
00:32:05.460 | with one where everything needs to be
00:32:07.340 | rigorously defined upfront
00:32:08.420 | so that you never slip a deadline or miss a budget
00:32:11.860 | was a pretty big challenge.
00:32:13.580 | And that, we're over three years in now
00:32:16.780 | after the acquisition.
00:32:18.300 | And I'd say like, the investment we made in figuring out
00:32:21.740 | how to work together successfully and who should do what
00:32:24.300 | and how we bridge the gaps
00:32:26.340 | between these very different systems
00:32:27.660 | and way of doing engineering work
00:32:29.540 | is now one of our greatest assets
00:32:30.900 | 'cause I think we have this really powerful thing.
00:32:32.340 | But for a while it was,
00:32:33.460 | both GM and Cruise were very steep on the learning curve.
00:32:37.420 | - Yeah, so I'm sure it was very stressful.
00:32:38.900 | It's really important work
00:32:39.980 | 'cause that's how to revolutionize the transportation,
00:32:43.700 | really to revolutionize any system.
00:32:46.620 | You know, you look at the healthcare system
00:32:48.220 | or you look at the legal system.
00:32:49.660 | I have people like Laura has come up to me all the time,
00:32:52.020 | like everything they're working on
00:32:53.940 | can easily be automated, but then--
00:32:56.460 | - That's not a good feeling, yeah.
00:32:57.780 | - Well, it's not a good feeling,
00:32:58.740 | but also there's no way to automate
00:33:01.220 | because the entire infrastructure is really based,
00:33:06.220 | is older and it moves very slowly.
00:33:08.340 | And so how do you close the gap between,
00:33:11.540 | I have an app, how can I replace?
00:33:13.860 | Of course, lawyers don't wanna be replaced with an app,
00:33:15.700 | but you could replace a lot of aspect
00:33:17.900 | when most of the data is still on paper.
00:33:20.180 | And so the same thing with automotive.
00:33:23.420 | I mean, it's fundamentally software.
00:33:26.100 | So it's basically hiring software engineers,
00:33:28.580 | it's thinking in a software world.
00:33:30.340 | - I mean, I'm pretty sure nobody in Silicon Valley
00:33:32.580 | has ever hit a deadline.
00:33:34.620 | So, and then on GM--
00:33:36.020 | - That's probably true, yeah.
00:33:37.420 | - And GM side is probably the opposite.
00:33:39.940 | So that culture gap is really fascinating.
00:33:42.740 | So you're optimistic about the future of that?
00:33:45.180 | - Yeah, I mean, from what I've seen, it's impressive.
00:33:47.460 | And I think like, especially in Silicon Valley,
00:33:49.420 | it's easy to write off building cars
00:33:51.380 | because people have been doing that
00:33:53.100 | for over a hundred years now in this country.
00:33:54.900 | And so it seems like that's a solved problem,
00:33:57.020 | but that doesn't mean it's an easy problem.
00:33:58.820 | And I think it would be easy to sort of overlook that
00:34:02.260 | and think that we're Silicon Valley engineers,
00:34:06.060 | we can solve any problem.
00:34:08.220 | Building a car, it's been done,
00:34:09.540 | therefore it's not a real engineering challenge.
00:34:14.540 | But after having seen just the sheer scale
00:34:17.500 | and magnitude and industrialization
00:34:19.340 | that occurs inside of an automotive assembly plant,
00:34:23.260 | that is a lot of work that I am very glad
00:34:25.820 | that we don't have to reinvent
00:34:28.180 | to make self-driving cars work.
00:34:29.500 | And so to have partners who have done that for a hundred
00:34:31.660 | years now, these great processes and this huge infrastructure
00:34:34.060 | and supply base that we can tap into is just remarkable
00:34:38.780 | because the scope and surface area of the problem
00:34:43.780 | of deploying fleets of self-driving cars is so large
00:34:47.420 | that we're constantly looking for ways to do less
00:34:50.340 | so we can focus on the things that really matter more.
00:34:52.940 | And if we had to figure out how to build and assemble
00:34:55.380 | and build the cars themselves,
00:35:00.140 | I mean, we work closely with GM on that,
00:35:01.620 | but if we had to develop all that capability in-house
00:35:03.620 | as well, that would just make the problem
00:35:08.300 | really intractable, I think.
00:35:10.180 | - So yeah, just like your first entry
00:35:13.940 | at the MIT DARPA challenge when there was what the motor
00:35:16.820 | that failed, if somebody that knows what they're doing
00:35:18.980 | with a motor did it--
00:35:20.060 | - That would have been nice if we could focus
00:35:21.460 | on the software and not the hardware platform.
00:35:23.900 | Yeah.
00:35:24.820 | - So from your perspective now,
00:35:27.140 | there's so many ways that autonomous vehicles
00:35:29.980 | can impact society in the next year, five years, 10 years.
00:35:34.300 | What do you think is the biggest opportunity
00:35:37.100 | to make money in autonomous driving,
00:35:39.380 | sort of make it a financially viable thing in the near term?
00:35:44.740 | What do you think would be the biggest impact there?
00:35:49.140 | - Well, the things that drive the economics
00:35:52.180 | for fleets of self-driving cars,
00:35:54.060 | there's sort of a handful of variables.
00:35:56.420 | One is the cost to build the vehicle itself.
00:36:00.380 | So the material cost, how many,
00:36:02.460 | what's the cost of all your sensors,
00:36:03.660 | plus the cost of the vehicle
00:36:05.140 | and all the other components on it.
00:36:07.540 | Another one is the lifetime of the vehicle.
00:36:09.460 | It's very different if your vehicle drives 100,000 miles
00:36:12.460 | and then it falls apart versus 2 million.
00:36:14.780 | And then if you have a fleet, it's kind of like an airplane
00:36:21.100 | where, or an airline where once you produce the vehicle,
00:36:26.100 | you want it to be in operation
00:36:27.900 | as many hours a day as possible producing revenue.
00:36:30.780 | And then the other piece of that is,
00:36:32.940 | how are you generating revenue?
00:36:35.340 | I think that's kind of what you're asking.
00:36:36.420 | And I think the obvious things today
00:36:38.460 | are the ride sharing business,
00:36:40.140 | because that's pretty clear that there's demand for that.
00:36:42.820 | There's existing markets you can tap into.
00:36:44.920 | And--
00:36:46.300 | - Large urban areas, that kind of thing.
00:36:48.020 | - Yeah, yeah.
00:36:49.260 | And I think that there are some real benefits
00:36:51.220 | to having cars without drivers
00:36:54.500 | compared to sort of the status quo
00:36:56.020 | for people who use ride share services today.
00:36:58.500 | You know, you get privacy, consistency,
00:37:01.060 | hopefully significantly improve safety,
00:37:02.420 | all these benefits versus the current product.
00:37:05.100 | But it's a crowded market.
00:37:06.500 | And then other opportunities,
00:37:08.020 | which you've seen a lot of activity in the last,
00:37:09.600 | really in the last six or 12 months is, you know, delivery,
00:37:12.540 | whether that's parcels and packages, food or groceries.
00:37:17.780 | Those are all sort of, I think,
00:37:19.700 | opportunities that are pretty ripe for these,
00:37:23.420 | you know, once you have this core technology,
00:37:26.020 | which is the fleet of autonomous vehicles,
00:37:28.100 | there's all sorts of different business opportunities
00:37:30.940 | you can build on top of that.
00:37:32.120 | But I think the important thing, of course,
00:37:34.580 | is that there's zero monetization opportunity
00:37:36.500 | until you actually have that fleet
00:37:37.580 | of very capable driverless cars
00:37:39.180 | that are as good or better than humans.
00:37:41.060 | And that's sort of where the entire industry
00:37:44.160 | is sort of in this holding pattern right now.
00:37:45.940 | - Yeah, they're trying to achieve that baseline.
00:37:47.460 | So, but you said sort of reliability,
00:37:49.260 | not reliability, consistency.
00:37:51.580 | It's kind of interesting.
00:37:52.400 | I think I heard you say somewhere,
00:37:54.260 | not sure if that's what you meant,
00:37:55.460 | but, you know, I can imagine a situation
00:37:58.280 | where you would get an autonomous vehicle.
00:38:01.240 | And, you know, when you get into an Uber or Lyft,
00:38:04.620 | you don't get to choose the driver in a sense
00:38:06.540 | that you don't get to choose the personality
00:38:08.020 | of the driving.
00:38:09.140 | Do you think there's room to define the personality
00:38:13.680 | of the car, the way it drives you,
00:38:15.100 | in terms of aggressiveness, for example,
00:38:17.620 | in terms of sort of pushing the,
00:38:20.100 | one of the biggest challenges in autonomous driving
00:38:22.780 | is the trade-off between sort of safety and assertiveness.
00:38:27.780 | And do you think there's any room for the human
00:38:31.700 | to take a role in that decision?
00:38:36.100 | Sort of accept some of the liability, I guess.
00:38:38.140 | - I wouldn't, no, I'd say within reasonable bounds,
00:38:41.020 | as in we're not gonna, I think it'd be highly unlikely
00:38:44.380 | we'd expose any knob that would let you,
00:38:46.260 | you know, significantly increase safety risk.
00:38:50.220 | I think that's just not something we'd be willing to do.
00:38:53.020 | But I think driving style or like, you know,
00:38:56.700 | are you gonna relax the comfort constraints slightly
00:38:59.060 | or things like that?
00:39:00.100 | All of those things make sense and are plausible.
00:39:02.340 | I see all those as, you know, nice optimizations.
00:39:04.440 | Once again, we get the core problem solved
00:39:06.740 | in these fleets out there.
00:39:08.100 | But the other thing we've sort of observed
00:39:10.400 | is that you have this intuition
00:39:12.500 | that if you sort of slam your foot on the gas
00:39:15.340 | right after the light turns green
00:39:16.640 | and aggressively accelerate,
00:39:18.140 | you're gonna get there faster.
00:39:19.700 | But the actual impact of doing that is pretty small.
00:39:22.060 | You feel like you're getting there faster.
00:39:23.660 | But so the same would be true for AVs.
00:39:26.620 | Even if they don't slam their, you know,
00:39:28.720 | the pedal to the floor when the light turns green,
00:39:30.980 | they're gonna get you there within, you know,
00:39:32.500 | if it's a 15 minute trip, within 30 seconds
00:39:34.580 | of what you would have done otherwise
00:39:36.380 | if you were going really aggressively.
00:39:37.780 | So I think there's this sort of self-deception
00:39:40.780 | that my aggressive driving style
00:39:43.300 | is getting me there faster.
00:39:44.460 | - Well, so that's, you know,
00:39:45.460 | some of the things I've studied,
00:39:46.700 | some of the things I'm fascinated by,
00:39:47.780 | the psychology of that.
00:39:48.820 | I don't think it matters
00:39:50.660 | that it doesn't get you there faster.
00:39:52.260 | It's the emotional release.
00:39:55.540 | Driving is a place, being inside of a car,
00:39:59.140 | somebody said it's like the real world version
00:40:00.900 | of being a troll.
00:40:02.980 | So you have this protection, this mental protection,
00:40:05.020 | you're able to sort of yell at the world,
00:40:06.660 | like release your anger, whatever it's about.
00:40:08.460 | So there's an element of that
00:40:10.020 | that I think autonomous vehicles
00:40:11.980 | would also have to, you know,
00:40:13.580 | giving an outlet to people,
00:40:15.380 | but it doesn't have to be through driving
00:40:18.540 | or honking or so on.
00:40:19.700 | There might be other outlets.
00:40:21.140 | But I think to sort of even just put that aside,
00:40:24.020 | the baseline is really, you know,
00:40:25.860 | that's the focus, that's the thing you need to solve,
00:40:28.140 | and then the fun human things can be solved after.
00:40:30.940 | But so from the baseline of just solving autonomous driving,
00:40:34.660 | you're working in San Francisco,
00:40:36.000 | one of the more difficult cities to operate in.
00:40:38.960 | What is, in your view, currently the hardest aspect
00:40:43.820 | of autonomous driving?
00:40:45.040 | Is it negotiating with pedestrians?
00:40:49.180 | Is it edge cases of perception?
00:40:51.400 | Is it planning?
00:40:52.760 | Is there mechanical engineering?
00:40:54.500 | Is it data, fleet stuff?
00:40:57.020 | What are your thoughts on the challenge,
00:40:59.180 | the more challenging aspects there?
00:41:01.180 | - That's a good question.
00:41:02.260 | I think before we go to that, though,
00:41:04.240 | I like what you said about the psychology aspect of this,
00:41:07.600 | 'cause I think one observation I've made is,
00:41:09.680 | I think I read somewhere that,
00:41:10.800 | I think it's maybe Americans on average
00:41:12.980 | spend over an hour a day on social media,
00:41:16.560 | like staring at Facebook.
00:41:18.320 | And so that's just, you know,
00:41:20.120 | 60 minutes of your life you're not getting back.
00:41:21.640 | It's probably not super productive.
00:41:23.120 | And so that's 3,600 seconds, right?
00:41:26.200 | And that's a lot of time you're giving up.
00:41:30.600 | And if you compare that to people being on the road,
00:41:34.080 | if another vehicle, whether it's a human driver
00:41:36.320 | or autonomous vehicle, delays them by even three seconds,
00:41:39.860 | they're laying in on the horn, you know,
00:41:41.900 | even though that's 1/1000th of the time they waste
00:41:45.300 | looking at Facebook every day.
00:41:46.380 | So there's definitely some psychology aspects of this,
00:41:50.020 | I think, that are pretty interesting,
00:41:50.900 | road rage in general.
00:41:51.740 | And then the question, of course,
00:41:52.960 | is if everyone is in self-driving cars,
00:41:54.940 | do they even notice these three-second delays anymore?
00:41:57.580 | 'Cause they're doing other things,
00:41:58.940 | or reading, or working, or just talking to each other.
00:42:01.740 | So it'll be interesting to see where that goes.
00:42:03.220 | - In a certain aspect, people need to be distracted
00:42:06.380 | by something entertaining, something useful inside the car,
00:42:09.180 | so they don't pay attention to the external world.
00:42:10.980 | And then they can take whatever psychology
00:42:14.260 | and bring it back to Twitter,
00:42:16.460 | and then focus on that as opposed to sort of interacting,
00:42:19.680 | sort of putting the emotion out there into the world.
00:42:23.220 | So it's an interesting problem, but baseline autonomy.
00:42:26.980 | - I guess you could say self-driving cars at scale
00:42:29.660 | will lower the collective blood pressure of society
00:42:32.260 | probably by a couple points
00:42:33.980 | without all that road rage and stress.
00:42:35.780 | So that's a good externality.
00:42:37.520 | So back to your question about the technology
00:42:41.820 | and I guess the biggest problems.
00:42:43.820 | And I have a hard time answering that question
00:42:45.620 | because we've been at this,
00:42:47.900 | like specifically focusing on driverless cars
00:42:51.500 | and all the technology needed to enable that
00:42:53.560 | for a little over four and a half years now.
00:42:55.220 | And even a year or two in,
00:42:58.180 | I felt like we had completed the functionality needed
00:43:03.040 | to get someone from point A to point B,
00:43:04.900 | as in if we need to do a left turn maneuver
00:43:07.380 | or if we need to drive around a double parked vehicle
00:43:10.340 | into oncoming traffic
00:43:11.940 | or navigate through construction zones,
00:43:13.940 | the scaffolding and the building blocks
00:43:16.340 | was there pretty early on.
00:43:17.900 | And so the challenge is not any one scenario or situation
00:43:22.460 | for which we fail at 100% of those.
00:43:25.580 | It's more, we're benchmarking against a pretty good
00:43:29.000 | or pretty high standard, which is human driving.
00:43:31.380 | All things considered,
00:43:32.300 | humans are excellent at handling edge cases
00:43:35.020 | and unexpected scenarios where computers are the opposite.
00:43:38.460 | And so beating that baseline set by humans is the challenge.
00:43:43.140 | And so what we've been doing for quite some time now
00:43:46.580 | is basically it's this continuous improvement process
00:43:50.840 | where we find sort of the most uncomfortable
00:43:55.060 | or the things that could lead to a safety issue,
00:43:59.860 | other things, all these events.
00:44:00.980 | And then we sort of categorize them
00:44:02.520 | and rework parts of our system
00:44:04.580 | to make incremental improvements
00:44:06.200 | and do that over and over and over again.
00:44:08.060 | And we just see sort of the overall performance
00:44:10.180 | of the system actually increasing in a pretty steady clip.
00:44:13.980 | But there's no one thing.
00:44:15.380 | There's actually like thousands of little things
00:44:17.400 | and just like polishing functionality
00:44:19.900 | and making sure that it handles every version
00:44:22.940 | and possible permutation of a situation
00:44:26.120 | by either applying more deep learning systems
00:44:29.220 | or just by adding more test coverage
00:44:32.980 | or new scenarios that we develop against
00:44:35.760 | and just grinding on that.
00:44:37.180 | We're sort of in the unsexy phase of development right now,
00:44:40.140 | which is doing the real engineering work
00:44:41.820 | that it takes to go from prototype to production.
00:44:44.140 | - You're basically scaling the grinding,
00:44:47.900 | instead of taking seriously the process
00:44:50.580 | of all those edge cases,
00:44:52.940 | both with human experts and machine learning methods
00:44:56.420 | to cover all those situations.
00:44:59.340 | - Yeah, and the exciting thing for me
00:45:00.620 | is I don't think that grinding ever stops
00:45:02.980 | because there's a moment in time
00:45:04.860 | where you've crossed that threshold of human performance
00:45:08.780 | and become superhuman.
00:45:09.980 | But there's no reason,
00:45:12.180 | there's no first principles reason
00:45:13.560 | that AV capability will tap out anywhere near humans.
00:45:17.560 | Like there's no reason it couldn't be 20 times better,
00:45:20.300 | whether that's just better driving or safer driving
00:45:22.860 | or more comfortable driving,
00:45:24.260 | or even a thousand times better given enough time.
00:45:26.820 | And we intend to basically chase that forever
00:45:31.500 | to build the best possible product.
00:45:32.840 | - Better and better and better
00:45:33.940 | and always new edge cases come up and new experiences.
00:45:36.380 | So, and you wanna automate that process
00:45:39.540 | as much as possible.
00:45:40.700 | So what do you think in general in society,
00:45:45.180 | when do you think we may have hundreds of thousands
00:45:48.200 | of fully autonomous vehicles driving around?
00:45:50.200 | So first of all, predictions, nobody knows the future.
00:45:53.520 | You're a part of the leading people
00:45:55.320 | trying to define that future,
00:45:56.520 | but even then you still don't know.
00:45:58.520 | But if you think about hundreds of thousands of vehicles,
00:46:02.200 | so a significant fraction of vehicles
00:46:05.780 | in major cities are autonomous.
00:46:07.560 | Do you think, are you with Rodney Brooks
00:46:10.760 | who is 2050 and beyond?
00:46:13.920 | Or are you more with Elon Musk
00:46:17.140 | who is, we should have had that two years ago?
00:46:19.540 | - Well, I mean, I'd love to have it two years ago,
00:46:23.820 | but we're not there yet.
00:46:26.100 | So I guess the way I would think about that
00:46:28.460 | is let's flip that question around.
00:46:31.220 | So what would prevent you
00:46:33.500 | to reach hundreds of thousands of vehicles?
00:46:35.940 | - And that's a good, that's a good rephrasing.
00:46:38.140 | - Yeah, so the,
00:46:42.060 | I'd say the, it seems the consensus
00:46:44.760 | among the people developing self-driving cars today
00:46:49.440 | is to sort of start with some form of an easier environment,
00:46:53.440 | whether it means lacking inclement weather
00:46:55.880 | or mostly sunny or whatever it is,
00:46:59.060 | and then add capability
00:47:01.920 | for more complex situations over time.
00:47:05.040 | And so if you're only able to deploy
00:47:07.480 | in areas that meet sort of your criteria
00:47:11.920 | or the current domain,
00:47:13.180 | operating domain of the software you developed,
00:47:15.580 | that may put a cap on how many cities you could deploy in.
00:47:18.980 | But then as those restrictions start to fall away,
00:47:21.540 | like maybe you add capability to drive really well
00:47:23.980 | and safely in heavy rain or snow,
00:47:26.360 | that probably opens up the market by two or three fold
00:47:30.320 | in terms of the cities you can expand into and so on.
00:47:33.180 | And so the real question is,
00:47:34.980 | I know today if we wanted to,
00:47:37.300 | we could produce that many autonomous vehicles,
00:47:40.340 | but we wouldn't be able to make use of all of them yet
00:47:42.240 | 'cause we would sort of saturate the demand in the cities
00:47:45.440 | in which we would want to operate initially.
00:47:47.640 | So if I were to guess like what the timeline is
00:47:50.760 | for those things falling away
00:47:51.920 | and reaching hundreds of thousands of vehicles.
00:47:54.280 | - Maybe a range is better.
00:47:56.400 | - I would say less than five years.
00:47:57.920 | - Less than five years. - Yeah.
00:47:59.680 | - And of course you're working hard to make that happen.
00:48:03.280 | So you started two companies that were eventually acquired
00:48:06.120 | for each $4 billion.
00:48:09.480 | So you're a pretty good person to ask,
00:48:11.380 | what does it take to build a successful startup?
00:48:13.780 | - I think there's sort of survivor bias here a little bit,
00:48:19.340 | but I can try to find some common threads
00:48:20.620 | for the things that worked for me, which is,
00:48:22.820 | in both of these companies,
00:48:26.820 | I was really passionate about the core technology.
00:48:28.720 | I actually like lay awake at night
00:48:30.660 | thinking about these problems and how to solve them.
00:48:33.060 | And I think that's helpful
00:48:34.100 | because when you start a business,
00:48:35.660 | there are like to this day,
00:48:38.540 | there are these crazy ups and downs.
00:48:40.480 | Like one day you think the business is just on,
00:48:42.200 | you're just on top of the world and unstoppable.
00:48:44.040 | And the next day you think, okay, this is all gonna end.
00:48:46.720 | You know, it's just going south
00:48:48.320 | and it's gonna be over tomorrow.
00:48:49.880 | And so I think like having a true passion
00:48:54.680 | that you can fall back on
00:48:55.560 | and knowing that you would be doing it
00:48:56.840 | even if you weren't getting paid for it
00:48:58.360 | helps you weather those tough times.
00:49:01.040 | So that's one thing.
00:49:01.880 | I think the other one is really good people.
00:49:05.340 | So I've always been surrounded by really good co-founders
00:49:08.000 | that are logical thinkers,
00:49:10.120 | are always pushing their limits
00:49:11.320 | and have very high levels of integrity.
00:49:13.000 | So that's Dan Kahn and my current company
00:49:14.880 | and actually his brother and a couple other guys
00:49:17.760 | for Justin TV and Twitch.
00:49:19.520 | And then I think the last thing is
00:49:21.280 | just I guess persistence or perseverance.
00:49:25.640 | And that can apply to sticking to sort of,
00:49:30.080 | or having conviction around the original premise
00:49:32.520 | of your idea and sticking around to do all the unsexy work
00:49:36.680 | to actually make it come to fruition,
00:49:38.520 | including dealing with whatever it is
00:49:42.200 | that you're not passionate about,
00:49:43.540 | whether that's finance or HR or operations or those things.
00:49:48.080 | As long as you are grinding away
00:49:50.200 | and working towards that North Star for your business,
00:49:52.880 | whatever it is, and you don't give up
00:49:55.440 | and you're making progress every day,
00:49:56.880 | it seems like eventually you'll end up in a good place.
00:49:59.000 | And the only things that can slow you down
00:50:00.200 | are running out of money
00:50:01.680 | or I suppose your competitors destroying you.
00:50:03.400 | But I think most of the time it's people giving up
00:50:06.320 | or somehow destroying things themselves
00:50:08.360 | rather than being beaten by their competition
00:50:09.960 | or running out of money.
00:50:11.400 | - Yeah, if you never quit, eventually you'll arrive.
00:50:15.120 | - That's a much more concise version
00:50:16.520 | of what I was trying to say.
00:50:17.360 | Yeah, it was good.
00:50:18.840 | - So you went the Y Combinator route twice.
00:50:20.920 | - Yeah.
00:50:21.760 | - What do you think, in a quick question,
00:50:23.960 | do you think is the best way to raise funds
00:50:25.680 | in the early days?
00:50:27.080 | Or not just funds, but just community,
00:50:30.240 | develop your idea and so on.
00:50:32.280 | Can you do it solo or maybe with a co-founder
00:50:37.280 | with like self-funded?
00:50:38.960 | Do you think Y Combinator is good?
00:50:40.160 | Is it good to do VC route?
00:50:41.680 | Is there no right answer
00:50:42.880 | or is there from the Y Combinator experience
00:50:45.800 | something that you could take away
00:50:47.240 | that that was the right path to take?
00:50:48.840 | - There's no one size fits all answer,
00:50:50.480 | but if your ambition I think is to,
00:50:52.240 | you know, see how big you can make something
00:50:55.200 | or rapidly expand and capture a market
00:50:58.880 | or solve a problem or whatever it is,
00:51:00.800 | then going the venture back route
00:51:03.000 | is probably a good approach
00:51:04.200 | so that capital doesn't become your primary constraint.
00:51:07.880 | Y Combinator, I love because it puts you
00:51:11.000 | in this sort of competitive environment
00:51:13.920 | where you're surrounded by the top maybe 1%
00:51:17.240 | of other really highly motivated peers
00:51:20.440 | who are in the same place.
00:51:22.080 | And that environment I think just breeds success, right?
00:51:27.080 | If you're surrounded by really brilliant,
00:51:28.400 | hardworking people, you're gonna feel, you know,
00:51:31.200 | sort of compelled or inspired
00:51:32.440 | to try to emulate them or beat them.
00:51:34.920 | And so even though I had done it once before
00:51:37.880 | and I felt like, you know, I'm pretty self-motivated,
00:51:41.360 | I thought like, look, this is gonna be a hard problem.
00:51:43.040 | I can use all the help I can get.
00:51:44.680 | So surrounding myself with other entrepreneurs
00:51:46.760 | is gonna make me work a little bit harder
00:51:48.360 | or push a little harder than it's worth it.
00:51:51.000 | And so that's why I did it, you know,
00:51:52.600 | for example, the second time.
00:51:54.160 | - Let's go philosophical existential.
00:51:57.080 | If you go back and do something differently in your life,
00:52:00.680 | starting in high school and MIT, leaving MIT,
00:52:05.680 | you could have gone the PhD route, doing the startup,
00:52:09.320 | going to see about a startup in California,
00:52:12.800 | or maybe some aspects of fundraising.
00:52:15.400 | Is there something you regret,
00:52:17.120 | something you not necessarily regret,
00:52:19.280 | but if you go back, you could do differently?
00:52:22.200 | - I think I've made a lot of mistakes,
00:52:23.680 | like, you know, pretty much everything you can screw up,
00:52:25.820 | I think I've screwed up at least once,
00:52:28.520 | but I, you know, I don't regret those things.
00:52:30.400 | I think it's hard to look back on things,
00:52:32.720 | even if it didn't go well and call it a regret,
00:52:34.560 | because hopefully, you know,
00:52:35.600 | it took away some new knowledge or learning from that.
00:52:38.200 | So I would say there was a period,
00:52:43.200 | yeah, the closest I can come to is this.
00:52:47.400 | There's a period in Justin.tv, I think after seven years,
00:52:51.280 | where, you know, the company was going one direction,
00:52:55.080 | which was towards Twitch and video gaming,
00:52:57.260 | and I'm not a video gamer.
00:52:58.700 | I don't really even use Twitch at all.
00:53:00.880 | And I was still working on the core technology there,
00:53:04.480 | but my heart was no longer in it,
00:53:06.260 | because the business that we were creating
00:53:07.760 | was not something that I was personally passionate about.
00:53:10.020 | - It didn't meet your bar of existential impact.
00:53:12.360 | - Yeah, and I'd say I probably spent an extra year
00:53:15.900 | or two working on that, and I'd say, like,
00:53:19.420 | I would have just tried to do something different sooner,
00:53:23.060 | 'cause those were two years where I felt like,
00:53:25.360 | you know, from this philosophical or existential thing,
00:53:29.560 | I just felt that something was missing.
00:53:31.440 | And so I would have, if I could look back now
00:53:33.640 | and tell myself, it's like,
00:53:34.480 | I would have said exactly that.
00:53:35.600 | Like, you're not getting any meaning out of your work
00:53:37.840 | personally right now.
00:53:39.380 | You should find a way to change that.
00:53:41.680 | And that's part of the pitch I use
00:53:44.300 | to basically everyone who joins Cruise today.
00:53:46.000 | It's like, hey, you've got that now by coming here.
00:53:48.440 | - Well, maybe you needed the two years
00:53:49.880 | of that existential dread to develop the feeling
00:53:52.440 | that ultimately it was the fire that created Cruise.
00:53:55.040 | So you never know.
00:53:55.880 | You can't-- - Good theory, yeah.
00:53:57.240 | - So last question.
00:53:58.820 | What does 2019 hold for Cruise?
00:54:02.000 | - After this, I guess we're gonna go
00:54:03.200 | and I'll talk to your class,
00:54:04.640 | but one of the big things is going from prototype
00:54:06.920 | to production for autonomous cars,
00:54:08.960 | and what does that mean?
00:54:09.800 | What does that look like?
00:54:10.620 | And 2019 for us is the year that we try
00:54:13.880 | to cross over that threshold and reach, you know,
00:54:16.120 | superhuman level of performance to some degree
00:54:18.040 | with the software and have all the other
00:54:20.940 | of the thousands of little building blocks in place
00:54:23.320 | to launch, you know, our first commercial product.
00:54:27.400 | So that's what's in store for us.
00:54:30.320 | And we've got a lot of work to do.
00:54:32.720 | We've got a lot of brilliant people working on it.
00:54:35.680 | So it's all up to us now.
00:54:37.660 | - Yeah, from Charlie Miller and Chris Vells,
00:54:40.360 | like the people I've crossed paths with.
00:54:42.280 | - Oh, great, yeah.
00:54:43.460 | - It sounds like you have an amazing team.
00:54:45.680 | So like I said, it's one of the most,
00:54:48.300 | I think one of the most important problems
00:54:50.400 | in artificial intelligence of the century.
00:54:52.080 | It'll be one of the most defining.
00:54:53.520 | That's super exciting that you work on it.
00:54:55.360 | And the best of luck in 2019.
00:54:59.040 | I'm really excited to see what Cruise comes up with.
00:55:01.520 | - Thank you.
00:55:02.340 | Thanks for having me today.
00:55:03.180 | - Thanks, Kyle.
00:55:04.020 | (upbeat music)
00:55:06.600 | (upbeat music)
00:55:09.180 | (upbeat music)
00:55:11.760 | (upbeat music)
00:55:14.340 | (upbeat music)
00:55:16.920 | (upbeat music)
00:55:19.500 | [BLANK_AUDIO]