back to index

Tekedra Mawakana | All-In Summit 2024


Chapters

0:0 The Besties welcome Waymo Co-CEO Tekedra Mawakana
1:43 Joining Waymo, breaking down their approach vs competitors
9:57 Focus on reducing deaths, impact on Uber, creating an AV ecosystem
16:44 Thoughts on Cruise's issues in SF, operating at scale, unit economics
22:9 Selling to carmakers, US focus vs international
27:51 Hardest challenge while leading Waymo

Whisper Transcript | Transcript Only Page

00:00:00.000 | [VIDEO PLAYBACK]
00:00:00.660 | - Hop in a driverless taxi with the Google-owned self-driving
00:00:03.640 | company Waymo opening up its robo-taxis
00:00:05.920 | to all users in that city.
00:00:08.200 | - The technology, of course, is hugely expensive.
00:00:11.520 | - Alphabet's autonomous vehicle unit,
00:00:13.520 | Waymo, completing another major round of funding.
00:00:16.840 | - Tahedra has served as co-CEO at Waymo since 2021.
00:00:21.040 | - We are on a mission to build the world's most experienced
00:00:24.360 | driver and make it safe and easy for people and things
00:00:26.600 | to get where they're going.
00:00:27.840 | - This is not a move fast and break things.
00:00:29.720 | This is a move with great, great iterative pace and focus
00:00:33.740 | on safety.
00:00:34.240 | - Please join me in welcoming Tahedra from Waymo to the stage.
00:00:44.240 | - Hi.
00:00:45.240 | - Thank you for coming.
00:00:46.240 | - Hi, thank you.
00:00:46.920 | - It's a pleasure.
00:00:47.480 | - Thanks for coming.
00:00:48.280 | - Thank you so much.
00:00:49.400 | - Hi.
00:00:49.900 | - How are you?
00:00:50.400 | Good to see you.
00:00:51.360 | - Hi, David.
00:00:51.880 | Thank you.
00:00:53.240 | - Tahedra, thanks so much for being here.
00:00:55.000 | - Yeah, thank you.
00:00:56.000 | - Tahedra's been co-CEO of Waymo since 2021,
00:00:59.560 | where she leads the company's business operations
00:01:01.680 | and expansion in autonomous driving.
00:01:04.040 | Recently, Waymo has been reported to be completing--
00:01:06.720 | I don't know if Waymo put this out or if it was in the news,
00:01:09.220 | but over 100,000 paid trips per week now.
00:01:11.680 | Is that public?
00:01:12.320 | Is that like a true thing?
00:01:13.800 | - It's completely true.
00:01:14.920 | And it's over 100,000.
00:01:16.320 | - Over 100,000.
00:01:17.840 | - Per week, yeah.
00:01:19.160 | - And we can now see Waymo's autonomous ride-hailing
00:01:22.360 | services in Phoenix, LA, San Francisco in the Bay Area,
00:01:25.760 | Austin.
00:01:27.280 | And I think on X, you announced $5 billion
00:01:30.040 | in additional committed funding from Alphabet--
00:01:32.160 | I don't know if that was recent--
00:01:33.540 | to support Waymo's expansion.
00:01:34.960 | - It was recent, yes.
00:01:35.960 | - Congratulations.
00:01:36.760 | - Thank you.
00:01:37.560 | - So thanks for joining us.
00:01:39.680 | The Waymo story started a long time ago as a Google X project.
00:01:43.960 | When did you join, and how did you kind of become co-CEO?
00:01:47.400 | - Yeah, thank you.
00:01:49.080 | So Waymo started as the Google Self-Driving Car Project
00:01:51.840 | back in 2009.
00:01:53.800 | And actually, my co-CEO was one of those founding team members,
00:01:57.920 | Dmitry Dolgov.
00:01:59.400 | And in 2016, at the end of 2016, we spun out
00:02:03.960 | and sort of became Waymo.
00:02:06.240 | And that's when I joined.
00:02:08.480 | At that point, it had been determined
00:02:10.800 | that the technology was on a good enough path
00:02:13.280 | to start to think about commercialization and scale.
00:02:17.360 | And so the fun thing about joining a moonshot
00:02:20.840 | is I came from a big tech company.
00:02:23.160 | So I'm like, oh, product market fit.
00:02:24.640 | Like, it's all there.
00:02:25.520 | I'm going to join, and we're going to scale.
00:02:27.800 | And the reality is it was an opportunity
00:02:30.200 | to learn how much this is a process of discovery,
00:02:34.720 | how much this is a process of figuring out
00:02:37.040 | not only the technology roadmap, but also
00:02:39.560 | the external environment.
00:02:40.940 | And at that time, I had been head of policy
00:02:43.760 | at a lot of companies.
00:02:45.520 | And I joined under the idea that if the technology works,
00:02:49.000 | the market opportunity would come
00:02:50.760 | from being able to open markets.
00:02:52.900 | And so that was super terrifying to me,
00:02:54.840 | because if you're at most large tech companies doing policy
00:02:58.040 | work, you're actually trying to avoid costs,
00:03:02.000 | or you're trying to defeat legislation that's harmful.
00:03:04.880 | But this was really the opportunity of, like,
00:03:07.200 | how do you usher in this amazing technology?
00:03:10.960 | And so that's when I joined.
00:03:12.120 | And then I was asked by my prior CEO
00:03:15.320 | to help launch the first market, which was Phoenix.
00:03:17.480 | Back then, it was Chandler.
00:03:19.400 | And we launched that back in 2017.
00:03:21.840 | And we've been-- since 2018, a lot of people don't realize,
00:03:25.400 | we've had a 24/7 ride-hailing service around the clock.
00:03:30.840 | It was in 2020, during COVID, though,
00:03:33.760 | that we removed the driver from behind the wheel.
00:03:36.920 | So we have the Waymo driver that's
00:03:39.440 | driving the vehicles at all time.
00:03:41.080 | But there's no human behind the wheel.
00:03:42.980 | And we've been offering that service now.
00:03:44.680 | We're almost at four years, 24/7,
00:03:47.840 | not only in Phoenix, which is a 315-square-mile territory,
00:03:52.120 | but also in San Francisco and now here in LA.
00:03:54.640 | So in that sort of ushering in from market one to this role,
00:03:58.720 | I moved through the COO role.
00:04:00.780 | And that's when we actually removed the driver.
00:04:02.740 | And we're the first company that's done that.
00:04:05.760 | And so it was daunting.
00:04:08.320 | And I think that's one thing I love about Waymo.
00:04:11.160 | We call our team Waymonauts, is you just
00:04:13.720 | have to, every time, get to this point of determining what's
00:04:17.560 | good enough, what's safe enough, and then make that leap.
00:04:22.320 | And then learn, because you can't learn until you
00:04:24.360 | get to that next point.
00:04:25.600 | And so, of course, we have rigorous safety culture.
00:04:29.000 | But it was in that time that I was COO.
00:04:30.880 | And then in April of '21, I moved into the co-CEO role
00:04:34.520 | with Dmitry.
00:04:35.440 | So now that it's scaling, 100,000 paid rides a week,
00:04:38.520 | maybe we can talk a little bit.
00:04:39.880 | I mean, Jason's spent a lot of time on this.
00:04:40.680 | Well, I think maybe starting with security
00:04:42.520 | would be an interesting--
00:04:43.880 | safety, rather, would be a really good idea.
00:04:47.160 | You've done, it seems like, millions of rides now.
00:04:50.200 | No fatalities, correct?
00:04:51.800 | That's right.
00:04:52.920 | But there have been accidents.
00:04:54.440 | So when you look at the accidents that have occurred,
00:04:58.560 | how often does an accident occur?
00:05:00.080 | And in what percentage of the time
00:05:01.920 | is it just somebody running into the Waymo versus the Waymo
00:05:05.280 | running into somebody else?
00:05:06.600 | Yeah.
00:05:07.320 | So we've done 2 million paid rides.
00:05:12.480 | So when we think about-- we just released this week
00:05:15.960 | our safety hub, so you're sort of tapping into an issue
00:05:18.760 | that's really important.
00:05:20.280 | What we've been able to determine
00:05:21.920 | as of June of this year, with over 22 million,
00:05:25.400 | what we refer to as RO miles--
00:05:27.640 | those are miles that are fully autonomous--
00:05:30.840 | we've been able to have 83% fewer airbag deployments, which
00:05:35.600 | of course is important because those are high impact,
00:05:38.880 | and 73% fewer injury-causing crashes.
00:05:45.040 | Now, that's what we're focused on.
00:05:46.480 | Over a human.
00:05:47.160 | Over a traditional human.
00:05:49.040 | Meaning the counterfactual in that area
00:05:51.520 | at that same point in time--
00:05:52.960 | Exactly.
00:05:53.640 | --via insurance claims and other data.
00:05:55.280 | Exactly.
00:05:56.040 | We're comparing in San Francisco, not on highways.
00:05:58.960 | There's an apples-to-apples comparison.
00:06:00.600 | We're comparing in San Francisco or in Phoenix.
00:06:03.360 | OK, great.
00:06:04.200 | Because this is the next piece, which is,
00:06:06.360 | you're not yet on highways.
00:06:08.000 | That's right.
00:06:08.600 | We're testing on highways in Phoenix and San Francisco.
00:06:11.120 | With safety drivers.
00:06:11.960 | With employees and a human behind the wheel.
00:06:14.400 | Right.
00:06:14.920 | At times.
00:06:15.640 | And maybe you could explain to us
00:06:18.680 | why you're taking that approach.
00:06:20.080 | Because in my experience driving a Tesla
00:06:22.320 | since the beginning with full self-driving and autopilot
00:06:25.280 | before that, it seems like it's better on the highway
00:06:29.280 | than it is on local streets, where
00:06:31.280 | you have people jumping out and weird things that
00:06:34.280 | occur on the highway.
00:06:35.160 | You very rarely have to run into a weird thing.
00:06:37.760 | It's pretty ABC.
00:06:39.120 | So I would think it was the opposite.
00:06:41.320 | But is it because speed kills?
00:06:42.680 | What's the thinking there?
00:06:44.080 | So I think, just for everyone to orient,
00:06:48.600 | when you think about the levels of autonomy,
00:06:50.400 | we are only focused on level four and above.
00:06:53.200 | Level four and above means you do not
00:06:55.040 | need a driver's license, nor do you
00:06:56.560 | need to sit behind the wheel.
00:06:58.640 | So when you're focused on tackling the hardest challenge
00:07:02.440 | first, it's really important for the driver, the Waymo driver,
00:07:06.560 | to ingest dense urban environments.
00:07:09.560 | Because that is where the driver is going to learn the most.
00:07:12.040 | Sitting on freeways, seeing the same thing every day
00:07:15.000 | can be predictable.
00:07:16.200 | It's not entirely predictable, but it's not additive
00:07:19.000 | to what the machine learns.
00:07:20.160 | That's a great answer.
00:07:21.080 | So we're trying to learn as much as we can learn.
00:07:23.720 | And we've always, from the beginning,
00:07:26.080 | been focused on freeways.
00:07:27.680 | It's just that freeways isn't actually
00:07:29.480 | how the machine learns the fastest
00:07:31.320 | in the most sophisticated environments.
00:07:33.280 | But because you brought up sort of a level three or two
00:07:36.680 | or one system, I think it's also important that for systems
00:07:40.680 | where you need someone to sit behind the wheel,
00:07:43.520 | you can have a completely different approach
00:07:45.440 | to this technology.
00:07:46.320 | But we're really--
00:07:47.400 | I don't know how many of you here have had the chance
00:07:49.600 | to take a Waymo ride here in LA.
00:07:51.480 | It actually includes this campus where we're sitting right now.
00:07:54.800 | But you can't sit behind the wheel.
00:07:57.240 | So there's no ambiguity.
00:07:58.560 | You sit in the passenger seat, or you sit in the back seat.
00:08:01.200 | So let's talk about the LiDAR versus camera bet.
00:08:03.800 | Elon's been pretty clear, hey, you don't need LiDAR.
00:08:07.200 | It's a waste of money.
00:08:08.080 | The LiDAR on Waymos, I believe, is still in the $10,000
00:08:10.960 | to $20,000 incremental cost.
00:08:12.880 | Am I ballpark correct?
00:08:13.840 | I'm not sure.
00:08:15.480 | Oh, OK.
00:08:16.800 | So with that--
00:08:18.360 | As you scale, costs go down.
00:08:19.720 | Yeah, of course.
00:08:20.600 | So who's right?
00:08:23.440 | I mean, are you using the LiDAR, and is that
00:08:25.480 | a critical piece of this?
00:08:26.560 | Or do you think eventually, hey, the cameras
00:08:29.000 | are seeing in 360 degrees.
00:08:31.040 | They have higher fidelity than human eyes.
00:08:33.360 | And if humans do a great job, and you just
00:08:35.080 | have to beat the humans at driving,
00:08:36.400 | why even have the LiDAR?
00:08:37.520 | What do you think?
00:08:38.840 | So obviously, we have conviction that we need LiDAR, radar,
00:08:42.160 | and cameras.
00:08:43.440 | We have all three.
00:08:44.560 | It's given us the opportunity to scale at this point.
00:08:46.800 | We're the only company that's doing level 4 driving
00:08:49.440 | on public roads 24 hours a day.
00:08:52.240 | It's very different to get to the place of a demo.
00:08:54.680 | We learned that ourselves back in 2015, and '16, and '17.
00:08:58.880 | We had an autopilot highway product.
00:09:03.000 | We allowed employees, at the time Google employees,
00:09:05.320 | to ride in that product.
00:09:06.840 | We told them to pay attention.
00:09:08.600 | We told them, you have to be alert and be
00:09:10.600 | prepared to take over.
00:09:12.240 | What did they do?
00:09:13.200 | We're human.
00:09:14.040 | We want to do other stuff when we're in the car.
00:09:16.480 | Like, I want to be on the phone.
00:09:17.880 | I want to get to that meeting.
00:09:19.120 | I need to send that last email.
00:09:20.400 | I need to check whether or not my son's doctor's appointment
00:09:22.960 | is tomorrow or next Thursday.
00:09:24.160 | We're distracted.
00:09:25.520 | And so what happens, the thing's beeping at you.
00:09:27.560 | It's telling you to re-engage and pay attention.
00:09:30.240 | And what we learned is that that's not
00:09:32.400 | a way to improve road safety.
00:09:34.320 | And so the question is, what's the end goal?
00:09:36.440 | Is it to improve road safety or not?
00:09:38.000 | For us as a company, we're obsessed.
00:09:40.560 | Our culture is about safety.
00:09:42.240 | This mission is about safety.
00:09:43.680 | And we want to expand the number of people who have access
00:09:47.480 | to mobility options.
00:09:48.880 | If you need a driver's license, then it's
00:09:51.360 | just like every other mobility platform.
00:09:54.040 | You have to be behind the wheel.
00:09:56.400 | Sorry, go ahead.
00:09:57.480 | I'm curious about when this flips from a technology
00:10:00.960 | problem and a go-to-market problem to one of public health.
00:10:06.320 | Because that's a staggering fact.
00:10:08.000 | And we all know people that have lost somebody close to them,
00:10:13.160 | whether it's just a road accident or to drunk driving
00:10:15.640 | or to--
00:10:16.440 | I don't know if you guys saw "Anybody Follows Hockey,"
00:10:18.720 | but Johnny Hockey who was killed with his brother
00:10:21.160 | the day before his sister's wedding by a drunk drive.
00:10:23.560 | This is so avoidable.
00:10:26.040 | So what does it take for a politician
00:10:28.800 | or a group of politicians-- and I don't even
00:10:30.760 | know at which level this decision would get made--
00:10:33.200 | where they say, OK, we've seen enough.
00:10:34.800 | That should be the only solution.
00:10:36.720 | And are you pushing for that?
00:10:38.760 | So we are pushing to make sure people
00:10:42.000 | who put autonomous vehicles on the road
00:10:44.120 | have to demonstrate their safety case.
00:10:46.240 | We think the worst thing that could happen
00:10:48.480 | is introducing a new technology that doesn't actually
00:10:51.680 | improve this problem.
00:10:53.040 | Because it'll kill the whole industry.
00:10:54.660 | Because it'll kill the whole industry.
00:10:56.240 | And the state of affairs, 40,000 Americans, 1.35 globally,
00:11:01.760 | dying annually from road crashes, is avoidable.
00:11:05.760 | And so the challenge that we have
00:11:07.960 | to tackle with a lot of humility is how prepared is the public--
00:11:14.760 | humans can kill other people.
00:11:16.800 | How prepared is the public to accept that this
00:11:19.600 | isn't going to be a panacea, and it's not going to be perfect?
00:11:23.360 | You didn't hear me say 100%, 100%, 100%, right?
00:11:25.800 | It was 70.
00:11:26.320 | There'll be a small error rate.
00:11:27.360 | There's going to be a small-- and so that's
00:11:29.240 | the work that we're doing, which is we have community
00:11:31.560 | partners, Mothers Against Drunk Driving, National Safety
00:11:34.040 | Council, a host of organizations who've
00:11:36.360 | been trying to tackle this issue, who we partnered
00:11:38.960 | with early because it's actually in both of our interests
00:11:42.680 | to figure out how to tell the story that if this technology
00:11:46.360 | can reduce 60% of the fatalities,
00:11:51.440 | then it's worth the exposure to the risks.
00:11:53.600 | Are you on any sort of internal shock clock,
00:11:56.320 | meaning there must have been a moment where
00:11:58.360 | you convened a meeting, or you guys had a meeting,
00:12:00.520 | and you said, all right, we have enough data.
00:12:02.480 | Let's go.
00:12:02.960 | We're going to pull the guy out of the car.
00:12:04.720 | Or, hey, this is it.
00:12:05.680 | We're ready.
00:12:06.160 | And then you show it to regulators, and they say, OK.
00:12:09.120 | Is there a set of these really important milestones
00:12:12.800 | that we're going to feel like goes to other cities?
00:12:16.200 | Is it only grid cities where this works?
00:12:18.280 | So where do you go from here?
00:12:19.960 | Yeah, I hope that what, over time, you start to experience
00:12:24.000 | is, one, because we are super focused on safety,
00:12:27.840 | you'll see us scaling.
00:12:31.040 | And because we're building a driver and not a car,
00:12:34.960 | you'll see us scaling in a number of ways.
00:12:36.840 | So we have partnerships.
00:12:37.880 | Right now, we have a partnership with Uber in Phoenix.
00:12:41.000 | Why are we doing that?
00:12:41.920 | People are like, oh, my gosh.
00:12:43.120 | I got my Uber Eats in a Waymo.
00:12:45.000 | Why are you all doing that?
00:12:46.160 | It's because we're building a driver.
00:12:47.740 | And so we're trying to figure out
00:12:49.120 | all of the ways in which this driver could
00:12:51.080 | be applied to future use cases.
00:12:53.440 | And so over time, you're certainly
00:12:55.360 | going to see us announce new places,
00:12:56.900 | but you're also going to see, like in San Francisco,
00:12:59.560 | we launched a 24-hour service in a limited territory.
00:13:02.800 | And over time, we've expanded that territory.
00:13:04.640 | Is it a step function in complexity
00:13:06.360 | to go to cities that are more--
00:13:08.760 | how would I call it-- more chaotically city planned?
00:13:11.720 | It is definitely an opportunity for the driver to learn more.
00:13:15.360 | And so coming to Los Angeles, for example,
00:13:17.360 | was more like being in Phoenix.
00:13:19.560 | Going to Austin--
00:13:20.280 | Grids.
00:13:20.840 | Exactly.
00:13:21.680 | Exactly.
00:13:22.160 | Going to Manhattan, we will certainly
00:13:23.920 | benefit from our work.
00:13:25.000 | And San Francisco, same thing with DC.
00:13:27.300 | So start with the grid cities and work your way
00:13:29.740 | into more complex cities.
00:13:31.140 | So can I follow up on that last point?
00:13:33.740 | I have a friend who's the third or fourth investor in Uber.
00:13:37.700 | And he's very worried that once this product scales,
00:13:46.740 | there's not really going to be a need for Uber anymore.
00:13:49.180 | Why would you need Uber if you can just
00:13:51.340 | summon the robot car directly, or the robot driver,
00:13:53.940 | whatever you want to call it, right?
00:13:55.440 | Driver, car, whatever.
00:13:56.700 | I mean, doesn't this logically disrupt Uber once it's
00:14:01.660 | at scale?
00:14:03.980 | I think--
00:14:06.140 | [LAUGHTER]
00:14:08.340 | Well, there are current partners.
00:14:09.720 | As I said, we have a partnership with--
00:14:11.740 | right, I mean, we have a partnership.
00:14:13.860 | Your friend hasn't sold enough shares.
00:14:16.620 | Your friend's doing just fine.
00:14:18.140 | We have the partnership with Uber
00:14:20.000 | because we think it's important for us
00:14:22.260 | to create an AV ecosystem around this technology.
00:14:26.180 | This isn't going to be we've created technology.
00:14:29.260 | We're going to figure out everything ourselves.
00:14:31.540 | That would be foolish and very expensive.
00:14:33.420 | And we'd never find ourselves profitable.
00:14:35.500 | But having companies that are experts in integration,
00:14:39.300 | having automotive partner--
00:14:41.700 | we're not building the car--
00:14:43.060 | having network partners, having fleet operating
00:14:46.060 | partners, public officials.
00:14:48.540 | There's a lot that is built around us.
00:14:51.540 | And these are the early days, the early innings.
00:14:53.580 | And so that's part of it is learning.
00:14:55.820 | I'd say five years ago, people would say to me--
00:14:59.700 | network operators would say, oh, this is exactly--
00:15:02.260 | we know how to do this.
00:15:03.820 | Well, this is a constrained supply.
00:15:05.820 | It's not a supply you just surge.
00:15:08.380 | This is a driver that you need to understand actually
00:15:11.220 | how to operate with.
00:15:12.140 | And so I think we're going through those steps
00:15:14.100 | to learn together.
00:15:14.980 | And so what I would say is there's
00:15:16.500 | room for a lot of people--
00:15:18.700 | is the insurance lobby and the insurance industry a proponent?
00:15:23.340 | Or would they try to constrain your progress?
00:15:28.780 | So one of the largest reinsurers in the world, Swiss Re,
00:15:32.460 | has actually been doing our safety--
00:15:35.060 | Yeah, they would want to underwrite this.
00:15:37.380 | But I'm talking about more like the GEICOs of the world.
00:15:39.220 | Same.
00:15:39.740 | I think they're very eager to get the data sets that we're
00:15:42.980 | releasing.
00:15:43.660 | We released 20 safety papers last year.
00:15:46.580 | They're all wanting to understand
00:15:48.340 | what the actual impact of this technology can be.
00:15:50.780 | Because one of the things I think
00:15:53.180 | that's hard for us to grasp as humans-- or at least I'll
00:15:56.620 | say for me.
00:15:57.220 | I don't like speaking for everybody.
00:15:59.780 | We drive in a week more than a human drives in a lifetime.
00:16:05.460 | And you can say that as a company that runs fleets.
00:16:09.340 | But we're the only company where it's one driver.
00:16:13.700 | And that's not the way the world is organized right now.
00:16:18.500 | FedEx has a lot of drivers.
00:16:20.180 | All of you who have Teslas have a lot of drivers.
00:16:22.980 | Everyone has a lot of drivers.
00:16:24.300 | And all of those drivers have different capabilities.
00:16:26.980 | This is one driver learning all the time in unison.
00:16:33.340 | And so what can we demonstrate that's possible?
00:16:35.900 | And it really goes back to your other question
00:16:37.780 | of what are policymakers five years from now really
00:16:40.700 | going to want to know?
00:16:41.660 | And how will they be able to derive
00:16:43.540 | from what we're doing today?
00:16:44.980 | Where were you when you heard the news of Cruz
00:16:47.980 | dragging that woman in San Francisco 20 feet?
00:16:51.180 | And then where were you?
00:16:52.940 | And what was your reaction internally
00:16:55.260 | to when you found out Cruz covered it up?
00:16:59.700 | I don't remember where I was when I learned.
00:17:03.060 | But we're here to improve road safety.
00:17:09.660 | So it was pretty upsetting, I think, to me and the whole org.
00:17:14.700 | I think as we learned about the lack of transparency,
00:17:19.380 | our view has just been, be transparent.
00:17:22.100 | It's not perfection.
00:17:24.340 | It's technology.
00:17:25.460 | Yeah, but you're considered their contemporary.
00:17:27.580 | And you have a different approach.
00:17:28.980 | You would never cover it up.
00:17:30.740 | They obviously had something dysfunctional.
00:17:33.060 | Did that set the industry back significantly?
00:17:35.060 | How are regulators talking to you before and after that,
00:17:38.540 | differently, if at all?
00:17:40.580 | It definitely led to more scrutiny in San Francisco,
00:17:44.700 | because there was already an active,
00:17:47.540 | and continues to be, a fairly active voice against technology
00:17:52.140 | in San Francisco.
00:17:53.500 | And so we needed to continue to engage,
00:17:56.340 | which we were already doing, provide more data, which
00:17:58.900 | we're, of course, willing to do.
00:18:00.540 | And I think over time, what we'll see
00:18:02.460 | is we have to get to a place where the amount of data
00:18:05.740 | that we're providing actually advances the safety concerns.
00:18:09.860 | I think at that point, it was just providing
00:18:11.860 | a lot more engagement.
00:18:13.700 | If the mission is to reduce those deaths
00:18:15.820 | and to really scale this technology,
00:18:17.740 | Uber's doing a quarter billion rides a week.
00:18:20.620 | You just raised $5 billion.
00:18:21.740 | It's a big number.
00:18:23.020 | That gives you another 50,000 cars,
00:18:25.860 | which is a fraction of the rides in but one city.
00:18:29.420 | It seems like you need a lot of partners in the car space
00:18:32.340 | to build this.
00:18:33.500 | So wouldn't the best strategy be to license this technology
00:18:37.180 | to every carmaker and then allow the Ubers, the Lyfts,
00:18:41.820 | the DoorDashes, and fleet managers
00:18:44.260 | to handle cleaning the puke in the back of the cars
00:18:46.460 | and putting the air in the tires?
00:18:48.100 | Because I don't think Waymo wants
00:18:49.580 | to be in the back of that car cleaning up the puke
00:18:51.780 | and doing the air tire, right?
00:18:53.820 | There has to be somebody who manages all that.
00:18:55.740 | So maybe talk a little bit about that possibility of--
00:18:59.340 | and even a more beautiful thing would
00:19:00.940 | be for you to open source it.
00:19:02.260 | So has there been a discussion internally
00:19:03.580 | about open sourcing this technology?
00:19:05.300 | So I'll start at the beginning.
00:19:06.620 | I think what you just described is exactly the AV ecosystem
00:19:09.620 | that I'm talking about.
00:19:10.700 | And yes, out of necessity.
00:19:13.020 | And because we are--
00:19:14.020 | I mean, whoever was first was going to need to create this.
00:19:16.860 | It doesn't exist.
00:19:18.300 | So we're doing that.
00:19:20.220 | I think the second part of what you're talking about,
00:19:22.420 | let everyone do what they're good at.
00:19:24.300 | We're good at building this driver.
00:19:26.060 | There's, though, a period where we
00:19:27.700 | have to become really good at what does
00:19:29.620 | it mean to operate at scale.
00:19:32.100 | And that's where we are right now.
00:19:34.380 | What does it mean to operate at scale?
00:19:37.220 | Technology can work and not be delightful, not be reusable,
00:19:40.300 | not be repeatable.
00:19:41.220 | And we don't just want people having tourism rides.
00:19:44.340 | We want people actually integrating this
00:19:46.100 | into their lives.
00:19:46.860 | And we just did a UX study.
00:19:48.820 | And we found that a third of people
00:19:50.500 | use the service in San Francisco to go to their doctor
00:19:53.380 | appointments.
00:19:53.900 | We're like, that feels like real life.
00:19:56.060 | And 36% were using the vehicles to go to local businesses,
00:20:01.260 | which is also real life.
00:20:02.900 | And so these are the kinds of signals
00:20:05.020 | that we're trying to get.
00:20:06.420 | As far as personally owned cars, we've
00:20:08.580 | always said as part of our strategy from the beginning
00:20:11.380 | that we would start with ride hailing and local delivery
00:20:14.060 | and that eventually we would make our technology available
00:20:16.980 | to automotive companies so that at some point,
00:20:21.900 | you'll find Waymo cars sitting on showroom floors for sale.
00:20:24.860 | Oh, wow.
00:20:25.360 | So we could go buy one and then put it
00:20:27.460 | into whatever fleet we want.
00:20:29.380 | What does it cost to make one of these cars today?
00:20:31.540 | And where do you see it in five years?
00:20:32.740 | And just, I guess, on that question,
00:20:34.260 | where's the tipping point in unit economics?
00:20:36.100 | So if you look on the ride hailing business, based on--
00:20:39.740 | I know a lot of people drive during rush hour.
00:20:41.980 | And then they don't need rides the rest of the time.
00:20:43.580 | So there's a lot of excess inventory sitting around.
00:20:46.260 | So if you look at the utilization of a car
00:20:48.380 | over its lifetime and the cost of the car over the lifetime,
00:20:51.020 | what's the breaking point where the unit economics make sense?
00:20:53.660 | Yeah.
00:20:54.740 | So your question of cost of car, we just
00:20:57.220 | don't break it out since we're not
00:20:59.500 | reported separately for Alphabet.
00:21:01.480 | We're just bundled up.
00:21:02.400 | The rumor is $1.50 right now.
00:21:03.940 | It's close.
00:21:07.140 | $1.25?
00:21:07.640 | Not commenting on rumors.
00:21:09.420 | Not commenting on rumors.
00:21:11.580 | Wait, if that's just the system for self-driving.
00:21:14.580 | The car plus the system.
00:21:16.020 | The car plus.
00:21:16.820 | But what if--
00:21:17.420 | $125 to $150 is the rumored cost for the car today.
00:21:21.420 | It's actually like--
00:21:22.260 | You're going to comment on the rumor?
00:21:23.780 | I love this.
00:21:24.660 | I mean, that's obviously a lot more than a Tesla, right?
00:21:28.940 | Yeah, Tesla's $40,000, $50,000, yeah.
00:21:31.820 | So do we know ballparks here?
00:21:33.740 | What can we say?
00:21:34.340 | So keep going.
00:21:38.500 | It's so fun.
00:21:39.860 | We're poker players, so we're going to read you and bait you.
00:21:42.540 | And it's not working.
00:21:44.220 | We didn't know you were a Jedi.
00:21:45.740 | So sorry, let me just ask.
00:21:47.220 | Is it about reinventing the system
00:21:49.460 | or getting the next gen of the system
00:21:51.000 | before the unit economics break?
00:21:52.340 | Or is it about volume?
00:21:53.420 | What's the breaking point of where the unit economics become
00:21:55.920 | profitable, become positive?
00:21:58.740 | Yeah, not going to get into that other than to say to you,
00:22:03.100 | it is the thing I spend all of my time focused on.
00:22:06.180 | And we are on--
00:22:07.020 | But there's a path there.
00:22:08.100 | There's completely a path there.
00:22:09.740 | Well, without getting into costs,
00:22:11.340 | so is one of the plans here to sell this to all the car
00:22:15.940 | makers so that they can compete with Tesla?
00:22:18.420 | Because most of the car companies
00:22:20.580 | do not have the technical capabilities
00:22:22.580 | to develop their own self-driving.
00:22:24.260 | It's just impossible.
00:22:25.700 | So are you guys going to be like an OEM to all the car makers
00:22:28.940 | so that they can provide self-driving?
00:22:30.980 | Yeah, we've said sort of publicly,
00:22:33.780 | and we've had partnerships where we've
00:22:35.400 | made the technology available.
00:22:37.820 | It's a question of when will a level four sort of product
00:22:43.180 | be available and interesting to consumers at scale?
00:22:47.060 | That's part of the question.
00:22:48.220 | But absolutely, this is part of our strategy.
00:22:50.100 | A friend of mine--
00:22:50.860 | Can I get that?
00:22:51.540 | Yeah, of course.
00:22:52.180 | I'm sorry.
00:22:52.660 | A friend of mine who couldn't make it here
00:22:55.620 | is an angel investor in Uber.
00:22:57.380 | And he-- no.
00:23:03.300 | If you look at the business outside of the United States--
00:23:06.140 | actually, if you look at Uber's business,
00:23:08.380 | there's a lot of opportunity in other markets.
00:23:10.820 | And sometimes, in other markets, they
00:23:12.620 | will prove probably more able and open-minded
00:23:16.940 | to taking the public health priority around these things.
00:23:20.260 | So why such a heavy reliance on America,
00:23:25.380 | considering the scale and the imprimatur of Google,
00:23:27.920 | could probably pull you into any number of markets
00:23:30.340 | anywhere around the world?
00:23:31.420 | Yeah.
00:23:32.260 | So we really think about it as this sort of learning journey.
00:23:36.380 | But it's also just where we started, right?
00:23:38.620 | I mean, part of the conversation we get to have now
00:23:41.780 | is what would--
00:23:42.700 | if the driver was at this level of maturity,
00:23:45.460 | where would we have started?
00:23:47.860 | That said, we have global aspirations.
00:23:49.860 | We're always thinking about the--
00:23:51.500 | I mean, doesn't it just take one geography to prove this point?
00:23:55.140 | Or do you view it the problem that way?
00:23:56.940 | Like, it's like, we're training, we're training, we're training.
00:23:59.660 | And then at some point, there'll be some city or county--
00:24:03.180 | Which point?
00:24:03.940 | I just want to make sure I understand.
00:24:05.480 | --this idea where you can prove that it's just
00:24:07.780 | so reliable and valuable.
00:24:09.900 | And that you're saving just--
00:24:11.900 | I really care about that idea.
00:24:13.540 | Yeah.
00:24:15.140 | To who?
00:24:15.780 | Who would it prove the point to?
00:24:18.260 | People that would make the rules, I guess.
00:24:19.980 | Yeah.
00:24:21.020 | I think-- I hope.
00:24:23.820 | I don't know.
00:24:24.980 | I think there's a lot of noise in the system.
00:24:28.220 | There's a lot of confusion about full autonomy
00:24:31.140 | versus partial autonomy.
00:24:34.140 | There's not a lot of legislative activity
00:24:37.340 | that's informed by experts.
00:24:40.340 | So I don't know, right?
00:24:41.600 | Like, certainly, it's my hope.
00:24:43.820 | And I think one of the things, when
00:24:45.580 | you think about certain countries around the world,
00:24:47.740 | there is more of a nation-state, top-down approach
00:24:50.420 | to some regulations.
00:24:52.940 | And so that could be easier.
00:24:54.740 | Singapore?
00:24:55.460 | Yeah.
00:24:55.980 | You prove it, it's codified into law, you launch.
00:24:58.660 | That hasn't been the primary way we've
00:25:04.060 | thought about market entry or opportunity,
00:25:05.820 | but it's certainly one of the areas that we focus on.
00:25:08.020 | But as long as some markets adopt,
00:25:09.460 | the rest of the markets will see and they'll fall behind.
00:25:11.860 | I mean, you just have to penetrate some markets.
00:25:14.260 | Yeah.
00:25:15.020 | Google had amazing success with Android and open source.
00:25:18.100 | So back to that open source question,
00:25:20.020 | you must have had this discussion many times internally.
00:25:23.060 | Why not open source the technology
00:25:24.860 | and then provide the premium Android
00:25:27.500 | to people who want to pay?
00:25:28.700 | And is that something on the roadmap?
00:25:30.460 | Because I hear there's a big discussion internally about that.
00:25:33.080 | To OEMs, to car OEMs?
00:25:34.000 | Just open sourcing the maps, the software,
00:25:36.580 | to make everybody contribute to it,
00:25:39.100 | to just get this out quicker.
00:25:40.340 | So where are you at with those internal discussions
00:25:42.700 | at Waymo?
00:25:43.460 | Give away the whole business?
00:25:44.660 | Yeah, exactly.
00:25:45.340 | No, well, I mean, Android, it's worked, right?
00:25:47.340 | It's like $20 billion in.
00:25:48.380 | It's like, let's just give it away.
00:25:49.780 | But Friedberg, if you look at Android,
00:25:51.500 | it's only gotten stronger, right?
00:25:52.900 | It's completely different.
00:25:53.980 | I think that--
00:25:54.780 | That was defensive on search, but--
00:25:56.340 | It's OK, bad note, your answer.
00:25:57.380 | No, I like it when you guys answer each other's questions,
00:26:00.380 | because I get to learn what you actually think.
00:26:02.340 | We want to see your answer.
00:26:03.460 | We fight a lot, it's fine.
00:26:04.580 | No, it's fun.
00:26:05.540 | It's actually fun for me.
00:26:07.420 | So feel free to keep--
00:26:09.780 | So on the question of open sourcing,
00:26:13.900 | that isn't something that we're spending a lot of time
00:26:16.140 | talking about right now.
00:26:17.740 | Let's talk about level six, which I believe
00:26:20.020 | is inclement weather, and really--
00:26:22.020 | Level five.
00:26:22.700 | Oh, level five, rather, sorry.
00:26:25.060 | And I know when I'm in my Tesla, it gives me a little warning.
00:26:27.620 | Hey, inclement weather, and the FSD might be degraded,
00:26:30.980 | extra attention time.
00:26:33.460 | How close are you to feeling comfortable with these things
00:26:37.140 | driving in snow, ice, rain?
00:26:40.180 | Or when that happens, do fleets just pull over and game over?
00:26:44.380 | You can't use this technology.
00:26:46.420 | Yeah, we've been improving weather.
00:26:49.220 | We've tested across 25 cities specifically for that.
00:26:52.300 | So we had massive storms in San Francisco last year,
00:26:56.580 | and we didn't take down the fleet.
00:26:58.460 | Our fleet was able to handle all of the rain.
00:27:00.260 | So rain and flooding, pretty good.
00:27:02.420 | Rain and flooding, pretty good.
00:27:03.740 | Because of LIDAR and radar, maybe.
00:27:07.860 | You said it.
00:27:08.860 | OK, fog, snow, ice.
00:27:11.420 | So we've tested in snow and ice.
00:27:16.260 | We don't have any deployments there,
00:27:17.860 | but that's an area that we're learning.
00:27:20.100 | Fog, we've been able to master in San Francisco.
00:27:22.900 | So that 280 craziness.
00:27:24.100 | No problem.
00:27:24.940 | Same thing with those sandstorms in Phoenix.
00:27:27.540 | That was an early learning.
00:27:29.060 | Those are actually very challenging.
00:27:31.780 | And then there are swarms of birds in Phoenix.
00:27:33.700 | Like, really.
00:27:34.780 | And heat, right?
00:27:36.020 | You have a machine--
00:27:36.900 | Phoenix sounds brutal.
00:27:38.100 | It's like Revelations.
00:27:41.180 | There's a lot.
00:27:41.980 | There's a lot.
00:27:43.100 | And we're the only company that can pick you up autonomously
00:27:46.260 | at the airport, right?
00:27:47.940 | And so like, curbside hiccup is the whole thing.
00:27:52.020 | What has been the hardest challenge for you as CEO?
00:27:55.620 | I think--
00:27:59.140 | I mean, we've heard all these rumors
00:28:00.740 | about cultural issues at Google or whatnot.
00:28:02.660 | But I get the sense that you're much more
00:28:05.500 | separated in terms of building a much more
00:28:09.060 | technical and specific culture.
00:28:10.500 | But what have been the hardest challenges for you?
00:28:12.980 | I think one of the hardest challenges was making--
00:28:16.300 | we made such a big decision in going fully autonomous,
00:28:20.900 | which is sort of the reason we're here, during COVID.
00:28:24.980 | You mean getting that person out?
00:28:26.500 | The person from behind the wheel, scaling this,
00:28:29.020 | and so then bringing the culture back together to now scale.
00:28:32.740 | So we just needed-- like, we needed a minute to lock back
00:28:35.820 | That's one of the things.
00:28:36.740 | And the other one, I would say, is what you just said,
00:28:38.980 | which is it's a brilliant team.
00:28:43.180 | And the team works really hard.
00:28:45.740 | But like, bad things are going to happen in the real world
00:28:48.860 | with human drivers around our cars.
00:28:51.100 | And the team takes that so seriously.
00:28:53.700 | And so that's-- it's like having worked at--
00:28:56.860 | I mean, I worked at AOL, and you were there, too.
00:28:58.940 | But like, other kinds of tech companies--
00:29:01.580 | I know every kind of company is stressful.
00:29:03.380 | It's a different kind of stress when people's lives
00:29:05.260 | are at stake.
00:29:05.780 | They internalize it.
00:29:06.300 | Completely.
00:29:07.060 | And they personalize it.
00:29:08.100 | It's their work.
00:29:08.940 | And so that's-- it's just a different level
00:29:11.060 | of stress and accountability and a sense of responsibility.
00:29:13.620 | But do you have to manage people differently because of that
00:29:16.140 | pressure, or no?
00:29:16.780 | We do when things happen in industry.
00:29:20.060 | We need people to sort of stay buckled in and believing
00:29:23.900 | that this is something that is worth it.
00:29:26.260 | It's audacious.
00:29:27.260 | It's hard.
00:29:28.420 | But it's worth it.
00:29:29.420 | And it's bigger.
00:29:30.500 | Most people come because they have lost someone.
00:29:32.940 | Like, they're very mission-oriented.
00:29:34.460 | I'm going to be very honest with you.
00:29:35.180 | It's an incredible framing you gave.
00:29:36.980 | And I totally missed it for so many years,
00:29:39.180 | which is when you said, we're building one driver.
00:29:41.380 | I thought for whatever reason you
00:29:42.780 | were building an autonomous driving system, which
00:29:44.860 | I interpreted very technologically
00:29:46.420 | for a very long time.
00:29:47.500 | But you're absolutely right.
00:29:48.620 | You have one goal.
00:29:49.420 | It's very clarifying, actually.
00:29:50.540 | The world's greatest driver.
00:29:51.620 | One-- the world's greatest driver.
00:29:53.080 | The world's most trusted driver.
00:29:54.500 | The world's most trusted driver.
00:29:56.100 | Takedra, thank you for--
00:29:57.180 | That's an incredible thing.
00:29:58.100 | Yeah, I mean, it was incredible.
00:29:59.380 | Thank you so much.
00:30:01.700 | Thank you.
00:30:02.300 | That was awesome.
00:30:03.500 | Thank you so much.