back to index

Sebastian Thrun: Flying Cars, Autonomous Vehicles, and Education | Lex Fridman Podcast #59


Chapters

0:0 Intro
3:24 Simulation
4:29 Computing
6:12 Machine Learning
7:39 Expert Systems
13:55 Software and Clocks
15:29 Advice for Future Work
17:23 Leadership
20:35 Intentions
22:36 Stephen Schwarzman
26:53 Lessons Learned
28:51 What is Academia
31:54 What makes you upset
33:10 Challenges of autonomous vehicles
36:12 Leadership style
38:42 Autonomous vehicle approaches
42:10 Selfdriving car nanodegree
43:46 Impact on society
45:17 Deep learning in autonomous vehicles
47:28 What is machine learning
51:8 Machine learning in the medical field
54:6 Nearterm impacts of AI
57:52 nanodegree

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Sebastian Thrun.
00:00:03.480 | He's one of the greatest roboticists,
00:00:05.280 | computer scientists, and educators of our time.
00:00:08.120 | He led the development of the autonomous vehicles
00:00:10.600 | at Stanford that won the 2005 DARPA Grand Challenge
00:00:14.280 | and placed second in the 2007 DARPA Urban Challenge.
00:00:18.160 | He then led the Google Self-Driving Car Program
00:00:20.960 | which launched the self-driving car revolution.
00:00:24.640 | He taught the popular Stanford course
00:00:26.480 | on artificial intelligence in 2011,
00:00:29.080 | which was one of the first massive open online courses
00:00:32.440 | or MOOCs as they're commonly called.
00:00:35.040 | That experience led him to co-found Udacity,
00:00:37.560 | an online education platform.
00:00:39.840 | If you haven't taken courses on it yet,
00:00:41.600 | I highly recommend it.
00:00:43.320 | Their self-driving car program, for example, is excellent.
00:00:47.120 | He's also the CEO of Kitty Hawk,
00:00:49.880 | a company working on building flying cars
00:00:53.000 | or more technically EVTOLs,
00:00:54.960 | which stands for electric vertical takeoff
00:00:56.880 | and landing aircraft.
00:00:58.640 | He has launched several revolutions
00:01:00.640 | and inspired millions of people,
00:01:02.720 | but also, as many know, he's just a really nice guy.
00:01:06.840 | It was an honor and a pleasure to talk with him.
00:01:10.560 | This is the Artificial Intelligence Podcast.
00:01:12.800 | If you enjoy it, subscribe on YouTube,
00:01:15.160 | give it five stars on Apple Podcasts,
00:01:17.120 | follow it on Spotify, support it on Patreon,
00:01:19.800 | or simply connect with me on Twitter
00:01:21.880 | at Lex Friedman, spelled F-R-I-D-M-A-N.
00:01:25.840 | If you leave a review on Apple Podcasts
00:01:27.840 | or YouTube or Twitter, consider mentioning ideas,
00:01:30.600 | people, topics you find interesting.
00:01:32.880 | It helps guide the future of this podcast.
00:01:35.840 | But in general, I just love comments
00:01:38.200 | with kindness and thoughtfulness in them.
00:01:40.160 | This podcast is a side project for me, as many people know,
00:01:43.640 | but I still put a lot of effort into it.
00:01:45.880 | So the positive words of support
00:01:47.720 | from an amazing community, from you, really help.
00:01:52.220 | I recently started doing ads at the end of the introduction.
00:01:55.240 | I'll do one or two minutes after introducing the episode
00:01:58.120 | and never any ads in the middle
00:01:59.680 | that can break the flow of the conversation.
00:02:01.840 | I hope that works for you
00:02:03.120 | and doesn't hurt the listening experience.
00:02:05.420 | I provide timestamps for the start of the conversation
00:02:08.000 | that you can skip to, but it helps if you listen to the ad
00:02:11.480 | and support this podcast by trying out the product
00:02:13.880 | or service being advertised.
00:02:15.480 | This show is presented by Cash App,
00:02:18.800 | the number one finance app in the App Store.
00:02:21.460 | I personally use Cash App to send money to friends,
00:02:24.080 | but you can also use it to buy, sell,
00:02:25.920 | and deposit Bitcoin in just seconds.
00:02:28.240 | Cash App also has a new investing feature.
00:02:31.080 | You can buy fractions of a stock, say $1 worth,
00:02:34.480 | no matter what the stock price is.
00:02:36.600 | Brokerage services are provided by Cash App Investing,
00:02:39.520 | a subsidiary of Square and member SIPC.
00:02:42.960 | I'm excited to be working with Cash App
00:02:44.720 | to support one of my favorite organizations called FIRST,
00:02:47.920 | best known for their FIRST robotics and Lego competitions.
00:02:51.340 | They educate and inspire hundreds of thousands of students
00:02:54.680 | in over 110 countries
00:02:56.520 | and have a perfect rating on Charity Navigator,
00:02:59.040 | which means the donated money
00:03:00.720 | is used to maximum effectiveness.
00:03:03.120 | When you get Cash App from the App Store or Google Play
00:03:06.080 | and use code LEXPODCAST, you'll get $10
00:03:09.360 | and Cash App will also donate $10 to FIRST,
00:03:12.120 | which again is an organization
00:03:13.960 | that I've personally seen inspire girls and boys
00:03:16.680 | to dream of engineering a better world.
00:03:19.760 | And now, here's my conversation with Sebastian Thrun.
00:03:24.000 | You've mentioned that "The Matrix"
00:03:26.760 | may be your favorite movie.
00:03:29.020 | So let's start with a crazy philosophical question.
00:03:32.220 | Do you think we're living in a simulation?
00:03:34.860 | And in general, do you find the thought experiment
00:03:37.980 | interesting?
00:03:40.060 | - Define simulation, I would say.
00:03:42.280 | Maybe we are, maybe we are not,
00:03:43.780 | but it's completely irrelevant to the way we should act.
00:03:47.220 | Putting aside for a moment,
00:03:49.900 | the fact that it might not have any impact
00:03:52.400 | on how we should act as human beings,
00:03:54.940 | for people studying theoretical physics,
00:03:57.340 | these kinds of questions might be kind of interesting,
00:03:59.620 | looking at the universe as a information processing system.
00:04:03.820 | - Universe is an information processing system.
00:04:05.980 | It's a huge physical, biological, chemical computer.
00:04:08.500 | There's no question.
00:04:09.600 | But I live here and now.
00:04:12.940 | I care about people, I care about us.
00:04:15.660 | What do you think is trying to compute?
00:04:17.700 | - I don't think there's an intention.
00:04:18.860 | I think it's just the world evolves the way it evolves,
00:04:22.060 | and it's beautiful, it's unpredictable,
00:04:25.380 | and I'm really, really grateful to be alive.
00:04:28.060 | - Spoken like a true human.
00:04:29.520 | - Which last time I checked I was.
00:04:32.140 | - Well, in fact, this whole conversation
00:04:35.300 | is just a Turing test to see if indeed you are.
00:04:39.360 | You've also said that one of the first programs
00:04:42.780 | or the first few programs you've written
00:04:45.100 | was, wait for it, TI-57 calculator.
00:04:49.160 | - Yeah.
00:04:50.040 | - Maybe that's early '80s.
00:04:52.060 | I don't wanna date calculators or anything.
00:04:54.380 | - That's early '80s, correct.
00:04:55.580 | - Yeah.
00:04:56.460 | So if you were to place yourself back into that time,
00:04:59.820 | into the mindset you were in,
00:05:02.140 | could you have predicted the evolution of computing,
00:05:05.620 | AI, the internet, technology, in the decades that followed?
00:05:10.620 | - I was super fascinated by Silicon Valley,
00:05:13.180 | which I'd seen on television once and thought,
00:05:15.260 | my God, this is so cool.
00:05:16.420 | They build like DRAMs there and CPUs.
00:05:19.620 | How cool is that?
00:05:20.460 | And as a college student a few years later,
00:05:23.980 | I decided to really study intelligence
00:05:25.860 | and study human beings and found that,
00:05:28.100 | even back then in the '80s and '90s,
00:05:30.540 | that artificial intelligence
00:05:31.580 | is what fascinated me the most.
00:05:33.340 | What's missing is that back in the day,
00:05:35.820 | the computers are really small.
00:05:37.700 | They're like the brains you could build
00:05:39.100 | were not anywhere bigger than a cockroach,
00:05:41.580 | and cockroaches aren't very smart.
00:05:43.740 | So we weren't at the scale yet where we are today.
00:05:46.300 | - Did you dream at that time
00:05:48.700 | to achieve the kind of scale we have today?
00:05:51.060 | Or did that seem possible?
00:05:52.660 | - I always wanted to make robots smart.
00:05:54.340 | I felt it was super cool to build an artificial human.
00:05:57.940 | And the best way to build an artificial human
00:05:59.660 | would have been a robot,
00:06:00.660 | because that's kind of the closest we could do.
00:06:03.060 | Unfortunately, we aren't there yet.
00:06:04.900 | The robots today are still very brittle.
00:06:07.180 | But it's fascinating to study intelligence
00:06:09.420 | from a constructive perspective
00:06:10.700 | where you build something.
00:06:12.020 | - To understand you build,
00:06:15.140 | what do you think it takes to build an intelligent system
00:06:19.500 | and an intelligent robot?
00:06:20.900 | - I think the biggest innovation
00:06:22.140 | that we've seen is machine learning.
00:06:23.780 | And it's the idea that the computers
00:06:26.140 | can basically teach themselves.
00:06:27.700 | Let's give an example.
00:06:29.740 | I'd say everybody pretty much knows how to walk.
00:06:33.100 | And we learn how to walk in the first year or two
00:06:35.340 | of our lives.
00:06:36.860 | But no scientist has ever been able
00:06:38.820 | to write down the rules of human gait.
00:06:41.180 | We don't understand it.
00:06:42.180 | We have it in our brains somehow.
00:06:44.060 | We can practice it.
00:06:45.220 | We understand it.
00:06:46.620 | But we can't articulate it.
00:06:47.820 | We can't pass it on by language.
00:06:50.300 | And that to me is kind of the deficiency
00:06:52.140 | of today's computer programming.
00:06:53.380 | When you program a computer,
00:06:55.300 | they're so insanely dumb
00:06:56.780 | that you have to give them rules for every contingencies.
00:06:59.900 | Very unlike the way people learn
00:07:01.780 | but learn from data and experience,
00:07:03.500 | computers are being instructed.
00:07:05.500 | And because it's so hard to get this instruction set right,
00:07:07.900 | you pay software engineers $200,000 a year.
00:07:11.500 | Now, the most recent innovation,
00:07:13.140 | which has been to make for like 30, 40 years,
00:07:15.860 | is an idea that computers can find their own rules.
00:07:18.500 | So they can learn from falling down and getting up
00:07:20.660 | the same way children can learn
00:07:21.980 | from falling down and getting up.
00:07:23.820 | And that revolution has led to a capability
00:07:26.220 | that's completely unmatched.
00:07:28.700 | Today's computers can watch experts do their jobs,
00:07:31.740 | whether you're a doctor or a lawyer,
00:07:33.860 | pick up the regularities, learn those rules,
00:07:36.900 | and then become as good as the best experts.
00:07:39.380 | - So the dream of in the '80s of expert systems,
00:07:41.700 | for example, had at its core the idea
00:07:44.820 | that humans could boil down their expertise
00:07:47.820 | on a sheet of paper.
00:07:49.340 | So to sort of reduce, sort of be able to explain to machines
00:07:53.220 | how to do something explicitly.
00:07:55.460 | So do you think, what's the use of human expertise
00:07:59.100 | into this whole picture?
00:08:00.060 | Do you think most of the intelligence
00:08:01.740 | will come from machines learning from experience
00:08:04.260 | without human expertise input?
00:08:06.460 | - So the question for me is much more
00:08:08.380 | how do you express expertise?
00:08:10.700 | You can express expertise by writing a book.
00:08:12.940 | You can express expertise by showing someone
00:08:15.140 | what you're doing.
00:08:16.260 | You can express expertise by applying it
00:08:18.380 | by many different ways.
00:08:20.020 | And I think the expert systems was our best attempt in AI
00:08:23.700 | to capture expertise and rules.
00:08:25.980 | But someone sat down and say,
00:08:27.100 | "Here are the rules of human gait.
00:08:28.540 | "Here's when you put your big toe forward
00:08:31.260 | "and your heel backwards and here how you stop stumbling."
00:08:34.700 | And as we now know, the set of rules,
00:08:37.100 | the set of language that we can command
00:08:39.420 | is incredibly limited.
00:08:41.180 | The majority of the human brain doesn't deal with language.
00:08:43.780 | It deals with like subconscious numerical perceptual things
00:08:48.180 | that we don't even, that we're self aware of.
00:08:50.420 | Now, when a AI system watches an expert do their job
00:08:55.740 | and practice their job,
00:08:57.900 | it can pick up things that people can't even put into
00:09:01.140 | writing into books or rules.
00:09:03.020 | And that's where the real power is.
00:09:04.500 | We now have AI systems that, for example,
00:09:07.140 | look over the shoulders of highly paid human doctors
00:09:10.540 | like dermatologists or radiologists,
00:09:12.820 | and they can somehow pick up those skills
00:09:15.340 | that no one can express in words.
00:09:16.980 | - So you were a key person in launching three revolutions,
00:09:22.140 | online education, autonomous vehicles,
00:09:25.140 | and flying cars or VTOLs.
00:09:28.180 | So high level,
00:09:30.220 | and I apologize for all the philosophical questions.
00:09:35.020 | - No apology necessary.
00:09:36.180 | - How do you choose what problems to try and solve?
00:09:40.540 | What drives you to make those solutions a reality?
00:09:43.340 | - I have two desires in life.
00:09:44.820 | I wanna literally make the lives of others better.
00:09:48.500 | Or as we often say,
00:09:50.460 | maybe jokingly, make the world a better place.
00:09:52.900 | I actually believe in this.
00:09:54.940 | It's as funny as it sounds.
00:09:56.340 | And second, I wanna learn.
00:09:59.100 | I wanna get in the skillset.
00:10:00.420 | I don't wanna be in a job I'm good at,
00:10:01.820 | because if I'm in a job that I'm good at,
00:10:04.260 | the chances for me to learn something interesting
00:10:05.820 | is actually minimized.
00:10:06.660 | So I wanna be in a job I'm bad at.
00:10:08.340 | That's really important to me.
00:10:10.180 | So in a build, for example,
00:10:11.540 | what people often call flying cars,
00:10:12.980 | these are electrical, vertical,
00:10:14.860 | takeoff and landing vehicles.
00:10:16.500 | I'm just no expert in any of this.
00:10:19.660 | And it's so much fun to learn on the job
00:10:22.220 | what it actually means to build something like this.
00:10:24.860 | Now I'd say the stuff that I've done lately
00:10:27.500 | after I finished my professorship at Stanford,
00:10:31.060 | they really focused on like what has the maximum impact
00:10:33.740 | on society.
00:10:34.820 | Like transportation is something that has transformed
00:10:37.500 | the 21st or 20th century more than any other invention,
00:10:40.100 | in my opinion, even more than communication.
00:10:42.500 | And cities are different, workers different,
00:10:45.020 | women's rights are different because of transportation.
00:10:47.860 | And yet we still have a very suboptimal
00:10:50.740 | transportation solution where we kill 1.2
00:10:54.660 | or so million people every year in traffic.
00:10:57.620 | It's like the leading cause of death
00:10:58.860 | for young people in many countries,
00:11:01.100 | where we are extremely inefficient resource wise,
00:11:03.580 | just go to your average neighborhood city
00:11:06.780 | and look at the number of parked cars,
00:11:08.300 | that's a travesty in my opinion,
00:11:10.380 | or where we spend endless hours in traffic jams.
00:11:13.820 | And very, very simple innovations like a self-driving car
00:11:16.820 | or what people call a flying car
00:11:18.820 | could completely change this.
00:11:20.220 | And it's there, I mean, the technology is basically there.
00:11:23.260 | You have to close your eyes not to see it.
00:11:25.380 | - So lingering on autonomous vehicles,
00:11:29.380 | a fascinating space, some incredible work
00:11:32.060 | you've done throughout your career there.
00:11:33.580 | So let's start with DARPA, I think,
00:11:38.340 | the DARPA challenge through the desert
00:11:40.340 | and then urban to the streets.
00:11:42.820 | I think that inspired an entire generation of roboticists
00:11:45.700 | and obviously sprung this whole excitement
00:11:49.460 | about this particular kind of four wheeled robots
00:11:52.620 | we called autonomous cars, self-driving cars.
00:11:55.500 | So you led the development of Stanley,
00:11:58.180 | the autonomous car that won the race to the desert,
00:12:01.220 | the DARPA challenge in 2005.
00:12:03.940 | And Junior, the car that finished second
00:12:07.300 | in the DARPA urban challenge,
00:12:09.140 | also did incredibly well in 2007, I think.
00:12:13.340 | What are some painful, inspiring
00:12:16.540 | or enlightening experiences from that time
00:12:19.420 | that stand out to you?
00:12:20.580 | - Oh my God, painful were all these incredibly complicated,
00:12:25.580 | stupid bugs that had to be found.
00:12:30.420 | We had a phase where Stanley,
00:12:33.020 | our car that eventually won the DARPA urban challenge
00:12:36.180 | would every 30 miles just commit suicide
00:12:39.300 | and we didn't know why.
00:12:40.860 | And it ended up to be that in the syncing
00:12:43.500 | of two computer clocks, occasionally a clock went backwards
00:12:47.700 | and that negative time elapsed,
00:12:50.340 | screwed up the entire internal logic,
00:12:51.820 | but it took ages to find this.
00:12:54.380 | It were like bugs like that.
00:12:56.300 | I'd say enlightening is the Stanford team
00:12:59.300 | immediately focused on machine learning and on software,
00:13:02.460 | whereas everybody else seemed to focus
00:13:03.740 | on building better hardware.
00:13:05.180 | Our analysis had been a human being
00:13:07.620 | with an existing rental car can perfectly drive the course.
00:13:10.220 | Why do I have to build a better rental car?
00:13:12.100 | I just should replace the human being.
00:13:14.780 | And the human being to me was a conjunction of three steps.
00:13:18.780 | We had sensors, eyes and ears, mostly eyes.
00:13:22.300 | We had brains in the middle
00:13:23.740 | and then we had actuators, our hands and our feet.
00:13:26.260 | Now the actuators are easy to build.
00:13:28.140 | The sensors are actually also easy to build.
00:13:29.660 | What was missing was the brain.
00:13:30.660 | So we had to build a human brain
00:13:32.540 | and nothing clear then to me
00:13:35.180 | that the human brain is a learning machine.
00:13:36.940 | So why not just train our robot?
00:13:38.180 | So we would build massive machine learning into our machine.
00:13:42.260 | And with that, we're able to not just learn
00:13:44.780 | from human drivers, we had the entire speed control
00:13:47.340 | of the vehicle was copied from human driving,
00:13:49.780 | but also have the robot learn from experience
00:13:51.620 | where it made a mistake and go to recover from it
00:13:53.620 | and learn from it.
00:13:55.580 | - You mentioned the pain point of software and clocks.
00:14:00.580 | Synchronization seems to be a problem
00:14:04.580 | that continues with robotics.
00:14:06.020 | It's a tricky one with drones and so on.
00:14:08.020 | What does it take to build a thing,
00:14:13.380 | a system with so many constraints?
00:14:16.620 | You have a deadline, no time.
00:14:20.300 | You're unsure about anything really.
00:14:22.100 | It's the first time that people really even explore.
00:14:24.980 | It's not even sure that anybody can finish
00:14:26.780 | when we're talking about the race
00:14:28.340 | of the desert the year before nobody finished.
00:14:30.620 | What does it take to scramble and finish a product
00:14:33.420 | that actually, a system that actually works?
00:14:35.780 | - I mean, we were very lucky.
00:14:36.780 | We were a really small team.
00:14:38.300 | The core of the team were four people.
00:14:40.460 | It was four because five couldn't comfortably sit
00:14:43.100 | inside a car, but four could.
00:14:45.300 | And I, as a team leader, my job was to get pizza
00:14:47.700 | for everybody and wash the car and stuff like this
00:14:51.180 | and repair the radiator when it broke
00:14:52.860 | and debug the system.
00:14:55.260 | And we were very kind of open-minded.
00:14:56.900 | We had like no egos involved in this.
00:14:58.420 | We just wanted to see how far we can get.
00:15:00.820 | What we did really, really well was time management.
00:15:03.300 | We were done with everything a month before the race.
00:15:06.220 | And we froze the entire software a month before the race.
00:15:08.740 | And it turned out, looking at other teams,
00:15:11.420 | every other team complained if they had just one more week,
00:15:14.100 | they would have won.
00:15:15.420 | And we decided, we're not gonna fall into that mistake.
00:15:18.740 | We're gonna be early.
00:15:19.860 | And we had an entire month to shake the system.
00:15:22.700 | And we actually found two or three minor bugs
00:15:24.940 | in the last month that we had to fix.
00:15:27.060 | And we were completely prepared when the race occurred.
00:15:30.020 | - Okay, so first of all,
00:15:30.860 | that's such an incredibly rare achievement
00:15:34.620 | in terms of being able to be done on time or ahead of time.
00:15:38.980 | What do you, how do you do that in your future work?
00:15:43.060 | What advice do you have in general?
00:15:44.780 | Because it seems to be so rare,
00:15:46.340 | especially in highly innovative projects like this.
00:15:49.300 | People work till the last second.
00:15:50.860 | - Well, the nice thing about the Dark Background Challenge
00:15:52.540 | is that the problem was incredibly well-defined.
00:15:55.300 | We were able for a while
00:15:56.540 | to drive the old Dark Background Challenge course,
00:15:58.780 | which had been used the year before.
00:16:00.780 | And then at some reason, we were kicked out of the region.
00:16:04.020 | So we had to go to different deserts, the Snorren Desert,
00:16:06.300 | and we were able to drive desert trails
00:16:08.860 | just at the same time.
00:16:10.580 | So there was never any debate about,
00:16:12.180 | like, what is actually the problem?
00:16:13.220 | We didn't sit down and say,
00:16:14.340 | "Hey, should we build a car or a plane?"
00:16:16.660 | We had to build a car.
00:16:18.260 | That made it very, very easy.
00:16:20.380 | Then I studied my own life and life of others
00:16:23.780 | and realized that the typical mistake that people make
00:16:26.340 | is that there's this kind of crazy bug left
00:16:29.580 | that they haven't found yet.
00:16:31.220 | And it's just, they regret it.
00:16:34.340 | And the bug would have been trivial to fix.
00:16:36.140 | They just haven't fixed it yet.
00:16:37.740 | They didn't want to fall into that trap.
00:16:39.580 | So I built a testing team.
00:16:41.060 | We had a testing team that built a testing booklet
00:16:43.740 | of 160 pages of tests we had to go through
00:16:46.740 | just to make sure we shake out the system appropriately.
00:16:49.100 | - Wow.
00:16:49.940 | - And the testing team was with us all the time
00:16:51.820 | and dictated to us,
00:16:53.020 | today we do railroad crossings,
00:16:55.540 | tomorrow we do, we practice the start of the event.
00:16:58.460 | And in all of these, we thought,
00:17:00.660 | "Oh my God, this long solve trivial."
00:17:02.260 | And then we tested it out.
00:17:03.180 | "Oh my God, it doesn't do a railroad crossing, why not?
00:17:05.100 | "Oh my God, it mistakes the rails for metal barriers.
00:17:09.340 | "Shit, we have to fix this."
00:17:10.620 | - Yes.
00:17:11.620 | - So it was really a continuous focus
00:17:14.460 | on improving the weakest part of the system.
00:17:16.380 | And as long as you focus
00:17:18.540 | on improving the weakest part of the system,
00:17:20.540 | you eventually build a really great system.
00:17:23.100 | - Let me just pause in that.
00:17:24.780 | To me as an engineer, it's just super exciting
00:17:27.100 | that you were thinking like that,
00:17:28.300 | especially at that stage as brilliant
00:17:30.460 | that testing was such a core part of it.
00:17:33.380 | It may be to linger on the point of leadership.
00:17:35.740 | I think it's one of the first times you were really a leader
00:17:41.700 | and you've led many very successful teams since then.
00:17:45.420 | What does it take to be a good leader?
00:17:48.460 | - I would say most of all, just take credit
00:17:50.820 | for the work of others.
00:17:54.220 | That's very convenient, turns out,
00:17:57.580 | 'cause I can't do all these things myself.
00:18:00.220 | I'm an engineer at heart, so I care about engineering.
00:18:03.740 | So I don't know what the chicken and the egg is,
00:18:06.140 | but as a kid, I loved computers
00:18:07.860 | because you could tell them to do something
00:18:09.540 | and they actually did it.
00:18:10.700 | It was very cool.
00:18:11.540 | And you could like in the middle of the night,
00:18:12.740 | wake up at one in the morning and switch on your computer.
00:18:15.180 | And what you told you to yesterday, it would still do.
00:18:18.140 | That was really cool.
00:18:19.380 | Unfortunately, that didn't quite work with people.
00:18:21.300 | So you go to people and tell them what to do
00:18:22.820 | and they don't do it and they hate you for it.
00:18:25.780 | Or you do it today and then they go a day later
00:18:28.060 | and they stop doing it.
00:18:29.140 | So you have to...
00:18:30.220 | So then the question really became,
00:18:31.460 | how can you put yourself in the brain of people
00:18:34.100 | as opposed to computers?
00:18:35.100 | And in terms of computers, that's super dumb.
00:18:37.340 | That's so dumb.
00:18:38.220 | If people were as dumb as computers,
00:18:39.620 | I wouldn't wanna work with them.
00:18:41.260 | But people are smart and people are emotional
00:18:43.580 | and people have pride and people have aspirations.
00:18:45.900 | So how can I connect to that?
00:18:48.860 | And that's the thing where most of our leadership just fails
00:18:52.540 | because many, many engineers turn manager
00:18:56.220 | believe they can treat their team just the same
00:18:58.340 | way they can treat your computer
00:18:59.300 | and it just doesn't work this way.
00:19:00.420 | It's just really bad.
00:19:02.300 | So how can I connect to people?
00:19:05.100 | And it turns out as a college professor,
00:19:07.620 | the wonderful thing you do all the time
00:19:09.940 | is to empower other people.
00:19:10.980 | Like your job is to make your students look great.
00:19:14.700 | That's all you do.
00:19:15.540 | You're the best coach.
00:19:16.900 | And it turns out if you do a fantastic job
00:19:18.740 | with making your students look great,
00:19:20.820 | they actually love you and their parents love you.
00:19:22.700 | And they give you all the credit
00:19:23.780 | for stuff you don't deserve.
00:19:24.700 | Turns out all my students were smarter than me.
00:19:27.180 | All the great stuff invented at Stanford
00:19:28.740 | was their stuff, not my stuff.
00:19:30.020 | And they give me credit and say,
00:19:31.140 | oh, Sebastian, we're just making them
00:19:33.620 | feel good about themselves.
00:19:35.100 | So the question really is, can you take a team of people
00:19:38.020 | and what does it take to make them,
00:19:40.380 | to connect to what they actually want in life
00:19:43.340 | and turn this into productive action?
00:19:45.740 | It turns out every human being that I know
00:19:48.500 | has incredibly good intentions.
00:19:50.100 | I've really rarely met a person with bad intentions.
00:19:53.220 | I believe every person wants to contribute.
00:19:55.900 | I think every person I've met wants to help others.
00:19:59.420 | It's amazing how much of a urge we have
00:20:01.860 | not to just help ourselves, but to help others.
00:20:04.420 | So how can we empower people
00:20:05.940 | and give them the right framework
00:20:07.700 | that they can accomplish this?
00:20:09.340 | In moments when it works, it's magical
00:20:12.420 | because you'd see the confluence of people
00:20:17.180 | being able to make the world a better place
00:20:19.180 | and just having enormous confidence and pride out of this.
00:20:22.900 | And that's when my environment works the best.
00:20:27.220 | These are moments where I can disappear for a month
00:20:29.460 | and come back and things still work.
00:20:31.620 | It's very hard to accomplish,
00:20:32.780 | but when it works, it's amazing.
00:20:35.100 | - So I agree with you very much.
00:20:37.260 | It's not often heard that most people in the world
00:20:42.020 | have good intentions.
00:20:43.500 | At the core, their intentions are good
00:20:45.900 | and they're good people.
00:20:47.380 | That's a beautiful message.
00:20:48.860 | It's not often heard.
00:20:50.180 | - We make this mistake,
00:20:51.180 | and this is a friend of mine, Alex Voda,
00:20:53.460 | talking this, that we judge ourselves by our intentions
00:20:57.700 | and others by their actions.
00:20:59.140 | And I think that the biggest skill,
00:21:01.860 | I mean, here in Silicon Valley, we're full of engineers
00:21:03.540 | who have very little empathy
00:21:05.140 | and are kind of befuddled by why it doesn't work for them.
00:21:09.180 | The biggest skill I think that people should acquire
00:21:13.060 | is to put themselves into the position of the other
00:21:16.860 | and listen, and listen to what the other has to say.
00:21:19.980 | And they'd be shocked how similar they are to themselves.
00:21:23.380 | And they might even be shocked
00:21:24.580 | how their own actions don't reflect their intentions.
00:21:28.540 | I often have conversations with engineers
00:21:30.900 | where I say, "Look, hey, I love you.
00:21:32.540 | "You're doing a great job.
00:21:33.380 | "And by the way, what you just did has the following effect.
00:21:37.220 | "Are you aware of that?"
00:21:38.820 | And then people would say, "Oh my God, not I wasn't
00:21:41.180 | "because my intention was that."
00:21:43.100 | And I say, "Yeah, I trust your intention.
00:21:44.900 | "You're a good human being.
00:21:46.260 | "But just to help you in the future,
00:21:48.340 | "if you keep expressing it that way,
00:21:51.220 | "then people will just hate you."
00:21:53.380 | And I've had many instances where people say,
00:21:55.140 | "Oh my God, thank you for telling me this
00:21:56.460 | "because it wasn't my intention to look like an idiot.
00:21:59.180 | "It wasn't my intention to help other people.
00:22:00.580 | "I just didn't know how to do it."
00:22:02.460 | - Very simple, by the way.
00:22:03.980 | There's a book, Dale Carnegie, 1936,
00:22:07.420 | "How to Make Friends and How to Influence Others."
00:22:10.460 | Has the entire Bible.
00:22:11.500 | You just read it and you're done
00:22:12.780 | and you apply it every day.
00:22:14.020 | And I wish I was good enough to apply it every day.
00:22:16.820 | But it's just simple things, right?
00:22:18.940 | Like be positive, remember people's names, smile,
00:22:22.620 | and eventually have empathy.
00:22:24.540 | Really think that the person that you hate
00:22:27.460 | and you think is an idiot is actually just like yourself.
00:22:30.500 | It's a person who's struggling, who means well,
00:22:33.260 | and who might need help.
00:22:34.220 | And guess what?
00:22:35.060 | You need help.
00:22:36.620 | I've recently spoken with Stephen Schwarzman.
00:22:39.980 | I'm not sure if you know who that is.
00:22:41.660 | - I do.
00:22:42.500 | It's on my list.
00:22:45.220 | (laughing)
00:22:46.040 | - On the list.
00:22:46.880 | (laughing)
00:22:47.720 | But he said, sort of to expand on what you're saying,
00:22:52.720 | that one of the biggest things you can do
00:22:56.020 | is hear people when they tell you what their problem is
00:23:00.060 | and then help them with that problem.
00:23:02.380 | He says it's surprising how few people
00:23:06.020 | actually listen to what troubles others.
00:23:08.700 | - Yeah.
00:23:09.540 | - And because it's right there in front of you
00:23:12.620 | and you can benefit the world the most.
00:23:15.220 | And in fact, yourself and everybody around you
00:23:18.020 | by just hearing the problems and solving them.
00:23:20.860 | - I mean, that's my little history of engineering.
00:23:23.980 | That is, while I was engineering with computers,
00:23:28.260 | I didn't care at all what the computer's problems were.
00:23:32.380 | I just told them what to do and they do it.
00:23:34.820 | And it just doesn't work this way with people.
00:23:37.580 | - It doesn't work with me.
00:23:38.500 | If you come to me and say, "Do A," I do the opposite.
00:23:41.260 | (laughing)
00:23:43.620 | - But let's return to the comfortable world of engineering.
00:23:47.140 | And can you tell me in broad strokes in how you see it?
00:23:52.140 | Because you're at the core of starting it,
00:23:53.860 | the core of driving it,
00:23:55.100 | the technical evolution of autonomous vehicles
00:23:58.020 | from the first DARPA Grand Challenge
00:24:00.420 | to the incredible success we see with the program
00:24:03.660 | you started with Google Self-Driving Car
00:24:05.420 | and Waymo and the entire industry that sprung up
00:24:08.340 | of different kinds of approaches, debates, and so on.
00:24:11.220 | - Well, the idea of self-driving car goes back to the '80s.
00:24:14.180 | There was a team in Germany, another team in Carnegie Mellon
00:24:16.140 | that did some very pioneering work.
00:24:18.700 | But back in the day, I'd say the computers were so deficient
00:24:21.780 | that even the best professors and engineers in the world
00:24:25.900 | basically stood no chance.
00:24:27.300 | It then folded into a phase where the US government
00:24:31.220 | spent at least half a billion dollars
00:24:33.340 | that I could count on research projects.
00:24:36.180 | But the way the procurement works,
00:24:37.980 | a successful stack of paper describing lots of stuff
00:24:42.820 | that no one's ever gonna read
00:24:43.900 | was a successful product of a research project.
00:24:47.660 | So we trained our researchers to produce lots of paper.
00:24:50.540 | That all changed with the DARPA Grand Challenge.
00:24:54.340 | And I really gotta credit the ingenious people at DARPA
00:24:58.500 | and the US government and Congress
00:25:00.420 | that took a complete new funding model
00:25:02.220 | where they said, "Let's not fund effort,
00:25:04.100 | "let's fund outcomes."
00:25:05.580 | And it sounds very trivial, but there was no tax code
00:25:08.820 | that allowed the use of congressional tax money for a price.
00:25:13.700 | It was all effort-based.
00:25:15.100 | So if you put in 100 hours in, you could charge 100 hours.
00:25:17.460 | If you put in 1,000 hours in, you could build 1,000 hours.
00:25:20.660 | By changing the focus and say, "We're making the price.
00:25:22.780 | "We don't pay you for development,
00:25:23.940 | "we pay you for the accomplishment."
00:25:26.340 | They drew in, they automatically drew out
00:25:28.940 | all these contractors who are used to the drug
00:25:31.700 | of getting money per hour.
00:25:33.380 | And they drew in a whole bunch of new people.
00:25:35.700 | And these people are mostly crazy people.
00:25:37.620 | They were people who had a car and a computer
00:25:40.740 | and they wanted to make a million bucks.
00:25:42.420 | The million bucks was the original price money,
00:25:43.900 | it was then doubled.
00:25:45.460 | And they felt, "If I put my computer in my car
00:25:47.980 | "and program it, I can be rich."
00:25:50.860 | And that was so awesome.
00:25:52.060 | Like half the teams, there was a team that was surfer dudes
00:25:55.500 | and they had like two surfboards on their vehicle
00:25:58.540 | and brought like these fashion girls,
00:26:00.460 | super cute girls, like twin sisters.
00:26:02.820 | And you could tell these guys were not your common
00:26:06.700 | fails babe bandit who gets all these big multi-million
00:26:10.860 | and billion dollar countries from the US government.
00:26:13.540 | And there was a great reset.
00:26:15.260 | Universities moved in.
00:26:18.580 | I was very fortunate at Stanford that I'd just received tenure
00:26:21.820 | so I couldn't get fired no matter what I do,
00:26:23.380 | otherwise I wouldn't have done it.
00:26:25.140 | And I had enough money to finance this thing.
00:26:28.260 | And I was able to attract a lot of money from third parties.
00:26:31.180 | And even car companies moved in.
00:26:32.540 | They kind of moved in very quietly
00:26:34.060 | because they were super scared to be embarrassed
00:26:36.580 | that their car would flip over.
00:26:38.580 | But Ford was there and Volkswagen was there
00:26:40.700 | and a few others and GM was there.
00:26:43.380 | So it kind of reset the entire landscape of people.
00:26:46.340 | And if you look at who's a big name
00:26:48.220 | in self-driving cars today,
00:26:49.500 | these were mostly people who participated
00:26:51.340 | in those challenges.
00:26:52.300 | - Okay, that's incredible.
00:26:54.260 | Can you just comment quickly on your sense of lessons learned
00:26:59.060 | from that kind of funding model
00:27:01.220 | and the research that's going on in academia
00:27:04.380 | in terms of producing papers?
00:27:06.100 | Is there something to be learned and scaled up bigger?
00:27:09.860 | These having these kinds of grand challenges
00:27:11.700 | that could improve outcomes?
00:27:14.540 | - So I'm a big believer in focusing
00:27:16.300 | on kind of an end-to-end system.
00:27:19.620 | I'm a really big believer in systems building.
00:27:21.900 | I've always built systems in my academic career,
00:27:23.660 | even though I do a lot of math and abstract stuff,
00:27:27.020 | but it's all derived from the idea
00:27:28.100 | of let's solve a real problem.
00:27:29.620 | And it's very hard for me to be an academic
00:27:33.780 | and say, let me solve a component of a problem.
00:27:35.740 | Like with someone, there's fields like non-monetary logic
00:27:38.620 | or AI planning systems where people believe
00:27:41.740 | that a certain style of problem solving
00:27:44.260 | is the ultimate end objective.
00:27:47.220 | And I would always turn it around and say,
00:27:49.500 | hey, what problem would my grandmother care about
00:27:52.580 | that doesn't understand computer technology
00:27:54.620 | and doesn't want to understand?
00:27:56.460 | And how could I make her love what I do?
00:27:58.460 | Because only then do I have an impact on the world.
00:28:01.300 | I can easily impress my colleagues.
00:28:02.900 | That is much easier,
00:28:04.700 | but impressing my grandmother is very, very hard.
00:28:07.580 | So I would always thought if I can build a self-driving car
00:28:10.700 | and my grandmother can use it
00:28:12.820 | even after she loses her driving privileges
00:28:14.660 | or children can use it,
00:28:16.100 | or we save maybe a million lives a year,
00:28:20.500 | that would be very impressive.
00:28:22.420 | And there's so many problems like these.
00:28:23.900 | Like there's a problem with curing cancer,
00:28:25.300 | or whatever it is, live twice as long.
00:28:27.780 | Once a problem is defined,
00:28:29.580 | of course I can't solve it in its entirety.
00:28:31.420 | Like it takes sometimes tens of thousands of people
00:28:34.180 | to find a solution.
00:28:35.340 | There's no way you can fund an army of 10,000 at Stanford.
00:28:39.340 | So you're gonna build a prototype.
00:28:41.060 | Let's build a meaningful prototype.
00:28:42.500 | And the Dark Background Challenge was beautiful
00:28:43.900 | because it told me what this prototype had to do.
00:28:46.340 | I didn't have to think about what it had to do.
00:28:47.660 | I just had to read the rules.
00:28:48.820 | And it was really, really beautiful.
00:28:51.060 | - And it's most beautiful, you think,
00:28:52.820 | what academia could aspire to is to build a prototype
00:28:56.220 | that's the systems level that solves,
00:29:00.260 | gives you an inkling that this problem
00:29:02.140 | could be solved with this prototype.
00:29:03.540 | - First of all, I wanna emphasize what academia really is.
00:29:06.540 | And I think people misunderstand it.
00:29:08.580 | First and foremost,
00:29:10.340 | academia is a way to educate young people.
00:29:13.340 | First and foremost, a professor is an educator.
00:29:15.420 | No matter where you are at,
00:29:17.060 | a small suburban college,
00:29:18.580 | or whether you are a Harvard or Stanford professor.
00:29:21.980 | That's not the way most people think of themselves
00:29:25.020 | in academia because we have this kind of competition
00:29:28.020 | going on for citations and publication.
00:29:31.460 | That's a measurable thing,
00:29:32.860 | but that is secondary to the primary purpose
00:29:35.500 | of educating people to think.
00:29:37.820 | Now, in terms of research,
00:29:40.020 | most of the great science,
00:29:42.900 | the great research comes out of universities.
00:29:45.540 | You can trace almost everything back,
00:29:46.980 | including Google to universities.
00:29:48.860 | So there's nothing really fundamentally broken here.
00:29:52.180 | It's a good system.
00:29:53.420 | And I think America has the finest university system
00:29:55.980 | on the planet.
00:29:56.820 | We can talk about reach and how to reach people
00:30:00.620 | outside the system.
00:30:01.500 | It's a different topic,
00:30:02.340 | but the system itself is a good system.
00:30:04.820 | If I had one wish, I would say,
00:30:07.020 | it'd be really great if there was more debate
00:30:10.860 | about what the great big problems are in society.
00:30:15.940 | And focus on those.
00:30:18.780 | And most of them are interdisciplinary.
00:30:21.620 | Unfortunately, it's very easy to fall
00:30:24.660 | into a interdisciplinary viewpoint
00:30:28.180 | where your problem is dictated
00:30:30.460 | by what your closest colleagues believe the problem is.
00:30:33.700 | It's very hard to break out and say,
00:30:35.300 | well, there's an entire new field of problems.
00:30:37.940 | So to give an example,
00:30:39.860 | prior to me working on self-driving cars,
00:30:41.660 | I was a roboticist and a machine learning expert.
00:30:44.660 | And I wrote books on robotics,
00:30:46.860 | something called probabilistic robotics.
00:30:48.540 | It's a very methods driven kind of viewpoint of the world.
00:30:51.540 | I built robots that acted in museums as tour guides,
00:30:54.060 | that like led children around.
00:30:55.660 | It is something that at the time was moderately challenging.
00:31:00.060 | When I started working on cars,
00:31:02.260 | several colleagues told me,
00:31:03.700 | "Sebastian, you're destroying your career
00:31:06.060 | "because in our field of robotics,
00:31:08.140 | "cars are looked like as a gimmick
00:31:10.380 | "and they're not expressive enough.
00:31:11.740 | "They can only push this bottle
00:31:14.140 | "and the brakes, there's no dexterity,
00:31:16.420 | "there's no complexity, it's just too simple."
00:31:19.540 | And no one came to me and said,
00:31:21.180 | "Wow, if you solve that problem,
00:31:22.700 | "you can save a million lives."
00:31:25.020 | Among all robotic problems that I've seen in my life,
00:31:27.260 | I would say the self-driving car, transportation,
00:31:29.780 | is the one that has the most hope for society.
00:31:32.100 | So how come the robotics community
00:31:33.540 | wasn't all over the place?
00:31:35.140 | And it was because we focused on methods and solutions
00:31:37.940 | and not on problems.
00:31:39.900 | Like if you go around today and ask your grandmother,
00:31:42.460 | what bugs you, what really makes you upset?
00:31:45.260 | I challenge any academic to do this
00:31:48.780 | and then realize how far your research
00:31:51.820 | is probably away from that today.
00:31:53.860 | At the very least, that's a good thing
00:31:56.820 | for academics to deliberate on.
00:31:59.300 | The other thing that's really nice in Silicon Valley
00:32:00.900 | is Silicon Valley is full of smart people outside academia.
00:32:04.380 | So there's the Larry Pages and Mark Zuckerbergs
00:32:06.340 | in the world who are anywhere as smart or smarter
00:32:09.020 | than the best academics I've met in my life.
00:32:11.460 | And what they do is they are at a different level.
00:32:15.420 | They build the systems,
00:32:16.740 | they build the customer-facing systems,
00:32:19.340 | they build things that people can use
00:32:21.980 | without technical education.
00:32:23.820 | And they are inspired by research,
00:32:25.860 | they're inspired by scientists.
00:32:27.540 | They hire the best PhDs from the best universities
00:32:30.340 | for a reason.
00:32:31.180 | So I think there's kind of vertical integration
00:32:35.140 | between the real product, the real impact,
00:32:37.780 | and the real thought, the real ideas.
00:32:39.860 | That's actually working surprisingly well
00:32:41.300 | in Silicon Valley.
00:32:42.780 | It did not work as well in other places in this nation.
00:32:44.900 | So when I worked at Carnegie Mellon,
00:32:46.700 | we had the world's finest computer science university,
00:32:49.860 | but there wasn't those people in Pittsburgh
00:32:52.780 | that would be able to take these very fine
00:32:55.140 | computer science ideas and turn them into
00:32:57.140 | massively impactful products.
00:33:00.620 | That symbiosis seemed to exist pretty much only
00:33:03.900 | in Silicon Valley and maybe a bit in Boston and Austin.
00:33:06.620 | - Yeah, with Stanford.
00:33:07.980 | That's really interesting.
00:33:11.100 | So if we look a little bit further on
00:33:14.060 | from the DARPA Grand Challenge
00:33:17.180 | and the launch of the Google self-driving car,
00:33:20.100 | what do you see as the state,
00:33:22.060 | the challenges of autonomous vehicles as they are now?
00:33:25.900 | Is actually achieving that huge scale
00:33:29.220 | and having a huge impact on society?
00:33:31.700 | - I'm extremely proud of what has been accomplished.
00:33:35.300 | And again, I'm taking a lot of credit for the work that I do.
00:33:37.620 | (laughing)
00:33:38.460 | And I'm actually very optimistic
00:33:40.220 | and people have been kind of worrying,
00:33:42.380 | is it too fast?
00:33:43.220 | Is it too slow?
00:33:44.060 | Why is it not there yet?
00:33:44.900 | And so on.
00:33:45.900 | It is actually quite an interesting hard problem.
00:33:48.860 | And in that a self-driving car,
00:33:51.660 | to build one that manages 90% of the problems
00:33:55.340 | and count them every day driving is easy.
00:33:57.260 | We can literally do this over a weekend.
00:33:59.500 | To do 99% might take a month.
00:34:02.100 | Then there's 1% left.
00:34:03.260 | So 1% would mean that you still
00:34:05.660 | have a fatal accident every week.
00:34:08.140 | Very unacceptable.
00:34:09.020 | So now you work on this 1%
00:34:10.940 | and the 99% of that,
00:34:12.900 | the remaining 1% is actually still relatively easy.
00:34:15.780 | But now you're down to like a hundredth of 1%
00:34:18.220 | and it's still completely unacceptable in terms of safety.
00:34:21.580 | So the variety of things you encounter are just enormous.
00:34:24.260 | And that gives me enormous respect for human beings
00:34:26.500 | that we're able to deal with the couch on the highway,
00:34:30.220 | right?
00:34:31.060 | Or the deer in the headlight
00:34:31.900 | or the blown tire that we've never been trained for.
00:34:34.940 | And all of a sudden I have to handle it
00:34:35.980 | in an emergency situation
00:34:37.140 | and often do very, very successfully.
00:34:38.780 | It's amazing from that perspective
00:34:40.700 | how safe driving actually is
00:34:42.220 | given how many millions of miles we drive
00:34:44.860 | every year in this country.
00:34:46.220 | We are now at a point
00:34:48.500 | where I believe the technology is there.
00:34:50.060 | And I've seen it.
00:34:51.620 | I've seen it in Waymo.
00:34:52.540 | I've seen it in Aptiv.
00:34:53.580 | I've seen it in Cruise
00:34:55.460 | and in a number of companies and Voyage
00:34:58.300 | where vehicles not driving around
00:35:00.980 | and basically flawlessly are able to drive people around
00:35:04.260 | in limited scenarios.
00:35:06.100 | In fact, you can go to Vegas today
00:35:08.020 | and order a summon a lift.
00:35:09.940 | And if you got the right setting of your app
00:35:13.540 | you'd be picked up by a driverless car.
00:35:15.820 | Now there's still safety drivers in there
00:35:18.100 | but that's a fantastic way to kind of learn
00:35:21.340 | what the limits of technology today.
00:35:22.980 | And there's still some glitches
00:35:24.740 | but the glitches have become very, very rare.
00:35:26.580 | I think the next step is gonna be to down cost it
00:35:29.700 | to harden it.
00:35:31.300 | The entrapment, the sensors
00:35:33.780 | are not quite an automotive grade standard yet.
00:35:36.220 | And then to really build the business models
00:35:37.820 | to really kind of go somewhere and make the business case.
00:35:40.980 | And the business case is hard work.
00:35:42.580 | It's not just, oh my God, we have this capability
00:35:44.580 | people are just gonna buy it.
00:35:45.540 | You have to make it affordable.
00:35:46.740 | You have to give people the,
00:35:50.220 | find the social acceptance of people.
00:35:52.300 | None of the teams yet has been able to
00:35:54.460 | or gutsy enough to drive around
00:35:56.180 | without a person inside the car.
00:35:59.260 | And that's the next magical hurdle.
00:36:01.300 | We'll be able to send these vehicles around
00:36:03.820 | completely empty in traffic.
00:36:05.780 | And I think, I mean, I wait every day,
00:36:08.140 | wait for the news that Waymo has just done this.
00:36:10.700 | - So, you know, it's interesting you mentioned gutsy.
00:36:14.700 | Let me ask some maybe unanswerable question,
00:36:19.700 | maybe edgy questions, but in terms of
00:36:24.460 | how much risk is required, some guts
00:36:28.700 | in terms of leadership style,
00:36:30.340 | it would be good to contrast approaches.
00:36:32.580 | And I don't think anyone knows what's right.
00:36:34.660 | But if we compare Tesla and Waymo, for example,
00:36:38.540 | Elon Musk and the Waymo team,
00:36:41.420 | there's slight differences in approach.
00:36:45.660 | So on the Elon side, there's more,
00:36:49.540 | I don't know what the right word to use,
00:36:50.860 | but aggression in terms of innovation.
00:36:53.900 | And on Waymo side, there's more sort of cautious,
00:36:58.900 | safety focused approach to the problem.
00:37:03.460 | What do you think it takes?
00:37:06.180 | What leadership at which moment is right?
00:37:09.140 | Which approach is right?
00:37:10.660 | - Look, I don't sit in either of those teams.
00:37:13.860 | So I'm unable to even verify,
00:37:16.220 | like somebody says, correct.
00:37:17.980 | In the end of the day, every innovator in that space
00:37:21.660 | will face a fundamental dilemma.
00:37:23.140 | And I would say you could put aerospace titans
00:37:27.140 | into the same bucket, which is you have to balance
00:37:30.220 | public safety with your drive to innovate.
00:37:34.300 | And this country in particular, in the States,
00:37:36.780 | has a hundred plus year history
00:37:38.340 | of doing this very successfully.
00:37:40.620 | Air travel is what a hundred times as safe per mile
00:37:43.900 | than ground travel, than cars.
00:37:46.620 | And there's a reason for it,
00:37:48.500 | because people have found ways to be very methodological
00:37:52.900 | about ensuring public safety,
00:37:55.140 | while still being able to make progress
00:37:56.940 | on important aspects, for example,
00:37:59.060 | like airline noise and fuel consumption.
00:38:01.780 | So I think that those practices are proven
00:38:06.180 | and they actually work.
00:38:07.900 | We live in a world safer than ever before.
00:38:09.900 | And yes, there will always be the provision
00:38:11.900 | that something goes wrong.
00:38:12.740 | There's always the possibility that someone makes a mistake
00:38:15.260 | or there's an unexpected failure.
00:38:17.140 | We can never guarantee to a hundred percent absolute safety
00:38:20.980 | rather than just not doing it.
00:38:23.340 | But I think I'm very proud of the history of the United States
00:38:27.100 | I mean, we've dealt with much more dangerous technology
00:38:30.140 | like nuclear energy and kept that safe too.
00:38:32.740 | We have nuclear weapons and we keep those safe.
00:38:36.420 | So we have methods and procedures
00:38:39.460 | that really balance these two things very, very successfully.
00:38:42.980 | - You've mentioned a lot of great autonomous vehicle
00:38:44.900 | companies that are taking sort of the level four, level five
00:38:48.780 | they jump in full autonomy with a safety driver
00:38:51.900 | and take that kind of approach.
00:38:53.140 | And also through simulation and so on.
00:38:55.780 | There's also the approach that Tesla Autopilot is doing
00:38:59.580 | which is kind of incrementally taking a level two vehicle
00:39:03.740 | and using machine learning and learning
00:39:05.860 | from the driving of human beings and trying to creep up
00:39:10.580 | trying to incrementally improve the system
00:39:12.380 | until it's able to achieve level four autonomy.
00:39:15.580 | So perfect autonomy in certain kind of geographical regions.
00:39:19.820 | What are your thoughts on these contrasting approaches?
00:39:23.180 | - Well, so first of all, I'm a very proud Tesla owner
00:39:25.620 | and I literally use the Autopilot every day
00:39:27.900 | and it literally has kept me safe.
00:39:29.620 | It is a beautiful technology specifically
00:39:33.980 | for highway driving when I'm slightly tired
00:39:37.660 | because then it turns me into a much safer driver
00:39:42.260 | and that I'm a hundred percent confident that's the case.
00:39:46.100 | In terms of the right approach,
00:39:47.700 | I think the biggest change I've seen
00:39:49.900 | since I ran the Waymo team is this thing called deep learning.
00:39:54.500 | Deep learning was not a hot topic when I started Waymo
00:39:58.020 | or Google self-driving cars.
00:39:59.420 | It was there, in fact, we started Google Brain
00:40:01.740 | at the same time in Google X.
00:40:02.820 | So I invested in deep learning, but people didn't talk
00:40:05.980 | about it, it wasn't a hot topic.
00:40:07.860 | And now it is.
00:40:08.700 | There's a shift of emphasis
00:40:10.380 | from a more geometric perspective
00:40:12.460 | where you use geometric sensors.
00:40:14.340 | They give you a full 3D view
00:40:15.740 | and you do a geometric reasoning about,
00:40:17.300 | oh, this box over here might be a car,
00:40:19.660 | towards a more human-like, oh, let's just learn about it.
00:40:24.180 | This looks like the thing I've seen 10,000 times before.
00:40:26.540 | So maybe it's the same thing, machine learning perspective.
00:40:30.300 | And that has really put, I think,
00:40:32.180 | all these approaches on steroids.
00:40:34.740 | At Udacity, we teach a course in self-driving cars.
00:40:38.740 | In fact, I think we've graduated over 20,000 or so people
00:40:43.820 | on self-driving car skills.
00:40:45.020 | So every self-driving car team
00:40:47.060 | in the world now uses our engineers.
00:40:49.260 | And in this course, the very first homework assignment
00:40:51.900 | is to do lane finding on images.
00:40:54.900 | And lane finding images for laymen,
00:40:56.940 | what this means is you put a camera into your car
00:40:59.060 | or you open your eyes
00:40:59.900 | and you wouldn't know where the lane is, right?
00:41:01.820 | So you can stay inside the lane with your car.
00:41:04.980 | Humans can do this super easily.
00:41:06.540 | You just look and you know where the lane is,
00:41:08.100 | just intuitively.
00:41:10.180 | For machines, for a long time, it was super hard
00:41:12.180 | because people would write these kind of crazy rules.
00:41:14.660 | If there's like one lane markers
00:41:16.100 | and here's what white really means,
00:41:17.660 | this is not quite white enough.
00:41:19.100 | So let's, oh, it's not white.
00:41:20.340 | Or maybe the sun is shining.
00:41:21.420 | So when the sun shines and this is white
00:41:23.500 | and this is a straight line,
00:41:24.660 | or maybe it's not quite a straight line
00:41:25.700 | because the road is curved.
00:41:27.260 | And do we know that there's really six feet
00:41:29.260 | between lane markings or not, or 12 feet, whatever it is.
00:41:32.220 | And now, what the students are doing,
00:41:36.260 | they would take machine learning.
00:41:37.380 | So instead of like writing these crazy rules
00:41:39.620 | for the lane marker, it's just say,
00:41:40.820 | hey, let's take an hour of driving
00:41:42.700 | and label it and tell the vehicle
00:41:44.420 | this is actually the lane by hand.
00:41:45.780 | And then these are examples
00:41:47.380 | and have the machine find its own rules
00:41:49.420 | what lane markings are.
00:41:51.420 | And within 24 hours, now every student
00:41:53.820 | that's never done any programming before in this space
00:41:56.060 | can write a perfect lane finder
00:41:58.340 | as good as the best commercial lane finders.
00:42:00.900 | And that's completely amazing to me.
00:42:02.780 | We've seen progress using machine learning
00:42:05.540 | that completely dwarfs anything
00:42:08.180 | that I saw 10 years ago.
00:42:10.940 | - Yeah, and just as a side note,
00:42:12.820 | the self-driving car nanodegree,
00:42:15.220 | the fact that you launched that many years ago now,
00:42:18.900 | maybe four years ago.
00:42:20.100 | - Three years ago.
00:42:20.940 | - Three years ago is incredible.
00:42:22.060 | That's a great example of system level thinking.
00:42:24.740 | So just taking an entire course that teaches you
00:42:27.580 | how to solve the entire problem,
00:42:29.260 | I definitely recommend people.
00:42:31.220 | - It's become super popular
00:42:32.380 | and it's become actually incredibly high quality.
00:42:34.220 | We've been with Mercedes
00:42:35.260 | and various other companies in that space.
00:42:38.060 | And we find that engineers from Tesla and Waymo
00:42:40.620 | are taking it to bay.
00:42:41.980 | The insight was that two things,
00:42:45.500 | one is existing universities will be very slow to move
00:42:49.220 | because the department lies
00:42:50.500 | and there's no department for self-driving cars.
00:42:52.380 | So between McGee and EE and computer science,
00:42:56.260 | getting those folks together into one room
00:42:57.700 | is really, really hard.
00:42:59.660 | And every professor listening here will know,
00:43:01.260 | they'll probably agree to that.
00:43:02.940 | And secondly, even if all the great universities
00:43:06.460 | did this, which none so far
00:43:08.340 | has developed a curriculum in this field,
00:43:11.060 | it is just a few thousand students that can partake
00:43:13.660 | because all the great universities are super selective.
00:43:16.180 | So how about people in India?
00:43:18.060 | How about people in China or in the Middle East
00:43:20.580 | or Indonesia or Africa?
00:43:23.380 | Why should those be excluded
00:43:25.100 | from the skill of building self-driving cars?
00:43:27.180 | Are they any dumber than we are?
00:43:28.380 | Are they any less privileged?
00:43:30.140 | And the answer is, we should just give everybody the skill
00:43:34.780 | to build a self-driving car.
00:43:35.820 | Because if we do this,
00:43:37.380 | then we have like a thousand self-driving car startups.
00:43:40.300 | And if 10% succeed, that's like a hundred,
00:43:42.900 | that means a hundred countries now
00:43:44.140 | will have self-driving cars and be safer.
00:43:46.740 | - It's kind of interesting to imagine,
00:43:48.940 | impossible to quantify, but the number,
00:43:52.500 | the, you know, over a period of several decades,
00:43:54.980 | the impact that has, like a single course,
00:43:57.900 | like a ripple effect of society.
00:43:59.780 | I just recently talked to Andrew
00:44:02.900 | and who was creator of Cosmos.
00:44:05.340 | So it's a show, it's interesting to think about
00:44:08.180 | how many scientists that show launched.
00:44:10.700 | And so it's really, in terms of impact,
00:44:15.580 | I can't imagine a better course
00:44:17.180 | than the self-driving car course.
00:44:18.620 | That's, you know, there's other more specific disciplines
00:44:21.820 | like deep learning and so on that Udacity is also teaching,
00:44:24.100 | but self-driving cars, it's really,
00:44:25.900 | really interesting course.
00:44:26.860 | - Yeah, and it came at the right moment.
00:44:28.420 | It came at a time when there were a bunch of acqui-hires.
00:44:31.700 | Acqui-hire is acquisition of a company,
00:44:34.180 | not for its technology or its products or business,
00:44:36.380 | but for its people.
00:44:38.300 | So acqui-hire means maybe the company of 70 people,
00:44:40.620 | they have no product yet, but they're super smart people
00:44:43.180 | and they pay a certain amount of money.
00:44:44.340 | So I took acqui-hires like GM Cruise and Uber and others
00:44:48.420 | and did the math and said, hey, how many people are there
00:44:52.220 | and how much money was paid?
00:44:53.780 | And as a lower bound, I estimated the value
00:44:56.980 | of a self-driving car engineer in these acquisitions
00:45:00.420 | to be at least $10 million, right?
00:45:02.220 | So think about this, you get yourself a skill
00:45:05.060 | and you team up and build a company
00:45:06.700 | and your worth now is $10 million.
00:45:09.820 | I mean, that's kind of cool.
00:45:10.820 | I mean, what other thing could you do in life
00:45:13.420 | to be worth $10 million within a year?
00:45:15.940 | - Yeah, amazing.
00:45:17.620 | But to come back for a moment onto deep learning
00:45:21.020 | and its application in autonomous vehicles,
00:45:23.760 | what are your thoughts on Elon Musk's statement,
00:45:28.500 | provocative statement perhaps, that lighter is a crutch?
00:45:31.100 | So this geometric way of thinking about the world
00:45:34.060 | may be holding us back if what we should instead be doing
00:45:38.980 | in this robotics, in this particular space
00:45:41.020 | of autonomous vehicles is using camera as a primary sensor
00:45:46.020 | and using computer vision and machine learning
00:45:48.260 | as the primary way to--
00:45:49.780 | - Look, I have two comments.
00:45:50.620 | I think first of all, we all know that people can drive cars
00:45:55.420 | without lighters in their hands because we only have eyes
00:45:59.020 | and we mostly just use eyes for driving.
00:46:02.140 | Maybe we use some other perception about our bodies,
00:46:04.620 | accelerations, occasionally our ears,
00:46:06.720 | certainly not our noses. (laughs)
00:46:10.740 | So the existence proof is there
00:46:12.500 | that eyes must be sufficient.
00:46:14.640 | In fact, we could even drive a car
00:46:17.980 | if someone put a camera out
00:46:19.480 | and then gave us the camera image with no latency,
00:46:23.460 | we would be able to drive a car that way, the same way.
00:46:26.380 | So a camera is also sufficient.
00:46:28.780 | Secondly, I really love the idea that in the Western world,
00:46:31.860 | we have many, many different people
00:46:33.620 | trying different hypotheses.
00:46:35.700 | It's almost like an ant hill.
00:46:37.340 | An ant hill tries to forge for food, right?
00:46:39.580 | You can sit there as two ants
00:46:40.820 | and agree what the perfect path is
00:46:42.580 | and then every single ant marches
00:46:44.060 | for the most likely location of food is
00:46:46.380 | or you can have them just spread out.
00:46:47.980 | And I promise you the spread out solution will be better
00:46:50.500 | because if the disgusting philosophical,
00:46:54.020 | intellectual ants get it wrong
00:46:55.620 | and they're all moving the wrong direction,
00:46:56.980 | they're gonna waste the day
00:46:58.340 | and then they're gonna discuss again for another week.
00:47:00.540 | Whereas if all these ants go in a random direction,
00:47:02.500 | someone's gonna succeed
00:47:03.540 | and they're gonna come back and claim victory
00:47:05.620 | and get the Nobel prize or whatever the ant equivalent is.
00:47:08.700 | And then they all march in the same direction.
00:47:10.580 | And that's great about society.
00:47:11.860 | That's great about the Western society.
00:47:13.220 | We're not plan-based, we're not central-based.
00:47:15.540 | We don't have a Soviet Union style central government
00:47:19.140 | that tells us where to forge.
00:47:21.020 | We just forge.
00:47:21.860 | We start in C-Corp.
00:47:23.140 | We get investor money, go out and try it out.
00:47:25.860 | And who knows who's gonna win?
00:47:27.820 | (laughs)
00:47:28.740 | - I like it.
00:47:29.580 | When you look at the long-term vision
00:47:33.500 | of autonomous vehicles,
00:47:35.220 | do you see machine learning
00:47:36.940 | as fundamentally being able to solve most of the problems?
00:47:39.620 | So learning from experience.
00:47:42.300 | - I'd say we should be very clear
00:47:44.260 | about what machine learning is and is not.
00:47:46.100 | And I think there's a lot of confusion.
00:47:48.180 | What it is today is a technology
00:47:50.900 | that can go through large databases
00:47:54.700 | of repetitive patterns and find those patterns.
00:47:59.700 | So an example, we did a study at Stanford two years ago
00:48:03.580 | where we applied machine learning
00:48:05.460 | to detecting skin cancer in images.
00:48:07.900 | And we harvested or built a data set
00:48:10.780 | of 129,000 skin photo shots
00:48:15.100 | that all had been biopsied for what the actual situation was.
00:48:19.460 | And those included melanomas and carcinomas,
00:48:22.700 | also included rashes and other skin conditions, lesions.
00:48:26.460 | And then we had a network find those patterns
00:48:30.740 | and it was by and large able to then detect skin cancer
00:48:34.540 | with an iPhone as accurately
00:48:36.700 | as the best board-certified Stanford-level dermatologist.
00:48:41.380 | We proved that.
00:48:42.780 | Now this thing was great in this one thing,
00:48:45.940 | finding skin cancer, but it couldn't drive a car.
00:48:48.540 | So the difference to human intelligence
00:48:51.580 | is we do all these many, many things
00:48:53.260 | and we can often learn from a very small data set
00:48:56.700 | of experiences, whereas machines still need
00:48:59.580 | very large data sets and things that would be very repetitive.
00:49:03.340 | Now that's still super impactful
00:49:04.660 | because almost everything we do is repetitive.
00:49:06.420 | So that's gonna really transform human labor,
00:49:09.980 | but it's not this almighty general intelligence.
00:49:13.100 | We're really far away from a system
00:49:15.300 | that would exhibit general intelligence.
00:49:17.300 | To that end, I actually commiserate the naming a little bit
00:49:21.300 | because artificial intelligence, if you believe Hollywood,
00:49:24.420 | is immediately mixed into the idea of human suppression
00:49:27.300 | and machine superiority.
00:49:30.340 | I don't think that we're gonna see this in my lifetime.
00:49:32.940 | I don't think human suppression is a good idea.
00:49:35.460 | I don't see it coming.
00:49:37.420 | I don't see the technology being there.
00:49:39.700 | What I see instead is a very pointed,
00:49:42.300 | focused pattern recognition technology
00:49:44.300 | that's able to extract patterns from large data sets.
00:49:48.420 | And in doing so, it can be super impactful, super impactful.
00:49:53.420 | Let's take the impact of artificial intelligence
00:49:55.900 | on human work.
00:49:57.620 | We all know that it takes something like 10,000 hours
00:50:00.540 | to become an expert.
00:50:01.540 | If you're gonna be a doctor or a lawyer
00:50:04.260 | or even a really good driver,
00:50:06.300 | it takes a certain amount of time to become experts.
00:50:09.500 | Machines now are able and have been shown
00:50:12.220 | to observe people become experts and observe experts
00:50:16.700 | and then extract those rules from experts
00:50:18.460 | in some interesting way that could go from law to sales
00:50:23.100 | to driving cars to diagnosing cancer
00:50:28.100 | and then giving that capability to people
00:50:31.460 | who are completely new in their job.
00:50:33.060 | We now can, and that's been done.
00:50:35.460 | It's been done commercially in many, many instantiations.
00:50:38.460 | So that means we can use machine learning
00:50:40.780 | to make people expert on the very first day of their work.
00:50:45.500 | Like think about the impact.
00:50:46.540 | If your doctor is still in their first 10,000 hours,
00:50:51.020 | you have a doctor who's not quite an expert yet.
00:50:53.780 | Who would not want a doctor who is the world's best expert?
00:50:57.340 | And now we can leverage machines
00:50:59.180 | to really eradicate error in decision-making,
00:51:02.700 | error in lack of expertise for human doctors.
00:51:06.180 | That could save your life.
00:51:08.340 | - If we can link on that for a little bit,
00:51:10.300 | in which way do you hope machines in the medical field
00:51:14.780 | could help assist doctors?
00:51:16.340 | You mentioned this sort of accelerating the learning curve
00:51:21.300 | or people, if they start a job,
00:51:24.500 | or in the first 10,000 hours can be assisted by machines.
00:51:27.340 | How do you envision that assistance looking?
00:51:29.740 | - So we built this app for an iPhone
00:51:32.340 | that can detect and classify and diagnose skin cancer.
00:51:36.300 | And we proved two years ago that it does pretty much
00:51:40.540 | as good or better than the best human doctors.
00:51:42.220 | So let me tell you a story.
00:51:43.580 | So there's a friend of mine, let's call him Ben.
00:51:45.460 | Ben is a very famous venture capitalist.
00:51:47.660 | He goes to his doctor and the doctor looks at a mole
00:51:50.700 | and says, "Hey, that mole is probably harmless."
00:51:55.340 | And for some very funny reason,
00:51:58.660 | he pulls out that phone with our app.
00:52:00.420 | He's a collaborator in our study.
00:52:02.620 | And the app says, "No, no, no, no, this is a melanoma."
00:52:06.260 | And for background, melanomas are,
00:52:08.660 | and skin cancer is the most common cancer in this country.
00:52:12.380 | Melanomas can go from stage zero to stage four
00:52:16.620 | within less than a year.
00:52:18.100 | Stage zero means you can basically cut it out yourself
00:52:20.860 | with a kitchen knife and be safe.
00:52:23.180 | And stage four means your chances of living
00:52:25.300 | for five more years are less than 20%.
00:52:27.980 | So it's a very serious, serious, serious condition.
00:52:31.180 | So this doctor who took out the iPhone
00:52:36.140 | looked at the iPhone and was a little bit puzzled.
00:52:37.700 | He said, "I mean, but just to be safe,
00:52:39.700 | let's cut it out and biopsy it."
00:52:41.580 | That's the technical term for it.
00:52:43.220 | Let's get an in-depth diagnostics
00:52:45.380 | that is more than just looking at it.
00:52:47.660 | And it came back as cancerous, as a melanoma.
00:52:50.740 | And it was then removed.
00:52:52.180 | And my friend Ben, I was hiking with him
00:52:54.900 | and we were talking about AI.
00:52:56.220 | And I said, I told him I do this book on skin cancer.
00:52:58.820 | And he said, "Oh, funny.
00:53:00.660 | My doctor just had an iPhone that found my cancer."
00:53:03.740 | (both laughing)
00:53:05.420 | So I was like completely intrigued.
00:53:06.860 | I didn't even know about this.
00:53:08.140 | So here's a person, I mean,
00:53:09.420 | this is a real human life, right?
00:53:11.260 | - Yes.
00:53:12.100 | - Now who doesn't know somebody
00:53:12.940 | who has been affected by cancer?
00:53:13.980 | Cancer is cause of death number two.
00:53:16.100 | Cancer is this kind of disease that is mean
00:53:19.420 | in the following way.
00:53:21.020 | Most cancers can actually be cured relatively easily
00:53:24.460 | if we catch them early.
00:53:25.820 | And the reason why we don't tend to catch them early
00:53:28.300 | is because they have no symptoms.
00:53:30.540 | Like your very first symptom of a gallbladder cancer
00:53:33.820 | or a pancreas cancer might be a headache.
00:53:37.020 | And when you finally go to your doctor
00:53:38.620 | because of these headaches or your back pain
00:53:41.580 | and you're being imaged, it's usually stage four plus.
00:53:45.860 | And that's the time when the occurring chances
00:53:48.180 | might be dropped to a single digit percentage.
00:53:50.820 | So if we could leverage AI to inspect your body
00:53:54.500 | on a regular basis without even a doctor in the room,
00:53:58.060 | maybe when you take a shower or what have you,
00:54:00.300 | I know this sounds creepy,
00:54:01.420 | but then we might be able to save millions
00:54:03.740 | and millions of lives.
00:54:06.260 | - You've mentioned there's a concern that people have
00:54:09.420 | about near-term impacts of AI in terms of job loss.
00:54:12.780 | So you've mentioned being able to assist doctors,
00:54:15.460 | being able to assist people in their jobs.
00:54:17.860 | Do you have a worry of people losing their jobs
00:54:21.060 | or the economy being affected by the improvements in AI?
00:54:25.420 | - Yeah, anybody concerned about job losses,
00:54:27.660 | please come to Gerdasady.com.
00:54:30.020 | We teach contemporary tech skills
00:54:32.300 | and we have our kind of implicit job promise.
00:54:36.740 | We often, when we measure,
00:54:38.700 | we spend way over 50% of our graduates in new jobs
00:54:41.860 | and they're very satisfied about it.
00:54:43.780 | And it costs almost nothing,
00:54:44.860 | costs like 1,500 max or something like that.
00:54:47.180 | - And I saw there's a cool new program
00:54:48.980 | that you agree with the US government,
00:54:51.180 | guaranteeing that you will help give scholarships
00:54:54.940 | that educate people in this kind of situation.
00:54:57.900 | - Yeah, we're working with the US government
00:55:00.020 | on the idea of basically rebuilding the American dream.
00:55:03.900 | So Udacity has just dedicated 100,000 scholarships
00:55:07.460 | for citizens of America for various levels of courses
00:55:12.100 | that eventually will get you a job.
00:55:15.620 | And those courses are all somewhat related
00:55:18.740 | to the tech sector because the tech sector
00:55:20.500 | is kind of the hottest sector right now.
00:55:22.100 | And they range from inter-level digital marketing
00:55:24.980 | to very advanced self-driving car engineering.
00:55:28.100 | And we're doing this with the White House
00:55:29.460 | because we think it's bipartisan.
00:55:30.900 | It's an issue that if you wanna really make America great,
00:55:35.900 | being able to be a part of the solution
00:55:40.060 | and live the American dream requires us
00:55:43.020 | to be proactive about our education and our skillset.
00:55:45.780 | It's just the way it is today.
00:55:47.700 | And it's always been this way.
00:55:48.700 | And we always had this American dream
00:55:49.940 | to send our kids to college.
00:55:51.140 | And now the American dream has to be
00:55:53.260 | to send ourselves to college.
00:55:54.660 | We can do this very, very, very efficiently
00:55:58.220 | and we can squeeze in the evenings and things to online.
00:56:01.780 | - At all ages.
00:56:03.140 | - All ages.
00:56:03.980 | So our learners go from age 11 to age 80.
00:56:08.980 | I just traveled Germany and the guy
00:56:14.060 | in the train compartment next to me was one of my students.
00:56:17.940 | It's like, wow, that's amazing.
00:56:19.820 | Think about impact.
00:56:21.300 | We've become the educator of choice for now,
00:56:24.020 | I believe officially six countries or five countries.
00:56:26.500 | Mostly in the Middle East, like Saudi Arabia and in Egypt.
00:56:30.100 | In Egypt, we just had a cohort graduate
00:56:33.420 | where we had 1100 high school students
00:56:37.300 | that went through programming skills proficient
00:56:40.340 | at the level of a computer science undergrad.
00:56:42.940 | And we had a 95% graduation rate
00:56:45.220 | even though everything's online.
00:56:46.260 | It's kind of tough, but we kind of trying to figure out
00:56:48.260 | how to make this effective.
00:56:50.140 | The vision is very, very simple.
00:56:52.540 | The vision is education ought to be a basic human right.
00:56:57.540 | It cannot be locked up behind ivory tower walls
00:57:02.300 | only for the rich people,
00:57:03.620 | for the parents who might be bribe themselves
00:57:05.620 | into the system and only for young people
00:57:08.380 | and only for people from the right demographics
00:57:10.460 | and the right geography and possibly even the right race.
00:57:14.220 | It has to be opened up to everybody.
00:57:15.820 | If we are truthful to the human mission,
00:57:18.700 | if we are truthful to our values,
00:57:20.620 | we're gonna open up education to everybody in the world.
00:57:23.420 | So Udacity's pledge of 100,000 scholarships,
00:57:27.180 | I think is the biggest pledge of scholarships ever
00:57:29.180 | in terms of numbers.
00:57:30.740 | And we're working, as I said, with the White House
00:57:32.980 | and with very accomplished CEOs
00:57:35.460 | like Tim Cook from Apple and others
00:57:37.460 | to really bring education to everywhere in the world.
00:57:40.100 | - Not to ask you to pick the favorite of your children,
00:57:44.580 | but at this point-- - Oh, that's Jasper.
00:57:46.620 | (laughing)
00:57:47.460 | I only have one that I know of.
00:57:49.700 | - Okay, good.
00:57:50.540 | In this particular moment, what nano degree,
00:57:55.780 | what set of courses are you most excited about at Udacity?
00:58:00.020 | Or is that too impossible to pick?
00:58:01.980 | - I've been super excited about something
00:58:03.780 | we haven't launched yet and we're building,
00:58:05.460 | which is when we talk to our partner companies,
00:58:09.100 | we have now a very strong footing in the enterprise world.
00:58:12.700 | And also to our students,
00:58:14.580 | we've kind of always focused on these hard skills
00:58:17.260 | like the programming skills or math skills
00:58:19.740 | or building skills or design skills.
00:58:22.180 | And a very common ask is soft skills.
00:58:25.180 | Like how do you behave in your work?
00:58:26.860 | How do you develop empathy?
00:58:28.260 | How do you work in a team?
00:58:29.540 | What are the very basics of management?
00:58:32.380 | How do you do time management?
00:58:33.700 | How do you advance your career
00:58:36.260 | in the context of a broader community?
00:58:39.260 | And that's something that we haven't done very well
00:58:41.740 | at Udacity and I would say most universities
00:58:43.860 | are doing very poorly as well
00:58:45.180 | because we are so obsessed with individual test scores
00:58:47.900 | and pay so little attention to teamwork in education.
00:58:52.620 | So that's something I see us moving into as a company
00:58:55.500 | because I'm excited about this.
00:58:56.940 | And I think, look, we can teach people tech skills
00:59:00.100 | and they're gonna be great.
00:59:00.940 | But if we teach people empathy,
00:59:02.700 | that's gonna have the same impact.
00:59:04.940 | - Maybe harder than self-driving cars, but--
00:59:08.100 | - I don't think so.
00:59:08.940 | I think the rules are really simple.
00:59:11.300 | You just have to want to engage.
00:59:14.380 | It's weird, we literally went in school in K through 12,
00:59:18.180 | we teach kids like get the highest math score.
00:59:20.460 | And if you are a rational human being,
00:59:22.860 | you might evolve from this education,
00:59:24.860 | say having the best math score
00:59:26.740 | and the best English scores, make me the best leader.
00:59:29.620 | And it turns out not to be the case.
00:59:31.060 | It's actually really wrong
00:59:32.860 | because making, first of all, in terms of math scores,
00:59:35.820 | I think it's perfectly fine to hire somebody
00:59:37.620 | with great math skills.
00:59:38.500 | You don't have to do it yourself.
00:59:40.620 | You can hire someone with great empathy for you,
00:59:42.740 | that's much harder, but it can always hire someone
00:59:44.980 | with great math skills.
00:59:46.340 | But we live in an affluent world
00:59:48.980 | where we constantly deal with other people.
00:59:51.020 | And that's a beauty.
00:59:51.900 | It's not a nuisance, it's a beauty.
00:59:53.340 | So if we somewhat develop that muscle
00:59:55.940 | that we can do that well and empower others
00:59:59.700 | in the workplace, I think we're gonna be super successful.
01:00:02.900 | - And I know many fellow roboticists
01:00:05.780 | and computer scientists that I will insist
01:00:08.660 | to take this course.
01:00:09.820 | (laughing)
01:00:11.020 | - Not to be named here.
01:00:12.180 | - Not to be named.
01:00:13.780 | Many, many years ago, 1903,
01:00:17.980 | the Wright brothers flew in Kitty Hawk for the first time.
01:00:22.660 | And you've launched a company of the same name, Kitty Hawk,
01:00:26.980 | with the dream of building flying cars, EVtols.
01:00:31.980 | So at the big picture, what are the big challenges
01:00:35.780 | of making this thing that actually have inspired
01:00:38.700 | generations of people about what the future looks like?
01:00:41.780 | What does it take?
01:00:42.620 | What are the biggest challenges?
01:00:43.700 | - So flying cars has always been a dream.
01:00:47.260 | Every boy, every girl wants to fly.
01:00:49.740 | Let's be honest.
01:00:50.580 | - Yes.
01:00:51.420 | - And let's go back in our history
01:00:52.380 | of your dreaming of flying.
01:00:53.820 | I think my, honestly, my single most remembered
01:00:56.540 | childhood dream has been a dream
01:00:58.340 | where I was sitting on a pillow and I could fly.
01:01:00.780 | I was like five years old.
01:01:02.060 | I remember like maybe three dreams of my childhood,
01:01:04.180 | but that's the one I remember most vividly.
01:01:06.420 | And then Peter Thiel famously said,
01:01:09.380 | "They promised us flying cars
01:01:10.660 | and they gave us 140 characters,"
01:01:12.780 | pointing at Twitter at the time,
01:01:15.220 | limited message size to 140 characters.
01:01:18.380 | So we're coming back now to really go
01:01:20.220 | for this super impactful stuff like flying cars.
01:01:23.220 | And to be precise, they're not really cars.
01:01:25.900 | They don't have wheels.
01:01:27.180 | They're actually much closer to a helicopter
01:01:28.580 | than anything else.
01:01:29.620 | They take off vertically and they fly horizontally,
01:01:32.060 | but they have important differences.
01:01:34.380 | One difference is that they are much quieter.
01:01:37.740 | We just released a vehicle called Project Heaviside
01:01:41.620 | that can fly over you as low as a helicopter.
01:01:43.540 | And you basically can't hear it.
01:01:45.300 | It's like 38 decibels.
01:01:46.740 | It's like, if you were inside the library,
01:01:49.300 | you might be able to hear it,
01:01:50.260 | but anywhere outdoors, your ambient noise is higher.
01:01:53.020 | Secondly, they're much more affordable.
01:01:57.060 | They're much more affordable than helicopters.
01:01:59.020 | And the reason is helicopters are expensive
01:02:01.940 | for many reasons.
01:02:03.060 | There's lots of single point of figures in a helicopter.
01:02:07.020 | There's a bolt between the blades
01:02:09.140 | that's called Jesus bolt.
01:02:10.780 | And the reason why it's called Jesus bolt
01:02:12.420 | is that if this bolt breaks, you will die.
01:02:16.380 | There is no second solution in helicopter flight.
01:02:19.500 | Whereas we have these distributed mechanism.
01:02:21.500 | When you go from gasoline to electric,
01:02:23.740 | you can now have many, many, many small motors
01:02:25.820 | as opposed to one big motor.
01:02:27.260 | And that means if you lose one of those motors,
01:02:28.780 | not a big deal.
01:02:29.620 | Heaviside, if it loses a motor, has eight of those,
01:02:32.820 | if it loses one of those eight motors,
01:02:34.020 | so it's seven left,
01:02:35.180 | it can take off just like before
01:02:37.260 | and land just like before.
01:02:38.820 | We are now also moving into a technology
01:02:42.020 | that doesn't require a commercial pilot
01:02:44.140 | because in some level,
01:02:45.500 | flight is actually easier than ground transportation.
01:02:48.980 | Like in self-driving cars,
01:02:50.700 | the world is full of like children and bicycles
01:02:54.500 | and other cars and mailboxes and curbs
01:02:56.620 | and shrubs and what have you,
01:02:58.420 | all these things you have to avoid.
01:03:00.500 | When you go above the buildings and tree lines,
01:03:03.740 | there's nothing there.
01:03:04.620 | I mean, you can do the test right now,
01:03:06.100 | look outside and count the number of things you see flying.
01:03:09.420 | I'd be shocked if you could see more than two things.
01:03:11.500 | It's probably just zero.
01:03:12.860 | In the Bay Area, the most I've ever seen was six.
01:03:16.940 | And maybe it's 15 or 20, but not 10,000.
01:03:20.420 | So the sky is very ample and very empty and very free.
01:03:23.980 | So the vision is, can we build a socially acceptable
01:03:27.820 | mass transit solution for daily transportation
01:03:32.340 | that is affordable?
01:03:34.260 | And we have an existence proof.
01:03:36.300 | Heaviside can fly 100 miles in range
01:03:39.780 | with still 30% electric reserves.
01:03:43.260 | It can fly up to like 180 miles an hour.
01:03:46.060 | We know that that solution at scale
01:03:48.860 | would make your ground transportation
01:03:51.380 | 10 times as fast as a car
01:03:53.780 | based on US census or statistics data,
01:03:57.340 | which means we would take your 300 hours of daily,
01:04:00.860 | of yearly commute down to 30 hours
01:04:02.980 | and give you 270 hours back.
01:04:05.180 | Who wouldn't want, I mean, who doesn't hate traffic?
01:04:07.660 | Like I hate, give me the person who doesn't hate traffic.
01:04:10.780 | I hate traffic.
01:04:11.620 | Every time I'm in traffic, I hate it.
01:04:13.620 | And if we could free the world from traffic,
01:04:17.540 | we have technology, we can free the world from traffic.
01:04:20.020 | We have the technology.
01:04:21.300 | It's there, we have an existence proof.
01:04:23.020 | It's not a technological problem anymore.
01:04:25.420 | - Do you think there is a future where tens of thousands,
01:04:29.300 | maybe hundreds of thousands of both delivery drones
01:04:34.300 | and flying cars of this kind, EV towers fill the sky?
01:04:39.340 | - I absolutely believe this.
01:04:40.900 | And there's obviously the societal acceptance
01:04:43.540 | is a major question.
01:04:45.420 | And of course, safety is.
01:04:46.900 | I believe in safety,
01:04:48.020 | we're gonna exceed ground transportation safety
01:04:50.300 | as has happened for aviation already, commercial aviation.
01:04:54.460 | And in terms of acceptance,
01:04:56.580 | I think one of the key things is noise.
01:04:58.260 | That's why we are focusing relentlessly on noise
01:05:00.900 | and we built perhaps the quietest electric VTOL vehicle
01:05:05.580 | ever built.
01:05:06.420 | The nice thing about the sky is it's three dimensional.
01:05:09.700 | So any mathematician will immediately recognize
01:05:12.460 | the difference between 1D of like a regular highway
01:05:14.900 | to 3D of a sky.
01:05:16.180 | But to make it clear for the layman,
01:05:19.300 | say you wanna make 100 vertical lanes
01:05:22.700 | of highway 101 in San Francisco,
01:05:24.980 | because you believe building 100 vertical lanes
01:05:27.180 | is the right solution.
01:05:28.820 | Imagine how much it would cost
01:05:30.140 | to stack 100 vertical lanes physically onto 101.
01:05:33.380 | That would be prohibitive.
01:05:34.300 | That would be consuming the world's GDP for an entire year
01:05:37.740 | just for one highway.
01:05:39.180 | It's amazingly expensive.
01:05:41.220 | In the sky, it would just be a recompilation
01:05:43.660 | of a piece of software
01:05:44.540 | because all these lanes are virtual.
01:05:46.500 | That means any vehicle that is in conflict
01:05:49.180 | with another vehicle would just go to different altitudes
01:05:51.780 | and then the conflict is gone.
01:05:53.260 | And if you don't believe this,
01:05:55.300 | that's exactly how commercial aviation works.
01:05:58.500 | When you fly from New York to San Francisco,
01:06:01.380 | another plane flies from San Francisco to New York,
01:06:04.140 | they're at different altitudes
01:06:05.180 | so they don't hit each other.
01:06:06.660 | It's a solved problem for the jet space
01:06:10.300 | and it will be a solved problem for the urban space.
01:06:12.660 | There's companies like Google, Bing and Amazon
01:06:15.300 | working on very innovative solutions
01:06:16.980 | how do we have space management.
01:06:18.500 | They use exactly the same principles as we use today
01:06:21.580 | to route today's jets.
01:06:23.220 | There's nothing hard about this.
01:06:25.860 | - Do you envision autonomy being a key part of it
01:06:28.940 | so that the flying vehicles are either semi-autonomous
01:06:33.940 | or fully autonomous?
01:06:36.860 | - 100% autonomous.
01:06:37.820 | You don't want idiots like me flying in the sky.
01:06:40.420 | I promise you.
01:06:41.900 | And if you have 10,000,
01:06:43.180 | watch the movie "The Fifth Element"
01:06:45.980 | to get a fee for what would happen
01:06:48.180 | if it's not autonomous.
01:06:49.420 | - And a centralized, that's a really interesting idea
01:06:51.660 | of a centralized sort of management system
01:06:55.260 | for lanes and so on.
01:06:56.300 | So actually just being able to have
01:06:58.780 | similar as we have in the current commercial aviation
01:07:02.980 | but scale it up to much more vehicles.
01:07:05.540 | That's a really interesting optimization problem.
01:07:07.660 | - It is mathematically very, very straightforward.
01:07:11.060 | Like the gap we leave between jets is gargantuous.
01:07:13.500 | And part of the reason is there isn't that many jets.
01:07:16.380 | So it just feels like a good solution.
01:07:18.820 | Today, when you get vectored by air traffic control,
01:07:22.380 | someone talks to you, right?
01:07:23.900 | So an ATC controller might have up to maybe 20 planes
01:07:26.980 | on the same frequency.
01:07:28.140 | And then they talk to you, you have to talk back.
01:07:30.340 | And that feels right because there isn't more than 20 planes
01:07:32.700 | around anyhow, so you can talk to everybody.
01:07:34.980 | But if there's 20,000 things around,
01:07:36.740 | you can't talk to everybody anymore.
01:07:37.980 | So we have to do something that's called digital,
01:07:40.260 | like text messaging.
01:07:41.540 | Like we do have solutions.
01:07:43.060 | Like we have what, four or five billion smartphones
01:07:45.540 | in the world now, right?
01:07:46.460 | And they're all connected.
01:07:47.740 | And somehow we solve the scale problem for smartphones.
01:07:50.740 | We know where they all are.
01:07:51.940 | They can talk to somebody and they're very reliable.
01:07:54.860 | They're amazingly reliable.
01:07:56.460 | We could use the same system,
01:07:58.620 | the same scale for air traffic control.
01:08:01.060 | So instead of me as a pilot talking to a human being
01:08:04.060 | in the middle of the conversation,
01:08:06.260 | receiving a new frequency, like how ancient is that?
01:08:09.660 | We could digitize this stuff
01:08:11.260 | and digitally transmit the right flight coordinates.
01:08:15.260 | And that solution will automatically scale
01:08:18.060 | to 10,000 vehicles.
01:08:20.060 | - We talked about empathy a little bit.
01:08:22.180 | Do you think we'll one day build an AI system
01:08:25.780 | that a human being can love and that loves that human back?
01:08:30.140 | Like in the movie "Her."
01:08:31.340 | - Look, I'm a pragmatist.
01:08:33.980 | For me, AI is a tool.
01:08:35.620 | It's like a shovel.
01:08:37.020 | And the ethics of using the shovel are always
01:08:40.800 | with us, the people.
01:08:41.860 | And it has to be this way.
01:08:44.060 | In terms of emotions, I would hate to come into my kitchen
01:08:49.060 | and see that my refrigerator spoiled all my food,
01:08:54.220 | then have it explained to me that it fell in love
01:08:56.500 | with the dishwasher and I wasn't as nice as the dishwasher.
01:08:59.660 | So as a result, it neglected me.
01:09:02.180 | That would just be a bad experience
01:09:05.140 | and it would be a bad product.
01:09:07.060 | I would probably not recommend this refrigerator
01:09:09.540 | to my friends.
01:09:10.420 | And that's where I draw the line.
01:09:12.900 | I think to me, technology has to be reliable.
01:09:16.620 | It has to be predictable.
01:09:17.700 | I want my car to work.
01:09:19.860 | I don't want to fall in love with my car.
01:09:22.860 | I just want it to work.
01:09:24.620 | I want it to compliment me, not to replace me.
01:09:27.180 | I have very unique human properties
01:09:30.620 | and I want the machines to make me,
01:09:33.420 | turn me into a superhuman.
01:09:35.700 | Like I'm already a superhuman today,
01:09:37.820 | thanks to the machines that surround me.
01:09:39.260 | And I give you examples.
01:09:40.780 | I can run across the Atlantic near the speed of sound
01:09:45.700 | at 36,000 feet today.
01:09:48.460 | That's kind of amazing.
01:09:49.580 | My voice now carries me all the way to Australia
01:09:53.620 | using a smartphone today.
01:09:56.580 | And it's not the speed of sound, which would take hours.
01:10:00.060 | It's the speed of light.
01:10:01.300 | My voice travels at the speed of light.
01:10:03.820 | How cool is that?
01:10:04.660 | That makes me superhuman.
01:10:06.300 | I would even argue my flushing toilet makes me superhuman.
01:10:10.500 | Just think of the time before flushing toilets.
01:10:13.780 | And maybe you have a very old person in your family
01:10:16.460 | that you can ask about this
01:10:18.460 | or take a trip to rural India to experience it.
01:10:22.100 | It makes me superhuman.
01:10:25.780 | So to me, what technology does, it compliments me.
01:10:28.860 | It makes me stronger.
01:10:30.900 | Therefore, words like love and compassion have very little,
01:10:34.900 | have very little interest in this for machines.
01:10:38.580 | I have interest in people.
01:10:40.700 | - You don't think, first of all, beautifully put,
01:10:44.260 | beautifully argued,
01:10:45.660 | but do you think love has use in our tools, compassion?
01:10:50.420 | - I think love is a beautiful human concept.
01:10:53.260 | And if you think of what love really is,
01:10:55.380 | love is a means to convey safety, to convey trust.
01:11:00.380 | I think trust has a huge need in technology as well,
01:11:07.420 | not just people.
01:11:09.140 | We want to trust our technology
01:11:11.220 | the same way we, or in a similar way we trust people.
01:11:14.260 | In human interaction, standards have emerged
01:11:19.340 | and feelings, emotions have emerged,
01:11:21.740 | maybe genetically, maybe biologically,
01:11:23.900 | that are able to convey sense of trust, sense of safety,
01:11:26.540 | sense of passion, of love, of dedication
01:11:28.860 | that makes the human fabric.
01:11:30.780 | And I'm a big slacker for love.
01:11:33.700 | I want to be loved.
01:11:34.580 | I want to be trusted.
01:11:35.420 | I want to be admired.
01:11:36.820 | All these wonderful things.
01:11:38.820 | And because all of us, we have this beautiful system,
01:11:42.140 | I wouldn't just blindly copy this to the machines.
01:11:44.780 | Here's why.
01:11:46.140 | When you look at, say, transportation,
01:11:49.300 | you could have observed that up to the end
01:11:53.260 | of the 19th century, almost all transportation
01:11:56.180 | used any number of legs,
01:11:58.060 | from one leg to two legs to a thousand legs.
01:12:01.660 | And you could have concluded that is the right way
01:12:03.740 | to move about the environment.
01:12:06.780 | We've made the exception of birds, who use flapping wings.
01:12:08.940 | In fact, there were many people in aviation
01:12:10.820 | that flapped wings to their arms and jumped from cliffs.
01:12:13.700 | Most of them didn't survive.
01:12:15.100 | Then the interesting thing is that the technology solutions
01:12:19.900 | are very different.
01:12:21.580 | Like, in technology, it's really easy to build a wheel.
01:12:23.900 | In biology, it's super hard to build a wheel.
01:12:25.700 | There's very few perpetually rotating things in biology,
01:12:30.100 | and they usually run cells and things.
01:12:34.180 | In engineering, we can build wheels,
01:12:37.180 | and those wheels gave rise to cars.
01:12:39.980 | Similar wheels gave rise to aviation.
01:12:44.380 | Like, there's no thing that flies
01:12:46.700 | that wouldn't have something that rotates,
01:12:48.820 | like a jet engine or helicopter blades.
01:12:52.420 | So the solutions have used very different physical laws
01:12:55.500 | than nature, and that's great.
01:12:58.060 | So for me to be too much focused on,
01:13:00.100 | oh, this is how nature does it, let's just replicate it,
01:13:03.340 | if we really believed that the solution
01:13:05.380 | to the agricultural revolution was a humanoid robot,
01:13:08.700 | it would still be waiting today.
01:13:10.940 | - Again, beautifully put.
01:13:12.540 | You said that you don't take yourself too seriously.
01:13:15.860 | - Did I say that?
01:13:16.700 | You want me to say that?
01:13:19.180 | - Maybe I did.
01:13:20.020 | - You're not taking me seriously.
01:13:20.980 | - I'm not, yeah, that's right.
01:13:22.860 | - Good, you're right, I don't wanna.
01:13:24.460 | - I just made that up.
01:13:25.740 | But you have a humor and a lightness about life
01:13:29.100 | that I think is beautiful and inspiring to a lot of people.
01:13:33.500 | Where does that come from?
01:13:35.020 | The smile, the humor, the lightness amidst all the chaos
01:13:40.020 | of the hard work that you're in, where does that come from?
01:13:43.660 | - I just love my life.
01:13:44.540 | I love the people around me.
01:13:46.100 | I'm just so glad to be alive.
01:13:49.740 | Like, I'm, what, 52, hard to believe.
01:13:53.620 | People say 52 is a new 51, so now I feel better.
01:13:56.220 | (Lex laughing)
01:13:58.540 | But in looking around the world,
01:14:02.340 | looking, just go back 200, 300 years.
01:14:05.180 | Humanity is, what, 300,000 years old?
01:14:09.340 | But for the first 300,000 years minus the last 100,
01:14:13.980 | our life expectancy would have been
01:14:17.060 | plus or minus 30 years, roughly, give or take.
01:14:20.260 | So I would be long dead now.
01:14:22.620 | That makes me just enjoy every single day of my life
01:14:26.860 | because I don't deserve this.
01:14:28.260 | Why am I born today when so many of my ancestors
01:14:32.460 | died of horrible deaths, like famines, massive wars
01:14:37.460 | that ravaged Europe for the last 1,000 years,
01:14:41.860 | mystically disappeared after World War II
01:14:44.540 | when the Americans and the Allies did something amazing
01:14:47.580 | to my country that didn't deserve it,
01:14:49.780 | the country of Germany.
01:14:51.460 | This is so amazing.
01:14:52.620 | And then when you're alive and feel this every day,
01:14:56.940 | then it's just so amazing what we can accomplish,
01:15:01.940 | what we can do.
01:15:03.460 | We live in a world that is so incredibly
01:15:06.380 | vastly changing every day.
01:15:08.700 | Almost everything that we cherish,
01:15:10.980 | from your smartphone to your flushing toilet,
01:15:14.540 | to all these basic inventions,
01:15:16.180 | your new clothes you're wearing, your watch, your plane,
01:15:19.580 | penicillin, I don't know, anesthesia for surgery,
01:15:25.780 | penicillin, have been invented in the last 150 years.
01:15:30.020 | So in the last 150 years, something magical happened.
01:15:32.380 | And I would trace it back to Gutenberg
01:15:34.340 | and the printing press that has been able
01:15:35.900 | to disseminate information more efficiently than before,
01:15:38.780 | that all of a sudden we're able to invent agriculture
01:15:42.860 | and nitrogen fertilization that made agriculture
01:15:45.940 | so much more potent that we didn't have to work
01:15:48.020 | in the farms anymore and we could start reading and writing
01:15:50.140 | and we could become all these wonderful things
01:15:52.300 | we are today, from airline pilot to massage therapist
01:15:54.740 | to software engineer.
01:15:56.780 | It's just amazing.
01:15:57.620 | Like living in that time is such a blessing.
01:16:00.620 | We should sometimes really think about this.
01:16:03.340 | Steven Pinker, who is a very famous author and philosopher
01:16:07.340 | whom I really adore, wrote a great book called
01:16:09.420 | "Enlightenment Now" and that's maybe the one book
01:16:11.140 | I would recommend.
01:16:11.980 | And he asked the question if there was only
01:16:14.500 | a single article written in the 20th century,
01:16:17.140 | only one article, what would it be?
01:16:19.100 | What's the most important innovation,
01:16:21.140 | the most important thing that happened?
01:16:23.060 | And he would say this article would credit
01:16:24.700 | a guy named Carl Bosch.
01:16:27.020 | And I challenge anybody, have you ever heard
01:16:29.460 | of the name Carl Bosch?
01:16:31.180 | I hadn't, okay.
01:16:32.940 | There's a Bosch Corporation in Germany,
01:16:35.420 | but it's not associated with Carl Bosch.
01:16:37.420 | So I looked it up.
01:16:39.860 | Carl Bosch invented nitrogen fertilization.
01:16:42.620 | And in doing so, together with an older invention
01:16:45.540 | of irrigation, was able to increase the yield
01:16:49.180 | per agricultural land by a factor of 26.
01:16:52.860 | So a 2,500% increase in fertility of land.
01:16:57.700 | And that, so Steve Pinker argues,
01:17:00.540 | saved over two billion lives today.
01:17:03.900 | Two billion people who would be dead
01:17:05.700 | if this man hadn't done what he had done, okay?
01:17:08.420 | Think about that impact and what that means to society.
01:17:12.180 | That's the way I look at the world.
01:17:14.180 | I mean, it's just so amazing to be alive
01:17:16.180 | and to be part of this.
01:17:17.020 | And I'm so glad I lived after Carl Bosch and not before.
01:17:21.340 | - I don't think there's a better way to end this, Sebastian.
01:17:24.020 | It's an honor to talk to you,
01:17:25.500 | to have had the chance to learn from you.
01:17:27.380 | Thank you so much for talking to me.
01:17:28.340 | - Thanks for coming on, it's a real pleasure.
01:17:31.020 | - Thank you for listening to this conversation
01:17:32.820 | with Sebastian Thrun.
01:17:34.420 | And thank you to our presenting sponsor, Cash App.
01:17:37.500 | Download it, use code LEXPODCAST,
01:17:40.260 | you'll get $10 and $10 will go to FIRST,
01:17:43.260 | a STEM education nonprofit that inspires
01:17:45.540 | hundreds of thousands of young minds
01:17:47.500 | to learn and to dream of engineering our future.
01:17:50.580 | If you enjoy this podcast, subscribe on YouTube,
01:17:53.380 | get five stars on Apple Podcast, support on Patreon,
01:17:56.660 | or connect with me on Twitter.
01:17:58.860 | And now let me leave you with some words of wisdom
01:18:01.300 | from Sebastian Thrun.
01:18:03.260 | It's important to celebrate your failures
01:18:05.440 | as much as your successes.
01:18:07.740 | If you celebrate your failures really well,
01:18:09.820 | if you say, wow, I failed, I tried, I was wrong,
01:18:13.940 | but I learned something.
01:18:15.620 | Then you realize you have no fear.
01:18:18.300 | And when your fear goes away, you can move the world.
01:18:21.660 | Thank you for listening and hope to see you next time.
01:18:25.500 | (upbeat music)
01:18:28.080 | (upbeat music)
01:18:30.660 | [BLANK_AUDIO]