back to index

Eric Schmidt: Google | Lex Fridman Podcast #8


Chapters

0:0 Introduction
0:36 First moment when you fell in love with technology
1:8 Discovering the beauty of math
3:35 Creating Lex
5:32 Path to generality
6:32 Middle class
7:25 Computers as tools
9:30 Predicting the future
11:4 The champions of the impossible
12:3 Planning for the impossible
15:17 Artificial Intelligence
19:21 Thinking beyond 50 years
22:21 Leadership styles
25:23 Personal AI assistant

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Eric Schmidt.
00:00:03.200 | He was the CEO of Google for 10 years
00:00:05.160 | and a chairman for six more,
00:00:06.780 | guiding the company through an incredible period of growth
00:00:10.100 | and a series of world-changing innovations.
00:00:12.960 | He is one of the most impactful leaders
00:00:15.300 | in the era of the internet and a powerful voice
00:00:19.360 | for the promise of technology in our society.
00:00:22.320 | It was truly an honor to speak with him
00:00:24.800 | as part of the MIT course
00:00:26.920 | on artificial general intelligence
00:00:28.640 | and the Artificial Intelligence Podcast.
00:00:31.920 | And now, here's my conversation with Eric Schmidt.
00:00:36.160 | What was the first moment
00:00:38.040 | when you fell in love with technology?
00:00:40.000 | - I grew up in the 1960s as a boy
00:00:44.440 | where every boy wanted to be an astronaut
00:00:46.840 | and part of the space program.
00:00:48.920 | So like everyone else of my age,
00:00:51.360 | we would go out to the cow pasture behind my house,
00:00:54.360 | which was literally a cow pasture,
00:00:56.320 | and we would shoot model rockets off.
00:00:58.560 | And that, I think, is the beginning.
00:01:00.840 | And of course, generationally,
00:01:02.480 | today it would be video games
00:01:04.800 | and all the amazing things
00:01:05.800 | that you can do online with computers.
00:01:08.220 | - There's a transformative, inspiring aspect
00:01:12.040 | of science and math that maybe rockets would bring,
00:01:15.800 | would instill in individuals.
00:01:17.480 | You've mentioned yesterday that eighth grade math
00:01:20.160 | is where the journey through mathematical universe
00:01:22.200 | diverges for many people.
00:01:23.800 | It's this fork in the roadway.
00:01:26.960 | There's a professor of math at Berkeley, Edward Frenkel.
00:01:30.200 | I'm not sure if you're familiar with him.
00:01:32.760 | - I am.
00:01:33.600 | - He has written this amazing book
00:01:35.480 | I recommend to everybody called "Love and Math,"
00:01:37.760 | two of my favorite words.
00:01:39.920 | (laughs)
00:01:41.600 | He says that if painting was taught like math,
00:01:46.600 | then students would be asked to paint a fence,
00:01:49.660 | which is his analogy of essentially how math is taught.
00:01:52.360 | You never get a chance to discover the beauty
00:01:55.920 | of the art of painting or the beauty of the art of math.
00:01:59.320 | So how, when, and where did you discover that beauty?
00:02:03.900 | - I think what happens with people like myself
00:02:08.080 | is that you're math-enabled pretty early,
00:02:11.400 | and all of a sudden you discover that you can use that
00:02:14.360 | to discover new insights.
00:02:16.600 | The great scientists will all tell a story,
00:02:19.120 | the men and women who are fantastic today,
00:02:22.080 | that somewhere when they were in high school or in college,
00:02:24.640 | they discovered that they could discover
00:02:26.120 | something themselves.
00:02:27.840 | And that sense of building something,
00:02:29.920 | of having an impact that you own,
00:02:32.320 | drives knowledge acquisition and learning.
00:02:35.520 | In my case, it was programming,
00:02:37.080 | and the notion that I could build things
00:02:39.840 | that had not existed that I had built,
00:02:42.360 | that it had my name on it.
00:02:44.440 | And this was before open source,
00:02:46.200 | but you could think of it as open source contributions.
00:02:49.140 | So today, if I were a 16 or 17-year-old boy,
00:02:51.800 | I'm sure that I would aspire as a computer scientist
00:02:54.720 | to make a contribution like the open source heroes
00:02:58.120 | of the world today.
00:02:58.960 | That would be what would be driving me,
00:03:00.400 | and I'd be trying and learning and making mistakes
00:03:03.740 | and so forth in the ways that it works.
00:03:06.680 | The repository that represent, that GitHub represents,
00:03:10.000 | and that open source libraries represent,
00:03:12.240 | is an enormous bank of knowledge
00:03:14.960 | of all of the people who are doing that.
00:03:17.160 | And one of the lessons that I learned at Google
00:03:19.600 | was that the world is a very big place,
00:03:21.520 | and there's an awful lot of smart people.
00:03:23.600 | And an awful lot of them are underutilized.
00:03:26.320 | So here's an opportunity, for example,
00:03:28.960 | building parts of programs, building new ideas,
00:03:31.700 | to contribute to the greater of society.
00:03:33.840 | - So in that moment in the '70s,
00:03:38.360 | the inspiring moment where there was nothing,
00:03:40.680 | and then you created something through programming,
00:03:42.840 | that magical moment.
00:03:44.720 | So in 1975, I think, you've created a program called Lex,
00:03:49.180 | which I especially like because my name is Lex.
00:03:51.480 | So thank you, thank you for creating a brand
00:03:54.640 | that established a reputation that's long-lasting,
00:03:57.600 | reliable, and has a big impact on the world,
00:03:59.720 | and still used today.
00:04:01.200 | So thank you for that.
00:04:02.840 | But more seriously, in that time, in the '70s,
00:04:07.840 | as an engineer, personal computers were being born.
00:04:11.200 | Do you think you would be able to predict
00:04:14.160 | the '80s, '90s, and the aughts of where computers would go?
00:04:18.920 | - I'm sure I could not and would not have gotten it right.
00:04:22.100 | I was the beneficiary of the great work
00:04:25.440 | of many, many people who saw it clearer than I did.
00:04:29.040 | With Lex, I worked with a fellow named Michael Lesk,
00:04:32.520 | who was my supervisor, and he essentially helped me
00:04:35.640 | architect and deliver a system that's still in use today.
00:04:39.180 | After that, I worked at Xerox Palo Alto Research Center,
00:04:42.200 | where the Alto was invented.
00:04:43.660 | And the Alto is the predecessor
00:04:46.080 | of the modern personal computer, or Macintosh, and so forth.
00:04:50.200 | And the Altos were very rare, and I had to drive an hour
00:04:53.600 | from Berkeley to go use them, but I made a point
00:04:56.400 | of skipping classes and doing whatever it took
00:04:59.560 | to have access to this extraordinary achievement.
00:05:02.500 | I knew that they were consequential.
00:05:04.880 | What I did not understand was scaling.
00:05:08.260 | I did not understand what would happen
00:05:09.860 | when you had 100 million as opposed to 100.
00:05:12.840 | And so since then, and I have learned the benefit
00:05:15.360 | of scale, I always look for things
00:05:17.480 | which are going to scale to platforms, right?
00:05:19.680 | So mobile phones, Android, all those things.
00:05:23.120 | There are, the world is numerous,
00:05:25.880 | there are many, many people in the world,
00:05:27.440 | people really have needs, they really will use
00:05:29.360 | these platforms, and you can build big businesses
00:05:31.280 | on top of them.
00:05:32.600 | - So it's interesting, so when you see a piece
00:05:34.200 | of technology, now you think, what will this technology
00:05:36.800 | look like when it's in the hands of a billion people?
00:05:39.040 | - That's right.
00:05:39.960 | So an example would be that the market is so competitive now
00:05:45.000 | that if you can't figure out a way for something
00:05:47.480 | to have a million users or a billion users,
00:05:50.840 | it probably is not gonna be successful
00:05:53.160 | because something else will become the general platform
00:05:56.880 | and your idea will become a lost idea
00:06:01.080 | or a specialized service with relatively few users.
00:06:04.320 | So it's a path to generality, it's a path
00:06:06.560 | to general platform use, it's a path to broad applicability.
00:06:10.080 | Now there are plenty of good businesses that are tiny,
00:06:12.680 | so luxury goods, for example.
00:06:14.920 | But if you wanna have an impact at scale,
00:06:18.520 | you have to look for things which are of common value,
00:06:21.360 | common pricing, common distribution,
00:06:23.320 | and solve common problems.
00:06:24.760 | They're problems that everyone has.
00:06:26.160 | And by the way, people have lots of problems.
00:06:28.120 | Information, medicine, health, education, and so forth.
00:06:31.200 | Work on those problems.
00:06:32.960 | - Like you said, you're a big fan of the middle class.
00:06:36.800 | - 'Cause there's so many of them.
00:06:37.880 | - There's so many of them.
00:06:38.800 | - By definition.
00:06:40.200 | - So any product, any thing that has a huge impact
00:06:44.400 | and improves their lives is a great business decision
00:06:47.480 | and it's just good for society.
00:06:48.880 | - And there's nothing wrong with starting off
00:06:50.960 | in the high end, as long as you have a plan
00:06:53.680 | to get to the middle class.
00:06:55.440 | There's nothing wrong with starting with a specialized
00:06:57.600 | market in order to learn and to build and to fund things.
00:07:01.000 | So you start with a luxury market
00:07:02.560 | to build a general purpose market.
00:07:04.480 | But if you define yourself as only a narrow market,
00:07:07.520 | someone else can come along with a general purpose market
00:07:10.920 | that can push you to the corner,
00:07:12.320 | can restrict the scale of operation,
00:07:14.260 | can force you to be a lesser impact than you might be.
00:07:17.840 | So it's very important to think in terms of broad businesses
00:07:21.040 | and broad impact, even if you start
00:07:23.440 | in a little corner somewhere.
00:07:24.980 | - So as you look to the '70s, but also in the decades
00:07:29.920 | to come, and you saw computers, did you see them as tools?
00:07:34.880 | Or was there a little element of another entity?
00:07:40.280 | I remember a quote saying, "AI began with our dream
00:07:44.620 | "to create the gods."
00:07:46.180 | Is there a feeling when you wrote that program
00:07:48.660 | that you were creating another entity,
00:07:51.300 | giving life to something?
00:07:52.860 | - I wish I could say otherwise, but I simply found
00:07:55.940 | the technology platforms so exciting,
00:07:58.780 | that's what I was focused on.
00:08:00.500 | I think the majority of the people that I've worked with,
00:08:03.420 | and there are a few exceptions, Steve Jobs being an example,
00:08:06.740 | really saw this as a great technological play.
00:08:10.020 | I think relatively few of the technical people
00:08:12.500 | understood the scale of its impact.
00:08:15.420 | So I used NCP, which is a predecessor to TCP/IP.
00:08:19.620 | It just made sense to connect things.
00:08:21.200 | We didn't think of it in terms of the internet,
00:08:23.780 | and then companies, and then Facebook, and then Twitter,
00:08:27.020 | and then politics, and so forth.
00:08:29.160 | We never did that build.
00:08:30.720 | We didn't have that vision.
00:08:32.820 | And I think most people, it's a rare person
00:08:35.260 | who can see compounding at scale.
00:08:37.980 | Most people can see, if you ask people to predict the future
00:08:40.540 | they'll say, they'll give you an answer
00:08:42.020 | of six to nine months, or 12 months.
00:08:44.460 | 'Cause that's about as far as people can imagine.
00:08:47.460 | But there's an old saying, which actually was attributed
00:08:49.740 | to a professor at MIT a long time ago,
00:08:52.020 | that we overestimate what can be done in one year,
00:08:56.340 | and we underestimate what can be done in a decade.
00:09:00.060 | And there's a great deal of evidence
00:09:02.420 | that these core platforms at hardware and software
00:09:05.540 | take a decade, right?
00:09:07.700 | So think about self-driving cars.
00:09:09.420 | Self-driving cars were thought about in the '90s.
00:09:12.060 | There were projects around them.
00:09:13.300 | The first DARPA-Durant Challenge was roughly 2004.
00:09:17.080 | So that's roughly 15 years ago.
00:09:19.460 | And today we have self-driving cars
00:09:21.580 | operating in a city in Arizona, right?
00:09:23.900 | So 15 years, and we still have a ways to go
00:09:26.580 | before they're more generally available.
00:09:28.580 | - So you've spoken about the importance.
00:09:33.780 | You just talked about predicting into the future.
00:09:37.060 | You've spoken about the importance
00:09:38.220 | of thinking five years ahead
00:09:40.700 | and having a plan for those five years.
00:09:42.820 | - Yeah, the way to say it is that
00:09:44.700 | almost everybody has a one-year plan.
00:09:47.500 | Almost no one has a proper five-year plan.
00:09:50.900 | And the key thing to having a five-year plan
00:09:52.740 | is to having a model for what's going to happen
00:09:55.220 | under the underlying platforms.
00:09:56.860 | So here's an example.
00:09:58.140 | Moore's Law as we know it,
00:10:01.100 | the thing that powered improvements in CPUs,
00:10:04.220 | has largely halted in its traditional shrinking mechanism
00:10:07.540 | because the costs have just gotten so high.
00:10:10.300 | It's getting harder and harder.
00:10:12.120 | But there's plenty of algorithmic improvements
00:10:14.540 | and specialized hardware improvements.
00:10:16.540 | So you need to understand the nature of those improvements
00:10:19.620 | and where they'll go in order to understand
00:10:21.900 | how it will change the platform.
00:10:24.300 | In the area of network connectivity,
00:10:26.060 | what are the gains that are gonna be possible in wireless?
00:10:29.400 | It looks like there's an enormous expansion
00:10:33.380 | of wireless connectivity at many different bands.
00:10:36.900 | And that we will primarily,
00:10:38.660 | historically I've always thought
00:10:39.860 | that we were primarily gonna be using fiber,
00:10:42.080 | but now it looks like we're gonna be using fiber
00:10:43.940 | plus very powerful high bandwidth
00:10:46.540 | sort of short distance connectivity
00:10:49.300 | to bridge the last mile.
00:10:51.460 | That's an amazing achievement.
00:10:53.020 | If you know that,
00:10:54.400 | then you're gonna build your systems differently.
00:10:56.860 | By the way, those networks
00:10:57.740 | have different latency properties.
00:10:59.620 | Because they're more symmetric,
00:11:01.580 | the algorithms feel faster for that reason.
00:11:03.880 | - And so when you think about whether it's a fiber
00:11:07.860 | or just technologies in general,
00:11:09.860 | so there's this Barber wooden poem or quote
00:11:14.180 | that I really like.
00:11:15.780 | "It's from the champions of the impossible
00:11:18.180 | "rather than the slaves of the possible
00:11:20.260 | "that evolution draws its creative force."
00:11:23.220 | So in predicting the next five years,
00:11:25.980 | I'd like to talk about the impossible and the possible.
00:11:29.220 | - Well, and again, one of the great things
00:11:31.280 | about humanity is that we produce dreamers.
00:11:34.140 | - Right.
00:11:34.980 | - Right, we literally have people
00:11:35.900 | who have a vision and a dream.
00:11:37.780 | They are, if you will, disagreeable
00:11:40.100 | in the sense that they disagree with the,
00:11:42.940 | they disagree with what the sort of zeitgeist is.
00:11:46.260 | They say there is another way.
00:11:48.020 | They have a belief, they have a vision.
00:11:50.280 | If you look at science,
00:11:51.820 | science is always marked by such people
00:11:55.180 | who went against some conventional wisdom,
00:11:58.380 | collected the knowledge at the time
00:12:00.220 | and assembled it in a way
00:12:01.340 | that produced a powerful platform.
00:12:03.660 | - And you've been amazingly honest about,
00:12:08.320 | in an inspiring way,
00:12:09.880 | about things you've been wrong about predicting
00:12:12.200 | and you've obviously been right about a lot of things,
00:12:14.700 | but in this kind of tension,
00:12:19.060 | how do you balance, as a company,
00:12:21.560 | predicting the next five years,
00:12:23.740 | the impossible, planning for the impossible,
00:12:26.460 | so listening to those crazy dreamers,
00:12:28.900 | letting them do, letting them run away
00:12:31.740 | and make the impossible real, make it happen,
00:12:35.260 | and slow, that's how programmers often think,
00:12:38.900 | and slowing things down and saying,
00:12:41.740 | well, this is the rational, this is the possible,
00:12:44.780 | the pragmatic, the dreamer versus the pragmatist.
00:12:49.100 | - So it's helpful to have a model
00:12:51.380 | which encourages a predictable revenue stream
00:12:56.020 | as well as the ability to do new things.
00:12:58.680 | So in Google's case, we're big enough
00:13:00.540 | and well enough managed and so forth
00:13:02.340 | that we have a pretty good sense
00:13:04.080 | of what our revenue will be for the next year or two,
00:13:06.300 | at least for a while.
00:13:07.900 | And so we have enough cash generation
00:13:11.540 | that we can make bets.
00:13:13.340 | And indeed, Google has become Alphabet,
00:13:16.780 | so the corporation is organized around these bets.
00:13:19.500 | And these bets are in areas of fundamental importance
00:13:22.740 | to the world, whether it's artificial intelligence,
00:13:26.720 | medical technology, self-driving cars,
00:13:29.700 | connectivity through balloons, on and on and on.
00:13:33.300 | And there's more coming and more coming.
00:13:35.980 | So one way you could express this
00:13:38.020 | is that the current business is successful enough
00:13:41.500 | that we have the luxury of making bets.
00:13:44.340 | And another one that you could say
00:13:45.940 | is that we have the wisdom of being able to see
00:13:49.140 | that a corporate structure needs to be created
00:13:51.560 | to enhance the likelihood of the success of those bets.
00:13:55.260 | So we essentially turned ourselves
00:13:56.860 | into a conglomerate of bets
00:13:59.540 | and then this underlying corporation, Google,
00:14:02.100 | which is itself innovative.
00:14:04.280 | So in order to pull this off,
00:14:05.900 | you have to have a bunch of belief systems.
00:14:08.060 | And one of them is that you have to have
00:14:09.580 | bottoms up and tops down.
00:14:11.460 | The bottoms up we call 20% time,
00:14:13.560 | and the idea is that people can spend 20% of the time
00:14:15.780 | on whatever they want.
00:14:16.940 | And the top down is that our founders in particular
00:14:19.780 | have a keen eye on technology
00:14:21.820 | and they're reviewing things constantly.
00:14:23.940 | So an example would be they'll hear about an idea
00:14:26.620 | or I'll hear about something and it sounds interesting,
00:14:28.780 | let's go visit them.
00:14:30.460 | And then let's begin to assemble the pieces
00:14:33.120 | to see if that's possible.
00:14:34.860 | And if you do this long enough,
00:14:36.060 | you get pretty good at predicting what's likely to work.
00:14:39.860 | - So that's a beautiful balance that's struck.
00:14:42.060 | Is this something that applies at all scale?
00:14:44.500 | So in the--
00:14:45.620 | - Seems to be.
00:14:46.720 | Sergey, again, 15 years ago,
00:14:53.100 | came up with a concept called 10% of the budget
00:14:56.900 | should be on things that are unrelated.
00:14:59.060 | It was called 70/20/10.
00:15:00.940 | 70% of our time on core business,
00:15:03.580 | 20% on adjacent business, and 10% on other.
00:15:06.840 | And he proved mathematically,
00:15:08.780 | of course he's a brilliant mathematician,
00:15:10.640 | that you needed that 10% to make the sum
00:15:13.940 | of the growth work.
00:15:14.780 | And it turns out he was right.
00:15:16.180 | - So getting into the world of artificial intelligence,
00:15:20.980 | you've talked quite extensively and effectively
00:15:25.420 | to the impact in the near term,
00:15:28.820 | the positive impact of artificial intelligence,
00:15:32.060 | whether it's machine, especially machine learning
00:15:34.180 | in medical applications, in education,
00:15:38.620 | and just making information more accessible.
00:15:41.620 | In the AI community, there is a kind of debate.
00:15:45.900 | There's this shroud of uncertainty
00:15:47.740 | as we face this new world
00:15:49.060 | with artificial intelligence in it.
00:15:50.500 | And there's some people, like Elon Musk,
00:15:54.260 | you've disagreed on, at least on the degree of emphasis
00:15:57.680 | he places on the existential threat of AI.
00:16:00.740 | So I've spoken with Stuart Russell, Max Tegmark,
00:16:03.380 | who share Elon Musk's view,
00:16:05.380 | and Yoshio Bengio, Steven Pinker, who do not.
00:16:09.180 | And so there's a lot of very smart people
00:16:11.860 | who are thinking about this stuff,
00:16:13.900 | disagreeing, which is really healthy, of course.
00:16:17.200 | So what do you think is the healthiest way
00:16:19.100 | for the AI community to,
00:16:22.020 | and really for the general public,
00:16:23.860 | to think about AI and the concern
00:16:26.780 | of the technology being mismanaged in some kind of way?
00:16:32.700 | - So the source of education for the general public
00:16:35.060 | has been robot killer movies.
00:16:37.420 | - Right.
00:16:38.260 | - And Terminator, et cetera.
00:16:40.860 | And the one thing I can assure you we're not building
00:16:44.500 | are those kinds of solutions.
00:16:46.620 | Furthermore, if they were to show up,
00:16:48.420 | someone would notice and unplug them, right?
00:16:51.140 | So as exciting as those movies are,
00:16:53.140 | and they're great movies,
00:16:54.700 | were the killer robots to start,
00:16:57.500 | we would find a way to stop them, right?
00:17:00.460 | So I'm not concerned about that.
00:17:02.860 | And much of this has to do
00:17:05.980 | with the timeframe of conversation.
00:17:08.540 | So you can imagine a situation 100 years from now
00:17:13.300 | when the human brain is fully understood,
00:17:15.920 | and the next generation and next generation
00:17:18.140 | of brilliant MIT scientists have figured all this out,
00:17:20.940 | we're gonna have a large number of ethics questions, right?
00:17:25.140 | Around science and thinking and robots and computers
00:17:28.060 | and so forth and so on.
00:17:29.700 | So it depends on the question of the timeframe.
00:17:32.260 | In the next five to 10 years,
00:17:34.780 | we're not facing those questions.
00:17:37.220 | What we're facing in the next five to 10 years
00:17:39.100 | is how do we spread this disruptive technology
00:17:42.140 | as broadly as possible to gain the maximum benefit of it?
00:17:46.500 | The primary benefit should be in healthcare
00:17:48.980 | and in education.
00:17:50.040 | Healthcare, because it's obvious,
00:17:52.300 | we're all the same even though we don't,
00:17:53.980 | we somehow believe we're not.
00:17:55.780 | As a medical matter,
00:17:57.340 | the fact that we have big data about our health
00:17:59.160 | will save lives, allow us to get,
00:18:01.500 | deal with skin cancer and other cancers,
00:18:04.220 | ophthalmological problems,
00:18:05.480 | there's people working on psychological diseases
00:18:08.380 | and so forth using these techniques.
00:18:10.220 | I can go on and on.
00:18:11.660 | The promise of AI in medicine is extraordinary.
00:18:15.780 | There are many, many companies and startups
00:18:17.940 | and funds and solutions
00:18:19.460 | and we will all live much better for that.
00:18:22.100 | The same argument in education.
00:18:25.580 | Can you imagine that for each generation of child
00:18:28.500 | and even adult, you have a tutor educator that's AI based,
00:18:33.060 | that's not a human but is properly trained,
00:18:35.900 | that helps you get smarter,
00:18:37.140 | helps you address your language difficulties
00:18:39.280 | or your math difficulties or what have you.
00:18:41.340 | Why don't we focus on those two?
00:18:43.300 | The gains societally of making humans smarter
00:18:46.600 | and healthier are enormous, right?
00:18:48.940 | And those translate for decades and decades
00:18:51.460 | and we'll all benefit from them.
00:18:53.900 | There are people who are working on AI safety,
00:18:56.300 | which is the issue that you're describing
00:18:58.100 | and there are conversations in the community
00:19:00.680 | that should there be such problems,
00:19:02.540 | what should the rules be like?
00:19:04.400 | Google, for example, has announced its policies
00:19:07.540 | with respect to AI safety, which I certainly support
00:19:10.180 | and I think most everybody would support
00:19:12.260 | and they make sense, right?
00:19:14.180 | So it helps guide the research.
00:19:16.340 | But the killer robots are not arriving this year
00:19:19.580 | and they're not even being built.
00:19:21.220 | - And on that line of thinking, you said the time scale,
00:19:26.720 | in this topic or other topics,
00:19:30.460 | have you found it useful on the business side
00:19:34.600 | or the intellectual side to think beyond five, 10 years,
00:19:37.540 | to think 50 years out?
00:19:39.380 | Has it ever been useful or productive?
00:19:41.980 | - In our industry, there are essentially no examples
00:19:45.180 | of 50 year predictions that have been correct.
00:19:47.480 | Let's review AI, right?
00:19:50.420 | AI, which was largely invented here at MIT
00:19:53.100 | and a couple of other universities in the 1956, 1957, 1958,
00:19:57.820 | the original claims were a decade or two.
00:20:01.360 | And when I was a PhD student, I studied AI a bit
00:20:05.220 | and it entered during my looking at it,
00:20:07.700 | a period which is known as AI winter,
00:20:10.400 | which went on for about 30 years,
00:20:12.780 | which is a whole generation of scientists
00:20:15.360 | and a whole group of people
00:20:16.680 | who didn't make a lot of progress
00:20:18.420 | because the algorithms had not improved
00:20:20.180 | and the computers did not approved.
00:20:22.080 | It took some brilliant mathematicians,
00:20:23.860 | starting with a fellow named Jeff Hinton
00:20:25.420 | at Toronto and Montreal,
00:20:27.300 | who basically invented this deep learning model,
00:20:30.820 | which empowers us today.
00:20:32.580 | Those, the seminal work there was 20 years ago.
00:20:36.100 | And in the last 10 years, it's become popularized.
00:20:39.980 | So think about the timeframes for that level of discovery.
00:20:43.880 | It's very hard to predict.
00:20:45.920 | Many people think that we'll be flying around
00:20:47.740 | in the equivalent of flying cars, who knows?
00:20:51.200 | My own view, if I wanna go out on a limb,
00:20:54.460 | is to say that we know a couple of things
00:20:56.860 | about 50 years from now.
00:20:57.980 | We know that there'll be more people alive.
00:21:00.460 | We know that we'll have to have platforms
00:21:02.180 | that are more sustainable because the earth is limited
00:21:05.700 | in the ways we all know.
00:21:07.380 | And that the kind of platforms that are gonna get built
00:21:10.500 | will be consistent with the principles that I've described.
00:21:13.020 | They will be much more empowering of individuals.
00:21:15.740 | They'll be much more sensitive to the ecology
00:21:17.740 | 'cause they have to be.
00:21:19.020 | They just have to be.
00:21:20.540 | I also think that humans are gonna be a great deal smarter.
00:21:23.780 | And I think they're gonna be a lot smarter
00:21:25.080 | because of the tools that I've discussed with you.
00:21:27.740 | And of course, people will live longer.
00:21:29.200 | Life extension is continuing a pace.
00:21:32.180 | A baby born today has a reasonable chance
00:21:34.620 | of living to 100, right?
00:21:36.060 | Which is pretty exciting.
00:21:37.140 | It's well past the 21st century.
00:21:38.580 | So we better take care of them.
00:21:40.620 | - And you mentioned an interesting statistic
00:21:42.580 | on some very large percentage,
00:21:44.660 | 60, 70% of people may live in cities.
00:21:47.340 | - Today, more than half the world lives in cities.
00:21:50.460 | And one of the great stories of humanity
00:21:53.740 | in the last 20 years has been the rural to urban migration.
00:21:57.460 | This has occurred in the United States.
00:21:59.180 | It's occurred in Europe.
00:22:01.100 | It's occurring in Asia and it's occurring in Africa.
00:22:04.660 | When people move to cities, the cities get more crowded,
00:22:07.780 | but believe it or not, their health gets better.
00:22:10.500 | Their productivity gets better.
00:22:12.280 | Their IQ and educational capabilities improve.
00:22:15.440 | So it's good news that people are moving to cities,
00:22:18.500 | but we have to make them livable and safe.
00:22:20.820 | - So you, first of all, you are,
00:22:25.860 | but you've also worked with some of the greatest leaders
00:22:28.320 | in the history of tech.
00:22:29.960 | What insights do you draw from the difference
00:22:32.980 | in leadership styles of yourself, Steve Jobs,
00:22:37.020 | Elon Musk, Larry Page, now the new CEO,
00:22:40.220 | Sandra Pachai and others from the,
00:22:44.060 | I would say calm sages to the mad geniuses?
00:22:49.060 | - One of the things that I learned as a young executive
00:22:52.140 | is that there's no single formula for leadership.
00:22:54.820 | They try to teach one, but that's not how it really works.
00:22:59.960 | There are people who just understand what they need to do
00:23:02.660 | and they need to do it quickly.
00:23:04.260 | Those people are often entrepreneurs.
00:23:06.760 | They just know and they move fast.
00:23:09.020 | There are other people who are systems thinkers and planners.
00:23:11.580 | That's more who I am, somewhat more conservative,
00:23:14.660 | more thorough in execution, a little bit more risk averse.
00:23:18.660 | There's also people who are sort of slightly insane, right?
00:23:22.160 | In the sense that they are emphatic and charismatic
00:23:26.080 | and they feel it and they drive it and so forth.
00:23:28.940 | There's no single formula to success.
00:23:31.380 | There is one thing that unifies all of the people
00:23:33.620 | that you named, which is very high intelligence, right?
00:23:36.920 | At the end of the day,
00:23:39.060 | the thing that characterizes all of them
00:23:40.820 | is that they saw the world quicker, faster,
00:23:43.660 | they processed information faster.
00:23:45.740 | They didn't necessarily make
00:23:46.580 | the right decisions all the time,
00:23:48.240 | but they were on top of it.
00:23:49.980 | And the other thing that's interesting
00:23:51.220 | about all those people is they all started young.
00:23:54.180 | So think about Steve Jobs starting Apple
00:23:56.980 | roughly at 18 or 19.
00:23:58.420 | Think about Bill Gates starting at roughly 2021.
00:24:01.660 | Think about by the time they were 30,
00:24:03.700 | Mark Zuckerberg, a good example at 1920.
00:24:06.940 | By the time they were 30, they had 10 years.
00:24:10.660 | At 30 years old, they had 10 years of experience
00:24:13.740 | of dealing with people and products and shipments
00:24:16.980 | and the press and business and so forth.
00:24:19.780 | It's incredible how much experience they had
00:24:22.780 | compared to the rest of us who were busy getting our PhDs.
00:24:25.260 | - Yes, exactly.
00:24:26.100 | - So we should celebrate these people
00:24:28.500 | because they've just had more life experience, right?
00:24:32.260 | And that helps inform the judgment.
00:24:34.420 | At the end of the day,
00:24:36.160 | when you're at the top of these organizations,
00:24:40.100 | all the easy questions have been dealt with, right?
00:24:43.580 | How should we design the buildings?
00:24:45.700 | Where should we put the colors on our product?
00:24:48.260 | What should the box look like, right?
00:24:51.380 | The problems, that's why it's so interesting
00:24:53.420 | to be in these rooms.
00:24:54.540 | The problems that they face, right,
00:24:56.460 | in terms of the way they operate,
00:24:58.380 | the way they deal with their employees,
00:25:00.100 | their customers, their innovation,
00:25:01.860 | are profoundly challenging.
00:25:03.940 | Each of the companies
00:25:06.460 | is demonstrably different culturally, right?
00:25:09.340 | They are not, in fact, cut of the same.
00:25:11.740 | They behave differently based on input.
00:25:14.220 | Their internal cultures are different.
00:25:15.860 | Their compensation schemes are different.
00:25:17.500 | Their values are different.
00:25:19.380 | So there's proof that diversity works.
00:25:22.580 | (inhales deeply)
00:25:24.740 | So, when faced with a tough decision,
00:25:28.660 | in need of advice,
00:25:31.900 | it's been said that the best thing one can do
00:25:34.180 | is to find the best person in the world
00:25:36.780 | who can give that advice
00:25:39.020 | and find a way to be in a room with them,
00:25:41.980 | one-on-one and ask.
00:25:43.620 | So here we are,
00:25:45.980 | and let me ask in a long-winded way,
00:25:48.020 | I wrote this down.
00:25:50.740 | In 1998, there were many good search engines,
00:25:53.380 | Lycos, Excite, AltaVista, Infoseek,
00:25:56.740 | Ask Jeeves, maybe,
00:25:58.300 | Yahoo, even.
00:26:00.300 | So Google stepped in and disrupted everything.
00:26:04.660 | They disrupted the nature of search,
00:26:06.540 | the nature of our access to information,
00:26:08.820 | the way we discover new knowledge.
00:26:10.620 | So now, it's 2018, actually 20 years later.
00:26:15.980 | There are many good personal AI assistants,
00:26:18.740 | including, of course, the best from Google.
00:26:21.020 | So you've spoken in medical and education
00:26:25.540 | the impact of such an AI assistant could bring.
00:26:28.660 | So we arrive at this question.
00:26:30.380 | So it's a personal one for me,
00:26:32.220 | but I hope my situation represents
00:26:35.020 | that of many other, as we said, dreamers
00:26:39.220 | and the crazy engineers.
00:26:41.100 | So my whole life,
00:26:43.300 | I've dreamed of creating such an AI assistant.
00:26:46.380 | So every step I've taken has been towards that goal.
00:26:49.020 | Now I'm a research scientist
00:26:50.500 | in human-centered AI here at MIT.
00:26:52.900 | So the next step for me, as I sit here,
00:26:55.460 | so facing my passion,
00:26:57.020 | is to do what Larry and Sergey did in '98.
00:27:01.100 | This simple startup.
00:27:04.780 | And so here's my simple question.
00:27:06.900 | Given the low odds of success,
00:27:08.620 | the timing and luck required,
00:27:10.700 | the countless other factors
00:27:11.900 | that can't be controlled or predicted,
00:27:13.780 | just all the things that Larry and Sergey faced,
00:27:16.500 | is there some calculation,
00:27:17.980 | some strategy to follow in this step,
00:27:21.620 | or do you simply follow the passion
00:27:23.740 | just because there's no other choice?
00:27:25.580 | - I think the people who are in universities
00:27:29.700 | are always trying to study
00:27:31.900 | the extraordinarily chaotic nature
00:27:34.140 | of innovation and entrepreneurship.
00:27:37.300 | My answer is that they didn't have that conversation.
00:27:41.220 | They just did it.
00:27:42.860 | They sensed a moment when, in the case of Google,
00:27:47.260 | there was all of this data that needed to be organized,
00:27:49.740 | and they had a better algorithm.
00:27:51.340 | They had invented a better way.
00:27:53.820 | So today, with human-centered AI,
00:27:56.340 | which is your area of research,
00:27:58.100 | there must be new approaches.
00:28:00.900 | It's such a big field.
00:28:02.500 | There must be new approaches,
00:28:04.900 | different from what we and others are doing.
00:28:07.260 | There must be startups to fund.
00:28:09.580 | There must be research projects to try.
00:28:11.940 | There must be graduate students to work on new approaches.
00:28:15.020 | Here at MIT, there are people who are looking at learning
00:28:18.180 | from the standpoint of looking at child learning, right?
00:28:20.580 | How do children learn starting at age one?
00:28:22.340 | - Josh Tenenbaum and others.
00:28:23.540 | - And the work is fantastic.
00:28:25.340 | Those approaches are different
00:28:27.180 | from the approach that most people are taking.
00:28:29.780 | Perhaps that's a bet that you should make,
00:28:31.940 | or perhaps there's another one.
00:28:33.820 | But at the end of the day,
00:28:35.860 | the successful entrepreneurs
00:28:37.940 | are not as crazy as they sound.
00:28:40.100 | They see an opportunity based on what's happened.
00:28:43.060 | Let's use Uber as an example.
00:28:45.300 | As Travis tells the story,
00:28:46.700 | he and his co-founder were sitting in Paris,
00:28:48.980 | and they had this idea 'cause they couldn't get a cab.
00:28:52.020 | And they said, "We have smartphones,
00:28:54.300 | "and the rest is history."
00:28:56.660 | So what's the equivalent of that Travis, Eiffel Tower,
00:29:00.980 | where is a cab moment
00:29:03.100 | that you could, as an entrepreneur, take advantage of,
00:29:05.940 | whether it's in human-centered AI or something else?
00:29:08.500 | That's the next great startup.
00:29:10.100 | - And the psychology of that moment.
00:29:13.660 | So when Sergey and Larry talk about,
00:29:16.100 | and listen to a few interviews, it's very nonchalant.
00:29:20.180 | Well, here's the very fascinating web data,
00:29:23.860 | and here's an algorithm we have for it.
00:29:27.460 | You know, we just kind of want to play around with that data,
00:29:29.420 | and it seems like that's a really nice way
00:29:31.060 | to organize this data.
00:29:32.340 | - Well, I should say what happened, remember,
00:29:35.580 | is that they were graduate students at Stanford,
00:29:38.140 | and they thought this was interesting,
00:29:39.340 | so they built a search engine,
00:29:40.580 | and they kept it in their room.
00:29:42.080 | And they had to get power from the room next door
00:29:46.340 | 'cause they were using too much power in their room,
00:29:48.040 | so they ran an extension cord over, right?
00:29:51.500 | And then they went and they found a house,
00:29:53.520 | and they had Google World headquarters of five people,
00:29:56.500 | right, to start the company,
00:29:57.540 | and they raised $100,000 from Andy Bechtolsheim,
00:30:00.460 | who was the Sun founder to do this,
00:30:02.220 | and Dave Cheriton and a few others.
00:30:04.480 | The point is, their beginnings were very simple,
00:30:08.220 | but they were based on a powerful insight.
00:30:10.480 | That is a replicable model for any startup.
00:30:14.880 | It has to be a powerful insight,
00:30:16.500 | the beginnings are simple,
00:30:17.620 | and there has to be an innovation.
00:30:19.840 | In Larry and Sergey's case, it was PageRank,
00:30:22.800 | which was a brilliant idea,
00:30:23.960 | one of the most cited papers in the world today.
00:30:26.700 | What's the next one?
00:30:27.800 | - So, you're one of, if I may say,
00:30:33.520 | richest people in the world,
00:30:35.000 | and yet it seems that money is simply a side effect
00:30:38.680 | of your passions and not an inherent goal.
00:30:41.940 | But you're a fascinating person to ask.
00:30:47.960 | So much of our society at the individual level
00:30:51.540 | and at the company level and as nations
00:30:55.020 | is driven by the desire for wealth.
00:30:57.380 | What do you think about this drive,
00:31:01.100 | and what have you learned about,
00:31:03.120 | if I may romanticize the notion,
00:31:05.040 | the meaning of life having achieved success
00:31:08.120 | on so many dimensions?
00:31:10.400 | - There have been many studies of human happiness,
00:31:13.560 | and above some threshold,
00:31:16.320 | which is typically relatively low for this conversation,
00:31:19.500 | there's no difference in happiness about money.
00:31:23.560 | The happiness is correlated with meaning and purpose,
00:31:27.040 | a sense of family, a sense of impact.
00:31:30.000 | So if you organize your life,
00:31:31.920 | assuming you have enough to get around
00:31:33.640 | and have a nice home and so forth,
00:31:35.880 | you'll be far happier if you figure out
00:31:38.320 | what you care about and work on that.
00:31:41.680 | It's often being in service to others.
00:31:44.560 | There's a great deal of evidence
00:31:45.760 | that people are happiest when they're serving others
00:31:47.960 | and not themselves.
00:31:49.560 | This goes directly against the sort of press-induced
00:31:53.680 | excitement about powerful and wealthy leaders of one kind,
00:31:59.240 | and indeed, these are consequential people.
00:32:01.720 | But if you are in a situation
00:32:03.840 | where you've been very fortunate, as I have,
00:32:06.120 | you also have to take that as a responsibility,
00:32:09.040 | and you have to basically work both to educate others
00:32:12.160 | and give them that opportunity,
00:32:13.600 | but also use that wealth to advance human society.
00:32:16.720 | In my case, I'm particularly interested
00:32:18.320 | in using the tools of artificial intelligence
00:32:20.560 | and machine learning to make society better.
00:32:22.880 | I've mentioned education, I've mentioned inequality
00:32:26.040 | and middle class and things like this,
00:32:28.060 | all of which are a passion of mine.
00:32:30.120 | It doesn't matter what you do,
00:32:31.840 | it matters that you believe in it,
00:32:33.700 | that it's important to you,
00:32:35.360 | and that your life will be far more satisfying
00:32:38.080 | if you spend your life doing that.
00:32:40.520 | - I think there's no better place to end
00:32:43.440 | than a discussion of the meaning of life.
00:32:45.240 | Eric, thank you so much. - Thank you very much, Alex.
00:32:47.680 | (audience applauding)
00:32:50.840 | (gentle music)
00:32:53.420 | (gentle music)
00:32:56.000 | (gentle music)
00:32:58.580 | (gentle music)
00:33:01.160 | (gentle music)
00:33:03.740 | [BLANK_AUDIO]