back to indexEric Schmidt: Google | Lex Fridman Podcast #8
Chapters
0:0 Introduction
0:36 First moment when you fell in love with technology
1:8 Discovering the beauty of math
3:35 Creating Lex
5:32 Path to generality
6:32 Middle class
7:25 Computers as tools
9:30 Predicting the future
11:4 The champions of the impossible
12:3 Planning for the impossible
15:17 Artificial Intelligence
19:21 Thinking beyond 50 years
22:21 Leadership styles
25:23 Personal AI assistant
00:00:00.000 |
The following is a conversation with Eric Schmidt. 00:00:06.780 |
guiding the company through an incredible period of growth 00:00:15.300 |
in the era of the internet and a powerful voice 00:00:19.360 |
for the promise of technology in our society. 00:00:31.920 |
And now, here's my conversation with Eric Schmidt. 00:00:51.360 |
we would go out to the cow pasture behind my house, 00:01:12.040 |
of science and math that maybe rockets would bring, 00:01:17.480 |
You've mentioned yesterday that eighth grade math 00:01:20.160 |
is where the journey through mathematical universe 00:01:26.960 |
There's a professor of math at Berkeley, Edward Frenkel. 00:01:35.480 |
I recommend to everybody called "Love and Math," 00:01:41.600 |
He says that if painting was taught like math, 00:01:46.600 |
then students would be asked to paint a fence, 00:01:49.660 |
which is his analogy of essentially how math is taught. 00:01:52.360 |
You never get a chance to discover the beauty 00:01:55.920 |
of the art of painting or the beauty of the art of math. 00:01:59.320 |
So how, when, and where did you discover that beauty? 00:02:03.900 |
- I think what happens with people like myself 00:02:11.400 |
and all of a sudden you discover that you can use that 00:02:22.080 |
that somewhere when they were in high school or in college, 00:02:46.200 |
but you could think of it as open source contributions. 00:02:51.800 |
I'm sure that I would aspire as a computer scientist 00:02:54.720 |
to make a contribution like the open source heroes 00:03:00.400 |
and I'd be trying and learning and making mistakes 00:03:06.680 |
The repository that represent, that GitHub represents, 00:03:17.160 |
And one of the lessons that I learned at Google 00:03:28.960 |
building parts of programs, building new ideas, 00:03:38.360 |
the inspiring moment where there was nothing, 00:03:40.680 |
and then you created something through programming, 00:03:44.720 |
So in 1975, I think, you've created a program called Lex, 00:03:49.180 |
which I especially like because my name is Lex. 00:03:54.640 |
that established a reputation that's long-lasting, 00:04:02.840 |
But more seriously, in that time, in the '70s, 00:04:07.840 |
as an engineer, personal computers were being born. 00:04:14.160 |
the '80s, '90s, and the aughts of where computers would go? 00:04:18.920 |
- I'm sure I could not and would not have gotten it right. 00:04:25.440 |
of many, many people who saw it clearer than I did. 00:04:29.040 |
With Lex, I worked with a fellow named Michael Lesk, 00:04:32.520 |
who was my supervisor, and he essentially helped me 00:04:35.640 |
architect and deliver a system that's still in use today. 00:04:39.180 |
After that, I worked at Xerox Palo Alto Research Center, 00:04:46.080 |
of the modern personal computer, or Macintosh, and so forth. 00:04:50.200 |
And the Altos were very rare, and I had to drive an hour 00:04:53.600 |
from Berkeley to go use them, but I made a point 00:04:56.400 |
of skipping classes and doing whatever it took 00:04:59.560 |
to have access to this extraordinary achievement. 00:05:12.840 |
And so since then, and I have learned the benefit 00:05:17.480 |
which are going to scale to platforms, right? 00:05:27.440 |
people really have needs, they really will use 00:05:29.360 |
these platforms, and you can build big businesses 00:05:32.600 |
- So it's interesting, so when you see a piece 00:05:34.200 |
of technology, now you think, what will this technology 00:05:36.800 |
look like when it's in the hands of a billion people? 00:05:39.960 |
So an example would be that the market is so competitive now 00:05:45.000 |
that if you can't figure out a way for something 00:05:53.160 |
because something else will become the general platform 00:06:01.080 |
or a specialized service with relatively few users. 00:06:06.560 |
to general platform use, it's a path to broad applicability. 00:06:10.080 |
Now there are plenty of good businesses that are tiny, 00:06:18.520 |
you have to look for things which are of common value, 00:06:26.160 |
And by the way, people have lots of problems. 00:06:28.120 |
Information, medicine, health, education, and so forth. 00:06:32.960 |
- Like you said, you're a big fan of the middle class. 00:06:40.200 |
- So any product, any thing that has a huge impact 00:06:44.400 |
and improves their lives is a great business decision 00:06:48.880 |
- And there's nothing wrong with starting off 00:06:55.440 |
There's nothing wrong with starting with a specialized 00:06:57.600 |
market in order to learn and to build and to fund things. 00:07:04.480 |
But if you define yourself as only a narrow market, 00:07:07.520 |
someone else can come along with a general purpose market 00:07:14.260 |
can force you to be a lesser impact than you might be. 00:07:17.840 |
So it's very important to think in terms of broad businesses 00:07:24.980 |
- So as you look to the '70s, but also in the decades 00:07:29.920 |
to come, and you saw computers, did you see them as tools? 00:07:34.880 |
Or was there a little element of another entity? 00:07:40.280 |
I remember a quote saying, "AI began with our dream 00:07:46.180 |
Is there a feeling when you wrote that program 00:07:52.860 |
- I wish I could say otherwise, but I simply found 00:08:00.500 |
I think the majority of the people that I've worked with, 00:08:03.420 |
and there are a few exceptions, Steve Jobs being an example, 00:08:06.740 |
really saw this as a great technological play. 00:08:10.020 |
I think relatively few of the technical people 00:08:15.420 |
So I used NCP, which is a predecessor to TCP/IP. 00:08:21.200 |
We didn't think of it in terms of the internet, 00:08:23.780 |
and then companies, and then Facebook, and then Twitter, 00:08:37.980 |
Most people can see, if you ask people to predict the future 00:08:44.460 |
'Cause that's about as far as people can imagine. 00:08:47.460 |
But there's an old saying, which actually was attributed 00:08:52.020 |
that we overestimate what can be done in one year, 00:08:56.340 |
and we underestimate what can be done in a decade. 00:09:02.420 |
that these core platforms at hardware and software 00:09:09.420 |
Self-driving cars were thought about in the '90s. 00:09:13.300 |
The first DARPA-Durant Challenge was roughly 2004. 00:09:33.780 |
You just talked about predicting into the future. 00:09:52.740 |
is to having a model for what's going to happen 00:10:04.220 |
has largely halted in its traditional shrinking mechanism 00:10:12.120 |
But there's plenty of algorithmic improvements 00:10:16.540 |
So you need to understand the nature of those improvements 00:10:26.060 |
what are the gains that are gonna be possible in wireless? 00:10:33.380 |
of wireless connectivity at many different bands. 00:10:42.080 |
but now it looks like we're gonna be using fiber 00:10:54.400 |
then you're gonna build your systems differently. 00:11:03.880 |
- And so when you think about whether it's a fiber 00:11:25.980 |
I'd like to talk about the impossible and the possible. 00:11:42.940 |
they disagree with what the sort of zeitgeist is. 00:12:09.880 |
about things you've been wrong about predicting 00:12:12.200 |
and you've obviously been right about a lot of things, 00:12:31.740 |
and make the impossible real, make it happen, 00:12:35.260 |
and slow, that's how programmers often think, 00:12:41.740 |
well, this is the rational, this is the possible, 00:12:44.780 |
the pragmatic, the dreamer versus the pragmatist. 00:12:51.380 |
which encourages a predictable revenue stream 00:13:04.080 |
of what our revenue will be for the next year or two, 00:13:16.780 |
so the corporation is organized around these bets. 00:13:19.500 |
And these bets are in areas of fundamental importance 00:13:22.740 |
to the world, whether it's artificial intelligence, 00:13:29.700 |
connectivity through balloons, on and on and on. 00:13:38.020 |
is that the current business is successful enough 00:13:45.940 |
is that we have the wisdom of being able to see 00:13:49.140 |
that a corporate structure needs to be created 00:13:51.560 |
to enhance the likelihood of the success of those bets. 00:13:59.540 |
and then this underlying corporation, Google, 00:14:13.560 |
and the idea is that people can spend 20% of the time 00:14:16.940 |
And the top down is that our founders in particular 00:14:23.940 |
So an example would be they'll hear about an idea 00:14:26.620 |
or I'll hear about something and it sounds interesting, 00:14:36.060 |
you get pretty good at predicting what's likely to work. 00:14:39.860 |
- So that's a beautiful balance that's struck. 00:14:53.100 |
came up with a concept called 10% of the budget 00:15:16.180 |
- So getting into the world of artificial intelligence, 00:15:20.980 |
you've talked quite extensively and effectively 00:15:28.820 |
the positive impact of artificial intelligence, 00:15:32.060 |
whether it's machine, especially machine learning 00:15:41.620 |
In the AI community, there is a kind of debate. 00:15:54.260 |
you've disagreed on, at least on the degree of emphasis 00:16:00.740 |
So I've spoken with Stuart Russell, Max Tegmark, 00:16:05.380 |
and Yoshio Bengio, Steven Pinker, who do not. 00:16:13.900 |
disagreeing, which is really healthy, of course. 00:16:26.780 |
of the technology being mismanaged in some kind of way? 00:16:32.700 |
- So the source of education for the general public 00:16:40.860 |
And the one thing I can assure you we're not building 00:17:08.540 |
So you can imagine a situation 100 years from now 00:17:18.140 |
of brilliant MIT scientists have figured all this out, 00:17:20.940 |
we're gonna have a large number of ethics questions, right? 00:17:25.140 |
Around science and thinking and robots and computers 00:17:29.700 |
So it depends on the question of the timeframe. 00:17:37.220 |
What we're facing in the next five to 10 years 00:17:39.100 |
is how do we spread this disruptive technology 00:17:42.140 |
as broadly as possible to gain the maximum benefit of it? 00:17:57.340 |
the fact that we have big data about our health 00:18:05.480 |
there's people working on psychological diseases 00:18:11.660 |
The promise of AI in medicine is extraordinary. 00:18:25.580 |
Can you imagine that for each generation of child 00:18:28.500 |
and even adult, you have a tutor educator that's AI based, 00:18:43.300 |
The gains societally of making humans smarter 00:18:53.900 |
There are people who are working on AI safety, 00:19:04.400 |
Google, for example, has announced its policies 00:19:07.540 |
with respect to AI safety, which I certainly support 00:19:16.340 |
But the killer robots are not arriving this year 00:19:21.220 |
- And on that line of thinking, you said the time scale, 00:19:30.460 |
have you found it useful on the business side 00:19:34.600 |
or the intellectual side to think beyond five, 10 years, 00:19:41.980 |
- In our industry, there are essentially no examples 00:19:45.180 |
of 50 year predictions that have been correct. 00:19:53.100 |
and a couple of other universities in the 1956, 1957, 1958, 00:20:01.360 |
And when I was a PhD student, I studied AI a bit 00:20:27.300 |
who basically invented this deep learning model, 00:20:32.580 |
Those, the seminal work there was 20 years ago. 00:20:36.100 |
And in the last 10 years, it's become popularized. 00:20:39.980 |
So think about the timeframes for that level of discovery. 00:20:45.920 |
Many people think that we'll be flying around 00:21:02.180 |
that are more sustainable because the earth is limited 00:21:07.380 |
And that the kind of platforms that are gonna get built 00:21:10.500 |
will be consistent with the principles that I've described. 00:21:13.020 |
They will be much more empowering of individuals. 00:21:15.740 |
They'll be much more sensitive to the ecology 00:21:20.540 |
I also think that humans are gonna be a great deal smarter. 00:21:25.080 |
because of the tools that I've discussed with you. 00:21:47.340 |
- Today, more than half the world lives in cities. 00:21:53.740 |
in the last 20 years has been the rural to urban migration. 00:22:01.100 |
It's occurring in Asia and it's occurring in Africa. 00:22:04.660 |
When people move to cities, the cities get more crowded, 00:22:07.780 |
but believe it or not, their health gets better. 00:22:12.280 |
Their IQ and educational capabilities improve. 00:22:15.440 |
So it's good news that people are moving to cities, 00:22:25.860 |
but you've also worked with some of the greatest leaders 00:22:29.960 |
What insights do you draw from the difference 00:22:32.980 |
in leadership styles of yourself, Steve Jobs, 00:22:49.060 |
- One of the things that I learned as a young executive 00:22:52.140 |
is that there's no single formula for leadership. 00:22:54.820 |
They try to teach one, but that's not how it really works. 00:22:59.960 |
There are people who just understand what they need to do 00:23:09.020 |
There are other people who are systems thinkers and planners. 00:23:11.580 |
That's more who I am, somewhat more conservative, 00:23:14.660 |
more thorough in execution, a little bit more risk averse. 00:23:18.660 |
There's also people who are sort of slightly insane, right? 00:23:22.160 |
In the sense that they are emphatic and charismatic 00:23:26.080 |
and they feel it and they drive it and so forth. 00:23:31.380 |
There is one thing that unifies all of the people 00:23:33.620 |
that you named, which is very high intelligence, right? 00:23:51.220 |
about all those people is they all started young. 00:23:58.420 |
Think about Bill Gates starting at roughly 2021. 00:24:10.660 |
At 30 years old, they had 10 years of experience 00:24:13.740 |
of dealing with people and products and shipments 00:24:22.780 |
compared to the rest of us who were busy getting our PhDs. 00:24:28.500 |
because they've just had more life experience, right? 00:24:36.160 |
when you're at the top of these organizations, 00:24:40.100 |
all the easy questions have been dealt with, right? 00:24:45.700 |
Where should we put the colors on our product? 00:25:31.900 |
it's been said that the best thing one can do 00:25:50.740 |
In 1998, there were many good search engines, 00:26:00.300 |
So Google stepped in and disrupted everything. 00:26:25.540 |
the impact of such an AI assistant could bring. 00:26:43.300 |
I've dreamed of creating such an AI assistant. 00:26:46.380 |
So every step I've taken has been towards that goal. 00:27:13.780 |
just all the things that Larry and Sergey faced, 00:27:37.300 |
My answer is that they didn't have that conversation. 00:27:42.860 |
They sensed a moment when, in the case of Google, 00:27:47.260 |
there was all of this data that needed to be organized, 00:28:11.940 |
There must be graduate students to work on new approaches. 00:28:15.020 |
Here at MIT, there are people who are looking at learning 00:28:18.180 |
from the standpoint of looking at child learning, right? 00:28:27.180 |
from the approach that most people are taking. 00:28:40.100 |
They see an opportunity based on what's happened. 00:28:48.980 |
and they had this idea 'cause they couldn't get a cab. 00:28:56.660 |
So what's the equivalent of that Travis, Eiffel Tower, 00:29:03.100 |
that you could, as an entrepreneur, take advantage of, 00:29:05.940 |
whether it's in human-centered AI or something else? 00:29:16.100 |
and listen to a few interviews, it's very nonchalant. 00:29:27.460 |
You know, we just kind of want to play around with that data, 00:29:32.340 |
- Well, I should say what happened, remember, 00:29:35.580 |
is that they were graduate students at Stanford, 00:29:42.080 |
And they had to get power from the room next door 00:29:46.340 |
'cause they were using too much power in their room, 00:29:53.520 |
and they had Google World headquarters of five people, 00:29:57.540 |
and they raised $100,000 from Andy Bechtolsheim, 00:30:04.480 |
The point is, their beginnings were very simple, 00:30:23.960 |
one of the most cited papers in the world today. 00:30:35.000 |
and yet it seems that money is simply a side effect 00:30:47.960 |
So much of our society at the individual level 00:31:10.400 |
- There have been many studies of human happiness, 00:31:16.320 |
which is typically relatively low for this conversation, 00:31:19.500 |
there's no difference in happiness about money. 00:31:23.560 |
The happiness is correlated with meaning and purpose, 00:31:45.760 |
that people are happiest when they're serving others 00:31:49.560 |
This goes directly against the sort of press-induced 00:31:53.680 |
excitement about powerful and wealthy leaders of one kind, 00:32:06.120 |
you also have to take that as a responsibility, 00:32:09.040 |
and you have to basically work both to educate others 00:32:13.600 |
but also use that wealth to advance human society. 00:32:18.320 |
in using the tools of artificial intelligence 00:32:22.880 |
I've mentioned education, I've mentioned inequality 00:32:35.360 |
and that your life will be far more satisfying 00:32:45.240 |
Eric, thank you so much. - Thank you very much, Alex.