back to indexKai-Fu Lee: AI Superpowers - China and Silicon Valley | Lex Fridman Podcast #27
Chapters
0:0 Introduction
1:25 What defines the Chinese soul
4:28 Chinese vs American AI engineers
6:40 KaiFus intuition
8:35 Teslas approach
20:7 Evil Mantra
24:55 Chinese copycats
27:44 Silicon Valleys baggage
33:36 Guiding Funds
37:34 Infrastructure Augmentation
40:25 Machine Intelligence
42:9 White Collar Jobs
45:40 Universal Basic Income
48:53 Difficulties with AI
55:20 Retraining
57:56 Doubleedged sword
59:19 Arms race metaphor
00:00:00.000 |
The following is a conversation with Kai-Fu Lee. 00:00:02.960 |
He's the chairman and CEO of Sinovation Ventures 00:00:06.560 |
that manages a $2 billion dual currency investment fund 00:00:10.600 |
with a focus on developing the next generation 00:00:24.160 |
of the artificial intelligence leaders in China, 00:00:33.900 |
He was named one of the 100 most influential people 00:00:40.640 |
He's the author of seven bestselling books in Chinese, 00:00:43.840 |
and most recently, the New York Times bestseller 00:00:46.680 |
called "AI Superpowers, China, Silicon Valley, 00:01:00.040 |
And so he has a unique perspective on global innovation 00:01:03.480 |
and the future of AI that I think is important 00:01:11.880 |
If you enjoy it, subscribe on YouTube and iTunes, 00:01:16.800 |
or simply connect with me on Twitter @LexFriedman. 00:01:20.960 |
And now here's my conversation with Kai-Fu Lee. 00:01:25.800 |
- I immigrated from Russia to US when I was 13. 00:01:38.280 |
a spirit that permeates throughout the generations. 00:01:42.080 |
So maybe it's a little bit of a poetic question, 00:01:52.040 |
- I think the Chinese soul of people today, right? 00:01:56.120 |
We're talking about people who have had centuries of burden 00:02:01.120 |
because of the poverty that the country has gone through 00:02:15.480 |
And undoubtedly there are two sets of pressures 00:02:33.880 |
So that's a very strong hunger and a strong desire 00:02:38.360 |
and strong work ethic that drives China forward. 00:02:41.160 |
- And is there roots to not just this generation, 00:02:52.560 |
that you could speak to that's in the people? 00:02:58.600 |
is about excellence, dedication, and results. 00:03:02.680 |
And the Chinese exams and study subjects in schools 00:03:09.920 |
10,000 characters, not an easy task to start with, 00:03:15.280 |
his historic philosophers, literature, poetry. 00:03:36.160 |
and also enhances the speed execution get results. 00:03:42.400 |
And that I think characterizes the historic basis of China. 00:03:47.480 |
- That's interesting 'cause there's echoes of that 00:03:49.200 |
in Russian education as well as rote memorization. 00:03:53.840 |
I mean, there's just an emphasis on perfection 00:04:00.880 |
to perhaps what you're speaking to, which is creativity. 00:04:03.720 |
But you think that kind of education holds back 00:04:11.000 |
- Well, it holds back the breakthrough innovative spirits 00:04:16.560 |
but it does not hold back the valuable execution oriented, 00:04:28.040 |
- So is there a difference between a Chinese AI engineer 00:04:34.880 |
perhaps rooted in the culture that we just talked about 00:04:37.120 |
or the education or the very soul of the people or no? 00:04:51.320 |
about using known technologies and trying new things. 00:05:05.400 |
And it is also, as anyone who's built AI products 00:05:15.320 |
And data is generally unstructured, errorful and unclean. 00:05:26.360 |
So I think the better part of American engineering, 00:05:38.000 |
and to use technology to solve most, if not all problems. 00:05:43.400 |
So to make the algorithm work despite not so great data, 00:05:47.160 |
find error tolerant ways to deal with the data. 00:05:50.680 |
The Chinese way would be to basically enumerate 00:06:12.960 |
working with thousands of people who label or correct 00:06:24.040 |
So the Chinese engineer would rely on and ask for 00:06:32.600 |
And probably less time thinking about new algorithms 00:06:55.160 |
cleaning data, organizing data onto the same algorithms? 00:06:58.480 |
What do you think the big impact in the applied world is? 00:07:14.280 |
and likely to generate better and better results. 00:07:17.240 |
And that's why the Chinese approach has done quite well. 00:07:20.560 |
Now, there are a lot of more challenging startups 00:07:34.280 |
And that would put the Chinese approach more challenged 00:07:38.720 |
and give them more breakthrough innovation approach, 00:07:53.600 |
So you brought up autonomous vehicles and medical diagnosis. 00:07:56.440 |
So your intuition is that huge amounts of data 00:08:00.080 |
might not be able to completely help us solve that problem. 00:08:08.060 |
I think huge amounts of data probably will solve 00:08:22.360 |
is likely to require new technologies we don't yet know. 00:08:39.740 |
and the developments with Tesla and Elon Musk. 00:08:48.040 |
into this mysterious complex world of full autonomy, 00:08:55.080 |
And they're trying to solve that purely with data. 00:09:02.100 |
which is what a lot of people share your intuition. 00:09:09.320 |
do you think possible for them to achieve success 00:09:13.560 |
with simply just a huge amount of this training 00:09:17.000 |
on edge cases and difficult cases in urban environments, 00:09:27.000 |
as kind of a Chinese strength approach, right? 00:09:36.080 |
clearly a lot of the decisions aren't merely solved 00:09:40.400 |
by aggregating data and having feedback loop. 00:09:43.480 |
There are things that are more akin to human thinking 00:09:58.780 |
even though that's a taboo word with the machine learning 00:10:02.920 |
and the integration of the two types of thinking 00:10:12.420 |
And of course, Tesla also has an additional constraint 00:10:18.520 |
I know that they think it's foolish to use LIDARs, 00:10:25.120 |
and reliable source of input that they're foregoing, 00:10:32.440 |
I think the advantage of course is capturing data 00:10:42.180 |
I have seen Chinese companies accumulate data 00:10:44.820 |
that's not seen anywhere in the Western world 00:10:50.240 |
But then speech recognition and object recognition 00:10:53.720 |
are relatively suitable problems for deep learning 00:11:00.760 |
for the human intelligence analytical planning elements. 00:11:04.440 |
- And the same on the speech recognition side, 00:11:09.000 |
and the machine learning approaches to speech recognition 00:11:16.000 |
which is sort of maybe akin to what driving is. 00:11:20.080 |
So it needs to have something more than just simply 00:11:22.800 |
simple language understanding, simple language generation. 00:11:28.760 |
I would say that based on purely machine learning approaches 00:11:36.720 |
a full conversational experience across arbitrary domains, 00:11:44.600 |
I'm a little hesitant to use the word Turing test 00:11:46.920 |
because the original definition was probably too easy. 00:11:52.320 |
- The spirit of the Turing test is what I was referring to. 00:11:56.520 |
- So you've had major leadership research positions 00:12:01.640 |
So continuing on the discussion of America, Russia, 00:12:18.760 |
of each of these three major companies in your view? 00:12:22.080 |
- I think in aggregate Silicon Valley companies, 00:12:25.160 |
and we could probably include Microsoft in that 00:12:34.000 |
and believe that technology will conquer all. 00:12:37.980 |
And also the self-confidence and the self-entitlement 00:12:59.120 |
he looks in the mirror and asks the person in the mirror, 00:13:11.280 |
but develop something that users will know they want 00:13:16.240 |
but they could never come up with themselves. 00:13:18.960 |
I think that is probably the most exhilarating description 00:13:26.560 |
that this brilliant idea could cause you to build something 00:13:31.560 |
that couldn't come out of focus groups or A/B tests. 00:13:38.020 |
No one in the age of Blackberry would write down 00:13:44.800 |
No one would say they want that in the days of FTP, 00:13:49.440 |
So I think that is what Silicon Valley is best at. 00:14:01.960 |
and there were basically no competitors anywhere. 00:14:08.440 |
that these are the only things that one should do, 00:14:22.320 |
and an OpenTable and the Grubhub would each feel, 00:14:26.280 |
okay, I'm not gonna do the other company's business 00:14:28.560 |
because that would not be the pride of innovating 00:14:32.360 |
whatever each of these four companies have innovated. 00:14:47.200 |
the market leader will get predominantly all the value 00:14:53.120 |
And the system isn't just defined as one narrow category, 00:15:06.040 |
and domination of increasingly larger product categories 00:15:15.080 |
and the opportunity to extract tremendous value. 00:15:41.920 |
If what it takes is to satisfy a foreign country's need 00:15:47.640 |
and building something that looks really ugly and different, 00:15:56.240 |
And I think the flexibility and the speed and execution 00:16:13.240 |
and the American entrepreneurs only look internally 00:16:23.560 |
- The unique elements of the three companies, perhaps. 00:17:16.360 |
because it's well architected at the bottom level 00:17:21.360 |
and the work is efficiently delegated to individuals. 00:17:33.640 |
So it's probably the most effective high-tech assembly line 00:17:49.560 |
and something competitors can't easily repeat. 00:18:00.880 |
essentially dominating the market for a long time, 00:18:05.680 |
- I think there are elements that are the same. 00:18:13.920 |
and obviously as well as American characteristics, 00:18:22.960 |
that will tenaciously go after adjacent markets, 00:18:40.000 |
And they understand the value of the platforms, 00:18:45.640 |
And then with Google, I think it's a genuinely 00:18:50.640 |
value-oriented company that does have a heart and soul 00:18:57.040 |
and that wants to do great things for the world 00:19:01.960 |
and that has also very strong technology genes 00:19:13.320 |
and has found out-of-the-box ways to use technology 00:19:36.560 |
I mean, they used to have the slogan, "Don't be evil." 00:19:39.000 |
And Facebook, a little bit more has a negative tint to it, 00:19:43.160 |
at least in the perception of privacy and so on. 00:19:46.000 |
Do you have a sense of how these different companies 00:19:59.400 |
give it a heart and soul, gain the trust of the public, 00:20:12.120 |
First, the "don't do evil" mantra is very dangerous 00:20:16.760 |
because every employee's definition of evil is different, 00:20:25.200 |
So I don't necessarily think that's a good value statement, 00:20:32.360 |
or its parent company, Alphabet, does in new areas 00:20:36.480 |
like healthcare, like eradicating mosquitoes, 00:20:42.380 |
of a internet tech company, I think that shows 00:20:45.920 |
that there is a heart and soul and desire to do good 00:20:57.280 |
That doesn't necessarily mean it has all the trust 00:21:02.520 |
I realize while most people would view Facebook 00:21:06.400 |
as the primary target of their recent unhappiness 00:21:14.080 |
and some have named Google's business practices 00:21:19.840 |
So it's kind of difficult to have the two parts of a body, 00:21:24.280 |
the brain wants to do what it's supposed to do 00:21:29.320 |
and then the heart and soul wants to do good things 00:21:32.040 |
that may run against what the brain wants to do. 00:21:36.160 |
- So in this complex balancing that these companies 00:21:39.840 |
have to do, you've mentioned that you're concerned 00:21:45.520 |
like Google, Facebook, Amazon are controlling our data, 00:21:49.680 |
are controlling too much of our digital lives. 00:21:57.560 |
- I think I'm hardly the most vocal complainer of this. 00:22:03.640 |
There are a lot louder complainers out there. 00:22:24.120 |
So the entrepreneurial opportunities still exists 00:22:53.360 |
Probably the best solution is let the entrepreneurial 00:22:56.760 |
VC ecosystem work well, and find all the places 00:23:00.680 |
that can create the next Google, the next Facebook. 00:23:04.240 |
So there will always be increasing number of challengers. 00:23:08.600 |
In some sense that has happened a little bit. 00:23:33.320 |
China has emerged more of such companies than the US 00:23:42.200 |
because of the more fearless nature of the entrepreneurs, 00:23:57.080 |
and financial, while it's Alibaba affiliated, 00:24:00.160 |
it's nevertheless independent and worth 150 billion. 00:24:12.720 |
So it's probably not the case that in five or 10 years 00:24:19.720 |
with these five companies having such dominance. 00:24:24.800 |
this fascinating world of entrepreneurship in China, 00:24:33.280 |
what it takes to be an entrepreneur in China? 00:24:43.960 |
of the way the government helps companies and so on? 00:24:59.400 |
would brand Chinese entrepreneur as copycats. 00:25:10.760 |
an entrepreneur probably could not get funding 00:25:19.560 |
The first question is who has proven this business model? 00:25:23.400 |
Which is a nice way of asking, who are you copying? 00:25:29.560 |
because China had a much lower internet penetration 00:25:55.380 |
And the American successes have given a shortcut 00:25:59.540 |
that if you built your minimally viable product 00:26:04.260 |
it's guaranteed to be a decent starting point. 00:26:12.300 |
there hasn't been in the mobile and AI spaces, 00:26:25.180 |
because that's not your own idea to start with. 00:26:38.140 |
because lean startup is intended to try many, many things 00:26:46.740 |
So finding a decent starting point without legal violations, 00:26:51.240 |
there should be nothing morally dishonorable about that. 00:27:12.020 |
"Okay, we'll do exactly what Amazon is doing. 00:27:16.700 |
"and then let's out-innovate them from that starting point, 00:27:45.340 |
It's a non-heroic, acceptable way to start the company 00:27:52.860 |
So that's, I think, a baggage for Silicon Valley, 00:28:00.800 |
then it may limit the ultimate ceiling of the company. 00:28:11.540 |
but he's very proud that he wants to build his own features, 00:28:16.820 |
while Facebook was more willing to copy his features. 00:28:23.480 |
So I think putting that handcuff on a company 00:28:27.460 |
would limit its ability to reach the maximum potential. 00:28:33.840 |
copying was merely a way to learn from the American masters. 00:28:38.420 |
Just like we, if we learned to play piano or painting, 00:28:48.160 |
So very amazingly, the Chinese entrepreneurs, 00:28:54.120 |
started to branch off with these lean startups 00:28:59.500 |
to build better products than American products. 00:29:18.500 |
And then step three is once these entrepreneurs 00:29:23.700 |
they now look at the Chinese market and the opportunities 00:29:27.380 |
and come up with ideas that didn't exist elsewhere. 00:29:34.180 |
under which includes Alipay, which is mobile payments, 00:29:38.260 |
and also the financial products for loans built on that, 00:30:08.720 |
So, and an additional interesting observation 00:30:19.400 |
but may work very well in Southeast Asia, Africa, 00:30:27.840 |
And a few of these products maybe are universal 00:30:31.060 |
and are getting traction even in the United States, 00:30:43.520 |
because a large market with innovative entrepreneurs 00:31:05.420 |
So at Sinovation Ventures, our first fund was 15 million. 00:31:12.040 |
So it reflects the valuation of the companies 00:31:16.560 |
and our us going multi-stage and things like that. 00:31:22.020 |
but not in the way most Americans would think of it. 00:31:26.120 |
The government actually leaves the entrepreneurial space 00:31:29.520 |
as a private enterprise, so the self-regulating, 00:31:33.240 |
and the government would build infrastructures 00:31:39.320 |
For example, the Mass Entrepreneur, Mass Innovation Plan 00:31:48.400 |
For autonomous vehicles, the Chinese government 00:31:54.360 |
smart cities that separate pedestrians from cars 00:32:13.840 |
would have these guiding funds acting as LPs, 00:32:22.040 |
part of the money made is given back to the GPs 00:32:25.280 |
and potentially other LPs to increase everybody's return 00:32:36.400 |
that entrusts the task of choosing entrepreneurs to VCs 00:32:43.840 |
by letting some of the profits move that way. 00:32:48.760 |
So I look at the Russian government as a case study 00:33:00.880 |
And probably the same is true in the United States, 00:33:04.060 |
but the entrepreneurs themselves kind of find a way. 00:33:11.720 |
how did the Chinese government arrive to be this way? 00:33:23.160 |
And also perhaps, how can we copy it in other countries? 00:33:31.640 |
to support infrastructure for autonomous vehicles 00:34:01.440 |
And China made a few tweaks and turned it into a, 00:34:06.120 |
because the Chinese cities and government officials 00:34:11.360 |
'cause they all want to make their city more successful 00:34:14.720 |
so they can get the next level in their political career. 00:34:22.320 |
So the central government made it a bit of a competition. 00:34:26.840 |
They can put it on AI, or they can put it on bio, 00:34:39.760 |
it's kind of almost like an entrepreneurial environment 00:34:42.840 |
for local governments to see who can do a better job. 00:34:47.440 |
And also many of them try different experiments. 00:34:52.440 |
Some have given award to very smart researchers, 00:34:57.440 |
just give them money and hope they'll start a company. 00:35:00.820 |
Some have given money to academic research labs, 00:35:20.960 |
The one that worked the best was the guiding funds. 00:35:35.700 |
The autonomous vehicle and the massive spending 00:35:40.400 |
in highways and smart cities, that's a Chinese way. 00:35:45.400 |
It's about building infrastructure to facilitate. 00:35:49.520 |
It's a clear division of the government's responsibility 00:35:55.400 |
The market should do everything in a private freeway, 00:36:00.400 |
but there are things the market can't afford to do, 00:36:04.500 |
So the government always appropriates large amounts of money 00:36:12.000 |
This happens with not only autonomous vehicle and AI, 00:36:20.840 |
You'll find that the Chinese wireless reception 00:36:25.360 |
is better than the US because massive spending 00:36:34.360 |
It's a government driven because I think they view 00:36:36.880 |
the coverage of cell access and 3G, 4G access 00:36:49.880 |
So that's, of course, the state-owned enterprises 00:37:01.920 |
that may be very hard to inject into Western countries 00:37:05.440 |
to say starting tomorrow, bandwidth infrastructure 00:37:09.320 |
and highways are gonna be governmental spending 00:37:24.120 |
it's possible to solve the problem of full autonomy 00:37:30.160 |
without significant investment in infrastructure? 00:37:49.040 |
whether it's road, the city, or whole city planning, 00:37:52.360 |
building a new city, I'm sure that will accelerate 00:38:01.320 |
and it's hard to predict even when we're knowledgeable 00:38:06.100 |
But in the US, I don't think people would consider 00:38:16.000 |
There are smaller ones being built, I'm aware of that. 00:38:18.920 |
But is infrastructure spend really impossible 00:38:29.000 |
Was that during President Eisenhower or Kennedy? 00:38:37.280 |
how did President Eisenhower get the resources 00:38:42.840 |
that surely gave US a tremendous amount of prosperity 00:38:53.120 |
it takes us to artificial intelligence a little bit 00:39:00.520 |
So I'll be actually interested if you would say 00:39:03.400 |
that you talk in your book about all kinds of jobs 00:39:08.920 |
I wonder if building infrastructure is one of the jobs 00:39:17.040 |
because I think you've mentioned somewhere in the talk 00:39:19.920 |
or that there might be, as jobs are being automated, 00:39:24.280 |
a role for government to create jobs that can't be automated. 00:39:41.200 |
And a lot of it went into infrastructure building. 00:39:44.600 |
And I think that's a legitimate way at a government level 00:39:57.040 |
As long as the infrastructures are truly needed 00:39:59.840 |
and as long as there isn't an employment problem, 00:40:07.840 |
if you've been a leader and a researcher in AI 00:40:16.200 |
So how has AI changed in the West and the East 00:40:21.000 |
as you've observed, as you've been deep in it 00:40:37.500 |
that worked extremely well, which is machine intelligence. 00:40:52.760 |
but relatively simple kinds of planning tasks 00:40:58.680 |
So we didn't end up building human intelligence. 00:41:04.560 |
that was a lot better than us on some problems, 00:41:11.700 |
So today, I think a lot of people still misunderstand 00:41:27.600 |
to having invented the internet or the spreadsheet 00:41:31.680 |
or the database and getting broader adoption. 00:41:42.080 |
the general intelligence that people in the popular culture 00:41:51.960 |
the kind, the narrow AI that you're talking about 00:42:04.440 |
beginning to be automated by AI systems, algorithms? 00:42:09.240 |
- Yes, this is also maybe a little bit counterintuitive 00:42:31.540 |
is most people think of AI replacing routine jobs 00:42:35.740 |
than they think of the assembly line, the workers. 00:42:40.980 |
but it's actually the routine white collar workers 00:43:01.900 |
and maybe even unknown environments, very, very difficult. 00:43:17.920 |
and deal with simple computer programs and data 00:43:34.600 |
So you have people dealing with new employee orientation, 00:43:39.380 |
searching for past lawsuits and financial documents 00:43:52.800 |
- In addition to the white collar repetitive work, 00:43:56.440 |
a lot of simple interaction work can also be taken care of, 00:44:00.240 |
such as telesales, telemarketing, customer service, 00:44:04.860 |
as well as many physical jobs that are in the same location 00:44:09.540 |
and don't require a high degree of dexterity. 00:44:12.220 |
So fruit picking, dish washing, assembly line, 00:44:24.460 |
And the blue collar may be smaller initially, 00:44:32.540 |
And when we start to get to over the next 15, 20 years, 00:44:39.100 |
of doing assembly line, that's a huge chunk of jobs. 00:44:52.020 |
So I see modest numbers in the next five years, 00:44:58.060 |
- On the worry of the jobs that are in danger 00:45:03.080 |
I'm not sure if you're familiar with Andrew Yang. 00:45:07.820 |
- So there's a candidate for president of the United States 00:45:10.580 |
whose platform, Andrew Yang, is based around, 00:45:24.460 |
that are folks who lose their job due to automation 00:45:34.360 |
So what are your thoughts about his concerns, 00:45:39.140 |
- I think his thinking is generally in the right direction, 00:46:00.560 |
The unemployment numbers are not very high yet. 00:46:03.840 |
And I think he and I have the same challenge. 00:46:07.680 |
If I want to theoretically convince people this is an issue 00:46:25.280 |
on the displacement issue, on universal basic income 00:46:33.960 |
because I think the main issue is retraining. 00:47:00.520 |
because historically when technology revolutions, 00:47:11.900 |
the whole point is replacing all routine jobs eventually. 00:47:15.380 |
So there will be fewer and fewer routine jobs. 00:47:26.940 |
So therefore the people who are losing the jobs 00:47:32.340 |
The jobs that are becoming available are non-routine jobs. 00:47:35.740 |
So the social stipend needs to be put in place 00:47:39.360 |
is for the routine workers who lost their jobs 00:47:42.100 |
to be retrained maybe in six months, maybe in three years. 00:47:46.180 |
It takes a while to retrain on a non-routine job 00:47:58.240 |
So I'm not disagreeing with what he's trying to do. 00:48:03.240 |
But for simplification, sometimes he just says UBI, 00:48:08.740 |
- And I think you've mentioned elsewhere that, 00:48:13.780 |
to give people enough money to survive or live 00:48:24.580 |
That our employment, at least in the United States, 00:48:36.920 |
So now, what kind of jobs do you think can't be automated? 00:48:52.300 |
- Because an AI system is currently merely optimizing. 00:49:05.980 |
It can't come up with a new problem and solve it. 00:49:32.940 |
that isn't just about optimizing the bottom line, 00:49:38.700 |
corporate brand, and many, many other things. 00:49:48.820 |
doing them creates a high degree of satisfaction, 00:49:52.540 |
and therefore appealing to our desire for working, 00:50:00.980 |
that others maybe can't do or can't do as well. 00:50:15.020 |
AI can't do that because AI is cold, calculating, 00:50:29.700 |
people would want to interact with another person, 00:50:38.660 |
or a concierge, or a masseuse, or a bartender. 00:50:45.020 |
just don't want to interact with a cold robot or software. 00:50:49.460 |
I've had an entrepreneur who built an elderly care robot, 00:51:09.460 |
"Let me show you a picture of her grandkids." 00:51:11.600 |
So people yearn for that people-people interaction. 00:51:15.340 |
So even if robots improved, people just don't want it. 00:51:29.220 |
and that will give people money to enjoy services, 00:51:44.900 |
every dollar of that $16 trillion will be tremendous. 00:51:50.040 |
that are to service the people who did well through AI 00:52:06.140 |
The best example is probably in healthcare services. 00:52:15.360 |
just brand new incremental jobs in the next six years 00:52:20.520 |
That includes nurses, orderly in the hospital, 00:52:26.740 |
and also at-home care is particularly lacking. 00:52:41.560 |
and that the social status of these jobs are not very good. 00:53:03.480 |
and just think about satisfaction from one's job, 00:53:07.200 |
someone repetitively doing the same manual action 00:53:24.700 |
So if only we could fix the pay for service jobs, 00:53:28.760 |
there are plenty of jobs that require some training 00:53:37.000 |
We can easily imagine someone who was maybe a cashier 00:53:49.220 |
Also do want to point out the blue collar jobs 00:53:58.160 |
AI cannot be told go clean an arbitrary home. 00:54:10.080 |
because plumber is almost like a mini detective 00:54:12.840 |
that has to figure out where the leak came from. 00:54:22.840 |
So one has to study which blue collar jobs are going away 00:54:56.960 |
and actually the meaningful nature of those jobs 00:55:02.040 |
So overall, are you optimistic about the future 00:55:07.040 |
where much of the repetitive tasks are automated? 00:55:13.780 |
for the compassionate, for the creative input 00:55:20.040 |
- I am optimistic if we start to take action. 00:55:30.780 |
with the devastating losses that will emerge. 00:55:41.820 |
why they should train more plumbers than auto mechanics, 00:55:49.700 |
for corporations to have more training positions. 00:55:53.620 |
We start to explain to people why retraining is important. 00:55:58.180 |
We start to think about what the future of education, 00:56:00.780 |
how that needs to be tweaked for the era of AI. 00:56:08.960 |
then there's no reason to think we can't deal with this 00:56:12.360 |
because this technological revolution is arguably similar 00:56:22.660 |
for governments to step in to help with policy 00:56:34.020 |
and they don't have to believe automation will be this fast 00:56:39.460 |
Revamping vocational school would be one example. 00:56:47.500 |
and we know that a country's population is growing older, 00:56:54.260 |
because people over 80 require five times as much care 00:56:59.660 |
then it is a good time to incent training programs 00:57:03.460 |
for elderly care, to find ways to improve the pay. 00:57:07.540 |
Maybe one way would be to offer as part of Medicare 00:57:14.540 |
to be entitled to a few hours of elderly care at home. 00:57:35.420 |
controlling the future of AI development in general? 00:57:44.340 |
can better represent the interest of the people 00:57:52.300 |
are better at representing the interest of the people? 00:58:00.140 |
The companies and governments can provide better services 00:58:03.740 |
with more access to data and more access to AI, 00:58:13.560 |
whether it's monopoly or corruption in the government. 00:58:21.380 |
to look at how much data that companies and governments have 00:58:25.020 |
and some kind of checks and balances would be helpful. 00:58:42.180 |
the conflict all over the world is decreasing in general, 00:58:45.460 |
but do you have a sense that having written the book, 00:58:50.180 |
AI Superpowers, do you see a major international conflict 00:58:57.880 |
whatever they are, whether it's Russia, China, 00:59:12.160 |
Is there something, is that something we need to think about 00:59:31.820 |
And when you extrapolate into military kinds of scenarios, 00:59:47.380 |
And autonomous decision-making can lead to not enough time 01:00:13.740 |
I think engagement, interaction, some protocols 01:00:17.780 |
to avoid inadvertent disasters is actually needed. 01:00:23.660 |
So it's natural for each country to want to be the best, 01:00:27.820 |
whether it's in nuclear technologies or AI or bio, 01:00:43.300 |
that probably presents greater challenges to humanity 01:00:53.760 |
but with some degree of protocol for interaction, 01:00:57.080 |
just like when there was a nuclear competition, 01:01:18.460 |
is the level of engagement seems to be coming down. 01:01:26.380 |
especially from the US towards other large countries, 01:01:34.740 |
So that's beautifully put, level of engagement, 01:01:37.260 |
and even just a basic trust and communication, 01:01:43.400 |
making artificial enemies out of particular countries. 01:01:51.140 |
Do you have a sense how we can make it better? 01:01:57.180 |
Actionable items that as a society we can take on? 01:02:04.980 |
but I would say that we look pretty foolish as humankind 01:02:18.180 |
and yet we're not solving fundamental problems 01:02:29.480 |
And for the first time, we have the resources 01:02:35.940 |
but we're fueling competition among superpowers, 01:02:53.740 |
and maybe some AI to figure out how to use it 01:03:13.920 |
And then the geopolitics issue with superpower competition 01:03:19.360 |
There's another side which I worry maybe even more, 01:03:28.200 |
by US and China and a few of the other developed countries, 01:03:36.880 |
and the wealth disparity and inequality will increase. 01:03:52.400 |
who previously had hoped they could do the China model 01:03:56.060 |
and do outsource manufacturing or the India model 01:03:58.860 |
so they could do the outsource process or call center. 01:04:02.660 |
Well, all those jobs are gonna be gone in 10 or 15 years. 01:04:05.820 |
So the individual citizen may be a net liability, 01:04:10.820 |
I mean, financially speaking, to a poorer country 01:04:15.060 |
and not an asset to claw itself out of poverty. 01:04:46.940 |
can find solutions that's beyond my expertise. 01:04:50.000 |
- So different countries that we've talked about 01:04:59.000 |
there is an absolute desire for freedom of speech. 01:05:15.080 |
to the essence of what it means to be America, right? 01:05:26.540 |
that China and Russia and many other countries undertake. 01:05:31.300 |
Do you see that having effects on innovation, 01:06:02.980 |
which is correlated with entrepreneurial success. 01:06:08.480 |
I think empirically, we have seen that is not true, 01:06:20.820 |
but it's just that that perfect correlation isn't there. 01:06:31.700 |
and I've not been very good at that in my past predictions, 01:06:44.060 |
a lot of fundamental values for the long term. 01:07:10.700 |
and I think the fundamental values are there. 01:07:25.520 |
ultimately people want that kind of protection, 01:07:35.360 |
- On the point of privacy, to me, it's really interesting. 01:07:45.500 |
I'm just speaking generally in terms of products, 01:07:48.540 |
and then we have currently, depending on the age, 01:07:59.060 |
So in your view, how do we get this balance right 01:08:09.960 |
You look at Facebook, the more Facebook knows about you, 01:08:21.220 |
So in your view, how do we get that balance right? 01:08:25.220 |
- Yes, I think a lot of people have a misunderstanding 01:08:30.220 |
that it's okay and possible to just rip all the data out 01:08:48.420 |
We'll no longer be able to use products that function well 01:09:11.320 |
for those who have egregious misuse of the data. 01:09:19.760 |
Actually, China in this side, on this aspect, 01:09:33.680 |
pretty much eradicated the illegal distribution, 01:09:56.320 |
that will let us have our cake and eat it too? 01:09:59.000 |
People are looking into homomorphic encryption, 01:10:04.480 |
have it encrypted and train on encrypted data. 01:10:09.200 |
but that kind of direction may be worth pursuing. 01:10:15.280 |
which would allow one hospital to train on its hospitals' 01:10:18.800 |
patient data fully because they have a license for that. 01:10:22.520 |
And then hospitals would then share their models, 01:10:30.640 |
So I would want to encourage us to be open-minded 01:10:34.240 |
and think of this as not just the policy binary, yes, no, 01:10:39.240 |
but letting the technologists try to find solutions 01:10:44.960 |
or have most of our cake and eat most of it too. 01:10:47.560 |
Finally, I think giving each end user a choice is important 01:11:04.840 |
GDPR today causes all these pop-ups of yes, no, 01:11:29.200 |
and user choice, it implemented it in such a way 01:11:33.120 |
that really didn't deliver the spirit of GDPR. 01:11:56.720 |
between perfect protection, security of your personal data 01:12:12.140 |
But maybe we should turn the problem on its head 01:12:21.280 |
but we sure cannot understand every pop-up question. 01:12:40.360 |
the best way to make a lot of money in the long-term 01:12:45.520 |
I think getting that slider right in the short-term 01:12:54.240 |
but in the long-term, you'll be getting user trust 01:13:03.400 |
I sure would hope there is a way we can do long-term projects 01:13:24.560 |
but I'm not sure how in the current funding environment 01:14:31.360 |
it's beyond the current cycle of venture capital. 01:15:07.920 |
to my family, friends, and people who loved me, 01:15:10.720 |
and my life revolved around optimizing for work. 01:15:36.760 |
But when I faced mortality and the possible death 01:15:42.000 |
I suddenly realized that this really meant nothing to me, 01:15:45.640 |
that I didn't feel like working for another minute, 01:15:57.480 |
and apologizing to them that I lived my life the wrong way. 01:16:01.960 |
So that moment of reckoning caused me to really rethink 01:16:11.520 |
is something that we might be too much shaped by the society 01:16:22.040 |
But while that can get you periodic successes 01:16:27.040 |
and satisfaction, it's really in the facing death, 01:16:35.840 |
So as a result of going through the challenges with cancer, 01:16:40.840 |
I've resolved to live a more balanced lifestyle. 01:16:52.440 |
My wife travels with me when my kids need me. 01:16:58.000 |
And before, I used to prioritize everything around work. 01:17:06.240 |
Now, when my family needs something, really needs something, 01:17:12.640 |
And then in the time remaining, I allocate to work. 01:17:19.000 |
It's not like they will take 50 hours a week from me. 01:17:23.160 |
So I'm actually able to still work pretty hard, 01:17:29.280 |
So I realized the most important thing in my life 01:17:38.680 |
It isn't the only thing I do, but when that is needed, 01:17:45.860 |
And I feel much better, and I feel much more balanced. 01:17:53.080 |
as to a life of routine work, a life of pursuit of numbers. 01:17:58.080 |
While my job was not routine, it was in pursuit of numbers, 01:18:09.560 |
Can I make sure our VC is ranked higher and higher 01:18:13.960 |
This competitive nature of driving for bigger numbers 01:18:27.280 |
And bigger numbers really didn't make me happier. 01:18:32.940 |
I realized bigger numbers really meant nothing. 01:18:36.360 |
And what was important is that people who have given 01:18:52.000 |
And that's really powerful for you to say that. 01:18:55.420 |
I have to ask sort of a difficult question here. 01:19:05.980 |
Looking historically, I'd like to challenge someone 01:19:10.980 |
some aspect of that a little bit on the point of hard work. 01:19:18.900 |
that is the greatest, the most beautiful aspects 01:19:22.020 |
of human nature is the ability to become obsessed 01:19:26.340 |
of becoming extremely passionate to the point where, 01:19:41.540 |
But in another sense, this kind of obsession, 01:19:43.980 |
this pure exhibition of passion and hard work 01:19:53.100 |
Because you've accomplished incredible things. 01:19:57.260 |
But really, there's some incredible work there. 01:20:00.940 |
So how do you think about that when you look back 01:20:27.720 |
you might give a smaller percentage to family, 01:20:40.700 |
you give most of it to them and the highest priority. 01:20:47.420 |
we would work 20 hours less for the whole life 01:21:11.400 |
that it's important for us to put down our phone and PC 01:21:18.340 |
And that priority on the things that really matter 01:21:23.340 |
isn't going to be so taxing that it would eliminate 01:21:28.060 |
or even dramatically reduce our accomplishments. 01:21:35.980 |
because if you have a happier family, maybe you fight less. 01:21:46.100 |
- So it's unclear that it would take more time. 01:21:48.220 |
And if it did, I'd be willing to take that reduction. 01:22:09.940 |
- So given the many successful companies that you've launched 01:22:16.700 |
what advice would you give to young people today looking, 01:22:28.540 |
and to create the next $1 billion tech startup, 01:22:42.740 |
What worked two years ago may not work today. 01:22:49.700 |
I think two years ago, or maybe three years ago, 01:23:29.800 |
VCs may be willing to accept greater uncertainty. 01:23:34.800 |
But now the number of people who have the equivalent 01:23:46.060 |
the cost to become a AI engineer is much lower, 01:23:53.940 |
So I would suggest someone who wants to build an AI company, 01:23:57.460 |
be thinking about the normal business questions. 01:24:01.480 |
What customer cases are you trying to address? 01:24:16.480 |
And how much business value will get created? 01:24:19.920 |
That today needs to be thought about much earlier upfront 01:24:26.720 |
The scarcity question of AI talent has changed. 01:24:54.080 |
But the good news though is that AI technologies 01:25:00.740 |
TensorFlow, PyTorch and such tools are much easier to use. 01:25:10.500 |
and get results iteratively faster than before. 01:25:18.540 |
Think less of this as a laboratory taken into a company, 01:25:26.180 |
The only exception is if you truly have a breakthrough 01:26:03.060 |
Kaifu, thank you so much for your time today.