back to index

Kai-Fu Lee: AI Superpowers - China and Silicon Valley | Lex Fridman Podcast #27


Chapters

0:0 Introduction
1:25 What defines the Chinese soul
4:28 Chinese vs American AI engineers
6:40 KaiFus intuition
8:35 Teslas approach
20:7 Evil Mantra
24:55 Chinese copycats
27:44 Silicon Valleys baggage
33:36 Guiding Funds
37:34 Infrastructure Augmentation
40:25 Machine Intelligence
42:9 White Collar Jobs
45:40 Universal Basic Income
48:53 Difficulties with AI
55:20 Retraining
57:56 Doubleedged sword
59:19 Arms race metaphor

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Kai-Fu Lee.
00:00:02.960 | He's the chairman and CEO of Sinovation Ventures
00:00:06.560 | that manages a $2 billion dual currency investment fund
00:00:10.600 | with a focus on developing the next generation
00:00:13.160 | of Chinese high-tech companies.
00:00:15.440 | He's the former president of Google China
00:00:17.760 | and the founder of what is now called
00:00:19.800 | Microsoft Research Asia,
00:00:21.680 | an institute that trained many
00:00:24.160 | of the artificial intelligence leaders in China,
00:00:26.520 | including CTOs or AI execs at Baidu,
00:00:30.280 | Tencent, Alibaba, Lenovo, and Huawei.
00:00:33.900 | He was named one of the 100 most influential people
00:00:38.520 | in the world by Time Magazine.
00:00:40.640 | He's the author of seven bestselling books in Chinese,
00:00:43.840 | and most recently, the New York Times bestseller
00:00:46.680 | called "AI Superpowers, China, Silicon Valley,
00:00:50.600 | and the New World Order."
00:00:52.600 | He has unparalleled experience
00:00:55.760 | in working across major tech companies
00:00:57.640 | and governments on applications of AI.
00:01:00.040 | And so he has a unique perspective on global innovation
00:01:03.480 | and the future of AI that I think is important
00:01:06.200 | to listen to and think about.
00:01:08.880 | This is the Artificial Intelligence Podcast.
00:01:11.880 | If you enjoy it, subscribe on YouTube and iTunes,
00:01:15.160 | support it on Patreon,
00:01:16.800 | or simply connect with me on Twitter @LexFriedman.
00:01:20.960 | And now here's my conversation with Kai-Fu Lee.
00:01:25.800 | - I immigrated from Russia to US when I was 13.
00:01:29.440 | You immigrated to US at about the same age.
00:01:32.440 | The Russian people, the American people,
00:01:34.800 | the Chinese people each have a certain soul,
00:01:38.280 | a spirit that permeates throughout the generations.
00:01:42.080 | So maybe it's a little bit of a poetic question,
00:01:45.160 | but could you describe your sense
00:01:48.920 | of what defines the Chinese soul?
00:01:52.040 | - I think the Chinese soul of people today, right?
00:01:56.120 | We're talking about people who have had centuries of burden
00:02:01.120 | because of the poverty that the country has gone through
00:02:05.200 | and suddenly shined with hope of prosperity
00:02:10.200 | in the past 40 years as China opened up
00:02:13.400 | and embraced market economy.
00:02:15.480 | And undoubtedly there are two sets of pressures
00:02:20.160 | on the people, that of the tradition,
00:02:24.160 | that of facing difficult situations,
00:02:28.040 | and that of hope of wanting to be the first
00:02:31.160 | to become successful and wealthy.
00:02:33.880 | So that's a very strong hunger and a strong desire
00:02:38.360 | and strong work ethic that drives China forward.
00:02:41.160 | - And is there roots to not just this generation,
00:02:43.960 | but before that's deeper
00:02:47.320 | than just the new economic developments?
00:02:50.120 | Is there something that's unique to China
00:02:52.560 | that you could speak to that's in the people?
00:02:55.000 | - Yeah, well, the Chinese tradition
00:02:58.600 | is about excellence, dedication, and results.
00:03:02.680 | And the Chinese exams and study subjects in schools
00:03:07.240 | have traditionally started from memorizing
00:03:09.920 | 10,000 characters, not an easy task to start with,
00:03:13.600 | and further by memorizing
00:03:15.280 | his historic philosophers, literature, poetry.
00:03:19.040 | So it really is probably the strongest
00:03:22.160 | rote learning mechanism created
00:03:25.120 | to make sure people had good memory
00:03:26.960 | and remembered things extremely well.
00:03:29.200 | That's, I think at the same time,
00:03:32.760 | suppresses the breakthrough innovation
00:03:36.160 | and also enhances the speed execution get results.
00:03:42.400 | And that I think characterizes the historic basis of China.
00:03:47.480 | - That's interesting 'cause there's echoes of that
00:03:49.200 | in Russian education as well as rote memorization.
00:03:52.120 | So you have to memorize a lot of poetry.
00:03:53.840 | I mean, there's just an emphasis on perfection
00:03:57.560 | in all forms that's not conducive
00:04:00.880 | to perhaps what you're speaking to, which is creativity.
00:04:03.720 | But you think that kind of education holds back
00:04:07.040 | the innovative spirit that you might see
00:04:09.560 | in the United States?
00:04:11.000 | - Well, it holds back the breakthrough innovative spirits
00:04:14.920 | that we see in the United States,
00:04:16.560 | but it does not hold back the valuable execution oriented,
00:04:21.560 | result oriented value creating engines,
00:04:25.320 | which we see China being very successful.
00:04:28.040 | - So is there a difference between a Chinese AI engineer
00:04:32.400 | today and an American AI engineer,
00:04:34.880 | perhaps rooted in the culture that we just talked about
00:04:37.120 | or the education or the very soul of the people or no?
00:04:41.240 | And what would your advice be to each
00:04:43.840 | if there's a difference?
00:04:45.600 | - Well, there's a lot that's similar
00:04:47.200 | because AI is about mastering sciences,
00:04:51.320 | about using known technologies and trying new things.
00:04:55.000 | But it's also about picking from many parts
00:04:58.360 | of possible networks to use
00:05:00.360 | and different types of parameters to tune.
00:05:03.040 | And that part is somewhat rote.
00:05:05.400 | And it is also, as anyone who's built AI products
00:05:09.120 | can tell you a lot about cleansing the data
00:05:12.760 | because AI runs better with more data.
00:05:15.320 | And data is generally unstructured, errorful and unclean.
00:05:20.320 | And the effort to clean the data is immense.
00:05:26.360 | So I think the better part of American engineering,
00:05:31.360 | AI engineering process is to try new things,
00:05:35.560 | to do things people haven't done before
00:05:38.000 | and to use technology to solve most, if not all problems.
00:05:43.400 | So to make the algorithm work despite not so great data,
00:05:47.160 | find error tolerant ways to deal with the data.
00:05:50.680 | The Chinese way would be to basically enumerate
00:05:55.680 | to the fullest extent all the possible ways
00:05:58.600 | by a lot of machines,
00:05:59.720 | try lots of different ways to get it to work
00:06:02.280 | and spend a lot of resources and money
00:06:05.280 | and time cleaning up data.
00:06:07.720 | That means the AI engineer
00:06:09.840 | may be writing data cleansing algorithms,
00:06:12.960 | working with thousands of people who label or correct
00:06:17.240 | or do things with the data.
00:06:19.200 | That is the incredible hard work
00:06:21.720 | that might lead to better results.
00:06:24.040 | So the Chinese engineer would rely on and ask for
00:06:27.720 | more and more and more data
00:06:29.080 | and find ways to cleanse them
00:06:30.680 | and make them work in the system.
00:06:32.600 | And probably less time thinking about new algorithms
00:06:36.360 | that can overcome data or other issues.
00:06:39.680 | - So where's your intuition?
00:06:40.920 | Where do you think the biggest impact
00:06:42.400 | in the next 10 years lies?
00:06:44.200 | Is it in some breakthrough algorithms
00:06:46.520 | or is it in just this at scale rigor,
00:06:52.400 | a rigorous approach to data,
00:06:55.160 | cleaning data, organizing data onto the same algorithms?
00:06:58.480 | What do you think the big impact in the applied world is?
00:07:02.640 | - Well, if you're really in the company
00:07:04.600 | and you have to deliver results,
00:07:06.920 | using known techniques and enhancing data
00:07:09.760 | seems like the more expedient approach
00:07:12.280 | that's very low risk
00:07:14.280 | and likely to generate better and better results.
00:07:17.240 | And that's why the Chinese approach has done quite well.
00:07:20.560 | Now, there are a lot of more challenging startups
00:07:24.320 | and problems such as autonomous vehicles,
00:07:28.480 | medical diagnosis that existing algorithms
00:07:31.720 | may probably won't solve.
00:07:34.280 | And that would put the Chinese approach more challenged
00:07:38.720 | and give them more breakthrough innovation approach,
00:07:42.400 | more of an edge on those kinds of problems.
00:07:45.440 | - So let me talk to that a little more.
00:07:47.080 | So my intuition personally
00:07:49.600 | is that data can take us extremely far.
00:07:53.600 | So you brought up autonomous vehicles and medical diagnosis.
00:07:56.440 | So your intuition is that huge amounts of data
00:08:00.080 | might not be able to completely help us solve that problem.
00:08:04.000 | - Right, so breaking that down further,
00:08:06.680 | in autonomous vehicle,
00:08:08.060 | I think huge amounts of data probably will solve
00:08:11.280 | trucks driving on highways,
00:08:13.000 | which will deliver a significant value
00:08:15.600 | and China will probably lead in that.
00:08:18.100 | And full L5 autonomous
00:08:22.360 | is likely to require new technologies we don't yet know.
00:08:26.320 | And that might require academia
00:08:28.640 | and great industrial research,
00:08:30.360 | both innovating and working together.
00:08:32.480 | And in that case, US has an advantage.
00:08:35.360 | - So the interesting question there is,
00:08:37.200 | I don't know if you're familiar
00:08:38.240 | on the autonomous vehicle space
00:08:39.740 | and the developments with Tesla and Elon Musk.
00:08:42.660 | - I am.
00:08:43.500 | - Where they are in fact,
00:08:46.600 | a full steam ahead
00:08:48.040 | into this mysterious complex world of full autonomy,
00:08:52.920 | L5, L4, L5.
00:08:55.080 | And they're trying to solve that purely with data.
00:08:58.800 | So the same kind of thing that you're saying
00:09:00.800 | is just for highway,
00:09:02.100 | which is what a lot of people share your intuition.
00:09:05.360 | They're trying to solve with data.
00:09:07.200 | So just to linger on that moment further,
00:09:09.320 | do you think possible for them to achieve success
00:09:13.560 | with simply just a huge amount of this training
00:09:17.000 | on edge cases and difficult cases in urban environments,
00:09:20.400 | not just highway and so on?
00:09:21.760 | - I think there'll be very hard.
00:09:24.520 | One could characterize Tesla's approach
00:09:27.000 | as kind of a Chinese strength approach, right?
00:09:29.800 | Gather all the data you can
00:09:31.560 | and hope that will overcome the problems.
00:09:34.000 | But in autonomous driving,
00:09:36.080 | clearly a lot of the decisions aren't merely solved
00:09:40.400 | by aggregating data and having feedback loop.
00:09:43.480 | There are things that are more akin to human thinking
00:09:48.000 | and how would those be integrated and built?
00:09:51.660 | There has not yet been a lot of success
00:09:53.960 | integrating human intelligence
00:09:56.400 | or call it expert systems if you will,
00:09:58.780 | even though that's a taboo word with the machine learning
00:10:02.920 | and the integration of the two types of thinking
00:10:05.600 | hasn't yet been demonstrated.
00:10:07.840 | And the question is how much can you push
00:10:09.960 | a purely machine learning approach?
00:10:12.420 | And of course, Tesla also has an additional constraint
00:10:15.520 | that they don't have all the sensors.
00:10:18.520 | I know that they think it's foolish to use LIDARs,
00:10:21.140 | but that's clearly a one less very valuable
00:10:25.120 | and reliable source of input that they're foregoing,
00:10:28.920 | which may also have consequences.
00:10:32.440 | I think the advantage of course is capturing data
00:10:35.000 | no one has ever seen before.
00:10:37.000 | And in some cases such as computer vision
00:10:40.640 | and speech recognition,
00:10:42.180 | I have seen Chinese companies accumulate data
00:10:44.820 | that's not seen anywhere in the Western world
00:10:47.320 | and they have delivered superior results.
00:10:50.240 | But then speech recognition and object recognition
00:10:53.720 | are relatively suitable problems for deep learning
00:10:57.080 | and don't have the potentially need
00:11:00.760 | for the human intelligence analytical planning elements.
00:11:04.440 | - And the same on the speech recognition side,
00:11:06.440 | your intuition that speech recognition
00:11:09.000 | and the machine learning approaches to speech recognition
00:11:11.480 | won't take us to a conversational system
00:11:14.560 | that can pass the Turing test,
00:11:16.000 | which is sort of maybe akin to what driving is.
00:11:20.080 | So it needs to have something more than just simply
00:11:22.800 | simple language understanding, simple language generation.
00:11:27.560 | - Roughly right.
00:11:28.760 | I would say that based on purely machine learning approaches
00:11:33.120 | it's hard to imagine it could lead to
00:11:36.720 | a full conversational experience across arbitrary domains,
00:11:41.720 | which is akin to L5.
00:11:44.600 | I'm a little hesitant to use the word Turing test
00:11:46.920 | because the original definition was probably too easy.
00:11:50.300 | We probably do that.
00:11:52.320 | - The spirit of the Turing test is what I was referring to.
00:11:55.320 | - Of course.
00:11:56.520 | - So you've had major leadership research positions
00:11:59.440 | at Apple, Microsoft, Google.
00:12:01.640 | So continuing on the discussion of America, Russia,
00:12:05.120 | Chinese, Seoul and culture and so on,
00:12:07.620 | what is the culture of Silicon Valley
00:12:11.360 | in contrast to China, maybe US broadly?
00:12:16.320 | And what is the unique culture
00:12:18.760 | of each of these three major companies in your view?
00:12:22.080 | - I think in aggregate Silicon Valley companies,
00:12:25.160 | and we could probably include Microsoft in that
00:12:27.240 | even though they're not in the Valley,
00:12:29.160 | is really dream big and have visionary goals
00:12:34.000 | and believe that technology will conquer all.
00:12:37.980 | And also the self-confidence and the self-entitlement
00:12:42.300 | that whatever they produce
00:12:43.600 | the whole world should use and must use.
00:12:47.280 | And those are historically important.
00:12:52.000 | I think Steve Jobs' famous quote that
00:12:56.280 | he doesn't do focus groups,
00:12:59.120 | he looks in the mirror and asks the person in the mirror,
00:13:02.400 | what do you want?
00:13:03.560 | And that really is an inspirational comment
00:13:07.040 | that says the great companies
00:13:09.000 | shouldn't just ask users what they want,
00:13:11.280 | but develop something that users will know they want
00:13:15.200 | when they see it,
00:13:16.240 | but they could never come up with themselves.
00:13:18.960 | I think that is probably the most exhilarating description
00:13:23.880 | of what the essence of Silicon Valley is,
00:13:26.560 | that this brilliant idea could cause you to build something
00:13:31.560 | that couldn't come out of focus groups or A/B tests.
00:13:35.520 | And iPhone would be an example of that.
00:13:38.020 | No one in the age of Blackberry would write down
00:13:40.560 | they want an iPhone or multi-touch.
00:13:42.800 | A browser might be another example.
00:13:44.800 | No one would say they want that in the days of FTP,
00:13:47.520 | but once they see it, they want it.
00:13:49.440 | So I think that is what Silicon Valley is best at.
00:13:53.600 | But it also came with a lot of success.
00:13:58.920 | These products became global platforms
00:14:01.960 | and there were basically no competitors anywhere.
00:14:05.080 | And that has also led to a belief
00:14:08.440 | that these are the only things that one should do,
00:14:13.240 | that companies should not tread
00:14:15.960 | on other companies' territory,
00:14:17.960 | so that a Groupon and a Yelp
00:14:22.320 | and an OpenTable and the Grubhub would each feel,
00:14:26.280 | okay, I'm not gonna do the other company's business
00:14:28.560 | because that would not be the pride of innovating
00:14:32.360 | whatever each of these four companies have innovated.
00:14:35.920 | But I think the Chinese approach
00:14:40.360 | is do whatever it takes to win.
00:14:42.720 | And it's a winner-take-all market.
00:14:45.000 | And in fact, in the internet space,
00:14:47.200 | the market leader will get predominantly all the value
00:14:50.880 | extracted out of the system.
00:14:53.120 | And the system isn't just defined as one narrow category,
00:14:59.600 | but gets broader and broader.
00:15:01.360 | So it's amazing ambition for success
00:15:06.040 | and domination of increasingly larger product categories
00:15:11.760 | leading to clear market winner status
00:15:15.080 | and the opportunity to extract tremendous value.
00:15:19.080 | And that develops a practical,
00:15:22.480 | result-oriented, ultra-ambitious,
00:15:27.440 | winner-take-all, gladiatorial mentality.
00:15:31.480 | And if what it takes is to build
00:15:35.560 | what the competitors built,
00:15:37.400 | essentially a copycat,
00:15:38.680 | that can be done without infringing laws.
00:15:41.920 | If what it takes is to satisfy a foreign country's need
00:15:46.280 | by forking the code base
00:15:47.640 | and building something that looks really ugly and different,
00:15:50.320 | they'll do it.
00:15:51.400 | So it's contrasted very sharply
00:15:54.360 | with the Silicon Valley approach.
00:15:56.240 | And I think the flexibility and the speed and execution
00:16:00.040 | has helped the Chinese approach.
00:16:01.960 | And I think the Silicon Valley approach
00:16:05.040 | is potentially challenged
00:16:08.440 | if every Chinese entrepreneur is learning
00:16:10.760 | from the whole world,
00:16:11.920 | US and China,
00:16:13.240 | and the American entrepreneurs only look internally
00:16:16.360 | and write off China as a copycat.
00:16:19.640 | And the second part of your question
00:16:21.360 | about the three companies.
00:16:23.560 | - The unique elements of the three companies, perhaps.
00:16:26.080 | - Yeah.
00:16:26.920 | I think Apple represents,
00:16:30.440 | wow the user, please the user,
00:16:33.160 | and the essence of design and brand.
00:16:38.160 | And it's the one company
00:16:41.800 | and perhaps the only tech company
00:16:44.160 | that draws people with a strong,
00:16:48.800 | serious desire for the product
00:16:51.120 | and the willingness to pay a premium
00:16:53.600 | because of the halo effect of the brand,
00:16:56.920 | which came from the attention to detail
00:17:00.160 | and great respect for user needs.
00:17:02.920 | Microsoft represents a platform approach
00:17:07.920 | that builds giant products
00:17:11.600 | that become very strong modes
00:17:14.400 | that others can't do
00:17:16.360 | because it's well architected at the bottom level
00:17:21.360 | and the work is efficiently delegated to individuals.
00:17:26.600 | And then the whole product is built
00:17:30.480 | by adding small parts that sum together.
00:17:33.640 | So it's probably the most effective high-tech assembly line
00:17:38.440 | that builds a very difficult product
00:17:40.560 | that the whole process of doing that
00:17:44.880 | is kind of a differentiation
00:17:49.560 | and something competitors can't easily repeat.
00:17:52.600 | - Are there elements of the Chinese approach
00:17:54.880 | and the way Microsoft went about assembling
00:17:58.400 | those little pieces and dominating,
00:18:00.880 | essentially dominating the market for a long time,
00:18:03.960 | or do you see those as distinct?
00:18:05.680 | - I think there are elements that are the same.
00:18:08.280 | I think the three American companies
00:18:10.520 | that had or have Chinese characteristics,
00:18:13.920 | and obviously as well as American characteristics,
00:18:16.120 | are Microsoft, Facebook, and Amazon.
00:18:20.440 | - Yes, that's right, Amazon.
00:18:21.800 | - Because these are companies
00:18:22.960 | that will tenaciously go after adjacent markets,
00:18:27.680 | build up strong product offering,
00:18:31.360 | and find ways to extract greater value
00:18:36.360 | from a sphere that's ever increasing.
00:18:40.000 | And they understand the value of the platforms,
00:18:43.560 | so that's the similarity.
00:18:45.640 | And then with Google, I think it's a genuinely
00:18:50.640 | value-oriented company that does have a heart and soul
00:18:57.040 | and that wants to do great things for the world
00:18:59.840 | by connecting information,
00:19:01.960 | and that has also very strong technology genes
00:19:06.960 | and wants to use technology
00:19:13.320 | and has found out-of-the-box ways to use technology
00:19:18.320 | to deliver incredible value to the end user.
00:19:23.760 | - If we can look at Google, for example,
00:19:25.280 | you mentioned heart and soul.
00:19:26.800 | There seems to be an element where Google
00:19:31.880 | is after making the world better.
00:19:34.880 | There's a more positive view.
00:19:36.560 | I mean, they used to have the slogan, "Don't be evil."
00:19:39.000 | And Facebook, a little bit more has a negative tint to it,
00:19:43.160 | at least in the perception of privacy and so on.
00:19:46.000 | Do you have a sense of how these different companies
00:19:51.000 | can achieve, because you've talked about
00:19:53.080 | how much we can make the world better
00:19:54.560 | in all these kinds of ways with AI.
00:19:56.760 | What is it about a company that can make,
00:19:59.400 | give it a heart and soul, gain the trust of the public,
00:20:03.240 | and just actually just not be evil
00:20:06.160 | and do good for the world?
00:20:08.040 | - It's really hard, and I think Google
00:20:10.280 | has struggled with that.
00:20:12.120 | First, the "don't do evil" mantra is very dangerous
00:20:16.760 | because every employee's definition of evil is different,
00:20:20.860 | and that has led to some difficult
00:20:22.640 | employee situations for them.
00:20:25.200 | So I don't necessarily think that's a good value statement,
00:20:29.600 | but just watching the kinds of things Google
00:20:32.360 | or its parent company, Alphabet, does in new areas
00:20:36.480 | like healthcare, like eradicating mosquitoes,
00:20:40.500 | things that are really not in the business
00:20:42.380 | of a internet tech company, I think that shows
00:20:45.920 | that there is a heart and soul and desire to do good
00:20:49.680 | and willingness to put in the resources
00:20:53.960 | to do something when they see it's good,
00:20:55.920 | they will pursue it.
00:20:57.280 | That doesn't necessarily mean it has all the trust
00:21:01.480 | of the users.
00:21:02.520 | I realize while most people would view Facebook
00:21:06.400 | as the primary target of their recent unhappiness
00:21:09.800 | about Silicon Valley companies,
00:21:11.600 | many would put Google in that category,
00:21:14.080 | and some have named Google's business practices
00:21:16.800 | as predatory also.
00:21:19.840 | So it's kind of difficult to have the two parts of a body,
00:21:24.280 | the brain wants to do what it's supposed to do
00:21:27.480 | for a shareholder, maximize profit,
00:21:29.320 | and then the heart and soul wants to do good things
00:21:32.040 | that may run against what the brain wants to do.
00:21:36.160 | - So in this complex balancing that these companies
00:21:39.840 | have to do, you've mentioned that you're concerned
00:21:42.540 | about a future where too few companies
00:21:45.520 | like Google, Facebook, Amazon are controlling our data,
00:21:49.680 | are controlling too much of our digital lives.
00:21:53.400 | Can you elaborate on this concern?
00:21:55.360 | Perhaps do you have a better way forward?
00:21:57.560 | - I think I'm hardly the most vocal complainer of this.
00:22:03.640 | There are a lot louder complainers out there.
00:22:07.280 | I do observe that having a lot of data
00:22:11.840 | does perpetuate their strength,
00:22:14.680 | and limits competition in many spaces.
00:22:19.480 | But I also believe AI is much broader
00:22:22.520 | than the internet space.
00:22:24.120 | So the entrepreneurial opportunities still exists
00:22:27.520 | in using AI to empower financial, retail,
00:22:32.520 | manufacturing, education applications.
00:22:35.440 | So I don't think it's quite a case
00:22:37.320 | of full monopolistic dominance
00:22:39.840 | that totally stifles innovation.
00:22:44.000 | But I do believe in their areas of strength,
00:22:46.440 | it's hard to dislodge them.
00:22:48.760 | I don't know if I have a good solution.
00:22:53.360 | Probably the best solution is let the entrepreneurial
00:22:56.760 | VC ecosystem work well, and find all the places
00:23:00.680 | that can create the next Google, the next Facebook.
00:23:04.240 | So there will always be increasing number of challengers.
00:23:08.600 | In some sense that has happened a little bit.
00:23:11.400 | You see Uber, Airbnb having emerged
00:23:14.760 | despite the strength of the big three.
00:23:17.680 | And I think China as an environment
00:23:22.440 | may be more interesting for the emergence,
00:23:25.320 | because if you look at companies between,
00:23:28.320 | let's say 50 to $300 billion,
00:23:33.320 | China has emerged more of such companies than the US
00:23:37.600 | in the last three to four years.
00:23:39.960 | Because of the larger marketplace,
00:23:42.200 | because of the more fearless nature of the entrepreneurs,
00:23:45.800 | and the Chinese giants are just as powerful
00:23:49.600 | as American ones.
00:23:50.880 | Tencent, Alibaba are very strong,
00:23:53.000 | but ByteDance has emerged worth 75 billion,
00:23:57.080 | and financial, while it's Alibaba affiliated,
00:24:00.160 | it's nevertheless independent and worth 150 billion.
00:24:03.960 | And so I do think if we start to extend
00:24:08.360 | to traditional businesses,
00:24:10.000 | we will see very valuable companies.
00:24:12.720 | So it's probably not the case that in five or 10 years
00:24:17.720 | we'll still see the whole world
00:24:19.720 | with these five companies having such dominance.
00:24:22.720 | - So you've mentioned a couple of times
00:24:24.800 | this fascinating world of entrepreneurship in China,
00:24:29.280 | of the fearless nature of the entrepreneurs.
00:24:31.160 | So can you maybe talk a little bit about
00:24:33.280 | what it takes to be an entrepreneur in China?
00:24:35.560 | What are the strategies that are undertaken?
00:24:38.260 | What are the ways to achieve success?
00:24:41.120 | What is the dynamic of VCF funding,
00:24:43.960 | of the way the government helps companies and so on?
00:24:46.480 | What are the interesting aspects here
00:24:47.880 | that are distinct from,
00:24:49.520 | that are different from the Silicon Valley
00:24:52.360 | world of entrepreneurship?
00:24:54.220 | - Well, many of the listeners probably still
00:24:59.400 | would brand Chinese entrepreneur as copycats.
00:25:03.000 | And no doubt 10 years ago,
00:25:05.380 | that would not be an inaccurate description.
00:25:09.100 | Back 10 years ago,
00:25:10.760 | an entrepreneur probably could not get funding
00:25:13.660 | if he or she could not describe what product
00:25:17.540 | he or she is copying from the US.
00:25:19.560 | The first question is who has proven this business model?
00:25:23.400 | Which is a nice way of asking, who are you copying?
00:25:27.180 | And that reason is understandable
00:25:29.560 | because China had a much lower internet penetration
00:25:34.880 | and didn't have enough indigenous experience
00:25:39.880 | to build innovative products.
00:25:43.260 | And secondly, internet was emerging.
00:25:47.660 | Lean startup was the way to do things,
00:25:49.900 | building a first minimally viable products
00:25:52.980 | and then expanding was the right way to go.
00:25:55.380 | And the American successes have given a shortcut
00:25:59.540 | that if you built your minimally viable product
00:26:02.620 | based on an American product,
00:26:04.260 | it's guaranteed to be a decent starting point.
00:26:06.740 | Then you tweak it afterwards.
00:26:08.460 | So as long as there are no IP infringement,
00:26:11.300 | which as far as I know,
00:26:12.300 | there hasn't been in the mobile and AI spaces,
00:26:16.700 | that's a much better shortcut.
00:26:19.420 | And I think Silicon Valley would view that
00:26:21.940 | as still not very honorable
00:26:25.180 | because that's not your own idea to start with.
00:26:29.220 | But you can't really at the same time
00:26:32.620 | believe every idea must be your own
00:26:35.180 | and believe in the lean startup methodology
00:26:38.140 | because lean startup is intended to try many, many things
00:26:41.900 | and then converge on that at worst.
00:26:44.220 | And it's meant to be iterated and changed.
00:26:46.740 | So finding a decent starting point without legal violations,
00:26:51.240 | there should be nothing morally dishonorable about that.
00:26:55.540 | - So just a quick pause on that.
00:26:56.980 | It's fascinating that that's,
00:26:58.680 | why is that not honorable, right?
00:27:01.900 | It's exactly as you formulated.
00:27:03.760 | It seems like a perfect start for business
00:27:08.060 | is to take, look at Amazon and say,
00:27:12.020 | "Okay, we'll do exactly what Amazon is doing.
00:27:14.440 | "Let's start there in this particular market
00:27:16.700 | "and then let's out-innovate them from that starting point,
00:27:20.680 | "come up with new ways."
00:27:22.220 | I mean, is it wrong to be,
00:27:23.780 | except the word copycat just sounds bad,
00:27:27.140 | but is it wrong to be a copycat?
00:27:28.820 | It just seems like a smart strategy.
00:27:31.620 | But yes, doesn't have a heroic nature to it
00:27:35.820 | that like a Steve Jobs, Elon Musk,
00:27:40.820 | sort of in something completely,
00:27:42.260 | coming up with something completely new.
00:27:43.900 | - Yeah, I like the way you describe it.
00:27:45.340 | It's a non-heroic, acceptable way to start the company
00:27:50.340 | and maybe more expedient.
00:27:52.860 | So that's, I think, a baggage for Silicon Valley,
00:27:57.860 | that if it doesn't let go,
00:28:00.800 | then it may limit the ultimate ceiling of the company.
00:28:05.180 | Take Snapchat as an example.
00:28:07.220 | I think, you know, Evan's brilliant.
00:28:09.860 | He built a great product,
00:28:11.540 | but he's very proud that he wants to build his own features,
00:28:15.480 | not copy others,
00:28:16.820 | while Facebook was more willing to copy his features.
00:28:21.020 | And you see what happens in the competition.
00:28:23.480 | So I think putting that handcuff on a company
00:28:27.460 | would limit its ability to reach the maximum potential.
00:28:31.560 | So back to the Chinese environment,
00:28:33.840 | copying was merely a way to learn from the American masters.
00:28:38.420 | Just like we, if we learned to play piano or painting,
00:28:43.420 | you start by copying.
00:28:44.560 | You don't start by innovating
00:28:45.960 | when you don't have the basic skillsets.
00:28:48.160 | So very amazingly, the Chinese entrepreneurs,
00:28:51.000 | about six years ago,
00:28:54.120 | started to branch off with these lean startups
00:28:57.820 | built on American ideas
00:28:59.500 | to build better products than American products.
00:29:02.260 | But they did start from the American idea.
00:29:04.940 | And today, WeChat is better than WhatsApp.
00:29:08.580 | Weibo is better than Twitter.
00:29:10.500 | Zhihu is better than Quora, and so on.
00:29:12.900 | So that, I think, is Chinese entrepreneurs
00:29:16.980 | going to step two.
00:29:18.500 | And then step three is once these entrepreneurs
00:29:21.540 | have done one or two of these companies,
00:29:23.700 | they now look at the Chinese market and the opportunities
00:29:27.380 | and come up with ideas that didn't exist elsewhere.
00:29:30.580 | So products like Ant Financial,
00:29:34.180 | under which includes Alipay, which is mobile payments,
00:29:38.260 | and also the financial products for loans built on that,
00:29:43.260 | and also in education, VIPKID,
00:29:47.520 | and in video social network,
00:29:53.600 | TikTok, and in social e-commerce, Pinduoduo,
00:29:58.600 | and then in ride-sharing, Mobike.
00:30:01.740 | These are all Chinese innovated products
00:30:05.640 | that now are being copied elsewhere.
00:30:08.720 | So, and an additional interesting observation
00:30:13.040 | is some of these products
00:30:14.280 | are built on unique Chinese demographics,
00:30:17.280 | which may not work in the US,
00:30:19.400 | but may work very well in Southeast Asia, Africa,
00:30:23.160 | and other developing worlds
00:30:25.200 | that are a few years behind China.
00:30:27.840 | And a few of these products maybe are universal
00:30:31.060 | and are getting traction even in the United States,
00:30:33.760 | such as TikTok.
00:30:35.360 | So this whole ecosystem is supported by VCs
00:30:40.360 | as a virtuous cycle,
00:30:43.520 | because a large market with innovative entrepreneurs
00:30:47.680 | will draw a lot of money
00:30:49.440 | and then invest in these companies.
00:30:51.560 | As the market gets larger and larger,
00:30:53.800 | US market, China market is easily three,
00:30:56.120 | four times larger than the US,
00:30:58.440 | they will create greater value
00:30:59.800 | and greater returns for the VCs,
00:31:02.280 | thereby raising even more money.
00:31:05.420 | So at Sinovation Ventures, our first fund was 15 million.
00:31:10.000 | Our last fund was 500 million.
00:31:12.040 | So it reflects the valuation of the companies
00:31:16.560 | and our us going multi-stage and things like that.
00:31:19.840 | It also has government support,
00:31:22.020 | but not in the way most Americans would think of it.
00:31:26.120 | The government actually leaves the entrepreneurial space
00:31:29.520 | as a private enterprise, so the self-regulating,
00:31:33.240 | and the government would build infrastructures
00:31:36.240 | that would around it to make it work better.
00:31:39.320 | For example, the Mass Entrepreneur, Mass Innovation Plan
00:31:42.960 | built 8,000 incubators.
00:31:44.920 | So the pipeline is very strong to the VCs.
00:31:48.400 | For autonomous vehicles, the Chinese government
00:31:50.800 | is building smart highways with sensors,
00:31:54.360 | smart cities that separate pedestrians from cars
00:31:57.920 | that may allow initially an inferior
00:32:01.120 | autonomous vehicle company to launch a car
00:32:04.280 | without increasing, with lower casualty
00:32:07.680 | because the roads or the city is smart.
00:32:11.560 | And the Chinese government at local levels
00:32:13.840 | would have these guiding funds acting as LPs,
00:32:17.400 | passive LPs to funds.
00:32:19.440 | And when the fund makes money,
00:32:22.040 | part of the money made is given back to the GPs
00:32:25.280 | and potentially other LPs to increase everybody's return
00:32:30.280 | at the expense of the government's return.
00:32:33.720 | So that's an interesting incentive
00:32:36.400 | that entrusts the task of choosing entrepreneurs to VCs
00:32:41.400 | who are better at it than the government
00:32:43.840 | by letting some of the profits move that way.
00:32:46.720 | - So this is really fascinating, right?
00:32:48.760 | So I look at the Russian government as a case study
00:32:51.840 | where, let me put it this way,
00:32:54.280 | there's no such government-driven,
00:32:57.880 | large-scale supportive entrepreneurship.
00:33:00.880 | And probably the same is true in the United States,
00:33:04.060 | but the entrepreneurs themselves kind of find a way.
00:33:07.680 | So maybe in a form of advice or explanation,
00:33:11.720 | how did the Chinese government arrive to be this way?
00:33:15.600 | So supportive of entrepreneurship to be
00:33:18.600 | in this particular way,
00:33:20.120 | so forward-thinking at such a large scale.
00:33:23.160 | And also perhaps, how can we copy it in other countries?
00:33:27.080 | How can we encourage other governments,
00:33:29.800 | like even the United States government,
00:33:31.640 | to support infrastructure for autonomous vehicles
00:33:33.840 | in that same kind of way, perhaps?
00:33:36.100 | - Yes, so these techniques are the result
00:33:41.100 | of several key things,
00:33:44.480 | some of which may be learnable,
00:33:46.040 | some of which may be very hard.
00:33:47.640 | One is just trial and error
00:33:50.520 | and watching what everyone else is doing.
00:33:52.920 | I think it's important to be humble
00:33:54.680 | and not feel like you know all the answers.
00:33:56.920 | The guiding funds idea came from Singapore,
00:33:59.480 | which came from Israel.
00:34:01.440 | And China made a few tweaks and turned it into a,
00:34:06.120 | because the Chinese cities and government officials
00:34:09.320 | kind of compete with each other,
00:34:11.360 | 'cause they all want to make their city more successful
00:34:14.720 | so they can get the next level in their political career.
00:34:19.720 | And it's somewhat competitive.
00:34:22.320 | So the central government made it a bit of a competition.
00:34:25.200 | Everybody has a budget.
00:34:26.840 | They can put it on AI, or they can put it on bio,
00:34:29.840 | or they can put it on energy.
00:34:32.200 | And then whoever gets the results,
00:34:34.160 | the city shines, the people are better off,
00:34:36.180 | the mayor gets a promotion.
00:34:38.020 | So the tools,
00:34:39.760 | it's kind of almost like an entrepreneurial environment
00:34:42.840 | for local governments to see who can do a better job.
00:34:47.440 | And also many of them try different experiments.
00:34:52.440 | Some have given award to very smart researchers,
00:34:57.440 | just give them money and hope they'll start a company.
00:35:00.820 | Some have given money to academic research labs,
00:35:05.820 | maybe government research labs
00:35:08.040 | to see if they can spin off some companies
00:35:10.720 | from the science lab or something like that.
00:35:14.040 | Some have tried to recruit overseas Chinese
00:35:17.080 | to come back and start companies,
00:35:19.000 | and they've had mixed results.
00:35:20.960 | The one that worked the best was the guiding funds.
00:35:23.360 | So it's almost like a lean startup idea
00:35:25.840 | where people try different things
00:35:27.560 | and what works sticks and everybody copies.
00:35:30.560 | So now every city has a guiding fund.
00:35:32.880 | So that's how that came about.
00:35:35.700 | The autonomous vehicle and the massive spending
00:35:40.400 | in highways and smart cities, that's a Chinese way.
00:35:45.400 | It's about building infrastructure to facilitate.
00:35:49.520 | It's a clear division of the government's responsibility
00:35:52.880 | from the market.
00:35:55.400 | The market should do everything in a private freeway,
00:36:00.400 | but there are things the market can't afford to do,
00:36:02.920 | like infrastructure.
00:36:04.500 | So the government always appropriates large amounts of money
00:36:09.500 | for infrastructure building.
00:36:12.000 | This happens with not only autonomous vehicle and AI,
00:36:16.880 | but happened with the 3G and 4G.
00:36:20.840 | You'll find that the Chinese wireless reception
00:36:25.360 | is better than the US because massive spending
00:36:28.600 | that tries to cover the whole country,
00:36:30.640 | whereas in the US it may be a little spotty.
00:36:34.360 | It's a government driven because I think they view
00:36:36.880 | the coverage of cell access and 3G, 4G access
00:36:41.880 | to be a governmental infrastructure spending
00:36:46.400 | as opposed to capitalistic.
00:36:49.880 | So that's, of course, the state-owned enterprises
00:36:52.280 | also publicly traded, but they also carry
00:36:55.920 | a government responsibility
00:36:57.760 | to deliver infrastructure to all.
00:37:00.240 | So it's a different way of thinking
00:37:01.920 | that may be very hard to inject into Western countries
00:37:05.440 | to say starting tomorrow, bandwidth infrastructure
00:37:09.320 | and highways are gonna be governmental spending
00:37:13.880 | with some characteristics.
00:37:16.120 | - What's your sense, and sorry to interrupt,
00:37:18.280 | but because it's such a fascinating point,
00:37:20.480 | do you think on the autonomous vehicle space
00:37:24.120 | it's possible to solve the problem of full autonomy
00:37:30.160 | without significant investment in infrastructure?
00:37:33.080 | - Well, that's really hard to speculate.
00:37:36.440 | I think it's not a yes, no question,
00:37:39.000 | but how long does it take question?
00:37:40.800 | 15 years, 30 years, 45 years.
00:37:45.200 | Clearly with infrastructure augmentation,
00:37:49.040 | whether it's road, the city, or whole city planning,
00:37:52.360 | building a new city, I'm sure that will accelerate
00:37:56.500 | the day of the L5.
00:37:59.680 | I'm not knowledgeable enough,
00:38:01.320 | and it's hard to predict even when we're knowledgeable
00:38:03.960 | because a lot of it is speculative.
00:38:06.100 | But in the US, I don't think people would consider
00:38:10.480 | building a new city the size of Chicago
00:38:13.280 | to make it the AI/autonomous city.
00:38:16.000 | There are smaller ones being built, I'm aware of that.
00:38:18.920 | But is infrastructure spend really impossible
00:38:22.080 | for US or Western countries?
00:38:23.760 | I don't think so.
00:38:25.800 | The US highway system was built.
00:38:29.000 | Was that during President Eisenhower or Kennedy?
00:38:31.920 | - I think so, Eisenhower, yeah.
00:38:33.200 | - So maybe historians can study
00:38:37.280 | how did President Eisenhower get the resources
00:38:40.320 | to build this massive infrastructure
00:38:42.840 | that surely gave US a tremendous amount of prosperity
00:38:47.620 | over the next decade, if not century.
00:38:50.840 | - If I may comment on that then,
00:38:53.120 | it takes us to artificial intelligence a little bit
00:38:55.480 | because in order to build infrastructure,
00:38:58.120 | it creates a lot of jobs.
00:39:00.520 | So I'll be actually interested if you would say
00:39:03.400 | that you talk in your book about all kinds of jobs
00:39:06.560 | that could and could not be automated.
00:39:08.920 | I wonder if building infrastructure is one of the jobs
00:39:13.080 | that would not be easily automated.
00:39:15.760 | Something you can think about
00:39:17.040 | because I think you've mentioned somewhere in the talk
00:39:19.920 | or that there might be, as jobs are being automated,
00:39:24.280 | a role for government to create jobs that can't be automated.
00:39:28.160 | - Yes, I think that's a possibility.
00:39:30.200 | Back in the last financial crisis,
00:39:34.280 | China put a lot of money
00:39:37.840 | to basically give this economy a boost.
00:39:41.200 | And a lot of it went into infrastructure building.
00:39:44.600 | And I think that's a legitimate way at a government level
00:39:49.920 | to deal with the employment issues
00:39:54.360 | as well as build out the infrastructure.
00:39:57.040 | As long as the infrastructures are truly needed
00:39:59.840 | and as long as there isn't an employment problem,
00:40:02.200 | which, no, we don't know.
00:40:04.000 | - So maybe taking a little step back,
00:40:07.840 | if you've been a leader and a researcher in AI
00:40:12.840 | for several decades, at least 30 years.
00:40:16.200 | So how has AI changed in the West and the East
00:40:21.000 | as you've observed, as you've been deep in it
00:40:23.080 | over the past 30 years?
00:40:25.100 | - Well, AI began as the pursuit
00:40:27.840 | of understanding human intelligence.
00:40:30.280 | And the term itself represents that.
00:40:34.160 | But it kind of drifted into the one sub area
00:40:37.500 | that worked extremely well, which is machine intelligence.
00:40:40.840 | And that's actually more using
00:40:43.080 | pattern recognition techniques to basically
00:40:47.580 | do incredibly well on a limited domain,
00:40:51.280 | large amount of data,
00:40:52.760 | but relatively simple kinds of planning tasks
00:40:57.120 | and not very creative.
00:40:58.680 | So we didn't end up building human intelligence.
00:41:02.440 | We built a different machine
00:41:04.560 | that was a lot better than us on some problems,
00:41:08.080 | but nowhere close to us on other problems.
00:41:11.700 | So today, I think a lot of people still misunderstand
00:41:15.200 | when we say artificial intelligence
00:41:18.040 | and what various products can do,
00:41:20.720 | people still think it's about
00:41:22.360 | replicating human intelligence.
00:41:24.200 | But the products out there really are closer
00:41:27.600 | to having invented the internet or the spreadsheet
00:41:31.680 | or the database and getting broader adoption.
00:41:34.460 | - And speaking further to the fears,
00:41:37.620 | near-term fears that people have about AI,
00:41:40.340 | so you're commenting on the sort of
00:41:42.080 | the general intelligence that people in the popular culture
00:41:46.940 | from sci-fi movies have a sense about AI,
00:41:49.560 | but there's practical fears about AI,
00:41:51.960 | the kind, the narrow AI that you're talking about
00:41:54.820 | of automating particular kinds of jobs.
00:41:57.280 | And you talk about them in the book.
00:41:59.400 | So what are the kinds of jobs in your view
00:42:01.520 | that you see in the next five, 10 years
00:42:04.440 | beginning to be automated by AI systems, algorithms?
00:42:09.240 | - Yes, this is also maybe a little bit counterintuitive
00:42:13.020 | because it's the routine jobs
00:42:15.260 | that will be displaced the soonest.
00:42:18.360 | And they may not be displaced entirely,
00:42:20.860 | maybe 50%, 80% of a job,
00:42:24.080 | but when the workload drops by that much,
00:42:26.300 | employment will come down.
00:42:28.740 | And also another part of misunderstanding
00:42:31.540 | is most people think of AI replacing routine jobs
00:42:35.740 | than they think of the assembly line, the workers.
00:42:38.760 | Well, that will have some effects,
00:42:40.980 | but it's actually the routine white collar workers
00:42:44.260 | that's easiest to replace
00:42:46.180 | because to replace a white collar worker,
00:42:49.280 | you just need software.
00:42:50.740 | To replace a blue collar worker,
00:42:53.100 | you need robotics, mechanical excellence,
00:42:57.180 | and the ability to deal with dexterity
00:43:01.900 | and maybe even unknown environments, very, very difficult.
00:43:05.620 | So if we were to categorize
00:43:08.360 | the most dangerous white collar jobs,
00:43:12.400 | they would be things like back office,
00:43:15.620 | people who copy and paste
00:43:17.920 | and deal with simple computer programs and data
00:43:22.200 | and maybe paper and OCR.
00:43:25.600 | And they don't make strategic decisions.
00:43:29.040 | They basically facilitate the process.
00:43:32.040 | The software and paper systems don't work.
00:43:34.600 | So you have people dealing with new employee orientation,
00:43:39.380 | searching for past lawsuits and financial documents
00:43:44.220 | and doing reference check.
00:43:47.920 | - So basic searching and management of data.
00:43:50.440 | - Of data.
00:43:51.280 | - That's the most in danger of being lost.
00:43:52.800 | - In addition to the white collar repetitive work,
00:43:56.440 | a lot of simple interaction work can also be taken care of,
00:44:00.240 | such as telesales, telemarketing, customer service,
00:44:04.860 | as well as many physical jobs that are in the same location
00:44:09.540 | and don't require a high degree of dexterity.
00:44:12.220 | So fruit picking, dish washing, assembly line,
00:44:16.700 | inspection are jobs in that category.
00:44:20.340 | So altogether, back office is a big part.
00:44:24.460 | And the blue collar may be smaller initially,
00:44:29.840 | but over time, AI will get better.
00:44:32.540 | And when we start to get to over the next 15, 20 years,
00:44:36.860 | the ability to actually have the dexterity
00:44:39.100 | of doing assembly line, that's a huge chunk of jobs.
00:44:42.580 | And when autonomous vehicles start to work,
00:44:45.600 | initially starting with truck drivers,
00:44:47.440 | but eventually to all drivers,
00:44:49.360 | that's another huge group of workers.
00:44:52.020 | So I see modest numbers in the next five years,
00:44:55.580 | but increasing rapidly after that.
00:44:58.060 | - On the worry of the jobs that are in danger
00:45:01.260 | and the gradual loss of jobs,
00:45:03.080 | I'm not sure if you're familiar with Andrew Yang.
00:45:06.660 | - Yes, I am.
00:45:07.820 | - So there's a candidate for president of the United States
00:45:10.580 | whose platform, Andrew Yang, is based around,
00:45:14.380 | in part around job loss due to automation.
00:45:17.700 | And also in addition, the need perhaps
00:45:21.100 | of universal basic income to support jobs
00:45:24.460 | that are folks who lose their job due to automation
00:45:28.100 | and so on.
00:45:28.920 | He cannot support people under complex,
00:45:32.000 | unstable job market.
00:45:34.360 | So what are your thoughts about his concerns,
00:45:36.760 | him as a candidate, his ideas in general?
00:45:39.140 | - I think his thinking is generally in the right direction,
00:45:43.440 | but his approach as a presidential candidate
00:45:48.480 | may be a little bit ahead of the time.
00:45:51.040 | I think the displacements will happen,
00:45:55.220 | but will they happen soon enough
00:45:57.880 | for people to agree to vote for him?
00:46:00.560 | The unemployment numbers are not very high yet.
00:46:03.840 | And I think he and I have the same challenge.
00:46:07.680 | If I want to theoretically convince people this is an issue
00:46:11.640 | and he wants to become the president,
00:46:13.980 | people have to see how can this be the case
00:46:17.600 | when unemployment numbers are low?
00:46:19.760 | So that is the challenge.
00:46:21.480 | And I think we do, I do agree with him
00:46:25.280 | on the displacement issue, on universal basic income
00:46:29.040 | at a very vanilla level.
00:46:32.360 | I don't agree with it
00:46:33.960 | because I think the main issue is retraining.
00:46:38.380 | So people need to be incented
00:46:41.580 | not by just giving a monthly $2,000 check
00:46:44.280 | or $1,000 check and do whatever they want
00:46:47.220 | because they don't have the know-how
00:46:51.000 | to know what to retrain
00:46:54.100 | to go into what type of a job.
00:46:56.920 | And guidance is needed.
00:46:58.720 | And retraining is needed
00:47:00.520 | because historically when technology revolutions,
00:47:03.220 | when routine jobs were displaced,
00:47:05.120 | new routine jobs came up.
00:47:06.960 | So there was always room for that.
00:47:09.460 | But with AI and automation,
00:47:11.900 | the whole point is replacing all routine jobs eventually.
00:47:15.380 | So there will be fewer and fewer routine jobs.
00:47:17.880 | And AI will create jobs,
00:47:20.340 | but it won't create routine jobs
00:47:22.700 | because if it creates routine jobs,
00:47:24.900 | why wouldn't AI just do it?
00:47:26.940 | So therefore the people who are losing the jobs
00:47:30.400 | are losing routine jobs.
00:47:32.340 | The jobs that are becoming available are non-routine jobs.
00:47:35.740 | So the social stipend needs to be put in place
00:47:39.360 | is for the routine workers who lost their jobs
00:47:42.100 | to be retrained maybe in six months, maybe in three years.
00:47:46.180 | It takes a while to retrain on a non-routine job
00:47:48.620 | and then take on a job that will last
00:47:51.420 | for that person's lifetime.
00:47:53.420 | Now, having said that,
00:47:55.300 | if you look deeply into Andrew's document,
00:47:57.140 | he does cater for that.
00:47:58.240 | So I'm not disagreeing with what he's trying to do.
00:48:03.240 | But for simplification, sometimes he just says UBI,
00:48:06.340 | but simple UBI wouldn't work.
00:48:08.740 | - And I think you've mentioned elsewhere that,
00:48:10.900 | I mean, the goal isn't necessarily
00:48:13.780 | to give people enough money to survive or live
00:48:17.300 | or even to prosper.
00:48:19.100 | The point is to give them a job
00:48:21.320 | that gives them meaning.
00:48:22.780 | That meaning is extremely important.
00:48:24.580 | That our employment, at least in the United States,
00:48:28.580 | and perhaps it carries across the world,
00:48:31.180 | provides something that's,
00:48:33.500 | forgive me for saying, greater than money.
00:48:35.620 | It provides meaning.
00:48:36.920 | So now, what kind of jobs do you think can't be automated?
00:48:43.380 | You talk a little bit about creativity
00:48:46.580 | and compassion in your book.
00:48:48.140 | What aspects do you think it's difficult
00:48:50.660 | to automate for an AI system?
00:48:52.300 | - Because an AI system is currently merely optimizing.
00:48:58.040 | It's not able to reason, plan,
00:49:00.780 | or think creatively or strategically.
00:49:03.620 | It's not able to deal with complex problems.
00:49:05.980 | It can't come up with a new problem and solve it.
00:49:10.300 | A human needs to find the problem
00:49:13.040 | and pose it as an optimization problem,
00:49:16.340 | then have the AI work at it.
00:49:18.260 | So an AI would have a very hard time
00:49:22.100 | discovering a new drug,
00:49:23.940 | or discovering a new style of painting,
00:49:26.980 | or dealing with complex tasks,
00:49:31.300 | such as managing a company,
00:49:32.940 | that isn't just about optimizing the bottom line,
00:49:35.740 | but also about employee satisfaction,
00:49:38.700 | corporate brand, and many, many other things.
00:49:41.220 | So that is one category of things.
00:49:44.740 | And because these things are challenging,
00:49:46.900 | creative, complex,
00:49:48.820 | doing them creates a high degree of satisfaction,
00:49:52.540 | and therefore appealing to our desire for working,
00:49:55.940 | which isn't just to make the money,
00:49:57.900 | make the ends meet,
00:49:58.860 | but also that we've accomplished something
00:50:00.980 | that others maybe can't do or can't do as well.
00:50:03.700 | Another type of job that is much numerous
00:50:07.680 | would be compassionate jobs,
00:50:09.820 | jobs that require compassion, empathy,
00:50:12.380 | human touch, human trust.
00:50:15.020 | AI can't do that because AI is cold, calculating,
00:50:19.180 | and even if it can fake that to some extent,
00:50:23.260 | it will make errors,
00:50:24.380 | and that will make it look very silly.
00:50:26.540 | And also I think even if AI did okay,
00:50:29.700 | people would want to interact with another person,
00:50:33.380 | whether it's for some kind of a service,
00:50:36.260 | or a teacher, or a doctor,
00:50:38.660 | or a concierge, or a masseuse, or a bartender.
00:50:42.220 | There are so many jobs where people
00:50:45.020 | just don't want to interact with a cold robot or software.
00:50:49.460 | I've had an entrepreneur who built an elderly care robot,
00:50:53.860 | and they found that the elderly
00:50:55.620 | really only use it for customer service,
00:50:57.980 | but not to service the product,
00:51:00.820 | but they click on customer service,
00:51:03.200 | and the video of a person comes up,
00:51:05.500 | and then the person says,
00:51:06.760 | "How come my daughter didn't call me?
00:51:09.460 | "Let me show you a picture of her grandkids."
00:51:11.600 | So people yearn for that people-people interaction.
00:51:15.340 | So even if robots improved, people just don't want it.
00:51:19.180 | And those jobs are going to be increasing
00:51:21.540 | because AI will create a lot of value,
00:51:24.140 | $16 trillion to the world in next 11 years,
00:51:27.540 | according to PWC,
00:51:29.220 | and that will give people money to enjoy services,
00:51:34.220 | whether it's eating a gourmet meal,
00:51:37.420 | or tourism and traveling,
00:51:39.660 | or having concierge services.
00:51:41.440 | The services revolving around, you know,
00:51:44.900 | every dollar of that $16 trillion will be tremendous.
00:51:48.020 | It will create more opportunities
00:51:50.040 | that are to service the people who did well through AI
00:51:53.560 | with things.
00:51:56.200 | But even at the same time,
00:51:58.240 | the entire society is very much short
00:52:01.680 | in need of many service-oriented,
00:52:04.200 | compassionate-oriented jobs.
00:52:06.140 | The best example is probably in healthcare services.
00:52:10.320 | There's going to be 2 million new jobs,
00:52:14.280 | not counting replacement,
00:52:15.360 | just brand new incremental jobs in the next six years
00:52:18.940 | in healthcare services.
00:52:20.520 | That includes nurses, orderly in the hospital,
00:52:24.960 | elderly care,
00:52:26.740 | and also at-home care is particularly lacking.
00:52:31.440 | And those jobs are not likely to be filled.
00:52:34.800 | So there's likely to be a shortage.
00:52:36.800 | And the reason they're not filled
00:52:38.580 | is simply because they don't pay very well,
00:52:41.560 | and that the social status of these jobs are not very good.
00:52:46.560 | So they pay about half as much
00:52:49.660 | as a heavy equipment operator,
00:52:52.160 | which will be replaced a lot sooner.
00:52:54.720 | And they pay probably comparably
00:52:57.560 | to someone on the assembly line.
00:52:59.880 | And so, if we ignoring all the other issues
00:53:03.480 | and just think about satisfaction from one's job,
00:53:07.200 | someone repetitively doing the same manual action
00:53:10.560 | at an assembly line,
00:53:11.680 | that can't create a lot of job satisfaction,
00:53:14.680 | but someone taking care of a sick person
00:53:17.720 | and getting a hug and thank you
00:53:19.640 | from that person and the family,
00:53:22.040 | I think is quite satisfying.
00:53:24.700 | So if only we could fix the pay for service jobs,
00:53:28.760 | there are plenty of jobs that require some training
00:53:31.920 | or a lot of training for the people
00:53:34.400 | coming off the routine jobs to take.
00:53:37.000 | We can easily imagine someone who was maybe a cashier
00:53:42.000 | at the grocery store,
00:53:43.520 | at stores become automated,
00:53:45.680 | learns to become a nurse or a at home care.
00:53:49.220 | Also do want to point out the blue collar jobs
00:53:52.800 | are going to stay around a bit longer.
00:53:54.780 | Some of them quite a bit longer.
00:53:58.160 | AI cannot be told go clean an arbitrary home.
00:54:02.200 | That's incredibly hard.
00:54:03.920 | Arguably is an L5 level of difficulty.
00:54:07.160 | Right?
00:54:08.000 | And then AI cannot be a good plumber
00:54:10.080 | because plumber is almost like a mini detective
00:54:12.840 | that has to figure out where the leak came from.
00:54:15.640 | So yet AI probably can be an assembly line
00:54:20.240 | and auto mechanic and so on.
00:54:22.840 | So one has to study which blue collar jobs are going away
00:54:26.800 | and facilitate retraining for the people
00:54:29.280 | to go into the ones that won't go away
00:54:31.160 | or maybe even will increase.
00:54:33.000 | - I mean, it is fascinating that it's easier
00:54:35.040 | to build a world champion chess player
00:54:39.540 | than it is to build a mediocre plumber.
00:54:42.080 | - Yes.
00:54:42.920 | - Right.
00:54:43.740 | - Very true.
00:54:44.580 | And AI and that goes counterintuitive
00:54:46.140 | to a lot of people's understanding
00:54:47.960 | of what artificial intelligence is.
00:54:50.080 | So it sounds, I mean,
00:54:51.720 | you're painting a pretty optimistic picture
00:54:53.920 | about retraining, about the number of jobs
00:54:56.960 | and actually the meaningful nature of those jobs
00:54:59.520 | once we automate repetitive tasks.
00:55:02.040 | So overall, are you optimistic about the future
00:55:07.040 | where much of the repetitive tasks are automated?
00:55:11.600 | That there is a lot of room for humans,
00:55:13.780 | for the compassionate, for the creative input
00:55:17.320 | that only humans can provide?
00:55:20.040 | - I am optimistic if we start to take action.
00:55:23.360 | If we have no action in the next five years,
00:55:27.640 | I think it's going to be hard to deal
00:55:30.780 | with the devastating losses that will emerge.
00:55:34.200 | So if we start thinking about retraining,
00:55:37.120 | maybe with the low-hanging fruits,
00:55:39.360 | explaining to vocational schools
00:55:41.820 | why they should train more plumbers than auto mechanics,
00:55:45.760 | maybe starting with some government subsidy
00:55:49.700 | for corporations to have more training positions.
00:55:53.620 | We start to explain to people why retraining is important.
00:55:58.180 | We start to think about what the future of education,
00:56:00.780 | how that needs to be tweaked for the era of AI.
00:56:04.580 | If we start to make incremental progress
00:56:06.760 | and a greater number of people understand,
00:56:08.960 | then there's no reason to think we can't deal with this
00:56:12.360 | because this technological revolution is arguably similar
00:56:15.620 | to what electricity, industrial revolutions
00:56:18.780 | and internet brought about.
00:56:20.420 | - Do you think there's a role for policy,
00:56:22.660 | for governments to step in to help with policy
00:56:26.020 | to create a better world?
00:56:27.980 | - Oh, absolutely.
00:56:28.980 | And the governments don't have to believe
00:56:32.300 | and employment will go up
00:56:34.020 | and they don't have to believe automation will be this fast
00:56:37.540 | to do something.
00:56:39.460 | Revamping vocational school would be one example.
00:56:42.540 | Another is if there's a big gap
00:56:44.760 | in healthcare service employment
00:56:47.500 | and we know that a country's population is growing older,
00:56:51.820 | more longevity, living older,
00:56:54.260 | because people over 80 require five times as much care
00:56:57.580 | as those under 80,
00:56:59.660 | then it is a good time to incent training programs
00:57:03.460 | for elderly care, to find ways to improve the pay.
00:57:07.540 | Maybe one way would be to offer as part of Medicare
00:57:11.680 | or the equivalent program for people over 80
00:57:14.540 | to be entitled to a few hours of elderly care at home.
00:57:18.740 | And then that might be reimbursable
00:57:22.100 | and that will stimulate the service industry
00:57:26.700 | around the policy.
00:57:28.880 | - Do you have concerns about large entities,
00:57:33.340 | whether it's governments or companies
00:57:35.420 | controlling the future of AI development in general?
00:57:39.180 | So we talked about companies,
00:57:40.940 | do you have a better sense that governments
00:57:44.340 | can better represent the interest of the people
00:57:49.340 | than companies, or do you believe companies
00:57:52.300 | are better at representing the interest of the people?
00:57:54.900 | Or is there no easy answer?
00:57:56.780 | - I don't think there's an easy answer
00:57:58.060 | because it's a double-edged sword.
00:58:00.140 | The companies and governments can provide better services
00:58:03.740 | with more access to data and more access to AI,
00:58:06.740 | but that also leads to greater power,
00:58:09.420 | which can lead to uncontrollable problems,
00:58:13.560 | whether it's monopoly or corruption in the government.
00:58:17.740 | So I think one has to be careful
00:58:21.380 | to look at how much data that companies and governments have
00:58:25.020 | and some kind of checks and balances would be helpful.
00:58:29.440 | - So again, I come from Russia.
00:58:34.060 | There's something called the Cold War.
00:58:36.020 | So let me ask a difficult question here,
00:58:39.300 | looking at conflict.
00:58:40.780 | As Steven Pinker written a great book,
00:58:42.180 | the conflict all over the world is decreasing in general,
00:58:45.460 | but do you have a sense that having written the book,
00:58:50.180 | AI Superpowers, do you see a major international conflict
00:58:54.500 | potentially arising between major nations,
00:58:57.880 | whatever they are, whether it's Russia, China,
00:59:00.380 | a European nations, United States or others,
00:59:04.180 | in the next 10, 20, 50 years around AI,
00:59:07.740 | around the digital space, cyberspace,
00:59:10.240 | do you worry about that?
00:59:12.160 | Is there something, is that something we need to think about
00:59:15.580 | and try to alleviate or prevent?
00:59:18.680 | - I believe in greater engagement.
00:59:22.740 | A lot of the worries about more powerful AI
00:59:26.820 | are based on a arms race metaphor.
00:59:31.820 | And when you extrapolate into military kinds of scenarios,
00:59:39.300 | AI can automate and autonomous weapons
00:59:44.300 | that needs to be controlled somehow.
00:59:47.380 | And autonomous decision-making can lead to not enough time
00:59:52.380 | to fix international crises.
00:59:56.180 | So I actually believe a Cold War mentality
00:59:59.460 | would be very dangerous,
01:00:01.100 | because should two countries rely on AI
01:00:04.220 | to make certain decisions,
01:00:06.060 | and they don't even talk to each other,
01:00:08.620 | they do their own scenario planning,
01:00:10.780 | then something could easily go wrong.
01:00:13.740 | I think engagement, interaction, some protocols
01:00:17.780 | to avoid inadvertent disasters is actually needed.
01:00:23.660 | So it's natural for each country to want to be the best,
01:00:27.820 | whether it's in nuclear technologies or AI or bio,
01:00:33.540 | but I think it's important to realize
01:00:37.740 | if each country has a black box AI
01:00:41.380 | and don't talk to each other,
01:00:43.300 | that probably presents greater challenges to humanity
01:00:48.300 | than if they interacted.
01:00:51.460 | I think there can still be competition,
01:00:53.760 | but with some degree of protocol for interaction,
01:00:57.080 | just like when there was a nuclear competition,
01:01:02.180 | there were some protocol for deterrence
01:01:04.900 | among US, Russia, and China.
01:01:08.020 | And I think that engagement is needed.
01:01:10.900 | So of course, we're still far from AI
01:01:13.540 | presenting that kind of danger.
01:01:16.040 | But what I worry the most about
01:01:18.460 | is the level of engagement seems to be coming down.
01:01:23.020 | The level of distrust seems to be going up,
01:01:26.380 | especially from the US towards other large countries,
01:01:30.260 | such as China, and of course-
01:01:31.860 | - Russia. - And Russia, yes.
01:01:33.460 | - Is there a way to make that better?
01:01:34.740 | So that's beautifully put, level of engagement,
01:01:37.260 | and even just a basic trust and communication,
01:01:40.700 | as opposed to sort of,
01:01:43.400 | making artificial enemies out of particular countries.
01:01:51.140 | Do you have a sense how we can make it better?
01:01:57.180 | Actionable items that as a society we can take on?
01:02:01.740 | - I'm not an expert at geopolitics,
01:02:04.980 | but I would say that we look pretty foolish as humankind
01:02:09.980 | when we are faced with the opportunity
01:02:13.180 | to create $16 trillion for humanity,
01:02:18.180 | and yet we're not solving fundamental problems
01:02:24.500 | with parts of the world still in poverty.
01:02:29.480 | And for the first time, we have the resources
01:02:32.380 | to overcome poverty and hunger.
01:02:34.560 | We're not using it on that,
01:02:35.940 | but we're fueling competition among superpowers,
01:02:38.780 | and that's a very unfortunate thing.
01:02:41.820 | If we become utopian for a moment,
01:02:44.720 | imagine a benevolent world government
01:02:49.720 | that has this $16 trillion,
01:02:53.740 | and maybe some AI to figure out how to use it
01:02:57.640 | to deal with diseases and problems and hate
01:03:01.580 | and things like that.
01:03:02.620 | World would be a lot better off.
01:03:04.820 | So what is wrong with the current world?
01:03:07.600 | I think the people with more skill
01:03:09.780 | than I should think about this.
01:03:13.920 | And then the geopolitics issue with superpower competition
01:03:16.900 | is one side of the issue.
01:03:19.360 | There's another side which I worry maybe even more,
01:03:24.020 | which is as the $16 trillion all gets made
01:03:28.200 | by US and China and a few of the other developed countries,
01:03:32.040 | the poorer country will get nothing
01:03:34.280 | because they don't have technology,
01:03:36.880 | and the wealth disparity and inequality will increase.
01:03:41.880 | So a poorer country with a large population
01:03:45.880 | will not only benefit from the AI boom
01:03:48.560 | or other technology booms,
01:03:50.360 | but they will have their workers
01:03:52.400 | who previously had hoped they could do the China model
01:03:56.060 | and do outsource manufacturing or the India model
01:03:58.860 | so they could do the outsource process or call center.
01:04:02.660 | Well, all those jobs are gonna be gone in 10 or 15 years.
01:04:05.820 | So the individual citizen may be a net liability,
01:04:10.820 | I mean, financially speaking, to a poorer country
01:04:15.060 | and not an asset to claw itself out of poverty.
01:04:19.840 | So in that kind of situation,
01:04:22.700 | these large countries with not much tech
01:04:26.460 | are going to be facing a downward spiral,
01:04:30.140 | and it's unclear what could be done.
01:04:33.180 | And then when we look back and say
01:04:35.100 | there's $16 trillion being created,
01:04:37.660 | and it's all being kept by US, China,
01:04:39.780 | and other developed countries,
01:04:41.540 | it just doesn't feel right.
01:04:43.420 | So I hope people who know about geopolitics
01:04:46.940 | can find solutions that's beyond my expertise.
01:04:50.000 | - So different countries that we've talked about
01:04:53.200 | have different value systems.
01:04:55.240 | If you look at the United States,
01:04:56.960 | to an almost extreme degree,
01:04:59.000 | there is an absolute desire for freedom of speech.
01:05:03.440 | If you look at a country where I was raised,
01:05:05.200 | that desire just amongst the people
01:05:06.960 | is not as elevated as it is,
01:05:11.960 | to basically fundamental level
01:05:15.080 | to the essence of what it means to be America, right?
01:05:17.580 | And the same is true with China.
01:05:19.180 | There's different value systems.
01:05:21.540 | There's some censorship of internet content
01:05:26.540 | that China and Russia and many other countries undertake.
01:05:31.300 | Do you see that having effects on innovation,
01:05:36.300 | other aspects of some of the tech stuff,
01:05:38.940 | AI development we talked about,
01:05:40.960 | and maybe from another angle,
01:05:42.540 | do you see that changing in different ways
01:05:46.200 | over the next 10 years, 20 years, 50 years,
01:05:49.540 | as China continues to grow as it does now
01:05:52.960 | in its tech innovation?
01:05:54.800 | - There's a common belief
01:05:57.160 | that full freedom of speech and expression
01:06:01.000 | is correlated with creativity,
01:06:02.980 | which is correlated with entrepreneurial success.
01:06:08.480 | I think empirically, we have seen that is not true,
01:06:13.160 | and China has been successful.
01:06:15.560 | That's not to say the fundamental values
01:06:18.180 | are not right or not the best,
01:06:20.820 | but it's just that that perfect correlation isn't there.
01:06:25.820 | It's hard to read the tea leaves
01:06:28.540 | on opening up or not in any country,
01:06:31.700 | and I've not been very good at that in my past predictions,
01:06:36.980 | but I do believe every country
01:06:40.820 | shares some fundamental value,
01:06:44.060 | a lot of fundamental values for the long term.
01:06:46.600 | So China is drafting its privacy policy
01:06:52.140 | for individual citizens,
01:06:57.240 | and they don't look that different
01:06:59.980 | from the American or European ones,
01:07:03.220 | so people do want to protect their privacy
01:07:07.600 | and have the opportunity to express,
01:07:10.700 | and I think the fundamental values are there.
01:07:14.180 | The question is in the execution and timing,
01:07:17.940 | how soon or when will that start to open up?
01:07:21.780 | So as long as each government knows,
01:07:25.520 | ultimately people want that kind of protection,
01:07:28.320 | there should be a plan to move towards that
01:07:32.240 | as to when or how, again, I'm not an expert.
01:07:35.360 | - On the point of privacy, to me, it's really interesting.
01:07:39.060 | So AI needs data to create a personalized,
01:07:43.580 | awesome experience, right?
01:07:45.500 | I'm just speaking generally in terms of products,
01:07:48.540 | and then we have currently, depending on the age,
01:07:51.340 | and depending on the demographics
01:07:52.740 | of who we're talking about,
01:07:54.020 | some people are more or less concerned
01:07:55.860 | about the amount of data they hand over.
01:07:59.060 | So in your view, how do we get this balance right
01:08:04.060 | that we provide an amazing experience
01:08:07.160 | to people that use products?
01:08:09.960 | You look at Facebook, the more Facebook knows about you,
01:08:13.480 | yes, it's scary to say,
01:08:15.400 | the better it can probably,
01:08:18.440 | better experience it can probably create.
01:08:21.220 | So in your view, how do we get that balance right?
01:08:25.220 | - Yes, I think a lot of people have a misunderstanding
01:08:30.220 | that it's okay and possible to just rip all the data out
01:08:35.100 | from a provider and give it back to you.
01:08:38.260 | So you can deny them access to further data
01:08:41.220 | and still enjoy the services we have.
01:08:44.100 | If we take back all the data,
01:08:46.260 | all the services will give us nonsense.
01:08:48.420 | We'll no longer be able to use products that function well
01:08:52.380 | in terms of right ranking, right products,
01:08:56.180 | right user experience.
01:08:57.920 | So yet I do understand we don't want
01:09:01.820 | to permit misuse of the data.
01:09:04.680 | From legal policy standpoint,
01:09:07.680 | I think there can be severe punishment
01:09:11.320 | for those who have egregious misuse of the data.
01:09:16.320 | That's, I think, a good first step.
01:09:19.760 | Actually, China in this side, on this aspect,
01:09:22.840 | has very strong laws about people who sell
01:09:25.680 | or give data to other companies.
01:09:28.160 | And that over the past few years,
01:09:30.240 | since that law came into effect,
01:09:33.680 | pretty much eradicated the illegal distribution,
01:09:38.680 | sharing of data.
01:09:40.720 | Additionally, I think giving,
01:09:45.560 | I think technology is often a very good way
01:09:50.240 | to solve technology misuse.
01:09:52.880 | So can we come up with new technologies
01:09:56.320 | that will let us have our cake and eat it too?
01:09:59.000 | People are looking into homomorphic encryption,
01:10:02.080 | which is letting you keep the data,
01:10:04.480 | have it encrypted and train on encrypted data.
01:10:07.480 | Of course, we haven't solved that one yet,
01:10:09.200 | but that kind of direction may be worth pursuing.
01:10:13.520 | Also federated learning,
01:10:15.280 | which would allow one hospital to train on its hospitals'
01:10:18.800 | patient data fully because they have a license for that.
01:10:22.520 | And then hospitals would then share their models,
01:10:24.900 | not data, but models to create a super AI.
01:10:28.160 | And that also maybe has some promise.
01:10:30.640 | So I would want to encourage us to be open-minded
01:10:34.240 | and think of this as not just the policy binary, yes, no,
01:10:39.240 | but letting the technologists try to find solutions
01:10:42.880 | to let us have our cake and eat it too,
01:10:44.960 | or have most of our cake and eat most of it too.
01:10:47.560 | Finally, I think giving each end user a choice is important
01:10:53.000 | and having transparency is important.
01:10:55.500 | Also, I think that's universal,
01:10:57.520 | but the choice you give to the user
01:11:00.640 | should not be at a granular level
01:11:02.440 | that the user cannot understand.
01:11:04.840 | GDPR today causes all these pop-ups of yes, no,
01:11:09.240 | will you give this site this right
01:11:10.700 | to use this part of your data?
01:11:12.480 | I don't think any user understands
01:11:15.120 | what they're saying yes or no to.
01:11:17.120 | And I suspect most are just saying yes
01:11:19.080 | because they don't understand it.
01:11:20.880 | So while GDPR in its current implementation
01:11:25.720 | has lived up to its promise of transparency
01:11:29.200 | and user choice, it implemented it in such a way
01:11:33.120 | that really didn't deliver the spirit of GDPR.
01:11:38.120 | It fit the letter, but not the spirit.
01:11:41.680 | So again, I think we need to think about
01:11:43.880 | is there a way to fit the spirit of GDPR
01:11:48.120 | by using some kind of technology?
01:11:50.720 | Can we have a slider?
01:11:52.440 | That's an AI trying to figure out
01:11:54.760 | how much you want to slide
01:11:56.720 | between perfect protection, security of your personal data
01:12:01.680 | versus a high degree of convenience
01:12:04.240 | with some risks of not having full privacy.
01:12:07.140 | Each user should have some preference
01:12:10.240 | and that gives you the user choice.
01:12:12.140 | But maybe we should turn the problem on its head
01:12:15.000 | and ask, can there be an AI algorithm
01:12:17.400 | that can customize this?
01:12:18.980 | Because we can understand the slider,
01:12:21.280 | but we sure cannot understand every pop-up question.
01:12:25.200 | - And I think getting that right
01:12:27.360 | requires getting the balance
01:12:29.140 | between what we talked about earlier,
01:12:30.880 | which is heart and soul
01:12:32.880 | versus profit-driven decisions and strategy.
01:12:37.280 | I think from my perspective,
01:12:40.360 | the best way to make a lot of money in the long-term
01:12:43.240 | is to keep your heart and soul intact.
01:12:45.520 | I think getting that slider right in the short-term
01:12:50.800 | may feel like you'll be sacrificing profit,
01:12:54.240 | but in the long-term, you'll be getting user trust
01:12:57.760 | and providing a great experience.
01:12:59.680 | Do you share that kind of view in general?
01:13:02.180 | - Yes, absolutely.
01:13:03.400 | I sure would hope there is a way we can do long-term projects
01:13:08.400 | that really do the right thing.
01:13:12.120 | I think a lot of people who embrace GDPR,
01:13:15.360 | their heart's in the right place.
01:13:17.080 | I think they just need to figure out
01:13:19.120 | how to build a solution.
01:13:20.760 | I've heard utopians talk about solutions
01:13:23.440 | that get me excited,
01:13:24.560 | but I'm not sure how in the current funding environment
01:13:27.960 | they can get started, right?
01:13:29.400 | People talk about,
01:13:30.720 | imagine this crowdsourced data collection
01:13:35.720 | that we all trust,
01:13:37.960 | and then we have these agents
01:13:40.840 | that we ask them to ask the trusted agent,
01:13:45.360 | that agent only, that platform,
01:13:48.880 | so a trusted joint platform
01:13:51.280 | that we all believe is trustworthy,
01:13:55.120 | that can give us all the closed loop,
01:13:58.760 | closed loop personal suggestions
01:14:03.080 | by the new social network,
01:14:05.240 | new search engine, new e-commerce engine
01:14:07.720 | that has access to even more of our data,
01:14:10.360 | but not directly, but indirectly.
01:14:12.400 | So I think that general concept
01:14:14.600 | of licensing to some trusted engine
01:14:18.480 | and finding a way to trust that engine
01:14:20.640 | seems like a great idea,
01:14:22.320 | but if you think how long it's going to take
01:14:24.240 | to implement and tweak and develop it right,
01:14:27.720 | as well as to collect all the trusts
01:14:29.920 | and the data from the people,
01:14:31.360 | it's beyond the current cycle of venture capital.
01:14:34.960 | So how do you do that is a big question.
01:14:38.120 | - You've recently had a fight with cancer,
01:14:41.840 | stage four lymphoma,
01:14:43.280 | and in a sort of deep personal level,
01:14:48.240 | what did it feel like in the darker moments
01:14:51.080 | to face your own mortality?
01:14:54.880 | - Well, I've been a workaholic my whole life
01:14:57.920 | and I've basically worked 996,
01:15:01.440 | 9 a.m. to 9 p.m. six days a week, roughly.
01:15:04.680 | And I didn't really pay a lot of attention
01:15:07.920 | to my family, friends, and people who loved me,
01:15:10.720 | and my life revolved around optimizing for work.
01:15:14.520 | While my work was not routine,
01:15:16.960 | my optimization really made my life
01:15:23.280 | basically a very mechanical process,
01:15:25.680 | but I got a lot of highs out of it
01:15:28.400 | because of accomplishments that I thought
01:15:31.680 | were really important and dear
01:15:34.440 | and the highest priority to me.
01:15:36.760 | But when I faced mortality and the possible death
01:15:40.040 | in a matter of months,
01:15:42.000 | I suddenly realized that this really meant nothing to me,
01:15:45.640 | that I didn't feel like working for another minute,
01:15:48.600 | that if I had six months left in my life,
01:15:51.040 | I would spend it all with my loved ones
01:15:54.280 | and thanking them, giving them love back,
01:15:57.480 | and apologizing to them that I lived my life the wrong way.
01:16:01.960 | So that moment of reckoning caused me to really rethink
01:16:06.960 | that why we exist in this world
01:16:11.520 | is something that we might be too much shaped by the society
01:16:16.520 | to think that success and accomplishments
01:16:20.380 | is why we live.
01:16:22.040 | But while that can get you periodic successes
01:16:27.040 | and satisfaction, it's really in the facing death,
01:16:33.080 | you see what's truly important to you.
01:16:35.840 | So as a result of going through the challenges with cancer,
01:16:40.840 | I've resolved to live a more balanced lifestyle.
01:16:45.720 | I'm now in remission, knock on wood,
01:16:48.920 | and I'm spending more time with my family.
01:16:52.440 | My wife travels with me when my kids need me.
01:16:56.000 | I spend more time with them.
01:16:58.000 | And before, I used to prioritize everything around work.
01:17:02.680 | When I had a little bit of time,
01:17:03.960 | I would dole it out to my family.
01:17:06.240 | Now, when my family needs something, really needs something,
01:17:09.880 | I drop everything at work and go to them.
01:17:12.640 | And then in the time remaining, I allocate to work.
01:17:15.880 | But one's family is very understanding.
01:17:19.000 | It's not like they will take 50 hours a week from me.
01:17:23.160 | So I'm actually able to still work pretty hard,
01:17:27.100 | maybe 10 hours less per week.
01:17:29.280 | So I realized the most important thing in my life
01:17:32.720 | is really love and the people I love.
01:17:36.280 | And I give that the highest priority.
01:17:38.680 | It isn't the only thing I do, but when that is needed,
01:17:43.520 | I put that at the top priority.
01:17:45.860 | And I feel much better, and I feel much more balanced.
01:17:50.080 | And I think this also gives a hint
01:17:53.080 | as to a life of routine work, a life of pursuit of numbers.
01:17:58.080 | While my job was not routine, it was in pursuit of numbers,
01:18:03.600 | pursuit of, can I make more money?
01:18:05.600 | Can I fund more great companies?
01:18:08.000 | Can I raise more money?
01:18:09.560 | Can I make sure our VC is ranked higher and higher
01:18:12.680 | every year?
01:18:13.960 | This competitive nature of driving for bigger numbers
01:18:18.960 | and better numbers became a endless pursuit
01:18:24.020 | of that's mechanical.
01:18:27.280 | And bigger numbers really didn't make me happier.
01:18:31.880 | And faced with death,
01:18:32.940 | I realized bigger numbers really meant nothing.
01:18:36.360 | And what was important is that people who have given
01:18:41.040 | their heart and their love to me
01:18:42.840 | deserve for me to do the same.
01:18:45.640 | - So there's deep, profound truth in that,
01:18:48.420 | that everyone should hear and internalize.
01:18:52.000 | And that's really powerful for you to say that.
01:18:55.420 | I have to ask sort of a difficult question here.
01:19:01.200 | So I've competed in sports my whole life.
01:19:05.980 | Looking historically, I'd like to challenge someone
01:19:10.980 | some aspect of that a little bit on the point of hard work.
01:19:15.220 | That it feels that there are certain aspects
01:19:18.900 | that is the greatest, the most beautiful aspects
01:19:22.020 | of human nature is the ability to become obsessed
01:19:26.340 | of becoming extremely passionate to the point where,
01:19:31.020 | yes, flaws are revealed.
01:19:33.740 | And just giving yourself fully to a task.
01:19:36.540 | That is, in another sense,
01:19:39.980 | you mentioned love being important.
01:19:41.540 | But in another sense, this kind of obsession,
01:19:43.980 | this pure exhibition of passion and hard work
01:19:47.940 | is truly what it means to be human.
01:19:50.340 | What lessons should we take as deeper?
01:19:53.100 | Because you've accomplished incredible things.
01:19:54.780 | You say it chasing numbers.
01:19:57.260 | But really, there's some incredible work there.
01:20:00.940 | So how do you think about that when you look back
01:20:04.940 | in your 20s, in your 30s?
01:20:07.180 | What would you do differently?
01:20:09.980 | Would you really take back
01:20:12.540 | some of the incredible hard work?
01:20:14.840 | - I would, but it's in percentages, right?
01:20:19.980 | We're both computer scientists.
01:20:22.380 | So I think when one balances one's life,
01:20:25.620 | when one is younger,
01:20:27.720 | you might give a smaller percentage to family,
01:20:31.240 | but you would still give them high priority.
01:20:33.700 | And when you get older,
01:20:34.860 | you would give a larger percentage to them
01:20:36.700 | and still the high priority.
01:20:38.500 | And when you're near retirement,
01:20:40.700 | you give most of it to them and the highest priority.
01:20:43.820 | So I think the key point is not that
01:20:47.420 | we would work 20 hours less for the whole life
01:20:50.580 | and just spend it aimlessly with the family,
01:20:53.940 | but that when the family has a need,
01:20:56.740 | when your wife is having a baby,
01:21:00.260 | when your daughter has a birthday,
01:21:03.140 | or when they're depressed,
01:21:05.180 | or when they're celebrating something,
01:21:07.540 | or when they have a get together,
01:21:09.340 | or when we have family time,
01:21:11.400 | that it's important for us to put down our phone and PC
01:21:14.860 | and be 100% with them.
01:21:18.340 | And that priority on the things that really matter
01:21:23.340 | isn't going to be so taxing that it would eliminate
01:21:28.060 | or even dramatically reduce our accomplishments.
01:21:32.020 | It might have some impact,
01:21:34.380 | but it might also have other impact
01:21:35.980 | because if you have a happier family, maybe you fight less.
01:21:39.380 | You fight less, you don't spend time
01:21:41.340 | taking care of all the aftermath of a fight.
01:21:45.260 | - That's right.
01:21:46.100 | - So it's unclear that it would take more time.
01:21:48.220 | And if it did, I'd be willing to take that reduction.
01:21:53.220 | And it's not a dramatic number,
01:21:55.380 | but it's a number that I think would give me
01:21:57.620 | a greater degree of happiness
01:22:00.100 | and knowing that I've done the right thing
01:22:03.220 | and still have plenty of hours
01:22:06.100 | to get the success that I want to get.
01:22:09.940 | - So given the many successful companies that you've launched
01:22:14.500 | and much success throughout your career,
01:22:16.700 | what advice would you give to young people today looking,
01:22:22.740 | or it doesn't have to be young,
01:22:26.820 | but people today looking to launch
01:22:28.540 | and to create the next $1 billion tech startup,
01:22:32.400 | or even AI-based startup?
01:22:34.300 | - I would suggest that people understand
01:22:39.780 | technology waves move quickly.
01:22:42.740 | What worked two years ago may not work today.
01:22:45.900 | And that is very much case in point for AI.
01:22:49.700 | I think two years ago, or maybe three years ago,
01:22:53.220 | you certainly could say,
01:22:54.340 | I have a couple of super smart PhDs,
01:22:57.400 | and we're not sure what we're gonna do,
01:22:59.860 | but here's how we're gonna start
01:23:01.900 | and get funding for a very high valuation.
01:23:05.180 | Those days are over because AI is going
01:23:08.460 | from rocket science towards mainstream,
01:23:11.500 | not yet commodity, but more mainstream.
01:23:14.400 | So first, the creation of any company
01:23:19.380 | to a venture capitalist has to be creation
01:23:22.700 | of business value and monetary value.
01:23:26.180 | And when you have a very scarce commodity,
01:23:29.800 | VCs may be willing to accept greater uncertainty.
01:23:34.800 | But now the number of people who have the equivalent
01:23:38.580 | of a PhD three years ago,
01:23:40.640 | because that can be learned more quickly,
01:23:43.780 | platforms are emerging,
01:23:46.060 | the cost to become a AI engineer is much lower,
01:23:49.580 | and there are many more AI engineers.
01:23:51.700 | So the market is different.
01:23:53.940 | So I would suggest someone who wants to build an AI company,
01:23:57.460 | be thinking about the normal business questions.
01:24:01.480 | What customer cases are you trying to address?
01:24:06.120 | What kind of pain are you trying to address?
01:24:08.600 | How does that translate to value?
01:24:10.680 | How will you extract value
01:24:12.920 | and get paid through what channel?
01:24:16.480 | And how much business value will get created?
01:24:19.920 | That today needs to be thought about much earlier upfront
01:24:24.520 | than it did three years ago.
01:24:26.720 | The scarcity question of AI talent has changed.
01:24:30.500 | The number of AI talent has changed.
01:24:32.820 | So now you need not just AI,
01:24:35.080 | but also understanding of business customer
01:24:39.080 | and the marketplace.
01:24:40.980 | So I also think you should have
01:24:45.940 | a more reasonable valuation expectation
01:24:50.020 | and growth expectation.
01:24:52.340 | There's gonna be more competition.
01:24:54.080 | But the good news though is that AI technologies
01:24:57.820 | are now more available in open source.
01:25:00.740 | TensorFlow, PyTorch and such tools are much easier to use.
01:25:05.740 | So you should be able to experiment
01:25:10.500 | and get results iteratively faster than before.
01:25:14.240 | So take more of a business mindset to this.
01:25:18.540 | Think less of this as a laboratory taken into a company,
01:25:23.600 | because we've gone beyond that stage.
01:25:26.180 | The only exception is if you truly have a breakthrough
01:25:29.780 | in some technology that really no one has,
01:25:32.380 | then the old way still works.
01:25:34.680 | But I think that's harder and harder now.
01:25:37.180 | - So I know you believe as many do
01:25:39.620 | that we're far from creating
01:25:41.940 | an artificial general intelligence system.
01:25:44.040 | But say once we do,
01:25:48.580 | and you get to ask her one question,
01:25:51.220 | what would that question be?
01:25:53.000 | (sighs)
01:25:55.000 | What is it that differentiates you and me?
01:26:00.280 | - Beautifully put.
01:26:03.060 | Kaifu, thank you so much for your time today.
01:26:05.780 | - Thank you.
01:26:06.620 | (upbeat music)
01:26:09.200 | (upbeat music)
01:26:11.780 | (upbeat music)
01:26:14.360 | (upbeat music)
01:26:16.940 | (upbeat music)
01:26:19.520 | (upbeat music)
01:26:22.100 | [BLANK_AUDIO]