back to index

Kevin Scott: Microsoft CTO | Lex Fridman Podcast #30


Chapters

0:0
1:50 Product Portfolio
21:2 Mixed Reality
44:32 Microsoft Search
44:48 Microsoft Graph
47:18 Fluid Framework
53:1 The Future of the World

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Kevin Scott,
00:00:03.440 | the CTO of Microsoft.
00:00:06.080 | Before that, he was the Senior Vice President
00:00:08.560 | of Engineering and Operations at LinkedIn.
00:00:11.080 | And before that, he oversaw mobile ads engineering at Google.
00:00:14.980 | He also has a podcast called "Behind the Tech"
00:00:18.960 | with Kevin Scott, which I'm a fan of.
00:00:21.880 | This was a fun and wide ranging conversation
00:00:24.280 | that covered many aspects of computing.
00:00:26.680 | It happened over a month ago,
00:00:28.840 | before the announcement of Microsoft's investment
00:00:30.960 | in OpenAI that a few people have asked me about.
00:00:34.440 | I'm sure there'll be one or two people in the future
00:00:38.120 | that'll talk with me about the impact of that investment.
00:00:41.380 | This is the Artificial Intelligence Podcast.
00:00:45.400 | If you enjoy it, subscribe on YouTube,
00:00:47.640 | give it five stars on iTunes, support it on Patreon,
00:00:50.920 | or simply connect with me on Twitter @LexFriedman,
00:00:54.240 | spelled F-R-I-D-M-A-N.
00:00:57.660 | And I'd like to give a special thank you
00:00:59.240 | to Tom and Nalanti Bighausen
00:01:01.960 | for their support of the podcast on Patreon.
00:01:04.600 | Thanks, Tom and Nalanti.
00:01:06.080 | Hope I didn't mess up your last name too bad.
00:01:08.400 | Your support means a lot
00:01:10.520 | and inspires me to keep this series going.
00:01:13.480 | And now, here's my conversation with Kevin Scott.
00:01:18.160 | You've described yourself as a kid in a candy store
00:01:20.720 | at Microsoft because of all the interesting projects
00:01:23.000 | that are going on.
00:01:24.200 | Can you try to do the impossible task
00:01:27.980 | and give a brief whirlwind view of all the spaces
00:01:32.900 | that Microsoft is working in?
00:01:34.500 | Both research and product.
00:01:37.420 | - If you include research, it becomes even more difficult.
00:01:42.420 | So, I think broadly speaking,
00:01:47.740 | Microsoft's product portfolio includes everything
00:01:53.740 | from big cloud business, like a big set of SaaS services.
00:01:58.740 | We have sort of the original,
00:02:01.740 | or like some of what are among the original
00:02:05.580 | productivity software products that everybody uses.
00:02:09.620 | We have an operating system business.
00:02:11.220 | We have a hardware business where we make everything
00:02:14.580 | from computer mice and headphones
00:02:18.460 | to high-end personal computers and laptops.
00:02:23.540 | We have a fairly broad ranging research group
00:02:27.700 | where we have people doing everything
00:02:29.700 | from economics research.
00:02:31.900 | So, there's this really smart young economist, Glenn Weil,
00:02:36.900 | who my group works with a lot,
00:02:39.760 | who's doing this research
00:02:41.700 | on these things called radical markets.
00:02:45.140 | Like he's written an entire technical book
00:02:48.140 | about this whole notion of a radical market.
00:02:51.100 | So, the research group sort of spans from that
00:02:53.500 | to human-computer interaction to artificial intelligence.
00:02:56.820 | And we have GitHub, we have LinkedIn,
00:03:01.020 | we have a search advertising and news business,
00:03:05.800 | and like probably a bunch of stuff
00:03:07.340 | that I'm embarrassingly not recounting in this list.
00:03:11.300 | - Gaming to Xbox and so on, right?
00:03:12.860 | - Yeah, gaming for sure.
00:03:14.100 | Like I was having a super fun conversation this morning
00:03:17.940 | with Phil Spencer.
00:03:19.520 | So, when I was in college,
00:03:21.260 | there was this game that LucasArts made
00:03:25.580 | called Day of the Tentacle
00:03:27.620 | that my friends and I played forever.
00:03:30.140 | And like we're doing some interesting collaboration now
00:03:33.940 | with the folks who made Day of the Tentacle.
00:03:37.900 | And I was like completely nerding out with Tim Schafer,
00:03:40.860 | like the guy who wrote Day of the Tentacle this morning,
00:03:43.900 | just a complete fan boy,
00:03:45.820 | which, you know, sort of, it like happens a lot.
00:03:49.900 | Like, you know, Microsoft has been doing so much stuff
00:03:53.300 | at such breadth for such a long period of time
00:03:55.960 | that, you know, like being CTO,
00:03:58.840 | like most of the time my job is very, very serious.
00:04:02.140 | And sometimes like I get caught up
00:04:05.580 | in like how amazing it is to be able to have
00:04:10.580 | the conversations that I have with the people
00:04:12.740 | I get to have them with.
00:04:14.620 | - You have to reach back into the sentimental
00:04:17.020 | and what's the radical markets and the economics.
00:04:21.620 | - So the idea with radical markets is like,
00:04:24.740 | can you come up with new market-based mechanisms to,
00:04:29.740 | you know, I think we have this,
00:04:33.780 | we're having this debate right now,
00:04:35.220 | like does capitalism work, like free markets work?
00:04:40.020 | Can the incentive structures
00:04:42.980 | that are built into these systems produce outcomes
00:04:46.340 | that are creating sort of equitably distributed benefits
00:04:51.340 | for every member of society?
00:04:53.520 | You know, and I think it's a reasonable set of questions
00:04:58.700 | to be asking.
00:04:59.540 | And so what Glenn, and so like, you know,
00:05:02.100 | one mode of thought there,
00:05:03.140 | like if you have doubts that the markets
00:05:05.940 | are actually working, you can sort of like tip towards,
00:05:08.380 | like, okay, let's become more socialist
00:05:10.780 | and, you know, like have central planning
00:05:12.880 | and, you know, governments
00:05:14.220 | or some other central organization
00:05:15.780 | that's like making a bunch of decisions
00:05:18.260 | about how, you know, sort of work gets done
00:05:22.020 | and, you know, like where the, you know,
00:05:24.500 | where the investments and where the outputs
00:05:26.340 | of those investments get distributed.
00:05:28.860 | Glenn's notion is like lean more
00:05:32.140 | into like the market-based mechanism.
00:05:35.780 | So like, for instance, you know,
00:05:37.860 | this is one of the more radical ideas,
00:05:39.540 | like suppose that you had a radical pricing mechanism
00:05:45.140 | for assets like real estate,
00:05:47.100 | where you were, you could be bid out of your position
00:05:52.100 | in your home, you know, for instance.
00:05:58.620 | So like if somebody came along and said, you know,
00:06:01.380 | like I can find higher economic utility
00:06:04.420 | for this piece of real estate
00:06:05.780 | that you're running your business in,
00:06:08.740 | like then like you either have to, you know,
00:06:13.060 | sort of bid to sort of stay
00:06:16.500 | or like the thing that's got the higher economic utility,
00:06:19.980 | you know, sort of takes over the asset
00:06:21.540 | which would make it very difficult
00:06:23.740 | to have the same sort of rent-seeking behaviors
00:06:27.620 | that you've got right now
00:06:29.020 | because like if you did speculative bidding,
00:06:34.020 | like you would, you very quickly
00:06:38.820 | like lose a whole lot of money.
00:06:40.460 | And so like the prices of the assets
00:06:42.420 | would be sort of like very closely indexed
00:06:45.660 | to like the value that they could produce.
00:06:49.700 | And like, because like you'd have
00:06:51.780 | this sort of real-time mechanism
00:06:53.220 | that would force you to sort of mark the value
00:06:55.340 | of the asset to the market,
00:06:56.780 | then it could be taxed appropriately.
00:06:58.580 | Like you couldn't sort of sit on this thing and say,
00:07:00.420 | oh, like this house is only worth 10,000 bucks
00:07:03.020 | when like everything around it is worth 10 million.
00:07:06.620 | - That's really interesting.
00:07:07.460 | So it's an incentive structure
00:07:08.740 | that where the prices match the value much better.
00:07:13.220 | - Yeah.
00:07:14.060 | And Glenn does a much better job than I do at selling
00:07:16.860 | and I probably picked the world's worst example,
00:07:18.980 | you know, and, and, and,
00:07:19.980 | but like, and it's intentionally provocative,
00:07:24.340 | you know, so like this whole notion,
00:07:25.820 | like I'm not sure whether I like this notion
00:07:28.980 | that like we could have a set of market mechanisms
00:07:31.180 | where I could get bid out of my property,
00:07:35.060 | you know, but, but, you know,
00:07:36.260 | like if you're thinking about something like,
00:07:38.700 | Elizabeth Warren's wealth tax, for instance,
00:07:42.460 | like you would have, I mean,
00:07:44.340 | it'd be really interesting in like how you would actually
00:07:47.060 | set the, the, the price on the assets.
00:07:50.100 | And like, you might have to have a mechanism like that
00:07:52.020 | if you put a tax like that in place.
00:07:54.180 | - It's really interesting that that kind of research,
00:07:56.420 | at least tangentially is touching Microsoft research.
00:07:59.820 | - Yeah.
00:08:00.660 | - That you're really thinking broadly.
00:08:02.540 | Maybe you can speak to,
00:08:05.300 | this connects to AI.
00:08:08.380 | So we have a candidate, Andrew Yang,
00:08:10.660 | who kind of talks about artificial intelligence
00:08:13.460 | and the concern that people have about,
00:08:15.980 | you know, automation's impact on society.
00:08:18.980 | And arguably Microsoft is at the cutting edge of innovation
00:08:23.340 | in all these kinds of ways.
00:08:25.020 | And so it's pushing AI forward.
00:08:27.060 | How do you think about,
00:08:28.660 | combining all our conversations together here
00:08:30.660 | with radical markets and socialism and innovation in AI
00:08:35.660 | that Microsoft is doing,
00:08:37.500 | and then Andrew Yang's worry that,
00:08:40.900 | that that will, that will result in job loss
00:08:45.460 | for the lower and so on.
00:08:46.860 | How do you think about that?
00:08:47.700 | - I think it's sort of one of the most important questions
00:08:51.140 | in technology, like maybe even in society right now
00:08:55.300 | about how is AI going to develop over the course
00:09:00.300 | of the next several decades
00:09:01.980 | and like, what's it going to be used for?
00:09:03.580 | And like, what, what benefits will it produce
00:09:06.540 | and what negative impacts will it produce?
00:09:08.500 | And, you know, how, who, who,
00:09:10.940 | who gets to steer this whole thing?
00:09:13.700 | You know, I'll say at a, at the highest level,
00:09:16.260 | one of the real joys of, of getting to do what I do
00:09:22.060 | at Microsoft is Microsoft has this heritage
00:09:25.460 | as a platform company.
00:09:27.540 | And so, you know, like Bill,
00:09:29.540 | Bill has this thing that he said, you know,
00:09:32.020 | a bunch of years ago where, you know,
00:09:33.980 | the measure of a successful platform
00:09:36.460 | is that it produces far more economic value
00:09:39.820 | for the people who build on top of the platform
00:09:41.820 | than is created for the, the platform owner or, or builder.
00:09:46.820 | And I think we have to think about AI that way.
00:09:50.900 | Like-- - As a platform.
00:09:52.260 | - Yeah, it has to, like, it has to be a platform
00:09:54.660 | that other people can use to build businesses,
00:09:58.540 | to fulfill their creative objectives,
00:10:01.260 | to be entrepreneurs, to solve problems that they have
00:10:04.660 | in their work and in their lives.
00:10:07.700 | It can't be a thing where there are a handful of companies
00:10:11.980 | sitting in a very small handful of cities geographically
00:10:16.460 | who are making all the decisions about what goes into the AI
00:10:21.460 | and, and, and like, and then on top of like
00:10:25.780 | all this infrastructure, then build all of the
00:10:28.620 | commercially valuable uses for it.
00:10:30.980 | So like, I think like that's bad from a, you know,
00:10:34.380 | sort of, you know, economics and sort of
00:10:37.340 | equitable distribution of value perspective,
00:10:39.700 | like, you know, sort of back to this whole notion of,
00:10:42.060 | you know, like, do the markets work?
00:10:44.540 | But I think it's also bad from an innovation perspective
00:10:47.580 | because like I have infinite amounts of faith
00:10:51.340 | in human beings that if you, you know,
00:10:53.860 | give folks powerful tools,
00:10:55.700 | they will go do interesting things.
00:10:58.260 | And it's more than just a few tens of thousands of people
00:11:02.300 | with the interesting tools.
00:11:03.340 | It should be millions of people with the tools.
00:11:05.380 | So it's sort of like, you know,
00:11:07.180 | you think about the, the steam engine
00:11:10.180 | in the late 18th century, like it was, you know,
00:11:13.740 | maybe the first large scale substitute for human labor
00:11:16.740 | that we've built like a machine.
00:11:19.100 | And, you know, in the beginning,
00:11:21.660 | when these things are getting deployed,
00:11:23.500 | the folks who got most of the value
00:11:27.020 | from the steam engines were the folks who had capital
00:11:30.140 | so they could afford to build them.
00:11:31.580 | And like, they built factories around them and businesses
00:11:34.660 | and the experts who knew how to build and maintain them.
00:11:38.620 | But access to that technology democratized over time.
00:11:42.820 | Like now, like an engine,
00:11:45.780 | it's not like a differentiated thing.
00:11:48.780 | Like there isn't one engine company
00:11:50.260 | that builds all the engines
00:11:51.500 | and all of the things that use engines
00:11:53.100 | are made by this company.
00:11:54.220 | And like, they get all the economics from all of that.
00:11:57.380 | Like, no, no, like fully democratized.
00:11:59.260 | Like they're probably, you know,
00:12:00.540 | we're sitting here in this room
00:12:02.300 | and like, even though they don't,
00:12:03.660 | they're probably things, you know,
00:12:05.220 | like the MEMS gyroscope that are in both of our phones.
00:12:09.100 | Like there's like little engines, you know,
00:12:11.460 | sort of everywhere.
00:12:13.220 | They're just a component in how we build the modern world.
00:12:16.260 | Like AI needs to get there.
00:12:17.660 | - Yeah, so that's a really powerful way to think.
00:12:20.220 | If we think of AI as a platform
00:12:22.700 | versus a tool that Microsoft owns
00:12:26.860 | as a platform that enables creation on top of it,
00:12:30.140 | that's the way to democratize it.
00:12:31.540 | That's really interesting actually.
00:12:34.220 | And Microsoft throughout its history
00:12:36.060 | has been positioned well to do that.
00:12:38.260 | - And the, you know, the tie back
00:12:39.580 | to this radical markets thing,
00:12:41.660 | like the, so my team has been working with Glenn
00:12:46.660 | on this and Jaron Lanier actually.
00:12:51.100 | So Jaron is the, like the sort of father of virtual reality.
00:12:56.100 | Like he's one of the most interesting human beings
00:12:59.300 | on the planet, like a sweet, sweet guy.
00:13:01.580 | And so Jaron and Glenn and folks in my team
00:13:06.940 | have been working on this notion of data as labor,
00:13:10.180 | or like they call it data dignity as well.
00:13:12.940 | And so the idea is that if you, you know,
00:13:16.660 | again, going back to this, you know,
00:13:18.420 | sort of industrial analogy,
00:13:20.580 | if you think about data as the raw material
00:13:23.380 | that is consumed by the machine of AI
00:13:27.380 | in order to do useful things,
00:13:29.500 | then like we're not doing a really great job right now
00:13:34.180 | in having transparent marketplaces
00:13:36.620 | for valuing those data contributions.
00:13:39.540 | So like, and we all make them like explicitly,
00:13:42.460 | like you go to LinkedIn,
00:13:43.340 | you sort of set up your profile on LinkedIn,
00:13:45.940 | like that's an explicit contribution.
00:13:47.580 | Like, you know exactly the information
00:13:49.260 | that you're putting into the system.
00:13:50.500 | And like, you put it there because you have
00:13:52.780 | some nominal notion of like what value
00:13:55.300 | you're gonna get in return.
00:13:56.420 | But it's like only nominal,
00:13:57.500 | like you don't know exactly what value
00:13:59.420 | you're getting in return,
00:14:00.260 | like services free, you know,
00:14:01.820 | like it's low amount of like perceived.
00:14:04.420 | And then you've got all this indirect contribution
00:14:06.500 | that you're making just by virtue of interacting
00:14:08.780 | with all of the technology that's in your daily life.
00:14:12.980 | And so like what Glenn and Jaron
00:14:15.940 | and this data dignity team are trying to do
00:14:18.500 | is like, can we figure out a set of mechanisms
00:14:22.060 | that let us value those data contributions
00:14:25.820 | so that you could create an economy
00:14:28.060 | and like a set of controls and incentives
00:14:31.300 | that would allow people to like,
00:14:35.140 | maybe even in the limit,
00:14:36.700 | like earn part of their living
00:14:38.740 | through the data that they're creating.
00:14:40.860 | And like, you can sort of see it in explicit ways.
00:14:42.500 | There are these companies like Scale AI
00:14:45.860 | and like there are a whole bunch of them
00:14:47.860 | in China right now that are basically
00:14:50.900 | data labeling companies.
00:14:52.420 | So like you're doing supervised machine learning,
00:14:54.540 | you need lots and lots of label training data.
00:14:57.420 | And like those people are getting,
00:15:00.900 | like who work for those companies
00:15:02.060 | are getting compensated for their data contributions
00:15:05.060 | into the system.
00:15:06.380 | And so-
00:15:07.740 | - That's easier to put a number on their contribution
00:15:10.260 | 'cause they're explicitly labeling data.
00:15:12.460 | But you're saying that we're all contributing data
00:15:14.340 | in different kinds of ways.
00:15:15.740 | And it's fascinating to start to explicitly try
00:15:19.620 | to put a number on it.
00:15:20.900 | Do you think that's possible?
00:15:22.620 | - I don't know, it's hard.
00:15:23.620 | It really is.
00:15:25.460 | Because, you know, we don't have as much transparency
00:15:30.460 | as I think we need in like how the data is getting used.
00:15:36.260 | And it's, you know, super complicated.
00:15:38.700 | Like, you know, we, you know,
00:15:40.980 | I think as technologists sort of appreciate
00:15:42.860 | like some of the subtlety there.
00:15:44.180 | It's like, you know, the data gets created
00:15:47.900 | and then it gets, you know, it's not valuable.
00:15:51.420 | Like the data exhaust that you give off
00:15:56.020 | or the, you know, the explicit data
00:15:58.500 | that I am putting into the system
00:16:01.300 | isn't super valuable atomically.
00:16:05.180 | Like it's only valuable when you sort of aggregate it
00:16:07.900 | together into, you know, sort of large numbers.
00:16:10.500 | It's true even for these like folks
00:16:11.980 | who are getting compensated for like labeling things.
00:16:14.900 | Like for supervised machine learning now,
00:16:16.500 | like you need lots of labels to train, you know,
00:16:20.140 | a model that performs well.
00:16:22.140 | And so, you know, I think that's one of the challenges.
00:16:24.500 | It's like, how do you, you know,
00:16:26.140 | how do you sort of figure out like,
00:16:28.020 | because this data is getting combined in so many ways,
00:16:31.500 | like through these combinations,
00:16:33.900 | like how the value is flowing.
00:16:35.940 | - Yeah, that's fascinating.
00:16:37.540 | - Tough.
00:16:38.580 | - Yeah, and it's fascinating that you're thinking
00:16:41.420 | about this and I wasn't even going
00:16:43.540 | into this conversation expecting the breadth
00:16:46.260 | of research really that Microsoft broadly
00:16:50.020 | is thinking about, you're thinking about at Microsoft.
00:16:52.380 | So if we go back to '89 when Microsoft released Office
00:16:57.380 | or 1990 when they released Windows 3.0,
00:17:01.060 | how's the, in your view, I know you weren't there
00:17:06.180 | the entire, you know, through its history,
00:17:08.020 | but how has the company changed in the 30 years since
00:17:11.220 | as you look at it now?
00:17:12.940 | - The good thing is it's started off as a platform company,
00:17:17.180 | like it's still a platform company,
00:17:20.020 | like the parts of the business that are like thriving
00:17:22.700 | and most successful are those that are building platforms.
00:17:26.660 | Like the mission of the company now is,
00:17:29.100 | the mission's changed.
00:17:30.220 | It's like changed in a very interesting way.
00:17:32.580 | So, you know, back in '89, '90,
00:17:36.380 | like they were still on the original mission,
00:17:39.140 | which was like put a PC on every desk and in every home,
00:17:43.980 | like, and it was basically about democratizing access
00:17:47.620 | to this new personal computing technology,
00:17:50.140 | which when Bill started the company,
00:17:52.820 | integrated circuit microprocessors were a brand new thing.
00:17:57.740 | And like people were building, you know,
00:18:00.180 | homebrew computers, you know, from kits,
00:18:03.900 | like the way people build ham radios right now.
00:18:08.620 | And I think this is sort of the interesting thing
00:18:10.700 | for folks who build platforms in general,
00:18:12.900 | Bill saw the opportunity there
00:18:16.860 | and what personal computers could do.
00:18:18.740 | And it was like a, it was sort of a reach.
00:18:20.460 | Like you just sort of imagine like where things were,
00:18:23.660 | you know, when they started the company
00:18:24.860 | versus where things are now.
00:18:26.100 | Like in success, when you've democratized a platform,
00:18:29.380 | it just sort of vanishes into the platform.
00:18:31.020 | You don't pay attention to it anymore.
00:18:32.500 | Like operating systems aren't a thing anymore.
00:18:35.700 | Like they're super important, like completely critical.
00:18:38.020 | And like, you know, when you see one, you know, fail,
00:18:41.740 | like you just, you sort of understand,
00:18:43.500 | but like, you know, it's not a thing where you're,
00:18:45.300 | you're not like waiting for, you know,
00:18:47.940 | the next operating system thing
00:18:50.500 | in the same way that you were in 1995, right?
00:18:52.980 | Like in 1995, like, you know,
00:18:54.300 | we had Rolling Stones on the stage
00:18:56.020 | with the Windows 95 rollout.
00:18:57.620 | Like it was like the biggest thing in the world.
00:18:59.340 | Everybody was like lined up for it
00:19:01.060 | in the way that people used to line up for iPhone.
00:19:03.420 | But like, you know, eventually,
00:19:05.100 | and like, this isn't necessarily a bad thing.
00:19:07.180 | Like it just sort of, you know, it,
00:19:08.980 | the success is that it's sort of, it becomes ubiquitous.
00:19:12.860 | It's like everywhere, like human beings,
00:19:14.820 | when their technology becomes ubiquitous,
00:19:16.620 | they just sort of start taking it for granted.
00:19:18.220 | So the mission now that Satya rearticulated
00:19:23.220 | five plus years ago now,
00:19:25.260 | when he took over as CEO of the company,
00:19:27.300 | our mission is to empower every individual
00:19:33.500 | and every organization in the world to be more successful.
00:19:37.740 | And so, you know, again, like that's a platform mission.
00:19:43.180 | And like the way that we do it now is different.
00:19:46.340 | It's like, we have a hyperscale cloud
00:19:48.700 | that people are building their applications on top of.
00:19:51.660 | Like we have a bunch of AI infrastructure
00:19:53.700 | that people are building their AI applications on top of.
00:19:56.260 | We have, you know, we have a productivity suite of software,
00:20:02.260 | like Microsoft Dynamics, which, you know,
00:20:05.780 | some people might not think is the sexiest thing
00:20:07.420 | in the world, but it's like helping people figure out
00:20:10.060 | how to automate all of their business processes
00:20:12.740 | and workflows and to, you know,
00:20:14.740 | like help those businesses using it to like grow
00:20:18.420 | and be more successful.
00:20:19.260 | So it's a much broader vision in a way now
00:20:24.260 | than it was back then.
00:20:25.500 | Like it was sort of very particular thing.
00:20:27.420 | And like now, like we live in this world
00:20:29.300 | where technology is so powerful
00:20:31.340 | and it's like such a basic fact of life
00:20:36.340 | that it, you know, that it both exists
00:20:39.780 | and is going to get better and better over time
00:20:42.740 | or at least more and more powerful over time.
00:20:45.980 | So like, you know, what you have to do
00:20:47.300 | as a platform player is just much bigger.
00:20:49.900 | - Right.
00:20:50.740 | There's so many directions in which you can transform.
00:20:52.620 | You didn't mention mixed reality too.
00:20:55.180 | You know, that's probably early days
00:20:59.180 | or it depends how you think of it.
00:21:00.700 | But if we think on a scale of centuries,
00:21:02.220 | it's the early days of mixed reality.
00:21:04.060 | - Oh, for sure.
00:21:04.900 | - And so with HoloLens,
00:21:08.260 | Microsoft is doing some really interesting work there.
00:21:10.620 | Do you touch that part of the effort?
00:21:13.580 | What's the thinking?
00:21:14.860 | Do you think of mixed reality as a platform too?
00:21:17.700 | - Oh, sure.
00:21:18.540 | When we look at what the platforms of the future could be,
00:21:21.340 | it's like fairly obvious that like AI is one,
00:21:23.940 | like you don't have to, I mean, like that's,
00:21:26.620 | you know, you sort of say it to like someone
00:21:29.180 | and you know, like they get it.
00:21:31.940 | But like we also think of the like mixed reality
00:21:36.300 | and quantum as like these two interesting,
00:21:39.580 | you know, potentially.
00:21:40.900 | - Quantum computing?
00:21:41.820 | - Yeah.
00:21:42.660 | - Okay, so let's get crazy then.
00:21:44.500 | So you're talking about some futuristic things here.
00:21:48.900 | Well, the mixed reality,
00:21:49.980 | Microsoft is really not even futuristic, it's here.
00:21:52.620 | - It is.
00:21:53.460 | - It's incredible stuff.
00:21:54.300 | - And look, and it's having an impact right now.
00:21:56.660 | Like one of the more interesting things
00:21:58.740 | that's happened with mixed reality
00:22:00.020 | over the past couple of years that I didn't clearly see
00:22:04.180 | is that it's become the computing device
00:22:08.420 | for folks who, for doing their work,
00:22:13.220 | who haven't used any computing device at all
00:22:16.060 | to do their work before.
00:22:16.980 | So technicians and service folks
00:22:19.860 | and people who are doing like machine maintenance
00:22:24.220 | on factory floors.
00:22:25.340 | So like they, you know, because they're mobile
00:22:28.780 | and like they're out in the world
00:22:30.300 | and they're working with their hands
00:22:32.340 | and, you know, sort of servicing these
00:22:34.100 | like very complicated things,
00:22:36.580 | they don't use their mobile phone
00:22:39.460 | and like they don't carry a laptop with them
00:22:41.460 | and you know, they're not tethered to a desk.
00:22:43.540 | And so mixed reality,
00:22:45.620 | like where it's getting traction right now,
00:22:47.660 | where HoloLens is selling a lot of units
00:22:50.780 | is for these sorts of applications for these workers.
00:22:54.620 | And it's become like, I mean, like the people love it.
00:22:58.060 | They're like, oh my God, like this is like for them,
00:23:01.140 | like the same sort of productivity boosts that,
00:23:03.460 | you know, like an office worker had
00:23:05.540 | when they got their first personal computer.
00:23:08.220 | - Yeah, but you did mention it's certainly obvious AI
00:23:12.100 | as a platform, but can we dig into it a little bit?
00:23:14.980 | - Sure.
00:23:15.980 | - How does AI begin to infuse some of the products
00:23:18.300 | in Microsoft?
00:23:19.500 | So currently providing training
00:23:24.100 | of for example, neural networks in the cloud
00:23:26.740 | or providing pre-trained models
00:23:30.980 | or just even providing computing resources
00:23:35.380 | and whatever different inference that you want to do
00:23:38.100 | using neural networks.
00:23:39.940 | How do you think of AI infusing as a platform
00:23:44.500 | that Microsoft can provide?
00:23:45.900 | - Yeah, I mean, I think it's super interesting.
00:23:48.340 | It's like everywhere.
00:23:49.580 | And like we run these review meetings now
00:23:54.580 | where it's me and Satya
00:23:59.300 | and like members of Satya's leadership team
00:24:02.540 | and like a cross-functional group of folks
00:24:05.060 | across the entire company who are working
00:24:07.740 | on like either AI infrastructure
00:24:11.980 | or like have some substantial part
00:24:14.580 | of their product work using AI
00:24:19.580 | in some significant way.
00:24:22.740 | Now, the important thing to understand is like
00:24:24.580 | when you think about like how the AI is gonna manifest
00:24:28.180 | in like an experience for something
00:24:30.860 | that's gonna make it better.
00:24:31.980 | Like I think you don't want the AI-ness
00:24:36.980 | to be the first order thing.
00:24:38.940 | It's like whatever the product is
00:24:41.060 | and like the thing that is trying to help you do
00:24:43.900 | like the AI just sort of makes it better.
00:24:45.860 | And this is a gross exaggeration,
00:24:48.100 | but like people get super excited
00:24:51.260 | about like where the AI is showing up in products.
00:24:53.980 | And I'm like, do you get that excited
00:24:55.380 | about like where you're using a hash table
00:24:58.460 | like in your code?
00:24:59.820 | Like it's just another--
00:25:01.260 | - It's just a tool.
00:25:02.100 | - It's a very interesting programming tool,
00:25:03.980 | but it's sort of like it's an engineering tool.
00:25:06.340 | And so like it shows up everywhere.
00:25:09.540 | So like we've got dozens and dozens of features now
00:25:12.860 | in Office that are powered
00:25:15.220 | by like fairly sophisticated machine learning.
00:25:18.220 | Our search engine wouldn't work at all
00:25:22.180 | if you took the machine learning out of it.
00:25:24.820 | The like increasingly, you know,
00:25:28.540 | things like content moderation
00:25:31.020 | on our Xbox and xCloud platform.
00:25:36.020 | - Yeah, when you mean moderation,
00:25:38.060 | do you mean like the recommenders
00:25:39.100 | is like showing what you wanna look at next?
00:25:41.740 | - No, no, no, it's like anti-bullying stuff.
00:25:43.980 | - So the usual social network stuff
00:25:45.980 | that you have to deal with.
00:25:47.020 | - Yeah, correct.
00:25:47.860 | But it's like really, it's targeted,
00:25:50.060 | it's targeted towards a gaming audience.
00:25:52.260 | So it's like a very particular type of thing
00:25:54.740 | where, you know, the line between playful banter
00:25:59.420 | and like legitimate bullying is like a subtle one.
00:26:02.260 | And like you have to, it's sort of tough.
00:26:06.020 | Like I have--
00:26:07.500 | - I'd love to, if we could dig into it,
00:26:09.060 | 'cause you're also, you led the engineering efforts
00:26:11.660 | at LinkedIn.
00:26:13.140 | And if we look at LinkedIn as a social network,
00:26:17.620 | and if we look at the Xbox gaming as the social components,
00:26:21.740 | the very different kinds of, I imagine,
00:26:24.060 | communication going on on the two platforms, right?
00:26:26.860 | And the line in terms of bullying and so on
00:26:29.460 | is different on the two platforms.
00:26:31.460 | So how do you, I mean, it's such a fascinating
00:26:34.740 | philosophical discussion of where that line is.
00:26:37.180 | I don't think anyone knows the right answer.
00:26:39.820 | Twitter folks are under fire now, Jack at Twitter,
00:26:43.340 | for trying to find that line.
00:26:45.140 | Nobody knows what that line is.
00:26:46.940 | But how do you try to find the line for, you know,
00:26:51.940 | trying to prevent abusive behavior,
00:26:58.060 | and at the same time let people be playful
00:27:00.220 | and joke around and that kind of thing?
00:27:02.860 | - I think in a certain way, like, you know,
00:27:04.620 | if you have what I would call vertical social networks,
00:27:10.380 | it gets to be a little bit easier.
00:27:12.180 | So like, if you have a clear notion
00:27:14.420 | of like what your social network should be used for,
00:27:17.980 | or like what you are designing a community around,
00:27:22.260 | then you don't have as many dimensions
00:27:25.780 | to your sort of content safety problem
00:27:28.940 | as you do in a general purpose platform.
00:27:33.740 | I mean, so like on LinkedIn,
00:27:37.500 | like the whole social network is about
00:27:39.900 | connecting people with opportunity,
00:27:41.540 | whether it's helping them find a job
00:27:43.140 | or to, you know, sort of find mentors
00:27:46.260 | or to, you know, sort of help them
00:27:49.300 | like find their next sales lead,
00:27:52.100 | or to just sort of allow them to broadcast
00:27:56.140 | their, you know, sort of professional identity
00:27:59.380 | to their network of peers and collaborators
00:28:06.620 | and, you know, sort of professional community.
00:28:08.180 | Like that is, I mean, like in some ways,
00:28:09.820 | like that's very, very broad,
00:28:11.460 | but in other ways it's sort of, you know, it's narrow.
00:28:15.060 | And so like you can build AIs,
00:28:18.060 | like machine learning systems that are, you know,
00:28:23.340 | capable with those boundaries
00:28:25.500 | of making better automated decisions
00:28:27.980 | about like what is, you know,
00:28:29.820 | sort of inappropriate and offensive comment
00:28:31.700 | or dangerous comment or illegal content
00:28:35.180 | when you have some constraints.
00:28:37.980 | You know, same thing with, you know,
00:28:40.380 | same thing with like the gaming social network,
00:28:43.940 | so for instance, like it's about playing games,
00:28:45.780 | about having fun.
00:28:47.300 | And like the thing that you don't want to have happen
00:28:49.500 | on the platform is why bullying is such an important thing.
00:28:52.260 | Like bullying is not fun.
00:28:53.780 | So you want to do everything in your power
00:28:56.500 | to encourage that not to happen.
00:28:59.420 | And yeah, but I think it's sort of a tough problem
00:29:03.460 | in general, and it's one where I think, you know,
00:29:05.300 | eventually we're going to have to have
00:29:07.220 | some sort of clarification from our policy makers
00:29:13.860 | about what it is that we should be doing,
00:29:17.420 | like where the lines are, because it's tough.
00:29:20.900 | Like you don't, like in democracy, right,
00:29:23.780 | like you don't want,
00:29:25.580 | you want some sort of democratic involvement,
00:29:28.940 | like people should have a say in like where the lines are.
00:29:33.100 | Lines are drawn.
00:29:34.700 | Like you don't want a bunch of people
00:29:36.940 | making like unilateral decisions.
00:29:39.500 | And like we are in a, we're in a state right now
00:29:43.180 | for some of these platforms where you actually do have
00:29:45.140 | to make unilateral decisions where the policymaking
00:29:47.220 | isn't going to happen fast enough in order to like
00:29:49.860 | prevent very bad things from happening.
00:29:52.540 | But like we need the policymaking side of that to catch up,
00:29:56.060 | I think as quickly as possible,
00:29:58.500 | because you want that whole process to be a democratic thing,
00:30:02.020 | not a, you know, not some sort of weird thing
00:30:05.780 | where you've got a non-representative group of people
00:30:08.420 | making decisions that have, you know,
00:30:10.460 | like national and global impact.
00:30:12.540 | - It's fascinating because the digital space is different
00:30:15.620 | than the physical space in which nations
00:30:18.380 | and governments were established.
00:30:19.900 | And so what policy looks like globally,
00:30:24.020 | what bullying looks like globally,
00:30:25.780 | what healthy communication looks like globally
00:30:28.260 | is an open question.
00:30:29.380 | And we're all figuring it out together,
00:30:31.940 | which is fascinating. - Yeah, I mean,
00:30:33.220 | with, you know, sort of fake news, for instance, and-
00:30:38.220 | - Deep fakes and fake news generated by humans?
00:30:42.340 | - Yeah, so, I mean, we can talk about deep fakes.
00:30:44.620 | Like I think that is another, like, you know,
00:30:46.140 | sort of very interesting level of complexity.
00:30:48.300 | But like, if you think about just the written word, right?
00:30:51.540 | Like we have, you know, we invented papyrus,
00:30:54.420 | what, 3,000 years ago, where we, you know,
00:30:56.780 | you could sort of put word on paper.
00:31:01.180 | And then 500 years ago, like we get the printing press,
00:31:06.180 | like where the word gets a little bit more ubiquitous.
00:31:11.500 | And then like, you really, really didn't get
00:31:14.060 | ubiquitous printed word until the end of the 19th century
00:31:18.460 | when the offset press was invented.
00:31:20.740 | And then, you know, just sort of explodes
00:31:22.420 | and like, you know, the cross product of that
00:31:25.380 | and the industrial revolution's need for educated citizens
00:31:31.020 | resulted in like this rapid expansion of literacy
00:31:34.740 | and the rapid expansion of the word.
00:31:36.020 | But like we had 3,000 years up to that point
00:31:39.700 | to figure out like how to, you know,
00:31:43.260 | like what's journalism, what's editorial integrity,
00:31:46.900 | like what's, you know, what's scientific peer review.
00:31:50.100 | And so like you built all of this mechanism
00:31:52.820 | to like try to filter through all of the noise
00:31:57.020 | that the technology made possible to like, you know,
00:32:00.540 | sort of getting to something that society could cope with.
00:32:03.940 | And like, if you think about just the piece,
00:32:06.540 | the PC didn't exist 50 years ago.
00:32:09.740 | And so in like this span of, you know, like half a century,
00:32:12.980 | like we've gone from no digital, you know,
00:32:16.380 | no ubiquitous digital technology to like having a device
00:32:19.780 | that sits in your pocket where you can sort of say
00:32:22.460 | whatever is on your mind to like,
00:32:24.300 | what did Mary have in her,
00:32:27.060 | Mary Meeker just released her new like slide deck last week.
00:32:32.060 | You know, we've got 50% penetration of the internet
00:32:37.340 | to the global population.
00:32:38.500 | Like there are like three and a half billion people
00:32:40.260 | who are connected now.
00:32:41.740 | So it's like, it's crazy, crazy, like inconceivable,
00:32:44.980 | like how fast all of this happens.
00:32:46.500 | So, you know, it's not surprising
00:32:48.700 | that we haven't figured out what to do yet.
00:32:50.980 | But like we gotta like, we gotta really like lean
00:32:54.340 | into this set of problems because like we basically
00:32:56.820 | have three millennia worth of work to do
00:33:00.740 | about how to deal with all of this.
00:33:02.500 | And like probably what amounts
00:33:04.980 | to the next decade worth of time.
00:33:07.020 | - So since we're on the topic of tough, you know,
00:33:09.980 | tough, challenging problems,
00:33:11.620 | let's look at more on the tooling side in AI
00:33:15.220 | that Microsoft is looking at is face recognition software.
00:33:18.420 | So there's a lot of powerful positive use cases
00:33:21.860 | for face recognition, but there's some negative ones
00:33:24.220 | and we're seeing those in different governments
00:33:27.180 | in the world.
00:33:28.140 | So how do you, how does Microsoft think about the use
00:33:30.900 | of face recognition software as a platform
00:33:35.700 | in governments and companies?
00:33:39.300 | Yeah, how do we strike an ethical balance here?
00:33:42.300 | - Yeah, I think we've articulated a clear point of view.
00:33:47.300 | So Brad Smith wrote a blog post last fall,
00:33:52.180 | I believe that sort of like outline like very specifically
00:33:55.620 | what, you know, what our point of view is there.
00:33:59.620 | And, you know, I think we believe that there are certain uses
00:34:02.620 | to which face recognition should not be put.
00:34:05.020 | And we believe again,
00:34:06.340 | that there's a need for regulation there.
00:34:09.500 | Like the government should like really come in and say that,
00:34:13.140 | you know, this is where the lines are.
00:34:16.020 | And like, we very much wanted to like figuring out
00:34:19.860 | where the lines are should be a democratic process.
00:34:21.820 | But in the short term, like we've drawn some lines
00:34:24.220 | where, you know, we push back against uses
00:34:27.700 | of face recognition technology.
00:34:31.140 | You know, like the city of San Francisco, for instance,
00:34:33.740 | I think has completely outlawed any government agency
00:34:38.060 | from using face recognition tech.
00:34:40.780 | And like that may prove to be a little bit overly broad,
00:34:45.780 | but for like certain law enforcement things,
00:34:48.860 | like you really, I would personally rather be overly
00:34:53.860 | sort of cautious in terms of restricting use of it
00:34:57.380 | until like we have, you know, sort of defined a reasonable,
00:35:01.900 | you know, democratically determined regulatory framework
00:35:04.900 | for like where we could and should use it.
00:35:08.860 | And, you know, the other thing there is,
00:35:10.980 | like we've got a bunch of research that we're doing
00:35:13.980 | and a bunch of progress that we've made on bias there.
00:35:18.380 | And like, there are all sorts of like weird biases
00:35:20.860 | that these models can have,
00:35:22.980 | like all the way from like the most noteworthy one
00:35:25.580 | where, you know, you may have underrepresented minorities
00:35:30.580 | who are like underrepresented in the training data.
00:35:34.660 | And then you start learning like strange things,
00:35:39.180 | but like there are even, you know, other weird things.
00:35:42.100 | Like we've, I think we've seen in the public research,
00:35:46.460 | like models can learn strange things
00:35:49.500 | like all doctors are men, for instance.
00:35:54.500 | Yeah, I mean, and so like, it really is a thing
00:35:58.900 | where it's very important for everybody
00:36:03.580 | who is working on these things before they push publish,
00:36:08.420 | they launch the experiment, they, you know,
00:36:11.700 | push the code to, you know, online,
00:36:15.180 | or they even publish the paper that they are
00:36:18.620 | at least starting to think about what some of the potential
00:36:23.620 | negative consequences are, some of this stuff.
00:36:25.780 | I mean, this is where, you know, like the deep fake stuff,
00:36:28.980 | I find very worrisome just because
00:36:32.300 | there are gonna be some very good beneficial uses
00:36:39.740 | of like GAN generated imagery.
00:36:45.660 | And like, and funny enough, like one of the places
00:36:48.460 | where it's actually useful is we're using the technology
00:36:52.940 | right now to generate synthetic visual data
00:36:57.940 | for training some of the face recognition models
00:37:01.140 | to get rid of the bias.
00:37:02.380 | So like, that's one like super good use of the tech,
00:37:05.740 | but like, you know, it's getting good enough now
00:37:09.620 | where, you know, it's gonna sort of challenge
00:37:12.260 | a normal human being's ability to,
00:37:14.260 | like now you're just sort of say like,
00:37:15.940 | it's very expensive for someone
00:37:19.300 | to fabricate a photorealistic fake video.
00:37:23.180 | And like GANs are gonna make it fantastically cheap
00:37:26.860 | to fabricate a photorealistic fake video.
00:37:30.380 | And so like what you assume you can sort of trust is true
00:37:34.420 | versus like be skeptical about is about to change.
00:37:38.340 | And like, we're not ready for it, I don't think.
00:37:40.500 | - The nature of truth, right.
00:37:43.340 | It's also exciting because I think both you and I
00:37:46.540 | probably would agree that the way to solve,
00:37:49.540 | to take on that challenge is with technology.
00:37:52.060 | - Yeah.
00:37:52.900 | - Right, there's probably going to be ideas
00:37:54.700 | of ways to verify which kind of video is legitimate,
00:37:59.420 | which kind is not.
00:38:00.820 | So to me, that's an exciting possibility,
00:38:03.860 | most likely for just the comedic genius
00:38:07.180 | that the internet usually creates with these kinds of videos
00:38:10.980 | and hopefully will not result in any serious harm.
00:38:13.980 | - Yeah, and it could be,
00:38:15.900 | like I think we will have technology
00:38:19.260 | that may be able to detect
00:38:23.020 | whether or not something's fake or real.
00:38:24.460 | Although, the fakes are pretty convincing
00:38:29.460 | even like when you subject them to machine scrutiny.
00:38:34.340 | But we also have these increasingly interesting
00:38:38.660 | social networks that are under fire right now
00:38:42.660 | for some of the bad things that they do.
00:38:46.180 | Like one of the things you could choose to do
00:38:47.700 | with a social network is like you could use crypto
00:38:52.700 | and the networks to like have content signed
00:38:57.740 | where you could have a like full chain of custody
00:39:01.420 | that accompanied every piece of content.
00:39:03.900 | So like when you're viewing something
00:39:06.780 | and like you wanna ask yourself,
00:39:08.660 | "How much can I trust this?"
00:39:11.020 | Like you can click something
00:39:12.380 | and like have a verified chain of custody
00:39:14.980 | that shows like, "Oh, this is coming from this source
00:39:18.980 | "and it's like signed by like someone
00:39:21.580 | "whose identity I trust."
00:39:24.100 | Yeah, I think having that chain of custody,
00:39:26.620 | like being able to like say, "Oh, here's this video."
00:39:29.340 | Like it may or may not have been produced
00:39:31.940 | using some of this deep fake technology.
00:39:33.740 | But if you've got a verified chain of custody
00:39:35.660 | where you can sort of trace it all the way back
00:39:37.780 | to an identity and you can decide whether or not
00:39:39.940 | like I trust this identity,
00:39:41.540 | like, "Oh no, this is really from the White House."
00:39:43.340 | Or like, "This is really from the office
00:39:46.060 | "of this particular presidential candidate."
00:39:48.820 | Or it's really from Jeff Weiner, CEO of LinkedIn
00:39:53.580 | or Satya Nadella, CEO of Microsoft.
00:39:55.580 | Like that might be like one way
00:39:58.460 | that you can solve some of the problems.
00:39:59.980 | So like that's not the super high tech,
00:40:01.820 | like we've had all of this technology forever.
00:40:04.540 | But I think you're right.
00:40:06.740 | Like it has to be some sort of technological thing
00:40:11.140 | because the underlying tech that is used to create this
00:40:15.860 | is not gonna do anything but get better over time.
00:40:18.820 | And the genie is sort of out of the bottle.
00:40:21.180 | There's no stuffing it back in.
00:40:22.820 | - And there's a social component,
00:40:24.540 | which I think is really healthy for a democracy
00:40:26.620 | where people will be skeptical about the thing they watch.
00:40:30.260 | - Yeah. - In general.
00:40:32.180 | So, which is good, skepticism in general is good
00:40:35.980 | for your personal content. - It is good, I think.
00:40:37.300 | - So deep fakes in that sense are creating
00:40:40.420 | global skepticism about, can they trust what they read?
00:40:44.780 | It encourages further research.
00:40:46.900 | I come from the Soviet Union
00:40:48.820 | where basically nobody trusted the media
00:40:53.300 | because it was propaganda.
00:40:55.180 | And that kind of skepticism encouraged further research
00:40:59.180 | about ideas. - Yeah.
00:41:00.420 | - As opposed to just trusting any one source.
00:41:02.380 | - Well, I think it's one of the reasons why
00:41:04.380 | the scientific method and our apparatus
00:41:09.380 | of modern science is so good.
00:41:11.900 | Because you don't have to trust anything.
00:41:15.460 | Like the whole notion of modern science
00:41:20.180 | beyond the fact that this is a hypothesis
00:41:22.500 | and this is an experiment to test the hypothesis.
00:41:24.900 | And this is a peer review process
00:41:27.380 | for scrutinizing published results.
00:41:30.140 | But stuff's also supposed to be reproducible.
00:41:33.300 | So you know it's been vetted by this process,
00:41:35.260 | but you also are expected to publish enough detail
00:41:38.060 | where if you are sufficiently skeptical of the thing,
00:41:42.100 | you can go try to reproduce it yourself.
00:41:44.820 | And I don't know what it is.
00:41:47.620 | I think a lot of engineers are like this
00:41:50.020 | where your brain is sort of wired for skepticism.
00:41:55.020 | You don't just first order trust everything
00:41:58.100 | that you see and encounter.
00:42:00.140 | And you're sort of curious to understand the next thing.
00:42:04.580 | But I think it's an entirely healthy thing.
00:42:09.220 | And we need a little bit more of that right now.
00:42:12.380 | - So I'm not a large business owner.
00:42:16.340 | So I'm just a huge fan of many of Microsoft products.
00:42:21.340 | I mean I still, actually in terms of I generate
00:42:25.900 | a lot of graphics and images
00:42:27.100 | and I still use PowerPoint to do that.
00:42:28.780 | It beats Illustrator for me.
00:42:30.540 | Even professional sort of, it's fascinating.
00:42:34.580 | So I wonder what is the future of let's say
00:42:39.580 | Windows and Office look like?
00:42:42.260 | Do you see it?
00:42:43.980 | I mean I remember looking forward to XP.
00:42:45.980 | Was it exciting when XP was released?
00:42:48.300 | Just like you said, I don't remember when 95 was released.
00:42:51.220 | But XP for me was a big celebration.
00:42:53.940 | And when 10 came out I was like,
00:42:56.060 | "Oh okay, well it's nice, it's a nice improvement."
00:42:59.100 | So what do you see the future of these products?
00:43:02.140 | - I think there's a bunch of exciting,
00:43:04.700 | I mean on the Office front there's gonna be this
00:43:08.620 | increasing productivity wins that are coming out
00:43:14.740 | of some of these AI powered features that are coming.
00:43:17.900 | Like the products are sort of get smarter and smarter
00:43:20.020 | in like a very subtle way.
00:43:21.260 | Like there's not gonna be this big bang moment
00:43:24.260 | where like Clippy is gonna reemerge and it's gonna be--
00:43:28.060 | - Wait a minute, okay we'll have to, wait, wait, wait.
00:43:30.660 | Is Clippy coming back?
00:43:32.540 | But quite seriously, so injection of AI,
00:43:37.140 | there's not much, or at least I'm not familiar,
00:43:39.220 | sort of assistive type of stuff going on
00:43:41.340 | inside the Office products.
00:43:43.700 | Like a Clippy style assistant, personal assistant.
00:43:47.740 | Do you think that there's a possibility of that
00:43:51.020 | in the future?
00:43:52.140 | - So I think there are a bunch of like very small ways
00:43:54.860 | in which like machine learning powered assistive things
00:43:58.580 | are in the product right now.
00:44:00.300 | So there are a bunch of interesting things
00:44:04.980 | like the auto response stuff's getting better and better
00:44:09.500 | and it's like getting to the point where
00:44:12.380 | it can auto respond with like,
00:44:14.740 | "Okay, this person's clearly trying to schedule a meeting."
00:44:19.300 | So it looks at your calendar and it automatically
00:44:21.740 | like tries to find like a time and a space
00:44:24.300 | that's mutually interesting.
00:44:26.400 | Like we have this notion of Microsoft Search
00:44:32.460 | where it's like not just web search,
00:44:34.980 | but it's like search across like all of your information
00:44:38.220 | that's sitting inside of like your Office 365 tenant
00:44:43.220 | and like potentially in other products.
00:44:46.940 | And like we have this thing called the Microsoft Graph
00:44:49.700 | that is basically a API federator that,
00:44:53.420 | sort of like gets you hooked up across the entire breadth
00:44:57.980 | of like all of the,
00:44:59.780 | like what were information silos
00:45:01.640 | before they got woven together with the graph.
00:45:04.740 | Like that is like getting increasing,
00:45:07.860 | with increasing effectiveness sort of plumbed
00:45:09.860 | into some of these auto response things
00:45:13.120 | where you're gonna be able to see the system
00:45:15.860 | like automatically retrieve information for you.
00:45:18.220 | Like if, you know, like I frequently send out,
00:45:21.160 | you know, emails to folks where like,
00:45:22.980 | I can't find a paper or a document or whatnot.
00:45:25.380 | There's no reason why the system won't be able
00:45:26.820 | to do that for you.
00:45:27.660 | And like, I think the, it's building towards
00:45:31.940 | like having things that look more like,
00:45:34.460 | like a fully integrated, you know, assistant.
00:45:37.860 | But like, you'll have a bunch of steps
00:45:40.700 | that you will see before you,
00:45:42.780 | like it will not be this like big bang thing
00:45:45.100 | where like Clippy comes back and you've got this like,
00:45:47.380 | you know, manifestation of, you know,
00:45:49.340 | like a fully powered assistant.
00:45:52.020 | So I think that's definitely coming out,
00:45:56.900 | like all of the, you know,
00:45:57.940 | collaboration co-authoring stuff's getting better.
00:46:00.740 | You know, it's like really interesting.
00:46:02.220 | Like if you look at how we use
00:46:05.500 | the Office product portfolio at Microsoft,
00:46:09.020 | like more and more of it is happening inside of like Teams
00:46:13.000 | as a canvas and like, it's this thing where, you know,
00:46:17.180 | you've got collaboration is like at the center
00:46:20.620 | of the product and like we built some like really cool stuff
00:46:25.620 | that's some of which is about to be open source
00:46:29.420 | that are sort of framework level things
00:46:31.980 | for doing co-authoring.
00:46:35.580 | - That's awesome.
00:46:36.420 | So in, is there a cloud component to that?
00:46:38.880 | So on the web or is it,
00:46:41.940 | forgive me if I don't already know this,
00:46:43.600 | but with Office 365, we still,
00:46:46.520 | the collaboration we do if we're doing Word,
00:46:48.460 | we still send the file around.
00:46:50.620 | - No, no, no.
00:46:51.460 | - So this is.
00:46:53.360 | - We're already a little bit better than that.
00:46:55.240 | And like, you know, so like the fact that you're unaware of,
00:46:57.400 | it means we've got a better job to do,
00:46:59.120 | like helping you discover this stuff.
00:47:01.940 | But yeah, I mean, it's already like got a huge,
00:47:06.320 | huge cloud component.
00:47:07.160 | And like part of, you know, part of this framework stuff,
00:47:09.640 | I think we're calling it,
00:47:11.040 | like we've been working on it for a couple of years.
00:47:14.480 | So like I know the internal code name for it,
00:47:17.180 | but I think when we launched it at Build,
00:47:18.620 | it's called the Fluid Framework.
00:47:20.720 | And, but like what Fluid lets you do is like,
00:47:25.060 | you can go into a conversation that you're having in Teams
00:47:27.900 | and like reference, like part of a spreadsheet
00:47:30.240 | that you're working on,
00:47:32.560 | where somebody is like sitting in the Excel canvas,
00:47:35.580 | like working on the spreadsheet with a, you know,
00:47:37.720 | chart or whatnot.
00:47:39.080 | And like, you can sort of embed like part of the spreadsheet
00:47:41.960 | in the Teams conversation where like you can
00:47:44.720 | dynamically update it.
00:47:45.920 | And like all of the changes that you're making
00:47:48.760 | to this object are like, you know,
00:47:51.240 | coordinate and everything is sort of updating in real time.
00:47:54.640 | So like you can be in whatever canvas is most convenient
00:47:57.960 | for you to get your work done.
00:48:00.400 | - So I, out of my own sort of curiosity as engineer,
00:48:03.400 | I know what it's like to sort of lead a team
00:48:06.240 | of 10, 15 engineers.
00:48:08.200 | Microsoft has, I don't know what the numbers are,
00:48:11.680 | maybe 50, maybe 60,000 engineers, maybe 40.
00:48:14.920 | - I don't know exactly what the number is.
00:48:16.160 | It's a lot.
00:48:17.240 | It's tens of thousands.
00:48:18.520 | - Right, it's more than 10 or 15.
00:48:20.620 | I mean, you've led different sizes,
00:48:28.680 | mostly large size of engineers.
00:48:30.560 | What does it take to lead such a large group
00:48:33.840 | into a continue innovation,
00:48:37.480 | continue being highly productive
00:48:40.240 | and yet develop all kinds of new ideas
00:48:43.200 | and yet maintain, like what does it take to lead
00:48:46.320 | such a large group of brilliant people?
00:48:49.000 | - I think the thing that you learn
00:48:52.080 | as you manage larger and larger scale
00:48:55.120 | is that there are three things that are like
00:48:58.520 | very, very important for big engineering teams.
00:49:02.360 | Like one is like having some sort of forethought
00:49:06.320 | about what it is that you're gonna be building
00:49:09.840 | over large periods of time.
00:49:11.080 | Like not exactly, like you don't need to know
00:49:13.120 | that like, you know, I'm putting all my chips
00:49:15.320 | on this one product and like, this is gonna be the thing.
00:49:17.840 | But like, it's useful to know like what sort of capabilities
00:49:21.480 | you think you're going to need to have
00:49:23.120 | to build the products of the future.
00:49:24.760 | And then like invest in that infrastructure,
00:49:28.040 | like whether, and I'm not just talking
00:49:30.200 | about storage systems or cloud APIs,
00:49:32.760 | it's also like, what is your development process look like?
00:49:35.400 | What tools do you want?
00:49:36.760 | Like what culture do you want to build around?
00:49:40.040 | Like how you're, you know, sort of collaborating together
00:49:42.800 | to like make complicated technical things.
00:49:45.800 | And so like having an opinion and investing in that
00:49:48.120 | is like, it just gets more and more important.
00:49:50.520 | And like the sooner you can get a concrete set of opinions,
00:49:54.560 | like the better you're going to be.
00:49:57.720 | Like you can wing it for a while, small scales.
00:50:01.640 | Like, you know, when you start a company,
00:50:03.200 | like you don't have to be like super specific about it.
00:50:06.360 | But like the biggest miseries that I've ever seen
00:50:10.040 | as an engineering leader are in places
00:50:12.680 | where you didn't have a clear enough opinion
00:50:14.480 | about those things soon enough.
00:50:16.760 | And then you just sort of go create a bunch of technical debt
00:50:20.280 | and like culture debt that is excruciatingly painful
00:50:24.040 | to clean up.
00:50:25.800 | So like that's one bundle of things.
00:50:28.720 | Like the other, you know, another bundle of things is,
00:50:33.720 | like it's just really, really important
00:50:36.640 | to like have a clear mission.
00:50:41.640 | That's not just some cute crap you say
00:50:46.280 | because like you think you should have a mission,
00:50:48.960 | but like something that clarifies for people
00:50:52.920 | like where it is that you're headed together.
00:50:55.720 | Like I know it's like probably like a little bit
00:50:59.160 | too popular right now, but Yuval Harari's book "Sapiens,"
00:51:04.160 | one of the central ideas in his book is that like
00:51:12.320 | storytelling is like the quintessential thing
00:51:16.880 | for coordinating the activities of large groups of people.
00:51:20.520 | Like once you get past Dunbar's number,
00:51:22.520 | and like I really, really seen that
00:51:25.880 | just managing engineering teams.
00:51:27.360 | Like you can just brute force things
00:51:32.120 | when you're less than 120, 150 folks
00:51:35.240 | where you can sort of know and trust
00:51:37.560 | and understand what the dynamics are between all the people.
00:51:40.960 | But like past that, like things just sort of start
00:51:43.720 | to catastrophically fail if you don't have
00:51:46.920 | some sort of set of shared goals
00:51:48.780 | that you're marching towards.
00:51:50.480 | And so like, even though it sounds touchy feely
00:51:53.000 | and like a bunch of technical people
00:51:55.640 | will sort of balk at the idea that like you need
00:51:58.280 | to like have a clear, like the missions,
00:52:01.760 | like very, very, very important.
00:52:03.640 | - Yuval's right, right?
00:52:04.680 | Stories, that's how our society,
00:52:07.560 | that's the fabric that connects us all of us
00:52:09.440 | is these powerful stories.
00:52:11.200 | And that works for companies too, right?
00:52:13.520 | - It works for everything.
00:52:14.600 | Like, I mean, even down to like,
00:52:16.600 | you sort of really think about it,
00:52:17.580 | like our currency, for instance, is a story.
00:52:20.080 | Our constitution is a story.
00:52:21.800 | Our laws are, I mean, like we believe very, very,
00:52:25.520 | very strongly in them and thank God we do.
00:52:30.000 | But like they are, they're just abstract things.
00:52:33.060 | Like they're just words.
00:52:34.040 | Like if we don't believe in them, they're nothing.
00:52:36.560 | - And in some sense, those stories are platforms
00:52:39.480 | and the kinds, some of which Microsoft is creating, right?
00:52:43.080 | Yeah, platforms on which we define the future.
00:52:46.380 | So last question, what do you,
00:52:48.640 | let's get philosophical maybe,
00:52:50.120 | bigger than even Microsoft,
00:52:51.520 | what do you think the next 20, 30 plus years
00:52:56.320 | looks like for computing, for technology, for devices?
00:53:00.160 | Do you have crazy ideas about the future of the world?
00:53:04.640 | - Yeah, look, I think we're entering this time
00:53:08.080 | where we've got, we have technology that is progressing
00:53:13.080 | at the fastest rate that it ever has
00:53:15.840 | and you've got, you get some really big social problems,
00:53:20.840 | like society scale problems that we have to tackle.
00:53:26.300 | And so, I think we're gonna rise to the challenge
00:53:28.720 | and like figure out how to intersect
00:53:30.560 | like all of the power of this technology
00:53:32.400 | with all of the big challenges that are facing us,
00:53:35.320 | whether it's global warming,
00:53:37.840 | whether it's like the biggest remainder
00:53:40.320 | of the population boom is in Africa
00:53:44.240 | for the next 50 years or so.
00:53:46.800 | And like global warming is gonna make it increasingly
00:53:49.360 | difficult to feed the global population in particular,
00:53:52.660 | like in this place where you're gonna have
00:53:54.240 | like the biggest population boom.
00:53:56.660 | I think we, like AI is gonna,
00:54:01.580 | like if we push it in the right direction,
00:54:03.600 | like it can do like incredible things to empower all of us
00:54:07.880 | to achieve our full potential and to,
00:54:12.520 | you know, like live better lives.
00:54:15.760 | But like that also means focus
00:54:19.200 | on like some super important things,
00:54:22.080 | like how can you apply it to healthcare
00:54:24.000 | to make sure that, you know,
00:54:26.920 | like our quality and cost of,
00:54:29.640 | and sort of ubiquity of health coverage
00:54:32.080 | is better and better over time.
00:54:35.080 | Like that's more and more important every day
00:54:37.960 | is like in the United States
00:54:40.880 | and like the rest of the industrialized world.
00:54:43.280 | So Western Europe, China, Japan, Korea,
00:54:45.760 | like you've got this population bubble
00:54:48.880 | of like aging, working, you know,
00:54:51.960 | working age folks who are, you know,
00:54:54.360 | at some point over the next 20, 30 years,
00:54:56.200 | they're gonna be largely retired.
00:54:58.040 | And like, you're gonna have more retired people
00:55:00.160 | than working age people.
00:55:01.200 | And then like, you've got, you know,
00:55:02.480 | sort of natural questions about who's gonna take care
00:55:04.840 | of all the old folks and who's gonna do all the work.
00:55:07.140 | And the answers to like all of these sorts of questions,
00:55:11.040 | like where you're sort of running into, you know,
00:55:13.200 | like constraints of the, you know,
00:55:16.080 | the world and of society has always been like,
00:55:20.080 | what tech is gonna like help us get around this?
00:55:23.000 | You know, like when I was a kid in the 70s and 80s,
00:55:26.360 | like we talked all the time about like,
00:55:28.360 | oh, like population boom, population boom,
00:55:30.640 | like we're gonna, like we're not gonna be able
00:55:33.080 | to like feed the planet.
00:55:34.360 | And like, we were like right in the middle
00:55:36.800 | of the green revolution where like this,
00:55:40.280 | this massive technology driven increase
00:55:44.560 | in crop productivity, like worldwide.
00:55:47.520 | And like some of that was like taking some of the things
00:55:49.300 | that we knew in the West and like getting them distributed
00:55:52.560 | to the, you know, to the developing world.
00:55:55.780 | And like part of it were things like, you know,
00:55:59.360 | just smarter biology, like helping us increase.
00:56:03.280 | And like, we don't talk about like overpopulation anymore
00:56:08.040 | because like we can more or less,
00:56:10.320 | we sort of figured out how to feed the world.
00:56:12.000 | Like that's a technology story.
00:56:14.840 | And so like, I'm super, super hopeful about the future
00:56:19.500 | and in the ways where we will be able to apply technology
00:56:24.080 | to solve some of these super challenging problems.
00:56:28.040 | Like I've, like one of the things that I'm trying
00:56:32.880 | to spend my time doing right now is trying
00:56:35.040 | to get everybody else to be hopeful as well,
00:56:37.000 | because, you know, back to Harari,
00:56:38.720 | like we are the stories that we tell.
00:56:41.160 | Like if we, you know, if we get overly pessimistic right now
00:56:44.320 | about like the potential future of technology,
00:56:48.600 | like we, you know, like we may fail to fail
00:56:52.760 | to get all of the things in place that we need
00:56:54.600 | to like have our best possible future.
00:56:56.840 | - And that kind of hopeful optimism.
00:56:59.440 | I'm glad that you have it
00:57:00.640 | because you're leading large groups of engineers
00:57:04.160 | that are actually defining, that are writing that story,
00:57:06.720 | that are helping build that future, which is super exciting.
00:57:10.040 | And I agree with everything you've said,
00:57:12.280 | except I do hope Clippy comes back.
00:57:14.880 | (Colin laughs)
00:57:16.400 | We miss him.
00:57:17.760 | I speak for the people.
00:57:19.360 | So Colin, thank you so much for talking to me.
00:57:21.800 | - Thank you so much for having me.
00:57:22.640 | It was a pleasure.
00:57:23.600 | (upbeat music)
00:57:26.200 | (upbeat music)
00:57:28.800 | (upbeat music)
00:57:31.400 | (upbeat music)
00:57:34.000 | (upbeat music)
00:57:36.600 | (upbeat music)
00:57:39.200 | [BLANK_AUDIO]