back to indexKevin Scott: Microsoft CTO | Lex Fridman Podcast #30
Chapters
0:0
1:50 Product Portfolio
21:2 Mixed Reality
44:32 Microsoft Search
44:48 Microsoft Graph
47:18 Fluid Framework
53:1 The Future of the World
00:00:00.000 |
The following is a conversation with Kevin Scott, 00:00:06.080 |
Before that, he was the Senior Vice President 00:00:11.080 |
And before that, he oversaw mobile ads engineering at Google. 00:00:14.980 |
He also has a podcast called "Behind the Tech" 00:00:28.840 |
before the announcement of Microsoft's investment 00:00:30.960 |
in OpenAI that a few people have asked me about. 00:00:34.440 |
I'm sure there'll be one or two people in the future 00:00:38.120 |
that'll talk with me about the impact of that investment. 00:00:47.640 |
give it five stars on iTunes, support it on Patreon, 00:00:50.920 |
or simply connect with me on Twitter @LexFriedman, 00:01:06.080 |
Hope I didn't mess up your last name too bad. 00:01:13.480 |
And now, here's my conversation with Kevin Scott. 00:01:18.160 |
You've described yourself as a kid in a candy store 00:01:20.720 |
at Microsoft because of all the interesting projects 00:01:27.980 |
and give a brief whirlwind view of all the spaces 00:01:37.420 |
- If you include research, it becomes even more difficult. 00:01:47.740 |
Microsoft's product portfolio includes everything 00:01:53.740 |
from big cloud business, like a big set of SaaS services. 00:02:05.580 |
productivity software products that everybody uses. 00:02:11.220 |
We have a hardware business where we make everything 00:02:23.540 |
We have a fairly broad ranging research group 00:02:31.900 |
So, there's this really smart young economist, Glenn Weil, 00:02:51.100 |
So, the research group sort of spans from that 00:02:53.500 |
to human-computer interaction to artificial intelligence. 00:03:01.020 |
we have a search advertising and news business, 00:03:07.340 |
that I'm embarrassingly not recounting in this list. 00:03:14.100 |
Like I was having a super fun conversation this morning 00:03:30.140 |
And like we're doing some interesting collaboration now 00:03:37.900 |
And I was like completely nerding out with Tim Schafer, 00:03:40.860 |
like the guy who wrote Day of the Tentacle this morning, 00:03:45.820 |
which, you know, sort of, it like happens a lot. 00:03:49.900 |
Like, you know, Microsoft has been doing so much stuff 00:03:53.300 |
at such breadth for such a long period of time 00:03:58.840 |
like most of the time my job is very, very serious. 00:04:10.580 |
the conversations that I have with the people 00:04:14.620 |
- You have to reach back into the sentimental 00:04:17.020 |
and what's the radical markets and the economics. 00:04:24.740 |
can you come up with new market-based mechanisms to, 00:04:35.220 |
like does capitalism work, like free markets work? 00:04:42.980 |
that are built into these systems produce outcomes 00:04:46.340 |
that are creating sort of equitably distributed benefits 00:04:53.520 |
You know, and I think it's a reasonable set of questions 00:05:05.940 |
are actually working, you can sort of like tip towards, 00:05:39.540 |
like suppose that you had a radical pricing mechanism 00:05:47.100 |
where you were, you could be bid out of your position 00:05:58.620 |
So like if somebody came along and said, you know, 00:06:16.500 |
or like the thing that's got the higher economic utility, 00:06:23.740 |
to have the same sort of rent-seeking behaviors 00:06:53.220 |
that would force you to sort of mark the value 00:06:58.580 |
Like you couldn't sort of sit on this thing and say, 00:07:00.420 |
oh, like this house is only worth 10,000 bucks 00:07:03.020 |
when like everything around it is worth 10 million. 00:07:08.740 |
that where the prices match the value much better. 00:07:14.060 |
And Glenn does a much better job than I do at selling 00:07:16.860 |
and I probably picked the world's worst example, 00:07:19.980 |
but like, and it's intentionally provocative, 00:07:28.980 |
that like we could have a set of market mechanisms 00:07:36.260 |
like if you're thinking about something like, 00:07:44.340 |
it'd be really interesting in like how you would actually 00:07:50.100 |
And like, you might have to have a mechanism like that 00:07:54.180 |
- It's really interesting that that kind of research, 00:07:56.420 |
at least tangentially is touching Microsoft research. 00:08:10.660 |
who kind of talks about artificial intelligence 00:08:18.980 |
And arguably Microsoft is at the cutting edge of innovation 00:08:28.660 |
combining all our conversations together here 00:08:30.660 |
with radical markets and socialism and innovation in AI 00:08:47.700 |
- I think it's sort of one of the most important questions 00:08:51.140 |
in technology, like maybe even in society right now 00:08:55.300 |
about how is AI going to develop over the course 00:09:03.580 |
And like, what, what benefits will it produce 00:09:13.700 |
You know, I'll say at a, at the highest level, 00:09:16.260 |
one of the real joys of, of getting to do what I do 00:09:39.820 |
for the people who build on top of the platform 00:09:41.820 |
than is created for the, the platform owner or, or builder. 00:09:46.820 |
And I think we have to think about AI that way. 00:09:52.260 |
- Yeah, it has to, like, it has to be a platform 00:09:54.660 |
that other people can use to build businesses, 00:10:01.260 |
to be entrepreneurs, to solve problems that they have 00:10:07.700 |
It can't be a thing where there are a handful of companies 00:10:11.980 |
sitting in a very small handful of cities geographically 00:10:16.460 |
who are making all the decisions about what goes into the AI 00:10:25.780 |
all this infrastructure, then build all of the 00:10:30.980 |
So like, I think like that's bad from a, you know, 00:10:39.700 |
like, you know, sort of back to this whole notion of, 00:10:44.540 |
But I think it's also bad from an innovation perspective 00:10:47.580 |
because like I have infinite amounts of faith 00:10:58.260 |
And it's more than just a few tens of thousands of people 00:11:03.340 |
It should be millions of people with the tools. 00:11:10.180 |
in the late 18th century, like it was, you know, 00:11:13.740 |
maybe the first large scale substitute for human labor 00:11:27.020 |
from the steam engines were the folks who had capital 00:11:31.580 |
And like, they built factories around them and businesses 00:11:34.660 |
and the experts who knew how to build and maintain them. 00:11:38.620 |
But access to that technology democratized over time. 00:11:54.220 |
And like, they get all the economics from all of that. 00:12:05.220 |
like the MEMS gyroscope that are in both of our phones. 00:12:13.220 |
They're just a component in how we build the modern world. 00:12:17.660 |
- Yeah, so that's a really powerful way to think. 00:12:26.860 |
as a platform that enables creation on top of it, 00:12:41.660 |
like the, so my team has been working with Glenn 00:12:51.100 |
So Jaron is the, like the sort of father of virtual reality. 00:12:56.100 |
Like he's one of the most interesting human beings 00:13:06.940 |
have been working on this notion of data as labor, 00:13:29.500 |
then like we're not doing a really great job right now 00:13:39.540 |
So like, and we all make them like explicitly, 00:14:04.420 |
And then you've got all this indirect contribution 00:14:06.500 |
that you're making just by virtue of interacting 00:14:08.780 |
with all of the technology that's in your daily life. 00:14:18.500 |
is like, can we figure out a set of mechanisms 00:14:40.860 |
And like, you can sort of see it in explicit ways. 00:14:52.420 |
So like you're doing supervised machine learning, 00:14:54.540 |
you need lots and lots of label training data. 00:15:02.060 |
are getting compensated for their data contributions 00:15:07.740 |
- That's easier to put a number on their contribution 00:15:12.460 |
But you're saying that we're all contributing data 00:15:15.740 |
And it's fascinating to start to explicitly try 00:15:25.460 |
Because, you know, we don't have as much transparency 00:15:30.460 |
as I think we need in like how the data is getting used. 00:15:47.900 |
and then it gets, you know, it's not valuable. 00:16:05.180 |
Like it's only valuable when you sort of aggregate it 00:16:07.900 |
together into, you know, sort of large numbers. 00:16:11.980 |
who are getting compensated for like labeling things. 00:16:16.500 |
like you need lots of labels to train, you know, 00:16:22.140 |
And so, you know, I think that's one of the challenges. 00:16:28.020 |
because this data is getting combined in so many ways, 00:16:38.580 |
- Yeah, and it's fascinating that you're thinking 00:16:50.020 |
is thinking about, you're thinking about at Microsoft. 00:16:52.380 |
So if we go back to '89 when Microsoft released Office 00:17:01.060 |
how's the, in your view, I know you weren't there 00:17:08.020 |
but how has the company changed in the 30 years since 00:17:12.940 |
- The good thing is it's started off as a platform company, 00:17:20.020 |
like the parts of the business that are like thriving 00:17:22.700 |
and most successful are those that are building platforms. 00:17:36.380 |
like they were still on the original mission, 00:17:39.140 |
which was like put a PC on every desk and in every home, 00:17:43.980 |
like, and it was basically about democratizing access 00:17:52.820 |
integrated circuit microprocessors were a brand new thing. 00:18:03.900 |
like the way people build ham radios right now. 00:18:08.620 |
And I think this is sort of the interesting thing 00:18:20.460 |
Like you just sort of imagine like where things were, 00:18:26.100 |
Like in success, when you've democratized a platform, 00:18:32.500 |
Like operating systems aren't a thing anymore. 00:18:35.700 |
Like they're super important, like completely critical. 00:18:38.020 |
And like, you know, when you see one, you know, fail, 00:18:43.500 |
but like, you know, it's not a thing where you're, 00:18:50.500 |
in the same way that you were in 1995, right? 00:18:57.620 |
Like it was like the biggest thing in the world. 00:19:01.060 |
in the way that people used to line up for iPhone. 00:19:05.100 |
and like, this isn't necessarily a bad thing. 00:19:08.980 |
the success is that it's sort of, it becomes ubiquitous. 00:19:16.620 |
they just sort of start taking it for granted. 00:19:33.500 |
and every organization in the world to be more successful. 00:19:37.740 |
And so, you know, again, like that's a platform mission. 00:19:43.180 |
And like the way that we do it now is different. 00:19:48.700 |
that people are building their applications on top of. 00:19:53.700 |
that people are building their AI applications on top of. 00:19:56.260 |
We have, you know, we have a productivity suite of software, 00:20:05.780 |
some people might not think is the sexiest thing 00:20:07.420 |
in the world, but it's like helping people figure out 00:20:10.060 |
how to automate all of their business processes 00:20:14.740 |
like help those businesses using it to like grow 00:20:39.780 |
and is going to get better and better over time 00:20:42.740 |
or at least more and more powerful over time. 00:20:50.740 |
There's so many directions in which you can transform. 00:21:08.260 |
Microsoft is doing some really interesting work there. 00:21:14.860 |
Do you think of mixed reality as a platform too? 00:21:18.540 |
When we look at what the platforms of the future could be, 00:21:21.340 |
it's like fairly obvious that like AI is one, 00:21:31.940 |
But like we also think of the like mixed reality 00:21:44.500 |
So you're talking about some futuristic things here. 00:21:49.980 |
Microsoft is really not even futuristic, it's here. 00:21:54.300 |
- And look, and it's having an impact right now. 00:22:00.020 |
over the past couple of years that I didn't clearly see 00:22:19.860 |
and people who are doing like machine maintenance 00:22:25.340 |
So like they, you know, because they're mobile 00:22:41.460 |
and you know, they're not tethered to a desk. 00:22:50.780 |
is for these sorts of applications for these workers. 00:22:54.620 |
And it's become like, I mean, like the people love it. 00:22:58.060 |
They're like, oh my God, like this is like for them, 00:23:01.140 |
like the same sort of productivity boosts that, 00:23:08.220 |
- Yeah, but you did mention it's certainly obvious AI 00:23:12.100 |
as a platform, but can we dig into it a little bit? 00:23:15.980 |
- How does AI begin to infuse some of the products 00:23:35.380 |
and whatever different inference that you want to do 00:23:39.940 |
How do you think of AI infusing as a platform 00:23:45.900 |
- Yeah, I mean, I think it's super interesting. 00:24:22.740 |
Now, the important thing to understand is like 00:24:24.580 |
when you think about like how the AI is gonna manifest 00:24:41.060 |
and like the thing that is trying to help you do 00:24:51.260 |
about like where the AI is showing up in products. 00:25:03.980 |
but it's sort of like it's an engineering tool. 00:25:09.540 |
So like we've got dozens and dozens of features now 00:25:15.220 |
by like fairly sophisticated machine learning. 00:25:54.740 |
where, you know, the line between playful banter 00:25:59.420 |
and like legitimate bullying is like a subtle one. 00:26:09.060 |
'cause you're also, you led the engineering efforts 00:26:13.140 |
And if we look at LinkedIn as a social network, 00:26:17.620 |
and if we look at the Xbox gaming as the social components, 00:26:24.060 |
communication going on on the two platforms, right? 00:26:31.460 |
So how do you, I mean, it's such a fascinating 00:26:34.740 |
philosophical discussion of where that line is. 00:26:39.820 |
Twitter folks are under fire now, Jack at Twitter, 00:26:46.940 |
But how do you try to find the line for, you know, 00:27:04.620 |
if you have what I would call vertical social networks, 00:27:14.420 |
of like what your social network should be used for, 00:27:17.980 |
or like what you are designing a community around, 00:27:56.140 |
their, you know, sort of professional identity 00:28:06.620 |
and, you know, sort of professional community. 00:28:11.460 |
but in other ways it's sort of, you know, it's narrow. 00:28:18.060 |
like machine learning systems that are, you know, 00:28:40.380 |
same thing with like the gaming social network, 00:28:43.940 |
so for instance, like it's about playing games, 00:28:47.300 |
And like the thing that you don't want to have happen 00:28:49.500 |
on the platform is why bullying is such an important thing. 00:28:59.420 |
And yeah, but I think it's sort of a tough problem 00:29:03.460 |
in general, and it's one where I think, you know, 00:29:07.220 |
some sort of clarification from our policy makers 00:29:17.420 |
like where the lines are, because it's tough. 00:29:25.580 |
you want some sort of democratic involvement, 00:29:28.940 |
like people should have a say in like where the lines are. 00:29:39.500 |
And like we are in a, we're in a state right now 00:29:43.180 |
for some of these platforms where you actually do have 00:29:45.140 |
to make unilateral decisions where the policymaking 00:29:47.220 |
isn't going to happen fast enough in order to like 00:29:52.540 |
But like we need the policymaking side of that to catch up, 00:29:58.500 |
because you want that whole process to be a democratic thing, 00:30:02.020 |
not a, you know, not some sort of weird thing 00:30:05.780 |
where you've got a non-representative group of people 00:30:12.540 |
- It's fascinating because the digital space is different 00:30:25.780 |
what healthy communication looks like globally 00:30:33.220 |
with, you know, sort of fake news, for instance, and- 00:30:38.220 |
- Deep fakes and fake news generated by humans? 00:30:42.340 |
- Yeah, so, I mean, we can talk about deep fakes. 00:30:44.620 |
Like I think that is another, like, you know, 00:30:46.140 |
sort of very interesting level of complexity. 00:30:48.300 |
But like, if you think about just the written word, right? 00:31:01.180 |
And then 500 years ago, like we get the printing press, 00:31:06.180 |
like where the word gets a little bit more ubiquitous. 00:31:14.060 |
ubiquitous printed word until the end of the 19th century 00:31:22.420 |
and like, you know, the cross product of that 00:31:25.380 |
and the industrial revolution's need for educated citizens 00:31:31.020 |
resulted in like this rapid expansion of literacy 00:31:43.260 |
like what's journalism, what's editorial integrity, 00:31:46.900 |
like what's, you know, what's scientific peer review. 00:31:52.820 |
to like try to filter through all of the noise 00:31:57.020 |
that the technology made possible to like, you know, 00:32:00.540 |
sort of getting to something that society could cope with. 00:32:09.740 |
And so in like this span of, you know, like half a century, 00:32:16.380 |
no ubiquitous digital technology to like having a device 00:32:19.780 |
that sits in your pocket where you can sort of say 00:32:27.060 |
Mary Meeker just released her new like slide deck last week. 00:32:32.060 |
You know, we've got 50% penetration of the internet 00:32:38.500 |
Like there are like three and a half billion people 00:32:41.740 |
So it's like, it's crazy, crazy, like inconceivable, 00:32:50.980 |
But like we gotta like, we gotta really like lean 00:32:54.340 |
into this set of problems because like we basically 00:33:07.020 |
- So since we're on the topic of tough, you know, 00:33:15.220 |
that Microsoft is looking at is face recognition software. 00:33:18.420 |
So there's a lot of powerful positive use cases 00:33:21.860 |
for face recognition, but there's some negative ones 00:33:24.220 |
and we're seeing those in different governments 00:33:28.140 |
So how do you, how does Microsoft think about the use 00:33:39.300 |
Yeah, how do we strike an ethical balance here? 00:33:42.300 |
- Yeah, I think we've articulated a clear point of view. 00:33:52.180 |
I believe that sort of like outline like very specifically 00:33:55.620 |
what, you know, what our point of view is there. 00:33:59.620 |
And, you know, I think we believe that there are certain uses 00:34:09.500 |
Like the government should like really come in and say that, 00:34:16.020 |
And like, we very much wanted to like figuring out 00:34:19.860 |
where the lines are should be a democratic process. 00:34:21.820 |
But in the short term, like we've drawn some lines 00:34:31.140 |
You know, like the city of San Francisco, for instance, 00:34:33.740 |
I think has completely outlawed any government agency 00:34:40.780 |
And like that may prove to be a little bit overly broad, 00:34:48.860 |
like you really, I would personally rather be overly 00:34:53.860 |
sort of cautious in terms of restricting use of it 00:34:57.380 |
until like we have, you know, sort of defined a reasonable, 00:35:01.900 |
you know, democratically determined regulatory framework 00:35:10.980 |
like we've got a bunch of research that we're doing 00:35:13.980 |
and a bunch of progress that we've made on bias there. 00:35:18.380 |
And like, there are all sorts of like weird biases 00:35:22.980 |
like all the way from like the most noteworthy one 00:35:25.580 |
where, you know, you may have underrepresented minorities 00:35:30.580 |
who are like underrepresented in the training data. 00:35:34.660 |
And then you start learning like strange things, 00:35:39.180 |
but like there are even, you know, other weird things. 00:35:42.100 |
Like we've, I think we've seen in the public research, 00:35:54.500 |
Yeah, I mean, and so like, it really is a thing 00:36:03.580 |
who is working on these things before they push publish, 00:36:18.620 |
at least starting to think about what some of the potential 00:36:23.620 |
negative consequences are, some of this stuff. 00:36:25.780 |
I mean, this is where, you know, like the deep fake stuff, 00:36:32.300 |
there are gonna be some very good beneficial uses 00:36:45.660 |
And like, and funny enough, like one of the places 00:36:48.460 |
where it's actually useful is we're using the technology 00:36:57.940 |
for training some of the face recognition models 00:37:02.380 |
So like, that's one like super good use of the tech, 00:37:05.740 |
but like, you know, it's getting good enough now 00:37:09.620 |
where, you know, it's gonna sort of challenge 00:37:23.180 |
And like GANs are gonna make it fantastically cheap 00:37:30.380 |
And so like what you assume you can sort of trust is true 00:37:34.420 |
versus like be skeptical about is about to change. 00:37:38.340 |
And like, we're not ready for it, I don't think. 00:37:43.340 |
It's also exciting because I think both you and I 00:37:49.540 |
to take on that challenge is with technology. 00:37:54.700 |
of ways to verify which kind of video is legitimate, 00:38:07.180 |
that the internet usually creates with these kinds of videos 00:38:10.980 |
and hopefully will not result in any serious harm. 00:38:29.460 |
even like when you subject them to machine scrutiny. 00:38:34.340 |
But we also have these increasingly interesting 00:38:38.660 |
social networks that are under fire right now 00:38:46.180 |
Like one of the things you could choose to do 00:38:47.700 |
with a social network is like you could use crypto 00:38:57.740 |
where you could have a like full chain of custody 00:39:14.980 |
that shows like, "Oh, this is coming from this source 00:39:26.620 |
like being able to like say, "Oh, here's this video." 00:39:33.740 |
But if you've got a verified chain of custody 00:39:35.660 |
where you can sort of trace it all the way back 00:39:37.780 |
to an identity and you can decide whether or not 00:39:41.540 |
like, "Oh no, this is really from the White House." 00:39:48.820 |
Or it's really from Jeff Weiner, CEO of LinkedIn 00:40:01.820 |
like we've had all of this technology forever. 00:40:06.740 |
Like it has to be some sort of technological thing 00:40:11.140 |
because the underlying tech that is used to create this 00:40:15.860 |
is not gonna do anything but get better over time. 00:40:24.540 |
which I think is really healthy for a democracy 00:40:26.620 |
where people will be skeptical about the thing they watch. 00:40:32.180 |
So, which is good, skepticism in general is good 00:40:35.980 |
for your personal content. - It is good, I think. 00:40:40.420 |
global skepticism about, can they trust what they read? 00:40:55.180 |
And that kind of skepticism encouraged further research 00:41:00.420 |
- As opposed to just trusting any one source. 00:41:22.500 |
and this is an experiment to test the hypothesis. 00:41:30.140 |
But stuff's also supposed to be reproducible. 00:41:33.300 |
So you know it's been vetted by this process, 00:41:35.260 |
but you also are expected to publish enough detail 00:41:38.060 |
where if you are sufficiently skeptical of the thing, 00:41:50.020 |
where your brain is sort of wired for skepticism. 00:42:00.140 |
And you're sort of curious to understand the next thing. 00:42:09.220 |
And we need a little bit more of that right now. 00:42:16.340 |
So I'm just a huge fan of many of Microsoft products. 00:42:21.340 |
I mean I still, actually in terms of I generate 00:42:48.300 |
Just like you said, I don't remember when 95 was released. 00:42:56.060 |
"Oh okay, well it's nice, it's a nice improvement." 00:42:59.100 |
So what do you see the future of these products? 00:43:04.700 |
I mean on the Office front there's gonna be this 00:43:08.620 |
increasing productivity wins that are coming out 00:43:14.740 |
of some of these AI powered features that are coming. 00:43:17.900 |
Like the products are sort of get smarter and smarter 00:43:21.260 |
Like there's not gonna be this big bang moment 00:43:24.260 |
where like Clippy is gonna reemerge and it's gonna be-- 00:43:28.060 |
- Wait a minute, okay we'll have to, wait, wait, wait. 00:43:37.140 |
there's not much, or at least I'm not familiar, 00:43:43.700 |
Like a Clippy style assistant, personal assistant. 00:43:47.740 |
Do you think that there's a possibility of that 00:43:52.140 |
- So I think there are a bunch of like very small ways 00:43:54.860 |
in which like machine learning powered assistive things 00:44:04.980 |
like the auto response stuff's getting better and better 00:44:14.740 |
"Okay, this person's clearly trying to schedule a meeting." 00:44:19.300 |
So it looks at your calendar and it automatically 00:44:34.980 |
but it's like search across like all of your information 00:44:38.220 |
that's sitting inside of like your Office 365 tenant 00:44:46.940 |
And like we have this thing called the Microsoft Graph 00:44:53.420 |
sort of like gets you hooked up across the entire breadth 00:45:01.640 |
before they got woven together with the graph. 00:45:07.860 |
with increasing effectiveness sort of plumbed 00:45:15.860 |
like automatically retrieve information for you. 00:45:18.220 |
Like if, you know, like I frequently send out, 00:45:22.980 |
I can't find a paper or a document or whatnot. 00:45:25.380 |
There's no reason why the system won't be able 00:45:34.460 |
like a fully integrated, you know, assistant. 00:45:45.100 |
where like Clippy comes back and you've got this like, 00:45:57.940 |
collaboration co-authoring stuff's getting better. 00:46:09.020 |
like more and more of it is happening inside of like Teams 00:46:13.000 |
as a canvas and like, it's this thing where, you know, 00:46:17.180 |
you've got collaboration is like at the center 00:46:20.620 |
of the product and like we built some like really cool stuff 00:46:25.620 |
that's some of which is about to be open source 00:46:53.360 |
- We're already a little bit better than that. 00:46:55.240 |
And like, you know, so like the fact that you're unaware of, 00:47:01.940 |
But yeah, I mean, it's already like got a huge, 00:47:07.160 |
And like part of, you know, part of this framework stuff, 00:47:11.040 |
like we've been working on it for a couple of years. 00:47:14.480 |
So like I know the internal code name for it, 00:47:20.720 |
And, but like what Fluid lets you do is like, 00:47:25.060 |
you can go into a conversation that you're having in Teams 00:47:27.900 |
and like reference, like part of a spreadsheet 00:47:32.560 |
where somebody is like sitting in the Excel canvas, 00:47:35.580 |
like working on the spreadsheet with a, you know, 00:47:39.080 |
And like, you can sort of embed like part of the spreadsheet 00:47:45.920 |
And like all of the changes that you're making 00:47:51.240 |
coordinate and everything is sort of updating in real time. 00:47:54.640 |
So like you can be in whatever canvas is most convenient 00:48:00.400 |
- So I, out of my own sort of curiosity as engineer, 00:48:08.200 |
Microsoft has, I don't know what the numbers are, 00:48:43.200 |
and yet maintain, like what does it take to lead 00:48:58.520 |
very, very important for big engineering teams. 00:49:02.360 |
Like one is like having some sort of forethought 00:49:06.320 |
about what it is that you're gonna be building 00:49:11.080 |
Like not exactly, like you don't need to know 00:49:13.120 |
that like, you know, I'm putting all my chips 00:49:15.320 |
on this one product and like, this is gonna be the thing. 00:49:17.840 |
But like, it's useful to know like what sort of capabilities 00:49:32.760 |
it's also like, what is your development process look like? 00:49:36.760 |
Like what culture do you want to build around? 00:49:40.040 |
Like how you're, you know, sort of collaborating together 00:49:45.800 |
And so like having an opinion and investing in that 00:49:48.120 |
is like, it just gets more and more important. 00:49:50.520 |
And like the sooner you can get a concrete set of opinions, 00:49:57.720 |
Like you can wing it for a while, small scales. 00:50:03.200 |
like you don't have to be like super specific about it. 00:50:06.360 |
But like the biggest miseries that I've ever seen 00:50:16.760 |
And then you just sort of go create a bunch of technical debt 00:50:20.280 |
and like culture debt that is excruciatingly painful 00:50:28.720 |
Like the other, you know, another bundle of things is, 00:50:46.280 |
because like you think you should have a mission, 00:50:52.920 |
like where it is that you're headed together. 00:50:55.720 |
Like I know it's like probably like a little bit 00:50:59.160 |
too popular right now, but Yuval Harari's book "Sapiens," 00:51:04.160 |
one of the central ideas in his book is that like 00:51:12.320 |
storytelling is like the quintessential thing 00:51:16.880 |
for coordinating the activities of large groups of people. 00:51:37.560 |
and understand what the dynamics are between all the people. 00:51:40.960 |
But like past that, like things just sort of start 00:51:50.480 |
And so like, even though it sounds touchy feely 00:51:55.640 |
will sort of balk at the idea that like you need 00:52:21.800 |
Our laws are, I mean, like we believe very, very, 00:52:30.000 |
But like they are, they're just abstract things. 00:52:34.040 |
Like if we don't believe in them, they're nothing. 00:52:36.560 |
- And in some sense, those stories are platforms 00:52:39.480 |
and the kinds, some of which Microsoft is creating, right? 00:52:43.080 |
Yeah, platforms on which we define the future. 00:52:56.320 |
looks like for computing, for technology, for devices? 00:53:00.160 |
Do you have crazy ideas about the future of the world? 00:53:04.640 |
- Yeah, look, I think we're entering this time 00:53:08.080 |
where we've got, we have technology that is progressing 00:53:15.840 |
and you've got, you get some really big social problems, 00:53:20.840 |
like society scale problems that we have to tackle. 00:53:26.300 |
And so, I think we're gonna rise to the challenge 00:53:32.400 |
with all of the big challenges that are facing us, 00:53:46.800 |
And like global warming is gonna make it increasingly 00:53:49.360 |
difficult to feed the global population in particular, 00:54:03.600 |
like it can do like incredible things to empower all of us 00:54:35.080 |
Like that's more and more important every day 00:54:40.880 |
and like the rest of the industrialized world. 00:54:58.040 |
And like, you're gonna have more retired people 00:55:02.480 |
sort of natural questions about who's gonna take care 00:55:04.840 |
of all the old folks and who's gonna do all the work. 00:55:07.140 |
And the answers to like all of these sorts of questions, 00:55:11.040 |
like where you're sort of running into, you know, 00:55:16.080 |
the world and of society has always been like, 00:55:20.080 |
what tech is gonna like help us get around this? 00:55:23.000 |
You know, like when I was a kid in the 70s and 80s, 00:55:30.640 |
like we're gonna, like we're not gonna be able 00:55:47.520 |
And like some of that was like taking some of the things 00:55:49.300 |
that we knew in the West and like getting them distributed 00:55:55.780 |
And like part of it were things like, you know, 00:55:59.360 |
just smarter biology, like helping us increase. 00:56:03.280 |
And like, we don't talk about like overpopulation anymore 00:56:10.320 |
we sort of figured out how to feed the world. 00:56:14.840 |
And so like, I'm super, super hopeful about the future 00:56:19.500 |
and in the ways where we will be able to apply technology 00:56:24.080 |
to solve some of these super challenging problems. 00:56:28.040 |
Like I've, like one of the things that I'm trying 00:56:41.160 |
Like if we, you know, if we get overly pessimistic right now 00:56:44.320 |
about like the potential future of technology, 00:56:52.760 |
to get all of the things in place that we need 00:57:00.640 |
because you're leading large groups of engineers 00:57:04.160 |
that are actually defining, that are writing that story, 00:57:06.720 |
that are helping build that future, which is super exciting. 00:57:19.360 |
So Colin, thank you so much for talking to me.