back to index

Satya Nadella | BG2 w/ Bill Gurley & Brad Gerstner


Chapters

0:0 Intro
1:31 Becoming Microsoft CEO
6:42 Satya’s Memo to CEO Committee
10:42 Satya’s Advantage as a CEO
11:34 Advice for CEOs
15:1 Microsoft’s Investment in OpenAI
19:42 AI Arms Race
23:55 Legacy Search and Consumer AI
28:7 The Future of AI Agents
38:32 Near-Infinite Memory
39:47 Copilot Approach to AI Adoption
50:26 Leveraging AI within Microsoft
56:3 CapX
60:20 The Cost of Model Scaling and Inference
75:15 Open AI Conversion to Profit
78:5 Next Steps for OpenAI
79:43 Open vs. Closed and Safe AI

Whisper Transcript | Transcript Only Page

00:00:00.000 | I think the company of this generation
00:00:02.180 | has already been created, which is open AI, in some sense.
00:00:06.600 | It's kind of like the Google or the Microsoft
00:00:08.920 | or the meta of this era.
00:00:12.260 | [MUSIC PLAYING]
00:00:15.160 | Well, it's great to be with you.
00:00:27.600 | When Bill and I were talking, Satya,
00:00:30.160 | and looking back at your tenure as CEO,
00:00:33.840 | it was really quite astonishing.
00:00:36.040 | You know, you started at Microsoft in 1992,
00:00:38.400 | for those who may not know.
00:00:39.640 | You took over online in 2007.
00:00:42.920 | You launched Bing Search in 2009.
00:00:46.320 | You took over servers and launched Azure in 2011.
00:00:50.640 | And you became CEO in 2014.
00:00:53.000 | And it was just before that that a pretty now well-known essay
00:00:57.320 | entitled "The Irrelevance of Microsoft"
00:00:59.400 | had just been published.
00:01:01.480 | Now, since then, you've taken Azure from $1 billion
00:01:05.680 | to $66 billion run rate.
00:01:08.440 | The total revenues of the business are up $2.5x.
00:01:12.320 | The total earnings are up over $3x.
00:01:15.480 | And the share price is up almost $10x.
00:01:17.640 | You've added almost $3 trillion of value
00:01:21.440 | to Microsoft shareholders.
00:01:23.480 | And as you reflect back on that over the course of last decade,
00:01:28.080 | what's the single greatest change
00:01:30.920 | that you thought you could do then to unlock the value,
00:01:35.120 | to change the course of Microsoft,
00:01:37.720 | which has been just an extraordinary success?
00:01:40.080 | Yeah.
00:01:44.360 | So the way I've always thought, Brad,
00:01:48.800 | about sort of that entire period of time
00:01:51.840 | is some sense from '92 to now,
00:01:55.800 | it's just one continuous sort of period for me.
00:02:01.200 | Although, obviously, 2014 was a big event
00:02:05.320 | with the accountability that goes with it.
00:02:09.680 | But what I felt was essentially pattern match
00:02:14.400 | when we were successful and when we were not,
00:02:18.760 | and do more of the former and less of the latter.
00:02:21.480 | I mean, in some sense, it's as simple as that,
00:02:23.600 | because I've sort of lived through...
00:02:26.000 | When I joined in '92,
00:02:27.680 | that was just after Windows 3.1 was launched.
00:02:32.160 | I think Windows 3.1 was May of '92,
00:02:36.040 | and I joined in November of '92.
00:02:37.840 | In fact, I was working at Sun,
00:02:40.360 | and I was thinking of going to business school,
00:02:42.560 | and I got an offer at Microsoft.
00:02:44.640 | And I said, "Ah, maybe I'll go to business school."
00:02:47.240 | And then I somehow or the other,
00:02:49.120 | the boss who was hiring me convinced me
00:02:51.320 | to just come to Microsoft.
00:02:52.680 | And it was like the best decision,
00:02:53.960 | because the thing that convinced me
00:02:55.640 | was the PDC of '91 in Moscone Center,
00:02:59.360 | when I went and saw the basically Windows NT,
00:03:05.000 | it was not called Windows NT at that time, and x86.
00:03:09.920 | And I said, "God, this, what's happening in the client
00:03:13.240 | will happen on the server."
00:03:16.240 | And this is a platform company and a partner company,
00:03:19.640 | and they're going to ride the wave.
00:03:21.320 | And so that was sort of the calculus then.
00:03:24.720 | Then, of course, the web happened.
00:03:27.240 | We managed that transition.
00:03:28.800 | We got a lot of things right.
00:03:30.520 | Like, for example, I mean, we recognized the browser.
00:03:34.280 | We competed and got that browser thing eventually right.
00:03:40.960 | We missed search, right?
00:03:42.640 | We sort of felt like, wow, the big thing there was the browser
00:03:47.320 | because it felt more like an operating system,
00:03:49.560 | but we didn't understand the new category,
00:03:51.600 | which is the organizing layer of the internet
00:03:54.440 | happened to be search.
00:03:56.760 | Then we kind of were there in mobile,
00:04:01.040 | but we really didn't get it right.
00:04:02.680 | Obviously, the iPhone happened, and we got the cloud right.
00:04:06.880 | So if I look at it, and then we are here,
00:04:09.040 | we are on the fourth one on AI.
00:04:12.600 | In all of those cases, I think doing things
00:04:16.120 | which are not coming out of because somebody else got it
00:04:19.240 | and we just need to do the same.
00:04:21.520 | Sometimes it's okay to fast follow, and it worked out,
00:04:25.000 | but you shouldn't do things out of envy.
00:04:26.560 | That was one of the hardest lessons I think we've learned.
00:04:29.880 | Do it because you have permission to do this,
00:04:34.000 | and you can do it better.
00:04:35.200 | Like, both of those matter to me, the brand permission.
00:04:37.720 | Like, you know, Jeffrey Moore once said this to me,
00:04:40.080 | which I say, "Hey, why don't you go do things
00:04:42.160 | which your customers expect you to do?"
00:04:44.200 | I love that, right, which is cloud was one such thing,
00:04:47.400 | which is the custom, you know,
00:04:48.840 | in fact, when I first remember showing up in Azure,
00:04:52.760 | people would tell me, "Oh, it's a winner-take-all,
00:04:54.760 | it's all over, and Amazon's won it all."
00:04:59.280 | I never believed it, because after all,
00:05:01.200 | I'd compete against Oracle and IBM in the servers,
00:05:04.440 | and I always felt like, look,
00:05:06.120 | it's just never going to be winner-take-all
00:05:08.000 | when it comes to infrastructure.
00:05:09.880 | And all you need to do is just get into the game
00:05:14.520 | with a value proposition.
00:05:15.760 | So, in some sense, a lot of these transitions for me
00:05:19.840 | has been about making sure
00:05:21.680 | you kind of recognize your structural position,
00:05:25.080 | you really get a good understanding
00:05:27.480 | of where you have permission from those partners
00:05:31.040 | and customers who want you to win,
00:05:33.800 | and go do those obvious things first.
00:05:37.240 | And I think, you know, you could call it,
00:05:40.120 | "Hey, that's the basics of strategy,"
00:05:43.200 | but that's sort of what I feel,
00:05:45.520 | I think, at least has been key.
00:05:49.080 | And, you know, there are things that cultivated,
00:05:52.440 | to your point, Brad, which is, you know,
00:05:54.320 | there's the sense of purpose and mission,
00:05:57.200 | the culture that you need to have,
00:06:00.160 | all those are the most,
00:06:01.480 | I would say those are the necessary conditions
00:06:05.080 | to even have a real chance for shots on goal.
00:06:09.920 | But I would just say getting that strategy right
00:06:13.280 | by recognizing your structural position and permission
00:06:16.480 | is probably what I have, you know,
00:06:19.040 | hopefully done a reasonable job on.
00:06:20.760 | - Satya, before we move on to AI,
00:06:22.760 | I have a couple of questions about the transition
00:06:25.240 | and just echoing what Brad said.
00:06:27.600 | I mean, there's a, I think that it's definitive
00:06:31.160 | that you may be the best CEO hire of all time.
00:06:34.360 | I mean, 3 trillion is unmatched.
00:06:37.320 | So one, I read an article that suggested,
00:06:41.480 | and maybe this isn't true, so you tell us,
00:06:43.080 | that you wrote a 10-page memo to the committee
00:06:46.400 | that was choosing the CEO.
00:06:49.200 | Is that true?
00:06:50.160 | And what was in the memo?
00:06:51.520 | - Yeah, it is true.
00:06:54.480 | Yeah, because I think our CEO process
00:06:59.040 | was pretty open out there.
00:07:01.560 | And at that time, quite frankly,
00:07:04.760 | it is definitely not obvious to me
00:07:07.720 | that one, in the beginning, remember,
00:07:10.720 | I never thought that first, Bill would leave,
00:07:14.240 | and then second, Steve would leave, right?
00:07:16.160 | It's not like you join Microsoft and think,
00:07:18.160 | oh yeah, you know, founders are going to retire
00:07:20.040 | and there's going to be a job opening
00:07:21.520 | and you can apply for it.
00:07:22.480 | I mean, that was not the mental model
00:07:25.000 | growing up at Microsoft.
00:07:26.280 | So when Steve decided to retire, I forget now,
00:07:29.040 | I think in August of 2013, there was a pretty big shock.
00:07:32.320 | And you know, at that time I was running
00:07:33.840 | our server and tools business, as it was called,
00:07:36.360 | in which Azure was housed and so on,
00:07:38.280 | and I was having a lot of fun.
00:07:39.760 | And I didn't even put up my hand first,
00:07:42.200 | saying, oh, I want to be CEO,
00:07:43.480 | because it was not even like a thing
00:07:46.000 | that I was thinking that'll happen.
00:07:48.360 | And then eventually the board came around and asked,
00:07:51.880 | and there were a lot of other candidates at that time,
00:07:53.680 | even internally at Microsoft.
00:07:56.280 | And so at some point in that process,
00:07:59.720 | they asked us to write.
00:08:00.800 | And quite frankly, it's fascinating.
00:08:02.960 | That memo, everything I said in it, right?
00:08:06.440 | You know, one of the terms I used in that memo,
00:08:09.560 | which I subsequently used even in the first piece of email
00:08:12.880 | I sent out to the company,
00:08:15.080 | had ambient intelligence and ubiquitous computing.
00:08:19.920 | And I dumbed it down to mobile first, cloud first later,
00:08:22.800 | because, you know, my PR folks came and said,
00:08:24.880 | what the heck is this?
00:08:25.760 | Nobody will understand what ambient computing is
00:08:28.360 | and, you know, ubiquitous or other ambient intelligence
00:08:30.640 | and ubiquitous computing.
00:08:31.720 | But that was the mobile first, cloud first.
00:08:34.160 | How do you really go where the secular shift is?
00:08:37.520 | Then understanding our structural position,
00:08:40.240 | thinking about Microsoft Cloud,
00:08:42.160 | what are the assets we have?
00:08:43.720 | Why is M365?
00:08:44.880 | In fact, one of the things I've always resisted
00:08:48.160 | is thinking of our cloud,
00:08:50.120 | the way the market segments it, right?
00:08:52.000 | The market segments it, oh, here is IaaS.
00:08:55.360 | Even Brad, the way he described ads,
00:08:57.200 | to me, I've never, I don't allocate my capital
00:08:59.720 | thinking here is the Azure capital,
00:09:01.800 | here is the M365 capital, here is, you know, gaming.
00:09:06.600 | I kind of think of, hey, there's a cloud infra.
00:09:09.120 | That's the core theory of the firm for me.
00:09:12.080 | On top of it, I have a set of workloads.
00:09:14.360 | One of those workloads happens to be Azure.
00:09:16.720 | The other one is M365, dynamics, gaming, what have you.
00:09:20.600 | And so in some sense, that was all in that memo
00:09:24.720 | and pretty much has played out.
00:09:28.960 | And one of the assumptions at that time was that this,
00:09:33.240 | you know, we had a 98%, 99% gross margin business
00:09:37.480 | in our servers and clients.
00:09:38.960 | And people said, oh, you know, good news.
00:09:40.560 | You now can move to the cloud
00:09:41.920 | and maybe you'll have some margin.
00:09:43.760 | And so that was the transition.
00:09:46.440 | And my gut was, it is going to be less GM,
00:09:51.440 | but the TAM is bigger.
00:09:53.080 | You know, we'll sell more to small businesses.
00:09:57.560 | We will sell more in aggregate in terms of even upsell,
00:10:02.560 | like the consumption would increase, right?
00:10:05.120 | Because, you know, we had sold a bit of exchange,
00:10:08.240 | but if you think about it, exchange, SharePoint, Teams,
00:10:10.920 | now everything expanded.
00:10:12.560 | So that was the basic arc that I had in that memo.
00:10:16.600 | - Was there any element of cultural shift?
00:10:19.600 | I mean, the number of CEO hire,
00:10:22.360 | there's CEO hires made in the world all the time
00:10:26.080 | and many of them fail.
00:10:27.320 | I mean, Intel's going through a second reboot here
00:10:30.560 | as we speak.
00:10:31.840 | And as Brad pointed out,
00:10:33.440 | there were people that are arguing,
00:10:34.920 | oh, Microsoft's the next IBM or DEC
00:10:37.280 | that it's better days are over.
00:10:38.880 | So what did you do and what would you advise new CEOs
00:10:42.280 | that come on to kind of reboot the culture
00:10:44.680 | and get it moving in a different direction?
00:10:46.840 | - Yeah, one of the advantages I think I had
00:10:52.080 | was I was a consummate insider, right?
00:10:54.480 | I mean, having grown up pretty much
00:10:57.160 | all my professional career at Microsoft.
00:10:59.400 | And so in some sense,
00:11:02.840 | if I would even criticize our culture,
00:11:05.760 | it was criticizing myself.
00:11:07.120 | So in an interest, the break I got was,
00:11:10.400 | it never felt like somebody from the outside coming
00:11:13.040 | and criticizing these folks who are here,
00:11:16.400 | versus it's about mostly pointing the finger
00:11:18.880 | right back at me,
00:11:19.880 | because I was pretty much part of the culture, right?
00:11:22.800 | I couldn't say anything that I was not part of.
00:11:26.640 | And so I felt like, to your point, Bill,
00:11:30.640 | I distinctly remember,
00:11:31.960 | I think the first time Microsoft
00:11:33.520 | became the largest market cap company,
00:11:36.480 | I remember walking around the campus,
00:11:39.040 | all of us, including me,
00:11:40.400 | we were all strutting around as if we were like,
00:11:43.960 | the best thing to humankind, right?
00:11:48.840 | And it is all our brilliance
00:11:50.400 | that's finally reflected in the market cap.
00:11:52.440 | And somehow it stuck with me that, God,
00:11:56.360 | that is the culture that you want to avoid, right?
00:11:59.480 | Because as I always say,
00:12:01.320 | from sort of ancient sort of Greece
00:12:03.360 | to modern Silicon Valley,
00:12:04.600 | there's only one thing that brings civilizations,
00:12:07.720 | countries, and companies down, which is hubris.
00:12:11.600 | And so one of the greatest breaks is,
00:12:14.520 | my wife had introduced me to a book by Carol Dweck,
00:12:18.120 | a few years before I became CEO,
00:12:19.840 | which I read on "Growth Mindset,"
00:12:21.840 | more in the context of my children's education
00:12:25.080 | and parenting and what have you.
00:12:27.000 | And I said, God, this thing is like the best.
00:12:29.480 | You know, all of us are always talking about learning
00:12:31.840 | and learning cultures and so on.
00:12:34.160 | And this was the best cultural meme
00:12:36.880 | we could have ever picked.
00:12:38.040 | So I attribute a lot of our success culturally to that meme,
00:12:41.560 | because it is not, the other thing,
00:12:43.360 | nice thing about that, Bill,
00:12:45.040 | was it is not trademarked Microsoft,
00:12:48.120 | or it's not some new dogma from a CEO.
00:12:51.640 | It's a thing that speaks to work and life.
00:12:55.240 | You can be a better parent, a better partner,
00:12:57.600 | a better friend, a neighbor, and a manager and a leader.
00:13:00.320 | So we picked that,
00:13:02.120 | and the pithy way I've always characterized it is,
00:13:05.640 | hey, go from being the know-it-alls to learn-it-alls,
00:13:09.360 | and it's a destination you never reach,
00:13:11.680 | because the day you say, I have a growth mindset,
00:13:14.080 | means you don't have a growth mindset by definition.
00:13:16.440 | And so it has been very, very helpful for us.
00:13:21.720 | And it's like all cultural change,
00:13:24.320 | you got to give it time, oxygen, breathing space,
00:13:28.720 | and it's both top-down and bottom-up,
00:13:32.040 | and it middles out, right?
00:13:33.320 | Which is, there's not a single meeting
00:13:35.960 | that I do with the company,
00:13:37.480 | or even my executive staff, or what have you,
00:13:39.800 | where I don't start with mission culture.
00:13:41.600 | Those are the two bookends.
00:13:42.920 | And I've been very, like, the other thing is,
00:13:45.680 | I've been super disciplined on my framework.
00:13:48.360 | To your point about that memo,
00:13:50.440 | pretty much for the last, now close to 11 years,
00:13:55.040 | the arc is the same.
00:13:58.320 | Mission culture, it's the worldview, right?
00:14:01.960 | That ambient intelligence, ubiquitous computing,
00:14:04.640 | and then the specific set of products/strategies.
00:14:08.760 | That frame, I pick and choose every word.
00:14:12.960 | I'm very, very deliberate about it.
00:14:15.120 | I repeat it until I'm bored stiff, but I just stay on it.
00:14:20.000 | Well, speaking of that, you've, you know,
00:14:22.280 | you mentioned the phase shifts that we've been through,
00:14:25.800 | and I've heard you say that as a large platform company,
00:14:30.560 | most of the value capture, right,
00:14:33.480 | is determined in that first three or four years
00:14:36.080 | of the phase shift,
00:14:37.160 | when the market position is established, Satya.
00:14:40.200 | You know, I've heard you say, you basically, you know,
00:14:42.600 | Microsoft was coming off of having missed search,
00:14:45.840 | having largely missed mobile,
00:14:47.560 | and I've heard you say,
00:14:49.000 | caught the last train out of town on cloud, right?
00:14:53.000 | So as you started thinking about the next big phase shift,
00:14:56.360 | it appears that you and others in the team,
00:14:59.720 | Kevin Scott, sniffed out pretty early
00:15:02.320 | that Google was likely ahead in AI with DeepMind.
00:15:06.840 | You make the decision to invest in open AI.
00:15:10.120 | What convinced you of this direction, right,
00:15:14.720 | versus the internal AI research efforts
00:15:17.640 | that you had underway?
00:15:18.920 | - Yeah, it's a great point,
00:15:21.840 | because there are a couple of things there, right?
00:15:25.120 | One is we were at it on AI for a long, long time.
00:15:30.120 | Obviously, you know, when Bill started MSR in 1995,
00:15:35.560 | I think, you know, the first group,
00:15:37.680 | I mean, he was always into this natural user interface.
00:15:40.640 | I think the first group was speech.
00:15:43.000 | You know, Rick Rashid came.
00:15:44.640 | There was, you know, in fact, Kaifu worked here,
00:15:47.320 | and, you know, we had a lot of, I would say,
00:15:52.320 | focus on trying to crack natural user interface.
00:15:56.040 | Language was always something that we cared about, right?
00:16:00.680 | In fact, even Hinton worked,
00:16:02.560 | like some of the early work in DNNs
00:16:05.280 | happened when he was in residency in MSR,
00:16:09.520 | and then Google hired him.
00:16:10.960 | So we missed, I would say, even in the early 2010s,
00:16:15.960 | some of what could have been doubling down
00:16:20.240 | at around the same time that Google doubled down
00:16:24.040 | and bought even DeepMind, right?
00:16:26.280 | And so that actually bothered me quite a bit,
00:16:30.040 | but I always wanted to focus.
00:16:32.320 | Like, for example, Skype Translate
00:16:34.360 | was one of the first things I focused on,
00:16:36.800 | because that was pretty cool.
00:16:39.680 | Like, that was the first time
00:16:41.440 | you could see transfer learning work, right,
00:16:44.040 | which is you could train it on one language pair,
00:16:46.320 | and it got better on another language, right?
00:16:48.360 | That was the first place where we could say,
00:16:51.240 | "Wow, machine translation is also with DNNs,
00:16:54.040 | like, it's different."
00:16:56.240 | And so ever since I've been obsessed with language,
00:16:59.080 | along with Kevin, in fact, the first time,
00:17:02.360 | actually, Elon and Sam,
00:17:06.320 | they were looking for, obviously, Azure credits
00:17:08.720 | and what have you, and we gave them some credits.
00:17:12.160 | And that time, they were more into RL and Dota 2
00:17:15.440 | and what have you, and that was interesting.
00:17:18.560 | And then we stopped for, I forget even exactly what happened,
00:17:21.560 | and then they, I think, went to GCP.
00:17:23.640 | And then they came back to talk about
00:17:27.880 | sort of what they wanted to do with language.
00:17:31.080 | That was the moment, right,
00:17:32.560 | which they talked about transformers and natural language.
00:17:35.320 | And because I always felt like, look,
00:17:37.200 | because that's, to me, our core business.
00:17:39.200 | And it goes back a little bit to how I think,
00:17:41.320 | which is what's our structural position?
00:17:43.120 | I knew always that if there was a way
00:17:45.440 | to have a nonlinear breakthrough
00:17:47.440 | in terms of some model architecture
00:17:49.360 | that sort of exhibited, you know,
00:17:51.960 | like one of the things that Bill, you know,
00:17:54.480 | he'd always say throughout our career here
00:17:57.120 | was there's only one category in digital.
00:17:59.200 | It's called information management.
00:18:00.840 | The way he thought about it was you schematize the world,
00:18:03.680 | right, take people, places, things, you know,
00:18:06.600 | just build a schema, right?
00:18:08.040 | We went down many, you know,
00:18:10.200 | there was this very infamous project
00:18:12.400 | called WinFS at Microsoft,
00:18:14.960 | which was all about schematize everything.
00:18:17.240 | And then, you know, you'll make sense of all information.
00:18:20.600 | And this was, it was just, it's just impossible to do.
00:18:24.360 | And so, therefore, you needed some breakthrough.
00:18:26.680 | And we said, maybe the way to do that is how we schematize.
00:18:30.360 | After all, the human brain does it through language
00:18:32.720 | and in a monologue and reasoning.
00:18:35.000 | And so, therefore, anyway, so that's what led me to OpenAI
00:18:39.480 | and quite frankly, the ambition
00:18:41.080 | that Sam and Greg and team had.
00:18:44.320 | And that was the other thing, right?
00:18:45.760 | Scaling laws.
00:18:46.600 | In fact, I think the first memo, weirdly enough,
00:18:49.040 | I read on scaling was written by Dario
00:18:52.840 | when he was at OpenAI and Ilya.
00:18:56.640 | And that's sort of what, like I said,
00:18:59.400 | let's take a bet on this, right?
00:19:00.760 | Which is, hey, wow, if this is going to have
00:19:03.200 | exponential performance, why not go all in
00:19:08.040 | and give it a real shot?
00:19:09.840 | And then, of course, once we started seeing it work
00:19:13.480 | on GitHub, Copilot and so on,
00:19:15.120 | then it was pretty easy to double down.
00:19:17.560 | But that was the intuition.
00:19:18.920 | - One of the things that has happened, I think,
00:19:22.520 | in previous phase shifts is some of the incumbents
00:19:26.120 | don't get on board fast enough.
00:19:28.080 | You even talked about Microsoft perhaps missing mobile
00:19:31.960 | or search or that kind of thing.
00:19:34.000 | I could argue, especially since I'm old
00:19:36.320 | and I've seen these shifts,
00:19:37.920 | that everyone's awake on this one.
00:19:40.720 | Or it's the most awake.
00:19:42.960 | Like it's heavily choreographed.
00:19:45.320 | Everyone's maybe at the starting line
00:19:47.480 | at almost the same time.
00:19:49.000 | I'm curious if you agree with that
00:19:51.440 | and how you think about the key players in the race.
00:19:55.760 | Google, Amazon, Meta with Lama, Elon has entered the game.
00:20:00.600 | (laughs)
00:20:02.440 | - Yeah, it's an interesting one.
00:20:03.840 | To your point about...
00:20:05.480 | I always think about it, right?
00:20:06.760 | If you sort of say take the late '90s,
00:20:09.160 | there was Microsoft and there was Daylight.
00:20:12.680 | And then there was the rest.
00:20:16.240 | Interestingly enough, now people talk about the Mag 7.
00:20:20.280 | There is probably more than that,
00:20:22.400 | even to your point about everybody's awake to it.
00:20:25.560 | They all have amazing balance sheets.
00:20:28.920 | There are even, I think, I'll call it...
00:20:31.320 | If you think about OpenAI,
00:20:33.000 | in some sense, you could say it's Mag 8
00:20:35.600 | because I think the company of this generation
00:20:38.200 | has already been created,
00:20:39.440 | which is OpenAI in some sense.
00:20:42.640 | It's kind of like the Google or the Microsoft
00:20:44.960 | or the Meta of this era.
00:20:49.200 | And so there are a couple of things.
00:20:52.640 | So therefore, I think it's going to be very competitive.
00:20:56.040 | I also think that I don't think
00:20:58.880 | it's going to be winner-take-all, right?
00:21:00.320 | Because there may be some categories
00:21:02.400 | that may be winner-take-all.
00:21:03.840 | For example, on the hyperscale side, absolutely not, right?
00:21:07.920 | I mean, the world will demand,
00:21:10.760 | even ex-China, multiple providers of frontier models
00:21:15.760 | distributed all over the world.
00:21:18.960 | In fact, one of the best structural positions
00:21:22.520 | that I think Microsoft has is...
00:21:25.680 | Because if you remember,
00:21:27.720 | the Azure structure is slightly different, right?
00:21:30.400 | We built out Azure for enterprise workloads
00:21:34.520 | with a lot of data residency,
00:21:36.760 | with lots...
00:21:37.720 | We have 60-plus regions, more regions than others.
00:21:40.120 | So it was not like we built our cloud for one big app.
00:21:44.680 | We built cloud for a lot of heterogeneous
00:21:48.960 | enterprise workloads,
00:21:50.520 | which I think in the long run
00:21:52.160 | is where all the inference demand will be
00:21:55.240 | with nexus to data and the app server and what have you.
00:21:59.080 | So I think there is going to be multiple winners
00:22:02.440 | at the infrastructure layer.
00:22:04.400 | There is going to be in the models even there,
00:22:08.440 | just the model and the app servers
00:22:10.680 | that each hyperscaler will have a bunch of models
00:22:14.160 | and there will be an app server around...
00:22:16.320 | Like every app today, even including Copilot,
00:22:18.520 | it's just a multi-model app.
00:22:20.520 | And so there's, in fact, a complete new app server.
00:22:23.280 | Like everyone, there was a mobile app server,
00:22:25.360 | there was a web app server, and guess what?
00:22:26.920 | There's an AI app server now.
00:22:28.400 | And for us, that's Foundry and we're building one
00:22:30.720 | and others will build.
00:22:32.040 | There'll be multiple of those.
00:22:34.080 | Then in apps, I think there will be more folk...
00:22:38.160 | I would say network effects is always going to be
00:22:40.400 | at the software layer, right?
00:22:41.600 | So at the app layer.
00:22:43.640 | There'll be different network effects in consumer,
00:22:46.000 | in the enterprise and what have you.
00:22:47.840 | And so to your fundamental point,
00:22:50.680 | I think you have to analyze it at structurally by layer.
00:22:54.840 | And there is going to be fierce competition
00:22:57.680 | between the seven, eight, nine, 10 of us
00:23:00.880 | at different layers of the stack.
00:23:02.320 | And as I always say to our team,
00:23:05.040 | which is watch for the one who comes and adds to it, right?
00:23:10.040 | That's the game you're all in,
00:23:12.200 | where you're always looking at
00:23:13.600 | who is the new entrepreneur will come out of the blue.
00:23:16.280 | And at least I would say OpenAI is one such company,
00:23:19.520 | which at this point has escaped velocity.
00:23:22.240 | Yeah, if we think about the app layer for a second,
00:23:26.480 | start with consumer AI a little bit here, Satya.
00:23:30.080 | Bing's a very large business.
00:23:32.160 | You and I've discussed 10 blue links
00:23:34.880 | was maybe the best business model
00:23:36.520 | in the history of capitalism,
00:23:38.440 | but it's massively threatened by a new modality
00:23:42.280 | where consumers just want answers, right?
00:23:44.560 | For example, my kids, they're like,
00:23:46.400 | why would I go to a search engine
00:23:47.920 | when I can just get answers?
00:23:49.640 | So do you think, first, can Google and Bing
00:23:53.240 | continue to grow the legacy search businesses
00:23:56.640 | in the age of answers?
00:23:58.480 | And then what does Bing need to do
00:24:02.560 | or your consumer efforts under Mustafa need to do
00:24:05.640 | in order to compete with ChatGPT,
00:24:08.960 | which really looks like it's broken out
00:24:11.640 | from a consumer perspective?
00:24:14.240 | - Yeah, I mean, I think the first thing
00:24:17.000 | is what you said last, which is chat meets answers.
00:24:21.680 | And that's ChatGPT, both the brand, the product,
00:24:25.680 | and it's becoming stateful, right?
00:24:27.560 | I mean, like ChatGPT now is not just,
00:24:30.640 | in fact, search was a stateless,
00:24:33.720 | there was search history,
00:24:34.960 | but I think more so these agents
00:24:38.400 | will be a lot more stateful.
00:24:40.200 | So, in fact, so that's why I was so thrilled.
00:24:44.520 | Like I've been trying to get an Apple search deal
00:24:47.480 | for like 10 years.
00:24:49.560 | And so when Tim finally did a deal with Sam,
00:24:52.280 | I was like the most thrilled person,
00:24:53.800 | which is better, it's better to have ChatGPT
00:24:57.800 | get that deal than anybody else,
00:25:00.280 | because we have that commercial
00:25:02.760 | and investor relationship with OpenAI.
00:25:06.560 | So to that point, the way I look at it and say,
00:25:09.680 | is at the same time, distribution matters, right?
00:25:13.560 | I mean, this is where Google
00:25:14.800 | has an enormous advantage, right?
00:25:16.880 | They have the distribution on Apple.
00:25:19.800 | They're the default.
00:25:21.720 | They are obviously the default on Android.
00:25:24.720 | They touch, so therefore I think,
00:25:26.360 | and the habits don't go away, right?
00:25:29.440 | I mean, the number of times
00:25:30.560 | you just go to the browser URL
00:25:33.120 | and just type in your query, right?
00:25:35.160 | I mean, even now, even though I wanna go to Copa,
00:25:37.480 | I mean, my usage is mostly Copilot.
00:25:40.920 | And like, if I have to think about Bing versus Copilot,
00:25:44.040 | it's kind of interesting, right?
00:25:45.720 | Some of the navigational stuff,
00:25:47.520 | I go to Bing, pretty much everything else,
00:25:50.080 | I go to Copilot, right?
00:25:51.680 | That shift, I think, is what's happening universally.
00:25:55.200 | And we are away, maybe one or two of these agents
00:25:58.360 | for shopping or travel away
00:26:00.760 | from even some of the commercial query.
00:26:02.440 | That's the time when the dam breaks,
00:26:05.160 | I think, on traditional search,
00:26:06.960 | when some of the commercial intent
00:26:09.240 | also migrates into the chat, right?
00:26:11.800 | Now, mostly the business has withstood
00:26:15.040 | because the commercial intent has not migrated.
00:26:18.600 | But once commercial intent migrates,
00:26:20.600 | that's when it suddenly moves.
00:26:22.920 | And so I think, yes, this is a secular shift.
00:26:26.600 | The way we are managing it is,
00:26:28.400 | we have three properties in Mustafa's world, right?
00:26:30.880 | There is Bing, MSN, and Copilot.
00:26:35.200 | So we think, in fact, he's got a crisp vision
00:26:38.000 | of what these three things are.
00:26:39.920 | They're all sort of one ecosystem.
00:26:42.200 | One is a feed, one is search in the traditional way,
00:26:46.240 | and then the other is this new agent interface.
00:26:51.240 | And they all have a social contract with content providers.
00:26:54.680 | We need to drive traffic.
00:26:56.280 | We need to have paywalls, maybe.
00:26:58.440 | We need to have ad-supported models, all of those.
00:27:01.200 | And so that's what we're trying to manage.
00:27:03.160 | We have our own distribution.
00:27:04.600 | The one advantage we do still have is Windows.
00:27:07.240 | We get to relitigate.
00:27:09.080 | We lost the browser, right?
00:27:10.320 | Even Chrome became the dominant browser,
00:27:12.600 | which is a real travesty
00:27:14.200 | because we had won against Netscape only to lose to Google.
00:27:17.960 | And we are getting it back now in an interesting way,
00:27:21.240 | both with Edge and with Copilot.
00:27:24.000 | Guess what?
00:27:24.840 | Now even Gemini has to earn.
00:27:26.080 | Like the good news about Windows for at least is,
00:27:28.120 | it's the open system, right?
00:27:29.400 | Chat GPT has a shot.
00:27:30.920 | Gemini has a shot.
00:27:32.880 | You don't have to call Microsoft.
00:27:34.440 | You can go do your best work and go over the top.
00:27:38.080 | But that also means we also get to,
00:27:40.120 | having lost it is great sometimes
00:27:42.400 | because you can win it all back.
00:27:45.200 | And so to me, even Windows distribution,
00:27:48.400 | I mean, I always say Google makes more money on Windows
00:27:51.480 | than all of Microsoft.
00:27:52.880 | I mean, literally.
00:27:53.720 | I mean, and I say, wow,
00:27:55.520 | this is the best news for Microsoft shareholders
00:27:58.360 | that we lost so badly
00:28:00.280 | that we can now go contest it and win back some share.
00:28:04.320 | - Hey Satya, one thing that everybody's talking
00:28:06.280 | about these agents,
00:28:07.320 | and if you just kind of think forward in your mind a bit,
00:28:11.560 | you can imagine all kinds of players wanting to enact action
00:28:16.560 | on other apps and other data that may be on a system.
00:28:21.760 | And Microsoft's in an interesting position
00:28:24.240 | 'cause you control the Windows ecosystem,
00:28:28.120 | but you have apps on like the iPhone ecosystem
00:28:31.360 | or the Android ecosystem.
00:28:33.200 | And how do you think about,
00:28:35.600 | and this is partially in terms of service question,
00:28:38.480 | partially a partnership question,
00:28:40.560 | will Apple allow Microsoft to control other apps on iOS?
00:28:45.520 | Will Microsoft let a chat GBT instantiate apps
00:28:50.520 | and take data from apps on Windows OS?
00:28:54.880 | I mean, you get the question.
00:28:55.880 | It goes all the way to when you start thinking
00:28:58.400 | about search and commerce,
00:29:00.040 | like will booking.com let Jim and I run transactions on it
00:29:05.040 | without their permission or knowledge?
00:29:09.240 | - Yeah, I think that this is the most interesting question.
00:29:13.040 | I mean, to some degree,
00:29:14.800 | it's unclear exactly how this will happen.
00:29:19.640 | There is a slight very old school way
00:29:23.480 | of thinking about some of this,
00:29:24.840 | which is if you remember,
00:29:27.680 | how did business applications of various kinds
00:29:31.080 | manage to do interop, right?
00:29:34.120 | They did manage that interop using connectors
00:29:37.800 | and people had connector licenses.
00:29:39.880 | So there was a business model that emerged, right?
00:29:42.480 | I mean, SAP was one of the most classic ones
00:29:45.400 | where you could say,
00:29:46.640 | hey, you can access SAP data as long as you had connectors.
00:29:50.280 | So there's a part of me which says
00:29:52.360 | something like that will emerge
00:29:54.600 | as when agent to agent interface occurs,
00:29:59.320 | it's unclear exactly what happens in consumer
00:30:02.120 | because consumer, the value exchange
00:30:04.280 | was a lot of advertising and traffic and what have you.
00:30:09.280 | Some of those things go away in an agentic world.
00:30:12.480 | So I think the business model is slightly unclear to me
00:30:16.400 | on the consumer side.
00:30:18.280 | But on the enterprise side,
00:30:20.120 | I think what will happen is everybody will say,
00:30:22.280 | hey, in order for you to either action into my action space
00:30:26.760 | or to get data out of my sort of schema, so to speak,
00:30:31.640 | there is some kind of an interface to my agent
00:30:35.440 | that is licensed, so to speak.
00:30:37.720 | And I think that that's a reason,
00:30:38.880 | like today, for example, when I go to co-pilot at Microsoft,
00:30:43.240 | I have connectors into Adobe, into my instance of SAP,
00:30:48.240 | obviously our instance of CRM, which is Dynamics.
00:30:51.760 | So it's fascinating.
00:30:53.840 | In fact, when was the last time any of us
00:30:57.720 | really went to a business application, right?
00:30:59.920 | We licensed all these SaaS applications.
00:31:02.040 | We hardly use them.
00:31:03.280 | And somebody in the org is sort of inputting data into it.
00:31:07.040 | But in the AI age, the intensity goes up
00:31:09.840 | because all that data now is easy, right?
00:31:12.120 | You're a query away.
00:31:12.960 | I can literally say, hey, I'm meeting with Bill.
00:31:15.320 | Tell me about all the companies that Benchmark's invested in.
00:31:19.320 | It's both taking the web,
00:31:21.000 | anything that's in my CRM database,
00:31:23.400 | collating it all together, giving me a note, what have you.
00:31:27.080 | So to some degree, all that, I think,
00:31:30.040 | can be monetized by us and by even these connectors.
00:31:33.920 | - But more explicitly,
00:31:35.040 | like the thing that could happen really quickly,
00:31:37.440 | 'cause there's been talk about it,
00:31:38.720 | like would you allow chat GPT on the Windows OS
00:31:43.720 | to just start opening random apps and take-
00:31:47.320 | - Now that's an interesting one, right?
00:31:48.800 | So that over-the-top computer use,
00:31:52.880 | who is going to permit that, right?
00:31:55.520 | So which is, is it the user or is it the operating system?
00:31:58.960 | Like on Windows, there is quite frankly
00:32:00.960 | not anything I can do to prevent that
00:32:04.520 | other than some security guardrails, right?
00:32:06.600 | So I could sort of definitely,
00:32:07.960 | because I think if they became a secure,
00:32:09.520 | like one of my big fears is the security risk, right?
00:32:13.440 | If some malware got downloaded
00:32:15.840 | and that malware started sort of actioning stuff, right?
00:32:19.360 | That's when it's really dangerous.
00:32:21.280 | So I think those are the ones
00:32:23.240 | that we will build into the OS itself, right?
00:32:25.840 | Which is some elevated access and privilege
00:32:29.160 | that this computer use stuff happens.
00:32:31.400 | But at the end of the day,
00:32:32.600 | the user will be in control on an open platform like Windows
00:32:36.720 | and I'm sure Apple and Google will have a lot more control
00:32:40.120 | so they won't allow it.
00:32:41.320 | And so that's, in some sense,
00:32:43.640 | you could say that's a advantage they have
00:32:46.000 | or, you know, depending on how AT rules on all of those,
00:32:51.000 | you know, ultimately it'll be an interesting thing to watch.
00:32:54.720 | - Flip that around and then we can move on.
00:32:56.840 | But like, would you allow the Android OS
00:33:00.400 | or let's just call it the Android AI or the iOS AI
00:33:05.400 | to read email, you know,
00:33:08.880 | through a Microsoft client on that smartphone?
00:33:12.600 | - Yeah, I mean, we kind of like, you know,
00:33:13.840 | for example, today, you know,
00:33:16.040 | one of the things I always think about is,
00:33:18.560 | I don't know whether that was value leaking
00:33:20.480 | or did it actually help us, right?
00:33:22.440 | Which is we licensed the sync for Outlook to Apple
00:33:27.440 | for Apple Mail.
00:33:31.640 | It was kind of a, it was an interesting case.
00:33:34.200 | And I think that there was a lot of value leaked perhaps,
00:33:38.160 | but at the same time,
00:33:39.160 | I think that was one of the reasons
00:33:40.760 | why we were able to hold on to Exchange, right?
00:33:43.520 | It would have been doubly problematic
00:33:46.200 | - Understood.
00:33:47.560 | - If we had not done that.
00:33:48.840 | And so one of the things I think is going to your point,
00:33:53.360 | Bill, if we are building out,
00:33:56.120 | the reason we are going to do this
00:33:57.320 | is we have to have a trust system around Microsoft 365.
00:34:01.800 | We just cannot sort of say,
00:34:02.960 | hey, any agent comes in and does anything
00:34:05.120 | because after all, first it's not our data,
00:34:06.840 | it's our customers' data, right?
00:34:08.480 | It'll be, and so therefore the customer
00:34:10.960 | will have to permit it.
00:34:12.080 | The IT folks in the customer will have to permit it.
00:34:14.960 | It's not like some blanket flag I can set.
00:34:17.760 | And then the second thing is,
00:34:20.800 | it has to have a trust boundary.
00:34:22.840 | So I think what we will do is,
00:34:24.720 | it's kind of, it's an interesting way.
00:34:26.480 | It's kind of like what Apple Intelligence is doing.
00:34:29.520 | Think of it as we will do that around M365.
00:34:32.920 | - I've played with it a lot today.
00:34:34.880 | I'd highly recommend people download it.
00:34:37.200 | It's super interesting.
00:34:38.560 | - Yeah.
00:34:40.760 | So Sachit, clicking on this,
00:34:43.880 | Mustafa said that 2025 will be the year of infinite memory.
00:34:49.480 | And Bill and I have talked a lot
00:34:51.000 | dating back to the start of this year,
00:34:52.960 | that we think the next 10X function,
00:34:55.480 | it sounds like you agree on chat GPT,
00:34:58.520 | is really this persistent memory
00:35:01.640 | combined with being able to take some actions on our behalf.
00:35:05.360 | So we're already seeing the starts of memory
00:35:08.680 | and I'm pretty convinced as well that 2025,
00:35:12.440 | it seems like that one's pretty well solved.
00:35:14.800 | But this question of actions,
00:35:16.200 | when am I gonna be able to say to chat GPT,
00:35:19.200 | book me the four seasons in Seattle next Tuesday
00:35:22.040 | at the lowest price, right?
00:35:24.080 | And Bill and I have gone back and forth on this one
00:35:27.960 | and it would seem that computer use
00:35:30.040 | is the early test case for that.
00:35:33.280 | But do you have any sense,
00:35:34.800 | is it, you know, does that seem like a hard one
00:35:37.320 | from here to you?
00:35:38.400 | - Yeah, I mean, the most open-ended action space
00:35:44.400 | is still hard.
00:35:46.680 | But to your point, there are two things
00:35:49.120 | or maybe three things that are really exciting.
00:35:51.800 | Beyond I'll just say, I'm sure we'll talk about it,
00:35:55.320 | the scaling laws itself
00:35:56.720 | and capabilities of the raw models.
00:35:59.560 | One is memory.
00:36:01.440 | The other is tools use or actions.
00:36:05.360 | And the other one I would say is even entitlements, right?
00:36:09.880 | Which is, you know, what can you like, you know,
00:36:11.720 | one of the most interesting products we have even
00:36:13.920 | is Purview inside of Microsoft
00:36:15.960 | because increasingly, what do you have permissions to?
00:36:19.480 | What can you get?
00:36:20.560 | You know, you have to be able to access things
00:36:22.560 | in a safe way.
00:36:23.400 | Somebody needs to have governance on it and what have you.
00:36:25.800 | So if you put all those three things together
00:36:28.480 | and this agent is going to then be more governable
00:36:33.360 | and when it comes to actions,
00:36:35.080 | it is verifiable and then it has memory,
00:36:39.000 | then I think you're off to a very different place
00:36:42.800 | where for doing more autonomous work, so to speak.
00:36:46.360 | I still think, one of the things I always think
00:36:48.560 | is you're built, I like this co-pilot as the UI for AI
00:36:51.800 | because even in a fully autonomous world,
00:36:54.960 | from time to time, you'll raise exceptions,
00:36:57.840 | you will ask for permission,
00:36:59.880 | you will ask for invocation, what have you.
00:37:02.000 | And so therefore, this UI layer
00:37:03.520 | will be the organizing layer.
00:37:05.000 | In fact, that's kind of why we think of co-pilot
00:37:07.960 | as the organizing layer for work,
00:37:10.200 | work artifacts and workflow.
00:37:12.880 | But to your fundamental point,
00:37:15.120 | I don't think the models, I take even 4.0, right?
00:37:18.680 | Not even going to '01.
00:37:20.080 | 4.0 is pretty good with function calling.
00:37:22.560 | So you can do in the enterprise setting significant more,
00:37:26.280 | more so than consumer because consumer web function calling
00:37:30.480 | is just hard, where at least in an open-ended web,
00:37:35.200 | you can do it for a couple of websites.
00:37:37.720 | But once you say, hey, let's go do a book me a ticket
00:37:40.520 | on anything and it just, and if there's schema changes
00:37:42.960 | on the backend and so on, it'll trip over.
00:37:45.600 | You can teach it that, that's where I think '01
00:37:48.200 | can get better if it's a verifiable, auto-gradable
00:37:52.040 | sort of process on Rails.
00:37:54.360 | But I think we are maybe a year, year to two years away
00:37:57.920 | from doing more and more, but I think at least
00:38:01.160 | from an enterprise perspective, going and doing,
00:38:04.760 | here's my sales agent, here's my marketing agent,
00:38:07.200 | here's my supply chain agent,
00:38:08.720 | which can do more of these autonomous tasks.
00:38:11.520 | We built 10 or 15 of them into dynamics, right?
00:38:14.440 | Even looking into sort of my supplier communications
00:38:18.640 | and automatically handling my supplier communications,
00:38:22.400 | updating my databases, changing my inventories, my apply.
00:38:26.000 | Those are the kinds of things that you can do today,
00:38:28.160 | I would say.
00:38:29.400 | - Mustafa made this comment about near-infinite memory
00:38:32.480 | and I'm sure you heard it or hear it internally.
00:38:35.800 | Is there any clarification you can offer about that
00:38:38.360 | or is that more to come?
00:38:39.720 | - I think that, I mean, at some level,
00:38:42.640 | the idea that you have essentially a type system
00:38:50.640 | for your memory, right?
00:38:52.880 | That's the thing, right?
00:38:53.720 | Which is, it's not like every time I start,
00:38:56.880 | you have to organize.
00:38:58.040 | - I get the idea.
00:38:58.960 | He made it sound like you guys had an internal
00:39:01.600 | technical breakthrough on this front.
00:39:04.080 | - Yeah, I mean, there's an open source project even.
00:39:08.040 | I think it's, I forget, it's the same set of folks
00:39:11.320 | who did all the TypeScript stuff who are working on this.
00:39:15.960 | So what we're trying to do is essentially take memory
00:39:19.440 | and schematize it and sort of make it available
00:39:22.960 | such that you can go.
00:39:24.760 | Like each time I start, let's just imagine I'm prompting
00:39:27.720 | on some new prompt, I know how to cluster
00:39:30.800 | based on everything else I've done.
00:39:32.760 | And then that type matching and so on,
00:39:35.480 | I think is a good way for us to build up a memory system.
00:39:39.000 | - So shifting maybe to enterprise AI, Satya,
00:39:43.200 | you know, the Microsoft AI business has already reported
00:39:46.280 | to be about $10 billion.
00:39:48.280 | You've said that it's all inference
00:39:50.680 | and that you're not actually renting raw GPUs to others
00:39:54.560 | to train on because your inference demand is so high.
00:39:58.400 | So as we think about this, there's a lot of, I think,
00:40:01.040 | skepticism out there in the world as to whether or not
00:40:04.120 | major workloads are moving, you know?
00:40:06.520 | And so if you think about the key revenue products
00:40:10.320 | that people are using today and how it's driving
00:40:13.520 | that inference revenue for you today
00:40:15.720 | and how that may be similar or different
00:40:18.040 | from Amazon or Google, I'd be interested in that.
00:40:21.840 | - Yeah, I think that's a good, so the way for us
00:40:25.040 | this thing has played out is, you've got to remember
00:40:28.040 | most of our training stuff with OpenAI
00:40:30.960 | is sort of more investment logic, right?
00:40:33.480 | So it's sort of not in our quarterly results,
00:40:36.840 | it's more in the other income based on our investment.
00:40:41.320 | So that means the only thing that shows up in revenue-
00:40:45.800 | - Or loss, other income or loss, right?
00:40:48.040 | - That is right, that is right.
00:40:50.440 | Right now, that's how it shows up.
00:40:52.920 | And so most of the revenue or all the revenue
00:40:58.800 | is pretty much our API business, or in fact, to your point,
00:41:03.800 | ChatGPT's inference costs are there, right?
00:41:06.840 | So that's a different piece.
00:41:08.680 | And so the fact is the big hit apps of this era are what?
00:41:13.680 | ChatGPT, Copilot, GitHub Copilot,
00:41:18.080 | and the APIs of OpenAI and Azure OpenAI, right?
00:41:23.880 | So in some sense, if you had to list out
00:41:25.880 | the 10 most sort of hit apps,
00:41:28.920 | these would probably be in the four or five of them.
00:41:32.880 | And so therefore that's the biggest driver.
00:41:35.880 | The advantage we have had and OpenAI has had,
00:41:39.840 | which is we've had two years of runway, right?
00:41:43.680 | Pretty much uncontested.
00:41:44.960 | To your point, Bill made the point about,
00:41:47.480 | hey, everybody's awake, but, and it might be,
00:41:50.200 | I don't think there will be ever again
00:41:52.200 | maybe a two-year lead like this.
00:41:54.640 | Who knows?
00:41:55.480 | You say that and somebody else drops some sample
00:41:59.040 | and suddenly blows the world away.
00:42:01.120 | But that said, I think it's unlikely
00:42:03.800 | that that type of lead could be established
00:42:06.800 | with some foundation model.
00:42:09.440 | But we have that advantage.
00:42:10.720 | That was the great advantage we've had with OpenAI.
00:42:13.560 | OpenAI was able to really build out this escape velocity
00:42:17.400 | with ChatGPT.
00:42:18.360 | But on the API side, the biggest thing
00:42:21.200 | that we were able to gain was, you know,
00:42:23.400 | take Shopify or Stripe or Spotify.
00:42:28.200 | These were not customers of Azure.
00:42:30.600 | They were all customers of GCP
00:42:32.560 | or they were customers of AWS.
00:42:34.280 | So suddenly we got access to many, many more logos
00:42:39.920 | who are all quote-unquote digital natives
00:42:42.320 | who are using Azure in some shape or fashion and so on.
00:42:46.400 | So that's sort of one.
00:42:47.440 | And when it comes to the traditional enterprise,
00:42:50.160 | I think it's scaling.
00:42:51.200 | Like, I mean, literally it is, you know,
00:42:53.920 | people are playing with Copilot on one end
00:42:55.880 | and then are building agents on the other end using Foundry.
00:42:59.800 | But like these things are design wins and project wins
00:43:02.560 | and they're slow, but they're starting to scale.
00:43:05.240 | And again, the fact that we've had two years of runway on it,
00:43:08.920 | I think I like that business a lot more.
00:43:11.480 | And that's one of the reasons why
00:43:12.840 | the adverse selection problems here would have been
00:43:15.400 | lots of tech startups all looking for their H100 allocation
00:43:20.400 | in small batches, right?
00:43:23.280 | That, you know, having watched what happened
00:43:26.080 | to Sun Microsystems in the sort of .com,
00:43:30.160 | I always worry about that, which is, whoa,
00:43:32.800 | if, you know, you just can't chase
00:43:36.600 | everybody building models.
00:43:37.760 | In fact, even in the, I think the investor side,
00:43:39.880 | I think the sentiment is changing,
00:43:41.520 | which is now people are wanting to be more capital light
00:43:44.200 | and build on top of other people's models
00:43:46.520 | and so on and so forth.
00:43:47.520 | And if that's the case, you know,
00:43:49.560 | everybody who was looking for a H100
00:43:51.440 | will not want to, you know, want to look for it more.
00:43:54.160 | So that's kind of what we've been selective on.
00:43:56.440 | - And your sense is that for the others,
00:43:58.920 | that training of those models and those model clusters
00:44:02.600 | was a much bigger part of their AI revenue versus yours.
00:44:07.360 | - I don't know.
00:44:08.200 | I mean, this is where I'm speaking
00:44:09.400 | for other people's results.
00:44:10.960 | I don't know.
00:44:11.800 | I mean, it's just, I go back and say,
00:44:12.840 | what are the other big hit apps, right?
00:44:17.360 | I don't know what they are.
00:44:19.240 | Like, I mean, where do they, like,
00:44:22.200 | what models do they run?
00:44:23.760 | Where do they run them?
00:44:25.120 | And I would, like, that's kind of, I'm not,
00:44:29.240 | I mean, obviously Google's Gemini.
00:44:30.960 | I don't know, when I look at the Dow numbers
00:44:33.040 | of any of these AI products, there is ChatGPT.
00:44:37.440 | - Right.
00:44:38.280 | - And then there is, you know, like even Gemini,
00:44:41.680 | I'm very surprised at the Gemini numbers.
00:44:44.640 | I mean, obviously I think it will grow, you know,
00:44:46.680 | because of all the inherent distribution,
00:44:48.760 | but it's kind of interesting to say
00:44:51.760 | that they're not that many.
00:44:52.720 | In fact, we talk a lot more about AI scale,
00:44:57.200 | but there is not that many hit apps, right?
00:44:59.880 | There is ChatGPT, GitHub Copilot,
00:45:02.480 | there's Copilot, and there's Gemini.
00:45:04.320 | I think those are the four, I would say, in a Dow.
00:45:07.280 | Like, is there anything else that comes to your mind?
00:45:10.280 | - Well, I think there, you know,
00:45:11.480 | there are a lot of these startup use cases
00:45:13.440 | that I think are starting to get some traction,
00:45:15.280 | kind of bottoms up.
00:45:16.160 | A lot of them build on top of Lama, but, you know.
00:45:20.160 | - But if you said, oh, and there's Meta.
00:45:21.960 | - Right, right.
00:45:22.800 | - But if you said there are 10 more,
00:45:23.960 | what are the apps that have more than 5 million Dow, right?
00:45:27.880 | I think it's going to be interesting.
00:45:29.800 | - I think Zuckerberg would argue Meta AI,
00:45:32.160 | certainly, you know, has more, et cetera.
00:45:34.760 | But I think you're right,
00:45:36.600 | in terms of the non-affiliated apps, you named them.
00:45:40.240 | - And Zack's stuff all runs on his own cloud.
00:45:42.800 | I mean, he's not running a public cloud.
00:45:45.000 | - That's right, yeah.
00:45:46.200 | - Satya, on the enterprise side,
00:45:48.520 | obviously the coding space is off into the races,
00:45:53.160 | and you guys are doing well,
00:45:54.360 | and there's a lot of venture-backed players there.
00:45:56.680 | On some of the productivity apps,
00:45:59.400 | I have a question about the Copilot approach.
00:46:01.960 | And I guess Mark Benioff's been kind of obnoxiously critical
00:46:06.600 | on this front and called it Clippy 2 or whatever.
00:46:10.040 | Do you worry that someone might think
00:46:15.160 | kind of first principles AI from ground up,
00:46:18.200 | and that some of the infrastructure,
00:46:21.280 | say in an Excel spreadsheet,
00:46:23.440 | isn't necessary to know if you did an AI first product?
00:46:28.440 | And the same thing, by the way,
00:46:30.400 | could be said about the CRM, right?
00:46:32.320 | There's a bunch of fields and tasks
00:46:35.200 | that may be able to be obfuscated for the user.
00:46:38.800 | - Yeah, I mean, it's a very, very, very important question.
00:46:44.360 | The SaaS applications or biz apps,
00:46:46.440 | so let me just speak of our own dynamics thing.
00:46:49.200 | The approach at least we're taking is,
00:46:51.760 | I think the notion that business applications exist,
00:46:58.920 | that's probably where they'll all collapse, right?
00:47:01.680 | In the agent era.
00:47:02.680 | Because if you think about it, right,
00:47:04.080 | they are essentially CRUD databases
00:47:08.200 | with a bunch of business logic.
00:47:10.600 | The business logic is all going to these agents.
00:47:16.560 | And these agents are going to be multi-repo CRUD, right?
00:47:20.960 | So they're not going to discriminate
00:47:22.680 | between what the backend is.
00:47:25.600 | They're going to update multiple databases
00:47:27.760 | and all the logic will be in the AI tier, so to speak.
00:47:32.760 | And once the AI tier becomes the place
00:47:38.640 | where all the logic is,
00:47:40.360 | then people will start replacing the backends, right?
00:47:42.520 | We have people, that's what, in fact, it's interesting.
00:47:45.840 | As we speak, I think we are seeing pretty high rates
00:47:49.520 | of wins on dynamics, backends, and the agent use.
00:47:54.520 | And we are going to go pretty aggressively
00:47:57.360 | and try and collapse it all, right?
00:47:58.920 | Whether it's in customer service,
00:48:01.000 | whether it is in, by the way,
00:48:03.560 | the other fascinating thing that's increasing
00:48:06.200 | is just not CRM,
00:48:07.120 | but even what we call finance and operations.
00:48:10.480 | Because people want more AI native biz apps, right?
00:48:14.760 | That means the logic tier can be orchestrated
00:48:19.760 | by AI and AI agents.
00:48:21.680 | So in other words, co-pilot to agent
00:48:24.520 | to my business application should be very seamless.
00:48:29.520 | Now, in the same way, and you could even say,
00:48:31.600 | "Hey, why do I need Excel?"
00:48:34.120 | Like, interestingly enough,
00:48:35.160 | one of the most exciting things for me is Excel with Python
00:48:38.680 | is like GitHub with co-pilot, right?
00:48:40.960 | That's essential.
00:48:41.800 | So what we have done is when you have Excel,
00:48:45.360 | like this, by the way, it would be fun for you guys, right?
00:48:47.480 | Which is, you should just bring up Excel,
00:48:50.080 | bring up co-pilot, and start playing with it
00:48:52.560 | because it's no longer like,
00:48:54.000 | oh, you know, it is like having a data analyst.
00:48:57.960 | And so it's no longer just making sense
00:49:01.880 | of the numbers that you have.
00:49:03.760 | It will do the plan for you, right?
00:49:06.720 | It will literally, like how GitHub co-pilot workspace
00:49:09.320 | creates the plan, and then it executes the plan.
00:49:12.600 | This is like a data analyst who is using Excel
00:49:16.560 | as a sort of row column visualization
00:49:20.480 | to do analysis scratch pad.
00:49:22.760 | So it's kind of tools you.
00:49:24.000 | So the co-pilot is using Excel as a tool
00:49:26.960 | with all of its action space, because it can generate,
00:49:29.800 | and it has Python interpreter.
00:49:31.960 | That is, in fact, a great way to reconceptualize Excel.
00:49:36.400 | And at some point you could say,
00:49:37.480 | "Hey, I'll generate all of Excel."
00:49:40.040 | And that is also true.
00:49:40.880 | After all, there's a code interpreter, right?
00:49:42.520 | So therefore you can generate anything.
00:49:45.120 | And so, yes, I think there will be disruption,
00:49:47.600 | but so the way we are approaching
00:49:49.080 | at least our M365 stuff is one is build co-pilot
00:49:52.920 | as that organizing layer, UI for AI,
00:49:55.720 | get all agents, including our own agents.
00:49:59.200 | You can say the Excel is an agent to my co-pilot.
00:50:02.240 | Word is an agent.
00:50:03.920 | It's kind of specialized canvases,
00:50:05.640 | which is I'm doing a legal document.
00:50:08.440 | Let me take it into Pages and then to Word,
00:50:11.120 | and then have the co-pilot go with it.
00:50:15.280 | Go into Excel and have the co-pilot go with it.
00:50:18.280 | And so that's sort of a new way
00:50:19.760 | to think about the work and workflow.
00:50:21.680 | You know, one of the questions I hear people
00:50:25.320 | wringing their hands about a lot today, Satya,
00:50:27.400 | is the ROI people are making on these investments.
00:50:31.160 | You know, you have over 225,000 employees.
00:50:34.760 | Are you leveraging AI to increase productivity,
00:50:38.240 | reduce costs, drive revenues in your own business?
00:50:41.440 | If so, kind of what are the biggest examples there?
00:50:44.560 | You know, and maybe to a finer point on that,
00:50:47.520 | you know, when we had Jensen on, I asked him,
00:50:50.320 | you know, when he 2 or 3x'd his top line,
00:50:52.840 | what did he expect his headcount to increase by?
00:50:56.600 | And he said 25%.
00:50:58.600 | And when asked why, he said,
00:50:59.880 | "Well, I'll have 100,000 agents helping us do the work."
00:51:03.440 | So when you 2 or 3x your revenue for Azure,
00:51:07.240 | you know, do you expect to see that similar type
00:51:09.640 | of leverage on headcount?
00:51:14.000 | - Yeah, I mean, it's top of mind,
00:51:18.800 | and top of mind for both us at Microsoft
00:51:21.720 | as well as our customers.
00:51:24.280 | Here's the way I come at it.
00:51:26.120 | I love this thing of, I've been going to school
00:51:28.720 | on learning a lot about what happened
00:51:31.080 | in industrial companies with lean, right?
00:51:34.480 | I mean, it's fascinating, right?
00:51:35.560 | They're all GDP plus growers.
00:51:38.080 | It's unbelievable.
00:51:38.920 | Like, I mean, the discipline they have
00:51:40.840 | in how the good industrials can literally say,
00:51:43.840 | hey, I'll add 2 to 3, you know, 100 basis points
00:51:47.920 | of tailwind just by lean,
00:51:50.760 | which is increased value, reduced waste, right?
00:51:53.600 | That's the practice.
00:51:55.280 | So I think of AI as the lean for knowledge work.
00:52:00.280 | You know, we are really going to school on it,
00:52:03.480 | like, which is how do we really go look at,
00:52:05.800 | that's why I think, you know, the good old,
00:52:07.480 | you know, we remember in the '90s,
00:52:08.920 | we had all this business process re-engineering.
00:52:11.000 | I think it's back in a new way
00:52:13.560 | where people who can think end-to-end process flows
00:52:17.640 | and say, hey, what's the way to think
00:52:20.320 | about the process efficiency?
00:52:22.520 | What can be automated?
00:52:23.920 | What can be made more efficient?
00:52:26.800 | So that's a little bit of, I think.
00:52:28.320 | So customer service is the obvious one.
00:52:30.560 | Like, we are on course.
00:52:31.640 | We spend around $4 billion or so.
00:52:33.760 | This is everything from Xbox support to Azure support.
00:52:38.280 | This is really, I mean, this is serious one year
00:52:41.840 | because of the deflection rate on the front end.
00:52:44.480 | Then the biggest benefit is the agent efficiency, right?
00:52:48.960 | Where the agent is happier, the customer is happier,
00:52:52.200 | and our costs are going down.
00:52:54.520 | And so that's, I think, the most obvious place
00:52:57.760 | and that we have in our contact center application
00:53:01.560 | that's also doing super well.
00:53:03.480 | The other one is obviously GitHub Copilot.
00:53:05.480 | That's the other, and with GitHub Copilot Workspace,
00:53:08.760 | that's the first place where even this,
00:53:11.400 | what is agentic sort of side comes in, right?
00:53:14.080 | You go from an issue to a plan or to a spec to a plan
00:53:19.080 | and then multi-file edit, right?
00:53:23.160 | So it's just completely changes the workflow
00:53:26.360 | for the age team.
00:53:27.840 | As I said, and then the O365 is the catch-all, right?
00:53:33.880 | So the M365 Copilot is where,
00:53:36.360 | I mean, just to give you a feel, like even my own, right?
00:53:38.640 | Every time I'm meeting a customer,
00:53:40.400 | I would say the workflow of the prep of the CEO office
00:53:46.160 | has not changed since 1990, right?
00:53:48.280 | Basically, I mean, in fact, one of the ways I look at it is
00:53:51.440 | just imagine how did forecasting happen
00:53:54.760 | pre-PCs and post-PCs, right?
00:53:57.680 | There were faxes, then inter-office memos,
00:54:01.760 | and then PCs became a thing and people said,
00:54:04.720 | "Hey, I'm just gonna put an Excel spreadsheet in email
00:54:07.120 | "and send it around and people will enter numbers
00:54:09.160 | "and we will have a forecast."
00:54:11.760 | The same thing is happening in the AI era right now
00:54:15.720 | all over the place, right?
00:54:16.880 | I prep for a customer meeting where
00:54:20.320 | I literally go into Copilot and I say,
00:54:22.280 | "Tell me everything I need to know about the customer."
00:54:24.440 | It tells me everything from my CRM, my emails,
00:54:27.320 | my team's meetings, and the web, right?
00:54:29.960 | It grounds it.
00:54:31.000 | I put it into pages,
00:54:32.320 | share it with my account team in real time.
00:54:35.600 | So just imagine the hierarchy, this entire thing of,
00:54:38.320 | "Oh, let me prepare a brief for the CEO," goes away.
00:54:41.600 | It's just a query away.
00:54:42.680 | I generate a query, share a page.
00:54:44.400 | If they wanna annotate it, so I'm reasoning with AI
00:54:47.640 | and collaborating with my colleagues, right?
00:54:50.480 | That's the new workflow.
00:54:51.920 | And that's happening all over the place.
00:54:54.080 | Somebody gave me this example from supply chain.
00:54:56.760 | Like somebody said, "Supply chain is like a trading desk,
00:54:59.960 | "except it doesn't have real-time information," right?
00:55:02.600 | That's kind of what it is.
00:55:04.080 | So it's like you wait for the quarter to end
00:55:06.680 | and then the CFO comes and bangs you on the head
00:55:09.000 | and saying all the mistakes you made.
00:55:10.600 | What if that financial analyst essentially
00:55:14.760 | can be in real time, be available to you
00:55:17.520 | and giving you like, "Oh, you're doing this contract
00:55:19.880 | "for this data center in this region.
00:55:21.880 | "You should think about these terms."
00:55:24.080 | All that intelligence in real time
00:55:26.400 | is changing the workflow and work artifacts.
00:55:29.000 | So lots and lots of use cases all around.
00:55:32.120 | And I think to your fundamental point,
00:55:35.120 | our goal is to kind of create operating leverage through AI.
00:55:38.080 | So I think headcount will, in fact,
00:55:41.480 | one of the ways I look at it and say
00:55:42.800 | is our total people costs will go down.
00:55:46.360 | Our cost per head will go up
00:55:49.720 | and my GPU per researcher will go up.
00:55:53.160 | That's kind of the way I look at it.
00:55:55.320 | - That makes sense.
00:55:57.200 | Hey, let's shift ahead to something
00:56:00.480 | that you referenced earlier,
00:56:01.760 | just around what we're seeing out of model scaling
00:56:05.000 | and CapEx generally.
00:56:07.160 | You know, I've heard you talk about, you know,
00:56:09.840 | Microsoft's CapEx.
00:56:11.080 | I imagine in 2014, when you took over,
00:56:13.880 | you had no idea that the CapEx
00:56:16.240 | would look like it does today.
00:56:18.240 | In fact, you've said it looks increasingly
00:56:21.000 | these companies look more like industrial company CapEx
00:56:24.840 | than traditional software companies.
00:56:27.200 | Your CapEx come from about 20 billion in 2020
00:56:31.360 | to maybe as high as 70 billion in 2025.
00:56:35.320 | You know, you've earned a pretty consistent return
00:56:38.120 | on that CapEx, right?
00:56:39.760 | So there's actually a very high correlation
00:56:42.800 | when you look at your CapEx to revenue.
00:56:45.640 | Some people are worried that that correlation will break
00:56:48.440 | and even you have said, you know,
00:56:50.280 | maybe at some point there's going to be, you know,
00:56:53.360 | CapEx is going to have to be spent ahead of the revenue.
00:56:56.480 | You know, there may be an error pocket.
00:56:58.800 | We have to build for this resiliency.
00:57:01.920 | So how do you feel about the level of CapEx?
00:57:05.880 | Does it cause you any sleepless nights?
00:57:08.120 | And when does it begin to taper off, you know,
00:57:11.960 | in terms of this rate of growth?
00:57:14.320 | Yeah, I mean, a couple of different things, right?
00:57:17.760 | One is, this is where being a hyperscaler,
00:57:22.560 | I think structurally super helpful
00:57:24.640 | because in some sense,
00:57:26.120 | we've been practicing this for a long time, right?
00:57:29.560 | Which is, you know, hey, data centers
00:57:32.200 | have 20 year life cycles,
00:57:34.880 | power you pay only when you use,
00:57:37.600 | the kits are six years,
00:57:40.760 | you know how to sort of drive utilization up.
00:57:44.760 | And the good news here is it's kind of like capital intensive
00:57:48.880 | but it's also software intensive
00:57:51.000 | and you use software to bring the ROIC
00:57:54.640 | of your capital higher, right?
00:57:56.320 | That's kind of like when people even in the early days said,
00:57:59.560 | hey, how can like a hyperscaler ever make money?
00:58:02.600 | Because what's the difference between old holsters
00:58:05.640 | and the new hyperscalers?
00:58:07.320 | It is software, right?
00:58:08.720 | And that I think is what's going to apply even
00:58:13.200 | to this GPU physics even, right?
00:58:16.200 | Which is, hey, you buy leading, you build it out.
00:58:18.640 | In fact, one of the things that's happening right now
00:58:21.520 | is what I'll call catch up, right?
00:58:23.720 | Which is we built after all,
00:58:26.640 | over the last 15 years, the cloud.
00:58:29.520 | Suddenly a new meter showed up in the cloud.
00:58:32.560 | It's called the AI accelerator
00:58:36.120 | because every app now needs a database,
00:58:40.120 | a Kubernetes cluster
00:58:42.160 | and a model that runs on an AI accelerator, right?
00:58:47.280 | So if you sort of say, oh, I need all three,
00:58:49.240 | you suddenly add to build up these AI accelerators
00:58:52.720 | in order to be able to provision
00:58:54.320 | for all of these applications.
00:58:56.480 | So that will normalize.
00:58:57.800 | So the first thing is the build out will happen,
00:59:00.560 | the workloads will normalize and then it will be,
00:59:03.320 | you will just keep growing like the cloud has grown.
00:59:05.760 | So that's sort of the one side of it.
00:59:07.520 | And that's where avoiding some
00:59:09.320 | of these adverse selection issues,
00:59:11.680 | making sure it's not just all supply side,
00:59:15.560 | everybody's sort of building only hoping demand will come,
00:59:18.960 | just making sure that there is real diverse demand
00:59:23.240 | all over the world, all over the segments,
00:59:25.800 | I watch for all of that.
00:59:27.960 | So I think that that's, I think the way to manage the ROIC.
00:59:31.360 | And by the way, the margins will be different, right?
00:59:33.280 | This goes back to the very early dialogue we had on,
00:59:37.040 | when I think about the Microsoft cloud,
00:59:38.760 | the margin profile of a raw GPU
00:59:41.320 | versus the margin profile of fabric plus GPU
00:59:44.560 | or foundry plus GPU or GitHub copilot add on to M365.
00:59:49.560 | So they're all gonna be different.
00:59:56.160 | And so if you're having a portfolio matters here, right?
00:59:59.840 | Because if I look at even the Microsoft,
01:00:01.360 | why does Microsoft have a premium today in the cloud?
01:00:04.520 | We are bigger than Amazon, growing faster than Amazon
01:00:07.720 | with better margins than Amazon
01:00:09.640 | because we have all these layers.
01:00:13.840 | And that's kind of what we wanna do even in the AI era.
01:00:16.440 | - Satya, there's been a lot of talk about model scaling.
01:00:19.560 | And obviously there was talk historically
01:00:24.560 | about kind of 10Xing the cluster size
01:00:29.680 | that you might do over and over again,
01:00:31.680 | not once and then twice.
01:00:34.080 | And NX.AI is still making noise
01:00:37.240 | about going in that direction.
01:00:38.880 | There was a podcast recently
01:00:40.800 | where they kind of flipped everything on their head
01:00:43.080 | and they said, well, if we're not doing that anymore,
01:00:45.320 | it's way better because we can just move on to inference,
01:00:48.240 | which is getting cheaper
01:00:49.640 | and you won't have to spend all this capex.
01:00:52.040 | I'm curious, those are two kind of views of the same coin,
01:00:55.680 | but what's your view on large LLM model scaling
01:01:00.560 | and training costs and where we're headed in the future?
01:01:04.040 | - Yeah, I mean, I'm a big believer in scaling laws,
01:01:10.720 | I'll sort of first say.
01:01:12.640 | In fact, if anything, the bet we placed in 2019
01:01:16.280 | was on scaling laws and I stay on that,
01:01:18.720 | which is in other words, don't bet against scaling laws.
01:01:22.080 | But at the same time,
01:01:23.800 | let's also be grounded on a couple of different things.
01:01:26.040 | One is these exponentials on scaling laws
01:01:30.920 | will become harder just because
01:01:33.680 | as the clusters become harder, everything.
01:01:36.920 | I mean, the distributed computing problem
01:01:38.440 | of doing large scale training becomes harder.
01:01:42.000 | And so that's kind of one side of it.
01:01:45.240 | So there is, but I would just still say,
01:01:48.400 | and I'll let the OpenAI folks speak for what they're doing,
01:01:51.280 | but they are continuing to,
01:01:54.000 | pre-training I think is not over, it sort of continues.
01:01:58.280 | But the exciting thing, which again,
01:02:00.520 | OpenAI has talked about and Sam has talked about
01:02:04.160 | is what they've done with O1, right?
01:02:05.840 | So this chain of thought with autograding
01:02:08.200 | and is just a fantastic.
01:02:11.560 | In fact, basically it is test time computer
01:02:14.920 | inference time compute as another scaling law, right?
01:02:17.800 | So you have pre-training
01:02:20.040 | and then you have effectively this test time sampling
01:02:23.560 | that then creates the tokens
01:02:25.320 | that can go back into pre-training,
01:02:27.080 | creating even more powerful models
01:02:28.960 | that then are running on your inference, right?
01:02:31.240 | So therefore that's I think a fantastic way
01:02:34.320 | to increase model capability.
01:02:36.320 | So the good news of test time or inference time compute
01:02:40.400 | is sometimes running of those O1 models
01:02:43.680 | means the run, there's two separate things.
01:02:47.360 | Sampling is kind of like training
01:02:49.400 | when you're using it to generate tokens for training,
01:02:52.440 | for your pre-training,
01:02:53.840 | but also customers when they are using O1,
01:02:57.240 | they're using more of your meters
01:02:59.360 | and so you are getting paid for it.
01:03:01.040 | And so therefore there is more of an economic model, right?
01:03:04.240 | So therefore I like it.
01:03:05.240 | In fact, that's where I said
01:03:06.360 | I have a good structural position
01:03:07.880 | with 60 plus data centers all over the world.
01:03:10.160 | - Right, it's a different hardware architecture
01:03:12.520 | for one of those scaling versus the other,
01:03:14.640 | for the pre-training versus.
01:03:15.800 | - Exactly, and I think the best way to think about it
01:03:18.640 | is it's a ratio, right?
01:03:19.880 | So going back to sort of Brad's thing about ROIC,
01:03:24.000 | this is where I think you have to sort of
01:03:26.560 | really establish a stable state.
01:03:28.920 | In fact, whenever I've talked to Jensen,
01:03:30.840 | I think he's got it right,
01:03:32.040 | which is look, you kind of wanna buy some every year,
01:03:35.600 | not buy, like think about it, right?
01:03:37.720 | When you depreciate something over six years,
01:03:39.880 | the best way is what we have always done,
01:03:41.760 | which is you buy a little every year
01:03:44.400 | and you age it, you age it, you age it, right?
01:03:46.640 | You use the leading node for training
01:03:48.360 | and then the next year it makes it, goes into inference.
01:03:51.720 | And that's sort of the stable state
01:03:53.880 | I think we will get into across the fleet
01:03:56.960 | for both utilization and the ROIC
01:04:00.560 | and then the demands meets supply.
01:04:02.600 | And like basically to your point about everybody saying,
01:04:05.960 | oh, wow, have the exponential stopped?
01:04:08.080 | One of the other things is the economic realities
01:04:10.440 | will also sort of stop, right?
01:04:12.600 | I mean, at some point everybody will look and say,
01:04:14.960 | what's the economically rational thing to do?
01:04:17.720 | - I agree.
01:04:19.280 | - Which is, hey, even if I double every year's capability,
01:04:22.480 | but I'm not able to sell that inventory.
01:04:24.560 | And the other problem is the winner's curse, right?
01:04:26.720 | Which is, you don't even have to publish a paper.
01:04:30.800 | The other folks have to just look at your capability
01:04:33.920 | and do either a distillation.
01:04:36.440 | It's just impossible.
01:04:37.720 | It's kind of like piracy, right?
01:04:39.600 | I mean, you can sort of all kinds of terms of use,
01:04:41.960 | but it's impossible to control distillation.
01:04:44.680 | That's one.
01:04:45.560 | Second thing is, you don't even have to do anything.
01:04:48.400 | You just have to reverse engineer that capability
01:04:51.160 | and you do it in a more compute efficient way.
01:04:53.800 | And so given all this,
01:04:55.560 | I think there will be a governor
01:04:57.120 | on how much people will kind of chase.
01:05:00.120 | Right now, a little bit of everybody wants to be first.
01:05:03.560 | It's great, but at some point,
01:05:05.040 | all the economic reality will set in on everyone.
01:05:07.720 | And the network effects are in the app layer.
01:05:10.720 | So why would I want to spend a lot on some model capability
01:05:14.360 | if the network effects are all on the app layer?
01:05:17.040 | - What I heard you say, I believe, you know,
01:05:20.360 | so Elon has said that he's going to build
01:05:22.360 | a million GPU cluster.
01:05:24.600 | I think Meta has said the same thing.
01:05:28.120 | I think the pre-training--
01:05:28.960 | - I think he said 200
01:05:30.400 | and then he kind of joked about a million, but I--
01:05:32.800 | - I think he joked about a billion, but you know,
01:05:35.320 | the fact of the matter is have your,
01:05:38.640 | versus the start of the year, Satya,
01:05:41.280 | based on what you've seen around pre-training and scaling,
01:05:45.200 | have you changed your infrastructure plans around that?
01:05:50.200 | And then I have a separate question with regard to '01.
01:05:54.400 | I am building to what I would say is a way,
01:05:59.400 | like a little bit of the 10X point, right?
01:06:02.160 | Which is, hey, how do you, we can argue the duration,
01:06:06.160 | like is it every two years?
01:06:09.160 | Is it every three years, every four years?
01:06:10.760 | There is an economic model.
01:06:12.400 | And this is where I think a little bit of disciplined way
01:06:16.080 | of thinking about how do you clear your inventory
01:06:19.680 | such that it makes sense, right?
01:06:22.640 | Which is, or the other way is the depreciation cycle
01:06:25.520 | of your kit, right?
01:06:26.840 | There is no way you can sort of buy,
01:06:29.200 | you can, unless you find the physics of the GPU works out,
01:06:34.040 | where suddenly it flows through my P&L
01:06:37.480 | and it's actually, you know,
01:06:39.440 | it's in the same or better margin than a hyperscaler.
01:06:42.040 | That's simple.
01:06:42.880 | Like, so that's kind of what I'm gonna do.
01:06:44.120 | I'm gonna keep going and building basically to,
01:06:47.440 | hey, how do I drive inference demand?
01:06:51.320 | And then keep increasing my capability
01:06:54.760 | and be efficient at it.
01:06:57.640 | I absolutely, and Sam may have a different objective
01:07:01.560 | and he's been open to it, right?
01:07:02.720 | He's sort of like, he may say, hey, I wanna build
01:07:05.880 | because I know I'm deeply, have deep conviction
01:07:08.480 | on what AGI looks like or what have you.
01:07:10.560 | And so be it.
01:07:11.400 | So therefore that's where I think a little bit
01:07:13.640 | of our tension is even.
01:07:15.120 | - And to clarify something, I heard Mustafa say
01:07:18.080 | on a podcast that Microsoft is not going to engage
01:07:22.880 | in the biggest model training competition that's going on.
01:07:27.280 | Is that fair?
01:07:28.120 | - Well, what we won't do is do it twice, right?
01:07:32.640 | Because after all, we have the IP from,
01:07:34.880 | like it'd be silly for Microsoft today,
01:07:38.840 | given the partnership with OpenAI to do two unnecessary,
01:07:43.600 | I mean, doing a second training set.
01:07:46.320 | Correct.
01:07:47.160 | So we are very, and that's why we have concerted.
01:07:49.560 | And by the way, that's the strategic discipline
01:07:51.560 | we have had, right?
01:07:52.400 | Which is, that's why I always stress to Sam,
01:07:55.480 | like we bet the farm on OpenAI and said,
01:07:59.360 | hey, we will concentrate our compute.
01:08:01.240 | And we did it because we had all the rights to the IP.
01:08:04.240 | And so that's sort of the give gets on it.
01:08:08.040 | And we feel fantastic about it.
01:08:09.520 | And so then what Mustafa is basically saying is,
01:08:13.360 | hey, we will also do, in fact, a lot of focus
01:08:16.080 | on our end is post-training
01:08:17.760 | and even on the verification or what have you.
01:08:21.120 | So that's a big thing.
01:08:22.160 | So we'll focus a lot of our compute resources
01:08:24.640 | on adding more model adaptations
01:08:27.920 | and capabilities that make sense,
01:08:30.200 | while also having a principled pre-training stuff
01:08:33.400 | that sort of gives us capability internally to do things.
01:08:38.400 | We anyway have different model weights
01:08:40.840 | and model classes for different use cases
01:08:43.200 | that we will continue to go ahead and develop as well.
01:08:45.880 | - Does your question to Brad's question about,
01:08:48.280 | your answer to Brad's question
01:08:49.520 | about the balancing of GPU ROI,
01:08:53.960 | does that answer the question
01:08:55.440 | as to why you've outsourced some of the infrastructure
01:08:58.960 | to CoreWeave in that partnership that you have?
01:09:01.880 | - That we did because we all got caught
01:09:06.880 | with the hit called ChatGPT and OpenAI API, right?
01:09:11.360 | Yeah, we were completely, I mean like it was impossible.
01:09:15.360 | There's no supply chain planning I could have done in,
01:09:20.360 | what is it, like none of us knew what was gonna happen,
01:09:25.560 | what happened in November of 22,
01:09:28.480 | like that was just a bolt from the blue, right?
01:09:30.880 | So therefore we had to catch up.
01:09:32.920 | So we said, hey, we're not going to in fact worry
01:09:35.480 | about too much inefficiency.
01:09:37.440 | So that's why, whether it's CoreWeave or many others,
01:09:41.560 | we bought all over the place.
01:09:43.560 | - Fair enough.
01:09:44.400 | - So, and that is a one-time thing
01:09:47.960 | and then now it's all catching up and yeah.
01:09:51.760 | So that was just more about trying to get caught up
01:09:55.000 | with demand.
01:09:55.840 | - Are you still supply constrained Satya?
01:09:58.160 | - I am power, yes.
01:10:02.720 | I am not chip supply constrained.
01:10:05.520 | We were definitely constrained in 24.
01:10:08.360 | What we have told the street is that's why we are optimistic
01:10:11.360 | about sort of the first half of 25,
01:10:14.360 | which is the rest of our fiscal year.
01:10:16.240 | And then after that, I think we'll be in better shape
01:10:20.360 | going into 26 and so on.
01:10:22.320 | We have good line of sight.
01:10:24.120 | - So I'm hearing with respect to this level two thinking,
01:10:29.120 | the O1 test time compute, post-training work
01:10:33.560 | that's being done on that is leading
01:10:36.120 | to really positive outcomes.
01:10:38.240 | And when you think about that,
01:10:39.520 | that's also pretty compute intensive
01:10:42.280 | because you're generating a lot of tokens,
01:10:44.000 | you're recycling those tokens back into the context window
01:10:46.960 | and you're doing that time and time again.
01:10:49.160 | And so that compounds very quickly.
01:10:51.920 | Jensen said he thought looking at O1,
01:10:55.480 | the inference was going to a million or a billion X,
01:10:58.920 | just that it was the demand for inference
01:11:01.000 | is going to go up dramatically.
01:11:02.720 | In that regard, do you feel like you have
01:11:05.640 | the right long-term plan to scale inference
01:11:08.920 | to keep up with these new models?
01:11:11.360 | - Yeah, I mean, I think there are two things there, Brad,
01:11:15.520 | which is in some sense, it's very helpful
01:11:19.440 | to think about the full workload there, the full workload.
01:11:22.640 | Like in the agentic world,
01:11:24.480 | you have to have the AI accelerator.
01:11:26.080 | One of the fastest growing things of, in fact,
01:11:28.200 | open AI itself is the container service
01:11:31.200 | because after all these agents need a scratch pad
01:11:35.520 | for doing some of those autograding even
01:11:38.040 | to generate the samples, right?
01:11:39.960 | And so that is like where they run a code interpreter.
01:11:42.760 | And that, by the way, is a regular Azure Kubernetes cluster.
01:11:46.680 | So in an interesting way, there's a ratio
01:11:49.080 | of even what is regular Azure compute
01:11:52.400 | and its nexus to the GPU and then some data service.
01:11:57.400 | So to your point, when we say inference,
01:12:00.640 | that's why I look at it and say there's,
01:12:02.960 | people think about AI as separate from the cloud.
01:12:06.000 | AI is now core part of the cloud.
01:12:09.520 | And I think in a world where every AI application
01:12:13.680 | is a stateful application, it's an agentic application,
01:12:17.640 | that agent performs actions,
01:12:20.560 | then classic app server plus the AI app server
01:12:24.440 | plus the database are all required.
01:12:27.200 | And so I go back to my fundamental thing,
01:12:30.000 | which is, hey, we built this 60 plus AI regions.
01:12:32.600 | I mean, Azure regions, they all will be ready
01:12:36.320 | for full-on AI applications.
01:12:39.880 | And that's, I think, what will be needed.
01:12:42.560 | That makes a lot of sense.
01:12:45.080 | So let's talk a little bit,
01:12:47.920 | we've talked around OpenAI a lot during this conversation,
01:12:51.920 | but you're managing this balance
01:12:54.920 | between a huge investment there and your own efforts.
01:12:59.720 | At Ignition, you showed a slide
01:13:02.880 | highlighting the differences between Azure OpenAI
01:13:06.880 | and OpenAI Enterprise.
01:13:09.680 | And a lot of those were about the enterprise grade,
01:13:14.400 | you know, things that you bring to the table.
01:13:16.440 | So when you look at that tension, you know,
01:13:18.840 | the competition that you have with OpenAI,
01:13:21.920 | do you think about them as ChatGPT is likely
01:13:25.760 | to be that winner on the consumer side?
01:13:28.920 | You'll have your own consumer apps as well.
01:13:31.840 | And then you'll divide and conquer
01:13:34.440 | when it comes to enterprise.
01:13:35.560 | How do you think about competing with them?
01:13:38.120 | The way I think about, at this point,
01:13:42.120 | given OpenAI is a very at-scale company, right?
01:13:46.600 | So it's no longer, it's a really very successful company
01:13:51.600 | with even multiple lines, if you will,
01:13:55.560 | of business and segments and what have you.
01:13:57.800 | And so I come at it very principally
01:14:00.320 | like I would with any other big partner, right?
01:14:03.760 | Because I don't think of them.
01:14:05.320 | So I think of them as, hey, as an investor,
01:14:07.920 | what are their interests and our interests
01:14:10.360 | and how do we align them?
01:14:12.000 | I think of them as an IP partner.
01:14:15.520 | And because we give them systems IP,
01:14:19.120 | they give us model IP, right?
01:14:22.360 | So that's another side of it
01:14:23.760 | where we are very deeply interested in each other's success.
01:14:28.760 | The third is, I think of them as a big customer.
01:14:33.320 | And so therefore I want to serve them
01:14:37.200 | like I would serve any other big customer.
01:14:39.520 | And so, and then the last one is the cooperation, right?
01:14:43.440 | Which is, whether it's co-pilot in the consumer space,
01:14:48.440 | whether it's co-pilot with M365 or whatever else,
01:14:53.440 | we sort of say, hey, where is the competition?
01:14:57.280 | Where is, and that's where I kind of look at it and say,
01:15:00.960 | ultimately these things will have some overlap,
01:15:03.240 | but I also in that context,
01:15:04.920 | the fact that they have the Apple deal
01:15:06.640 | is in some sense for the MSFT shareholder accretive, right?
01:15:11.240 | Even in like the fact that their APIs,
01:15:14.520 | like to your point about the API differences,
01:15:16.480 | hey, you choose, right?
01:15:17.520 | The customers can choose which API front
01:15:19.560 | or like some of the, there's differences, right?
01:15:22.480 | Azure has a particular style.
01:15:24.760 | And if you're an Azure customer
01:15:26.200 | and you want to use other services of Azure,
01:15:28.800 | then it's easiest to have an Azure and Azure Mac.
01:15:31.240 | But if you're on AWS and you want to just use
01:15:34.120 | just the API in a stateless way, great.
01:15:36.920 | Just use even open AI.
01:15:38.200 | So I think in an interesting way,
01:15:39.600 | there's sometimes having these two types of distributions
01:15:43.440 | is also helpful to the MSFT cost.
01:15:46.720 | - Satya, I would say the kind of curious part
01:15:52.200 | of the Silicon Valley community and even writ larger,
01:15:55.480 | I would say the entire business community
01:15:57.320 | is I think infatuated with the relationship
01:16:00.440 | between Microsoft and open AI.
01:16:02.440 | I was at DealBook last week
01:16:04.720 | and Andrew Sorkin pushed Sam really hard on this.
01:16:08.960 | I imagine there's a lot you can't say,
01:16:10.520 | but is there anything you can say?
01:16:12.600 | There's supposedly a restructuring,
01:16:15.720 | conversion to profit.
01:16:17.120 | I guess Elon's launched a missive in there as well.
01:16:20.440 | What can you tell us?
01:16:21.960 | - Yeah, I mean, I think those, Bill,
01:16:25.920 | are obviously all for the open AI board
01:16:28.000 | and Sam and Sarah and Brad and that team
01:16:31.120 | to decide what they want to do.
01:16:33.320 | And we want to be supportive.
01:16:34.480 | I mean, so this is where we're an investor.
01:16:37.200 | We, let me, I'd say the one thing that we care deeply about
01:16:41.480 | is open AI continues to succeed, right?
01:16:44.200 | I mean, it's in our interest.
01:16:46.160 | And I also think it's a company
01:16:48.880 | that is an iconic company of this platform shift
01:16:53.080 | and the world is better with open AI doing well.
01:16:56.400 | And so therefore that's sort of the fundamental position.
01:16:59.920 | Then after that, the pace with which
01:17:03.640 | the tension to your point comes from,
01:17:06.080 | like in all of these partnerships,
01:17:07.320 | some of it is that co-op petition tension.
01:17:09.280 | Some of it is, you know, Sam's somebody
01:17:11.680 | who is an unbelievable entrepreneur
01:17:14.440 | with great amount of sort of vision and ambition
01:17:18.200 | and the pace with which he wants to move, you know,
01:17:22.280 | and so therefore we have to balance that all out, right?
01:17:25.240 | Which is, you know, what he wants to do,
01:17:28.440 | I have to accommodate for so that he can do
01:17:31.080 | what he does to do and he needs to accommodate
01:17:33.280 | for the discipline that we need on our end,
01:17:36.440 | given, you know, the overall constraints we may have.
01:17:39.000 | And so I think we'll work it out.
01:17:40.760 | But I mean, the good news here,
01:17:43.400 | I think is in this construct,
01:17:47.240 | we have come a long way.
01:17:48.240 | I mean, this five years has been great for them.
01:17:51.160 | It's been great for us.
01:17:53.000 | And at least for my part,
01:17:54.560 | I'm going to keep going back to that.
01:17:56.040 | And I want to prolong it as long as I can.
01:17:58.200 | It will only behoove us
01:17:59.320 | to have a long-term stable partnership.
01:18:01.720 | - When you think about, you know, the separate funding,
01:18:05.800 | you know, and untangling the two businesses,
01:18:08.600 | such, you know, are you guys motivated
01:18:11.160 | to do that relatively quickly?
01:18:13.800 | I've talked about thinking that the next step for them,
01:18:17.000 | you know, it'd be great to have them as a public company.
01:18:20.200 | You know, it's such an iconic, you know, business,
01:18:24.120 | early leader in AI.
01:18:26.800 | Is that the path that you see, you know,
01:18:30.480 | for these guys on the way forward?
01:18:33.560 | Or do you think that it stays kind of in the relationship
01:18:36.920 | that we are today?
01:18:38.120 | - And that's the place where, Brad,
01:18:40.680 | I want to be careful not to overstep, right?
01:18:42.760 | Because in some sense, you know,
01:18:44.320 | I'm neither, we're not in the board,
01:18:46.440 | we're investors like you,
01:18:47.800 | and at the end of the day,
01:18:51.160 | it's their board and their management decision.
01:18:53.640 | And so at some level,
01:18:54.760 | I'm going to take whatever their cues are.
01:18:57.320 | Like, in other words,
01:18:58.520 | I'm very clear that I want to support them
01:19:00.440 | with whatever decision they make.
01:19:02.240 | And to me, perhaps even as an investor,
01:19:08.120 | it's that commercial and IP partnership
01:19:10.600 | that matters the most.
01:19:12.920 | We want to make sure we protect our interests
01:19:15.800 | in all of this.
01:19:16.520 | And if anything, bolster them going forward.
01:19:20.200 | But I think, you know, at this point, you know,
01:19:23.320 | people like Sarah and Brad and Sam are, you know,
01:19:26.840 | are very, very smart folks on this
01:19:28.760 | and what makes the most sense for them
01:19:31.160 | to achieve their objectives on the mission
01:19:34.120 | is what we would be supportive of.
01:19:35.560 | - Well, maybe we should wrap
01:19:36.840 | and thank you for so much time today.
01:19:38.840 | But I want to wrap on this topic of open versus closed,
01:19:42.920 | you know, and how we should cooperate
01:19:46.040 | to usher in safe AI.
01:19:48.120 | And so maybe I'll just leave it open-ended to you.
01:19:51.320 | You know, talk to us a little bit
01:19:53.320 | about how you think about
01:19:54.760 | some of these differences and debates
01:19:57.000 | and the importance of doing this.
01:19:59.080 | And one anecdote I would just throw out there
01:20:01.480 | is Reuters recently reported
01:20:03.400 | that Chinese researchers developed an AI model
01:20:06.440 | for potential military use on the back of MetaLlama, right?
01:20:11.080 | And, you know, there are a lot of supporters
01:20:12.920 | like Bill and I of open source,
01:20:15.080 | but we've also heard critics,
01:20:16.680 | and you said everybody can distill a model,
01:20:20.120 | you know, out there.
01:20:21.720 | So we are going to see some of these put to uses
01:20:24.440 | that we're not going to be happy about.
01:20:26.280 | So how do you think about, you know,
01:20:29.880 | us coming together really as a nation
01:20:32.440 | and as a collection of companies to usher in safe AI?
01:20:36.200 | - Yeah, I think two things.
01:20:39.160 | I think that I always have thought of open source
01:20:42.760 | versus close source as two different tactics
01:20:47.960 | in order to create network effects, right?
01:20:49.800 | I've never thought of them as just religious battles.
01:20:54.200 | I've thought of them as more like,
01:20:56.600 | hey, two different, I mean,
01:20:57.800 | that's why I think what Meta and Mark are doing
01:21:01.160 | is very smart, right?
01:21:02.200 | Which is in some sense,
01:21:04.440 | he's trying to commoditize even his compliment, right?
01:21:06.840 | It makes a ton of sense to me.
01:21:09.000 | If I were in his shoes, I would do that, right?
01:21:10.840 | Which is get the entire world converged.
01:21:13.480 | I mean, I think he talks openly and very eloquently
01:21:16.360 | about how he wants to be the Linux of LLMs.
01:21:19.320 | And I think it's a beautiful model.
01:21:21.560 | In fact, there is even a model there,
01:21:23.160 | I think that, you know,
01:21:24.360 | sometimes going back to some of your economics question,
01:21:28.360 | I think there is like the game,
01:21:31.320 | theoretically, a consortium could be a superior model,
01:21:36.680 | quite frankly, than any one player trying to do it.
01:21:39.640 | Like this has, unlike the Linux Foundation,
01:21:44.120 | where the contributions were mostly
01:21:47.240 | Apex contributions, right?
01:21:48.760 | Which is if, I always say Linux wouldn't have happened,
01:21:51.640 | but for, I guess, you know,
01:21:53.560 | in fact, Microsoft's one of the largest committers to Linux.
01:21:57.320 | And so was IBM, so was Oracle and what have you.
01:22:01.080 | And I think that there may be a real place for,
01:22:03.880 | and open source is a beautiful mechanism for that, right?
01:22:07.640 | Which is when you have multiple entities
01:22:09.400 | coming together and so on.
01:22:10.760 | And it's a smart business strategy.
01:22:13.000 | Then closed source may make sense in closed source.
01:22:15.320 | After all, we have had lots of closed source products.
01:22:17.640 | Then safety is an important but orthogonal issue
01:22:21.160 | because after all, regulations will apply
01:22:24.440 | and safety will apply to both sides.
01:22:26.760 | And, you know, one could make arguments that,
01:22:30.520 | "Hey, if everybody's inspecting it,
01:22:32.200 | and, you know, there will be more safety
01:22:34.920 | on one side or the other."
01:22:36.200 | So I think of these as perhaps best dealt with
01:22:39.240 | in capitalism, at least.
01:22:41.160 | It's better to have multiple models
01:22:43.000 | and let there be competition
01:22:45.400 | and different companies will choose different paths.
01:22:48.200 | And then we should be pretty hardcore
01:22:51.080 | and the governments will demand that.
01:22:52.680 | I think at tech, you know, now there's no chance of saying,
01:22:56.440 | "Hey, we'll see what happens
01:22:58.440 | to the unintended consequences later."
01:23:00.520 | I mean, no government, no community,
01:23:02.680 | no society is going to tolerate that.
01:23:04.520 | So therefore, these AI safety, you know,
01:23:06.920 | institutions all over will hold a same bar.
01:23:10.360 | And also national security, to your point,
01:23:12.760 | if there is sort of national security leakage or challenges,
01:23:16.440 | the people will worry about that too.
01:23:17.880 | So therefore, I think states and state policy
01:23:21.160 | will have a lot to say about which of these models
01:23:25.400 | and what the regulatory regime will look like.
01:23:27.960 | - Well, it's hard to believe that we're only 22 months
01:23:31.320 | into the post-chat GPT era, you know,
01:23:36.920 | but, you know, it's interesting when I reflect back
01:23:40.680 | on your, you know, framework around phase shifts,
01:23:44.040 | you have to put Microsoft in a really good position
01:23:47.800 | as we emerge into the age of AI.
01:23:50.840 | And so congrats on the run over the last 10 years.
01:23:53.800 | It's been really, you know, a sight to behold,
01:23:56.920 | but, you know, it's great.
01:23:58.360 | I think both Bill and I get excited
01:24:00.120 | when we see the leadership, you, Elon, Mark, Sundar, et cetera,
01:24:06.440 | you know, really forging ahead for Team America around AI.
01:24:09.960 | You know, I feel, I think we both feel
01:24:13.240 | pretty incredibly optimistic
01:24:15.560 | about how we're going to be positioned
01:24:17.000 | vis-a-vis the rest of the world.
01:24:18.120 | So thanks for spending some time with us.
01:24:20.440 | - Yeah, I can't thank you enough for the time, Satya.
01:24:22.760 | Really appreciate it.
01:24:23.560 | - Thank you so much. - Bye-bye.
01:24:25.160 | - Thank you, Brad and Bill.
01:24:26.200 | - Take care, Satya.
01:24:27.080 | - Cheers.
01:24:37.080 | As a reminder to everybody,
01:24:38.760 | just our opinions, not investment advice.