back to indexAll things AI w @altcap @sama & @satyanadella. A Halloween Special. 🎃🔥BG2 w/ Brad Gerstner

Chapters
0:0 Intro
2:28 Microsoft’s Investment in OpenAI
3:19 The Nonprofit Structure and Its Impact
5:46 Health, AI Security, and Resilience
7:50 Models, Exclusivity, and Distribution
8:58 Revenue Sharing and AGI Milestones
11:38 OpenAI’s Growth and Compute Commitments
15:21 Compute Constraints and Scaling
21:27 The Future of AI Devices and Consumer Use
24:31 Regulation and the Patchwork Problem
28:1 Looking Ahead to 2026 and Beyond
37:10 Microsoft’s Strategic Value from OpenAI
57:15 The Economics of AI and SaaS
64:28 Productivity, Jobs, and the Age of AI
70:43 Reindustrialization of America
00:00:00.240 |
I think this is really an amazing partnership through every phase. 00:00:04.000 |
We had kind of no idea where it was all going to go when we started, as Satya said, 00:00:07.600 |
but I don't think, I think this is one of the great tech partnerships ever. 00:00:13.840 |
And without, certainly without Microsoft and particularly Satya's early conviction, 00:00:19.520 |
What a week, what a week. Great to see you both. Sam, how's the baby? 00:00:36.560 |
Baby is great. That's the best thing ever. Every cliche is true and it is the best thing ever. 00:00:45.120 |
Smile on Sam's face whenever he talks about his baby. It's just so different. 00:00:50.160 |
And compute, I guess, when he talks about compute and his baby. 00:00:55.280 |
Well Satya, have you given any dad tips with all this time you guys have spent together? 00:01:01.680 |
I said, just enjoy it. I mean, it's so awesome that, you know, we had our babies or our children 00:01:08.320 |
so young and I wish I could redo it. So in some sense, it's just the most precious time. And as they 00:01:14.240 |
grow, it's just so wonderful. I'm so glad Sam is- 00:01:17.200 |
I'm happy to be doing it older, but I do think sometimes, man, 00:01:21.040 |
I wish I had the energy of when I was like 25. That part's harder. 00:01:24.320 |
No doubt about it. What's the average age at OpenAI, Sam? Any idea? It's young. 00:01:29.040 |
It's not crazy young. Not like most Silicon Valley startups. I don't know, maybe low 30s average. 00:01:35.600 |
Are babies trending positively or negatively? 00:01:41.440 |
Yeah. Well, you guys, such a big week. You know, I was thinking about I started at NVIDIA's GTC, 00:01:47.280 |
you know, just hit $5 trillion. Google, Meta, Microsoft, Satya, you had your earnings yesterday. 00:01:53.440 |
You know, and we heard consistently not enough compute, not enough compute, not enough compute. 00:01:58.240 |
We got rate cuts on Wednesday. The GDP is tracking near 4%. And then I was just saying to Sam, 00:02:03.840 |
you know, the president's cut these massive deals in Malaysia, South Korea, Japan, sounds like with 00:02:08.880 |
China. You know, deals that really incredibly provide the financial firepower to re-industrialize 00:02:14.640 |
America. 80 billion for new nuclear fission, all the things that you guys need to build more compute. 00:02:20.720 |
But certainly wasn't, what wasn't lost in all of this was you guys had a big announcement on Tuesday 00:02:26.160 |
that clarified your partnership. Congrats on that. And I thought we'd just start there. I really want 00:02:30.880 |
to just break down the deal in really simple, plain language to make sure I understand it and others. 00:02:37.440 |
But, you know, we'll just start with your investment, Satya. You know, Microsoft started investing in 2019, 00:02:44.720 |
is invested in the ballpark at $13, $14 billion into OpenAI. And for that, you get 27% of the business 00:02:52.640 |
ownership in the business on a fully diluted basis. I think it was about a third. And you took some 00:02:57.280 |
dilution over the course of last year with all the investment. So does that sound about right in terms 00:03:02.800 |
of ownership? Yeah, it does. But I would say before even our stake in it, Brad, I think what's pretty 00:03:09.360 |
unique about OpenAI is the fact that as part of OpenAI's process of restructuring, one of the largest 00:03:18.000 |
nonprofit gets created. I mean, let's not forget that, you know, in some sense, I say at Microsoft, 00:03:23.520 |
like, you know, we are very proud of the fact that we were associated with the two of the largest 00:03:27.520 |
nonprofits, the Gates Foundation and now the OpenAI Foundation. So that's, I think, the big news. We 00:03:33.680 |
obviously are thrilled. It's not what we thought. And as I said to somebody, it's not like when we 00:03:39.280 |
first invested our billion dollars that, oh, this is going to be the hundred bagger that I'm going to 00:03:43.440 |
be talking about to VCs about. But here we are. But we are very thrilled to be an investor and an early 00:03:50.160 |
backer. And it's a great, and it's really a testament to what Sam and team have done, quite frankly. I mean, 00:03:56.560 |
they obviously had the vision early about what this technology could do. And they ran with it and just 00:04:02.480 |
executed, you know, in a masterful way. I think this has really been an amazing partnership through 00:04:08.160 |
every phase. We had kind of no idea where it was all going to go when we started, as Satya said. 00:04:12.560 |
But I don't think, I think this is one of the great tech partnerships ever. And without, 00:04:19.440 |
certainly without Microsoft, and particularly Satya's early conviction, we would not have been 00:04:23.920 |
able to do this. I don't think there were a lot of other people that would have been willing to take 00:04:28.240 |
that kind of a bet given what the world looked like at the time. We didn't know exactly how the tech was 00:04:34.400 |
going to go. Well, not exactly. We didn't know at all how the tech was going to go. We just had a lot of 00:04:38.160 |
conviction in this one idea of pushing on deep learning and trusting that if we could do that, 00:04:42.800 |
we'd figure out ways to make wonderful products and create a lot of value. And also, as Satya said, 00:04:47.360 |
create what we believe will be the largest nonprofit ever. And I think it's going to do amazingly great 00:04:53.520 |
things. It was, I really like the structure because it lets the nonprofit grow in value while the PBC is 00:05:01.040 |
able to get the capital that it needs to keep scaling. I don't think the nonprofit would be able to be 00:05:05.440 |
this valuable if we didn't come up with the structure and if we didn't have partners around the table that 00:05:09.440 |
we're excited for it to work this way. But, you know, I think it's been six, more than six years since 00:05:14.640 |
we first started this partnership. And a pretty crazy amount of achievement for six years. And I think much, 00:05:20.720 |
much, much more to come. I hope that Satya makes a trillion dollars on the investment, not a hundred 00:05:24.240 |
billion, you know, whatever it is. Well, as part of the restructuring, 00:05:27.520 |
you guys talked about it. You have this nonprofit on top and a public benefit corp below. It's pretty 00:05:32.720 |
insane. The nonprofit is already capitalized with $130 billion, $130 billion of open AI stock. It's one of 00:05:42.000 |
the largest in the world out of the gates. It could end up being much, much larger. The California attorney 00:05:46.960 |
general said they're not going to object to it. You already have this $130 billion dedicated to making 00:05:52.720 |
sure that AGI benefits all of humanity. You announced that you're going to direct the first $25 billion 00:05:58.400 |
to health and AI security and resilience, Sam. First, let me just say, you know, as somebody who 00:06:04.480 |
participates in the ecosystem, kudos to you both. It's incredible, this contribution to the future of AI. 00:06:10.240 |
But Sam, talk to us a bit about the importance of the choice around health and resilience, and then help us 00:06:18.160 |
understand how do we make sure that you get maximal benefit without it getting weighted down, as we've 00:06:24.240 |
seen with so many nonprofits with its own political biases? 00:06:27.760 |
Yeah, first of all, the best way to create a bunch of value for the world is hopefully what we've already been 00:06:35.680 |
doing, which is to make these amazing tools and just let people use them. And I think capitalism is 00:06:40.320 |
great. I think companies are great. I think people are doing amazing work getting advanced AI into the 00:06:45.280 |
hands of a lot of people and companies that are doing incredible things. There are some areas where the, 00:06:51.200 |
I think, market forces don't quite work for what's in the best interest of people, and you do need to do 00:06:58.000 |
things in a different way. There are also some new things with this technology that just haven't existed 00:07:03.280 |
before, like the potential to use AI to do science at a rapid clip, like really truly automated discovery. 00:07:09.040 |
And when we thought about the areas we wanted to first focus on, clearly, if we can cure a lot of 00:07:14.720 |
disease and make the data and information for that broadly available, that would be a wonderful thing 00:07:20.160 |
to do for the world. And then on this point of AI resilience, I do think some things may get 00:07:25.600 |
a little strange and they won't all be addressed by companies doing their thing. So as the world has to 00:07:31.040 |
navigate through this transition, if we can fund some work to help with that, and that could be, 00:07:36.640 |
you know, cyber defense, that could be AI safety research, that could be economic studies, all of 00:07:40.480 |
these things, helping society get through this transition smoothly. We're very confident about how 00:07:46.080 |
great it can be on the other side. But, you know, I'm sure there will be some choppiness along the way. 00:07:49.920 |
Let's keep busting through the deal. So models and exclusivity, Sam, OpenAI can distribute its models, 00:07:58.400 |
its leading models on Azure, but I don't think you can distribute them on any other leading of the big 00:08:03.600 |
clouds for seven years until 2032. But that would end earlier if AGI is verified, we can come back to that. 00:08:10.800 |
But you can distribute your open source models, Sora, agents, codecs, wearables, everything else on other 00:08:16.320 |
platforms. So Sam, I assume this means no chat GPT or GPT-6 on Amazon or Google. 00:08:23.440 |
No, so we have a cat. First of all, we want to do lots of things together to help, you know, 00:08:28.080 |
create value for Microsoft. We want them to do lots of things to create value for us. And there are many, 00:08:32.640 |
many things that will happen in that category. We are keeping what Satya termed once, and I think it's a great 00:08:37.360 |
phrase of stateless APIs on Azure exclusively through 2030 and everything else we're going to, you know, 00:08:42.880 |
distribute elsewhere. And that's obviously in Microsoft's interest too. So we'll put lots of 00:08:47.120 |
products, lots of places, and then this thing we'll do on Azure and people can get it there or via us. 00:08:51.920 |
And I think that's great. And then the rev share, there's still a rev share that gets paid by OpenAI 00:08:56.080 |
to Microsoft on all your revenues that also runs until 2032, more until AGI is verified. So let's just 00:09:03.200 |
assume for the sake of argument, I know this is pedestrian, but it's important that the 00:09:07.200 |
rev share is 15%. So that would mean if you had $20 billion in revenue that you're paying $3 billion 00:09:12.720 |
to Microsoft and that counts as revenue to Azure. Satya, does that sound about right? 00:09:17.920 |
Yeah, we have a rev share. And I think as you characterized it, it is either going to AGI 00:09:22.960 |
or till the end of the term. And I actually don't know exactly where we count it, quite honestly, 00:09:28.000 |
whether it goes into Azure or somewhere else. That's a good question. It's a good question for Amy. 00:09:31.920 |
Given that both exclusivity and the rev share end early in the case AGI is verified, 00:09:38.240 |
it seems to make AGI a pretty big deal. And as I understand it, you know, if OpenAI claimed AGI, 00:09:44.560 |
it sounds like it goes to an expert panel and you guys basically select a jury who's got to make a 00:09:49.600 |
relatively quick decision whether or not AGI has been reached. Satya, you said on yesterday's earning call 00:09:54.880 |
that nobody's even close to getting to AGI and you don't expect it to happen anytime soon. You talked about 00:10:00.000 |
this spiky and jagged intelligence. Sam, I've heard you perhaps sound a little bit more bullish on, 00:10:05.760 |
you know, when we might get to AGI. So I guess the question is to you both, do you worry that over the 00:10:11.200 |
next two or three years, we're going to end up having to call in the jury to effectively make a call on 00:10:17.200 |
whether or not we've hit AGI? I realize you've got to try to make some drama between us here. 00:10:21.280 |
I think putting a process in place for this is a good thing to do. I expect that the technology 00:10:32.400 |
will take several surprising twists and turns and we will continue to be good partners to each other and 00:10:36.880 |
figure out what makes sense. That's well said, I think. And that's one of the reasons why I think 00:10:42.000 |
this process we put in place is a good one. And at the end of the day, I'm a big believer in the fact that 00:10:47.760 |
intelligence capability-wise is going to continue to improve. And our real goal, quite frankly, 00:10:53.760 |
is that, which is how do you put that in the hands of people and organizations so that they can get the 00:10:58.400 |
maximum benefits? And that was the original mission of OpenAI that attracted me to OpenAI and Sam and 00:11:04.880 |
team. And that's kind of what we plan to continue on. Brad, to say the obvious, if we had super 00:11:09.920 |
intelligence tomorrow, we would still want Microsoft's help getting this product out into people's hands. 00:11:14.960 |
And we want them, like, yeah. Of course, of course. Yeah, no, again, I'm asking the questions I know that 00:11:21.680 |
are on people's minds and that makes a ton of sense to me. Obviously, Microsoft is one of the largest 00:11:27.520 |
distribution platforms in the world. You guys have been great partners for a long time, but I think it 00:11:31.440 |
dispels some of the myths that are out there. But let's shift gears a little bit. You know, obviously, 00:11:35.520 |
OpenAI is one of the fastest growing companies in history. Satya, you said on the pod a year ago, 00:11:41.120 |
this pod, that every new phase shift creates a new Google and the Google of this phase shift is already 00:11:47.760 |
known and it's OpenAI. And none of this would have been possible had you guys not made these huge bets. 00:11:53.440 |
With all that said, you know, OpenAI's revenues are still a reported $13 billion in 2025. And Sam, 00:12:00.880 |
on your live stream this week, you talked about this massive commitment to compute, right? 1.4 trillion 00:12:07.920 |
over the next four or five years with, you know, big commitments, 500 million to Nvidia, 300 million to 00:12:14.560 |
AMD and Oracle, 250 billion to Azure. So I think the single biggest question I've heard all week and, 00:12:21.680 |
and hanging over the market is how, you know, how can the company with 13 billion in revenues make 1.4 00:12:29.120 |
trillion of spend commitments, you know, and, and, and you've heard the criticism. 00:12:33.920 |
Sam, we're doing well more revenue than that. Second of all, Brad, if you want to sell your shares, 00:12:38.560 |
I'll find you a buyer. I just, enough, like, you know, people are, I think there's a lot of people 00:12:46.080 |
who would love to buy OpenAI shares. I don't, I don't think you would. Including myself. Including myself. 00:12:51.200 |
People who talk with a lot of like breathless concern about our compute stuff or whatever, 00:12:56.320 |
that would be thrilled to buy shares. So I think we could sell, you know, your shares or anybody else's to 00:13:00.960 |
some of the people who are making the most noise on Twitter, whatever about this very quickly. We do plan for 00:13:05.760 |
revenue to grow steeply. Revenue is growing steeply. We are taking a forward bet that it's going to 00:13:10.400 |
continue to grow. And that not only will Chow Chow Chow Tee keep growing, but we will be able to become 00:13:16.880 |
one of the important AI clouds. That our consumer device business will be a significant and important 00:13:23.360 |
thing. That AI that can automate science will create huge value. So, you know, there are not many times that I 00:13:31.920 |
want to be a public company, but one of the rare times it's appealing is when those people that are 00:13:36.160 |
writing these ridiculous open AI is about to go out of business and, you know, whatever. I would love 00:13:40.400 |
to tell them they could just short the stock and I would love to see them get burned on that. But, 00:13:45.120 |
you know, I, we carefully plan, we understand where the technology, where the capability is going to 00:13:51.920 |
grow, go and, and how the products we can build around that and the revenue we can generate. 00:13:57.280 |
We might screw it up. Like this is the bet that we're making and we're taking a risk along with 00:14:01.840 |
that. A certain risk is if we don't have the compute, we will not be able to generate the revenue or make 00:14:06.800 |
the models at these, at this kind of scale. Exactly. And let me just say one thing, Brad, 00:14:12.400 |
as both a partner and an investor, there is not been a single business plan that I've seen from OpenAI 00:14:21.600 |
that they've put in and not beaten it. So in some sense, this is the one place where, you know, in 00:14:28.320 |
terms of their growth and just even the business, it's been unbelievable execution, quite frankly. I mean, 00:14:34.240 |
obviously, OpenAI, everyone talks about all the success and the usage and what have you. But even, I'd say all 00:14:40.240 |
up, the business execution has been just pretty unbelievable. I heard Greg Brockman say on CNBC a couple of weeks ago, 00:14:48.000 |
right, if we could 10x our compute, we might not have 10x more revenue, but we'd certainly have a lot 00:14:54.080 |
more revenue. Simply because of lack of compute power, things like, yeah, it's just, it's really 00:15:00.400 |
wild when I just look at how much we are held back. And in many ways we have, you know, we've scaled our 00:15:05.840 |
compute probably 10x over the past year. But if we had 10x more compute, I don't know if we'd have 10x more 00:15:10.800 |
revenue, but I don't think it'd be that far. And we heard this from you as well last night, Satya, 00:15:16.640 |
that you were compute constrained and growth would have been higher even if you add more compute. 00:15:21.120 |
So help us contextualize Sam, maybe like how compute constrained do you feel today? And do you when 00:15:27.600 |
you look at the build out over the course of the next two to three years, do you think you'll ever get to 00:15:31.760 |
the point where you're not compute constrained? We talked about this question of is there ever enough 00:15:37.200 |
compute a lot? I think the answer is the only, the best way to think about this is like energy or 00:15:46.720 |
something. You can talk about demand for energy at a certain price point, but you can't talk about demand 00:15:51.600 |
for energy without talking about at different, you know, different demand at different price levels. 00:15:58.800 |
If the price of compute per like unit of intelligence or whatever, however you want to think about it, 00:16:04.800 |
fell by a factor of a hundred tomorrow, you would see usage go up by much more than a hundred. 00:16:10.080 |
And there'd be a lot of things that people would love to do with that compute that just make no economic 00:16:13.760 |
sense at the current cost, but there would be new kind of demand. So I think the, now on the other 00:16:20.720 |
hand, as the models get even smarter and you can use these models to cure cancer or discover novel 00:16:25.680 |
physics or drive a bunch of humanoid robots to construct a space station or whatever crazy thing 00:16:29.840 |
you want, then maybe there's huge willingness to pay a much higher rate cost per unit of intelligence 00:16:37.040 |
for a much higher level of intelligence that we don't know yet, but I would bet there will be. 00:16:41.200 |
So I, I think when you talk about capacity, it's, it's like a, you know, cost per unit and, you know, 00:16:48.080 |
capability per unit. And you have to kind of, without those curves, it's sort of a made up note. It's 00:16:53.120 |
not a super well-specified problem. Yeah. I mean, I think the one thing that, you know, Sam, you've 00:16:59.040 |
talked about, which I think is the right ways to think about is that if intelligence is whatever log of 00:17:03.360 |
compute, then you try and really make sure you keep getting efficient. And so that means the tokens per dollar 00:17:09.840 |
per watt, uh, and the economic value that the society gets out of it is what we should maximize and reduce 00:17:16.640 |
the costs. And so that's where, if you sort of where like the Jevons paradox point is that, which is 00:17:22.080 |
you keep reducing it, commoditizing in some sense intelligence, uh, so that it becomes the real driver 00:17:28.880 |
of GDP growth all around. Unfortunately, it's something closer to, uh, log of intelligence. 00:17:34.080 |
It was log of compute, but we may figure out better scaling laws and we may figure out how to do this. 00:17:37.600 |
Yeah. We heard from both Microsoft and Google yesterday, both said their cloud businesses 00:17:42.400 |
would have been growing faster if they have more GPUs. You know, I asked Jensen on this pod, 00:17:47.200 |
if there was any chance over the course of the next five years, we would have a compute glut. And he said, 00:17:52.880 |
it's virtually non-existent chance in the next two to three years. And I assume you guys would both 00:17:58.960 |
agree with Jensen that while we can't see out five, six, seven years, certainly over the course of the 00:18:04.320 |
next two to three years for the, for the reasons we just discussed, that it's almost a non-existent 00:18:09.120 |
chance that you have excess compute. Well, I mean, I think the, the cycles of demand and supply 00:18:16.880 |
in this particular case, you can't really predict, right? I mean, even the point is what's the secular trend? 00:18:23.360 |
The secular trend is what Sam said, which is at the end of the day, because quite frankly, the biggest issue 00:18:28.880 |
we are now having is not a compute glut, but it's a power and it's sort of the ability to get the bills done 00:18:34.960 |
fast enough close to power. So if you can't do that, you may actually have a bunch of chips sitting 00:18:41.040 |
in inventory that I can't plug in. And in fact, that is my problem today, right? It's not a supply 00:18:46.160 |
issue of chips. It's actually the fact that I don't have warm shells to plug into. And so 00:18:51.920 |
how some supply chain constraints emerge, tough to predict because the demand is just going, you know, 00:18:59.040 |
is tough to predict, right? I mean, I wouldn't, it's not like Sam and I would want to be sitting 00:19:03.520 |
here saying, oh my God, we're less short on compute. It's because we just were not that good at being able 00:19:08.720 |
to project out what the demand would really look like. So I think that that's, and by the way, the worldwide 00:19:14.480 |
side, right? It's one thing to sort of talk about one segment in one country, but it's about, you know, 00:19:19.520 |
really getting it out to everywhere in the world. And so there will be constraints and how we work 00:19:23.920 |
through them is going to be the most important thing. It won't be a linear path for sure. 00:19:27.760 |
There will come a glut for sure. And whether that's like in two to three years or five to six, Satya, 00:19:33.440 |
and I can't tell you, but like, it's going to happen at some point, probably several points along 00:19:37.920 |
the way. Like this is, there is something deep about human psychology here and bubbles. And also, 00:19:44.880 |
as Satya said, like there's, it's such a complex supply chain, weird stuff gets built. The technological 00:19:51.120 |
landscape shifts in big ways. So, you know, if a very cheap form of energy comes online soon at mass 00:19:58.080 |
scale, and a lot of people are going to be extremely burned with existing contracts they've signed, 00:20:01.760 |
if, if we can continue this unbelievable reduction in cost per unit of intelligence, let's say it's been 00:20:09.280 |
averaging like 40 X for a given level per year, you know, that's like a very scary exponent 00:20:17.040 |
I think that's really well said. And you have to hold those two simultaneous truths. We had that 00:20:45.360 |
happen in 2000, 2001, and yet the internet became much bigger and produced much greater outcomes for 00:20:51.440 |
society than anybody estimated in that period of time. Yeah, but I think that the one thing that Sam 00:20:56.320 |
said is not talked about enough, which is the, for example, the optimizations that OpenAI has done on the 00:21:02.560 |
inference stack for a given GPU. I mean, it's kind of like, it's, you know, we talk about the Moore's law 00:21:08.320 |
improvement on one end, but the software improvements are much more exponential than that. 00:21:13.520 |
I mean, someday we will make an incredible consumer device that can run a GPT-5 or GPT-6 00:21:20.800 |
capable model completely locally at a low power draw. And this is like, so hard to wrap my head around. 00:21:27.600 |
That will be incredible. And, you know, that's the type of thing I think that scares some of the people 00:21:32.240 |
who are building, obviously, these large centralized compute stacks. And Satya, you've talked a lot about the 00:21:38.320 |
distribution, both to the edge, as well as having inference capability distributed around the world. 00:21:43.280 |
Yeah, I mean, at least I've thought about it is more about really building a fungible fleet. I mean, when I look at sort of in the 00:21:51.040 |
cloud infrastructure business, one of the key things you have to do is have two things. One is an efficient, like in this context, in a very efficient token 00:21:59.360 |
factory, and then high utilization. That's it. There are two simple things that you need to achieve. 00:22:05.440 |
And in order to have a high utilization, you have to have multiple workloads that can be scheduled, 00:22:09.840 |
even on the training. I mean, if you look at the AI pipelines, there's pre-training, there's mid-training, 00:22:13.920 |
there's post-training, there's RL, you want to be able to do all of those things. So thinking about 00:22:18.160 |
fungibility of the fleet is everything for a cloud provider. 00:22:21.920 |
Okay, so Sam, you referenced, you know, and Reuters was reporting yesterday that OpenAI may be planning 00:22:28.240 |
to go public late 26 or in 27. No, no, no, we don't have anything that specific. I'm a realist. I assume 00:22:34.320 |
it will happen someday. But that was, I don't know why people write these reports. We don't have like a date in mind. 00:22:40.640 |
Great to know. I don't have a decision to do this or anything like that. I just assume it's where 00:22:44.240 |
things will eventually go. But it does seem to me, if you guys were, you know, are doing in excess of 00:22:50.640 |
$100 billion of revenue in 28 or 29, that you at least would be in position. What? How about 27? 00:22:58.160 |
Yeah, 27, even better. You are in position to do an IPO and the rumored trillion dollars, again, just to 00:23:06.400 |
contextualize for listeners. If you guys went public at 10 times $100 billion in revenue, right, which 00:23:13.520 |
would be, I think, a lower multiple than Facebook went public at, a lower multiple than a lot of other 00:23:19.440 |
big consumer companies went public at, that would put you at a trillion dollars. If you floated 10 to 20% 00:23:26.320 |
of the company, that raises $100 to $200 billion, which seems like that would be a good path to fund 00:23:33.440 |
a lot of the growth and a lot of the stuff that we just talked about. So you're, you're not opposed 00:23:38.800 |
to it. You're not, but you guys are making the company with revenue growth, which is what I would 00:23:43.920 |
like us to do. But no doubt about it. Well, I've also said, I think that this is such an important 00:23:50.080 |
company. And, you know, there are so many people, including my kids who like to trade their little 00:23:56.080 |
accounts and they use ChatGPT. And I think having retail investors have an opportunity to buy one of 00:24:02.240 |
the most important and largest companies. That would be nice. Honestly, that is probably 00:24:05.440 |
the single most appealing thing about it to me. That would be really nice. 00:24:10.560 |
So one of the things I've talked to you both about shifting gears again is part of the big, 00:24:16.640 |
beautiful bill, you know, Senator Cruz had included federal preemption so that we wouldn't have this 00:24:24.000 |
state patchwork, 50 different laws that mires the industry down and kind of needless compliance and 00:24:30.160 |
regulation. Unfortunately, got killed at the last second by Senator Blackburn, because frankly, I think 00:24:36.320 |
AI is pretty poorly understood in Washington. And there's a lot of doomerism, I think, that has gained 00:24:41.520 |
traction in Washington. So now we have state laws like the Colorado AI Act that goes into full effect 00:24:47.600 |
in February, I believe, that creates this whole new class of litigants. Anybody who claims any unfair 00:24:52.720 |
impact from an algorithmic discrimination in a chatbot. So somebody could claim harm for countless reasons. 00:24:59.840 |
Sam, how worried are you that, you know, having this state patchwork of AI, you know, 00:25:06.000 |
poses real challenges to, you know, our ability to continue to accelerate and compete around the world? 00:25:11.360 |
I don't know how we're supposed to comply with that California, sorry, Colorado law, I would love 00:25:16.400 |
them to tell us, you know, we'd like to be able to do it. But that's just from what I've read of that, 00:25:22.560 |
that's like, I literally don't know what we're supposed to do. I'm very worried about a 50 state 00:25:27.280 |
patchwork. I think it's a big mistake. I think it's, there's a reason we don't usually do that for 00:25:33.360 |
Yeah, I mean, I think the fundamental problem of, you know, this patchwork approach is, 00:25:38.880 |
quite frankly, I mean, between OpenAI and Microsoft, we'll figure out a way to navigate this, right? I 00:25:44.160 |
mean, we can figure this out. The problem is anyone starting a startup and trying to kind of, 00:25:50.720 |
this is sort of, it just goes to the exact opposite, or I think what the intent here is, 00:25:55.440 |
which obviously safety is very important, making sure that the fundamental, you know, concerns people 00:26:02.240 |
have are addressed, but there's a way to do that at the federal level. So, I think if we don't do this, 00:26:07.680 |
again, you know, EU will do it, and then that'll cause its own issues. So, I think if US leads, 00:26:13.680 |
it's better as, you know, as one regulatory framework. For sure. 00:26:18.400 |
And to be clear, it's not that one is advocating for no regulation. It's simply saying, let's have, 00:26:24.400 |
you know, agreed upon regulation at the federal level, as opposed to 50 competing state laws, 00:26:29.840 |
which certainly firebombs the AI startup industry. And I think it makes it super challenging, even for 00:26:36.800 |
companies like yours, who can afford to defend all these cases. 00:26:39.440 |
Yeah. And I would just say, quite frankly, my hope is that this time around, even across EU and the 00:26:45.200 |
United States, like that'll be the dream, right? Quite frankly, for any European startup. 00:26:51.920 |
That would be great. I don't, I wouldn't hold your breath for that one. That would be great. 00:26:54.720 |
No, but I really think that if you think about it, right, if you sort of, if anyone in Europe is 00:27:00.480 |
thinking about their, you know, how can they participate in this AI economy with their companies, 00:27:07.440 |
this should be the main concern there as well. So therefore that's, I hope there is some enlightened 00:27:13.600 |
approach to it, but I agree with you that, you know, today I wouldn't bet on that. 00:27:16.960 |
I do think that with SACS, as the AIs are, you at least have a president that I think might fight for 00:27:24.320 |
that in terms of coordination of AI policy, using trade as a lever to make sure that, you know, 00:27:30.880 |
we don't end up with overly restricted European policy, but we shall see. I think first things 00:27:35.040 |
first, federal preemption in the United States is pretty critical. You know, we've been down in the 00:27:38.960 |
weeds a little bit here, Sam. So I want to telescope out a little bit. You know, I've heard people on your 00:27:45.600 |
team talk about all the great things coming up. And as you start thinking about much more unlimited 00:27:52.480 |
compute chat GPT six and beyond robotics, physical devices, scientific research, as you, as you look 00:28:02.080 |
forward to 2026, what do you think surprises us the most? What, what, what are you most excited about 00:28:09.600 |
You, I mean, you just hit on a lot of the key points there. I think 00:28:13.520 |
Codex has been a very cool thing to watch this year. And as these go from multi-hour tasks to multi-day 00:28:20.720 |
tasks, which I expect to happen next year, what people be able to do to create 00:28:24.720 |
software at an unprecedented rate and really in fundamentally new ways. I'm very excited for that. 00:28:31.280 |
I think we'll see that in other industries too. I have like a bias towards coding. I understand that 00:28:35.040 |
one better, but I think we'll see that really start to transform what people are capable of. I, I hope for 00:28:42.000 |
very small scientific discoveries in 2026, but if we can get those very small ones, we'll get bigger ones in 00:28:46.400 |
future years. That's a really crazy thing to say is that like AI is going to make a novel scientific 00:28:50.720 |
discovery in 2026, even a very small one. This is like, this is a wildly important thing to be talking 00:28:57.600 |
about. So I'm excited for that. Certainly robotics and computer and new kind of computers in future 00:29:03.280 |
years. That'll be, that'll be very important. But yeah, my personal bias is if we can really get AI to do 00:29:11.680 |
science here, that is, I mean, that is super intelligence in some sense. Like if this is expanding 00:29:17.520 |
the total sum of human knowledge, that is a crazy big deal. 00:29:20.000 |
Yeah. I mean, I think one of the things to use your codex example, I think the combination of the model 00:29:27.040 |
capability, I mean, if you think about the magical moment that happened with ChatGPT was the UI that 00:29:33.200 |
met intelligence that just took off, right? It's just, you know, unbelievable right form fact. And some 00:29:39.680 |
of it was also the instruction following piece of model capability was ready for chat. I think that that's 00:29:46.320 |
what the codex and the, you know, these coding agents are about to help us, which is what's that, 00:29:52.080 |
you know, coding agent goes off for a long period of time, comes back, and then I'm then dropped into 00:29:59.040 |
what I should steer. Like one of the metaphors I think we're all sort of working towards is I do this 00:30:04.560 |
macro delegation and micro steering. What is that UI meets this new intelligence capability? And you can see 00:30:13.520 |
the beginnings of that with codex, right? The way at least I use it inside a GitHub Copilot is like, 00:30:18.960 |
you know, it's now, it's just a, it's just a different way than the chat interface. And I think 00:30:24.160 |
that, that I think would be a new way for the human computer interface, quite frankly, it's probably 00:30:29.280 |
bigger than, that might be the departure. That's one reason I'm very excited that we're doing new form 00:30:35.840 |
factors of computing devices, because computers were not built for that kind of workflow very well. 00:30:40.240 |
Certainly, a UI like ChatGPT is wrong for it. But this idea that you can have a device that is sort of 00:30:45.520 |
always with you, but able to go off and do things and get micro steer from you when it needs and have 00:30:51.200 |
like really good contextual awareness of your whole life and flow. And I think that'd be cool. 00:30:55.680 |
And what neither of you have talked about is the consumer use case, I think a lot about, 00:31:00.160 |
you know, again, we go onto this device, and we have to hunt and pack through 100 different 00:31:04.080 |
applications and fill out little web forms, things that really haven't changed in 20 years. But to 00:31:08.880 |
just have, you know, a personal assistant that we take for granted, perhaps that we actually have a 00:31:13.360 |
personal assistant, but to give a personal assistant for virtually free to billions of people around the 00:31:19.120 |
world to improve their lives, whether it's, you know, ordering diapers for their kid, or whether it's, 00:31:24.720 |
you know, booking their hotel or making changes in their calendar, I think sometimes it's the 00:31:30.160 |
pedestrian that's the most impactful. And as we move from answers to memory and actions, and then the 00:31:36.880 |
ability to interface with that through an earbud or some other device that doesn't require me to 00:31:41.600 |
constantly be staring at this rectangular piece of glass, I think it's pretty extraordinary. 00:31:52.720 |
Sam, it was great to see you. Thanks for joining us. Congrats again on this big step forward and we'll 00:32:01.040 |
As Sam well knows, we're certainly a buyer, not a seller. But sometimes, you know, I think it's important 00:32:09.520 |
because the world, you know, we're pretty small. We spend all day long thinking about this stuff, 00:32:15.680 |
right? And so conviction, it comes from the 10,000 hours we've spent thinking about it. But the reality 00:32:23.040 |
is we have to bring along the rest of the world and the rest of the world doesn't spend 10,000 hours 00:32:28.080 |
thinking about this. And frankly, they look at some things that appear overly ambitious, right? And get 00:32:34.160 |
worried about whether or not we can pull those things off. So you took this idea to the board in 2019 00:32:40.720 |
to invest a billion dollars into OpenAI. Was it a no-brainer in the boardroom? 00:32:45.040 |
You know, did you have to expend any political capital to get it done? Dish for me a little bit, 00:32:51.280 |
like, what that moment was like. Because I think it was such a pivotal moment, not just for Microsoft, 00:32:56.640 |
not just for the country, but I really do think for the world. 00:32:59.040 |
It's interesting when you look back. The journey, when I look at it, it's been, you know, we were 00:33:05.200 |
involved even in 2016 when initially OpenAI started. In fact, Azure was even the first sponsor, I think. 00:33:13.520 |
And then they were doing a lot more reinforcement learning at that time. I remember the Dota 2 00:33:18.080 |
competition, I think, happened on Azure. And then they moved on to other things. And, you know, I was 00:33:23.520 |
interested in RL, but quite frankly, you know, it speaks a little bit to your 10,000 hours or the 00:33:28.880 |
prepared mind. Microsoft, since 1995, was obsessed. I mean, Bill's obsession for the company was natural 00:33:36.160 |
language, natural language. I mean, after all, we had a coding company, information work company. 00:33:40.640 |
So it's when Sam in 2019 started talking about text and natural language and transformers and scaling 00:33:47.840 |
laws. That's when I said, wow, like, this is an interesting, I mean, he, you know, this was a team 00:33:54.000 |
that was going in the direction or the direction of travel was now clear. It had a lot more overlap 00:34:00.160 |
with our interest. So in that sense, it was a no brainer. Obviously, you go to the board and say, 00:34:06.160 |
hey, I have an idea of taking a billion dollars and giving it to this crazy structure, which we 00:34:11.920 |
don't even kind of understand. What is it? It's a nonprofit, blah, blah, blah. And saying, go for it. 00:34:17.840 |
There was a debate. Bill was kind of rightfully so skeptical because, and then he became like, 00:34:24.800 |
once he saw the GPT-4 demo, like that was like the thing that Bill's talked about publicly, where 00:34:29.520 |
when he saw it, he said, it's the best demo he saw after, you know, what Charles Simone showed him at 00:34:35.200 |
Xerox PARC. And, but, you know, quite honestly, none of us could. So the moment for me was that, 00:34:42.160 |
you know, let's go give it a shot. Then seeing the early codecs inside of Copilot, inside of 00:34:50.080 |
GitHub Copilot and seeing just the code completions and seeing it work, that's when I would say we, 00:34:56.320 |
I felt like I can go from one to 10 because that was the big call, quite frankly. One was controversial, 00:35:03.440 |
but the one to 10 was what really made this entire era possible. And then obviously, 00:35:09.920 |
the great execution by the team and the productization on their part, our part. I mean, 00:35:15.280 |
if I think about it, right, the collective monetization reach of GitHub Copilot, ChatGPT, 00:35:22.160 |
Microsoft 365 Copilot and Copilot, you add those four things, that is it, right? That's the biggest sort of AI 00:35:28.160 |
set of products out there on the planet. And that's, you know, what obviously has let us sustain all of this. 00:35:34.640 |
And I think not many people know that your CTO, Kevin Scott, you know, an ex Googler lives down 00:35:40.320 |
here in Silicon Valley. And to contextualize it, right, Microsoft had missed out on search, 00:35:44.880 |
had missed out on mobile, you become CEO, almost had missed out on the cloud, right? You've described it, 00:35:52.480 |
caught the last train out of town to capture the cloud. And I think you were pretty determined to have 00:35:59.200 |
eyes and ears down here. So you didn't miss the next big thing. So I assume that Kevin played a good 00:36:04.720 |
role for you as well. Absolutely. I find deep seek and open AI. Yeah, I mean, I mean, if it's in fact, 00:36:11.200 |
I would say Kevin's conviction, and Kevin was also skeptical, like that was the thing. I always watch 00:36:18.240 |
for people who are skeptical, who change their opinion, because to me, that's a signal. So I'm always 00:36:25.040 |
looking for someone who's a non-believer in something, and then suddenly changes. And then 00:36:29.840 |
they get excited about it, that I have all the time for that, because I'm then curious, why? And so, 00:36:35.680 |
Kevin started with thinking that all of us were kind of skeptical, right? No, I mean, in some sense, 00:36:40.080 |
it defies the, you know, we're all having gone to school and said, God, you know, there must be an 00:36:45.680 |
algorithm to crack this versus just scaling laws and throw compute. But quite frankly, Kevin's conviction 00:36:52.720 |
that this is worth going after is one of the big things that drove this. 00:36:57.040 |
Well, we talk about, you know, that investment that it's now worth 130 billion, I suppose, could be 00:37:03.200 |
worth a trillion someday, as Sam says. But it really, in many ways, understates the value of the partnership, 00:37:10.080 |
right? So you have the value in the rev share billions per year going to Microsoft. You have the 00:37:16.880 |
profits you make off the $250 billion of the Azure compute commitment from OpenAI. And of course, 00:37:24.160 |
you get huge sales from the exclusive distribution of the API. So talk to us how you think about the 00:37:31.440 |
value across those domains, especially how this exclusivity has brought a lot of customers who may 00:37:37.760 |
have been on AWS to Azure. Yeah, no, absolutely. I mean, so to us, if I look at it, you know, 00:37:45.360 |
aside from all the equity parts, the real strategic thing that comes together and that remains going 00:37:52.240 |
forward, is that stateless API exclusivity on Azure that helps quite frankly, both OpenAI and us and our 00:37:59.840 |
customers. Because when somebody in the enterprise is trying to build an application, they want an API 00:38:06.640 |
that's stateless, they want to mix it up with compute and storage, put a database underneath it to capture 00:38:14.240 |
your state and build a full workload. And that's where, you know, Azure coming together with this API. 00:38:21.040 |
And so what we're doing with even Azure Foundry, right? Because in some sense, you let's say you want 00:38:26.160 |
to build an AI application. But the key thing is, how do you make sure that the evals of what you're doing 00:38:32.880 |
with AI are great. So that's where you need even a full app server in Foundry. That's what we have done. 00:38:39.600 |
And so therefore, I feel that that is the way we will go to market in our infrastructure business. 00:38:45.280 |
The other side of the value capture for us is going to be incorporating all this IP. Not only we have the 00:38:52.320 |
exclusivity of the model in Azure, but we have access to the IP. I mean, having a royalty free, 00:38:58.880 |
let's even forgetting all the know how and the knowledge side of it. But having royalty free 00:39:04.080 |
access all the way till seven more years gives us a lot of flexibility business model wise. It's kind 00:39:09.920 |
of like having a frontier model for free. In some sense, if you're an MSFT shareholder, that's kind of 00:39:15.440 |
where you should start from is to think about, we have a frontier model that we can then deploy, 00:39:20.480 |
whether it's in GitHub, whether it's in M365, whether it's in our consumer copilot, then add to it our own 00:39:26.720 |
data, post train it. So that means we can have it embedded in the weights there. And so therefore, 00:39:32.640 |
we are excited about the value creation on both the Azure and the infrastructure side, 00:39:38.480 |
as well as in our high value domains, whether it is in health, whether it's in knowledge work, 00:39:45.920 |
You've been consolidating the losses from OpenAI. You know, I think you just reported earnings 00:39:50.800 |
yesterday. I think you consolidated 4 billion of losses in the quarter. Do you think that investors 00:39:56.160 |
are, I mean, they may even be attributing negative value, right? Because of the losses, you know, as they 00:40:01.760 |
apply their multiple of earnings, Satya. Whereas I hear this, and I think about all of those benefits we 00:40:07.440 |
just described, not to mention the look through equity value that you own in a company that could 00:40:12.800 |
be worth a trillion unto itself. You know, do you think that the market is kind of misunderstanding 00:40:19.040 |
the value of OpenAI as a component of Microsoft? 00:40:22.240 |
Dr. Abdul- Yeah, that's a good one. So I think the approach that Amy is going to take 00:40:26.800 |
is full transparency, because at some level, I'm no accounting expert. So therefore, the best thing 00:40:32.320 |
to do is to give all the transparency. I think this time around as well, I think that's why the non-GAAP 00:40:38.800 |
gap, so that at least people can see the EPS numbers. Because the common sense way I look at it, 00:40:44.960 |
Brad, is simple. If you've invested, let's call it $13.5 billion, you can, of course, lose $13.5 billion. 00:40:51.680 |
Dr. Abdul- But you can't lose more than $13.5 billion. At least the last time I checked, 00:40:56.000 |
that's what you have at risk. You could also say, hey, the $135 billion that is, you know, today, 00:41:02.080 |
our equity stake, you know, is sort of illiquid, what have you. We don't plan to sell it. So therefore, 00:41:08.320 |
it's got risk associated with it. But the real story I think you are pulling is all the other things 00:41:14.720 |
that are happening. What's happening with Azure growth, right? Would Azure be growing if we had not sort of 00:41:19.840 |
had the OpenAI partnership. To your point, the number of customers who came from other 00:41:24.160 |
clouds for the first time, right? This is the thing that really we benefited from. What's happening with 00:41:31.040 |
Microsoft 365? In fact, one of the things about Microsoft 365 was, what was the next big thing 00:41:35.840 |
after E5? Guess what? We found it in Copilot. It's bigger than any suite. Like, you know, we talk about 00:41:43.280 |
penetration and usage and the pace. It's bigger than anything we have done in our information work, 00:41:50.560 |
which we've been at it for decades. And so we feel very, very good about the opportunity to create 00:41:56.240 |
value for our shareholders. And then at the same time, be fully transparent so that people can look 00:42:01.680 |
through the, what are the losses? I mean, who knows what the accounting rules are, but we will do whatever 00:42:06.240 |
is needed and people will then be able to see what's happening. But a year ago, Satya, there were a bunch 00:42:12.160 |
of headlines that Microsoft was pulling back on AI infrastructure, right? Fair or unfair, they were out 00:42:17.600 |
there. You know, and perhaps you guys were a little more conservative, a little more skeptical 00:42:22.640 |
of what was going on. Amy said on the call last night, though, that you've been short power and 00:42:28.000 |
infrastructure for many quarters, and she thought that you would catch up, but you haven't caught up 00:42:33.200 |
because demand keeps increasing. So I guess the question is, were you too conservative, you know, 00:42:38.000 |
knowing what you know now and what's the roadmap from here? Yeah, it's a great question because, 00:42:42.960 |
see, the thing that we realized, and I'm glad we did, is that the concept of building a fleet 00:42:51.440 |
that truly was fungible, fungible for all the parts of the life cycle of AI, fungible across geographies, 00:42:59.920 |
and fungible across generations, right? So because one of the key things is when you have, let's take even 00:43:06.320 |
what Jensen and team are doing, right? I mean, they're at a pace. In fact, one of the things I like 00:43:11.120 |
is the speed of light, right? We now have GB300s bringing, you know, that we're bringing up. So you 00:43:15.920 |
don't want to have ordered a bunch of GB200s that are getting plugged in only to find the GB300s are in 00:43:24.000 |
full production. So you kind of have to make sure you're continuously modernizing, you're spreading the 00:43:30.320 |
fleet all over, you are really truly fungible by workload, and you're adding to that the software 00:43:37.200 |
optimizations we talked about. So to me, that is the decision we made. And we said, look, sometimes you 00:43:42.880 |
may have to say no to some of the demand, including some of the open AI demand, right? Because sometimes, 00:43:48.560 |
you know, Sam may say, hey, build me a dedicated, you know, big, you know, whatever, multi-gigawatt data 00:43:54.640 |
center in one location for training. Makes sense from an open AI perspective. Doesn't make sense from a long-term 00:44:02.800 |
infrastructure build out for Azure. And that's where I thought they did the right thing to give them 00:44:07.440 |
flexibility to go procure that from others, while maintaining, again, a significant book of business 00:44:13.840 |
from open AI. But more importantly, giving ourselves the flexibility with other customers, our own 1P. 00:44:20.960 |
Remember, like one of the things that we don't want to do is be short on is, you know, we talk about 00:44:25.920 |
Azure. In fact, some of the times our investors are overly fixated on the Azure number. But remember, for me, 00:44:31.280 |
the high margin business for me is co-pilot. It is security co-pilot. It's GitHub co-pilot. It's 00:44:37.200 |
the healthcare co-pilot. So we want to make sure we have a balanced way to approach the returns that 00:44:43.280 |
the investors have. And so that's kind of one of the other misunderstood, perhaps, in our investor 00:44:48.560 |
base in particular, which I find pretty strange and funny because I think they want to hold Microsoft 00:44:53.920 |
because of the portfolio we have. But man, are they fixated on the growth number of one little thing 00:44:59.120 |
called Azure. On that point, Azure grew 39% in the quarter on a staggering 93 00:45:06.960 |
billion-dollar run rate. And, you know, I think that compares to GCP that grew at 32% and AWS closer to 00:45:14.080 |
20%. But could Azure, because you did give compute to 1P and because you did give compute to research, 00:45:22.720 |
it sounds like Azure could have grown 41%, 42% had you had more compute to offer. 00:45:28.000 |
Absolutely. Absolutely. There's no question. There's no question. So that's why I think the internal 00:45:33.040 |
thing is to balance out what we think, again, is in the long-term interests of our shareholders 00:45:38.160 |
and also to serve our customers well. And also not to kind of, you know, one of the other things was, 00:45:43.440 |
you know, people talk about concentration risk, right? We obviously want a lot of OpenAI, 00:45:47.920 |
but we also want other customers. And so we're shaping the demand here. You know, we are in a supply, 00:45:53.600 |
you know, we're not demand-constrained, we're supply-constrained. So we are shaping the demand 00:45:59.200 |
such that it matches the supply in the optimal way with a long-term view. 00:46:04.320 |
So to that point, Sacha, you talked about $400 billion, it's an incredible number, 00:46:09.520 |
of remaining performance obligations last night. You said that, you know, that's your booked business 00:46:15.440 |
today. It'll surely go up tomorrow as sales continue to come in. And you said you're going to, 00:46:21.040 |
you know, your need to build out capacity just to serve, that backlog is very high. 00:46:25.440 |
You know, how diversified is that backlog to your point? And how confident are you that that $400 00:46:32.720 |
billion does turn into revenue over the course of the next couple of years? 00:46:36.960 |
Yeah, that $400 billion, it has a very short duration as Amy explained. It's the two-year 00:46:42.720 |
duration on average. So that's definitely our intent. That's one of the reasons why we're spending the 00:46:49.760 |
capital out clear with high certainty that we just need to clear the backlog. And to your point, 00:46:54.480 |
it's pretty diversified both on the 1P and the 3P. Our own demand is quite frankly pretty high for our 00:47:00.480 |
one first party. And even amongst third party, one of the things we now are seeing is the rise of all 00:47:08.000 |
the other companies building real workloads that are scaling. And so given that, I think we feel very 00:47:14.080 |
good. I mean, obviously, that's one of the best things about RPO is you can be planful, quite frankly. 00:47:19.680 |
And so therefore, we feel very, very good about building. And then this doesn't include, obviously, 00:47:24.000 |
the additional demand that we're already going to start seeing, including the 250, 00:47:28.480 |
you know, which will have a longer duration and will build accordingly. 00:47:32.080 |
Right. So there are a lot of new entrants, right, in this race to build out compute Oracle, CoreWeave, 00:47:39.840 |
Crusoe, et cetera. And normally we think that will compete away margins, but you've somehow managed to 00:47:46.160 |
build all this out while maintaining healthy operating margins at Azure. So I guess the question is for 00:47:51.920 |
Microsoft, how do you compete in this world that is where people are levering up, taking lower margins 00:47:59.920 |
while balancing that profit and risk? And do you see any of those competitors doing deals that cause 00:48:06.480 |
you to scratch your head and say, oh, we're just setting ourselves up for another boom and bust cycle? 00:48:11.200 |
I mean, at some level, the good news for us has been competing even as a hyperscaler every day. 00:48:18.480 |
You know, there's a lot of competition, right, between us and Amazon and Google on all of these, 00:48:23.440 |
right? I mean, it's sort of one of those interesting things, which is everything is a commodity, right? 00:48:27.840 |
Compute, storage. I remember everybody saying, wow, how can there be a margin? Except at scale, 00:48:33.280 |
nothing is a commodity. And so therefore, yes, so we have to have a cost structure, our supply chain 00:48:39.440 |
efficiency, our software efficiencies all have to kind of continue to compound in order to make sure 00:48:46.800 |
that there's margins, but scale. And to your point, one of the things that I really love about the OpenAI 00:48:53.440 |
partnership is it's gotten us to scale, right? This is a scale game. When you have the biggest workload 00:49:00.560 |
there is running on your cloud, that means not only are we going to learn faster on what it means to 00:49:06.000 |
operate with scale, that means your cost structure is going to come down faster than anything else. 00:49:10.320 |
And guess what? That'll make us price competitive. And so I feel pretty confident about our ability to, 00:49:16.480 |
you know, have margins. And this is where the portfolio helps. I've always said, you know, 00:49:21.840 |
I've been forced into giving the Azure numbers, right? Because at some level, I've never thought of 00:49:28.000 |
allocating compute. I mean, my capital allocation is for the cloud, from whether it is Xbox cloud 00:49:34.480 |
gaming, or Microsoft 365, or for Azure, it's one capital outlay. And then everything is a meter, 00:49:42.160 |
as far as I'm concerned, from an MSFT perspective. It's a question of, hey, the blended average of that 00:49:48.240 |
should match the operating margins we need as a company. Because after all, otherwise, we're not a 00:49:54.400 |
conglomerate. We're one company with one platform logic. It's not running five, six different businesses. 00:50:00.320 |
We're in these five, six different businesses only to compound the returns on the cloud and AI investment. 00:50:06.720 |
Yeah, I love that line. Nothing is a commodity at scale. You know, there's been a lot of ink and time 00:50:13.920 |
spent even on this podcast with my partner, Bill Gurley, talking about circular revenues, 00:50:18.400 |
including Microsoft Stasher credits, right to open AI that were booked as revenue. Do you see anything 00:50:25.360 |
going on like the AMD deal, you know, where they traded 10% of their equity and, you know, for a deal 00:50:31.920 |
or the Nvidia deal? Again, I don't want to be overly fixated on concern, but I do want to address head 00:50:37.440 |
on what is being talked about every day on CNBC and Bloomberg. And there are a lot of these overlapping 00:50:43.760 |
deals that are going on out there. Do you, do you, when you think about that in the context of Microsoft, 00:50:49.440 |
does any of that worry you again, as to the sustainability or durability of the AI revenues 00:50:56.880 |
that we see in the world? Yeah. I mean, first of all, our investment of let's say that 13 and a half, 00:51:03.040 |
which was all the training investment that was not booked as revenue. That is the, that is the reason 00:51:08.080 |
why we have the equity percentage. That's the reason why we have the 27% or 135 billion. So that was not 00:51:15.680 |
something some that somehow that made it into Azure revenue. In fact, if anything, the Azure revenue was 00:51:21.440 |
purely the consumption revenue of chat GPT and anything else, and the APIs they put out that they 00:51:28.560 |
monetized and we monetized. To your aspect of others, you know, to some degree, it's always been there in 00:51:36.000 |
terms of vendor financing, right? So it's not like a new concept that when someone's building something and 00:51:41.760 |
they have a customer who's also building something, but they need financing, you know, for whether it is, 00:51:48.240 |
you know, it's, it's sort of some, they're taking some exotic forms, which obviously need to be 00:51:53.520 |
scrutinized by the investment community. But that said, you know, vendor financing is not a new concept. 00:52:00.480 |
Interestingly enough, we have not had to do any of that, right? I mean, we may have, you know, really 00:52:06.080 |
either invested in OpenAI and essentially got an equity stake in it for return for compute, 00:52:12.640 |
or essentially sold them great pricing of compute in order to be able to sort of bootstrap them. But, 00:52:19.040 |
you know, others choose to do so differently. And, and I think circularity ultimately will be tested by 00:52:25.120 |
demand because all this will work as long as there is demand for the final output of it. And up to now, 00:52:33.040 |
that has been the case. Certainly, certainly. Well, I want to shift, you know, as you said, 00:52:37.760 |
over half your business's software applications, you know, I want to think about software and agents, 00:52:43.360 |
you know, last year on this pod, you made a bit of a stir by saying that much of the application software, 00:52:49.120 |
you know, was this thin layer that sat on top of a CRUD database. 00:52:52.560 |
So the notion that business applications exist, that's probably where they will all collapse, 00:53:02.080 |
right? In the agent era. Because if you think about it, right, they are essentially CRUD databases 00:53:08.880 |
with a bunch of business logic. The business logic is all going to these agents. 00:53:16.320 |
public software companies are now trading at about 5.2 times forward revenue. So that's below their 00:53:23.280 |
10 year average of seven times, despite the markets being at all time highs. And there's lots of concern 00:53:29.040 |
that SaaS subscriptions and margins may be put at risk by AI. So how today is AI affecting the growth 00:53:37.840 |
rates of your software products of, you know, those core products? And specifically, as you think about 00:53:43.200 |
database, fabric security, Office 360. And then second question, I guess, is what are you doing 00:53:50.480 |
to make sure that software is not disrupted, but is instead superpowered by AI? 00:53:56.240 |
Yeah, I think that's right. So the last time we talked about this, my point really there was the 00:54:02.000 |
architecture of SaaS applications is changing because this agent tier is replacing the old business logic 00:54:09.120 |
tier. And so because if you think about it, the way we built SaaS applications in the past was 00:54:13.120 |
you had the data, the logic tier and the UI all tightly coupled. And AI, quite frankly, doesn't 00:54:19.120 |
respect that coupling because it requires you to be able to decouple. And yet the context engineering is 00:54:26.240 |
going to be very important. I mean, take, you know, something like Office 365. One of the things I love 00:54:30.800 |
about our Microsoft 365 offering is it's low ARPU, high usage, right? I mean, if you think about it, 00:54:39.120 |
right, Outlook or Teams or SharePoint, you pick Word or Excel, like people are using it all the time, 00:54:45.440 |
creating lots and lots of data, which is going into the graph, and our ARPU is low. So that's sort of 00:54:51.120 |
what gives me real confidence that this AI tier, I can meet it by exposing all my data. In fact, one of 00:54:59.920 |
the fascinating things that's happened, Brad, with both GitHub and Microsoft 365 is thanks to AI, we are 00:55:07.840 |
seeing all time highs in terms of data that's going into the graph or the repo. I mean, think about it, 00:55:14.000 |
the more code that gets generated, whether it is Codex or Cloud or wherever, where is it going? 00:55:19.680 |
GitHub. More PowerPoints that get created, Excel models that get created, all these artifacts and chat 00:55:26.160 |
conversations. Chat conversations are new docs. They're all going in to the graph and all that is needed, 00:55:33.520 |
again, for grounding. So that's what, you know, you turn it into a forward index, into an embedding, 00:55:40.880 |
and basically that semantics is what you really go ground any agent request. And so I think the 00:55:48.240 |
next generation of SaaS applications will have to sort of, if you are high ARPU, low usage, 00:55:53.920 |
then you have a little bit of a problem. But if you are, we are the exact opposite. We are low ARPU, 00:55:59.760 |
high usage. And I think that anyone who can structure that and then use this AI as in fact an accelerant, 00:56:06.640 |
because I mean, like, if you look at the M365 Copilot price, I mean, it's higher than any other 00:56:11.920 |
thing that we sell. And yet it's getting deployed faster and with more usage. And so I feel very 00:56:18.080 |
good. Oh, or coding, right? Who would have thought? In fact, take GitHub, right? What GitHub did in the 00:56:24.000 |
first, I don't know, 15 years of its existence or 10 years of its existence, it was basically done in the 00:56:29.120 |
last year, just because coding is no longer a tool, it's more a substitute for wages. And so it's a very 00:56:36.880 |
different type of business model, even. Kind of thinking about the stack and where value gets 00:56:41.360 |
distributed. So until very recently, right, clouds largely ran pre-compiled software, you didn't need a 00:56:48.160 |
lot of GPUs, and most of the value accrued to the software layer, to the database, to the applications like 00:56:53.440 |
CRM and Excel. But it does seem in the future that these interfaces will only be valuable, 00:56:59.120 |
right, if they're intelligent, right? If they're pre-compiled, they're kind of dumb. The software's 00:57:05.040 |
got to be able to think and to act and to advise. And that requires, you know, the production of these 00:57:11.120 |
tokens, you know, dealing with the ever-changing context. And so in that world, it does seem like much 00:57:17.120 |
more of the value will accrue to the AI factory, if you will, to, you know, gents in producing, 00:57:23.040 |
you know, helping to produce these tokens at the lowest cost, and to the models. And maybe that 00:57:30.560 |
the agents or the software will accrue a little bit less of the value in the future than they've accrued 00:57:35.680 |
in the past. Steelman, for me, why that's wrong? Yeah. So I think there are two things that are necessary 00:57:43.200 |
to try and to drive the value of AI. One is what you described first, which is the token factory. And 00:57:48.720 |
even if you unpack the token factory, it's the hardware silicon system. But then it is about running 00:57:55.600 |
it most efficiently with the system software, with all the fungibility, max utilization. That's where the 00:58:03.520 |
hyperscaler's role is, right? What is a hyperscaler? Is hyperscaler, like everybody says, if you sort of 00:58:09.040 |
said, hey, I want to run a hyperscaler. Yeah, you could say, oh, it's simple. I'll buy a bunch 00:58:12.960 |
of servers and wire them up and run it. It's not that, right? I mean, if it was that simple, then 00:58:17.040 |
there would have been more than three hyperscalers by now. So the hyperscaler is the know-how of running 00:58:23.280 |
that max util and the token factories. And it's not, and by the way, it's going to be heterogeneous. 00:58:28.320 |
Obviously, Jensen's super competitive. Lisa is going to come, you know, Hawk's going to produce things from 00:58:34.480 |
Broadcom, we will all do our own. So there's going to be a combination. So you want to run 00:58:39.440 |
ultimately a heterogeneous fleet that is maximized for token throughput and efficiency and so on. 00:58:45.840 |
So that's kind of one job. The next thing is what I call the agent factory. Remember that a SaaS 00:58:52.080 |
application in the modern world is driving a business outcome. It knows how to most efficiently use the 00:58:59.520 |
tokens to create some business value. In fact, GitHub Copilot is a great example of it, right? Which is, 00:59:06.480 |
you know, if you think about it, the auto mode of GitHub Copilot is the smartest thing we've done, 00:59:11.760 |
right? So it chooses based on the prompt, which model to use for a code completion or a task handoff, 00:59:19.600 |
right? That's what you, and you do that not just by, you know, choosing in some round robin fashion, 00:59:24.720 |
you do it because of the feedback cycle you have, you have the evals, the data loops and so on. So 00:59:30.080 |
the new SaaS applications, as you rightfully said, are intelligent applications that are optimized for 00:59:36.160 |
a set of evals and a set of outcomes that then know how to use the token factory's output 00:59:42.320 |
most efficiently. Sometimes latency matters, sometimes performance matters. And knowing how to do that trade 00:59:50.960 |
in a smart way is where the SaaS application value is. But overall, it is going to be true that there 00:59:57.120 |
is a real marginal cost to software this time around. It was there in the cloud era too. When 01:00:02.240 |
we were doing, you know, CD-ROMs, there wasn't much of a marginal cost. You know, with the cloud, 01:00:08.240 |
there was, and this time around, it's a lot more. And so therefore, the business models have to adjust, 01:00:13.200 |
and you have to do these optimizations for the agent factory and the token factory separately. 01:00:18.480 |
You have a big search business that most people don't know about, you know, but it turns out 01:00:23.040 |
that that's probably one of the most profitable businesses in the history of the world because 01:00:27.040 |
people are running lots of searches, billions of searches, and the cost of completing a search, 01:00:33.040 |
if you're Microsoft, is many fractions of a penny, right? It doesn't cost very much to complete a search. 01:00:39.040 |
But the comparable query or prompt stack today when you use a chatbot looks different, right? So I guess the 01:00:46.960 |
question is, assume similar levels of revenue in the future for those two businesses, right? Do you 01:00:54.160 |
ever get to a point where kind of that chat interaction has unit economics that are as profitable as search? 01:01:00.880 |
I think that's a great point because, see, search was pretty magical in terms of its ad unit 01:01:09.600 |
and its cost economics, because there was the index, which was a fixed cost that you could then amortize 01:01:16.400 |
in a much more efficient way. Whereas this one, you know, each chat, to your point, you have to burn a 01:01:23.280 |
lot more GPU cycles, both with the intent and the retrieval. So the economics are different. So I think 01:01:29.040 |
you know, that's why I think a lot of the early sort of economics of chat have been the freemium model and 01:01:34.560 |
subscription on the even on the consumer side. So we are yet to discover whether it's agentic commerce or 01:01:40.640 |
whatever is the ad unit, how it's going to be litigated. But at the same time, the fact that at this point, you know, I 01:01:48.240 |
kind of know, in fact, I use search for very, very specific navigational queries. I used to say I use 01:01:55.520 |
it a lot for commerce, but that's also shifting to my, you know, copilot. Look, I look at the copilot mode in 01:02:03.120 |
edge and Bing or copilot. Now they're blending in. So I think that, yes, I think there is going to be 01:02:10.000 |
a real litigation, just like that we talked about the SaaS disruption, where in the beginning of the cheese 01:02:15.440 |
being a little moved in consumer economics of that category. Right. I mean, and given that it's the 01:02:21.920 |
multi-trillion dollar, this is the thing that's driven all the economics of the internet, right? 01:02:26.960 |
When you move the economics of search for both you and Google, and it converges on something that looks 01:02:33.120 |
more like a personal agent, a personal assistant chat, you know, that could end up being much, much 01:02:39.440 |
bigger in terms of the total value delivered to humanity. But the unit economics, you're not just 01:02:44.080 |
amortizing this one-time fixed index, right? That's right. That's right. I think that the 01:02:49.360 |
consumer- Yeah, the consumer category, because you're putting a thread on something that I think a lot 01:02:54.080 |
about, right? Which is what, during these disruptions, you kind of have to have a real sense of where is, 01:03:01.200 |
what is the category economics? Is it winner take all? And both matter, right? The problem on consumer space 01:03:11.680 |
always is that there's a finite amount of time. And so, if I'm not doing one thing, 01:03:19.280 |
I'm doing something else. And if your monetization is predicated on some human interaction in particular, 01:03:26.160 |
if there was truly agentic stuff even on consumer, that could be different. Whereas in the enterprise, 01:03:31.280 |
one is it's not winner take all. And two, it is going to be a lot more friendly for agentic interaction. 01:03:38.000 |
So, it's not like, for example, the per seat versus consumption. The reality is agents are the new seats. 01:03:44.160 |
And so, you can think of it as the enterprise monetization is much clearer. The consumer 01:03:50.720 |
monetization, I think, is a little more murky. You know, we've seen a spate of layoffs recently with 01:03:55.520 |
Amazon announcing some big, big layoffs this week. You know, the mag seven has had little job growth 01:04:00.880 |
over the last three years, despite really robust top lines. You know, you didn't grow your head 01:04:06.000 |
count really from 24 to 25. It's around 225,000. You know, many attribute this to normal getting fit, 01:04:13.200 |
you know, just getting more efficient coming out of COVID. And I think there's a lot of truth to that. 01:04:17.120 |
But do you think part of this is due to AI? Do you think that AI is going to be a net job creator? 01:04:23.760 |
And do you see this being a long-term positive for Microsoft productivity? Like it feels to me 01:04:29.120 |
like the pie grows, but you can do all these things much more efficiently, which either means your margins 01:04:37.120 |
expand or it means you reinvest those margin dollars and you grow faster for longer. I call it the golden 01:04:43.360 |
agent margin expansion. I'm a firm believer that the productivity curve does and will bend or in the 01:04:53.440 |
sense that we will start seeing some of what is the work and the workflow in particular change, right? 01:05:01.840 |
There's going to be more agency for you at a task level to get to job complete because of the power of 01:05:10.640 |
these tools in your hand. And that I think is going to be the case. So that's why I think we are even 01:05:16.800 |
internally, for example, when you talked about even our allocation of tokens, we want to make sure that 01:05:22.560 |
everybody at Microsoft standard issue, right? All of them have Microsoft 365 to the tilt in the sort of 01:05:29.040 |
most unlimited way and have GitHub co-pilot so that they can really be more productive. But here is the other 01:05:35.920 |
interesting thing, Brad, we're learning is there's a new way to even learn, right? Which is, you know, 01:05:42.080 |
how to work with agents, right? So that's kind of like when the first, when Word, Excel, PowerPoint all 01:05:47.840 |
showed up in office, kind of, we learned how to rethink, let's say, how we did a forecast, right? I mean, 01:05:54.400 |
think about it, right? In the 80s, the forecasts were inter-office memos and faxes and what have you. And then 01:06:00.640 |
suddenly, somebody said, Oh, here's an Excel spreadsheet, let's put in an email, send it around, 01:06:04.720 |
people enter numbers, and there was a forecast. Similarly, right now, any planning, any execution 01:06:11.040 |
starts with AI, you research with AI, you think with AI, you share with your colleagues, and what have you. So 01:06:17.440 |
there's a new artifact being created and a new workflow being created. And that is the rate of, 01:06:23.920 |
the pace of change of the business process that matches the capability of AI, that's where the 01:06:31.920 |
productivity efficiencies come. And so organizations that can master that are going to be the biggest 01:06:38.000 |
beneficiaries, whether it's in our industry, or quite frankly, in the real world. 01:06:41.600 |
And so is Microsoft benefiting from that? You know, so let's, let's think about a couple years from now, 01:06:48.000 |
five years from now, at the current growth rate will be sooner, but let's call it five years from now, 01:06:53.120 |
your top line is twice as big as what it is today, Satya. How many more employees will you have? If you're, 01:07:00.480 |
Like one of the best things right now is these examples that I'm hit with every day from the employees 01:07:05.760 |
of Microsoft. There is this person who leads our network operations, right? I mean, if you think about the 01:07:10.560 |
amount of fiber we have had to put for like this, you know, this two gigawatt data center, we just 01:07:17.600 |
built out in Fairwater, right? And the amount of fiber there, the AI ran and what have you, it's just 01:07:23.840 |
crazy, right? So, and it turns out, this is a real world asset. There are, I think 400 different fiber 01:07:30.000 |
operators we're dealing with worldwide. Every time something happens, we're literally going and dealing 01:07:34.720 |
with all these DevOps pipelines. The person who leads it, she basically said to me, you know what, 01:07:39.440 |
I, there's no way I'll ever get the headcount to go do all this. Not forget, even if I even approve 01:07:44.640 |
the budget, I can't hire all these folks. So she, she did the next best thing. She just built herself 01:07:49.840 |
a whole bunch of agents to automate the DevOps pipeline of how to deal with the maintenance. That 01:07:55.760 |
is an example of, to your point, a team with AI tools being able to get more productivity. So 01:08:03.440 |
if you have a question, I will say we will grow a headcount. But the way I look at it is that headcount 01:08:09.040 |
we grow will grow with a lot more leverage than the headcount we had pre AI. And that's the adjustment, 01:08:15.920 |
I think structurally you're seeing first, which is one, you called it getting fit. I think of it as 01:08:22.400 |
more getting to a place where everybody is really not learning how to rethink how they work. And it's the 01:08:30.640 |
how, not even the what. Even if the what remains the constant, how you go about it has to be relearned. 01:08:37.280 |
And it's the unlearning and learning process that I think will take the next year or so. Then the 01:08:42.960 |
headcount growth will come with max leverage. Yeah, no, it's a, I think we're on the verge of 01:08:49.920 |
incredible economic productivity growth. It does feel like when I talk to you or Michael Dell that 01:08:55.440 |
most companies aren't even really in the first inning, maybe the first batter in the first inning in 01:09:01.360 |
reworking those workflows to get maximum leverage from these agents. But it sure feels like over the 01:09:06.640 |
course of the next two to three years, that's where a lot of gains are going to start coming from. 01:09:10.400 |
And again, I, you know, I certainly am an optimist. I think we're going to have net job gains from all of 01:09:16.320 |
this. But I think for those companies, they'll just be able to grow their bottom line, their number of 01:09:21.360 |
employees slower than their top line. That is the productivity gain to the company. Aggregate all that up. 01:09:27.120 |
That's the productivity gain to the economy. And then we'll just take that consumer surplus and invest it in 01:09:32.560 |
creating a lot of things that didn't exist before. 01:09:34.880 |
A hundred percent. A hundred percent. Even in software development, right? One of the things I look at it 01:09:39.360 |
is no one would say we're going to have a challenge in having, you know, more software engineers contribute 01:09:46.800 |
to our sort of society. Because the reality is you look at the IT backlog in any organization. And so the 01:09:52.640 |
question is all these software agents are hopefully going to help us go and take a whack at all of the IT 01:10:00.240 |
backlog we have. And think of that dream of evergreen software. That's going to be true. And then think 01:10:06.800 |
about the demand for software. So I think that to your point, it's the levels of abstraction at which 01:10:11.840 |
knowledge work happens will change. We will adjust to that, the work and the workflow. That will then 01:10:17.760 |
adjust itself even in terms of the demand for the products of this industry. 01:10:21.520 |
-I'm going to end on this, which is really around the reindustrialization of America. I've said if you 01:10:27.600 |
add up the $4 trillion of CapEx that you and these and so many of the big large US tech companies are 01:10:35.120 |
investing over the course of the next four or five years, it's about 10 times the size of the Manhattan 01:10:39.680 |
Bitcoin project on an inflation adjusted or GDP adjusted basis. So it's a massive undertaking for 01:10:45.840 |
America. But the president has made it a real priority of his administration to recut the trade 01:10:51.680 |
deals. And it looks like we now have trillions of dollars. South Koreans committed $350 billion of 01:10:57.280 |
investments just today into the United States. And when you think about, you know, what you see going 01:11:04.960 |
on in power in the United States, both production, the grid, et cetera, what you see going on in terms 01:11:11.040 |
of this reindustrialization, how do you think this is all going? And, you know, maybe just reflect on 01:11:19.360 |
where we're landing the plane here and your level of optimism for the few years ahead. 01:11:24.320 |
-Yeah, no, I feel very, very optimistic because in some sense, you know, Brad Smith was telling me about 01:11:30.080 |
sort of the economy around our Wisconsin data center. It's fascinating. Most people think, 01:11:35.520 |
oh, data center, that is sort of like, yeah, it's going to be one big warehouse and there is, you know, 01:11:41.040 |
fully automated. A lot of it is true. But first of all, what went into the construction of that data center 01:11:49.120 |
and the local supply chain of the data center, that is in some sense, the reindustrialization of the United 01:11:55.920 |
States as well. Even before you get to what is happening in Arizona with the TSMC plants or what 01:12:02.640 |
was happening with Micron and their investments in memory or Intel and their fabs and what have you, 01:12:08.400 |
right? There's a lot of stuff that we will want to start building. Doesn't mean we won't have trade 01:12:14.560 |
deals that make sense for the United States with other countries. But to your point, the reindustrialization 01:12:20.400 |
for the new economy and making sure that all the skills and all that capacity from power on down, 01:12:28.320 |
I think, is sort of very important, right, for us. And the other thing that I also say, Brad, 01:12:33.840 |
it's important, and this is something that I've had a chance to talk to President Trump as well as 01:12:38.720 |
Secretary Lutnik and others, is it's important to recognize that we as hyperscalers of the United States 01:12:45.360 |
are also investing around the world. So in other words, the United States is the biggest investor 01:12:52.400 |
of compute factories or token factories around the world. But not only are we attracting foreign capital 01:12:59.920 |
to invest in our country so that we can reindustrialize, we are helping, whether it's in Europe or in Asia or 01:13:06.320 |
elsewhere, in Latin America and in Africa, with our capital investments, bringing the best American tech 01:13:13.520 |
to the world that they can then innovate on and trust. And so both of those, I think, are really 01:13:20.000 |
bode well for the United States long term. I'm grateful for your leadership. 01:13:24.640 |
Sam is really helping lead the charge at OpenAI for America. I think this is a moment where I look ahead, 01:13:31.600 |
you know, you can see 4% GDP growth on the horizon. We'll have our challenges, we'll have our ups and 01:13:36.320 |
downs. These tend to be stairs, you know, stairs up rather than a line straight up and to the right. 01:13:41.360 |
But I, for one, see a level of coordination going on between Washington and Silicon Valley, between big 01:13:47.200 |
tech and the reindustrialization of America that gives me cause for incredible hope. Watching what 01:13:52.320 |
happened this week in Asia, led by the president and his team, and then watching what's happening here 01:13:58.560 |
is super exciting. So thanks for making the time. We're big fans. Thanks, Satya. 01:14:14.320 |
As a reminder to everybody, just our opinions, not investment advice.