back to indexEp6. AI Demand / Supply - Models, Agents, the $2T Compute Build Out, Need for More Nuclear & More
Chapters
0:0 Intro
2:49 Demand for Compute Is Virtually Unlimited
27:53 Everywhere we Look we are building more and bigger Supercomputers
35:49 The AI Induced Power Gap & The Need for Nuclear
50:21 Valuations / Market Check
00:00:00.000 |
If World War II never happened, the atomic bomb never happened, and someone just showed up in 2024 00:00:05.440 |
and said, "I figured out this thing, it's nuclear fission," people would be like, "Oh my god!" 00:00:11.600 |
Like, probably be more excited than you are about AI, right? Like, because it would solve all of 00:00:37.360 |
I mean, is this a little promotion for the state of Texas we got going on? 00:00:41.200 |
Yeah, yeah, I'm proud of the state of Texas, and a little bit, you know, I'm sad that the... 00:00:47.360 |
especially watching the women's tournament quite a bit, and Texas just came this close 00:00:52.000 |
to the Final Four, but the games last night with Iowa and LSU were just amazing. So anyway, 00:00:58.320 |
it's been a fun time. So yeah, reminiscing perhaps a little bit. 00:01:01.520 |
March is a good month. Well, speaking of reminiscing, I've been reminiscing, you know, 00:01:06.960 |
I'm on spring break with my son, and we see these markets around AI getting... there's a lot of 00:01:12.400 |
commentary about how frothy they're getting. And it reminded me of that question, you know, 00:01:17.840 |
it's different this time. You know, people say the four most dangerous words in the investing 00:01:22.960 |
universe, but yet every big breakthrough we've gone through in tech, it actually has been 00:01:28.720 |
different, right? And so as analysts, as anthropologists, as forecasters, right, we have 00:01:35.360 |
to try to sort this out, both in the short run and the long run, right? And it's hard to do. 00:01:40.320 |
I mean, you do this bottoms up, you do this tops down, you study it. Lots of people in 1998 knew 00:01:46.560 |
the internet was going to be massive, right? I mean, Henry Blodgett calling Amazon 400, 00:01:53.840 |
right? People thought it was blasphemous, but it didn't stop us from having a boom 00:01:59.120 |
and a bust along the way, right? And today Amazon's at 3000 bucks, almost 10X what Henry, 00:02:05.680 |
you know, got shouted down for saying in 1998. And I think at the time he said it was a 10-year 00:02:10.400 |
call. But the fact of the matter is trying to marry up the short and the long-term, I think, 00:02:15.680 |
is really, really tough. Well, and I think it gets even tougher if enthusiasm builds, because 00:02:23.840 |
that impacts the entry price on the marginal investment that one might make. And I think 00:02:30.000 |
this particular moment in time is very, very difficult for investors that are looking at the 00:02:38.640 |
marginal investment, because the prices infer some amount of optimism already. 00:02:46.080 |
For sure. And so that's really what I think we're going to dig into a little bit today. Tap on this 00:02:52.480 |
and go a little bit deeper. Maybe starting with this idea is there is increasing evidence that 00:02:59.360 |
demand for training and inferences is maybe deeper and wider than we thought. You know, 00:03:04.160 |
we're reading a lot of headlines about the world building ever bigger supercomputers, 00:03:07.920 |
and then a lot of conversation about what those bottlenecks become. But why don't we just start 00:03:12.160 |
with this question about why do we need bigger, right? And I guess first principles, you know, 00:03:19.120 |
generative AI produces these tokens. These tokens are a proxy for human intelligence. 00:03:24.560 |
And there's a lot of conversation about there's no limit to how much incremental human intelligence 00:03:30.080 |
we want to buy, right? You and I've talked a fair bit about co-pilots for engineering. We talked 00:03:36.720 |
about that with Dara, co-pilots for call centers. And in a pretty short period of time, these have 00:03:42.400 |
become really ubiquitous development projects for almost every enterprise in just a few short years 00:03:49.520 |
wanting these co-pilots. And now we're seeing a lot of conversation about autonomous agents. I 00:03:54.560 |
mean, I think Benchmark is investors in Lang chain. Harrison Chase gave a great talk last week about 00:04:00.560 |
what's next for AI agents that I encourage folks to watch around planning, UI, and memory. And, 00:04:07.760 |
you know, we hosted an AI dinner last week, and there was an interesting conversation that came 00:04:12.480 |
up just about a search use case within the enterprise. And in this particular company, 00:04:18.320 |
the CTO took their million, well, 100,000 lines of code, their entire code base, about a million 00:04:24.560 |
tokens, dropped it all into the prompt, and then just asked if it spotted any common bugs. And he 00:04:31.680 |
said it found six, like non-trivial bugs in the code base. So the point is, we've gone from co-pilot 00:04:39.360 |
to some of these maybe a little bit more autonomous systems, magic and cognition, all of this in less 00:04:45.440 |
than two years. And let me ask you the question, if we compare this to 1998, like looking at just 00:04:52.800 |
demand, you know, do you think this demand is as real as it appears? And what do you think are the 00:05:00.880 |
most interesting use cases you're seeing from a demand perspective today? I think it's super 00:05:06.320 |
difficult to unpack, not opting out of the question, but, you know, you and I went deep 00:05:13.200 |
on the topic of full self-driving at Tesla and Model 12. And based on what we were told and what 00:05:21.520 |
we learned, that's an application use case of AI that's pretty amazing, you know, coded differently 00:05:31.360 |
than it had been for the past decade, and has results that people are claiming they can just 00:05:39.280 |
feel is demonstrably better. And we've talked about the use case like coding. We've also talked 00:05:47.520 |
about the fact that when you move away from coding, the efficacy drops a bit in terms of the 00:05:54.000 |
productivity gain you might get. And I've watched, you know, certain companies say, oh, we're going 00:06:00.320 |
to use AI, or I think they mean LLMs, but we're going to use LLMs to improve workflows in every 00:06:06.560 |
aspect of our business. And when I've circled back with those people and say, did it change 00:06:12.080 |
every aspect of your business? They come off of it a little bit. So, and one thing that I think 00:06:20.640 |
is really different between the full self-driving data point, and it's not built on LLMs, it's a 00:06:28.800 |
traditional use case of AI, which allows it to, I think, have much more finite input/output. 00:06:35.040 |
And that example use case is fairly finite input/output, rather than this kind of open-ended 00:06:41.200 |
conversation thing. So, I'm insanely curious, as I have always been. And so, in the two weeks 00:06:50.080 |
since we've talked, you know, if someone has a use case that they've uncovered, and they're 00:06:55.680 |
willing to share it with me, and I'll throw this open to the audience, like anyone, feel free to 00:07:01.040 |
reach out. I'm infinitely curious. But one thing that I've seen that is causing me to pause a bit 00:07:08.720 |
is a lot of the incremental work that's being done on AI projects are stitching together 00:07:19.440 |
external databases with the LLM. And the LLM gets relegated to being an interpreter of the human, 00:07:27.840 |
or to being an interpreter of the data back to the human, but is not involved necessarily 00:07:35.600 |
in the data store, and is partially involved but not wholly involved in the decision-making. 00:07:42.800 |
And on one hand, you could say, "Well, that's just how this is going to evolve." And maybe that is 00:07:47.760 |
how this is going to evolve. On the other hand, for me, it questions whether we're still on the 00:07:54.400 |
same evolutionary track we were with the LLMs. And look, there's all kind of open questions, like, 00:08:00.960 |
"Will the LLMs that are being worked on at Anthropica OpenAce start to have a native data 00:08:06.880 |
store inside of them?" Well, on the other side, we're seeing companies that have a ton of corporate 00:08:12.640 |
data bolting their own LLM on top of theirs, and saying, "You don't need the traditional LLM 00:08:20.720 |
vendors because we'll just allow you to query the data store you already have." So I think there's a 00:08:26.640 |
lot of questions that caused me not to have the overwhelming confidence that it sounds like you're 00:08:34.000 |
building, that it can go even higher from where we are today. And as you and I talk about all the 00:08:41.200 |
time, like, I think about the fact that these entry prices are quite accelerated and bake in quite a 00:08:48.240 |
bit of optimism. So one thing we've seen since you and I talked is a couple of, we'll just call them 00:08:54.960 |
early falls from grace with inflection and stability AI. And when you raise money 00:09:03.920 |
for a non-revenue company in the billions, it's pretty easy to lose confidence when you actually 00:09:15.920 |
start having revenue, because all of a sudden, you can see the delta from what it's going to take 00:09:23.040 |
to get to your next round. And in some way, I used to always tell people that valuation is discounted 00:09:32.720 |
future expectations. They think about it as they won an award. What they should really think is, 00:09:39.600 |
"Holy shit, we just set ourselves up." And that gets back to that point we were making before, 00:09:47.200 |
which is Amazon 400 in 1998 was ahead of where the company was. The fact of the matter is it 00:09:55.680 |
ended up being a bargain relative to the ultimate price that Amazon achieved. But we all know there 00:10:02.400 |
were a lot of companies along the way that never achieved it. Yeah. And I mean, I've made this point 00:10:07.600 |
over and over again, but I I'll state it again for the record, which is public companies structures 00:10:13.760 |
are much more flexible for downward performance than private company structures. Right. And they 00:10:22.400 |
just like, if you raise money, let's say your generic AI company, we call it company G, and 00:10:31.120 |
company G raises money at $1.2 billion, and they have very little revenue. And 18 months from now, 00:10:38.080 |
they need to be in the market again. And because they raise that much, they're burning, I don't 00:10:42.960 |
know, what, $50 million a year? And the revenue's $2 million, $5 million, $10 million, $15 million. 00:10:50.000 |
Like, all of a sudden, that looks very difficult to raise an incremental round. 00:10:55.120 |
Well, I would I would say it's, you know, and again, I think inflection did some important work. 00:11:00.800 |
I don't think it was a terrible outcome for the investors, for Mustafa, for Microsoft. But it does, 00:11:07.760 |
I think, shine a light on one of the things you said. If you are trying to build your own LLM 00:11:12.640 |
today, as inflection was, and you want to get your hands on 40,000 H100s, which I think they did, 00:11:19.520 |
that's like, you know, digging a gold mine, right? The capital intensity of that undertaking 00:11:26.080 |
is very different than starting a software company, right? And so the risk reward to 00:11:32.800 |
both the founders and the risk reward to the investors changes a lot. I want to touch on 00:11:38.160 |
something you said, you know, I was getting perhaps a little bit more bullish and excited. 00:11:43.040 |
You know, I think I have a good company. I think I have a good company here. 00:11:49.280 |
Remember, it was just not even a year ago that, you know, perhaps those of us who owned NVIDIA 00:11:56.880 |
went from 125 to 250 or 300. And everybody yelled and said, you got to sell your NVIDIA. 00:12:02.320 |
And I said, why would I do that? My numbers have gone up by more than the stock is appreciated. 00:12:07.680 |
And the argument was that we pulled forward all the training demand models are going to get smaller. 00:12:12.720 |
And that, you know, perhaps some of the production use cases are overhyped. 00:12:17.280 |
And, you know, what we've seen from our bottoms up, both in terms of the training demand, 00:12:23.840 |
okay, so this is training small models, medium sized models, frontier models, 00:12:28.880 |
that is a lot bigger than we thought. And then the production use cases around inference, 00:12:35.760 |
the co-pilots that we talked about, like, I can't think of another technology that became 00:12:40.000 |
almost ubiquitous within enterprises, right? The need for it. Like if you're an enterprise with a 00:12:45.840 |
call center, and you're not leveraging AI today, like what are you doing? If you're not leveraging 00:12:49.920 |
co-pilot today, what are you doing? So in a matter of just a few years, this enterprise technology 00:12:55.200 |
has become ubiquitous. And I want to explore two quotes that, you know, I came across recently, 00:13:02.640 |
both from interviews that I think get to this point. And I want to try to break those down a 00:13:08.800 |
little bit for where we go from here. But the first was from Jensen Huang. And he said, "You 00:13:16.080 |
have to redefine the way you think about the traditional sources of demand for data centers 00:13:21.200 |
and compute. The production at scale of intelligence matters to every single country 00:13:27.920 |
and to every single industry." Now, the immediate reaction to this and to the quote from Sam Altman, 00:13:33.600 |
I'm going to read here in a second, is of course, Jensen's going to say that he's talking his book. 00:13:38.720 |
Right? But given that Jensen has more demand today than he can satisfy, right? There's no 00:13:46.320 |
need to talk his book. And Jensen's been at this for 30 years. And I think he's been a pretty 00:13:51.280 |
straight shooter for a long period of time. And so when you have somebody like him who's saying, 00:13:55.920 |
"Listen, this is going to impact every sovereign in every industry." And then my own two eyes look 00:14:00.800 |
at every single company, and every one of them wants a co-pilot for engineering. Every one of 00:14:05.600 |
them wants a co-pilot for call centers. And the capabilities of these co-pilots is still on a very 00:14:11.760 |
steep curve. It tells me that the use cases are unlike anything else we've seen since perhaps 00:14:18.880 |
the database or the internet itself, where those became ubiquitous in every single enterprise. 00:14:24.800 |
What do you think about Jensen's quote? Will this touch every industry? 00:14:30.320 |
I think... Well, I already made a statement earlier that it is such an interesting question, 00:14:38.320 |
too, because I really think you have to separate LLMs from AI. Now, Jensen's 00:14:42.880 |
exposed to both, so it doesn't... One's a subset of the other. 00:14:48.320 |
Explain that a little bit. Explain what you mean by that a little bit. 00:14:51.280 |
Look, I'm always open to being wrong, and I've been wrong many times before. But I'll tell you, 00:14:58.080 |
my current best thinking was just that LLMs were developed, starting with language translation, 00:15:06.560 |
and then when the attention window part was added in, where context mattered, became very good at 00:15:12.800 |
structured language. And it turns out coding is a subset of... And an even better subset of 00:15:21.200 |
structured language, because it's more structured than language itself. And so, LLMs are very, 00:15:28.560 |
very, very good at coding. Although I did see one tweet this week where someone said that the... 00:15:33.360 |
They were saying the percentage of time spent on debugging has gone up massively because you don't 00:15:41.040 |
even understand the code that was written, so to debug it takes longer, which would make sense. 00:15:46.800 |
But anyway, as you... So, LLMs work really well against things that are language intensive. So, 00:15:54.320 |
programming is a very particular use case of language that's even better for... And then 00:16:00.480 |
customer support, you know, because the data content stores that have all the answers are 00:16:07.520 |
very textual, and the user representing their problem is in language. And so, that one's a 00:16:15.200 |
really good one too. And then, you know, people have talked about legal. There are others, but 00:16:21.600 |
LLMs not going to do much for you on a manufacturing floor. The LLM part, AI might, 00:16:29.920 |
but the LLM won't. And so... And clearly, NVIDIA is taking... There are going to be all these 00:16:36.400 |
different models, vision models, omniverse, models of the real world, et cetera, that tackle the 00:16:42.160 |
other problems that you're talking about. But that's also going to lead to massive TAM expansion. 00:16:46.560 |
We met with... I spent time with Fei-Fei Li this past week, the incredible researcher, author, 00:16:52.960 |
thinker at Stanford who's working on her own business, and that's going to require model 00:16:58.800 |
training. So many people are still... And I would argue in the... Not only do I think it's going 00:17:04.880 |
higher, Bill, I think we're in the very, very early phases, and we're going to look back at 00:17:10.480 |
this period and say those were pretty incapable models. I want to talk about this Lex Friedman 00:17:15.280 |
podcast, which I thought was pretty terrific with Sam Altman. But Sam, I think, referenced that, 00:17:21.360 |
you know, the step forward in CHAT GPT 5 versus 4 is at least as big as 4 versus 3. I've heard 00:17:27.680 |
this from others who've actually had some exposure to that. I also hear similar things about Anthropic 00:17:34.000 |
where you have these agent capabilities that are, you know, built right in and become intrinsic, 00:17:39.040 |
inherent in the model itself. And so I think it's not just language. I think we're going to have, 00:17:44.960 |
you know, these real world models. Obviously, we've seen the video models, etc. And these 00:17:48.720 |
things are just massively consumptive of data and compute. And so in that regard, I think, 00:17:54.880 |
if anything, we're still probably underestimating where we go from here. But I will stipulate to you 00:18:01.120 |
as venture investors, you're going to have a lot of roadkill along the way, 00:18:07.840 |
right? You know, there were 20 search engines that went to zero, there are going to be 20 LLMs 00:18:14.160 |
that go to zero, or many more. And I'm not even sure, right? If you said to me, where does value 00:18:21.920 |
capture happen at the end of the day? Where is durable value capture in the land of AI? I'm not 00:18:28.800 |
sure that LLMs, like if that's all you have going for you, I'm not sure that's going to be the 00:18:34.160 |
winner. At the end of the day, you get paid by solving problems for consumers or solving problems 00:18:40.400 |
for enterprises in a way that's durable and differentiated such that other people can't 00:18:45.760 |
compete. And you can achieve, you know, some some dominant market share type pricing, some monopoly 00:18:52.160 |
like pricing. And when I look at the commoditization of models, it seems to me that 00:18:57.680 |
apart from the people who are on the very frontier, if you're on the frontier, and you 00:19:02.080 |
have something totally different, it seems to me that that's a place where that is defensible. But 00:19:07.040 |
if you're not on the frontier, man, it seems that these are going to be really fast depreciating 00:19:12.720 |
assets that are going to be commoditized by the llamas of the world, other open source models. 00:19:17.760 |
So I think you can have these two simultaneous truths. One can be that we're at the early stages 00:19:24.320 |
in terms of the demand cycle for AI. But number two, that if you're building an index of LLMs, 00:19:31.040 |
it may not be a very successful investment approach. Yeah, I mean, you raised so many 00:19:36.400 |
different questions. I'll try and unpack a couple and then and then and then I want to go back to 00:19:42.240 |
the same thing for a minute. But the the I really look forward to seeing five and seeing if I feel 00:19:51.760 |
that way about it. I think part of what people are doing is extrapolating the feeling they had 00:19:59.600 |
the first time they used chat GPT or the first time they had to write a letter, you know, and 00:20:05.600 |
it just had this magical appeal to it. And then the next step was multimodal. And I would argue 00:20:13.440 |
that didn't really land as well as people thought. And the first version of I think what you're 00:20:20.080 |
calling agents were these third party attachments you could click and sign up for like that didn't 00:20:25.680 |
really land. And so I what what I don't have the ability to understand and where this is where I 00:20:34.880 |
could get it way wrong is. It doesn't appear to me that if agents are connectors to the external 00:20:41.600 |
world, you know, to me, that's a lot like an enterprise when people would build connectors 00:20:46.560 |
to other apps. And I just don't see it on this kind of exponential scale that we got, you know, 00:20:54.240 |
as the LM was passing the LSAT, you know, you had each of these hurdles that it was driving through. 00:21:01.600 |
And, you know, I don't I don't know that it'll feel that way. But if it does, then you I mean, 00:21:08.080 |
this will happen. So when does five ship? Everybody thinks summer end of the year. 00:21:12.960 |
OK, perfect. So today's April 2nd. So within six months, if it's so magical, then I want you to 00:21:21.120 |
call me out and say, you see, but but but but we're playing a lot with hyperbole. And that's 00:21:27.920 |
where I wanted to ask, which is like like just saying, oh, it's way better or oh, it's you know, 00:21:34.480 |
I I have I it causes my skeptical meter to go up. I want to there's two things Sam did this in this 00:21:42.400 |
recent interview that I think put him in the promotion hall of fame. And one of them was he 00:21:51.680 |
read he juxtapositioned AI versus the smartphone market and then on the smartphone market as being 00:22:01.280 |
small. He said, well, that's limited to five billion purchases every two years. And we're 00:22:08.400 |
going to go way beyond that, which was I just thought genius. Right. Like like bring up something 00:22:14.880 |
really big and then say degrade that and say it's going to be way bigger. It's genius. But the second 00:22:20.720 |
thing he did was he said he's afraid they're going to run out of energy. Now, this is genius also, 00:22:28.960 |
because the minute you leave that conversation and say, holy, how much energy is used by data 00:22:37.120 |
centers, you've succumbed to the game, like because you've now accepted that this thing's 00:22:44.400 |
going to be energy limited and you're going off and trying to figure out whether that's true or 00:22:48.640 |
not. I think both of these things were and I it may sound like I'm kidding. I'm not. I think 00:22:55.200 |
they're jobs and level promotional techniques like so much. So considering that I am on the other 00:23:03.680 |
side of that from you, let's actually look at the quote and then let's take it apart in its parts. 00:23:08.800 |
Right. He starts off by saying, I think compute is going to be the currency of the future. I think 00:23:12.960 |
it'll be the most precious commodity in the world. OK, I'll give you that. That's that's nice prose 00:23:17.760 |
and a little hyperbolic, perhaps. He says, I think we should be investing heavily to make a lot more 00:23:22.240 |
compute. And then he he addressed this question of why are people skeptical of demand? I would say 00:23:27.520 |
you're skeptical of demand. We have some other friends who are skeptical of demand. And he says 00:23:32.160 |
people think they compare it to the market. So they think about the market for chips, for mobile 00:23:37.200 |
phones. And let's say there's something I never did, by the way. I never thought about it that 00:23:41.440 |
way. That's my point. But you know that seven billion people, they upgrade every two years. 00:23:47.440 |
So the market's three billion systems on a chip. And he says, here's the important point, Bill, 00:23:51.600 |
because it gets down to the elasticity of pricing. He says, if you make 30 billion chips on a system, 00:23:59.600 |
you're not going to sell 10 times as many phones because most people only have one phone. Now, 00:24:04.800 |
then he goes on to say, but what if you produce more compute? This is why it's so important. 00:24:10.400 |
But compute is different. Intelligence is going to be more like energy, where the only thing that 00:24:18.320 |
makes sense to talk about at price X is how much of this compute the world will consume and at price 00:24:25.680 |
Y, how much it will consume. Because if you think about energy, right, and I was trying to come up 00:24:31.920 |
with an analogy on this, Bill, if the application, let's talk about an application that we use of 00:24:39.040 |
energy every day that we like, like a hot water heater. I like to take a hot shower. But if the 00:24:44.240 |
cost of that was $100,000, right, not many people would take hot showers. But he's saying that if 00:24:51.920 |
you drive down the price of the cost of compute, then the reflexivity is people will consume a lot 00:24:58.720 |
more of it. Now, this is also known as the Jevons paradox, right? As price goes down, we demand more 00:25:06.800 |
of it. The aggregate amount of consumed, of the compute consumed actually goes up, right? And 00:25:13.840 |
that's really, you know, just a fancy way of talking about the elasticity of demand. And so to 00:25:19.760 |
me, I think that is the way when people, when the traditional semiconductor chip analysts look at 00:25:27.440 |
this market or data center analysts look at this market, they do, Bill, compare it to things like 00:25:32.880 |
the smartphone market, et cetera. And they say, okay, well, how does this compare? Is it bigger 00:25:37.920 |
or is it smaller? And I think the key point he was making, whether we agree or disagree, 00:25:42.960 |
he was saying, if you produce it in abundance, right, there is going to be dramatic workloads 00:25:50.080 |
demanded of that abundance since compute. And he used the examples, you know, whether it's, 00:25:54.640 |
you know, tracking my email, doing very pedestrian things, running co-pilots for engineering and 00:26:00.000 |
customer care, running these autonomous agents, or whether it's solving in the future, 00:26:05.040 |
much more difficult problems like finding cures to cancer using some of these search and planning 00:26:13.200 |
techniques, inference planning techniques that we're looking to in the future. So I thought that 00:26:17.840 |
was actually interesting. But, so let me stop there, get your reaction because the second half 00:26:23.440 |
of it I also think is important. Don't you agree that if we drive down the cost of compute, if we 00:26:30.320 |
build a lot more of this, today we're clearly constrained that we're going to consume a lot 00:26:35.280 |
more of it in the future. I mean, it's not limited by the number of chips on a smartphone. 00:26:39.680 |
It's just not that interesting, Brad. Like if I drove down the cost of housing, I would produce 00:26:44.720 |
more of it and people would use more of it. If I drove down the cost of gasoline, people would use 00:26:49.440 |
more of it. If I drove down the cost of a robot, there'd be more automation. Like, of course, 00:26:54.720 |
that's true. If an airplane cost one-tenth the price that it does today, we would fly more. 00:27:00.560 |
That's just not that interesting. It's not provocative. It's not provocative. 00:27:06.560 |
Okay. So let me tell you why I think it is provocative. We did drop the cost of flying 00:27:10.640 |
on an airplane by one-tenth since 1971. And guess what? We got 10X as much travel. Okay. So all he's 00:27:17.520 |
saying is that there, you know, today we're limited in the amount of these things that we can do. We 00:27:24.560 |
can't build a multi-trillion parameter model today if you don't have the compute to do it. 00:27:30.160 |
And so I do think that you're going to get to intelligence faster. You're going to get to more 00:27:35.280 |
use cases faster, just like you got to more plane tickets and more hot water heaters, but we got to 00:27:39.920 |
actually build out the infrastructure. The truth is, whether we believe it or not, it's very clear 00:27:46.720 |
to me, and maybe we can move to this topic, it's very clear to me that enough people in the world 00:27:52.400 |
believe, because I'm reading headline after headline about massive supercomputers that are 00:27:59.120 |
being built, right, in almost all of the GCC countries. So Omnivan, Kuwait, MGX is backing a 00:28:06.480 |
lot of efforts in the Emirates. The Saudis are doing the same. The French, a lot of talk about 00:28:12.720 |
what's going on with the Singaporeans. The United States is out talking about their supercomputers. 00:28:17.680 |
And just this week, we saw some headlines, I don't know whether or not they're true, 00:28:22.960 |
about this project that Microsoft and OpenAI execs are drawing up on something, I think 00:28:28.640 |
they're calling, what is it, Stargate? And again, I don't know the specifics whether or not that's 00:28:34.160 |
true in '26 or '27, that they're going to spend $100 billion on a single supercomputer, but it 00:28:39.760 |
does seem to me that there are enough breadcrumbs in the world today that people are making forward 00:28:45.520 |
bets, that the need for more training and the need for more inference, there's sufficient evidence 00:28:52.160 |
to cause them to put real money up against this. And so I think we're, you know, remember, we 00:28:56.800 |
talked about a few weeks ago, that chart where Jensen said the five-year replacement cycle for 00:29:02.320 |
data centers is not going to be $1 trillion, it's going to be $2 trillion. And then we'll reshow it 00:29:07.200 |
here in the pod, and we broke down what that's likely to look like on an annualized buildup. 00:29:12.640 |
In many ways, this is just a lot more commentary about the same, Bill, which is there are people 00:29:20.960 |
placing orders, hyperscalers placing orders, enterprises placing orders, and now sovereigns 00:29:26.320 |
placing orders that get you to a much bigger number than the traditional data center market. 00:29:33.760 |
Well, I think it will be solved. There will be more capabilities brought online to train 00:29:41.360 |
ever bigger models. But I think that he's suggesting, and this is, you know, I think 00:29:48.160 |
as forecasters, the hard thing is, are these things going to be sufficiently demanding of 00:29:55.280 |
compute and power that on current trajectories, we're going to fall short. And I think there are 00:30:01.040 |
a lot of people sounding alarm bells on this, including, I saw a headline from the Biden 00:30:07.440 |
administration, says the Biden administration wants to accelerate its conversations with big 00:30:12.480 |
tech companies on how to generate more electricity, including with nuclear power, to meet their 00:30:17.520 |
massive demand for AI computing. So on this one, I think we have a forecast. You know, 00:30:24.320 |
if you look at the general forecast for power demand, it's been about GDP growth, right? 00:30:30.720 |
Couple percent a year. And the fact of the matter is with renewables, we got sufficiently more 00:30:35.920 |
efficient every year that you really didn't have to bring on a lot of new power generation to meet 00:30:41.680 |
the demand that was going on in the world. But now if you look at some of these forecasts, 00:30:47.040 |
and here's a forecast, I think coming out of semiconductor analysis, it's similar to a lot 00:30:52.960 |
of the other ones. And it has data centers as a percentage of US power generation going from 00:31:00.000 |
something like four percent today to something like, you know, 18, 19 percent in 2030. Right. 00:31:08.400 |
So these are parabolic moves in terms of consumption that aren't satisfied with the 00:31:13.680 |
current improvements that we have in terms of both power generation and our grid. You've been 00:31:20.080 |
out front, I should say, on calling for the need to have regulatory reform as it comes to comes 00:31:26.720 |
down to nuclear fission. A lot of people are talking about nuclear fission with respect to 00:31:32.080 |
solving this problem. Are you a let's say that you believe the demand is going to be great, 00:31:37.040 |
you know, and we just talked that there's some debate back and forth. But talk a little bit 00:31:41.440 |
about what would need to happen from a nuclear fission perspective in order to meet some of 00:31:46.240 |
this power demand. Yeah, I think there are better people like to talk about this than me. But 00:31:52.320 |
when I look across the number of people whose intellect I respect and this came to my thinking, 00:32:01.680 |
you know, maybe five years ago, and that's everyone from Steve Pinker to John Collison to 00:32:08.000 |
Ilan to Toby at Shopify, like they all believe that nuclear is the very best way 00:32:15.920 |
to get us past the climate change problem. And one of them, I can't remember which one, 00:32:21.600 |
tweeted like if if if you're vocal about climate change and anti-nuclear, I don't have time for you. 00:32:28.320 |
And and I I'm in I've been in that mindset. What what has happened and another person that's been 00:32:35.520 |
remarkably outspoken on this is Josh Wolfe at Lux, who I spent time with at Santa Fe. 00:32:40.560 |
And he even went in front of Congress and was talking about how much China's investing in 00:32:46.480 |
fission. I think we have some charts we can show here, but their plan is to build 100 or 200 new 00:32:52.320 |
fission plants while we're, you know, and if you go back three or four years ago, we were only 00:32:58.320 |
decommissioning plants. We weren't. And of course, people talk about the horrific thing that happened 00:33:04.160 |
in Germany where they were a leader in the market and turned them off and now are aggressively 00:33:10.240 |
using coal and buying energy externally. And, you know, I think the best quote I heard, 00:33:17.840 |
and maybe it was from Josh, it may have been from Pinker. He'll love that. I confuse those two 00:33:22.640 |
was that if World War Two never happened, the atomic bomb never happened. And someone just 00:33:29.680 |
showed up in twenty twenty four and said, I figured out this thing. It's nuclear fission. 00:33:34.640 |
People be like, oh, my God, like probably be more excited than you are about. I write like 00:33:41.760 |
because it would solve all of our problems and we let there become this negative association. 00:33:47.200 |
Now, in the past three or four years, that's been flipping and there's been a lot. I think 00:33:51.680 |
it's because of thought leaders like the ones I mentioned being outspoken that people have come 00:33:58.000 |
to realize that we are we are it's it is the most energy efficient, the most energy dense, 00:34:04.640 |
the cleanest thing we have available. It's way more durable because it can be put in so many 00:34:10.960 |
different places that don't have geographic limitations, don't have sunshine limitations, 00:34:15.520 |
don't have wind limitations, don't have transport limitations that some of the renewables have. 00:34:21.120 |
And so it's been great. I would just say exceptional even to see this flipping of expectation about 00:34:29.760 |
what's possible with nuclear and, you know, the the plant that didn't get shut down in California 00:34:36.320 |
and then this Biden thing this week, which is all new, right? It's all and it's new in a different 00:34:41.600 |
direction. So winds have been going this way and they're going this way. So it's fantastic. Now, 00:34:47.680 |
despite it being fantastic, China is way ahead of us. And we'll we'll put a link in the notes of 00:34:56.240 |
Josh Wolfe telling this to Congress, but they're way ahead of us. They're executing faster. And 00:35:02.240 |
the biggest problem, as I understand it, once again, from talking to experts, not from my 00:35:06.960 |
my own direct knowledge, but our cost of deploying new nuclear fission infrastructure 00:35:14.960 |
is limited by our own regulatory framework, not by the technology. Or if you look at the 00:35:22.080 |
cost differential between us and say China, that is the problem. And I don't know that we know 00:35:30.480 |
how to do regulatory reform. And if I if I could snap my fingers and say, what would I want? I'd 00:35:38.240 |
love for there to be a a zero based like like you do zero costing in an organization, a zero based 00:35:46.560 |
regulatory rewrite of nuclear fission and one done without oil executives in the room. Now, 00:35:54.160 |
I don't know how to make that happen. Well, you know, it's you bring up so many good points. 00:36:00.560 |
One of the points is, you know, we've got this chart from our world and data 00:36:06.000 |
that nuclear is both the safest and the cleanest source of energy. Right. And so if you just start 00:36:12.800 |
there, you have to ask the question, like, how did we end up here? Like, we're all of the age 00:36:19.120 |
that we remember Chernobyl and all the fear that got propagated in the world. One of the things 00:36:24.720 |
that worries me about A.I. itself. Right. Here's a technology much like supersonic aviation, 00:36:32.080 |
where scare tactics literally shut down all of the innovation in massive industries that would 00:36:39.760 |
have inured to our national strategic advantage, the environmental advantage, right, geopolitical 00:36:47.440 |
advantage. And we literally shut it down. And worse yet, there wasn't even a robust debate 00:36:53.440 |
about it. It's just like, you know, the anti nuclear lobby declared victory and it was game 00:36:59.840 |
over. And it's only what's interesting to me if you say, well, why, Bill, has have the winds begun 00:37:07.680 |
to change on nuclear? I think the reason the Biden administration came out this week. Right. 00:37:13.360 |
If you think that A.I. is on the front lines of the new Cold War about national economic security, 00:37:21.600 |
national defense and offensive security, and if you think the limiting factor to pushing forward 00:37:28.960 |
in A.I. is power, then all of a sudden. Right. You're sitting here worrying about China from 00:37:35.120 |
the perspective of the video chips. You think you're ahead of China, but all of a sudden you 00:37:40.160 |
say if the bottleneck is power, at least on nuclear, we're way behind, way behind on two 00:37:46.640 |
dimensions. They have over 300 plants currently either being built or in development. We have zero, 00:37:53.280 |
zero plants being built. I think 13 plants that are proposed according to this data that we have 00:37:59.840 |
here and the cycle time to put up a plant. Right. If you if you do it as fast as humanly possible, 00:38:07.280 |
something like seven to 10 years. Right. And that's that's assuming that somebody says go 00:38:13.280 |
tomorrow. On top of that, if you look at some of, you know, just enriched uranium, 00:38:19.440 |
I think a lot of the plants in Canada have been mothballed. We've got to spin those plants back up 00:38:24.320 |
in Saskatchewan, whereas I think, you know, we have another chart on this. Eighty percent of 00:38:29.680 |
the world's uranium is basically coming from Kazakhstan and is being shipped to the border 00:38:35.200 |
between Russia and China. So like the U.S. has got to get serious about this. But if I'm right, 00:38:41.600 |
that the lead time is 10 years, then all the stuff that we just talked about, the demand 00:38:48.000 |
that we've got to satisfy over the next 10 years is not going to come from nuclear. 00:38:52.240 |
It's got to be an all of the above strategy. But, you know, here's an FT article that just 00:38:57.520 |
just came out. It said data centers, voracious power needs are set to rocket as cloud storage 00:39:03.120 |
facilities, crypto mining and AI all strain the grids. Microsoft alone's opening a new data center 00:39:10.320 |
globally every three days. These power hungry operations will together consume more than 480 00:39:16.800 |
terawatts hours of electricity or almost a tenth of U.S. demand. So when you start to think about 00:39:25.120 |
that, wind and solar are important. Costs are plummeting. They're clean, but they have baseload 00:39:32.000 |
issues and they certainly can't scale at the rate I think that we need this to scale. So it seems to 00:39:37.600 |
me that the power source that we're going to need to scale up is not gas. And part of the reason I 00:39:44.800 |
think you're seeing a lot of this go down in the Middle East is because there are abundant sources 00:39:50.320 |
of natural gas. The U.S. has abundant sources of natural gas. You can spin up a natural gas facility 00:39:56.320 |
much faster, right, two to three years, very well-known technology, less regulatory headwind. 00:40:02.160 |
But it seems to me that there's a lot of energy in Washington around national AI strategy. We need 00:40:08.480 |
to have an equal commitment. Part of the same conversation needs to be a national energy policy 00:40:13.600 |
that supports the buildout we're going to have to see. Well, and look, you could leave the AI part 00:40:19.680 |
out. I think there are enough people who have such a strong view of climate risk that you could do it 00:40:26.640 |
just for that reason. And let's say both, you know, OK, that's kind of a slam dunk. When I talked to 00:40:35.840 |
our friend, Phil, who would be thrilled to hear your comments on natural gas, you know, he told 00:40:42.560 |
me that the base load in the U.S. has been flat for many decades now. And so obviously the demand 00:40:51.040 |
for energy has gone up over those decades, but it's been offset by efficiency gains. And some 00:40:57.520 |
of that will play out here. Like the more you make an issue out of this and the more people like us 00:41:03.120 |
talk about it is part of the market adjusting. And so you'll see some of that get corrected. 00:41:09.200 |
One thing that set off a little bit of fear in the markets was Amazon went and did this deal 00:41:16.080 |
where they bought some land right next to a nuclear facility. And once again, according to Phil, 00:41:23.360 |
about 30, 25, 30 percent of the production in the U.S. is from independent energy producers who 00:41:29.600 |
basically sell to whoever they decide they want to sell to. So it's a plant owned by a corporation 00:41:37.040 |
that sells. And if you start to see data centers pick off the independents, that could put pressure 00:41:46.960 |
on the grid in a way that might bring the regulators in to voice their opinion. So I 00:41:54.160 |
think that'd be something to watch out for. Back to the nuclear thing, like I just, once again, 00:42:00.960 |
if I could do anything, I would just encourage those in Washington, if they do get on this 00:42:07.600 |
bandwagon, to commission, to build, to create a commission of people who have the best interests 00:42:14.960 |
of the country in mind to think about reevaluating nuclear regulatory structure from the ground up 00:42:22.240 |
and starting over from scratch. Because I suspect that the 10-year thing that you're talking about, 00:42:28.480 |
in addition to the cost thing, are heavily impacted by regulatory. In fact, if you'll allow 00:42:34.560 |
me, sorry to spin off of this, when I was thinking about this, I had a flash in my brain about this 00:42:43.760 |
bridge that went down on I-95 in Pennsylvania. And I'll put a link in here. And for those of 00:42:49.280 |
you that don't remember, I mean, now everyone's focused on the Delaware Bridge, but this other 00:42:52.960 |
bridge collapsed. And within 12 days, it was replaced. And there's articles that say the 00:42:59.360 |
governor's now a presidential candidate, like he's definitely going to get reelected because of how 00:43:04.080 |
wonderful this is. And when you peel underneath that and read the articles about what happened 00:43:10.160 |
is he went with no bid contracts. He was able to get people to work 24/7. It basically broke the 00:43:17.600 |
rules of the regulations that were put in place that slow everything down. And if we've gotten 00:43:24.400 |
to a place where we celebrate how quickly things can happen, and you've probably seen these things 00:43:31.120 |
on Twitter about how quick the Golden Gate Bridge was built and how much it cost and all these 00:43:36.480 |
infrastructure projects and how today they're like exponents of that. And I also thought about 00:43:44.720 |
when they cleaned up San Francisco for Xi in like a week. The problem isn't that we don't know how 00:43:52.400 |
to solve these problems. The problem is that we're our own worst enemy. We put in place the things 00:43:58.640 |
that limit what we're capable of. Another article that I want to link to shows that renewable 00:44:05.040 |
projects are way more successful in Texas than California, which is the ultimate irony, right? 00:44:11.760 |
- But what does that tell you? What does that tell you, Bill? 00:44:13.920 |
- It tells you that there's less regulatory mud in Texas than in California. And despite 00:44:23.280 |
the intent of both the people of California and the people that I guess are representing them, 00:44:30.720 |
the intent doesn't matter if the outcome's not possible. In fact, in some ways, you should 00:44:41.520 |
hold representatives only accountable for output. Intent is kind of silly if it never achieves the 00:44:48.720 |
output. And so anyway, there's a long way of saying that. 00:44:52.720 |
- I think it's a super important topic, Bill, because when we look at great advances in 00:44:58.720 |
innovation that have been stymied by regulatory excess, right? I talked about supersonic aircraft, 00:45:06.880 |
right? Think about the productivity gains that would occur in the world if we were able to get 00:45:10.640 |
from point A to point B faster. But literally, it was the environmental lobby that stymied 00:45:17.040 |
supersonic aircraft. You think about calcium CT scans, which I've been promoting from the top of 00:45:23.040 |
the mountain. Calcium CT scans cost $100 and save lives, can bend the healthcare curve around the 00:45:28.880 |
3 million sudden cardiac events we have on an annual basis in this country. But the reason 00:45:32.880 |
it doesn't happen is they lost the lobbying game in Washington. You think about nuclear fission. 00:45:37.920 |
I mean, think about the amount of dollars in energy the government is using to subsidize 00:45:42.400 |
nuclear fusion, right? Nuclear fusion, which we all would agree would be a fantastic thing if and 00:45:51.120 |
when it happens, but it's long dated, right? We have nuclear fission that has all these benefits 00:45:56.960 |
that exist today. But again, for whatever reason, mass hysteria and otherwise, government regulation 00:46:02.320 |
got in the way of that industry. And this is what we kicked off this pod talking about. 00:46:07.440 |
And consumer perception, which you talked about. The hysteria around Chernobyl. Actually, I think 00:46:18.080 |
you have data in here, but the number of deaths that happen per year from traditional fossil fuels 00:46:24.080 |
is exponentially higher than the number of deaths we've ever had from nuclear energy. Yet, there's 00:46:31.360 |
this weightiness to it. It's like fear of flying. The irony is they're not even fearful anymore. I 00:46:42.320 |
mean, the data suggests that well over a majority of people understand that nuclear fission is safe. 00:46:47.440 |
But what happens is it takes a long time for the government regulatory process to revert back 00:46:53.840 |
to that pre-paranoia. So it takes some external force to cause us to change. Well, I think the 00:47:02.240 |
external force that's causing it today is when we start forecasting ahead power demand needs 00:47:08.960 |
to power all these things that we're talking about over the course of the next five to 10 years, 00:47:12.960 |
we're going to have to take in all of the above strategy. And I'm glad Phil's going to be excited. 00:47:17.760 |
I'm not sure the price of nat gas is going to go up that much, but I do think that natural gas 00:47:23.120 |
is super plentiful. The mechanics of it in terms of building production facilities are super easy, 00:47:30.400 |
well done. And I think it can come online in big chunks, like two, three gigawatt chunks, 00:47:37.760 |
which we're going to need to power some of these big data centers. 00:47:40.640 |
Yeah. And some of the graphs that you've included here show that the cost of producing 00:47:48.320 |
new nuclear facilities in France has been going down and in the U.S. it's only been going up. And 00:47:54.800 |
that should be remarkably upsetting to everyone. Just that core reality. And I'll pose a question 00:48:05.280 |
to you. Do you think they'll go up or down in China? They're going to go down. Okay. You know 00:48:10.640 |
that innately, right? Yes. Okay. And even worse, Bill, I mean, I love to visit France as much as 00:48:21.040 |
the next person, but if it's not the height of embarrassment, if it doesn't prove the point 00:48:27.200 |
better than any other point we could possibly make, the French are driving more innovation 00:48:33.280 |
and more efficiency out of nuclear fission than the United States. And we pride ourselves on 00:48:37.840 |
innovation, right? That should be a shock and a national embarrassment. It should. And at risk of 00:48:43.520 |
being redundant, in case anyone who is listening has any authority whatsoever, I would just highly, 00:48:50.880 |
highly recommend that there be a consideration of a zero-based re-regulation of the market, 00:48:57.840 |
because it's clearly the problem. I don't think there's anyone in the energy market that doesn't 00:49:05.440 |
think the bureaucracy and regulation is the problem in the American nuclear fission market. 00:49:10.960 |
Well, I think that before we move to the market check, the one thing that seems to me is that 00:49:19.120 |
we have increasing entanglement. You know, you gave this talk that one of America's greatest 00:49:24.000 |
strengths was, you know, the fact that Silicon Valley is 3000 miles away from Washington DC, 00:49:30.000 |
right? That's been a source of our innovation. And you and I both know 25 years ago, 00:49:34.400 |
Washington wasn't all that interested in what was happening in Silicon Valley. 00:49:37.840 |
Well, today, these are all matters of sovereign importance, right? AI is our national security. 00:49:45.840 |
It is what's driving productivity gains, you know, in our economy. Technology is what's driving 00:49:51.440 |
productivity gains in the economy. And now I look at power being integral to all of this, 00:49:56.880 |
you know, whether, you know, the front lines of our battle with China, the hot war and the 00:50:01.520 |
cold war, if you will, are now being fought on whether or not we're going to give them chips, 00:50:05.760 |
whether or not we're going to let certain engineers work on certain projects, etc. 00:50:11.040 |
And so if Washington wants to get serious, it needs to be an integrated, 00:50:15.200 |
right national policy, you can't fix AI without also taking on our future energy needs. 00:50:21.200 |
How about we talk a little bit just about a little tech check. Speaking of energy needs, 00:50:26.480 |
I think I know somebody on this podcast, and it's not me, who was faster to invest in some of these 00:50:32.720 |
companies that would benefit from energy policy. But we have a chart here for CEG. They're one of 00:50:38.240 |
the leading nuclear companies in the United States. And that looks like a pretty good 00:50:43.600 |
chart over the course of the past couple years, in particular this year. But yeah, 00:50:50.320 |
I bought this stock. And by the way, I bought a lot of stocks that went down. So I think 00:50:55.520 |
we got to be careful about cherry picking. But I bought this stock solely based on the bet that 00:51:00.640 |
perceptions would shift on nuclear. This is a company that was carved out of a larger producer. 00:51:07.040 |
They took the nuclear assets and made it a singular company. What I didn't know was that 00:51:13.840 |
the AI thing would happen. And there are clearly people talking about this company. I saw a Morgan 00:51:19.920 |
Stanley chart that listed everything about AI. And I saw their ticker. I was like, holy crap, 00:51:25.040 |
like I didn't know it was an AI play. But once again, I'm happy that the nuclear thing has 00:51:32.880 |
shifted. But certainly Wall Street is focused on this and this energy issue that you've brought up. 00:51:40.000 |
As a reminder to everybody, just our opinions, not investment advice. 00:51:44.160 |
But why don't we talk a little bit just about the Fed met since we last were on, Bill. 00:51:49.760 |
And they update, the Fed every quarter updates their own estimates for things like GDP. 00:51:55.680 |
December of last year, they expected our GDP to grow 1.4% this year. And now they're 00:52:00.880 |
expecting it to grow 2.1%. So expecting our economy to be more robust. In December, 00:52:07.200 |
they expected core PC. So this is their favorite inflation metric to watch was going to be 2.4%. 00:52:13.600 |
Now they expect it's going to be 2.6%. So coming in a little bit hotter. And now the market is 00:52:19.360 |
wrestling with how many rate cuts, if any, we're going to get. I'm still in the camp that we're 00:52:23.440 |
going to have rate cuts ahead of the election. But the market is now pricing in, I think, 00:52:27.680 |
only three rate cuts between now and the end of the year. And we see the 10-year just today in 00:52:32.640 |
the markets is up to 4-4. I think we had bottomed at 3.5. So that's adding more restrictiveness 00:52:39.840 |
to the economy. The markets, particularly high growth stocks, are starting to sell off again. 00:52:45.440 |
Stocks that are levered are starting to sell off again. And so when we started the year, 00:52:51.680 |
I think one of the big debates, we covered the AI debate, whether or not we had pulled 00:52:56.400 |
forward all the demand for training, whether or not these stocks had gotten ahead of themselves. 00:53:00.320 |
But there was a second question. Are we going to be able to avoid a recession? 00:53:05.360 |
How many rate cuts we'd get? When would we get those rate cuts at the start of the year? People 00:53:09.520 |
thought we'd get rate cuts in March or April. And so I think, as I read the tea leaves, I see 00:53:17.600 |
pockets of weakness in the economy, particularly if you look at housing, autos. Tesla really missed 00:53:24.240 |
their number today in terms of new cars shipped in the quarter. Some of that has to do with China, 00:53:29.520 |
but it also has to do with demand here. So I think that that's worthy of watching. 00:53:34.880 |
And then I've got three charts that we spend time looking at. One is a Morgan Stanley chart. 00:53:40.640 |
So the first chart is the hedge fund net exposure. So this is how many of their dollars 00:53:49.280 |
they have invested on the long side of the market in a particular sector. So this is in technology. 00:53:56.880 |
So you can see that in October of '21 or the end of '21, the MAG 7 or TMT X, the MAG 7 had become 00:54:05.920 |
31% of its book. So what is that? That's technology if you exclude the largest names. And now that's 00:54:11.200 |
down to something like 19%. And so hedge funds are starting to pull back a little bit on that 00:54:18.720 |
part of the risk curve to your point as to whether things are getting a little too bubbly, Bill. 00:54:23.840 |
The next one is people have seen this a lot. This is our software index. So where is software 00:54:29.440 |
trading relative to interest rates? You can see that in the blue line is it's trading at about 6.1 00:54:36.960 |
times. The 10-year average is 6.9 times. Rates have obviously moved up a lot off the bottom. 00:54:43.200 |
But this is a long-winded way of saying that software is not really participating in this 00:54:48.320 |
run-up. So software came down hard coming out of COVID, and it kind of remains there. It's 00:54:53.920 |
bouncing around a little bit. Everybody's trying to figure out whether or not AI is going to 00:54:58.960 |
catalyze a re-acceleration in software. Within software, kind of the data stack companies, 00:55:05.600 |
so the Databricks and the Snowflakes and maybe even the ServiceNow and Salesforce, 00:55:10.080 |
they've caught a little bit of a bid. But a lot of software is trading well below its pre-COVID 00:55:17.600 |
multiple. And then the final chart, because I knew you would ask me, I just asked the team to 00:55:22.720 |
pull together, get rid of Tesla out of the index, show me the MAG-6 index. So I just want to see the 00:55:28.560 |
multiples for that relative to interest rates. And there again, you can see the 10-year average 00:55:34.160 |
has been about 22 times. And so that is trading up. All of these companies have re-rated a bit. 00:55:40.480 |
And the interesting thing here is if you look, you had like a reverse correlation 00:55:47.920 |
between these two things, which is what you would expect up until about March of '22, 00:55:54.880 |
where rates kept going up, but the multiple expanded on this group of companies. 00:56:01.760 |
And so some people would say, hey, that's a warning sign, right? Rates have gone up, 00:56:06.640 |
but these MAG-6 has kind of a 10-year high multiple. Shouldn't that be a warning sign? 00:56:12.960 |
Shouldn't these companies be trading down a little bit? But I would argue to the reason 00:56:18.000 |
that they're trading at those multiples is that people's confidence or their forecast for future 00:56:23.600 |
growth is a lot higher than it's been at over the course of the last several years. 00:56:28.720 |
Bill, you pointed out a tweet to me that was, I think, where the generative AI company 00:56:35.600 |
multiples were trading. And it says OpenAI at 18 times, Anthropic at 20 times revenue, 00:56:42.240 |
Glean at 56 times revenue, et cetera. Perplexity has a bunch of names on here. It has Huggingface 00:56:49.280 |
at 150 times. It's just a question whether or not this is the prelude to a bubble. 00:56:55.520 |
So I think this kind of brings us a little bit full circle. We started the pod asking the question, 00:57:00.800 |
you know, you can believe that the long run in AI is going to be massive, but you can kind of have 00:57:05.920 |
booms and busts, winners and losers along the way. When you look at these multiples for these 00:57:11.680 |
companies, what do you sniff out? Well, I mean, the thing that made me forward this to you was 00:57:17.440 |
that it lists them all and then it says, is OpenAI a bargain? So rather than the takeaway being, 00:57:27.600 |
oh my God, these things are really highly priced, there's this relative valuation game, 00:57:33.280 |
which is how bubbles are built. And because you adjust up, so you re-rate to a new norm, right? 00:57:42.640 |
You know, I would just say, like, literally in, you know, since in the past, maybe not since we 00:57:48.000 |
talked, but let's say in the past four weeks, a couple of things have happened. So we've had 00:57:53.520 |
what I called these fast failures. You might, maybe the inflection team will get their money 00:58:00.560 |
back, but that's not what venture is about, right? Like, so the 10X is off the table. That seems to 00:58:07.280 |
be true at stability AI as well. You had a really interesting data point in the past week where 00:58:12.560 |
I think people are abandoning the notion of the $20 premium personal AI tool. Perplexity came out 00:58:22.080 |
and said that they were, you know, considering advertising and they, there were statements in 00:58:26.640 |
their previous releases that were very negative on advertising. And then OpenAI said there was 00:58:32.560 |
going to be a non-login version of ChatGPT, which would just inherently be cannibalistic 00:58:39.360 |
to their $20, even though I think most of us believe that's over a billion run rate already, 00:58:44.400 |
right? But if they're abandoning that, that's an interesting data point. Like the leading players 00:58:50.160 |
are saying they don't think that's durable. Or it might be that, you know, the land grab is on, 00:58:56.240 |
I want to sign up as many consumers to use my product as anybody else. And I'm willing to 00:59:00.560 |
forego some of those early revenues, but I would agree with you. There's marginal cost here. 00:59:05.840 |
I think the run rate here that's been rumored is that OpenAI consumer revenues are about a billion 00:59:11.280 |
and a half. So if you look at that, you've got just under maybe a million customers using, 00:59:18.080 |
you know, paying for ChatGPT. I do think I've said this many, many times, listen, 00:59:23.200 |
I don't think a $20 a month fee for my consumer AI is going to be defensible. And the reason I 00:59:31.520 |
don't think that's defensible is because they're going to be way too many people in the pool. 00:59:35.600 |
Apple's going to have one, Google's going to have one, Meta's going to have one, 00:59:40.560 |
ChatGPT is going to have one, go through the list, perplexity, et cetera. It only takes 00:59:44.880 |
one person to give away a frontier quality experience, you know, at a much lower price. 00:59:50.320 |
I'm going to say two things that are going to sound like they're at odds with one another. 00:59:53.840 |
I agree with the current state of LLMs. Exactly what you're saying is true. I think if one of 00:59:59.840 |
them can seamlessly integrate memory in a way where I can become reliant on it, I would gladly 01:00:07.680 |
pay 20 and maybe more. A lot more. And so, you know, I think it's TBD. I think we got to see 01:00:14.400 |
whether or not someone can, you know, everyone seems aware of the memory issue that we started 01:00:18.560 |
talking about a while back, but that doesn't mean it's solved. And the architecture doesn't 01:00:23.280 |
have an elegant solution. So it's the number one thing I'm interested in and looking out for. 01:00:28.800 |
The last data point I'd bring up on this front, I did a tweet where I was curious why, you know, 01:00:36.240 |
there's a 60X reduction from top model at a LLM company to their next model. 60 is just a lot. 01:00:44.160 |
Like, you don't see this. Just so we understand what you're talking about. Like, the difference 01:00:50.240 |
between, I don't know, maybe a CHAT-TPT4 and CHAT-TPT3 is a huge differential in price. 01:00:55.840 |
60X. And I would use this, you know, relative to your, like, that's not true in energy. That's not 01:01:03.040 |
true in cars. I said to you, if new car production were limited, what would happen to used car 01:01:08.240 |
prices? They would go up. So there's something weird here. Now, I got some good responses. One 01:01:13.600 |
person said that there are throughput limitations on the high-end model. And so they're priced 01:01:19.360 |
artificially high on purpose, which I could see that making sense. But the second thing people 01:01:25.360 |
said was that the runtime models are just highly competitive. And many people think that some of 01:01:31.440 |
the models are already priced under cost, which goes back to the credit investment theme that you 01:01:38.000 |
and I have talked about. But if things are already that kind of hyper-price competitive, it's just a 01:01:45.040 |
data point. Like, it's a data point worth paying attention to. So when I look at all those things 01:01:50.960 |
and the entry prices that a marginal late-stage investor would be asked to pay, which are these 01:01:56.560 |
multiples here, I'd be, like, nervous. I'd be nervous. I'll leave it at that. 01:02:02.160 |
No, I mean, listen, it brings me back to the conversation about investing in Alta Vista, 01:02:08.000 |
Lycos, AOL, et cetera, versus Google. I mean, the reality is we were in the fog of war, 01:02:13.840 |
and we didn't know it. Every venture firm was paying up to get a search logo in '98 and '99. 01:02:20.000 |
There were early revenue curves that looked promising for all of those companies. But the 01:02:24.240 |
fact of the matter is we hadn't even gotten to the starting line. Like, we didn't figure-- like, 01:02:29.280 |
we hadn't even determined who the market leader was going to be. And ultimately, Google, by 2004, 01:02:34.320 |
2005, it was very clear that they were going to capture a dominant share of that market, 01:02:38.400 |
that the market was going to grow longer, be bigger. And so '98, '99% of all the profits ever 01:02:44.400 |
created in internet search went to Google and went to them five or six years after, right? 01:02:51.760 |
Search really emerged as a category. And so you at least have to leave open the possibility that 01:02:58.000 |
that's the moment that we're in, that you look at these names on the list. And obviously, OpenAI is 01:03:03.760 |
an early leader, is in pole position to be the non-Google Google, right? But the fact is we're 01:03:09.840 |
super early, and it's not even clear that they're going to have sustainable economics in order to 01:03:16.240 |
do that. But if you said to me, I look-- we passed on a lot of these companies simply because we 01:03:22.960 |
couldn't get comfortable with the entry multiples, given how opaque it still is at this moment. 01:03:29.040 |
But here's what I also believe. I believe the winners, the ultimate winners in AI have the 01:03:34.800 |
potential to be way bigger than the winners of prior generations. And so you can afford to miss 01:03:40.720 |
those early rounds, perhaps, in some of those companies. Not if you're benchmarked, perhaps, 01:03:45.520 |
and your stock and trade is Series A. But if you're Altimeter, and we can-- like, guess what? 01:03:51.680 |
Investing-- we had Bob Milad on last week. Investing in price line at a billion dollars, 01:03:56.240 |
that was 120x in the public markets. Google also produced venture-like returns in the public 01:04:02.880 |
markets. Amazon produced venture-like returns in the public markets. So I think the most important 01:04:08.800 |
thing is-- But anyway, you were able to buy all of those at very reasonable relative prices, 01:04:14.080 |
all three of the examples you used. Yeah, but I would also tell you-- I mean, 01:04:17.760 |
at the beginning of the 100x in the public market. Yeah, but my mentor, Paul Reeder, 01:04:22.160 |
used to walk into my office, and Google would-- query volume would go down a little bit, 01:04:26.320 |
and he would be like, why are you so confident? This is 2005, 2006. And so it never appears clear 01:04:33.440 |
at that moment in time, right? You had to believe in the size, the length, and the width of the 01:04:39.360 |
market. And so all I'm saying here is that you've got to tread with caution, right? 01:04:51.680 |
Like, that already happened. That's liquid today. It clearly has been the AI winner. 01:04:59.920 |
18 months ago, 18 months ago, NVIDIA was at $125 a share, and Meta was at $90 a share, okay? 01:05:08.720 |
Meta's at $500, NVIDIA's at a huge multiple, $900 of where it was then. Markets shift very quickly, 01:05:18.320 |
right? And then they forecast ahead levels of durability that may or may not exist. 01:05:23.760 |
And so when you're at these moments, the beginning of a phase shift, 01:05:27.280 |
we talked about-- you and I on this pod-- ERR, experimental run rate revenues, 01:05:34.320 |
versus something that you believe is monopolistic, annually recurring. 01:05:38.720 |
Those are two radically different things. One, you have no ability to forecast the future, 01:05:44.000 |
and therefore it should have a one or a two X multiple because you don't know if it's going 01:05:48.320 |
to occur again. The other one, if it's monopoly-like recurring, then you can give it a 10 01:05:54.320 |
or a 20 X because you think it's going to last for a very long period of time. 01:05:58.320 |
Anybody who tells me that these revenues deserve monopoly recurring like multiples today, 01:06:04.800 |
that's a real struggle once you get to the late stage. 01:06:08.240 |
I think it's a good lead-in for maybe a future podcast, which is I have this big question about 01:06:14.960 |
whether the leading LLMs are going to incorporate their own data store, which if developers become 01:06:22.320 |
reliant on, would greatly increase switching costs. Because today, the switching costs based 01:06:28.800 |
on every developer I've talked to are-- we mentioned this before-- but about as low as 01:06:33.600 |
you could possibly imagine. And if you're just using big context windows and not doing fine 01:06:39.360 |
tuning, then you're just not-- you can bounce from one model to the next. 01:06:43.600 |
And this gets back to that point I was making about are you just using the LLM for 01:06:49.360 |
language translation versus are you relying on it for something greater? 01:06:53.600 |
And so anyway, it's something I'm watching out for. Could someone build an LLM with an 01:07:00.880 |
inherent data store that developers start to become reliant on because then they're in a 01:07:05.760 |
different place than the ones are today? And answering the question in that tweet, 01:07:10.320 |
is OpenAI a bargain? I will tell you, the only way I see you getting to switching costs 01:07:14.880 |
is you have to develop something that is fundamentally differentiated, 01:07:18.960 |
right, that gives you monopoly-like capability, whether because you've got memory and nobody 01:07:23.600 |
else could do it, whether it's because it-- Maybe it'll be the agent you're talking about. 01:07:28.400 |
I'm skeptical, but people that have seen them are excited, including you. 01:07:32.480 |
But if I had to make a bet, you bet on the smartest group of people that's come together, 01:07:36.160 |
that's delivering at the fastest cycle time. I mean, an OpenAI would certainly be in that camp. 01:07:41.280 |
Anthropic would be in that camp. They're going to compete against these big hyperscalers. 01:07:45.680 |
I do think the winner here is going to be massive, but you're making an excellent point 01:07:52.800 |
that that could be true, and 90% of these other things could go to zero. 01:07:57.040 |
So for venture investors who are paying up for things with that level of variability, 01:08:02.080 |
that's a very different game than betting on a software startup with low capex, 01:08:07.200 |
high predictability, $10 or $20 million in early indicative revenue in 2016 or 2017.