back to index

Market Predictions, Rates & Inflation, DOGE, CES, AI Compute | BG2 w/ Bill Gurley & Brad Gerstner


Chapters

0:0 Intro
2:58 Frontline Ideas for 2025
7:2 The Bogeyman (Interest Rates and Inflation)
9:21 Mega Cap Valuations & Expectations
14:2 Big Tech CapEx
16:51 Scaling Inference
20:11 Future of AI
39:50 Role of Power in AI Development
42:58 Interest Rates & Economic Growth
45:11 Federal Spending & DOGE
54:15 Regulatory Landscape for AI
61:27 Co-opetition in AI
77:6 Reasoning Models & Chain of Thought

Whisper Transcript | Transcript Only Page

00:00:00.000 | Elon said he thinks that every cognitive task
00:00:03.240 | that can be done by a human will be able to be done
00:00:06.080 | by an AI within three to four years.
00:00:09.000 | If you believe that to be true,
00:00:11.060 | the value of all that human labor that you're replacing
00:00:14.800 | is measured in trillions.
00:00:17.040 | (upbeat music)
00:00:28.660 | Good to see you, Brad, happy new year.
00:00:30.840 | Happy new year, I have on a, you probably can't tell, Bill,
00:00:33.820 | I have on a blue shirt today, not a black shirt.
00:00:36.900 | I have on my green pants, a little Notre Dame spirit
00:00:39.460 | for the big Notre Dame game tonight.
00:00:41.000 | I know you have a big Texas game coming up.
00:00:43.340 | Yeah, that's tomorrow in Dallas, yeah.
00:00:45.860 | So we'll have to see if both those teams
00:00:48.480 | could win in advance, that'd be pretty spectacular.
00:00:51.040 | You and I were talking about some of this insanity
00:00:53.720 | going on in LA, and I know we both have a friend
00:00:57.220 | who's lost a house and there's a lot of this debate
00:01:01.620 | going on online, whether or not any of the policies
00:01:05.240 | that we've had are approximate causes,
00:01:07.260 | at least to the severity of the thing that's going on.
00:01:09.580 | And one of the things that I know you and I
00:01:12.940 | both really appreciate is as we're entering 2025,
00:01:16.940 | just the opportunity, I feel like the conversation
00:01:21.300 | going on online, the debate going on online
00:01:24.380 | is more robust than it's been.
00:01:26.660 | And I think it rubs a lot of people the wrong way,
00:01:28.740 | but I tend to be in the camp that the more open debate
00:01:31.220 | and accountability, it's just the better, right?
00:01:34.140 | And no matter what side of the political aisle
00:01:37.140 | you end up being on, we should all be in favor
00:01:39.620 | of just doing better.
00:01:41.900 | And when you look at how horrific these scenes are in LA,
00:01:46.780 | you have to ask the question, what could we have done better?
00:01:50.300 | Yeah, and I totally agree with you.
00:01:52.980 | I have this framework in my head that I think about
00:01:56.820 | in relation to these types of situations,
00:01:58.900 | which is a lot of people, I think, evaluate politicians
00:02:03.900 | and policies based on what they think the intent
00:02:08.460 | of the decision was, and they fail to follow up
00:02:12.740 | and then see if the output or the outcome is identical
00:02:17.740 | to what they thought the original intent was.
00:02:20.020 | And I think all too often the original intent of something
00:02:24.100 | may have sounded good, or you might be voting for someone
00:02:27.620 | 'cause you agree with their point of view,
00:02:29.820 | but if the policy fails to achieve that,
00:02:33.220 | or in many, many cases, achieves the exact opposite of that,
00:02:38.220 | then you really have to ask yourself, what's the point?
00:02:42.100 | And so, yeah, I hope that this type of accountability
00:02:47.300 | and transparency, shining a light in,
00:02:50.980 | could lead to better outcomes in the future.
00:02:53.460 | And there's always trade-offs, obviously,
00:02:55.300 | like you can't have everything.
00:02:57.220 | For sure, for sure.
00:02:58.820 | Well, I thought we could do something unique and different,
00:03:02.940 | if you're up for it, related to this time of year.
00:03:06.860 | So, obviously, large investment funds like yourself
00:03:11.740 | are typically operating on a calendar cycle.
00:03:16.140 | Sometimes the reporting is certainly looked at annually,
00:03:20.140 | and sometimes even some of the fees and whatnot
00:03:23.420 | are calculated on an annual basis.
00:03:25.340 | And so I'm sure that creates a annual cycle for you.
00:03:29.020 | And as you've been going through that process,
00:03:32.640 | I thought it'd be really cool to expose the listeners
00:03:36.420 | to both the analysis that you're going on now,
00:03:40.180 | but a window into that process.
00:03:42.300 | And so, how does someone, a large professional investor
00:03:46.540 | like yourself, think about this time of year,
00:03:49.320 | and specifically, what are you looking at now,
00:03:52.960 | and give everyone a peek inside?
00:03:56.620 | Yeah, no doubt about it.
00:03:58.260 | And we've been, I think,
00:04:00.180 | having this conversation for 20 years.
00:04:01.900 | But half of our business,
00:04:03.420 | the venture capital part of our business
00:04:05.500 | has a much longer cycle time.
00:04:08.180 | What informs how we think about the annual cadence
00:04:11.140 | of our public market positioning,
00:04:13.220 | which is the other half of our business,
00:04:14.820 | is a lot of times these big trends
00:04:17.020 | that we're having debates on.
00:04:18.860 | We ended last year in this great conversation with Dylan
00:04:22.500 | about this tension or debate in the world
00:04:24.620 | about how much compute the world needs,
00:04:27.380 | all the investment going on, et cetera,
00:04:29.140 | that we'll get into today.
00:04:31.100 | But it's all those things,
00:04:33.060 | including a major political change
00:04:37.300 | going on in this country,
00:04:38.500 | that all impact how we think about this year.
00:04:42.020 | And so, at the end of every year,
00:04:44.180 | I not only look at kind of errors and omissions,
00:04:46.420 | like what could we have done better in our public trading
00:04:50.120 | and our public investing for 2024,
00:04:52.420 | and we had a great year and I'm proud of the team,
00:04:54.260 | or whatever, but there's a lot we always can do better.
00:04:58.540 | And then I try to look ahead, not at January,
00:05:02.000 | but I try to put myself in the shoes
00:05:03.780 | of where I think the world will be in December
00:05:06.260 | of the following year, right?
00:05:08.100 | So you really have to get in the habit of being an analyst,
00:05:13.100 | a forecaster, a prognosticator,
00:05:15.300 | thinking about the big trends,
00:05:16.540 | but also thinking about all of the competing things
00:05:18.820 | going on, interest rates, inflation, et cetera, the backdrop.
00:05:23.220 | And so we go through that exercise.
00:05:25.020 | I journal to myself and I make everybody on the team
00:05:27.740 | do this independently, Bill.
00:05:29.540 | And then we get together,
00:05:30.460 | and this started at the beginning of December,
00:05:32.880 | but certainly informs how we're positioned
00:05:35.900 | at the start of the year.
00:05:37.140 | What's top of mind, what's top of mind?
00:05:38.740 | What are the big things you're thinking about,
00:05:40.940 | the big blocks?
00:05:42.600 | There are a lot of exciting things
00:05:44.220 | that can cause you to believe
00:05:45.540 | that this is the golden age, right?
00:05:47.540 | That 2025 could just be a really phenomenal year.
00:05:51.620 | Lower taxes, lower regulations,
00:05:54.740 | GDP growth is strong, employment looks good.
00:05:58.340 | And at the same time, we have these mega trends
00:06:00.620 | that have been percolating for a decade
00:06:02.500 | that really seem to be bearing fruit, right?
00:06:05.220 | Obviously AI being the biggest mega trend,
00:06:08.660 | but ancillary things like robotics and self-driving cars
00:06:12.700 | and all these things that are yielding
00:06:14.380 | productivity improvements to the economy,
00:06:16.700 | which as we know is a huge driver of GDP.
00:06:20.540 | So there's a lot to be positive about.
00:06:23.580 | On the other side, you gotta look around the world
00:06:26.640 | and you say, well, we got a situation in the Middle East,
00:06:30.440 | could that get better or worse?
00:06:31.660 | The situation with China, better or worse?
00:06:34.020 | Situation in Ukraine, better or worse?
00:06:35.980 | Those things can break both ways,
00:06:37.940 | but I could make an argument how all those things get better.
00:06:41.420 | And so I think on the one hand,
00:06:44.700 | there's a lot of enthusiasm.
00:06:46.720 | At the same time, valuations bill are quite high
00:06:50.140 | relative to where we started '23
00:06:52.700 | and relative to where we started '24.
00:06:54.940 | So the world assumes that things are going to be better.
00:06:59.140 | And as we look at it, I kept asking the team,
00:07:02.780 | what is the boogeyman?
00:07:03.900 | Like, what's the thing that we're not thinking about
00:07:06.180 | that could go wrong here?
00:07:07.460 | And if you look at the start of this year,
00:07:09.420 | I would say that the boogeyman that's out there,
00:07:11.580 | you and I have a lot of smart friends
00:07:14.780 | and some of them have been shorting the U.S. tenure,
00:07:18.300 | that is they expect rates to go up.
00:07:21.060 | And I really think that it's this inflation
00:07:24.480 | and higher rates combined with higher valuations
00:07:29.480 | that could put a damper on at least the market
00:07:33.460 | in the first part of this year.
00:07:35.340 | And again, this is just like one of the potential outcomes,
00:07:38.020 | but we've seen the tenure go up almost 100 basis points
00:07:42.340 | and people didn't expect that.
00:07:44.420 | They thought the Fed was gonna be cutting rates
00:07:46.120 | and the tenure would come down
00:07:47.340 | and that would lead to more housing sales,
00:07:49.660 | more sales of a lot of things,
00:07:51.140 | be a tailwind for the economy.
00:07:52.460 | The exact opposite has happened.
00:07:54.020 | And so I think we have to explore why is that happening?
00:07:56.900 | Where do we expect that to go?
00:07:58.800 | So that would, I guess, be the overall framework.
00:08:01.620 | - Let's dive in on the positive side
00:08:04.700 | and specifically the enthusiasm around AI.
00:08:08.340 | I would share with you, like just as an observant
00:08:12.460 | rather than as a participant,
00:08:14.260 | when I listen to you talk,
00:08:16.000 | when I just listened to the short interview
00:08:19.140 | that Elon did at CES,
00:08:22.340 | I'm hearing a level of enthusiasm
00:08:27.000 | that feels somewhat unprecedented to me,
00:08:29.420 | like over the past three decades
00:08:31.980 | in terms of just how much excitement there is
00:08:36.620 | about what's possible
00:08:37.740 | and how much investment's going into it.
00:08:39.900 | Why wouldn't that just be,
00:08:41.600 | well, first of all, do you agree with that?
00:08:43.100 | And then second, how do you interpret it
00:08:45.300 | from an investor point of view?
00:08:47.320 | - Well, certainly there's enthusiasm,
00:08:50.540 | but I would say there's plenty, a wall of worry as well.
00:08:53.300 | For every conversation I have with people in Silicon Valley
00:08:56.480 | who say they're gonna invest more in compute,
00:08:58.320 | this is the year of agentic AI
00:09:00.500 | that we may see AGI in the next 18 months.
00:09:02.820 | I have another call maybe with a big fund in New York
00:09:06.500 | who says, listen, this is gonna lead
00:09:08.620 | to the telecom bust of 2000.
00:09:10.820 | This is Cisco all over again.
00:09:12.340 | There's a bubble, et cetera.
00:09:13.700 | And so there are the words that everybody says, Bill,
00:09:16.780 | and the byproduct of those words
00:09:19.340 | is the valuations for companies.
00:09:21.560 | So maybe the place to start
00:09:23.100 | is just like looking at where the valuations are
00:09:27.060 | for big tech as we start the year.
00:09:29.600 | And when I look at that,
00:09:31.820 | the S&P 500 recently peaked at about 22 times.
00:09:36.820 | So in Q4 '21, it was about 22 times earnings.
00:09:42.260 | It troughed in '22 at 16 times, and now we're at 23 times.
00:09:47.540 | So the S&P 500 as a whole is actually,
00:09:52.040 | like it's near its kind of recent peak valuation.
00:09:56.800 | If you look at companies like Meta,
00:09:58.380 | they're trading at 23 times.
00:10:01.160 | If you look at Google, trading about 21 times.
00:10:04.180 | If you look at NVIDIA, it peaked at 66 times.
00:10:08.020 | It's now at 36 times consensus estimates.
00:10:11.140 | And I would tell you closer to 28 or 30 times our estimates.
00:10:15.580 | And so I look at the valuations for these businesses, Bill.
00:10:18.660 | But growing at a rate that's unprecedented
00:10:21.220 | for a company of this size.
00:10:22.900 | Correct. Unprecedented.
00:10:24.540 | Correct. And so like, and we'll put all these charts in,
00:10:28.560 | but at the highest level, I look at that and I say,
00:10:31.580 | is that a bubble, right?
00:10:33.320 | Does this feel like a bubble?
00:10:34.480 | I say, no, it's not as good a deal
00:10:37.080 | as you got at the start of '23.
00:10:39.280 | And it's not as good a deal as you got at the start of '24.
00:10:42.360 | But if we achieve the growth expectations
00:10:45.040 | we expect out of these businesses,
00:10:46.880 | then I don't think it's overly onerous.
00:10:49.560 | But the question really is, and you brought up,
00:10:51.920 | like, what does the market expect
00:10:53.700 | in terms of earnings growth?
00:10:54.980 | Like first, what was earnings growth last year?
00:10:57.960 | And then what is the market expect
00:10:59.540 | for earnings growth this year?
00:11:01.580 | So here's a chart that shows the mega cap earnings growth
00:11:05.340 | broken out from the rest of the S&P 500.
00:11:08.960 | And in 2024, the big five, so in this case,
00:11:13.960 | Microsoft, Meta, Nvidia, Amazon, and Google
00:11:17.340 | grew earnings at 44% in 2024.
00:11:21.460 | Stocks were up a lot,
00:11:22.620 | but Bill, earnings were up tremendously, 44% is extraordinary.
00:11:26.620 | If you look at the S&P 500 as a whole,
00:11:29.220 | because of the tailwind of the mag five,
00:11:32.860 | the S&P 500 as a whole was up 9%.
00:11:36.400 | If you look at the other 495 companies in the S&P,
00:11:40.300 | their earnings only grew 2%, right?
00:11:43.580 | So it was pretty anemic earnings growth
00:11:46.020 | in the S&P 500 in 2024 relative to the mag five.
00:11:50.100 | If you look at it now, what's expected,
00:11:52.300 | what the consensus forecast is for 2025,
00:11:56.860 | we see the mag five's earnings growth
00:11:59.340 | comes from 44% down to 21%.
00:12:02.380 | How much of that has already started, Brad?
00:12:04.540 | Do you know what Q4 was?
00:12:06.140 | That deceleration, I'm just curious.
00:12:09.580 | There's certainly a deceleration in these businesses.
00:12:11.460 | Remember, they're coming off of easy comps,
00:12:13.700 | and by the time you get to 2025, it's very hard comps,
00:12:17.060 | and still growing 20% on the massive, massive
00:12:21.080 | earnings and revenue bases of these companies.
00:12:23.580 | Pretty extraordinary.
00:12:24.780 | Five years ago, nobody thought
00:12:26.060 | they would be growing that fast.
00:12:27.940 | But the interesting thing here that kind of worries me
00:12:30.740 | a little bit about the market, Bill,
00:12:32.700 | is that we expect non-tech earnings
00:12:36.060 | to grow from 2% to 11%.
00:12:38.300 | So a big acceleration in non-tech earnings.
00:12:42.260 | And so I asked myself, well, where does everybody think
00:12:45.180 | this is gonna come from?
00:12:46.260 | And the next slide is the answer to that.
00:12:49.340 | So the yellow shaded box shows you
00:12:53.020 | that people expect healthcare earnings
00:12:55.220 | to go from plus 4% to plus 20%.
00:12:58.300 | They expect industrials to go from negative 4%
00:13:01.100 | earnings growth to plus 16.
00:13:03.380 | Materials to go from negative 10 to plus 17.
00:13:06.500 | Those are huge turnarounds, right?
00:13:08.900 | Huge accelerations in earnings.
00:13:10.860 | Now, a big component of that is associated
00:13:13.500 | with the tax cuts, but I think that only speaks
00:13:15.700 | for about 30% of that bump in earnings growth.
00:13:19.700 | So if you said to me again, Brad,
00:13:22.380 | where is there a potential boogeyman?
00:13:24.020 | Well, if those earnings aren't delivered,
00:13:26.740 | if you don't see this acceleration in non-tech earnings
00:13:29.660 | in the S&P 500, then it's hard for me to see
00:13:32.980 | how you can have a big year.
00:13:34.580 | I don't think we're gonna see a year
00:13:35.940 | that looks anything like 2023 or 2024 anyway.
00:13:39.620 | But if you just said, can we get to 10 or 15% on the market,
00:13:44.740 | a couple of things have to happen.
00:13:46.140 | You have to see this acceleration in earnings, number one.
00:13:49.660 | And number two, you really have to see interest rates
00:13:52.620 | not go above 5%.
00:13:54.020 | I think if interest rates go up a lot,
00:13:56.420 | that becomes a big albatross on market performance in 2025.
00:14:01.420 | Let's come back to the interest rate thing.
00:14:04.660 | I wanna stick with the large cap companies for a second.
00:14:08.300 | You and I were having a discussion about their CapEx trends.
00:14:12.220 | And this is something that's remarkably different
00:14:16.060 | than any window of past tech investing, right?
00:14:19.540 | This level of CapEx wasn't a part of the equation
00:14:22.660 | other than maybe in a manufacturing company.
00:14:26.140 | So why don't you set this up?
00:14:27.820 | I know you put together a slide,
00:14:29.220 | like what's happening with CapEx in these large companies?
00:14:34.220 | CapEx was growing before the chat GPT moment,
00:14:39.420 | but nothing like we've seen it grow over the last two years
00:14:42.660 | and we're forecasting it to grow for the next couple years.
00:14:45.980 | So this first slide just shows the combined CapEx
00:14:50.020 | of Google, Meta, Amazon, Microsoft, Apple, and Oracle,
00:14:53.260 | both what they did in 2024,
00:14:55.640 | which you can see that huge step up in 2024.
00:14:58.820 | Yeah, from about 160 billion to 260-ish.
00:15:03.820 | Exactly, and Bill, right?
00:15:07.940 | It consumed the vast majority
00:15:10.020 | of the incremental free cashflow of those businesses, right?
00:15:12.900 | So they clearly are all betting and believe
00:15:17.900 | that those are really NPV positive investments
00:15:21.940 | and we can dig into that.
00:15:24.220 | So you heard Satya say, I think they are expected,
00:15:28.900 | in their most recent quarter,
00:15:30.020 | they did about 20 billion in CapEx, Bill.
00:15:32.820 | And you just, you know, he said earlier in the year,
00:15:34.940 | which caused, or a couple of days ago,
00:15:36.440 | which caused a bit of a kerfuffle
00:15:37.980 | that they expected to spend 80 billion in CapEx this year.
00:15:41.300 | Which is the run rate, basically.
00:15:42.540 | Yeah, it's just the run rate that he was on.
00:15:45.100 | But it is, when people hear the number,
00:15:47.660 | it's still pretty shocking.
00:15:48.660 | Yeah, on the CES interview, Elon goes,
00:15:51.780 | he said, yeah, that's a big number in anyone's universe.
00:15:56.780 | Coming from his point of view, he's like, yeah, that's big.
00:16:00.100 | Right, and so if you go company by company,
00:16:05.260 | the one thing, you know, there's debate.
00:16:06.960 | A lot of people were debating,
00:16:08.080 | we talked about it with Dylan in December,
00:16:10.000 | will people actually spend this money, right?
00:16:12.680 | Like, will they actually buy more compute in 2025?
00:16:16.320 | And I think what we can put to bed,
00:16:18.520 | based upon the conversations we've had,
00:16:20.940 | what people are hearing out there,
00:16:22.400 | the conversations at CES, there are going to,
00:16:25.680 | these investments are gonna show up in 2025, right?
00:16:29.980 | And the reason they're showing up, I believe,
00:16:32.880 | is that people are still seeing a lot of returns on training.
00:16:37.300 | We'll get into the scaling laws,
00:16:38.780 | these different scaling laws that they're building for.
00:16:42.180 | So whether that's pre-training, whether that's post-training,
00:16:45.300 | whether that's test time scaling,
00:16:47.100 | they're investing in making the models better.
00:16:48.920 | But the other one, which is coming on really strong,
00:16:51.820 | remember, Jensen in our pod with him.
00:16:54.500 | 40% of your revenue today is inference.
00:16:56.940 | But inference is about ready,
00:16:58.620 | because of chain of reasoning, right?
00:17:02.460 | It's about ready.
00:17:03.300 | It's about to go up by a billion times.
00:17:04.400 | Right, by a million X, by a billion X.
00:17:07.860 | That's right.
00:17:08.700 | That's the part that most people
00:17:10.220 | haven't completely internalized.
00:17:13.620 | This is that industry we were talking about,
00:17:15.500 | but this is the industrial revolution.
00:17:17.840 | That's the production of intelligence.
00:17:21.020 | That's right.
00:17:21.860 | It's gonna go up a billion times.
00:17:24.460 | He expected inference to go up by 100,000 times,
00:17:28.180 | a million times, maybe even a billion times.
00:17:30.460 | So inference is scaling very, very quickly,
00:17:33.820 | and that's all new compute that's gotta get built out
00:17:36.820 | to support that inference.
00:17:39.540 | Looking at this chart that you put together,
00:17:41.980 | you've got, it looks like Meta and Microsoft
00:17:45.020 | popping above 25% of revenue on CapEx.
00:17:49.100 | You've got Apple and Amazon kind of in the middle
00:17:53.460 | in this 10 to 15% range.
00:17:55.460 | And then Apple, ironically, is falling below 5%.
00:18:00.340 | Which makes sense, right?
00:18:01.260 | Because they're not,
00:18:02.700 | Apple is not investing in frontier models.
00:18:05.660 | And then the other thing is those same top two
00:18:09.620 | are bouncing up against 100% of free cash flow, right?
00:18:13.340 | Which is certainly a moment of pause, right?
00:18:17.620 | So how do you frame either of those?
00:18:20.340 | How do you think about whether percent of revenue
00:18:23.660 | should matter?
00:18:24.500 | Is there a limit that's too high?
00:18:26.500 | And then we'll do free cash flow.
00:18:28.460 | I will tell you, this is a very, very robust debate.
00:18:33.460 | You know, I was out in Omaha in December,
00:18:36.420 | and I'll tell you, they're pretty skeptical
00:18:38.620 | about the amount of dollars.
00:18:40.540 | Define who they are.
00:18:41.780 | Yeah, you know, they being the biggest investor in Omaha.
00:18:45.300 | Yeah, okay.
00:18:46.140 | I think that they're worried,
00:18:47.140 | like a lot of other investors
00:18:48.700 | who've kind of seen these moments,
00:18:50.860 | that this is like the telecom buildup, right?
00:18:54.780 | I will tell you, though,
00:18:56.620 | I have a lot of respect for Elon, for Satya, for Sundar,
00:19:00.620 | for the people who are making these investment decisions.
00:19:04.420 | And the numbers are there.
00:19:06.140 | You know, as Satya told us on the pod,
00:19:07.900 | they have 10 billion in inference revenue,
00:19:10.380 | and I think he expects that to grow significantly.
00:19:13.700 | And that is the ROI, right,
00:19:16.740 | on the dollars that he's putting into the ground.
00:19:19.580 | But they are making a bet, Bill, right?
00:19:22.020 | And you can make a bet for one of two reasons.
00:19:23.980 | You can make an offensive bet
00:19:25.860 | that this will drive future revenue growth
00:19:28.340 | or profit growth in my business,
00:19:29.620 | or it could be a defensive bet, right?
00:19:31.060 | The prisoner's dilemma.
00:19:32.380 | Like, I can't not do this
00:19:33.780 | because my competitors are doing it.
00:19:35.500 | And I think that's where some of these concerns emanate from.
00:19:39.140 | But it's very clear to me
00:19:41.380 | that part of the reason the multiples on these businesses
00:19:45.780 | have not, you know, kind of gotten even higher
00:19:50.780 | is this wall of worry.
00:19:52.820 | They're spending a lot of money
00:19:54.100 | and they may not see the return.
00:19:55.700 | So are there any rules of thumb?
00:19:57.500 | Like, does 25 or 30,
00:20:00.740 | like, does it matter to you as an investor
00:20:03.220 | what that percentage of revenue is?
00:20:05.180 | Of course, of course.
00:20:07.300 | I mean, like, listen, you and I both know
00:20:09.260 | when we're investing in a startup,
00:20:10.500 | we want, you know, here's the irony, right?
00:20:12.900 | Over the last 10 years,
00:20:13.860 | people celebrated raising these bigger and bigger
00:20:15.860 | and bigger rounds.
00:20:16.940 | And you and I would look at each other quizzically
00:20:18.820 | and we'd say, you know, people lost the script.
00:20:21.700 | The goal is the least amount of money in
00:20:23.740 | for the most amount of money out.
00:20:25.620 | Not the most in for the most out.
00:20:27.940 | And I would say today, the thing that's very clear to me
00:20:32.620 | is that over the last decade,
00:20:34.380 | we've had super high returns,
00:20:37.260 | super high returns on the incremental capital
00:20:40.500 | that got invested.
00:20:41.740 | And there's no doubt in the short run here
00:20:44.860 | that those returns are compressing.
00:20:46.900 | So you have to believe as an investor,
00:20:48.780 | you have to have an imagination to believe, right?
00:20:51.460 | Just like when Google was investing gobs in early 2000s
00:20:55.340 | or when Amazon was investing gobs in '08, '09, 2010 in AWS,
00:21:00.340 | you have to believe that there is a pot of gold
00:21:04.060 | at the end of this rainbow.
00:21:05.300 | I happen to believe it, right?
00:21:06.940 | I see the revenue growth inside some of these businesses,
00:21:10.460 | the inference revenue growth inside these businesses.
00:21:12.980 | And when you replace human labor, you know,
00:21:15.140 | Elon said on this recent interview,
00:21:18.020 | I think it was with Bill Miller at CES,
00:21:20.020 | it was a virtual interview.
00:21:21.940 | I mean, AI really within the next few years
00:21:23.700 | will be able to do any cognitive task.
00:21:26.700 | You're like, it obviously begs the question,
00:21:28.180 | what are we all going to do?
00:21:29.660 | You know, I'd say max three or four years, maximum.
00:21:34.660 | And Elon said he thinks that every cognitive task
00:21:38.340 | that can be done by a human
00:21:40.020 | will be able to be done by an AI within three to four years.
00:21:44.100 | If you believe that to be true,
00:21:46.180 | the value of all that human labor that you're replacing
00:21:49.900 | is measured in trillions.
00:21:52.380 | And I think these are things that, you know,
00:21:55.340 | most of these CEOs have determined,
00:21:57.300 | even though that makes them a little bit uncomfortable.
00:21:59.500 | You said it makes the CFOs talking a little bit high pitches
00:22:02.980 | and all that's true, but I think they've all determined
00:22:05.620 | they can't not make this level of investment
00:22:09.100 | in something that has the very high potential
00:22:11.540 | to be this big.
00:22:12.380 | When I look at the free cash flow, you know,
00:22:15.300 | certainly it causes you to step back and say,
00:22:19.220 | we've however, the past decade has been unprecedented
00:22:24.220 | free cash flow from the max seven, right?
00:22:28.140 | And the amount of cash that's been put
00:22:30.980 | on their balance sheet has, was so unprecedented.
00:22:35.140 | And, you know, everyone would talk about it
00:22:37.820 | and like, what are they going to do with all this money?
00:22:39.980 | And yeah, to move from that to a place
00:22:42.940 | where a hundred percent of incremental free cash flow
00:22:45.340 | is now spoken for is a certainly a change, right?
00:22:49.020 | And my friend, Mike Mosin would get mad at me
00:22:53.420 | for worrying about it because he would say
00:22:55.740 | all that matters is not that they're using up
00:22:58.220 | their free cash flow, but what's the return
00:23:00.220 | on the incremental investment.
00:23:02.220 | Of course, of course.
00:23:03.940 | That's super hard to understand.
00:23:06.380 | That's the great puzzle.
00:23:07.460 | And that, this is the thing where I think that
00:23:09.900 | in these early parts of a phase shift,
00:23:12.220 | in the first three to four years of a phase shift, right?
00:23:15.260 | It's really hard for the traditional Wall Street analysts
00:23:18.540 | to model this.
00:23:20.220 | Remember, I give this quote, in 2023,
00:23:25.220 | you had 26 analysts on Wall Street covering NVIDIA.
00:23:29.460 | And the consensus forecast of all 26 of them
00:23:33.540 | missed by 80%, okay?
00:23:37.220 | I mean, it's just like, that's how wrong you can be
00:23:39.900 | if you're thinking linearly at a moment of a phase shift.
00:23:44.140 | And so I think there, we're still in that moment.
00:23:47.460 | And that's why I think it's an advantage
00:23:48.820 | to be in Silicon Valley,
00:23:49.860 | because you're really spending all your time
00:23:52.700 | with the people who are actually doing these things.
00:23:54.700 | To get more exposed, yeah.
00:23:55.740 | Right, and seeing what they're actually building.
00:23:58.140 | You know, once it's well-known, think about,
00:24:00.180 | by 2016, the cloud was well-known,
00:24:02.660 | and so everybody could model it reasonably well.
00:24:05.820 | But in 2012, 2013, right, when you're earlier on
00:24:09.580 | in that phase shift, there was a real opportunity early
00:24:12.420 | in those companies, Snowflake, Mongo, Okta, et cetera,
00:24:15.060 | to do things that people thought were not possible
00:24:17.380 | at that point in time.
00:24:18.420 | Let's spend a second on this,
00:24:19.940 | what you call the prisoner's dilemma argument.
00:24:22.380 | And I guess you might include
00:24:25.380 | just the hyperscalers at this point.
00:24:27.580 | And there's, by the way, let me ask a quick aside.
00:24:32.500 | There has been talk that Meta
00:24:35.620 | has been hiring an enterprise group
00:24:37.820 | and is structuring new deals, you know,
00:24:41.740 | around higher-end versions of LLAMA
00:24:45.140 | or unlimited versions of LLAMA.
00:24:46.940 | Do you have a perspective?
00:24:48.020 | Are they entering the enterprise hyperscaler business?
00:24:52.620 | You know, I have no evidence that they are.
00:24:55.660 | I, you know, I take Mark at his words last year
00:24:59.180 | in one of his podcasts where he said
00:25:01.940 | they already charge license fees to the large hyperscalers
00:25:05.700 | to use LLAMA and to provide it to them
00:25:07.700 | in certain ways, et cetera.
00:25:09.180 | I think it's de minimis revenue in the scheme of Meta,
00:25:12.380 | but, you know, he's smart enough in that-
00:25:13.220 | I've heard rumors in the billions
00:25:15.540 | and people have found that there are searches,
00:25:18.780 | you know, on, you know, for executives to come in
00:25:22.860 | that have certain backgrounds.
00:25:24.300 | I don't know.
00:25:25.260 | No, no, no, that is in fact occurring.
00:25:27.740 | But again, you could have revenues
00:25:29.340 | of a couple of billion dollars,
00:25:30.420 | but in the scheme of Meta,
00:25:31.300 | that's still not all that material.
00:25:33.340 | But they're smart enough to maintain the optionality bill.
00:25:36.620 | I mean, you saw NVIDIA release some models,
00:25:38.420 | which is the packaging of LLAMA models
00:25:40.940 | to make them easier for some customers of NVIDIA.
00:25:43.580 | So it would only make sense to me that, you know,
00:25:47.460 | that they're thinking about that.
00:25:48.860 | Let's include them for the sake of this question
00:25:51.580 | and argument.
00:25:52.420 | So let's say there's four hyperscalers
00:25:54.540 | and this is the prisoner's dilemma thing.
00:25:56.660 | Every one of them is kind of being forced
00:26:01.420 | to announce their CapEx.
00:26:03.420 | And if any of them-
00:26:05.820 | Not being forced,
00:26:06.660 | they do it as a matter of their earnings call.
00:26:08.860 | So they're all gonna give us their CapEx.
00:26:10.820 | Sure.
00:26:11.660 | And so they, but my point is,
00:26:14.500 | it's become this point of focus.
00:26:17.180 | Everyone's paying attention to it.
00:26:18.580 | Oh, for sure.
00:26:19.420 | And so, you know,
00:26:21.380 | if Satya were to say 60 next year instead of 80,
00:26:26.020 | does he have a concern that that's a perception
00:26:28.940 | that they don't believe as much
00:26:30.820 | or that they're losing ground to other people
00:26:33.860 | that are taking share?
00:26:35.500 | And therefore, you get into this kind of reflexive argument.
00:26:39.540 | Like if we want to be perceived as leaders in AI
00:26:44.380 | with confidence that we're gonna take our fair share,
00:26:47.900 | don't we have to be announcing
00:26:51.300 | that we're betting via CapEx?
00:26:55.420 | Yeah, I mean, it would be fairly conspiratorial
00:26:58.580 | to think that they're doing this
00:27:01.020 | just so that the optics are that they are our leaders.
00:27:04.540 | What I will tell you is Satya has said
00:27:07.340 | that they have been compute constrained
00:27:10.620 | for all of last year
00:27:12.140 | and their inference revenues are going to accelerate
00:27:14.620 | in the first half of 25
00:27:16.460 | because they will have less constraint.
00:27:19.100 | OpenAI has said publicly, Sam has said many times
00:27:22.500 | that the reason that they can't release models widely,
00:27:25.700 | they can't release some of the voice stuff that you wanted,
00:27:28.220 | the reason they couldn't widely release Sora
00:27:30.340 | is because they were compute constrained.
00:27:32.260 | They didn't have the compute
00:27:33.460 | they needed to support these models.
00:27:35.700 | Part of the reason they charge higher prices for pro models
00:27:39.580 | is because they have to artificially reduce the demand
00:27:42.420 | because they don't have the compute.
00:27:44.300 | And so I think across the board, there's compute constraint.
00:27:47.140 | Now that occurs for two reasons, Bill.
00:27:49.180 | Number one, you have over 300 million people a week
00:27:52.500 | who've decided that I want to use chat GPT
00:27:54.900 | instead of search or something else to answer my question.
00:27:57.700 | So the usage is bigger than people expected.
00:28:00.100 | On the other hand, these new models,
00:28:02.380 | inference time compute is a compute hog, right?
00:28:06.020 | These things require huge amounts of compute.
00:28:09.180 | And so those two things combined,
00:28:11.260 | we just don't have a compute infrastructure in 2024
00:28:14.380 | that kept up with it.
00:28:15.300 | And so I think the investments we're seeing in 2025,
00:28:18.340 | frankly, I don't even think is going to get us to the point
00:28:21.620 | where we have compute surplus.
00:28:23.060 | I think we're going to end this year, compute constrained.
00:28:26.180 | And I think a lot of the frontier labs feel the same.
00:28:28.900 | By the way, one quick aside on that.
00:28:31.780 | You've seen Google release a high-end pay-for consumer,
00:28:36.780 | you know, by consumer, I mean just a individual user
00:28:43.820 | 'cause it could be a consumer at a company as well,
00:28:46.860 | but one with a price point on it.
00:28:48.900 | And when Elon was talking about their release of Grok 2
00:28:53.620 | and potentially Grok 3,
00:28:54.900 | they said Grok 2 is always going to be free.
00:28:57.220 | I infer from that Grok 3 might be pay-for
00:29:00.620 | and this to me is a positive outcome for OpenAI
00:29:05.620 | 'cause I think when people frame the fight
00:29:08.660 | as a competition with search,
00:29:10.580 | they assume that the frontier is going to be free
00:29:14.020 | and not pay-for.
00:29:15.340 | If multiple people fall in line behind them
00:29:19.420 | with subscription pricing, that could start to get sticky.
00:29:23.500 | Well, it's going to be very, very difficult
00:29:26.780 | for any frontier lab to invest at the level
00:29:29.260 | that's going to be required
00:29:31.060 | and not have a robust business model behind it.
00:29:33.500 | What I would tell you when, you know,
00:29:35.900 | and without, you know, say anything
00:29:38.780 | other than what's been said publicly
00:29:40.220 | about OpenAI's revenues, right?
00:29:42.740 | You know, they ended the year,
00:29:43.860 | I think it was rumored four to five billion run rate
00:29:46.180 | growing very robustly.
00:29:48.500 | You know, you would expect a company at this stage
00:29:51.820 | to be growing at least triple digits.
00:29:54.020 | And so you start thinking about 10 billion plus in revenue
00:29:57.340 | for this company.
00:29:58.340 | Now, compare that to some of the other companies, Bill,
00:30:00.540 | right, Mistral, a lot of the other model companies
00:30:03.300 | have fully pivoted.
00:30:04.420 | Like they've already raised their hand, white flag,
00:30:06.900 | we don't have the revenues that can support the spend
00:30:09.220 | to keep up, right?
00:30:10.780 | It's going to be interesting to see what Anthropic does,
00:30:13.100 | right?
00:30:13.940 | Their revenues are reported to be less than a billion dollars
00:30:16.580 | so can they afford-
00:30:17.420 | And mostly on the enterprise API side,
00:30:19.500 | not in this consumer-
00:30:20.900 | Correct.
00:30:21.740 | So can they afford to keep up?
00:30:22.980 | And I would tell you that as large as Google is
00:30:25.260 | and as large as X is,
00:30:28.060 | I don't think that you can go out and spend
00:30:30.660 | and not have a business model to support the spend.
00:30:33.060 | So what you're hearing would make sense to me.
00:30:35.820 | One other thing I just want to say about this, Bill,
00:30:38.940 | one of the things we've spent a lot of time on internally
00:30:41.580 | is trying to really model out what we think
00:30:43.980 | of that compute demand relative to the compute supply.
00:30:47.260 | Like, are we really constrained?
00:30:49.420 | And I want to keep coming back to this point
00:30:52.020 | that Jensen made and maybe we'll insert the clip here
00:30:54.460 | on the pod where he talks about his inference revenue
00:30:59.340 | is already 50% of his revenue.
00:31:01.860 | So already upwards of 50% of his revenue.
00:31:03.900 | And I ask him, is the mix going to go up?
00:31:06.060 | And he said, well, Brad, of course,
00:31:07.980 | because I think inference is going to go up
00:31:10.820 | by a million X or a billion X.
00:31:12.740 | Now what drives that, Bill?
00:31:14.820 | I think you've heard a lot of people say
00:31:16.860 | that 2025 is going to be the year of agents, right?
00:31:20.940 | And so you have these O-series models
00:31:24.020 | and now you have deep reasoning out of Google
00:31:26.420 | that launch this whole different vector
00:31:30.300 | of scaling intelligence and reasoning.
00:31:33.180 | But the thing about those, Bill, as you well know,
00:31:35.580 | is they are compute hogs, right?
00:31:37.740 | The number of tokens that you have to produce,
00:31:39.900 | that you have to repopulate back into the prompt,
00:31:42.500 | the number of branches of inference that you go down,
00:31:46.220 | it dwarfs single shot kind of chat GPT.
00:31:50.420 | And now you add in personalization,
00:31:54.020 | long-term memory about each individual users,
00:31:57.940 | and actions, book my hotel, book my, do things for me.
00:32:02.020 | Again, those things are all compute consumptive.
00:32:04.620 | So that's why I think the frontier labs like chat GPT
00:32:08.020 | are modeling out that expected compute demand
00:32:11.180 | and then looking at what they have
00:32:13.100 | and saying, we're not even close, right?
00:32:16.060 | We're not even close.
00:32:17.900 | I do want to come back to that,
00:32:19.180 | but I want to finish the CapEx thing real quick.
00:32:22.780 | Obviously, one way to look at,
00:32:26.100 | we're looking at a slide, Google, Meta, Amazon,
00:32:28.460 | Microsoft, Apple, and they're all spending this money.
00:32:31.540 | And you can say, what does it mean for those stocks?
00:32:34.380 | But I'm sure as an investor,
00:32:37.060 | the easier thing to consider is
00:32:39.580 | these people are telling us, forecasting
00:32:42.780 | that they're going to spend this amount of money.
00:32:45.660 | And the processes for spending that amount of money
00:32:49.020 | are sticky and slow.
00:32:50.380 | They're not fast.
00:32:51.380 | Like you can't back out.
00:32:52.740 | Like you pre-commit and you go build.
00:32:55.860 | Who are the recipients of this stuff?
00:32:58.660 | Like that's an easy win.
00:33:01.740 | It's a great question.
00:33:04.620 | So we started the conversation like,
00:33:06.140 | how do we think about the framework for the year?
00:33:08.540 | What I didn't say at the start is,
00:33:11.180 | we started 2023 with 95% net long.
00:33:16.300 | So think of that bill as all your chips on the table.
00:33:18.940 | So that's our longs minus our shorts.
00:33:22.140 | We started last year, I think around 80% net long.
00:33:25.580 | Now that can go up and go down,
00:33:27.260 | but it's less chips on the table.
00:33:28.780 | And you might say, well, Brad,
00:33:30.060 | if you think it's the golden age,
00:33:31.300 | and if you think all this money's being spent,
00:33:32.940 | then why the hell do you have fewer chips on the table?
00:33:34.780 | And I just told you, valuations are a lot higher.
00:33:37.820 | So let's start there.
00:33:39.500 | Warren Buffett has told us the most important thing,
00:33:41.380 | Charlie Munger, the most important thing in investing
00:33:43.220 | is the price of entry.
00:33:44.780 | So the price of entry to play is higher.
00:33:47.260 | The variant perception is lower.
00:33:48.980 | When I look at Q1,
00:33:50.940 | when it comes to the MAG 5 or MAG 6,
00:33:54.660 | what I say is there's gonna be a lot of FX headwind, okay?
00:33:57.660 | So the strength of the dollar has gone up a lot.
00:33:59.820 | A lot of these companies have revenues
00:34:01.580 | denominated in non-US dollars.
00:34:03.740 | So their revenues will have headwinds
00:34:05.500 | from that FX exchange.
00:34:06.820 | I'm not sure that's totally appreciated.
00:34:08.860 | And then secondly, their CapEx is going up a lot.
00:34:12.220 | I think Dylan said on our podcast,
00:34:14.820 | and I agree with him,
00:34:16.380 | that the CapEx is higher than people think, okay?
00:34:19.260 | So both of those things aren't particularly good
00:34:23.500 | for the biggest companies trading at high valuations.
00:34:26.220 | Now, I don't think it's gonna cause some cataclysmic event,
00:34:28.980 | but it's a headwind, okay?
00:34:30.460 | But who are the recipients?
00:34:32.220 | Who are the winners?
00:34:33.060 | Obviously, NVIDIA is the biggest one,
00:34:34.260 | but who else is a recipient here?
00:34:36.740 | Yeah, well, first I wanna show a slide on NVIDIA
00:34:38.900 | because I hear a lot and I get a lot.
00:34:41.140 | We owned it since it was $120 a share.
00:34:43.860 | Today, split adjust is $1,500 a share.
00:34:46.420 | So this, you know, or $1,450.
00:34:49.140 | It's gone up a lot.
00:34:50.460 | But people say, oh my God,
00:34:51.860 | I remember when it went up two X
00:34:53.300 | and Jim Cramer was telling everybody, sell it,
00:34:55.060 | it's up two X, and then three X, sell it, it's up.
00:34:57.740 | Every time you have to recalibrate
00:34:59.780 | based upon the facts on the field, right?
00:35:02.180 | It went up a lot because its earnings went up a lot.
00:35:04.300 | It went up a lot because its revenues went up a lot.
00:35:06.260 | So here's a slide, Bill,
00:35:07.860 | that just shows the consensus data center revenues,
00:35:12.180 | right, for NVIDIA.
00:35:14.780 | And the key here is to remember that NVIDIA prior,
00:35:19.420 | you know, data center revenues are mostly, you know,
00:35:22.140 | these chips that are driving training and inference.
00:35:24.860 | And this is a business that's got very high margins on it.
00:35:28.220 | And in 2023, it was $61 billion, right?
00:35:31.780 | In 2025, it's estimated to be close to $200 billion.
00:35:36.540 | And so if you notice the jump between '24 and '25,
00:35:39.820 | again, this is consensus,
00:35:41.140 | that is about a $60 billion increase, '24 to '25.
00:35:44.620 | Well, if you just add up the hyperscalers
00:35:47.220 | that we just went through, that's $60 billion, right?
00:35:50.380 | And where else are they spending the money?
00:35:51.900 | Yes, they're spending some on custom ASICs,
00:35:53.900 | but it's tiny on a relative basis.
00:35:56.380 | Most of this stuff is NVIDIA wall-to-wall in 2025.
00:36:00.460 | You know, Jensen told you at CES, he's got 40 AI factories.
00:36:04.340 | So he doesn't have six people in the world
00:36:05.980 | building this stuff.
00:36:07.100 | He's got lots of people building this stuff.
00:36:08.980 | So I actually think if you look at the consensus numbers,
00:36:11.900 | they probably would imply that NVIDIA's market share
00:36:15.420 | is going down '24 to '25, and I don't think that to be true.
00:36:18.780 | But so NVIDIA is one of--
00:36:20.340 | Comparing expectations for NVIDIA
00:36:22.220 | with this pre-committed CapEx spend.
00:36:24.860 | Correct, correct.
00:36:25.700 | And your expectation of what percentage of that--
00:36:27.660 | Correct, and remember, NVIDIA's stock has not gone up
00:36:29.780 | from June of last year until now.
00:36:32.180 | It's basically, you know,
00:36:33.220 | because this debate's been going on.
00:36:35.100 | You know, we had a lot of debates about,
00:36:37.260 | have we hit a wall on pre-scaling?
00:36:39.180 | Can the models continue to scale?
00:36:40.780 | We had the DeepSeq model come out.
00:36:42.460 | It was a smaller model, very capable.
00:36:44.420 | So people say, maybe you don't need all these chips.
00:36:46.540 | So like, there is real tension in the world
00:36:49.580 | about this name.
00:36:52.260 | Now--
00:36:53.100 | Well, I think there's also, I mean,
00:36:54.380 | another one that comes up is if we're moving
00:36:56.940 | from more spend on inference than pre-training,
00:37:01.780 | do you need this big, large, holistic cluster?
00:37:04.820 | Can you use a more distributed approach?
00:37:07.380 | Would you use products that aren't GPUs,
00:37:11.300 | you know, TPUs from Google's,
00:37:13.180 | some of the startups as well?
00:37:15.940 | Of course, they can't ramp to anywhere near this scale
00:37:19.340 | anytime soon, so--
00:37:20.500 | Right, and Google, I think, has said,
00:37:22.460 | you know, I'm pretty sure it's said publicly
00:37:24.060 | that, you know, they're spending a lot on GPUs,
00:37:26.780 | not just TPUs.
00:37:28.820 | So, you know, but you asked a question,
00:37:31.860 | who are the recipients?
00:37:33.100 | Well, NVIDIA's clearly, you know, high on that list.
00:37:37.020 | Now, remember, NVIDIA was up, I don't know,
00:37:38.660 | 140, 150% last year, but Intel was down over 20%,
00:37:42.900 | AMD, I think, was down on the year,
00:37:45.220 | and so it was not winner, you know,
00:37:47.140 | this was not a situation where everybody
00:37:48.780 | in the semiconductor complex won.
00:37:50.780 | But another group, you know,
00:37:52.180 | I know I share this feeling as well
00:37:53.860 | with my friend Gavin Baker, you know,
00:37:56.220 | we're also investors in the memory space.
00:37:58.180 | So SK Hynix, which is providing high-bandwidth memory,
00:38:02.300 | this becomes very important,
00:38:03.660 | particularly in a world of inference time compute.
00:38:06.460 | And we think that we have a memory shortage,
00:38:09.060 | which is a key part of the NVIDIA
00:38:12.140 | supercomputer ecosystem, as far as we can see.
00:38:16.180 | So Micron and SK Hynix would be in that memory space.
00:38:20.100 | So what, pause on SK, for example.
00:38:23.460 | So when you look at companies like NVIDIA, you know,
00:38:28.460 | or others that are doing well in this environment,
00:38:31.740 | they have multiples that, you know,
00:38:34.020 | would suggest everyone understands that and buys into it.
00:38:37.740 | If I'm reading the numbers correctly,
00:38:39.900 | SK trades at like six times forward earnings,
00:38:44.540 | which is insanely low for most companies.
00:38:48.660 | What's going on here?
00:38:50.340 | Do people not believe the number?
00:38:52.420 | Or is this the commodity, you know,
00:38:55.380 | argument that happens with hard drives,
00:38:58.060 | where you buy it with the highest multiple
00:39:01.340 | and sell it with the lowest multiple?
00:39:02.740 | Yeah, no, I think the reason it has that multiple
00:39:07.100 | is because people do think it's hard drives,
00:39:09.340 | it's commodity memory, that it's a boom and a bust,
00:39:12.060 | and that they don't add a lot of kind of sticky value
00:39:16.100 | that you can just easily put on a lot more supply.
00:39:19.180 | That supply will ultimately drive down the cost
00:39:22.420 | and the margins, and that demand is cyclical.
00:39:26.260 | We believe those things are not true.
00:39:28.660 | We think this has become a way,
00:39:29.980 | way more sophisticated product,
00:39:31.420 | lots of software baked into it,
00:39:33.380 | that this is secular, not cyclical.
00:39:35.500 | And so there are two ways to win there.
00:39:37.380 | Number one is just,
00:39:38.420 | you get the earnings growth of the business,
00:39:40.100 | but the second one is you could have a re-rating higher
00:39:42.620 | in terms of the multiple people are willing to pay.
00:39:44.460 | People come to believe that.
00:39:45.420 | If that SK gets treated more
00:39:47.980 | as a non-commodity company over time.
00:39:50.340 | You know, and then of course, you know,
00:39:52.580 | you and I talked a lot around the Diablo episode last year,
00:39:55.860 | just, you know, we have power issues in this country, right?
00:39:59.180 | And so who, how are we going to power up, you know,
00:40:02.660 | this five, 10, 15 gigs of data center
00:40:06.540 | that we need to bring online?
00:40:07.700 | Because remember, if they're spending this money, Bill,
00:40:10.220 | then they're telling you that they are,
00:40:12.820 | like, you have to have power,
00:40:14.460 | you have to have a shell,
00:40:15.780 | and you have to fill it with chips.
00:40:17.780 | And so, you know, one of the things-
00:40:20.380 | And I continue to have people tell me,
00:40:22.460 | like, I know they'll say like,
00:40:24.300 | I know this is hard to believe,
00:40:25.980 | but we are power limited.
00:40:27.980 | Like, we are power limited.
00:40:29.380 | We have CapEx that we would put online
00:40:32.740 | that we're not putting online 'cause we can't power.
00:40:35.860 | I will tell you this.
00:40:37.460 | I had a call this week with the COO of PG&E
00:40:41.020 | and with the head of Diablo Canyon.
00:40:43.300 | And we, I'm very focused in talking to our friend,
00:40:47.460 | David Sachs and others in this administration.
00:40:49.820 | We have to give regulatory relief.
00:40:52.540 | Otherwise, we're going to have huge hurdles
00:40:56.340 | placed in the front of our AI industry.
00:40:58.940 | We know China's building 100X as much as we are
00:41:02.060 | in terms of power,
00:41:03.140 | and power is the single most important primitive to AI.
00:41:06.380 | So I'll just give you one example.
00:41:08.180 | Gavin Newsom still has not signed the extension
00:41:10.900 | for Diablo Canyon beyond 2029.
00:41:13.940 | It's insane.
00:41:15.260 | We have to make that happen this year.
00:41:17.420 | It's 10% of California's green, clean power.
00:41:21.700 | We have to extend that
00:41:23.580 | so that they can begin doing the appropriate planning.
00:41:26.380 | Hell, you and I think they should probably be expanding it.
00:41:28.940 | But at a minimum,
00:41:29.940 | the idea that you would take 10% out of the grid at a time
00:41:33.660 | that we are already capacity constrained is totally insane.
00:41:37.460 | Let me ask you this.
00:41:38.580 | So Altimeter's historically been tech-focused.
00:41:42.780 | Do you, when your information leads you this direction,
00:41:46.620 | do you start moving in and out of energy companies?
00:41:51.540 | Well, I'll tell you, a lot of my tech peers have, right?
00:41:54.340 | And if you look at some of the highest returning companies
00:41:57.340 | in what I would call
00:41:58.340 | the semiconductor-related complex last year,
00:42:01.900 | they were in fact companies in the energy space.
00:42:04.260 | And we looked at all of those.
00:42:05.660 | Ultimately, I think one of the overarching
00:42:09.260 | Altimeter North Stars is essentialism, right?
00:42:12.900 | Like just don't make anything any more complex
00:42:16.580 | than it has to be.
00:42:17.620 | And so I would always ask my analysts
00:42:19.540 | when they would bring me a power company,
00:42:21.300 | I would say, why is that better than NVIDIA, right?
00:42:24.260 | It's a derivative of NVIDIA, it's the exact same bet.
00:42:27.460 | So why not just buy more NVIDIA?
00:42:29.140 | So we tend to just take bigger bets on our best ideas
00:42:33.780 | rather than diversifying out into these other things
00:42:36.420 | that we know less about.
00:42:37.860 | Because the truth of the matter is,
00:42:39.220 | I do know that we need a lot more power, Bill,
00:42:42.100 | but I don't necessarily know
00:42:43.980 | exactly how the regulatory is going to play out,
00:42:47.020 | exactly how, whether these small nuclear reactors,
00:42:49.780 | all this other stuff.
00:42:50.740 | This falls into the Buffett comment,
00:42:52.980 | there's a fool in every market,
00:42:54.460 | and if you don't know who it is, it might be you.
00:42:57.020 | (laughing)
00:42:58.580 | You know, one of the things we ought to come back to,
00:43:00.460 | though, Bill, is because you said,
00:43:02.220 | what could the boogeyman be?
00:43:03.420 | And I said, interest rates.
00:43:05.340 | So maybe just spend a few minutes
00:43:08.140 | double-clicking on rates, what's going on?
00:43:10.500 | Why are rates going up?
00:43:11.940 | And how do I think that may or may not change?
00:43:16.940 | But if you look at this chart,
00:43:20.460 | the black line on here shows the 10-year,
00:43:22.900 | just over the course of the last couple of months,
00:43:24.540 | has been trending up and is now around 4.7%.
00:43:29.540 | This, despite the fact that inflation has largely,
00:43:33.580 | you know, like, remember, just a couple years ago,
00:43:36.260 | we had a nine handle on headline inflation.
00:43:38.900 | And I remember when I said, a couple years from now,
00:43:40.980 | we'll be back to a two handle, people laughed.
00:43:43.380 | They said, no way.
00:43:44.620 | And in fact, that's where we were.
00:43:46.060 | And if you look at the Morgan Stanley consensus forecast
00:43:49.420 | for this year, they expect it to finish the year at 2.2.
00:43:52.500 | Okay?
00:43:53.340 | But you have to ask the question,
00:43:55.340 | why then are people concerned, right?
00:43:58.540 | What's going on?
00:43:59.380 | Why are rates going up?
00:44:00.500 | And I think there are a few things at play here.
00:44:03.220 | Number one, the Trump election got people really excited
00:44:08.220 | about more growth in the economy.
00:44:11.460 | And you're gonna have 400 billion of stimulus
00:44:14.580 | by way of, on an annual basis,
00:44:17.940 | if the Trump tax cuts are passed.
00:44:20.300 | That's a lot of stimulus into the economy.
00:44:23.060 | Regulatory relief will stimulate the economy.
00:44:25.500 | And so what's that stimulus going to do?
00:44:28.420 | A lot of people are concerned it could reignite inflation.
00:44:31.780 | And they're just saying,
00:44:32.780 | there's a chance that inflation goes higher
00:44:34.980 | because we have all this stimulus.
00:44:36.420 | And if it does, rates are not gonna be able to go lower.
00:44:38.860 | So if you look at the next slide, Bill,
00:44:40.540 | we show the expectation in the world now
00:44:43.420 | has gone from a lot of rate cuts to only one rate cut
00:44:47.180 | in the first half of next year, right?
00:44:49.740 | So basically the market's saying that there's too much,
00:44:53.020 | that there's a lot of stimulus
00:44:54.300 | and we're not gonna get rate cuts.
00:44:56.340 | So I think in the first part of this year,
00:44:58.380 | there's gonna be a lot of tension back to that core PCE.
00:45:01.700 | Is inflation continuing to roll over?
00:45:04.380 | We think that a lot of the components
00:45:06.380 | like rent equivalent and shelter, et cetera,
00:45:09.220 | will cause it to continue to go down.
00:45:11.640 | But the other thing going on here,
00:45:13.100 | and this is something I know you and I are close to,
00:45:16.340 | you know, and care a lot about with Doge,
00:45:18.820 | the market is saying we expect this stimulus to come in,
00:45:24.300 | but I think it's also saying we do not expect Congress
00:45:27.120 | to have the courage to cut an equivalent amount
00:45:30.500 | out of the deficit, right?
00:45:32.460 | So if you think about this,
00:45:33.500 | if we have 400 million of tailwind from this tax stimulus,
00:45:36.740 | then I think what the market would like to see
00:45:38.820 | is three or $400 billion of cutting
00:45:42.300 | come out of the budget, right?
00:45:44.900 | And when you look at what Congress has done
00:45:47.120 | over the last, I don't know, five, 10 years,
00:45:49.900 | there's no evidence, we've never done it.
00:45:51.960 | So, you know, the market's just saying,
00:45:53.780 | I'll believe it when I see it.
00:45:56.040 | Trump has talked about interest rates being too high.
00:45:58.740 | He talked about it at the press conference the other day.
00:46:00.920 | He wants them to go lower.
00:46:02.100 | If he talks to his, you know,
00:46:03.660 | chief economists in the White House,
00:46:05.260 | I think they'll say, these are the issues that are at play,
00:46:07.660 | so we really have to make these cuts.
00:46:09.580 | Well, we have a reconciliation package.
00:46:11.700 | So all of this is gonna get determined.
00:46:13.520 | They've told us, in a single reconciliation package,
00:46:16.900 | by April or May of this year.
00:46:19.300 | So this is what I'm telling you.
00:46:20.580 | This is the backdrop that is going to impact valuations
00:46:24.700 | that we have to grapple with, we have to try to understand.
00:46:27.700 | So here's my forecast.
00:46:30.740 | My forecast is that DOJ is actually,
00:46:33.900 | and this administration,
00:46:35.660 | are actually going to cut a lot out of the budget.
00:46:39.380 | And, you know, I think that you have the leadership
00:46:41.340 | in the House and the Senate.
00:46:43.760 | And so if they cut hundreds of billions of dollars
00:46:46.640 | out of this budget,
00:46:47.760 | then I think the market will say, great, fiscal discipline,
00:46:52.300 | we don't have as much stimulus,
00:46:53.940 | and I think you'll see the 10-year come in.
00:46:56.220 | And if the 10-year comes in,
00:46:57.560 | then I think it could be off to the races.
00:46:59.760 | But if it doesn't, if we do the opposite,
00:47:02.600 | if we don't make the cuts,
00:47:03.680 | if the 10-year goes to 5-2-5 or 5-5 bill,
00:47:06.600 | it's gonna be an anchor on the stock market,
00:47:08.440 | it's gonna be an anchor on the economy.
00:47:10.120 | So I actually did a little analysis
00:47:12.980 | I shared with you on this
00:47:15.020 | that I wanna review with folks on the pod
00:47:20.020 | because given how important I just said
00:47:22.820 | that I think this is,
00:47:24.340 | I think it's important to try to get our arms around
00:47:27.460 | and understanding, is it possible?
00:47:29.500 | Is it even possible to cut three or $400 billion
00:47:32.660 | annually out of spending?
00:47:34.140 | And so here's this federal spending chart.
00:47:36.360 | And what we did here, Bill,
00:47:39.040 | we went back to 2019,
00:47:41.060 | and you can see the total spending
00:47:42.700 | by the federal government in 2019 was $4.4 trillion.
00:47:46.880 | And you can see the subcomponents.
00:47:50.100 | Social Security, we spent a trillion dollars,
00:47:52.620 | Medicare, 644 billion, Medicaid, 409 billion.
00:47:57.620 | You can see the different categories under that,
00:47:59.660 | defense, 676, et cetera.
00:48:01.780 | And you can see what our net interest expense was
00:48:03.840 | at the time because interest rates were relatively low,
00:48:06.660 | right, and our debt was a lot lower.
00:48:09.380 | The next column is what we actually spent in 2024.
00:48:14.380 | So this is our best estimate.
00:48:15.940 | The CBO has given us an estimate.
00:48:17.460 | You can see in the footnotes here.
00:48:19.120 | So total spending was 6.7 trillion, okay?
00:48:23.300 | Well, the question is,
00:48:24.140 | is that what we would have expected?
00:48:26.060 | Not what we had expected.
00:48:27.600 | The next column, what we did, Bill,
00:48:29.420 | is what we call a fiscal year '24 baseline.
00:48:32.880 | How did we determine that baseline?
00:48:34.260 | We went back to 2019 and we grew every category
00:48:38.220 | roughly at two and a half percent.
00:48:40.620 | We said the GDP's growing at two and a half percent,
00:48:43.220 | inflation's growing roughly that rate,
00:48:45.260 | the population's growing at that rate,
00:48:46.900 | so that's what government spending should roughly grow at.
00:48:49.860 | And if you see there, the baseline 2019 budget
00:48:54.320 | adjusted two and a half percent CAGR.
00:48:56.180 | We should have spent five trillion,
00:48:58.120 | but instead we spent 6.7 trillion.
00:49:00.860 | That's a $1.7 trillion differential.
00:49:04.700 | So when you hear Doge talk about the ability
00:49:07.500 | to find $2 trillion in savings,
00:49:11.060 | this is, I think, really what has people optimistic,
00:49:16.060 | that there's opportunity to make cuts.
00:49:18.660 | So you go through there,
00:49:20.340 | and I think there are some interesting ways
00:49:23.340 | in which we can go after those savings.
00:49:26.260 | There are some proposals, and I'll post this proposal,
00:49:29.340 | which goes through and says,
00:49:30.580 | here's $700 billion of easy deficit reduction
00:49:34.100 | that both Democrats and Republicans agree on over 10 years.
00:49:37.420 | So that doesn't get us there, but it is certainly a start.
00:49:40.860 | But I think this is gonna be critically important
00:49:45.580 | that we get our arms around federal spending this year,
00:49:49.260 | because that, to me,
00:49:50.100 | is the biggest potential boogeyman out there.
00:49:52.500 | And there are ways that Trump can potentially
00:49:54.300 | do some of this unilaterally, through rescission,
00:49:57.520 | challenging the Impoundment Control Act,
00:49:59.820 | and just refusing to spend what Congress allocates.
00:50:03.660 | But given that the Republicans
00:50:05.020 | control both houses of Congress,
00:50:07.540 | I think that, you know, I'm expecting, and I hope,
00:50:11.260 | that we see cuts that at least offset the tax cuts.
00:50:14.500 | Yeah.
00:50:15.340 | I mean, one of the challenges, obviously,
00:50:17.380 | is some of these spend, you know,
00:50:19.180 | if you take healthcare, for example,
00:50:21.140 | those categories on a per-population basis
00:50:25.400 | have been growing way above
00:50:27.300 | your 2.5% baseline assumption,
00:50:30.720 | and expanding as a percentage
00:50:32.760 | of the overall spend for the government.
00:50:35.260 | And so it may require you to attack that specific problem
00:50:40.260 | within that industry,
00:50:42.740 | not just, you know, say we're gonna spend less.
00:50:46.620 | Listen, no doubt about it.
00:50:48.140 | There is a global demographic problem
00:50:51.140 | that you and I, you can look at any population chart
00:50:54.220 | of any country on the planet.
00:50:56.100 | The number of people over the age of 60
00:50:58.540 | is increasing at a very fast rate,
00:51:00.780 | and the number of new people coming into the workforce
00:51:04.320 | is slowing down as a total percentage.
00:51:07.680 | That means that your working population
00:51:09.920 | that is paying taxes to pay for all of these things
00:51:12.800 | is a much smaller percentage.
00:51:14.280 | So demographically, that is going to be challenged,
00:51:17.160 | but, you know, so how do you deal with it, right?
00:51:21.000 | Well, you can either just give less things to the people
00:51:24.960 | than what you've promised in the past,
00:51:27.520 | or you can come up with more efficient delivery mechanisms.
00:51:31.020 | Obviously, we know all of the easy and crazy (beep)
00:51:33.860 | the government spends money on
00:51:35.380 | that Doge has been talking about that we can cut.
00:51:38.140 | My assumption is we're also going to have
00:51:40.700 | to harness technology, harness AI,
00:51:43.800 | defense spending's gotta get a lot smarter,
00:51:47.100 | and you really have to go through every category.
00:51:49.900 | And if you zero base this, I tend to be where,
00:51:53.220 | I think there are huge opportunities here,
00:51:56.340 | but listen, the track record of Congress,
00:51:58.780 | both Republicans and Democrats,
00:52:00.540 | have not been good at cutting these costs.
00:52:03.220 | And what I would say is if we don't do it,
00:52:06.240 | then there's a real risk that the bond vigilantes
00:52:09.540 | come into the bond market, you end up with higher rates,
00:52:12.700 | because they don't believe
00:52:13.900 | that we're on a stable fiscal path.
00:52:15.500 | Look, I mean, I think it's an assumption
00:52:19.800 | in American life that the government's inefficient.
00:52:22.620 | Like, I don't think there's anyone that,
00:52:25.340 | anywhere that would stand up and say,
00:52:27.780 | we think our government is great
00:52:29.420 | at execution and spending dollars.
00:52:31.460 | There's no one that defends it.
00:52:33.540 | They might argue that it has to exist regardless,
00:52:37.100 | but they don't argue that it's efficient.
00:52:39.320 | So there's plenty of opportunity.
00:52:41.340 | Elon said it's like shooting fish in a barrel
00:52:44.380 | if I'm being asked to look at inefficiency
00:52:46.420 | in the government.
00:52:47.240 | And so there will be opportunity.
00:52:49.560 | I think there's an even bigger opportunity,
00:52:51.860 | which he also hinted at, which is,
00:52:53.900 | if you find policy that's restrictive and unnecessary,
00:52:58.900 | you could unlock GDP growth,
00:53:01.300 | which helps in this equation as well.
00:53:03.940 | And I totally agree with you that the inhibitor here
00:53:07.700 | won't be the identification or knowing what to do,
00:53:12.540 | it'll be whether the system can actually reform itself.
00:53:16.960 | And we'll have to find out.
00:53:18.820 | I mean, like, you know, we'll put in here the charts
00:53:20.940 | that we showed last year
00:53:22.760 | on how you get to a balanced budget in 2029.
00:53:25.740 | But it is just not that hard to get to a balanced budget
00:53:29.060 | at $6 trillion, right?
00:53:31.100 | That's 1 trillion above what we should be spending
00:53:35.540 | on the baseline today, right?
00:53:37.420 | But it just gets you back to that 2019 baseline, right?
00:53:41.020 | And it assumes that we have some acceleration,
00:53:43.940 | but not a lot in terms of growth.
00:53:46.000 | And we are gonna get acceleration, right?
00:53:48.280 | Cutting taxes three to $400 billion a year
00:53:50.580 | is going to accelerate it.
00:53:51.980 | We are going to get efficiency
00:53:53.740 | and productivity gains from AI.
00:53:55.460 | Like these things are happening.
00:53:57.020 | So we've got a lot of goodness.
00:53:59.220 | And by the way, relative to Europe,
00:54:01.020 | which is a basket case and a disaster,
00:54:03.460 | like the US is in such an incredible position.
00:54:07.220 | We just have to have the courage of our conviction
00:54:09.420 | and get rid of some of this wastefulness.
00:54:11.220 | That's the only boogeyman I see out there.
00:54:13.340 | But Bill, there's one other thing, you know,
00:54:14.700 | like we've talked a lot about spending.
00:54:16.380 | Let's talk for a second about regulation,
00:54:18.060 | because I know there's an AI bill brewing down in Texas,
00:54:22.260 | akin to California's 1047 that we got killed.
00:54:26.940 | So maybe just, you know, as again,
00:54:30.020 | we got a lot of changing politics.
00:54:32.060 | I suspect that Trump is gonna rescind all,
00:54:35.340 | or at least part, the major part,
00:54:37.540 | of the Biden-EO executive order on AI.
00:54:41.500 | But what's going on in Texas around AI?
00:54:44.820 | Well, you and I have talked about this in the past,
00:54:48.020 | and there had been a huge movement underway.
00:54:52.780 | And I would call it an unusual movement
00:54:54.780 | when there's a new market evolving,
00:54:56.580 | where people are literally begging for regulation.
00:54:59.780 | And that got a bunch of different parties up on both sides,
00:55:04.620 | that, you know, part of those,
00:55:06.500 | people believe part of those efforts led to the Biden-EO.
00:55:09.940 | And they also led to this big push in California,
00:55:13.860 | which we talked about.
00:55:15.420 | What was it?
00:55:16.260 | 1092, that everyone got on one side or the other.
00:55:20.940 | Everyone in our community had a point of view.
00:55:24.060 | You had Scott Weiner as the person
00:55:27.220 | inside the administration in California pushing for that.
00:55:31.220 | And it ended up with this huge, you know, argument,
00:55:35.020 | and then Gavin Newsom vetoing it.
00:55:36.940 | And I suspect that most people see administration change,
00:55:42.060 | or friend David Sachs coming in as the AIs are,
00:55:46.500 | and they would expect that this is kind of,
00:55:49.300 | at least for now, put to bed.
00:55:51.260 | What's happening, and people are probably not aware of,
00:55:54.100 | is that the people pushing for this regulation
00:55:58.180 | have moved underground to a certain extent,
00:56:00.660 | and they're pushing it in a bunch of different states.
00:56:04.460 | I've heard there's as many as 25 now,
00:56:07.020 | state-by-state initiatives.
00:56:08.660 | And I guess this is how policy works in this country.
00:56:12.540 | But there's a bunch of reasons
00:56:14.740 | to be really worried about this.
00:56:17.380 | I can't stand that the one
00:56:19.620 | that's got kind of the most heat right now is in Texas.
00:56:22.700 | I think Governor Abbott, and obviously Elon,
00:56:26.700 | and all his companies have had this massive impact
00:56:29.980 | on the Texas economy by making it the state
00:56:32.880 | that has the least red tape,
00:56:34.580 | that's kind of the most pro-innovation and pro-business.
00:56:39.420 | And for this thing to pop up there
00:56:42.180 | is just so ironic in my mind,
00:56:44.160 | and would be literally brain dead.
00:56:46.220 | When 1092 was being proposed, 1047, sorry,
00:56:49.900 | people, even including myself,
00:56:51.900 | said, "Why don't you just write a sign
00:56:53.980 | that says move your AI company to Texas?"
00:56:56.940 | Well, now those words sound stupid, because Texas,
00:57:00.460 | and the first thing that people should realize
00:57:04.700 | is there's no reason to do this on a state-by-state basis.
00:57:08.300 | At the end of Obama's term,
00:57:11.380 | he had this initiative that he didn't get around to,
00:57:14.020 | I wish he had, where he had identified
00:57:16.620 | like 100 different industries
00:57:18.260 | where there's state-by-state regulation,
00:57:20.500 | including like haircare and stuff.
00:57:22.780 | And it's just wasteful red tape.
00:57:26.460 | And for us, when so many people in Washington
00:57:30.540 | are worried about our competitive position versus China,
00:57:35.100 | to build a gauntlet that our companies
00:57:40.100 | would have to move through to adhere
00:57:42.480 | to state-by-state interpretations and rules
00:57:45.940 | is just mind-numbingly,
00:57:48.060 | you know, I wanna say the word stupid out loud,
00:57:52.380 | and the only reason I fear saying it
00:57:54.540 | is that like someone trying to write that legislation
00:57:57.660 | would use it against me.
00:57:58.680 | But like, I can't imagine,
00:58:00.740 | like we have a changing administration,
00:58:03.260 | we have people looking at this.
00:58:05.000 | If we feel we just have to do something,
00:58:07.660 | God, please, you know,
00:58:08.780 | I prefer it at a national level to a state-by-state.
00:58:10.540 | Yeah, no, federal preemption.
00:58:12.160 | I mean, like, listen,
00:58:13.340 | this is the Interstate Commerce Clause,
00:58:15.220 | like there's nothing,
00:58:17.420 | our great advantage over Europe is, you know,
00:58:19.460 | they got this crazy patchwork of regulation
00:58:21.780 | across all these different countries.
00:58:23.220 | And here, if you're operating an internet company,
00:58:25.420 | you have one rule of the road, right?
00:58:27.500 | And we need to have one rule of the road around AI.
00:58:29.780 | And I think that it should be promulgated
00:58:32.860 | and will be promulgated in the federal level.
00:58:34.540 | And just so people, you know,
00:58:35.660 | I've heard this boogeyman out there
00:58:37.820 | that people are like, oh, everybody in Silicon Valley,
00:58:39.940 | they don't want any regulation.
00:58:41.220 | They're just a bunch of crazy libertarians.
00:58:43.560 | No, what I think you hear is pushback
00:58:48.560 | to uninformed regulation that would slow us down
00:58:53.140 | and cause us to lose an important race to China.
00:58:56.540 | If we care about the race with China,
00:58:59.020 | then the first thing we need to do
00:59:00.580 | is to reduce the impediments
00:59:02.820 | to us running the fastest race we can run, right?
00:59:06.620 | While still caring about AI safety,
00:59:08.860 | while still caring about national security,
00:59:11.500 | while still caring about all of these things.
00:59:13.840 | But if you have this patchwork
00:59:15.900 | where these states are onerously regulating
00:59:18.380 | all of these companies out of the gates,
00:59:20.420 | all you're doing is handing the gold medal to China.
00:59:25.180 | And so I think the unintended consequences are really bad.
00:59:28.160 | I'm quite confident.
00:59:29.000 | Let me tell you about some of the details
00:59:30.980 | and I'll include a link in here.
00:59:33.220 | Dean Ball wrote a really solid analysis
00:59:38.140 | of everything that's wrong with the Texas proposal.
00:59:40.380 | But one of the things that's in there
00:59:42.700 | is that you have to do a risk assessment.
00:59:45.500 | So this is, you know, similar to like creating an audit
00:59:49.940 | and, you know, for those of you who've run companies
00:59:52.720 | who've gone through audits, you know how difficult it is.
00:59:56.540 | But not only would you now have to,
00:59:58.620 | you'd have to publish a risk assessment.
01:00:00.620 | So whatever product I'm working on,
01:00:03.180 | I've now stepped through these hurdles
01:00:05.220 | and wrote these reports.
01:00:07.300 | And then there's liability associated with it
01:00:09.580 | looking backwards.
01:00:10.540 | And so if I do something bad, I'm liable for it.
01:00:14.000 | But I also then get a second look back
01:00:16.500 | and did I run the analysis
01:00:18.560 | where I should have known about the risk that's there.
01:00:21.540 | And it's like, and it applies, you know, very broadly.
01:00:26.340 | I give the people
01:00:28.620 | that are looking for regulatory capture credit
01:00:30.980 | in that I think they saw
01:00:32.860 | that they were losing at the national level
01:00:34.900 | and realized that they could create messy anxiety
01:00:39.420 | at a state level, which, you know,
01:00:43.420 | may actually promote trying to get to a national,
01:00:48.440 | output, but man, this would be horrific.
01:00:51.640 | I think they're going to get a lot of pushback.
01:00:53.160 | I'll be very, I would be shocked if this happened in Texas.
01:00:57.600 | I don't think they're, you know,
01:00:58.880 | I think you have a lot of coordination
01:01:00.200 | going on in Washington.
01:01:01.320 | I think it's quite smart.
01:01:03.240 | I think it's on both sides of the aisle.
01:01:05.240 | You had a great report published
01:01:06.760 | by Congressman Jay Obernolte in the house,
01:01:10.200 | who was a co-chair of the AI committee in the house.
01:01:13.680 | The Senate's done some work.
01:01:15.160 | And so hopefully SACS will get there
01:01:17.120 | and provide a really clear coordinating function
01:01:21.440 | and that we'll have federal preemption
01:01:23.560 | and that we'll get great direction, you know,
01:01:27.040 | out of Congress.
01:01:27.880 | You know, one other thing, Bill,
01:01:28.800 | is I'm just thinking about 2025 predictions, you know,
01:01:31.940 | because there was a lot of, you know,
01:01:33.920 | when we're talking about regulatory capture,
01:01:35.480 | I think there was a lot of fearfulness
01:01:37.560 | and a lot of it directed at open AI that, you know,
01:01:40.880 | these models, these companies with a closed model
01:01:43.760 | were trying to, you know, who had the lead,
01:01:46.200 | lock everything down
01:01:47.440 | before people could catch up with them,
01:01:49.120 | trying to squash what was happening in open source.
01:01:52.560 | I think you're gonna see in 2025,
01:01:54.760 | a lot more companies open source more of their models.
01:01:58.920 | Well, I think we just saw Microsoft
01:02:01.600 | put out some open source models last week,
01:02:03.640 | which is like really surprising.
01:02:06.120 | And so I think that my own sense on this
01:02:09.240 | is that there's a lot more commonality
01:02:14.560 | between all of the major players around AI
01:02:17.440 | than there is division.
01:02:20.000 | And, you know, we'll see.
01:02:23.480 | But that's a, what I'm hearing out of a bunch of them
01:02:26.000 | is that we're gonna see a bunch more open source.
01:02:28.280 | Yeah, but there's two dimensions to this.
01:02:30.600 | So you could be right that the bigger companies
01:02:33.520 | could all agree on a regulatory framework.
01:02:36.480 | There's also the issue of small versus large.
01:02:40.920 | And, you know, there's a lot to pay attention to there.
01:02:45.920 | Before we wrap up, I had like four or five things
01:02:49.640 | that I'd like to bounce off of you
01:02:52.000 | and get your opinion on as I look forward into the year.
01:02:56.160 | The one that I've been thinking about a lot is Google.
01:02:58.320 | And we talked a lot about how the, you know,
01:03:03.280 | how search is at risk.
01:03:04.520 | And clearly for those of us that are doing so many searches
01:03:09.320 | first on a chat GBT or other product like that,
01:03:13.840 | there is a real argument as, you know,
01:03:16.360 | what's the point of Google in the long run
01:03:19.160 | and does it still hold?
01:03:22.080 | On the other side, you know,
01:03:23.880 | when we've looked at them in the past,
01:03:25.760 | we've said, man, they have an incredible number of assets.
01:03:29.120 | And when you look at, you know,
01:03:32.520 | whether it's their email, Gmail product,
01:03:35.400 | and then their docs product,
01:03:37.040 | and then their Slack-like competitive products,
01:03:42.040 | and their Zoom-like competitive products.
01:03:44.520 | And then the fact that they own Android
01:03:47.760 | and this mobile operating system.
01:03:50.920 | I was, you know, surprised to hear Jason on All In
01:03:55.920 | get super excited about Google in this way,
01:03:59.560 | and then announced that he had switched to a Pixel phone.
01:04:02.680 | And I'm thinking to myself, you know,
01:04:05.360 | this could be kind of the perfect AI basket of assets.
01:04:10.320 | And, you know, you look at a company like Glean,
01:04:12.760 | that I know you're in and it's doing well.
01:04:15.520 | You know, you can create Glean for the rest of us
01:04:18.880 | if a company commits to being on the Google stack
01:04:22.560 | and has all these things.
01:04:23.680 | And it becomes a combination that no one else has.
01:04:27.520 | Apple doesn't have it, Microsoft doesn't have it.
01:04:30.360 | It requires insanely great execution to get it all right.
01:04:33.960 | But I find myself, you know,
01:04:35.640 | that if a whole bunch of people said,
01:04:39.240 | man, I want this so bad
01:04:41.960 | that I'm gonna switch from my iPhone and get onto Android,
01:04:45.200 | that would be a real tell.
01:04:47.080 | That'd be something special.
01:04:49.080 | Yeah, well, you know, I'll tell you,
01:04:51.520 | I don't know what it was,
01:04:52.680 | late November, maybe early December.
01:04:54.600 | Before the recent run-up, you know,
01:04:56.080 | I started seeing a bunch of breadcrumbs around Gemini,
01:05:00.720 | around Deep Reasoning, around Notebook LM,
01:05:03.840 | just the cycle time, you know, improving there.
01:05:07.000 | I talked to a lot of companies.
01:05:08.080 | A lot of companies were still seeing growth
01:05:10.760 | in their search volumes, you know, their lead generation.
01:05:14.040 | So online travel companies, et cetera, from Google.
01:05:16.880 | And I started to see a lot more evidence of them
01:05:18.840 | just, you know, getting fit and getting more efficient.
01:05:21.600 | And frankly, I think a lot of credit goes to Sundar.
01:05:24.800 | I think he's starting to get feisty
01:05:26.280 | and see the things that, you know,
01:05:27.800 | that need to get done here.
01:05:29.120 | I would say, you know,
01:05:30.000 | let's keep a few big pieces in perspective.
01:05:32.480 | They are facing the largest innovators dilemma
01:05:34.680 | in the history, and certainly from my perspective,
01:05:37.080 | the history of Silicon Valley.
01:05:38.280 | And I think it's almost impossible
01:05:40.040 | to replace a 99% incremental margin business,
01:05:43.320 | i.e. a monopoly search business with whatever comes next.
01:05:46.360 | Because whatever comes next, they may be important in it,
01:05:49.440 | but they're not going to be 99% monopolies, right?
01:05:52.720 | Like OpenAI is going to be there,
01:05:54.920 | Meta is going to be there, et cetera.
01:05:56.200 | So, and I don't think their margins are going to be the same.
01:05:58.600 | So that's the first thing.
01:06:00.320 | Number two is like, how could I be wrong?
01:06:03.880 | Well, you could envision a world
01:06:05.080 | where the pie grows so damn much, Bill,
01:06:07.600 | that, you know, Gemini and if they were able
01:06:11.640 | to displace Apple on the phone,
01:06:13.120 | then all these things could really replace that.
01:06:15.760 | I think you have plenty of time to wait and see,
01:06:18.840 | you know, to pick up the breadcrumbs
01:06:20.360 | on whether those things are occurring.
01:06:21.800 | But I will tell you this,
01:06:23.120 | I suspected Apple for the first time.
01:06:26.280 | They look at Google's assets and they say,
01:06:28.720 | "Okay, this is orthogonal."
01:06:31.360 | Like they were never worried about Pixel
01:06:33.440 | over the course of the last eight years,
01:06:35.040 | but I imagine to your point,
01:06:37.040 | this is the first time in a long time
01:06:39.560 | that they've worried that Google could appropriately embed
01:06:43.480 | Gemini and deep reasoning and a search, you know,
01:06:47.240 | an assistant on the phone that can book my hotel,
01:06:49.800 | just make it a 10X better product.
01:06:51.400 | I'll tell you another area where they have an advantage,
01:06:54.320 | which is hopefully useful to everyone out there.
01:06:58.280 | Gemini is really good at local
01:07:00.440 | because they have all the local reviews.
01:07:02.680 | So I find myself now if I'm going to a restaurant,
01:07:06.600 | you can say, "What are the three dishes
01:07:09.320 | people have loved the most?
01:07:10.560 | What are the two that you stay away from?"
01:07:13.080 | And get down at a level of detail
01:07:15.160 | that you wouldn't have used before.
01:07:17.800 | And, you know, very valuable.
01:07:20.200 | So that's another asset.
01:07:21.680 | What are they like at Tony Black's?
01:07:24.240 | Well, you have to go do the search.
01:07:26.800 | A second one that I think is super interesting to me
01:07:29.120 | that we're going to see play out first half of this year,
01:07:32.000 | X.AI, Elon's commitment to the largest cluster.
01:07:37.560 | So the concerns about running out of data,
01:07:40.720 | the concerns about maybe there's a parameter limit,
01:07:44.160 | at least against the text data set draws into question,
01:07:48.240 | do you need the largest single cluster?
01:07:53.000 | Grok 3 has been promised, I guess,
01:07:56.720 | in the first half of the year.
01:07:58.360 | It would be super interesting to see
01:07:59.880 | if something comes out of that product
01:08:02.640 | that is competitive with the cutting edge at OpenAI
01:08:06.400 | or maybe above it.
01:08:07.560 | I don't know, but I'm very curious about how that plays out.
01:08:11.000 | I mean, I think there's been a lot of attention
01:08:13.280 | focused on that.
01:08:14.920 | I'm not sure that I would measure X.AI's success
01:08:21.640 | solely on the dimension of, is the Grok 3 model
01:08:24.720 | better than whatever OpenAI's latest is or Gemini?
01:08:28.200 | Because remember, what they're doing really, Bill,
01:08:30.760 | is building larger cluster on pre-training.
01:08:33.720 | So they can do more on pre-training.
01:08:35.080 | But remember, these other models are infused
01:08:36.920 | with post-training and test-time compute, et cetera.
01:08:39.600 | So I suspect that what we should-
01:08:41.760 | Oh, I agree.
01:08:42.600 | I think it's a test of that pre-training argument.
01:08:45.320 | Correct.
01:08:46.160 | Let's say they do achieve something
01:08:48.120 | and come out with something that's better
01:08:49.680 | than everybody else.
01:08:50.600 | So that would be a new data point
01:08:53.960 | relative to where everybody's thinking.
01:08:55.920 | I think, here's the way I think about X and Elon.
01:09:00.200 | He's got three vectors on which he is leveraging AI.
01:09:05.200 | Number one is robotics.
01:09:07.960 | And he recently said,
01:09:09.640 | and I've heard Google and others say this,
01:09:11.920 | that he expects there to be millions of humanoids
01:09:14.680 | live and in the wild by 2028.
01:09:18.240 | Think about that.
01:09:19.080 | That's three years away.
01:09:21.000 | It's really, it's incredible.
01:09:22.520 | And you just, we'll post this video
01:09:24.080 | of this Chinese humanoid that, you know,
01:09:27.320 | somebody said passed the visual Turing test
01:09:29.440 | because humans were confused whether or not it was real.
01:09:32.320 | It was so damn good.
01:09:33.440 | And so he's got that vector.
01:09:35.200 | The second vector he has is autonomy, right?
01:09:37.880 | And I picked up a new Tesla S at the end of last year
01:09:40.160 | with hardware for just so I could be on the latest of FSD.
01:09:42.960 | It drove me home from San Francisco last night.
01:09:45.240 | So those are two real world, like not language models.
01:09:48.600 | Now we're talking in the world of bits and, you know,
01:09:51.760 | and atoms.
01:09:53.240 | And then the third one is really around--
01:09:56.000 | By that, you're implying,
01:09:57.520 | just because those are in separate companies,
01:09:59.640 | that X.AI could become the hyperscaler for--
01:10:04.240 | 100%.
01:10:05.120 | I think there's no doubt about it.
01:10:06.360 | It already is, and it will continue to be.
01:10:08.720 | Like he has a Koretsu, as they say in Japan,
01:10:11.760 | of companies that can leverage each other's insights.
01:10:14.400 | And I fully expect that they will.
01:10:16.360 | And X.AI, so his need to build these large clusters
01:10:21.360 | and place these bets is because he's got to support
01:10:26.360 | huge potential businesses, not just building Grok
01:10:30.680 | to compete with JetGBT,
01:10:32.200 | although I think they'll try to do that too
01:10:33.920 | as part of X, et cetera.
01:10:35.480 | But he has lots of ways that he can leverage it.
01:10:37.360 | And remember, if you build a large coherent cluster
01:10:41.800 | for pre-training and you decide,
01:10:43.320 | okay, we're hitting up against some scaling limits
01:10:45.400 | of pre-training, you can use it
01:10:47.000 | for training reasoning models,
01:10:48.360 | you can use it for post-training,
01:10:49.920 | you can use it for inference.
01:10:52.040 | So it's not like you're wasting this
01:10:54.680 | just by building these larger clusters.
01:10:56.280 | I think what I'm most convicted in
01:10:59.000 | is that these guys are going to spend a lot of money
01:11:02.000 | on GPUs because, as Jensen says,
01:11:04.840 | more of the world is moving to workloads
01:11:08.160 | that demand machine learning and accelerated compute.
01:11:11.200 | Okay, that was number two, Bill.
01:11:13.600 | What's number three?
01:11:15.000 | One last question on number two.
01:11:17.400 | It seems to me that X.AI wants to make the argument
01:11:22.400 | that having their own hardware
01:11:25.520 | is a competitive advantage versus open AI.
01:11:28.320 | Do you buy into that argument?
01:11:30.720 | How do you think that plays out?
01:11:33.160 | Well, I think it's not necessarily owning it or having it,
01:11:37.400 | but it's being more competent.
01:11:39.120 | I do think that the ability
01:11:41.400 | to rapidly stand up compute that works,
01:11:44.800 | that works really well,
01:11:46.040 | that doesn't have as many hiccups in productions, et cetera,
01:11:49.560 | like that's a superpower.
01:11:50.720 | I mean, Jensen's gone into, you know,
01:11:52.680 | talk about this crazy timeline that Elon hit.
01:11:56.560 | I think he's unique in that regard.
01:11:58.800 | I think he has a balance sheet
01:11:59.960 | and an ability to raise global capital, Bill,
01:12:02.640 | that his cost of capital is really low for doing this.
01:12:05.600 | So listen, I'm happy.
01:12:07.600 | I think it's so great for the US ecosystem
01:12:11.280 | that Elon is charging full speed ahead in AI
01:12:15.520 | and keeping Anthropic and Google and OpenAI
01:12:18.480 | and everybody else honest.
01:12:19.520 | It's part of the reason I'm so bullish on Team America
01:12:23.400 | when it comes to AI,
01:12:24.480 | because we've got the smartest people in the world
01:12:26.160 | investing more money than we've ever seen
01:12:28.520 | invested in anything other than perhaps the Apollo project
01:12:31.320 | on an inflation-adjusted basis
01:12:33.640 | to make sure that we're in the lead.
01:12:35.520 | But at the end of the day,
01:12:37.640 | it's a combination of things, right?
01:12:39.400 | You gotta have the best research in the world.
01:12:41.040 | Like you can have the best computers and chips
01:12:43.520 | and biggest cluster in the world,
01:12:44.760 | but if you don't get the architecture right
01:12:46.400 | around your reasoning model and the other guy does,
01:12:48.760 | they're probably gonna win.
01:12:50.360 | All right, number three.
01:12:51.480 | We'll do four and then we'll wrap it up.
01:12:53.960 | Number three, I don't recall in any of the previous waves,
01:12:58.960 | you know, PC wave, client-server wave, mobile wave,
01:13:02.880 | the amount of chatter about co-opetition
01:13:07.880 | that I'm seeing here.
01:13:09.920 | And people took out of our video with Satya
01:13:13.440 | where he said, "Oh, I already have my own models,"
01:13:16.000 | you know, and then they released a few,
01:13:18.600 | you know, implying that they're in some kind of co-opetition
01:13:21.760 | with open AI.
01:13:23.040 | You've heard all of these hyperscalers
01:13:26.240 | talk about doing their own processors of some sort,
01:13:30.440 | you know, both Tranium at Amazon
01:13:32.360 | and then the TPU at Google compete with NVIDIA,
01:13:35.360 | but they're buying NVIDIA's product.
01:13:37.200 | NVIDIA put out models this week,
01:13:39.840 | which competes with some of those customers.
01:13:41.920 | And then they announced this AI-driven PC unit,
01:13:46.920 | which would, you know, presumably compete with Dell,
01:13:51.480 | even though Dell's on stage helping them
01:13:53.600 | with the Elon build.
01:13:54.800 | But I don't recall, you know,
01:13:58.440 | at least I think about the Wintel, you know, world,
01:14:01.760 | which I was on Wall Street covering.
01:14:04.320 | I just felt more like people stayed in their lane
01:14:07.160 | and were thankful to be a part of this broad ecosystem
01:14:10.720 | that was growing.
01:14:11.920 | And I just see so much tension.
01:14:15.320 | Like, it's super interesting to me
01:14:17.480 | as just someone that's watching it,
01:14:20.240 | but I'm curious what your thoughts are.
01:14:22.440 | Yeah, I don't know that I have any strong,
01:14:26.520 | you know, perspectives there.
01:14:29.120 | Maybe a couple of things.
01:14:30.160 | Number one, the amount of capital
01:14:31.640 | that it takes to do these things
01:14:33.640 | is not the domain of venture capitalists.
01:14:35.960 | So if you're a new entrant,
01:14:37.160 | if you're Anthropic or you're OpenAI,
01:14:39.800 | frankly, they were forced into this
01:14:41.320 | because there was no other choice.
01:14:43.520 | When they realized the amount of capital
01:14:46.320 | was going to be, you know, a nuclear level of capital,
01:14:49.960 | you just had to turn to these big companies.
01:14:51.920 | So these big companies ended up owning,
01:14:54.360 | you know, pieces of them,
01:14:55.440 | but they couldn't be totally dependent upon them.
01:14:57.560 | So you had Google who hedged their bets
01:14:59.440 | a little bit with Anthropic,
01:15:00.840 | and you had Amazon that made a bet on Anthropic,
01:15:03.240 | and Microsoft on OpenAI.
01:15:05.080 | And now as these companies become big and successful,
01:15:07.720 | like, listen, if they don't become big and successful,
01:15:10.640 | inflection just gets subsumed by Microsoft.
01:15:13.200 | Character gets subsumed by Google.
01:15:15.840 | But if they get some scale on their own, like OpenAI,
01:15:19.320 | and now it probably couldn't be subsumed
01:15:22.520 | from an antitrust perspective by Microsoft,
01:15:24.800 | and it has standing on its own.
01:15:26.360 | And so I think you have that dimension.
01:15:29.360 | I think the competition is as aggressive as it's ever been.
01:15:32.800 | I think sometimes we make more out of this competition
01:15:36.520 | than there is.
01:15:37.280 | Like, I don't think Microsoft, you know,
01:15:40.120 | NVIDIA launching a few models,
01:15:41.800 | I don't think they intend to be in the frontier model game
01:15:45.480 | where they're competing heads up with these other folks.
01:15:47.480 | At the same time, there's a long tail of their customers
01:15:50.360 | who are more than satisfied to use a tightly coupled model
01:15:54.400 | rather than maybe trying to kludge it together themselves
01:15:57.800 | by going to Lama or using a lot
01:16:00.160 | of these other plumbing services.
01:16:01.680 | And so, you know, we'll have to wait and see.
01:16:05.320 | Here's one thing that I would just point out in that regard.
01:16:08.600 | We just looked at that CapEx chart.
01:16:11.080 | Almost $300 billion of non-governmental R&D
01:16:16.080 | being spent by the biggest companies in the world
01:16:21.880 | to advance causes that are aligned with America, right?
01:16:25.440 | Like we hearken back to the age of Bell Labs
01:16:27.680 | in Silicon Valley, you know,
01:16:29.280 | and where you had national government spending.
01:16:31.720 | And I just think this is so damn bullish for us
01:16:35.160 | that we have printing presses,
01:16:36.760 | we have companies generating enough cashflow
01:16:39.520 | to invest this much money to put us at the bleeding edge
01:16:43.480 | and it creates incredible national strategic advantage.
01:16:46.400 | And they were getting pushed back
01:16:48.400 | for doing stock repurchases.
01:16:50.440 | Now that people said they don't have any use for this cash,
01:16:54.280 | well, now they do.
01:16:55.320 | So I guess that's a positive.
01:16:57.760 | All right, the last thing that's on my mind
01:17:00.440 | and that I'm really looking forward
01:17:02.760 | to better understanding in 2025,
01:17:06.800 | it does appear like, for the time being,
01:17:10.440 | the majority of the enthusiasm on the advancement in AI
01:17:15.440 | is around this chain of thought thing.
01:17:17.560 | And as you've talked about, you know,
01:17:20.280 | I think it's somewhat ironic that it's less efficient,
01:17:23.440 | but now that's a good thing.
01:17:24.840 | I think if someone had built a model
01:17:26.760 | that was just hyper shit efficient
01:17:28.480 | to where the margins expanded, that would be better.
01:17:31.240 | But okay, so now the argument is it's less efficient,
01:17:34.680 | but it's gonna consume more compute.
01:17:36.400 | So that's a good thing.
01:17:37.680 | So seeing how well this applies
01:17:40.720 | to different uses of AI will be super interesting.
01:17:44.440 | So, you know, what are the coding companies seeing,
01:17:49.440 | you know, when they use chain of thought
01:17:52.280 | precisely against that?
01:17:53.560 | 'Cause when this model was first released,
01:17:56.120 | even OpenAI said it's not for every use case,
01:17:59.000 | or they hadn't seen it be successful in every use case.
01:18:01.800 | Now, maybe they will in the future, you know,
01:18:03.440 | and I've signed up and paid for all these pro versions now
01:18:07.600 | in the past few weeks, been throwing problems at,
01:18:10.920 | you know, especially the one that's interesting
01:18:12.520 | is it goes away.
01:18:13.840 | You know, both the Google and the OpenAI products,
01:18:17.080 | it'll go away for 10 minutes.
01:18:18.920 | So, and is, you know, how much better is that output?
01:18:22.680 | And so it's super interesting.
01:18:24.520 | I think that's gonna be something
01:18:25.640 | that's really important to watch.
01:18:28.240 | Obviously, if the compute price keeps falling,
01:18:31.480 | I think you fall into this, why not argument?
01:18:33.760 | Like, why not run 20 passes on something
01:18:37.200 | if the marginal cost gets somewhat irrelevant?
01:18:40.520 | And with the fast followers, you know,
01:18:44.040 | and the way you're using synthetic data
01:18:46.240 | to create even smaller models,
01:18:47.960 | maybe you can get the smaller model to run, you know,
01:18:51.400 | the second, you know, double-checking your work,
01:18:54.520 | if you would.
01:18:55.360 | But anyway, this, I think this will be the fun part
01:18:58.200 | to watch on the edge this year.
01:19:00.880 | I can tell you a couple of things on that.
01:19:02.600 | Number one, I can tell you
01:19:03.840 | that the researchers inside these labs,
01:19:06.520 | they feel like they're looking into AGI,
01:19:11.560 | and they're surprised how skeptical the world is.
01:19:14.680 | Yeah, like there's a lot of dissonance
01:19:17.280 | that I think they have,
01:19:19.600 | because I think they're like,
01:19:21.520 | "Oh my God, we're getting close."
01:19:24.920 | The second thing I would tell you--
01:19:26.360 | Part of the problem on that is,
01:19:29.440 | you know, they're seeing things ahead of time.
01:19:34.400 | Correct, correct.
01:19:35.240 | And they're unable to release 'em
01:19:37.440 | 'cause they're not fully baked in it.
01:19:38.880 | 100%, and they don't have the compute.
01:19:41.440 | Yeah.
01:19:42.280 | They don't have the compute, Bill.
01:19:43.120 | But I'll tell you, I'll tell you that,
01:19:44.840 | I'll tell you this, they also,
01:19:47.200 | part of the reason they're so bullish
01:19:48.840 | on these reasoning models
01:19:50.480 | is that from a scaling perspective, right?
01:19:54.640 | So remember, reasoning models can get better
01:19:57.080 | just because you throw more compute at 'em as well.
01:19:59.640 | So where are we on that scaling curve for reasoning?
01:20:02.640 | Most of 'em tell me that we're around a Chad GPT 2.0
01:20:07.640 | level from that logarithmic scaling.
01:20:10.160 | So they expect a lot of scaling advantages
01:20:14.120 | in 2025 and 2026 on the reasoning models.
01:20:17.280 | And then finally, when it comes to utility,
01:20:19.720 | so where are these things going to be put into practice?
01:20:23.160 | I think you're going to see
01:20:25.480 | some pretty dramatic breakthroughs this year on coding,
01:20:28.360 | right, so think about all the startups
01:20:30.040 | that we've seen in the coding space.
01:20:32.280 | A lot of them have used prompt injection
01:20:34.360 | and other techniques in order to, you know,
01:20:36.680 | kind of get these agents to do things that they want.
01:20:39.240 | I think you're going to see real coding agents
01:20:42.880 | baked by Google and by OpenAI,
01:20:46.160 | driven by their most sophisticated
01:20:48.800 | inference time reasoning models
01:20:50.400 | that are going to be meaningfully better
01:20:52.480 | than what's in the market today, that's one.
01:20:55.120 | And I would say secondly, so if at the enterprise level
01:20:58.920 | you think of coding as kind of the tip of the spear
01:21:01.640 | as to what these agents can do,
01:21:03.880 | remember, like you and I started last year
01:21:06.320 | talking a lot about the consumer, right?
01:21:09.040 | Memory and actions.
01:21:11.160 | And I think that you're going to see
01:21:13.720 | Chad GPT imbued with these capabilities.
01:21:17.560 | And I imagine the same happens with Gemini, et cetera.
01:21:20.600 | And we've seen it, like we in fact did a,
01:21:23.280 | one of my partners did a demo using booking.com
01:21:26.280 | and the anthropic computer use where, you know,
01:21:29.000 | booking a hotel and doing other things.
01:21:31.120 | It's still fairly embryonic, but I think that,
01:21:34.000 | again, that's not built on the back of O3.
01:21:36.240 | Now build that on the back of O3,
01:21:38.080 | it becomes very, very capable.
01:21:39.920 | So if you said to me, like how wide
01:21:43.440 | is the use case going to be?
01:21:45.000 | I think now the use case is very narrow, it's researchers.
01:21:48.120 | Okay, but I think you're going to see the aperture
01:21:51.240 | on the reasoning models expand pretty dramatically.
01:21:54.960 | And I think you're going to see increasingly
01:21:57.360 | a blending of the pre-trained and the O series model,
01:22:01.360 | because ultimately it's one model, you know,
01:22:04.200 | that's going to help you, you know,
01:22:06.680 | do all the things that you want to have done.
01:22:08.440 | Well, here's, I didn't discuss this with you ahead of time.
01:22:12.480 | So we can edit it out if we don't use it,
01:22:15.280 | but I would invite, if there are people
01:22:18.640 | that are doing particularly interesting things
01:22:20.960 | or leveraging this in a particularly interesting way,
01:22:23.880 | I would invite them to reach out to us.
01:22:26.000 | And I'd love to not only learn about it,
01:22:28.440 | but be obviously willing to share with the audience
01:22:32.400 | what these things are as they uncover them.
01:22:35.360 | I'll close with this and you didn't ask for my input on it,
01:22:38.760 | but you know I'm not a fan of macro analysis
01:22:42.600 | and that I think it falls in the too hard bucket
01:22:45.120 | because of all the different variables at play.
01:22:49.080 | Great seeing you.
01:22:49.920 | And I look forward to a fun year of doing these
01:22:51.560 | with some incredible guests.
01:22:53.240 | And, you know, I learned a lot kicking around with you.
01:22:56.000 | It's a lot of fun.
01:22:56.840 | All right, take care.
01:22:57.800 | Take care.
01:22:58.640 | (upbeat music)
01:23:01.200 | (upbeat music)
01:23:03.800 | - As a reminder to everybody,
01:23:09.320 | just our opinions, not investment advice.