back to indexChina Open-Source, Compute Arms Race, Reordering Global Trade | BG2 w/ Bill Gurley and Brad Gerstner

Chapters
0:0 Intro
3:55 Open-Source Models in China
17:54 Future of American Open-Source Models
26:30 Compute Arms Race
47:5 China, Tariffs, and Reordering of Global Trade
00:00:00.000 |
15% on all goods coming from Europe, 0% on US goods going to Europe, so an opening up 00:00:09.120 |
of Europe markets, paying us 15% and on top of that, getting commitments like $750 billion, 00:00:15.480 |
almost a trillion dollars of energy purchases from the US. 00:00:18.580 |
Or look at Japan, which they announced last week, another huge market. 00:00:22.760 |
Again, similar, they're going to pay tariffs to the United States, no tariffs imposed on 00:00:28.380 |
the United States, and they're going to invest $550 billion into the US in a way the President 00:00:36.460 |
So I just think we need to give the President credit where credit is due. 00:00:44.200 |
Everybody said this was going to lead to retaliation, to trade wars, was going to be disastrous for 00:00:50.160 |
And all we've seen so far is deals, deals, deals, deals. 00:00:54.700 |
And I have to say, if this was the CEO of one of our companies, let's say we had a board 00:00:59.420 |
meeting at the start of the year and he outlined these plans. 00:01:01.700 |
And we said, hey, we're really nervous about this. 00:01:07.560 |
It's either going to backfire and we're going to fire you, or it's going to work really well 00:01:13.960 |
If we're measuring them halfway through the year, I would say that he's in line for a 00:01:35.680 |
We have our good friend, Sunny Madra, the CEO of Grok joining from, I don't know, Sunny, it 00:01:40.460 |
looks like some fancy hotel in Saudi Arabia in the middle of the night. 00:01:48.960 |
And Bill, you got off your boat catching bluefin tuna. 00:01:54.480 |
You look like you're in some office somewhere, so good to see you. 00:01:57.480 |
I am borrowing Mitch Lasky's incredible podcast set up in our Woodside office with the high-def 00:02:09.960 |
Sunny, of course, our good buddy Sunny is the CEO of Grok. 00:02:14.380 |
I don't know, Sunny, probably a few hundred million of revenues, maybe doubling year over 00:02:20.400 |
You're rumored to be raising 600 million at a $6 billion valuation. 00:02:24.420 |
Maybe that has something to do with you being over here in Saudi Arabia. 00:02:28.480 |
And of course, you're hosting all the open source models in your inference clouds around 00:02:38.340 |
We don't comment on speculation, but you touched on some good points. 00:02:46.640 |
Our good friend David Sachs is really on a heater. 00:02:49.200 |
First, it was the Crypto Summit a couple of weeks ago. 00:02:51.760 |
Of course, the Genius Act got passed, which really teed up these stable coins. 00:02:57.620 |
And now the Clarity Act around market structures making its way through Congress. 00:03:01.960 |
And then, of course, last week was the AI Summit, where the president laid out a multi-pronged 00:03:08.260 |
strategic plan for American AI that both extends the government's investment and leadership, 00:03:16.640 |
but also accelerates the distribution of the American AI stack around the world. 00:03:21.820 |
Both of those things, both the crypto and the AI Summit that he put together, I thought 00:03:26.100 |
are key contributions, really, to the next generation of American technology leadership around the 00:03:32.640 |
It's amazing to see that much progress in six months. 00:03:35.640 |
And the rest of the team, Kratios, Dean Ball, Sriram, they really all kind of brought the 00:03:44.820 |
Of course, the AI Action Plan was focused on maintaining global AI leadership, particularly 00:03:52.180 |
And I hate to say it, but we have really been our own worst enemy. 00:03:56.540 |
It seems like excess regulation on everything from energy production to model development 00:04:05.800 |
It's really been a bit of an unforced error by the US over the course of the last 24 months, 00:04:11.040 |
and handed a lot of momentum to China, and I think really threatened our leadership. 00:04:15.360 |
So this is kind of a 180 to get America back on track. 00:04:19.460 |
We've underestimated Huawei and the Chinese AI development, model development, really at 00:04:27.380 |
So I want to kick off today talking about recent developments with the base models and reasoning 00:04:35.320 |
Because we had this freak out moment earlier in the year with DeepSeq that we all remember. 00:04:40.000 |
Remember, Nvidia stock plummets, everybody in Washington's talking about it. 00:04:45.340 |
But since then, people kind of forgot about DeepSeq, but the reality is China's been on 00:04:51.520 |
I mean, they're dominating the global landscape for open source models. 00:04:55.180 |
We've seen six to seven high quality open source model providers, many of the fastest growing 00:05:01.460 |
And this at a time when American open source, right, Lama 4 has been sputtering a bit, really 00:05:10.580 |
So Quinn, the open source model out of Alibaba has passed, I think, 400 million downloads. 00:05:16.820 |
Of course, that's released like these other models under the Apache 2.0 open source license. 00:05:24.940 |
But Sonny, you tweeted, and one of the reasons I wanted to get you on the pod this week is you 00:05:28.840 |
tweeted that all of these open source models are really coming together in China. 00:05:38.220 |
They can distill and generate synthetic data on each other's work. 00:05:42.020 |
And you went so far as to suggest this might allow them to pass the best proprietary models 00:05:47.580 |
coming out of the US yet this year, maybe by Q4 of this year. 00:05:59.840 |
And should US model companies like OpenAI and Anthropic be concerned? 00:06:05.600 |
So let's kind of tie it into, I think, three important things that we see happening. 00:06:11.380 |
The first one being, you know, the Chinese and the president addressed this at the AI summit, 00:06:17.980 |
He addressed the point around using copyrighted work and he says, you know, he used a great 00:06:24.740 |
If you read a book and you use it, you're not violating the copyright there. 00:06:29.400 |
And that was one of the major things that, you know, a lot of people didn't talk about, 00:06:32.920 |
but I think it's important for the model makers. 00:06:35.040 |
And so the Chinese just have been able to work around that because of, you know, their position 00:06:43.120 |
And I think, you know, Bill teed it up even better off of my tweet, which is, you know, 00:06:48.980 |
So what you're seeing very quickly is both the open source nature, the open weights nature, 00:06:54.340 |
allow them to basically compound on each other. 00:06:57.000 |
So instead of working in silos and instead of having to create giant training clusters 00:07:02.120 |
separately, they can basically take each other's work, build on top of it, almost consider it 00:07:06.860 |
like a remix of someone's model K2, sort of a well-known remix of what DeepSeq had done. 00:07:12.880 |
And now we're starting to see that happen really fast and we're seeing two dimensions of it 00:07:18.100 |
One we're seeing the leading edge models getting quicker and we're then seeing them distill down 00:07:22.300 |
smaller, you know, turbo models really, really fast as well. 00:07:25.560 |
There was a release today of a Quen, you know, 30 billion parameter model, which is performing 00:07:33.540 |
And GPT-40 was, you know, world-class not that long ago. 00:07:36.900 |
So those are the reasons that we're really seeing an acceleration right now. 00:07:41.240 |
You know, I want to dig into this model development in particular, you know, a year ago, we were 00:07:46.440 |
talking about these models being these, you know, stochastic parrots, you know, and we really had to 00:07:53.580 |
So you go back to GPT-4, you know, and you're compressing, you know, the entire internet, but now we really don't need to do it because we've trained them to use tools like the internet, right? 00:08:06.580 |
When I ask a question today, you know, it doesn't just spit out an answer immediately, it goes and uses the tool and it searches the internet. 00:08:13.380 |
So if you don't have to compress all this Wikipedia information, I don't know, take a subject like World War II, you just need to know how to go out and use the internet to find the information, to summarize it in real time. 00:08:26.140 |
How has that changed the pace of progress and the balance between open and closed? 00:08:32.480 |
Yeah, and so it's spot on, right, Brad, what we see now, and you see it when you use these reasoning models, and what I suggest everyone do is when you're using a reasoning model, you can usually expand out its thought process. 00:08:43.440 |
So when you ask a question, it'll say, oh, the person is asking a question about this, what should I do? 00:08:48.620 |
Let me go and maybe search the internet, let me go do a few different things, it can have a lot of different tools. 00:08:52.820 |
And so the push has been towards, you know, really, really strong reasoning models. 00:08:56.940 |
And, you know, we have to give credit there, OpenAI really started that with O1, that was really the first reasoning model. 00:09:03.240 |
But I think on the back of the research and the back of it, you know, everyone talked about, and, you know, the pod's good friend, Noam Brown was the leader on that program. 00:09:11.440 |
Everyone's been able to look at that and say, let's reframe the problem. 00:09:14.600 |
And this allows us to build stronger reasoning models that don't have to compress, like you said, all the internet's information. 00:09:21.000 |
And once they're coupled with strong tools, you start getting these really, really incredible results that don't even just show up in benchmarks, because none of the benchmarks really allow you to use a tool to answer the results. 00:09:32.340 |
And if they did, we're going to see a whole bunch of new set of results that happen there. 00:09:37.000 |
Hey, Bill, just in many ways, I think this validates what you were arguing over the last six to 12 months. 00:09:45.640 |
Everybody knows you're one of the biggest proponents of open source in the world. 00:09:49.300 |
And you were telling people, don't make, you know, the Chinese are going to use open source to their advantage. 00:09:55.580 |
And now we're seeing that in a really profound way. 00:10:07.380 |
But China got excited about open source about 20 years ago. 00:10:12.900 |
And you can imagine when most of the world accuses you of IP theft, that embracing something like Linux and all the other software, open source products, seems very appealing, right? 00:10:27.640 |
And so I think it became kind of a common way of operating within China. 00:10:33.160 |
And, you know, it's a country that hasn't prioritized IP protection the way we have, you know, around patents. 00:10:41.120 |
And I can make an argument that, you know, there's way more prosperity if ideas are shared instead of protected. 00:10:51.140 |
The one thing I don't know is in the current AI situation, you know, was the government promoting open source and encouraging it? 00:10:59.500 |
Or did it just develop through competitive forces? 00:11:03.380 |
But now you have, you know, a scenario where these companies, you know, are, you know, are, first of all, 00:11:12.060 |
And so, you know, I almost feel like an idiot when Kimmy comes out and moonshot and then this week, I guess, I don't even know how to pronounce it, Zipu, you know, releases a model. 00:11:23.880 |
And I go on PitchBook and I look it up and they've already raised $1.4 billion. 00:11:28.720 |
So it shouldn't have been a secret, but I didn't know about it, you know. 00:11:33.540 |
And all of a sudden they're in the leader tables on open router and like, oh, my God, you know, they're coming out of everywhere. 00:11:39.520 |
And what it shows you is just that when you have a competitive dynamic where every single player, 00:11:46.940 |
and I think there may be seven or eight dig pocketed players with open models in China, they all learn from each other extremely fast. 00:11:54.620 |
And in this case, unlike software, you can use one model to distill the other and make it better. 00:12:00.700 |
So it's almost like an accelerated, you know, form of that. 00:12:04.940 |
And you just get massive, quick co-evolution. 00:12:08.280 |
And I came up with a little analogy for people. 00:12:16.800 |
And let's say there's 10 to 20 farms in each. 00:12:19.840 |
And in one community, they come into the farmer's market once a week, and they just compete by selling their products. 00:12:27.040 |
In the other community, when they come into farmer's market, they also, in addition to competing and selling, they're forced. 00:12:34.840 |
I don't know who would force them, but they're forced to share all their best practices from that week with everybody. 00:12:44.220 |
And then if you ran that exercise over two years or whatever, obviously, the community where the best practices are shared across all farms is going to have a higher global output for the community than the one where you've just got proprietary ideas driving, you know, the individual and just competition without the ideas sharing. 00:13:05.620 |
And that may, like, even me saying that may cause some people to scream, that's socialism or, like, you know, they may not understand open source. 00:13:15.400 |
But I do think you end up with a higher fitness level for a community that's behaving that way. 00:13:22.080 |
Overall, you may end up with a lot less chance of a breakout monopolist like we've had in many of the sectors in American technology. 00:13:32.420 |
Well, I mean, let's just assume the Chinese government is, in fact, encouraging this in whatever ways, right? 00:13:39.060 |
I mean, if you look at the release of the AI action plan last week, the Trump administration, you know, had a section. 00:13:46.180 |
Which is about encouraging open source and open weight models in the US saying that these could become standards in some businesses and academic workloads. 00:13:55.740 |
And it's important they're built on the American AI stack. 00:13:58.900 |
So, you know, as an aside, the Chinese quickly followed, you know, the American AI plan. 00:14:06.320 |
I think they released theirs a couple of days later where they called for the establishment of a global AI cooperation organization, which I thought, again, is interesting. 00:14:16.720 |
So, you know, Bill, how do you feel about, you know, like we haven't seen that much traction in the US labs on open source. 00:14:28.540 |
Obviously, Lama has been probably the market leader there, but this is for both of you, you know, handicap for me, if you will, how you think this plays out. 00:14:43.700 |
Do you see a lot of demand for these Chinese open source models? 00:14:47.620 |
And if so, what would it take for an American open source model to catch up? 00:14:52.820 |
Yeah, one of the things that we should pull up is the chart of intelligence to price. 00:14:58.520 |
And one of the things that you see with the leading open source models now, which are the Chinese, is 90% of the quality in terms of intelligence, but at a 90% price discount. 00:15:09.280 |
And I think whenever you offer that to anybody, you're going to see people want to use that, whether it's individual developers or enterprises. 00:15:20.180 |
So if you're looking at this chart, right, in the top right of that chart, you'll see a cluster of these Chinese open source companies, right? 00:15:29.960 |
And the vertical axes here being really intelligence, the horizontal axes from left to right being the cost per million tokens. 00:15:39.300 |
And so you really want to be in the top right of that model, high intelligence, low cost. 00:15:45.800 |
And what this chart shows is that, to Sonny's point, you can get 90% of the intelligence, right, for 10% or 20% of the cost. 00:15:57.560 |
And the result, I assume, Sonny, is that you're seeing huge demand at Grok and in Saudi Arabia, where you are right now, for these Chinese open source models. 00:16:07.640 |
And then now, just taking that forward, what do people want? 00:16:14.360 |
That's what you sort of got out of Linux and, say, Red Hat, right? 00:16:16.900 |
As great as Linux was, and Bill was touching on it, the majority of the enterprise was using a distribution, which they could go and point to someone if they needed something. 00:16:26.440 |
And so I think the world wants models that they can get from companies that they can go to. 00:16:31.700 |
And so to answer your question on what happens, I think if we look at Q4 this year or Q1 next year, I'd be willing to say, you know, a top three worldwide model will be a U.S.-based open source model. 00:16:45.660 |
And, you know, we've got two big efforts happening there. 00:16:47.980 |
We know we have the open AI, open source, which, you know, a lot of people have been working on, and open AI. 00:16:54.120 |
And even, you know, Sam himself has commented on that, and it's released later this summer. 00:17:00.420 |
And if you take – you combine both of those things together, I don't think you end up with something that ends up further down the list in terms of intelligence and or price. 00:17:09.080 |
One thing I wanted to highlight about the China situation that I think might inform the U.S. situation, I was having a conversation with this extremely young AI entrepreneur that I know. 00:17:25.340 |
And he was pouring over the Zipu – I hope I'm pronouncing that right – paper. 00:17:33.400 |
I went on PitchBook and sent it to him who had funded it. 00:17:36.220 |
And he asked me, he says, why is Alibaba funding all these things when they've got their own model? 00:17:42.380 |
And because they're in several of the other competitive plays. 00:17:46.960 |
And it reminded me of, you know, a lot of the points that I've made about open source is, like, if you're not – 00:17:54.180 |
If you're not confident you're going to win on offense, you want to play defense. 00:17:58.560 |
And so, for any large tech company, commoditizing a potential threat is actually quite valuable. 00:18:08.600 |
You know, you look at what Facebook did with the Open Compute Initiative inside of their data centers. 00:18:13.800 |
And so, you know, it may just be – very well be that Alibaba just wants to make sure there's no ByteDance, you know, that equivalent in the AI space. 00:18:27.080 |
And the reason I think that's an interesting data point when you think about the U.S. 00:18:31.580 |
And, you know, there are several big tech companies that seem to be not on the bleeding edge of AI. 00:18:41.860 |
I mean, they have access to OpenAI right now, but they might lose that or whatever. 00:18:48.960 |
You know, if I'm at any of those companies, I'd be funding a Open Source competitor, you know, rather than funding, you know, like Amazon with Anthropoc. 00:18:59.120 |
I think you're in a much better position to encourage Open Source. 00:19:05.740 |
But, Bill, are there a bunch of Open Source startup models in the U.S.? 00:19:12.540 |
I think you're going to see new entrants pop up that try to co-evolve with the Chinese models. 00:19:25.160 |
No one thinks of it as having a domicile, right? 00:19:32.400 |
I don't – you know, I just think you're going to see – just like you saw Kimmy and Zipu pop up. 00:19:38.920 |
I wouldn't be shocked if you see other new entrants pop up that are trying to be, like, sanctioned or, you know, cleaned. 00:19:49.400 |
You know, use Red Hat as an example, Sonny, version of these things. 00:19:53.300 |
Because I think if you start with access to those Chinese models, it wouldn't take you long to move into a near place. 00:20:01.360 |
And you wouldn't have to spend the kind of money the foundational model companies have. 00:20:05.780 |
And then you may see a big fight around regulatory capture where someone tries to say that's not allowed or whatnot. 00:20:15.380 |
And if I'm the Mistral team, if you're not distilling on these Chinese models, I don't know what you're doing right now. 00:20:24.720 |
OpenAI is rumored to be launching their open source model any day. 00:20:30.780 |
So to your point, you predicted, go back to the intelligence and pricing chart, right? 00:20:36.060 |
So if OpenAI was in the top right of that chart, i.e. 00:20:39.640 |
if they're able to deliver something at the intelligence of, let's call it Quinn, and they're also able to deliver it to market at, you know, 20% of the cost, 00:20:50.140 |
it would seem to me that actors around the world, certainly actor, it would be, you know, part of the American AI action plan. 00:20:57.820 |
We had won everybody in the world to use that model. 00:21:00.280 |
And do you believe they have a shot at out competing, being the upper right, you know, by the end of the year? 00:21:06.800 |
And if so, do you think that will be the outcome? 00:21:09.520 |
Like these companies that are using Quinn on Grok, do you think they would prefer to use OpenAI so long as it was equally capable and equally priced performance? 00:21:19.140 |
So two things that we see is brand and, you know, the U.S. domiciled or, you know, someone that they can kind of point at, that wins. 00:21:32.000 |
Because if you're a company and, you know, at some point you have to, you know, get your teams to sign off on what is it that you're using? 00:21:41.240 |
And like, you know, who is liable if something goes wrong? 00:21:44.400 |
And so I sort of feel like with OpenAI's release and, you know, Meta charges back or even if some of these startups emerge that, you know, we can point at, I think we'll see a huge shift back towards those models versus the Chinese ones. 00:22:00.380 |
Yeah. And we really don't know at this point, unless you guys have some inside knowledge, like when Meta makes their second push with all these hires that they've made, are they going to remain committed to open source or even be more open? 00:22:16.280 |
We don't, we really don't know yet at this point. 00:22:19.240 |
Right. You certainly seen some of those rumors out there. 00:22:21.540 |
You know, I've seen rumors that on Twitter that they were debating whether or not they should back away from open source. 00:22:32.840 |
My hunch is that they're going to stay very committed to open source, but they may complement it with a proprietary model. 00:22:40.160 |
That would be my guess, as opposed to scrapping open source altogether. 00:22:46.800 |
I think part of the pitch to get everyone there is that it's open because everyone that they're pulling over are coming from closed places. 00:22:53.900 |
And so if you're really passionate about the work you're doing and you're passionate about where this is going to go and there's only, there's only one company that can fund that to that scale and do it open is, is those guys. 00:23:06.000 |
Yeah. Some of the researchers like have a religious belief in it, which was evident in the, in the interview with the Deep Seek founder. 00:23:14.360 |
Like it's, it's, it's oddly like higher on their Maslow, you know, hierarchy than the money. 00:23:27.040 |
This is, I think a really important point I want to come back to Sonny is that you're seeing massive demand for these Chinese open source models today, precisely because enterprises around the world can utilize them. 00:23:45.520 |
They have 90% of their capabilities at 20% of the cost, 10 or 20% of the cost, you know, so it turns out if you deliver something really powerful and really cheap, that's more important to these players than American values aligned. 00:24:00.200 |
Right. But if you gave them something that was super powerful and super cheap and aligned with Western values, right? That you think that would be the winning formula for an American open AI model to top the, the distribution leaderboards around the world. 00:24:18.200 |
I think, yeah, on a place like Open Router, you know, where you can see where this is happening, I see, we would see it rise to the top in a few days. 00:24:25.200 |
That's music to David Sachs' ears because, you know, clearly in the strategic plan, they're worried about Chinese open source models dominating globally. 00:24:38.200 |
And if you just watch the pace of releases, the quality of the releases out of China, the cycle time on the innovation in the open source community out of China, it's faster and better at the moment. 00:24:52.200 |
So, and the only real big development, you know, we may see some of these new startups, Bill, we may see a reboot out of Meta, but the one that has, I think, everybody really holding their breath and hoping that we see something really capable and powerful is this open source model that's been promised out of OpenAI. 00:25:13.200 |
Now, of course, Elon says he's committed to open source as well. 00:25:20.200 |
Grok 4 is a great model, is impressive what they put out there. 00:25:23.200 |
Any idea, Sonny, about the open source plans out of X? 00:25:28.200 |
So I think like Elon has been, you know, pretty said it on Twitter that they'll always open source one generation back. 00:25:35.200 |
And so while they were on three, you know, they should have gotten to two and now they're on four. 00:25:39.200 |
So I think the thinking is that they will get there. 00:25:42.200 |
You know, my only guess would be is that, you know, right now if they were to open source two or even three, it's so far behind that, you know, what's the purpose in doing it? 00:25:55.200 |
It may not even be utilized and you'll maybe end up having to deal with just, you know, a bunch of internet or Twitter FUD. 00:26:00.200 |
Just coming back to OpenAI, I will tell you, it's like one of those things that really you rarely see in the enterprise. 00:26:09.200 |
It's almost like the demand for like a, you know, like a Tesla Roadster or something or Model Y before it came out. 00:26:16.200 |
It's like, that's the model that everybody wants to use right now. 00:26:19.200 |
And so, you know, we can't wait till it comes live and, you know, when it's on Grok, when it's all over the world. 00:26:26.200 |
I think it's going to be a real big one for everyone. 00:26:28.200 |
And while we have you, Sonny, you know, there's really just been this explosion in the compute arms race, right? 00:26:41.200 |
We had with Bill, you know, had we topped out on compute demand? 00:26:45.200 |
You know, were we entering an overbuild a la Cisco 2000? 00:26:55.200 |
Just a couple of tweets out of Elon and Sam Altman the last couple of weeks on this compute demand has really caught my attention. 00:27:02.200 |
If you look at this one out of Elon talking about the X.AI goal is 50 million in units of H100 equivalent. 00:27:10.200 |
And Clark Tang on my team tweeted something that broke that down, which showed that that reflected something like, you know, 4 million total GPUs and an energy footprint of like 11 gigawatts. 00:27:24.200 |
And then Sam Altman comes out and talks about their deal down in Abilene for four and a half gigawatts and the fact that they were going to come in well above the $500 billion estimate that they had promised to the government. 00:27:38.200 |
So these compute clusters now that are being talked about being built over the next five years, these are massively bigger than what we were even talking about a year ago. 00:27:50.200 |
And I think they reflect this move toward inference time reasoning, agent to agent interaction, reasoning engines. 00:27:57.200 |
You know, Jensen's comment on the pod last year, the inference was going to 1 billion X and the consequence, you know, what we're going to need in terms of compute power to power all that. 00:28:09.200 |
Maybe just reflect, you're in the middle of all this, you're building out your own inference clouds around the world. 00:28:16.200 |
Is this a lot of hyperbole and chest pounding or do you actually see the dollars going into the ground in places like Saudi Arabia and around the US? 00:28:25.200 |
Yeah, I'm going to just quantify it with Google for a second. 00:28:30.200 |
So, you know, in the, in the, in the, in the Mary Meeker bond deck, you know, they have a slide there that shows Google went from 5 trillion tokens a month to 480 trillion tokens a month. 00:28:42.200 |
And they had just put some press out that they crossed like, you know, 800 trillion. 00:28:47.200 |
And I saw something today that crossed a quadrillion and I had to look that up. 00:28:52.200 |
And so in, in a course of a year and a few months, they've gone from 5 trillion to a thousand trillion. 00:29:01.200 |
I mean, that just shows it to you without having to look at anything else. 00:29:07.200 |
Every single search query on the planet today is now an inference transaction. 00:29:12.200 |
And so, you know, and you see it, you know, in Anthropic with, you know, their continued fundraisers going, you know, through the roof. 00:29:21.200 |
That's happening because they're, they're seeing the amount of token consumption. 00:29:25.200 |
And so anywhere we lay infrastructure, we fire up a rack. 00:29:31.200 |
It becomes fully consumed within a few hours. 00:29:37.200 |
So you have demand far outstripping supply, even at GROC. 00:29:42.200 |
So, Bill, you see these fundraising announcements that are being discussed. 00:29:48.200 |
I think that Iconic is going to lead a $5 billion round into Anthropic at $170 billion. 00:29:54.200 |
In the case of Anthropic, it's, you know, $170 billion on rumored $5 billion in revenue. 00:30:01.200 |
X has been rumored to be raising at $150 to $200 billion. 00:30:06.200 |
So, you know, you've never in the history of venture, you've never seen fundraisers like this. 00:30:12.200 |
One of the topics that is being hotly debated on the Twitter is that there's massive intervening dilution in these rounds, Bill, because of the employee option grants or the employee RSUs that are needing to be granted to keep the employees, you know, in these businesses. 00:30:33.200 |
So, just observing this a little bit from afar, I don't think you guys, you're a direct investor in any of these major labs. 00:30:45.200 |
What are your warning signs, you know, about the size and the demand that you see in these rounds? 00:30:52.200 |
Well, I, yeah, I've never seen anything like it. 00:30:54.200 |
I mean, I saw, you know, through the Uber and Lyft situation, I saw a precursor to this, but these dollars are even bigger. 00:31:02.200 |
And the amount of money that these companies are willing to lose in a year. 00:31:07.200 |
You know, I still think it's particularly interesting that Google has to compete with OpenAI because OpenAI is going to lose $7 billion this year and Google won't. 00:31:16.200 |
Like, they would never allow themselves to do that. 00:31:19.200 |
And this started back in that previous era where, for the first time ever, you saw private companies have a competitive advantage in that they can be more risk-seeking with capital than the public companies are allowed to be. 00:31:35.200 |
But I think in the past 12 months, we've seen OpenAI, Meta, and certainly X, you know, move into this place. 00:31:45.200 |
I call them the cost is no object, the CINO group, where they're just putting out press lease after press lease and opening data center after data center. 00:31:56.200 |
And, you know, there are other people, I think, you know, you look at Microsoft, you know, choosing not to extend their CapEx budget. 00:32:05.200 |
You look at that Amazon example when we spoke to the LeFont brothers at Code 2 where they're not keeping up with the NVIDIA purchases relative to their AWS share. 00:32:18.200 |
And so there are a few companies that are back on their feet and there's a few companies that are really pushing the gas pedal. 00:32:24.200 |
And I can't, there's a few in the middle, I can't tell whether Anthropic has the audacity and the means to raise enough money to start building data centers themselves. 00:32:38.200 |
But it's, you know, it's a sport of kings, you know, it's a sport of kings. 00:32:44.200 |
It's like there's never been this amount of money spent right now in video and others building in the stack like like Dell that we spoke to a few weeks ago, like they're the winners in a pick and shovel game that's got this amount of aggressiveness. 00:33:05.200 |
Sonny, do you see, you know, going back to the well-worn cliche that every glut or that every shortage ultimately ends in a glut? 00:33:17.200 |
Do you have any evidence on the horizon where you see supply outstripping demand? 00:33:25.200 |
No, I mean, I was going to ask this to Bill as he was just saying it, Bill, like on a daily basis, are you consuming more tokens or are you consuming more, you know, 00:33:34.200 |
more, you know, just traditional, you know, web lookups, right? 00:33:39.200 |
And I'd be willing to guess you're consuming more tokens. 00:33:46.200 |
Like when you're using those reasoning models, you don't see all the tokens, right? 00:33:53.200 |
Yeah, it's 10 to 100x more than your very first AI search. 00:34:08.200 |
And I just wanted to say, even yesterday, Anthropic had to put this, you know, press release plus, you know, product change out saying, hey, we've got to throttle everybody. 00:34:16.200 |
And this is like, because we have all these people using way too much of our services. 00:34:21.200 |
So, I think, you know, if you have intelligent models, right? 00:34:28.200 |
It's one of those things people are consuming. 00:34:33.200 |
I'm going to offer one caveat to this super exciting line of thought, which is because of the amount of venture capital out there, companies are not pricing the cost. 00:34:48.200 |
Like no one, none of the model companies, I don't think anyone, even in the verticals, like no one's pricing the cost because they're pricing to take market share. 00:34:58.200 |
And, you know, you and I, Sonny, we had a previous discussion about the unlimited pricing and inference has variable costs. 00:35:10.200 |
And even when people say to me, oh, well, we're going to run out of power. 00:35:13.200 |
You wouldn't run out of power if you just took the price up. 00:35:16.200 |
Like the thing that throttles demand is price. 00:35:24.200 |
But they're not willing to do that because they're afraid to lose share. 00:35:33.200 |
There's rumors of even some of the best known brands in AI having negative gross margin. 00:35:38.200 |
And so, I don't know when that settles out, but that will create a bump in the supply demand curve if that ever has to be, you know, fixed. 00:35:57.200 |
Which is going back to the point that we made on open source. 00:36:00.200 |
But if you know in the back of your mind that there's something that's 90% as good, but it's 90% cheaper. 00:36:07.200 |
Because we've also never had that factor as we're going through this growth curve. 00:36:12.200 |
I mean, almost since we started the pod, you know, I've routinely highlighted that the steepness of that price curve on, you know, as it kind of becomes, you know, less cutting edge is something I've never seen before. 00:36:30.200 |
And I'm sure that a lot of people sit around and say, well, it's okay if I'm losing money here because, you know, six months from now, I'll just use the older model. 00:36:42.200 |
And we also talked in the past about how in the internet age, all the startups began with Oracle and Sun, and eventually they all moved to Linux and MySQL. 00:36:54.200 |
And so there was a, there was a, we got to win at all costs phase. 00:36:58.200 |
And then there was a phase where you started worrying about cost and optimization. 00:37:03.200 |
And so one day, one day we'll likely, you know, make that, make that move. 00:37:08.200 |
And, and, and I, a few of the companies I've talked to that are running inference at scale, they are already starting to think that way. 00:37:15.200 |
Like they're looking, they're looking at it, you know, from that, from that lens. 00:37:20.200 |
But, but it was, and, and I think that's why Grock and Cerebris are doing so well, but Sonny, give us, give us an, you know, an example. 00:37:28.200 |
I would imagine that the windsurf of the world and the cursors of the world and all these folks who are building these, you know, these coding agents, they've got massive, you know, demand for their applications, right? 00:37:41.200 |
But they're paying through the nose to Anthropic or to these underlying proprietary model providers to be able to do that. 00:37:50.200 |
Do you see them running to implement CLEN or some of these cheaper models? 00:37:56.200 |
Without kind of getting into specific, any one of them, but like, you know, multiple folks are building their own models based off open source. 00:38:07.200 |
So they could just go distill any one of these models. 00:38:12.200 |
And, you know, given that they're these very lenient Apache licenses and eliminate the entire cost of Sonnet that sits under it for 70% of use cases. 00:38:24.200 |
And like Bill said, turn that into a premium offering, right? 00:38:27.200 |
And say that's the, you know, the gold and, you know, the silver and the bronze are built off, you know, something that's, like I said, one tenths the price. 00:38:35.200 |
That seems to me, Bill, you know, if I had to forecast, you know, if I'm open AI, I'm running a consumer business with really high gross margins. 00:38:49.200 |
Because consumers are less sensitive to what they're paying. 00:38:58.200 |
Whereas if somebody's writing code, you know, there's the variable intensity is high. 00:39:05.200 |
And so, you know, for them, it would make sense to launch an open source model back to where we started the pod and price it really low to, you know, drive share knowing, you know, reminds me a little bit of Amazon back in the day. 00:39:20.200 |
Amazon had this monopoly retail business they could use to subsidize AWS, gain share for a decade, and then begin to take price. 00:39:29.200 |
That would be a rational strategy for open AI to follow. 00:39:33.200 |
So you take the profitable consumer business, you use it to subsidize, you know, the market sharing other applications that you hope to build. 00:39:50.200 |
But, you know, I pay the $200 a month and I do every one of my searches on 4.5, and I'm probably negative, I would think. 00:39:59.200 |
And so I do think there'll be some rationalization where these models kind of self-pick which one they're running based on what you need. 00:40:15.200 |
They're already doing that in parts of their enterprise business. 00:40:20.200 |
And, you know, as they've transitioned to more of this consumption logic, I think it's led to some real unlocks, you know, for the business. 00:40:30.200 |
I think that makes it harder if you're in the lab game and you don't have a consumer product and you don't own an application that you can drive high gross margins. 00:40:42.200 |
I think then gets back to this question, how long can you run your business for share, right? 00:40:49.200 |
Hoping that someday, because, you know, they all exist at the beneficence of the capital markets and the capital markets are willing to provide an incredible amount of capital to these businesses today. 00:41:00.200 |
But you and I have, we've lived through these periods where that disappears quickly. 00:41:05.200 |
And can I can I put something there just to kind of hear your feedback on it, guys, which is look at Google and the TPU. 00:41:12.200 |
And Google is clearly, you know, by these numbers that we're seeing, right, putting out more tokens than anyone else. 00:41:18.200 |
Do they do you guys believe they have a strategic advantage because they have their own hardware? 00:41:22.200 |
They're not having to pay, you know, 80 percent margin on something that they can generate tokens with. 00:41:27.200 |
Like, how do you guys look at that business and say clearly they look to be sort of the largest, at least openly saying the largest processor of tokens? 00:41:38.200 |
I think, look, the key data point in that case, which I don't have the data, but I'd be glad to repeat it if someone shared it with us, is how many non Google applications are running on the TPUs? 00:41:50.200 |
Like how many third party customers are using, because what I've heard or what the, you know, the general perception is, is that that most of their proprietary TPU transactions are their own applications. 00:42:05.200 |
But that's probably where most of their tokens are being processed anyways at this point, Bill, right? 00:42:09.200 |
Like transcribing YouTube videos and, you know, all you can Google meet, you can turn it on and all those searches. 00:42:15.200 |
I was just inferring in your question, maybe I shouldn't have been, that they'll have an advantage for Google Cloud. 00:42:22.200 |
And in order for that to be true, they need to have this crossover moment. 00:42:28.200 |
One quick thing, Brad, on OpenAI, you know, I've been writing a book, which I've talked about frequently, and I've been quite, although I guess there's some privacy things now you need to be worried about, but I've been quite open with OpenAI about the book and doing research, you know, along the way. 00:42:48.200 |
It knows a tremendous amount about my book right now, and I can ask follow-up questions without having to put the whole book back in the prompt again because of that. 00:43:01.200 |
And so I would continue to believe that OpenAI's most likely chance to long-term success comes from switching costs and lock-in more than it will come from staying on the edge of the model race. 00:43:19.200 |
And the pricing power that comes with that brand dominance, right? 00:43:25.200 |
Because the fact of the matter is, you said you're paying $200, you're getting more than $200 in value. 00:43:29.200 |
I don't know what the price is, but I know that if it was variable, you would pay a hell of a lot more money to use that service to it. 00:43:35.200 |
I would, but the lock-in once one of these systems starts to truly understand you and have all your historic knowledge, I think the switching costs will be very high at that moment in time. 00:43:50.200 |
Sonny, back to your question about the TPU and the advantage of that vertical integration, you know, to Google. 00:44:01.200 |
What I would tell you is that they've absolutely made some changes, I think, over the course of the last three to four months to accelerate the business. 00:44:12.200 |
You've seen the news about OpenAI leveraging TPUs for some of the inference demand that they have. 00:44:20.200 |
Ultimately, what I've said all along about Google is that in many ways the best position company in the world, but a lot of their advantage, right, is no longer much of an advantage, right? 00:44:31.200 |
And namely that, you know, they were the dominant place where the consumer started every single query. 00:44:38.200 |
And, you know, when people are looking for answers, Bill's book is not in Google. 00:44:45.200 |
Actually, it's in Google Docs, but you're right. 00:44:53.200 |
The knowledge of it and the interaction and the token generation. 00:44:56.200 |
So my only point is this, the battle for the consumer is ultimately where the value accrues, Sonny, not who runs, you know, what hardware. 00:45:05.200 |
And so Google's dominance, it's been the greatest business in the history of capitalism for 20 years because they owned the consumer. 00:45:13.200 |
They owned the verb in something that was extraordinarily high margin. 00:45:18.200 |
And all I would say is the first real threat in 20 years came about in the ChatGPT moment. 00:45:25.200 |
I think ChatGPT will cross a billion weeklies like maybe this year, you know, probably this year, I would guess. 00:45:35.200 |
And so that to me has always been the case, right, for OpenAI. 00:45:39.200 |
And when you look at the rest of these frontier labs, the case you've made throughout this pod, about seven of these models in China, being able to open source, distill off one another, drive up intelligence, drive down costs. 00:45:52.200 |
What that tells me is the model layers being increasingly commoditized and that there's not going to be a lot of intrinsic value in that intelligence layer, that operating layer. 00:46:03.200 |
You're going to have to build applications that guys like Bill Gurley and you and I are using every day. 00:46:09.200 |
And that's where the battle is on the consumer side. 00:46:11.200 |
You're going to have the exact same battle when it comes to coding agents and general enterprise applications. 00:46:17.200 |
And I've said there, I think it's going to be more of a, you know, heterogeneous world. 00:46:23.200 |
I think there are going to be lots of players that compete. 00:46:25.200 |
I think the margins will be lower in that world. 00:46:28.200 |
But it may very well be that the tide is going up so much here, right? 00:46:33.200 |
And the whole world is transforming so much around this that you're still going to have lots of players who do incredibly well. 00:46:39.200 |
I have to say I'm surprised if you would have told any of us a year ago or 18 months ago, right, 00:46:46.200 |
that the combined enterprise value of OpenAI and Anthropic together would be over half a trillion dollars. 00:46:53.200 |
And you throw X.AI in there, it'd be a trillion dollars, you know, across the three of them roughly. 00:47:01.200 |
And the compute demand is higher than any of us, I think, anticipated. 00:47:05.200 |
Sonny, hopefully we can keep you on here for a bit. 00:47:07.200 |
We're just going to wrap up with a, you know, a topic that I think has dominated really the conversation in the markets over the course of last year. 00:47:16.200 |
And that's been about tariffs and kind of the reordering of global trade. 00:47:20.200 |
Bill, I know you had some questions, some thoughts about it, and I'm happy to dig in and talk about it as well. 00:47:26.200 |
Well, I mean, I would really just love to hear from you, Brad. 00:47:32.200 |
The markets got very nervous when the, what was it, Liberation Day, when the kind of unpredictability of how big some of the numbers were and what that might mean 00:47:47.200 |
and whether we were walking away from the notion of comparative advantage and, and I think the markets got spooked. 00:47:54.200 |
And, you know, I think you turn around and look at where the markets are today and, and we've really gone through an evolution of how the Wall Street is interpreting the, 00:48:05.200 |
both the initial launch of the tariffs and the reality of where they're landing. 00:48:11.200 |
So, how would you, how would you describe that and, and why do you think the markets are getting very comfortable with the, where, where they're landing? 00:48:20.200 |
I mean, not only comfortable, we're at all time highs, you know, and, and April 2nd, I was going, you know, on CNBC saying the nuclear Navarro was going to be a disaster and I'm out. 00:48:34.200 |
And that's this, this, the, the, the amazing thing about this administration is there's really, you know, it's a team of rivals within the White House. 00:48:43.200 |
And you had Bessett and Lutnick who were basically outlining this plan for, you know, let's call it 10 to 15 to 20% tariffs across the board. 00:48:51.200 |
That would amount to about $300 billion in total tariffs up from 75 billion in 2024. 00:48:58.200 |
But you had Navarro who was basically saying, we're going to replace the Internal Revenue Service. 00:49:02.200 |
We're going to get rid of the income tax and we're going to have 2 trillion of tariffs. 00:49:12.200 |
And we said, until the president tells us whether it's door one or door two, we're out. 00:49:17.200 |
The market shot first and asked questions later. 00:49:20.200 |
And that's where you saw that huge drawdown in the market. 00:49:29.200 |
It's a 30% move in about 60 or 70 days, which is extraordinary, even by the historical patterns that we've seen over the course of the last five years. 00:49:44.200 |
I think the consensus view of all economists, right? 00:49:49.200 |
90% of economists is that tariffs are going to be bad. 00:49:52.200 |
They're going to be a tax that gets paid by the US consumer. 00:49:55.200 |
There was a small group led by, you know, Scott Besant and Kevin Hassett at the National Economic Council that said, no, it's actually going to be different this time. 00:50:06.200 |
And the reason it's going to be different this time, their theory argued, was that the world had become dependent upon exporting to the United States. 00:50:17.200 |
So that the total trade deficit to the United States of goods and services was about 915 billion last year, a $1.2 trillion goods deficit. 00:50:29.200 |
And that basically meant that China was selling a lot more to the United States than they were buying of US goods. 00:50:36.200 |
And so what Besant and Hassett postulated was that these countries have no choice. 00:50:42.200 |
If we impose a tariff on them, so long as it's not draconian, 70%, 80%, what Navarro was talking about. 00:50:49.200 |
If we impose a 15% tariff on them, they have to eat it. 00:50:53.200 |
The producers have to eat it because otherwise they're going to end up laying off millions of people in Vietnam and China and these countries. 00:51:01.200 |
And politically, they can't afford to lay these folks off. 00:51:07.200 |
The consensus economists said, no way is that true. 00:51:15.200 |
You have not seen the inflation percolate through. 00:51:23.200 |
The consensus economists said it would have already happened. 00:51:27.200 |
And the National Economic Council was out with a paper last week that deconstructed core PCE. 00:51:34.200 |
So that's the best proxy the Fed watches for inflation since the start of the year. 00:51:44.200 |
Import prices have been going up at a slower rate than domestically produced goods. 00:51:51.200 |
So this is the exact opposite of what you would have expected from tariffs. 00:51:56.200 |
Of course, you would have expected the imported prices would have been going up more than domestic goods. 00:52:01.200 |
And so we'll show these charts and we'll put the link to this paper. 00:52:08.200 |
But to me, when I look at the president's-- the deals he's landing, the deal he just got announced yesterday with the European Union. 00:52:25.200 |
So an opening up of Europe markets, paying us 15%. 00:52:29.200 |
And on top of that, getting commitments like $750 billion, almost $1 trillion of energy purchases from the US. 00:52:36.200 |
Or look at Japan, which they announced last week, another huge market. 00:52:42.200 |
They're going to pay tariffs to the United States, no tariffs imposed on the United States. 00:52:47.200 |
And they're going to invest $550 billion into the US in a way the president gets to direct. 00:52:54.200 |
So I would love-- I would-- I just think we need to give the president credit where credit is due. 00:53:01.200 |
Everybody said this was going to lead to retaliation, to trade wars, was going to be disastrous for the US. 00:53:07.200 |
And all we've seen so far is deals, deals, deals, deals. 00:53:12.200 |
And I have to say, if this was the CEO of one of our companies, right? 00:53:16.200 |
Let's say we had a board meeting at the start of the year, and he outlined these plans. 00:53:19.200 |
And we said, hey, we're really nervous about this. 00:53:22.200 |
You know, this is a high-risk, high-reward strategy. 00:53:25.200 |
It's either going to backfire, and we're going to fire you. 00:53:28.200 |
Or it's going to work really well, and we give you a bonus. 00:53:31.200 |
If we're measuring them halfway through the year, I would say that he's in line for a bonus based upon the trillions of dollars 00:53:39.200 |
that are going to be coming into the United States. 00:53:42.200 |
And the fact that we now have a $300 to a $350 billion recurring stream of revenues into Treasury 00:53:51.200 |
in the form of these tariffs which are being paid. 00:53:55.200 |
And I think Besant said last week, in the month of June, we had our first surplus in the United States, monthly surplus, since 2015, right? 00:54:06.200 |
Because of the tariff revenues that are coming in. 00:54:11.200 |
I knew the nuclear tariffs, the trillion or two trillion. 00:54:16.200 |
I said if we landed the plane where Besant wanted to come in at $300 billion, I thought there was a decent chance 00:54:22.200 |
that those prices could be passed on in the home countries. 00:54:29.200 |
What's the-- anything you're watching out for? 00:54:32.200 |
Well, I think the number one thing is core PCE did bottom last year, and it started to tick up. 00:54:41.200 |
And of course, I think it's almost impossible to conceive that we would have totally reordered the entire global trading system on the first pass with zero mistakes. 00:54:51.200 |
So we're going to have some goods and some products that consumers or US consumers are going to end up paying the taxes, and we're going to have to go back and fix some of these things. 00:55:02.200 |
But I will say that it's turning out massively better than consensus criticisms. 00:55:07.200 |
I mean, remember Larry Summers at the CO2 event? 00:55:10.200 |
I mean, this was just a few months ago, and he was saying this is the biggest economic disaster of his career. 00:55:17.200 |
And I don't think you can describe it that way. 00:55:20.200 |
The markets are-- the voting machine is telling you-- and these are a lot of sophisticated investors. 00:55:24.200 |
The voting machine is telling you that no retaliations, no trade wars, all these deals getting done works for the US economy. 00:55:32.200 |
And I will tell you that the Atlanta now Fed tracker, which tracks real-time GDP, has now ticked up back to 3%. 00:55:40.200 |
So after going down a lot in April, it's bounced way back up. 00:55:49.200 |
Remember, one of the key reasons for doing this, right, wasn't just because we were-- 00:55:55.200 |
I mean, the EU conceded in the trade negotiations that our relationship was unfairly balanced in the direction of the EU. 00:56:08.200 |
But on top of that, a key reason for doing this was to support the domestic production of critical national industries 00:56:16.200 |
and to make our supply chains more resilient. 00:56:22.200 |
At Altimeter, we just led the Series A in a company-- I don't even know if we've announced it, but I'll announce it here-- 00:56:27.200 |
which is an all-American producer of rare earth magnets, right? 00:56:30.200 |
So these are now viable investments because of the tariffs and the resolve of the government to re-onshore these critical supply chains. 00:56:38.200 |
So, I mean, that's a huge benefit that you would be willing to pay something for. 00:56:48.200 |
Not on the economic side, but running the supply chain at Brock? 00:56:52.200 |
Look, we're fortunate in that the majority of our supply chain is US-centric, including our chips. 00:56:57.200 |
But we still have small, discrete components. 00:57:00.200 |
And Brad, exactly what you were saying is happening is that the producers and manufacturers of those things are coming and you're negotiating. 00:57:07.200 |
And we've had pretty significant negotiation with those folks. 00:57:11.200 |
And that was, I think, like you said, wasn't anticipated. 00:57:18.200 |
The one thing-- again, I'll go back to the administration. 00:57:21.200 |
Like, we've seen our tariff schedule move around probably six to eight times since this all has started. 00:57:32.200 |
And I think that's also a function of how this administration operates. 00:57:37.200 |
Like, people can go and share with them, hey, these things are not available. 00:57:42.200 |
And I think that that's also helping with that last thing you said with the investment you're making is companies get established onshore to take advantage of what these tariffs are causing. 00:57:51.200 |
Sonny, has your supply chain planning and that dynamic nature, has that started to settle down? 00:57:57.200 |
Do you see this-- you know, are we reaching the end of the tariff negotiations such that everything can kind of settle down and operate? 00:58:06.200 |
It's gone from week to week or even day to day when it first started and trying to figure out what happened to like now we're looking at it monthly. 00:58:14.200 |
And Brad, we still have a big thing looming with a China discussion, correct? 00:58:24.200 |
We're going to have, you know, the long list is going to be coming out. 00:58:29.200 |
But when you look at our big trading partners, the EU, we do, I think, $900 billion of trade with them a year. 00:58:38.200 |
But we have about a $300 billion goods trade deficit with both Europe and China. 00:58:46.200 |
China is the big enchilada because it's not just trade with China, right? 00:58:51.200 |
It's strategic, it's national security, and it's trade. 00:58:58.200 |
We know that, you know, the rare earth ban on Chinese magnets was devastating to US industry. 00:59:03.200 |
We know the retaliation that, you know, where H20s were cut off in terms of NVIDIA's chips back to China. 00:59:11.200 |
So reading the tea leaves, I'll go out on a limb and I'll say the consensus still believes like that the China thing is going to be a problem or that it will be small. 00:59:22.200 |
I think this president wants to do the biggest deal ever done with China. 00:59:31.200 |
He said at the AI summit last week, I'm a deal junkie. 00:59:36.200 |
You can't be the biggest deal maker in the world without doing a big deal with China. 00:59:43.200 |
And I, you know, if you just look at today, the Chinese reciprocated in a way I think the US government was looking for. 00:59:50.200 |
They said, hey, we'll postpone all of our retaliatory tariffs. 00:59:55.200 |
The president has suggested he's going to go to China the first week of September, sometime between September and November. 01:00:02.200 |
I think there is a very, very big deal that's going to get done with China that's going to reorient the relationship in a big way. 01:00:11.200 |
The president said earlier this year something that caught all of our attention. 01:00:15.200 |
He said, you know, if I could wave a magic wand, I would cut the defense budget in half for the United States, for China and for Russia. 01:00:25.200 |
We've never heard a US president in history utter anything like that. 01:00:31.200 |
That is what I would call an extraordinarily flexible mindset. 01:00:36.200 |
And if you go into this deal negotiation with China with that sort of flexible mindset, I think it could include all of the above. 01:00:44.200 |
Rare earths, chips, maybe even military cooperation, certainly a rebalancing of trade. 01:00:50.200 |
I think China's-- listen, we entered the year with China paying 15% in tariffs. 01:00:56.200 |
They were paying 15% to the United States in tariffs. 01:01:05.200 |
So I think that they will continue to pay at least 15%, but I think it's going to be much more structured, much more nuanced. 01:01:14.200 |
China has to rebalance to domestic consumption and away from an export economy that's really sticking it to the US in terms of the trade deficit. 01:01:27.200 |
I think they want that for their own country. 01:01:30.200 |
I think the United States understands that can't happen overnight. 01:01:36.200 |
I think that there's going to be a big Chinese deal done before the year's out. 01:01:42.200 |
You've often on the pod been willing to speak about your own temperature for the US markets and whether you're net long or net short. 01:01:54.200 |
And you've, on this podcast, been more enthusiastic than I've ever seen you, both about AI, but this China theory that you have, I think, would cause the markets to rip, if you're correct. 01:02:09.200 |
But you also said we're at all-time highs, you know, and so you want to buy low and sell high. 01:02:20.200 |
Well, we did the pod in March, and I said, we're out of the market. 01:02:27.200 |
And you said you're early, and Liberation Day came, and we were happy to be out of the market. 01:02:31.200 |
On May 2nd, I said, we're all back in, because the Besant consensus has won. 01:02:39.200 |
I said, you can land the plane, no inflation, you get rate cuts, and it's kind of off to the races. 01:02:45.200 |
And we're up 30% off of that bottom in the NASDAQ since then. 01:02:52.200 |
But when you telescope out, we're up 10% for the year, okay? 01:02:56.200 |
And if I had told you guys on day one of this year, here's what's going to happen. 01:03:00.200 |
We're going to rebalance global trade, and we're going to land the plane around $300 billion. 01:03:05.200 |
We're going to have the economy grow at 3% accelerating. 01:03:10.200 |
We're going to have no inflation heading toward rate cuts by the end of the year. 01:03:22.200 |
And we're going to have all of this AI demand and accelerating demand for AI compute. 01:03:28.200 |
I would have said the market can be up at least 15% for the year. 01:03:31.200 |
I think we've captured a lot of the return for the year, Bill. 01:03:38.200 |
And so I would say that we're also bullish on what we see happening in AI. 01:03:41.200 |
Sonny's going to raise a huge new round here. 01:03:44.200 |
We're happy to be investors with Sonny as well. 01:03:46.200 |
And it's extraordinary to watch what Sonny's- 01:03:50.200 |
If he wasn't supposed to disclose that, he can edit it out. 01:03:57.200 |
Thank you, Sonny, for coming on with us today. 01:04:07.200 |
As a reminder to everybody, just our opinions, not investment advice.