back to index

Stargate, Executive Orders, TikTok, DOGE, Public Valuations | BG2 w/ Bill Gurley & Brad Gerstner


Chapters

0:0 Intro
1:21 Stargate
15:7 Dylan Patel and Semianalysis take on Stargate
17:56 Stargate’s Impact on AI Competitive Landscape
29:3 DeepSeek
32:3 Special Guest Rene Haas (CEO of ARM Holdings)
49:46 DeepSeek continued
51:53 Trumps Executive Orders & Regulatory Freeze
54:10 Coordination in AI Legistlation
59:38 US National Debt and Balancing the Budget
61:8 TikTok
65:48 Tech Check

Whisper Transcript | Transcript Only Page

00:00:00.000 | Elon is 100% right.
00:00:03.040 | Nobody has $500 billion.
00:00:05.240 | In fact, nobody has $100 billion to contribute to this
00:00:09.120 | on day one.
00:00:09.840 | Hey, man, great to see you.
00:00:24.080 | Good to see you.
00:00:25.160 | I mean, never a dull moment.
00:00:27.720 | Incredible.
00:00:28.960 | This goes well beyond being a dull moment.
00:00:32.560 | I think we have a year worth of activity in about 48 hours
00:00:37.000 | this week.
00:00:37.960 | I had an amazing weekend in DC.
00:00:40.040 | There was a tremendous amount of optimism.
00:00:42.360 | I've been saying to you and our friend group
00:00:44.760 | that I remember during Trump won in 2016.
00:00:49.000 | I mean, every morning you woke up,
00:00:50.440 | you were kind of holding your breath at what had changed
00:00:53.480 | since the night before.
00:00:54.760 | It was really the first administration
00:00:57.160 | that lived on Twitter.
00:00:58.960 | And here we are again.
00:01:00.440 | In some ways, the last year certainly,
00:01:04.520 | I feel like not a lot was going on in the world.
00:01:07.000 | Like, I wasn't stressing out every morning that I woke up.
00:01:09.960 | And man, this administration has kicked off
00:01:12.880 | with a bang and a major, major bang at the end of the day
00:01:16.800 | yesterday with Stargate, which we're going
00:01:18.640 | to talk a lot about today.
00:01:20.440 | And this morning, I have to say, even for those of us
00:01:23.480 | who are fascinated by the back and forth in Silicon Valley
00:01:27.000 | Bill, the back and forth this morning
00:01:29.000 | among perhaps the most seminal figures in Silicon Valley,
00:01:32.960 | Elon, Larry Ellison, Satya Nadella, Sam Altman,
00:01:37.260 | et cetera, over what is and what is not going on with Stargate
00:01:41.440 | is truly a situation where I think in some ways,
00:01:46.000 | facts are more intriguing than fiction.
00:01:47.800 | But why don't we start off by you just level setting
00:01:50.560 | for folks.
00:01:51.920 | There was a huge announcement at the White House
00:01:55.120 | yesterday with Larry Ellison, Sam Altman, Masa Sun,
00:01:58.780 | and President Trump.
00:02:01.240 | Talk to us about what was announced.
00:02:04.600 | Yeah, and maybe I'll do it in kind of how it rolled over me.
00:02:09.480 | So this is the day after the inauguration
00:02:13.940 | and the kind of high profile performance
00:02:18.560 | Trump had with the EO signing.
00:02:21.000 | And rumors-- I don't know.
00:02:23.440 | They started flipping out on Twitter or I don't know where.
00:02:27.680 | Maybe I saw it on the TV.
00:02:29.160 | But there was going to be a big press conference.
00:02:31.840 | And it revolved around this thing called Stargate.
00:02:35.720 | And reminding everyone, Sam Altman at OpenEye
00:02:39.120 | had been talking about the concept of Stargate
00:02:42.240 | for a while now.
00:02:43.560 | He had suggested maybe him and Microsoft were going to do it.
00:02:47.280 | I think that's important.
00:02:48.400 | We can dive into that later, that it's the same name that
00:02:51.360 | had been thrown around.
00:02:52.360 | I don't think there was ever any verification from Microsoft
00:02:57.400 | that they were in on the Stargate project.
00:02:59.240 | But clearly, this is something Sam's
00:03:01.360 | been thinking about for some time.
00:03:03.240 | And so it was supposed to happen at 4.
00:03:05.280 | I think they finally started at 5.30 or something.
00:03:08.560 | So it was like 90 minutes late.
00:03:10.960 | And then this door opens and Trump comes in.
00:03:15.440 | And then Masa Sun representing SoftBank, Larry Ellison
00:03:19.000 | representing Oracle, and Sam representing OpenAI.
00:03:23.080 | It looked a little haphazard.
00:03:24.640 | It looked like it had been thrown together quickly.
00:03:28.040 | And clearly, my interpretation, one of the reasons
00:03:33.760 | that it might have been thrown together quickly
00:03:36.200 | is Trump was looking to show that he was having
00:03:41.200 | quick momentum right out of the gate in terms
00:03:43.280 | of having an impact.
00:03:44.440 | And the things he said would suggest
00:03:46.720 | that that was part of the reasoning for doing this now
00:03:50.520 | and so quickly.
00:03:52.680 | I think if you just go to the posts that OpenAI put out,
00:03:56.400 | it's probably got the most detail.
00:03:58.760 | This thing's lacking in quite a bit of detail, which
00:04:01.240 | is why there's so much speculation.
00:04:03.480 | It says the Stargate project is a new company.
00:04:07.080 | So even that, I think, is in question.
00:04:09.600 | So that's going to be a new corporate entity which
00:04:12.000 | intends to invest $500 billion over the next four years, which
00:04:18.000 | I guess Microsoft's the highest right now at $80 billion
00:04:21.280 | a year.
00:04:22.240 | So this would be $125 billion a year
00:04:25.440 | if they met that commitment or target.
00:04:29.360 | It says we will begin deploying $100 billion immediately.
00:04:33.120 | There's a lot of talk of American leadership.
00:04:35.720 | There's talk of this being in America.
00:04:38.040 | It says the initial equity funders in Stargate
00:04:41.320 | are SoftBank, OpenAI, Oracle, and MGX.
00:04:45.360 | And maybe you can tell our listeners who MGX is
00:04:49.720 | in a minute.
00:04:50.560 | It says SoftBank and OpenAI are the lead partners.
00:04:53.680 | With SoftBank having financial responsibility
00:04:56.360 | and OpenAI having operational responsibility,
00:04:59.560 | Masasan will be chairman.
00:05:01.560 | And it lists key initial technology partners
00:05:04.280 | are Microsoft, NVIDIA, Oracle, and OpenAI.
00:05:08.280 | Yeah, so I'd say that's the gist of it.
00:05:11.320 | I mean, it sounds to me like maybe an even better funded
00:05:17.240 | version of CoreWeave.
00:05:19.040 | But I don't-- there's a lot I don't know.
00:05:21.360 | And I'm asking questions like a lot of other people.
00:05:24.200 | How do you see it?
00:05:25.360 | Well, I mean, listen.
00:05:26.520 | I think that for now, well over a year,
00:05:30.080 | OpenAI and many others have been talking about the need
00:05:33.720 | for massively more compute.
00:05:35.400 | A lot of the talk in Washington this weekend
00:05:38.200 | with various cabinet ministers and others
00:05:40.400 | was that this administration was going to dramatically
00:05:43.640 | accelerate power generation in the US,
00:05:46.240 | data center construction in the US,
00:05:48.160 | and compute in order that we win at AI.
00:05:50.160 | So there's no surprises there.
00:05:53.440 | The fact that they were able to announce this the day
00:05:55.880 | after the inauguration, that it was
00:05:57.880 | at this scale and magnitude-- now remember,
00:05:59.920 | this comes on the heels when I think
00:06:02.120 | there was a lot of talk in the back half of last year
00:06:05.240 | that Stargate was dead, that the need for this much compute
00:06:09.480 | was overstated, that maybe models
00:06:11.560 | were hitting some scaling limits.
00:06:13.600 | Smaller models could get the job done.
00:06:16.080 | So I think there was a lot of skepticism, frankly,
00:06:18.480 | about maybe even the demand for compute.
00:06:21.440 | So there was a wall of worry, I think,
00:06:23.720 | in the back half of last year as to what was
00:06:25.560 | going to happen with compute.
00:06:26.720 | So this was very orthogonal and much bigger
00:06:30.440 | even than the original $100 billion
00:06:32.720 | talked about when you and I first
00:06:34.560 | talked about the Stargate discussion in the spring
00:06:38.160 | of 2024.
00:06:40.240 | So when they made this announcement,
00:06:42.560 | the first thing I started asking myself
00:06:44.760 | was, how do they actually roll this out, Bill?
00:06:49.280 | Because you're right.
00:06:50.800 | There was a lot of white space left in this.
00:06:53.720 | And so I immediately, being the analyst that we are,
00:06:57.720 | got the team on the phone.
00:06:59.000 | And I said, we need to build a bottoms-up model.
00:07:01.480 | Everybody's talking about $500 billion.
00:07:04.040 | And now this morning, we have debates
00:07:05.880 | between Elon, who says, funding not secured,
00:07:10.160 | like they don't have the money.
00:07:11.480 | And on the other side, we have folks saying, of course,
00:07:13.920 | we're already doing this in Abilene.
00:07:15.760 | Larry said at the press conference,
00:07:17.600 | they have 10 buildings already built in Abilene.
00:07:20.000 | They have another 10 under construction.
00:07:21.840 | And we know that OpenAI is already running workloads out
00:07:24.960 | of Abilene, Texas, which is where this mega project's
00:07:27.880 | going to be built.
00:07:29.080 | And so--
00:07:29.880 | I bet you that's the first time you've
00:07:31.960 | said Abilene three times in a row in under a minute.
00:07:34.640 | By the way, I talked a lot about Abilene last year on the pod
00:07:38.080 | and with our team.
00:07:38.880 | Do you even know where Abilene is?
00:07:40.760 | I do, West Texas.
00:07:42.880 | I thought one of the things that I hope you and I add
00:07:45.120 | to the conversation writ large, there's
00:07:48.560 | a lot of talking heads and battling going on this morning.
00:07:51.040 | But what might this look like in reality?
00:07:53.520 | And so we'll share some projections
00:07:56.200 | that our team has made, which are obviously
00:07:58.600 | back of the envelope, because we don't have the precise data.
00:08:01.720 | But let's just assume--
00:08:03.280 | Let me ask you a quick question.
00:08:06.080 | If this is already, quote, "underway"
00:08:08.240 | and this is, according to the OpenAI press release,
00:08:11.440 | a new company, are we presuming that NewCo is acquiring
00:08:17.160 | some assets that were owned by somebody else?
00:08:20.960 | On whose balance sheet was the initial project sitting?
00:08:24.800 | Yeah, I honestly have no details on that.
00:08:28.920 | And if I did, I probably couldn't share them, Bill.
00:08:31.280 | Just as a reminder to everybody, I
00:08:33.240 | think from an investment perspective,
00:08:35.040 | Altimeter's invested in a lot of these companies, right?
00:08:38.280 | Our best perspective here is we know
00:08:42.120 | that there was construction and work going on in Abilene
00:08:44.800 | last year with Oracle.
00:08:46.240 | I assume that was kind of a direct relationship
00:08:48.520 | between the company and Oracle.
00:08:50.040 | But I have no insights about whether--
00:08:53.160 | how this thing is particularly going to be structured.
00:08:56.080 | What's more interesting to me is what is the equity check?
00:09:00.040 | Because there's this question.
00:09:02.400 | Based on some of the tweets this morning, Gavin Baker--
00:09:04.680 | and we'll show that as well--
00:09:05.880 | said this $500 billion is totally make-believe.
00:09:10.000 | And so based on those, it might have
00:09:12.120 | you think that somebody has to show up on day one
00:09:14.520 | with $500 billion in order for this to get off the ground.
00:09:19.000 | They're almost saying that it's a ruse,
00:09:20.680 | that the announcement was just a bunch of BS.
00:09:24.240 | But if we break it down, let's assume
00:09:27.080 | they started on this three months ago.
00:09:28.960 | And let's assume that they were able to secure 250,000 GPUs
00:09:33.720 | out of NVIDIA this year.
00:09:37.520 | If you break that down, that's about $13 billion of capex
00:09:40.840 | that they could spend in 2025.
00:09:43.560 | In the press release you quoted, they
00:09:45.240 | said we're going to start to spend $100 billion in 2025.
00:09:50.200 | So with $13 billion, you can stand up maybe 250,000 GPUs
00:09:57.080 | this year.
00:09:58.320 | Out of the $13 billion of capex required,
00:10:02.240 | if you make an assumption about $2,575 or $3,070 equity to debt,
00:10:08.200 | that probably requires a $3 billion equity check.
00:10:11.680 | And I'm assuming that there are debt providers.
00:10:13.720 | We've read about these deals already in the past.
00:10:16.480 | MGX, which is the Sovereign Wealth Fund, a big, big fund
00:10:20.520 | out of the United Arab Emirates.
00:10:22.080 | BlackRock is also said to be involved in the debt financing.
00:10:26.560 | But we know there are a lot of debt providers
00:10:28.680 | who are competing to provide debt to these data centers.
00:10:32.880 | So the equity check for 2025 would be relatively small.
00:10:37.480 | Let's assume that they want to ramp that up,
00:10:39.760 | put the pedal to the floor, and ramp that up in Abilene
00:10:43.960 | for 2026.
00:10:45.760 | Let's assume that they could secure, I don't know,
00:10:48.560 | 2 million GPUs from NVIDIA.
00:10:52.280 | Now you're talking about $100 billion in total capital
00:10:56.000 | that would be needed.
00:10:57.640 | 70%, 80% of that goes to GPUs.
00:11:00.600 | 25%, 30% of that goes to data center land and power.
00:11:05.680 | And so the equity check that would
00:11:08.440 | be required to do that bill is about another $25 billion
00:11:11.640 | in 2026.
00:11:14.000 | Now just to put it in perspective,
00:11:16.080 | to stand up 2 million GPUs in 2026,
00:11:20.120 | you need about 2.7 gigawatts of power.
00:11:25.920 | So this is not inconsequential at all.
00:11:28.440 | This would be the biggest deployment of GPUs
00:11:32.120 | anywhere in the world, even if they just stop there.
00:11:35.080 | And one thing worth noting that I failed to mention,
00:11:39.400 | Trump did say that one of the things
00:11:42.080 | that the Trump administration was bringing to the table
00:11:46.000 | was going to be a push to clear regulation to allow
00:11:50.800 | more power generation faster.
00:11:52.800 | And we've heard from people like Satya
00:11:55.080 | who say they're power limited.
00:11:57.160 | And so one thing that may be new in this situation
00:12:01.760 | is this entity may have an advantage in getting
00:12:06.400 | a hold of power faster, although there
00:12:08.240 | would be an incentive to do that for all of the players,
00:12:10.560 | I would imagine.
00:12:11.600 | And I would tell you, being in Washington this weekend,
00:12:14.040 | speaking with folks coming into the administration like Doug
00:12:18.360 | Bergram, I mean, if there's anything that I believe
00:12:22.760 | to be true, drill, baby, drill, and tapping US energy reserves
00:12:27.200 | and removing obstacles and regulations in order
00:12:29.680 | to light all this up in America rather than other places
00:12:32.720 | is probably priority number one.
00:12:34.960 | And probably the quickest thing you
00:12:36.480 | could do to generate power, to get it up to speed as fast
00:12:40.920 | as possible, or just to put in an LNG plant,
00:12:45.160 | that's very fast compared to nuclear.
00:12:46.880 | That's exactly what's going on, I think, out there in Abilene.
00:12:49.680 | You and I should go for a visit.
00:12:51.640 | Listen, you can't get SMRs or nuclear fission
00:12:54.560 | or anything else that can do this.
00:12:56.800 | But that's why they, I think, started building there.
00:12:59.880 | But just back to the analysis, Bill.
00:13:01.880 | So to get to the end of 2026 with 2 and 1/2 million GPUs,
00:13:07.240 | the equity check required is something like 25, 30 billion.
00:13:11.040 | So that just calibrates, like, how much does somebody really
00:13:14.280 | have to show up with, I think, in order to get this stood up?
00:13:17.360 | And then we could go into 2027, assuming
00:13:20.440 | that you're going to stand up another 2 million GPUs, which
00:13:23.400 | would bring your cluster to 4 and 1/2 million GPUs.
00:13:26.080 | You need another 25 billion of equity.
00:13:28.200 | So at least on that score, just the question of,
00:13:32.160 | how big is the equity check?
00:13:33.640 | You're talking well under $50 billion in years 1 and 2.
00:13:38.560 | You're talking about $78 billion over a period of four years
00:13:42.840 | to the end of 2028.
00:13:44.720 | So we need to-- the $500 billion is a sexy headline.
00:13:48.560 | It's important.
00:13:49.600 | These things are going to be built here.
00:13:51.720 | But the equity check required in order to stand this up--
00:13:54.680 | and if they just achieved what I just outlined
00:13:56.800 | through the end of 2028, that requires 7 and 1/2 gigawatts
00:14:01.400 | of power out of Abilene.
00:14:03.760 | And so to me, you're kind of on the outside limits
00:14:07.040 | of what's possible.
00:14:08.520 | Well, but they talked about multiple locations, too.
00:14:11.040 | So they would move.
00:14:13.040 | Now, you're assuming a 3-to-1 debt to equity,
00:14:16.320 | which I guess is something similar to what Corwe's done.
00:14:19.240 | Equinix has, over the years, run closer to 1-to-1.
00:14:23.040 | And so that would be assuming a substantial amount of risk.
00:14:28.240 | Now, no one's more prone to risk than Masasan and SoftBank.
00:14:33.000 | So it's very plausible that would be their assumption.
00:14:37.160 | And remember here, again, over the days and weeks ahead,
00:14:41.360 | we're going to refine this back of the envelope.
00:14:45.320 | But I just wanted to calibrate.
00:14:48.000 | Nobody has to come up with $500 billion right now.
00:14:50.600 | Nobody has to come up with $100 billion right now.
00:14:52.600 | Nobody needs to come up with $50 billion right now.
00:14:55.640 | We know there are lenders willing to lend.
00:14:57.800 | But that's an important question.
00:14:59.280 | And we're going to try to get to the bottom.
00:15:01.120 | What would BlackRock and what would MGX require in terms
00:15:04.840 | of that debt to equity split?
00:15:07.840 | So Bill, as we know, everybody had a huge incentive
00:15:10.560 | to get to this big number.
00:15:12.400 | Masa wanted to get to a bigger number.
00:15:14.680 | Trump, when they were down in Mar-a-Lago,
00:15:16.440 | took it from $100 billion to $200 billion.
00:15:18.600 | And now they got to $500 billion,
00:15:20.120 | which served everybody around the table really well.
00:15:24.760 | We broke it down in our numbers to try to reverse engineer
00:15:28.920 | our way to that $500 billion in CapEx.
00:15:32.200 | And even in our numbers, we've demonstrated
00:15:34.320 | you need a much, much smaller equity check
00:15:36.920 | so that nobody has to show up with $100 billion or $500
00:15:39.840 | billion on day one.
00:15:41.840 | But there's another way to look at this as well,
00:15:44.600 | which is that we're not even talking about $500 billion
00:15:47.880 | in CapEx.
00:15:48.800 | But instead, Dylan, who was on our pod in December,
00:15:52.840 | has done an analysis that says maybe they
00:15:55.000 | were talking about the total cost of operation,
00:15:57.640 | not about CapEx at all.
00:15:59.880 | And he has a piece of analysis, which is quite interesting,
00:16:02.880 | which shows that even if you aggressively build an Abilene,
00:16:09.840 | the fact is you're only going to be
00:16:11.920 | able to spend something like $100 billion in total
00:16:15.680 | between now and the end of 2027.
00:16:18.360 | I think he estimates something like 800,000 total GPUs
00:16:23.400 | that would get purchased over that period of time.
00:16:26.440 | He does note that those GPUs, which
00:16:29.160 | are much more powerful--
00:16:30.240 | Blackwell and eventually Rubin GPUs--
00:16:33.200 | will be the equivalent to millions of H100s.
00:16:37.800 | So just if we want to compare what
00:16:40.320 | he's forecasting in Abilene compared to the 200,000 cluster
00:16:44.920 | that we're talking about in Memphis.
00:16:46.680 | But in that case as well, it's very consistent with us.
00:16:49.360 | The amount of equity that would be required on day one
00:16:53.520 | is a fraction of the amount we're talking about.
00:16:56.080 | So if you're trying to square the circle,
00:16:59.120 | how can both Elon and Sam be right?
00:17:01.760 | Elon is 100% right.
00:17:04.840 | Nobody has $500 billion.
00:17:07.000 | In fact, nobody has $100 billion to contribute to this
00:17:10.840 | on day one.
00:17:11.760 | But both our analysis and Dylan's analysis
00:17:14.560 | show that that's not required in order
00:17:16.880 | to begin scaling this up.
00:17:18.240 | So Sam is also right.
00:17:20.000 | They are, in fact, building in Abilene.
00:17:22.640 | And you can scale up to much, much bigger clusters
00:17:26.560 | without this headline number being required
00:17:30.120 | to be delivered on day one.
00:17:32.200 | But Bill, let me ask you a different question.
00:17:34.360 | Let's assume that we're going to spend a lot more money
00:17:39.320 | than we thought we were going to spend 30 days ago,
00:17:42.120 | or certainly three months ago.
00:17:45.240 | Maybe talk a little bit about what this tells us
00:17:50.000 | about where we're headed and/or how
00:17:52.480 | this impacts the competitive landscape out there.
00:17:55.960 | Well, I think in order to do that,
00:17:57.360 | you kind of got to back up and look
00:17:59.920 | at the motivations of each of these parties.
00:18:02.240 | And I think there are numerous things going on that
00:18:06.320 | all led to this big event.
00:18:09.080 | And that's not to suggest that it's not going to happen.
00:18:13.360 | It sounds like it is going to happen.
00:18:15.040 | It's just there are a lot of different motivations.
00:18:17.920 | So as we talked about, Sam's been
00:18:21.040 | talking about this for a while.
00:18:23.240 | He has a belief that he needs access
00:18:26.280 | to a very large data center.
00:18:28.720 | It appears from this conjecture on my part.
00:18:32.280 | And I'm not an investor like you.
00:18:34.480 | So if you can't respond, don't feel that you need to.
00:18:38.640 | But it appears that two things might have played
00:18:42.080 | a role in this happening.
00:18:43.600 | One is Elon and x.ai.
00:18:46.400 | Elon and Gavin, who's a big backer of x.ai,
00:18:50.440 | have pointed out that they have a competitive advantage
00:18:53.200 | over OpenAI because they have their own infrastructure.
00:18:56.840 | This isn't exactly OpenAI's own infrastructure,
00:18:59.600 | but it probably feels that way.
00:19:01.280 | So that may have provoked this a little bit.
00:19:03.960 | And then the other thing is just the much-discussed relationship
00:19:08.280 | between Microsoft and OpenAI.
00:19:10.880 | This easily could be one of the pieces in the back and forth
00:19:16.920 | between those two parties.
00:19:18.560 | And we got a quick press release out of Microsoft
00:19:22.280 | immediately after this was announced.
00:19:24.360 | And it's hard to dissect what might be going on there.
00:19:28.040 | But I think that played a role in this happening.
00:19:31.480 | I think my own interpretation is Microsoft
00:19:34.600 | didn't want to build this big OpenAI
00:19:38.440 | playground, or one as big as maybe Sam wanted to build.
00:19:42.480 | And so he went somewhere else.
00:19:44.080 | And everybody's-- everyone at the surface
00:19:47.720 | behaving like they're OK with that.
00:19:49.560 | I think that the other parties--
00:19:51.840 | Masa's always looking to be involved in the biggest
00:19:55.040 | movement that's happening, that happened in previous waves
00:19:59.360 | we've been involved with.
00:20:00.360 | So no surprise that he shows up here.
00:20:03.200 | And then for Larry Ellison and Oracle,
00:20:06.400 | if you look at their relative to position as a hyperscaler,
00:20:10.560 | I think people think of Amazon and Google
00:20:14.480 | before they think of Oracle.
00:20:16.120 | And so this gives them a platform
00:20:19.920 | to kind of brag a little bit about that they're
00:20:24.080 | at the front of the line.
00:20:25.160 | So I think those are the things that
00:20:27.080 | are happening in the background that take us to this place.
00:20:30.480 | And I think it's important to note that Oracle was, I think,
00:20:33.680 | at the start helping x.ai in Memphis.
00:20:36.480 | But Memphis decided to go it alone and build out
00:20:39.960 | their own infrastructure.
00:20:41.120 | And as Jensen-- we discussed with Jensen on the pod,
00:20:44.600 | listen, Elon's an N of 1.
00:20:47.040 | He stood it up faster.
00:20:48.280 | It worked better.
00:20:49.200 | He built a bigger cluster.
00:20:50.920 | We're going to see.
00:20:51.720 | He's going to launch Grok 3 here.
00:20:53.440 | I think they're going to have the output, the first training
00:20:56.240 | run on the biggest cluster in the world.
00:20:58.040 | And we're excited to see what that yields.
00:21:01.680 | From my own perspective, I mean, set aside for a second.
00:21:05.240 | I think you nailed it.
00:21:06.240 | Those are largely the motivations
00:21:08.320 | of the individual parties.
00:21:09.600 | I think, again, just as a country,
00:21:11.320 | we just went through this amazing inauguration.
00:21:14.040 | We're going to talk a little bit about Stan Druckenmiller
00:21:16.800 | later.
00:21:17.320 | But I think that this level of competition,
00:21:21.160 | that this scaling is occurring in the US,
00:21:23.880 | that this investment is occurring in the US,
00:21:26.080 | I think it is fantastic for the United States of America.
00:21:29.320 | This increases the probability that we're
00:21:32.000 | going to win in the race to AI.
00:21:33.640 | We needed more power.
00:21:35.200 | We need this 5 to 10 gigs to come online.
00:21:37.640 | We need to have-- now, does that lead us to AGI or ASI
00:21:41.400 | or whatever?
00:21:42.440 | It certainly puts us in a very strong competitive position.
00:21:47.360 | And you're right.
00:21:48.280 | I think it is that competition on the field, Bill.
00:21:51.000 | I think it is the fact that Elon stood up a bigger cluster.
00:21:54.240 | And that is a competitive advantage.
00:21:56.160 | And he did it fast, et cetera.
00:21:57.840 | And so now you have an alternative here,
00:21:59.960 | which brings me back to just what
00:22:01.800 | does this mean for the competitive landscape, Bill?
00:22:04.560 | Think about it.
00:22:05.200 | You and I just went through this slide
00:22:06.960 | that we'll share again on the hyperscaler, CapEx, that's
00:22:11.600 | expected for 2025.
00:22:13.560 | Well, you've added another player on top of it.
00:22:16.400 | I mean, the person that is probably most obviously
00:22:21.880 | impacted by this is Jensen and NVIDIA.
00:22:24.200 | And of course, the stock's up 5% today.
00:22:26.200 | So these aren't secrets.
00:22:27.960 | These are obvious facts, right?
00:22:31.200 | But it's yet another large customer of NVIDIA gear.
00:22:38.120 | And the size and scope of what they
00:22:41.360 | want to do, even if an NVIDIA competitor showed up,
00:22:44.560 | I don't think they could create enough production
00:22:48.640 | in the amount of time you need.
00:22:50.120 | You even hinted that Stargate may
00:22:53.080 | be limited by whatever NVIDIA is willing to give them,
00:22:57.600 | which is an interesting dynamic.
00:22:59.800 | We've been chip constrained for two years.
00:23:02.120 | And now you have a new player on the field that
00:23:04.120 | may be raising their hand and saying
00:23:05.600 | they need 2 million GPUs.
00:23:07.360 | Remember, just a couple of years ago,
00:23:08.920 | we weren't making 2 million GPUs.
00:23:11.200 | I think the forecast this year is for something
00:23:13.480 | like 6 million GPUs, right?
00:23:15.680 | So if I'm sitting in the Googleplex today,
00:23:20.080 | what does it say to me?
00:23:21.160 | I got to spend more on CapEx.
00:23:23.000 | I got to make sure I secure my GPUs.
00:23:25.440 | If I'm at Microsoft, got to secure my GPUs.
00:23:28.040 | I've got to make sure I'm spending enough.
00:23:30.280 | If I'm at Meta today, what do I need?
00:23:32.400 | I need to make sure that I'm not losing out
00:23:35.280 | to Stargate/OpenAI in the order book for GPUs.
00:23:39.720 | So, again, I heard just as recently as a month ago
00:23:45.840 | that 2026, there's not going to be demand for GPUs
00:23:49.720 | and whatnot.
00:23:51.480 | And Dylan came on this podcast and called all of that garbage.
00:23:55.160 | He said the demand is off the charts.
00:23:57.040 | People aren't making 12-month bets.
00:23:59.200 | They're making three-year bets.
00:24:01.520 | And I think, again, this is just further validation of that.
00:24:04.480 | Here's the way I think it changes
00:24:05.800 | the competitive landscape.
00:24:08.040 | I think there are only--
00:24:10.080 | the bigger the stakes get--
00:24:12.480 | and Sam just pushed a big pile into the middle--
00:24:15.560 | there are only certain companies that can be in that game.
00:24:18.440 | X.AI, because of the genius of Elon and the momentum
00:24:23.080 | they have and the operating businesses that
00:24:25.200 | stand behind them, are 100% going to be in that game
00:24:28.280 | and maybe eventually leading that game.
00:24:30.760 | Amazon is like-- this is a real test for them.
00:24:35.000 | Is Anthropic going to be able to show--
00:24:37.680 | they have less than a billion in revenue.
00:24:40.320 | They don't really have--
00:24:41.960 | they have great models, I would argue.
00:24:44.320 | But they don't have the consumer or the enterprise traction
00:24:47.200 | that the other folks do.
00:24:48.520 | Can they come up with the money that's going to be required?
00:24:51.200 | Or is Amazon willing to put up that sort of money?
00:24:54.240 | Amazon has never been that aggressive with CapEx
00:24:56.800 | relative to the other folks.
00:24:58.400 | Meta is going to be in the game.
00:24:59.760 | Google is going to be in the game.
00:25:01.200 | I suspect that this puts more pressure on their CapEx
00:25:04.280 | to come up.
00:25:05.240 | And I think you're right that because of that and the fact
00:25:08.800 | that Jensen said there are 35 other AI factories
00:25:12.080 | around the world that are not hyperscalers,
00:25:14.440 | now you've got Oracle being a major player in the game
00:25:17.880 | because of Stargate.
00:25:20.480 | Again, I think the net-net benefit here
00:25:23.000 | is that we're going to get a lot better AI from all
00:25:25.320 | of these players.
00:25:26.480 | But I think you are going to be chip constrained.
00:25:28.560 | I think you are going to be power constrained
00:25:31.760 | for the next several years.
00:25:33.160 | And I think if you're sub-scale on any level,
00:25:37.200 | I think those players ultimately have to get folded
00:25:39.920 | into a much larger entity.
00:25:42.320 | We saw, reportedly, Anthropic raised a couple billion
00:25:45.640 | from Lightspeed and maybe a billion
00:25:47.160 | I saw today, an announcement out of Google.
00:25:49.520 | So they are raising money.
00:25:51.480 | They do have the capital.
00:25:52.600 | But every time they buy in, Bill, for a little bit more
00:25:55.600 | at the poker table, somebody else goes over the top
00:25:59.160 | and the demands of the pot size just get bigger and bigger.
00:26:01.720 | Yeah, and one of my big takeaways is--
00:26:05.280 | and it was this already, but it's a sport of kings.
00:26:09.600 | It's the amount of money that's being thrown around.
00:26:13.240 | I think with all the announcements,
00:26:15.560 | the billion dollars from Google into Anthropic
00:26:18.360 | almost fell through the cracks, right?
00:26:21.040 | Because it just doesn't-- it's not as big a number.
00:26:24.200 | It's part of why I think there's so much hyperbole,
00:26:27.680 | is there is a game in the background of,
00:26:31.040 | can I bid even more?
00:26:32.160 | Can I bid even bigger?
00:26:33.440 | Can I talk even bigger?
00:26:35.920 | Potentially to try and scare some out of the game.
00:26:39.160 | I also think, though, that--
00:26:40.920 | and we've talked about this in the past-- that causes
00:26:43.720 | an increase of risk as well.
00:26:46.000 | There's some point at which you deploy
00:26:49.280 | CapEx, which is slow to bring on,
00:26:52.080 | where you could overrun demand.
00:26:56.160 | And if you have a 3 to 1 debt to equity ratio,
00:26:59.800 | that could be a real painful downside.
00:27:02.800 | So clearly, no one right now is thinking that way.
00:27:07.280 | Clearly, all these people are very positively minded.
00:27:11.000 | I tip my hat to Sam Altman's aggressive, ambitious
00:27:17.800 | gamesmanship.
00:27:19.320 | This is clearly something he wanted to do.
00:27:21.080 | He found the right window.
00:27:23.120 | It's certainly interesting to watch the triangle of Sam
00:27:28.480 | and Elon and Trump here.
00:27:29.840 | Because some people said, oh, is Elon
00:27:32.640 | going to tell Trump what to do?
00:27:34.040 | Well, clearly here, you've got some dynamics where not
00:27:38.840 | everybody's on the same page.
00:27:41.080 | This comes back to this question I
00:27:43.360 | ask about, where does this tell us about where we're headed?
00:27:46.680 | And what I mean by that question, Bill,
00:27:49.440 | is really, where are we in the stage of model development?
00:27:53.120 | Because we know that this pre-training is asymptoting.
00:27:56.560 | You've been at the forefront of discussing this.
00:28:00.040 | But we also know, as Jensen showed in this slide,
00:28:03.280 | that we now have three waves of scaling.
00:28:05.960 | We have this post-training and now this inference time compute.
00:28:09.880 | Lost again in the shuffle today, OpenAI announced Operator,
00:28:14.000 | which is, again, built on the back of inference time compute,
00:28:17.360 | which is really more around this planning and actions.
00:28:21.400 | And when I look at the year 2025 and what this will enable,
00:28:28.480 | clearly, there is a lot more enthusiasm
00:28:32.440 | about the progress being made by these models.
00:28:35.760 | Whether that enthusiasm is at Google
00:28:37.640 | or whether it's at x.ai and what we're
00:28:39.600 | going to see out of Grok 3 or what they're seeing out
00:28:41.920 | of the O-series at OpenAI.
00:28:45.360 | Part of the reason I think folks have the confidence
00:28:48.440 | to commit to this level of equity and debt
00:28:51.360 | is because of the use cases and the demand
00:28:54.920 | that they're seeing by both consumers and enterprises
00:28:59.440 | for more of this and the progress that's
00:29:01.120 | being made on these models.
00:29:02.920 | In that regard, maybe just talk for a second
00:29:06.520 | about these breakthroughs that are happening
00:29:09.360 | in China around DeepSeek.
00:29:11.360 | Because I think here's a model development that there's a lot--
00:29:15.160 | I think over the weekend, people were blown away
00:29:18.560 | by the benchmarks that were achieved
00:29:20.560 | by this small, open source, inexpensively trained
00:29:23.920 | Chinese model.
00:29:25.440 | I can assure you of this.
00:29:27.360 | One of the top priorities on the mind of our new AI czar, David
00:29:30.920 | Sachs, and lots of folks in the administration
00:29:34.480 | is national security.
00:29:36.000 | Their number one objective is how do we
00:29:37.720 | speed up the United States?
00:29:39.200 | And you saw the Stargate announcement.
00:29:41.000 | That's all about speeding up the United States.
00:29:43.840 | But the other one is how do we not make it easy on China
00:29:47.680 | to compete?
00:29:48.400 | So here's a situation where even with deprecated chips,
00:29:53.080 | they don't have cutting edge chips out of NVIDIA.
00:29:55.160 | They seemingly train something that's very competitive.
00:29:58.680 | What's your read on that, Bill?
00:30:00.480 | My studying of what other people are saying,
00:30:03.240 | because I try and absorb as much of this as I can,
00:30:07.120 | is that the Chinese DeepSeek in particular
00:30:10.680 | has figured out a way--
00:30:12.000 | the word people are using is distill.
00:30:14.160 | They've figured out a way to basically shrink their model
00:30:21.520 | theoretically, possibly by connecting
00:30:25.920 | to an API of one of these other foundational models
00:30:29.440 | that we've talked to, and using that as a guide,
00:30:33.000 | and even make these things more efficient.
00:30:35.320 | So to pack the same amount of horsepower, if you will,
00:30:39.560 | in a much smaller engine, and get
00:30:44.120 | to the exact same competitive benchmarks on the output.
00:30:48.680 | And so in one way, you could say, oh, my god,
00:30:52.080 | this is this great commoditization
00:30:54.280 | of the foundational models.
00:30:55.880 | I think that's potentially valid.
00:30:57.640 | The other thing you could say is, wow,
00:30:59.920 | we're going to get so much more performance per token price.
00:31:04.080 | And that can increase the market,
00:31:07.120 | because it gives so much more tools to the startups
00:31:12.200 | that they can play with.
00:31:13.400 | And apparently, one of the tests even
00:31:15.680 | had this thing doing the chain of thought reasoning,
00:31:19.880 | and it was successful at that.
00:31:21.760 | And so when you considered that that was a 10 to 100x increase
00:31:27.200 | in token use, if you get down to these lower price points now,
00:31:30.200 | you can do even more for less.
00:31:31.960 | So it could unlock quite a bit.
00:31:33.720 | I think, for me, the most interesting part of it
00:31:36.600 | is, I actually think--
00:31:38.640 | well, I've said this before, but I
00:31:40.280 | think the policy of the American government
00:31:45.040 | to try and keep China out of the AI game is futile.
00:31:49.440 | I've always felt that way.
00:31:51.880 | I don't think it'll work.
00:31:52.960 | In this case, I actually think it backfired.
00:31:55.520 | There is a phrase that people use
00:31:58.480 | that constraints drive creativity.
00:32:01.760 | And Jobs used to say this quite a bit.
00:32:04.600 | Whoa, whoa.
00:32:05.440 | Bill, bookmark that.
00:32:07.560 | OK, we'll come back.
00:32:08.400 | Rene Haas, CEO of Arm, just jumped into our conversation.
00:32:12.040 | Hey, Rene.
00:32:12.800 | Hello there.
00:32:13.680 | Hello, Bill.
00:32:14.640 | How are you doing, Rene?
00:32:15.800 | Well, how are you?
00:32:17.000 | Great.
00:32:17.720 | What an incredible day, Rene.
00:32:19.720 | Yeah, pretty amazing stuff, right?
00:32:22.400 | I appreciate you jumping in here.
00:32:24.040 | I know we got maybe a 15-minute cameo.
00:32:28.680 | We'll have you back when we can spend a couple hours together,
00:32:31.960 | because you're such a thought leader,
00:32:34.560 | having spent so much time at Nvidia and now running Arm.
00:32:37.880 | As most people know, Masa, I think, owns 90% of Arm.
00:32:42.040 | So you're very close partners with Masa.
00:32:43.920 | Obviously, you've been party to the conversations.
00:32:47.040 | You're named as a technical partner along with OpenAI,
00:32:51.720 | Microsoft, Oracle, and yesterday's really
00:32:55.240 | earthquake-level announcement out of the White House.
00:32:58.800 | So Rene, help us understand a little bit
00:33:02.200 | how you see this play now.
00:33:04.680 | There's a lot of talk about this $500 billion
00:33:07.480 | number, whether this money exists.
00:33:09.400 | Bill and I just went through this analysis.
00:33:11.440 | And I said, from my perspective, this ramps up.
00:33:14.640 | And it's going to be a combination of equity and debt.
00:33:18.440 | Nobody has to show up with $500 billion on day one.
00:33:22.440 | And I'm forecasting maybe 250,000 GPUs this year,
00:33:27.000 | maybe a million or two million next year.
00:33:29.680 | But what's your role in this?
00:33:32.680 | And how are you thinking about how this scales up
00:33:35.080 | over the next few years?
00:33:36.360 | Yeah, sure.
00:33:37.240 | So I have a few roles on this.
00:33:39.760 | And I know you guys know Masa reasonably well.
00:33:42.640 | I know you do, definitely, Bill.
00:33:45.160 | I spend a lot of time with him.
00:33:46.560 | Part of it is he owns 90% of Arm.
00:33:48.920 | So I have a lot of investor meetings with him
00:33:51.880 | as my chief investor and talking about strategy.
00:33:54.680 | He has been, himself personally, pretty large
00:33:58.480 | on this idea of singularity for quite some time.
00:34:01.640 | And it's something that I think he's just
00:34:03.680 | had a vision on in the long, long game.
00:34:07.600 | The Chad GPT moment, I think, for him
00:34:10.000 | was a bit of an accelerant relative to there's
00:34:14.600 | a game to be played here relative to capital,
00:34:17.200 | relative to compute, relative to power.
00:34:20.160 | And he wants to play a big part in it.
00:34:22.640 | And I think there was just a lot of discussions,
00:34:25.680 | whether it was Sam Alton wanted to buy FABs
00:34:29.160 | or different assets that were trying
00:34:31.600 | to look at power in different areas of the planet.
00:34:34.480 | I think something had an opportunity
00:34:35.960 | to come together to solve this giant problem of how do you
00:34:40.840 | get access to so much energy that's needed, we think,
00:34:44.600 | to drive AGI and ASI at numbers that are even well
00:34:48.480 | beyond the balance sheets of the giant companies
00:34:51.560 | like a Microsoft, a Google, a Meta, an AWS, et cetera,
00:34:55.200 | an Amazon.
00:34:55.760 | So I think this all kind of came together
00:34:57.460 | at right time, right place.
00:34:59.120 | But like everything in life with these type of things,
00:35:01.720 | it didn't happen overnight.
00:35:03.880 | There were a lot of discussions and conversations
00:35:05.960 | that had been taking place for weeks and actually many months.
00:35:09.800 | And I think this all came together
00:35:11.560 | with a confluence of ambitious partners like Sam,
00:35:15.840 | ambitious partners like Larry Ellison and Oracle,
00:35:18.800 | and quite frankly, a new administration
00:35:21.080 | that was ready to take action in a very fast way.
00:35:24.440 | And I think that's what you all saw come together yesterday.
00:35:28.080 | And I think one of the most amazing things about it--
00:35:30.520 | I know you and Brad and I, we had talked about this
00:35:32.600 | in the past--
00:35:33.720 | is to imagine that 28 hours after the president took
00:35:37.960 | office, he's announcing a project around AI data centers
00:35:42.680 | and build out with three large players in the tech industry.
00:35:46.560 | Kind of amazing.
00:35:47.880 | That just speaks to the importance
00:35:50.280 | of how the new administration views all this.
00:35:53.160 | It's truly an Apollo-scale project.
00:35:55.600 | I guess as you think about, again, Arm,
00:35:58.840 | why don't you explain to us again,
00:36:00.520 | what is Arm actually delivering into Stargate?
00:36:05.960 | I know you're embedded in the GB200,
00:36:08.600 | but maybe just share with everybody else the role
00:36:11.120 | you play.
00:36:12.360 | Well, maybe at the highest level,
00:36:13.880 | the way to think about it is you've got a giant, as you said,
00:36:17.840 | Apollo Manhattan project, whatever terms
00:36:20.400 | you want to use about--
00:36:21.680 | what I guess is probably the largest infrastructure
00:36:24.080 | build-out in the history of the world.
00:36:26.440 | And every data center, whether it's
00:36:30.360 | running general purpose compute, whether it's running inference
00:36:33.040 | or running training, needs a base CPU to run everything,
00:36:37.280 | end quote.
00:36:38.280 | And that's our role.
00:36:39.840 | And whether that is going to be what
00:36:41.640 | we are part of today, which is GB200--
00:36:45.120 | and we're super happy to be partnering
00:36:47.320 | with NVIDIA on that product--
00:36:49.600 | or other areas that we haven't talked about yet
00:36:52.280 | in terms of productization, there's
00:36:54.440 | lots of opportunity for Arm, because the base
00:36:56.760 | CPU will be Arm.
00:36:58.560 | And I think therein lies a huge opportunity.
00:37:01.400 | One of the things that people don't always appreciate with--
00:37:04.480 | let's take, again, GB200 running in an AI data center.
00:37:07.920 | All of the other work that needs to take place,
00:37:10.520 | whether it's the hypervisors, virtual machines,
00:37:13.240 | anything that the, end quote, "normal" CPU
00:37:16.280 | has to do in a data center has to be run by something.
00:37:19.960 | And that's what Grace does.
00:37:21.840 | So then when you baseline that relative to, OK,
00:37:24.360 | GB200 is where we are today, the opportunity going forward
00:37:28.200 | in terms of these large data centers
00:37:29.760 | doing some level of mixed inference and training,
00:37:33.440 | reasoning, reinforcement training,
00:37:36.160 | there's a lot of opportunities for Arm
00:37:37.960 | to do even more than what we're talking about today.
00:37:40.680 | So it's super exciting.
00:37:42.120 | Bill, I know you had some questions, just
00:37:44.720 | real tactical questions.
00:37:46.320 | Renee, as you think about just Stargate, it's a new company.
00:37:51.520 | It's a new entity.
00:37:53.000 | I know Maas is the chairman.
00:37:54.520 | But any insights for us, like, who's running this thing?
00:37:59.880 | And how should we think about this entity
00:38:03.120 | relative to the other entities?
00:38:04.680 | I mean, my mental framework is that it's
00:38:06.760 | kind of like this operating shell
00:38:09.480 | that everything runs through, but that the actual compute,
00:38:13.120 | the offload's going to go largely to open AI
00:38:15.720 | or 100% to open AI, that the inputs are
00:38:19.680 | going to be coming from you guys and NVIDIA largely,
00:38:21.960 | and of course, all the other people
00:38:23.520 | that are going to be needed to network and do
00:38:26.440 | the things in the data center.
00:38:28.080 | But is there an idea that you guys
00:38:30.400 | are investing in an entity that unto itself will have power
00:38:33.560 | and maybe grow in and serve other customers?
00:38:36.480 | Or is this really just about coordinating
00:38:38.400 | the activities of the people who are around the table?
00:38:41.480 | Yeah, I think what I can say today, Brad,
00:38:44.000 | it's much more of the latter than the former.
00:38:47.600 | Could there be opportunity for the former
00:38:49.400 | somewhere down the road?
00:38:50.440 | Potentially.
00:38:51.120 | But right now, it's what you just described.
00:38:53.880 | And the operational control will be from open AI.
00:38:58.480 | So they'll call the shots relative to all the things
00:39:01.840 | relative to the operation.
00:39:03.440 | Obviously, there's existing relationships with NVIDIA.
00:39:05.760 | There's existing relationships with Oracle,
00:39:08.120 | existing relationships with Microsoft, ourselves.
00:39:11.760 | But going forward, they're going to be in a very, very key role
00:39:16.360 | on the operation, which I think, if you kind of go back again
00:39:19.440 | to Sam and team spending a lot of time and energy
00:39:23.240 | over the past 12 to 18 months of seeking for ways
00:39:27.040 | to get opportunity and access to large resources
00:39:30.440 | to advance the training of these large models,
00:39:33.400 | it's kind of where he's kind of been with this.
00:39:36.000 | So I think it's not inconsistent with some
00:39:39.160 | of the actions and behaviors you've seen
00:39:41.520 | over the last number of months.
00:39:44.720 | Can you think of an entity, like a comparable entity
00:39:48.640 | in the past?
00:39:49.880 | I'm having a hard time imagining what Brad just
00:39:54.120 | described actually is.
00:39:56.160 | I don't think there is a good comparison on this, Bill,
00:39:58.360 | because when you just think about the amount of capital
00:40:01.640 | that's required, it's bigger than anyone, right?
00:40:04.680 | So this required a very, very novel set of partners
00:40:09.840 | to come together, both with a big vision, a large opportunity
00:40:15.680 | to get access to capital, and candidly, probably
00:40:18.720 | a little bit of a willingness of we're
00:40:20.260 | going to figure this out as we try to grow it.
00:40:22.520 | Because this is beyond what we've done before.
00:40:27.440 | The only analogy, and it's not a good one
00:40:29.280 | in terms of how I can think about it,
00:40:31.080 | is maybe with global foundries and Mubadala
00:40:33.880 | relative to starting to see that traditional fabs needed
00:40:37.960 | to get extra capital.
00:40:38.880 | And that was at a much smaller scale.
00:40:41.640 | Now Satya talking about spending $80 billion of CapEx,
00:40:47.080 | even for Microsoft, that's a giant number.
00:40:50.360 | And at these numbers, no one company can do it.
00:40:54.960 | So Rene, if you think about what's
00:40:58.360 | built into this forecast, and you
00:41:00.760 | don't have the privilege of seeing
00:41:01.920 | what we talked about earlier.
00:41:02.840 | But I had my team do what we do well.
00:41:04.520 | We're a bunch of analysts.
00:41:05.760 | We take what's known.
00:41:06.760 | We try to build a bottoms up model.
00:41:08.640 | And what we got to really was by the end of 2028,
00:41:13.400 | consuming about 7 and 1/2 gigs of power in Abilene,
00:41:17.920 | standing up about 2 million GPUs a year in that infrastructure.
00:41:22.720 | So 6 and 1/2 GPUs, 6 and 1/2 million GPUs.
00:41:26.320 | That would spend roughly $300, $350 billion up to $500 billion
00:41:32.960 | that they said they would spend.
00:41:34.760 | When I just look at that level of power,
00:41:38.280 | we think about where the bottlenecks are
00:41:40.640 | or where the risks are here to this build out.
00:41:44.640 | Obviously, this is operationally difficult, as I said,
00:41:48.280 | to build 2 million GPUs.
00:41:49.960 | I think that's a third of what NVIDIA is expected
00:41:52.760 | to make this year.
00:41:54.360 | So not an insignificant amount of the demand out of NVIDIA.
00:41:58.240 | But where do you see the bottlenecks
00:42:00.920 | to the extent they arise?
00:42:02.400 | Is it power?
00:42:03.520 | Is it GPUs?
00:42:05.160 | Is it ARM?
00:42:06.840 | As you think about--
00:42:07.640 | Yeah, those are all the right questions and stuff
00:42:11.160 | that we've been talking about, as you can imagine,
00:42:13.240 | for a number of months.
00:42:14.720 | One of the big bottlenecks or larger bottlenecks
00:42:17.520 | was or is hopefully addressed yesterday in the sense
00:42:21.920 | that you have a government that's
00:42:23.560 | going to help you in terms of permitting,
00:42:26.080 | in terms of regulations.
00:42:28.400 | That was not small, by the way.
00:42:29.800 | That was not small.
00:42:31.760 | A lot of the aggressive things that
00:42:33.360 | were happening in terms of build out
00:42:34.920 | was literally buying Bitcoin mining facilities
00:42:39.360 | that were out of business and repurposing them.
00:42:41.960 | Now, when you're talking about having
00:42:43.560 | to build a bunch of new things, regulatory matters.
00:42:47.040 | So that's one big one.
00:42:48.440 | Your numbers that your team looked at, I think,
00:42:51.240 | are probably in the right zip code.
00:42:53.320 | Our numbers might be a little bit bigger,
00:42:55.360 | but your number's about right.
00:42:57.800 | Fab capacity does become an issue, no doubt,
00:43:00.880 | because when you start thinking about 3 nanometer and 2
00:43:03.080 | nanometer, TSMC is the leader, is the only game
00:43:08.560 | in town on some level.
00:43:09.520 | So I think that is a potential limitation or at least
00:43:12.680 | constraint.
00:43:13.960 | HBM memory and DRAM, for sure, that is another potential.
00:43:19.760 | And then you just get into a couple of things.
00:43:23.600 | First off, when you're talking about gigawatts of power,
00:43:29.000 | how physically close can those data centers be?
00:43:32.440 | And if you're trying to do training dispersed
00:43:35.880 | across multiple facilities, what does that look like?
00:43:39.280 | Can you really, really train a large model
00:43:42.240 | relying on the connectivity of the network
00:43:44.480 | across multiple physical sites?
00:43:46.560 | That is a challenge.
00:43:48.800 | Human labor to connect all these cables,
00:43:51.960 | I know that sounds a little trivial in the context of some
00:43:54.640 | of these technological problems we're talking about,
00:43:57.000 | utterly non-trivial at this scale.
00:43:59.440 | So I think automation is a potential.
00:44:02.840 | I think robotics are a big potential on this.
00:44:06.440 | How do you build these?
00:44:07.920 | Do you build them from the ground up,
00:44:10.240 | or do you containerize them and put them
00:44:13.200 | together in a modular fashion?
00:44:15.400 | So I think there are opportunities that
00:44:18.960 | are out there relative to scale this, some known,
00:44:22.280 | some not so known.
00:44:23.800 | And I think all of that is going to be played out here
00:44:26.560 | over the next number of years.
00:44:29.600 | Rene, I think you nailed it.
00:44:31.240 | And I think you nailed it when you said,
00:44:33.480 | echoing Stan Druckenmiller, he said his entire career,
00:44:36.960 | he's never seen this big a reversal
00:44:38.720 | from anti-business administration
00:44:41.320 | to a pro-business administration.
00:44:44.000 | We up-leveled the ambitions dramatically.
00:44:46.840 | Clearly, the people around the table--
00:44:48.640 | Larry Ellison, you, Masa, et cetera--
00:44:51.760 | understood that vibe shift occurring in the White House.
00:44:56.760 | And as I said to Bill, we have certainly
00:45:00.400 | intramural battles between Meta and Google and x.ai and Elon
00:45:05.640 | and OpenAI.
00:45:06.680 | But we're all on Team America.
00:45:08.960 | And unequivocally, this advances us on AI at a rate and a pace
00:45:15.840 | that I think is great for all of us.
00:45:20.480 | Appreciate your leadership on that.
00:45:22.080 | Appreciate you jumping in here for a few minutes today.
00:45:26.320 | We look forward to having you back on and going deeper
00:45:28.480 | with you.
00:45:28.800 | Yeah, one thing I might add to this--
00:45:30.400 | and I don't remember, Bill, whether it
00:45:32.040 | was a competing podcast that you were on that you were talking
00:45:34.800 | about, that innovation thrives because you're
00:45:37.600 | 3,000 miles away from a certain city in the United States.
00:45:41.200 | I was there the last three days.
00:45:44.400 | Frigging cold, by the way.
00:45:46.880 | But to Brad's point--
00:45:49.000 | and you do the rounds of the dinners and the galas,
00:45:52.200 | blah, blah, blah, blah.
00:45:53.200 | But I was struck by just the intent for a pace of change
00:46:01.120 | that was very stark.
00:46:02.560 | And it's felt much, much more business oriented
00:46:06.880 | than anything else.
00:46:09.120 | And certainly, in my career, I've
00:46:11.520 | spent a lot of time in DC working
00:46:14.760 | with different administrations and different pieces
00:46:17.080 | of government, setting policy.
00:46:18.640 | And there are a lot of times you're having to explain,
00:46:20.880 | what is it that you do?
00:46:22.560 | And what are your products?
00:46:24.120 | And what do they go into?
00:46:25.720 | And tell me again how they work.
00:46:27.760 | This time, the conversations are all about,
00:46:30.040 | how do we remove barriers to go really fast and do it here?
00:46:34.200 | And I think the biggest testament I can give to it
00:46:36.640 | is that 28 hours after taking office,
00:46:41.120 | Donald Trump is there with Mossa and Sam Altman and Larry
00:46:44.640 | Ellison talking about gigantic investments.
00:46:47.200 | It was his first press conference he did,
00:46:48.960 | for gosh's sake.
00:46:49.720 | So yeah, I think that's something to look at.
00:46:53.840 | I totally-- one party below.
00:46:55.720 | How excited is Larry Ellison?
00:46:57.480 | And is he really all in on this?
00:47:00.400 | Larry doesn't do a lot of public--
00:47:02.280 | [LAUGHTER]
00:47:03.240 | --that you know, right?
00:47:04.840 | And he lives in Florida.
00:47:06.600 | He likes the warm weather.
00:47:08.040 | He flew up to freezing Washington to stand outside
00:47:10.320 | and do an interview with Fox.
00:47:12.280 | And then he was there.
00:47:13.200 | So yeah, I don't want to speak too much for Larry,
00:47:15.320 | but he's very committed to this.
00:47:17.040 | And Larry, he's got a lot of stuff
00:47:18.440 | he's doing with his Oxford Institute
00:47:20.400 | around advanced health research, all enabled by AI.
00:47:24.480 | So he's-- and as I am, you and I have talked about this, Brad.
00:47:27.440 | I think the last mile on this stuff
00:47:30.160 | is all about drug research and cancer research.
00:47:32.560 | Larry is very passionate about it.
00:47:34.000 | So this is a big connection to the stuff
00:47:36.040 | he believes very strongly in.
00:47:37.760 | Well, of course, the podcast that you referenced,
00:47:41.560 | Bill's famous talk that everybody ought to watch,
00:47:44.520 | pinned to his x.ai handle, 2850--
00:47:48.800 | 2,851 miles from Washington by Bill Gurley
00:47:52.200 | at the All In Summit, our good friends from the All In pod.
00:47:55.640 | It was an incredible talk.
00:47:57.000 | And what I took away this weekend
00:47:58.640 | was exactly what you did, Rene, which
00:48:01.200 | is this is the first time I've seen an intersection
00:48:04.160 | and partnership between technology and Washington
00:48:08.520 | the way I did this weekend.
00:48:10.080 | And given that the whole field of battle for national security
00:48:13.640 | has moved to the playing field of AI,
00:48:15.520 | it couldn't come at a better time.
00:48:17.360 | And so thanks for joining us.
00:48:18.720 | A lot more to talk about.
00:48:19.760 | We'll see you soon.
00:48:21.520 | Thanks, guys.
00:48:22.800 | Thanks, Rene.
00:48:24.040 | That was incredible from Rene.
00:48:26.200 | I think answered a bunch of questions.
00:48:28.520 | What was your takeaway from that?
00:48:29.880 | What was your key takeaway?
00:48:31.760 | I mean, I think it reinforced a lot of what you said earlier
00:48:35.200 | and maybe helped frame the math for the unit volume
00:48:44.280 | as you had structured.
00:48:46.360 | I'm still left with a huge question.
00:48:50.880 | I didn't understand the answer about structure.
00:48:54.080 | You can't invest equity.
00:48:56.920 | And you can't put debt on something at a 3 to 1 ratio
00:49:00.600 | and not expect it to be a standalone entity capable
00:49:04.920 | of creating equity value.
00:49:06.840 | So I captured one company, subsidiary JV.
00:49:11.280 | That doesn't make sense to me.
00:49:14.480 | Some more to unpack there.
00:49:16.040 | Yeah, I've got like three or four questions.
00:49:18.280 | Who's going to run it?
00:49:19.280 | Will they be great at it?
00:49:20.600 | I mean, we gave a lot of credit to Elon
00:49:22.320 | for being operationally tight in Memphis.
00:49:24.840 | We know as other companies are.
00:49:26.160 | Will that happen here?
00:49:27.720 | Is OpenAI the only customer?
00:49:29.480 | Will it serve other people?
00:49:30.960 | Is it a core weave lookalike, even maybe on steroids?
00:49:34.800 | But there's some things I still don't know.
00:49:37.640 | Yeah, no, I think he started to answer
00:49:39.600 | some of those questions.
00:49:40.600 | But we'll go deeper with him in time.
00:49:42.520 | Hey, I had interrupted you.
00:49:44.040 | We had bookmarked DeepSeek.
00:49:45.840 | I want to finish that off.
00:49:47.440 | And then we're going to talk a little bit about just
00:49:49.640 | a couple of the other impactful EOs that came out
00:49:52.240 | of Washington, a quick little market check.
00:49:55.040 | Why don't you wrap on DeepSeek?
00:49:56.720 | Yeah, so the point I was making when Renee popped in
00:49:59.360 | was that constraints can drive innovation.
00:50:02.160 | And so here you have a case where the Biden administration,
00:50:08.040 | I think it was led by this guy, Alan Estevez,
00:50:10.920 | and this select committee for China
00:50:14.520 | that's inside of Congress.
00:50:16.720 | They're constantly looking for ways
00:50:18.320 | to kind of block China with export controls.
00:50:22.920 | And he, even on leaving, Estevez said, hey, we won.
00:50:27.360 | We made it work.
00:50:29.080 | And meanwhile, like Eric Schmidt saying, we're behind.
00:50:33.760 | And I certainly don't think we're slowing anybody down.
00:50:37.960 | But in this case, maybe by giving them constraints,
00:50:41.240 | we actually created a world where they innovated
00:50:46.680 | in a different direction and created these hyper small
00:50:51.360 | models that they didn't need massive training
00:50:54.800 | infrastructure on.
00:50:56.200 | And so I think the reality is we did do that.
00:51:00.280 | You and I have talked about this,
00:51:04.160 | but just some of the most remarkable entrepreneurs
00:51:06.480 | I've ever met are in China, resourceful.
00:51:09.480 | And so you tell them, well, you can't play with those tools.
00:51:12.600 | Well, they go and figure out how to do it with even lesser tools
00:51:15.720 | and maybe put themselves at equal footing, maybe even
00:51:18.840 | better footing because they've learned
00:51:20.800 | to be more innovative in a way we weren't being innovative.
00:51:23.680 | Well, there's a lot more to discuss about DeepSeek.
00:51:27.320 | But I think you're absolutely right.
00:51:29.520 | And listen, there is a real risk that well-intended legislation,
00:51:36.040 | well-intended efforts in Washington
00:51:38.840 | backfire and result in the exact opposite result.
00:51:42.920 | Everybody wants to be tough on China.
00:51:45.040 | But being tough on China may not lead
00:51:47.320 | to China being slowed down, as evidenced by DeepSeek.
00:51:51.040 | So that may be case exhibit number one to you, Bill.
00:51:53.720 | So this brings me maybe to the transition
00:51:56.560 | to talk about what exactly happened.
00:51:59.800 | We had the President of the United States literally
00:52:01.960 | at the Capital One Center.
00:52:03.160 | I've never seen any-- nobody touches this guy.
00:52:05.760 | He's the greatest marketer as president
00:52:07.880 | I think we've ever had.
00:52:09.440 | He does his inauguration.
00:52:10.880 | He seemingly doesn't need sleep.
00:52:12.800 | And at the Capital One Center, he's signing executive orders.
00:52:16.320 | Now, one of those executive orders, which we'll show here,
00:52:20.480 | halts all federal regulations that are currently
00:52:24.600 | being promulgated unless they've been published
00:52:28.120 | in the Federal Register.
00:52:30.000 | And so one of the questions that I had--
00:52:33.280 | there's a very insidious piece of regulation,
00:52:36.720 | again, under the guise of being tough on China that I think
00:52:39.640 | is really going to slow down American AI.
00:52:42.960 | And this is a framework for artificial intelligent
00:52:46.280 | diffusion.
00:52:47.600 | And it was promulgated by the Department of Commerce.
00:52:50.520 | And the idea was-- basically, you saw it last week--
00:52:53.520 | that NVIDIA and all these guys can only send chips
00:52:56.440 | to a few friendly countries.
00:52:58.960 | And then now it creates all of these different acronyms
00:53:01.960 | and levels, who can get what.
00:53:04.720 | People think there's a lot of regulatory capture in there
00:53:07.400 | for a certain company in Seattle.
00:53:10.560 | It hurts a lot of the smaller companies.
00:53:12.480 | But without a doubt, I've heard across the board
00:53:16.280 | from almost every company in the ecosystem
00:53:18.840 | that this is a terrible idea.
00:53:20.240 | So I was hopeful that this rule, which was promulgated,
00:53:26.760 | would be revoked.
00:53:27.680 | But unfortunately, he said, unless it's
00:53:30.600 | been published in the Federal Register.
00:53:33.000 | So we went and we researched.
00:53:34.480 | Has it been published in the Federal Register?
00:53:36.720 | And we'll show here that it has.
00:53:39.000 | So it seems to me that that EO wasn't enough
00:53:42.480 | to roll back this diffusion rule that
00:53:45.440 | was promulgated by Commerce.
00:53:47.040 | And Howard Lutnick or others are going
00:53:48.600 | to have to act specifically, I think,
00:53:50.840 | to roll back this rule, which I don't think
00:53:52.720 | is meant to be adopted for some three or four months anyway.
00:53:58.120 | I don't think it will get ratified.
00:53:59.760 | But it's an example of the new look
00:54:02.320 | I think the administration is taking at all these things.
00:54:05.600 | Were there any EOs, Bill, that caught your eye?
00:54:07.960 | By the way, just following up on that,
00:54:10.040 | we talked about this on the last podcast.
00:54:11.880 | But there's like 20 different state-by-state initiatives
00:54:15.880 | being pushed right now.
00:54:17.160 | And so I think the unfortunate reality
00:54:22.280 | is that there's so much effort and so much push
00:54:27.040 | to write regulation here.
00:54:29.560 | And maybe the intent of pushing it to state level
00:54:32.840 | is to provoke a federal piece of legislation.
00:54:37.560 | Because the dumbest possible thing we could possibly have
00:54:41.200 | is state-by-state.
00:54:43.680 | It's just mud in the gears, like really, really stupid.
00:54:47.800 | So to put a finer point on that, you talked about 1047.
00:54:52.080 | Fortunately, it was vetoed in the state of California.
00:54:54.360 | But now you have the piece of legislation
00:54:56.920 | working its way through Texas that we talked about.
00:54:59.040 | I think it's roughly 25 states within this patchwork that
00:55:03.800 | have a 1047 equivalent that would be very problematic.
00:55:07.560 | I'm happy to report that I spent a lot of time
00:55:10.560 | with an incredible congressman from the state of California,
00:55:13.360 | Jay Obernolte.
00:55:14.840 | And Jay was the chair, or is the chair,
00:55:18.000 | of the Artificial Intelligence Task Force in the House.
00:55:21.000 | So from the House of Representatives perspective,
00:55:23.040 | he's on point developing a plan of attack.
00:55:26.760 | And it was great to hear Jay.
00:55:28.680 | I coordinated some meetings this weekend between him
00:55:31.480 | and a lot of CEOs of leading AI companies.
00:55:34.520 | And there is a plan by Congress, on the one hand,
00:55:39.120 | to have federal preemption.
00:55:40.600 | Clearly, this is a matter of interstate commerce.
00:55:43.080 | So we should have federal preemption.
00:55:44.720 | But because that's going to take some time,
00:55:48.320 | they also have a plan, Bill, that if any of these states
00:55:51.600 | were to pass something like 1047,
00:55:54.240 | they can issue a moratorium that would effectively
00:55:57.760 | freeze the state law for the period of time
00:56:01.400 | it would take them to get the preemption passed.
00:56:04.600 | But that gets back to my point where
00:56:06.120 | that may have been the end goal all the way around.
00:56:09.800 | So they may be happy with that outcome.
00:56:11.880 | Obviously, all this is going to fall on our friend Mr. David
00:56:14.960 | Sachs.
00:56:15.460 | So I'm sure we'll have more to hear from him going forward.
00:56:20.160 | No doubt about it.
00:56:21.080 | And can we just say that, given what's at stake right now,
00:56:27.920 | how fortuitous it is to have people who actually understand
00:56:32.320 | this stuff, like Sachs, helping to coordinate
00:56:36.320 | the needs of industry and the various agencies of government.
00:56:40.120 | Because there's a lot of friction in this process.
00:56:42.600 | And I can tell you, he is working tirelessly
00:56:45.240 | in information-gathering mode, trying
00:56:47.080 | to understand all these pieces and how
00:56:49.360 | to coordinate these priorities.
00:56:51.440 | But one of the things I think he and many others celebrated,
00:56:54.680 | one of the EOs that was revoked by Trump
00:56:57.840 | was the Biden EO on AI.
00:57:01.560 | And one of the things I would say on the Biden executive
00:57:04.080 | order is there were many Republicans who said to me,
00:57:06.480 | you know, there's some good stuff in this.
00:57:08.760 | Like, we need to keep certain things.
00:57:12.040 | And I think even Sachs and others acknowledged,
00:57:14.480 | there's some good things in there.
00:57:15.900 | But the adopted approach was start from a clean slate,
00:57:19.860 | tabula rasa, and rather than try to edit the Biden rule,
00:57:25.180 | just start with a blank sheet of paper.
00:57:26.980 | And if there were good ideas in there,
00:57:28.560 | you can add them to an executive order on AI that could come out
00:57:31.780 | of this administration.
00:57:33.020 | So to me, when I think through the most important executive
00:57:35.740 | orders impacting Silicon Valley, one is on this diffusion bill.
00:57:40.060 | We got to kill this.
00:57:40.980 | This was terrible legislation that
00:57:42.980 | was rushed out of commerce so that they
00:57:45.420 | could say that they were being tough on China.
00:57:47.380 | We got the AI EO revoked.
00:57:49.460 | The next one for me was DOGE.
00:57:52.260 | I mean, this DOGE EO was frankly further reaching
00:57:56.820 | than I expected, setting up a department or effectively
00:58:00.100 | a task force, a SWAT team within every federal agency,
00:58:04.380 | a four-person SWAT team to go through the DOGE procedure.
00:58:08.540 | We're now in a period of maybe three to four months
00:58:11.640 | where the president has said he wants a reconciliation
00:58:14.580 | bill on his desk.
00:58:15.500 | He wants to put us on a path to balancing the budget.
00:58:19.220 | And so DOGE is going to have to move very fast.
00:58:22.460 | We now see that Vivek's going to run for governor of Ohio,
00:58:25.660 | it sounds like.
00:58:26.460 | And so Elon is singularly at the head of DOGE.
00:58:29.780 | I think that will help make decisions go maybe even faster.
00:58:34.060 | But I was surprised at the level of coordination and reach
00:58:38.860 | that they already had, including around software modernization.
00:58:42.260 | There was an EO around software modernization
00:58:44.700 | across the entire government.
00:58:46.100 | So it was exciting to see that move forward quickly.
00:58:50.620 | Yeah, look, I think that's one of those things where
00:58:53.660 | most people are skeptical.
00:58:55.420 | I think they've been around the American government
00:58:58.420 | a long enough time that they don't
00:59:00.300 | think that type of radical change and efficiency creation
00:59:05.020 | is possible.
00:59:06.260 | I think if you look at what Malay's done in Argentina,
00:59:09.740 | and you say, well, what could go right?
00:59:13.020 | If it were to work, it would be so positive for the US dollar,
00:59:18.940 | the American economy.
00:59:20.340 | It would be remarkably positive.
00:59:23.420 | And so I put it at high impact, still low probability
00:59:28.620 | in my brain, just because it's such a bureaucratic place,
00:59:31.980 | Washington.
00:59:32.980 | But it'd be great if it succeeds.
00:59:35.900 | Well, I would say if you think about American national
00:59:39.340 | security, the two things that are top of mind for me
00:59:42.820 | is, number one, not losing the race on AI.
00:59:45.620 | We've got to win the race on AI.
00:59:47.420 | And the second one to me is our national financial security.
00:59:51.220 | We cannot-- we're on a path to financial ruin
00:59:54.100 | if we don't make these changes.
00:59:55.740 | It's immoral to leave this level of debt to our children.
00:59:59.780 | We outlined a path on this pod to how
01:00:03.020 | we can balance the budget at $6 trillion of revenue
01:00:06.780 | and spend by the end of Trump's term.
01:00:09.500 | And I'm going to predict it here.
01:00:11.100 | I think in the president's State of the Union
01:00:13.220 | address in the first week of March,
01:00:15.100 | he's going to make a commitment to balance the budget
01:00:17.660 | in his first term in office.
01:00:19.020 | You don't have to do it in year one.
01:00:20.820 | And listen, if we get to $2 trillion on doge,
01:00:23.780 | then you'll do better than balance the budget.
01:00:25.900 | You'll have a surplus by the end of his first term.
01:00:28.340 | But I think a commitment to balance
01:00:30.380 | the budget in his first term would cause the bond market
01:00:35.140 | to react, which means, Bill, if the bond market gets bought
01:00:39.140 | and rates go down, our cost of borrow
01:00:41.740 | will go down by $100 to $200 billion per year,
01:00:45.700 | positively impacting even more what can be achieved here.
01:00:51.260 | Of course, the bond market hasn't believed
01:00:53.580 | and shouldn't believe because we've never delivered it,
01:00:56.100 | hasn't believed that we actually can make these cuts.
01:00:58.420 | But I'm more and more convinced that the coordination
01:01:02.060 | between Congress, the executive branch, and doge,
01:01:05.660 | we're going to get something important done.
01:01:08.180 | Before we leave Washington, I got
01:01:09.980 | to make a few comments about the TikTok stuff that kind of popped
01:01:14.580 | up in his initial press conference, Trump's
01:01:19.660 | initial press conference.
01:01:21.180 | And what I thought was different this time--
01:01:25.140 | and he then did it again after the Stargate conference--
01:01:31.340 | is that he's talking about putting partial ownership
01:01:35.380 | of TikTok on the United States government balance sheet, which
01:01:39.740 | I think is new.
01:01:41.300 | I think the way this happened is--
01:01:43.940 | I think he's been a negotiator his whole career.
01:01:47.460 | And he recognized that there was an asset that,
01:01:51.500 | if this law is enforced, goes away.
01:01:55.500 | And most of that equity value probably
01:01:57.660 | goes to Meta, maybe a little bit to Snap,
01:02:01.700 | or if someone else pops up.
01:02:03.540 | But it just melts away.
01:02:06.340 | And so he views the enforcement as destruction of value
01:02:12.340 | that could be negotiated and grabbed.
01:02:16.540 | And so I literally think that's what's going on in his head.
01:02:19.980 | What he may be underestimating is the willingness of China
01:02:23.940 | to just let it go away, or the unwillingness of someone
01:02:30.140 | to buy it and then give away half
01:02:31.980 | right away to the government.
01:02:33.580 | So it'll be interesting.
01:02:34.940 | It sounds like he's only got 90 days to play this out.
01:02:38.740 | It's a fascinating one.
01:02:40.180 | I mean, just to replay the weekend, Bill, TikTok went dark.
01:02:44.220 | They literally shut down the app, which nobody
01:02:46.300 | thought was going to happen.
01:02:48.900 | I happened to tweet that I think it was going to be shut down
01:02:51.340 | for less than 36 hours.
01:02:52.740 | Sure enough, he took to the floor of the Capital One arena.
01:02:56.180 | He's like, I brought TikTok back for the masses.
01:02:59.780 | That makes kids and younger folks really excited.
01:03:02.660 | He's standing on the side of free speech.
01:03:04.500 | But yes, Trump is absolutely a deal maker.
01:03:07.980 | And he thinks if the US is going to turn this on,
01:03:10.380 | that they're entitled to something.
01:03:12.340 | And I think there are a lot of people who agree with him.
01:03:14.780 | I think there are a couple of things.
01:03:16.340 | Number one, he said multiple times, I do well on TikTok.
01:03:19.860 | I like TikTok.
01:03:21.780 | And so I think there is a part of him that just commercially
01:03:25.220 | knows there are 7 million folks who make a living on TikTok.
01:03:28.140 | He's done well on it.
01:03:29.180 | He would like to keep it open.
01:03:30.940 | But nobody on the planet wants a fair playing
01:03:33.700 | field more than Trump.
01:03:35.140 | And he realizes that American internet companies
01:03:38.340 | aren't allowed to be in China.
01:03:41.060 | And he wants a deal.
01:03:43.100 | Whether the deal is as you described or some other way,
01:03:46.660 | I think it's yet to be determined.
01:03:49.380 | But what we know for now is that it's alive and well.
01:03:53.260 | And I think for TikTok shareholders,
01:03:54.900 | and as a reminder to folks, we're
01:03:57.060 | investors since 2016 in ByteDance,
01:04:00.340 | as TikTok shareholders, I just think they want resolution.
01:04:03.860 | TikTok US is a small part of that global business.
01:04:08.780 | Maybe there'll be a commercial deal to be worked out.
01:04:11.100 | But to your point, it's likely to be done or not
01:04:13.900 | done in the next 90 days.
01:04:15.820 | And they've had 180 days to do it.
01:04:19.060 | So the odds that they actually want to do it, I think,
01:04:21.700 | are low because they could have done it already.
01:04:23.740 | And I also, based on what I've read,
01:04:26.540 | I think it's out of Trump's hands.
01:04:28.540 | He found a way to delay it.
01:04:29.980 | But Congress already voted.
01:04:31.900 | And the Supreme Court already backed it up.
01:04:33.700 | He can't veto what they've already done.
01:04:37.460 | So it is going to come to a head pretty quickly.
01:04:40.020 | Yeah, it was funny.
01:04:41.100 | In the press conference yesterday on Stargate,
01:04:43.820 | he was asked a question on TikTok.
01:04:45.820 | And he starts laying out his case like what he wants.
01:04:48.700 | And it could have been like he was running the investment
01:04:51.100 | bank at Goldman Sachs the way he laid it out.
01:04:53.060 | It was pretty brilliant.
01:04:54.580 | And he said, hey, Larry, talking to Larry Ellison,
01:04:56.740 | why don't we just live negotiate the TikTok deal right here?
01:04:59.620 | I'm going to tell you what I think.
01:05:01.020 | You tell me whether or not you want to do the deal.
01:05:03.860 | So at the end of the day, remember, the Oracle Cloud
01:05:06.780 | runs Project Texas, which is where TikTok US is run.
01:05:11.220 | So Larry Ellison has a big stake here.
01:05:14.260 | He was also there yesterday talking about that.
01:05:18.340 | Somebody asked the president, would it be OK
01:05:20.100 | if Elon bought part of this?
01:05:21.580 | He said, yes, I would be fine if Elon bought part of it.
01:05:24.220 | So might Larry and Elon be the two buyers of 50%
01:05:27.540 | or more of US TikTok?
01:05:29.380 | It's going to be just another party.
01:05:31.900 | But there's three parties.
01:05:33.500 | Like China has to want to sell.
01:05:35.180 | For sure.
01:05:35.680 | There's four.
01:05:36.780 | You got ByteDance, the China government, the US government.
01:05:39.500 | There's six parties all together.
01:05:41.820 | So this will be a tough one to land.
01:05:44.220 | It's going to be fascinating.
01:05:46.660 | Why don't we maybe just want to end up with a little tech
01:05:50.180 | check?
01:05:51.020 | Sure.
01:05:51.620 | What are you seeing?
01:05:52.900 | The first thing I saw, Bill, is I
01:05:55.540 | think it was on the actual day of the inauguration.
01:05:59.180 | One of our heroes, Stan Druckenmiller, was on CNBC.
01:06:02.700 | And he talked about his observation
01:06:05.780 | on this administration.
01:06:07.380 | And he's saying we're going from the most anti-business
01:06:10.220 | administration of his lifetime to the most pro-business
01:06:14.020 | administration of his lifetime.
01:06:15.540 | I mean, think about this guy.
01:06:16.860 | He's seen it all.
01:06:18.700 | And Stan is not a person prone to hyperbole.
01:06:23.220 | When you look at what's happening in the market,
01:06:26.660 | just year to date, I think the Nasdaq's up like 4%
01:06:31.860 | through today.
01:06:33.060 | That's on top of it being up a fair bit
01:06:36.140 | since people started expecting that Trump
01:06:38.300 | was going to win the election.
01:06:39.860 | But the fact that he made those comments,
01:06:42.500 | I think you got to take as very significant.
01:06:44.780 | But there are two other things Stan said in that interview.
01:06:47.940 | He said, that doesn't mean--
01:06:49.660 | just because I believe that, that doesn't mean
01:06:52.340 | I'm all in on the markets.
01:06:54.540 | And he said, things are priced reasonably high.
01:06:57.740 | And so we'll show, if you look at the S&P,
01:07:02.580 | we think that S&P is baking in a big earnings
01:07:05.140 | acceleration this year.
01:07:07.220 | And that's on top of already high multiples.
01:07:09.340 | So the S&P XMAG 7 has a big hurdle this year.
01:07:13.860 | If you look at what's expected out of MAG 7,
01:07:16.580 | it's a pretty big earnings deceleration.
01:07:19.340 | And the multiples are pretty consistent with where
01:07:21.380 | they've been the last five years.
01:07:22.980 | We'll post both of those charts in here.
01:07:24.820 | So number one, he said, we're not all in on the market.
01:07:27.180 | And then he talked about the tenure,
01:07:28.700 | which we said in our first show this year, watch the tenure.
01:07:32.380 | That's going to determine where the market goes.
01:07:34.700 | It went as high as 4.8, 4.85.
01:07:37.060 | We're now down today closer to 4.6.
01:07:40.900 | And that's really about whether or not
01:07:42.860 | folks think inflation is reengaging here
01:07:47.100 | and whether or not we're going to be
01:07:48.580 | faced with interest rates.
01:07:49.660 | It's really interesting, Bill.
01:07:51.220 | Lennar, the home builder, down a ton yesterday.
01:07:53.820 | Ford, the auto builder, down today.
01:07:57.180 | At the same time, these tech stocks are screaming.
01:08:00.060 | Because interest rates are high.
01:08:01.740 | The economy is restrictive.
01:08:03.140 | Not all parts of this market are benefiting.
01:08:06.180 | And I think that will be concerning to the president.
01:08:08.740 | He said he doesn't like these interest rates.
01:08:10.860 | He thinks the interest rates are too high.
01:08:13.060 | And Stan said, I feel great about the administration,
01:08:16.740 | super pro-business.
01:08:17.740 | But we got these two things, high valuations and maybe
01:08:21.380 | interest rates that are going to go higher
01:08:24.300 | that could ruin the party.
01:08:25.860 | We've gotten off to a strong start
01:08:27.460 | here in the early part of January.
01:08:29.180 | But for managers like us, we're paying close attention
01:08:31.940 | to those things.
01:08:33.860 | I'm kind of on record saying I don't think inflation
01:08:36.540 | is going to reignite, and that I think that rates will--
01:08:41.180 | we've probably seen the top in rates
01:08:43.500 | for the next three, four months.
01:08:44.940 | But hell, I don't know.
01:08:46.660 | I'm not a macro forecaster.
01:08:48.380 | We just have to look at them and take them as they come.
01:08:51.580 | But we will.
01:08:52.540 | If rates were to go to 5% or 5.5%,
01:08:55.420 | and we see that happening, our exposures would come down.
01:08:58.900 | But I would say right now, it's been a pretty incredible start
01:09:02.780 | to the year.
01:09:03.940 | Any comments on the Netflix quarter,
01:09:06.420 | which looks like it's up $40 billion today in market cap?
01:09:11.060 | I mean, just an extraordinary company, incredibly well-run.
01:09:15.060 | And frankly, it's a tailwind, another tailwind today
01:09:18.540 | for all internet companies.
01:09:20.220 | Remember one of the things I said
01:09:21.620 | that we were worried about about these companies?
01:09:23.660 | We said FX was going to be a big headwind,
01:09:26.300 | because the dollar has strengthened tremendously
01:09:29.140 | on a year-over-year basis.
01:09:30.420 | So we thought for most of these companies,
01:09:32.340 | FX was going to be a 2% or a 3% headwind.
01:09:35.380 | And they said-- they took up their guidance bill,
01:09:38.060 | even accounting for the FX headwind.
01:09:41.140 | So it's a much bigger raise in guidance on an FX-adjusted
01:09:45.260 | basis than people expected.
01:09:46.780 | And it just goes to show you, you know,
01:09:48.420 | I was with a well-known CEO of an internet company
01:09:52.020 | this weekend.
01:09:52.900 | And he said, Brad, we've doubled our revenues,
01:09:55.780 | and our costs keep coming down because of AI.
01:09:59.940 | I've said that I think this moment that we're in,
01:10:02.460 | the next five years, is going to be
01:10:04.060 | a golden moment of margin expansion for technology
01:10:08.220 | businesses, and frankly, all businesses.
01:10:10.460 | Human productivity is going to go up,
01:10:12.460 | which means the cost to do the things you're
01:10:14.980 | doing in your business are going to come down.
01:10:17.660 | So for market leaders who can hold on to that revenue bill,
01:10:21.220 | who it doesn't get competed away,
01:10:23.180 | their margins are going to expand.
01:10:24.980 | Look at what's happened at Meta and these other companies.
01:10:27.340 | I think that's accelerating in 2025 and 2026.
01:10:31.220 | You know, everything's price dependent,
01:10:33.340 | and things can change really rapidly,
01:10:35.220 | as the first week of this year showed.
01:10:38.100 | But it's fun to be with you.
01:10:39.740 | It's going to be a fun year to do this.
01:10:41.980 | We'll have some more great guests on in the next few weeks
01:10:45.940 | doing cameos and for the long form.
01:10:48.540 | But I appreciate you doing this with me again.
01:10:50.580 | It keeps me sharp.
01:10:52.060 | All right, man.
01:10:52.700 | Take care.
01:10:53.420 | Great seeing you, Bill.
01:10:54.920 | [MUSIC PLAYING]
01:10:58.360 | As a reminder to everybody, just our opinions,
01:11:07.420 | not investment advice.