back to indexDOGE kills its first bill, Zuck vs OpenAI, Google's AI comeback with bestie Aaron Levie
00:00:00.000 |
Jekyll, what just happened? Did you just have a moment? An emotional moment? 00:00:02.880 |
I don't know. You know, I've been trying to get over Sax and it's hard to get over your ex. 00:00:09.840 |
We were in a codependent relationship and we were working on it. 00:00:13.520 |
You have no one to fight with anymore. Who are you going to fight with, Jekyll? 00:00:16.160 |
That's the emptiness you feel inside because you're a fighter. You're a street brawler 00:00:20.160 |
and you have no one to brawl with anymore. So you feel... 00:00:22.480 |
Well, you're the first guy to jump under the table in a bar fight. 00:00:26.080 |
I mean, here you go. Here you go. As I was saying, it's always worth a start. 00:00:30.400 |
No, I feel like if this goes down, I feel like Sax and I would have like, 00:00:34.560 |
we would have thrown down. We were bulldogs. We would have fought like tooth and nail. 00:00:38.640 |
I have this image. It's like we're in a bar. Okay. And all of a sudden... 00:00:47.440 |
A fight is about to break out. Four thought bubbles appear. Okay. Jekyll's like, "Let's 00:00:53.600 |
go get him." Sax is like, "This guy's a dipshit. Let's roll." I'm like, "Look at that chick. She's 00:00:58.960 |
so good looking." And then Freeberg's like, "What will happen to my invite to **** duck?" 00:01:04.160 |
Freeberg's like, "Oh, how do I get out of this unscathed?" 00:01:10.320 |
All right, everybody. Welcome to the number one podcast in the world. 00:01:32.720 |
With me again today, for better or worse, your sultan of science, David Freeberg is back. 00:01:39.840 |
How are you doing, Freeberg? I see you're a pawn. Only a pawn in their game. 00:01:48.960 |
Pawn or queen? What is that from? Is that from a movie? What movie is that from? 00:01:56.000 |
It's like some French new wave I'm not aware of. 00:02:00.720 |
I've learned that Freeberg's background is some like weird, secret communication language. There's 00:02:07.600 |
a small but fervent group of people that are really into these backgrounds. They're always 00:02:12.880 |
Absolutely. Of course, with us again, Chamath Palihapitiya, your chairman, 00:02:16.560 |
dictator. How are you doing, Chamath? Nice sweater. 00:02:23.040 |
I know. Look at this. Three or four besties will be hitting the slopes. 00:02:26.480 |
Three or four besties will be skiing together. 00:02:28.320 |
Aaron might be there. Aaron, where are you going for the holidays? 00:02:39.440 |
Okay. This is perfect. We're all going to be there, bro. 00:02:42.320 |
It's $2,000 a night. How many rooms you got with the kids and the fam? You have three? 00:02:49.280 |
What did you do? Did you start buying Bitcoin with your treasury? What do you mean boxes 00:02:53.280 |
at an all-time high? Are you doing a Michael Saylor or something? 00:02:56.320 |
It turns out if you didn't get crushed during the post-COVID period, you can just keep cranking. 00:03:06.320 |
Oh, you've been building a real business, of course, with us now, our new fifth bestie, 00:03:09.840 |
Aaron Levy. He is the CEO of the publicly traded company Box, which means 00:03:14.960 |
he's got the most to lose by coming here. So welcome back to the program. 00:03:17.920 |
Based on what you guys were just talking about before this recording started, 00:03:27.600 |
Listen, we're living in an age of meme stocks and people selling convertible notes to buy 00:03:33.120 |
Bitcoin for their treasury and then becoming a Nasdaq 100. It's that simple. Aaron, when you see 00:03:38.560 |
a company buy billions of dollars' worth of Bitcoin and get added to the Nasdaq 100, 00:03:44.320 |
what method of suicide do you think of taking your own life with? 00:03:47.520 |
There was a brief week in 2021 where the thought kind of crossed my mind. 00:03:56.080 |
You're just going to use the sword, the short blade? 00:03:59.200 |
We have a very sophisticated audit committee that prevented the action. 00:04:07.200 |
Just do this for me. How much cash does Box have on the books, ballpark? 00:04:12.320 |
We just did a convert, so we're probably 600 or 700 million. 00:04:17.520 |
Okay, here's what I want to do. A little experiment for next week. Just put 5% of the 00:04:21.440 |
treasury, 30 million in Bitcoin, and then we'll invite you back in two weeks and we'll see what 00:04:25.680 |
happens. Okay? Just put 5% of the treasury in Bitcoin. Hey, everybody, here's another announcement, 00:04:30.960 |
a little housekeeping. As you know, we successfully got the allin.com domain. That was a big victory 00:04:37.120 |
for us. We now have an email list. Four years into this, Meshuggah, we now have the ability 00:04:45.280 |
to take your email address and spam you with all kinds of great information, like a notification 00:04:50.960 |
when the pod drops. Wow, so compelling. Insights from the besties who you've had enough of, 00:04:59.280 |
and first dibs on overpriced event tickets to come hang out with us. Wow. This is the compelling 00:05:05.680 |
pitch we have for giving us your emails. If you'd like to give us your email and get spammed, 00:05:10.160 |
go to allin.com and let the spam begin. All right, that's out of the way. Sax is out again this week. 00:05:17.440 |
I don't know what's going on. We've been trying to figure out where he is. He's MIA. If anybody 00:05:22.240 |
knows... He's in DC this week, yeah. Oh, is that what's going on? Did you see all the 00:05:26.400 |
meetings? I know, I'm being facetious. He's in meetings. Mr. Sax went to Washington, 00:05:34.640 |
creating waves, but Mr. Sax is in Washington. Have you guys seen the photos? 00:05:40.400 |
Aaron, what do you think of Sax in this role? I think it's a strong pick. 00:05:47.440 |
Let the genuflecting begin. Go ahead and say more, Aaron. 00:05:53.920 |
So Crypto I'm a little bit indifferent on. We haven't spent much time there, 00:06:03.120 |
leaned in much there. But on AI, I think it's a very strong pick. And I think you want somebody 00:06:11.120 |
that has a general deregulation, anti-regulation bent at this stage in the evolution of the 00:06:20.640 |
technology. I think there's risk of slowing down too much progress right now. And I think he'll 00:06:27.920 |
provide some good parameters and principles around how to avoid that. So I think very strong. And 00:06:34.240 |
then Crypto again, don't know too much about. And then we'll see the rest of the topics. 00:06:38.240 |
As a software CEO of a public company, when the Biden administration was putting forward 00:06:44.720 |
their proposals on how to regulate AI and have definitions on the size of a model and 00:06:50.480 |
what the models could or shouldn't do and the regulatory authority they would have over the 00:06:54.240 |
software that's written, what was your reaction? And you were supporting Harris at the time, 00:06:58.800 |
I believe, or Biden at the time, right? But how did you react to that when you saw those proposals? 00:07:04.320 |
And just to be clear, are you talking about the EO that went out? 00:07:07.360 |
Yeah, there was the EO, but then they were also drafting. They published a lot of detailed 00:07:12.560 |
drafting. And then obviously California had its bill, which you probably saw as well, 00:07:16.240 |
which specified the number of parameters in a model and the size of a model and 00:07:20.400 |
all these kind of constraints. In reverse order was against SB 1047. 00:07:25.200 |
It felt like you had two big issues. One, you probably don't want state-by-state legislation 00:07:31.040 |
on this topic. That's going to be, you're in a world of hurt if every state has a different 00:07:35.680 |
approach to this. And then secondarily, if you just look at how it evolved from the very first 00:07:41.280 |
proposal to the final proposal, and unfortunately, the kind of underlying philosophy that was in the 00:07:46.800 |
bill, it was very clearly a sort of like viewing basically AI progress as inherently risky right 00:07:54.320 |
now. And so it just ratcheted up the different levels of consequence for the AI model developers. 00:08:02.000 |
And the risk is sort of the second or third order effects of that, which is like, does Zuck then 00:08:07.120 |
want to release the next version of Llama if you're taking on that much risk? And even the 00:08:12.720 |
incremental updates, the liability you have in terms of any of the model advancements. 00:08:19.120 |
And so right now we're benefiting from just an incredibly competitive market between five or 00:08:23.680 |
six players, and you want them running as fast as possible, not having to have sort of this, 00:08:28.880 |
you know, a whole council before every model gets released because they're, you know, in fear of 00:08:33.440 |
getting sued by, you know, the government for hundreds of millions of dollars if one person 00:08:37.280 |
does something wrong with the model. So that was the problem with SB 1047. That's been the problem 00:08:41.280 |
with some of the proposals on national legislation. I felt like the first CEO, it didn't have a lot 00:08:46.080 |
of teeth in it. So it kind of was more like, let's watch, you know, this space and continue 00:08:50.480 |
to study it. The actual current head of OSTP, Aarti Prabhakar, a lot of folks in Silicon Valley 00:08:56.400 |
know her. She's actually very strong, very technical, you know, understands the valley well, 00:09:00.960 |
is not a sort of, does not lean into overregulation. So I actually think OSTP has had a pretty good 00:09:07.520 |
steward, even under Biden. But I think, you know, the efforts that Sachs would, you know, 00:09:13.600 |
clearly be leading, I think would lean even more toward AI progress and sort of not accidentally 00:09:19.680 |
overregulating too early in this journey. >> So let me ask you a question then about 00:09:26.400 |
crypto. You're not into crypto. Crypto is a little bit harder to regulate. So with Sachs being there, 00:09:32.960 |
what do we think the one, two or three things he could do to actually make crypto 00:09:37.920 |
not a scam, not have consumers get left holding the bag? Obviously, sandboxing projects, 00:09:47.440 |
maybe having people know your customer, you know, some basic regulation there, 00:09:52.240 |
the sophisticated investor test comes to mind. Shmoop, what do you think Sachs should do 00:09:57.280 |
in terms of crypto regulation in the short term, in the near term? 00:10:01.040 |
>> That's a really good question. I think that today there are a lot of 00:10:06.960 |
really valuable use cases that can sit on top of crypto rails. I think the most obvious one is 00:10:16.160 |
how you can have real-time payments using stable coins. I think the United States government is 00:10:22.320 |
already using some of these stable coins for a bunch of purposes. The number of companies that 00:10:28.400 |
are beginning to adopt and use stable coins is actually growing very quickly as well. 00:10:32.440 |
>> Take a second to define stable coins for the audience just to catch people up. 00:10:36.240 |
>> Let me define what a stable coin is, which is that you put a dollar into a wallet somewhere, 00:10:44.000 |
and in return, you get a digital token of that dollar. There are two big purveyors of 00:10:50.960 |
stable coins. There's Tether and then there's USDC, which is this company called Circle. 00:10:57.280 |
I think the easiest way to disambiguate them is Tether is abroad. I think it's run out of 00:11:04.000 |
somewhere in the Caribbean. USDC is American. It's run by this company called Circle. 00:11:09.840 |
The other difference is that there is some ambiguity around how these assets are 00:11:17.840 |
secured. What a stable coin is supposed to do is when you give them a dollar, 00:11:21.840 |
they're supposed to go and buy some short-term treasury security so that that dollar is always 00:11:27.920 |
guaranteed. Because if you had a billion dollars of stable coins, but only $900 million of assets 00:11:36.080 |
to back them up, there's an insolvency risk there. There's a run on the bank risk. Theoretically, 00:11:41.520 |
a billion dollars of stable coins should have a billion dollars of cash in some short-term 00:11:46.320 |
fungible security. What's incredible is at the scale in which these stable coins operate, 00:11:51.760 |
that has turned out to be an enormous business. Why? When you give them a billion dollars and 00:11:57.600 |
get a billion dollars of stable coins in return, they just go and put it into the bank. When 00:12:02.000 |
interest rates are two, three, four, five percent, they're making billions and billions of dollars. 00:12:09.600 |
They get the float. These businesses have turned out to be incredible, but that's beyond the point. 00:12:14.080 |
The point is that a lot of companies that you would never think, so for example, SpaceX uses 00:12:19.120 |
these stable coins. How? How do you think they collect payments from all the Starlink customers 00:12:24.560 |
when they aggregate them in all of these long-tail countries? They don't want to necessarily take the 00:12:29.040 |
FX risk, the foreign exchange risk. They don't want to deal with sending wires. What they do is 00:12:34.160 |
they'll swap into stable coin, send it back to the United States, and then swap back to US dollars. 00:12:38.800 |
It's a very useful utility. Number one is I think we need to make those rails the standard rails in 00:12:47.680 |
the United States. What that does is it allows us to chip away at all of this decrepit infrastructure 00:12:54.960 |
that the banks use to slow down and tax a process that should never have been taxed. 00:13:03.840 |
Now, the banks have somebody to challenge them for money transfer and money storage, 00:13:10.080 |
and it could be regulated and stable, but I guess the question that- 00:13:13.760 |
That's the first thing, but then the second thing it allows you to do is you can see a 00:13:18.240 |
world where now you can have real competition against the traditional rails, specifically Visa, 00:13:23.360 |
MasterCard, American Express, because when you look at great companies like Stripe, I use Stripe, 00:13:29.200 |
I pay them 3%. If I use stable coins from Stripe, I don't pay zero, but I don't pay 3%, 00:13:37.440 |
it's kind of somewhere in the middle. If I was technically adept at implementing stable coins 00:13:42.400 |
through the entire stack of my product, so I use it for this learn with me thing where we publish 00:13:48.640 |
research, I would save a lot of money. The idea that you can take that 300 basis points you pay 00:13:55.600 |
to these companies and crush it to zero would be a boom to global GDP, because that's 3% 00:14:06.000 |
Aaron, you're shaking your head. This is something that you're experimenting 00:14:09.600 |
with at Box, or you're aware of, or a problem that you recognize. 00:14:13.760 |
No, no, just the credit card rails is, I mean, the tax on transactions is obviously insane, so 00:14:19.440 |
the stable coin being the intermediary for that in the future makes total sense if you could get 00:14:25.120 |
everybody to kind of coordinate against that. So, yeah. 00:14:27.120 |
Well, and regulating it would get people off of Tether, hopefully, which has a checkered, 00:14:32.640 |
colorful, sorted pass, you can go look it up, but they've had many, many legal 00:14:37.040 |
Passions against them, I'll leave it at that, yeah. 00:14:40.800 |
Yeah, but in fairness to them, I think, again, both of these two companies look, as of today, 00:14:44.880 |
again, you have a jurisdictional issue difference, but as of today, it looks like they're both 00:14:50.400 |
peg one for one. But anyways, the point is, if Sachs can really push the adoption of stable 00:14:56.400 |
coins, number one, and then number two is to incentivize much, much cheaper transactional 00:15:04.240 |
rails, then I think he can go to Bitcoin and these other more long-tail crypto projects 00:15:10.960 |
off of a back of momentum. Because these first two things, I think everybody will embrace, 00:15:16.480 |
and he won't get caught in a political quagmire. These other things, you have these opponents 00:15:21.600 |
always coming into the system, and they have, even Bitcoin, like when I said this thing about 00:15:26.960 |
encryption and quantum, and even though I thought I was very clear, the crypto community on the 00:15:35.680 |
internet went absolutely crazy all last week. Yes, they piled on, yeah. 00:15:40.000 |
But the thing is, some of the maxis piled on, then when they actually took time to understand 00:15:43.840 |
the technicalities of what I was saying, other people realized I was right, that they were 00:15:47.280 |
tweeting it. My point is, that world is so- Religious. 00:15:52.400 |
Animated and energized that I think it's hard to use them as the first place where you find 00:16:01.200 |
Go to stable coins, then disrupt the Visa rails, and then go to the other stuff. 00:16:05.760 |
I would go stable coins, and I would go even before that, the accreditation test that I've 00:16:10.080 |
been talking about, because the SEC has that mandate. People were educated, Aaron. They could 00:16:14.720 |
buy crypto and know they're going to lose their money, or know that something's a fugazi or fugazi, 00:16:19.280 |
whatever it is. Yeah, I mean, I'm sure we want to move on, 00:16:25.600 |
So there's a parallel universe where, so no matter what, like obviously, Gensler did not get his arms 00:16:32.800 |
around this whole thing. So that was a big, big mistake. But there's sort of like DeFi/financial 00:16:41.360 |
crypto, where almost everything is deflationary, it improves the rails. If you believe in Bitcoin 00:16:48.960 |
as this store of value, and digital gold, all of those things can actually kind of make sense and 00:16:55.040 |
be a bit rational and improve things. And then, unfortunately or not, depending on your views, 00:17:00.640 |
there was this other sort of fractal event that happened, which was, oh, let's also use these 00:17:06.400 |
things as a means of kind of creating a virtual currency and equities and tokens for anything, 00:17:15.760 |
where that then runs into basically the SEC remit of like, are these things securities or not? And 00:17:22.800 |
is there insider trading or not? And can anybody issue them at any time and promote them on Twitter 00:17:26.960 |
or not? And so I think to some extent, if you could get back to like crypto 1.0, which was like, 00:17:32.800 |
this is a financial infrastructure, I think you would have avoided a lot of the sort of noise and 00:17:38.800 |
challenges with crypto. Now, I don't know, you can't put it back in the bag. But there's like, 00:17:44.800 |
I don't think you could get 10 crypto people to agree on how you regulate that second category, 00:17:48.880 |
because some people believe I should be able to issue an NFT on anything, and I should be 00:17:53.920 |
able to trade that. And at the same time, they would obviously, they would sort of claim, 00:17:58.640 |
well, that's just the same as an aftermarket, you know, seat to a concert. And yet another group 00:18:04.800 |
would be treating this as effectively, you know, a security. And so I don't know how you're ever 00:18:10.240 |
going to reel that in, without some people being upset, you know, within the crypto community. 00:18:14.320 |
All right, well, more to come and Saks will be back, Saks will be back. And we will be 00:18:20.000 |
rotating the fifth seat amongst, you know, friends of the pod, and newsmakers. 00:18:25.200 |
Sorry, did I already, did I already get rotated out? Based on? 00:18:28.320 |
Well, yeah, basically, the energy was a little low. But, you know, I mean, it's just, 00:18:33.200 |
I think, well, what you're already had to warn what you're resting heart rate, 00:18:37.120 |
what you're resting heart rate versus we're boys, and then we'll just make a decision. 00:18:40.560 |
All right, listen, Doge is fully operational. Elon and Vivek have helped kill the last minute 00:18:48.960 |
omnibus spending bill, Wednesday night, the bill had been killed. And we were looking at the 00:18:55.760 |
government shutting down starting Friday, December 20. Today, when you're listening to this, for some 00:19:02.000 |
quick background in September, Congress approved a bill that would keep the government funded 00:19:05.440 |
through December 20, the day this episode published. Keep that December date in mind 00:19:10.880 |
for a second. This Tuesday, December 17. Three days before the deadline leaders in Congress 00:19:17.280 |
unveiled what was presented as a bipartisan stopgap bill that would keep the government 00:19:22.160 |
funded through March 14. That kind of bill is called a continuing resolution, basically give 00:19:27.600 |
you more time for the incoming GOP majority to reach an agreement on the funding for the 00:19:33.280 |
government. But there are two major problems with the bill. It's a rush job, it had to pass the 00:19:37.840 |
House and the Senate by Friday night after being presented on Tuesday. It's absurdly long 1500 00:19:43.760 |
pages with $340 billion in spending, including pay raises and better healthcare benefits for 00:19:50.480 |
the members of Congress. My Lord, read the room gentlemen and ladies, funding for a global 00:19:55.440 |
engagement center for another year. That's the disinformation watchdog group that was wrapped up 00:19:59.760 |
in the Twitter files, 130 billion to renew the farm bill for another year, $110 billion in 00:20:05.920 |
hurricane disaster aid, just money being thrown everywhere, a billion three to replace the 00:20:11.440 |
Francis Scott Key Bridge in Baltimore, they had some spicy comments on it. Congress has known 00:20:16.480 |
about the deadline since they created it in late September. He said the urgency is 100% 00:20:22.160 |
manufactured designed to avoid serious public debate. But serious public debate is exactly 00:20:26.880 |
what's happening on Twitter. People are screenshotting and using chat GPT and Claude and 00:20:33.360 |
Gemini to work their way through these documents. And it looks like this is not going to happen. 00:20:40.240 |
Freeberg, your thoughts. So the proposed bill made no real change to the current spending level 00:20:46.000 |
of the federal government at roughly $6.2 trillion on an annualized basis, which by the way, is 00:20:52.480 |
roughly call it 23% of GDP. Just to give you some context, in 1860, nearly 100 years after the 00:21:01.440 |
founding of the United States government, federal spending to GDP was less than 1%. 00:21:08.800 |
And it took off during the Civil War for a couple of years. But, you know, we're at these kind of 00:21:15.120 |
like unprecedented levels year after year now, really speaking to how the federal government 00:21:20.960 |
has grown, as we talked about many times so much in our life. And, you know, our roads were really 00:21:25.760 |
bad back then, though. Yeah, yeah, our roads were really bad back then. But remember, the objective 00:21:33.040 |
of the republic was to have the states make local decisions about how to take care of their 00:21:37.760 |
infrastructure. The national highway effort obviously changed that in the mid 20th century. 00:21:44.000 |
But this was kind of the original intention of the republic. It wasn't to have the federal 00:21:48.480 |
government come in and employ people, provide insurance to people, provide energy markets to 00:21:55.680 |
people, own football stadiums, etc, etc, etc. If you go through the list of things in this bill, 00:22:01.600 |
I think the $100 billion of natural disaster relief, everyone says that seems very reasonable, 00:22:07.520 |
that's something the federal government should do when we have a natural disaster, we need help, 00:22:10.960 |
we need support. That's a great thing for the federal government to do. But think about the 00:22:15.040 |
incentive it creates, it distorts markets. So we've talked about this in the past, in areas 00:22:20.160 |
where you have a higher probability of natural disasters, and people have paid a lot of money 00:22:24.240 |
for their homes, pay a lot of money for infrastructure, should the federal government 00:22:28.080 |
come in and rebuild those homes and provide capital to those individuals and those businesses 00:22:32.880 |
to help them rebuild those homes, if there's a high probability of natural disaster events 00:22:36.960 |
happening again in the future, it means that the cost of insurance doesn't matter. And the cost of 00:22:41.600 |
real estate doesn't matter because the federal government effectively can come in and support 00:22:45.600 |
those prices. In the same way, the federal government comes in and supports the prices 00:22:49.520 |
in agricultural products, through the work in the farm bill, and through the biofuels mandates, 00:22:54.320 |
which were also proposed to be extended in this thing. So the federal government's playing both 00:22:58.160 |
a market role, and you know, also kind of this role that I think fills the gap where people want 00:23:04.320 |
to have a customer where there isn't a customer and an employer where there isn't one. So like, 00:23:09.280 |
how did we get to this point? So first principles perspective, we've kind of I think, lost the 00:23:13.520 |
narrative on what the federal government was meant to do. If you think about the simple rubric in a 00:23:18.000 |
bill, like just go back some period of time and someone says, Hey, I want something I want to have 00:23:22.640 |
this in this bill. And you're the representative that's supposed to vote on that bill and say yes 00:23:26.960 |
or no. It's very hard to just say, No, we are not going to spend that money. What's the incentive 00:23:33.520 |
to say no? The alternative is you say yes, but give me x as well. There is an incentive in that 00:23:41.360 |
response. And the incentive there is to get something for your electorate, the people that 00:23:45.760 |
voted you in as a representative of the house, which is how we ended up at this point, where 00:23:50.720 |
everyone says I want something if you're going to get something, and eventually the government, 00:23:54.800 |
the federal government swells to spending roughly 24% of our GDP. Now, the biggest mistake I think 00:24:01.360 |
the founding fathers made was that they didn't create constitutional limits on spending and 00:24:05.760 |
enrichment. And this was because they had these deeply held philosophical beliefs that relied on 00:24:10.480 |
the House of Representatives to provide a check by the people. If you read the Federalist Papers, 00:24:15.920 |
and I went through a couple of these recently, and I use chat GPT to help me kind of, 00:24:19.520 |
you know, bring out some of I think, the key points, but in the Federalist Papers 10 and 51, 00:24:26.160 |
James Madison emphasized that the structure of government was meant to ensure that both state 00:24:30.720 |
and federal governments would limit each other's excesses, including their financial ones. And then 00:24:35.360 |
in the Federalist Paper number 58, he said, the House of Representatives has control over the 00:24:40.160 |
quote, power of the purse, which gives the people's representatives authority over taxation 00:24:45.440 |
and spending. But they also warned along with Alexander Hamilton, of the dangers of unchecked 00:24:51.600 |
government power through burdensome taxation, and excess spending, which would ultimately erode 00:24:57.520 |
individual freedoms. So they recognized that there were going to be limits, but their expectation was 00:25:02.160 |
that the house and the individuals that were electing people to the house that were members 00:25:05.840 |
of this republic would come in and say, we're going to keep that from happening. And clearly, 00:25:10.560 |
something went wrong along the way that we got to this point, where again, spending is 24% of GDP. 00:25:16.720 |
And I think that the biggest thing they got wrong was that they didn't create these constitutional 00:25:21.200 |
limits on federal spending or taxation through either a balanced budget structure, spending as 00:25:26.880 |
a percentage GDP, no federal debt or term limits or all of these other mechanisms that could have 00:25:31.600 |
been introduced at the beginning, that could have created some structural limits. Instead, 00:25:36.640 |
they assume that there was this natural limit that would emerge as a function of the democratic 00:25:40.560 |
process because of how they formed the government. But unfortunately, I think they failed to realize 00:25:45.680 |
that the electorate would eventually not want the freedoms of the people of the time. 00:25:53.680 |
Back then in 1776, this was a pioneering country where everyone wanted to come here to have freedom 00:26:00.400 |
to do anything they wanted, everywhere they wanted to build a business, to homestead, 00:26:04.560 |
to be rugged individuals. It was entrepreneurial. Yeah, it was. And over the last 250 years, 00:26:10.960 |
we've gotten used to an increment in lifestyle every year and discovered that we have a mechanism 00:26:16.560 |
to force the increment in lifestyle through the actions of government. And so the electorate has 00:26:21.600 |
stood up and said, "I want more each year, and I want the government to provide it for me if the 00:26:26.560 |
free markets are not doing it." And that's really where we kind of got to this point where I think 00:26:30.480 |
we elected people to the House who ultimately had this incentive that said, "If I give you this, 00:26:35.120 |
I need to get this." And we ended up swallowing this. So I don't know if it cracks with Doge. I 00:26:40.160 |
really don't know if people step up and recognize the limits of government and what the limits 00:26:45.360 |
should be of the federal government on an electorate basis. It's an amazing moment to see 00:26:49.760 |
that Elon went on Twitter and said, "Hey, guys, this is nuts." And everyone said, "This is nuts. 00:26:53.120 |
We're not going to elect you if you do this." If that momentum and that transparency can keep up, 00:26:58.000 |
I hope that people start to connect the dots, that this isn't a free lunch, 00:27:01.120 |
that the federal government spending is not limitless, and it's not unaccountable. 00:27:05.120 |
Aaron, I think we have enough people who are notable now speaking up about this excess spending 00:27:14.800 |
and the out-of-control debt that it's now in vogue to talk about austerity, to talk about 00:27:21.520 |
inefficiency. And that really all comes back to Doge and, dare I say, you know, the conversations 00:27:27.520 |
we've been having on this podcast for two years, that this is becoming acute. What are your 00:27:30.880 |
thoughts on this vibe shift, this complete pivot where we've gone from, "My Lord, everybody's 00:27:36.880 |
saying, 'I got to get mine. You got yours. I'm getting mine,'" to name and shame. They're naming 00:27:43.120 |
and shaming now very specific pieces of pork in these bills, you know, including stadiums for 00:27:50.640 |
the NFL. And people are like, "Why is the NFL getting this if they're worth $20, $30, $40 00:27:54.640 |
billion?" Two just quick thoughts. One, Patrick Carlson had a tweet yesterday that basically said 00:28:00.160 |
this big misinformation kind of created by people that want to be slow is that you have to sort of 00:28:07.440 |
choose two of fast, good, and cheap. And I think basically, you know, Elon's companies have sort of 00:28:14.560 |
always effectively kind of proven the opposite, which is actually, if you just start to ask the 00:28:20.800 |
question, "Why does that thing have to cost as much?" You know, if you're building a rocket or 00:28:26.080 |
designing a car or developing batteries, if you just do ground-up, why does it have to cost as 00:28:30.720 |
much? And so what's interesting is that probably if most people looked at what the government was 00:28:35.840 |
spending on, they wouldn't even feel like, you know, it's not even helping them in like the 00:28:40.080 |
disaster relief sense of, you know, I think like that there are probably actually people that 00:28:44.800 |
actually do experience the benefits of disaster relief. It's actually just all of the overhead 00:28:50.960 |
that we've created to getting anything done in the government that could actually make the 00:28:54.240 |
government better serve all the constituents. I was talking to, you know, sort of a nameless 00:29:00.400 |
individual in the government the other week, where by Congress, they have to hire contractors 00:29:06.400 |
to do work. And the contractors, the contracting firms charge them two and a half times the sort 00:29:12.400 |
of cost of an individual employee that they could otherwise hire. And so now they have to outsource 00:29:18.240 |
the work, and they don't have any accountability mechanism for that contractor. And so I think 00:29:23.760 |
there's not a single American that could look at that and say, "This is like actually working well." 00:29:28.400 |
Like, yeah, we are spending more money to do less. And the ultimate outcome is actually lower 00:29:35.360 |
quality. And so you have to at some point, just kind of do a little bit of a reset moment. And 00:29:40.480 |
that's obviously the upside of Doge is like, it's like, it breaks every rule of us thinking about 00:29:45.200 |
how you would actually go and attack this problem. We thought you'd attack it through meetings. 00:29:49.120 |
And we would do it through congressional, you know, sort of processes and research. And it's 00:29:55.120 |
actually just, it is, you know, obviously a much more sort of founder startup oriented way to 00:30:00.800 |
approach this. There's gonna be lots of things that are broken glass around the edges. There's 00:30:05.120 |
no question. But I think what's interesting about this week's event is, I think that there's been 00:30:09.760 |
this underlying kind of notion of like, you know, Elon and whatnot at all, you know, don't understand 00:30:15.840 |
the government enough to be able to change it. And it might actually be the case that the 00:30:19.600 |
government doesn't understand Elon in the sense of, of like, he will just see this thing through. 00:30:25.440 |
And the tools at his disposal and Doge's disposal is sort of, you know, 00:30:30.320 |
completely unprecedented in terms of the ability to put any, anybody in Congress on notice if, 00:30:35.280 |
you know, if basically they're promoting things that are not making the country better. So, 00:30:40.880 |
so that the, you know, the thing that we saw this week was actually that playing out, 00:30:44.640 |
everybody's been wondering, well, what are the actual, you know, what are the formal mechanisms 00:30:48.320 |
Doge has to accomplish and enact change? And it's like, you just saw it, like, they can just, 00:30:53.600 |
they can just create enough visibility and spotlight on the problem that it causes it 00:30:59.280 |
a level of discomfort in, in supporting moving forward with, with whatever that thing is. 00:31:04.000 |
And so I think it's interesting. I have no opinion on the actual elements of the bill, 00:31:07.920 |
other than from a process standpoint and a, and a new kind of case study of how this is going to 00:31:13.680 |
play out is I think we're seeing some early indication of what Doge will, will be able to do. 00:31:17.520 |
Aaron, how do you feel about this Doge effort? 00:31:20.800 |
As someone who is a public Harris supporter. So you've come out, I think you've, you've been 00:31:26.880 |
public about this, but I want to understand like why people wouldn't be supportive of this effort, 00:31:33.200 |
right? Like, like, what is like, what is the motivation for people saying this isn't a good 00:31:37.440 |
thing? We shouldn't be doing this. Like, cause there's a lot of folks that have gone on these 00:31:41.200 |
shows. It's like, they, they, they blast Elon, they blast Vivek, but like, aren't these principles, 00:31:46.480 |
like, shouldn't they just be universal that we should not be wasting money and stuff? 00:31:50.960 |
Sure. Of course. The, I mean, so to give some credit, you know, you have Ro Khanna supporting 00:31:56.880 |
it. You have a Fetterman supporting it. Bernie Sanders, you know, everybody has their. 00:32:00.960 |
Yeah. Bernie Sanders had a good call out for Elon. 00:32:03.360 |
You have, everybody has their thing in the government budget that they don't like. So 00:32:06.800 |
assuming that they see that as something Doge can contribute to, you could probably get 00:32:10.320 |
actually broad support. You know, there's a classic, you know, sort of reflex within, 00:32:15.360 |
within probably the Democrat party on, on at this point, just because of Elon's support of Trump, 00:32:19.280 |
that, that if something is a, is an Elon project, they're going to, they're sort of going to 00:32:24.240 |
instantly respond no matter what the thing is as a negative without, you know, kind of actually 00:32:29.120 |
saying, is this, does this actually support actually something I do agree with? And so that's. 00:32:34.160 |
It's all partisan, none of its first principles, right? I mean, for these people, for this group 00:32:39.120 |
of people, which isn't everybody. That's going to be true of both sides. Like Michelle Obama, 00:32:43.200 |
Michelle Obama was like, let's get kids healthy. And all of a sudden now it's in vogue to do that. 00:32:47.920 |
So, so I think, I think we're just in an environment where, where any, anything will 00:32:52.080 |
become partisan. What's interesting is, is that because of the, you know, some of the, 00:32:56.880 |
the cross party elements of, of, of Trump and now his cabinet is it might pull in more, 00:33:02.640 |
more of the Democrat party than, than would usually happen. And, and I think because of 00:33:07.200 |
Elon and the people that are surrounding Trump, you probably have a bit more air cover for the 00:33:11.680 |
Rocanas of the world to also step up. Because if it was like Steve Bannon and Trump doing Doge, 00:33:17.520 |
it would be like, ah, okay, you know, maybe this is not the thing to, to, you know, lend credibility 00:33:21.440 |
to for, for pure political reasons. If you have, you know, some of the best entrepreneurs that are, 00:33:25.920 |
that are out there actually like literally in the cabinet driving this, at some point, you know, 00:33:31.280 |
it's, it is an IQ test if you're, if you're on board or not. Shabbat, this is a sea change that 00:33:36.720 |
we're seeing. And to Aaron's point, this administration gets to pick their priorities, 00:33:41.440 |
but everything can't be a priority or it's not a priority. Therefore they've picked the 00:33:45.600 |
priority of smaller government, more controlled spending over say mass deportation or removing 00:33:52.320 |
more rights from women to choose when they have an abortion, et cetera. So this seems like a 00:33:57.120 |
distinctly different focus for Trump 2.0, the second term. What are your thoughts on what we 00:34:02.400 |
saw this past 72 hours? It was the most, it was the most incredible change in how politics will 00:34:11.920 |
be done going forward. I think that people are probably underestimating what happened here. This 00:34:19.360 |
was a multi hundred billion dollar grift that was stopped on a dime over 12 hours of tweets. 00:34:30.800 |
You would have never thought that that was possible. The point is to put a dagger in 00:34:35.520 |
something that big that had so much broad support just a few hours earlier. I think 00:34:43.440 |
it's so consequential in how the United States can run going forward. So building on what you 00:34:48.960 |
said, Chamath, also very interesting here is the fact that Trump hasn't taken office yet, 00:34:55.200 |
and they're having this huge impact. This is occurring a month before he's even in office. 00:35:00.560 |
If they can stop this, what will they do when they actually are in power? 00:35:04.560 |
I think right now you're seeing the first order reaction, which is about the bill itself, 00:35:09.840 |
and I think that misses the point. The bigger issue is going forward, you will have the ability 00:35:15.680 |
to, and part of it is because we have a set of tools now that allows us to do this, 00:35:21.120 |
to read 1500 pages in a matter of minutes, to summarize it into the key essential elements, 00:35:27.760 |
to really understand what's being asked and what's being offered, and then to put it in a 00:35:34.240 |
digestible format that normal people can consume. Then all you'll have to do is just connect the 00:35:40.000 |
dots and tell your congressman or congresswoman that you'd like or dislike this thing, and what 00:35:44.960 |
you're going to see is a much more active form of government. I think that's the really big deal, 00:35:49.600 |
the fact that it really becomes the voice of the people. 00:35:54.000 |
Now, the alternative can also happen. Imagine there is a piece of legislation 00:35:58.240 |
that is very controversial, but it turns out that people actually want it. Then the opposite will 00:36:06.640 |
also happen and can also happen now. I think that it's a very nuanced and interesting way that 00:36:11.920 |
governance can happen. The other thing that I'll say though, which is funny, is we should have a 00:36:16.000 |
segment called Today in Hypocrisy. If I was running the segment, what I would say is, "Today 00:36:21.360 |
in Hypocrisy, you have a group of people, i.e. the Democrats, who are very upset and who now point to 00:36:30.800 |
Elon as some shadow cabinet minister or some shadow president-elect or some shadow first buddy." 00:36:41.120 |
Except then I thought to myself, "Well, hold on a second." There was something untoward happening 00:36:46.960 |
in the shadows and I thought, "Well, actually, this is the exact opposite." The guy is tweeting 00:36:51.440 |
in real time his stream of consciousness. You absolutely know everything that he wants because 00:36:57.200 |
he just lays it all bare. At the same time, I thought it was really interesting. The same people 00:37:02.160 |
who were saying that were the ones that finally admitted that they were hiding Biden for the last 00:37:07.120 |
two years. I thought, "Did we just miss this?" The same people that are like, "Hold on, Elon. I don't 00:37:12.960 |
like the fact you're telling us what you actually want on Twitter transparently while we hide our 00:37:18.640 |
President of the United States in a box." Well, yes. You're referring to a Wall Street 00:37:28.560 |
Yes. As we said on this pod, we knew that they were hiding him. Now, the cover-up is worse than 00:37:36.480 |
the crime and the cover-up is a cover-up. They were not letting him take meetings. They were 00:37:41.360 |
limiting access to him. Dean B. Phillips came on this program. Shout out to him. Congratulations 00:37:46.560 |
on a great run. He just told the truth here. He's not up for it. He's sundowning. Terrible 00:37:54.480 |
situation. People get old and people have cognitive decline. The end. 00:37:59.760 |
Question. Hold on a second. Just to contrast and compare, Trump is not even in office. 00:38:06.560 |
Elon is not a member of the cabinet per se. These are effectively today still private citizens. 00:38:13.520 |
There's all of this noise about what happened over the last two days to stop an absolutely 00:38:17.440 |
ridiculous pork barrel bill. When are we going to double-click into what decisions have been 00:38:22.560 |
made in the last two years that were actually Biden's versus surrogates that just decided, 00:38:27.680 |
and who gave them the authority to make those decisions? 00:38:30.320 |
If they're going to do, Aaron, and you and I are left-leaning moderates, I think that would 00:38:34.640 |
be the most accurate way to describe us. I mean, if they want to do an investigation, 00:38:39.360 |
there should be an investigation, too. Did they know that he was in massive cognitive decline and 00:38:44.240 |
let him stay in charge of the nuclear codes? Do you agree or disagree? 00:38:48.720 |
This is sort of not the part of politics I think about as much, so I'll leave that up to you as 00:38:55.920 |
the other left-leaning moderate. No, I just wonder if this is a crime that they... What if it turns 00:39:02.400 |
out he actually has Parkinson's or Alzheimer's, like a diagnosis? It's not out of the realm of 00:39:08.800 |
possibility that they covered up an Alzheimer's diagnosis, Chamath, and if they did, is that 00:39:14.320 |
criminal? I mean, it's unethical. It's deeply unethical. It's very dangerous. 00:39:24.400 |
the people elected Joe Biden. He won fair and square. He ran on a specific mandate that the 00:39:31.600 |
people endorse. I just think it's devious, though, that certain figures in that White House took a 00:39:38.800 |
level of power and decision-making authority that they were never entitled to. If they wanted that, 00:39:45.200 |
they should have run and they should have been elected. That's what we all sign up for, 00:39:49.520 |
and so I think that idea that we let that happen or that that happened to the American voting public 00:39:55.760 |
is, I think, very unfair. That's why I think you have to realize that, and you've said this before, 00:40:03.120 |
we need these checks and balances going forward. I think the way that you have these checks and 00:40:07.760 |
balances, again, is veer towards transparency. The more transparency there is, and again, 00:40:13.280 |
this is where I'll say you may or may not like Donald Trump, but the one thing you will never 00:40:19.520 |
have to be worried about is whether you will not be able to hear from him in first person. 00:40:26.240 |
You're going to hear from him. That is absolutely true. I mean, 00:40:31.120 |
If you're ever in doubt, you will be able to know very quickly where he stands 00:40:36.320 |
on whatever topic is important to you, and I think that that idea is very important, 00:40:41.280 |
because then if it's filtered through somebody else because of some health issue that's then 00:40:45.760 |
covered up, we're making decisions that impact the entire world. We're making decisions that 00:40:51.440 |
impacts the economy. We're making decisions that touch hundreds of millions of American lives. 00:40:58.320 |
Yeah, it's kind of crazy. I have a simple suggestion here with these bills, by the way, 00:41:02.000 |
when they're 1,500 pages. How about for every 100 pages you release, you have one day of review? 00:41:07.440 |
So if you want to release 1,500 pages, 15 business days review, 00:41:10.800 |
does that sound reasonable? Maybe three weeks, and then just stop doing 1,500 at a time. 00:41:15.920 |
Break these things up into two or 300 pages at a time. I just love the fact that people 00:41:21.280 |
are motivated and they have the will and the desire to focus on these focus and desire, 00:41:28.400 |
because people have to have the will and people did not have the will to get in the weeds 00:41:33.040 |
and examine the spending. And now it's becoming like in vogue. It's becoming a 00:41:38.400 |
pastime to question the spending. This is a really great moment in time for America. 00:41:42.480 |
The other thing, Jason, that we haven't done is I think that killing the bill was step one. 00:41:47.200 |
The thing that America has not yet seen, and I think Aaron brought this up with the Patrick 00:41:52.800 |
Collison tweet, which is just excellent. Nick, if you just want to put it up there, it really is 00:41:56.960 |
true. America still believes that the more you spend, the more you get. We do at the core. 00:42:05.520 |
That's why there are 1,500 pages of spending in here, because people want things because 00:42:10.000 |
they want things to be better. What we need to train people to understand is actually that it's 00:42:16.560 |
the lies that are told that make you think that with more money comes a better outcome. 00:42:21.520 |
And we see it every day in Silicon Valley. We all start with next to nothing as a startup, 00:42:27.840 |
and we outmaneuver and we out execute companies all day long with way more resources. So it has 00:42:33.040 |
nothing to do with the resources. >>Contraint leads to great art. Aaron, 00:42:36.960 |
your thoughts? >>No, I can't add that much more to this. I think there's probably a little bit of a 00:42:42.400 |
disconnected times from, let's say, the voting public and broad constituents from then those 00:42:49.440 |
that have sort of seen this in real life being inside of a company having to do a startup and 00:42:54.480 |
scaling up and just the perverse incentives to build bigger teams, spend more. Your project 00:43:02.720 |
then is more important, the more dollars it gets. We have all of these systems in place, 00:43:07.280 |
which is the stuff that gets attention are the things that you spent more on. 00:43:10.880 |
So you have all these weird incentives to actually have your thing literally cost more, 00:43:15.680 |
to have more overhead because you've brought in more contractors into the project that then 00:43:21.520 |
you're going to get some kind of future benefit from in some way. So you have a lot that is sort 00:43:27.520 |
of fully broken in this. And it's hard to imagine any other way to veer off from that path other 00:43:35.520 |
than something that does shake things up as much as Doge is doing. >>All right, listen, we can't 00:43:40.880 |
do any worse than being massively in debt. So just let's have a culture of- >>Yes, we can. Yes, 00:43:45.840 |
we can. You could have runaway debt. >>I think we already have that. I mean, 00:43:50.000 |
for people who maybe are rooting against Trump in the audience or rooting against this because 00:43:56.240 |
they're super partisan, all I'll say is I know these individuals around Trump, root for them 00:44:01.440 |
and root for this process, please, because this is a path to fixing the most acute existential 00:44:07.680 |
problem we have, which is our debt. You don't have to like Trump to like Elon, to like Vivek, 00:44:12.880 |
and to like these other individuals, Sachs included. There are great people who are being 00:44:17.760 |
called to serve. Let's judge them based on their performance. That's all I ask for the people who 00:44:24.080 |
hate Trump, who hate these individuals, judge them on their performance. They've come out with a bold 00:44:29.680 |
plan. Let them cook. Once they've cooked, then judge their results. That is what I'm telling 00:44:36.000 |
everybody who hates Trump and who hates this administration and who's partisan on the other 00:44:40.720 |
side. Let them cook, judge the results. >>I'll just throw out one more thing on this, 00:44:44.480 |
because I think Doge has, the branding of Doge is often the efficiency side, which people always go 00:44:50.960 |
to the spend side. But the corollary to that is the regulations that obviously are expensive to 00:44:57.280 |
maintain. That's what creates layers and layers of overhead on reviewing everything that's coming in 00:45:03.040 |
then to the government. Unfortunately, we have great examples in California where literally, 00:45:10.480 |
we spend more to do less. It's because we've ratcheted up these layers and layers of regulation. 00:45:16.480 |
I have friends literally doing climate tech. In climate tech, you couldn't imagine something more 00:45:23.040 |
probably left-leaning Democrat that they can't actually get things done in California, the state 00:45:28.240 |
that you would imagine to be the most climate-first friendly state, because of the amount of 00:45:32.800 |
regulation that prevents them from getting things done. There's actually this total combination of 00:45:40.560 |
actually fewer regulations. You'll spend less money in government. You'll actually grow the 00:45:45.360 |
economy faster, which will create more jobs. All of the things get solved, the more efficient you 00:45:51.280 |
get, kind of writ large on all of the topics. >>Absolutely. So this is spending. Great point, 00:45:56.400 |
Aaron. Regulations next. Let's see what they do there. I was in the room when Antonio Gracias 00:46:02.240 |
was doing zero-based budgeting at Twitter, now X, and Sax and I were looking at roles for people. 00:46:10.960 |
What this team can do in terms of making things more efficient, it's amazing what can happen when 00:46:16.880 |
you do zero-based budgeting and when you just think from first principles, well, do we even 00:46:21.360 |
need to do this? Does this need to exist? Take all these regulations, put a 20-year clock on 00:46:27.520 |
half of them, a 10-year clock on the other half. >>Can I just get one more random example, feel 00:46:31.280 |
free to edit it out. Chamath, you'll like this because it came out of Meta. Have you followed 00:46:35.680 |
the Z-Standard compression library from Meta? So this open-source library, kind of a next-gen 00:46:42.800 |
compression on data. It took us probably too long, but the team worked insanely hard, 00:46:47.840 |
implemented this compression algorithm. Our uploads and downloads got faster. 00:46:51.840 |
We spend less money on networking and compute. It took some re-engineering of the system, 00:46:58.800 |
but that's just not a concept that people go into problem-solving with, which is like, 00:47:02.960 |
what if the thing actually was cheaper to run and it was better? And so think about all the systems 00:47:08.240 |
of government that could just be upgraded, and then you would spend less money actually 00:47:12.400 |
maintaining them. I mean, we spend hundreds of billions of dollars, way too much, on legacy 00:47:17.360 |
infrastructure, technology. We could automate more. You could spend way less money and then 00:47:20.960 |
get better outcomes. So this is just happening everywhere, and I don't think people realize 00:47:25.520 |
the scale of the opportunity. >>And I'll just give an example. When we went into Twitter, 00:47:30.560 |
nobody was coming to the office. There were people who hadn't come to the office, who hadn't 00:47:36.080 |
committed code in six months. So what were they doing, right? So you start looking at the data. 00:47:40.640 |
Then they were spending an enormous amount of money on SaaS software that nobody had logged into. 00:47:45.120 |
And then they had desk software for route people around that was costing $10 per day, per desk, 00:47:51.920 |
per location, whatever. Nobody was coming to the office, but they were paying for software to route 00:47:58.800 |
people to the right desk in office suiting. The waste, when I tell you the waste and the grift, 00:48:04.800 |
and I'll just call it what my interpretation is, stealing. They were stealing from those 00:48:09.360 |
shareholders. The government is stealing from taxpayers. It has to be fixed. Let our boys cook. 00:48:15.760 |
Freeberg, you get the final word before we go on to Conspiracy Corner. You got nothing? All right, 00:48:21.200 |
let's go straight to Conspiracy Corner. Everybody put your tinfoil hats on. There were drones over 00:48:25.920 |
New Jersey. Thursday morning, the FAA banned drones. This is breaking news in parts of New 00:48:32.080 |
Jersey until January 17th due to "breaking news, special security reasons." They also warned that 00:48:39.760 |
the government may respond with deadly force against drones posing a threat. This thing is 00:48:45.200 |
getting crazier by the day. There have been thousands of reported drone sightings in New 00:48:48.880 |
Jersey and bordering states over the last week. Here's some examples. Play the clip, yada, yada. 00:48:55.200 |
Until Thursday morning, White House officials have been dismissive, saying repeatedly that 00:48:59.600 |
nothing significant is going on. One of the theories was that the drones were looking for 00:49:05.280 |
nuclear material, a.k.a. a dirty bomb or lost radioactive material or, the ultimate, a lost 00:49:13.520 |
nuclear bomb from, like an actual nuclear warhead from Ukraine. On Tuesday morning, 00:49:20.640 |
the mayor of Belleville, New Jersey suggested the drones could be looking for that radioactive 00:49:24.480 |
material that went missing on December 2nd. That was radioactive material, germanium-68. We'll get 00:49:30.640 |
details on that from Freeberg in a moment. And then, of course, conspiracy theories are going 00:49:36.240 |
wild. It's Iran, it's UFOs, and my favorite, Project Blue Beam, which is that NASA is using 00:49:42.320 |
holograms and other technology to create a new world order and religion and projecting Jesus 00:49:47.200 |
into the clouds. You can look that stuff up, or we'll have Alex Jones in Sachs' seat one week. 00:49:53.360 |
Okay, Freeberg, you're the genius here. Tell us what's going on. 00:49:56.320 |
I think there are three likely explanations. The first is that the government's got some 00:50:01.440 |
activities that can't be disclosed, so we don't know and they can't talk about it. No one can 00:50:05.280 |
neither, you know, say yes or say no to it. So, you know, that's kind of one that's pretty 00:50:09.760 |
possible. The second is, these are just individuals with a bunch of drones messing around, having fun, 00:50:14.960 |
trying to wreak havoc. Could be fun. Bunch of kids. I think we've all been there. The third 00:50:20.160 |
is what I would call kind of a bit of a more nefarious, like, this is my conspiracy theory. 00:50:24.960 |
I think it could be considered a PSYOP. Okay, so right now, the US has a significant regulatory 00:50:35.840 |
burden on drone utilization in a commercial setting. And it's very hard to use drones, 00:50:41.680 |
you have to have line of sight to the operator, these things aren't supposed to go on their own, 00:50:44.720 |
there's all these rules and restrictions and so on and so forth. Meanwhile, you've got countries 00:50:49.520 |
like China rocketing ahead. So I don't know if you guys know the company Meituan in China, 00:50:53.440 |
you know, the food delivery company? If you guys, do you know that they do a large amount of drone 00:50:58.320 |
delivery of their food now? I was not aware of that. We're testing that here in the US. 00:51:02.240 |
The drone delivery business in China is already $30 billion a year. And they're 00:51:07.760 |
also launching a pretty significant fleet of what we would call kind of the eVTOLs or flying cars. 00:51:12.480 |
The expectation is that by 2030, there'll be 100,000 flying cars, 00:51:18.160 |
moving people around in China. And these are huge efficiency gains. In fact, with Meituan, 00:51:22.160 |
you can now order food while you're on the Great Wall at one of the ramparts. And it'll bring the 00:51:26.080 |
drone will bring the food to you while you're walking the Great Wall as a tourist. And in the 00:51:31.200 |
US, the reason that these things haven't taken off, and the reason we don't have a large kind 00:51:35.040 |
of drone industry, which is clearly emerging, it's going to be a huge economic driver for China and 00:51:39.920 |
others around the world is simply the regulatory restrictions. And so if you were going to try and 00:51:45.280 |
mess with the US's ability to move forward with the drone economy, you would probably try and 00:51:52.320 |
wreak some havoc, stoke some fear, and get people to say, "Hey, this doesn't seem cool. What's going 00:51:58.480 |
on? I don't like that there's all these drones in the sky. I'm freaking out." And try and get 00:52:02.000 |
the regulators to come in and say, "Hey, we're banning drones." And set up everyone, including 00:52:08.400 |
the people in the government to say, "We should take a beat. We should think a little bit before 00:52:12.880 |
we deregulate." We should... Who would do this? Who's the motivation? Uber Eats and Postmates? 00:52:17.600 |
No, no, no. People driving? No, no, no. It could be the other government. That's my psyop point. 00:52:22.960 |
This could be a psyop. Oh, you're saying this could be China doing this to try to slow down 00:52:26.000 |
our economy? Think about it. If you're going to pass a bill in Congress to make drones more 00:52:31.120 |
freely roamable in the skies, you're going to reference back this crazy story in New Jersey. 00:52:35.760 |
Everyone's freaking out about it. And you're going to say, "Hey, wait, what about all that stuff that 00:52:39.440 |
happened in New Jersey? Maybe this doesn't make as much... People are scared about it. We shouldn't 00:52:43.200 |
rush. We shouldn't rush. We shouldn't rush." That would be my psyop theory. That's my conspiracy 00:52:49.360 |
theory tinfoil hat. I don't often do them, but that's what I would think about. I think the 00:52:52.720 |
first two are probably more likely. Alex Jones would be proud. Aaron, why don't you jump the 00:52:56.560 |
fence and tell us your best conspiracy theory in this? Jump the fence. No, no, no. That was 00:53:01.520 |
all crazy pills that I just heard. Yeah, take some. It's entertaining. I think there's 10 00:53:07.280 |
higher psyops you would do if you wanted to get us to have a collapsing economy than going after 00:53:12.320 |
drone deliveries. What would they be? First of all, I think you'd have AI do a robot. I think 00:53:21.920 |
you'd have a robot AI thing that runs amok. That's a good idea. Rogue robot. Robocop. Yeah. 00:53:28.080 |
I think that would be way sooner than you worry about food delivery. No, you have 10 self-driving 00:53:31.840 |
cars hop the curb and crash into a storefront. Self-driving is on ice for two years. That would 00:53:38.480 |
be an example. Chamath, your thoughts? You got some conspiracies here. What do you think's going 00:53:43.440 |
on here? No, but I thought that the most credible thing was that they were looking for radioactive 00:53:52.320 |
material, that somehow some got lost. Why would they only look at night? Actually, it's interesting 00:53:59.440 |
you say that. There was somebody on Twitter, X, who claims to be an expert in this field, 00:54:05.120 |
and there's a startup actually that I just talked about. Which one? Canacoa the Great? 00:54:08.880 |
I think it might've been Canacoa the Great. Shout out to Canacoa the Great in his mom's 00:54:11.600 |
basement eating pizza bites. Why do you hate Canacoa? He's a nice guy. I don't hate Canacoa. 00:54:15.520 |
I don't hate Canacoa. Canacoa tried to get me bad because he said I was trying to dox him. 00:54:20.160 |
You always can't Canacoa. He's great. No, I just think it's hilarious that people are 00:54:24.000 |
retrading Canacoa the Great as if he's this great journalist, and he's in his mom's basement eating 00:54:30.480 |
Hot Pockets, or he's working for Putin. He's a really good- 00:54:33.680 |
I don't know. Do I want my news from Jake? It's J. Cal or Canacoa. I don't know. 00:54:38.000 |
I don't know. Who do you take? I mean, your conspiracy theory is pretty good. 00:54:41.280 |
I subscribe to Canacoa. Aaron Levy, you have a favorite? Who are you? You Autism Capital or 00:54:47.760 |
Canacoa? Who are you? I like Autism Capital, too. I think he's really good. 00:54:53.280 |
Oh, you're into Geiger? Geiger Capital is good. Geiger Capital is good. 00:54:59.040 |
I think the nighttime thing would be, it would be at least typical of this government to do 00:55:02.640 |
a Streisand effect of just like, maybe if we cover it up, nobody will see, and then 00:55:07.920 |
obviously it's the biggest thing, so. Yeah, and they have blinking lights on them. 00:55:13.520 |
But if they, there is a startup that makes, we really looked it up, and there is a startup 00:55:20.560 |
that makes drones to do this specific task, to look for dirty bombs in ports, and ports 00:55:25.840 |
obviously look for these things. It's a known threat. This is not rocket science. 00:55:30.080 |
I'm sorry, and why, and why only at night, to free folks' questions? 00:55:33.040 |
Oh, because that they can read the signatures better was the theory, that at night you can 00:55:37.680 |
read the signature, which doesn't make sense to me. Science guy, you want to come in here? 00:55:41.600 |
I don't see how night, I don't see how nighttime you'd get a better read on radioactive material. 00:55:49.040 |
That made no sense. That sounds like a crazy thing. But anyway, we're living in crazy times. 00:55:54.960 |
By the way, that was my first conspiracy theory in 208 episodes of the All in Pod, so. 00:55:58.960 |
Very nice. I actually can participate in Conspiracy Corner now. 00:56:06.000 |
Okay, Jake, I'll read us some more news. Go to Google News and read us some more news, 00:56:14.080 |
No, you do it. Go ahead. No, no, you get the docket in front of you. 00:56:16.400 |
Okay, so today the Dow Jones Industrial Average is up 23,000 to 44,000. 00:56:21.680 |
I try to highlight you guys, make you look good. 00:56:25.760 |
I'm going skiing. That's enough. I'm going skiing. 00:56:28.080 |
I thought it was cool how he read the entire congressional bill earlier. 00:56:36.160 |
I'm not filibustering. All right, listen. Here, let's do another story. Let's see if 00:56:39.600 |
you can contribute something. Open AI update, Mattis Lander. Your contribution was amazing. 00:56:45.840 |
I was actually using chat GPT to go into the founding father's papers. 00:56:51.440 |
I was reading the Federalist Papers with Gemini. 00:57:01.680 |
I was comparing the lyrics from Hamilton the Musical. 00:57:10.800 |
Hey, Nick, can you send J. Cal another Yahoo News article? Let's get going. 00:57:14.960 |
I can't wait. I can't wait to see you in the morning. Let's go. 00:57:28.400 |
No, do it, do it, because I want to talk about OpenAI. 00:57:34.160 |
Give him his red meat. Give Chamath his red meat. 00:57:36.080 |
All right, here's an update on closed AI, flipping, 00:57:39.360 |
and Sam Altman, supervillain Sam Altman, secure in the bank. 00:57:43.840 |
Meta wrote a letter to California's Attorney General- 00:57:47.840 |
-on OpenAI's for-profit conversion two months ago. 00:57:52.960 |
OpenAI announced a $6.6 billion funding round, $157 billion valuation. 00:57:59.360 |
They're projecting $3.7 billion in revenue this year, pretty great revenue growth. 00:58:07.440 |
OpenAI must convert to a for-profit within the next two years, 00:58:15.600 |
And right before they announced this round, you remember CTO Mira 00:58:18.960 |
Mirati and two other top researchers resigned. Many people saw this as a protest. 00:58:24.640 |
Elon, who put the first $50 million in and co-founded OpenAI, is currently suing the company 00:58:29.440 |
and seeking a court order that would stop the for-profit conversion. 00:58:37.680 |
Elon and Zuck are in some weird pairing up. They're not in the ring, not in the octagon, no. 00:58:46.320 |
California's Attorney General to stop OpenAI's for-profit conversion. 00:58:51.360 |
We've got them now not in the octagon, but they're in the political arena. 00:59:00.880 |
If you put them all together, it paints a really interesting story. 00:59:04.320 |
So you have Elon filing an injunction, which basically says you should not allow this 00:59:10.400 |
conversion to happen until we sort out all of these details, because if you allow it to convert 00:59:16.320 |
and then I win, it'll be very messy to go backwards. 00:59:19.920 |
I think that that's a pretty credible legal argument. 00:59:22.320 |
Then you have Zuck basically say, "Hey, Elon's right. This company should not convert." 00:59:30.080 |
But the more interesting thing that really got me thinking about this was a chart that 00:59:37.360 |
What this shows is just what's happened in the last year. 00:59:43.680 |
What you notice is the market share of OpenAI has changed pretty meaningfully 00:59:53.360 |
What you see is Anthropic doubled, Meta roughly staying the same, Google picking up steam. 00:59:59.440 |
It started to make me think this is very similar to a chart that I would have looked at in 2006 01:00:09.280 |
and 2007 when we were building Facebook, because we had this huge competitor in MySpace that was 01:00:15.040 |
the de facto winner and we were this upstart. 01:00:18.640 |
It made me think, "Is there a replay of this same thing all over again?" 01:00:23.840 |
Where you have this incumbent that pioneers an idea and they start with 100% share. 01:00:31.440 |
Then all of these upstarts come around from nowhere. 01:00:35.040 |
Then I thought, "Well, what is better today if you are a company that's just starting 01:00:43.120 |
I think that there's a handful that are worth talking about. 01:00:45.600 |
The first is when you look at what XAI has done with respect to NVIDIA GPUs, 01:00:52.960 |
the fact that they were able to get 100,000 to work in one contiguous system and are now 01:00:59.920 |
rapidly scaling up to basically a million over the next year. 01:01:02.880 |
I think what it does, Jason, it puts you to the front of the line for all hardware. 01:01:07.360 |
Now all of a sudden, if you were third or fourth in line, XAI is now first and it pushes 01:01:13.440 |
In doing that, you either have to buy it yourself or work with your partner. 01:01:18.000 |
I think for the folks like Meta, that translates and explains why they're spending so much 01:01:25.600 |
If you can't spend with my competitors, I'm just going to prefer my competitor to you. 01:01:33.280 |
In a capital war, I think the big companies like Google, Amazon, Microsoft, Meta, and 01:01:40.480 |
brands like Elon will always be able to attract effectively infinite capital. 01:01:45.760 |
Their cost of capital goes to zero, which means they'll be able to win this hardware 01:01:52.160 |
Then the second thing is what happens on the actual data and experience side on the training 01:01:57.840 |
If you listen to Ilya Tsutskever, if you listen to Andrej Karpathy, what they effectively 01:02:02.800 |
are saying is there's this terminal asymptote that we're seeing right now in model quality. 01:02:09.440 |
A lot of these folks are now experimenting on the things around the model, the user experience, 01:02:16.640 |
Can I cut and paste these things from here and there? 01:02:18.640 |
Because what it says is the data is kind of static and brittle, but it's actually not. 01:02:24.960 |
That's what we said before, because you have this corpus of data on X that's pretty unique. 01:02:29.360 |
I suppose if Elon fed in all this kinetic data that he controls through Tesla, that's 01:02:36.720 |
Does that all of a sudden create this additive pool of information? 01:02:43.920 |
Then the third thing is when you look at that chart, what that chart says is, "Hold on a 01:02:50.480 |
What I can tell you just through the 80-90 lens is we are completely promiscuous in how 01:02:57.760 |
The reason is because these models offer different cost quality trade-offs at different points 01:03:06.560 |
What we are seeing is a world where instead of having two or three models you rely on, 01:03:12.000 |
you're going to rely on 30 or 40 or 50, and you're going to trade them off, and you're 01:03:15.520 |
going to use effectively like an LLM router to- 01:03:23.440 |
Then there's an intelligence above that that's constantly tasking and figuring out prompt 01:03:28.800 |
It's this thing where we were very reliant on open AI. 01:03:40.400 |
I just see this world where it's all getting commoditized quite quickly. 01:03:49.840 |
I know you're very promiscuous when it comes to LLMs as well. 01:03:52.240 |
We have a very similar model, which Ma said, which is we're agnostic, so we work with multiple 01:03:59.360 |
I think a friend deep in AI land a couple of years ago, right before Chatterbot said, 01:04:07.200 |
"There's no secrets in AI," and I didn't totally understand at the time. 01:04:10.800 |
It hadn't registered what that meant, but very quickly it became obvious, which is the 01:04:14.880 |
research breakthroughs propagate insanely quickly across the AI community. 01:04:21.600 |
Back to this MOS framework, if you just think about it, if the research effectively becomes 01:04:28.080 |
open at some point in time quickly enough because either the researchers move or people 01:04:31.920 |
publish it or whatnot, then it really is a compute game and then maybe a data access 01:04:36.160 |
game, and that means that there's four or five at scale players that can fund this. 01:04:40.080 |
I think as we've seen in other areas where it's an infrastructure play, with enough competition, 01:04:50.080 |
you have the underlying service eventually trend toward the cost of the infrastructure. 01:04:53.440 |
What we should expect is that the price of a token in AI land basically will be whatever 01:05:01.120 |
the price of running the computers are, and maybe with a plus 10%, 20% margin. 01:05:07.040 |
Did you see the same thing happen with storage? 01:05:08.800 |
I remember in the early days with Box and Dropbox and YouTube, you all had this major 01:05:21.760 |
What happened was the price of the underlying storage has gone down by hundreds of times 01:05:29.520 |
since we started the company, and then all our value is in the software layer on top 01:05:34.960 |
We've benefited by this incredible, ruthless competition between Western Digital, Seagate, 01:05:40.480 |
other players that are just trying to pack more storage density into these drives. 01:05:48.080 |
Every couple of years, they have a new breakthrough. 01:05:49.600 |
We're heading toward maybe a 50-terabyte hard drive. 01:05:53.680 |
When we started the company, they were 80 gigabytes. 01:05:56.480 |
How much of your time in the early days was spent on dealing with this infrastructure 01:06:01.680 |
issue, and then how much of your time and your leadership team's time is spent on this 01:06:06.420 |
Back in the day, if we had 10 people in engineering, 80% of them was doing pure infrastructure 01:06:14.240 |
If we have 1,000 people, it would be inverted in terms of the ratio. 01:06:18.960 |
You get more leverage, both as you get the advancements in the technology, but then also 01:06:26.320 |
But all of this is to say, you should basically anticipate a world where -- and I think Zuck 01:06:31.200 |
is this interesting counterbalance on all of this because of open source -- if at any 01:06:35.200 |
moment you know that Zuck will basically provide an open-source model that is at best-in-class 01:06:41.520 |
benchmarks and at the frontier, then there is a limit on how much you can charge for 01:06:47.040 |
the tokens of your hosted model because anybody will then be able to go host the open model 01:06:51.200 |
and be able to provide infrastructure around it. 01:06:53.440 |
So if you always have that counterbalance, and the tokens eventually kind of look the 01:06:57.680 |
same, the output tokens kind of look the same... 01:06:59.920 |
Isn't it also the truth that major enterprises -- Fortune 500s, 200s -- 20 years ago weren't 01:07:08.960 |
interested in open source, and now that's kind of their default? 01:07:11.520 |
They want to buy into open source because they don't want to be locked into a vendor? 01:07:15.120 |
It's actually not even necessarily the case that the customer has to pick the open-source 01:07:19.200 |
They might buy it through an abstraction layer that is letting them get the benefit 01:07:22.960 |
of open source, but still buy through a proprietary or commercial... 01:07:29.520 |
I believe open source causes pricing to always be extremely low. 01:07:34.400 |
Right, but in this case, do you think open source is going to ultimately win the day 01:07:42.000 |
No, because only meta has the scale to be able to provide... 01:07:46.640 |
I think what Aaron is saying here -- let me maybe try to frame it -- I think what he's 01:07:49.760 |
saying is there'll be open-source models, there'll be closed-source models, but the 01:07:55.520 |
price that Aaron or me or anybody else pays these model-makers will effectively go to 01:08:02.400 |
And it'll go to the cost of the compute, to be clear, with a little bit of margin for 01:08:10.080 |
Now, it's important to step back and say, "I still think you could probably have the 01:08:13.280 |
entirety of the model providers make 10 times more revenue than they do today because we're 01:08:17.440 |
just literally in the first percent of the total TAM." 01:08:20.640 |
So, it'd be a mistake to think that that has some kind of downward pressure in terms of 01:08:25.840 |
the long-term economics of these businesses, especially because I think OpenAI's revenue 01:08:29.920 |
stream is increasingly looking like a SaaS revenue business as opposed to just the API 01:08:36.160 |
So, none of this is to provide any sort of color on what would you bet on today. 01:08:42.000 |
I agree with Chamath that you're going to have maybe not 30 providers, but let's say 01:08:44.800 |
you at least have five to 10 good choices all competing heavily for the next breakthrough. 01:08:48.960 |
Like literally this morning, Google had a breakthrough in sort of this reasoning-oriented 01:08:57.840 |
And so, what's incredibly kind of great is it's sort of the best time ever to be building 01:09:04.240 |
software, assuming that you have a play in the market that lets you remain differentiated. 01:09:10.320 |
And the key there is just do enough on top of the AI model that there's enough value 01:09:14.560 |
You know how much the world spends on software and software-related things every year? 01:09:20.800 |
So, there's like, call it like a trillion and a half of software licenses, a trillion 01:09:27.920 |
and a half of consulting, and a trillion and a half of IT folks inside of companies, plus 01:09:32.800 |
or minus a little bit more, you get about $5 trillion. 01:09:38.880 |
I'm pretty sure that the market here shrinks by an order of magnitude. 01:09:44.800 |
And instead of fighting over $5 trillion, I think we'll be fighting over $500 billion. 01:09:53.520 |
I mean, I don't know if you want to make the case more, but... 01:09:57.680 |
Well, the only reason is that I think that as we de-lever the software development process 01:10:04.080 |
from humans, I think the unit cost of creating code effectively becomes so cheap that it 01:10:08.800 |
is going to be very hard to differentially price these products the way that they are. 01:10:12.800 |
So, an example would be that, let's say you use, I don't know, pick your favorite piece 01:10:28.820 |
Let's pick on Excel because maybe that's like... 01:10:36.640 |
It's what is the marginal cost of creating the Excel equivalent that is good enough that 01:10:44.080 |
And you can see that because it's what it costs Google to make Sheets, but that's humans. 01:10:50.800 |
So, the real question is, if you have a legion of bots that works 24/7 incrementally and 01:10:58.560 |
increasingly more accurately every day, the question is, what is the marginal cost? 01:11:04.240 |
And I think the marginal cost of that is going to be very cheap. 01:11:07.120 |
And when you do that, it's very difficult to price it anywhere near the same. 01:11:12.480 |
And the reason is that other companies will then replicate it and say, "Hold on, if Excel 01:11:18.080 |
wants to charge $100, I'll charge 50 and I'll take a lower margin." 01:11:24.740 |
So, I just don't agree with the TAM compression because I think there's another kind of counter 01:11:33.200 |
event that's happening that AI is really going after services. 01:11:36.880 |
And so, that then conversely expands the TAM software where IT budgets weren't usually 01:11:47.760 |
And it hasn't exactly played out as you're saying. 01:11:50.000 |
So, Zoho is this really interesting business. 01:11:54.240 |
It's probably a couple billion in revenue at this point. 01:11:55.840 |
And it's basically a suite of extremely low-cost, affordable software products by category. 01:12:04.480 |
But that's not been the reason people don't switch, though. 01:12:15.840 |
I do agree with you, Aaron, by the way, that when you bring that whole offline services 01:12:20.880 |
category online and you automate them with AI, I agree with you that that TAM could be 01:12:26.000 |
All I'm saying is the traditional software TAM today, what people spend $5.1 trillion 01:12:30.960 |
on, I think people will spend $500 billion on. 01:12:35.440 |
There may be other things that people spend money on that are wrapped in AI. 01:12:42.240 |
I guess the counter, and maybe you'd look at an ERP system or a CRM system or something 01:12:55.200 |
The last thing you want to touch is the system that is powering your supply chain. 01:13:00.000 |
The companies I talk to, they're consistently like, "Rip it out. 01:13:05.600 |
And the reason is because what they've realized is they'll spend $50 to $100 million a year 01:13:11.520 |
And they're like, "Just give me these five features as workflows. 01:13:14.640 |
Give me a simple CRUD database and just get out of the way." 01:13:17.520 |
And it's like the trade-off for that makes a lot of sense because, look, let's face it, 01:13:24.000 |
when you have to build one piece of software that has to sell to 50,000 companies, the 01:13:28.560 |
reality is that that piece of software is trying to do everything and then some, and 01:13:33.840 |
it's trying to solve two or three use cases, plus around five or six common use cases that 01:13:43.120 |
You know, it's really interesting because, Aaron, you kind of alluded to there's a TAM 01:13:48.080 |
expansion moment there, and I'm seeing this on the front lines. 01:13:51.200 |
We run a program, Founder University, I've talked about it here before, where we see 01:13:54.800 |
people pitching us their year zero startups, two or three-person teams, and what they're 01:13:59.440 |
doing is they're not going after existing legacy software. 01:14:04.080 |
They're looking at a position or a job somewhere. 01:14:10.080 |
We're going to make the number one accountant in the world that's an AI agentic, an agent. 01:14:15.520 |
We're going to make a podcast producer with podcast AI. 01:14:18.480 |
We're going to make a virtual assistant with Athena. 01:14:22.720 |
That's a whole nother category where you study a person's behavior as they work, a social 01:14:28.640 |
media manager, and what they do, and then you replicate it with AI, and that's something 01:14:35.680 |
So there could be two things occurring here, Chamath and Aaron, at the same time, which 01:14:40.000 |
is a deflation in legacy systems, and they'll be replaced. 01:14:44.720 |
And then additionally, human capital and jobs that are easy enough for AI to do as an agent 01:14:51.600 |
will also expand the TAM, two things at the same time. 01:14:58.160 |
All I'm debating is when you have to price something, you have to look at the total cost 01:15:04.480 |
of what it took to make it, and then you want to try to build in a reasonable amount of 01:15:09.120 |
margin and some reasonable expansion, and you discount it back, and this is what you 01:15:13.280 |
Even though you may not think you're doing that when you implicitly price something that 01:15:17.520 |
way, that is what's happening underneath the hood to the unemotional buyer of that good. 01:15:22.960 |
And all I'm saying is, if what Aaron says before, which I agree with is true, which 01:15:28.880 |
is the cost of using a model to get to an output effectively goes to zero, somewhat 01:15:35.440 |
what I've been saying before, the marginal cost of compute goes to zero, the marginal 01:15:42.320 |
The real question is, what does it take to make a good in the digital world? 01:15:52.320 |
You're making these choices as to what SaaS products and how you're going to solve problems. 01:15:56.640 |
Are you looking at it saying, "I'm going to hire developers who can work 10x because 01:16:00.720 |
of all these new tools, and I'm going to build my own internal systems," or are you looking 01:16:04.960 |
and saying, "I'm going to buy off-the-shelf SaaS products"? 01:16:08.480 |
What are you doing when you make your own decisions every day, David Freeberg? 01:16:12.160 |
>> Well, I try and encourage the teams to build stuff that better meets our needs, and 01:16:20.160 |
then it can actually be a better solution, and it can be built in-house. 01:16:25.200 |
I think I mentioned this in the last episode, but we did a hackathon where we brought in 01:16:29.440 |
people to learn how to use Cursor and ChatGPT to build software that had never done it before 01:16:35.840 |
to try and create this as a capability for people broadly in the organization, and there 01:16:44.400 |
And this is kind of like, I'd say, early generation, but as we get to further later generation 01:16:48.480 |
capabilities, you could see the instruction to a ChatGPT or like interface, "Hey, I'd 01:16:55.360 |
love to do the following things with a piece of software." 01:17:03.120 |
It does QA for you, and ultimately, it puts it into production for you, which is the biggest 01:17:09.040 |
You still need engineers that can load stuff into production and do QA, but if all of that 01:17:12.440 |
gets automated as well, now any user in a company can actually stand up software to 01:17:24.120 |
So Chamath, just to clarify, there's two different ways to approach the lowering the cost of 01:17:29.160 |
One is that it just creates more competitors in each of these categories, which then lowers 01:17:33.080 |
my price because now there's some downward pressure. 01:17:35.320 |
Dave's bringing up a different example, which is I'm going to build my own software at effectively 01:17:42.080 |
The challenge on this, especially the second one, is most organizations don't want to be 01:17:46.600 |
in the business of having to think about building their own software. 01:17:54.960 |
Your business would be very unique relative to the broad economy, which is I want a place 01:18:01.280 |
I want a place to just have my HR get managed. 01:18:03.520 |
I think the downward pricing pressure due to software cost lowering makes total sense. 01:18:08.440 |
I don't think it's a 10x factor, but I think that we've always had a long tail of applications 01:18:18.360 |
You did it at Microsoft Access, now you do it in Retool. 01:18:21.440 |
So the next era of that will be obviously AI built and there'll be 10 times the amount 01:18:25.520 |
of that software, but it's not obvious to me why that would go after the core systems 01:18:30.880 |
of running a business because just most companies are not looking to reinvent the wheel of that. 01:18:34.800 |
We're working with an aerospace partner, won't say who it is, and we sat there and they walked 01:18:40.160 |
us through what they deal with to make the things that they're making. 01:18:45.320 |
It's convoluted and this is not a software problem, it's that there is no piece of software 01:18:50.600 |
that understands how they want to run their company. 01:18:53.760 |
So instead, they have to morph their org chart to the tools that are available. 01:18:58.720 |
So I think what Freeberg is saying is some version of that as well. 01:19:02.400 |
There were probably people there using the tools that were available and as a result, 01:19:06.760 |
at some point, some HR person said, "We need to hire this other person and this other person," 01:19:11.440 |
and instead, what it allows companies to do is just completely reimagine. 01:19:15.680 |
How do you want your company to work and what is the business process you actually need 01:19:20.320 |
to implement and then let's just get that built. 01:19:25.680 |
I don't know if you guys saw this, but Satya Nadella had this clip that's gone pretty viral 01:19:30.720 |
in the last couple of days where he effectively said the same thing and what he's effectively 01:19:34.200 |
saying is all these big software systems are business rules wrapping a database plus an 01:19:45.960 |
This is what Alex Karp points to as sharp knives, steak dinners, and basketball game 01:19:56.240 |
Yeah, he likes to win based on product, not on sales effort. 01:20:04.920 |
We will put, if you want to actually compete, compete on your product. 01:20:09.360 |
And what's very special, and yes, do I enjoy humiliating people who have better steak dinners 01:20:19.840 |
It makes me very happy and it makes our clients happy. 01:20:22.960 |
But it really points to the heart of how the software industrial complex, that's what I 01:20:28.120 |
How this $5.1 trillion, this software industrial complex, how has it evolved? 01:20:33.560 |
I think it's people that build good point products, but when the market, meaning the 01:20:38.440 |
shareholders and the public investors, demand growth, you have this natural expansion. 01:20:49.520 |
And then in order to sell that, they inflate the set of incentives that they give to the 01:20:56.880 |
The CIO inside of these large organizations, they control budgets. 01:21:01.840 |
They're equivalent to like state level budgets in some cases. 01:21:05.560 |
You know, what do you think the CIO of a top five bank is spending? 01:21:11.960 |
They are getting wined and dined to a way that you could not even imagine. 01:21:16.640 |
I don't even think the CEO probably understands. 01:21:22.560 |
Hey, don't like totally ruin our whole game here. 01:21:24.360 |
Well, I mean, it's not, it's not, I'm just pointing it out. 01:21:30.440 |
But here's another part of the game in the field. 01:21:32.400 |
If these tools make your team 10, 20, 6%, whatever you can quantify, more effective 01:21:39.440 |
at their jobs, then it's a de minimis amount for most organizations who have created very 01:21:46.720 |
I see this where, I was mentioning some companies earlier, and their SaaS pricing is broken 01:21:52.400 |
because the number of employees at a company has been going down because people are getting 01:21:57.360 |
So now they're looking at saying, well, what value did we provide with this podcast, you 01:22:01.840 |
know, producer in a box and whatever it does? 01:22:04.560 |
They say they were starting with like $99 a month or $49 a seat. 01:22:08.400 |
I said, listen, just charge a minimum of $500 to get the software. 01:22:15.640 |
So there's a value being created here that is so great that I don't think everybody's 01:22:22.120 |
going to roll their own like Friedberg because they're going to be like, you know what? 01:22:31.560 |
And in five years, just like we're all going to, and we're all using chat-like interfaces 01:22:37.200 |
more frequently, you go to your chat-like interface and eventually you realize you can 01:22:42.120 |
ask it to build some tool for you that does something. 01:22:45.480 |
And it renders the tool and it makes it possible and it stands it up in production. 01:22:50.080 |
And suddenly you're using it at your company and then you ask it to do another tool and 01:22:54.080 |
it does it exactly to spec and you define the UX you want, you define what you want 01:22:58.400 |
it to do and it works and it interoperates, but you and really clearly and really cleanly 01:23:05.200 |
And there's a big aspect of this where you can start to build all of the software infrastructure 01:23:12.480 |
But I think you and Aaron, you and Aaron mentioned this though, the extreme difficulty is not 01:23:19.800 |
It's like, I think it's less than three or 4% of the work. 01:23:23.920 |
The 96% of the work is how you actually integrate it in the backend and how do you provision 01:23:30.240 |
How do you have controls and how do you do security? 01:23:32.200 |
Because if those things fail and those are implemented in a bot in a highly regulated 01:23:36.600 |
market, as an example, you may not actually be allowed to operate. 01:23:42.280 |
And let's say that you're in a bank and you tell the AI, figure out all the security and 01:23:47.520 |
permissions and authority rules that are necessary for me to operate as a bank, and it can actually 01:23:53.200 |
It sounds far fetched today, but there's no reason that in seven years that is not the 01:23:56.960 |
standard du jour that I don't have the ability to say, go look at all the software that's 01:24:01.720 |
So that helped me build a tool that meets compliance standards, that meets all of my 01:24:06.160 |
And you can actually instruct the AI to operate like a large number of software engineers 01:24:12.520 |
The practical issue where the rubber meets the road there is when there is a penetration 01:24:18.000 |
and a regulator who's a human, because it will not have been a bot, comes and knocks 01:24:21.920 |
on your door and you're like, well, here's this immutable log of the things that I did. 01:24:26.840 |
And they're like, I don't want an immutable log. 01:24:30.600 |
Show me the unit test that you created and signed off on. 01:24:33.680 |
So there is a critical human in the loop problem on that other 95%, which is where I agree 01:24:39.640 |
with Aaron in, in, in, in like stage one, like we're in stage zero, no, no, no, no. 01:24:47.720 |
I think you'll need governments to take an entirely different risk posture for highly 01:24:52.360 |
I don't see that happening in any, in any time in the near future. 01:24:55.360 |
In life sciences, if you want anything in a clinical system, I don't need to tell you 01:24:59.120 |
guys this, but you have to, you have to do a QA test on every single change that ever 01:25:03.080 |
happens and be able to prove that you tested every single thing. 01:25:06.440 |
So, so the idea of a probabilistic AI system generating the kind of code for you, you know, 01:25:11.600 |
in a clinical trial, you know, workflow, like I just think, yeah, it's going to take a lot 01:25:17.120 |
Yeah, I mean, we got another decade of evolution here to make these things sustainable and 01:25:23.640 |
I think it'll happen faster than everyone thinks. 01:25:26.600 |
And I think actually that's, that's like, there's actually no reason why that wouldn't 01:25:29.520 |
be a good thing if that did happen to be clear. 01:25:31.360 |
I think though, back to, to, you know, Jason's earlier point though, like then the counterbalance 01:25:35.920 |
on the TAM compression is, is just now the sort of thing that we think of as software 01:25:43.280 |
So if you're, if you're selling, if you're selling, yeah, so, but like, but like that 01:25:47.720 |
means that if you're selling, let's say $50,000 of security software, you know, maybe that 01:25:51.720 |
goes down a little bit to, you know, $25,000, but, but you might, yeah, but you might sell 01:26:00.840 |
My only comment is the software industrial complex today has to shrink because the stranglehold 01:26:06.240 |
that it has on how companies run is incredibly high for an experience that's incredibly poor. 01:26:15.040 |
Nobody, nobody raises their hand and says, gosh, this piece of enterprise software that 01:26:18.680 |
I use in my day to day life is as good as Instagram or tick tock. 01:26:22.800 |
Well, I mean, except about box, I was leading the reviews earlier today and I was on a G2 01:26:28.160 |
or one of those sites and it just box was just a rating for a tremendous, so. 01:26:38.640 |
Well, we try to make it very simple for people, so. 01:26:44.320 |
I'm putting a J trade in here because I may have to J trade this and it sounds like a 01:26:54.280 |
My last J trade is I'm just, I'm loading up on MSTR and Bitcoin. 01:26:59.160 |
I'm doing a, I'm shorting Bitcoin and I'm buying MST. 01:27:04.520 |
I'm focused on investing in a hundred startups per year. 01:27:20.560 |
Sorry, before we wrap on AI, can I just tell you guys, are you at Levy or at Aaron? 01:27:32.840 |
Have you guys talked about how Google has like gotten some religion? 01:27:42.240 |
And Sergei's going to work every day, by the way. 01:27:48.160 |
By the way, did you guys see Genesis yesterday? 01:27:50.160 |
Let me just put this up here, and then I'll tee it up to you, Freeberg. 01:27:58.800 |
And they have one with Deep Reasoning and their 2.0 model. 01:28:01.440 |
I did a side-by-side test, same prompts, with our friend Sandeep Madhra on This Week in 01:28:17.400 |
And I think he conceded that with their free model and their $20 model, if we just gave 01:28:22.680 |
I think Google now has reached parity, and they have an app out. 01:28:27.720 |
Because people were counting them out, and here we are. 01:28:30.600 |
I think not only have they caught up, I think they have exceeded. 01:28:35.040 |
They were late, and the compounding effects are playing out, and it's only going to continue 01:28:38.720 |
And I will say the data repository at Google, the engine that they have, the infrastructure, 01:28:42.320 |
the team, everything down to components is advantaged. 01:28:45.160 |
And so everyone's been kind of counting them out, but it's clear they're in it to win it. 01:28:50.000 |
Freeberg, 70% of the usage of open AI is consumer. 01:28:55.240 |
What does that mean for them as Gemini and Google get momentum? 01:28:59.440 |
I think you'll end up having a Gemini that has ads, eventually. 01:29:05.200 |
And you could pay and have no ads, or you could not pay and have ads. 01:29:08.640 |
No, meaning do you provide enough value and of enough quality where folks don't need to 01:29:17.000 |
I'm asking you whether it impacts open AI, the quality of Gemini as it increases, or 01:29:21.280 |
do you think that these usage cohorts are roughly set? 01:29:26.800 |
My experience is I don't feel like I've got embedded data on open AI or on Gemini that 01:29:32.920 |
Just like with Google, I'm going to go to the best interface, the best engine. 01:29:36.600 |
So I do think that there's fungibility here and people will move over as they realize 01:29:42.140 |
But I will say that what we're seeing now with the data advantage at Google, so the 01:29:46.400 |
internet, all these LLMs are language models trained on text from the internet, right? 01:29:50.800 |
There's call it 50 billion words on the internet. 01:29:53.840 |
And I think if you estimate the data repository in these training sets, it's like probably 01:29:58.720 |
a couple terabytes, one to five terabytes or something like that. 01:30:01.760 |
But if you look at the video data that's out there, there's hundreds of billions of hours 01:30:07.080 |
or a hundred billion plus hours of video data, a large amount of that is sitting on YouTube. 01:30:12.040 |
And by some estimates, there's a thousand exabytes of video data on the internet. 01:30:15.660 |
So about a billion times more video data than there is word or text data. 01:30:21.480 |
And I think we just saw that play out with the VO model that launched yesterday. 01:30:27.100 |
Google has all of this YouTube data, you know, whether or not they're using it to train, 01:30:31.120 |
I don't think they're allowed to, is what I heard from the insiders. 01:30:33.600 |
They have to redo their terms of service to get explicit permission to use it. 01:30:41.560 |
And it's basically rendering physics or it looks like it's rendering physics. 01:30:44.080 |
Now, Genesis came out yesterday and it's open source. 01:30:48.000 |
So you can actually go play with this Genesis model, which similarly renders the actual 01:30:56.280 |
So rather than rendering a two-dimensional set of pixels to look at a video visual with 01:31:02.040 |
Genesis, as you can see here, you can type in a prompt and it renders this extraordinary 01:31:06.400 |
video that also has underlying it the three-dimensional objects that make up the video. 01:31:13.160 |
Which means you could change the angle, right? 01:31:16.920 |
And also with Google and this, you can start to implement three-dimensional models based 01:31:22.520 |
on some prompt that says something like, I want to have the camera angle at this point 01:31:30.880 |
And suddenly everything starts to prompt in a way that you can actually render in real 01:31:35.120 |
time a video game, a movie, a visual experience. 01:31:39.180 |
And it goes to this point that we're unleashing the capacity for human imagination and creativity 01:31:44.360 |
with these systems because it's no longer just a lookalike two-dimensional image. 01:31:48.520 |
It's now an actual three-dimensional object that then renders the visuals to make this 01:31:53.080 |
So we're starting to see, I think, the next era of these models that goes beyond just 01:31:58.920 |
Yeah, there was another release last week of an experimental project for browser use 01:32:04.080 |
by Gemini, which is another advantage because YouTube has obviously an incredible amount 01:32:10.640 |
So you have, like we think about visual as just like, okay, it's going to be for developing 01:32:19.160 |
But it's actually just like all of the use cases of a computer are also on YouTube. 01:32:27.720 |
Where your Chrome extension AI can control your browser to do things. 01:32:31.120 |
And if you go to the AI studio from Gemini, they have a mode where you can turn on your 01:32:35.560 |
webcam and then basically, it has full visual access to anything. 01:32:45.640 |
And it's super exciting because you can tell that Google's woken up and they are just on 01:32:53.840 |
So I mean, just in 10 days, the quantum, the AI, open source, Gemini updates. 01:32:59.460 |
It's like every morning, you're waking up to a Sundar tweet that is some new breakthrough. 01:33:03.320 |
Well, yeah, it's amazing when you fire 20,000 people who weren't doing any work and were 01:33:09.960 |
And then you say, "Hey, that's not the point." 01:33:14.200 |
The point is they've had this compounding and infrastructure advantage, data advantage, 01:33:20.560 |
And now they were just a little late to the game. 01:33:29.600 |
It was making Abraham Lincoln, African-American and Betsy Ross was Asian. 01:33:37.360 |
A big part of Google's orientation historically has been don't mess up the machine, a classic 01:33:43.320 |
kind of big company dilemma, where if I do something that's either disruptive to my core 01:33:47.280 |
business or if I do something wrong where I will invite regulatory and consumer scrutiny, 01:33:55.040 |
And what they've done is they've changed the posture in the last two years. 01:33:57.800 |
And now the posture is launch early, launch aggressively, push hard. 01:34:01.620 |
We have to win this battle or we're going to get eaten alive. 01:34:04.240 |
And it's amazing to see the founders and Sundar lead this organization. 01:34:08.800 |
I think there were a lot of question marks a year ago, but I think that the questions 01:34:15.880 |
I think we've hit peak open AI in the market. 01:34:19.080 |
I think they're going to be the number three or four player. 01:34:21.280 |
I think Gemini, Meta and XAI are going to lead them. 01:34:26.800 |
If we're sitting here in three years, I think open AI is number three, four or five, not 01:34:37.880 |
I don't think you want to bet against Sam and Greg. 01:34:42.000 |
You don't want to bet against Sergey now in office with Sundar. 01:34:48.240 |
So I think the rankings are kind of less interesting as much as just the fact that it's like game 01:34:54.160 |
You do not have incumbents that are sleeping at this point. 01:34:59.120 |
So it's just an incredible time for everybody. 01:35:04.440 |
For the sultan of science, David Freeburg, the chairman dictator, Tamal Paya-Hapatia, 01:35:08.720 |
and I don't have a nickname for you, Aaron Levy, but we will have one soon. 01:35:22.800 |
Have a great Christmas break and we'll see you again shortly.