back to indexE115: The AI Search Wars: Google vs. Microsoft, Nordstream report, State of the Union
Chapters
0:0 Bestie intro!
1:49 Report about US involvement in the destruction of Nordstream pipelines, breaking away from the military-industrial complex
28:26 Bestie refresh!
33:13 The AI Search Wars: Microsoft pressing hard, Google has a rough week
45:6 Sundar's next move: How should Google counterpunch? Google's troubled business model, will we see successful lawsuits over training data?
72:41 Will generative AI commodify enterprise SaaS? If so, what happens to VC returns?
91:14 Disappointing State of the Union, precarious situation between debt, taxes, and entitlements
00:00:00.000 |
I had this thing with my oldest son where I don't know if it's all kids, but he's 13. 00:00:05.920 |
He doesn't know how to answer the phone, not to save his life. 00:00:13.040 |
And so I said, "Listen, from now on, when you call me, I expect a certain way that you 00:00:17.640 |
pick up the phone, and that's going to be practiced by how you interact with anybody 00:00:23.520 |
If I call you, you have to pick up the phone, and if you don't, I'm hanging up right away." 00:00:27.740 |
So yesterday, he calls me, and I'm like, "Jamal speaking." 00:00:33.200 |
You know, it's like, "Hello, Jamal speaking." 00:00:49.340 |
We did this three more fucking times, and then finally I said, "When will you get it 00:00:55.500 |
If I say, "Hello, Jamal speaking," "Hey, Dad, it's your son," or "Hey, Dad, it's 00:01:00.940 |
And it's unbelievable, and he's like, "Well, none of my friends pick up the phone like 00:01:06.280 |
Don't they think they need to have verbal communication skills?" 00:01:10.700 |
And then when I call him, he picks up the phone, "Hello?" 00:01:26.400 |
Are you kids like this, or is it just my kid? 00:01:28.180 |
- Actually, I don't think I've heard him pick up the phone. 00:01:35.180 |
- "And instead, we open sourced it to the fans, and they've just gone crazy with it." 00:01:39.180 |
- J. Cal was on Twitter calling me out for not denouncing the Chinese balloon as strong 00:01:55.300 |
- Well, I don't think he understood the language I was using. 00:01:58.700 |
Did you understand what the word errant means? 00:02:04.900 |
- Yeah, it generally means shraying off course. 00:02:07.900 |
Yeah, traveling in search of adventure, at least according to Merriam-Webster. 00:02:14.420 |
So I just thought maybe you're being a little... 00:02:17.300 |
But I know you're a dove, but I was a little doveish thinking they were just off course. 00:02:21.260 |
I think you'd think they were off course, or they were doing it deliberately. 00:02:28.940 |
So I knew J. Cal would have to join in this nationwide panic over this balloon. 00:02:37.980 |
- J. Cal, I got some really bad news for you. 00:02:45.340 |
There was these things called spy satellites, and they can see everything. 00:02:51.340 |
- It's obvious people, they've been sending these balloons over here for a while. 00:02:53.580 |
I just thought you were framing as it was errant, as in like, off course. 00:02:59.420 |
I'm sincerely asking, do you think it was errant, it was an accident? 00:03:02.940 |
It's such a harebrained scheme to send a balloon flying over American territory that the Occam's 00:03:10.980 |
Razor explanation here is that they somehow lost control over it, and these things are 00:03:16.900 |
So my guess is that it probably wasn't deliberate just because of how stupid a plan it was. 00:03:24.540 |
- I don't know how stupid it would be, and how obvious it would be, but it could be. 00:03:29.500 |
What I do know is that the whole nation got in a lather and a tizzy and started hyperventilating 00:03:35.020 |
about this balloon, and it just shows how reflexively hawkish the media is. 00:03:39.420 |
It's like, "Hey, can we just wrap up this war in Ukraine before we start another war 00:03:47.380 |
- The balloon got more attention than us blowing up Nord Stream. 00:03:53.380 |
- No, what I would say about that is the media insight is the valid one. 00:03:56.820 |
I was just responding to the errand, and I was curious if you actually thought it was 00:04:02.540 |
- Well, I think it clearly was off course and problematic, so I don't know about the 00:04:09.300 |
I think it very well could have been intentional, but I tend to think, because it's such a stupid 00:04:13.220 |
harebrained scheme, I almost give them the benefit of the doubt, not because they're 00:04:21.500 |
- I'm sure they're spying on us, just like we're spying on them. 00:04:23.140 |
That's what we both do, but it seems like such a stupid way, because you're going to 00:04:29.980 |
You have to understand the nature of live television and why the media overreacted to 00:04:34.300 |
It's ongoing, so because it's not final, it's like a live event occurring, like when kids 00:04:42.400 |
Those are the best stories for CNN, because you can keep updating people, and people keep 00:04:49.380 |
So this one was just made for CNN, because obviously, we're going to shoot the thing 00:04:54.420 |
down, and obviously, you can interpret into it if it's an accident or not, and it just 00:04:59.100 |
gives them something to talk about on a slow news week. 00:05:02.620 |
I think your point that is interesting is, is the Nord Stream story correct or not? 00:05:10.900 |
It's not being covered because it's not an ongoing story. 00:05:18.660 |
- I think it's not being covered because the mainstream media already took a side on this. 00:05:27.940 |
So when Nord Stream got blown up, the administration came racing out with the line that the Russians 00:05:36.660 |
And by the way, the media repeated this endlessly. 00:05:40.980 |
And it never really made sense to anyone who's paying attention, because first of all, this 00:05:47.940 |
Second, it was their main source of leverage over Europe, was their control over the gas 00:05:54.340 |
So the idea that they would shoot themselves in the foot that way, just to somehow, what, 00:06:00.380 |
show how crazy they are, it never really made sense to anyone who was paying attention. 00:06:04.660 |
And the fact that the administration, the media, so quickly raced to that conclusion 00:06:12.100 |
Because if we had nothing to do with it, you would just be more neutral and say, "Yeah, 00:06:17.340 |
But they had to promote this line that, "Oh, the Russians did it," which just never made 00:06:23.140 |
And now Cy Hirsch has come out with this story, laying out in great detail how we did it. 00:06:30.040 |
It's laying out who did it and how and the steps and all that kind of stuff. 00:06:33.700 |
And just so you guys understand who this guy is, he's this legendary Pulitzer Prize winning 00:06:39.140 |
He's like 86 years old or something like that now. 00:06:41.380 |
But he broke, during Vietnam, he broke the story of the My Lai massacre, which the military 00:06:49.260 |
He broke the story during the Iraq War of the Abu Ghraib prison abuses, which again, 00:06:58.900 |
He's Pulitzer winning is all you really need to know. 00:07:01.940 |
Yeah, and here he is, breaking out basically another covert military action. 00:07:06.500 |
And look, we don't know for sure whether it's true or not, but it looks pretty bad. 00:07:12.260 |
Do you think that the Europeans were told that the Americans were going to go and blow 00:07:17.020 |
If that's true, by the way, the administration has said clearly, this is a false story. 00:07:22.020 |
I think the specific quote was this is pure fiction. 00:07:25.340 |
The Ukraine would be the most likely person to do this. 00:07:29.300 |
You got to look for a means, motive and opportunity. 00:07:31.220 |
They don't have the opportunity to go down there and place a bomb? 00:07:34.300 |
I don't think the Ukrainians have the capability to do an undersea mission like that. 00:07:40.900 |
And the story maintained that we did it with them. 00:07:46.980 |
Norway didn't say whether they were involved or not. 00:07:49.100 |
So no, my guess is that if this was a covert US activity, it was kept to a very small group, 00:07:54.900 |
which is why it'll be hard to prove more definitively than that. 00:07:58.580 |
The other side of the argument is this is such a provocation. 00:08:07.980 |
Like, would the US do something so provocative? 00:08:12.660 |
And would Norway do something so provocative? 00:08:15.160 |
It seems an extremely, it seems like an extremely offensive technique, as opposed to just backing 00:08:20.940 |
the Ukraine to defend itself against Russia's invasion. 00:08:25.140 |
Before the invasion, Biden at a press conference said that if the Russians invade, Nord Stream 00:08:32.820 |
And they asked him, well, how can you make sure that that's a deal between Germany and 00:08:39.140 |
They said, we have ways, we have ways, just trust me, it'll happen. 00:08:41.660 |
Separately, Victoria Nuland, who's our deputy secretary of state, said something very similar 00:08:46.320 |
about how we would stop Nord Stream if the Russians invaded. 00:08:50.320 |
And then after Nord Stream got blown up, Blinken at a press conference said that this was a 00:08:55.280 |
wonderful opportunity, and was extolling all the benefits of this. 00:08:58.760 |
And then Victoria Nuland at a congressional hearing said that I'm sure we're all very 00:09:02.480 |
glad that it's a hunk of metal at the bottom of the sea, again, extolling all the benefits. 00:09:06.120 |
And we've discussed the benefits on this program of now we've shifted the European dependence 00:09:10.120 |
on Russian gas to a dependence on American gas. 00:09:13.680 |
So the fact of the matter is the administration kind of telegraphed what they were going to 00:09:29.320 |
It's a single source story that depends on the credibility of Cy Hirsch. 00:09:34.300 |
But as it stands right now, whose story do I think makes more sense? 00:09:40.680 |
You believe Cy Hirsch or the spokesperson for the CIA who wrote the claim is completely 00:09:46.680 |
Well, these these same people denied Abu Ghraib, they denied Milai. 00:09:52.760 |
They deny that NATO expansion anything to do with the, you know, breakout of this war. 00:09:58.760 |
Yeah, I mean, this is one of those situations where we can't possibly know. 00:10:02.680 |
Yeah, you make a good point, J Cal, which is this if true, and look, I'm not saying 00:10:10.040 |
I lean towards thinking it's more plausible than not. 00:10:12.880 |
Because also like who else could have done it and who again, who had the motive means 00:10:18.120 |
But you make a great point, which is if we did it, it's an incredibly provocative act. 00:10:23.880 |
It's basically perpetrating an act of war against a country that has thousands of nuclear 00:10:30.000 |
Biden promised us at the beginning of this war that he would keep us out of being directly 00:10:34.680 |
So this would directly contradict what he said at the beginning of this war. 00:10:48.000 |
If the US didn't do it, like, why don't we find the real killer? 00:10:59.120 |
Obviously, the, the people who are the most pro stopping Nordstrom have been the Republicans. 00:11:02.960 |
They've led this charge even more than the Democrats. 00:11:13.260 |
And just like we equipped the Ukraine to do it, we might have facilitated the Ukraine 00:11:17.960 |
and a collaborator, UK, Norway, some freelancers, you know, we have those freelance, former 00:11:25.880 |
Navy SEALs that operate, perhaps the CIA just said to the Ukraine, here's a, you know, black 00:11:32.700 |
If you wanted to engage them, you could feel free to do so, which would then split the 00:11:37.040 |
difference between what size saying what the CIA is saying, which is typically where the 00:11:41.680 |
truth lies is probably between what this investigative journalist of note has said and what the CIA 00:11:52.920 |
Well, if you could find a source for that story, J Cal, that lays out in the same level 00:11:56.800 |
of detail that Hirsch has laid out in terms of the meetings that occurred, what was approved 00:12:02.040 |
when, how they did it, that, you know, when they did it, the explosives they use, the 00:12:08.760 |
divers they used, it goes into a fair amount of detail, incredible detail. 00:12:13.080 |
Yeah, I mean, so I mean, you're laying out a theory out to what explosives, right? 00:12:17.280 |
But you're trying to put what you just said, which is basically you inventing a story on 00:12:24.600 |
But he actually has a lot of detail in his story. 00:12:30.080 |
There was a lot of people that were willing to tell the story. 00:12:32.520 |
No, no, I don't think that's actually the case. 00:12:36.760 |
And there's no on the record, there's one main source, there was one main source. 00:12:42.800 |
I don't know exactly how much he was able to directly corroborate with other sources. 00:12:46.680 |
The other thing I wonder, Saks is he with this one source story, he has a collaboration 00:12:54.480 |
And anybody would if this was a really well sourced, and you could back it up kind of 00:12:59.240 |
story, they would love to have the ratings for this story. 00:13:02.560 |
There's a blockbuster story for as happens a long time, which we learned was it through 00:13:07.360 |
the Valerie Plame affair, which is that these major news publications will sometimes have 00:13:13.040 |
a back channel back to the national security apparatus when they have something like this. 00:13:17.320 |
And the message was, you can't hit print on this. 00:13:19.480 |
Yeah, in which case possibility to he goes and just self publishes himself, which wasn't 00:13:27.760 |
So I mean, this is the Rorschach test of Rorschach tests. 00:13:30.640 |
You have the media, you got the CIA, it's fodder for a great movie. 00:13:34.280 |
What I go back to J. Cal is I think you can lay out some theories about let's say, you 00:13:38.560 |
know, the Poles did it or maybe Ukrainians with the British or something. 00:13:43.480 |
Yes, you could lay out those theories, but the media wasn't willing to entertain any 00:13:52.520 |
And that story made no sense, but they said it so definitively. 00:13:57.600 |
I'm looking for Nick, if you could pull up the source of the administration blaming the 00:14:01.240 |
Russians, I just want to make sure we're accurate. 00:14:04.000 |
Yeah, I just want to make sure there was no sabotage by the Russians. 00:14:08.720 |
What was interesting is the Biden administration set up but they didn't keep coming back to 00:14:15.600 |
And when people on both the left and the right like people like Jeffrey Sachs on the left 00:14:20.120 |
and Tucker Carlson, the right basically started to question whether the US could have done 00:14:23.440 |
it, they were accused of being conspiracy theorists, and you know, Putin stooges and 00:14:28.960 |
And now all of a sudden, here comes Seymour Hersh with a pretty detailed story lending 00:14:37.600 |
It just may indicate we don't know everything that's going on with this war. 00:14:42.080 |
And I think the longer this goes on, the more dangerous it is. 00:14:45.200 |
Freiburg, do you have any thoughts on geopolitical issues and who might have blown up Nord Stream? 00:14:53.120 |
What are your sources in Science Quarter saying? 00:14:59.800 |
I don't know about the whole speculating on what could have been done in the past by someone. 00:15:05.240 |
Those are conspiracy theories and a little dangerous. 00:15:17.080 |
Jekyll's theory is just a story with no evidence. 00:15:18.080 |
I think you've got to bring data to the table. 00:15:22.800 |
You got to bring data to the table to make it a... 00:15:24.800 |
By the way, speaking of which, Radek Sikorski, who is a Polish diplomat, I think he was like 00:15:30.600 |
their foreign minister, when Nord Stream blew up, he tweeted a photo of it saying, "Thank 00:15:39.120 |
Which was one of the reasons why people thought that, okay, like, yeah, of course the US did 00:15:52.920 |
Who was conducting NATO exercises in that region? 00:15:59.920 |
And Jeffrey Sachs pointed that out on, I think it was a CNBC interview before they basically 00:16:04.680 |
stopped him because he's not allowed to talk about this on, you know, network TV, apparently. 00:16:09.040 |
But he basically pointed out there were like US radar signatures in the area. 00:16:12.880 |
President Joe Biden declared that, this is from Bloomberg, that a massive leak from the 00:16:16.160 |
Nord Stream gas pipeline system in the Baltic Sea was an intentional act and that the Russian 00:16:20.080 |
statements about the incident shouldn't be trusted. 00:16:21.840 |
It was a deliberate act of sabotage and now the Russians are pumping out disinformation 00:16:25.200 |
lies Biden told reporters Friday at the White House. 00:16:27.600 |
So he didn't exactly say the Russians did it. 00:16:29.200 |
He just said it's an act of, don't believe the Russians. 00:16:34.080 |
You could do one of these montages where it was like Russian sabotage. 00:16:42.600 |
If you actually look at what Biden said, everything he said is absolutely true if the US also 00:16:49.760 |
Every single sentence that Joe Biden said is 100% true, whether we did it or whether 00:17:06.120 |
And then we have to be like careful about suggesting they did it. 00:17:13.720 |
So if we blew it up, that's also an, uh, explain to me your thinking on the chessboard of our 00:17:21.160 |
If we blew it up, would they not also see that as a hostile act? 00:17:24.000 |
This is why I asked if Europe knew because I think you have to tell Germany that it's 00:17:28.840 |
And I think the quid pro quo with Germany is some amount of guaranteed supply that the 00:17:35.040 |
US directs into Europe so that they know that their long term LNG supply is intact so that 00:17:42.720 |
There's a point of indifference where the Germans say, okay, we don't know when this 00:17:47.360 |
thing is going to get turned back on and we don't know what the implications of it are. 00:17:50.880 |
Here's what our demands are, meaning our energy supply, our energy needs are. 00:17:56.100 |
And so as long as the United States can say, look, worst case, we have stuff in the, you 00:18:03.360 |
There's probably a point of indifference where the Germans say, okay, we're just going to 00:18:07.640 |
And I'm curious, did the Germans say anything when this happened? 00:18:13.400 |
I was literally just type to Nick, what was the Germans position on this? 00:18:18.440 |
Well, yeah, if you're breaking this down like a poker hand, who are trying to figure out 00:18:22.120 |
if you construct this hand, right pre flop, it's like, you know, where both of these folks 00:18:27.960 |
Okay, there was a really interesting video that a guy named Matt or failure who puts 00:18:33.040 |
together these really funny videos, he put together these montages of media reactions 00:18:39.280 |
And what he shows is that, you know, you can have like 20 different media outlets, they 00:18:45.080 |
And when he clips it together, you can see the reading from somebody's talking points. 00:18:48.840 |
And it's not really clear who and in basically, if you look at his video here, on who blew 00:18:54.440 |
up Nord Stream pipeline, you see that there was like a party line from the mainstream 00:19:11.120 |
I mean, we have to conclude that is most likely Russia, Russian sabotage on its own infrastructure. 00:19:22.560 |
I think it's Putin's way of sending a message. 00:19:25.160 |
What Putin is saying to us by blowing up his pipeline is, look, I can blow up a pipeline. 00:19:31.280 |
Who are these talking heads smoking gun without the direct proof? 00:19:34.840 |
Yeah, I think logic and common sense will tell you that without the evidence, Russia 00:19:38.680 |
was behind the incident, you can say it for sure. 00:19:47.640 |
That's a you know, it's these talking heads who have no firsthand experience. 00:19:49.920 |
And I'm more than willing to comment on this. 00:19:54.920 |
All his videos are like that, where he has one on the Hunter Biden laptop as well, where 00:19:58.560 |
again, he's got like 20 different talking heads and media outlets, all portraying it 00:20:06.920 |
It's like a narrative on Sunday morning shows, you hear the same narrative from each side. 00:20:15.960 |
Each side builds those bullet points, emails their what do they call them? 00:20:21.440 |
They email all the surrogates and say, just keep saying these things over and over again 00:20:28.360 |
And they're just like, say this over and over again. 00:20:33.640 |
I think it's partly talking points, memos that go out to chat groups. 00:20:35.840 |
I think it's also just people looking on Twitter. 00:20:38.440 |
And then there's like certain key nodes that they follow. 00:20:41.920 |
And they know, okay, this is the party line, because such and such key person is saying 00:20:47.480 |
It's that memetic, mememic effect that you guys always talk about people with that. 00:20:53.840 |
Yeah, the thing to understand is that all the prestige outlets repeat the same party 00:21:02.320 |
You got to do your own search for information. 00:21:04.720 |
Actually, this is why substack is so important is it actually gives you an alternative. 00:21:09.080 |
It gives you an alternative because like you've got 10 different mainstream media networks 00:21:14.480 |
or newspapers or magazines, but they all have exactly the same talking points, except maybe 00:21:20.080 |
Although even Fox on the whole Nord Stream thing, you saw that Fox can be pretty militaristic 00:21:25.440 |
and they had the same generals basically blaming the Russians for this on Fox. 00:21:30.920 |
Their position is just, "Hey, everybody, this is sabotage." 00:21:37.160 |
I think you asked a really good question there about the German interest in this. 00:21:41.000 |
Right now, the German economic interest and the German foreign policy interests are not 00:21:46.880 |
What's clearly best for the German economy is to have cheap natural gas powering its 00:21:51.400 |
industries even if it comes from Russian pipelines. 00:22:00.040 |
They're going to pay a very high price economically maybe forever. 00:22:03.320 |
Remember, their whole economy is based on industry. 00:22:09.360 |
If this war drags on for a long time, I think Schultz might be in some political trouble 00:22:13.240 |
precisely because he's gone along with the Americans on this. 00:22:18.320 |
There is a growing political opposition to this war inside of Germany. 00:22:22.880 |
War fatigue is a real thing and this thing's got to wrap up at some point. 00:22:28.880 |
We didn't get too involved in that conversation. 00:22:42.640 |
You don't want to criticize the establishment? 00:22:57.600 |
I'm just analytical around the fact that I think there's a strong orientation towards 00:23:02.560 |
I don't think that there's much of an incentive or a motivation to back down because this 00:23:08.200 |
conflict creates a significant amount of debt owed back to the U.S. 00:23:18.160 |
Everyone's looking externally as internal economic conflict and wealth disparity issues 00:23:24.680 |
arise and economic growth is challenged and inflation is soaring. 00:23:32.240 |
I think all of this stuff is detailed analytical shenanigans around who's saying what or who's 00:23:38.800 |
I think the underlying thesis and the underlying river that's flowing is one that's looking 00:23:45.160 |
I think the same is true with the U.S. and China. 00:23:48.480 |
You're talking about the military-industrial complex is going to benefit massively from 00:23:53.520 |
The more this drags out, the more the conflict in Taiwan heats up, the more we're going to 00:24:01.000 |
We often talk about these things as if they're top-down, master plan driven. 00:24:06.200 |
As we all know, they're more Ouija board driven. 00:24:09.120 |
It's a bunch of guys that got their hand on the Ouija board and they all just had a little 00:24:15.320 |
In this case, I think it's just more about everyone's a little anxious and the anxieties 00:24:23.880 |
If you're happy at home, you're not looking externally for conflict. 00:24:27.000 |
That's true in nearly every developed nation on earth today. 00:24:37.200 |
I think there are a lot of interests who benefit from war and I think the foreign policy establishment 00:24:42.920 |
is funded by those interests and it's kind of wired for war, at least in terms of the 00:24:48.880 |
If you're going to do something as relatively harmless as a balloon, I saw that becomes 00:24:55.200 |
It's like people are ready to go to war against China over that. 00:24:58.480 |
I saw a couple of military leaders give a talk a few months ago. 00:25:03.400 |
So it wasn't on public record with the establishment. 00:25:06.720 |
No, yeah, I was at the establishment gathering. 00:25:09.120 |
What was striking to me in this particular thing where these guys were being interviewed 00:25:25.160 |
on stage at like a dinner thing and they were so oriented around their next steps in escalation 00:25:35.480 |
and I think it speaks to the point, Zach, like none of them were thinking about like 00:25:40.400 |
where we are today, how do we deescalate, what is this going to get? 00:25:43.040 |
There was no conversation at all from anyone about resolution or deescalation. 00:25:48.840 |
With every single one of them, it was all about like my orientation for getting bigger, 00:25:53.360 |
going deeper, going harder, going stronger, making this thing bigger and I think that 00:25:57.280 |
was really scary to me because I didn't hear anyone having a conversation around like how 00:26:01.520 |
do we should we even consider whether or not this doesn't get bigger. 00:26:05.920 |
Everyone was thinking, assumptively, it was going to get bigger. 00:26:11.880 |
When the last time we looked at this, right, Leon Panetta and all these other guys who 00:26:15.680 |
were screaming for war, they were getting paid by the military industrial complex. 00:26:19.880 |
I remember when Lloyd Austin was nominated as defense secretary. 00:26:23.880 |
He had some conflict issues because he was just on the board of Northrop Grumman or one 00:26:31.080 |
And so of course, these generals have to push for war because as long as they're girding 00:26:35.040 |
for war, they're guaranteed to have for them a very lucrative job once they leave the military. 00:26:42.640 |
The Ouija board though, Saxon, I'll throw to you, maybe you can keep this metaphor going. 00:26:49.760 |
You have the energy industrial complex in this German conflict who seeks to benefit 00:26:53.800 |
massively if people invest in renewables or you find other oil off Norway. 00:27:01.160 |
Norway's oil is one of the largest reserves that's untapped. 00:27:04.240 |
So you have this Ouija board, media, energy, and the military industrial complex all moving 00:27:11.080 |
Everyone wants to move it to the side of the Ouija board that says escalate. 00:27:14.160 |
There are very few people that have the energy to move it to the other side that says de-escalate. 00:27:19.280 |
De-escalation means less energy, more less investment. 00:27:24.200 |
You know who warned us about the military industrial complex? 00:27:27.800 |
Dwight D. Eisenhower, Supreme Allied Commander in World War II, wins the war, patriot war 00:27:32.720 |
hero, top general, becomes president, Republican president, and his departing address warns 00:27:38.400 |
us that yes, we need a defense industry, but they become a vested interest in favor of 00:27:49.280 |
Where's the interest on the other side of it? 00:27:52.440 |
I can tell you this, the American people don't want to be in a war with Russia. 00:27:56.320 |
I don't even think most of the American people want to send 100 billion over there. 00:28:00.560 |
They want to send 100 billion to their cities to fix crime and all the other problems. 00:28:07.460 |
If you've never seen that farewell address from 1961, it is well worth watching. 00:28:12.520 |
Just search for military industrial complex Eisenhower. 00:28:16.240 |
And this is a person who was part of the military industrial complex saying, "Watch out for 00:28:27.680 |
With us again, David "The Dove" Sachs, Chamath Palihapitiya, and the Sultan of Science, 00:28:39.560 |
What are all these podcasts you're doing, Freeberg? 00:28:42.280 |
I did a podcast with Brian Keating last week, who was really kind enough to reach out. 00:28:47.800 |
He's a cosmologist at UCSD, professor down there. 00:28:54.000 |
And we were supposed to record that day, and then we canceled, I think, last minute, right? 00:28:58.760 |
So I was only supposed to be on with him for an hour, and I'm like, "Oh, well, my next 00:29:05.480 |
I was like exhausted that day, so I look really hungover on the video and probably stumbled 00:29:13.040 |
So Chamath and I have done Lex Friedman, but you have not. 00:29:39.120 |
Well, full stop, we're not doing all in live from Davos. 00:29:47.360 |
I don't want to be part of any club that would have me as a member. 00:29:51.760 |
And they also don't want you there anyway, so it works out for everybody. 00:29:56.120 |
It does fill me with like a rage where I actually might agree to doing the All In Summit again. 00:30:01.880 |
By the way, proposal coming your way this weekend. 00:30:05.520 |
If you want to really, really, really thumb your nose at the establishment. 00:30:09.840 |
Set it during the exact same dates and times as an establishment conference. 00:30:16.640 |
And invite all the best guests so that they come to ours. 00:30:22.560 |
You know how Vanity Fair does their new establishment conference? 00:30:28.800 |
Let's come up with a tagline that just tweaks everybody. 00:30:33.440 |
You know, they have their like establishment list. 00:30:41.520 |
I have a great Vanity Fair establishment thing story. 00:30:53.440 |
And the most incredible thing about it is that when you go to the event, which was kind of 00:30:57.000 |
a cool event, we all had photos taken by Annie Leibovitz. 00:31:02.080 |
And I have a montage of some of the people that took photos that day. 00:31:09.680 |
Me, Aaron Levy, Priscilla Chan, Bezos, bunch of people. 00:31:13.600 |
But I was about to say, just before you had, just when you had the dad bod and no fashion 00:31:38.560 |
He's doing the jeans blazer thing, which is like a really tired look for a Silicon Valley 00:31:52.560 |
Just don't pull up the Chamath pictures when he's wearing like, you know, his Macy's shirt. 00:31:57.200 |
If you do the Google search, you put the images before 2011, you will find Chamath photos 00:32:03.640 |
Dude, at Facebook, I wore the same thing every day for four years, five years. 00:32:16.640 |
See this is when the sweater game was not tight. 00:32:23.600 |
Look, oh, and he's also got the watch subtly peeking out. 00:32:26.320 |
This is back when he was like, "Oh, I got a Rolex." 00:32:32.600 |
This is when the watches were only five figures. 00:32:35.160 |
I don't know what that watch got to six figures. 00:32:41.160 |
I got an Apple watch and I'm going to upgrade. 00:33:04.160 |
It was like pictures of me eating egg sandwiches all over the internet. 00:33:15.760 |
Okay, it's been a rough couple of days for Google. 00:33:18.520 |
Google and Microsoft both did live demos of their new generative AI, yada, yada, yada. 00:33:25.360 |
But now Bing is integrating it into their search engine, getting there before Google. 00:33:29.920 |
And Microsoft CEO Satya Nadella, he is going ham. 00:33:38.000 |
And he is saying he's going to make Google dance. 00:33:48.360 |
On the other hand, Google's AI demo was, frankly, a bit of a disaster. 00:33:54.480 |
Poorly received, stock to drop 12% since this event. 00:33:58.600 |
And their presentation did not include the chat barred because in search, because it 00:34:05.920 |
It seems like there was an error in it when they said, what new discoveries from the James 00:34:09.520 |
Webb Space Telescope can I tell my nine year old and barred answer that it took the first 00:34:14.920 |
pictures of a planet outside our solar system, which is false. 00:34:26.760 |
So anyway, there was a screenshot circulating today, which is probably false, but it says 00:34:35.240 |
Me and a bunch of coworkers were just laid off from Google for AI demo going wrong. 00:34:39.200 |
It was a team of 168 people who prepared the slides for the demo. 00:34:44.600 |
But if it was, that would be a hardcore moment for Google to fire a bunch of people for screwing 00:34:49.600 |
Listen, you worked in the belly of the beast, Freiburg. 00:34:52.800 |
What are your thoughts on being poking the tiger and telling Google dance? 00:34:59.080 |
You know, Sundar dance, you know, what's interesting is Google's had like an incredible AI competency, 00:35:08.200 |
And it's been predominantly oriented towards kind of, you know, internal problems. 00:35:14.200 |
You know, they're they demonstrated last year that their AI improved data center energy 00:35:20.920 |
They've used it for ad optimization, ad copy optimization, the YouTube follow video algorithm. 00:35:27.400 |
So what video is suggested to you as your next video to watch, which massively increased 00:35:33.020 |
YouTube hours watched per user, which massively increased YouTube revenue. 00:35:38.680 |
You know, what's the right time and place to insert videos in YouTube or insert ads 00:35:45.640 |
So so much of this competency has been oriented specifically to avoid this primary disruption 00:35:52.880 |
Obviously, now, things have come to a bit of a point because, you know, this alternative 00:36:04.080 |
And you know, we've used this term in the past, Larry and Sergey, the textbook that 00:36:08.040 |
they read, you know, one of the original textbooks that's used in internet search engine technology 00:36:13.560 |
is called information retrieval, information retrieval. 00:36:16.600 |
So information retrieval is this idea that you know, how do you pull data from a static 00:36:21.040 |
data set, and it involves scanning that data set or crawling it, and then creating an index 00:36:26.260 |
against it, and then a ranking model for how do you pull stuff out of the index to present 00:36:30.880 |
the results from the data that's available based on what it is you're querying for, you 00:36:37.560 |
know, and doing that all in a 10th of a second. 00:36:39.720 |
So you know, if you think about the information retrieval problem, you type in the data or 00:36:44.840 |
some rough estimation of the data you want to pull up, and then a list is presented to 00:36:49.720 |
And over time, Google realized, hey, we could show that data in smarter, quicker ways. 00:36:53.440 |
Like if we can identify that you're looking for a very specific answer, we can reveal 00:36:57.560 |
that answer in the one box, which is the thing that sits above the search results. 00:37:02.680 |
You know, what, when does this movie show at this theater, so they can pull out the 00:37:05.680 |
structure data and give you a very specific answer rather than a list from the database. 00:37:11.400 |
And then over time, there were other kind of modalities for displaying data that it 00:37:15.280 |
turns out, were even better than the list, like maps, or shopping, where you can kind 00:37:20.640 |
of see a matrix of results, or YouTube, where you can see a, you know, longer form version 00:37:26.440 |
And so these different kind of, you know, information retrieval, you know, media were 00:37:31.920 |
presented to you, and it really kind of changed the game and created much better user satisfaction 00:37:37.640 |
in terms of getting what they were looking for. 00:37:40.800 |
The challenge with this new modality is it's not really fully encompassing. 00:37:46.560 |
So if you can kind of think about the human computer interaction problem, you want to 00:37:51.280 |
see flight times, and airlines, and the price of flights in a matrix, you don't necessarily 00:37:56.920 |
want a text stream written to you to give you the, you know, the answer that you're 00:38:02.920 |
looking for, or you want to see a visual display of shopping results, or you do want to see 00:38:09.480 |
a bunch of different people's commentary, because you're looking for different points 00:38:12.560 |
of view on a topic, rather than just get an answer. 00:38:15.540 |
But there are certainly a bunch of answer solutions for which chat dbt, type, you know, 00:38:21.160 |
natural language responsiveness becomes a fantastic and better mode to present answers 00:38:27.080 |
to you, then the matrix or the list or the ranking and so on. 00:38:31.000 |
Now, the one thing that I think is worth noting, I did a back of the envelope analysis on the 00:38:39.120 |
So, so Google makes about three bucks per click, you can back into what the revenue 00:38:44.720 |
One way is three bucks per click about a 3% click through rate on ads. 00:38:48.200 |
Some people estimate this is about right about 5 cents to 10 cents revenue per search done 00:38:52.560 |
on Google or anywhere from one cent to 10 cents, even if they don't click the ads, because 00:38:59.620 |
So let's let's just call it five cents, right. 00:39:02.480 |
And you can assume a roughly 50% margin on that search, which means a 50% COGS or cost 00:39:07.720 |
of goods, or a cost to run that search and present those ads. 00:39:11.800 |
So you know, right now, Google search costs them about, you know, call it two and a half 00:39:20.480 |
A recent estimate on running the GPT three model for chat GPT is that each result takes 00:39:30.760 |
So it's about an order of magnitude higher cost to run that search result than it is 00:39:37.360 |
to do it through a traditional search query today makes today today. 00:39:43.160 |
So that's the point, like it has to come down by about an order of magnitude. 00:39:47.880 |
Now, this is a this then becomes a very deep technical discussion that I'm certainly not 00:39:53.760 |
But there are a lot of great experts that there's great blogs and sub stacks on this, 00:39:57.280 |
on what's it going to take to get there to get a 10x reduction in cost on running these 00:40:02.800 |
And there's a lot related to kind of optimization on how you run them on a compute platform, 00:40:07.960 |
the type of compute hardware that's being used all the way down to the chips that are 00:40:12.020 |
So there's still quite a lot of work to go before this becomes truly economically competitive 00:40:20.200 |
Because if you get to the scale of Google, you're talking about spending eight to $20 00:40:23.920 |
billion a quarter just to run search results and display them. 00:40:28.380 |
And so for chat GPT type solutions on Bing or elsewhere to scale, and to use that as 00:40:32.840 |
the modality, you're talking about something that today would cost $80 billion a quarter 00:40:38.000 |
to run from a compute perspective, if you were to do this across all search queries. 00:40:41.760 |
So it's certainly going to be a total game changer for a subset of search queries. 00:40:45.960 |
But to make it economically work for for these businesses, whether it's Bing or Google or 00:40:52.240 |
others, there's a lot of work still to be done. 00:40:54.800 |
The great part about this Chamath is that Bing gave 10 billion to our friend Sam and 00:41:01.320 |
chat GPT to invest in Azure, which now has the infrastructure and will be providing the 00:41:07.680 |
chat GPT infrastructure to startups or corporations, big companies, and small alike. 00:41:12.640 |
So that $10 billion should do enough to grind it down between software optimization, data 00:41:17.360 |
optimization, chip optimization and cloud optimization. 00:41:24.000 |
The ability to run this at scale is going to happen because we're getting better and 00:41:28.800 |
better at creating silicon that specializes in doing things in a massively parallelized 00:41:35.560 |
And the cost of energy at the same time is getting cheaper and cheaper along with it. 00:41:40.080 |
When you multiply these two things together, the effect of it is that you'll be able to 00:41:45.320 |
The same output today will cost one one 10th as long as you ride the energy and compute 00:41:57.600 |
So the sidebar is, if you guys were sitting on top of something that you thought was as 00:42:03.180 |
foundational as Google search back in 1999, would you have sold 49% of it for $10 billion? 00:42:15.920 |
Not in an environment where you have unlimited ability to raise capital. 00:42:19.280 |
This is something that we've said before, which is that chat GPT is an incredibly important 00:42:25.960 |
But it's an element of a platform who will get quickly commoditized because everybody 00:42:31.980 |
And so I think what Microsoft is doing is the natural thing for somebody on the outside 00:42:37.640 |
looking in at an entity that has 93% share of a very valuable category, which is how 00:42:44.860 |
And so Microsoft effectively for 10 billion, but almost 50% of a tool. 00:42:50.200 |
And now we'll make that tool as pervasive as possible so that consumer expectations are 00:42:54.880 |
such that Google is forced to decay the quality of their business model in order to compete. 00:43:01.720 |
So that as Friedberg said, you have to invest in all kinds of compute resources that today 00:43:10.240 |
And what you will see is that the business quality degrades. 00:43:14.160 |
And this is why when Google did the demo of Bard, the first thing that happened was a 00:43:18.520 |
stock went off 500 basis points, we they lopped off $100 billion of the market cap, mostly 00:43:23.920 |
in reaction to Oh my god, this is not good for the long term business. 00:43:27.120 |
It's not good for the long term business on a mechanical basis. 00:43:30.780 |
And you get an answer, you don't have to click the links. 00:43:32.720 |
No, right now, if you look at Google's business, they have the best business model ever invented 00:43:46.320 |
This is a business that this year will do almost $100 billion of free cash flow. 00:43:51.120 |
It's a business that has to find ways and we kind of joke, but they have to find ways 00:43:56.680 |
Because they'd be showing probably 50 or 60% EBITDA margins and people would wonder, hey, 00:44:02.020 |
wait a minute, you can't let something like this go unattended. 00:44:05.420 |
So they try to do a lot more things to make that core treasure look not as incredible 00:44:15.620 |
This is a business that's just an absolute juggernaut. 00:44:17.980 |
And they have 10 times as many employees as they need to run the core business. 00:44:23.060 |
But my point is that it's an incredible business. 00:44:25.180 |
So that business will get worse if Microsoft takes a few hundred basis points of share, 00:44:30.880 |
if meta takes a few hundred basis points of share, if 10 cent does, if a few startups 00:44:36.460 |
Quora, by the way, launched something called Po, which I was experimenting and playing 00:44:42.440 |
If you add it all up, what Satya said is true, which is even if all we do collectively as 00:44:46.700 |
an industry is take 500 or 600 basis points of share away from Google, it doesn't create 00:44:54.080 |
that much incremental cost for us, but it does create enormous headwinds and pressure 00:45:00.120 |
for Google with respect to how they are valued and how they will have to get revalued. 00:45:05.800 |
So the last thing I'll say is the question that I've been thinking about is what is Sundar 00:45:14.000 |
I think the countermeasure here, if I was him, is to go to the board and say, guys, 00:45:22.160 |
So tack is the traffic acquisition costs that Google pays their publishers. 00:45:26.320 |
It is effectively their way of guaranteeing an exclusivity on search traffic. 00:45:32.360 |
So for example, if you guys have an iPhone, it's Google search. 00:45:40.520 |
This year, this renegotiation for that deal could mean that Apple gets paid $25 billion 00:45:46.820 |
So if these Google does all these kinds of deals last year, they spent, I think, 45 billion 00:45:52.800 |
So about 21% when you think about that, Shamath, Google basically paid Apple, which was working 00:46:02.960 |
So I think the question for Google is the following. 00:46:04.700 |
If you think you're going to lose share, and let's say you go to 75% share, would you rather 00:46:11.080 |
go there and actually still maintain your core stranglehold on search? 00:46:17.040 |
Or do you actually want 75% share where now all of these other competitors have been seeded? 00:46:23.400 |
Well, you can decay business model quality and still remain exclusive if you just double 00:46:29.840 |
And what you do is you put all these other guys on their heels because, as we talked 00:46:33.040 |
about, if you're paying publishers two times more than what anybody else is paying them, 00:46:38.560 |
you'll be able to get publishers to say, "Hey, you know what? 00:46:41.280 |
Don't let those AI agents crawl your website because I'm paying you all this money. 00:46:46.080 |
So do not crawl in robots.txt equivalent for these AI agents. 00:46:51.040 |
And I think that that'll put Microsoft and all these other folks on their heels. 00:46:54.840 |
And then as you have to figure out all this derivative work stuff, all these lawsuits, 00:47:01.040 |
Google will look pristine because they can say, "I'm paying these guys double because 00:47:04.040 |
I acknowledge that this is a core part of the service." 00:47:06.440 |
So that's the game theory I think that has to get figured out. 00:47:15.160 |
I'm going to get the saxophone off because in this clip I'm about to show, Neelay Patel 00:47:18.960 |
from The Verge did an awesome interview with Satya and he basically would not answer this 00:47:26.680 |
question at least to my satisfaction, which is, "Hey, what do the publishers get out of 00:47:38.720 |
But if I ask the new bang, "What are the 10 best gaming TVs?" 00:47:41.640 |
And it just makes me a list, why should I, the user, then click on the link to The Verge, 00:47:48.000 |
which has another list of the 10 best gaming TVs? 00:47:51.760 |
But even there, you will sort of say, "Hey, where did these things come from?" 00:48:04.280 |
So I don't think of this as a complete departure from what is expected of a search engine today, 00:48:09.720 |
which is supposed to really respond to your query, while giving them the links that they 00:48:21.580 |
He punted the answer and just said, "Hey, listen, Search works this way." 00:48:23.920 |
Saks, will the rights to the data, will Google just say to Quora, "Hey, we'll give you a 00:48:30.360 |
billion dollars a year for this data set if you don't give it to anybody else." 00:48:36.800 |
Saks, the strategist, let me hear your strategy here. 00:48:39.800 |
I think there's maybe even a bigger problem before that, which is I think the whole monetization 00:48:46.720 |
So the reason why Google monetizes so well is it's perceived as having the best search, 00:48:52.200 |
and then it gives you a list of links, and a bunch of those links are paid, and then 00:48:58.680 |
Now I think when you search in AI, you're looking for a very different kind of answer. 00:49:02.800 |
You're not looking for a list of 10 or 20 links. 00:49:08.480 |
And so where is the opportunity to advertise against that? 00:49:10.800 |
I mean, maybe you can charge an affiliate commission if the answer contains a link in 00:49:16.600 |
it or something like that, but then you have to ask the question, "Well, does that distort 00:49:22.560 |
Am I really getting the best answer, or am I getting the answer that someone's willing 00:49:28.880 |
The fact is, if Google gives you an answer, you don't click on ads. 00:49:32.600 |
Google has had a very finely tuned balance between, "Hey, these first two or three paid 00:49:38.960 |
ads, these might, the paid links might actually give you a better answer than the content 00:49:45.160 |
But in this case, if the chat GPT tells you, "Hey, this is the top three televisions, these 00:49:49.200 |
are the top three hotels, these are the top three ways to write a better essay," you don't 00:49:55.760 |
You have now been given an answer, and the model is gone. 00:49:59.200 |
The paid link is still a subset in that case. 00:50:03.280 |
At Google, we used to have a key metric, was the bounce back rate. 00:50:07.840 |
So when a user clicks on a result on the search results page, we could see whether or not 00:50:19.000 |
That tells you the quality of the result that they were given because if they don't come 00:50:23.000 |
back, it means they ended up getting what they were looking for. 00:50:26.800 |
Ads that performed better than organic search results, which means someone created the ad, 00:50:33.040 |
paid for it, and the user clicked on it and didn't come back and came back with less frequency 00:50:39.040 |
than if they clicked on an organic result, that meant that the ad quality was higher 00:50:44.760 |
So the ad got promoted to kind of sit at the top and it became a really kind of important 00:50:48.800 |
part of the equation for Google's business model, which is how do we source, how do we 00:50:52.800 |
monetize more search results where we can get advertisers to pay for a better result 00:50:59.320 |
than what organic search might otherwise kind of show. 00:51:02.440 |
So it's actually better for the user in this case than say just getting an answer. 00:51:06.840 |
For example, I'm looking for a PlayStation 5. 00:51:09.720 |
I don't want to just be told, "Hey, go to Best Buy and buy a PlayStation 5." 00:51:12.840 |
I want to be taken to the checkout page to buy a PlayStation 5. 00:51:17.120 |
I am more likely to be happy if I click on a result and it immediately takes me to the 00:51:20.920 |
checkout page and Best Buy is really happy to pay for you to get there because they don't 00:51:24.560 |
want you looking around the internet looking for other places. 00:51:30.600 |
Not all search queries are, "Hey, what's the best dog to get to not pee on the floor?" 00:51:36.120 |
Whatever kind of arbitrary question you might have that you're doing research on. 00:51:39.200 |
Many search queries are commerce intention related. 00:51:49.580 |
That series of queries may have a very different kind of modality in terms of what's the right 00:51:53.680 |
interface versus the chat GPT interface where there's a lot of organic results that people 00:52:01.800 |
The question earlier can be resolved by Google doing a simple analytical exercise which is, 00:52:07.760 |
"What's it going to cost us and what's going to give the user the best result?" 00:52:11.640 |
That's ultimately what will resolve to the better business model. 00:52:15.120 |
I think on Chamath's point, today Google pays Apple $15 billion a year to be the default 00:52:22.540 |
search engine on iPhones on the Safari browser. 00:52:26.940 |
That's only about a quarter of Google's overall tack. 00:52:29.460 |
The majority of Google's traffic acquisition cost is actually not being paid for search. 00:52:33.460 |
A good chunk of that is being paid to publishers to do AdSense display ads on their sites and 00:52:39.100 |
Google's rev share back to them for putting ads on their sites. 00:52:42.420 |
The tack number, I think maybe you want to move the needle, but the majority of Google 00:52:46.800 |
searches don't come through the default search engine that they pay Apple to be on. 00:52:51.260 |
It might move the needle a bit, but I don't think it really changes the equation for them. 00:52:55.020 |
My comment is more tack has to become a weapon on the forward foot, number one. 00:53:01.060 |
If you're going to spend 21% of your revenue on tack, you should be willing to spend 30 00:53:08.660 |
I don't think what you want to see is your profit dollars decay because you lose share. 00:53:13.380 |
It's rather better for you to spend the money and decay your business model than have someone 00:53:19.500 |
At this point, Apple is really the only tack line item for search. 00:53:28.740 |
You have an entire sales team whose job it is right now to sell AdSense. 00:53:32.900 |
You have an entire group of people who know how to account for tack and how to think about 00:53:37.740 |
But if you're basically willing to say, "Out of the $100 billion of free cash flow, I'm 00:53:42.500 |
willing to go to 80 or 70 billion of free cash flow combined with the 100 billion of 00:53:48.900 |
short and long-term investments I have and I'm going to use it as a weapon. 00:53:53.540 |
I'm going to go and make sure that all of these publishers have a new kind of agreement 00:53:57.660 |
that they signed up for, which is I'll do my best to help you monetize. 00:54:02.380 |
You do your best by being exclusive to our AI agents." 00:54:06.700 |
You deprive other models of your content on your pages because that will get litigated 00:54:13.020 |
and there is no way, just like again, if you say, "Do not crawl," you're not allowed to 00:54:17.140 |
crawl if you're Google or Microsoft researchers. 00:54:23.220 |
My point is Google should do this and define how it's done before it's defined for them 00:54:28.780 |
because right now people are in this nascent phase where everybody thinks everybody's going 00:54:36.940 |
It's a really important kind of philosophical question. 00:54:39.340 |
First off, Google today, just so people know, on AdSense is typically paying out 70 cents 00:54:47.060 |
It's a pretty generous and it's the way they've kind of kept the competitive moat wide and 00:54:52.300 |
kept folks out of beating them on third-party ad network bids because they bid on everything 00:54:57.500 |
and they always win because they always share the most revenue back. 00:55:01.820 |
When it comes to acquiring content, the internet is open. 00:55:06.380 |
Anyone can go to any website by typing in the IP address and viewing the content that 00:55:10.180 |
a publisher chooses to make available on that server to display to the internet. 00:55:18.380 |
I think 15 or 20 or 30% of the pages on the internet right now are apps that are closed. 00:55:27.340 |
I'm saying the open internet matters less and less. 00:55:32.220 |
Maybe there's the enhancement of the models, but my point being that if the internet is 00:55:35.580 |
open and you and I spent a billion lifetimes reading the whole internet and getting smart, 00:55:41.180 |
and then we were the chatbot and someone came and asked us a question and we could kind 00:55:44.460 |
of answer their question because we've now read the whole internet, do I owe licensing 00:55:50.260 |
royalty revenues to the knowledge that I gained and then the synthesis that I did, which ultimately 00:55:55.300 |
meant excluding some things, including some things, combining certain things, and the 00:55:59.500 |
problem with these LLMs, these large language models, is that you end up with 100 million, 00:56:05.260 |
a billion plus parameters that are in these models that are really impossible to deconvolute. 00:56:12.340 |
We don't really understand deeply how the neural network is, the model is defined and 00:56:17.820 |
run based on the data that it is constantly kind of aggregating and learning from. 00:56:22.280 |
And so to go in and say, hey, it learned a little more from this website and a little 00:56:25.260 |
less from that website is a practical impossibility. 00:56:28.180 |
I'm saying when you look at transformer architecture today, every LLM that you write on the same 00:56:33.900 |
corpus of underlying data for training will get to the same answer. 00:56:37.820 |
So my point is today, if you're a company, the most important thing that you can do, 00:56:43.660 |
especially if you have a $1 trillion plus market cap that could get competed away, is 00:56:51.100 |
And so all I'm saying is from the perspective of a shareholder of Google and also from the 00:56:55.340 |
perspective of the board of director or senior executive or the CEO, this should be the number 00:57:02.380 |
And my framing of how to answer that question is build a competitive moat around two things. 00:57:09.940 |
One is at the end, which is how much money and what kind of relationship do I have with 00:57:16.820 |
my customers, including the publishers, and can I give them more so that number two is 00:57:24.700 |
I can affect who they decide to contribute their content to. 00:57:29.740 |
Let's assume that there are five of these infinite libraries in the world. 00:57:37.700 |
How important is it if Quora says, you know what guys, I've done a deal where my billions 00:57:43.420 |
of page views and all of that really rich content, Quora's incredible content, Google's 00:57:48.060 |
paying me $2 billion a year and so I've decided to only let Google's AI agents crawl it. 00:57:53.180 |
And so maybe when there are questions that Quora is already doing a phenomenal job of 00:57:59.060 |
answering, I think it does make a difference that Google now has access to Quora's content 00:58:06.580 |
For a hot minute, they did have access to the Twitter firehose and that was the premise 00:58:09.660 |
was we could get this corpus of data that we can have in a very limited, restricted 00:58:16.460 |
I don't think that those deals exist anymore. 00:58:18.340 |
Twitter, I mean, you guys might know better than I do, but I don't think they exist anymore. 00:58:23.900 |
And maybe Dave, I think you're being too forgiven. 00:58:27.920 |
These models know where they got the data and they can easily cite the sources and they 00:58:35.300 |
And if you want to say something, hold, go ahead. 00:58:39.460 |
The video in the Wall Street Journal where Satya was interviewed showed a demo and you're 00:58:44.980 |
They actually showed Jason in the search results, the five or six, but it made no sense because 00:58:49.860 |
it's like, how do you know that those are the five most cited places that resulted in 00:58:55.020 |
Well, by page rank technology or the authority of the website or the author. 00:59:06.060 |
They have 78 employees, I think, according to LinkedIn. 00:59:08.900 |
I just typed in one of the best flat panel TVs. 00:59:12.140 |
And as you see, sentence by sentence, as it rewrites another person's content, it links 00:59:18.780 |
with a citation, just like the Wikipedia does. 00:59:21.660 |
And when you scroll to the bottom of it, it tells you, hey, this is from Rolling Stone. 00:59:26.340 |
And if that answer is good for you and you trust those sources, those people should get 00:59:31.100 |
Every time there's a thousand searches and you come up, you should get a dollar every 00:59:36.620 |
And if not, these sites should sue the daylights out of Google. 00:59:40.380 |
And for Google to say they can't do it is hogwash. 00:59:44.980 |
It's not fair use because in fair use, you have the ability to create derivative works 00:59:52.740 |
And you are taking this person, the original content owners ability to exploit that and 00:59:57.820 |
you are co-opting it and you're doing it at scale. 01:00:01.580 |
You're not allowed to interfere with my ability to make future products. 01:00:11.420 |
The problem with that idea, just from a product perspective for a second, is that if you limit 01:00:18.060 |
how they can tokenize to just being all entire sentences, the product will not be that good. 01:00:23.820 |
Like the whole idea of these LLMs is that you're running, you know, so many iterations 01:00:28.100 |
to literally figure out what is the next most best word that comes after this other word. 01:00:33.260 |
And if you're all of a sudden stuck with blocks of sentences as inputs that can't be violated 01:00:37.500 |
because of copyright, the product will not be as good. 01:00:43.220 |
These are also not deterministic models and they're not deterministic outputs, meaning 01:00:46.340 |
that it's not a discrete and specific answer that's going to be repeated every time the 01:00:54.240 |
So they infer what the right answer could or should be based on a corpus of data and 01:00:58.460 |
a synthesis of that data to generate a response to a query. 01:01:02.380 |
That reference, that inference is going to be, you know, assigned some probability score. 01:01:07.740 |
And so the model will resolve to something that it thinks is high probability, but it 01:01:11.020 |
could also kind of say, there's a chance that this is the better answer, this is the better 01:01:15.300 |
And so when you have like you have in the internet competing data, competing points 01:01:19.140 |
of view, competing opinions, the model is synthesizing all these different opinions 01:01:23.980 |
and doing what Google search engine historically has done well, which is trying to rank them 01:01:27.940 |
and figure out which ones are better than others. 01:01:31.820 |
And so if as part of that ingest process, one is using some open, openly readable data 01:01:37.260 |
set, that doesn't necessarily mean that that data set is improving the quality of the output 01:01:42.500 |
or is necessarily the answer from the output. 01:01:46.780 |
Let me just give everybody a quick four factor education on fair use. 01:01:51.320 |
And here it is from Google's actual website, because they deal with this all the time. 01:01:55.580 |
And when you look at that the nature and purpose and character of the work, including whether 01:01:59.740 |
such use is not profit or educational purposes. 01:02:03.220 |
So that first test of fair use is hey, if you're using it educationally, and you want 01:02:07.260 |
to make a video that is criticism of Star Wars prequels, or how to shoot a shot like 01:02:14.580 |
this Quentin Tarantino, if it's educational, it's fine. 01:02:17.540 |
And courts typically focus I'm reading here from Google on whether the use is transformative. 01:02:21.540 |
That is whether it adds new expression or meaning to the original or whether it merely 01:02:26.240 |
It's very obvious that this is not transforming if they're just rewriting it. 01:02:29.720 |
The nature of the copyright is pretty transformative to me. 01:02:34.320 |
They're coming out with entirely new content. 01:02:47.280 |
Listen, Jacob, I think the rights issue is just like the cost issue, which is a problem 01:02:54.040 |
today maybe but it's going to get sorted out. 01:02:57.600 |
New technology waves that are this powerful don't get stymied by either chip costs or 01:03:07.800 |
YouTube got stopped dead in their tracks and the only way YouTube and Napster got stopped 01:03:11.400 |
in the tracks, I predict this is going to get stopped dead in its tracks with YouTube 01:03:19.340 |
And Google was enabled piracy and then they had to build tools to fight against it. 01:03:25.840 |
You guys both think that cost is going to stay in the way. 01:03:27.840 |
Nothing's going to stay in the way of the AI. 01:03:32.840 |
Big companies using AI to inst- The AI is already happening. 01:03:40.840 |
The effect of the use of harm- Are you going to bill us $500 an hour for 01:03:46.840 |
I've heard this point of view before from you. 01:03:48.840 |
Okay, take it easy, Mr. Sub- Better call J-Cal. 01:03:51.840 |
The effect of the use upon the potential market for or value of the copyright work. 01:03:52.840 |
Use upon the potential market for or value of the copyright work. 01:03:53.840 |
Use upon the potential market for or value of the copyright work. 01:03:54.840 |
Use upon the potential market for or value of the copyright work. 01:03:55.840 |
Use upon the potential market for or value of the copyright work. 01:03:56.840 |
Use upon the potential market for or value of the copyright work. 01:03:57.840 |
Use upon the potential market for or value of the copyright work. 01:03:58.840 |
Use upon the potential market for or value of the copyright work. 01:04:19.840 |
Use upon the potential market for or value of the copyright work. 01:04:45.840 |
Use upon the potential market for or value of the copyright work. 01:05:14.840 |
The use case that I saw this past week in a product demo 01:05:19.840 |
was, they were showing me an Excel spreadsheet, 01:05:24.840 |
like Excel spreadsheet modeling of financial asset. 01:05:29.840 |
And they had a plug-in to a chat GPT type AI. 01:05:39.840 |
And they were like, "Hey, I'm going to show you 01:05:54.840 |
And they were like, "Okay, so let's do this." 01:06:04.840 |
And they were like, "Okay, so let's do this." 01:06:09.840 |
And the chat GPT spat out a formula that was like perfect Excel logic 01:06:14.840 |
that was something that you or I could never figure out. 01:06:19.840 |
You need a super pro user of Excel to basically know how to do this stuff. 01:06:24.840 |
So it spit it out and boom, it worked instantly. 01:06:29.840 |
And what we're thinking about is that we're going to have these little assistants everywhere. 01:06:34.840 |
You combine that power with, say, speech to text, right? 01:06:39.840 |
Because we could have just talked to it, the speech to text would transcribe the instruction, 01:06:44.840 |
spit it back out, and you're going to have these little personal digital assistants in applications. 01:06:49.840 |
I think it's pretty obvious to see how AI could replace call centers 01:06:54.840 |
with having the frontline call center operator be, instead of being a human, it could be like an AI. 01:07:04.840 |
I think in every single application that we use, there's going to be an AI interface 01:07:09.840 |
and it's probably going to be voice based where you can just say to it, 01:07:14.840 |
"Hey, I'm trying to accomplish this. How do I do it? Can you just make it happen?" 01:07:19.840 |
I have an idea. I was hanging out with Andrej Karpathy and I gave him this following challenge. 01:07:27.840 |
I said, "If you had to build Stripe, how many engineers do you think it would take you 01:07:32.840 |
and how long would it take you to build a competitor?" It was just a thought exercise. 01:07:36.840 |
It would take hundreds of millions of dollars and years. 01:07:40.840 |
Now imagine you were feeling threatened by Stripe. 01:07:45.840 |
Imagine you're a large company. Visa MasterCard, just as an example. 01:07:50.840 |
You can now actually get one or two really smart people like him 01:07:55.840 |
to lead an effort where you would say, "Here's a couple hundred million dollars 01:08:00.840 |
to compete with Stripe, but here are the boundary conditions. Number one is you can only hire five or ten engineers." 01:08:05.840 |
What you would do is you would actually use tools like this to write the code for you. 01:08:10.840 |
The ability to write code is going to be the first thing 01:08:15.840 |
that these guys do incredibly well with absolute precision. 01:08:20.840 |
You can already do unit testing incredibly well, but it's going to go from unit testing to basically end-to-end testing. 01:08:25.840 |
You'll be able to build a version of Stripe extremely quickly and in a very lean way. 01:08:30.840 |
So then the question is, "Well, what would you do with the two or three hundred million you raised?" 01:08:40.840 |
You go to customers and you're like, "Well, listen, if Braintree is going to charge you one basis point 01:08:45.840 |
over Visa MasterCard, or sorry, a hundred basis points and Stripe will 01:08:55.840 |
Totally. And this is going to make everything. 01:08:57.840 |
That's what's so interesting. You can take any business that's a middleman business. 01:09:01.840 |
This is the point. Any middleman business right now that doesn't have its own competitive mode 01:09:06.840 |
can be competed against because now you can take all of those input costs that go into human capital, 01:09:11.840 |
you can defer that, have a much smaller human capital pool, and push all of that extra money 01:09:19.840 |
This goes to the movement in Silicon Valley of being more efficient. 01:09:23.840 |
The net benefit of all of this is economic productivity because the end customer that's using that tool that you just mentioned, 01:09:28.840 |
they now have a lower cost to run their business and their total net profits go up. 01:09:32.840 |
And this is what happens with every technology cycle. 01:09:35.840 |
It always yields greater economic productivity and that's why the economy grows. 01:09:39.840 |
And that's why, I just want to say, this is so important. 01:09:41.840 |
That's why technology is so important to drive economic growth, not debt. 01:09:46.840 |
We've historically used financial engineering to drive economic growth. 01:09:55.840 |
What we're describing here is AI making it harder and harder to make humans productive. 01:10:09.840 |
One human can do the coding that 20 humans would do before. 01:10:13.840 |
Assuming they're skilled enough to use the AI. 01:10:16.840 |
Chat, chat prompts are easier than programming. 01:10:19.840 |
Go back to traditional capitalism for a second. 01:10:21.840 |
Go back because the Stripe example is another good one. 01:10:24.840 |
If you have this business model, how does the ecosystem get efficient? 01:10:30.840 |
How do we create more opportunity to use Freeberg's language? 01:10:33.840 |
The only way that it really happens, how does cost go down, is that certain entities become 01:10:38.840 |
big enough that they can drive the prices down. 01:10:40.840 |
An Uber, a DoorDash, or whomever says, "Yes, I need payments capability, so Braintree, 01:11:00.840 |
But, you've never had an internal form that can create hyper-efficiency and basically 01:11:05.840 |
create customer value like this thing can because this thing can allow a billion, jillion 01:11:12.840 |
10-person companies to get created that can do the work of 10,000 people. 01:11:26.840 |
Here they say, "An onboarding screen of a dog walking app." 01:11:29.840 |
You type that in, and it gives you a welcome screen. 01:11:33.840 |
Then it says, "Oh, a way for people to change their name, phone number, and password." 01:11:36.840 |
You know, that classic screen on any app, "I need to change my thing." 01:11:40.840 |
Well, then at the same time that people are making text to UX, user interface, beautiful 01:11:49.840 |
Then there's GitHub Copilot, which if you haven't seen, we all know it here. 01:11:56.840 |
GitHub Copilot, GitHub is bought by Microsoft, another one of Satya Nadella's incredible 01:12:06.840 |
I mean, what an incredible person to come after a bomber who's just so effective at what he's 01:12:13.840 |
As you're writing your code, it fills in your code. 01:12:16.840 |
It knows what you're writing, just like in an email. 01:12:18.840 |
It's a smaller subset of information than email. 01:12:21.840 |
You could be talking to a lover or a business person, whatever. 01:12:25.840 |
Here, when you're doing programming, it's a much finer dataset. 01:12:29.840 |
These two things are going to come together where you're going to be able to build your 01:12:32.840 |
MVP for your startup by typing in text and then publish it. 01:12:36.840 |
You're not going to need a developer for your startup. 01:12:42.840 |
Do you think that this leverage, and I argue this is all about leverage, it's one person 01:12:50.840 |
Do you guys think that this commoditizes and puts at risk SACs in particular, like all 01:12:56.840 |
of enterprise SaaS, because it becomes such a commodity to basically build a business 01:13:02.840 |
Or does it create much higher returns for investors because you invest so much less 01:13:05.840 |
capital to get to a point of productivity with that business or revenue of that business 01:13:13.840 |
I tend to think that if it gets easier, then everything becomes more competitive. 01:13:23.840 |
By the way, I would slightly disagree with the characterization. 01:13:29.840 |
For example, in that demo we just saw, what does the AI do? 01:13:33.840 |
It exports the new design assets to Figma format. 01:13:38.840 |
So, if you can create a SaaS product that becomes a standard, everyone's still going 01:13:48.840 |
I don't think business software is going away. 01:13:50.840 |
I also don't think that you're not going to need to hire engineers because Copilot's 01:13:55.840 |
I think what Copilot will do is make your typical engineer more productive. 01:13:59.840 |
They had some graphs around how Copilot reduced coding time by 50%. 01:14:05.840 |
So, I think you'll be able to get a lot more out of your developers. 01:14:08.840 |
I think that's sort of the key is a lot of the drudgery work gets taken care of. 01:14:13.840 |
The answer, Freeberg, to your question is I think more startups, more niche startups 01:14:17.840 |
will make better products, and then you'll just have many more folks making SaaS for 01:14:32.840 |
If you're investing at $5 million like I do in companies when they're just on napkins 01:14:36.840 |
and, you know, back of envelope, there's plenty of room. 01:14:38.840 |
If you have a $200 million exit, that's a 40x. 01:14:41.840 |
But there could be a lot of new categories too. 01:14:45.840 |
There could be a lot of traditional industries. 01:14:47.840 |
There's a lot of traditional industries that get disrupted. 01:14:49.840 |
Like, we're thinking about just software displacing software. 01:14:52.840 |
It could be software displacing, like, industries that aren't even software yet. 01:14:57.840 |
I think the entire video game industry is going to get completely rewritten with AI 01:15:00.840 |
because you're not going to have a publisher anymore that makes one game that everyone consumes. 01:15:03.840 |
You're going to have tools that everyone creates and consumes their own game. 01:15:35.840 |
David Guetta playing Eminem at crack with Eminem's voice, right? 01:15:42.840 |
This is something that I made as a joke, and it works so good I could not believe it. 01:15:46.840 |
I discovered those websites that are about AI. 01:15:50.840 |
Basically, you can write lyrics in the style of any artist you like. 01:15:57.840 |
So I typed, "Write a verse in the style of Eminem about future rave." 01:16:03.840 |
And I went to another AI website that can recreate the voice. 01:16:10.840 |
I put the text in that, and I played the record, and people went nuts. 01:16:21.840 |
Whoever makes the best Friedberg Eminem hybrid rap with David Sacks as the hype man is getting 01:16:29.840 |
I think we need an AI performance at All In Summit 2023. 01:16:34.840 |
You and I are so in sync these days, Friedberg. 01:16:36.840 |
There are going to be a lot of interesting mashups that get created. 01:16:39.840 |
For example, you'd be able to create a movie where, let's say you want to make a Western 01:16:46.840 |
You obviously get the rights from the Wayne estate, but no actor ever goes away. 01:16:50.840 |
There's going to be a database of all of them. 01:16:56.840 |
As you write the script, the AI is going to be showing you that scene in real time. 01:17:05.840 |
Think about what that would do for Star Wars. 01:17:07.840 |
Just like Instagram and TikTok basically democratize everyone's ability to create and publish content. 01:17:15.840 |
This takes it to a whole other level where the monopoly that big production houses have 01:17:21.840 |
is they have the big budget so they can afford to make a big movie. 01:17:24.840 |
If that cost of making a $10 million movie goes to $10,000 or $1,000 of compute time, 01:17:28.840 |
anyone sitting in their studio in a basement can start to make a movie. 01:17:32.840 |
And it really changes the landscape for all media, not just movies, music, video games. 01:17:39.840 |
And ultimately, the consumers themselves can create stuff for their own enjoyment. 01:17:45.840 |
But having each individual create their own content-- 01:17:48.840 |
Maybe it's just become a lot more like the music industry. 01:17:50.840 |
Remember, anyone can really create a song now. 01:17:55.840 |
And they upload it on DistroKid, and it's on Spotify instantly. 01:17:58.840 |
Did you guys see this article in the New York Times that was kind of throwing some shade at the CEO of Goldman Sachs, David Solomon? 01:18:07.840 |
You look great in the picture, but I didn't read it because I figured it's hate. 01:18:10.840 |
They were talking about his side gig as being a DJ, but specifically they called out a potential conflict of interest. 01:18:18.840 |
And it's related to this because I guess he had gotten a license to a Whitney Houston song. 01:18:23.840 |
And he remixed it and released it on Spotify. 01:18:30.840 |
And they thought that there could be this perceived conflict because Goldman works on behalf of the publishing company. 01:18:36.840 |
And my thought was along the lines of what you guys said, like why is this a story? 01:18:40.840 |
Meaning David Solomon should be able to go to any website, license the song, make it, and then submit it back to them for them to approve. 01:18:48.840 |
Because the quote in here that matters is the company that licensed it said, "We are in the business of making sure this body of music stays relevant." 01:18:56.840 |
So obviously you want Whitney Houston songs, Michael Jackson songs. 01:19:01.840 |
You want these people to live on in culture because it's part of our culture. 01:19:10.840 |
The better way to maximize it would be like this. 01:19:15.840 |
So like Gweta should be able to just give that back to Eminem. 01:19:18.840 |
If Eminem's cool with it, he should be able to ship it and just be done. 01:19:22.840 |
Like the New York Times is just so anti-billionaire. 01:19:25.840 |
So David Solomon brushes off DJing as a minor hobby that has little to do with his work at the bank. 01:19:30.840 |
But his activities may pose potential conflicts of interest. 01:19:36.840 |
Is there not something more important than this? 01:19:38.840 |
Do we think that David Solomon or any CEO of any major bank on Wall Street would put their job at risk running one of the 20 or 30 most important institutions in the financial architecture of the world to license a Whitney Houston song that they can play at Coachella? 01:19:57.840 |
And just for the record, iconoclastic David Solomon, you're going to be doing the opening night DJ set for All in Summit 2023. 01:20:09.840 |
If the New York Times hates him, anybody the New York Times hates on, you got a slot. 01:20:24.840 |
It just like offends them that a corporate CEO could DJ? 01:20:41.840 |
You could be DJ one night at the All in 2020 after party and we'll use AI to help you out. 01:20:47.840 |
Sadly, when I should have been learning how to DJ, I was playing the violin. 01:20:51.840 |
You should get up there and Urkel your violin under a DJ set. 01:21:04.840 |
This is all kind of revelations happening here. 01:21:07.840 |
By the way, here was my most, pull up that first chat GPT I gave you. 01:21:12.840 |
Me and Sax were doing our weekly mastermind group. 01:21:18.840 |
Here was something that blew us away during our mastermind group. 01:21:23.840 |
I did this one because I was trying to figure this out. 01:21:37.840 |
So that's obviously the aggregation and synthesis of lots of different self-help websites. 01:21:42.840 |
How do you describe attribution of that answer to a particular content publisher? 01:21:51.840 |
I think in this case it would be compassionate. 01:21:57.840 |
Yeah, you say that because you hate content providers. 01:21:59.840 |
But here I tell you, this is compassionate publishing. 01:22:02.840 |
I think the GPT, the chat GPT should be given a pass. 01:22:04.840 |
You have weird blinders on on this one, J.K. I'll be honest. 01:22:13.840 |
It's not pulling a result from someone's website. 01:22:15.840 |
It's like read hundreds of websites and it's like averaged them. 01:22:22.840 |
They could say, as we ingest this stuff, tag it. 01:22:27.840 |
When you publish it, you say, hey, where did we start? 01:22:29.840 |
You're going with J.K. Is that what you're saying? 01:22:32.840 |
That is what you said is exactly how it's being done. 01:22:35.840 |
And J.K., let's say that the AI is using a hundred different websites 01:22:41.840 |
What's the incentive for the marginal hundredth website to say, 01:22:48.840 |
Google or OpenAI will just be like, okay, fine. 01:22:53.840 |
And this is why content providers, this is my best piece of advice. 01:22:56.840 |
You ask the question, I'll give you the answer. 01:22:57.840 |
Content providers as a group need to get together and fight for their rights. 01:23:04.840 |
No, fight for the right to get paid and to survive. 01:23:07.840 |
I'll tell you what I think is going to happen. 01:23:08.840 |
They need to go to Chatsy BT and say, as a group, 01:23:10.840 |
either give us these terms or don't index us. 01:23:13.840 |
You're trying to unionize all the content creators. 01:23:17.840 |
They should be a united front like the music industry. 01:23:19.840 |
Why do you think the music industry gets paid by Peloton and anybody else? 01:23:22.840 |
Because it was easy for them to get together and they fight. 01:23:26.840 |
but there's millions and millions of publishers. 01:23:28.840 |
I think the point here is that technology is fundamentally deflationary. 01:23:32.840 |
Here's the next great example where the minute you make something incredible, 01:23:39.840 |
revenue and profit dollars go down in the aggregate. 01:23:42.840 |
It doesn't mean that one company can't husband a lot of it 01:23:47.840 |
but it's just going to fundamentally put pressure on all these business models. 01:23:54.840 |
Google should go and they should cannibalize their own business 01:24:02.840 |
I think that if this goes as we all predict and everyone's saying it's going to go, 01:24:07.840 |
it is more likely than not that many of these "content publishers" 01:24:10.840 |
that aren't adding very much marginal value are going to go away. 01:24:13.840 |
That you could see the number of content sites offering self-help advice 01:24:18.840 |
95% of them go away because all of that work gets aggregated 01:24:22.840 |
and synthesized and presented in a really simple, easy user interface 01:24:28.840 |
I'm not discrediting the value that many content publishers provide, 01:24:31.840 |
but the requisite at that point to be valued as a "novel content producer" 01:24:39.840 |
The offset to that though is it's so much easier to create content 01:24:53.840 |
You just give it a title and it spits out a post. 01:24:56.840 |
They'll actually give you 10 different blog posts 01:24:59.840 |
and then you just select the one that is the direction you want to go 01:25:05.840 |
But how does new intelligence get put back into the system? 01:25:08.840 |
That's based on an existing corporate system. 01:25:10.840 |
No, you have a corporate blog, so you publish it to your corporate blog. 01:25:13.840 |
My point is, if there is now some new information in the world, 01:25:18.840 |
if everybody is just stealing content and rewriting it? 01:25:26.840 |
There's a head of the long tail that actually gets compensated. 01:25:29.840 |
The rest of the long tail has traditionally gotten nothing 01:25:34.840 |
And now the creation is going to explode because it's so easy. 01:25:42.840 |
and he built on the experience of listening to Haydn. 01:25:44.840 |
The same is true of how content is going to evolve 01:25:47.840 |
and it's going to evolve in a faster way because of AI. 01:25:50.840 |
This content is not just being retrieved and reproduced, 01:25:53.840 |
it's being synthesized and aggregated and represented in a novel way. 01:26:06.840 |
and they represent it based on the best content that's out there 01:26:11.840 |
because they have that algorithm with PageRank 01:26:18.840 |
That is profoundly unfair and not what we want for society 01:26:36.840 |
Sequoia and the YouTube founders sold it to Google 01:26:39.840 |
because they were so scared of the Viacom lawsuit 01:27:01.840 |
It's because they let content creators watermark 01:27:04.840 |
and find their stolen content and then claim it 01:27:11.840 |
There'll be a settlement where they are going to be able 01:27:16.840 |
I will bet any amount against your premonition here, J. Cal. 01:27:20.840 |
This is like the opposite of no predominance. 01:27:24.840 |
You've seen these AIs that generate images, right? 01:27:27.840 |
Like stable diffusion and like Wally or whatever. 01:27:34.840 |
and it would take an artist weeks to produce that 01:27:39.840 |
and you could tell the AI, "Give me 20 of those," 01:27:43.840 |
and in five minutes, you've got something mind-blowing. 01:27:45.840 |
So the fact that it's so much easier to create content, 01:27:48.840 |
you can do the same thing with the written word. 01:27:50.840 |
The people who need to be compensated, J. Cal, 01:27:52.840 |
if they don't get what they want, they may just go away, 01:27:53.840 |
but there'll be 10 times or 100 times more people 01:28:01.840 |
Getty Images is suing stable diffusion at the moment. 01:28:04.840 |
Here is what the dipshits at stable diffusion did. 01:28:10.840 |
with the watermarks on them and they've been busted 01:28:14.840 |
and they are going to pay $100 million or more 01:28:20.840 |
and allowing it to be republished in a commercial setting. 01:28:28.840 |
Stable diffusion copied the Getty Image watermark 01:28:34.840 |
Okay, I think stable diffusion's bigger problem 01:28:48.840 |
By the way, Mozart influenced by Haydn, not Beethoven. 01:28:52.840 |
By the way, there's a really interesting topic about AI 01:28:57.840 |
but I think we should put it on the docket for next week, 01:29:08.840 |
The last thing I'll say on this, from my perspective, 01:29:17.840 |
because Microsoft taking 500 or 600 basis points of share 01:29:31.840 |
5% or 6% of the market share is a really good thing 01:29:40.840 |
I mean, it's kind of a good news/bad news scenario 01:29:48.840 |
and even a bigger monopoly that's the one that's done it. 01:29:56.840 |
And they may all end up competing with each other. 01:29:58.840 |
What's great is everyone's got a tactical nuclear weapon. 01:30:01.840 |
and we don't know where it's going to get pointed 01:30:23.840 |
You guys need to watch what's happening right now. 01:30:31.840 |
because CoPilot was built off of stolen content. 01:30:44.840 |
It'll change the dynamics of YouTube in the long term, 01:30:49.840 |
We have a portfolio company called Sourcegraph, 01:30:53.840 |
if you're going to go through the whole portfolio? 01:30:59.840 |
You just get all your customers to opt into it. 01:31:36.840 |
on how scared I am about kind of where we're headed 01:31:46.840 |
And the scary moment at the State of the Union, 01:31:52.840 |
which was honestly a really discouraging sight to see, 01:32:31.840 |
and we don't change the tax rates in this country, 01:32:46.840 |
And so, you know, there are two ways this can go. 01:32:54.840 |
expense commitments that naturally balloon over time, 01:32:59.840 |
And the other one is that you just tax a lot more, 01:33:05.840 |
and it makes it really hard to eventually pay off that debt, 01:33:17.840 |
who want to cut Social Security and Medicare," 01:33:23.840 |
and said it's total BS that Biden would say that, 01:33:25.840 |
which I think supports what the polls have shown, 01:33:36.840 |
where they pushed back the retirement age by two years, 01:33:38.840 |
and there was effectively riots across the country. 01:33:40.840 |
I don't know if you guys saw this a few weeks ago. 01:33:45.840 |
And so this is a real cost that's coming bare. 01:33:56.840 |
because they're not going to let those things go bankrupt, 01:33:58.840 |
and that's another trillion plus of liabilities. 01:34:11.840 |
It is literally, if you keep Social Security and Medicare 01:34:13.840 |
where they are, and you don't pay down the debt, 01:34:19.840 |
across corporate and the individual taxpayer base. 01:34:24.840 |
And so, you know, it really, again, if you zoom out, 01:34:33.840 |
there's less to invest in the economic growth. 01:34:39.840 |
and that means that we can't grow our way out 01:34:45.840 |
And this is what Dalio's book that I mentioned in 2021 01:34:53.840 |
and the last couple decades get really nasty. 01:34:58.840 |
from the actual US Treasury, highlights the problem. 01:35:01.840 |
And the comments made in front of the Congress 01:35:03.840 |
this week by the President of the United States 01:35:05.840 |
indicates how serious of a problem this is going to be 01:35:07.840 |
because no one wants to cut these major cost obligations. 01:35:34.840 |
is that Biden was basically trying to take a page 01:35:50.840 |
It was like school uniforms and things like that 01:35:55.840 |
and that regular middle-class people could get behind. 01:36:06.840 |
He went all the way back to Dole's vote against Medicare. 01:36:29.840 |
over the last two years that we've really ever had 01:36:32.840 |
They're going to try and make everyone forget that 01:36:35.840 |
and just talk about the small, objectable stuff. 01:36:39.840 |
pose as the stalwart defender of entitlement programs, 01:37:06.840 |
that had some of this entitlement reform in it. 01:37:32.840 |
And I think Trump had the right instincts on this, 01:37:54.840 |
and the rest of the Republican caucus is like, 01:37:58.840 |
And they can't run away fast enough from Rick Scott. 01:38:20.840 |
- Chamath, any thoughts on the State of the Union writ large? 01:38:26.840 |
- Marjorie Taylor Greene yelling liar at the President? 01:38:40.840 |
at various points in his speech. It was quite 01:38:42.840 |
bizarre. And you're right, there were some Republicans 01:38:50.840 |
disappointed. - I think if the Republicans had just calmed down, 01:39:08.840 |
- It's really incredible. Republicans just, they have 01:39:24.840 |
And Mitt Romney telling that other liar guy to get 01:39:30.840 |
That's just the reality. That does not speak for the 01:39:44.840 |
billionaires, totally cool. If it's for centimillionaires 01:39:50.840 |
- We want those people to grow their wealth and invest. 01:39:52.840 |
The billionaires, yeah, sure. - Is that what you think 01:39:58.840 |
supposed to do? - I would love to see a website 01:40:00.840 |
where you have a piece of yarn. You know those 01:40:04.840 |
pin and you kind of push the pin out and the other one 01:40:12.840 |
have this level of spending. It's impossible. 01:40:18.840 |
Or you have to cut the entitlement programs, or you have 01:40:22.840 |
right about that, Freeberg, but let me tell you why the American 01:40:26.840 |
it would be ridiculous to cut their entitlements. 01:40:46.840 |
interests of the Democratic Party. - Did you just say the vaccine 01:40:50.840 |
- We talked about that. Let me bring it back up. 01:40:56.840 |
Hold on. They've watched as trillions of dollars 01:41:22.840 |
about the yearly cycle and the election cycle on this 01:41:24.840 |
stuff and just look at where we're headed over 01:41:28.840 |
And there's just no resolution to this problem 01:41:30.840 |
based on the way we're all oriented right now. 01:41:32.840 |
- If there is a wealth tax, let's just say on 01:41:34.840 |
billionaires or billionaires... - Let me just say one thing. 01:41:36.840 |
Sorry, Jacob. - What would they do? Would they leave the country? 01:41:40.840 |
That did happen in France, by the way. I think 40% 01:41:50.840 |
litigated for sure. - It'll be a Supreme Court, yeah. 01:42:04.840 |
have a fighting chance because you can actually grow the 01:42:10.840 |
about opportunities like Fusion and you guys can make 01:42:12.840 |
fun of me all you want. But if we can get to a 01:42:20.840 |
from that, from all these new industries and these 01:42:22.840 |
production systems. And that's how we can grow 01:42:26.840 |
entitlement, tax problem where one of those three 01:42:30.840 |
- You know what the cheapest source of energy is? 01:42:32.840 |
You know what the cheapest source of energy is today? 01:42:48.840 |
- Yeah, the problem, Chamath, honestly, let me just 01:42:54.840 |
- Yes. - And can you do it fast enough? And that's 01:42:58.840 |
creating jobs. - That's the real techno-economic 01:43:02.840 |
in my reading list today, this week, you guys can go 01:43:36.840 |
I'm just agreeing with you. I'm just building on your point to say 01:43:52.840 |
we have a path... - You can do it in 10 by putting 01:44:02.840 |
those three things is going to give, and it's going to be ugly. 01:44:04.840 |
- The thing that we are lacking right now is not 01:44:18.840 |
your boundary condition met, and we'll have it 01:44:34.840 |
Macron killed it. - They only had the wealth tax for 12 01:44:38.840 |
- They were just losing their tax base so violently 01:44:40.840 |
they had no choice. - That's what's happening in California. 01:44:42.840 |
- That's what's going to happen in California if they move forward with 01:44:44.840 |
this discount. - It's happening in California already 01:44:46.840 |
without the wealth tax. - San Francisco. - Right. 01:44:48.840 |
- It's happening already, and now you look at it... - Well, because they're stupidly 01:44:56.840 |
talking about, "Hey, could this happen in the next 10 years?" 01:45:02.840 |
because if I wait... - They're going to chase you. 01:45:12.840 |
- It is a serious problem. - But let me tell you, I mean, to 01:45:36.840 |
stink. And the job of politicians is to manage 01:45:40.840 |
the politicians have not been doing a good job 01:45:46.840 |
So, yeah, at some point it's going to blow up. 01:45:56.840 |
So, how much the city spends per year divided 01:45:58.840 |
by the number of people that live in the city? 01:46:00.840 |
- Well, we know it's a couple of billion dollar budget, 01:46:02.840 |
we know only a couple of hundred thousand in there. 01:46:04.840 |
- It's $13 billion budget. - Yeah, and it's 800,000 residents? 01:46:26.840 |
except Oregon and North Dakota, which have very weird 01:46:48.840 |
every other state except two, and spends more 01:47:36.840 |
even further in 2020. Yeah, some people think it's down 01:47:42.840 |
a lot of people who owned homes there and maybe owned 01:47:44.840 |
second homes, because let's face it, it was a 01:47:46.840 |
well-heeled group of individuals living there. 01:47:48.840 |
And a lot of them just still have their places, 01:47:50.840 |
but they've left. And then they're in the process 01:48:08.840 |
opinions are. That's just the economic reality 01:48:10.840 |
of what happened in France. It's what's happening in San 01:48:12.840 |
Francisco. There's a lot of great predictors in 01:48:18.840 |
maybe you have to have a miracle, like an energy 01:48:22.840 |
keep investing for that. All right, everybody, 01:48:40.840 |
world's greatest moderator, Jay Cowell, and we'll 01:48:42.840 |
see you next time. Bye-bye. I love you, besties. 01:49:16.840 |
We should all just get a room and just have one big huge 01:49:20.840 |
It's like sexual tension, but they just need to release