back to index

E115: The AI Search Wars: Google vs. Microsoft, Nordstream report, State of the Union


Chapters

0:0 Bestie intro!
1:49 Report about US involvement in the destruction of Nordstream pipelines, breaking away from the military-industrial complex
28:26 Bestie refresh!
33:13 The AI Search Wars: Microsoft pressing hard, Google has a rough week
45:6 Sundar's next move: How should Google counterpunch? Google's troubled business model, will we see successful lawsuits over training data?
72:41 Will generative AI commodify enterprise SaaS? If so, what happens to VC returns?
91:14 Disappointing State of the Union, precarious situation between debt, taxes, and entitlements

Whisper Transcript | Transcript Only Page

00:00:00.000 | I had this thing with my oldest son where I don't know if it's all kids, but he's 13.
00:00:05.920 | He doesn't know how to answer the phone, not to save his life.
00:00:10.040 | He picks up the phone, "Huh?
00:00:12.040 | Hello?"
00:00:13.040 | And so I said, "Listen, from now on, when you call me, I expect a certain way that you
00:00:17.640 | pick up the phone, and that's going to be practiced by how you interact with anybody
00:00:21.920 | else, and vice versa as well.
00:00:23.520 | If I call you, you have to pick up the phone, and if you don't, I'm hanging up right away."
00:00:27.740 | So yesterday, he calls me, and I'm like, "Jamal speaking."
00:00:33.200 | You know, it's like, "Hello, Jamal speaking."
00:00:37.200 | "Dad."
00:00:38.200 | "Hi, honey."
00:00:41.260 | He calls back, "Hello, Jamal speaking."
00:00:43.520 | "Dad."
00:00:44.520 | "Hug up again."
00:00:49.340 | We did this three more fucking times, and then finally I said, "When will you get it
00:00:52.920 | through your head?
00:00:53.920 | Can you please just answer?"
00:00:55.500 | If I say, "Hello, Jamal speaking," "Hey, Dad, it's your son," or "Hey, Dad, it's
00:00:59.180 | (beep)."
00:01:00.940 | And it's unbelievable, and he's like, "Well, none of my friends pick up the phone like
00:01:04.500 | this," and I'm like, "Oh my God.
00:01:06.280 | Don't they think they need to have verbal communication skills?"
00:01:09.700 | It's unbelievable.
00:01:10.700 | And then when I call him, he picks up the phone, "Hello?"
00:01:15.500 | "What?"
00:01:16.500 | Not even "Hello."
00:01:17.500 | "Huh?"
00:01:18.500 | "Huh?"
00:01:19.500 | What is that?
00:01:20.500 | It's like a grunt.
00:01:21.500 | - Minimal effort.
00:01:22.500 | It's like the minimum number of syllables.
00:01:25.400 | It's embarrassing.
00:01:26.400 | Are you kids like this, or is it just my kid?
00:01:28.180 | - Actually, I don't think I've heard him pick up the phone.
00:01:30.180 | I need to test that.
00:01:31.180 | - "I'm going all in."
00:01:32.180 | - "Don't let your winners ride."
00:01:33.180 | - "Rain Man David Saxon."
00:01:34.180 | - "I'm going all in."
00:01:35.180 | - "And instead, we open sourced it to the fans, and they've just gone crazy with it."
00:01:36.180 | - "Love you, S.I.C.E."
00:01:37.180 | - "Queen of Kinwam."
00:01:38.180 | - "I'm going all in."
00:01:39.180 | - J. Cal was on Twitter calling me out for not denouncing the Chinese balloon as strong
00:01:54.300 | enough language.
00:01:55.300 | - Well, I don't think he understood the language I was using.
00:01:58.700 | Did you understand what the word errant means?
00:02:00.260 | What did you think it meant?
00:02:01.260 | - You know, I do understand the word errant.
00:02:03.900 | - Is that like a teaching moment for you?
00:02:04.900 | - Yeah, it generally means shraying off course.
00:02:07.900 | Yeah, traveling in search of adventure, at least according to Merriam-Webster.
00:02:14.420 | So I just thought maybe you're being a little...
00:02:17.300 | But I know you're a dove, but I was a little doveish thinking they were just off course.
00:02:21.260 | I think you'd think they were off course, or they were doing it deliberately.
00:02:24.660 | You buy that it was off course?
00:02:25.940 | - Genuflect, go, genuflect.
00:02:26.940 | J. Cal, go, go, go, go.
00:02:27.940 | - Yeah, please.
00:02:28.940 | So I knew J. Cal would have to join in this nationwide panic over this balloon.
00:02:34.980 | - I have no panic over it.
00:02:35.980 | - Is that a Minolta camera attached to it?
00:02:36.980 | - Just the word errant means...
00:02:37.980 | - J. Cal, I got some really bad news for you.
00:02:45.340 | There was these things called spy satellites, and they can see everything.
00:02:51.340 | - It's obvious people, they've been sending these balloons over here for a while.
00:02:53.580 | I just thought you were framing as it was errant, as in like, off course.
00:02:58.420 | Do you think it was...
00:02:59.420 | I'm sincerely asking, do you think it was errant, it was an accident?
00:03:01.540 | - Well, here's the thing.
00:03:02.940 | It's such a harebrained scheme to send a balloon flying over American territory that the Occam's
00:03:10.980 | Razor explanation here is that they somehow lost control over it, and these things are
00:03:15.900 | not steerable.
00:03:16.900 | So my guess is that it probably wasn't deliberate just because of how stupid a plan it was.
00:03:23.540 | It was probably a deliberate plan.
00:03:24.540 | - I don't know how stupid it would be, and how obvious it would be, but it could be.
00:03:28.500 | I don't really know.
00:03:29.500 | What I do know is that the whole nation got in a lather and a tizzy and started hyperventilating
00:03:35.020 | about this balloon, and it just shows how reflexively hawkish the media is.
00:03:39.420 | It's like, "Hey, can we just wrap up this war in Ukraine before we start another war
00:03:44.180 | with an even more formidable superpower?"
00:03:47.380 | - The balloon got more attention than us blowing up Nord Stream.
00:03:50.380 | - Exactly.
00:03:51.380 | - Okay, well, hold on.
00:03:52.380 | We're ready to jump into it.
00:03:53.380 | - No, what I would say about that is the media insight is the valid one.
00:03:56.820 | I was just responding to the errand, and I was curious if you actually thought it was
00:03:59.540 | an accident.
00:04:00.540 | I don't think it's an accident.
00:04:01.540 | I don't think there are accidents.
00:04:02.540 | - Well, I think it clearly was off course and problematic, so I don't know about the
00:04:07.100 | intentionality of it.
00:04:09.300 | I think it very well could have been intentional, but I tend to think, because it's such a stupid
00:04:13.220 | harebrained scheme, I almost give them the benefit of the doubt, not because they're
00:04:17.140 | not capable of spying.
00:04:18.500 | I'm sure they're spying on us.
00:04:19.500 | - Obviously, they are.
00:04:20.500 | Sure.
00:04:21.500 | - I'm sure they're spying on us, just like we're spying on them.
00:04:23.140 | That's what we both do, but it seems like such a stupid way, because you're going to
00:04:27.540 | get caught.
00:04:28.540 | - It's a made-for-TV moment.
00:04:29.980 | You have to understand the nature of live television and why the media overreacted to
00:04:34.300 | It's ongoing, so because it's not final, it's like a live event occurring, like when kids
00:04:40.460 | are trapped in a mine.
00:04:42.400 | Those are the best stories for CNN, because you can keep updating people, and people keep
00:04:47.140 | turning the TV on to check on the status.
00:04:49.380 | So this one was just made for CNN, because obviously, we're going to shoot the thing
00:04:54.420 | down, and obviously, you can interpret into it if it's an accident or not, and it just
00:04:59.100 | gives them something to talk about on a slow news week.
00:05:01.620 | But it is interesting.
00:05:02.620 | I think your point that is interesting is, is the Nord Stream story correct or not?
00:05:08.760 | Why is that not being covered?
00:05:10.900 | It's not being covered because it's not an ongoing story.
00:05:16.220 | So that just shows you...
00:05:17.220 | - No, I don't think that's right.
00:05:18.660 | - I think it's not being covered because the mainstream media already took a side on this.
00:05:23.820 | - So even more cynical, yeah.
00:05:26.500 | - Well, it's not just cynical.
00:05:27.940 | So when Nord Stream got blown up, the administration came racing out with the line that the Russians
00:05:34.220 | did it, that this was self-sabotage.
00:05:36.660 | And by the way, the media repeated this endlessly.
00:05:39.100 | This was the media line.
00:05:40.980 | And it never really made sense to anyone who's paying attention, because first of all, this
00:05:44.580 | was an economically vital asset to Russia.
00:05:47.940 | Second, it was their main source of leverage over Europe, was their control over the gas
00:05:53.340 | supply.
00:05:54.340 | So the idea that they would shoot themselves in the foot that way, just to somehow, what,
00:06:00.380 | show how crazy they are, it never really made sense to anyone who was paying attention.
00:06:04.660 | And the fact that the administration, the media, so quickly raced to that conclusion
00:06:09.900 | suggested that it was maybe a cover story.
00:06:12.100 | Because if we had nothing to do with it, you would just be more neutral and say, "Yeah,
00:06:16.100 | we don't know what happened."
00:06:17.340 | But they had to promote this line that, "Oh, the Russians did it," which just never made
00:06:21.900 | any sense.
00:06:23.140 | And now Cy Hirsch has come out with this story, laying out in great detail how we did it.
00:06:28.300 | It's not just saying we did it.
00:06:30.040 | It's laying out who did it and how and the steps and all that kind of stuff.
00:06:33.700 | And just so you guys understand who this guy is, he's this legendary Pulitzer Prize winning
00:06:38.140 | journalist.
00:06:39.140 | He's like 86 years old or something like that now.
00:06:41.380 | But he broke, during Vietnam, he broke the story of the My Lai massacre, which the military
00:06:47.220 | denied until he proved it.
00:06:49.260 | He broke the story during the Iraq War of the Abu Ghraib prison abuses, which again,
00:06:55.460 | the military denied until it was undeniable.
00:06:58.900 | He's Pulitzer winning is all you really need to know.
00:07:00.940 | He's a Pulitzer winning journalist.
00:07:01.940 | Yeah, and here he is, breaking out basically another covert military action.
00:07:06.500 | And look, we don't know for sure whether it's true or not, but it looks pretty bad.
00:07:10.780 | Do you think that Europe knew?
00:07:12.260 | Do you think that the Europeans were told that the Americans were going to go and blow
00:07:15.780 | this thing up?
00:07:17.020 | If that's true, by the way, the administration has said clearly, this is a false story.
00:07:21.020 | So just so we're clear.
00:07:22.020 | I think the specific quote was this is pure fiction.
00:07:25.340 | The Ukraine would be the most likely person to do this.
00:07:28.300 | You don't have the capability.
00:07:29.300 | You got to look for a means, motive and opportunity.
00:07:31.220 | They don't have the opportunity to go down there and place a bomb?
00:07:34.300 | I don't think the Ukrainians have the capability to do an undersea mission like that.
00:07:38.900 | Oh, interesting.
00:07:39.900 | The Norwegians do.
00:07:40.900 | And the story maintained that we did it with them.
00:07:44.340 | The British do.
00:07:46.980 | Norway didn't say whether they were involved or not.
00:07:49.100 | So no, my guess is that if this was a covert US activity, it was kept to a very small group,
00:07:54.900 | which is why it'll be hard to prove more definitively than that.
00:07:58.580 | The other side of the argument is this is such a provocation.
00:08:03.140 | And the administration saying it's fiction.
00:08:05.300 | So what is your response to that, Saks?
00:08:07.980 | Like, would the US do something so provocative?
00:08:11.100 | Or would they have somebody else do it?
00:08:12.660 | And would Norway do something so provocative?
00:08:15.160 | It seems an extremely, it seems like an extremely offensive technique, as opposed to just backing
00:08:20.940 | the Ukraine to defend itself against Russia's invasion.
00:08:24.140 | Here's the thing.
00:08:25.140 | Before the invasion, Biden at a press conference said that if the Russians invade, Nord Stream
00:08:31.180 | would be no more.
00:08:32.820 | And they asked him, well, how can you make sure that that's a deal between Germany and
00:08:37.140 | Russia?
00:08:38.140 | We're not involved.
00:08:39.140 | They said, we have ways, we have ways, just trust me, it'll happen.
00:08:41.660 | Separately, Victoria Nuland, who's our deputy secretary of state, said something very similar
00:08:46.320 | about how we would stop Nord Stream if the Russians invaded.
00:08:50.320 | And then after Nord Stream got blown up, Blinken at a press conference said that this was a
00:08:55.280 | wonderful opportunity, and was extolling all the benefits of this.
00:08:58.760 | And then Victoria Nuland at a congressional hearing said that I'm sure we're all very
00:09:02.480 | glad that it's a hunk of metal at the bottom of the sea, again, extolling all the benefits.
00:09:06.120 | And we've discussed the benefits on this program of now we've shifted the European dependence
00:09:10.120 | on Russian gas to a dependence on American gas.
00:09:13.680 | So the fact of the matter is the administration kind of telegraphed what they were going to
00:09:19.240 | They had the capability to do it.
00:09:21.540 | And they had the motive to do it.
00:09:23.000 | So it's certainly a plausible story.
00:09:25.320 | You're right that we can't know for sure.
00:09:27.320 | It's a single source story.
00:09:28.320 | You don't believe the administration.
00:09:29.320 | It's a single source story that depends on the credibility of Cy Hirsch.
00:09:34.300 | But as it stands right now, whose story do I think makes more sense?
00:09:39.560 | Probably Cy Hirsch's.
00:09:40.680 | You believe Cy Hirsch or the spokesperson for the CIA who wrote the claim is completely
00:09:44.980 | and utterly false, just to be clear.
00:09:46.680 | Well, these these same people denied Abu Ghraib, they denied Milai.
00:09:50.760 | You believe
00:09:51.760 | They deny everything.
00:09:52.760 | They deny that NATO expansion anything to do with the, you know, breakout of this war.
00:09:58.760 | Yeah, I mean, this is one of those situations where we can't possibly know.
00:10:02.680 | Yeah, you make a good point, J Cal, which is this if true, and look, I'm not saying
00:10:07.120 | I don't know if it's
00:10:08.200 | You're leaning towards believing it.
00:10:10.040 | I lean towards thinking it's more plausible than not.
00:10:12.880 | Because also like who else could have done it and who again, who had the motive means
00:10:17.120 | an opportunity.
00:10:18.120 | But you make a great point, which is if we did it, it's an incredibly provocative act.
00:10:23.880 | It's basically perpetrating an act of war against a country that has thousands of nuclear
00:10:29.000 | weapons.
00:10:30.000 | Biden promised us at the beginning of this war that he would keep us out of being directly
00:10:33.680 | involved.
00:10:34.680 | So this would directly contradict what he said at the beginning of this war.
00:10:38.880 | So I think it's a very scary situation here.
00:10:43.800 | Yeah, no.
00:10:44.800 | And if you've been if you
00:10:46.000 | Do it.
00:10:47.000 | Yeah.
00:10:48.000 | If the US didn't do it, like, why don't we find the real killer?
00:10:50.640 | I actually have this.
00:10:51.640 | I have the theory.
00:10:52.640 | Here's my theory.
00:10:53.840 | The CIA knew how to do it.
00:10:56.320 | Biden wants to do it.
00:10:57.840 | The Republicans want to do it.
00:10:59.120 | Obviously, the, the people who are the most pro stopping Nordstrom have been the Republicans.
00:11:02.960 | They've led this charge even more than the Democrats.
00:11:05.440 | So it's a bipartisan issue.
00:11:07.160 | Stop Nordstrom bipartisan issue, hands down.
00:11:10.360 | I think the CIA probably knew how to do it.
00:11:13.260 | And just like we equipped the Ukraine to do it, we might have facilitated the Ukraine
00:11:17.960 | and a collaborator, UK, Norway, some freelancers, you know, we have those freelance, former
00:11:25.880 | Navy SEALs that operate, perhaps the CIA just said to the Ukraine, here's a, you know, black
00:11:31.080 | ops group.
00:11:32.700 | If you wanted to engage them, you could feel free to do so, which would then split the
00:11:37.040 | difference between what size saying what the CIA is saying, which is typically where the
00:11:41.680 | truth lies is probably between what this investigative journalist of note has said and what the CIA
00:11:48.880 | is denying.
00:11:49.880 | The CIA probably has plausible deniability.
00:11:51.920 | That's my
00:11:52.920 | Well, if you could find a source for that story, J Cal, that lays out in the same level
00:11:56.800 | of detail that Hirsch has laid out in terms of the meetings that occurred, what was approved
00:12:02.040 | when, how they did it, that, you know, when they did it, the explosives they use, the
00:12:08.760 | divers they used, it goes into a fair amount of detail, incredible detail.
00:12:13.080 | Yeah, I mean, so I mean, you're laying out a theory out to what explosives, right?
00:12:17.280 | But you're trying to put what you just said, which is basically you inventing a story on
00:12:21.760 | the same level as Hirsch's story.
00:12:23.320 | Yeah, I understand.
00:12:24.600 | But he actually has a lot of detail in his story.
00:12:27.480 | Yeah, he had he had umpteen sources.
00:12:30.080 | There was a lot of people that were willing to tell the story.
00:12:32.520 | No, no, I don't think that's actually the case.
00:12:34.600 | I don't think as umpteen sources.
00:12:36.760 | And there's no on the record, there's one main source, there was one main source.
00:12:40.640 | But who provided a ton of detail.
00:12:42.800 | I don't know exactly how much he was able to directly corroborate with other sources.
00:12:45.680 | I don't know that.
00:12:46.680 | The other thing I wonder, Saks is he with this one source story, he has a collaboration
00:12:52.400 | with the New Yorker.
00:12:54.480 | And anybody would if this was a really well sourced, and you could back it up kind of
00:12:59.240 | story, they would love to have the ratings for this story.
00:13:02.560 | There's a blockbuster story for as happens a long time, which we learned was it through
00:13:07.360 | the Valerie Plame affair, which is that these major news publications will sometimes have
00:13:13.040 | a back channel back to the national security apparatus when they have something like this.
00:13:17.320 | And the message was, you can't hit print on this.
00:13:19.480 | Yeah, in which case possibility to he goes and just self publishes himself, which wasn't
00:13:24.000 | really even an option.
00:13:25.360 | A few years ago.
00:13:26.680 | No, yeah.
00:13:27.760 | So I mean, this is the Rorschach test of Rorschach tests.
00:13:30.640 | You have the media, you got the CIA, it's fodder for a great movie.
00:13:34.280 | What I go back to J. Cal is I think you can lay out some theories about let's say, you
00:13:38.560 | know, the Poles did it or maybe Ukrainians with the British or something.
00:13:43.480 | Yes, you could lay out those theories, but the media wasn't willing to entertain any
00:13:48.120 | of those theories.
00:13:49.200 | When this news broke, what do they do?
00:13:51.040 | They blame the Russians.
00:13:52.520 | And that story made no sense, but they said it so definitively.
00:13:56.160 | I'm looking for the source on that.
00:13:57.600 | I'm looking for Nick, if you could pull up the source of the administration blaming the
00:14:01.240 | Russians, I just want to make sure we're accurate.
00:14:02.800 | Biden did a press conference.
00:14:04.000 | Yeah, I just want to make sure there was no sabotage by the Russians.
00:14:07.360 | Now they didn't push it that hard.
00:14:08.720 | What was interesting is the Biden administration set up but they didn't keep coming back to
00:14:13.560 | But the media really ran with it.
00:14:15.600 | And when people on both the left and the right like people like Jeffrey Sachs on the left
00:14:20.120 | and Tucker Carlson, the right basically started to question whether the US could have done
00:14:23.440 | it, they were accused of being conspiracy theorists, and you know, Putin stooges and
00:14:27.680 | all the rest of it.
00:14:28.960 | And now all of a sudden, here comes Seymour Hersh with a pretty detailed story lending
00:14:34.120 | credence to that point of view.
00:14:37.600 | It just may indicate we don't know everything that's going on with this war.
00:14:42.080 | And I think the longer this goes on, the more dangerous it is.
00:14:45.200 | Freiburg, do you have any thoughts on geopolitical issues and who might have blown up Nord Stream?
00:14:51.120 | What are your sources saying, Freiburg?
00:14:53.120 | What are your sources in Science Quarter saying?
00:14:58.160 | Speculation is best left to the future.
00:14:59.800 | I don't know about the whole speculating on what could have been done in the past by someone.
00:15:05.240 | Those are conspiracy theories and a little dangerous.
00:15:08.280 | They're not actually conspiracy theories.
00:15:10.800 | Nord Stream was blown up.
00:15:12.000 | You understand that, right?
00:15:13.400 | Like there's no doubt that it was blown up.
00:15:15.080 | No, I'm saying Jekyll's theory.
00:15:16.080 | Oh, yeah.
00:15:17.080 | Jekyll's theory is just a story with no evidence.
00:15:18.080 | I think you've got to bring data to the table.
00:15:21.080 | Pure speculation.
00:15:22.800 | You got to bring data to the table to make it a...
00:15:24.800 | By the way, speaking of which, Radek Sikorski, who is a Polish diplomat, I think he was like
00:15:30.600 | their foreign minister, when Nord Stream blew up, he tweeted a photo of it saying, "Thank
00:15:36.320 | you, USA."
00:15:39.120 | Which was one of the reasons why people thought that, okay, like, yeah, of course the US did
00:15:44.600 | Who has the capability to do it?
00:15:46.080 | Who has the motive to do it?
00:15:47.300 | Who said they were going to do it?
00:15:49.920 | And who benefits?
00:15:50.920 | Ukraine.
00:15:51.920 | Who benefits?
00:15:52.920 | Who was conducting NATO exercises in that region?
00:15:56.800 | Three months before the thing blew up.
00:15:58.800 | Yeah, exactly.
00:15:59.920 | And Jeffrey Sachs pointed that out on, I think it was a CNBC interview before they basically
00:16:04.680 | stopped him because he's not allowed to talk about this on, you know, network TV, apparently.
00:16:09.040 | But he basically pointed out there were like US radar signatures in the area.
00:16:12.880 | President Joe Biden declared that, this is from Bloomberg, that a massive leak from the
00:16:16.160 | Nord Stream gas pipeline system in the Baltic Sea was an intentional act and that the Russian
00:16:20.080 | statements about the incident shouldn't be trusted.
00:16:21.840 | It was a deliberate act of sabotage and now the Russians are pumping out disinformation
00:16:25.200 | lies Biden told reporters Friday at the White House.
00:16:27.600 | So he didn't exactly say the Russians did it.
00:16:29.200 | He just said it's an act of, don't believe the Russians.
00:16:32.080 | What's that?
00:16:33.080 | The media did.
00:16:34.080 | You could do one of these montages where it was like Russian sabotage.
00:16:36.920 | In fact, go to Matt Taigny's blog.
00:16:40.120 | Biden didn't say it was sabotage.
00:16:41.200 | He said it was an act of sabotage.
00:16:42.600 | If you actually look at what Biden said, everything he said is absolutely true if the US also
00:16:47.760 | did it.
00:16:48.760 | Correct.
00:16:49.760 | Every single sentence that Joe Biden said is 100% true, whether we did it or whether
00:16:57.600 | Russia did it.
00:16:58.720 | It was a deliberate act of sabotage.
00:17:03.100 | Think about it.
00:17:04.100 | If we did it, we know they didn't do it.
00:17:06.120 | And then we have to be like careful about suggesting they did it.
00:17:10.000 | Well, what are the Germans think?
00:17:11.320 | Because this is the Germans pipeline.
00:17:13.720 | So if we blew it up, that's also an, uh, explain to me your thinking on the chessboard of our
00:17:19.720 | relationship with Germany.
00:17:21.160 | If we blew it up, would they not also see that as a hostile act?
00:17:24.000 | This is why I asked if Europe knew because I think you have to tell Germany that it's
00:17:27.440 | going to happen.
00:17:28.840 | And I think the quid pro quo with Germany is some amount of guaranteed supply that the
00:17:35.040 | US directs into Europe so that they know that their long term LNG supply is intact so that
00:17:41.280 | they become ambivalent, right?
00:17:42.720 | There's a point of indifference where the Germans say, okay, we don't know when this
00:17:47.360 | thing is going to get turned back on and we don't know what the implications of it are.
00:17:50.880 | Here's what our demands are, meaning our energy supply, our energy needs are.
00:17:56.100 | And so as long as the United States can say, look, worst case, we have stuff in the, you
00:18:01.400 | know, SPR that we can give you.
00:18:03.360 | There's probably a point of indifference where the Germans say, okay, we're just going to
00:18:06.360 | turn around and not say anything.
00:18:07.640 | And I'm curious, did the Germans say anything when this happened?
00:18:10.480 | No, not that I'm looking for the jury.
00:18:13.400 | I was literally just type to Nick, what was the Germans position on this?
00:18:16.440 | That's interesting, too.
00:18:17.440 | That's a tell.
00:18:18.440 | Well, yeah, if you're breaking this down like a poker hand, who are trying to figure out
00:18:22.120 | if you construct this hand, right pre flop, it's like, you know, where both of these folks
00:18:26.960 | Right?
00:18:27.960 | Okay, there was a really interesting video that a guy named Matt or failure who puts
00:18:33.040 | together these really funny videos, he put together these montages of media reactions
00:18:37.760 | to things.
00:18:39.280 | And what he shows is that, you know, you can have like 20 different media outlets, they
00:18:42.440 | all use the exact same words.
00:18:45.080 | And when he clips it together, you can see the reading from somebody's talking points.
00:18:48.840 | And it's not really clear who and in basically, if you look at his video here, on who blew
00:18:54.440 | up Nord Stream pipeline, you see that there was like a party line from the mainstream
00:18:59.120 | media stuff.
00:19:00.120 | North Stream to we will bring it into it.
00:19:04.120 | How will you how will you do that?
00:19:07.120 | I promise you we'll be able to do it.
00:19:10.120 | North Stream pipeline.
00:19:11.120 | I mean, we have to conclude that is most likely Russia, Russian sabotage on its own infrastructure.
00:19:21.560 | The common sense matter.
00:19:22.560 | I think it's Putin's way of sending a message.
00:19:25.160 | What Putin is saying to us by blowing up his pipeline is, look, I can blow up a pipeline.
00:19:29.760 | Everyone knows that Putin did this himself.
00:19:31.280 | Who are these talking heads smoking gun without the direct proof?
00:19:34.840 | Yeah, I think logic and common sense will tell you that without the evidence, Russia
00:19:38.680 | was behind the incident, you can say it for sure.
00:19:41.200 | Who sabotaged the Nord Stream to pipeline?
00:19:43.840 | All right, enough.
00:19:45.640 | This nonsense.
00:19:46.640 | Come on.
00:19:47.640 | That's a you know, it's these talking heads who have no firsthand experience.
00:19:49.920 | And I'm more than willing to comment on this.
00:19:51.920 | It's hysterical.
00:19:52.920 | Fantastic.
00:19:53.920 | It's fantastic.
00:19:54.920 | All his videos are like that, where he has one on the Hunter Biden laptop as well, where
00:19:58.560 | again, he's got like 20 different talking heads and media outlets, all portraying it
00:20:04.400 | in exactly the same language.
00:20:06.920 | It's like a narrative on Sunday morning shows, you hear the same narrative from each side.
00:20:14.400 | How does that actually get coordinated?
00:20:15.960 | Each side builds those bullet points, emails their what do they call them?
00:20:20.440 | surrogates.
00:20:21.440 | They email all the surrogates and say, just keep saying these things over and over again
00:20:25.360 | to codify.
00:20:26.360 | It's a WhatsApp group.
00:20:27.360 | It's a WhatsApp group.
00:20:28.360 | And they're just like, say this over and over again.
00:20:31.640 | How does it work?
00:20:32.640 | Saks?
00:20:33.640 | I think it's partly talking points, memos that go out to chat groups.
00:20:35.840 | I think it's also just people looking on Twitter.
00:20:38.440 | And then there's like certain key nodes that they follow.
00:20:41.920 | And they know, okay, this is the party line, because such and such key person is saying
00:20:45.720 | it and they take their cues.
00:20:47.480 | It's that memetic, mememic effect that you guys always talk about people with that.
00:20:52.840 | Yeah, the memetic.
00:20:53.840 | Yeah, the thing to understand is that all the prestige outlets repeat the same party
00:20:58.320 | line and have the same perspective.
00:21:00.320 | Yeah.
00:21:01.320 | You got it.
00:21:02.320 | You got to do your own search for information.
00:21:03.720 | This is the beauty of substack.
00:21:04.720 | Actually, this is why substack is so important is it actually gives you an alternative.
00:21:08.080 | Yeah, it's pretty disruptive.
00:21:09.080 | It gives you an alternative because like you've got 10 different mainstream media networks
00:21:14.480 | or newspapers or magazines, but they all have exactly the same talking points, except maybe
00:21:18.080 | Fox News is kind of the one exception.
00:21:20.080 | Although even Fox on the whole Nord Stream thing, you saw that Fox can be pretty militaristic
00:21:25.440 | and they had the same generals basically blaming the Russians for this on Fox.
00:21:29.920 | Yeah.
00:21:30.920 | Their position is just, "Hey, everybody, this is sabotage."
00:21:35.160 | That's it.
00:21:36.160 | Who did the sabotage?
00:21:37.160 | I think you asked a really good question there about the German interest in this.
00:21:41.000 | Right now, the German economic interest and the German foreign policy interests are not
00:21:45.880 | aligned.
00:21:46.880 | What's clearly best for the German economy is to have cheap natural gas powering its
00:21:51.400 | industries even if it comes from Russian pipelines.
00:21:56.760 | They no longer have that anymore.
00:21:57.920 | In fact, they may never have that again.
00:22:00.040 | They're going to pay a very high price economically maybe forever.
00:22:03.320 | Remember, their whole economy is based on industry.
00:22:06.360 | They're very industrial power.
00:22:09.360 | If this war drags on for a long time, I think Schultz might be in some political trouble
00:22:13.240 | precisely because he's gone along with the Americans on this.
00:22:18.320 | There is a growing political opposition to this war inside of Germany.
00:22:22.880 | War fatigue is a real thing and this thing's got to wrap up at some point.
00:22:27.020 | Any final thoughts, Freiburg?
00:22:28.880 | We didn't get too involved in that conversation.
00:22:32.080 | Any game theory from you?
00:22:35.280 | Okay.
00:22:36.280 | There you have it, folks.
00:22:40.640 | Sultan of Science checking in.
00:22:41.640 | What's the problem?
00:22:42.640 | You don't want to criticize the establishment?
00:22:43.640 | It's always very personal with Sachs.
00:22:49.440 | Why don't you have my position on this?
00:22:51.440 | I'm trying to understand.
00:22:53.440 | I'm not pro-establishment.
00:22:54.440 | I think you know that.
00:22:57.600 | I'm just analytical around the fact that I think there's a strong orientation towards
00:23:00.920 | conflict and I think there still is.
00:23:02.560 | I don't think that there's much of an incentive or a motivation to back down because this
00:23:08.200 | conflict creates a significant amount of debt owed back to the U.S.
00:23:11.840 | It creates a potential future asset stream.
00:23:16.080 | Creates a realignment of power.
00:23:18.160 | Everyone's looking externally as internal economic conflict and wealth disparity issues
00:23:24.680 | arise and economic growth is challenged and inflation is soaring.
00:23:28.760 | It's a great place to address one's energy.
00:23:32.240 | I think all of this stuff is detailed analytical shenanigans around who's saying what or who's
00:23:37.800 | doing what.
00:23:38.800 | I think the underlying thesis and the underlying river that's flowing is one that's looking
00:23:44.160 | for external conflict.
00:23:45.160 | I think the same is true with the U.S. and China.
00:23:48.480 | You're talking about the military-industrial complex is going to benefit massively from
00:23:52.520 | this.
00:23:53.520 | The more this drags out, the more the conflict in Taiwan heats up, the more we're going to
00:23:59.160 | invest in our military.
00:24:01.000 | We often talk about these things as if they're top-down, master plan driven.
00:24:06.200 | As we all know, they're more Ouija board driven.
00:24:09.120 | It's a bunch of guys that got their hand on the Ouija board and they all just had a little
00:24:12.720 | too much caffeine.
00:24:15.320 | In this case, I think it's just more about everyone's a little anxious and the anxieties
00:24:20.240 | leading to a desire for more conflict.
00:24:22.880 | We're not happy at home.
00:24:23.880 | If you're happy at home, you're not looking externally for conflict.
00:24:27.000 | That's true in nearly every developed nation on earth today.
00:24:32.200 | That's it.
00:24:33.200 | I don't know.
00:24:34.200 | I like your position.
00:24:35.200 | It's a great take.
00:24:36.200 | I think it's a great take.
00:24:37.200 | I think there are a lot of interests who benefit from war and I think the foreign policy establishment
00:24:42.920 | is funded by those interests and it's kind of wired for war, at least in terms of the
00:24:47.880 | reflex, right?
00:24:48.880 | If you're going to do something as relatively harmless as a balloon, I saw that becomes
00:24:53.960 | like a casu spell.
00:24:55.200 | It's like people are ready to go to war against China over that.
00:24:58.480 | I saw a couple of military leaders give a talk a few months ago.
00:25:01.840 | It's in a private thing.
00:25:03.400 | So it wasn't on public record with the establishment.
00:25:06.720 | No, yeah, I was at the establishment gathering.
00:25:09.120 | What was striking to me in this particular thing where these guys were being interviewed
00:25:25.160 | on stage at like a dinner thing and they were so oriented around their next steps in escalation
00:25:35.480 | and I think it speaks to the point, Zach, like none of them were thinking about like
00:25:40.400 | where we are today, how do we deescalate, what is this going to get?
00:25:43.040 | There was no conversation at all from anyone about resolution or deescalation.
00:25:48.840 | With every single one of them, it was all about like my orientation for getting bigger,
00:25:53.360 | going deeper, going harder, going stronger, making this thing bigger and I think that
00:25:57.280 | was really scary to me because I didn't hear anyone having a conversation around like how
00:26:01.520 | do we should we even consider whether or not this doesn't get bigger.
00:26:05.920 | Everyone was thinking, assumptively, it was going to get bigger.
00:26:08.880 | I like the Ouija board.
00:26:09.880 | You have to follow the financial incentives.
00:26:11.880 | When the last time we looked at this, right, Leon Panetta and all these other guys who
00:26:15.680 | were screaming for war, they were getting paid by the military industrial complex.
00:26:19.880 | I remember when Lloyd Austin was nominated as defense secretary.
00:26:23.880 | He had some conflict issues because he was just on the board of Northrop Grumman or one
00:26:27.680 | of these big military industrial companies.
00:26:31.080 | And so of course, these generals have to push for war because as long as they're girding
00:26:35.040 | for war, they're guaranteed to have for them a very lucrative job once they leave the military.
00:26:42.640 | The Ouija board though, Saxon, I'll throw to you, maybe you can keep this metaphor going.
00:26:47.360 | The media has got their hands on it.
00:26:48.760 | They want ratings.
00:26:49.760 | You have the energy industrial complex in this German conflict who seeks to benefit
00:26:53.800 | massively if people invest in renewables or you find other oil off Norway.
00:27:01.160 | Norway's oil is one of the largest reserves that's untapped.
00:27:04.240 | So you have this Ouija board, media, energy, and the military industrial complex all moving
00:27:09.080 | it at once.
00:27:10.080 | Maybe you could speak to that analogy.
00:27:11.080 | Everyone wants to move it to the side of the Ouija board that says escalate.
00:27:14.160 | There are very few people that have the energy to move it to the other side that says de-escalate.
00:27:18.280 | It's not profitable.
00:27:19.280 | De-escalation means less energy, more less investment.
00:27:22.360 | So yeah, you're going to go escalate.
00:27:24.200 | You know who warned us about the military industrial complex?
00:27:27.800 | Dwight D. Eisenhower, Supreme Allied Commander in World War II, wins the war, patriot war
00:27:32.720 | hero, top general, becomes president, Republican president, and his departing address warns
00:27:38.400 | us that yes, we need a defense industry, but they become a vested interest in favor of
00:27:44.800 | foreign interventions of war.
00:27:47.280 | And where is the-
00:27:48.280 | It was 1961, he did this.
00:27:49.280 | Where's the interest on the other side of it?
00:27:52.440 | I can tell you this, the American people don't want to be in a war with Russia.
00:27:56.320 | I don't even think most of the American people want to send 100 billion over there.
00:28:00.560 | They want to send 100 billion to their cities to fix crime and all the other problems.
00:28:06.040 | Homelessness, everything, yeah.
00:28:07.460 | If you've never seen that farewell address from 1961, it is well worth watching.
00:28:10.960 | You can find it on YouTube.
00:28:12.520 | Just search for military industrial complex Eisenhower.
00:28:16.240 | And this is a person who was part of the military industrial complex saying, "Watch out for
00:28:20.600 | this."
00:28:21.600 | It was a very prescient warning.
00:28:22.600 | He knew of which he spoke.
00:28:24.600 | He was in the machine.
00:28:25.600 | He helped build the machine.
00:28:26.600 | Welcome to the All In Podcast.
00:28:27.680 | With us again, David "The Dove" Sachs, Chamath Palihapitiya, and the Sultan of Science,
00:28:33.320 | who is on his podcast.
00:28:34.320 | Oh my God, so many podcasts you're doing.
00:28:36.040 | The Sultan of Science is in hot demand.
00:28:39.560 | What are all these podcasts you're doing, Freeberg?
00:28:40.880 | All these science podcasts pulling you in.
00:28:42.280 | I did a podcast with Brian Keating last week, who was really kind enough to reach out.
00:28:46.400 | He's had some awesome guests.
00:28:47.800 | He's a cosmologist at UCSD, professor down there.
00:28:54.000 | And we were supposed to record that day, and then we canceled, I think, last minute, right?
00:28:57.760 | Yeah, yeah.
00:28:58.760 | So I was only supposed to be on with him for an hour, and I'm like, "Oh, well, my next
00:29:00.800 | thing just got freed up."
00:29:01.880 | So ended up doing like three hours.
00:29:05.480 | I was like exhausted that day, so I look really hungover on the video and probably stumbled
00:29:10.400 | a lot, yeah.
00:29:11.840 | But no Lex Friedman for you.
00:29:13.040 | So Chamath and I have done Lex Friedman, but you have not.
00:29:16.120 | Has he invited you yet, Lex?
00:29:18.000 | Has Lex invited you?
00:29:19.000 | No, no, he's not.
00:29:20.000 | Where's my invitation?
00:29:21.000 | I don't know what's going on here.
00:29:22.800 | Collect all four, Lex.
00:29:23.800 | What are you doing?
00:29:24.800 | No Davos and no Lex Friedman.
00:29:27.440 | Well, what's going on?
00:29:28.800 | You were never-
00:29:29.800 | Am I too anti-establishment?
00:29:30.800 | What's going on?
00:29:31.800 | No, I'm too anti-establishment, Salek.
00:29:32.800 | That's, you know, you've mythologized me.
00:29:36.560 | Nobody's inviting this quartet to anything.
00:29:39.120 | Well, full stop, we're not doing all in live from Davos.
00:29:43.360 | It's not happening, folks.
00:29:44.360 | Sorry, they don't want that heat.
00:29:45.360 | I agree.
00:29:46.360 | I'm with Groucho Marx.
00:29:47.360 | I don't want to be part of any club that would have me as a member.
00:29:50.760 | Absolutely.
00:29:51.760 | And they also don't want you there anyway, so it works out for everybody.
00:29:56.120 | It does fill me with like a rage where I actually might agree to doing the All In Summit again.
00:30:01.880 | By the way, proposal coming your way this weekend.
00:30:05.520 | If you want to really, really, really thumb your nose at the establishment.
00:30:08.520 | Yes, let's do it.
00:30:09.840 | Set it during the exact same dates and times as an establishment conference.
00:30:14.880 | Oh, the Alt Davos?
00:30:16.640 | And invite all the best guests so that they come to ours.
00:30:19.560 | Just a suggestion for you guys.
00:30:20.560 | Force them to choose Alt Ted?
00:30:21.560 | Ted's not relevant now.
00:30:22.560 | You know how Vanity Fair does their new establishment conference?
00:30:24.640 | You could call it the anti-establishment.
00:30:25.840 | Oh, that could be a tagline.
00:30:28.800 | Let's come up with a tagline that just tweaks everybody.
00:30:30.840 | Well, we should create a list.
00:30:32.440 | We should create the anti-list.
00:30:33.440 | You know, they have their like establishment list.
00:30:36.280 | We should have the anti-list.
00:30:37.280 | That's a good point.
00:30:38.760 | I like the anti-establishment.
00:30:40.520 | Start with them.
00:30:41.520 | I have a great Vanity Fair establishment thing story.
00:30:45.720 | Oh, go ahead.
00:30:47.320 | I snuck on that list.
00:30:48.600 | They put me on that list a decade ago.
00:30:50.440 | Oh, let's pull it.
00:30:51.440 | Let me find a link.
00:30:52.440 | Go on.
00:30:53.440 | And the most incredible thing about it is that when you go to the event, which was kind of
00:30:57.000 | a cool event, we all had photos taken by Annie Leibovitz.
00:31:02.080 | And I have a montage of some of the people that took photos that day.
00:31:07.760 | That's pretty cool.
00:31:09.680 | Me, Aaron Levy, Priscilla Chan, Bezos, bunch of people.
00:31:13.600 | But I was about to say, just before you had, just when you had the dad bod and no fashion
00:31:19.120 | sense.
00:31:20.120 | Yeah, it was like, yeah.
00:31:21.120 | Oh, look, he looks great.
00:31:22.120 | Here it is.
00:31:23.120 | I just, I'm just with you.
00:31:24.120 | Yeah, basically I had no stylist.
00:31:25.120 | I had a bad haircut.
00:31:26.120 | I was wearing shitty, poor rim glasses.
00:31:27.120 | It was like a year or two post Facebook.
00:31:31.280 | It was not a good look for me.
00:31:32.560 | Yeah, but that's not that bad.
00:31:33.560 | I'll show you the picture.
00:31:34.560 | Is that Tom Ford?
00:31:35.560 | That was your Tom Ford face.
00:31:36.560 | Tom Ford.
00:31:37.560 | That was my Tom Ford face.
00:31:38.560 | He's doing the jeans blazer thing, which is like a really tired look for a Silicon Valley
00:31:43.560 | viewer.
00:31:44.560 | Pretty skinny jeans though.
00:31:45.560 | It was like a little loose.
00:31:46.560 | Pretty skinny jeans, yeah.
00:31:47.560 | That was in 2011.
00:31:48.560 | I'll give you a not a bad look.
00:31:49.560 | Not bad.
00:31:50.560 | Not bad.
00:31:51.560 | That's not so bad.
00:31:52.560 | Just don't pull up the Chamath pictures when he's wearing like, you know, his Macy's shirt.
00:31:57.200 | If you do the Google search, you put the images before 2011, you will find Chamath photos
00:32:02.640 | that are regrettable.
00:32:03.640 | Dude, at Facebook, I wore the same thing every day for four years, five years.
00:32:08.120 | It was brutal.
00:32:09.120 | Oh my God.
00:32:10.120 | What is ...
00:32:11.120 | That's also Tom Ford.
00:32:12.120 | We got to break this down.
00:32:16.640 | See this is when the sweater game was not tight.
00:32:18.560 | This is like sweater 1.0 games.
00:32:19.800 | I didn't know what I was doing back then.
00:32:21.600 | Stop.
00:32:22.600 | Take these pictures off, please.
00:32:23.600 | Look, oh, and he's also got the watch subtly peeking out.
00:32:26.320 | This is back when he was like, "Oh, I got a Rolex."
00:32:29.160 | That's a petite, but anyways, yeah.
00:32:32.600 | This is when the watches were only five figures.
00:32:34.160 | That is zero.
00:32:35.160 | I don't know what that watch got to six figures.
00:32:36.160 | Or two, but yeah, same story.
00:32:37.160 | Same, same Jake, same, same.
00:32:38.160 | Same, same, same.
00:32:39.160 | All right, well, listen.
00:32:40.160 | All right, listen.
00:32:41.160 | I got an Apple watch and I'm going to upgrade.
00:32:45.160 | Oh, jeez.
00:32:46.160 | That's a dad vibe.
00:32:47.160 | Go teach your mom.
00:32:48.160 | There's the dad vibe.
00:32:49.160 | Yeah, yeah, yeah.
00:32:50.160 | Oh, he's going to ... There's the dad vibes.
00:32:52.160 | That must have been from ...
00:32:53.160 | Mayfield?
00:32:54.160 | Yeah, that was 13, 14 years ago.
00:32:56.160 | God, you had no style.
00:32:57.160 | I really didn't.
00:32:58.160 | It was rough.
00:32:59.160 | I really didn't.
00:33:00.160 | It was rough.
00:33:01.160 | I mean, don't pull up pictures of me.
00:33:02.160 | I was fat.
00:33:03.160 | Yeah, yeah, yeah.
00:33:04.160 | It was like pictures of me eating egg sandwiches all over the internet.
00:33:07.160 | Oh, my gosh.
00:33:08.160 | Yeah, chubby J-Cal.
00:33:09.160 | Oh, man.
00:33:10.160 | Look at that face.
00:33:11.160 | There he is.
00:33:12.160 | Ooh, that's plus 20 pounds.
00:33:13.160 | Let's keep this going.
00:33:14.160 | Search worse, Microsoft versus Google.
00:33:15.760 | Okay, it's been a rough couple of days for Google.
00:33:17.520 | We've all seen it.
00:33:18.520 | Google and Microsoft both did live demos of their new generative AI, yada, yada, yada.
00:33:23.480 | You guys all know about chat GPT.
00:33:25.360 | But now Bing is integrating it into their search engine, getting there before Google.
00:33:29.920 | And Microsoft CEO Satya Nadella, he is going ham.
00:33:34.760 | He looks great.
00:33:35.760 | He's fit.
00:33:36.760 | He's wearing a tight t-shirt.
00:33:38.000 | And he is saying he's going to make Google dance.
00:33:40.360 | Dance, Google, dance.
00:33:42.480 | He is getting up in their business.
00:33:44.680 | And listen, he is in a distant second place.
00:33:47.360 | So it makes sense.
00:33:48.360 | On the other hand, Google's AI demo was, frankly, a bit of a disaster.
00:33:54.480 | Poorly received, stock to drop 12% since this event.
00:33:58.600 | And their presentation did not include the chat barred because in search, because it
00:34:04.920 | wasn't working.
00:34:05.920 | It seems like there was an error in it when they said, what new discoveries from the James
00:34:09.520 | Webb Space Telescope can I tell my nine year old and barred answer that it took the first
00:34:14.920 | pictures of a planet outside our solar system, which is false.
00:34:18.920 | Which of course, we all know about chat GPT.
00:34:21.440 | It's only right half the time.
00:34:23.360 | It's a little woke on the margins.
00:34:26.760 | So anyway, there was a screenshot circulating today, which is probably false, but it says
00:34:34.240 | the following.
00:34:35.240 | Me and a bunch of coworkers were just laid off from Google for AI demo going wrong.
00:34:39.200 | It was a team of 168 people who prepared the slides for the demo.
00:34:41.760 | All of us are out of jobs.
00:34:42.760 | I can't imagine that's real.
00:34:44.600 | But if it was, that would be a hardcore moment for Google to fire a bunch of people for screwing
00:34:48.600 | it up.
00:34:49.600 | Listen, you worked in the belly of the beast, Freiburg.
00:34:52.800 | What are your thoughts on being poking the tiger and telling Google dance?
00:34:59.080 | You know, Sundar dance, you know, what's interesting is Google's had like an incredible AI competency,
00:35:06.240 | particularly since they bought DeepMind.
00:35:08.200 | And it's been predominantly oriented towards kind of, you know, internal problems.
00:35:14.200 | You know, they're they demonstrated last year that their AI improved data center energy
00:35:19.200 | efficiency by 40%.
00:35:20.920 | They've used it for ad optimization, ad copy optimization, the YouTube follow video algorithm.
00:35:27.400 | So what video is suggested to you as your next video to watch, which massively increased
00:35:33.020 | YouTube hours watched per user, which massively increased YouTube revenue.
00:35:38.680 | You know, what's the right time and place to insert videos in YouTube or insert ads
00:35:41.920 | and YouTube videos.
00:35:42.920 | So, you know, autofill in Gmail and Doc.
00:35:45.640 | So so much of this competency has been oriented specifically to avoid this primary disruption
00:35:51.360 | in search.
00:35:52.880 | Obviously, now, things have come to a bit of a point because, you know, this alternative
00:35:58.080 | for search has been revealed in chat GPT.
00:36:02.080 | And you guys can kind of think about search.
00:36:04.080 | And you know, we've used this term in the past, Larry and Sergey, the textbook that
00:36:08.040 | they read, you know, one of the original textbooks that's used in internet search engine technology
00:36:13.560 | is called information retrieval, information retrieval.
00:36:16.600 | So information retrieval is this idea that you know, how do you pull data from a static
00:36:21.040 | data set, and it involves scanning that data set or crawling it, and then creating an index
00:36:26.260 | against it, and then a ranking model for how do you pull stuff out of the index to present
00:36:30.880 | the results from the data that's available based on what it is you're querying for, you
00:36:37.560 | know, and doing that all in a 10th of a second.
00:36:39.720 | So you know, if you think about the information retrieval problem, you type in the data or
00:36:44.840 | some rough estimation of the data you want to pull up, and then a list is presented to
00:36:49.720 | And over time, Google realized, hey, we could show that data in smarter, quicker ways.
00:36:53.440 | Like if we can identify that you're looking for a very specific answer, we can reveal
00:36:57.560 | that answer in the one box, which is the thing that sits above the search results.
00:37:01.080 | Like if you said, What time is it?
00:37:02.680 | You know, what, when does this movie show at this theater, so they can pull out the
00:37:05.680 | structure data and give you a very specific answer rather than a list from the database.
00:37:11.400 | And then over time, there were other kind of modalities for displaying data that it
00:37:15.280 | turns out, were even better than the list, like maps, or shopping, where you can kind
00:37:20.640 | of see a matrix of results, or YouTube, where you can see a, you know, longer form version
00:37:25.440 | of content.
00:37:26.440 | And so these different kind of, you know, information retrieval, you know, media were
00:37:31.920 | presented to you, and it really kind of changed the game and created much better user satisfaction
00:37:37.640 | in terms of getting what they were looking for.
00:37:40.800 | The challenge with this new modality is it's not really fully encompassing.
00:37:46.560 | So if you can kind of think about the human computer interaction problem, you want to
00:37:51.280 | see flight times, and airlines, and the price of flights in a matrix, you don't necessarily
00:37:56.920 | want a text stream written to you to give you the, you know, the answer that you're
00:38:02.920 | looking for, or you want to see a visual display of shopping results, or you do want to see
00:38:09.480 | a bunch of different people's commentary, because you're looking for different points
00:38:12.560 | of view on a topic, rather than just get an answer.
00:38:15.540 | But there are certainly a bunch of answer solutions for which chat dbt, type, you know,
00:38:21.160 | natural language responsiveness becomes a fantastic and better mode to present answers
00:38:27.080 | to you, then the matrix or the list or the ranking and so on.
00:38:31.000 | Now, the one thing that I think is worth noting, I did a back of the envelope analysis on the
00:38:35.760 | cost of doing this compared to chat GPT.
00:38:39.120 | So, so Google makes about three bucks per click, you can back into what the revenue
00:38:42.320 | per search is a bunch of different ways.
00:38:44.720 | One way is three bucks per click about a 3% click through rate on ads.
00:38:48.200 | Some people estimate this is about right about 5 cents to 10 cents revenue per search done
00:38:52.560 | on Google or anywhere from one cent to 10 cents, even if they don't click the ads, because
00:38:57.320 | one out of 100 people click an ad.
00:38:58.620 | And that's where the money comes from.
00:38:59.620 | So let's let's just call it five cents, right.
00:39:02.480 | And you can assume a roughly 50% margin on that search, which means a 50% COGS or cost
00:39:07.720 | of goods, or a cost to run that search and present those ads.
00:39:11.800 | So you know, right now, Google search costs them about, you know, call it two and a half
00:39:16.720 | cents per search to present the results.
00:39:20.480 | A recent estimate on running the GPT three model for chat GPT is that each result takes
00:39:28.720 | about 30 cents of compute.
00:39:30.760 | So it's about an order of magnitude higher cost to run that search result than it is
00:39:37.360 | to do it through a traditional search query today makes today today.
00:39:41.600 | That's right.
00:39:43.160 | So that's the point, like it has to come down by about an order of magnitude.
00:39:47.880 | Now, this is a this then becomes a very deep technical discussion that I'm certainly not
00:39:52.760 | the expert.
00:39:53.760 | But there are a lot of great experts that there's great blogs and sub stacks on this,
00:39:57.280 | on what's it going to take to get there to get a 10x reduction in cost on running these
00:40:01.800 | models.
00:40:02.800 | And there's a lot related to kind of optimization on how you run them on a compute platform,
00:40:07.960 | the type of compute hardware that's being used all the way down to the chips that are
00:40:10.440 | being used.
00:40:12.020 | So there's still quite a lot of work to go before this becomes truly economically competitive
00:40:17.800 | with Google.
00:40:18.800 | And that really matters.
00:40:20.200 | Because if you get to the scale of Google, you're talking about spending eight to $20
00:40:23.920 | billion a quarter just to run search results and display them.
00:40:28.380 | And so for chat GPT type solutions on Bing or elsewhere to scale, and to use that as
00:40:32.840 | the modality, you're talking about something that today would cost $80 billion a quarter
00:40:38.000 | to run from a compute perspective, if you were to do this across all search queries.
00:40:41.760 | So it's certainly going to be a total game changer for a subset of search queries.
00:40:45.960 | But to make it economically work for for these businesses, whether it's Bing or Google or
00:40:52.240 | others, there's a lot of work still to be done.
00:40:54.800 | The great part about this Chamath is that Bing gave 10 billion to our friend Sam and
00:41:01.320 | chat GPT to invest in Azure, which now has the infrastructure and will be providing the
00:41:07.680 | chat GPT infrastructure to startups or corporations, big companies, and small alike.
00:41:12.640 | So that $10 billion should do enough to grind it down between software optimization, data
00:41:17.360 | optimization, chip optimization and cloud optimization.
00:41:20.560 | Yes, you would think so or no.
00:41:24.000 | The ability to run this at scale is going to happen because we're getting better and
00:41:28.800 | better at creating silicon that specializes in doing things in a massively parallelized
00:41:35.560 | And the cost of energy at the same time is getting cheaper and cheaper along with it.
00:41:40.080 | When you multiply these two things together, the effect of it is that you'll be able to
00:41:43.640 | run these models.
00:41:45.320 | The same output today will cost one one 10th as long as you ride the energy and compute
00:41:49.280 | curve for the next few years.
00:41:50.920 | So that's just going to naturally happen.
00:41:52.840 | I have two interesting takeaways.
00:41:56.040 | And one is maybe a little bit of a sidebar.
00:41:57.600 | So the sidebar is, if you guys were sitting on top of something that you thought was as
00:42:03.180 | foundational as Google search back in 1999, would you have sold 49% of it for $10 billion?
00:42:12.000 | Hard no, I think the answer is no.
00:42:13.720 | I think the answer is no.
00:42:15.920 | Not in an environment where you have unlimited ability to raise capital.
00:42:19.280 | This is something that we've said before, which is that chat GPT is an incredibly important
00:42:23.120 | innovation.
00:42:25.960 | But it's an element of a platform who will get quickly commoditized because everybody
00:42:30.000 | will compete over time.
00:42:31.980 | And so I think what Microsoft is doing is the natural thing for somebody on the outside
00:42:37.640 | looking in at an entity that has 93% share of a very valuable category, which is how
00:42:42.880 | can I scorch the earth.
00:42:44.860 | And so Microsoft effectively for 10 billion, but almost 50% of a tool.
00:42:50.200 | And now we'll make that tool as pervasive as possible so that consumer expectations are
00:42:54.880 | such that Google is forced to decay the quality of their business model in order to compete.
00:43:01.720 | So that as Friedberg said, you have to invest in all kinds of compute resources that today
00:43:05.720 | are still somewhat expensive.
00:43:07.840 | And that will flow into the PNL.
00:43:10.240 | And what you will see is that the business quality degrades.
00:43:14.160 | And this is why when Google did the demo of Bard, the first thing that happened was a
00:43:18.520 | stock went off 500 basis points, we they lopped off $100 billion of the market cap, mostly
00:43:23.920 | in reaction to Oh my god, this is not good for the long term business.
00:43:27.120 | It's not good for the long term business on a mechanical basis.
00:43:30.780 | And you get an answer, you don't have to click the links.
00:43:32.720 | No, right now, if you look at Google's business, they have the best business model ever invented
00:43:40.080 | on earth, ever for for profit company.
00:43:44.160 | It just rains money.
00:43:46.320 | This is a business that this year will do almost $100 billion of free cash flow.
00:43:51.120 | It's a business that has to find ways and we kind of joke, but they have to find ways
00:43:55.680 | to spend money.
00:43:56.680 | Because they'd be showing probably 50 or 60% EBITDA margins and people would wonder, hey,
00:44:02.020 | wait a minute, you can't let something like this go unattended.
00:44:05.420 | So they try to do a lot more things to make that core treasure look not as incredible
00:44:11.060 | as it is.
00:44:12.060 | They have 120 billion of cash.
00:44:15.620 | This is a business that's just an absolute juggernaut.
00:44:17.980 | And they have 10 times as many employees as they need to run the core business.
00:44:21.500 | That's, I don't know what that is.
00:44:23.060 | But my point is that it's an incredible business.
00:44:25.180 | So that business will get worse if Microsoft takes a few hundred basis points of share,
00:44:30.880 | if meta takes a few hundred basis points of share, if 10 cent does, if a few startups
00:44:36.460 | Quora, by the way, launched something called Po, which I was experimenting and playing
00:44:40.000 | around with last weekend.
00:44:42.440 | If you add it all up, what Satya said is true, which is even if all we do collectively as
00:44:46.700 | an industry is take 500 or 600 basis points of share away from Google, it doesn't create
00:44:54.080 | that much incremental cost for us, but it does create enormous headwinds and pressure
00:45:00.120 | for Google with respect to how they are valued and how they will have to get revalued.
00:45:04.800 | And that's what happened.
00:45:05.800 | So the last thing I'll say is the question that I've been thinking about is what is Sundar
00:45:10.560 | Right?
00:45:11.560 | So what's the countermeasure?
00:45:12.560 | Yes, this is what I was going to get.
00:45:14.000 | I think the countermeasure here, if I was him, is to go to the board and say, guys,
00:45:19.600 | we're going to double tack.
00:45:21.160 | Right?
00:45:22.160 | So tack is the traffic acquisition costs that Google pays their publishers.
00:45:26.320 | It is effectively their way of guaranteeing an exclusivity on search traffic.
00:45:32.360 | So for example, if you guys have an iPhone, it's Google search.
00:45:35.520 | That's the default search in the iPhone.
00:45:38.120 | Google pays Apple.
00:45:40.520 | This year, this renegotiation for that deal could mean that Apple gets paid $25 billion
00:45:45.320 | for giving away that right to Google.
00:45:46.820 | So if these Google does all these kinds of deals last year, they spent, I think, 45 billion
00:45:51.800 | or so.
00:45:52.800 | So about 21% when you think about that, Shamath, Google basically paid Apple, which was working
00:45:57.160 | on search technology.
00:45:58.600 | They were working on a search solution.
00:46:00.320 | They paid them to stay out of the business.
00:46:01.960 | And they're paying everybody.
00:46:02.960 | So I think the question for Google is the following.
00:46:04.700 | If you think you're going to lose share, and let's say you go to 75% share, would you rather
00:46:11.080 | go there and actually still maintain your core stranglehold on search?
00:46:17.040 | Or do you actually want 75% share where now all of these other competitors have been seeded?
00:46:23.400 | Well, you can decay business model quality and still remain exclusive if you just double
00:46:28.000 | the tack.
00:46:29.840 | And what you do is you put all these other guys on their heels because, as we talked
00:46:33.040 | about, if you're paying publishers two times more than what anybody else is paying them,
00:46:38.560 | you'll be able to get publishers to say, "Hey, you know what?
00:46:41.280 | Don't let those AI agents crawl your website because I'm paying you all this money.
00:46:45.080 | Remember that."
00:46:46.080 | So do not crawl in robots.txt equivalent for these AI agents.
00:46:51.040 | And I think that that'll put Microsoft and all these other folks on their heels.
00:46:54.840 | And then as you have to figure out all this derivative work stuff, all these lawsuits,
00:47:01.040 | Google will look pristine because they can say, "I'm paying these guys double because
00:47:04.040 | I acknowledge that this is a core part of the service."
00:47:06.440 | So that's the game theory I think that has to get figured out.
00:47:08.840 | But if I was Sundar, I'd double the tack.
00:47:11.040 | I love the second part because...
00:47:13.160 | Hold on.
00:47:14.160 | Let me get saxophone.
00:47:15.160 | I'm going to get the saxophone off because in this clip I'm about to show, Neelay Patel
00:47:18.960 | from The Verge did an awesome interview with Satya and he basically would not answer this
00:47:26.680 | question at least to my satisfaction, which is, "Hey, what do the publishers get out of
00:47:31.520 | this?
00:47:32.520 | You've ingested our information.
00:47:33.520 | How do we get paid?"
00:47:35.120 | Watch this clip.
00:47:36.120 | It's very telling.
00:47:37.120 | The answer or even in the chat session.
00:47:38.720 | But if I ask the new bang, "What are the 10 best gaming TVs?"
00:47:41.640 | And it just makes me a list, why should I, the user, then click on the link to The Verge,
00:47:48.000 | which has another list of the 10 best gaming TVs?
00:47:50.280 | Well, I mean, that's a great question.
00:47:51.760 | But even there, you will sort of say, "Hey, where did these things come from?"
00:47:56.480 | And would you want to go dig in?
00:47:58.600 | Even Search Today has that.
00:47:59.800 | We have answers.
00:48:00.800 | They may not be as high quality answers.
00:48:03.160 | They just are getting better.
00:48:04.280 | So I don't think of this as a complete departure from what is expected of a search engine today,
00:48:09.720 | which is supposed to really respond to your query, while giving them the links that they
00:48:14.080 | can then click on, like ads.
00:48:16.640 | And Search works that way.
00:48:18.220 | In my mind, there's a terrible answer.
00:48:19.680 | He needs to address how they get paid.
00:48:21.580 | He punted the answer and just said, "Hey, listen, Search works this way."
00:48:23.920 | Saks, will the rights to the data, will Google just say to Quora, "Hey, we'll give you a
00:48:30.360 | billion dollars a year for this data set if you don't give it to anybody else."
00:48:33.800 | They should.
00:48:34.800 | They should.
00:48:35.800 | Maybe.
00:48:36.800 | Saks, the strategist, let me hear your strategy here.
00:48:37.800 | You're now CEO of Google.
00:48:38.800 | What do you do?
00:48:39.800 | I think there's maybe even a bigger problem before that, which is I think the whole monetization
00:48:45.280 | model might change.
00:48:46.720 | So the reason why Google monetizes so well is it's perceived as having the best search,
00:48:52.200 | and then it gives you a list of links, and a bunch of those links are paid, and then
00:48:57.160 | people click on them.
00:48:58.680 | Now I think when you search in AI, you're looking for a very different kind of answer.
00:49:02.800 | You're not looking for a list of 10 or 20 links.
00:49:05.680 | You're just looking for the answer.
00:49:08.480 | And so where is the opportunity to advertise against that?
00:49:10.800 | I mean, maybe you can charge an affiliate commission if the answer contains a link in
00:49:16.600 | it or something like that, but then you have to ask the question, "Well, does that distort
00:49:21.560 | best answer?"
00:49:22.560 | Am I really getting the best answer, or am I getting the answer that someone's willing
00:49:26.000 | to pay for?
00:49:27.000 | This is your key insight.
00:49:28.880 | The fact is, if Google gives you an answer, you don't click on ads.
00:49:32.600 | Google has had a very finely tuned balance between, "Hey, these first two or three paid
00:49:38.960 | ads, these might, the paid links might actually give you a better answer than the content
00:49:44.160 | below them."
00:49:45.160 | But in this case, if the chat GPT tells you, "Hey, this is the top three televisions, these
00:49:49.200 | are the top three hotels, these are the top three ways to write a better essay," you don't
00:49:54.760 | need to click.
00:49:55.760 | You have now been given an answer, and the model is gone.
00:49:59.200 | The paid link is still a subset in that case.
00:50:03.280 | At Google, we used to have a key metric, was the bounce back rate.
00:50:07.840 | So when a user clicks on a result on the search results page, we could see whether or not
00:50:15.720 | they came back and searched again.
00:50:19.000 | That tells you the quality of the result that they were given because if they don't come
00:50:23.000 | back, it means they ended up getting what they were looking for.
00:50:26.800 | Ads that performed better than organic search results, which means someone created the ad,
00:50:33.040 | paid for it, and the user clicked on it and didn't come back and came back with less frequency
00:50:39.040 | than if they clicked on an organic result, that meant that the ad quality was higher
00:50:43.240 | than organic quality.
00:50:44.760 | So the ad got promoted to kind of sit at the top and it became a really kind of important
00:50:48.800 | part of the equation for Google's business model, which is how do we source, how do we
00:50:52.800 | monetize more search results where we can get advertisers to pay for a better result
00:50:59.320 | than what organic search might otherwise kind of show.
00:51:02.440 | So it's actually better for the user in this case than say just getting an answer.
00:51:06.840 | For example, I'm looking for a PlayStation 5.
00:51:09.720 | I don't want to just be told, "Hey, go to Best Buy and buy a PlayStation 5."
00:51:12.840 | I want to be taken to the checkout page to buy a PlayStation 5.
00:51:17.120 | I am more likely to be happy if I click on a result and it immediately takes me to the
00:51:20.920 | checkout page and Best Buy is really happy to pay for you to get there because they don't
00:51:24.560 | want you looking around the internet looking for other places.
00:51:27.520 | And we can't convolute all search queries.
00:51:30.600 | Not all search queries are, "Hey, what's the best dog to get to not pee on the floor?"
00:51:36.120 | Whatever kind of arbitrary question you might have that you're doing research on.
00:51:39.200 | Many search queries are commerce intention related.
00:51:42.680 | I want to buy a flight to go somewhere.
00:51:44.640 | I want to book a hotel to go somewhere.
00:51:46.480 | I want to buy a video game system, etc.
00:51:49.580 | That series of queries may have a very different kind of modality in terms of what's the right
00:51:53.680 | interface versus the chat GPT interface where there's a lot of organic results that people
00:51:59.120 | sift on the internet for today.
00:52:01.800 | The question earlier can be resolved by Google doing a simple analytical exercise which is,
00:52:07.760 | "What's it going to cost us and what's going to give the user the best result?"
00:52:11.640 | That's ultimately what will resolve to the better business model.
00:52:14.120 | It's really measurable.
00:52:15.120 | I think on Chamath's point, today Google pays Apple $15 billion a year to be the default
00:52:22.540 | search engine on iPhones on the Safari browser.
00:52:26.940 | That's only about a quarter of Google's overall tack.
00:52:29.460 | The majority of Google's traffic acquisition cost is actually not being paid for search.
00:52:33.460 | A good chunk of that is being paid to publishers to do AdSense display ads on their sites and
00:52:39.100 | Google's rev share back to them for putting ads on their sites.
00:52:42.420 | The tack number, I think maybe you want to move the needle, but the majority of Google
00:52:46.800 | searches don't come through the default search engine that they pay Apple to be on.
00:52:51.260 | It might move the needle a bit, but I don't think it really changes the equation for them.
00:52:55.020 | My comment is more tack has to become a weapon on the forward foot, number one.
00:53:01.060 | If you're going to spend 21% of your revenue on tack, you should be willing to spend 30
00:53:05.260 | to 40% to maintain the 93% market share.
00:53:08.660 | I don't think what you want to see is your profit dollars decay because you lose share.
00:53:13.380 | It's rather better for you to spend the money and decay your business model than have someone
00:53:17.620 | decay it for you in general.
00:53:19.500 | At this point, Apple is really the only tack line item for search.
00:53:24.740 | I understand.
00:53:25.740 | I'm not talking about today.
00:53:26.740 | I'm saying take that idea.
00:53:28.740 | You have an entire sales team whose job it is right now to sell AdSense.
00:53:32.900 | You have an entire group of people who know how to account for tack and how to think about
00:53:36.420 | it as a cost.
00:53:37.740 | But if you're basically willing to say, "Out of the $100 billion of free cash flow, I'm
00:53:42.500 | willing to go to 80 or 70 billion of free cash flow combined with the 100 billion of
00:53:48.900 | short and long-term investments I have and I'm going to use it as a weapon.
00:53:53.540 | I'm going to go and make sure that all of these publishers have a new kind of agreement
00:53:57.660 | that they signed up for, which is I'll do my best to help you monetize.
00:54:02.380 | You do your best by being exclusive to our AI agents."
00:54:06.700 | You deprive other models of your content on your pages because that will get litigated
00:54:13.020 | and there is no way, just like again, if you say, "Do not crawl," you're not allowed to
00:54:17.140 | crawl if you're Google or Microsoft researchers.
00:54:19.820 | This is going to happen for these agents.
00:54:21.580 | It's unrealistic to expect that it won't.
00:54:23.220 | My point is Google should do this and define how it's done before it's defined for them
00:54:28.780 | because right now people are in this nascent phase where everybody thinks everybody's going
00:54:32.900 | to be open and get along.
00:54:34.580 | I just think that that's unrealistic.
00:54:36.940 | It's a really important kind of philosophical question.
00:54:39.340 | First off, Google today, just so people know, on AdSense is typically paying out 70 cents
00:54:43.700 | on every dollar to the publisher.
00:54:47.060 | It's a pretty generous and it's the way they've kind of kept the competitive moat wide and
00:54:52.300 | kept folks out of beating them on third-party ad network bids because they bid on everything
00:54:57.500 | and they always win because they always share the most revenue back.
00:55:00.820 | They own that market.
00:55:01.820 | When it comes to acquiring content, the internet is open.
00:55:05.260 | It's an open protocol.
00:55:06.380 | Anyone can go to any website by typing in the IP address and viewing the content that
00:55:10.180 | a publisher chooses to make available on that server to display to the internet.
00:55:13.940 | There's a fair use policy.
00:55:14.940 | That's not true.
00:55:15.940 | You can type in any IP address.
00:55:18.380 | I think 15 or 20 or 30% of the pages on the internet right now are apps that are closed.
00:55:22.700 | Facebook's closed.
00:55:23.700 | Instagram's closed.
00:55:24.700 | I'm talking about the open internet, right?
00:55:25.700 | So like the content on the open internet.
00:55:27.340 | I'm saying the open internet matters less and less.
00:55:29.820 | Yeah, I don't know.
00:55:31.220 | I mean, look, you're right.
00:55:32.220 | Maybe there's the enhancement of the models, but my point being that if the internet is
00:55:35.580 | open and you and I spent a billion lifetimes reading the whole internet and getting smart,
00:55:41.180 | and then we were the chatbot and someone came and asked us a question and we could kind
00:55:44.460 | of answer their question because we've now read the whole internet, do I owe licensing
00:55:50.260 | royalty revenues to the knowledge that I gained and then the synthesis that I did, which ultimately
00:55:55.300 | meant excluding some things, including some things, combining certain things, and the
00:55:59.500 | problem with these LLMs, these large language models, is that you end up with 100 million,
00:56:05.260 | a billion plus parameters that are in these models that are really impossible to deconvolute.
00:56:11.340 | You don't know.
00:56:12.340 | We don't really understand deeply how the neural network is, the model is defined and
00:56:17.820 | run based on the data that it is constantly kind of aggregating and learning from.
00:56:22.280 | And so to go in and say, hey, it learned a little more from this website and a little
00:56:25.260 | less from that website is a practical impossibility.
00:56:28.180 | I'm saying when you look at transformer architecture today, every LLM that you write on the same
00:56:33.900 | corpus of underlying data for training will get to the same answer.
00:56:37.820 | So my point is today, if you're a company, the most important thing that you can do,
00:56:43.660 | especially if you have a $1 trillion plus market cap that could get competed away, is
00:56:49.660 | to figure out how to defend it.
00:56:51.100 | And so all I'm saying is from the perspective of a shareholder of Google and also from the
00:56:55.340 | perspective of the board of director or senior executive or the CEO, this should be the number
00:57:00.700 | one thing that I'm thinking about.
00:57:02.380 | And my framing of how to answer that question is build a competitive moat around two things.
00:57:09.940 | One is at the end, which is how much money and what kind of relationship do I have with
00:57:16.820 | my customers, including the publishers, and can I give them more so that number two is
00:57:24.700 | I can affect who they decide to contribute their content to.
00:57:28.740 | So you're right.
00:57:29.740 | Let's assume that there are five of these infinite libraries in the world.
00:57:34.460 | You mean non-public content.
00:57:35.940 | How important and also public.
00:57:37.700 | How important is it if Quora says, you know what guys, I've done a deal where my billions
00:57:43.420 | of page views and all of that really rich content, Quora's incredible content, Google's
00:57:48.060 | paying me $2 billion a year and so I've decided to only let Google's AI agents crawl it.
00:57:53.180 | And so maybe when there are questions that Quora is already doing a phenomenal job of
00:57:59.060 | answering, I think it does make a difference that Google now has access to Quora's content
00:58:04.660 | and others don't.
00:58:06.580 | For a hot minute, they did have access to the Twitter firehose and that was the premise
00:58:09.660 | was we could get this corpus of data that we can have in a very limited, restricted
00:58:15.340 | They paid Twitter a lot of money.
00:58:16.460 | I don't think that those deals exist anymore.
00:58:18.340 | Twitter, I mean, you guys might know better than I do, but I don't think they exist anymore.
00:58:21.620 | First word, Chamath, I think you're right.
00:58:23.900 | And maybe Dave, I think you're being too forgiven.
00:58:27.920 | These models know where they got the data and they can easily cite the sources and they
00:58:33.660 | could easily pay for it.
00:58:35.300 | And if you want to say something, hold, go ahead.
00:58:38.460 | You're absolutely right.
00:58:39.460 | The video in the Wall Street Journal where Satya was interviewed showed a demo and you're
00:58:43.980 | exactly right.
00:58:44.980 | They actually showed Jason in the search results, the five or six, but it made no sense because
00:58:49.860 | it's like, how do you know that those are the five most cited places that resulted in
00:58:54.020 | this?
00:58:55.020 | Well, by page rank technology or the authority of the website or the author.
00:58:58.860 | But let's pause for a second here.
00:59:00.020 | There is a company called neva.com.
00:59:02.620 | I'm not an investor.
00:59:03.620 | None of us are.
00:59:04.620 | It's a former Googler.
00:59:06.060 | They have 78 employees, I think, according to LinkedIn.
00:59:08.900 | I just typed in one of the best flat panel TVs.
00:59:11.140 | Here's the result.
00:59:12.140 | And as you see, sentence by sentence, as it rewrites another person's content, it links
00:59:18.780 | with a citation, just like the Wikipedia does.
00:59:21.660 | And when you scroll to the bottom of it, it tells you, hey, this is from Rolling Stone.
00:59:24.220 | This is from Best Buy.
00:59:25.220 | This is from ratings.
00:59:26.340 | And if that answer is good for you and you trust those sources, those people should get
00:59:29.980 | a commission.
00:59:31.100 | Every time there's a thousand searches and you come up, you should get a dollar every
00:59:34.820 | time your data was used.
00:59:36.620 | And if not, these sites should sue the daylights out of Google.
00:59:40.380 | And for Google to say they can't do it is hogwash.
00:59:42.980 | They can't attribute it.
00:59:43.980 | Why isn't it fair use?
00:59:44.980 | It's not fair use because in fair use, you have the ability to create derivative works
00:59:50.580 | on future platforms.
00:59:52.740 | And you are taking this person, the original content owners ability to exploit that and
00:59:57.820 | you are co-opting it and you're doing it at scale.
01:00:00.060 | And that is against fair use.
01:00:01.580 | You're not allowed to interfere with my ability to make future products.
01:00:04.100 | David, you know, this is an attorney.
01:00:05.660 | The problem with that IP lawyer.
01:00:07.900 | Sorry.
01:00:08.900 | All right.
01:00:09.900 | Well, I play one on TV and this podcast.
01:00:11.420 | The problem with that idea, just from a product perspective for a second, is that if you limit
01:00:18.060 | how they can tokenize to just being all entire sentences, the product will not be that good.
01:00:23.820 | Like the whole idea of these LLMs is that you're running, you know, so many iterations
01:00:28.100 | to literally figure out what is the next most best word that comes after this other word.
01:00:33.260 | And if you're all of a sudden stuck with blocks of sentences as inputs that can't be violated
01:00:37.500 | because of copyright, the product will not be as good.
01:00:40.260 | I don't just think it'll be as useful.
01:00:42.220 | Correct.
01:00:43.220 | These are also not deterministic models and they're not deterministic outputs, meaning
01:00:46.340 | that it's not a discrete and specific answer that's going to be repeated every time the
01:00:51.100 | model is run.
01:00:52.380 | These are statistical models.
01:00:54.240 | So they infer what the right answer could or should be based on a corpus of data and
01:00:58.460 | a synthesis of that data to generate a response to a query.
01:01:02.380 | That reference, that inference is going to be, you know, assigned some probability score.
01:01:07.740 | And so the model will resolve to something that it thinks is high probability, but it
01:01:11.020 | could also kind of say, there's a chance that this is the better answer, this is the better
01:01:13.620 | answer, and so on.
01:01:15.300 | And so when you have like you have in the internet competing data, competing points
01:01:19.140 | of view, competing opinions, the model is synthesizing all these different opinions
01:01:23.980 | and doing what Google search engine historically has done well, which is trying to rank them
01:01:27.940 | and figure out which ones are better than others.
01:01:30.060 | And that's a very dynamic process.
01:01:31.820 | And so if as part of that ingest process, one is using some open, openly readable data
01:01:37.260 | set, that doesn't necessarily mean that that data set is improving the quality of the output
01:01:42.500 | or is necessarily the answer from the output.
01:01:45.780 | Correct.
01:01:46.780 | Let me just give everybody a quick four factor education on fair use.
01:01:51.320 | And here it is from Google's actual website, because they deal with this all the time.
01:01:55.580 | And when you look at that the nature and purpose and character of the work, including whether
01:01:59.740 | such use is not profit or educational purposes.
01:02:03.220 | So that first test of fair use is hey, if you're using it educationally, and you want
01:02:07.260 | to make a video that is criticism of Star Wars prequels, or how to shoot a shot like
01:02:14.580 | this Quentin Tarantino, if it's educational, it's fine.
01:02:17.540 | And courts typically focus I'm reading here from Google on whether the use is transformative.
01:02:21.540 | That is whether it adds new expression or meaning to the original or whether it merely
01:02:24.820 | copies the original.
01:02:26.240 | It's very obvious that this is not transforming if they're just rewriting it.
01:02:29.720 | The nature of the copyright is pretty transformative to me.
01:02:32.320 | I don't think so.
01:02:33.320 | Not at all.
01:02:34.320 | They're coming out with entirely new content.
01:02:36.520 | They're just rewriting it.
01:02:37.920 | They're not actually adding anything to it.
01:02:39.960 | Transforming would be fine.
01:02:41.920 | If a human does it, it's not.
01:02:43.800 | If you transform it, it's pretty cool.
01:02:47.280 | Listen, Jacob, I think the rights issue is just like the cost issue, which is a problem
01:02:54.040 | today maybe but it's going to get sorted out.
01:02:56.600 | But here let me finish.
01:02:57.600 | New technology waves that are this powerful don't get stymied by either chip costs or
01:03:03.480 | legal rights issues.
01:03:04.800 | They do by legal rights.
01:03:05.800 | You're 100% wrong.
01:03:06.800 | It's going to get worked out.
01:03:07.800 | YouTube got stopped dead in their tracks and the only way YouTube and Napster got stopped
01:03:11.400 | in the tracks, I predict this is going to get stopped dead in its tracks with YouTube
01:03:15.360 | level near death experience lawsuits.
01:03:16.360 | Napster was pure piracy.
01:03:18.340 | This is different.
01:03:19.340 | And Google was enabled piracy and then they had to build tools to fight against it.
01:03:22.840 | I deeply disagree with Jacob.
01:03:23.840 | I deeply disagree with you.
01:03:24.840 | I disagree with both you guys.
01:03:25.840 | You guys both think that cost is going to stay in the way.
01:03:26.840 | Hold on, let me read you number four.
01:03:27.840 | Nothing's going to stay in the way of the AI.
01:03:28.840 | No, it's going to stay in the way.
01:03:29.840 | I don't think it's going to stay in the way.
01:03:30.840 | The AI wants to happen.
01:03:31.840 | The AI is going to happen.
01:03:32.840 | Big companies using AI to inst- The AI is already happening.
01:03:33.840 | AI is happening.
01:03:34.840 | Let's move the conversation forward.
01:03:35.840 | Hold on, I would like to make my point.
01:03:36.840 | I would like to make my point.
01:03:37.840 | We don't need your amateur lawyer opinions.
01:03:38.840 | I am going to give my point.
01:03:39.840 | I don't give a shit if you want it or not.
01:03:40.840 | The effect of the use of harm- Are you going to bill us $500 an hour for
01:03:41.840 | pretending to be a lawyer on TV?
01:03:42.840 | Sure, dude.
01:03:43.840 | He's been here an hour.
01:03:44.840 | He's been here an hour.
01:03:45.840 | He's better call Saul.
01:03:46.840 | I've heard this point of view before from you.
01:03:47.840 | Here it is.
01:03:48.840 | Okay, take it easy, Mr. Sub- Better call J-Cal.
01:03:49.840 | This is the third time you've done this.
01:03:50.840 | The effect, listen to this.
01:03:51.840 | The effect of the use upon the potential market for or value of the copyright work.
01:03:52.840 | Use upon the potential market for or value of the copyright work.
01:03:53.840 | Use upon the potential market for or value of the copyright work.
01:03:54.840 | Use upon the potential market for or value of the copyright work.
01:03:55.840 | Use upon the potential market for or value of the copyright work.
01:03:56.840 | Use upon the potential market for or value of the copyright work.
01:03:57.840 | Use upon the potential market for or value of the copyright work.
01:03:58.840 | Use upon the potential market for or value of the copyright work.
01:04:19.840 | Use upon the potential market for or value of the copyright work.
01:04:45.840 | Use upon the potential market for or value of the copyright work.
01:05:14.840 | The use case that I saw this past week in a product demo
01:05:19.840 | was, they were showing me an Excel spreadsheet,
01:05:24.840 | like Excel spreadsheet modeling of financial asset.
01:05:29.840 | And they had a plug-in to a chat GPT type AI.
01:05:34.840 | And so they just asked it.
01:05:39.840 | And they were like, "Hey, I'm going to show you
01:05:44.840 | how to do this."
01:05:49.840 | And I was like, "Oh, okay."
01:05:54.840 | And they were like, "Okay, so let's do this."
01:05:59.840 | And I was like, "Okay, so let's do this."
01:06:04.840 | And they were like, "Okay, so let's do this."
01:06:09.840 | And the chat GPT spat out a formula that was like perfect Excel logic
01:06:14.840 | that was something that you or I could never figure out.
01:06:19.840 | You need a super pro user of Excel to basically know how to do this stuff.
01:06:24.840 | So it spit it out and boom, it worked instantly.
01:06:29.840 | And what we're thinking about is that we're going to have these little assistants everywhere.
01:06:34.840 | You combine that power with, say, speech to text, right?
01:06:39.840 | Because we could have just talked to it, the speech to text would transcribe the instruction,
01:06:44.840 | spit it back out, and you're going to have these little personal digital assistants in applications.
01:06:49.840 | I think it's pretty obvious to see how AI could replace call centers
01:06:54.840 | with having the frontline call center operator be, instead of being a human, it could be like an AI.
01:06:59.840 | But this is actually even before that.
01:07:04.840 | I think in every single application that we use, there's going to be an AI interface
01:07:09.840 | and it's probably going to be voice based where you can just say to it,
01:07:14.840 | "Hey, I'm trying to accomplish this. How do I do it? Can you just make it happen?"
01:07:19.840 | I have an idea. I was hanging out with Andrej Karpathy and I gave him this following challenge.
01:07:24.840 | So there I was.
01:07:27.840 | I said, "If you had to build Stripe, how many engineers do you think it would take you
01:07:32.840 | and how long would it take you to build a competitor?" It was just a thought exercise.
01:07:36.840 | It would take hundreds of millions of dollars and years.
01:07:40.840 | Now imagine you were feeling threatened by Stripe.
01:07:45.840 | Imagine you're a large company. Visa MasterCard, just as an example.
01:07:50.840 | You can now actually get one or two really smart people like him
01:07:55.840 | to lead an effort where you would say, "Here's a couple hundred million dollars
01:08:00.840 | to compete with Stripe, but here are the boundary conditions. Number one is you can only hire five or ten engineers."
01:08:05.840 | What you would do is you would actually use tools like this to write the code for you.
01:08:10.840 | The ability to write code is going to be the first thing
01:08:15.840 | that these guys do incredibly well with absolute precision.
01:08:20.840 | You can already do unit testing incredibly well, but it's going to go from unit testing to basically end-to-end testing.
01:08:25.840 | You'll be able to build a version of Stripe extremely quickly and in a very lean way.
01:08:30.840 | So then the question is, "Well, what would you do with the two or three hundred million you raised?"
01:08:35.840 | My thought is you use it, again, as tack.
01:08:40.840 | You go to customers and you're like, "Well, listen, if Braintree is going to charge you one basis point
01:08:45.840 | over Visa MasterCard, or sorry, a hundred basis points and Stripe will
01:08:50.840 | fifty, you know what? I'll charge ten."
01:08:53.840 | Margin destruction.
01:08:54.840 | Margin destruction.
01:08:55.840 | Totally. And this is going to make everything.
01:08:57.840 | That's what's so interesting. You can take any business that's a middleman business.
01:09:01.840 | This is the point. Any middleman business right now that doesn't have its own competitive mode
01:09:06.840 | can be competed against because now you can take all of those input costs that go into human capital,
01:09:11.840 | you can defer that, have a much smaller human capital pool, and push all of that extra money
01:09:17.840 | into traffic acquisition and subsidization.
01:09:19.840 | This goes to the movement in Silicon Valley of being more efficient.
01:09:21.840 | This is going to lead to that efficiency.
01:09:23.840 | The net benefit of all of this is economic productivity because the end customer that's using that tool that you just mentioned,
01:09:28.840 | they now have a lower cost to run their business and their total net profits go up.
01:09:32.840 | And this is what happens with every technology cycle.
01:09:35.840 | It always yields greater economic productivity and that's why the economy grows.
01:09:39.840 | And that's why, I just want to say, this is so important.
01:09:41.840 | That's why technology is so important to drive economic growth, not debt.
01:09:46.840 | We've historically used financial engineering to drive economic growth.
01:09:49.840 | And this is why we need immigration.
01:09:51.840 | Technology and innovation drive this.
01:09:52.840 | What about China?
01:09:53.840 | What about China?
01:09:54.840 | Wait, what are you talking about?
01:09:55.840 | What we're describing here is AI making it harder and harder to make humans productive.
01:10:02.840 | So you want to bring in millions of people?
01:10:05.840 | No, it's going to augment human talent.
01:10:07.840 | It's going to augment human of labor.
01:10:08.840 | It's going to make super humans.
01:10:09.840 | One human can do the coding that 20 humans would do before.
01:10:13.840 | Assuming they're skilled enough to use the AI.
01:10:14.840 | Guys, go back to traditional capitalism.
01:10:16.840 | Chat, chat prompts are easier than programming.
01:10:19.840 | Go back to traditional capitalism for a second.
01:10:21.840 | Go back because the Stripe example is another good one.
01:10:24.840 | If you have this business model, how does the ecosystem get efficient?
01:10:30.840 | How do we create more opportunity to use Freeberg's language?
01:10:33.840 | The only way that it really happens, how does cost go down, is that certain entities become
01:10:38.840 | big enough that they can drive the prices down.
01:10:40.840 | An Uber, a DoorDash, or whomever says, "Yes, I need payments capability, so Braintree,
01:10:46.840 | give me your best bid.
01:10:47.840 | Checkout.com, give me your best bid.
01:10:49.840 | Adien, give me your best bid.
01:10:50.840 | Stripe, give me your best bid."
01:10:51.840 | You compete it down.
01:10:52.840 | Amazon comes in.
01:10:54.840 | Walmart comes in.
01:10:55.840 | They do it in physical CPG goods.
01:10:57.840 | They do it online.
01:10:58.840 | They do it for all kinds of technologies.
01:11:00.840 | But, you've never had an internal form that can create hyper-efficiency and basically
01:11:05.840 | create customer value like this thing can because this thing can allow a billion, jillion
01:11:12.840 | 10-person companies to get created that can do the work of 10,000 people.
01:11:16.840 | That's so cool.
01:11:17.840 | Hold on.
01:11:18.840 | Let me show you two things.
01:11:19.840 | These are two aha moments I had this week.
01:11:20.840 | This first one is called Galileo AI.
01:11:21.840 | It was just a tweet.
01:11:22.840 | You describe the design of an app.
01:11:24.840 | I saw this.
01:11:25.840 | This is cool.
01:11:26.840 | Here they say, "An onboarding screen of a dog walking app."
01:11:28.840 | Incredible.
01:11:29.840 | You type that in, and it gives you a welcome screen.
01:11:32.840 | That's like a seven out of 10.
01:11:33.840 | Then it says, "Oh, a way for people to change their name, phone number, and password."
01:11:36.840 | You know, that classic screen on any app, "I need to change my thing."
01:11:39.840 | Boom, it gives you that.
01:11:40.840 | Well, then at the same time that people are making text to UX, user interface, beautiful
01:11:47.840 | here, this stuff will get dumped into Figma.
01:11:49.840 | Then there's GitHub Copilot, which if you haven't seen, we all know it here.
01:11:56.840 | GitHub Copilot, GitHub is bought by Microsoft, another one of Satya Nadella's incredible
01:12:03.840 | acquisitions.
01:12:04.840 | This guy's like the new Zuckerberg.
01:12:06.840 | I mean, what an incredible person to come after a bomber who's just so effective at what he's
01:12:12.840 | doing.
01:12:13.840 | As you're writing your code, it fills in your code.
01:12:16.840 | It knows what you're writing, just like in an email.
01:12:18.840 | It's a smaller subset of information than email.
01:12:20.840 | Email, you could write anything.
01:12:21.840 | You could be talking to a lover or a business person, whatever.
01:12:25.840 | Here, when you're doing programming, it's a much finer dataset.
01:12:29.840 | These two things are going to come together where you're going to be able to build your
01:12:32.840 | MVP for your startup by typing in text and then publish it.
01:12:36.840 | You're not going to need a developer for your startup.
01:12:38.840 | That is transformative in the world.
01:12:40.840 | Let me ask you guys a question.
01:12:42.840 | Do you think that this leverage, and I argue this is all about leverage, it's one person
01:12:47.840 | can generate x units of output more.
01:12:50.840 | Do you guys think that this commoditizes and puts at risk SACs in particular, like all
01:12:56.840 | of enterprise SaaS, because it becomes such a commodity to basically build a business
01:13:00.840 | that does something now?
01:13:02.840 | Or does it create much higher returns for investors because you invest so much less
01:13:05.840 | capital to get to a point of productivity with that business or revenue of that business
01:13:10.840 | than was needed before?
01:13:12.840 | I'm not sure.
01:13:13.840 | I tend to think that if it gets easier, then everything becomes more competitive.
01:13:19.840 | So, I don't know.
01:13:21.840 | So value gets competed away.
01:13:23.840 | By the way, I would slightly disagree with the characterization.
01:13:26.840 | Does this scare you as an investor?
01:13:28.840 | I don't know.
01:13:29.840 | For example, in that demo we just saw, what does the AI do?
01:13:33.840 | It exports the new design assets to Figma format.
01:13:36.840 | Yeah, it just makes you faster.
01:13:38.840 | So, if you can create a SaaS product that becomes a standard, everyone's still going
01:13:43.840 | to want to use it.
01:13:45.840 | There's really good reasons for that.
01:13:47.840 | I don't know.
01:13:48.840 | I don't think business software is going away.
01:13:50.840 | I also don't think that you're not going to need to hire engineers because Copilot's
01:13:53.840 | just going to do it for you.
01:13:55.840 | I think what Copilot will do is make your typical engineer more productive.
01:13:59.840 | They had some graphs around how Copilot reduced coding time by 50%.
01:14:05.840 | So, I think you'll be able to get a lot more out of your developers.
01:14:08.840 | I think that's sort of the key is a lot of the drudgery work gets taken care of.
01:14:13.840 | The answer, Freeberg, to your question is I think more startups, more niche startups
01:14:17.840 | will make better products, and then you'll just have many more folks making SaaS for
01:14:23.840 | dentists.
01:14:24.840 | Are those ventures?
01:14:25.840 | Sure.
01:14:26.840 | Because they don't get big enough, right?
01:14:27.840 | In that case...
01:14:28.840 | Well, entry price matters.
01:14:29.840 | It'll be poorer returns.
01:14:31.840 | Well, entry price matters too.
01:14:32.840 | If you're investing at $5 million like I do in companies when they're just on napkins
01:14:36.840 | and, you know, back of envelope, there's plenty of room.
01:14:38.840 | If you have a $200 million exit, that's a 40x.
01:14:41.840 | But there could be a lot of new categories too.
01:14:43.840 | Like a lot of new categories.
01:14:44.840 | A lot of new categories.
01:14:45.840 | There could be a lot of traditional industries.
01:14:47.840 | There's a lot of traditional industries that get disrupted.
01:14:49.840 | Like, we're thinking about just software displacing software.
01:14:52.840 | It could be software displacing, like, industries that aren't even software yet.
01:14:56.840 | Video games.
01:14:57.840 | I think the entire video game industry is going to get completely rewritten with AI
01:15:00.840 | because you're not going to have a publisher anymore that makes one game that everyone consumes.
01:15:03.840 | You're going to have tools that everyone creates and consumes their own game.
01:15:07.840 | You guys want to watch this cool clip?
01:15:09.840 | It's 58 seconds long.
01:15:10.840 | Go ahead.
01:15:11.840 | This is a music...
01:15:12.840 | Clip day here on All In.
01:15:14.840 | But I thought it was really cool.
01:15:15.840 | Is this AI turning you into Eminem?
01:15:17.840 | This is what the world's waiting for.
01:15:19.840 | I hope it's you doing an Eminem song.
01:15:21.840 | It's not me.
01:15:22.840 | I'm naturally Eminem-like, but this is a...
01:15:24.840 | Only in your anger.
01:15:25.840 | This is the future rave sound.
01:15:26.840 | I'm getting lost in an underground.
01:15:28.840 | This is the future rave sound.
01:15:30.840 | I'm getting lost in an underground.
01:15:32.840 | Is that Eminem?
01:15:33.840 | It's David Guetta.
01:15:35.840 | David Guetta playing Eminem at crack with Eminem's voice, right?
01:15:39.840 | Eminem's voice.
01:15:40.840 | And the track just became hugely viral.
01:15:42.840 | This is something that I made as a joke, and it works so good I could not believe it.
01:15:46.840 | I discovered those websites that are about AI.
01:15:50.840 | Basically, you can write lyrics in the style of any artist you like.
01:15:57.840 | So I typed, "Write a verse in the style of Eminem about future rave."
01:16:03.840 | And I went to another AI website that can recreate the voice.
01:16:10.840 | I put the text in that, and I played the record, and people went nuts.
01:16:15.840 | That is nuts.
01:16:17.840 | It's nuts.
01:16:18.840 | And the crowd went wild for it.
01:16:19.840 | So awesome.
01:16:20.840 | So here it is, folks.
01:16:21.840 | Whoever makes the best Friedberg Eminem hybrid rap with David Sacks as the hype man is getting
01:16:26.840 | a free VIP ticket to All In Summit 2023.
01:16:29.840 | I think we need an AI performance at All In Summit 2023.
01:16:33.840 | 1000%.
01:16:34.840 | You and I are so in sync these days, Friedberg.
01:16:36.840 | There are going to be a lot of interesting mashups that get created.
01:16:39.840 | For example, you'd be able to create a movie where, let's say you want to make a Western
01:16:44.840 | and you want John Wayne to star in it.
01:16:46.840 | You obviously get the rights from the Wayne estate, but no actor ever goes away.
01:16:50.840 | There's going to be a database of all of them.
01:16:52.840 | But if you wanted to make it--
01:16:53.840 | Sacks, it's going to be better.
01:16:54.840 | You're going to be writing the script.
01:16:56.840 | As you write the script, the AI is going to be showing you that scene in real time.
01:17:00.840 | And you don't have to publish it.
01:17:01.840 | Cloudy day, rainy day.
01:17:03.840 | It'll just change in real time.
01:17:05.840 | Think about what that would do for Star Wars.
01:17:07.840 | Just like Instagram and TikTok basically democratize everyone's ability to create and publish content.
01:17:15.840 | This takes it to a whole other level where the monopoly that big production houses have
01:17:21.840 | is they have the big budget so they can afford to make a big movie.
01:17:24.840 | If that cost of making a $10 million movie goes to $10,000 or $1,000 of compute time,
01:17:28.840 | anyone sitting in their studio in a basement can start to make a movie.
01:17:32.840 | And it really changes the landscape for all media, not just movies, music, video games.
01:17:39.840 | And ultimately, the consumers themselves can create stuff for their own enjoyment.
01:17:43.840 | And maybe the best of those products win.
01:17:45.840 | But having each individual create their own content--
01:17:48.840 | Maybe it's just become a lot more like the music industry.
01:17:50.840 | Remember, anyone can really create a song now.
01:17:52.840 | And they do.
01:17:53.840 | And people do go viral on TikTok.
01:17:55.840 | And they upload it on DistroKid, and it's on Spotify instantly.
01:17:58.840 | Did you guys see this article in the New York Times that was kind of throwing some shade at the CEO of Goldman Sachs, David Solomon?
01:18:06.840 | I did see the headline.
01:18:07.840 | You look great in the picture, but I didn't read it because I figured it's hate.
01:18:10.840 | They were talking about his side gig as being a DJ, but specifically they called out a potential conflict of interest.
01:18:18.840 | And it's related to this because I guess he had gotten a license to a Whitney Houston song.
01:18:23.840 | And he remixed it and released it on Spotify.
01:18:27.840 | And her biopic is about to come out.
01:18:30.840 | And they thought that there could be this perceived conflict because Goldman works on behalf of the publishing company.
01:18:36.840 | And my thought was along the lines of what you guys said, like why is this a story?
01:18:40.840 | Meaning David Solomon should be able to go to any website, license the song, make it, and then submit it back to them for them to approve.
01:18:48.840 | Because the quote in here that matters is the company that licensed it said, "We are in the business of making sure this body of music stays relevant."
01:18:56.840 | So obviously you want Whitney Houston songs, Michael Jackson songs.
01:19:00.840 | You want John Wayne.
01:19:01.840 | You want these people to live on in culture because it's part of our culture.
01:19:05.840 | Look at this subhead.
01:19:07.840 | David Solomon brushes off DJing.
01:19:10.840 | The better way to maximize it would be like this.
01:19:12.840 | Go and use it.
01:19:13.840 | Create a derivative work.
01:19:14.840 | Let us see it if we like it.
01:19:15.840 | So like Gweta should be able to just give that back to Eminem.
01:19:18.840 | If Eminem's cool with it, he should be able to ship it and just be done.
01:19:21.840 | What a non-story.
01:19:22.840 | Like the New York Times is just so anti-billionaire.
01:19:25.840 | So David Solomon brushes off DJing as a minor hobby that has little to do with his work at the bank.
01:19:30.840 | But his activities may pose potential conflicts of interest.
01:19:34.840 | It's like what are they writing about?
01:19:36.840 | Is there not something more important than this?
01:19:38.840 | Do we think that David Solomon or any CEO of any major bank on Wall Street would put their job at risk running one of the 20 or 30 most important institutions in the financial architecture of the world to license a Whitney Houston song that they can play at Coachella?
01:19:52.840 | I mean does that pass the smell test?
01:19:56.840 | No, no.
01:19:57.840 | And just for the record, iconoclastic David Solomon, you're going to be doing the opening night DJ set for All in Summit 2023.
01:20:05.840 | The grift is on.
01:20:07.840 | Let's get him.
01:20:08.840 | He's booked.
01:20:09.840 | If the New York Times hates him, anybody the New York Times hates on, you got a slot.
01:20:13.840 | You got a slot.
01:20:14.840 | That'll be our lens here.
01:20:16.840 | Dave Chappelle?
01:20:17.840 | Dave Chappelle for sure.
01:20:18.840 | 1000%.
01:20:19.840 | I mean he looks good.
01:20:20.840 | It looks like he's living his best life.
01:20:21.840 | It's a finance news.
01:20:23.840 | Does this just like offend them somehow?
01:20:24.840 | It just like offends them that a corporate CEO could DJ?
01:20:27.840 | Well that anybody's happy.
01:20:29.840 | I think it's kind of cool this guy DJs.
01:20:33.840 | Punk rock.
01:20:34.840 | Yeah, yeah.
01:20:35.840 | Let him do his thing.
01:20:36.840 | I've always wanted to be a DJ.
01:20:37.840 | Yeah?
01:20:38.840 | Oh, DJ Pauly Hopitia.
01:20:40.840 | Spin the one and twos.
01:20:41.840 | You could be DJ one night at the All in 2020 after party and we'll use AI to help you out.
01:20:47.840 | Sadly, when I should have been learning how to DJ, I was playing the violin.
01:20:51.840 | You should get up there and Urkel your violin under a DJ set.
01:20:55.840 | That would be next level.
01:20:56.840 | I played violin for 10 years.
01:20:57.840 | We can play a duet together.
01:20:58.840 | I played for 14 years in an orchestra.
01:21:01.840 | I played in an orchestra.
01:21:02.840 | Oh my god, I never knew this.
01:21:03.840 | Never again.
01:21:04.840 | This is all kind of revelations happening here.
01:21:07.840 | By the way, here was my most, pull up that first chat GPT I gave you.
01:21:11.840 | This was an aha moment.
01:21:12.840 | Me and Sax were doing our weekly mastermind group.
01:21:16.840 | Sax and I get together.
01:21:17.840 | We kind of like co-mentor each other.
01:21:18.840 | Here was something that blew us away during our mastermind group.
01:21:22.840 | This was a chat GPT.
01:21:23.840 | I did this one because I was trying to figure this out.
01:21:25.840 | How do I make my spouse and kids feel heard?
01:21:28.840 | And in chat GPT gave us a great one.
01:21:29.840 | Give them your full attention.
01:21:31.840 | Number two, empathize.
01:21:32.840 | Number three, validate their feelings.
01:21:33.840 | There you have it, Sax.
01:21:34.840 | Just put that into your subroutine.
01:21:36.840 | Let me use this as an example.
01:21:37.840 | So that's obviously the aggregation and synthesis of lots of different self-help websites.
01:21:42.840 | How do you describe attribution of that answer to a particular content publisher?
01:21:47.840 | Honestly, serious question.
01:21:48.840 | Fair use, derivative work.
01:21:51.840 | I think in this case it would be compassionate.
01:21:54.840 | No monetization opportunity there.
01:21:56.840 | There's no monetization.
01:21:57.840 | Yeah, you say that because you hate content providers.
01:21:59.840 | But here I tell you, this is compassionate publishing.
01:22:02.840 | I think the GPT, the chat GPT should be given a pass.
01:22:04.840 | You have weird blinders on on this one, J.K. I'll be honest.
01:22:06.840 | No, they know who they got it from.
01:22:08.840 | It's a synthesis of lots of websites.
01:22:10.840 | It's a very smart synthesis.
01:22:12.840 | The algorithm is not just pulling.
01:22:13.840 | It's not pulling a result from someone's website.
01:22:15.840 | It's like read hundreds of websites and it's like averaged them.
01:22:18.840 | Bulldoggy.
01:22:19.840 | How do you pay for the average?
01:22:20.840 | That's complete bullshit.
01:22:22.840 | They could say, as we ingest this stuff, tag it.
01:22:24.840 | That's not bullshit.
01:22:25.840 | That is how it's being done.
01:22:26.840 | And then you could publish it.
01:22:27.840 | When you publish it, you say, hey, where did we start?
01:22:29.840 | You're going with J.K. Is that what you're saying?
01:22:31.840 | No, I'm saying it's not bullshit.
01:22:32.840 | That is what you said is exactly how it's being done.
01:22:34.840 | Right.
01:22:35.840 | And J.K., let's say that the AI is using a hundred different websites
01:22:39.840 | and synthesizing a hundred websites.
01:22:41.840 | What's the incentive for the marginal hundredth website to say,
01:22:44.840 | well, opt me out unless you pay me.
01:22:46.840 | Right.
01:22:47.840 | So that's the shakedown.
01:22:48.840 | Google or OpenAI will just be like, okay, fine.
01:22:51.840 | We'll just work with the other 99.
01:22:53.840 | And this is why content providers, this is my best piece of advice.
01:22:56.840 | You ask the question, I'll give you the answer.
01:22:57.840 | Content providers as a group need to get together and fight for their rights.
01:23:01.840 | You are not.
01:23:02.840 | New York Times, Medicare.
01:23:03.840 | Fight for the right to party?
01:23:04.840 | No, fight for the right to get paid and to survive.
01:23:07.840 | I'll tell you what I think is going to happen.
01:23:08.840 | They need to go to Chatsy BT and say, as a group,
01:23:10.840 | either give us these terms or don't index us.
01:23:13.840 | You're trying to unionize all the content creators.
01:23:15.840 | Yes, absolutely unionize all of them.
01:23:17.840 | They should be a united front like the music industry.
01:23:19.840 | Why do you think the music industry gets paid by Peloton and anybody else?
01:23:22.840 | Because it was easy for them to get together and they fight.
01:23:25.840 | There's five of them and you can organize,
01:23:26.840 | but there's millions and millions of publishers.
01:23:28.840 | I think the point here is that technology is fundamentally deflationary.
01:23:32.840 | Here's the next great example where the minute you make something incredible,
01:23:37.840 | costs go down, but also frankly,
01:23:39.840 | revenue and profit dollars go down in the aggregate.
01:23:42.840 | It doesn't mean that one company can't husband a lot of it
01:23:45.840 | and do incredibly well like Google has done,
01:23:47.840 | but it's just going to fundamentally put pressure on all these business models.
01:23:50.840 | I think it's important for Google to take...
01:23:53.840 | Go ahead, go ahead, Freeberg.
01:23:54.840 | Google should go and they should cannibalize their own business
01:23:58.840 | before it is cannibalized for them.
01:24:00.840 | Freeberg, final word.
01:24:01.840 | Here's another way to think about it.
01:24:02.840 | I think that if this goes as we all predict and everyone's saying it's going to go,
01:24:07.840 | it is more likely than not that many of these "content publishers"
01:24:10.840 | that aren't adding very much marginal value are going to go away.
01:24:13.840 | That you could see the number of content sites offering self-help advice
01:24:16.840 | and how to do this and how to do that.
01:24:18.840 | 95% of them go away because all of that work gets aggregated
01:24:22.840 | and synthesized and presented in a really simple, easy user interface
01:24:26.840 | that makes them completely oblivious.
01:24:28.840 | I'm not discrediting the value that many content publishers provide,
01:24:31.840 | but the requisite at that point to be valued as a "novel content producer"
01:24:38.840 | is going to go way up.
01:24:39.840 | The offset to that though is it's so much easier to create content
01:24:43.840 | because of the AI.
01:24:44.840 | Totally true.
01:24:45.840 | We have this company, Copy AI,
01:24:47.840 | where even before this ChatGPT stuff,
01:24:49.840 | you would just go there and say,
01:24:50.840 | "I want to write a blog about X, Y, and Z."
01:24:53.840 | You just give it a title and it spits out a post.
01:24:56.840 | They'll actually give you 10 different blog posts
01:24:59.840 | and then you just select the one that is the direction you want to go
01:25:02.840 | and you keep doing human selection on it.
01:25:04.840 | By the way, that's a--
01:25:05.840 | But how does new intelligence get put back into the system?
01:25:08.840 | That's based on an existing corporate system.
01:25:10.840 | No, you have a corporate blog, so you publish it to your corporate blog.
01:25:13.840 | My point is, if there is now some new information in the world,
01:25:16.840 | who is going to add that to the corpus
01:25:18.840 | if everybody is just stealing content and rewriting it?
01:25:21.840 | You have to have people--
01:25:22.840 | No, it is human.
01:25:23.840 | Humans have always had a desire to create.
01:25:25.840 | Most people create for free.
01:25:26.840 | There's a head of the long tail that actually gets compensated.
01:25:29.840 | The rest of the long tail has traditionally gotten nothing
01:25:31.840 | and they do it because they want to create.
01:25:33.840 | It's kind of like saying--
01:25:34.840 | And now the creation is going to explode because it's so easy.
01:25:37.840 | Beethoven listened to Haydn
01:25:38.840 | and then Beethoven wrote novel symphonies
01:25:40.840 | and his symphonies were incredible
01:25:42.840 | and he built on the experience of listening to Haydn.
01:25:44.840 | The same is true of how content is going to evolve
01:25:47.840 | and it's going to evolve in a faster way because of AI.
01:25:50.840 | This content is not just being retrieved and reproduced,
01:25:53.840 | it's being synthesized and aggregated and represented in a novel way.
01:25:57.840 | By the way, I'd like to say--
01:25:58.840 | Hold on, I want to answer to that.
01:25:59.840 | I want to answer that, please.
01:26:00.840 | If ChatGPT takes a Yelp review
01:26:03.840 | and a Condé Nast Traveler review
01:26:06.840 | and they represent it based on the best content that's out there
01:26:10.840 | that they've already ranked
01:26:11.840 | because they have that algorithm with PageRank
01:26:13.840 | or Bing's ranking engine
01:26:14.840 | and then they republish it
01:26:16.840 | and then that jeopardizes those businesses.
01:26:18.840 | That is profoundly unfair and not what we want for society
01:26:21.840 | and they are interfering with their ability
01:26:23.840 | to leverage their own content.
01:26:25.840 | It is profoundly unfair
01:26:26.840 | and those magazines and newspapers need to--
01:26:29.840 | You're going to get steamrolled.
01:26:30.840 | What's that?
01:26:31.840 | You're going to get steamrolled.
01:26:32.840 | It's possible.
01:26:33.840 | YouTube is a great example.
01:26:34.840 | YouTube was going to get shut down.
01:26:36.840 | Sequoia and the YouTube founders sold it to Google
01:26:39.840 | because they were so scared of the Viacom lawsuit
01:26:41.840 | and how well it was working against them.
01:26:44.840 | They thought this business will never fly
01:26:47.840 | if we don't have a big partner like Google
01:26:50.840 | to support the lawsuit.
01:26:52.840 | They won the lawsuit or they settled it
01:26:54.840 | because they were able to do Content ID
01:26:56.840 | and allow content creators.
01:26:58.840 | The only reason YouTube exists is--
01:27:00.840 | Hold on, let me finish.
01:27:01.840 | It's because they let content creators watermark
01:27:04.840 | and find their stolen content and then claim it
01:27:07.840 | and when they claim the stolen content,
01:27:09.840 | they were able to monetize it.
01:27:10.840 | That's what's going to happen here.
01:27:11.840 | There'll be a settlement where they are going to be able
01:27:14.840 | to claim their content.
01:27:16.840 | I will bet any amount against your premonition here, J. Cal.
01:27:20.840 | This is like the opposite of no predominance.
01:27:22.840 | Propose a bet.
01:27:24.840 | You've seen these AIs that generate images, right?
01:27:27.840 | Like stable diffusion and like Wally or whatever.
01:27:29.840 | You literally just tell it,
01:27:30.840 | "I want this image in this style,"
01:27:33.840 | and boom, it's done
01:27:34.840 | and it would take an artist weeks to produce that
01:27:37.840 | and you can do it in five seconds
01:27:39.840 | and you could tell the AI, "Give me 20 of those,"
01:27:41.840 | and then you just keep iterating
01:27:43.840 | and in five minutes, you've got something mind-blowing.
01:27:45.840 | So the fact that it's so much easier to create content,
01:27:48.840 | you can do the same thing with the written word.
01:27:50.840 | The people who need to be compensated, J. Cal,
01:27:52.840 | if they don't get what they want, they may just go away,
01:27:53.840 | but there'll be 10 times or 100 times more people
01:27:56.840 | waiting in the right sort of place to them.
01:27:57.840 | Here's how wrong you are.
01:27:58.840 | Thank you for bringing up this example
01:27:59.840 | so that I can prove how wrong you are.
01:28:01.840 | Getty Images is suing stable diffusion at the moment.
01:28:04.840 | Here is what the dipshits at stable diffusion did.
01:28:07.840 | They trained their AI on Getty Images
01:28:10.840 | with the watermarks on them and they've been busted
01:28:12.840 | and they are dead to rights now
01:28:14.840 | and they are going to pay $100 million or more
01:28:17.840 | to Getty Images for stealing their content
01:28:20.840 | and allowing it to be republished in a commercial setting.
01:28:23.840 | Those images don't look too good to me.
01:28:25.840 | (laughs)
01:28:26.840 | Over time, this will get resolved, J. Cal.
01:28:28.840 | Stable diffusion copied the Getty Image watermark
01:28:32.840 | and put it on a reproduced work.
01:28:34.840 | Okay, I think stable diffusion's bigger problem
01:28:37.840 | is they can't do noses and ears and eyelids.
01:28:40.840 | (laughs)
01:28:42.840 | That looks like a bigger problem to me.
01:28:43.840 | Anyway, shout out to stable diffusion
01:28:45.840 | for stealing Getty Image content.
01:28:47.840 | So funny.
01:28:48.840 | By the way, Mozart influenced by Haydn, not Beethoven.
01:28:51.840 | Sorry.
01:28:52.840 | By the way, there's a really interesting topic about AI
01:28:55.840 | that we don't have time to get to this week,
01:28:57.840 | but I think we should put it on the docket for next week,
01:28:59.840 | which is should AIs be trained to lie?
01:29:02.840 | Super important.
01:29:03.840 | Great question.
01:29:04.840 | Because that's happening right now.
01:29:05.840 | And the last thing is--
01:29:06.840 | Or have opinions.
01:29:08.840 | The last thing I'll say on this, from my perspective,
01:29:10.840 | maybe we can jump on after this,
01:29:12.840 | is this is the best thing that could happen
01:29:14.840 | for all of the monopolists in technology
01:29:17.840 | because Microsoft taking 500 or 600 basis points of share
01:29:21.840 | is the best way to ensure that the FTC
01:29:23.840 | has zero credibility in going after Google
01:29:25.840 | or anybody else in tech.
01:29:26.840 | Right.
01:29:27.840 | All of those things, I think, are DOA.
01:29:29.840 | So in some ways, actually, Google leaking
01:29:31.840 | 5% or 6% of the market share is a really good thing
01:29:34.840 | because the FTC is rendered toothless
01:29:36.840 | in making any claim that this is--
01:29:38.840 | I assume they understand that.
01:29:39.840 | That's such a good point.
01:29:40.840 | I mean, it's kind of a good news/bad news scenario
01:29:43.840 | with this whole thing.
01:29:44.840 | The good news is that the Google monopoly
01:29:45.840 | has finally been cracked.
01:29:46.840 | The bad news is that it's Microsoft
01:29:48.840 | and even a bigger monopoly that's the one that's done it.
01:29:51.840 | But it just shows how vulnerable
01:29:53.840 | all these big tech companies are.
01:29:55.840 | Yeah, TBD.
01:29:56.840 | And they may all end up competing with each other.
01:29:58.840 | What's great is everyone's got a tactical nuclear weapon.
01:30:00.840 | We've got a tactical nuclear weapon now,
01:30:01.840 | and we don't know where it's going to get pointed
01:30:03.840 | and who's going to set it off and where.
01:30:05.840 | The weaponry has completely changed.
01:30:07.840 | Yeah, the weaponry totally changed.
01:30:09.840 | And to prove how wrong you guys are,
01:30:11.840 | here is the verge on the CoPilot--
01:30:13.840 | How many times have you said we're wrong?
01:30:14.840 | Here's the other lawsuit.
01:30:16.840 | Open source writers--
01:30:18.840 | Nick, are you feeding him this nonsense?
01:30:19.840 | Is that what's going on?
01:30:20.840 | No, I've been tracking this.
01:30:21.840 | You people haven't.
01:30:23.840 | You guys need to watch what's happening right now.
01:30:25.840 | CoPilot, GitHub, ChatGPT, and Microsoft
01:30:29.840 | are being sued by developers
01:30:31.840 | because CoPilot was built off of stolen content.
01:30:34.840 | These lawsuits are just beginning,
01:30:36.840 | and it's going to result in licensing fees.
01:30:40.840 | This will be a transitory effect,
01:30:42.840 | and it won't change the dynamics
01:30:43.840 | of where this is going over the long term.
01:30:44.840 | It'll change the dynamics of YouTube in the long term,
01:30:46.840 | so let's keep going.
01:30:47.840 | I don't know. They're doing pretty good.
01:30:49.840 | We have a portfolio company called Sourcegraph,
01:30:50.840 | which is building a code-writing AI--
01:30:52.840 | Why don't you just put your logo page up
01:30:53.840 | if you're going to go through the whole portfolio?
01:30:55.840 | No, they're building something similar.
01:30:57.840 | But it's opt-in.
01:30:59.840 | You just get all your customers to opt into it.
01:31:02.840 | You might just get paid to play.
01:31:04.840 | Let's move on to State of the Union.
01:31:05.840 | Why do you want to move on?
01:31:06.840 | You want to do the crackdown?
01:31:07.840 | I think Zach's point about AI--
01:31:09.840 | State of the Union?
01:31:10.840 | --would be a great chat.
01:31:11.840 | Let's talk about it next week, though.
01:31:12.840 | Yeah, I think it deserves more time.
01:31:13.840 | The State of the Union,
01:31:14.840 | I don't know about you guys,
01:31:15.840 | but I found it one of the more profoundly
01:31:18.840 | disappointing, saddening states of the union
01:31:20.840 | I've ever seen.
01:31:21.840 | Why? Unpack that.
01:31:23.840 | I think it was--
01:31:24.840 | We often kind of focus on the one-year cycle
01:31:26.840 | of what the State of the Union says,
01:31:28.840 | but I think what's more important
01:31:29.840 | is how much the data that's coming through
01:31:31.840 | in the State of the Union
01:31:32.840 | supports the more scary long-term cycle.
01:31:35.840 | I've talked about this a lot,
01:31:36.840 | on how scared I am about kind of where we're headed
01:31:40.840 | with respect to the US's ability
01:31:43.840 | to fund its financial obligations.
01:31:46.840 | And the scary moment at the State of the Union,
01:31:49.840 | besides Biden's inability to kind of
01:31:51.840 | articulate much very well,
01:31:52.840 | which was honestly a really discouraging sight to see,
01:31:57.840 | was, you know, when he talked about
01:31:59.840 | where the Republicans are trying to cut
01:32:01.840 | Social Security and Medicare,
01:32:03.840 | the US Treasury put out a projection,
01:32:05.840 | which I tweeted last week,
01:32:08.840 | originally shared on Twitter by Lynn Alden.
01:32:10.840 | This is the US Treasury's forecast
01:32:13.840 | of debt held by the United States over time.
01:32:15.840 | And the assumptions in this forecast are
01:32:17.840 | we've got a certain amount of debt today,
01:32:19.840 | and we're running Social Security
01:32:21.840 | and Medicare forward without cuts.
01:32:23.840 | And so what happens as we make
01:32:25.840 | these Social Security and Medicare payments,
01:32:27.840 | and we accrue and pay interest
01:32:29.840 | on the debt that we hold today,
01:32:31.840 | and we don't change the tax rates in this country,
01:32:34.840 | and this is what happens.
01:32:35.840 | So it's a runaway kind of debt scenario,
01:32:38.840 | and the US, by definition,
01:32:39.840 | has to default at some point,
01:32:41.840 | because you cannot tax every dollar
01:32:43.840 | of the economy at 100% at some point.
01:32:46.840 | And so, you know, there are two ways this can go.
01:32:49.840 | The first way is you have to cut back
01:32:52.840 | on these major kind of, you know,
01:32:54.840 | expense commitments that naturally balloon over time,
01:32:57.840 | and that is Social Security and Medicare.
01:32:59.840 | And the other one is that you just tax a lot more,
01:33:02.840 | and when you tax a lot more,
01:33:03.840 | economic growth gets affected,
01:33:05.840 | and it makes it really hard to eventually pay off that debt,
01:33:07.840 | and the debt continues to spiral.
01:33:09.840 | So I think what we saw was, number one,
01:33:14.840 | the announcement by Biden,
01:33:15.840 | "Hey, Republicans are the ones
01:33:17.840 | who want to cut Social Security and Medicare,"
01:33:18.840 | and they all screamed, and they said,
01:33:19.840 | "No way, no way. We'll never do that."
01:33:21.840 | And a lot of them did interviews afterwards
01:33:23.840 | and said it's total BS that Biden would say that,
01:33:25.840 | which I think supports what the polls have shown,
01:33:27.840 | which is on both sides of the aisle,
01:33:30.840 | people do not want to see Social Security
01:33:31.840 | and Medicare cut in any way right now.
01:33:33.840 | That means you can't--
01:33:34.840 | and you guys saw what happened in France,
01:33:36.840 | where they pushed back the retirement age by two years,
01:33:38.840 | and there was effectively riots across the country.
01:33:40.840 | I don't know if you guys saw this a few weeks ago.
01:33:42.840 | We didn't talk about it,
01:33:43.840 | but it was pretty brutal, pretty ugly.
01:33:45.840 | And so this is a real cost that's coming bare.
01:33:48.840 | It's coming bare in the United States,
01:33:49.840 | not just with the publicly funded
01:33:51.840 | Social Security and Medicare programs,
01:33:52.840 | but also with a lot of the private pensions
01:33:54.840 | that are going to need to get bailed out
01:33:55.840 | with the same federal money,
01:33:56.840 | because they're not going to let those things go bankrupt,
01:33:58.840 | and that's another trillion plus of liabilities.
01:34:00.840 | So, you know, that cost is going to balloon,
01:34:03.840 | and the only solution at that point
01:34:05.840 | is to introduce massive tax hikes.
01:34:07.840 | And so they propose this billionaire tax,
01:34:09.840 | this tax on unrecognized capital gains.
01:34:11.840 | It is literally, if you keep Social Security and Medicare
01:34:13.840 | where they are, and you don't pay down the debt,
01:34:15.840 | and you don't grow the economy fast enough,
01:34:17.840 | you have to introduce significant tax hikes
01:34:19.840 | across corporate and the individual taxpayer base.
01:34:24.840 | And so, you know, it really, again, if you zoom out,
01:34:27.840 | it really indicates this steepening curve
01:34:29.840 | that the US has to climb its way out of.
01:34:31.840 | And as you tax more,
01:34:33.840 | there's less to invest in the economic growth.
01:34:35.840 | The government is a far worse investor
01:34:37.840 | in economic growth than the free market,
01:34:39.840 | and that means that we can't grow our way out
01:34:41.840 | and grow GDP enough to ultimately cover
01:34:43.840 | our debt obligations.
01:34:45.840 | And this is what Dalio's book that I mentioned in 2021
01:34:48.840 | was so kind of importantly sharing.
01:34:51.840 | This is a multi-hundred year cycle,
01:34:53.840 | and the last couple decades get really nasty.
01:34:56.840 | And this chart, which is a forecast
01:34:58.840 | from the actual US Treasury, highlights the problem.
01:35:01.840 | And the comments made in front of the Congress
01:35:03.840 | this week by the President of the United States
01:35:05.840 | indicates how serious of a problem this is going to be
01:35:07.840 | because no one wants to cut these major cost obligations.
01:35:10.840 | that we have coming to you.
01:35:12.840 | And so...
01:35:13.840 | What are you talking about?
01:35:14.840 | Saxton, aren't you and the Republicans,
01:35:16.840 | you want to cut Medicare and get rid of it?
01:35:18.840 | Is that what Biden said?
01:35:19.840 | You guys want to get rid of it?
01:35:20.840 | Okay, what was that kerfuffle about?
01:35:22.840 | With your, who's the person on your squad
01:35:24.840 | who was yelling and screaming out
01:35:25.840 | at the President of the United States?
01:35:27.840 | Oh, that doesn't matter.
01:35:28.840 | That was just sort of silliness.
01:35:30.840 | But who was screaming at him?
01:35:31.840 | Who's that person?
01:35:32.840 | Let me tell you what happened to the Union,
01:35:34.840 | is that Biden was basically trying to take a page
01:35:37.840 | out of Bill Clinton's playbook.
01:35:39.840 | When Bill Clinton lost the midterms in '94,
01:35:42.840 | he basically triangulated to the center
01:35:44.840 | and he did two things.
01:35:45.840 | He started going for kind of small ball.
01:35:48.840 | He started playing small ball politics.
01:35:50.840 | It was like school uniforms and things like that
01:35:53.840 | that were relatively unobjectionable
01:35:55.840 | and that regular middle-class people could get behind.
01:35:58.840 | And then he basically posed as the defender
01:36:00.840 | of entitlement programs.
01:36:02.840 | Back then, he, in '96, he ran against Dole
01:36:04.840 | by portraying Dole.
01:36:06.840 | He went all the way back to Dole's vote against Medicare.
01:36:08.840 | This is what the Biden team is teeing up
01:36:11.840 | for the reelect in '24,
01:36:14.840 | is they're talking about things like
01:36:16.840 | curbing ticketmaster fees
01:36:18.840 | and fixing right-turn red lights.
01:36:21.840 | I mean, seriously, like total small ball.
01:36:24.840 | They're going to try and pretend
01:36:26.840 | like he wasn't the most radical
01:36:28.840 | tax-and-spend progressive
01:36:29.840 | over the last two years that we've really ever had
01:36:31.840 | in American history.
01:36:32.840 | They're going to try and make everyone forget that
01:36:35.840 | and just talk about the small, objectable stuff.
01:36:37.840 | And then he's also going to, again,
01:36:39.840 | pose as the stalwart defender of entitlement programs,
01:36:41.840 | which are very popular.
01:36:43.840 | And partly they're doing this.
01:36:45.840 | They're already, I think, getting ready for
01:36:47.840 | DeSantis on this, because if you read
01:36:50.840 | some of the political analysis on this,
01:36:52.840 | and Josh Berman had a good column,
01:36:54.840 | and Andrew Sullivan had a good column
01:36:56.840 | talking about this, that way back when
01:36:58.840 | DeSantis was in Congress
01:37:00.840 | and he was like a backbencher,
01:37:02.840 | he voted for some Republican budgets,
01:37:04.840 | a Paul Ryan budget,
01:37:06.840 | that had some of this entitlement reform in it.
01:37:08.840 | So they're going to try and portray him
01:37:10.840 | as against entitlement reform.
01:37:11.840 | Now, I don't think it's going to work,
01:37:13.840 | because all you have to say is,
01:37:14.840 | "Listen, that was a long time ago.
01:37:16.840 | I wasn't voting for cutting entitlements.
01:37:18.840 | I just voted for my party's budget."
01:37:20.840 | That's irrelevant.
01:37:22.840 | I can tell you I will not cut entitlements.
01:37:24.840 | So any smart Republican is going to take
01:37:26.840 | entitlement reform off the table,
01:37:28.840 | because it is a total third rail,
01:37:30.840 | and they will lose.
01:37:32.840 | And I think Trump had the right instincts on this,
01:37:34.840 | and I'm sure that any major Republican
01:37:36.840 | will have the right instincts on this.
01:37:38.840 | You see the way they're throwing Rick Scott
01:37:40.840 | under the bus.
01:37:41.840 | Rick Scott had this proposal
01:37:43.840 | about having
01:37:46.840 | entitlements
01:37:48.840 | go from being
01:37:50.840 | permanently entitled to being something that
01:37:52.840 | gets voted on every year,
01:37:54.840 | and the rest of the Republican caucus is like,
01:37:56.840 | "Hell no, we're not touching that."
01:37:58.840 | And they can't run away fast enough from Rick Scott.
01:38:00.840 | So, Freeburg is right.
01:38:02.840 | No appetite for entitlement reform,
01:38:04.840 | and I would tell any Republican,
01:38:06.840 | if you want to do entitlement reform,
01:38:08.840 | it's got to be bipartisan.
01:38:10.840 | You've got to do what Ronald Reagan did,
01:38:12.840 | which is join hands with Tip O'Neill
01:38:14.840 | and Moynihan,
01:38:16.840 | and you jump off the cliff together.
01:38:18.840 | Do not stick your neck out on this.
01:38:20.840 | - Chamath, any thoughts on the State of the Union writ large?
01:38:22.840 | - No.
01:38:26.840 | - Marjorie Taylor Greene yelling liar at the President?
01:38:28.840 | - Um, no.
01:38:30.840 | - Okay, well that's the most boring
01:38:32.840 | segment of the show.
01:38:34.840 | - Both sides engaged in a lot of
01:38:36.840 | weirdness.
01:38:38.840 | Biden was bellowing
01:38:40.840 | at various points in his speech. It was quite
01:38:42.840 | bizarre. And you're right, there were some Republicans
01:38:44.840 | in the audience who were braying
01:38:46.840 | like jackasses, and it's unfortunate
01:38:48.840 | because I think-- - Utterly, utterly
01:38:50.840 | disappointed. - I think if the Republicans had just calmed down,
01:38:52.840 | I think Biden's
01:38:54.840 | sort of weird
01:38:56.840 | mannerisms where he was practically yelling,
01:38:58.840 | it was like Abe Simpson,
01:39:00.840 | you know, old man yelling at the cloud.
01:39:02.840 | - Yeah, they do a good job of like,
01:39:04.840 | right at the moment of self-immolation,
01:39:06.840 | they let him off the hook. - Totally.
01:39:08.840 | - It's really incredible. Republicans just, they have
01:39:10.840 | no impulse control.
01:39:12.840 | Right when they could just be quiet, sit
01:39:14.840 | there calm, quiet,
01:39:16.840 | and let Biden do the damage to himself,
01:39:18.840 | they just cannot.
01:39:20.840 | - I like McCarthy telling Marjorie
01:39:22.840 | Taylor Greene, "Shh, shh!"
01:39:24.840 | And Mitt Romney telling that other liar guy to get
01:39:26.840 | the hell out of here. - Listen, it's always
01:39:28.840 | been hard to control backbenchers.
01:39:30.840 | That's just the reality. That does not speak for the
01:39:32.840 | entire party.
01:39:34.840 | - How do you guys feel about
01:39:36.840 | being taxed on
01:39:38.840 | your change in net worth from year to year?
01:39:40.840 | - You mean a wealth tax?
01:39:42.840 | - Wealth tax, yeah. - If it's for
01:39:44.840 | billionaires, totally cool. If it's for centimillionaires
01:39:46.840 | or decamillionaires, absolutely
01:39:48.840 | off the table. - Tear cut off, huh?
01:39:50.840 | - We want those people to grow their wealth and invest.
01:39:52.840 | The billionaires, yeah, sure. - Is that what you think
01:39:54.840 | is coming, Freeberg? - Yeah.
01:39:56.840 | - So hard to execute on. What are you
01:39:58.840 | supposed to do? - I would love to see a website
01:40:00.840 | where you have a piece of yarn. You know those
01:40:02.840 | sites where you have a little
01:40:04.840 | pin and you kind of push the pin out and the other one
01:40:06.840 | has to go in. You can't have it all.
01:40:08.840 | You can't have low taxes
01:40:10.840 | and have these entitlement programs and
01:40:12.840 | have this level of spending. It's impossible.
01:40:14.840 | You have to tax.
01:40:16.840 | That's the only way.
01:40:18.840 | Or you have to cut the entitlement programs, or you have
01:40:20.840 | to cut the spending. - Well, you're
01:40:22.840 | right about that, Freeberg, but let me tell you why the American
01:40:24.840 | people think
01:40:26.840 | it would be ridiculous to cut their entitlements.
01:40:28.840 | They've watched as
01:40:30.840 | Washington spent $8 trillion
01:40:32.840 | on forever wars in the Middle
01:40:34.840 | East. They just watched as
01:40:36.840 | Biden spent trillions enriching
01:40:38.840 | the pharma companies on a fugazi
01:40:40.840 | that didn't work.
01:40:42.840 | They've just watched as hundreds of billions
01:40:44.840 | went to climate special
01:40:46.840 | interests of the Democratic Party. - Did you just say the vaccine
01:40:48.840 | was a fugazi?
01:40:50.840 | - We talked about that. Let me bring it back up.
01:40:52.840 | - No, don't say it.
01:40:54.840 | - They've watched as trillions...
01:40:56.840 | Hold on. They've watched as trillions of dollars
01:40:58.840 | have gone to the donor class and
01:41:00.840 | they are going to rise up
01:41:02.840 | and say to hell with you if
01:41:04.840 | you cut off our social security that
01:41:06.840 | we paid into for decades while
01:41:08.840 | enriching all these special interests.
01:41:10.840 | - I don't advocate for any of these
01:41:12.840 | points. - And the bailouts of corporations
01:41:14.840 | in the great financial crisis. You're right.
01:41:16.840 | - I think it's just an analytical
01:41:18.840 | certainty that
01:41:20.840 | you've got to zoom out and stop thinking
01:41:22.840 | about the yearly cycle and the election cycle on this
01:41:24.840 | stuff and just look at where we're headed over
01:41:26.840 | a multi-decade cycle.
01:41:28.840 | And there's just no resolution to this problem
01:41:30.840 | based on the way we're all oriented right now.
01:41:32.840 | - If there is a wealth tax, let's just say on
01:41:34.840 | billionaires or billionaires... - Let me just say one thing.
01:41:36.840 | Sorry, Jacob. - What would they do? Would they leave the country?
01:41:38.840 | Go to Puerto Rico? - No.
01:41:40.840 | That did happen in France, by the way. I think 40%
01:41:42.840 | of the people that were taxed left
01:41:44.840 | and then they came back. - It would violate
01:41:46.840 | our constitution.
01:41:48.840 | - That'll be
01:41:50.840 | litigated for sure. - It'll be a Supreme Court, yeah.
01:41:52.840 | - If you can get the cost of energy
01:41:54.840 | in this country to drop
01:41:56.840 | by 50% to 75%
01:41:58.840 | and you can increase energy
01:42:00.840 | capacity by
01:42:02.840 | 10% to 20% fold, then you
01:42:04.840 | have a fighting chance because you can actually grow the
01:42:06.840 | economy out of the problem. And that's
01:42:08.840 | really where I am optimistic and excited
01:42:10.840 | about opportunities like Fusion and you guys can make
01:42:12.840 | fun of me all you want. But if we can get to a
01:42:14.840 | point where we can increment energy capacity
01:42:16.840 | by an order of magnitude,
01:42:18.840 | there is economic growth that will arise
01:42:20.840 | from that, from all these new industries and these
01:42:22.840 | production systems. And that's how we can grow
01:42:24.840 | our way out of the debt,
01:42:26.840 | entitlement, tax problem where one of those three
01:42:28.840 | things has to give in the absence of that.
01:42:30.840 | - You know what the cheapest source of energy is?
01:42:32.840 | You know what the cheapest source of energy is today?
01:42:34.840 | Fusion. It's 3 cents a kilowatt hour.
01:42:38.840 | - Fission, you mean?
01:42:40.840 | - No, it's Fusion.
01:42:42.840 | - No, it's Fusion.
01:42:44.840 | It's the sun
01:42:46.840 | using solar panels. - There it is.
01:42:48.840 | - Yeah, the problem, Chamath, honestly, let me just
01:42:50.840 | say this one thing. The problem is, can you
01:42:52.840 | scale energy capacity by 10x?
01:42:54.840 | - Yes. - And can you do it fast enough? And that's
01:42:56.840 | the real technical... - While
01:42:58.840 | creating jobs. - That's the real techno-economic
01:43:00.840 | question, right? - I posted a link
01:43:02.840 | in my reading list today, this week, you guys can go
01:43:04.840 | look at it. The most prolific distribution
01:43:06.840 | of Fusion technology
01:43:08.840 | is China actually deploying
01:43:10.840 | solar on every single rooftop in China.
01:43:12.840 | The United States could do it too,
01:43:14.840 | and you will 10x the power
01:43:16.840 | available. - Well, yeah. I mean, they
01:43:18.840 | are... - And it'll be zero.
01:43:20.840 | - They're increasing their... - It'll happen
01:43:22.840 | in the next few years. It's just sometimes
01:43:24.840 | we want to create intellectual complexity.
01:43:26.840 | I love these different
01:43:28.840 | forms of Fusion. I just think it's a 50
01:43:30.840 | year trodge to get it.
01:43:32.840 | - I'm not betting on it.
01:43:34.840 | I'm just saying there is a... - No, no.
01:43:36.840 | I'm just agreeing with you. I'm just building on your point to say
01:43:38.840 | it's actually happening. Fusion is
01:43:40.840 | what is actually creating abundant
01:43:42.840 | zero-cost energy today.
01:43:44.840 | - Yeah. And so look, if we can increase
01:43:46.840 | energy capacity in this country by 10x,
01:43:48.840 | energy production capacity by 10x,
01:43:50.840 | and we can do it in the next 20 to 30 years,
01:43:52.840 | we have a path... - You can do it in 10 by putting
01:43:54.840 | solar on every rooftop. - If we can, we have
01:43:56.840 | a path out of the
01:43:58.840 | entitlement tax debt
01:44:00.840 | problem. Otherwise, one of
01:44:02.840 | those three things is going to give, and it's going to be ugly.
01:44:04.840 | - The thing that we are lacking right now is not
01:44:06.840 | actually the generation capability,
01:44:08.840 | which is incredibly cheap,
01:44:10.840 | but it's really scalable storage.
01:44:12.840 | And once we figure that out, which is
01:44:14.840 | actually the real technical bottleneck to
01:44:16.840 | abundant zero-cost energy, we'll have
01:44:18.840 | your boundary condition met, and we'll have it
01:44:20.840 | well before different forms of
01:44:22.840 | Fusion are commercializable.
01:44:24.840 | - France had an exodus
01:44:26.840 | of an estimated 42,000 millionaires
01:44:30.840 | between 2000 and 2012.
01:44:32.840 | And really, before
01:44:34.840 | Macron killed it. - They only had the wealth tax for 12
01:44:36.840 | years, and then they reversed it.
01:44:38.840 | - They were just losing their tax base so violently
01:44:40.840 | they had no choice. - That's what's happening in California.
01:44:42.840 | - That's what's going to happen in California if they move forward with
01:44:44.840 | this discount. - It's happening in California already
01:44:46.840 | without the wealth tax. - San Francisco. - Right.
01:44:48.840 | - It's happening already, and now you look at it... - Well, because they're stupidly
01:44:50.840 | telling people that this wealth tax
01:44:52.840 | is going to have a 10-year look forward.
01:44:54.840 | So, everybody I know is at least
01:44:56.840 | talking about, "Hey, could this happen in the next 10 years?"
01:44:58.840 | Because if it might happen
01:45:00.840 | five years from now, we've got to leave now
01:45:02.840 | because if I wait... - They're going to chase you.
01:45:04.840 | - They're going to chase me. - Yeah.
01:45:06.840 | Specifically you.
01:45:08.840 | They hate you. - No, I mean,
01:45:10.840 | it's a problem.
01:45:12.840 | - It is a serious problem. - But let me tell you, I mean, to
01:45:14.840 | Freeberg's point, it's not just Ray Dalio,
01:45:16.840 | the great political satirist
01:45:18.840 | Pedro O'Rourke, who died last year,
01:45:20.840 | he wrote that American
01:45:22.840 | politics is defined by the formula
01:45:24.840 | "X minus Y equals a big
01:45:26.840 | stink," where X
01:45:28.840 | is what people want from
01:45:30.840 | government, and Y equals
01:45:32.840 | what they're willing to pay for government.
01:45:34.840 | And that difference is basically the big
01:45:36.840 | stink. And the job of politicians is to manage
01:45:38.840 | that stink. And the problem is
01:45:40.840 | the politicians have not been doing a good job
01:45:42.840 | managing it, and they increasingly
01:45:44.840 | do a worse job managing it.
01:45:46.840 | So, yeah, at some point it's going to blow up.
01:45:48.840 | - You guys want to hear a crazy statistic?
01:45:50.840 | I was pulling this one up the other day.
01:45:52.840 | You know what the budget per capita
01:45:54.840 | is of the city of San Francisco?
01:45:56.840 | So, how much the city spends per year divided
01:45:58.840 | by the number of people that live in the city?
01:46:00.840 | - Well, we know it's a couple of billion dollar budget,
01:46:02.840 | we know only a couple of hundred thousand in there.
01:46:04.840 | - It's $13 billion budget. - Yeah, and it's 800,000 residents?
01:46:06.840 | - It's $18,000
01:46:08.840 | per citizen per year. That's how much
01:46:10.840 | the city of San Francisco spends.
01:46:12.840 | A third of that money, by the way,
01:46:14.840 | 30% of it goes to, or 25%
01:46:16.840 | goes to public health care. Now,
01:46:18.840 | when you look at that $18,000
01:46:20.840 | budget per capita, it is
01:46:22.840 | more than every single state
01:46:24.840 | of the union on a per capita
01:46:26.840 | except Oregon and North Dakota, which have very weird
01:46:28.840 | budgets. So,
01:46:30.840 | San Francisco spends more
01:46:32.840 | than every other state per capita.
01:46:34.840 | It spends more in aggregate than
01:46:36.840 | 16 US states. And the
01:46:38.840 | federal budget per capita is
01:46:40.840 | $15,000. The federal government's
01:46:42.840 | budget per capita is $15,000.
01:46:44.840 | So, San Francisco spends more than 15
01:46:46.840 | states and spends more per capita than
01:46:48.840 | every other state except two, and spends more
01:46:50.840 | than the federal government per capita.
01:46:52.840 | And I think that really highlights, and
01:46:54.840 | we need to tax the base to do that.
01:46:56.840 | And now we are seeing San Francisco is
01:46:58.840 | the largest population of exodus
01:47:00.840 | and population exodus of any city
01:47:02.840 | and business exodus of any city in the
01:47:04.840 | United States. That is your predictor.
01:47:06.840 | That is your predictor of where this goes.
01:47:08.840 | Right, and Miami, Florida,
01:47:10.840 | by contrast, has
01:47:12.840 | no state income tax. They just rely on
01:47:14.840 | property tax and sales tax, which
01:47:16.840 | California has as well. There is
01:47:18.840 | a income tax and
01:47:20.840 | cap gains tax of zero in Florida,
01:47:22.840 | and they seem to make it work.
01:47:24.840 | 880,000 people
01:47:26.840 | was the peak in San Francisco
01:47:28.840 | 2018, 2019, 2020.
01:47:30.840 | And then in 2021,
01:47:34.840 | So, 10%. And it's gone
01:47:36.840 | even further in 2020. Yeah, some people think it's down
01:47:38.840 | to 650 now. I think it is.
01:47:40.840 | Hard to track, but yeah. Well, you have
01:47:42.840 | a lot of people who owned homes there and maybe owned
01:47:44.840 | second homes, because let's face it, it was a
01:47:46.840 | well-heeled group of individuals living there.
01:47:48.840 | And a lot of them just still have their places,
01:47:50.840 | but they've left. And then they're in the process
01:47:52.840 | of selling their places. The point is
01:47:54.840 | increasing the tax rate is a great
01:47:56.840 | short-term solution, but over
01:47:58.840 | the long term, if that budget
01:48:00.840 | per citizen isn't brought in line,
01:48:02.840 | there is no way to tax the base enough
01:48:04.840 | without causing the tax base to leave.
01:48:06.840 | Forget about what anyone's personal
01:48:08.840 | opinions are. That's just the economic reality
01:48:10.840 | of what happened in France. It's what's happening in San
01:48:12.840 | Francisco. There's a lot of great predictors in
01:48:14.840 | history of where this has happened.
01:48:16.840 | And so something has to give. Or
01:48:18.840 | maybe you have to have a miracle, like an energy
01:48:20.840 | miracle, but we'll all
01:48:22.840 | keep investing for that. All right, everybody,
01:48:24.840 | this has been an amazing, amazing
01:48:26.840 | episode of the All In
01:48:28.840 | Podcast. 115 episodes
01:48:30.840 | for the dictator,
01:48:32.840 | sultan of science,
01:48:34.840 | and for the
01:48:36.840 | pacifist, David "The Dove"
01:48:38.840 | Sachs. I am the
01:48:40.840 | world's greatest moderator, Jay Cowell, and we'll
01:48:42.840 | see you next time. Bye-bye. I love you, besties.
01:48:44.840 | We'll let your winners ride.
01:48:46.840 | Rain Man,
01:48:48.840 | David Sachs.
01:48:50.840 | And it said,
01:48:52.840 | we open-sourced it to the fans,
01:48:54.840 | and they've just gone crazy with it.
01:48:56.840 | I'm the queen of quinoa.
01:48:58.840 | We'll let your winners ride.
01:49:00.840 | We'll let your winners ride.
01:49:02.840 | We'll let your winners ride.
01:49:04.840 | Besties are gone.
01:49:06.840 | That's my dog
01:49:08.840 | taking a notice in your driveway.
01:49:10.840 | Winner to me.
01:49:12.840 | Oh, man.
01:49:14.840 | My avatar will meet me at place.
01:49:16.840 | We should all just get a room and just have one big huge
01:49:18.840 | orgy because they're all just useless.
01:49:20.840 | It's like sexual tension, but they just need to release
01:49:22.840 | somehow.
01:49:24.840 | What a beat.
01:49:26.840 | What a beat.
01:49:28.840 | We need to get merch.
01:49:30.840 | Besties are back.
01:49:32.840 | [Music]
01:49:34.840 | [Music]
01:49:36.840 | [Music]
01:49:38.840 | [Music]
01:49:40.840 | (upbeat music)