back to index

E135: Wagner rebels, SCOTUS ends AA, AI M&A, startups gone bad, spacetime warps & more


Chapters

0:0 Bestie intros: Friedberg fills in as moderator!
2:45 Wagner Group rebellion
23:15 SCOTUS strikes down Affirmative Action
51:3 Databricks acquires MosaicML for $1.3B, Inflection raises $1.3B
69:35 IRL shuts down after faking 95% of users, Byju's seeks to raise emergency $1B as founder control in jeopardy
86:38 Science Corner: Understanding the NANOGrav findings

Whisper Transcript | Transcript Only Page

00:00:00.000 | This is going to be a feisty episode.
00:00:02.720 | Is it?
00:00:03.720 | Two of us are on Greenwich meantime, two of us are in Pacific, J. Cal still asleep in
00:00:10.000 | his head.
00:00:11.000 | I'm good, actually.
00:00:12.000 | You good?
00:00:13.000 | I'm good.
00:00:14.000 | All right.
00:00:15.000 | Well, great to be back.
00:00:16.000 | Welcome to the All Conspiracy podcast, where we repeat false statements and help spin them
00:00:19.000 | into tales of struggling against the establishment, the elite and the mainstream media.
00:00:24.120 | We will deliver to you the people the revolution against the powers that be.
00:00:29.120 | Blessed offense.
00:00:30.120 | I'm still trying to get my invite for next week.
00:00:36.120 | All right.
00:00:37.120 | We also enjoy Friedberg Mums the Word.
00:00:39.800 | We'll also enjoy sharing with you fantastic stories of opulence.
00:00:43.720 | Let's be neutral.
00:00:44.720 | Opulence, leisure and benevolent greed.
00:00:47.520 | Here we go.
00:00:48.520 | Joining me today are my co-hosts, General David Sachs, commander of the 4th Battalion
00:00:52.280 | of the Internet Tweet Brigade.
00:00:54.560 | General welcome.
00:00:56.520 | Joining us from a remote location in Moscow.
00:00:59.040 | Has there been like an establishment takeover of the pod?
00:01:01.320 | I mean, what's up with this intro?
00:01:04.240 | Every argument they make against us, you're basically just conceding it's true.
00:01:07.440 | I know, exactly.
00:01:08.600 | From his 12th century Mediterranean castle, Il Duce, Chamath Palihapitiya.
00:01:13.260 | Welcome Chamath.
00:01:14.260 | How is the Mediterranean diet treating you?
00:01:16.720 | Blue.
00:01:18.080 | Blue?
00:01:19.240 | Blue.
00:01:20.240 | And Emperor Nero Calacanis, ruler over podcasts, paid events, entrepreneurial universities,
00:01:26.560 | and dental SPVs.
00:01:28.120 | Emperor, thank you for letting me sit in your throne today.
00:01:31.280 | It's good to be here.
00:01:32.280 | Thank you.
00:01:33.280 | Dental SPVs.
00:01:34.280 | Shout out to my dentists.
00:01:35.280 | Wait, did you say king of STDs?
00:01:36.280 | SPVs.
00:01:37.280 | Oh, SPVs.
00:01:38.280 | Dental SPVs, yeah.
00:01:39.280 | That's the thing with certain STDs, where once you are king, you're going to be king
00:01:46.040 | for life.
00:01:47.040 | Absolutely.
00:01:48.040 | The Syndicate.com.
00:02:07.080 | Yeah, thank you.
00:02:08.080 | You've been doing, you said, five shows a week lately?
00:02:11.440 | Your voice is shot.
00:02:12.840 | I, my voice is starting to go, so I asked you if you'd moderate, and you thankfully
00:02:17.320 | said yes.
00:02:18.320 | I've got the energy I missed last week.
00:02:19.840 | I enjoyed listening to you guys with BG.
00:02:22.600 | Great episode.
00:02:23.600 | Sorry I couldn't join, but let's kick it off.
00:02:26.560 | That's funny.
00:02:27.560 | I still haven't watched the episode that I missed.
00:02:28.560 | You're not interested, huh?
00:02:29.560 | He's like, I'm not on the show.
00:02:31.920 | I'm not interested.
00:02:32.920 | If I didn't have time to participate, I don't really have time to watch it.
00:02:36.400 | The truth is, Sax, why don't you admit to everyone that when you're in a show, you probably
00:02:39.760 | watch it four or five times.
00:02:42.160 | All right, so let's kick this off.
00:02:45.600 | Obviously, you guys recorded right before the Wagner Group attempted coup or potential
00:02:52.160 | coup or theorized coup began last week.
00:02:55.360 | So Sax, you kind of sent out a text saying the show is already stale because you kind
00:03:00.160 | of missed that news cycle by publishing right after it started, but you recorded right before.
00:03:07.120 | So let's do just a quick recap of what happened with Russia, Ukraine, and particularly its
00:03:14.640 | Wagner Group rebellion.
00:03:17.880 | Last Friday, the Wagner Group, which is a Russian paramilitary organization led by Yevgeny
00:03:22.160 | Grigogin, launched what seemed like an armed insurrection against Russia.
00:03:27.520 | Wagner had occupied portions of Rostov-on-Don, a city of over a million people, a regional
00:03:31.520 | capital and headquarters of Russia's southern military district, before setting off towards
00:03:36.840 | Moscow and then abruptly stopping, I think, about 200 kilometers before reaching Moscow
00:03:42.200 | City.
00:03:43.200 | At this point, there was supposedly a negotiation.
00:03:47.400 | The president of Belarus got involved and Grigogin decided to step down.
00:03:53.840 | Putin said, "I'm not going to prosecute you for these crimes."
00:03:57.880 | He was given immunity and it was announced that all the members of the Wagner Group were
00:04:03.600 | given the option of returning home or joining the Russian military and the Wagner Group
00:04:07.800 | was going to be dissolved.
00:04:08.800 | Sax, maybe you can kind of give us your summary of the events that took place and then we'll
00:04:15.560 | talk a little bit about the interpretation of what we think this means for the conflict
00:04:19.800 | in Ukraine.
00:04:20.800 | Well, you're right that this rebellion took place just after we dropped our last episode
00:04:26.680 | and so everybody, both on Twitter and in the comments, was dunking on me for my take on
00:04:32.160 | last week's episode that the much-hyped Ukrainian counter-offensive was not succeeding, that
00:04:38.400 | it was in fact failing.
00:04:40.160 | I think there's abundant evidence for that, which hasn't changed.
00:04:43.000 | Even CNN had written an article basically supporting the idea that the counter-offensive
00:04:47.240 | was not living up to what had been promised.
00:04:50.100 | And so everyone was in the comments saying that my take had basically aged like milk
00:04:54.600 | and this armed rebellion or mutiny by Prigozhin was evidence that the Russian regime was about
00:04:59.920 | to collapse, that Russia was in fact on the verge of civil war.
00:05:06.760 | And you saw the exact same people who had oversold the counter-offensive now overselling
00:05:11.160 | this mutiny as something that would bring down the Russian regime and the war.
00:05:16.960 | And of course, that did not happen.
00:05:19.360 | It was certainly a highly unusual event.
00:05:21.600 | And I've read takes now from every different corner of the internet about what it was and
00:05:25.600 | what took place.
00:05:26.600 | You have people speculating that this was all staged.
00:05:28.680 | I do not believe that.
00:05:29.680 | I do believe that it was an insurrection or mutiny by Prigozhin.
00:05:33.520 | I think the trigger for it was the fact that his Wagner organization was being merged in
00:05:40.160 | with the Ministry of Defense and the regular Russian army and his men were all being made
00:05:44.080 | to sign contracts with the Ministry of Defense.
00:05:47.720 | That would have resulted in a giant loss of income and status for Prigozhin.
00:05:52.400 | Simultaneously, for months now, he's been criticizing the Ministry of Defense, specifically
00:05:57.680 | the Minister of Defense Shoigu and the Chief General of the General Staff, Gerasimov.
00:06:04.840 | So he's been vocally critical of them.
00:06:06.760 | And I think that this basically erupted into a mutiny by him where he basically tried to
00:06:14.520 | leverage his position.
00:06:15.680 | Like you said, he marched what they now think are about 8,000 men, which is about a quarter
00:06:21.960 | of Wagner into Rostov-on-Don.
00:06:24.240 | He then took the ministry headquarters and sent about 3,000 of his men on a convoy to
00:06:29.280 | Moscow.
00:06:30.320 | I think that although this is probably best described as a mutiny, I think that it did
00:06:35.480 | have coup optionality to it.
00:06:37.820 | I think that Prigozhin was seeking to find out how much support Putin had and who might
00:06:44.400 | join him.
00:06:45.400 | And he had put out a number of statements that I think from the Russian regime's point
00:06:49.000 | of view could be described as seditious that morning.
00:06:52.920 | And there's a lot of evidence that he staged this attack on his base.
00:06:57.080 | He claimed that there was a missile attack by the Ministry of Defense, and that's what
00:07:00.240 | launched this march for justice.
00:07:02.680 | And in his comments, he was careful not to criticize Putin directly, but he had a lot
00:07:07.960 | to say about the Ministry of Defense and the overall conduct of the war.
00:07:11.440 | And it was, I think, harshly critical indirectly of Putin.
00:07:15.400 | And I think he was looking to see who might support him.
00:07:18.560 | And what happened is that on the way to Moscow during this convoy, nobody supported him.
00:07:25.040 | In fact, all the statements came out from the other generals, including Sergei Sergeyevich,
00:07:30.720 | including all the regional governors, members of the Duma, and other important figures in
00:07:36.680 | Russian society.
00:07:37.680 | And there wasn't a single person willing to publicly support Prigozhin.
00:07:41.480 | And that's when Putin went on TV, called this basically an act of treason and a stab in
00:07:45.600 | the back.
00:07:46.600 | And at that point, I think Prigozhin's options were pretty limited.
00:07:50.040 | And he basically took a deal that was brokered by Lukashenko, in which he would go into exile
00:07:56.120 | in Belarus in exchange for basically being allowed to live.
00:08:01.660 | That was basically the deal that was ultimately cut.
00:08:04.080 | And so I think where things stand today is that although I think this was an embarrassment
00:08:08.920 | and a black eye for the Russian regime, it never looks good for a regime to have any
00:08:14.880 | kind of mutiny or insurrection.
00:08:18.200 | And I think that it does raise questions that Putin's now gonna have to answer to his various
00:08:23.080 | allies and supporters about how stable his regime is.
00:08:25.960 | I think ultimately, Putin has ended up in a place of consolidating Russian society behind
00:08:33.480 | Like I said, there were no power centers that supported this mutiny.
00:08:37.320 | They all rallied to Putin's defense and the people of Russia, even though Prigozhin is
00:08:41.560 | a popular figure being kind of a war hero from the Battle of Bakhmut, the Russian society
00:08:46.720 | supported Putin, I think he's at something like 80% poll numbers.
00:08:50.920 | So I think where things stand today...
00:08:52.520 | Who's poll?
00:08:53.520 | Well, like Levada Center, which is an independent polling agency, for example.
00:08:58.360 | These are not the Russian regime's own numbers.
00:09:00.680 | And when the poll numbers...
00:09:01.680 | No, no, wait a minute.
00:09:02.680 | Would you say in a poll that you don't support Putin?
00:09:04.320 | Well, I don't know how Levada Center, what their methodology is, but these numbers, when
00:09:10.000 | the numbers are bad, are cited by Western sources.
00:09:12.760 | But there are other forms of evidence too.
00:09:14.560 | You may have seen this video that went viral over the past few days of this song that's
00:09:19.360 | now the number one chart buster in Russia, where it's this very patriotic Russian song
00:09:26.280 | where they're basically singing "I am Russian."
00:09:28.440 | That's sort of the main chorus.
00:09:29.440 | Nick, can you pull up the song?
00:09:30.920 | I'd like to see this, please.
00:09:32.920 | Coming at you at 95.5.
00:09:34.680 | I love Russia.
00:09:35.680 | Oh my God, this is some serious propaganda.
00:09:36.680 | Wow, yeah.
00:09:45.920 | I mean, that looks like a pretty good party.
00:09:50.820 | If you look up the lyrics to the song, the English translation of the lyrics, the gist
00:09:55.960 | of it is I'm basically proud to be Russian and I don't care who doesn't like it.
00:10:01.040 | That's basically the lyrics of it.
00:10:03.120 | How do you get invited to that party?
00:10:05.120 | I got so many jokes I'm not going to make them.
00:10:08.760 | I know, you got to be careful.
00:10:09.760 | Be careful here, folks.
00:10:10.760 | You got to be careful here.
00:10:11.760 | Well, you have to be careful.
00:10:12.760 | It looks like a fun party.
00:10:14.200 | It's a catchy song.
00:10:16.080 | I mean, I love to dress up and show up.
00:10:18.360 | Tell me where to show up, Putin.
00:10:19.680 | I think the point here is that Russian society is united behind the state in wanting to fight
00:10:26.020 | this war.
00:10:27.080 | And I think that part of the reason why Purgoshin's mutiny was so oversold as an imminent coup
00:10:33.120 | that would bring about the collapse of the Putin regime and of the Russian war effort
00:10:38.000 | and of their front line is because that we, since the beginning of this war, we've had
00:10:42.040 | this narrative that if we applied enough pressure to Russia, that there would be a palace intrigue
00:10:48.360 | and a palace coup and that liberal forces inside of Russia would rise up and topple
00:10:53.400 | the dictator Putin and basically get them out of this war.
00:10:56.720 | And I think what's happened is fairly predictable, but it's the opposite of that, which is the
00:11:01.000 | Russian people are rallying around the flag and rallying around Putin, the war leader,
00:11:05.840 | and they are a patriotic people just like the Ukrainians.
00:11:09.560 | And I think both these countries, that both the Russians and the Ukrainians are a proud
00:11:13.240 | people and I think they're in a fight to the death.
00:11:16.340 | And I think that both countries regard this as existential.
00:11:20.280 | And we have basically stuck ourselves in the middle of this fight to the death between
00:11:23.820 | these two countries.
00:11:24.880 | And I don't see this working out very well.
00:11:26.600 | Okay, so look, I will make an admission.
00:11:29.400 | I consider myself a modestly well read person, modestly well informed.
00:11:34.960 | I had never heard of the Wagner group or Prygosian prior to this coup attempt last week.
00:11:40.600 | Chamath, J. Cal, had you guys heard of this person before?
00:11:45.200 | And the Wagner group?
00:11:46.200 | Yeah.
00:11:47.200 | Yeah.
00:11:48.200 | Oh, well, maybe I'm an idiot or just not interested.
00:11:50.960 | I think what was so surprising was like how out of the blue the story seemed to be last
00:11:55.240 | week that there was this disagreement between this person that commanded this paramilitary
00:11:59.920 | organization who then turned around against Putin and stood up against him and marched
00:12:04.840 | back towards Moscow.
00:12:06.640 | And it felt to me like it came a little bit out of the blue and was such like a weird
00:12:11.160 | kind of shocking event.
00:12:12.280 | Did it feel kind of like that to you guys that there was surprising instability and
00:12:17.800 | the surprising potential revolution happening locally?
00:12:21.320 | I think stepping back here and just looking at the news cycle, obviously, I don't think
00:12:26.000 | many people expected this.
00:12:27.360 | This is a wild card.
00:12:28.800 | And so people could be humble in, you know, their belief of like how much they actually
00:12:33.560 | understand about what's going on there.
00:12:36.040 | The Russian soldiers are not in favor of this war.
00:12:38.560 | This is a war that's very unpopular in Russia, actually.
00:12:41.600 | And for the Ukraine, having been invaded by Russia, they're fighting for their land and
00:12:45.000 | they're gonna, they are much more motivated.
00:12:47.440 | I wouldn't believe any of this propaganda, but this is a bit of a Rorschach test.
00:12:50.960 | Somebody on the left got on Twitter and said, this is the end of Putin.
00:12:54.640 | Everybody's gonna rise up in the streets.
00:12:55.840 | And they overplayed that, you know, angle of the story.
00:12:58.480 | And then, of course, the right or people who are pro-Russian or anti the West backing this
00:13:04.540 | war are gonna take the other side, Preberg.
00:13:05.960 | They're gonna take the side of, you know, oh, everybody loves Russia.
00:13:10.640 | 80% of people are voting for this.
00:13:12.720 | It's ridiculous to think that anybody in Russia is gonna answer, do you like Putin?
00:13:16.560 | Do you support Putin on a survey?
00:13:18.320 | Can you imagine a Putin's a murderous dictator who kills all of his enemies and he controls
00:13:23.800 | through violence?
00:13:25.280 | Nobody's answering a survey correctly.
00:13:27.240 | This, you know, top song is complete propaganda.
00:13:30.880 | Putin has control of the entire media apparatus there.
00:13:34.200 | What this showed actually, if you step back and you look at-
00:13:36.960 | Would you go to the party though, if you were invited?
00:13:38.680 | Well, I mean, are there gonna be potential LPs there?
00:13:42.580 | Last week, but I think-
00:13:44.320 | It's just a joke, folks.
00:13:46.400 | But stepping back, if you look at modern day dictators, they tend to stay in power for
00:13:50.280 | about three decades.
00:13:51.960 | Putin's in his third.
00:13:53.960 | And I think we're gonna see in the next 10 years, Putin lose power and he's going to
00:14:02.000 | be out of power.
00:14:03.840 | And when we look back on it, it's gonna be one of two causes.
00:14:06.520 | It's gonna be either cancer, which, you know, the speculation is he's had cancer and that's
00:14:10.240 | why he disappears from view, because he might be getting treatment.
00:14:15.240 | And we'll look back on that, the end of his power will be his control of Russia, which
00:14:22.720 | he controlling these, controlling through violence and fear of violence and threat of
00:14:28.000 | violence is exhausting.
00:14:29.240 | You have to be paranoid.
00:14:30.240 | And that's why it generally doesn't last that long, especially compared to the West, where
00:14:33.400 | we have a democracy and people last about a half decade.
00:14:37.480 | He will look back on his end, which will be in the next 10 years, either through cancer
00:14:43.000 | or through his invasion of Ukraine.
00:14:45.540 | This is the biggest blunder he's ever made.
00:14:47.360 | And this is a really crazy sign that somebody would actually attempt or even float a coup
00:14:55.360 | is insane.
00:14:56.600 | He's murdered every single person who has ever even challenged his authority in a minor
00:15:03.640 | The fact that his, one of his right hand men, this is one of his tight inner circle.
00:15:08.840 | The fact that one of the people in his tight inner circle would actually start heading
00:15:14.000 | towards Moscow is insane.
00:15:16.680 | So to say this wasn't a big deal and Putin's, you know, now consolidated power and everybody's
00:15:21.280 | in the street dancing.
00:15:22.280 | It was almost as bad as January 6.
00:15:24.800 | This is just simply not true.
00:15:26.760 | I never said it wasn't a big deal.
00:15:28.120 | I wasn't talking about you.
00:15:29.120 | I was talking about the mids on Twitter.
00:15:30.360 | I never mentioned your name.
00:15:31.480 | I think that your point, Jake, how at the beginning that this doesn't really seem to
00:15:36.920 | change anyone's point of view on the outlook, your point of view sounds like it's the same
00:15:40.800 | as it was a week ago.
00:15:42.240 | Saks, your point of view is probably the same as it was a week ago.
00:15:45.720 | I think there are a couple of takeaways here.
00:15:47.720 | First of all, they've had polling of opinion in Russia for a long time.
00:15:52.520 | And like I said, when the polls go the way that the Western sources want, no one questions
00:15:57.760 | their accuracy.
00:15:58.760 | Again, I don't know exactly the methodology, but Levada Center is an independent pollster
00:16:02.840 | that Western publications do trust.
00:16:05.640 | I hear them repeat it over and over again.
00:16:09.440 | And by their methodology, which I assume hasn't changed, I think Putin's popularity before
00:16:13.760 | the war is around 65%.
00:16:15.840 | Now they're showing it at about 83%.
00:16:17.440 | Jason, you may not like the war.
00:16:20.480 | And I certainly don't like the war.
00:16:21.960 | Nobody likes the war, yeah.
00:16:22.960 | Nobody likes the war.
00:16:23.960 | But I think it is simply a fact that the Russian people have rallied around the flag, and they
00:16:29.200 | do support this war and Putin as the leader.
00:16:33.800 | Now I do think he has egg on his face here from this Purgosian uprising in terms of,
00:16:38.720 | did I see this coming?
00:16:39.720 | No, I didn't have this on my bingo card.
00:16:41.000 | I don't think anybody else did either.
00:16:42.480 | However, did I know who Purgosian is?
00:16:44.200 | Certainly.
00:16:45.200 | I mean, I've been tracking Purgosian statements since around February.
00:16:48.800 | He's been vocally criticizing the Ministry of Defense, specifically Shoygu and Gerasimov,
00:16:54.360 | in increasingly insubordinate and you could argue even seditious ways.
00:16:58.400 | I'm really kind of surprised in a way that he wasn't dealt with before this.
00:17:02.880 | And I'm sure that the Kremlin is kicking itself for probably not dealing with it sooner.
00:17:07.640 | But in terms of why he's still alive, I think that Putin had a really tough decision to
00:17:12.320 | make about you quash this rebellion completely, which would have led to horrific images of
00:17:19.560 | violence, potentially Moscow or Russians killing Russians.
00:17:25.400 | That might have actually led the Russian front to question itself or collapse.
00:17:30.240 | So I think he did the expedient thing, which is he cut a deal.
00:17:33.200 | He got Lukashenko to help broker it and he cut a deal.
00:17:35.920 | And I think at the end of the day, I think that he made the cool headed decision that
00:17:41.000 | was in his and in Russian interest, which was to avoid this to getting to the point
00:17:46.520 | of a bloody violent insurrection.
00:17:48.120 | Okay, Chamath, any point of view shift for you coming out of the Purgosian event of the
00:17:55.080 | last week on Russia, Ukraine?
00:17:57.520 | I mean, I want to know how much he got paid to stop marching towards Moscow.
00:18:02.480 | Yeah, I mean, it is like a mob.
00:18:05.680 | It must have been a lot.
00:18:09.040 | Not bad for a guy that was what put it Putin's caterer a few years ago, right?
00:18:13.040 | He started a catering business.
00:18:14.480 | I spent nine years in jail.
00:18:16.160 | This guy's like been nine years in jail.
00:18:18.200 | It's like the Sopranos.
00:18:19.200 | He went to jail for nine years for selling illegal hot dogs or something.
00:18:21.920 | And all of a sudden, 30 years later, the guy got just paid billions of dollars to basically
00:18:27.200 | stop his paramilitary group from taking over one of the largest countries in the world.
00:18:30.880 | It's not largest, I mean, nuclear arsenal in the world.
00:18:32.960 | Yeah.
00:18:33.960 | It's just so you know who this guy is.
00:18:35.800 | I mean, he really is the street thug that Putin is always accused of being.
00:18:41.800 | He was a street thug.
00:18:42.800 | He did go to jail.
00:18:44.600 | He was one of these guys who came up in Russia as a businessman when to be a businessman,
00:18:50.480 | you had to be so tough.
00:18:52.120 | Businessmen were getting murdered left and right by gangsters.
00:18:54.360 | You almost had to be a gangster yourself.
00:18:56.480 | Apparently, he made some money in the supermarket chain business and that led him to create
00:19:01.480 | a catering business, which brought him to Putin's attention and he started catering
00:19:05.440 | for the Kremlin.
00:19:07.040 | He's sometimes called Putin's chef.
00:19:08.520 | I don't think he was a chef himself.
00:19:10.360 | He was a guy who owned the business.
00:19:13.320 | And then from there, he was given the license to create this PMC, this private military
00:19:17.760 | corporation, Wagner Group.
00:19:20.360 | He wasn't the only founder of it.
00:19:22.200 | He had a co-founder who was actually the military man behind it.
00:19:25.640 | But Wagner became this group of mercenaries who do all sorts of business in Africa mainly,
00:19:32.640 | where they are working on behalf of governments there to protect mineral resources or oil
00:19:38.760 | wells, all sorts of things.
00:19:39.760 | In fact, if he was a Sopranos captain, who would he be?
00:19:41.840 | Like Phil Leotardo?
00:19:42.840 | Does like 20 years in jail, comes out?
00:19:44.760 | I think it's sort of like John Gotti going against Michael Corleone.
00:19:48.400 | I think that Putin is sort of the very cold, rational guy with everything in his head who's
00:19:54.320 | very calculated and doesn't reveal much.
00:19:57.760 | More like a Michael Corleone.
00:19:59.040 | Whereas I think that Purgosian is emotional, erratic.
00:20:03.280 | He's been saying these statements for months here, which I don't see how they could possibly
00:20:06.840 | loose canon.
00:20:07.840 | And the crazy thing though is, is that what you saw on Twitter and social media was unrestrained
00:20:13.440 | glee really delirium over the idea that Purgosian might topple Putin and become the custodian
00:20:20.640 | of Russia's thousands of nuclear weapons.
00:20:23.040 | And so my comment on this whole thing is be careful what you wish for.
00:20:27.360 | Why in the world would Americans want that?
00:20:31.240 | We'd be jumping out of the frying pan and into the fire.
00:20:33.320 | I've been saying since the beginning of the war that this fantasy that Putin's gonna be
00:20:36.560 | toppled by a palace coup and you're gonna replace him with Navalny or something like
00:20:40.440 | that, or we're gonna get Gorbachev 2.0.
00:20:42.640 | I said that was always unrealistic.
00:20:44.040 | And what you're much more likely to end up with is an even worse dictator or a hardliner.
00:20:51.120 | And I think that is what would happen if Purgosian had taken over.
00:20:53.680 | I think it would have been much worse for the West.
00:20:55.520 | The final point is, what's the takeaway from here is I think this is gonna put more pressure
00:20:59.840 | on Putin to conduct the war in a more violent way.
00:21:04.200 | I know that people already think that the war is horrible and violent, but Putin has
00:21:08.480 | been criticized by hardliners on his right for basically making the war a special military
00:21:14.200 | operation instead of an all-out war.
00:21:16.760 | And Purgosian I think expected to find more support among the sort of ultra-nationalists
00:21:21.960 | in Russia and among the military who have been critical of Putin for waging the war
00:21:27.120 | in what they consider to be too half-hearted or an incomplete way.
00:21:31.200 | They would like to see this declared to be a war.
00:21:33.280 | They would like to see the full mobilization of Russian society.
00:21:37.280 | And this is the problem that I see now is that I think Putin already knew, but this
00:21:41.320 | has to underscore for him that this war is existential for him personally.
00:21:45.680 | If he loses, it's the end of not only his regime, but probably his life in Russia.
00:21:50.200 | And I think he's gonna do whatever it takes to win this war.
00:21:53.520 | And I think you could see now over the next few months a full mobilization in Russia.
00:21:57.500 | And I think that this could lead us to the next point of escalation in this war.
00:22:01.560 | That is if this Ukrainian counteroffensive actually is successful on some level.
00:22:05.360 | Right now it is not succeeding.
00:22:07.260 | So there's no reason for Putin to do that.
00:22:09.640 | But if this counteroffensive succeeds, you will see the next level of escalation.
00:22:13.760 | So Sax, you did a great job stringing six points together.
00:22:16.280 | I think my key takeaway is, which is now 18 points you've made, so you can retire for
00:22:21.960 | the rest of the show.
00:22:23.160 | My key takeaway from your series of statements, however, is an important one, which is to
00:22:27.520 | watch the potential escalation driven by Putin here.
00:22:32.160 | Any wrap up?
00:22:33.160 | Otherwise, I'm going to move forward with the Supreme Leader's action.
00:22:34.160 | I just have one statement.
00:22:35.160 | Go ahead.
00:22:36.160 | I was thinking about this, like, there's a famous Sun Tzu quote, "The supreme art of
00:22:40.400 | war is to subdue the enemy without fighting."
00:22:43.760 | This is a big mistake, and we need to make sure that we don't get into a war with Taiwan
00:22:49.520 | and China, and China over Taiwan.
00:22:51.600 | So we have to avoid these things.
00:22:53.560 | And that's where Sax and I are in alignment.
00:22:55.560 | Sun Tzu, Kalikinis, we heard it here first.
00:22:58.440 | Let's move forward with peace.
00:23:00.480 | I can only hope that the conflict ends soon, as I've always said.
00:23:03.960 | I realized over the last week how little I know about the Russian military conflict with
00:23:09.280 | Ukraine, and I appreciate Sax's contributions.
00:23:14.240 | Super helpful.
00:23:15.240 | I went to Cal, UC Berkeley, 1997, fall of '97.
00:23:20.480 | And it was the last year that Cal had affirmative action admissions.
00:23:25.960 | And I remember at that time, there was a big case, a guy named Bakke was rejected by the
00:23:31.000 | University of California Davis Medical School.
00:23:33.400 | And he alleged reverse discrimination in 1974, and sued the University of California.
00:23:39.480 | And eventually it became a landmark US Supreme Court case, regents of the University of California
00:23:43.720 | versus Bakke.
00:23:44.720 | And in 1995, the UC regents voted to eliminate affirmative action.
00:23:50.200 | So the year that I was at Cal, I think was the last year of affirmative action admissions.
00:23:55.920 | And it's obviously been a pretty hot topic here in California for the past, you know,
00:24:05.000 | 25, 30 years.
00:24:06.480 | This morning, the Supreme Court ruled on two separate cases regarding using race as an
00:24:13.360 | admissions criteria in college admissions.
00:24:16.440 | And the votes were six to three against affirmative action in the University of North Carolina
00:24:20.360 | case and six to two against affirmative action in the Harvard case.
00:24:24.320 | Katonji Brown Jackson recused herself because she previously served on Harvard's board of
00:24:29.160 | overseers.
00:24:30.160 | All the conservative as their, you know, kind of characterized judges voted to strike down
00:24:34.400 | affirmative action and all the as their characterized liberal judges voted to keep it.
00:24:39.120 | Both of these cases were filed in 2014 by a group called Students for Fair Admissions.
00:24:44.200 | And effectively, the court said that at Harvard at UNC, the schools were systematically discriminating
00:24:51.080 | against Asian Americans in violation of civil rights laws by using their race as a system
00:24:57.000 | for profiling, excluding and trying to be more inclusive of a more diverse and racially
00:25:02.820 | diverse set of applicants.
00:25:05.780 | So to moth, I'd love your read, I guess, on the surprise, or this was an except expected
00:25:12.840 | case.
00:25:13.840 | I think Saxon I mentioned this before, but I think we both expected this to happen.
00:25:19.040 | I think it's probably important to maybe set up a more practical explainer.
00:25:25.400 | Friedberg.
00:25:26.400 | So, Nick, if you want to just throw up that image that I just sent you, we can sort of
00:25:32.280 | explain the genesis of the lawsuit.
00:25:36.200 | So what you can see here is admit rates into Harvard by race, ethnicity, but also by academic
00:25:46.280 | decile.
00:25:47.280 | Yeah.
00:25:48.280 | And so what it basically shows in a nutshell is an African American student in the 40th
00:25:53.320 | percentile of the academic index is actually more likely to get in than an Asian student
00:25:59.880 | at the 100th percentile.
00:26:02.320 | And so that at the core is sort of sorry, that means that means that the Asian American
00:26:07.400 | student had better scores than 99% of other applicants and still didn't get it.
00:26:15.920 | Right.
00:26:16.920 | So you have to go back, I think, to 2003, when essentially what the Supreme Court said
00:26:26.080 | is like, Look, we're going to allow this affirmative action stuff to last roughly for another 25
00:26:31.520 | years.
00:26:32.520 | But by that point, we expect that the work that needed to be done will have been done.
00:26:37.160 | Again, this is them saying this, not me.
00:26:41.200 | And so I think what today does is actually quite important, not just for what it means
00:26:47.760 | for universities, but also what it means for private enterprises.
00:26:51.960 | So just to take a second on this, I think what happens today is the pretty obvious stuff,
00:26:58.600 | which is that you have to change university applications, you have to change all of the
00:27:03.680 | admissions profiling, all of the stuff that you would normally do.
00:27:08.240 | You probably I'm not even sure if you can even have a box where you can declare race,
00:27:12.400 | maybe you can or cannot, I don't even know.
00:27:14.860 | But all of that changes today.
00:27:16.880 | So then the question is, well, what's the first order derivative?
00:27:19.280 | What changes next?
00:27:21.520 | And I called someone who's a pretty well known constitutional Supreme Court lawyer on this.
00:27:27.480 | And the next step is probably going to be around athletics based and legacy based admissions.
00:27:35.980 | So athletics based admissions are pretty obvious, which is, you don't really have great grades.
00:27:41.740 | But you're really stupendous at a sport that's important to that school.
00:27:46.600 | So then they let you in, because they want to compete in said sport for whatever reason.
00:27:51.640 | The legacy one is even more prickly, which is net, you're kind of a dummy, but your parents
00:27:56.020 | are rich and or went to the school before.
00:27:59.100 | And so then they let you in as well.
00:28:02.060 | And his thought on this is that those things will go away.
00:28:09.260 | Because if you can't use race based admissions to kind of balance the scales, then it'll
00:28:15.480 | become pretty quick where somebody launches a legacy based lawsuit, or an athletic based
00:28:20.840 | bias lawsuit and wins that as well.
00:28:24.140 | So that's the first order derivative.
00:28:25.760 | So you know, the thought are those because those are constitutionally protected, whereas
00:28:33.080 | equality based on race, but it becomes a huge headache for these schools.
00:28:37.680 | Right.
00:28:38.680 | And so you're going to be fighting these admission standards constantly changing.
00:28:43.380 | And so if you're not going to let you know, a bunch of poor minority black and brown kids
00:28:48.560 | in, but you're letting in the sons and daughters of rich, important people, I think that that's
00:28:55.420 | going to paint that school in a very bad light.
00:28:58.240 | So I think that's so important.
00:28:59.240 | Shemar white, typically white, typically white, although one would say the great thing over
00:29:04.800 | the last couple of decades is there have been a lot of minorities that have gotten into
00:29:08.960 | these very elite schools, which means this, their kids would be the first generation that's
00:29:14.300 | eligible for legacy, but you're going to wipe that away.
00:29:17.540 | So I think from a just a social stigma perspective, and I have a solution for this, which I'll
00:29:22.320 | get to at the end for those people.
00:29:24.320 | But so I think that's the first order derivative.
00:29:26.080 | The second order derivative is now what lawsuits get launched and what are the implications
00:29:30.040 | for private companies, right?
00:29:32.020 | So right now, this affects any institution that receives federal funding.
00:29:36.540 | And that includes all the universities.
00:29:38.180 | So there's no private or public university really, except for a handful that don't take
00:29:41.680 | this money.
00:29:42.680 | So they'll all have to do this.
00:29:44.440 | But the really important question after that will be what happens to companies like Apple,
00:29:49.400 | or Facebook or Exxon who have race based programs to try to attract African American engineers
00:29:57.840 | or Hispanic chemists, whatever, whatever the program is that you want to come up with?
00:30:05.480 | Will those get challenged?
00:30:07.040 | And will those companies have to change?
00:30:08.860 | And my friend's thoughts on that were that yes, that those would also change.
00:30:13.800 | And that's going to have a really important impact on private enterprise and how they
00:30:17.260 | approach this stuff and how de AI stuff works.
00:30:20.280 | And frankly, downstream, how ESG works, because all these ESG checkboxes now some of them
00:30:25.120 | will actually become illegal, right?
00:30:28.320 | So I think the importance of decision can't be really understated, it's going to, the
00:30:33.760 | changes will be slow.
00:30:35.760 | And then there'll be fast, they'll first touch higher ed, but then I think they'll touch
00:30:39.460 | private enterprise.
00:30:40.520 | And so I think it was a, it was a very important decision in America that just happened.
00:30:46.520 | Jamal, what is what is the right ethics and values?
00:30:51.680 | I mean, what do you guys I guess we could just do this around the table, J Cal, maybe
00:30:54.920 | you kick it off.
00:30:57.240 | Should we I mean, from your point of view, do you think that values should include racial
00:31:03.680 | diversity in admissions and universities?
00:31:06.800 | Yeah, this is like the ultimate, or is the values about equality of opportunity for everyone,
00:31:12.240 | regardless of race, right?
00:31:13.640 | You're asking the exact right question, I think.
00:31:16.200 | That's the world's greatest moderator.
00:31:18.560 | But yeah, doing a solid job so far.
00:31:20.520 | And this creates a lot of cognitive dissonance for people, right?
00:31:23.000 | Because you you really want to believe that the world is a meritocracy.
00:31:28.720 | And you know, if you were to take other pursuits in the world, you'd never say like, we should
00:31:33.840 | let race, gender, age affect people's performance in the hundred yard dash or their, their compensation
00:31:42.640 | at a company, right?
00:31:43.640 | All of that should be based on achievement.
00:31:46.560 | And so there is a question on what achievements should be taken into account when you apply
00:31:51.320 | to a school.
00:31:53.280 | And it's pretty obvious, the legacy thing, you know, is, you know, a backdoor into these
00:31:58.280 | schools.
00:31:59.760 | But we want to feel like we're also making progress because listen, the world has been
00:32:05.000 | unfair.
00:32:06.280 | The world was built on slavery.
00:32:09.560 | And our country has, you know, it's only 150 years past that and Civil Rights Act was what,
00:32:15.800 | 1964 or 65.
00:32:17.160 | Like we've, we've, we really want to see everybody achieve here.
00:32:20.280 | So I think you have to pause for a second and say, well, if the goal is you want to
00:32:23.640 | see you know, black Americans perform better, and I think that's the, the underlying concern
00:32:32.600 | here.
00:32:33.800 | And it is based on the legacy of America.
00:32:36.120 | Well, how do you do that?
00:32:37.840 | And I think we're looking way too far down in the educational pipeline.
00:32:43.560 | The solution here is really childcare.
00:32:45.320 | The solution here is nursery schools, pre K, elementary school education, and those
00:32:49.800 | things need competition.
00:32:51.080 | And that's where people fall behind.
00:32:52.800 | To be looking at this at the end of the academic journey is I think, crazy.
00:32:57.160 | So you know, when I'm president, I'm going to have 365 day a year, you know, childcare
00:33:03.800 | and pre K. And that's where we should if we really want to try to make up for some wrongs
00:33:09.760 | in the history of this country, and try to have better outcomes, we need competition
00:33:14.000 | in schools, which means probably breaking some of these unions and giving people vouchers
00:33:18.360 | and choice.
00:33:19.840 | And then we have to invest more in the earliest stages of education.
00:33:24.080 | And I think everybody wants to see a better system here.
00:33:27.640 | de AI to Chamath's point, is it is illegal to hire people based on race, gender, any
00:33:34.360 | of those criteria, obviously, and the de AI programs are trying to fill more applicants.
00:33:41.300 | So their goal, typically, in the way they don't break the law, is to just try to, in
00:33:46.640 | their best cases, find more applicants.
00:33:51.200 | But even that does feel like there's many times in life when people will say things
00:33:57.880 | in corporate America, like we have too many white guys in this these positions, we need
00:34:02.360 | we cannot hire another white guy.
00:34:04.460 | So the reality of de AI that I've seen up close and personal when I was at AOL, I've
00:34:08.000 | told the story before.
00:34:10.120 | Somebody said to me, there's no way for us to make you an EVP, you have to stay at SVP.
00:34:14.800 | And I said, Why is that like, I'm doing all this EVP level work, and they said, because
00:34:17.680 | you're a white guy.
00:34:19.520 | And the entire company is white guys at EVP.
00:34:22.720 | And we cannot add another white guy there.
00:34:24.880 | But we'll just give you the same bonus compensation.
00:34:26.360 | So don't worry about it.
00:34:28.640 | And so there's all kinds of games being played here.
00:34:30.420 | But I think it's great that we're having this conversation, right?
00:34:32.340 | It's a hard conversation for America to have.
00:34:34.200 | For me, the I've talked about this in the past, I've always had concern, when we make
00:34:42.400 | the shift away from equality of opportunity to equality of outcome.
00:34:47.960 | Because we all have this objective that we want to see everyone have equal rights to
00:34:52.680 | success in some way in the United States.
00:34:56.080 | The question is, at what point do you move beyond opportunity where everyone is given
00:35:01.560 | an equal opportunity in this country to invest themselves in transforming their own lives,
00:35:09.200 | versus a quality of outcome, where regardless of how much you do, how much effort, or your
00:35:17.440 | trials, you are given the same as everyone else.
00:35:20.200 | And that ends up looking a lot like socialism.
00:35:22.040 | And it's very concerning, because I think it limits progress and opportunity for everyone.
00:35:26.400 | The real challenge with this particular topic is college admissions about outcome, or is
00:35:33.400 | about opportunity.
00:35:34.800 | It's outcome in the sense that you spend 12 years going to elementary school and high
00:35:39.720 | school and working hard to get yourself into college.
00:35:43.160 | So it's the outcome of all of that effort.
00:35:45.360 | And some people aren't given the opportunity to have success during those 12 years.
00:35:50.920 | And you know, and it is an outcome, what do you think we should do?
00:35:55.240 | And it's an opportunity because it's about going to college because without having a
00:35:57.920 | great college cycle, you may have a more tougher time getting into into the workforce.
00:36:02.140 | So that is why it's a hard value question for me.
00:36:04.560 | I don't have a great answer on this.
00:36:07.240 | But I'm just pointing out it's a lot like the abortion argument, where both sides have
00:36:12.440 | some value oriented point of view, that feels like it's negating the other person's point
00:36:18.440 | of view.
00:36:19.760 | But at the end of the day, they're both coming from either this is an opportunity or it's
00:36:23.120 | an outcome decision.
00:36:24.120 | And that's what makes it so challenging.
00:36:25.440 | The National Bureau of Economic Research did a study in 2019 that they published.
00:36:31.880 | And what they found was that 43% so for three 43% of white students admitted to Harvard
00:36:38.600 | were athletes, or legacy students, or children of faculty and staff, or had a relative that
00:36:48.360 | were donors to the school.
00:36:50.400 | 43% wait, it's rigged.
00:36:52.880 | And then they found on top of that, that 75% of those white students admitted from those
00:36:58.280 | four categories would have been rejected if they had been treated as a normal applicant.
00:37:03.960 | So I think for all the people that are looking at all the black and brown kids that may not
00:37:08.200 | get into a place like Harvard.
00:37:11.080 | If you don't look at these other categories, it is a it's a bit of a gross injustice, quite
00:37:16.560 | honestly.
00:37:17.560 | So I think that these institutions have to evolve.
00:37:19.360 | And if you're going to be forced to be meritocratic, then actually be meritocratic.
00:37:23.960 | And by the way, I actually am fine with legacies and donors.
00:37:28.280 | But I think what should happen is you should just publish a rate card, and you should make
00:37:31.880 | it hyper transparent.
00:37:33.780 | And so I love it for the rich guy who's got an idiot son or daughter.
00:37:37.800 | Let's just be upfront and honest with everybody.
00:37:39.800 | It costs $50 million to get into Stanford, it costs $80 million to get into Harvard,
00:37:44.940 | we all know these numbers, so we should just publish them.
00:37:47.860 | You should pay the price and be done with it.
00:37:49.760 | And for Harvard and Stanford and Yale and all these schools, having an extra 10 or 20
00:37:55.120 | dummies, but an extra two or 3 billion may be a reasonable trade off, but at least it
00:37:58.640 | would be transparent and fair, right?
00:38:00.880 | This is an important free market question as well, because these are private institutions
00:38:05.040 | are privately funded, not if they take federal dollars.
00:38:07.960 | They're not agreed.
00:38:09.960 | But if they take federal dollars, they're not.
00:38:10.960 | And then it becomes a government process.
00:38:13.160 | It's government influence.
00:38:14.160 | It's a state school, all those.
00:38:16.080 | Is there a separate category here, just like country clubs, or any private membership club,
00:38:21.480 | where the members of the club get to decide who they want to admit to the club?
00:38:25.940 | And is that un-American?
00:38:27.600 | And should the Supreme Court and should our constitution have a role in defining how private
00:38:32.240 | institutions make decisions about who gets it?
00:38:34.440 | No, Harvard could absolutely return all the federal funding, the billions of dollars a
00:38:37.860 | year they get, that's totally reasonable, then they can decide to just focus on legacy
00:38:41.680 | admits.
00:38:43.360 | That's totally reasonable.
00:38:44.360 | It's within their rights.
00:38:45.360 | Saks, we got to hear you.
00:38:47.400 | I know that you used up your speaking quota already.
00:38:50.400 | Well, yeah, I didn't want to butt in.
00:38:53.760 | Give him four of my points.
00:38:54.760 | I want to hear.
00:38:55.760 | I talked a lot in the last segment.
00:38:56.760 | You get four of J. Kel's minutes.
00:38:57.760 | Go ahead.
00:38:58.760 | I give you four of my minutes.
00:38:59.760 | So, yeah, two points, I guess.
00:39:00.760 | So, on the legacy thing, I agree with Jamath that we should get rid of it.
00:39:04.080 | It's not meritocratic.
00:39:05.600 | I think that if they did publish a rate card, that would be more honest, but they'd be too
00:39:09.700 | embarrassed and ashamed to do that.
00:39:11.720 | But I think making that argument exposes the hypocrisy of it.
00:39:15.520 | I've already told my kids I'm not helping them get into college, so they're going to
00:39:18.520 | do it on their own.
00:39:19.840 | And so, look, I think the legacy thing, by the way, that's the best gift you can give
00:39:23.120 | kids.
00:39:24.120 | Yeah, that perspective.
00:39:25.120 | Sorry, go ahead, Saks.
00:39:26.120 | I didn't mean to interrupt.
00:39:27.120 | So that's that's point number one, fully agreed on the legacy thing.
00:39:29.640 | With respect to the decision itself, that I'm sorry, can I just clarify that?
00:39:33.360 | Do you believe that the legacy thing should be like in federal law?
00:39:38.000 | I mean, is that a government thing?
00:39:39.960 | Or do you think that that's how those institutions should behave?
00:39:42.240 | Because those are different.
00:39:43.600 | I mean, I'm asking, are you suggesting that the law should be involved, that the government
00:39:46.680 | should be involved?
00:39:47.680 | I don't know if it's a legal thing, because I don't know how to implement that law.
00:39:50.240 | But I think it's something they should stop doing one way or another.
00:39:53.100 | Maybe it should be a law, but I think it should stop.
00:39:55.020 | So I think that's point number one.
00:39:56.020 | The legacy thing.
00:39:57.020 | For private membership.
00:39:58.020 | So just let me just double click on that.
00:40:01.240 | Do you think that should extend to private membership clubs like country clubs as well,
00:40:04.480 | that they shouldn't be allowed to decide who they let in and don't let in?
00:40:08.240 | What makes it different that it's Harvard?
00:40:09.640 | Is it because it's education versus any other private?
00:40:12.120 | No, because it takes federal funding.
00:40:14.160 | It takes tax money.
00:40:15.160 | Well, you and I should not pay for some person to be able to get into a school they don't
00:40:20.760 | deserve to get into just because their parent went there, or just because their parent wrote
00:40:25.760 | a check.
00:40:26.760 | That's unreasonable.
00:40:27.960 | And if they don't take federal funding, these schools take so much federal funding that
00:40:31.240 | they're quasi public institutions, even the private ones.
00:40:34.680 | So that's the distinction.
00:40:35.720 | That's the distinction for you, just to be clear.
00:40:37.280 | And also, there is a strong meritocracy opportunity argument on this.
00:40:43.920 | And I think it's why that whole parents college admission scandal was such a big deal is that
00:40:49.680 | for a lot of people in this country, the ability to have your kids advance themselves by being
00:40:55.760 | the first to get into college or go into college or going to a better college, that is a big
00:40:59.920 | part of creating opportunity in this country.
00:41:02.240 | So for people to try and defraud that, I think created a huge backlash.
00:41:07.100 | So look, I think that the legacy thing just needs to end one way or another.
00:41:11.000 | I don't know exactly what the right legal implementation is.
00:41:16.160 | I have two questions for the panel.
00:41:18.120 | Number one, should you be able to say by geography, hey, listen, we're Harvard, or we're Stanford,
00:41:23.200 | we want to have a representation of people from around the world.
00:41:25.920 | So we're gonna, you know, have the top three students from each country, or you know, or
00:41:31.120 | by population, whatever you do it, you know, mathematically, come in.
00:41:35.720 | And so a little bit of geography, because I did hear from one of these coaches that
00:41:39.800 | costs like six figures to get your kids into college, they said, the best thing you can
00:41:43.960 | do is like move to Kentucky.
00:41:46.680 | You know, and then Harvard and Stanford are looking to get a certain number of students
00:41:51.780 | from each state.
00:41:52.780 | I don't know how true that is.
00:41:53.780 | But they said, that's like one of the top ways to do it.
00:41:55.480 | And then, well, do you remember Jason, just to build on your point, I don't know if you
00:41:57.920 | guys remember, but a few years ago, the the in fashion thing to do was to learn to play
00:42:03.160 | squash.
00:42:04.160 | And I remember all these parents telling me that, and they had kids that were older than
00:42:08.360 | my kids.
00:42:09.360 | And they were, they were hiring full time squash coaches.
00:42:13.360 | Because apparently squash was like, yes, angles, angle shots.
00:42:16.720 | Yeah, I think like, stop with the angle shooting, guys.
00:42:20.320 | Yes, the gun should go off, you should run the race.
00:42:23.600 | And your time is your time.
00:42:25.280 | And you should go to whatever the best school is that you deserve to get into based on your
00:42:28.960 | academic ability.
00:42:29.960 | Now, going back, the big problem, and I think Jason, you really nailed it on the head.
00:42:36.320 | Trying to fix it with affirmative action at the university is still quite unfair in the
00:42:41.720 | sense that there are so many black and brown kids, I think with tons of potential that
00:42:45.560 | don't even get there.
00:42:47.560 | And so the real question is, what are you doing at the grade school and at the high
00:42:50.840 | school and at the preschool, so that you actually get more of these kids to the starting line?
00:42:56.400 | Because fixing it when they're 18, I think is a little too late.
00:43:00.760 | Yeah, right.
00:43:02.080 | Fixing it for three, four and five years old, that's when they deserve and need all the
00:43:07.080 | help in the world to yourself.
00:43:08.720 | That's why we need school choice.
00:43:10.000 | We need charter schools, we need to break the monopoly that the unions have over the
00:43:13.920 | schools running it running it for their own benefit and not fight the enemy.
00:43:17.560 | If you define institutional racism as conditions that trap people in conditions of poverty
00:43:22.520 | across generations, I'd say the abysmal quality of our public schools are number one, and
00:43:29.240 | one, two, and three.
00:43:30.240 | And the reason is because there's no competition and the unions run it for their own benefit.
00:43:33.080 | How long do they shut down these schools for in California because they didn't want to
00:43:36.920 | work because they're afraid of don't know don't see the see where we just got a label.
00:43:40.520 | God, beep it up.
00:43:42.000 | We're gonna be that was not for the benefit of kids.
00:43:46.280 | And it wasn't even medically necessary.
00:43:47.840 | That was a benefit they saw for themselves.
00:43:49.520 | Yeah, J Cal, you come you come from a family.
00:43:51.960 | Yep, that were members of unions because they worked for fire for police.
00:43:58.040 | Is that right?
00:43:59.480 | Yeah.
00:44:00.480 | We speak negatively about the effects of the teachers unions on our public education system.
00:44:05.520 | I think it's absolutely correct.
00:44:09.000 | But how do you share the point of view from the other side, if you're a teacher, and you're
00:44:13.640 | a member of the union, and the union takes care of you?
00:44:16.880 | What's the argument?
00:44:17.880 | Sure.
00:44:18.880 | To say, this union is damaging public education, and the teacher that's working in the union
00:44:23.200 | and a member of the union says this is necessary for my livelihood to protect me for my benefit.
00:44:29.520 | Help us share the point of view because we all have the strongly held point of view that
00:44:33.560 | the unions are destroying and eroding public education.
00:44:36.800 | What's the other side?
00:44:37.800 | Right, people have the right to form unions.
00:44:41.480 | But what we all do is we are forced to be consumers of one educational product because
00:44:46.320 | of how we pay taxes, we pay taxes in, I think in my in California, we each pay $16,000 into
00:44:52.280 | the educational system.
00:44:53.600 | And so if you're a parent, you should get that 16k back and be able to choose what you
00:44:57.840 | do with it.
00:44:58.840 | So there's competition.
00:44:59.840 | So the unions can have protection, but there still should be competition for these services.
00:45:05.440 | And I think there are two separate issues.
00:45:07.800 | Yeah, there's one other thing, which is, I just want to give a shout out to a nonprofit
00:45:14.320 | that I support called smash academy smash.org.
00:45:17.480 | It's done by Mitch and Freda Kapoor.
00:45:20.320 | The Mitch Kapoor founded Lotus 123.
00:45:23.720 | If you're under the age of 40, you might not know him.
00:45:27.560 | And what they do is they realize that a lot of the students who do get into good colleges,
00:45:32.480 | it turns out a lot of the black and brown students, they get accepted and they're behind
00:45:35.920 | in math.
00:45:37.160 | And so what smash has done is they have a three year program.
00:45:40.640 | And I go speak at it sometimes and I donate money to it.
00:45:43.480 | And I encourage you to do the same.
00:45:44.560 | They have this intensive summer program.
00:45:46.760 | So before you go to college, Jamath, if you were one of those students, you know, you
00:45:51.680 | might get into college and then they drop out or even worse.
00:45:54.880 | They switch from a STEM to liberal degree to a non STEM degree, because they're two
00:46:00.320 | years behind on STEM or a year behind on STEM.
00:46:03.200 | And so the Kapoor has found this like little opportunity to kind of catch people up.
00:46:07.760 | And I think that's what we have to do.
00:46:09.140 | We have to address this much earlier and not put a bandaid on it.
00:46:13.240 | Yeah, and the system, the Ivy League system, you know, needs to
00:46:17.720 | Well, it's not just Ivy's, right?
00:46:19.720 | Well, sure, sure.
00:46:20.720 | It's not just pick on Harvard, but it's like all the state schools as well.
00:46:22.720 | The elite institutions, which we talk about
00:46:24.720 | No, all institutions.
00:46:25.720 | Okay.
00:46:26.720 | All institutions that receive federal funding.
00:46:28.240 | They need to take a deep look in the mirror and say, are we doing the best thing for society?
00:46:31.720 | The second question I had for the panel was,
00:46:33.720 | Well, I'm not getting the union point of view, by the way, but yeah.
00:46:36.720 | But are pure academics the best way to accept people into a college?
00:46:40.760 | Or should there be some blend of it like putting sports aside, because that's an obvious one.
00:46:45.240 | But you know, is there's academics, but then there's also creativity, you know, if you
00:46:48.600 | you might be terrible on your SAT standardized test, and you might be an incredible, virtuoso
00:46:53.240 | pianist.
00:46:54.480 | So I think what is the criteria and making that criteria fair is what we all want.
00:46:59.440 | And it feels tremendously unfair.
00:47:01.440 | My point of view is, if the government is funding these schools, then the government
00:47:06.600 | certainly has to have a point of view on what's the reasonable model for admissions.
00:47:12.640 | The government's not funding the schools.
00:47:14.680 | I love a diversity in a marketplace.
00:47:18.080 | I love having different schools having different admissions criteria that allow different people
00:47:23.680 | to find their path through different institutions.
00:47:27.440 | To your point, Juilliard does not care, perhaps as much what you did on, you know, your SAT
00:47:34.920 | in chemistry.
00:47:37.440 | And art schools do not care as much how well you did in math.
00:47:42.120 | And STEM schools don't care whether or not you want an art competition.
00:47:46.080 | And I think that that's the important thing that we need to preserve.
00:47:49.760 | We need to preserve optionality for institutions to define what sorts of individuals they want
00:47:55.360 | to try and recruit, and progress and train and get ready for the workforce and the path
00:48:00.720 | in life that they then choose, versus trying to create a cookie cutter model for what the
00:48:05.400 | government says it's fair for everyone.
00:48:07.920 | And as much as we can take government funding out of these institutions and out of these
00:48:11.160 | systems and give them the freedom to set their own admissions criteria and create differential
00:48:16.720 | educational systems, I think that's going to create the best diversity of a workforce.
00:48:22.720 | And I would kind of be more excited about that sort of an institutional system than
00:48:28.760 | one that is standardized by the government.
00:48:30.520 | You know why that'll never happen?
00:48:33.720 | Because the profit motive of these universities is really to be shadow organizations for their
00:48:38.800 | endowments.
00:48:40.960 | And the thing with endowments is that the people that work there very much want to get
00:48:46.840 | paid and behave like profit generating organizations.
00:48:50.520 | And I think the issue is that if their sole job was to really fund the operational expenses
00:49:01.600 | of the university, then the endowments would be run very differently, right?
00:49:05.400 | Like take again, I just looked up on the internet, but Harvard has about the operating expenses
00:49:11.120 | are roughly five and a half billion dollars a year, but the revenues are about five and
00:49:15.880 | a half billion dollars a year, right?
00:49:17.320 | So if instead you had to basically fund, you know, there was essentially no revenue per
00:49:22.480 | se, right?
00:49:23.480 | There was, there was very little tuition and you didn't take any federal funding, you'd
00:49:27.080 | have to come up with $5 billion a year.
00:49:28.800 | So you just basically take that as a draw from your endowment.
00:49:34.120 | The endowment would be run very differently.
00:49:35.840 | It would be a don't lose money endowment that would generate very low vol returns.
00:49:42.840 | I think the problem with that is that that's not how the endowment or Harvard works.
00:49:47.480 | They wouldn't necessarily make risk seeking investments in things like private equity
00:49:51.960 | and hedge funds and venture capital.
00:49:53.240 | And to the extent they did, they would just make much, much fewer, much, much smaller
00:49:56.560 | or both.
00:49:58.200 | So I think what you're saying could be possible, but the problem are probably the endowments
00:50:03.560 | at these universities.
00:50:04.560 | Okay, well,
00:50:05.560 | $53 billion by the Harvard's endowment now.
00:50:08.200 | So yeah, it would be 10%.
00:50:10.160 | Can anyone tell me who the largest real estate owner is in San Francisco?
00:50:15.080 | You see, we're looking at on the internet.
00:50:16.080 | Oh, no, I know.
00:50:17.080 | I know.
00:50:18.080 | It's that art Institute, the Academy of Art University.
00:50:20.520 | I knew this because they kept buying things and using them.
00:50:24.200 | I think she used all the profit over the years to buy more real estate and she accumulated
00:50:29.760 | the largest real estate portfolio.
00:50:31.160 | Yeah.
00:50:32.160 | In San Francisco.
00:50:33.160 | Sorry, who is she?
00:50:34.160 | The founder, I forgot her name.
00:50:36.000 | This is a for profit university or private?
00:50:38.280 | Or I mean, a nonprofit university.
00:50:40.040 | Yeah, there's a lot of artists that come out of Academy Art and they work in a lot of different
00:50:43.400 | industries including industrial design, including animation.
00:50:46.920 | So that's what I'm saying.
00:50:48.960 | Is it like RISD or is it like actually an art school?
00:50:51.640 | Oh, no, it's a great art school.
00:50:53.120 | Yeah.
00:50:54.120 | Academy of Art.
00:50:55.120 | Anyway, let's keep going.
00:50:56.240 | So look, speaking of STEM, making a big pivot away from art to AI, a couple big news items,
00:51:05.760 | you know, the AI frenzy continues here in Silicon Valley, all the way from early to
00:51:11.840 | growth stage funding, through to M&A events.
00:51:15.360 | We saw this week, Databricks, which is a privately held data infrastructure company announced
00:51:21.140 | that they were acquiring Mosaic ML for $1.3 billion.
00:51:25.020 | That headline number is based on a cash and stock purchase price, where the value of the
00:51:31.160 | stock that was being used to acquire Mosaic ML was based off of the last round valuation
00:51:37.360 | for Databricks, which was $38 billion from a fundraising that they did in 2021.
00:51:42.600 | So arguably, the valuation should be lower and the overall purchase price could be considered
00:51:46.200 | lower.
00:51:47.200 | But that's besides the point.
00:51:48.440 | Mosaic ML, as you guys may remember, is a company I mentioned a number of episodes ago,
00:51:54.280 | led by the founder of Nirvana, which was an early AI business that was acquired by Intel.
00:52:01.080 | And then he started Mosaic ML and he offers open source models.
00:52:05.840 | And I shared the performance data of their most recent announcement on the show a few
00:52:10.520 | weeks ago.
00:52:11.520 | You know, there's rumors, I don't have any confirmed reports, but there's rumors that
00:52:16.160 | Mosaic ML saw their ARR grow from $1 million to $20 million since January.
00:52:23.100 | There was other rumors that said they were only at 6 million of revenue.
00:52:26.240 | Regardless, Databricks is paying a pretty hefty premium.
00:52:29.160 | And I think it begs the question, what do data infrastructure database companies end
00:52:34.200 | up looking like in the future, if AI has to become part of the core infrastructure of
00:52:38.360 | every enterprise?
00:52:39.760 | And this is creating a big shift.
00:52:40.960 | So Sax, as our enterprise software investor expert, maybe you can share with us what this
00:52:49.320 | means for the sector.
00:52:50.560 | Does this buoy excitement for AI infrastructure startups?
00:52:55.360 | Does this change the investing landscape?
00:52:58.760 | Is it just reinforcing what folks are already doing?
00:53:02.040 | I think it's a reinforcement.
00:53:03.040 | I mean, this space is probably the hottest space.
00:53:05.560 | We're talking about like AI infrastructure for enterprises.
00:53:08.520 | I think it's probably the hottest space right now in venture land.
00:53:12.060 | We actually looked at this deal.
00:53:13.520 | We had a small allocation in our next round.
00:53:15.720 | They had a term sheet for a Series B. Emergence was actually going to lead it.
00:53:21.840 | This is Mosaic ML?
00:53:22.840 | Yeah.
00:53:23.840 | Yeah.
00:53:24.840 | I don't know if I'm supposed to be telling you all this, but...
00:53:26.840 | This is great.
00:53:28.840 | This is why people tune in.
00:53:31.480 | Breaking news.
00:53:32.480 | They had a term sheet from Emergence to raise 50 million at 400 posts.
00:53:39.800 | Yeah.
00:53:40.800 | And we were going to have a small allocation in that.
00:53:42.680 | And as I recall, the valuation was somewhere around 30 to 35 times ARR, which actually
00:53:50.280 | is not that insane for a very fast growing company in a hot space.
00:53:53.880 | So that implies about 10 million of ARR.
00:53:56.240 | I don't remember the exact figure, but I think that's sort of the ballpark.
00:53:59.560 | But growing very, very quickly.
00:54:01.280 | I mean, up from like one or two at the beginning of the year.
00:54:06.040 | So I actually understand why someone would want to acquire or invest in this company.
00:54:12.080 | Like I said, I think we wanted to invest.
00:54:14.160 | And while we were sort of trying our best to get...
00:54:16.800 | You didn't say anything when I talked about him on the show a few weeks ago.
00:54:19.280 | You were just sitting there, "Mom's a word?"
00:54:21.320 | Actually, I knew that this deal was basically in the works because the founder called us
00:54:25.840 | up and he had already promised us a small allocation in the round.
00:54:29.720 | Naveen did.
00:54:30.800 | And he called up and said, "Actually, I've got this deal.
00:54:33.080 | So we're putting the round on hold."
00:54:35.640 | And so I didn't think I should say anything because obviously it was still ongoing.
00:54:39.920 | But yeah, we knew about this deal that was kind of coming down the pipe.
00:54:43.600 | I didn't know for sure that it would happen.
00:54:45.560 | But yeah, we heard it was in the ballpark of this like $1.2, $1.3 billion number, which
00:54:51.200 | like you said, because Databricks is a private stock, maybe it's only half of that or 750
00:54:56.880 | or something like that, who knows.
00:54:58.120 | It's still a great deal.
00:54:59.120 | But a lot of people are saying it's a crazy deal.
00:55:00.840 | I don't think it's a crazy deal because before this happened, after Naveen signed the term
00:55:05.880 | sheet for the Series B, an investor came over the top to invest in a $700 million valuation.
00:55:13.680 | So people were kind of going crazy.
00:55:15.320 | Now I don't think that's necessarily a rational behavior.
00:55:18.400 | I think that's more of evidence of a mania going on.
00:55:21.600 | But I think that what he got offered is obviously a fantastic deal.
00:55:24.840 | And I think what it's evidence of is that these big enterprise infra companies are going
00:55:30.240 | to try and build an end-to-end tool chain here.
00:55:33.920 | And I think Mosaic ML had a very, very important part of the tool set, which is training up
00:55:40.120 | these models, basically maximizing GPU efficiency, because GPUs are basically the scarce item
00:55:46.480 | right now.
00:55:47.480 | They're a GPU shortage.
00:55:48.480 | And it's probably not going to get better for a year or two, if that.
00:55:52.160 | So this is a very important part of the stack.
00:55:55.160 | And I think it's probably a smart acquisition for both Databricks and Mosaic.
00:55:59.440 | Yeah, it seems to me, I mean, remember Snowflake, which competes with Databricks, also acquired
00:56:03.480 | Neva, which was founded by one of my colleagues, a guy I work with, a new really great guy,
00:56:08.880 | Sridhar.
00:56:09.880 | For $150 million last month, that deal was announced.
00:56:13.360 | And the pattern recognition that seems to emerge here is that if you're in the data
00:56:18.360 | infrastructure business, it seems like it's becoming critical to level up, that it's not
00:56:23.080 | just about storing, and moving and manipulating data.
00:56:27.220 | But the interpretation of data through models, and the tooling to build those models becomes
00:56:32.760 | a critical component of all of these toolkits that these software companies have to provide
00:56:38.000 | to their enterprise software companies.
00:56:39.320 | And it's a big leveling up that's necessary.
00:56:41.660 | Which seems to me there's other companies out there like them, that are also going to
00:56:45.520 | need to strap on tools like this, to make themselves competitive in this market scape,
00:56:51.640 | which means that there are more acquisitions still to come.
00:56:54.120 | Yeah, Jamal, J Cal, you guys agree or have a different point of view?
00:56:58.760 | Looking at it, it's a it's a big number, the headline number, but I agree with Saks that
00:57:02.800 | the actual numbers half the number.
00:57:04.360 | So if it was, you know, if you look at the number of engineers they had, based on LinkedIn
00:57:08.660 | data and pitch book data, probably 6070 employees, 80 employees, and then 4050 of them are engineers.
00:57:15.120 | So that puts it at $30 million per engineer.
00:57:18.120 | And that's one way to look at these acquisitions.
00:57:21.620 | And I think, you know, probably three to $10 million per engineer for like really high
00:57:27.880 | end engineers is more of the going price.
00:57:30.360 | But if this is half that amount, because they bought it with monopoly money, in other words,
00:57:35.160 | their 2021 price for their company, it's great.
00:57:37.800 | I think you nailed it.
00:57:39.240 | Freeberg, what's happening is this layer of natural language on top of any service, whether
00:57:48.000 | it something as simple as Yelp, or something as complicated as a giant financial company
00:57:53.660 | with tons of transaction data, being able to talk to it and understand it and then have
00:57:59.780 | your machine learning team build tools.
00:58:03.740 | So business owners don't have to hire data scientists, the actual business leaders can
00:58:07.780 | talk to the data and get back answers or just say, Hey, tell me about our customers, how
00:58:13.000 | have they changed over the year?
00:58:14.620 | And then, hey, that's pretty interesting.
00:58:16.460 | Tell me more about our customers.
00:58:18.780 | And you know, how are they reacting to these three new products?
00:58:23.740 | And you will get back in intelligence that previously was unable to be accessed.
00:58:31.800 | And so I was, I was just at Sequoia yesterday with or two days ago with the latest seven
00:58:38.080 | graduates from our accelerator, we bring them to meet with sacks and his team, we bring
00:58:41.760 | them to meet with Sequoia.
00:58:43.540 | And when we were at Sequoia, I realized that of the seven companies, four of them would
00:58:49.040 | not have been possible before these machine learning API's were available and open AI
00:58:54.640 | is but one.
00:58:55.800 | There are now in the companies I'm talking to, they're they're trialing Freeberg on average,
00:59:00.680 | six, seven, eight language models before they pick one, and they're not picking open AI
00:59:04.600 | every time.
00:59:05.600 | Putting that aside, these businesses were not possible before this technology was introduced
00:59:10.920 | and available via API in the last six to 12 months.
00:59:14.600 | And I think there's a bunch of businesses that economically would not work.
00:59:18.580 | That now work.
00:59:19.580 | I can give one example.
00:59:22.920 | There are countless meetings that are recorded over zoom, right?
00:59:28.240 | Think like a local school board.
00:59:30.000 | Well, nobody could ever make a database of all the discussions going on at local school
00:59:34.920 | boards, and then analyze them.
00:59:37.640 | But now because all of those are saved on zoom, and they occur on zoom, and they're
00:59:42.140 | available for public record, you can ingest every single one of those, and then build
00:59:46.440 | a Bloomberg terminal of every discussion happening at every school board everywhere in the United
00:59:51.480 | States.
00:59:52.760 | And do that with chat GPT or any of the language models and then get really great insights.
00:59:58.200 | insights from it.
00:59:59.400 | That would be too costly to transcribe at, you know, $100 every hour or two and normalize,
01:00:05.080 | let alone to analyze.
01:00:07.280 | And so I'm looking at businesses as an investor, what I'm looking at right now is businesses
01:00:13.360 | that were previously not economically viable before this technology, and then that are
01:00:19.440 | now economically viable, if that makes sense.
01:00:22.120 | And I'm just looking at each company under that lens right now.
01:00:26.200 | And I'm finding a lot of interesting startups.
01:00:27.640 | They seem to have that in common.
01:00:29.080 | Similar news supporting this very quick evolution, further up the value stack.
01:00:35.280 | So we were just talking about these companies that are providing effectively tooling as
01:00:40.440 | infrastructure.
01:00:42.360 | A little bit more up the value stack is inflection AI, which was started by Mustafa Suleiman
01:00:47.840 | and Reid Hoffman, who was a co founder while Mustafa was working with him as a venture
01:00:55.640 | partner at Greylock.
01:00:57.160 | But Mustafa, as you guys know, was the co founder of DeepMind, which Google bought for
01:01:01.680 | $400 million, really created the core of Google's AI capability, and is considered one of the
01:01:07.400 | kind of preeminent thought leaders and entrepreneurs built that has built in this space.
01:01:12.000 | He started inflection.
01:01:14.560 | And the business just announced today that they've just closed a $1.3 billion funding
01:01:18.280 | round led by Microsoft, Reid Hoffman, Bill Gates, Eric Schmidt, Nvidia, with an intention
01:01:25.780 | of building the largest AI cluster in the world, 22,000 Nvidia H100s as part of the
01:01:31.960 | build out.
01:01:32.960 | But I think you've shared, Shamath, historically that these big funding rounds for these AI
01:01:37.440 | businesses that haven't necessarily been launched product yet don't make sense.
01:01:42.080 | Is this still, you know, kind of reinforce your point?
01:01:45.880 | And you know, what's your read on on the inflection funding?
01:01:48.480 | Well, the list price of an H100 is about 30,000, but the street price, so it's very hard to
01:01:55.200 | get it.
01:01:56.200 | So if you go to the if you go to like eBay and try to buy an H100, it's like 40 or 45,000.
01:02:01.480 | So if you have a 22,000 cluster of H100s, that's about $900 million of capex, just that.
01:02:10.800 | And then you know, all the sundry stuff around it, call it roughly plus or minus $1 billion.
01:02:17.680 | And so of the one and a half billion they've raised, let's say a billion goes into building
01:02:21.400 | this 22,000 node cluster, you have 500 million for SG&A.
01:02:27.200 | And so what that leaves behind is basically two and a half billion of enterprise value
01:02:31.600 | for their chatbot.
01:02:34.180 | So I don't know.
01:02:35.180 | I mean, I've never used pi has anybody any of you guys used it?
01:02:37.880 | Do you guys know if it's good?
01:02:39.200 | Jason, I'm sure you've experimented with it.
01:02:41.200 | Have you experimented with it or not really?
01:02:43.000 | Which one?
01:02:45.000 | That's what their that's what their chatbot is called pi.
01:02:48.440 | Yeah, I think it's like a pi.com or something.
01:02:51.320 | I think I did try it once.
01:02:54.320 | And it was not memorable.
01:02:57.000 | Oh, you mean po?
01:02:58.640 | No, not po.
01:02:59.640 | No, no, no, that's Quorum.
01:03:00.640 | That's Quorum.
01:03:01.640 | Pi is like your personal one where you talk to it and ask, Hey, how are you doing?
01:03:05.680 | Their concept is like you have this one relationship.
01:03:08.440 | So it's like one chat thread.
01:03:10.320 | It's not kind of how I like to work with the I use threads.
01:03:14.260 | And I share threads with my team.
01:03:15.520 | So I'm not a fan of this, like you have one relationship with one assistant.
01:03:18.600 | I think the thing is, it's interesting to note that very rarely when you invest money
01:03:24.040 | in the billions of dollars, does the capex or purchase of one specific form of equipment
01:03:31.000 | take up literally 25% of the enterprise value, that's a typical, at least for a startup.
01:03:40.680 | If you're buying a fixed plant of a slow cash flow generating business, then maybe, you
01:03:44.920 | know, a bunch of that has some value.
01:03:46.960 | So that's what stood out to me and all of these things.
01:03:49.600 | Freeberg is again, increasingly, this is all just a pass through to Nvidia.
01:03:55.680 | It's probably in some ways a pass through to the big cloud providers.
01:03:58.800 | So whenever I see a chip maker and a cloud provider come together to put in a lot of
01:04:02.760 | money, it's essentially round tripping cash.
01:04:05.340 | They're giving them money, which then they use to buy their services.
01:04:07.880 | And then you know, you just you're just pumping revenue.
01:04:10.380 | So I hope it works.
01:04:11.380 | I wish him the best.
01:04:13.280 | But you know, that's the
01:04:14.600 | what we just talked about where the infrastructure companies that are increasingly looking like
01:04:21.320 | more commodity service providers, if they don't up level with AI tooling, acquiring
01:04:27.960 | mosaic ml and acquiring Niva.
01:04:32.640 | Do you think that that's an indication of more M&A to come?
01:04:36.900 | And if so, doesn't that justify the increased funding, the increased valuation and the activity
01:04:43.080 | that we're seeing in the early stage with some of these businesses?
01:04:45.360 | I think for sure there's going to be more M&A.
01:04:47.620 | And I think the valuations will be high, not because these companies have a lot of revenue
01:04:50.840 | yet, but because it's very strategic for these big infra companies to assemble the end to
01:04:55.800 | end tool chain.
01:04:57.600 | I think we should explain to folks what that means.
01:05:00.720 | Enterprise software companies provide software to businesses that are not traditionally technology
01:05:05.560 | companies.
01:05:06.560 | They also provide software to other businesses to help them build new tools to help them
01:05:10.240 | build out their business.
01:05:11.760 | So an enterprise software company can sell to United Airlines or consult a visa or consult
01:05:15.440 | a Ford.
01:05:16.560 | And that software can then be used by that company to build tools that are powered by
01:05:22.040 | the database or powered by the data analytics or increasingly powered by AI tools.
01:05:27.760 | And so they can build AI applications and AI capabilities into their business, whether
01:05:32.160 | it's United Airlines or Expedia or Visa or whomever.
01:05:35.740 | And that's why this, these companies are so critical in terms of enabling the transformation
01:05:41.320 | of industries with AI tooling and why, you know, getting AI tooling into their capability
01:05:46.440 | set is so critical right now.
01:05:47.960 | It's important to note, though, guys, that whenever you have the emergence of a new sector,
01:05:52.000 | Saks, I think you are right that M&A goes up, but it tends to be that the valuations
01:05:57.160 | go down.
01:05:58.160 | Peak M&A froth happens at the beginning of the cycle when hype is at maximum and facts
01:06:04.880 | are at the minimum.
01:06:07.160 | And that's okay.
01:06:08.160 | You know, that's good for the startup.
01:06:09.760 | It's marginally negative for the existing shareholder of a large company.
01:06:14.080 | And then over time, it gets itself sorted out when the facts are more obvious.
01:06:18.760 | So I just think it's kind of like, do you guys remember when the optical networking
01:06:21.920 | craze hit?
01:06:22.920 | Oh my God, I've had these multi-billion dollar acquisitions.
01:06:26.280 | And where did they go?
01:06:27.280 | They went, they went nowhere.
01:06:28.280 | They just disappeared.
01:06:30.080 | We actually have like a market map that we did that I think can explain this concept
01:06:33.400 | of an end to end tool chain.
01:06:35.560 | So this is a slide that our growth team, shout out to Mike Robinson and Kevin Cabrera, they
01:06:41.600 | put this together in preparation and part of the investment memo for Mosaic.
01:06:47.340 | And I think to explain the point you were kind of making Freeberg, like why do enterprises
01:06:52.640 | need the services?
01:06:55.000 | One really simple way of thinking about it right now is that every enterprise would like
01:06:58.640 | to roll out its own chat GPT.
01:07:01.000 | They would all like to have their own internal version of chat GPT where the employees, for
01:07:05.160 | example, could ask questions and get answers.
01:07:08.480 | That's where all the action is right now.
01:07:10.160 | Every enterprise would buy that tomorrow if it existed in the way that they want.
01:07:15.160 | Give an example of what that is, Sax.
01:07:16.840 | Give an example.
01:07:17.840 | The idea would be that, you know, any employee in the company could ask questions to the
01:07:21.120 | AI model, the way you can ask chat GPT questions, and it would have all the enterprise's data
01:07:25.360 | and it would also understand their permissions and have all the security settings so that
01:07:29.880 | only the right people could get the right information.
01:07:32.080 | That's the kind of intelligence they want to unlock.
01:07:34.040 | I mean, there are lots of other use cases as well, but that's a really simple one.
01:07:37.440 | A corporate Oracle.
01:07:38.720 | So if I'm in HR, I can ask, "Hey, tell me about our compensation."
01:07:42.560 | A copilot.
01:07:43.560 | Yeah.
01:07:44.560 | Totally.
01:07:45.560 | A copilot for the CEO.
01:07:46.560 | I think there's gonna be lots of these.
01:07:47.560 | I think the sales team is gonna have their own copilot.
01:07:48.880 | I think the marketing team is gonna have their own copilot.
01:07:51.760 | Customer support has it already.
01:07:53.560 | Customer support will have it.
01:07:54.560 | Engineers will have their own copilot.
01:07:55.560 | So there's gonna be a lot of these.
01:07:57.880 | But I think enterprises want one at the level of the, call it the company intranet, where
01:08:02.640 | employees could just ask it questions.
01:08:05.280 | But they do not want to share their data with open AI.
01:08:08.000 | That's very clear.
01:08:09.000 | They want to roll out their own models.
01:08:11.280 | So the question is, well, how do you roll out your own model?
01:08:13.040 | And what this shows here is the different pieces of the stack that you have to have.
01:08:16.000 | So first, you capture all of your data.
01:08:17.960 | You got to label it to be classified right for the model.
01:08:21.200 | You got to store it somewhere.
01:08:23.200 | Then you need to get one of these open source AI models off the shelf.
01:08:26.720 | And there's probably the most prominent site for this called a Hugging Face, which already
01:08:31.120 | has something like, I think, a $2 billion valuation.
01:08:34.360 | That's another really crazy valuation to ARR multiple.
01:08:39.040 | But Hugging Face has all the open source models.
01:08:41.760 | It's very active.
01:08:43.360 | And so people grab the latest open source model that's the best fit for them.
01:08:47.620 | And then they need to train that model.
01:08:49.360 | And that's really where Mosaic played.
01:08:51.040 | And there's a ton of activity right now in this last mile problem of how do you customize
01:08:55.680 | a model to make it suitable for your use, whether you're an enterprise, whether you're
01:09:00.360 | a customer service team, whether you're a SaaS app that wants to incorporate AI capabilities
01:09:06.120 | into your app.
01:09:07.760 | That's where all the action is right now is customizing these open source models.
01:09:11.900 | That then leads to basically be able to get the right inferences.
01:09:16.080 | And there's sort of a separate category around hardware that is, you know, we don't play
01:09:20.000 | there.
01:09:21.000 | But this is kind of the end to end tool chain.
01:09:22.240 | I think these big tech companies are gonna be racing to put this whole thing together.
01:09:26.120 | To fill it out.
01:09:27.120 | Yeah.
01:09:28.120 | Yeah.
01:09:29.120 | So I'm gonna go back to the description, which means more startup valuation, booming,
01:09:31.800 | more capital deploying.
01:09:33.080 | There was a couple of articles this week, and Chamath, this is your red meat, as much
01:09:39.480 | as Ukraine is, Saks is, because you've talked about this at length, social messaging startup
01:09:45.360 | IRL is shutting down after a board investigation found 95% of its claimed 20 million users
01:09:51.100 | were actually fake.
01:09:52.100 | This is a company that in June of 2021, raised $170 million Series C at a valuation of over
01:09:57.920 | a billion dollars, making it one of the many proclaimed unicorns of Silicon Valley led
01:10:02.060 | by SoftBank's vision fund.
01:10:03.660 | The investor who was sitting on the board said that they didn't know if we've ever given
01:10:08.060 | an investment term sheet to a startup faster than SoftBank gave to IRL at the time.
01:10:13.480 | At the same time, different story.
01:10:15.400 | But in the same week, a company called Baiju, which Chamath you have talked about in the
01:10:19.560 | past, was once valued at $22 billion and claimed to be India's most valuable startup is in
01:10:25.040 | turmoil as shareholders and creditors are seeking to dilute the founder, and he's rushing
01:10:29.800 | to find capital and raise a billion dollars to try and buoy the company process one of
01:10:35.200 | the investors mark the company's valuation down to 5.1 billion to down 75%.
01:10:40.960 | I guess the question is, overall, are we still seeing this kind of turmoil in Silicon Valley
01:10:47.440 | from the Zerp era funding of startups in stark juxtaposition to the excitement and the frenzy
01:10:53.020 | around AI?
01:10:54.020 | It's a characteristic of the exact same thing.
01:10:56.520 | Meaning, if you replaced AI with crypto, it's the exact same thing.
01:11:02.320 | If you replaced AI with co-working, if you replaced AI with, I don't know, synthetic
01:11:12.080 | biology, if you replaced AI with SaaS, this has all happened before.
01:11:17.840 | I think it's important to identify what this is.
01:11:21.420 | What this is, is that there aren't enough checks and balances and there are fundamentally
01:11:26.360 | people who are deeply inexperienced, who are in the wrong job.
01:11:32.840 | In the few key moments where the venture capitalist is supposed to add value, that person is ill-equipped
01:11:40.360 | and unprepared.
01:11:42.360 | Because they were the VP of X, Y, and Z at some startup and they got hired through no
01:11:46.560 | fault of their own into a dynamic because these venture funds wanted to raise larger
01:11:52.040 | and larger amounts of money.
01:11:55.080 | What happens?
01:11:56.080 | You don't even know how to ask the basic questions or even more insidiously, you don't have the
01:12:01.360 | courage to say the hard thing.
01:12:06.740 | These things happen that are frankly inexcusable.
01:12:11.880 | In the case of one of these companies, and I've mentioned this before, they approached
01:12:16.200 | us for financing and when we asked for a data room, we got a Google doc link to a spreadsheet.
01:12:25.160 | There's no reasonable world in which a company is that unsophisticated when it comes to understanding
01:12:30.120 | their business.
01:12:32.680 | A data room should include an enormous amount of operational and financial metrics that
01:12:37.360 | you can use to come to your own conclusion so that you can present it transparently to
01:12:42.000 | the investor.
01:12:43.880 | The idea that boards wouldn't even hold these companies accountable is just a sign that
01:12:48.840 | the board members themselves are pretty fundamentally inexperienced people.
01:12:53.560 | I think the thing that we do, which is a mistake, is we say, "Oh, well, X, Y, and Z firm led
01:12:59.080 | this deal."
01:13:00.080 | Yeah, that may be true, but really what it means is that firm in a grab to get the money
01:13:08.400 | hired some person that checks some boxes, put them in the job, that person led around
01:13:14.360 | and there just wasn't any infrastructure to either teach that person or then that person
01:13:19.760 | to have the courage to hold the founder responsible.
01:13:22.720 | That will play out in AI as well.
01:13:24.440 | It's just that we're at the beginning of the hype cycle because we replace AI, again, as
01:13:29.760 | I said, with any of these other things.
01:13:31.160 | We sat here a little bit hand-wringing when we saw these crazy valuations for these NFT
01:13:36.360 | projects.
01:13:37.360 | We're at the beginning now.
01:13:38.360 | You name it.
01:13:39.360 | Zero.
01:13:40.360 | This is about fundamentally inexperienced people doing a job that seems pretty easy
01:13:45.720 | from the outside, but in practical reality, there are only a few legends in our business.
01:13:51.560 | Most people, and I think it was Shai Goldman that did the math on this, most people do
01:13:55.400 | not know how to run these businesses well.
01:13:57.600 | Nick will find that tweet you can show.
01:14:00.840 | I think it's like 2.5% of all of the funds that are in PitchBook, so over 800 of them,
01:14:06.720 | have ever generated more than 3X in two funds.
01:14:10.760 | This is a hard business, it turns out.
01:14:12.560 | You can't just wake up and be an investor, it turns out, and that's what we're finding.
01:14:17.840 | I don't know.
01:14:18.840 | It's not very surprising in the end.
01:14:19.840 | None of this is surprising.
01:14:20.840 | Good to know.
01:14:21.840 | I think Chamath had a really good point in the middle there is that there is a generation
01:14:24.920 | of venture capitalists who were added during the boom who were operators, but they've never
01:14:29.020 | been taught to have the discipline of capital allocators.
01:14:32.800 | One of those key pieces of discipline is asking uncomfortable questions and doing uncomfortable
01:14:38.120 | diligence.
01:14:41.040 | You can trust people, but you need to verify.
01:14:43.720 | That is a key part of the job.
01:14:46.000 | You can trust the founders, but you have to verify that the data you have is correct.
01:14:50.620 | The fact that SoftBank did this at an incredible valuation and the person who did this deal
01:14:55.660 | never checked that the customers were real makes you unfit to serve in the job.
01:15:04.340 | I will do diligence.
01:15:06.140 | During that time period, Friedberg, I had many founders say to me, "You're asking for
01:15:10.720 | more diligence than the lead.
01:15:12.960 | This deal is closing and we are oversubscribed."
01:15:16.480 | I said, "Okay."
01:15:17.480 | They said, "Okay, so you're not going to require this?"
01:15:20.400 | I said, "Oh, no.
01:15:21.400 | We require this diligence.
01:15:22.400 | We want to see your very basic stuff.
01:15:24.880 | Your bank statements, your P&L.
01:15:26.640 | We want to talk to you.
01:15:27.640 | Why don't you give us a list of your first 1,000.
01:15:29.880 | Give us a list of 500 customers from last month.
01:15:32.480 | We'll give you five numbers.
01:15:33.480 | We're going to talk to five random customers."
01:15:35.480 | People did not want to do this stuff.
01:15:36.680 | We do this work at our firm when we start to own 5%, 10% of this and we train our founders
01:15:41.160 | to be ready for proper diligence.
01:15:43.660 | All that diligence is happening now.
01:15:45.160 | Now, in the early stages, there's not much to go on, but you can check stuff.
01:15:49.760 | During this period, founders used the hot market to not participate in the due diligence
01:15:57.120 | process.
01:15:58.120 | When you look at companies, a lot of times people will suspend disbelief.
01:16:03.040 | This company, Byjo, I don't know a ton about it, but it seems to have an educational app,
01:16:08.920 | like a company, Brilliant.org, that Chamath and I are early investors in and Chamath incubated.
01:16:15.200 | Great, great business.
01:16:17.720 | But then their business and their revenue seems to be based on a series of Kumon-like
01:16:22.960 | in-person instruction.
01:16:23.960 | "Oh, that's not a high gross margin business.
01:16:26.080 | Sorry.
01:16:27.080 | If you have to have a storefront, you're not a software business anymore."
01:16:31.800 | People started giving valuations, and this is the second part and I'll just wrap on this.
01:16:35.020 | People started giving valuations to these companies that were real world businesses,
01:16:38.360 | that were low margin businesses, direct to consumer, whatever it was.
01:16:42.000 | They suspended disbelief and they gave them valuations for high growth, high gross margin
01:16:49.000 | businesses.
01:16:50.000 | That was another mistake.
01:16:51.000 | You put those two things together, not doing diligence, and then just misvaluing of actual
01:16:56.400 | assets, that's the cleanup work that's going on right now.
01:17:00.560 | It takes years.
01:17:01.560 | It took decades for them to pinch Bernie Madoff.
01:17:04.600 | It can take 10 years for these frauds to come out.
01:17:07.880 | There was a guy who kept telling the SEC about Bernie Madoff.
01:17:10.360 | I think he was like nine years since the first time he let them know that the perfect returns
01:17:16.920 | were just not possible statistically.
01:17:20.140 | It takes time, but they're picking these folks up everywhere.
01:17:24.200 | Do Kwon got picked up in Montenegro, the guy from Luna.
01:17:28.460 | It's going to take a decade to clean up all of the fraud in our space.
01:17:32.320 | I think this was sort of mentioned by Chamath, but I think it needs to be a bigger point,
01:17:35.680 | which is the influence of fund size on these decisions.
01:17:39.880 | I mean, Kraft funds are in the $600-700 million range.
01:17:44.280 | When we write a check, it's usually $10, $15, $20 million check in a Series A company.
01:17:51.200 | That's like a big check for us.
01:17:52.440 | We're really going to sweat that decision.
01:17:54.320 | For SoftBank, a $10-20 million check in a $100 billion fund, which is what they had,
01:17:59.640 | it doesn't even make sense.
01:18:01.280 | It's a waste of their time.
01:18:02.280 | It's not even a rounding error.
01:18:04.000 | For them to basically make a decision-
01:18:06.280 | It's like a $2,000 check for you.
01:18:08.280 | Yeah, exactly.
01:18:09.280 | For them, they had to write $200 million checks to justify their time managing $100 billion.
01:18:16.600 | The mistake, when they make a mistake, is 20 times bigger than it should be.
01:18:20.440 | That should have been maybe a $10 million mistake, not a $200 million mistake.
01:18:25.480 | Their fund size forced them to basically write these gigantic checks, and they were writing
01:18:29.440 | them into companies that were effectively seed stage or Series A companies.
01:18:33.800 | If it was into a growth stage company, I think that's fine.
01:18:37.080 | There's a lot more data, and there's a lot more customer references that you can check
01:18:40.560 | at a later stage company.
01:18:41.920 | By the way, the number one part of diligence, I'd say for us, other than looking at metrics,
01:18:46.360 | which anyone can do, is the off-sheet references.
01:18:50.080 | Talking to customers from a list that you figured out yourself, not from the company
01:18:55.160 | itself, is probably the single most important qualitative part of diligence.
01:19:00.920 | I don't know what happened here.
01:19:02.440 | It's not stated explicitly, but I think it's important.
01:19:05.120 | David, you have credibility.
01:19:07.040 | When you say something, Saks, people listen, because you have bona fides that are undeniable.
01:19:12.960 | Same thing with J. Cal, same thing with you, Freeberg.
01:19:15.880 | This may sound mean, but it's like most of these people are just XYZ mid-level VPs from
01:19:21.840 | a startup.
01:19:22.840 | That's a great thing, but it's not necessarily going to give you the gravitas, especially
01:19:28.600 | if that's not what you are forced to do to help build that company.
01:19:33.140 | When push comes to shove inside of a boardroom or in the middle of diligence, there has to
01:19:39.200 | be conflict.
01:19:40.200 | I think it's a necessary feature of good decisions.
01:19:43.120 | That conflict arises internally within your investment team, but it also has to come externally
01:19:48.040 | with the executives of the startup and with the CEO themselves, because when you're prosecuting
01:19:52.600 | a good decision, it's unbelievable that you agree on 100% of things.
01:19:56.080 | There has to be certain things that are controversial.
01:19:58.080 | Otherwise, by definition, that company isn't really even pushing the boundary.
01:20:03.060 | I just think that these are all skills that are poorly taught while you are building a
01:20:08.920 | business.
01:20:09.920 | It is not the reason why you should have been in charge of allocating 50 and $100 million
01:20:14.120 | checks into companies.
01:20:15.120 | That is just crazy town.
01:20:16.960 | I love your point, Sax, though, about fund size dynamics.
01:20:20.000 | Fund size dynamics are destiny.
01:20:21.920 | It really is.
01:20:24.000 | The optimal fund size for venture is somewhere between 250 and 600 million, according to
01:20:31.040 | everybody who's been doing this for more than a decade or two and who's successful.
01:20:35.800 | When the costs are coming down, so as the input costs come down, whether it's for engineers
01:20:39.640 | and co-pilots and hardware and abstraction layers, then theoretically, greater outcomes
01:20:45.400 | should be generated with fewer dollars in, which would, again, tell you that fund sizes
01:20:50.140 | should actually go down, not up.
01:20:51.760 | The reason they go up is because you get paid an annual management fee.
01:20:55.520 | Obviously, the way to make more money is to get 2% on a larger fixed number every year
01:21:01.640 | versus 2% on a smaller number.
01:21:04.880 | For example, what we did was we were like, "We're going to go and hit grand slams."
01:21:09.160 | I traded off management fee in return for 30% carry.
01:21:13.920 | That turned out to be literally a multi-billion dollar smart decision because I gave up tens
01:21:19.600 | of millions of dollars up front for back end.
01:21:22.320 | The back end could have gone to zero.
01:21:24.420 | Maybe it still can, so who knows?
01:21:27.500 | Most folks wouldn't do that.
01:21:29.120 | Most folks take the risk-adjusted bet and say, "You know what?
01:21:32.360 | I'll just take the 2% and I'll raise a $200 million, then a $500 million, then a billion,
01:21:36.680 | then a $2 billion fund."
01:21:37.680 | They overlap.
01:21:38.680 | Yeah, and they stack them all and they get the 2%.
01:21:41.360 | All of a sudden, the profits don't matter, which means the outcomes don't matter, which
01:21:44.240 | means the diligence is perfunctory and it becomes a theatrical expose that you can use.
01:21:51.960 | This sort of thin fig leaf, you can point to LPs and say, "We did our work here.
01:21:57.100 | Give us more money for this next fund."
01:21:58.960 | That's the rat race that the venture community is in.
01:22:01.960 | It's going to get played out in companies like IRL and Byju's and a lot of these AI
01:22:06.280 | companies, quite honestly.
01:22:07.280 | Right, and the chickens have come home to roost for all the crypto investments.
01:22:09.680 | This time is not different, is I think what I'm trying to say.
01:22:12.480 | This time is not different.
01:22:13.600 | Did you guys see there was some article that reported that fundraising for late-stage funds
01:22:18.500 | is just cratered?
01:22:19.840 | Dead.
01:22:20.840 | Dead.
01:22:21.840 | So, I think it was trying to raise a $10 billion fund and they've only been able to
01:22:24.240 | raise two, according to this article.
01:22:26.240 | No, no, no.
01:22:27.240 | It was 22 down to 10, and of which they've raised two.
01:22:29.920 | Okay, so 10%.
01:22:31.400 | And then Tiger was trying to raise 12.
01:22:34.200 | They cut it to six, and then they can only raise two.
01:22:37.120 | Makes sense.
01:22:38.120 | Yeah.
01:22:39.120 | So, basically, that's like a, whatever, 80% to 90% reduction in the size of these funds.
01:22:44.680 | Yeah, meanwhile, the sovereigns where they were going for this money are buying sports
01:22:48.240 | teams.
01:22:49.240 | They're like, "You know what?
01:22:50.240 | Instead of tech, let's just buy sports teams."
01:22:51.520 | They're buying Series A.
01:22:53.560 | And they're buying Manchester United, and they're buying distressed portfolios.
01:22:56.360 | Yeah.
01:22:57.360 | And the sovereigns?
01:22:58.360 | But there's a huge crunch.
01:22:59.360 | We've talked about it before, but there's a huge crunch in late-stage financing.
01:23:02.080 | It's only going to get worse over the next 18 months.
01:23:05.160 | I asked Brad last week, "How many of these zombie corns do you think there are out of
01:23:09.560 | the 1,400?"
01:23:10.560 | He said 30% to 40%.
01:23:12.480 | I think he might be, you know, it could be, look, out of 1,400, I think it could be 700
01:23:16.640 | zombie corns.
01:23:17.640 | I think it's at least 700.
01:23:18.760 | I think it's probably 60%.
01:23:21.080 | Yeah.
01:23:22.080 | And then the other 40%, let's say, how many of them have a down round coming?
01:23:26.320 | I think 60% go to zero.
01:23:30.080 | Of the remaining 40%, half of them probably return money.
01:23:35.200 | And then of the remaining half, half of those maybe get one and a half X.
01:23:40.200 | And then you get a geometric distribution from there, which means the blended return
01:23:43.720 | of that entire stream of unicorns will be about 1.1X, but it'll be very massively distributed.
01:23:49.480 | I think that is exactly right, Shama.
01:23:51.280 | I would...
01:23:52.280 | Yeah, I would...
01:23:53.280 | Everybody's getting their money back, except if you don't have diversification.
01:23:57.360 | I think the market is sending a very clear message, which is these large...
01:24:01.880 | Wait, so you said everybody's getting...
01:24:03.280 | No, no, no.
01:24:04.280 | Most people won't get their money back.
01:24:05.280 | Most people won't get their money back, but on average, it's going to be 1X return.
01:24:08.120 | Yeah, so it's not going to be evenly distributed, correct.
01:24:10.560 | Well, I'm glad that the term zombie corn has held.
01:24:13.760 | Guys, we are coming up on our time.
01:24:15.520 | Do you guys want to do Science Corner or do you want to...
01:24:18.360 | Yeah, the three of us need to use the bathroom break, so go ahead and just...
01:24:22.120 | If you could just go do it, we're going to go and take a leak and we'll come back and
01:24:27.280 | make a Uranus joke.
01:24:28.280 | You know, I'm okay.
01:24:29.280 | I'm going to wrap.
01:24:30.280 | So look, it's been great.
01:24:31.280 | It's been great being your host.
01:24:32.280 | We crushed it.
01:24:33.280 | So give us the Science Corner.
01:24:34.280 | Give us a quick...
01:24:35.280 | You don't hug me.
01:24:36.280 | You don't embrace me.
01:24:37.280 | You don't love my contributions.
01:24:38.280 | We love you.
01:24:39.280 | Freeberg, I actually, I posted a clip of RFK talking about vaccines.
01:24:45.240 | I'd love for you to listen to it and actually give us the critique.
01:24:48.400 | Yes, I will.
01:24:49.640 | Can we do it next week?
01:24:50.960 | That's going viral right now.
01:24:52.200 | He did such a good job explaining his position on that.
01:24:54.320 | He did such a good job.
01:24:55.760 | Yeah.
01:24:56.760 | Look, did you guys read the Offit piece?
01:24:58.480 | I forwarded it to you and I said, please read this.
01:25:01.840 | If you guys read that piece, I'll watch his clip and let's talk about it next week.
01:25:05.080 | Is that cool?
01:25:06.080 | Can you resend that?
01:25:07.080 | Yeah, he is a vaccine scientist who RFK Jr. references often as someone that he met with
01:25:13.720 | and spoke with and says, I caught him in a lie.
01:25:16.080 | And Offit basically said, here's exactly what happened.
01:25:18.520 | Here's the conversation.
01:25:19.920 | Here's the data.
01:25:20.920 | Here's the facts.
01:25:21.920 | Here's the science.
01:25:23.000 | And I would really, really, really encourage you guys to read that, please.
01:25:27.320 | And then I'll watch his clip and like, let's have a real kind of analytical conversation
01:25:31.880 | about statements that are good questions to ask and good things to interrogate and things
01:25:36.880 | that are being said that maybe aren't factually correct.
01:25:40.240 | And I think that we need to kind of really, as a service to ourselves and to people that
01:25:43.920 | listen to us really do that work.
01:25:46.320 | So let's do that and come back and talk about it next week if that's cool.
01:25:50.000 | But I encourage everyone to read off it.
01:25:51.320 | He put it on sub stack.
01:25:52.320 | Nick, we'll put the link in the notes here.
01:25:54.960 | Let's definitely continue.
01:25:55.960 | Let's continue.
01:25:56.960 | Exactly.
01:25:57.960 | I think that's the most important you want to talk about that certain senator that all
01:26:00.880 | of a sudden just basically gave us the Heisman.
01:26:03.520 | No, we had a lot of those, by the way.
01:26:05.520 | So let me just be clear.
01:26:06.680 | That's not the only Heisman we've received on the all in summit.
01:26:09.040 | I will say that the speaker list for the all in summit is looking fantastic.
01:26:14.200 | We're gonna have a we're gonna have a great time.
01:26:19.560 | And I'm really excited for the conversation.
01:26:20.960 | You're saying some folks Heisman does because of our support of RFK.
01:26:26.760 | That did happen.
01:26:28.120 | And specifically the fact that that's really open minded.
01:26:31.400 | Yeah.
01:26:32.400 | And then there were other folks who were insulted by things said about them by people on the
01:26:36.960 | show.
01:26:37.960 | Do you guys want to hear about the the nano grav?
01:26:40.400 | Yes, I came out yesterday.
01:26:42.040 | Okay, I'll cover this one real quick.
01:26:43.560 | Wait, what about this like slow Hummer that we're all getting?
01:26:46.200 | What is that?
01:26:47.200 | That's it.
01:26:48.200 | It's a slow Hummer.
01:26:49.200 | Exactly.
01:26:50.200 | So tell me, are you talking about like the new h3 EV, but it only goes up to 50 miles
01:26:53.760 | per hour?
01:26:54.760 | What is that?
01:26:56.360 | The new Hummer the new HV, you know, the event, Evie, they're coming out with an EV of that.
01:27:00.960 | Oh, you're kidding.
01:27:01.960 | Hilarious.
01:27:02.960 | Yeah.
01:27:03.960 | It's like a great troll.
01:27:04.960 | I remember when Schwarzenegger got the Hummer back in the 90s.
01:27:06.720 | And I was like, Oh my God, he's amazing.
01:27:08.520 | I can drive on your car.
01:27:10.120 | And then the Hummer was the cool car to drive.
01:27:12.120 | Yeah.
01:27:13.120 | And then it became like, ah, it's a lot of manly car.
01:27:16.760 | Okay.
01:27:17.840 | So yesterday, a paper was published by an international scientific consortium.
01:27:24.080 | This group is called nano grav.
01:27:26.640 | And they've been using a series of instruments to measure pulsars, including a 500 meter
01:27:32.060 | radio telescope array, which allows them to see what's so funny.
01:27:37.960 | No, I just had like three Uranus jokes in one sentence and I had to stop myself.
01:27:42.680 | I get them out now.
01:27:43.680 | Go ahead.
01:27:44.680 | I'm muting.
01:27:45.680 | I'm muting.
01:27:46.680 | The nano grav data that was released is 15 years of data from pulsars.
01:27:49.880 | And pulsars are neutron stars, which are stars that have collapsed on themselves and are
01:27:54.120 | basically super dense and start spinning.
01:27:56.800 | And then these pulsars, you know, you basically like a lighthouse, you can see the light.
01:28:00.360 | So it looks like a almost like a strobe light.
01:28:02.960 | And we can see 1000s of these across the universe and we can observe them.
01:28:07.080 | And the rate at which the pulsing is coming out of these pulsars tells us a lot about
01:28:12.200 | what is happening in the space between Earth and those pulsars.
01:28:17.420 | And when you collect enough data over a long enough period of time, which is what these
01:28:21.040 | folks have just released is 15 years worth of this data, you can start to see really
01:28:26.760 | interesting patterns in the data that support the the theory that space time itself is slowly
01:28:36.440 | vibrating being stretched, being compressed, being pulled apart, being pushed back together,
01:28:42.100 | because of very large gravitational events happening around the universe.
01:28:46.600 | And what that means is you guys have all seen, you know, that kind of two dimensional image
01:28:51.000 | of a black hole.
01:28:52.120 | And Nick, if you could find one online and pull it up, where it looks like a the middle
01:28:55.200 | of a black hole, space itself collapses in and collapses down.
01:28:58.880 | And what happens is space and time gets significantly elongated.
01:29:03.600 | When they're really close to gravity, gravity actually pulls space, sucks it in sucks in
01:29:07.800 | time, and it becomes distorted.
01:29:11.160 | And so when you have large black holes around the universe spinning and running past each
01:29:15.560 | other, they're actually pulling and stretching space time itself.
01:29:19.460 | And that sends out ripples throughout the universe, ripples that are slowly undulating,
01:29:25.080 | space and time itself.
01:29:27.320 | So by observing all of these pulsars around the universe, and the rate at which these
01:29:31.960 | pulsars are pulsing, and seeing slight variations, we can start to measure and actually see those
01:29:39.080 | waves, those very slow waves of space time itself undulating, and being pushed and compressed.
01:29:46.840 | And so it supports Einstein's general theory of relativity, which indicated that space
01:29:51.160 | time itself can be warped by gravity.
01:29:54.000 | And it provides a really interesting picture on the universe itself, that all around us,
01:29:59.200 | we have large masses that are many, many millions or billions of light years away, that are
01:30:04.320 | creating waves in space and time itself, that we as humans will never kind of observe, realize
01:30:10.400 | or feel ourselves, but as part of the fabric and the underlying nature of our universe,
01:30:15.760 | with space and time being slowly warped and slowly elongated, slowly compressed.
01:30:20.760 | But it's a really fascinating picture of the universe.
01:30:23.520 | Over time, as scientists gather more and more of this data, it will provide insights into
01:30:28.760 | where in the universe these massive black hole events may be occurring, and also provide
01:30:33.480 | insights into the early picture and the large scale structure of the universe, which helps
01:30:37.640 | us better understand how everything started and where we're all coming from.
01:30:41.840 | So it was a really fascinating data release.
01:30:43.720 | I think it's a really kind of profound thing.
01:30:46.360 | If you take a little time and think about it, it's super exciting.
01:30:49.920 | It's getting a lot of press coverage today and encourage us all to pull our heads out
01:30:54.400 | of the Ukraine war and Silicon Valley and money and all this stuff and realize that
01:30:58.640 | there are things of extraordinary scale and structure that are happening around us.
01:31:01.960 | Let me ask you two questions.
01:31:04.080 | Number one, why does it matter?
01:31:06.960 | And number two, any theories here of what we might discover, you know, if this, you
01:31:12.680 | know, goes 10x or 100x in terms of the information we're getting?
01:31:16.840 | Many years ago, it was theorized that there's what's called a cosmic microwave background
01:31:21.400 | radiation, the CMB.
01:31:23.840 | And what that is, it's the leftover heat from the formation of the universe from the universe
01:31:29.000 | when the Big Bang happened.
01:31:30.880 | And the scientists figured out how to create really sensitive radio telescopes and put
01:31:35.600 | them in the in orbit, and they started to observe the background radiation.
01:31:40.800 | And what that showed us was the fingerprint of the universe, the original structure of
01:31:44.920 | the universe that created ultimately all the galaxies, super galaxies, and then ultimately
01:31:49.720 | all the stars and then the planets and everything that came from that.
01:31:53.440 | This could be the beginning of seeing a gravitational background of the universe, where we could
01:31:59.120 | actually start to see perhaps the fingerprint of the space time of the whole universe of
01:32:05.400 | what the actual structure of space itself and time itself looks like throughout the
01:32:09.460 | universe with the perturbations being driven by some very large, massive, supermassive
01:32:14.240 | black holes.
01:32:15.240 | There was a black hole discovered this week, that's 30 billion times the mass of our sun.
01:32:21.160 | There are these massive objects out there that are actively distorting space time.
01:32:25.600 | And we're going to start to get a fingerprint of that with this sort of data.
01:32:29.240 | And over time, that just gives us a better sense of what the overall structure of the
01:32:32.760 | universe looks like, not just from the heat energy that we're collecting, but also the
01:32:36.240 | gravitational waves that we're now able to observe through the inference of this data
01:32:40.680 | collection.
01:32:41.680 | So it deepens our understanding of the universe, but there's not like the universe.
01:32:45.080 | Yeah, which is amazing and interesting, but and it validates and proves the general theory
01:32:50.600 | of relativity, which if you think about the application of that down the road, that may
01:32:54.480 | lead to things like close to or as fast as light travel or things related to time travel.
01:33:00.960 | You know, there's a lot of things that people have theorized for decades about, you know,
01:33:06.040 | black holes and the warping of space time itself.
01:33:08.920 | I'm not saying that any of this stuff is did you see the three body problem trailer?
01:33:12.600 | Yeah, it looks amazing.
01:33:14.680 | It looks amazing.
01:33:15.680 | Amazing.
01:33:16.680 | What a great what a great can't believe we have to wait so long.
01:33:19.240 | I hate it when they put out trailers.
01:33:20.760 | So how long when is it coming out for next year?
01:33:23.840 | Oh, wow.
01:33:24.840 | Yeah.
01:33:25.840 | Oh, I still haven't seen your movie.
01:33:27.480 | The guys your movie that you wanted me to see.
01:33:30.560 | Which one better try everything everywhere all at once.
01:33:33.000 | I got a better movie for you.
01:33:34.160 | I got a great poll for you.
01:33:36.520 | It's on pay per view right now the movie called about Blackberry.
01:33:41.020 | It tells the story of black.
01:33:42.020 | Oh, I heard it's like an independent film.
01:33:43.680 | It is awesome.
01:33:46.040 | I just reviewed it on this week and startups.
01:33:47.840 | It's called just on his Blackberry called Blackberry.
01:33:50.840 | Yeah, I think this has been episode amazing.
01:33:55.320 | 135 of the all in pod.
01:33:56.920 | I really appreciate our my god, I time together just enough time to get you back to your Nirvana
01:34:03.360 | concert.
01:34:04.360 | Oh, what's the background on this one?
01:34:07.760 | Is that a rival?
01:34:08.760 | What is that?
01:34:09.760 | What's that background?
01:34:10.920 | What movie?
01:34:11.920 | That's a black hole.
01:34:12.920 | That's a black hole.
01:34:13.920 | Just a black hole.
01:34:14.920 | Okay.
01:34:15.920 | Now I think that that's from might be a might be a sax's hair because he looks like did
01:34:21.480 | you get a cut sax?
01:34:22.960 | He got a cut.
01:34:23.960 | Did you cut it?
01:34:25.440 | No, you broke our sex.
01:34:27.040 | I got a mild sloughing.
01:34:29.040 | Show us the flop.
01:34:30.240 | Let's go.
01:34:31.320 | Let's see.
01:34:33.120 | Oh, boo.
01:34:34.280 | I mean, it's still crazy.
01:34:35.680 | Sure.
01:34:36.680 | What's good?
01:34:37.680 | You gotta just keep growing it out, man.
01:34:38.680 | Jake, how you want to take us out?
01:34:40.040 | You do a better outro.
01:34:41.040 | All right, everybody for David sacks, the architect coming at you.
01:34:45.560 | Z 100 Morning Zoo.
01:34:47.000 | Shabbath Poly hop to two for Tuesday's tears for fears coming up and free bird with the
01:34:51.600 | science project.
01:34:52.840 | All right, traveling back in time with David free bird to for Tuesday, Jackson Brown.
01:34:58.720 | Here we go.
01:35:04.200 | Gotcha.
01:35:05.200 | That's the world's greatest moderator.
01:35:06.200 | I think I'm the world's greatest moderator.
01:35:09.240 | I could do my NPR voice if you like.
01:35:11.600 | Do it.
01:35:12.600 | All right, closing us out here.
01:35:13.600 | Episode 135 PCR W 92.3 the sound of Santa Monica this Sunday at the Venice farmers market
01:35:22.760 | two for one on the organic milk.
01:35:25.000 | Go check it out.
01:35:26.000 | And we'll see you all later on the politics of culture.
01:35:31.060 | David sacks chiming in on the Republican GOP position, which we did consider free bird
01:35:36.880 | deeply going into science and Chamath Polly hop at on wealth disparity for everybody.
01:35:43.480 | I am your host here at KCRW.
01:35:45.440 | Jason Calican is the world's most moderate moderator.
01:35:49.440 | We'll see you next time.
01:35:51.440 | I love you guys.
01:35:58.440 | Let your winners ride.
01:36:02.040 | Rain Man David Sacks.
01:36:03.040 | We open source it to the fans and they've just gone crazy with it.
01:36:16.040 | Let your winners ride.
01:36:23.040 | Besties are gone.
01:36:28.040 | We need to get Merck.