back to indexE166: Mind-blowing AI Video: OpenAI launches Sora + Is Biden too old? Tucker/Putin interview & more
Chapters
0:0 Sacks Tequila Tasting!
1:19 Nvidia's market cap surpasses Google, ARM's stock rips, Masa's big comeback
15:36 Mind-blowing Sora demos: breaking down OpenAI's new text-to-video model
51:24 The risks of companies issuing stock option loans
64:57 Biden's cognitive decline, should the Democrats find a replacement? Plus, third party viability
91:46 Takeaways from Tucker's Putin interview
00:00:00.000 |
So how's tequila? I tasted 10 samples the other day. One was heads and shoulders above the rest, 00:00:07.440 |
which was a good sign. Oh, so now we're taking that and making it the new baseline and I'm going 00:00:12.480 |
to try 10 more samples. And we're going to go from there. I think we're very close to having 00:00:16.400 |
something excellent. Great. And then I did a side by side taste test of the sample I like 00:00:21.840 |
the best compared to classes ultra. And it was already better than classes ultra, which 00:00:28.240 |
really retails for like 3000 a bottle. Really? Yeah. So it's going to be better. V1's better 00:00:34.400 |
than their final product is what you're saying? Yes. And that was not just your taste test, 00:00:38.880 |
but like you and several people. It was me and two other guys who I trust. So we did it together. 00:00:43.280 |
Now, when you say trust, do you mean underlings or do you mean actually like. No, he means 00:00:47.360 |
alcoholics. Where in your pay hierarchy are they? He recruited them at Alcoholics Anonymous in West 00:00:54.320 |
Hollywood. Are they level two employees or level four employees? No, one works for me and one's a 00:00:58.960 |
consultant. Okay. Are we over or underestimating the impact of the AI boom in video 00:01:24.240 |
has been on an all time heater. If you don't know, its market cap is now 1.8 trillion Friedberg. 00:01:29.440 |
This week, it surpassed Google and Amazon as the fourth most valuable company in the world. There 00:01:37.440 |
it is, folks, right behind Saudi Aramco. 16 months ago, stock was trading about $112 a share. Since 00:01:46.400 |
then, share price has gone up 6.5 x on a big number. shares went higher this week after it 00:01:50.880 |
was reported that Nvidia plans to build custom AI chips with Amazon, Meta, Google and OpenAI. 00:01:57.200 |
In other news, Financial Times reported OpenAI hit a $2 billion run rate for their 00:02:04.240 |
reoccurring revenue. It's about 165 million a month. And they think they can double that number 00:02:09.360 |
in 2025. Tons of competition coming out. If you didn't see Google rebranded their generative AI 00:02:15.680 |
suite to Gemini, and they're charging 20 bucks a month for it. And then there was this bizarre 00:02:21.840 |
Wall Street Journal headline that Sam was going to raise $7 trillion. That makes no sense. Maybe 00:02:26.000 |
that was the Tam. We'll get into that. Friedberg, your thoughts on this ascension? Are we underestimating 00:02:33.280 |
or over hyping AI? I mean, it seems like the compute build out is underway. So this is going 00:02:42.480 |
to continue for some time. I think the real question that keeps getting asked is, you know, 00:02:47.120 |
when do you really see the tides recede? And what's what's there? The compute build out is 00:02:51.440 |
underway. But are we seeing application and productivity gains associated with the capability 00:02:56.800 |
of that compute? Some would say yes, some would say no. But you've got to build out the infrastructure 00:03:02.960 |
if you're going to assume that there's going to be these productivity gains in the application layer. 00:03:07.440 |
And everyone's assuming they'll be there. So we need to meet the demand, which is almost like 00:03:12.320 |
building out an entirely new internet. And so the core of AI is this matrix multiplication problem. 00:03:20.640 |
You need these chips that do that really efficiently. They're not traditional CPUs, 00:03:25.760 |
which are general purpose chips, but they're chips that are very special purpose for a particular 00:03:29.840 |
set of applications that render the output of whatever model is being run. It's sort of like 00:03:35.440 |
the internet, like you could pretty much create a bogey of any number on how big this can get. 00:03:40.320 |
At the same time, you're seeing some of the other players with different architectures and 00:03:46.800 |
potentially that are going to be beneficiaries of this build out because they're participants 00:03:50.320 |
in other parts of the data center, or alternative solutions like arm. And I don't know if you saw 00:03:55.040 |
this, but in just the last couple of weeks, or last week or two, arms valuation doubled, 00:04:01.280 |
it's come down a little bit since that big doubling. Today arm is sitting at 140 billion 00:04:06.480 |
valuation. Last week, it got as high as 156. And remember, it was only last August. So if you guys 00:04:14.400 |
remember arm was bought by SoftBank and the vision fund. And then I think 25% of arm was held by the 00:04:22.320 |
vision fund and 75% was owned by SoftBank. Last August, SoftBank bought from the vision fund, 00:04:29.680 |
the 25% of the vision fund owned at a valuation of 64 billion. So now SoftBank owns 90% of arm, 00:04:37.520 |
the last 10% trades publicly. So at $140 billion valuation, SoftBank's position in arm is 125 00:04:44.640 |
billion. And I think it really begs the question, did masa have a masterful move here yet again, 00:04:50.160 |
because everyone said for years, with all the bets this guy was making in the vision fund, 00:04:55.680 |
and some of the misguided bets he was making with SoftBank, and the leverage he was applying 00:05:00.240 |
on himself and SoftBank to deploy all this capital during the Zerp era. Everyone was saying, 00:05:05.920 |
you know, this guy had a one hit, it was Alibaba, and that was it. And clearly he doesn't know what 00:05:10.480 |
he's doing. But now all of a sudden, his bet on AI was fairly prescient. And it's paying off 00:05:17.360 |
pretty powerfully. He bought the whole business for 32 billion. And now he has a stake that's 00:05:23.520 |
worth 125 billion. Seven years later, yeah, I mean, chip in a chair, right? I mean, it's really 00:05:28.960 |
begging the question. I need one the power law, right? When you're making a long term investment, 00:05:33.760 |
you can very easily look like a moron over the short term. And to your point, the power law, 00:05:39.120 |
all it takes is one of them to work out and it can wipe out the losses on the rest of the 00:05:42.800 |
portfolio. And we may be seeing mosaic with his whole AI vision story that he's been telling for 00:05:47.600 |
seven years, we may see masa clawing his way out of the hole. You only look like a moron to losers. 00:05:54.400 |
Meaning when people were throwing shade at masa, it wasn't anybody that's any good. I didn't see 00:06:01.680 |
headlines that said john Malone thinks masa is a fraud. Warren Buffett thinks masa is a fraud. 00:06:07.120 |
Nobody was saying that. Because those guys know what the game is. And the game is exactly what 00:06:12.320 |
you said, J. Cal, you must always survive. And you need to keep your wits about you. And over time, 00:06:20.160 |
if you're prepared and do enough good work, you'll get waves of good luck that amplify 00:06:24.240 |
your hard work. And masa is an example of an incredibly talented person that has found a way 00:06:30.000 |
to survive more than anything else. And so waves of success will hit him. The people that complain 00:06:37.760 |
are the ones that are sort of a little bit stuck on the sidelines, and are more frustrated with 00:06:42.800 |
their own impotence and inability to do anything. Now, on arm specifically, the thing that I would 00:06:49.120 |
keep in mind is that those guys floated, I think about 15 million shares. So there's about 140 550 00:06:57.840 |
million shares outstanding 90% owned by them. So there's about 10 or 15 million shares outstanding. 00:07:03.600 |
And there was about 12 and a half million shares short going into the earnings report. And so the 00:07:08.560 |
thing to remember is that there was almost an entire turn of the float that was short. So I 00:07:14.960 |
don't know how much of this reaction was just an enormous short squeeze of people that were largely 00:07:23.360 |
long other chip companies. And so they thought they could short arm essentially as a spread 00:07:28.400 |
trade to minimize volatility. But this is an example of where if you don't do this analysis 00:07:33.680 |
properly, GameStop was another where you misjudge or you sub analytically look at the actual share 00:07:41.760 |
account and what's actually free floating, you can get caught on the wrong side. And these things can 00:07:46.320 |
be very violent. And to explain the float to folks, you can make a certain number of shares available 00:07:52.640 |
to the public to trade. When you have a small number of those correction off available to the 00:07:57.680 |
public weird things can happen, because there's not enough available to short. That's exactly right. 00:08:03.520 |
And so that could have been the situation here with this run up. But I think I think the overall 00:08:08.400 |
point of placing bets and being a capital allocator is the important one to discuss here. 00:08:13.600 |
sacks, what are your thoughts on all the crazy bets that masa made? We watched a lot of our 00:08:20.000 |
companies or companies adjacent to our companies get massively overfunded. Some of them literally 00:08:24.640 |
got drowned in capital. But then you only need to hit one and maybe win it all back. What are 00:08:29.520 |
your thoughts on this high alpha, you know, betting style of his? Look, I mean, he's a gambler. 00:08:37.600 |
And he is playing a power loss. So he only needs one really big return to basically make up for 00:08:43.680 |
all the mistakes. And it seems like he's hit that once again, he did it with Alibaba, too, right? 00:08:48.400 |
I mean, he's a major gambler. Most people would be a little bit more risk averse. I don't think 00:08:53.600 |
it was a great strategy to be pumping $500 million checks into what were effectively seed companies, 00:08:58.400 |
like they were doing with the vision fund. However, obviously, he made enough good bets 00:09:04.240 |
that they're going to pay off that fund. Well, it's a good point, sacks. I mean, none of the 00:09:08.080 |
beat. I don't, I probably need to be careful. Because I'm not super familiar with the portfolio. 00:09:14.640 |
But it doesn't seem like any of the big winners were those seed bets. Like he doesn't have any 00:09:20.000 |
returns coming out of that kind of investment strategy. Door. Yeah, DoorDash was not like 00:09:26.720 |
what point did they invest billion or $2 billion round was Yeah, pretty important moment. We've 00:09:30.880 |
heard at dinner, Jason and I at dinner, we've heard the the waterfall from the softbank side, 00:09:37.360 |
and it sounds like it was an enormous number. That was definitely there. It was a risk to 00:09:43.200 |
business. It was a scaling business. Seems a little different than some of these companies that 00:09:48.320 |
were fairly early on that got monster valuations and large chunks of capital and then struggled 00:09:54.640 |
to find ways to deploy that capital which led to unhealthy behavior. I think the thing you have to 00:10:00.080 |
give a lot of credit for is it's so easy to give up when the peanut gallery is chirping. And so 00:10:08.240 |
the thing that you have to give him a lot of credit for is he has the ability to tune those 00:10:14.240 |
people out. Now he probably isn't even living in a world where any of those people that chirping 00:10:19.440 |
even gets to him. And I think that that's a very important thing. But kudos to him. I mean, 00:10:26.240 |
to be able to tag something for 100 billion here 70 billion there. I mean, keep swinging. And by 00:10:32.000 |
the way, at the end of it all, what is it going to change for him? Absolutely nothing except he'll 00:10:36.640 |
have some incredible stories to tell about the ball that he took and the roller coaster ride he 00:10:41.440 |
was on. And I think that's going to be the most valuable thing he's going to be able to tell his 00:10:45.120 |
grandkids these stories and their eyes will just be like what and his friends you know, I mean, 00:10:51.840 |
I would love to have a dinner with him. I don't know. Come on the pod. And let's talk about I've 00:10:57.440 |
only met him once, but I have a great story to tell about boss when I was at Facebook. I met him 00:11:00.800 |
once. But anyways, dead tell it. I mean, can you tell it? All I'm saying is I would love to have 00:11:05.040 |
dinner with him because I just want to hear these stories of all like up and down and up and down 00:11:09.600 |
and two lunches with him. We had lunch one day, we had such a good time. He said, Can we have lunch 00:11:15.520 |
again tomorrow? How long are you in Tokyo? I said, Yeah. And I came back the next day and I had lunch 00:11:19.680 |
with him. That's amazing. And it was great. Like talking to three hour lunches, where we're just 00:11:24.400 |
talking about every web 2.0 company. This is in you know, the 2009 Uber kind of Robin Hood, period. 00:11:31.040 |
And he was just fascinated by what was going on and all the disruption happening. DoorDash series 00:11:36.000 |
D $1.4 billion post 2018. He did. Well, the question I was asking is like DoorDash was a 00:11:42.880 |
lot like an arm. There's a point of view that the business is proven that there's a market that it 00:11:47.680 |
can grow into that there's a management team that knows how to deploy capital versus a lot of these 00:11:53.120 |
early stage bets, where when they got too much money, they ambled about what they didn't amble 00:11:59.520 |
about they kind of incinerated it incinerated because what do I do with this money? I don't 00:12:05.520 |
yet have a machine or an engine that can grow with this money. And until the engine can handle 00:12:10.160 |
the capital thing, like the pizza thing, zoom pizza, you're talking about Yeah, I mean, if 00:12:14.640 |
you get dozens of them in that portfolio, right? Yes. If you don't have a business that can handle 00:12:20.480 |
the gas you're putting in the tank, the engine is going to blow up, the engine has to grow, 00:12:26.000 |
it has to build the volume has to build that it can actually handle the fuel. And I think that's 00:12:30.720 |
been part of the challenge. And I will just say I've said this in the past, but there was an era 00:12:34.640 |
in Silicon Valley that started around 2010 2009, kind of right after right as right as web 2.0 was 00:12:42.880 |
starting to venture into more traditional markets, meaning tech companies were becoming other 00:12:48.080 |
companies kind of the mobile local era was was kicking off. And there was a strategic concept 00:12:54.160 |
that got popular in Silicon Valley on take the market, which is when you feel like you could be 00:12:59.440 |
the winner, give the person all the chips and say now go over bet and take the whole market because 00:13:05.840 |
these are going to be winner take all or these could be winner take most markets. So get first 00:13:10.000 |
mover advantage, here's more capital than you need. And I remember being pitched this when I 00:13:13.920 |
was raising my series B, that I should raise more than I was trying to raise, because I should use 00:13:19.200 |
the extra capital to take the market. And then that mantra just became the de facto standard, 00:13:24.080 |
which is Oh, you have something that has something, it has some edge, it has some advantage, 00:13:28.000 |
it started to move in the market. Here's a ton of capital you don't really need. Now go take the 00:13:33.280 |
market. And the reality is most of those businesses had absolutely no way of using that capital to 00:13:38.960 |
quote take any market to know what their market was yet. Yeah, they when you're in the product 00:13:43.680 |
market fit product market fit phase and you don't have the engine as you called it. How do you make 00:13:48.960 |
the engine go faster? You're literally trying to take a bunch of car parts and then run a Formula 00:13:54.640 |
One race. The car is not even track worthy yet. And Katara you're not not to pick on any companies, 00:14:00.480 |
but I remember Katara, which was doing construction and they just thought they're going to take over 00:14:04.640 |
the whole construction injury, you know, industry with this company, raise like $900 million. And I 00:14:10.240 |
believe it's a zero the whole way you cannot. You can't put the money to use capital as a weapon 00:14:15.840 |
maybe works if you're Uber, Lyft and DoorDash and you have customers and you can reduce the price. 00:14:20.240 |
But well, you know, what's interesting is when you now look at all a bunch of these AI companies, 00:14:24.240 |
every day, what you see is every round is a minimum of $100 million. But the dirty little 00:14:31.760 |
secret is not that these companies have found product market fit or are scaling revenue 00:14:36.080 |
massively. I'm not talking about open AI, I'm talking about these other long tail businesses. 00:14:39.920 |
The $100 million is largely going to compute. But as we know, and I think we'll talk about this 00:14:48.160 |
next week when all these announcements are done, but you're about to see a 110 thing of the compute 00:14:53.920 |
cost. And so what's interesting for those companies is they're going to be left with a lot more money. 00:14:58.400 |
But back to Freebrook's point, they may be ambling about with too much money at that point. So today 00:15:03.760 |
you raise $100 million because you need to spend 70 or 80 on compute tomorrow when you only need 00:15:08.960 |
to spend seven or eight for the same amount of throughput. On one hand, you could say, well, 00:15:13.920 |
this is great, I have so much more money I can spend. But again, we've seen example after example, 00:15:19.360 |
where if you raise too much money, when you don't need it, you will just run yourself into a brick 00:15:23.520 |
wall, you'll over hire, you'll miss hire, you'll misallocate. And it just goes back to the same 00:15:28.960 |
thing that we've said before, which is there's just not enough people that actually know how 00:15:31.680 |
to run companies. Yeah, and it just becomes a huge distraction. Did you see Saks today, 00:15:36.320 |
Sora? Did you see Sora? This is breaking news right now. 00:15:40.720 |
Yeah, this is unbelievable. If you haven't seen it, Sora is the text to video model from OpenAI. 00:15:46.160 |
They've been talking about this. I guess this would compete with stable diffusion do video as 00:15:53.760 |
well. Anyway, you give a prompt. I'll just read this one out loud. Beautiful snowy Tokyo City is 00:15:58.720 |
bustling the camera moves through the bustling city street following several people enjoying 00:16:02.400 |
the beautiful snowy weather and shopping and nearby star stalls. Gorgeous Sakura pedals are 00:16:08.400 |
flying through the wind along with snowflakes and this if you weren't studying it would look like 00:16:17.600 |
a major feature film or TV show. Yeah, Saks. Yeah, I mean, it's unbelievable. I mean, 00:16:24.160 |
it looks to me like 4k video. I mean, I'm not sure it actually is 4k, but it looks pretty darn close. 00:16:29.520 |
Yeah, the rendering resolution can be whatever you want it to be. 00:16:33.840 |
Right. The fact that you can describe the way that you want the camera to move. 00:16:38.000 |
So I'm going to speculate on this. You guys know, I talked about this quite a while ago on how I 00:16:44.640 |
think we're going to be a prompt to video meaning people are going to have personalized video games 00:16:48.640 |
and personalized movie experiences and music eventually, where rather than everyone watch 00:16:53.840 |
the same product made centrally by a producer, we'll all make our own product locally. I spent 00:16:59.520 |
some time digging into this over the last year. And the big challenge was that the traditional 00:17:04.320 |
approach for rendering video is you create three dimensional objects. And then you have 00:17:08.240 |
a rendering engine that renders those objects. And then you have a system that defines where 00:17:12.160 |
the camera goes. And that's how you get the visual that you use to generate a 2d movie like this. 00:17:17.600 |
And the best tool out there is Unreal Engine five, which is made by Epic Games. I don't know 00:17:23.280 |
if you guys have ever seen the demos of it, but I've seen it. Yeah, Star Wars uses with their like 00:17:28.400 |
virtual sets. Now when they did the Mandalorian, I think they're using the Unreal Engine or unit, 00:17:32.080 |
they probably are. And then there's other tools. If you're in movie production, 00:17:35.280 |
you'll use different stuff. But Unreal Engine is just so incredible. But it requires the placement 00:17:40.400 |
and the rendering of 3d objects. So where do those objects go? And then the physics of the light 00:17:44.960 |
moving off those 3d objects generate a 2d video is pretty straightforward. That's been done for 00:17:50.480 |
2030 years. And so the density, the rate of improvement, the color contrast, etc, has all 00:17:57.680 |
improved, made it look incredible. Now it's super photorealistic, you create photorealistic video, 00:18:02.080 |
but it still requires this concept of where are the 3d objects in 3d space. 00:18:05.920 |
This doesn't do that. This was a trained model. So we don't know what's going on in the model. So 00:18:12.880 |
how would you train a model to do this without having a 3d space? Because clearly, there's 00:18:19.040 |
1000s of objects in that rendering, if you look at that video, 1000s of little objects, 00:18:23.040 |
the compute necessary to define each of those objects, place them in 3d space is practically 00:18:28.400 |
impossible today. So this is being done through some model that starts to render all of these 00:18:35.360 |
things together. So it is very likely my guess is that open AI used a tool like Unreal Engine five, 00:18:43.280 |
and generated tons and tons of video content, tagged it labeled it. And we're then able 00:18:50.720 |
to use that to train this, this model that can for whatever reason that we don't understand, 00:18:57.200 |
do this the output. But what's interesting is that this output underneath it, you're referring 00:19:03.120 |
to synthetic training data. And yeah, exactly, exactly. And but what's amazing about this model 00:19:09.600 |
is that underneath this, this model on its own has rendered physics. So all of the rules of physics, 00:19:15.760 |
all of the rules of motion, all of the rules of perspective, all of the rules of how objects sit 00:19:20.560 |
upon one another next to each other, how fluids move, if you look at some of the other demos, 00:19:24.960 |
on this tool, you can actually see fluid moving, it's rendering the movement of fluid or rendering 00:19:30.320 |
the movement of hair. Those are like physics models that would typically be deterministically 00:19:34.800 |
written, meaning that a programmer and engineer writes the physics. And then the Unreal Engine 00:19:39.840 |
renders the physics. In this case, the model figured out the physics. And the model probably 00:19:45.200 |
did it in such a reduced way that you could use a reduced amount of compute to generate something 00:19:50.320 |
that would otherwise take a ton of compute time. I mean, look at this. It's truly amazing. 00:19:54.880 |
Nick find the one of golden retrievers playing in the snow. This is way better than this. 00:19:58.800 |
But what's amazing is that the and by the way, if you're listening to this, 00:20:01.360 |
I really recommend going on Twitter and typing in OpenAI, Sora, S O R A, and looking at some 00:20:06.720 |
of these demo videos. It's bonkers. And I think it really speaks to what's so incredible right now, 00:20:11.920 |
which is that software is writing itself. And it's replacing a lot of human determinism, 00:20:16.800 |
the way that humans think about constructing software is being completely redone with these 00:20:22.400 |
learned models. This model learned physics on its own, renders this thing that looks like the only 00:20:28.400 |
way we know how to do this today is to create 3d object models in 3d space. And you've tons of 00:20:32.800 |
compute to render the and just to translate that from geek to Yeah, you know, people listening. 00:20:37.760 |
This is incredible, because Pixar spent years, maybe a decade trying to figure out hair. I 00:20:44.240 |
remember talking to Ed Catmull. And he told me hair was just so hard to get right with wind and 00:20:49.120 |
lighting. They literally built the physics engine for years, like maybe even a decade to make hair 00:20:55.520 |
cross the uncanny valley. What you're saying here, Friedberg, is somehow this language model, 00:21:01.120 |
this image model, just outputs it. And it doesn't actually have a physics model. It just outputs it, 00:21:07.120 |
correct? Yeah, it will. We don't know. But the model, there may be some determinism built into 00:21:13.760 |
it. So so I don't want to assume anything. But the general approach determinism means you write 00:21:19.600 |
the physics and you write the formula of the physics. Yep. So that the software can execute 00:21:24.480 |
the physics and render the thing. But in this case, it may very well be the case. Like a lot 00:21:30.560 |
of the other models that are being trained today, when you create a model, they're trained, that 00:21:35.680 |
it's being trained and on its own, it resolves to figure out how to do the physics. Is there a limit 00:21:42.000 |
on the length of these Sora videos that you can generate one minute right now, 60 seconds today, 00:21:48.320 |
obviously, that'll come off at some point, by the way, if you can do 60 seconds, you can do two 00:21:51.840 |
hours, right? The problem that you have is, I mean, if we're talking about creating a movie, 00:21:56.160 |
is making edits. So obviously, I haven't used sorry, because it just came out today. But I've 00:22:01.600 |
used Dolly, which is just their prompt image service. And I was designing the label for or 00:22:08.960 |
the bottle for the all in tequila for you guys. Yeah, what I noticed is that it would spit out a 00:22:14.640 |
bunch of ideas. And then I would take one and then want to just make little changes. And I would just 00:22:21.920 |
say, okay, keep this, but just change this one detail, it would give me a completely new image. 00:22:26.400 |
So the AI doesn't realize the layers inherent in what it's producing. Somehow, it takes this 00:22:34.720 |
prompt and generates a response. And if you give it that same prompt 100 times, it'll give you 100 00:22:40.960 |
different answers. And if you'd say just modify this, this or this, it can't do it can't reason, 00:22:46.560 |
it can't take criticism. Well, it seems like on some level, it's unaware of the output that it's 00:22:51.200 |
created. Again, yeah, a human creating this in Photoshop would have a lot of layers. And every 00:22:57.200 |
object in that image would have its own layer. And you could have a layer for text and just 00:23:02.320 |
quickly modify the text. When I want to do something as simple as edit the text or change 00:23:08.000 |
the font, again, it would give me a completely different image. And I just wonder about these 00:23:12.880 |
videos, if you want to create a movie, I think you're probably going to be stuck with like the 00:23:17.680 |
what they give you. And then maybe you can import that at some point into an Unreal Engine and then 00:23:23.440 |
do work from there. The editing part is it doesn't exist yet. When you do text, it's easy because 00:23:28.800 |
you just X, you cut and paste it into your grammarly or your notion or whatever you use. 00:23:32.960 |
And you start editing here, there's no exporting, like you're saying to Final Cut Pro or something. 00:23:38.720 |
I bet that when they turn this into an API, that when you're working in Unreal Engine, 00:23:43.520 |
you know, if you're like a VR designer, or something like that, or you're like a 00:23:47.520 |
filmmaker, you could probably just get started with a prompt. And that could be 00:23:52.480 |
hugely time saving if they can make that integration work. I think that's pretty 00:23:59.120 |
interesting. I looked at this a lot over the last year, this was one of the biggest challenges is in 00:24:03.440 |
order to have an interactive approach to video production, or video game production, you have 00:24:10.640 |
to be able to say, we'll take that out, put that in. But like, like you said, the model doesn't 00:24:15.120 |
distinguish the objects, it doesn't distinguish each of those little flowers, or each of those 00:24:20.240 |
little cherry blossoms, or doesn't even distinguish if there's two people walking down the road, 00:24:24.400 |
the model doesn't have this concept of objects when it renders. And so the traditional model, 00:24:30.400 |
where you have objects in a 3d model that then get rendered into a video, allows you to do those 00:24:36.960 |
sorts of things. So what may end up happening is a fusion. And this is something we talked about 00:24:42.320 |
with Stephen Wolfram and others, that there are these models that are called compute models that 00:24:46.400 |
do deterministic compute tasks, then there are these AI trained models where we don't really 00:24:54.240 |
have a sense of how they work. But the fusion of the two will unlock a lot of opportunities. And 00:24:59.280 |
that's very likely the next generation of models is not just purely non human intervene trained 00:25:05.360 |
models, but the integration of those models with these deterministic models with these compute 00:25:10.000 |
models. And then you can start to do things like instead of saying, hey, render this whole scene, 00:25:15.280 |
you're saying render all the objects that would go in the scene, they get placed in an Unreal 00:25:19.040 |
Engine. And then the Unreal Engine outputs the video. So that integration work, I think, 00:25:24.240 |
is the next big phase in AI model development. And that'll unlock like insane. Yeah, 00:25:29.840 |
because right now, you can't use this to create dialogue, like actors in a movie scene talking 00:25:36.080 |
to each other and acting, especially like close ups of those actors with their facial expressions. 00:25:41.520 |
I don't at least, I don't see any examples like that. The way humans want to have a creative 00:25:45.840 |
process. It doesn't work. That's right. Yeah, it'll get there though. And if you put something 00:25:50.400 |
like this out there to the public and said, it's 20 bucks a month to do this, or 50 bucks a month, 00:25:55.120 |
you can have a lot of consumers who want to buy this product. So when I see this, I think people 00:26:00.400 |
are going to pay for this. And then going back to Nvidia or arm, okay, more infrastructure to your 00:26:07.040 |
point, freeberg is like, what infrastructure do we need for everybody to have Pixar on their laptop, 00:26:13.040 |
or on their, you know, phone, and people, people are willing to pay for these. I don't know if 00:26:17.600 |
you guys saw this, but there was another big breakthrough announced today with the new Gemini 00:26:21.840 |
pro one five, can you just pull up some doors announcement on it? Yes, 00:26:25.040 |
Sundar now is found the password to his Twitter handle. He's tweeting. 00:26:29.760 |
Yeah, the reason I want to talk about this, if you just scroll down, so when you prompt an AI model, 00:26:35.360 |
you put in a certain number of words into like an LLM model today, and it's words back out, 00:26:41.120 |
the more words you put into the model, the more compute intensive or challenging it is to do the 00:26:46.720 |
compute work needed to render the output. So open AI, their GPT for turbo model has 128,000 tokens, 00:26:56.560 |
which means about 128,000 words can be fed into the model to get an answer back out. So you could 00:27:02.320 |
put like a whole essay or 100 pages of text, and it will read a document for you, you get a whole 00:27:08.000 |
book, and it could give you a summary in a couple of seconds. What Google just announced with Gemini 00:27:12.320 |
1.5 Pro is a 1 million token context window up to 10 million token context window in total. 00:27:19.440 |
I don't know what's going on here. Because the problem is the way these models are 00:27:24.000 |
kind of traditionally built, is that that creates a nonlinear scaling and compute needs 00:27:30.400 |
to be able to increase that that context window like that. So there is some architectural work 00:27:35.280 |
that's gone on to allow these models to scale up in a way that we did not think possible when 00:27:40.320 |
these models first came out, which is crazy to think was only like 24 months ago, that they 00:27:44.400 |
started to get kind of looked at like this 18. So that Yeah, and so there's two kind of things 00:27:49.840 |
that are happening. One is that these models are getting chopped up in a way that you can run small 00:27:53.600 |
models in a highly trained way locally, and you don't need a massive general purpose model. And 00:27:58.880 |
so that's going to make compute needs very small for very specific tasks. And the other one is that 00:28:03.120 |
there's other architecture that's happening in deploying the models that's allowing massive 00:28:07.120 |
context windows like this. So it totally changes how scalable they are with the same number of 00:28:11.680 |
chips or the same amount of compute or the same amount of power. So there are all these amazing 00:28:16.400 |
Moore's law effects happening right now, in AI technology development on the chip level on the 00:28:21.440 |
architectural level on the how they're being built and how they're being deployed. That completely 00:28:25.680 |
changes the concept that I think most folks had when these first came out two years ago and the 00:28:29.520 |
potential. So I think sky's still the limit. We don't really know how big this is going to get. 00:28:33.360 |
But all of these efficiency gains mean that we can do so much more so much less. 00:28:37.680 |
And by the way, when you can do more with less, it doesn't mean you need less chips, 00:28:42.480 |
it means you need a lot more chips. Because the fact that you can do a lot more with much fewer 00:28:47.280 |
chips means everyone's going to want to do it now and more people can afford to do it. So 00:28:51.440 |
it unlocks the opportunity for for compute. And so it really starts to rationalize the Nvidia 00:28:57.040 |
thesis that it could be worth $10 trillion, potentially, 00:29:01.360 |
for people in the audience to understand what this context window means. 00:29:05.760 |
No, no, no, no. I'm just I always like to just give an example or two after you explain it. 00:29:09.840 |
The context window. For example, you could upload the transcript of this podcast, right? 00:29:15.520 |
And there were only certain services like Claude, maybe we could put a PDF attachment, 00:29:19.760 |
boom, you put it in there and say, hey, summarize this using your language model and using this 00:29:23.440 |
content. Now you could be able to say, hey, take me every Mr. Beast video, take me, you know, take 00:29:30.000 |
the top 10,000 videos on YouTube, take the transcripts, summarize them for me give some 00:29:34.400 |
analysis on it. And that was only available to developers prior to these large context windows 00:29:41.040 |
being open. And that's going to just open up a massive amount of application, right? 00:29:47.200 |
tomorrow, the results are actually not that much better. There's these third party service 00:29:50.640 |
benchmarks, Lombard is probably the most often referred to. But what you see, if you just look at, 00:29:55.200 |
for example, all the GPTs, even as they increase the number of parameters in a model, the quality 00:30:01.840 |
of that model doesn't necessarily increase with the rate of parameter increase. So you have this 00:30:06.800 |
thing of a little bit of diminishing return, which is more a problem of what we said before, which is 00:30:11.360 |
eventually when you're scouring the open internet doesn't matter how fast or how, 00:30:16.080 |
how big you scour the open internet, the open internet is a bounded space. And so 00:30:19.600 |
you get the same answer. So all of these things are, they're interesting marketing where, 00:30:26.080 |
but I would just encourage us all to keep in mind that the shift is not necessarily 00:30:33.440 |
these kinds of things. It's kind of like saying, you know, in a chip, like, Oh, wow, 00:30:38.720 |
we built this ship on three nanometer. Okay, what does it mean? 00:30:41.840 |
Well, no, I think the piece respectfully, you might be missing here is like, your proprietary 00:30:46.720 |
data might not be in that language model. So by having this context window, and you can 00:30:52.560 |
add something that you have on your hard drive, that's not how it works. So my point is, 00:30:56.320 |
the context window is not correlated to the quality of the model outputs. 00:31:00.560 |
Correct. But consumers and business users haven't been able to put their data into the models, 00:31:06.320 |
because that has nothing to do with the context window that existed when the context window was 00:31:09.920 |
smaller, that that's what you think you do things like rag with, right? 00:31:13.120 |
But most people don't have access to that. This is like consumers can now just take something 00:31:18.880 |
rag is different than what this is. Okay. Like, this is this, what you guys just spent all this 00:31:25.280 |
time talking about, is equivalent to saying I built this chip with on a seven nanometer process 00:31:30.560 |
or a three nanometer process. And what I'm saying is, it is technically more burdensome, 00:31:35.840 |
as Friedberg said, it's technically much more sophisticated and complicated. 00:31:40.320 |
But the, the translation to quality in the consumer is not what that means. So a three 00:31:49.680 |
nanometer chip is not two times better than a seven nanometer chip for the consumer. Similarly, 00:31:54.880 |
a context window that's five times bigger than the one before it doesn't make a model five times 00:32:00.960 |
better to the consumer. These are orthogonal ideas. It's not to say that it isn't an important 00:32:05.680 |
thing. But I would say that it's a reminder for me that we're in the early phases, 00:32:10.400 |
where there's a lot of hay that people can make. But it's important to look at these third party 00:32:18.560 |
benchmarks that try to sort through and sift through what matters and what doesn't matter. 00:32:23.600 |
And one of the interesting things that third party benchmark services have figured out 00:32:27.920 |
is that irrespective of the exponential growth of these things, like parameter size, 00:32:34.480 |
quality is only growing linearly and now sublinear. 00:32:37.440 |
So just an interesting point, I understand your point. Now, my point is, man, when we when you 00:32:44.720 |
are a consumer of this, and I've been talking to people trying to get their data into, you know, 00:32:50.720 |
the context window to be able to ask a question, or it's like, how do I put more in there? How do 00:32:54.160 |
I put more in there? And then these language models, because of the copyright issues, 00:33:00.080 |
they're not able to upload Blade Runner and then say, Hey, make me something. But I was able to 00:33:06.080 |
interest to give a very real example. I did a podcast around the Ray Kroc movie, the founder 00:33:11.840 |
that we were referencing earlier, I uploaded the screenplay, I found a PDF of the book, 00:33:16.880 |
and I found two other like essays about Ray Kroc, like HBS studies. And I said, give me 00:33:21.600 |
like a timeline of it. And I use something called Google Notebook, which has a big context window, 00:33:25.440 |
and you can upload all this stuff. And that's where I'm seeing consumers. And what these things 00:33:30.400 |
are capable of vis a vis the context window, it just unleashes in people's minds so many 00:33:36.400 |
applications that you can do. There are two examples today that caught my eye, that I think 00:33:41.440 |
are worth bringing up. The first one is from meta meta published this paper today, which was like, 00:33:46.160 |
pretty crazy. And essentially, what they said is that, for the realm of the software worker, 00:33:52.320 |
the software engineer, what they basically built is this LLM tool that runs over llama called test 00:33:59.440 |
gen. And it essentially does automated unit testing to such a high degree of accuracy. So 00:34:05.840 |
you know, in the software development, if you want to spit out a product, you have to go through this 00:34:09.760 |
thing where you have to functionally run unit tests and make sure the thing works as advertised. 00:34:14.960 |
And this paper essentially now allows you to traverse a code base, understand it, 00:34:19.520 |
generate these unit tests with high accuracy. So where is this valuable? Think of remember the 00:34:25.200 |
software bugs in like the Boeing triple 737 max issue, Jake out, where it turned out that it was 00:34:30.720 |
just bad software. tools like this now will able will be able to run unit tests where you will not 00:34:37.280 |
be able to get out get that plane out the door, unless all of these functional unit tests not 00:34:43.200 |
only have been written, but run and have been passed successfully. And it'll all be done by 00:34:48.080 |
bots, which means that you can't gain that system. So that's, that's something today that 00:34:53.200 |
that is the last thing developers get to is writing the unit test, right? It's always like, which is 00:34:56.880 |
why there's all kinds of bugs and security holes in code. Now, I thought that that was a pretty 00:35:02.400 |
large breakthrough. And I think we're going to go through a bunch of fine tunes on this that make 00:35:05.920 |
it interesting. And then I saw this other thing, which caught my eye even more, which was this 00:35:11.440 |
thing that Nat Friedman tweeted out, which is this thing called magic dot dev. And what essentially 00:35:16.480 |
it is, is like a co worker. And what they are claiming, and I don't know if it's true, is that 00:35:23.120 |
now you can basically have this thing, understand your style of code, and actually be able to write 00:35:30.720 |
code that is just as good as the way you write code. So now it's not a co pilot, but effectively 00:35:35.360 |
a co worker. And this is the first real example of what I seen of a claim that basically says, 00:35:41.360 |
hey, you know, we can now be an add on to your workforce. And this to me is just so gargantuan, 00:35:50.960 |
because like, if you look at the amount of opex inside of any company that sits inside of R&D, 00:35:57.440 |
if you look at the amount of stock based compensation, right, if you replace humans 00:36:02.800 |
with bots, which is what this is, you're going to crush those things. SPC at the limit goes to zero 00:36:08.960 |
because you don't have to give stock to a bot, right? Do you want to give stock to people? 00:36:12.880 |
And opex gets shrunk, because now you just pay a license to magic dot dev, or I don't know how 00:36:19.360 |
they're going to make money on this. But really, really, really fascinating stuff. It is a scary 00:36:25.120 |
there was another open source project. I think it's called Maggie mgi. I don't know if you saw 00:36:30.800 |
that for a bird, but Apple has been working on an open source image project. And so you're kind of 00:36:37.680 |
going to see I think in iOS 18, or the next iPhone 16, they're going to have an image model on that 00:36:44.320 |
phone, where is where is Apple in all of this? It's unbelievable. No, they're where are they, 00:36:49.280 |
they are building an image model, and you're going to be able to talk to your phone, 00:36:54.000 |
anything, I think you're going to see a chip on the next set of mobile devices, 100% that will 00:37:01.760 |
run models locally. Yep. And so this goes back to the point about how the hell is the government 00:37:06.640 |
going to regulate AI models, which is the stupidest thing I've ever heard? Well, not the 00:37:10.640 |
stupidest is probably, you know, probably top 20. But the fact that they're going to like be able to 00:37:16.880 |
be fully vertically designed, that they can have the hardware, and the software and integrate all 00:37:23.520 |
the way through to the application layer, they're going to have, I would imagine incredible, 00:37:27.280 |
intuitive AI features built into all of their products that can run locally, rather than all 00:37:32.960 |
the other stuff that's going on where you've got to make a request on the internet and get the 00:37:35.760 |
results back over the internet. It'll be pretty powerful. But obviously, no data, right? We don't 00:37:41.360 |
know what they're doing. We're just speculating here. Well, no, I mean, you have breadcrumbs, 00:37:45.040 |
when the breadcrumbs are, they have an image model, and I just dropped in the hugging face of 00:37:49.200 |
it, you can basically upload an image and just give it a text prompt of how to change the image. 00:37:52.720 |
So, you know, if Nick were to use this, he could upload a picture of you to him off and say, you 00:37:57.920 |
know, give him five o'clock shadow, give him a mustache, whatever. And Nick, we do that real 00:38:02.880 |
quick. And while we're talking, while you're doing that, can you bring up the video of the 00:38:07.680 |
robots life in a cyberpunk setting? It's beautiful. Yeah, this is Sora. So beautiful. So it's really 00:38:15.200 |
cool. So but here's the the limitation. And this goes back to what me and fever were talking about 00:38:20.000 |
earlier. Okay, so you've got these people in the background, let's say the feet are backwards, 00:38:25.680 |
by the way, you can see the human feet are correct, sort of like a dolly problem. What's 00:38:30.080 |
cool is that, okay, so those cars aren't moving, they're static, you got some movement of these 00:38:34.800 |
background people, right, at least so they're kind of they're the extras in a movie. But if 00:38:42.560 |
you wanted to tell Sora, this is my guess that, okay, I want the people in the background to be 00:38:48.480 |
doing this, it would render a whole different image, like your robot might change too. Right? 00:38:53.440 |
Well, Saks, this is where your context window matters. So if you repeat the statement, 00:38:58.160 |
instead of writing, and then iterating, if you just re prompted, and your prompt got longer and 00:39:04.080 |
longer and longer with the degrees of specificity you needed, that's the way you use these models 00:39:09.040 |
today. And to your point, what humans are looking for, from a UX perspective, is an iterative 00:39:13.920 |
approach to rendering is an iterative approach, which isn't feasible in a world where the this, 00:39:21.120 |
this is all a matrix, it's a three dimensional matrix, there's two dimensions of pixels. And 00:39:25.840 |
then the third dimension is the next image, the next image, the next image. And so that matrix 00:39:30.080 |
is re rendered from scratch, it doesn't have a concept of what are the objects that I'm rendering 00:39:36.240 |
and has a concept of a matrix of pixels. And so it will re render a matrix of pixels based on a 00:39:40.960 |
redefined prompt. But what you're describing would require a very different model architecture 00:39:45.600 |
than the way all of these trade models operate today, this is going to revolutionize pornography, 00:39:50.960 |
I think it's the first place. No, no, I'm not saying it as a joke. I think that's where you're 00:39:54.160 |
going to see this first, because then all these issues, potential issues of underage people or 00:39:59.200 |
exploited folks, it goes completely away, you're gonna cut this. No, and you know, it'll kill only 00:40:07.440 |
fans. Yeah, it's where every revolution starts. That's where the web started. That's where the 00:40:14.880 |
web started. That's where payments got refined. But I think that in pornography, you're going to 00:40:20.320 |
see this basically destroy pornography and only fence. I don't want to bring up something that's 00:40:26.240 |
super offensive. But there was a certain pop music star who had a bunch of images just a couple of 00:40:31.440 |
weeks ago. That were generated. Yeah, deep fakes, right? Yeah, there were a bunch of deep fakes of 00:40:37.760 |
a pop star. I don't like actually highlight the person's name, because then it just, I think, 00:40:41.600 |
consents more of it. But X did a really good job of blocking all of them watermarking them and 00:40:47.440 |
saying you can't do this. And so this is like a whole new world. You can take any person on the 00:40:51.840 |
planet and make porn out of them. That is realistic. I don't mean that. I just mean that 00:40:56.560 |
you'll be able to make pornography in a way that is totally Yeah, non human involved, which is 00:41:03.840 |
generally better, I would say for all humans involved. What's great about the deep fakes is 00:41:09.200 |
that if you have a sex tape come out, you can just say it's a deep fake. Totally. I was about 00:41:12.560 |
to ask you, what is the prompt you're going to use to make love to yourself tonight? With the 00:41:17.680 |
new sort of model? Chamath embraces Chamath. All I hope is I may just need to publish some data so 00:41:24.560 |
that whatever deep fake gets some parts accurate. This is Apple. Oh, short Apple stock. This is 00:41:30.400 |
it's not very good. Yeah, what is this model? This is an Apple model. It's called Maggie. Mg IE. 00:41:36.880 |
This is Apple's open source image model. It's way behind just like GPT 0.1. They got worked 00:41:42.400 |
to do but it's open source. It's gonna go faster. It's very hard to make fine tune changes in these 00:41:46.480 |
models or to get the result that you want. But that's fine. This is just a this is v 0.9. So 00:41:52.320 |
if you guys want to see a great fine tune, the juggernaut Excel fine tune, 00:41:56.240 |
oh, I would just say try it. It is a fine tune that does not allow bad looking people. 00:42:04.000 |
The whole the whole concept is only beautiful people. So it doesn't matter what your prompt is. 00:42:09.840 |
But these people come up incredibly good looking people I've ever seen, 00:42:14.480 |
as if the standard wasn't already high enough with filters and everything else. But 00:42:19.040 |
this is gonna be revolutionary for independent film. I mean, this is obviously just tip of the 00:42:22.240 |
iceberg. But you know, the problem with independent film right now is that the economic model literally 00:42:26.240 |
doesn't work. It's too expensive to make an independent movie relative to the return. And so 00:42:33.280 |
the independent movie business is almost dried up almost completely. This is going to revolutionize 00:42:39.360 |
it because it'll bring the cost down close to zero. Now I don't I don't agree with Friberg 00:42:44.000 |
that everyone's going to design their own entertainment because most people are lazy. 00:42:48.160 |
No, I'm not saying design sex, the way you think about creative production processes is what's 00:42:52.720 |
going to change, it's all going to reduce down to sitting down. And it's like when you ask your 00:42:56.960 |
spouse or partner or whatever, what do you want to watch tonight, and they answer something, 00:43:01.200 |
that something will then be delivered to you instantly. That's what's going to be so power. 00:43:05.440 |
Well, you mean the AI is going to create it on the spot for you? 00:43:08.640 |
It certainly will. But I think that there's a certain element of culture 00:43:12.240 |
that gets integrated into personal. I mean, I don't think we have time for this. But I'd like 00:43:17.840 |
a whole philosophy. People still want curation. I mean, if you look at something like YouTube, 00:43:21.920 |
or really any user generated content site, there are roughly 100 times more people who are passive 00:43:29.120 |
than people who are active and actually create stuff, whatever the ratio is, there's always a 00:43:33.280 |
lot more passive viewers than there are creators. So there's, I'd say most of the content will be 00:43:39.920 |
created by people who actually care enough to do it. Again, let's call it one or 2% of 00:43:43.920 |
the population. And then of those 1% might actually be really good at it. 00:43:48.720 |
Yeah, but here's where freeberg I think has a really good point, because I have been looking 00:43:52.880 |
into this. Imagine if you took the soprano something you and I love facts. And you just 00:43:58.880 |
said, hey, for each season, give me 10 more minutes per episode and give me like, two characters or 00:44:04.480 |
whatever. Now you when you watch it, the next time you could say just fill it in. I want every episode 00:44:08.720 |
to be 90 minutes instead of 50 minutes. Or I've been doing this thing with my daughters were into 00:44:14.720 |
Star Wars and the Clone Wars, etc. People have been cutting shows like and or, or Rogue One, 00:44:22.480 |
Clone Wars, the prequels, etc. And the Bad Batch, and they're putting in together like timelines, 00:44:28.560 |
and they're making their new versions of it by super cutting the different series that 00:44:33.920 |
are filling things in. So you're going to be able to say, hey, tell me the whole Star Wars saga, 00:44:38.720 |
with a little more perspective from Obi Wan, or give me a little more perspective from 00:44:43.760 |
this person. And that that's going to just make these things like these franchises so much more 00:44:48.160 |
watchable and deep and rich. Yeah, so this is my point sex is I don't think it's about 00:44:53.120 |
ground up personal media creation. I think that it's about the curation model changes, 00:44:59.200 |
the creator or the central creative force comes up with a genre with a bunch of characters like 00:45:05.840 |
a Star Wars, and general like arcs of stories, and then you can go in and you can view it from 00:45:11.360 |
any character's point of view. Imagine watching the Star Wars movie from Darth Vader's point of 00:45:16.000 |
view. Imagine watching the Star Wars movie, where you just stayed on and or the whole time, 00:45:22.560 |
and you watch the news on the TV. While you're hanging out your fantasy you with the you with 00:45:27.520 |
the Ewoks. So you're right that there's a great opportunity for some of these cinematic universes 00:45:33.200 |
to kind of open up the creative assets and let the fan base do whatever they want with them 00:45:37.680 |
or create you know, these alternatives. I still think that it's that 1% of the viewership that 00:45:44.640 |
actually cares enough to do that. And the reason is because that I think what you see with all these 00:45:51.200 |
demos is that it's really easy to get the first 90% it's really hard to get the next 9% and then 00:45:57.680 |
the final 1% you know to get to the level of of course, yeah, to get it dialed in to the point 00:46:04.080 |
where again, the first thing it gives you but the benefits could outweigh those costs. And this is 00:46:08.800 |
true in 3d video game playing. I don't know if you play Fortnite, but there's all sorts of stuff 00:46:13.120 |
that you feel could be better. But the overall value of that experience far outweighs the missing 00:46:18.560 |
elements of it not being truly photorealistic. And I think that's where this stuff goes. 00:46:23.360 |
It's ski week next week. And I've done this for all both of my older boys. And I'm doing it now 00:46:31.280 |
for my youngest boy, who is four and a half, almost five. We are going to watch this entire week, 00:46:37.600 |
all of the Star Wars. Oh, fantastic. And then I got a downloaded history feature of all the movies. 00:46:44.720 |
Yeah, from, and I got an order. The order was not what I thought. It's not as released, obviously. 00:46:49.680 |
You want to do timeline. So you want to have the prequels. 00:46:53.840 |
I did that once. And I will tell you, it was super disappointing. 00:46:56.800 |
Road One, the Star Wars, the original sequel. 00:47:00.320 |
Well, you're saying this order is not good. Here's what I was told. This is what was the 00:47:03.600 |
internet was the most popular. You start with the Phantom Menace. Yeah, then Attack of the Clones, 00:47:10.320 |
then Revenge of the Sith. Then Solo is optional. Rogue One, optional. 00:47:18.480 |
Then Four, Five, and Six, obviously, A New Hope. 00:47:20.720 |
That's when your kids will lose their mind. They'll be like, what is this? 00:47:23.120 |
It's like Empire Strikes Back, Return of the Jedi. 00:47:27.120 |
Yeah, that's the problem. Seven, The Force Awakens. And eight, The Last Jedi. Then nine, 00:47:35.760 |
They fall asleep during Four, Five, Six. God, Four, Five, Six are the best. 00:47:40.880 |
In retrospect, they're great, but there's also The Clone Wars, 00:47:44.880 |
which is absolutely fantastic. I would sprinkle in some Clone Wars episodes. 00:47:48.880 |
Nat and my daughters, I'm forcing them to sit with us for this entire- 00:47:54.480 |
No, she's ready to shoot herself in the face. 00:47:58.720 |
I put the over-under on Nat sticking through maybe 100 minutes of this, and she's done. 00:48:04.320 |
Amore, why do you poison the kids with this nerd stuff? They're never going to get a girlfriend. 00:48:09.440 |
You told me this is Federico Fellini. This is not Fellini. This is George Lucas. 00:48:14.000 |
How about that they watch something where the people make a love instead of the lightsaber 00:48:20.400 |
It's a, no, there's no love and joy. There's people with light swords. It's no fun. 00:48:26.480 |
But to bridge the gap between your two vision sacks and free perks. 00:48:34.320 |
Chamath and I will talk, when we talk to each other, for literally 20 minutes as Nat. 00:48:41.360 |
We'll just have a conversation, but it's a conversation we're having. 00:48:44.400 |
I don't know why, Chamath, this deal has such a high interest rate on the convertible note. 00:48:51.200 |
Why would you do that to me? I'm your friend. 00:48:53.280 |
Who's going to be on the board? This person looks like a dumb-dumb. 00:48:59.360 |
Well, it's an independency, Chamath. You don't need to have a super smart person. 00:49:04.320 |
This product is the opposite of a white truffle. The white truffle is delicious. 00:49:10.880 |
But anyway, between sacks and free perks, we're literally just talking. 00:49:17.040 |
Hey, when are you coming tonight? Are you making it for dinner or not? 00:49:20.960 |
I'm going to try to make it by dinner. I just have to give a keynote at a podcasting conference. 00:49:28.880 |
No, I'm not kidding. You guys banned me from having advertising here. I have to make cheddar. 00:49:34.160 |
I got bills to pay. Just because there's an eight in front of the Uber stock price doesn't 00:49:38.480 |
mean I can retire. Will you be there by 7pm or not? 00:49:42.720 |
I'm going to make it by 7pm. Yes, keep me in for whatever. 00:49:45.440 |
So we are going to start dinner when you get there. 00:49:49.040 |
Perfect. I will not do David Sacks time. Okay. 00:49:52.400 |
But between Freeberg and Dave, you know, Sacks, you said it's like one or two percent of the 00:49:56.800 |
audience for, you know, show with a million viewers or 10 million viewers, that could 00:50:01.840 |
be 10s of 1000s of lunatics. And I don't know if you knew this, but you remember when Luke 00:50:06.320 |
Skywalker showed up, spoiler alert in the Mandalorian, everybody was like, oh, his young 00:50:10.880 |
face does not skip the uncanny valley. This is why you got to watch the Mandalorian. 00:50:14.960 |
So then some kid did his own AI re rendering of Luke Skywalker, you know, after Return of the 00:50:21.680 |
Jedi. But you know, after he defeats the Emperor, and they then hired that kid to work at Disney 00:50:29.680 |
and LucasArts. So what's happening is all these people who are making content for the 00:50:33.920 |
Star Wars universe, they are going to be the next generation of creators. So between a 00:50:38.240 |
Mr. Beast way over here, and LucasArt, there's going to be a new fill in. And it's going 00:50:44.080 |
to be all these great one percenters making interesting with IP that's owned by other 00:50:49.600 |
organizations. I think it's a whole new paradigm for media and content. It's going to be so 00:50:54.080 |
automated stuff that's customized for you. It's almost like what YouTube and Instagram 00:50:58.800 |
did. For user generated content, there's going to be this whole new model of AI generated 00:51:05.280 |
content, the real winners are going to be the technology platforms that bring these 00:51:10.080 |
tools in a simple, intuitive way, that to Saksa's point, doesn't challenge the user 00:51:14.400 |
to do work, but makes their life feel easier and better, and brings them a more rich experience. 00:51:19.440 |
And they're getting today from any other media source. And that's going to be a game changer. 00:51:22.480 |
All right, lots going on in AI. I thought I'd bring up a really interesting 00:51:26.080 |
story that happened just around stock options and employees and advice you're going to get, 00:51:33.840 |
generally speaking, just tons of chaos at the startup vault. If you don't remember, 00:51:38.320 |
that's a one click checkout startup. It was founded by a guy named Ryan Breslow. 00:51:42.240 |
He's been on my other podcast this week in startups. He went viral in 2022. After he 00:51:47.200 |
called YC and Shripe like mob bosses and had this whole conspiracy theory of how they were 00:51:51.440 |
trying to kill this company. Don't know if it's true or not. But at its peak, 00:51:56.000 |
Bolt was valued at $11 billion Chamath and they're buying back shares at a $300 million valuation 00:52:02.640 |
down 97%, yada yada. What makes this interesting was that in 2022, Breslow announced a new stock 00:52:09.120 |
auction program for his employees on Twitter. And he did this in a very bombastic lecturing 00:52:14.320 |
kind of way to the entire industry saying that Bolt would offer loans to any employee to exercise 00:52:19.920 |
their options early. We all know about these types of loans from their balance sheet. 00:52:24.000 |
That is the idea here. Yes. And he called it the most employee friendly plan possible 00:52:29.200 |
and said that half of Bolt's 600 plus employees took the deal. 00:52:34.160 |
He since deleted the entire thread, but at the time, a lot of VCs who'd been through this before, 00:52:41.280 |
you know, explained to Ryan, it's a terrible idea. Here's GGVs, Jeff Richards, for what it's worth, 00:52:46.400 |
almost every private company Silicon Valley did this in the 90s was an absolute disaster. Employees 00:52:50.080 |
spent years paying back loans for worth of stock tax bills for merely exercising, etc, yada yada. 00:52:55.520 |
And his now deleted tweet was over half our employees chose this. Plus, I would strongly 00:53:01.120 |
encourage my family and friends to choose this. But VCs say it didn't work in the 90s. So it's 00:53:05.120 |
a disaster VC Twitter pumps the tweet. This is why VC run companies are never able to make strides 00:53:10.720 |
for employees. Obviously, all those options at 11 billion are worthless now. 00:53:14.960 |
But sorry, they did this when they were worth 300 billion? 00:53:17.760 |
11 billion. It was reported that at the same time in a new industry publication that Breslow 00:53:25.040 |
was selling his shares in secondary. Breslow confirmed with that newsletter that he 00:53:31.360 |
had sold 10 million. And sources, anonymous sources, yada yada, take that for what it's 00:53:39.200 |
worth, said he didn't tell the team he was doing that. Two investors also accused the company of 00:53:44.400 |
misleading shareholders and raising capital on inflated metrics. SEC investigated but did not 00:53:50.000 |
recommend any action. Sex. Jamath, I don't know if either want to take this. This is really 00:53:54.480 |
mechanical. But it does speak to what happens when things implode. Should employees be taking 00:53:59.520 |
loans to buy these kind of stocks? At a minimum, hopefully these loans are non recourse. 00:54:06.880 |
And at the maximum, if they were recourse, but it looked like these guys were doing 00:54:11.200 |
these kinds of shady things, they have to kind of forgive the loans and eat it. 00:54:14.320 |
There's a reason why almost no one does these types of exercise loans anymore. I mean, 00:54:18.080 |
whoever the VC was who pushed back was right. I mean, these things were common 20 something years 00:54:22.640 |
ago. People stopped doing it. And the reason why is that well, there are a couple of reasons. 00:54:27.680 |
So the mechanics of the loan are that the company loans you the money to pay down the exercise price, 00:54:35.280 |
the strike price for the options. And then you're supposed to pay the company back at some future 00:54:39.600 |
date, basically when you can sell the shares. And in effect, from the point of view of the 00:54:45.120 |
balance sheet of the company, this was cashless, because it would loan the exercise price to the 00:54:50.320 |
employee who would then exercise and pay that money back to the company. So everyone's like, 00:54:54.160 |
well, what's wrong with this? This just seems like simple mechanics where no one loses. 00:54:59.360 |
The employee gets to start the clock on their shares for long term capital gains. That's why 00:55:04.480 |
they would do it. The problem is that what people discovered is that if the company didn't work out, 00:55:11.280 |
and those shares became worthless, then the employee is still on the hook for the loan. 00:55:16.720 |
And if they don't pay the loan back, then there's a loan forgiveness issue. In other words, 00:55:22.640 |
the forgiven amount of the loan, or the defaulted amount of the loan becomes ordinary income. And 00:55:28.320 |
so what happens is you end up having to recognize as income, the loan that you never pay back. 00:55:33.120 |
So you can lose. Furthermore, if there's spread at the time that you exercise between the strike 00:55:40.240 |
price and the fair market value of the shares, then although if it's the right kind of option, 00:55:45.600 |
if it's an ISO, you can defer paying tax on that spread. However, people got whammy by the AMT, 00:55:51.520 |
that that spread still counted for alternative minimum tax. So there are a lot of people during 00:55:56.480 |
the first dot com bubble who exercised thinking they can't lose because the market was always up 00:56:01.600 |
and to the right. And then the company would go bust, and they would get hit with a big AMT and 00:56:07.040 |
then they get hit with a loan forgiveness issue. So that's why everyone stopped doing these things. 00:56:11.120 |
Yeah, I mean, really importantly, getting taxed on that spread, you owe that money, 00:56:15.840 |
no matter what, no matter what, no matter what the IRS demands that money in the 00:56:20.320 |
California Franchise Tax Board demands that money. So I just want to be very specific. 00:56:25.200 |
So people understand this, if the value of your company's gone up to $2 a share, 00:56:31.280 |
as a private company, and your options are 10 cents a share, you pay 10 cents a share to get 00:56:36.400 |
those shares, you owe tax on the dollar 90 difference. And ultimately, if the company's 00:56:43.280 |
now worth less than $1, when you're finally able to sell, or shuts down, you actually have money, 00:56:49.200 |
you have to pay to the IRS, the Franchise Tax Board, and you did not make enough money to cover 00:56:52.560 |
the tax bill. The way a lot of these companies do this is they do a cashless early exercise or 00:56:56.400 |
cashless exercise loan. You don't even have to put money up to pay the 10 cents a share, 00:57:00.880 |
the company is just covering it for you. To Saksa's point, you end up getting taxed for that 00:57:07.040 |
later, if you don't pay it back. So the whole thing creates a lot of tax obligations. And it's 00:57:12.480 |
a very complicated structure and system. And the reason options exist is so that people could buy 00:57:18.400 |
their options and not have to pay taxes at the time that they receive them. Because if you just 00:57:22.880 |
get shares in a company rather than options, the value of those shares is your tax bill, 00:57:28.640 |
even if you can't sell the shares, let's think so. It's income. And so options were created as 00:57:33.360 |
a way that you would have a strike and the strike was the fair market value. So you're actually not 00:57:38.240 |
getting any value today. And so you can avoid the tax bill. But now it leads to all these other tax 00:57:42.960 |
loops for right and file employees is what it comes down to, like, I saw this with like senior 00:57:48.640 |
executives trying to get like the top five positions, you know, a CFO or something, 00:57:52.480 |
the whole thing is to reduce your tax bill from 50% to 20%. And at the end of the day, 00:57:58.480 |
you open up all of these other problems. If you try and do that completely, 00:58:01.920 |
one of the stupidest things about the way this was used, is that they would give the exercise 00:58:07.600 |
loan to employees who are leaving the company. Because they said that, you know, normally, 00:58:12.240 |
when you're a departing employee from a company, you have three months to exercise your options, 00:58:17.920 |
or you or you lose them. So they were saying that this would be a solution to that, right, 00:58:22.080 |
we'll load in the money to exercise, all they're doing is creating a huge tax problem for that 00:58:26.320 |
former employee. And you're right, the average employee is not sophisticated enough to navigate 00:58:32.000 |
all these issues. It's not clear also why you'd want to create this complicated benefit for 00:58:37.920 |
departing employees. Seems like if you're going to do this, you would want to do it for a key exec 00:58:43.360 |
who's joining the company. That's the only context in which I've seen it, by the way. 00:58:47.920 |
So they were trying to solve a problem using this complicated device that that really backfired. 00:58:55.280 |
But it's not even a problem that you should try and solve in that way. I mean, 00:58:57.680 |
there's a much simpler solution, which is extend the period of time that employees have to exercise. 00:59:02.240 |
If you don't want them to lose their option, if they don't exercise within three months, 00:59:05.680 |
give them 10 years. And there are a bunch of companies that have 00:59:08.640 |
done that where they give you a lot longer to exercise the option. 00:59:12.480 |
The trend I'm seeing with startups, like early stage startups, seed stage startups, 00:59:16.160 |
which are really indicative of what startups will do later on, hiring people, cash basis, 00:59:21.760 |
paying them hourly outside the United States, not having to worry about stock options, 00:59:26.560 |
not having to worry about healthcare, just finding a global workforce and using these 00:59:32.320 |
third party tools. There's a lot of third party tools to manage employees in Portugal or Manila 00:59:37.200 |
or Canada or whatever, South America. And they're just saying, you know what, 00:59:41.520 |
stock options are going to be for like an elite group, the top 10% of the company. 00:59:45.680 |
And then we're going to automate the rest of the jobs with AI. And we're going to outsource and 00:59:49.840 |
have hourly workers or full time contractors outside the United States. It's the undeniable 00:59:57.520 |
That's throwing the baby out with the bathwater. One of the most magical things about startups in 01:00:01.360 |
Silicon Valley is that when it worked, everyone got rich. I mean, you have the story of the Google 01:00:05.360 |
chef who made $100 million or the secretary at Microsoft who got to retire. It, you know, 01:00:11.680 |
those are extreme examples, graffiti artists at Facebook. 01:00:17.440 |
But the fact of the matter is that Silicon Valley had a share the wealth mentality 01:00:21.360 |
where the rank and file employees did participate in the ownership of the company and they got 01:00:25.440 |
equity. And it was highly motivating or it is highly motivating for not just like lower level 01:00:32.000 |
employees, but mid level employees and senior employees to basically get equity in the company. 01:00:36.560 |
That's one of the things that makes Silicon Valley companies special or companies created 01:00:41.520 |
outside Silicon Valley, but using this methodology template. 01:00:45.600 |
Yeah. So I think getting rid of options altogether is a terrible. 01:00:49.040 |
No, what they're doing is more like for the core team that's in Silicon Valley 01:00:53.600 |
options, but then like for work that is not considered as essential, 01:00:57.360 |
you know, just hiring people offshore who don't. 01:01:00.160 |
We'll try to get those people then to work nights and weekends the way that you might be, 01:01:05.280 |
if you're the owner of the company, they're not going to have an ownership mentality. 01:01:07.760 |
Yeah, I agree. I mean, I guess nine to five workers that way. 01:01:12.000 |
And I think what people are starting to realize is like, maybe there's, 01:01:14.720 |
but there's remote workers. A lot of people are doing nine to five work. 01:01:18.240 |
And maybe that like hardworking culture just doesn't exist in Silicon Valley like it used to. 01:01:23.760 |
You may be able to find hardworking entrepreneurial like folks 01:01:27.760 |
that don't necessarily speak English that now with these AI tools, basically become some of 01:01:35.520 |
your best folks. But there are many countries that don't value equity, don't have an understanding of 01:01:40.480 |
it. And so to Jason's point, it's almost more of a complication to try to give it. 01:01:44.640 |
So that may be the balancing thing sacks that happens, which is that all these tools actually 01:01:50.400 |
allow somebody in some country that speaks some language that's not English, but who can be 01:01:57.200 |
completely committed, who becomes this linchpin of a business and gets paid an extremely great salary 01:02:03.200 |
in American terms that makes them rich in their country, get to the same place, maybe, 01:02:08.160 |
maybe, I don't know. The infrastructure needs to be built for stock options in other countries, 01:02:12.000 |
because I don't know that that exists. Or just pay people more cash, because a lot of folks 01:02:15.440 |
have different, again, stock options make sense when you have a differential in taxation between 01:02:24.560 |
But in other places where you have a flat tax or no income tax whatsoever, what's the difference? 01:02:31.920 |
Yeah. If you're in Dubai paying 1%, like you can just ship it, right? 01:02:36.080 |
Yeah, ship it. You can always go back to a model where people vest shares rather than 01:02:42.960 |
vest options. And those shares vest on a trigger of time and a liquidity event. And that way, 01:02:49.280 |
they can't exercise or real or they won't get taxed prior to the liquidity event happening. 01:02:53.760 |
But then they will have to pay ordinary income tax rates on their shares. 01:02:58.080 |
Hold on, we're making it sound like exercising options is a bad thing. 01:03:01.280 |
It can be a very good thing. I mean, if you're an early employee of a startup, 01:03:05.760 |
and the options are worth pennies, and you can exercise for, I don't know, a few hundred dollars, 01:03:12.640 |
then that's a great tax benefit, because you start the clock ticking on long-term capital 01:03:17.120 |
gains treatment. And when you sell those shares, eventually, if there's a liquidity event, 01:03:20.960 |
you get taxed at a capital gains rate, not an ordinary income rate. That's the tax benefit. 01:03:26.880 |
That's why people want to be able to exercise their options in the first place. 01:03:31.200 |
The problem is that as you join the company later and later, the strike price of the options 01:03:36.720 |
goes up. And so therefore, if you want to exercise, you're going to have to write a 01:03:41.760 |
pretty big check. And so you're going to take a meaningful risk. Some people do that, and that's 01:03:45.360 |
fine. The trouble comes in when you try to get cute and go around taking that risk by saying, 01:03:51.040 |
"Okay, well, the employee doesn't have to go out of pocket for their exercise price. We're going 01:03:55.840 |
to loan them the money." Well, they thought that they had figured out a way to avoid taking that 01:03:59.840 |
risk. But as it turns out, all you end up doing is potentially incurring AMT and a loan forgiveness 01:04:05.920 |
issue. So there's no way around the fact that if you want to exercise your options to get 01:04:10.560 |
the potential tax benefit of a lower tax rate, you have to also take the risk of losing the 01:04:18.640 |
strike price, that check that you write into the company. That's a fair trade, right? If you want 01:04:24.400 |
the potential benefit, you take the potential risk. But that doesn't mean that you should 01:04:28.400 |
never exercise. There's a lot of cases. Again, I think if you're early in a company, you should 01:04:33.040 |
exercise. Yeah, low thousands of dollars, low hundreds of dollars, go for it. 01:04:36.160 |
There's all kinds of ways where when the basis is extremely low, 01:04:40.640 |
it's nominally not a lot of money. So you can take the risk. But when you start talking about 01:04:46.960 |
tens of thousands or hundreds of thousands of dollars or millions of dollars, and then using 01:04:51.120 |
loans and leverage and... You got the risk of ruin coming out. 01:04:53.760 |
Bad news bears all the way around. Yeah, no bueno. 01:04:57.040 |
All right, Saxe. Here's your red meat. Did you see the poll about Biden's age and 01:05:02.480 |
the unfavorable percentage of people who think he is too old? 01:05:09.040 |
I think the most important thing that we should talk about is 01:05:11.920 |
that President Biden has decided to not go through a cognitive test 01:05:16.720 |
as part of his annual physical. It's the first time it's ever happened. 01:05:20.640 |
Joe Biden now 81 years old, 82 at the start of his second term, if he wins, he'll be 86 at the end, 01:05:26.240 |
making him by far the oldest president ever. Trump is 77. He would be 78 and a half when he 01:05:32.720 |
gets his second term. If he won, 82.5 at the end of his term. Here's a histogram of age of presidents. 01:05:39.680 |
These are the oldest. Just to give you a... Everybody obviously refers to Reagan as the 01:05:44.320 |
oldest president prior to this. He was a spry 69 at the start of his presidency. And by the end, 01:05:50.960 |
he was 77, which is younger, obviously, than Biden and Trump today. So, Saks, thoughts? 01:06:01.360 |
Let's just say that we've had a bunch of these issues over the last few years. 01:06:05.280 |
President Trump didn't disclose his tax returns. Is that a huge issue? I think people made it to 01:06:14.720 |
be a huge issue, but it's more of an issue of judgment. But I don't think it fundamentally 01:06:19.600 |
compromises his ability to lead the country. You had a different issue, not with the president, 01:06:26.160 |
but with the current defense secretary, who had prostate cancer, the disclosures were, 01:06:32.800 |
let's just say, somewhat suboptimal. He had a UTI, he got emergency admitted to the hospital, 01:06:42.960 |
he just had another acute issue. And those disclosures weren't good. But you're talking 01:06:50.240 |
about an acute kind of a thing where if you're incapacitated, there's a chain of command that 01:06:54.400 |
immediately kicks in. Those are a class of issue that I think people can find things that are wrong 01:07:03.760 |
with, probably correlated to one's political affiliation. The other side probably cares more 01:07:10.320 |
than the side that the that the leaders on. But I think this issue is a really big one. And I posted 01:07:16.560 |
this clip of KJP basically saying nothing to see here, move on. And the first comment was Elon, 01:07:22.720 |
which I think nailed it. He said a basic cognitive test should not be something that someone who 01:07:28.880 |
controls nuclear strikes is allowed to skip. And I think that that is part of the fundamental issue 01:07:35.520 |
that I have with what they decided to do here, which is that it just massively increases the 01:07:41.440 |
risk that you could have somebody that was fundamentally unelected, running and making 01:07:48.000 |
decisions. And I think that is a huge issue. And it's an enormous reason why for me, this was sort 01:07:54.720 |
of the straw that broke the camel's back and why I just absolutely cannot vote for Biden anymore. 01:08:00.960 |
I was trying to find there's okay, I was trying to find a lot of good reasons to support him. 01:08:07.600 |
Stability, normalcy, all of this stuff. But this is not what stability looks like. 01:08:14.800 |
And I go again, back to this very simple thing, a cognitive test for somebody that can control 01:08:21.520 |
the ability for us to go to war, or for us to completely eviscerate entire populations 01:08:27.920 |
through weaponry that he and he alone controls, should be something that is just mandatory. 01:08:34.320 |
You can't skip it. This is not a thing where it's a nice to have, I think it's a must have. 01:08:39.760 |
And in the absence of it, I think it compromises his ability to be a legitimate choice. 01:08:44.160 |
And we do have a minimum age sacks. And what you're saying is what the majority of Americans 01:08:50.320 |
are saying ABC News ran a poll 86% of Americans across both parties think Biden is too old to be 01:08:57.840 |
president. 59% both said both Biden and Trump were too old. You said on a previous episode, 01:09:03.360 |
Saks that you thought the public should decide not and that that doesn't need to be a cognitive 01:09:07.360 |
test. Are you sticking with that? Or do you think you're gonna maybe move to Chamath's position 01:09:11.040 |
and mine, cognitive test should be mandatory to be president? 01:09:14.400 |
Well, I think it's a tell that they won't take the cognitive test because they're worried that 01:09:19.840 |
he might fail. Just like it was a tell that they refused to do the Super Bowl interview, 01:09:25.040 |
which the president always does. That's the most softball interview that Biden could possibly have. 01:09:31.120 |
And they refused to do it, obviously, because his aides and advisors are nervous about 01:09:35.280 |
how that might go. We had the H.E.R. report come out where the special counsel investigated that 01:09:41.520 |
classified documents case for Biden said that we can't prosecute this guy. He'll be too sympathetic 01:09:47.040 |
a defendant because he's manifestly senile and he'll come across as an old man who can't remember 01:09:52.000 |
what happened. So that's why they decided not to prosecute him. The Babylon Bee had a hilarious 01:09:56.320 |
headline or take on this, which was man deemed too senile to stand for trial, still fit to run 01:10:03.040 |
country. That's the situation we're in. He can't stand for trial, but he's still running in the 01:10:06.720 |
country. Yeah. And the thing that was most damning about the H.E.R. report was not just what the 01:10:14.800 |
special counsel wrote, but it was Biden's reaction to it. Apparently, he demanded that he wanted to 01:10:20.160 |
go out and do a press conference that very day to refute the idea that his memory was bad or that 01:10:27.040 |
he was senile. And he then did this press conference and it completely backfired. He came 01:10:32.480 |
across as kind of an old man shaking his fists, yelling, "Get off my lawn." I mean, that was 01:10:38.160 |
basically the vibe that he gave to the press. And then the thing that he got wrong was he confused 01:10:44.240 |
Egypt for Mexico in regards to the whole crisis in Gaza. So it would have been one thing if it 01:10:51.680 |
was just her saying that Biden was senile, but then Biden himself came out and seemed to confirm 01:10:58.960 |
it by making these mistakes and gaffes. So they know they have a problem. 01:11:03.840 |
So to my original question, would you be in favor, and will you change your position here, 01:11:08.720 |
that we should have cognitive tests for the presidency of the United States? 01:11:12.720 |
I think that the real issue is that Biden would fail such a test. I mean, that's the problem, 01:11:16.560 |
is that he is cognitively impaired. And the thing to realize about these sorts of conditions is 01:11:21.360 |
they're not static. There's a trajectory to them. So in the same way that Biden is 01:11:26.720 |
manifestly worse than he was a year ago, or three years ago, or 10 years ago. 01:11:34.960 |
Whatever this condition is, it's going to be worse. 01:11:37.600 |
Okay. I won't ask you a third time to answer the question, 01:11:40.320 |
but you're choosing not to answer the question. 01:11:42.320 |
So when you say that, should the president take a cognitive test? Sure, he should take a cognitive 01:11:48.640 |
Should it be the law of the land? Just like we have a 35 year minimum age? I'm just asking you 01:11:54.640 |
You could make him take a cognitive test. But the point is that let's say the doctor says, 01:11:58.880 |
yeah, he's cognitively impaired. Is that gonna bar someone from running for office constitutionally? 01:12:04.560 |
No, that's not going to be part of the constitutional requirements. You have to 01:12:08.800 |
pass a constitutional amendment. Yes, would it be a good idea for someone to take a cognitive test? 01:12:13.840 |
Part of their physical to be president? Sure. But is that going to be a constitutional 01:12:17.360 |
requirement? No, you can't have some doctor saying the way that that's my whole point. 01:12:21.440 |
Like you could take a physical Bill Clinton did it and people used to joke about his cholesterol 01:12:26.480 |
levels, and he would go for a jog and stop at a McDonald's. So I'm not saying by any means that 01:12:31.600 |
the results should bar you from the ability to run, it should still be up to the voters. 01:12:36.320 |
But just like we would make sure that acute or chronic health conditions were well understood, 01:12:41.840 |
right, the idea that this entire body of judgment, which is around your brain around the judgment of 01:12:48.560 |
the President of the United States, to basically say, No, we're good here, we can skip it. I think 01:12:54.080 |
that that's really dangerous. And it's part and parcel of a systematic set of behaving now by the 01:13:01.920 |
Democratic Party, which I think is really not where they want to be. So the only person that's 01:13:07.280 |
really stood up to all of this is Dean Phillips. Yep. But he is being prevented and railroaded 01:13:12.640 |
from even being able to put that issue on the table so that the voters can decide my fundamental 01:13:17.440 |
view is the voter should decide, but all of these tests should be done and skipping them is a form 01:13:23.200 |
of a tell as David said, and I think that this is the most important tell that we've been given 01:13:29.920 |
about the amount of cognitive decline that he suffered. And Jon Stewart, by the way, 01:13:34.480 |
had the funniest version of this Jon Stewart's, essentially, monologue was like, hold on, 01:13:39.520 |
everybody is out here giving soundbites that say he is a chess grandmaster in private, 01:13:45.440 |
but then he shows up in public and it looks like he can't even play checkers. 01:13:48.400 |
Yeah, free bird. Let me let me jump to you there. You shared the clip with us. What were your 01:13:51.680 |
thoughts? What are your thoughts on this whole issue? cognitive test? I don't I mean, 01:13:54.960 |
I don't think a cognitive test should be the law. Because you're starting to get into edge cases 01:13:59.200 |
where you'll create lots of different requirements, like what's the strategic 01:14:02.240 |
test? And, you know, what's the under pressure test. And there's obviously a lot of ways that 01:14:08.240 |
folks can game and challenge and create new systems for new criteria. The system of democracy, 01:14:15.040 |
however, is meant to solve these problems. The system of democracy is meant to allow individual 01:14:19.920 |
voters to say, we judge that someone is not equipped, or prove us wrong, which is what 01:14:26.400 |
the voters are saying and doing right now based on the polling data. The majority of voters in 01:14:31.200 |
the United States are saying we do not think that this individual is equipped to be president. 01:14:35.840 |
Rather than respond to that with a cognitive test, we are stuck with a single candidate. 01:14:42.960 |
And I think it speaks more to the problem of the Democratic Caucus and how the electoral system 01:14:47.920 |
works in the United States today, where the insiders get to decide who the nominee is for 01:14:53.600 |
the party. And those insiders are self dealt, they are getting benefits by putting someone 01:14:59.360 |
in the seat for next term, because that person is in the seat this term, and is approving things 01:15:04.720 |
that they want this term. And so it feels very corrupt. And it feels very unfair and very 01:15:11.360 |
undemocratic. I think it leads to either a sense that the individual voters rise up and say, we 01:15:17.760 |
want a third party, because the way that this two party system works in the way that the caucus 01:15:21.840 |
system works, isn't fair. Or we end up seeing a really unfortunate outcome, where you have a 01:15:28.560 |
similar sort of like January six type event where folks are just fed up with the whole system and 01:15:32.480 |
don't trust it. And if the citizens lose faith in democracy, we're, we're in peril. But as Jon 01:15:38.880 |
Stewart said in his monologue, it is the responsibility of the citizens of this country 01:15:42.800 |
to step up and not lose faith, but to take action and to resolve to a solution. 01:15:47.280 |
We have the power, we have the authority as voters. And if enough of us get together and say, 01:15:51.840 |
there should be a third party, or we do not want this individual to be the candidate, 01:15:56.560 |
something will change rather than just resolve to letting the insiders set the terms of the caucus. 01:16:01.440 |
Third party is seeming more and more viable. As we've talked about here, 01:16:05.920 |
people forget the case of Ross Perot getting 19% 01:16:11.120 |
RFK looking pretty favorable. And the third party candidate something I think the two parties 01:16:17.760 |
dismiss all the time, but it is completely possible. So maybe this is the time that this 01:16:23.600 |
is the breakout season for it. As Saks alluded to the horror part also came out. It was released. 01:16:30.480 |
Special Counsel Robert Herb, Republican and Trump appointed AG declined to prosecute Biden 01:16:38.640 |
because he feared they couldn't reach the standard of reasonable doubt as Saks pointed out. 01:16:42.560 |
And I'll just quote from the report, we conclude that the evidence does not establish Mr. Biden's 01:16:47.840 |
guilt beyond a reasonable doubt. They also said he was an elderly man with a poor memory, 01:16:55.440 |
or that's how a jury would view Biden. Democrats felt this was partisan, obviously, 01:17:00.560 |
and then pointed out a bunch of times that Trump has misstated people's names or been confused. Is 01:17:09.200 |
there any correlation here between the two candidates in your mind? Chamath is Trump right 01:17:14.720 |
behind Biden in terms of this? And is he too old to run as well? And do you think the gaffes that 01:17:18.640 |
he's made are comparable? I think that he looks much younger and sprightlier than Biden does, 01:17:29.200 |
to be totally honest. Yeah, and look at the schedule he maintains. It's nothing like Biden's. 01:17:34.320 |
I mean, look, Democrats are trying to both sides the issue because it's their only choice. They're 01:17:38.880 |
stuck with a candidate who's an obvious decline. And the problem for Democrats wasn't just the 01:17:43.920 |
horror report, which like you said, Jay Kal, they can dismiss as partisan because they control the 01:17:47.760 |
media. The problem is that Biden himself demanded to go out in front of a press conference to refute 01:17:53.760 |
the horror report, and he ended up confirming it. And the people of the United States can just look 01:17:58.400 |
at Joe Biden, and every time he talks, no matter how much they try to hide him, when he does speak, 01:18:03.840 |
the American people are left with the impression that this is somebody who's invisible decline. 01:18:08.480 |
Trump, although he does misstate a word or two here and there, does not give that impression. 01:18:13.600 |
Trump maintains a vigorous schedule. Not only is he running a presidential campaign, he's in court 01:18:19.120 |
every other day fighting all this lawfare. He is willing to give interviews. In fact, 01:18:24.000 |
he barely turns down a press interview. And most importantly, he's willing to do a hostile 01:18:28.400 |
interview. He's willing to step into the lion's den. He's willing to go on CNN to those town halls 01:18:33.760 |
and basically battle. I think that's your best point. With these reporters. I think that's your 01:18:37.600 |
best point. Whereas Biden won't even sit down for the Super Bowl interview, which is as much of a 01:18:42.800 |
softball interview as you're ever going to get. Yeah. Yeah, no, no. I mean, that's your that's 01:18:46.400 |
your best point. I think who does a two or three, if you can't do a two or three hour interview, 01:18:52.720 |
I don't think you can be. Who do you think is going to run the country in the second term, 01:18:55.920 |
if Biden were to win? You know, that that is the question. I think it's Biden. 01:18:59.680 |
No, the staff runs it. Listen, this will be this will be a regent president where the presidency 01:19:06.480 |
becomes an office instead of a person. And it's run by powerful ministers, like in some sort of 01:19:12.400 |
European system. That's what we'll get. Or happened under Reagan. And the fact 01:19:16.320 |
that matter is. Or Sachs has happened under Reagan. 01:19:19.440 |
You can only in the last couple of years of Reagan. Exactly what happened in the senility 01:19:23.840 |
start to or not some really had Alzheimer's. And it really, I think started to become a 01:19:28.640 |
factor in the last couple of years. It was the last two years. Yeah. So the first six 01:19:32.640 |
years were very strong. 25% of his presidency it who was running the country is the question. Yeah. 01:19:37.440 |
It's a valid question. Do you think there's a difference Jake out between 25% and 50%? 01:19:41.840 |
No, no, I think it's like we only have I'm just looking for other corollaries where this happened 01:19:47.280 |
and what happened. Who is I think that this isn't a corollary exactly because when he ran when 01:19:52.960 |
Reagan ran for a second term, he was cognitively sharp. It was true that he was diagnosed with 01:19:59.200 |
early onset Alzheimer's in the second year of his second term. And then that was not properly 01:20:04.320 |
disclosed to Americans. That was just it was just the beginning in the last two years. If you look 01:20:08.560 |
at if you look at the 1984 campaign, the morning in America campaign, Reagan still did debates. 01:20:14.320 |
He was still sharp on his feet. He was debating Mondale. Remember? Yeah. He gave speeches all 01:20:20.640 |
the time. He maintained a vigorous schedule. There was no evidence of there being a problem. 01:20:24.800 |
My only point here to Martha is like, does the public has a right to know when these things 01:20:28.480 |
happen? And it seems to have happened a couple of times in history. 01:20:31.520 |
Well, so I would take Friedberg side, I'm not sure that the public, 01:20:34.160 |
you could say, irrespective of whether the public has a right to know, the public has a right to 01:20:39.680 |
judge. And I think in order to judge, you have to make yourself available so that you can actually 01:20:45.520 |
see it in front of your own eyes, or hear it, or listen to it, or watch it, or what have you. 01:20:49.920 |
And that's the second problem that I think we're starting to see, which is, it is so controlled. 01:20:55.440 |
How we get to interact with with our leader. And I think that should give people some pause 01:21:03.840 |
so that they ask themselves, why would this happen? And I think that that's worth internalizing. 01:21:11.280 |
Friedberg, would you vote for a presidential candidate who does not debate, or who does not 01:21:18.320 |
subject himself to, you know, serious, strenuous, confrontational, adversarial interviews, 01:21:24.400 |
those two things, are those two things a requirement for you to vote for somebody? 01:21:27.440 |
I don't know how much that matters anymore. The problem with the two party system is you 01:21:32.560 |
vote for your party. No, I know. But to you, if like a Trump, Biden, they just refuse to do it, 01:21:38.720 |
because we're getting to the point where maybe they're just going to refuse to actually Biden 01:21:42.240 |
or Trump, Trump has refused to do any debates. So maybe they're just both going to refuse to 01:21:46.720 |
debate. Would you vote for a candidate? If they refuse to debate is my question to you, Friedberg. 01:21:51.280 |
I mean, I'm just in a very different mindset on this stuff. I'm not excited to vote for 01:21:58.560 |
a candidate that's not making their priority to fix the fiscal problem of the US. All right, 01:22:04.320 |
so you're back to that topic. Yeah, it's not that topic. It's like, for me, it's like, 01:22:08.000 |
there's a car driving towards a wall. And you've got your foot on the gas pedal. It's like, which 01:22:12.800 |
guy do you want to keep the foot on the gas pedal? Like, I don't care. Like, we got to take the foot 01:22:17.680 |
off the gas pedal. That's the point. You perceive there's a difference between the two and spending 01:22:21.360 |
you're talking. Okay, so yeah, you're, you're voting for a third party candidate that would 01:22:25.440 |
lean towards Chamath. Would you vote for somebody? Yeah, they can't win in the system. 01:22:29.040 |
Well, I mean, they can't until they can run until they do. So we'll see. Chamath, would you vote for 01:22:36.240 |
a candidate who didn't debate or subject themselves to a rigorous adversarial interview? 01:22:44.960 |
Okay, sack. Same question to you, Jason, you're acting as if we need to have all of these tests, 01:22:52.000 |
because we don't know whether Biden has a cognitive problem or not. 01:22:55.600 |
We are including Trump in this, I'm including Trump in this, like the debate issue. 01:22:58.640 |
I think we also know about Trump, Trump maintains a vigorous enough schedule, he takes enough 01:23:03.040 |
difficult interviews, he does enough speeches. 01:23:05.360 |
What about the debate part? That's what I'm curious. If he refuses to debate, 01:23:09.360 |
would you would you still be willing to vote for him? You know, or any candidate? 01:23:13.040 |
You're acting like we need to judge this question based on proxies for the real issue when the real 01:23:18.320 |
issue is that we know one of these candidates has basically dementia. He's in visible decline. 01:23:25.200 |
I'm not asking you this about Biden. I'm asking you in general, 01:23:28.400 |
because we're talking about norms, one of the norms of, you know, releasing your taxes. I mean, 01:23:32.480 |
I'm not asking you about, but I think you're changing the subject from the core substantive 01:23:36.720 |
issue. We have already all agreed that Biden is in cognitive decline. I'm asking you another 01:23:41.840 |
question. I'm opening up the aperture as the moderator here. And I'm asking you to address 01:23:45.440 |
another question. I think candidates should debate period. Okay. That's what I'm asking. 01:23:49.680 |
I don't know why you're going back and telling me what I think, because I think that the tactic of 01:23:54.000 |
Democrats, the mainstream media right now is that they can't, there's no way for them to paper over 01:23:59.280 |
or deny the visible reality of Biden's decline. So they're trying to muddy up the waters. They're 01:24:05.040 |
trying to claim well, Trump is old too, or that Trump refused to debate too. They're trying to 01:24:10.960 |
both sides of the issue. I think we all know that Biden has a problem that Trump does not have. Now, 01:24:17.920 |
I think that Chamath is also right, that there's a lot of Democrats who are comfortable voting for 01:24:22.720 |
a shadow government. They're fine with the presidency being a figurehead or a rubber stamp. 01:24:27.840 |
And I think the reason why they're fine with it is because Democrats basically control 01:24:32.160 |
most of our ruling institutions, really all of them. I don't think you can name one 01:24:35.920 |
major institution run by Republicans, and they don't need a president to do anything other than 01:24:41.920 |
be a rubber stamp. Republicans are looking for something different. Republicans need a president 01:24:46.160 |
who's willing to push back on the permanent bureaucracy in Washington, push back on the 01:24:52.560 |
deep state or the state department creating new wars. They want someone to push back on Wall 01:24:57.520 |
Street. They want someone to push back on all these institutions. Republicans are looking for 01:25:03.040 |
a Tribune of the Plebs, basically, to push back on this senatorial class that runs Washington for 01:25:09.280 |
their own benefit. Democrats are different. Democrats are the ruling elite, and they're fine 01:25:14.160 |
with the president basically just being a figurehead who will sign their legislation. 01:25:22.960 |
Does he drop out Chamath at this point? Is this the moment we're going to look back on and say, 01:25:26.880 |
this is the peak when that 86% poll comes out, and all this negative press? Do you think Biden is, 01:25:32.160 |
there's an exit ramp here? And do you think there'll be a different candidate fielded by 01:25:37.360 |
No, this is, this was their way of saying, we're going for it. 01:25:42.480 |
Okay. Freeberg, you have any thoughts on that? You think they'll make a switch? 01:25:46.240 |
On Biden? Yeah, I think there's a chance. There's a chance. Last week, I had a few 01:25:50.160 |
conversations that led me to think that there are not super influential people in the party, 01:25:54.480 |
but people that are in the party that, you know, have connections to people of influence in the 01:25:59.680 |
party. I'd love to hear Chamath's like, conversations you may have had around this stuff, 01:26:04.560 |
too. But people are not happy in the party. Yeah, very, like, very much unhappy. And everyone would 01:26:12.160 |
love to see someone else. Now the assumption is Gavin is going to be shoved in that seat. 01:26:16.880 |
And that's also a losing proposition. So if you shove Gavin in the seat, you're in a worse place 01:26:20.960 |
than you are with Biden in the seat with respect to your shot at winning. 01:26:23.600 |
And it's a little late now. But I still think that there's a chance. 01:26:28.240 |
Sachs, what do you think? Biden drops out? First of all, the only reason these people 01:26:31.520 |
are unhappy is because they think that Biden's problems have become so manifest that it might 01:26:35.440 |
cost them the election. That's what they're worried about. They don't care about the fact 01:26:39.680 |
that the Biden presidency will be run by staffers. As long as those staffers will basically do what 01:26:45.600 |
they want, sign their legislation, or veto the— But will he drop out is the question we're asking 01:26:49.360 |
you. Will he drop out? No, he's already said he won't. Look, the problem that people in the 01:26:53.840 |
Democratic Party have who have this concern about Biden losing the election because of this cognitive 01:27:00.880 |
issue is that there is no mechanism to replace a candidate who has won all the necessary primary 01:27:07.280 |
votes against his will. Biden has said he's going to be the nominee. He has no desire to drop out. 01:27:13.440 |
So unless he changes his mind, or there's some act of God, he's going to be the nominee. 01:27:18.400 |
The time for Democrats to make this change was a year ago. It required mounting a primary challenge 01:27:24.720 |
to Biden the way that Dean Phillips has done. The problem is that the Democratic Party elders and 01:27:30.320 |
the powerful people in the party said, "We're going with Biden." And so no one wanted to run 01:27:34.000 |
against him. And they basically set up the primary calendar so that Biden would win. And he is going 01:27:40.400 |
to win. He'll have all of the primary votes he needs locked up by the end of March. And 01:27:45.680 |
Dean Phillips is not gonna be able to basically stop him. So there is no mechanism to remove 01:27:50.320 |
Biden. He's going to be the candidate, you know, barring act of God, unless he can be persuaded 01:27:55.840 |
to step down. So it's not going to happen. What do you think about this new third party effort 01:28:00.960 |
in DC? No labels. Have you heard about it? Have you been pitched it? 01:28:08.080 |
I hosted a dinner at my house with some folks about it. 01:28:12.320 |
Yeah, I mean, I think like, it could be interesting. I think that the 01:28:16.240 |
Is there room for a third party in the country, you think? 01:28:18.560 |
Yes. And I and I think that there is a very reasonable chance. And I would put reasonable 01:28:23.760 |
somewhere between 30 and 50% that this third party gets created after the presidential election. 01:28:36.000 |
And that is the one of the very likely outcomes of RFK candidacy. Because it is an infrastructure 01:28:44.320 |
that will have gotten on presidential ballots in 50 states. And it's an infrastructure that 01:28:52.000 |
if people are willing to fund specifically no labels and other people, I think the artifact 01:28:56.720 |
could be a third party. And if RFK can get 30 plus percent support, which in some states, 01:29:04.720 |
I think he will. I think that there's a decent chance that would be an enormous outcome. 01:29:10.160 |
Obviously, the most disruptive outcome for RFK would be that he won the presidency. But I'm 01:29:14.000 |
just saying that a very powerful outcome that has huge longitudinal benefits to America would be that 01:29:21.280 |
his candidacy essentially is the is the founding story of a third party. Yeah. And that was the 01:29:29.520 |
shame of what happened with pro I think that there was a chance that pro could have done that. And 01:29:33.360 |
he must have been under just an enormous amount of pressure. I can't even imagine what he must 01:29:37.760 |
have dealt with. But that could have been the artifact back then, but it definitely can be 01:29:42.080 |
the artifact today. So we'll see. They haven't announced either. And there's no timeline for 01:29:49.520 |
them announcing they've asked to come on this program many times to just talk about the concept. 01:29:53.440 |
I think like the they want to have like the executive director on or no labels. Yeah, 01:29:59.040 |
they've asked. I don't love. Yeah. And I was like, well, maybe when you pick a candidate, 01:30:03.040 |
that would be a better time than just talking about it in the abstract. I think it's a great 01:30:07.920 |
idea. I think that's what Americans want. So maybe elections in nine months, we're running out of 01:30:13.040 |
time. I mean, it's entirely too late to be talking about new candidates who aren't even in the race. 01:30:18.160 |
Not gonna happen. I mean, this whole Gavin do some things of fantasy. Even if Biden were to 01:30:23.440 |
see the light, and step down, be pressured to do that. Kamala Harris just said, I'm ready to serve. 01:30:28.960 |
She wants to be president. And even though she's not senile, she's actually even more unpopular 01:30:35.440 |
than Biden. So you have that huge problem, the Democratic Party, that even if somehow they could 01:30:40.160 |
prevail on Biden to step down, now you have Kamala Harris ready to serve. So how do you pass over 01:30:45.840 |
her? He's not stepping down. He didn't go through all of this craziness this past week to step down. 01:30:51.840 |
They didn't evade the cognitive test for him to step down. They're not doing all of that. They're 01:30:56.000 |
not keeping away from the media so that he can step down. None of that is happening. They're 01:31:00.640 |
not having people come out with the soundbites about how he's got mastery of everything in 01:31:05.520 |
private, so that he can step down. He's running. Donald Trump is running, and RFK is running. 01:31:12.400 |
Those will be the three people on the ballot in November of 2024. 01:31:17.680 |
Biden's staff are the absolute last people who would ever pressure him to step down. 01:31:21.760 |
Why? Because they'd be out of a job. I mean, from the staff's point of view, 01:31:26.640 |
having a weak president who basically they administer, 01:31:30.560 |
it's not just a job. It's like you, you allocate the future as the President of the United States, 01:31:35.760 |
It's the best. It's the best situation possible for a staffer is to have a president who 01:31:39.680 |
is borderline incapacitated, and you get to just do whatever you want. You are the president. 01:31:45.200 |
Yeah, Jake, I'll give us a rundown of the Tucker Putin thing. What do you think? 01:31:48.800 |
Just as a programming note, it came out, it dropped right after we taped, 01:31:51.920 |
we would have covered it. Obviously, we're not covering it for a reason. 01:31:54.320 |
I give him a B on it could have been better. But it wasn't bad. All things considered. 01:31:59.920 |
I don't think any journalist is ever going to be able to trap Putin or get good information 01:32:06.400 |
out of him or any of those things. And if you have somebody like Tucker, who is obviously very 01:32:11.200 |
sympathetic to Russia, and who believes that, you know, the West forced Putin to invade Ukraine, 01:32:17.280 |
you know, you're not going to get like a really great interview. I did think he had a great 01:32:21.360 |
moment where he asked for the Wall Street Journal reporter back at that moment to take back with 01:32:26.640 |
them. He did a lot of softball questions. He asked 43 questions during the interview. 01:32:31.520 |
About 15 of them were kind of softballs, and he had about six fastballs, 01:32:36.640 |
but he lost control over the interview. Very early, obviously, if you watched it. 01:32:41.200 |
And that's just kind of like, you're like, Chris Collinsworth here. I'm not asking for 01:32:47.520 |
a technical analysis of the play. What did you think of the interview? What did you take away 01:32:52.480 |
from it? What did you think going in? Nothing? And was there a delta and a change going out? 01:32:57.040 |
Just a normal? No, because I've watched the other interviews. I watched Megyn Kelly's, 01:33:00.400 |
I watched NBC's he's done. I did a research on he's done about nine interviews in the last 10 01:33:05.760 |
years. So it's not like this was anything new. And when you watch the other ones, it's just Putin 01:33:11.600 |
spinning and pontificating and nothing gets accomplished. He doesn't answer hard questions. 01:33:15.920 |
So I had six questions, I said he should ask him, he did ask two of them. But he didn't ask the 01:33:21.920 |
other really hard questions I had. And so, you know, he's not, I wouldn't say Tucker is like 01:33:30.480 |
the useful idiot, like Hillary Clinton said, but he is biased. And he didn't ask enough hard 01:33:35.200 |
questions, I would have asked three or four harder questions. But you're not going to get 01:33:38.640 |
a lot out of Putin. This guy's like a KGB Stasi trained manipulator of media. It's his speciality. 01:33:44.800 |
I had two really interesting takeaways. The first was, if you had wanted to see some crazy madman, 01:33:52.960 |
you actually left that interview thinking, okay, this is a person that's leading a nation that has 01:33:59.280 |
a deep historical context, that goes back to basically the dawn of Christianity, of which he 01:34:07.120 |
has a pretty deep grasp of. And it seems that his decisions are rooted in a multi 1000 year history. 01:34:14.320 |
I didn't know that. And that was an interesting takeaway for me. So that was a broad strategic 01:34:19.200 |
takeaway. So he was a little professorial, he was a little historical. And all of that, to me, 01:34:26.480 |
made it harder for me to leave that interview thinking that he was a crazy person. In fact, 01:34:30.640 |
it's more like, he's a bit of a methodical thinker, I guess. The second is much more tactical. And I 01:34:38.640 |
thought super salacious. And I wish Tucker would have dug into it, which is that in 1999, when he 01:34:44.480 |
was meeting Bill Clinton, he asked Bill Clinton, can we join NATO? And Bill Clinton said to him, 01:34:52.000 |
yes, and let's go hash out the details over dinner. And then that night, they have dinner. 01:34:58.320 |
And Clinton says, my team says no. And to me, when I looked at the arc of that interview, 01:35:05.520 |
that was one of these really unique moments where it explains to me so much, which is a country 01:35:13.040 |
that's at the that's birthing a democracy, where Putin, transitioning from Yeltsin had to make a 01:35:20.480 |
lot of very important difficult decisions. The idea that he would have made an entreaty to 01:35:25.440 |
Clinton to say, can we join NATO? And then to be told yes, and then rebuffed, I suspect has an 01:35:32.720 |
underlying amount of emotion that explains a lot of what's happened since. And I think that that 01:35:38.400 |
should have been explored much, much more. I think they glossed over it, and they went on. But it was 01:35:42.560 |
a really important moment. Those are my two big takeaways from that from that. From that interview, 01:35:47.680 |
did you watch your free bird? I don't understand. I think when he said that point, 01:35:52.800 |
he was talking in the theoretical, not in the I'd like to join NATO, because the intention of NATO 01:35:58.720 |
was to create a unified defence against the Russian armament, which is the most significant 01:36:05.360 |
besides the US and at that point, before China's rise ahead of China. So I think it doesn't make 01:36:13.120 |
sense to be like, hey, there's two sides of this defence system. Let's unify the two sides of the 01:36:19.040 |
defence system. It literally is a nonsensical process. Not true in 1999. That's totally not 01:36:25.360 |
true. What was the other side? 1999 was where we've toured this whole thing down. It's no longer the 01:36:31.760 |
Soviet Union, right? So what is Russia? We're going to democracy. It's probably just an 01:36:36.160 |
organisation that looks like any other organisation like AZN. What's AZN? Well, 01:36:41.120 |
AZN is an organisation. Could they have military defences? Supposedly, it's but it's a grab bag 01:36:46.880 |
of countries. What's the Brits? It's another grab bag. This is this is a defence alliance. 01:36:52.000 |
This is an economic alliance. But in that moment, the justification for NATO was the least powerful 01:36:57.440 |
it said had ever been. So in some ways, joining NATO would have been akin to dismantling NATO 01:37:02.480 |
and saying, we don't need this anymore. That's the point. Like, why would he say, 01:37:05.600 |
why would the US and the Western Europe allies say, let's dismantle NATO? That's effectively 01:37:10.640 |
what the historical context of that moment is important. He did it at the moment where they 01:37:14.320 |
were transitioning to a democracy. I don't know. It's like saying, let's get rid of the defence 01:37:18.320 |
when Russia was transforming into a democracy. Well, when the USSR was breaking down and 01:37:22.240 |
transitioning to a democracy, how did that democracy go? If there was if there was massive 01:37:26.400 |
disarmament in Russia? The question the question is, what would have happened had they said yes, 01:37:30.640 |
and Clinton took a different path? All I'm saying is, guys, without prejudging it, that that was, 01:37:35.600 |
I think, a very important context that is not understood in history. I wouldn't dismiss it. 01:37:42.240 |
No, no, I certainly I think everybody raised their eyes like really that happened. 01:37:45.360 |
What did you think of that moment? And also Clinton, who is so steeped in history, 01:37:50.160 |
and policy is not somebody that would say something like that in an uncalculated way, 01:37:56.960 |
only to pull back that does not seem and I know Bill Clinton, that is not who he is. 01:38:02.880 |
So just in that context, he is a very close person. So he would have said that, thinking, 01:38:07.760 |
wow, this makes a lot of sense on many dimensions. And again, the concept was, 01:38:12.480 |
it was a it was kind of like, let's sit down for dinner and figure out the details. And it's like, 01:38:16.800 |
my team says no. And of course, the team would say no, because it's like, well, 01:38:21.280 |
how are we gonna so we haven't heard from Clinton, you know, confirming this or how 01:38:25.920 |
it was structured. This doesn't make sense. It sounds like it was a theoretical like, 01:38:29.280 |
yeah, that was my interpretation. I just said it and kind of shocked that he's like, 01:38:33.520 |
let me look into that. That's crazy. What did you think of that moment sacks? And then overall, 01:38:39.200 |
well, Jason, the fact that Putin sought to join NATO, or at least made that offer 01:38:44.080 |
is well known, well documented historical fact, this is definitely not the first time this has 01:38:48.320 |
been raised. More generally, I think what you heard in that interview, once you got past the 01:38:53.680 |
sort of the ancient history lecture, is that Putin, when he first came to power, sought cordial 01:39:00.240 |
relations with the US and with American presidents, and he was rebuffed. His attempt to join NATO was 01:39:07.200 |
rebuffed. More generally, the US reneged on his promise to Gorbachev not to expand NATO one inch 01:39:14.640 |
eastward. And instead, we began multiple waves of NATO expansion. And I think it's very important 01:39:20.800 |
to understand that you don't need to believe Putin on this or trust Putin. American diplomats were 01:39:27.440 |
saying at the time what the ramifications of this would be. And I just want to put two data points 01:39:33.200 |
in evidence. One is George Kennan, who was the father of the containment doctrine during the 01:39:38.960 |
Cold War, he was our most respected Cold War diplomat, said in the late 90s, when they were 01:39:44.240 |
first debating on NATO expansion, he said that, quote, expanding NATO would be the most fateful 01:39:50.720 |
era of American policy in the entire post Cold War era. So think about it. This is somebody who's 01:39:55.680 |
been around since the 1940s. He was our chief diplomat and defined our containment policy 01:40:01.760 |
towards the Soviet Union. And he said that expanding NATO would be the most fateful 01:40:06.480 |
era of American foreign policy. And he said that a decision like that would inflame the nationalist, 01:40:12.160 |
anti Western and militaristic tendencies in Russian opinion, it would have an adverse effect 01:40:17.280 |
on the development of Russian democracy, and it would restore the atmosphere of the Cold War. 01:40:21.680 |
So this is somebody who really knew what he was talking about and warned us not to go down this 01:40:25.760 |
path. And then in 2008, second data point is that Bill Burns, who was our ambassador to Moscow, 01:40:31.280 |
who is now Joe Biden's own CIA director, wrote a memo to Connolese Rice, who was then Secretary 01:40:38.560 |
of State called and yet means yet. And what he said in that memo, is that NATO expansion to 01:40:44.800 |
Ukraine specifically is the brightest of all red lines for the entire Russian elite, not just Putin. 01:40:50.640 |
In other words, the idea of bringing Ukraine into NATO was an idea that was rejected not just by 01:40:57.040 |
Putin or by hardliners, but even by liberal reformers inside of Russia. And nevertheless, 01:41:03.120 |
we persisted in this policy. And so I think what you hear in that interview are a lot of 01:41:09.920 |
points by Putin around how he sought good relations with the West. But what was our 01:41:15.920 |
reaction to him? Pressure, pressure, pressure. We supported the Maidan coup in 2014. We tried 01:41:22.320 |
to expand NATO into his backyard on his most vulnerable border. And then finally, we supported 01:41:27.360 |
these hard right ultra nationalist neo Nazi elements in Ukraine. Now, whether or not you 01:41:35.120 |
believe everything that Putin said, my takeaway is that this was a rational actor. This was not 01:41:41.520 |
a madman. This is somebody who we could have negotiated with. And in fact, there was still 01:41:46.400 |
an opportunity before this war to prevent the war by taking NATO expansion off the table. We refused 01:41:52.560 |
to do that. And even today, there was this new press story that in the wake of this interview, 01:41:58.320 |
the Biden administration still rejects entering negotiations with Russia, even though Putin 01:42:03.920 |
held out that olive branch in that interview. So, you know, again, it's just very sad how 01:42:10.800 |
we have conducted American-Russian relations over the last 25 years. We turned what could have been 01:42:18.640 |
an ally into an enemy, and proven George Kennan correct. I think this has been the most fateful 01:42:24.400 |
era of American policy in the last 30 years. All right. What an amazing episode this has 01:42:30.560 |
been. I think we talked about Tucker and Putin enough. Love you boys. 01:42:32.880 |
For the Sultan of Science, the Dictator, and the Rain Man, I am the world's greatest moderator. 01:43:06.160 |
That's my dog taking a notice in your driveway. 01:43:08.160 |
We should all just get a room and just have one big huge orgy because they're all just useless. 01:43:17.680 |
It's like this like sexual tension that they just need to release somehow.