back to indexE117: Did Stripe miss its window? Plus: VC market update, AI comes for SaaS, Trump's savvy move
Chapters
0:0 Bestie intro: Jason's Japan trip!
1:4 Stripe's precarious situation: Did it miss the window? Breaking down its $4B tax bill, slowing growth curve, enterprise vs SMB customers, scalability issues, and more
23:7 Lessons for founders: How ZIRP can skew CAC and LTV calculations, burn multiple
29:40 VC market update: ZIRP mistakes, VC as a "must-have" asset class for LPs, how the 2021 vintage can be saved
39:5 AI's outsized impact on SaaS and real-world businesses
55:16 Advice from Steve Jobs on customer-first product development, Section 230 update
60:29 Trump's savvy visit to East Palestine and 2024 strategy, Biden's visit to Ukraine, China's position
74:24 Tinfoil hat corner
83:25 Bestie wrap up!
00:00:00.000 |
Check out what what time is it over there? Well, we started at 00:00:03.120 |
8am. So now it's 828. It's 828. I'm going to be on the slope 00:00:06.480 |
said 11. Yeah. So I'll be out there skiing. I'm in the SECO in 00:00:11.880 |
Japan. And just take a quick flight to Sapporo, Sapporo. And 00:00:15.360 |
then you drive two hours into the mountains. Yesterday, I cat 00:00:17.680 |
skied. There's an abandoned ski by the way, in honor of you, I 00:00:21.440 |
grabbed a Sapporo from the fridge today. Very nice. Yeah, 00:00:24.600 |
this week's episode brought to you by so they drive the cat 00:00:27.680 |
ski up, and then you ski down and it's all fresh track. So 00:00:31.120 |
it's literally an abandoned ski resort. You know, during the 00:00:34.400 |
financial crisis here, I just asked you what time it was. 00:00:38.840 |
It's called small talk. It's called banter. I thought you 00:00:44.040 |
might be interested in your besties life, but apparently 00:00:47.100 |
Let's get to the show. Everybody wants to hear the 00:01:05.740 |
show. A lot of news going on. And I you know, in our industry, 00:01:09.980 |
there's been a big discussion about RSUs and stock options, 00:01:13.460 |
both the cost of these things. And then there's another issue 00:01:16.980 |
of people staying private for too long. If you remember, for 00:01:21.220 |
folks listening, Airbnb, Uber famously took over 10 years to 00:01:24.380 |
go public. People like Bill Gurley wrote about this, hey, 00:01:27.580 |
you should get public. When the window is open, obviously, the 00:01:30.620 |
window is closed right now are largely closed stripe. Now 00:01:34.380 |
people are speculating they missed their window, they have a 00:01:36.780 |
$4 billion tax bill due to cover expiring employee RSUs. Those 00:01:42.900 |
are restricted stock units. And at the same time, four square 00:01:46.780 |
miles away from a company from the web 2.0 era, this is, you 00:01:50.940 |
know, 1015 years ago, when they were very popular check in 00:01:53.860 |
software mobile location app, they are going to let their 00:01:58.460 |
previous employees stock option grants expire. According to the 00:02:02.260 |
information, they issued these options in 2016, seven year 00:02:05.660 |
window before expiration, more than 100 form employees will be 00:02:08.020 |
impacted. And some of them are the very early team members. And 00:02:12.300 |
the stock option problem is becoming acute because hey, 00:02:16.940 |
people waited to go public. Basically, what happens is you 00:02:19.820 |
grant an RSU, which is effectively w two income when 00:02:25.500 |
it's realized with an expiration date. But that expiration date 00:02:31.140 |
forces you to be public so that that RSU can be exchanged for 00:02:34.740 |
value. And that's like a 10 year window. So then these guys have 00:02:37.900 |
to go in and modify that date and push it out by another four 00:02:41.740 |
five, six years or whatever. That is a deemed event by the 00:02:46.300 |
IRS that then creates withholding tax issues, right? So 00:02:50.060 |
you then have to you then have to withhold tax on behalf of the 00:02:53.020 |
employees. And so that collective number is the 4 00:02:57.180 |
according to a leaked pitch deck, stripe implied they needed 00:03:00.420 |
2.3 billion in capital by the end of q1 2023. They're working 00:03:04.100 |
with Goldman Sachs to raise a few billion at a $55 billion 00:03:07.940 |
valuation that's down 42% from the peak of 95 billion to 20. 00:03:11.620 |
21. One wonders if they had gone public what the valuation would 00:03:15.220 |
be right now. Can we just say real quick why this matters? 00:03:17.740 |
Jacob? Like, yes. So anyway, why does it matter to my Yeah, why 00:03:20.940 |
does this? Why does this all matter? Like, why do we care? 00:03:24.020 |
I posted a link. This is a 2013 interview that Zuck did with 00:03:30.100 |
Michael Arrington of TechCrunch. And if you go all the 00:03:34.580 |
way back, the apprehension to go public was one thing that 00:03:41.940 |
we really anchor to a lot at Facebook in the early days. And 00:03:45.900 |
at the time, I don't know if you guys remember, but there was 00:03:48.220 |
these arcane laws around the number of shareholders that you 00:03:52.060 |
could have. And I think the issue specifically was that 00:03:55.380 |
after 500 shareholders, you have to publicly release your 00:03:59.060 |
financials. And so we did all kinds of things to make sure we 00:04:02.740 |
never hit the 500 cap. And we tried to push the IPO date as 00:04:06.940 |
far out as possible, because we thought that it would keep 00:04:09.700 |
people more focused. And then in 2010, or 11, I told this story 00:04:16.300 |
a couple times, one of the things that I was advocating for 00:04:19.620 |
pretty aggressively was trying to launch a mobile operating 00:04:24.540 |
system to compete with iOS and Android. And we had put together 00:04:27.980 |
all this work and brought in Intel and AT&T and all these 00:04:30.380 |
people. And it came down to the fact that we needed a couple 00:04:32.980 |
billion dollars to float this thing. And we didn't have that 00:04:36.700 |
money. So the only solution to that would have been to go 00:04:39.100 |
public, but it wasn't the right moment in time. And Zach was 00:04:42.260 |
uncomfortable with it. A year after going public. One of the 00:04:45.340 |
things that he said publicly in this tech crunch thing was, wow, 00:04:47.620 |
I should have just gone public sooner. It wasn't nearly the bad 00:04:50.540 |
thing that I thought it was going to be. And when you look 00:04:53.100 |
subsequently at how much money they've spent in AR and VR, 00:04:56.540 |
spending half a quarters of that cash could have given them the 00:05:00.660 |
chance to disrupt Android and iOS in 2010 and 11, which in 00:05:05.140 |
hindsight is obviously a no brainer bet, right? So even 00:05:09.220 |
though I think we at Facebook were the ones to really put this 00:05:12.380 |
in the water table about not going public, I think a lot of 00:05:17.100 |
startups should have gone back to first principles to really 00:05:19.500 |
question whether waiting as long as possible actually makes 00:05:23.340 |
sense. So I was curious about the stripe situation. So I asked 00:05:26.460 |
my team to do a little bit of work on how would you value this 00:05:30.340 |
thing if it were going public. And the interesting thing about 00:05:33.580 |
stripe is that it operates in a really transparent middleman 00:05:38.260 |
business. So what's interesting about stripe is that so many of 00:05:42.260 |
the people in the ecosystem are public. And so what that means 00:05:46.500 |
is you can build a pretty accurate mosaic of how well or 00:05:51.180 |
not well that business is doing by interpolating all the other 00:05:55.180 |
data from all of these other companies that are public and 00:05:57.660 |
are forced to report. And so there's like a couple of really 00:06:01.300 |
interesting things that jump off this page. And so the first 00:06:04.780 |
thing that we did was we looked at what is the future 00:06:06.740 |
profitability look like x of growth. And what's interesting 00:06:12.540 |
is that you look at companies like visa and MasterCard that 00:06:15.660 |
are doing quite well and have done really well for a long 00:06:18.460 |
time. But you look at this outlier and IDN and IDN is 00:06:22.780 |
probably the most obvious competitor to stripe. And the 00:06:28.060 |
thing that is demonstrated here is how incredibly profitable 00:06:33.500 |
this business is. And how much operating leverage they have, 00:06:38.460 |
which means that their op ex is relatively constrained. Because 00:06:42.500 |
it turns out in the x and y axis here, just so people who are 00:06:46.420 |
Sure. So if you take the market cap on the x axis and divided by 00:06:50.420 |
their sales estimate, you get a multiple of the enterprise value 00:06:53.620 |
to their sales. Got it. And if you look at the 2024 estimated 00:07:00.020 |
EBITDA margin that they're forecasting x of their long term 00:07:05.580 |
sales caker, what you start to get a sense of is the operating 00:07:09.620 |
leverage that this business has. And so all of this basically 00:07:14.220 |
nets out to three interesting takeaways. When stripe got 00:07:19.460 |
underwritten at $96 billion. It's this data point right here 00:07:23.500 |
where you know, you see your stripe previous round, 00:07:25.860 |
five x enterprise value to divided by 2024 divided over 00:07:30.620 |
their long term, their long term EBITDA exactly by their sales 00:07:33.940 |
estimate. And then if you look at the $55 billion valuation, 00:07:37.180 |
it's down. So what it looks like is happening is appropriately 00:07:40.540 |
so people are doing the right thing, which is they're re 00:07:44.140 |
rating the stock right by approximately 5060%. But what's 00:07:50.100 |
interesting is not where they are in terms of where they used 00:07:55.060 |
to be. But the interesting thing is where they are relative to 00:07:58.340 |
their most obvious competitor, Agen. So Nick, please bring up 00:08:01.340 |
the next one. So this is where things get really interesting 00:08:03.960 |
because we looked at what was odd yen. And what was stripes 00:08:08.380 |
GMV per employee a couple of years ago before all hell broke 00:08:14.540 |
loose in the private funding markets. And what you see is 00:08:17.620 |
they were pretty equivalent businesses. And they had roughly 00:08:20.980 |
the same amount of employees. But this crazy thing happened, 00:08:25.380 |
which is that if you look at the gray bar, this is the number of 00:08:28.620 |
employees that stripe has, it went crazy from a little over 00:08:33.220 |
2000 to almost 8000. So a four x of employees in 24 months, they 00:08:40.220 |
had it's 6000 people just pause for a second on that 6000 people 00:08:43.980 |
in 24 months and 700 days or so. Right? Three people a day. And 00:08:48.740 |
if you do the same calculation for Agen, it shows that they a 00:08:52.500 |
little bit less than grew by about 75%. And then if you look 00:08:56.860 |
at the growth of GMV, and you impute how productive is each 00:09:00.660 |
employee. Basically, this is the the story of what's happened to 00:09:05.780 |
stripe and Agen, which is that Agen has found operating 00:09:08.420 |
leverage, right? So they've found and maintained incredible 00:09:12.780 |
profitability. And stripe has added an enormous number of 00:09:18.220 |
employees. Now, the question is why, right? So it turns out that 00:09:22.340 |
these guys at the top line are growing roughly the same except 00:09:25.980 |
Agen actually takes meaningfully less on a per transaction basis 00:09:31.020 |
than stripe does. And the reason is that Agen services these 00:09:34.460 |
large head customers, think big, bulky folks that have huge 00:09:39.380 |
amounts of transactions. And so as a result, have pricing power. 00:09:42.900 |
And stripe has some of those customers as well. In fact, they 00:09:45.380 |
just announced that they're going to process a large portion 00:09:48.780 |
of Amazon's payment volume. But what's happened at the same time 00:09:52.700 |
is that those kinds of deals aren't necessarily that 00:09:57.140 |
profitable. And so you have to hire a lot more people to build 00:10:01.740 |
a lot more features so that you can generate revenue from the 00:10:04.180 |
long tail of customers, all of these SMBs. And this is the tail 00:10:08.060 |
of these two companies, which is that stripe has some head 00:10:10.620 |
customers, but many, many, many tail customers. Agen has mostly 00:10:15.220 |
head customers, fewer tail customers. And so the leverage 00:10:19.020 |
in the business is that Agen has most of these employees in 00:10:22.700 |
Europe, where the cost of these folks is much, much cheaper, and 00:10:25.380 |
they have less than half the number. And so as both of these 00:10:28.860 |
companies continue to grow, you have one that has maintained, 00:10:32.660 |
and frankly, grazed their long term profit projections, because 00:10:37.740 |
they see it in the business, even at lower transaction costs, 00:10:41.740 |
and stripe, which is having a little bit more trouble. So I 00:10:44.700 |
thought it was a really interesting expose. The 00:10:47.820 |
takeaway for me is that if you were sitting inside the company, 00:10:51.140 |
and obviously hindsight is 2020. The most profitable thing they 00:10:56.020 |
could have done from an enterprise value perspective 00:10:57.980 |
would probably have been to go public in 2018 2019. Because 00:11:01.900 |
they could have raised max value at max valuation, cleaned out 00:11:06.140 |
all these options issues and have a huge balance sheet of 00:11:08.820 |
cash with which to do stuff, whether it's acquisitions or 00:11:11.580 |
other things. Because the thing that I struggle with is, is 00:11:14.140 |
there going to be long term profitability and all of these 00:11:16.340 |
tail products? Because if you look in the SAS ecosystem, and 00:11:20.140 |
sacks, and the ball to you, there's companies building all 00:11:23.700 |
this other stuff. And these point products are probably 00:11:25.380 |
pretty good to sack. What do you think about Agen going after the 00:11:29.020 |
fat part of the long tail and then stripe going after the long 00:11:33.820 |
Well, I think they're both viable strategies. And I mean, 00:11:37.100 |
I've actually written about this, I wrote a blog some time 00:11:39.220 |
ago called enterprises versus SMBs, who's the better customer 00:11:42.340 |
for b2b SAS companies. And I think the sort of old school 00:11:47.500 |
traditional view is that enterprises were always the best 00:11:50.460 |
customers because they have the biggest budgets, that 00:11:53.060 |
translates into the biggest annual contract values, or ACVs. 00:11:56.740 |
This provides the highest ROI on sales efforts. So now you can 00:12:00.380 |
make a sales driven distribution strategy pencil in the first 00:12:03.380 |
place, the prospects are easy to identify, you know, after all, 00:12:06.860 |
if you're going after the fortune 500, you can just make a 00:12:09.300 |
list of the 500 companies. So I think the traditional gold 00:12:12.660 |
standard was sort of the head, like you're saying, Jason, the 00:12:17.620 |
enterprises, however, I think in recent years, has become more 00:12:21.500 |
popular to pursue the stripe strategy of the sort of more SMB. 00:12:27.260 |
Well, because first of all, starts are, the SMBs are more 00:12:30.300 |
early adopters. So when you're a startup, it's way easier to 00:12:34.140 |
satisfy their standards to satisfy their needs, their needs 00:12:39.180 |
are less complicated, you don't have to have sock to compliance 00:12:42.620 |
to everything else. If you saw risk taking, right? Yeah, if you 00:12:45.380 |
solve an immediate pain for point for them, they'll just buy 00:12:47.700 |
it. Okay. Whereas I think enterprises are more late 00:12:50.900 |
adopters, they tend to be more skeptical of news. categories. 00:12:54.660 |
Yeah, I think in addition to that, the SMB sales cycle is 00:12:57.580 |
really quick. I mean, I'd say typically one to two months, you 00:13:00.420 |
can close a deal, the sale itself is simpler. Like I said, 00:13:04.100 |
that's the product requirements are simpler. And the low end of 00:13:06.660 |
the market tends to be the most underserved part. So it's great 00:13:10.420 |
to play where the incumbents are not that's a traditional 00:13:13.020 |
strategy is you go after the low end of the market that's been 00:13:15.860 |
kind of overlooked or ignored. And that's kind of what stripe 00:13:18.740 |
has done here too, is no one was really serving these these 00:13:21.700 |
developers. So I tend to think it's a good strategy too. And 00:13:25.260 |
the truth is, it's not one or the other, I think you just have 00:13:27.660 |
to pick, you know, which of your battles that you want to fight. 00:13:30.620 |
And some starts to go after enterprises, and some will go 00:13:33.180 |
after SMBs. And it really goes down, I think, to founder market 00:13:37.420 |
fit, I think founders who are better at sales, probably skew 00:13:40.700 |
more towards an enterprise got a strategy. Whereas if you're more 00:13:46.380 |
Brilliant summary, overtime sacks for a company to thrive 00:13:49.580 |
over long periods of time. Do you have to serve as both? Or do 00:13:52.940 |
you think you can stay in one of those things and grow 00:13:55.340 |
Well, what I've seen is that if you start the low end of the 00:13:58.660 |
market with SMBs, over time, you can move up market because what 00:14:02.100 |
happens is that as your product gets more and more sophisticated, 00:14:04.940 |
and your company and your ability to execute and deliver 00:14:07.380 |
gets more sophisticated, you can start satisfying the needs of 00:14:10.300 |
bigger and bigger companies. So you start SMB, then you go mid 00:14:13.300 |
market, then you eventually get to enterprises. I think if you 00:14:16.780 |
start with enterprises is very hard to go down market because 00:14:20.180 |
it's a lot easier to add requirements to your product 00:14:23.220 |
than to actually strip complexity of a product that's 00:14:26.460 |
actually surprisingly difficult to do. So I think it's I think 00:14:30.300 |
either strategy can work either you start the low end and move 00:14:32.580 |
up market. That's the classic Clay Christensen innovators 00:14:35.620 |
dilemma type thing. Or you you just start the top and you stay 00:14:40.180 |
It's just I mean, adding 10 people a day over two years, 00:14:43.140 |
that's a large number of people to add to a company. 00:14:45.260 |
Well, in fairness to stripe, they were very honest about 00:14:47.820 |
this. And they were like, we overestimated got confident and 00:14:50.620 |
we overhired and they found that all the coordination costs, the 00:14:53.660 |
Saksis point became too high. That's exactly what the call is 00:14:56.340 |
and said in their memo. So I think that they're trying to 00:14:58.460 |
course correct and get back to this. I think the point that I'm 00:15:01.940 |
making unemotionally I don't own stripe nor adjunct I don't have 00:15:05.100 |
a horse in this race is more that in this market specifically 00:15:09.380 |
in these middlemen, highly transparent middlemen markets, 00:15:12.340 |
it's very difficult to hide the cheese, meaning the ability to 00:15:16.380 |
get to an extremely precise valuation model is pretty easy. 00:15:21.580 |
You know, this was half a day's work that we did. And the point 00:15:24.620 |
is, all this data is out there. And so it means that if you're 00:15:28.900 |
going to go public as a company like this, you have to be quite 00:15:32.660 |
thoughtful about how outside and folks will value you because the 00:15:36.260 |
terminal buyer is very, very sophisticated and pretty smart 00:15:41.300 |
freeberg. When you look at this, it kind of dovetails with the 00:15:44.260 |
get fit Brad Gerstner, you on Twitter doing more with less 00:15:48.300 |
employees. Zuckerberg again says he is getting rid of managers. 00:15:52.740 |
He's asking managers to SACS his discussion about, you know, the 00:15:56.660 |
layers of management that got added and added, where high 00:15:59.420 |
performers would be would have five people put under them 10 00:16:01.860 |
people put under them. Is it going to be? Are you impressed 00:16:07.060 |
with how quickly the industry is responding to this new 00:16:09.300 |
environment? Or are they not responding fast enough? 00:16:13.780 |
In terms of headcount revenue, because now we're looking at 00:16:16.380 |
revenue per employee. This is a never looked at that. It's been 00:16:20.780 |
This is a little bit of a different situation where it's 00:16:24.260 |
about the scalability of a business. Like when I look at 00:16:29.980 |
like the value that a business has created, you start first 00:16:32.940 |
with like, can you make a product? Can you sell the 00:16:35.220 |
product? Do people want to buy the product? And then you know, 00:16:37.780 |
can you make money selling it? And then there's this metric 00:16:40.740 |
that a lot of people use, which is LTV to CAC, which is the 00:16:44.340 |
lifetime value of acquiring a new customer divided by the cost to 00:16:47.580 |
acquire that customer. But I think you can generalize that 00:16:50.380 |
ratio to talk about business performance more broadly, which 00:16:54.660 |
is, you know, capital deployed, which is typically what CAC is 00:16:58.460 |
used in terms of growth on the denominator, and then capital 00:17:02.380 |
returned over time, which can be the numerator. And so you can 00:17:06.380 |
kind of think about that LTV to CAC ratio, being something more 00:17:10.060 |
broadly defined as something like ROIC, or what have you. The 00:17:13.580 |
question for the scalability of any business is, does that ratio 00:17:18.180 |
whether it's LTV to CAC or ROIC return on invested capital, does 00:17:22.140 |
it get bigger or smaller? Does it increase or decrease? Does 00:17:26.860 |
that ratio increase or decrease as you get bigger as you spend 00:17:29.780 |
more money as you deploy more money? If it's getting smaller, 00:17:32.980 |
then mathematically, you can resolve pretty quickly to the 00:17:36.700 |
asymptotic valuation that that business will achieve or the 00:17:39.420 |
asymptotic revenue that that business will achieve. And 00:17:42.060 |
that's a very scary kind of circumstance when a business 00:17:45.460 |
that's tracking that metric starts to see that metric 00:17:48.780 |
shrink. If that metric is growing, then you have an, you 00:17:52.820 |
know, a hyperbolic kind of moment and you can build 00:17:55.700 |
platforms and add products and invest very heavily and take 00:17:59.780 |
lots of risk and take lots of bets. When it's going the wrong 00:18:02.300 |
way. You have two options. Number one is you have to make a 00:18:05.820 |
change or pivot in the business to get it to go the other way. 00:18:09.180 |
Or number two is you have to take advantage of that moment 00:18:12.660 |
before the market finds out about that moment. Because as 00:18:15.220 |
soon as the market realizes that that ratio is going the wrong 00:18:17.780 |
way, your valuation multiple what you're worth as a multiple 00:18:21.180 |
of revenue or profit shrinks dramatically, because then the 00:18:24.340 |
market can also see that asymptote and outcome. So I 00:18:27.140 |
think it's very often the case that one should, you know, as a 00:18:29.500 |
board member as an investor, urge entrepreneurs, CEOs, 00:18:33.180 |
founders, managers to think really clearly about that 00:18:36.060 |
metric, what's the right way to define the denominator and 00:18:38.900 |
define the numerator in our business, and define that ratio 00:18:42.060 |
over time. And as soon as it starts tracking the wrong way, 00:18:44.900 |
you have a moment, you can either fix it, or you got to go 00:18:48.660 |
sell the business or go public and raise capital before the 00:18:52.060 |
market catches on and your valuation shrinks. So I think 00:18:54.500 |
what you're highlighting, yeah. So when I see what you're 00:18:57.340 |
showing in this data, and talking about this, the 00:18:59.780 |
shrinking valuation issue for Scripe, it really, I think 00:19:03.420 |
highlights this important point, this broad point, which is did 00:19:06.220 |
they miss the window? Did they miss the moment where suddenly, 00:19:09.300 |
you know, the shrinkage is causing, you know, an 00:19:11.740 |
asymptotic outcome for this business that it makes investors 00:19:14.980 |
a little bit like, well, I'm not as excited about that, because 00:19:16.980 |
it's not there's no, there's no longer as much upside. And it 00:19:20.300 |
might be time to kind of devalue the company. And did they miss 00:19:22.860 |
the moment to go public raise a bunch of capital, you know, to 00:19:25.940 |
go and try new things and hopefully pivot into a way. So I 00:19:29.020 |
don't know enough about the business. But that's my broad 00:19:31.300 |
kind of assessment of this, this interesting thing about that 00:19:33.820 |
space. We talked to one of our friends at our poker game who 00:19:38.060 |
runs a large consumer facing business. And I don't know if 00:19:40.860 |
you were there for that conversation, off revert, but I 00:19:43.220 |
was, you were there. Yeah. And one of the interesting things he 00:19:46.140 |
said is, we are at a level of scale where we just bid these 00:19:49.420 |
guys against each other. And these things tend to now be loss 00:19:52.780 |
leaders for them. Which is to say effectively, that cost 00:19:56.540 |
structure becomes really important. So your cac becomes 00:19:59.260 |
very important, because your LTVs are capped, right? And the 00:20:02.660 |
LTVs are capped, because these companies have enough 00:20:04.940 |
negotiating leverage to say, well, if you want my business, 00:20:07.780 |
here's the cost of doing this business, which makes a ton of 00:20:11.620 |
sense if you're any large purveyor of services that 00:20:14.900 |
require payment processing infrastructure. So one of the 00:20:17.700 |
interesting dynamics, I think we're learning in this market is 00:20:20.180 |
how it's really not a market, right, there are segments, and 00:20:24.180 |
there's embedded profitability in each segment. So to your 00:20:26.860 |
point, Friedberg, this is the sum of at least three or four 00:20:29.860 |
different LTV to cac ratios, right? Right. tail looks very 00:20:34.340 |
different, which is why you have to build a ton of features. And 00:20:37.540 |
the head just wants pure play. And it's all about cost first. 00:20:40.940 |
Because all of these guys want want to pick up every nickel and 00:20:44.660 |
dime that's on the floor, because for them, on billions of 00:20:47.380 |
transactions is meaningful to them. It's an it's an it's an 00:20:50.180 |
EPS miss or, or beat, right for them, which has huge 00:20:53.500 |
implications to their stock. This is a market that I think 00:20:55.740 |
is going to be really fascinating to uncover and peel 00:20:58.060 |
back the layers of over the next few. By the way, we haven't even 00:21:00.380 |
talked about what stripe does as a business. I know we have a 00:21:02.340 |
diverse audience that doesn't all come from tech. Yeah, so 00:21:05.060 |
stripe is will process your transactions, but they were the 00:21:08.940 |
first people to make it as simple as putting a snippet of 00:21:11.380 |
code into your app. To process a payment, they can be with visa, 00:21:14.660 |
MasterCard and those other places, they charge you a 00:21:16.620 |
percentage of each transaction. So to trim off point, these 00:21:19.620 |
larger answer devs, developers, 510 years ago love this because 00:21:25.220 |
they can instantly get payments, right? It's sort of abstracted 00:21:27.460 |
the whole thing just the same way cloud computing does right 00:21:29.700 |
storage at s3, etc. So you can kind of think about it that way. 00:21:33.260 |
But a large well in the system, Chamath, which you said, add 00:21:35.940 |
yen has a lot of whales, not a lot of long tail stripe, because 00:21:38.420 |
it's developer friendly. And a snippet of code, they have this 00:21:41.780 |
huge long tail, anybody can do stripe. In fact, people who are 00:21:44.260 |
using things like substack or Patreon, I believe, they can 00:21:47.140 |
just drop in their stripe account. So people now, 00:21:49.100 |
businesses have one have a stripe account, they just drop 00:21:52.380 |
it in there. So for me, that seems like huge potential in the 00:21:55.220 |
future, because some of those could be handed give wells in 00:21:57.940 |
And the long tail gives stripe a lot of pricing power, because 00:22:01.060 |
there's there's no way for any one of those entities to have 00:22:04.980 |
enough leverage to tell stripe, hey, I don't want to pay 2.9% 00:22:08.540 |
plus 20 or 30 cents a transaction, right? Whereas if 00:22:12.100 |
you go to the head, I think I'd yen is charging like 1.3 or 4%. 00:22:16.620 |
So yeah, it's a wholly different market. And the pricing as a 00:22:22.100 |
Yeah. It's interesting to me, sacks that we now are getting 00:22:26.500 |
down to, you know, brass tacks here, we're analyzing these 00:22:29.900 |
money printing businesses and saying, what is the ultimate 00:22:32.860 |
value of this? 1020 years from now, Chamath and I got a front 00:22:36.780 |
row seat to that because there's a natural audience to every 00:22:39.420 |
single service. For AOL, it was 30 million paid subs a month at 00:22:44.540 |
I think the peak was 30 bucks a month people were paying 00:22:50.340 |
So, you know, you start looking at those numbers, you know, a 00:22:53.540 |
billion dollars a month almost in just and it was a fixed cost 00:22:56.580 |
business. But then boom, it just hit a ceiling and competition 00:22:59.620 |
emerged emerged in the in the case of broadband. And then that 00:23:04.580 |
business just slowly deprecated over time. So sacks, what does 00:23:08.300 |
this moment tell you for founders, a lot of the listeners 00:23:11.020 |
here and capital allocators, in terms of assessing businesses 00:23:13.980 |
for the last and this will pivot into our next story. The last 00:23:16.380 |
couple years, you know, if you were a first time fund manager, 00:23:18.420 |
you were investing in 19 2019 to 2021 high valuations, those, 00:23:24.140 |
those funds, are they ever going to be able to throw a profit? 00:23:27.380 |
And then people were investing in those based on momentum, logo 00:23:31.740 |
chasing. This is now back to, you know, sharpening your 00:23:37.580 |
yeah, I mean, we've talked about it before. There's nothing new 00:23:39.460 |
here. When you're in a boom, the only three things that matter 00:23:44.540 |
are growth, growth and growth. And when you're in a downturn, 00:23:47.620 |
the three things that matter are growth, burn and margins. It's 00:23:52.420 |
not that growth stops mattering. It's just that people also care 00:23:54.820 |
about burn and margins. And, you know, the companies that fare 00:23:59.420 |
the worst are the ones that have inefficient growth that basically 00:24:03.940 |
have burned a lot of money to grow. They have, you know, lower 00:24:07.460 |
negative gross margins, they are burning way too much money, the 00:24:10.820 |
burn multiple doesn't make sense, basically, the ratio of 00:24:13.140 |
money burnt to net new ARR that they're adding, those companies 00:24:17.460 |
get called out when all of a sudden you have regime change, 00:24:21.300 |
pack is one of the early signs of this. chemop you and I saw 00:24:25.540 |
that member aol sending DVDs everywhere and cac became two or 00:24:30.820 |
$300 for every AOL subscriber. And then they were playing this 00:24:34.540 |
funny accounting game. I don't remember this chemop where they 00:24:36.660 |
were saying, Hey, the LTV is like five years for an AOL, they 00:24:40.140 |
were looking back at that number, not for the broadband 00:24:42.460 |
coming. And so like, we can totally spend $300 on TV ads to 00:24:46.140 |
get a dial up customer at 24 a month. And boy, did that whips 00:24:50.580 |
on them. So listening to everybody talk here, I'm just 00:24:53.460 |
like, wow, keep your eye on the cac folks, the customer 00:24:55.980 |
acquisition cost, how much you get you spend to get a new AOL 00:24:58.980 |
Netflix or SAS product or a stripe customer is critically 00:25:03.740 |
We look really closely at cac payback, you know, how many 00:25:06.300 |
months does it take to to pay back the cost of acquiring a 00:25:09.940 |
customer? We don't look at that exclusively, though, because you 00:25:13.580 |
know, what expenses go into cac is highly dependent on your 00:25:18.740 |
unpack that for a second, because there's the money you 00:25:20.820 |
spend on a Facebook ad or a LinkedIn ad or any other great 00:25:23.540 |
platform for driving, you know, customers to sign up for it. 00:25:28.300 |
then you spend money on an ad or you spend money on a sales 00:25:30.980 |
person, obviously, that goes into cac. But then what about 00:25:33.860 |
sales operation headcount? Does that go in? Is that opposite 00:25:38.020 |
counter? Is that sales account? Is that customer acquisition or 00:25:40.900 |
is it something else? So there's a lot of like subtle accounting 00:25:43.900 |
decisions that can have a big impact on with numbers that 00:25:46.300 |
number? Well, this is why this is why I've always recommended 00:25:48.980 |
just looking at burn multiple. What I really want to know is 00:25:51.220 |
how much money is this startup burning in relation to how much 00:25:54.460 |
revenue is adding? Just like the ratio of those two things. 00:25:57.180 |
Because there's no one hired burned 100. Yeah, so yeah, 00:25:59.860 |
there's no one spent $300,000. And we burned $100,000. And then 00:26:06.820 |
we added $100,000 in new customers, ARR. So that's one x 00:26:13.380 |
so that you have on your chart here burn multiple of one to 1.5 00:26:16.740 |
or under one is amazing or great. But if you burn 200,000 00:26:22.660 |
I warn founders going into this year do not have a burn multiple 00:26:25.660 |
greater than two because there's just so many headwinds right now 00:26:29.180 |
that what happens is if you end up missing your revenue forecast, 00:26:33.820 |
your burn multiples going to look terrible, they could shoot 00:26:35.740 |
up to three, four, five and up. So it's better to have some 00:26:39.820 |
cushion by going into the year being super efficient on the 00:26:43.100 |
converse side, Friedberg, if your lifetime value of a 00:26:47.100 |
customer is incorrect, which we're seeing now with people 00:26:50.900 |
canceling SAS products or reducing the number of seats, or 00:26:55.780 |
in cloud computing, people are now saying, Hey, maybe I should 00:26:58.340 |
take myself out of the cloud and host my own servers or some of 00:27:00.820 |
my own servers and reducing their cloud bill. Cloud growth 00:27:04.300 |
is slowing at Azure, Azure, across the board. Amazon Web 00:27:09.380 |
Services, etc. The it's still growing, but it's slowing the 00:27:11.900 |
growth. So that LTV if you get that wrong, that can whipsaw you 00:27:16.780 |
yeah, I mean, LTV, which is like, what do you make over time 00:27:20.540 |
from a customer or however you want to assess it? A market 00:27:25.860 |
deployment? It should be on kind of net cash, meaning like, how 00:27:30.620 |
much profit do I pull back into my bank accounts at the end of 00:27:34.180 |
After paying third parties and internal people, and we're a lot 00:27:38.780 |
of people, I think in models I've seen on, you know, what's 00:27:44.740 |
the lifetime value of a customer, they kind of take 00:27:46.780 |
either revenue, or just the simplified gross profit number. 00:27:49.780 |
But the reality is, if you're scaling the number of engineers 00:27:52.420 |
you need, because you have many more customers, and you got 00:27:54.580 |
customer service calls, and you know, you've got to do custom 00:27:57.780 |
deployments with your customers, all of that kind of adds up to 00:28:01.540 |
additional cost. And some of these businesses, you see that 00:28:04.300 |
the SAS companies, for example, that all have gotten their 00:28:07.780 |
multiples hammered, it's because the kind of microscope has come 00:28:11.740 |
out at this point, to some degree, set aside, general 00:28:15.380 |
macroeconomic factors that are driving some of the multiple 00:28:17.860 |
compression. But as the microscope has come out, it 00:28:20.500 |
turns out that the efficiency of the business is not what everyone 00:28:23.540 |
hoped and dreamed a SAS business might be, that the efficiency of 00:28:26.700 |
the business maybe looks a little bit more like either a 00:28:29.140 |
services business, or there's a big kind of scaling hardware 00:28:32.020 |
component, that the margin that you actually make for every 00:28:35.300 |
dollar of revenue generate, fundamentally is smaller than, 00:28:39.420 |
you know, what you think it is, you have to add people to 00:28:43.060 |
support and ops and new servers and all the stuff you're 00:28:46.340 |
highlighting. And a lot of that's excluded. And then it 00:28:48.660 |
doesn't take, you don't realize all that, when you're small, or 00:28:51.820 |
when you're medium and growing, you realize that when you're 00:28:53.700 |
bigger, when you're bigger, you're like, Oh, wow, how do we 00:28:56.140 |
get these costs out? Well, if we cut these costs, customer 00:28:58.420 |
quality would decline, customers would churn, all this bad stuff 00:29:01.500 |
would happen. So yeah, that LTV number is generally not right. 00:29:04.980 |
And that's why I say, it's much more about kind of a true ROIC 00:29:09.020 |
calculation, which is how much capital am I deploying. And it's 00:29:12.580 |
not just being deployed in marketing dollars, it's being 00:29:15.660 |
deployed in other ways. And then how much capital am I making 00:29:17.900 |
back net profit over time. And I think that's the right way to 00:29:21.700 |
always analyze a business generally, but like, 00:29:24.260 |
particularly in businesses where it's easy to obfuscate either 00:29:27.100 |
those numbers, and they could seem like it's an extraordinary 00:29:29.340 |
number business, you can get hurt when you get bigger, or 00:29:32.700 |
when you're scaling. And in a market like this, where you're 00:29:35.220 |
trying to go public, it's like, Whoa, that really hurts, you 00:29:38.820 |
know, so I think that's a lot of what we're seeing. 00:29:40.380 |
Let's talk about the other side of the table. We've been living 00:29:42.780 |
through a zero zero interest rate hallucination. Basically, 00:29:47.060 |
people were growth, growth, growth, logo, logo, logo, 00:29:51.060 |
whatever. When they're making these bets, capital allocators 00:29:54.820 |
now we're back to brass tacks. Okay, what's the margin? What's 00:29:57.460 |
the lifetime value? And is this actually real? Is there a real 00:30:01.180 |
business here? Or is this just a grand hallucination? That 00:30:04.580 |
hallucination exists not only on the founder side, but on the 00:30:07.460 |
capital allocator side. This week, we had a interesting 00:30:12.180 |
semi viral thread on Twitter, somebody named Tyler Tring us. 00:30:18.860 |
He's an early stage investor, don't know who that is. But he 00:30:22.220 |
did a thread predicting a 16 z just to pick out one firm was a 00:30:27.500 |
zero interest rate phenomenon, and an incredible machine to 00:30:30.620 |
accumulate a UM assets under management. And so what were your 00:30:34.300 |
thoughts just writ large on the capital allocator side of this 00:30:42.100 |
I mean, I think it's a little unfair. I think this is written 00:30:45.660 |
more just to try to generate views and clicks because, okay, 00:30:50.380 |
you have to see the underlying return data to really have a 00:30:54.260 |
sense of knowing is it I think it's fair to say a couple of 00:30:58.220 |
things that there was probably two and a half or three years of 00:31:01.980 |
capital raised in the industry. That's going to get really put 00:31:06.660 |
under pressure. And the reason is that there is not a lot of 00:31:11.660 |
time diversity in that money, meaning people got it, and they 00:31:15.420 |
put it into the ground right away. And one of the principles 00:31:19.820 |
of having a more predictable return set of returns over time 00:31:24.420 |
is that you leverage time, right? So if you had $100, and 00:31:28.940 |
you wanted to have a diversified stream of returns, you're much 00:31:32.780 |
better off spending $1 a month for 100 months, versus $10 a 00:31:39.460 |
month for 10 months. So just that thing will cause a lot of 00:31:45.460 |
impact and headwinds for a lot of the capital in 2021 and 2022. 00:31:49.780 |
Then the other thing you have to keep in mind is that over many 00:31:53.980 |
cycles where we've had high rates and low rates and medium 00:31:57.020 |
rates, our industry typically returns $1 60 for every dollar 00:32:02.940 |
it raises. And that's over many cycles. And so if you believe 00:32:06.940 |
that we're going to revert to the mean, out of the trillion 00:32:10.980 |
dollars we've raised, maybe we'll return 1.6 trillion. Now 00:32:16.420 |
that sounds good, except the problem is that 1.6 trillion is 00:32:20.060 |
marked at five and a half trillion. So you're gonna have 00:32:24.900 |
to give back. There's a lot of pain, you're gonna have to give 00:32:28.020 |
back a lot of paper profits in order to get back to that 1.6 00:32:31.300 |
and be okay with it. And the question is, what has happened 00:32:34.420 |
in decision making in the meantime, meaning how many 00:32:36.620 |
people did you hire? How many deals did you do that you 00:32:39.500 |
regret? And then how does it change your psychology and how 00:32:44.220 |
you treat the next investment that comes over the desk? Can 00:32:47.300 |
you separate yourself from these bad losses and not be on tilt 00:32:52.500 |
So you had a terrible two day session, like Phil Helmuth did 00:32:56.700 |
last week losing $350,000. Can you play the next week and not 00:33:01.180 |
be on tilt and start to build back your stack and make 30,000 00:33:03.900 |
a night for 10 nights or 10 of the next 20 set 15 sessions or 00:33:07.380 |
whatever it is? It's actually had a rebuttal or something you 00:33:10.580 |
No, not really a rebuttal. I mean, look, I think if you're 00:33:12.660 |
going to be intellectually honest about it, I think that 00:33:15.660 |
2021 is gonna be is gonna likely be not a great vintage for VC. 00:33:20.820 |
Why? Because the valuations were just Yeah, the valuations were 00:33:24.060 |
just really high. They've come down by what at least 50% on 00:33:30.180 |
maybe 50% now. But you still have more medicine to take. I 00:33:35.180 |
think when you look at some of the businesses, 00:33:36.540 |
you know, a lot of these companies are growing into their 00:33:37.980 |
valuation. Look, I think for any given set of companies for any 00:33:41.780 |
portfolio, the most important thing is what's in the 00:33:44.940 |
portfolio. So if in 2021, you had the founding of the next 00:33:50.860 |
Google or whatever, that effect is going to swamp the effect of 00:33:54.460 |
price levels in that year because of the power law. Again, 00:33:57.100 |
the number one most important thing is just what's in that 00:33:59.180 |
portfolio, what's in that basket. The second most 00:34:01.460 |
important thing is the entry prices. And obviously, if the 00:34:05.540 |
entry prices are twice as high in a given year than they are in 00:34:09.020 |
every other year and twice as high as what the exit multiples 00:34:12.580 |
are going to be in 10 years, when that portfolio becomes 00:34:15.500 |
liquid, that's gonna hurt the returns. But we won't know which 00:34:19.540 |
of these effects predominates until five years from now we see 00:34:24.140 |
I mean, when I saw that tweet thread, I thought, maybe this is 00:34:27.300 |
an issue for some venture firms, but we're not going to see even 00:34:30.980 |
the inklings of it for another five or seven years takes a 00:34:33.820 |
while. That's a problem that may manifest itself in year 10. And 00:34:37.500 |
between now and then, any firm that it has a good track record 00:34:42.100 |
of returning capital, or frankly, has a good brand and 00:34:45.260 |
good marks will still raise an inordinate amount of money 00:34:47.740 |
because this is an asset class that I still think on the 00:34:50.140 |
margins is a more of a must have asset allocation than a on the 00:34:56.460 |
margins, I just rather ignore it because you know, it is the 00:34:58.860 |
future of how GDP will get created. And so everybody kind 00:35:03.740 |
Imagine if in 2021, the you know, the next great mega 00:35:08.940 |
outcomes in AI were created, right, because those founders 00:35:11.980 |
were just slightly ahead of the curve, you know, they were like 00:35:13.860 |
a couple years ahead of the curve. If those create, you 00:35:17.940 |
know, the next, whatever trillion dollar companies, 00:35:22.060 |
then the fact that price levels were to x, what they should have 00:35:24.740 |
been won't matter. Yeah, what will really matter is the 00:35:26.660 |
distribution, there'll be a bunch of bad portfolios, there'll 00:35:29.020 |
be some really incredible ones. And that's the way it always is 00:35:32.300 |
The thing to keep in mind is in 21 and 22 rates were still 00:35:35.340 |
effectively too low. And I think we did this analysis, Nick, you 00:35:38.540 |
can throw up that thing, but it's not correlated with big 00:35:42.780 |
outcomes, those vintage years. 2023 is the is the first vintage 00:35:47.260 |
year where we're actually starting to see high enough 00:35:49.620 |
rates that have historically generated that kind of return. 00:35:53.340 |
And so I do agree with you, David, I just think it's shifted 00:35:56.460 |
out by a couple years 2324 25. Those can be some real power law 00:36:01.300 |
years, I think, because we're going to have just based on what 00:36:04.420 |
the Fed is saying, five and a half percent interest rates for 00:36:08.300 |
the foreseeable future, which is, it's a huge, huge number, 00:36:12.940 |
I'll tell you what that is. You know what it is, though, 00:36:15.540 |
Chamath, I think to build on your point, and freeberg, I'll 00:36:18.180 |
bring you in on after this, it creates an environment in which 00:36:22.020 |
discipline on all sides of the table, boards, management teams, 00:36:25.700 |
investors, rank and file, everybody has to be focused. 00:36:29.660 |
Everybody has to have sharpened swords. And that little bit of 00:36:33.580 |
headwind is and the the ability to raise capital being harder is 00:36:38.180 |
building more reserve and more resilience and grit in this set 00:36:41.620 |
of founders. It's kind of like parenting, in a way like if you 00:36:44.820 |
are too permissive, you give too many options. Kids aren't 00:36:48.060 |
disciplined. And now this group of entrepreneurs I'm seeing who 00:36:51.180 |
haven't given up, my lord, are they becoming animals in terms 00:36:55.580 |
of like pure samurai, in terms of how they're running these 00:36:58.740 |
businesses, anything that's not efficient projects that were the 00:37:02.180 |
third or fourth most important project, cut, cut, cut. Now it's 00:37:04.860 |
taking them 18 months, freeberg to maybe get discipline. But 00:37:08.660 |
maybe you could speak to the next three years and the 00:37:11.060 |
opportunity for investing in this cohort, because man, that 00:37:15.020 |
last cohort is going to be really, really challenged. And 00:37:18.420 |
they'll probably do 6% returns, just like your money market 00:37:23.300 |
account can do right now five or six or what bonds can do. But 00:37:26.620 |
this next group, man, we're seeing dogged entrepreneurs who 00:37:29.500 |
are focused on reality, and there is no hallucination now 00:37:33.300 |
that this is going to be easy. There is no grand illusion here. 00:37:38.460 |
If the market average return in venture in early stage investing 00:37:42.900 |
is going to be 6%. Remember, it's it's not evenly 00:37:46.580 |
distributed. So you know, 80% of funds could end up having net 00:37:52.340 |
negative, real returns, and 20% make money and then those 00:37:57.060 |
there'll be a very few that will make real money. And you know, 00:38:00.260 |
that's the nature of having, you know, a very kind of low average 00:38:05.180 |
return on the industry is there may be a lot of wipeouts on the 00:38:09.140 |
investor class. Folks that have only had one or two funds and 00:38:12.820 |
then just got blown up in the cycle. I think that there's two 00:38:16.500 |
groups of companies out there. One is companies that obviously 00:38:19.420 |
have been funded and are doing stuff and are active businesses. 00:38:22.860 |
And they've raised money in the past. And that's where there's 00:38:25.580 |
going to be really ugly times. I've mentioned this in the past, 00:38:29.100 |
but I do think that there's a significant number of these 00:38:30.740 |
companies that if they were to be truly valued on first 00:38:33.780 |
principles in private markets today, they'll get valued as at 00:38:37.700 |
a value that's less than their preferred equity, which means 00:38:41.740 |
that there's a difficult restructuring needed in the 00:38:43.580 |
company. And not everyone's going to be willing to embrace 00:38:46.220 |
that. So that's what's going to trigger a lot of the wipeouts in 00:38:48.820 |
the market. It's not like the businesses are valueless. It's 00:38:51.260 |
that the capital structure makes it difficult to refund them to 00:38:54.900 |
fund them and continue their operations. Now for all the new 00:38:57.540 |
businesses, as you highlight, man, there's so much 00:39:00.220 |
extraordinary leverage out there. You know, left and right. I 00:39:05.020 |
think we talked about this maybe a year ago, that there was a big 00:39:07.460 |
bubble coming in AI. But I mean, left and right in nearly every 00:39:10.860 |
market every segment, you won't see a pitch deck that doesn't 00:39:14.380 |
have those two letters in it. Right? I mean, I'm sure you guys 00:39:17.980 |
find it does feel it is. It is hard not to feel like you're a 00:39:26.100 |
little bit of a lemming if you buy into the AI stuff. But I 00:39:28.820 |
will say that the use cases we're seeing are really 00:39:31.260 |
incredible. Totally. I didn't feel this way with the last 00:39:35.100 |
couple of waves, like the whole web three thing never totally 00:39:37.460 |
made sense. And crypto always felt a little bit speculative, 00:39:40.660 |
like kind of unsure. But the AI thing seems like it's going to 00:39:43.380 |
deliver real value. And I'm seeing like already three major 00:39:47.500 |
enterprise use cases. Number one is just auto summaries, like 00:39:51.660 |
being able to summarize very quickly 1000 articles or a 00:39:56.140 |
meeting, you know, spinning out a like a summary of what just 00:39:58.540 |
happened in a meeting. And it could break it down between a 00:40:01.100 |
recap and action items. It just does all the work for you. 00:40:03.860 |
Second thing is like in app customer service, kind of like a 00:40:08.980 |
co pilot, but there's no reason to contact customer support 00:40:11.220 |
anymore. Because you can just ask the AI inside the app. And 00:40:16.780 |
And they'll be faster, right? That's like a power user. 00:40:21.660 |
It's like a power user who's sitting next to you as your co 00:40:24.340 |
pilot and is making you much more effective in the app. And 00:40:27.780 |
then the third thing we're already seeing is auto complete 00:40:30.940 |
for everything. I mean, it is like bonkers how you know how 00:40:34.220 |
you get like little type has suggestions in email. Yeah, it's 00:40:37.300 |
like two or three words. The AI is gonna be able to do type 00:40:47.540 |
it's bonkers. You see it in Google, you see it in Google 00:40:50.420 |
Sheets. Now, like if you type, you know, equal sum, it's like, 00:40:53.420 |
oh, here's what the seven most likely things to happen next 00:40:56.780 |
are, in which case, it's kind of like you use the chest.com app. 00:41:00.180 |
I don't know if you've used it with like the heads up display, 00:41:02.220 |
where it's showing you the different moves. And this is a 00:41:04.700 |
book move versus this is not a book. Let me make a prediction. 00:41:07.900 |
All of the things that you guys said, I think are incredible 00:41:11.180 |
consumer surplus business opportunities, which means that 00:41:13.940 |
the ultimate winner is us. And we're going to become sad for 00:41:18.820 |
the visa consumer, not the consumer, incredibly, 00:41:21.540 |
incredibly productive, and more leveraged in how we spend our 00:41:25.860 |
time, which will allow us to do all kinds of other interesting 00:41:28.500 |
things with all the time that we save. That I think is almost 00:41:31.540 |
now a certainty. The problem with consumer surplus businesses 00:41:36.300 |
is oftentimes, there is no money made in the funding of them. And 00:41:40.460 |
really, where the money is made is in enabling it. So for 00:41:43.780 |
example, so far, what I would say is there's very little money 00:41:47.380 |
that has been made in AI. There's been an enormous amount 00:41:50.660 |
of money that's been made by Nvidia. And the reason is 00:41:54.740 |
because they are the pick and shovel provider in the into the 00:41:57.940 |
industry. And so as that's an example, AMD, I think can also 00:42:03.060 |
benefit. So the silicon players seem pretty obvious here. Maybe 00:42:06.980 |
some of the cloud players, the problem is the cloud players are 00:42:09.900 |
trapped inside of other big companies with many other 00:42:11.780 |
business models. But I just want to put out there that I think 00:42:14.460 |
David, you're right that the consumer 100% wins. But 00:42:19.860 |
economically, it's not clear to me that there is a winner that 00:42:23.700 |
is venture fundable. Well, hold on a second. Yeah, the Levi's 00:42:28.140 |
Strauss is of the world right in the gold rush. The people that 00:42:31.300 |
made the picks and shovels and the jeans are sure to make 00:42:33.700 |
money. Yeah. And the people that pan for gold is much more 00:42:37.540 |
speculative and harder to see right now. Yeah, I just 00:42:40.300 |
disagree with that. So well, I think, I think you have a point 00:42:43.420 |
that so I mentioned three use cases, I think are killer use 00:42:46.500 |
cases that we're already seeing demos of today. And when you 00:42:49.540 |
look at them, you're like, okay, this has real applicability. I 00:42:52.860 |
mean, the AI is going to be, it's going to powerfully change 00:42:55.700 |
our work lives. I'm just focused on enterprise. So now I don't 00:42:59.580 |
know who benefits economically from that, that functionality 00:43:02.340 |
that I mentioned, I think is likely to be pretty 00:43:04.660 |
commoditized pretty soon. But it's going to be incorporated 00:43:08.820 |
into lots of different apps in ways that are hard to predict 00:43:11.500 |
right now. I think that this AI revolution is going to do for 00:43:15.980 |
SAS what mobile did for, you know, a lot of the web 1.0 00:43:20.300 |
companies, where, like, for a lot of these web one companies, 00:43:24.220 |
they were either disrupted by mobile, or they're turbocharged 00:43:27.020 |
by mobile. So you think about Facebook, it successfully made 00:43:30.060 |
the transition. And mobile made its business so much better, 00:43:32.700 |
because people are just using it a lot more on their mobile 00:43:34.820 |
devices. There are a lot of other businesses that just kind 00:43:37.340 |
of fell by the wayside, because they just couldn't make the 00:43:40.060 |
adaptation from desktop to mobile computing. I think AI is 00:43:44.340 |
going to be like that for SAS, where there's gonna be a lot of 00:43:49.180 |
Yeah, you're 100%. If you can incorporate the AI into your 00:43:52.340 |
SAS product, put in a co pilot, put in auto complete and all 00:43:56.900 |
sorts of other forms of value that we're just scratching the 00:43:59.220 |
surface of, you're going to be able to deliver so much more 00:44:01.460 |
business value. But if you're not able to do that, and 00:44:04.060 |
somebody else can, then you're gonna get disrupted. 00:44:06.340 |
Look at some of these enterprise spaces, like, take something 00:44:09.300 |
like APM, right, like application performance 00:44:11.140 |
management, that's an entire ecosystem of enterprise 00:44:13.900 |
companies, it's probably 10 15 $20 billion of collective 00:44:16.860 |
market cap. And I'm just gonna say something not to not defend 00:44:21.260 |
anybody, but like that can mostly be automated by AI. Those 00:44:25.780 |
are simple heuristics that can be embedded in a way that's 00:44:28.260 |
completely novel, where this code library just gets dropped 00:44:31.700 |
in, and all of this stuff happens relatively auto 00:44:34.180 |
magically now. So there are all kinds of other sectors to your 00:44:37.300 |
point that get crushed, then the question is, who provides that 00:44:40.980 |
layer now for free in their existing SAS toolkit or their 00:44:44.540 |
product that now all of a sudden, captures more value as a 00:44:47.700 |
result, and they can sell it for pennies, because it's 00:44:50.940 |
incremental to them in terms of their margin in revenue. 00:44:53.180 |
I think you're right, hardware wins. I think cloud wins big. 00:44:56.900 |
Because if you keep adding to these, you know, models, and 00:45:01.540 |
once 10 20% better, people are going to be willing to pay for 00:45:04.020 |
that. But then when you think about consumers, whether they're 00:45:05.860 |
enterprise or actual consumers, I believe tomato stuff is going 00:45:08.940 |
to provide so much value that people are going to take their 00:45:11.540 |
wallets out and be more than willing to spend for it. It's 00:45:14.380 |
more valuable than Netflix. I disagree. Okay, I'm gonna take 00:45:17.660 |
this side. But imagine you take your videos of you learning to 00:45:20.700 |
ski, and you put it into an AI coach. And it's like, here's how 00:45:23.820 |
to edit your straws on it. Here's how to be a better 00:45:26.620 |
skier. This is going to blow people's minds. And you'll be 00:45:29.300 |
more than willing to spend 25 bucks a month. I disagree with 00:45:31.980 |
that. And the reason is because we've spent now two decades, and 00:45:35.580 |
that's a lot of muscle memory to unwind of people that have been 00:45:39.900 |
consistently given more for less. And I think that we 00:45:43.420 |
shouldn't underestimate the expectations we've all 00:45:46.780 |
collectively created by building software tools that have that 00:45:50.660 |
inherent deflationary aspect to them. And so I just think that 00:45:53.900 |
it's going to, it's a very high, high bar, I still think there 00:45:56.820 |
are subscription services to be built. I don't disagree with you 00:45:58.940 |
there, Jason, I just think that in general, though, the de facto 00:46:01.700 |
business model that we've created in tech is more for 00:46:04.140 |
less. And we've used technology to give us operating leverage to 00:46:08.020 |
create margin structures that other companies couldn't copy. 00:46:11.060 |
And I still don't. And I think that AI accelerates that not 00:46:14.980 |
I think it's going to be the opposite. If you look at 00:46:18.540 |
Netflix, if you look at Disney, they've been raising prices, 00:46:21.220 |
providing more value, I think that this is going to provide so 00:46:24.060 |
much value, that the incremental 10 bucks a month, five bucks a 00:46:27.020 |
month per employee is going to pay off so much that this could 00:46:30.660 |
be a slack, or like some presentation software, there are 00:46:33.660 |
a lot of people who are making PowerPoint, AI PowerPoints, where 00:46:37.620 |
it makes you a new deck, or a figma with AI, these things are 00:46:40.980 |
going to be so powerful, people are like, it's totally worth an 00:46:43.180 |
extra 100 bucks a month, because I can get rid of another 00:46:45.060 |
employee, this one employee can now do the work of three, fuck 00:46:48.340 |
it, man, I'll give you $1,000. A really good a model if you just 00:46:52.100 |
added the LTV of that software company is gonna make more 00:46:54.980 |
money. I'm just saying it's deflationary. That's 00:46:56.940 |
deflationary. Okay, it's deflationary on the entire 00:46:59.500 |
economy. But that software company that figures out how you 00:47:02.180 |
can fire two accountants and keep one and make them as good 00:47:05.540 |
as you know, three. Yeah, you're not going to be able to charge 00:47:08.580 |
for software, right? You're selling consumer surplus. Okay, 00:47:11.540 |
I think we're in agreement. Freeberg. So then assignments, 00:47:13.780 |
you want to chime in on this? You still with us? So then 00:47:20.700 |
Well, technology is about doing more with less, right? It's 00:47:24.500 |
about doing more with less and the AI helps you do so much more 00:47:29.740 |
I think your whole point about Disney and Netflix, etc, is 00:47:32.460 |
because they aren't, you know, innovating on either side. And 00:47:36.660 |
so in order to drive earnings growth, they're having to raise 00:47:38.820 |
prices. But that doesn't speak to the benefit of technology. 00:47:42.300 |
They're innovating massively, they're adding massive features 00:47:45.180 |
to their products and massive new shows. I mean, I think 00:47:48.500 |
there's pricing power in this AI thing. That's just my belief. I 00:47:51.660 |
thought about leverage. Yeah. I mean, look, I think I think your 00:47:54.220 |
point like, so my general rule of thumb, thumb on technology is 00:47:59.180 |
the technology creator, the technology company should 00:48:03.820 |
generally be capturing about one third of the value that they 00:48:07.340 |
unpack that. Why? What do you come up with? And so, 00:48:09.700 |
I mean, it just kind of where I'm giving example. Yeah. Yeah. 00:48:13.300 |
So like, let's say that you, as a food delivery company, you have 00:48:20.500 |
to pay a human 10 bucks to deliver food from you. Now, 00:48:23.180 |
let's say I run a robot, my amortized cost of running that 00:48:25.940 |
robot is two bucks. So it's eight bucks cheaper, or call it 00:48:30.100 |
$1. So it's $9 cheaper, I should charge you four bucks. You know, 00:48:35.500 |
because four bucks is super competitive with the existing 00:48:37.860 |
market. And it'll keep me competitive against the other 00:48:40.540 |
automation companies that are going to start to emerge. It's 00:48:43.540 |
just kind of how market dynamics end up working out. If you 00:48:46.100 |
charge too much, you're going to invite people to come in and 00:48:48.740 |
compete with you. If your commodity technology 00:48:50.940 |
commoditizes, remember all technology commoditizes over 00:48:53.220 |
time. And if you don't charge enough, you're not going to make 00:48:58.180 |
enough money to be able to reinvest in scaling your 00:49:00.340 |
business and doing more kind of interesting things as a 00:49:03.540 |
platform. So you know, generally AI provides more leverage to 00:49:06.940 |
taxes point, if I can build an application, I don't know if you 00:49:09.500 |
guys have seen these incredible UI apps that are built in AI 00:49:12.380 |
now, where I can say, with a prompt, hey, make me talked 00:49:15.580 |
about it two weeks ago, yeah, right, make me a dog walking app 00:49:18.260 |
interface. And it builds like the three steps of the dog 00:49:20.260 |
walking app, and gives you a bunch of options. And you can 00:49:22.380 |
pick the one you want, I would typically have to pay a design 00:49:25.460 |
firm $50,000 to do that work for me. So if it'd be AI is doing it 00:49:29.620 |
automatically, you know, I should be paying, let's say 00:49:32.700 |
$15,000 for that product for that capability, the margin on 00:49:36.780 |
that is 100% trifecta margins, right, whatever it is very low. 00:49:40.260 |
And the margin on that's 100%. Whereas the margin on paying 00:49:42.900 |
people to do design work as a design firm is very, you know, 00:49:47.820 |
I figured it out. You know why we're having, we're working it 00:49:50.620 |
out in our heads right now, one group of us is talking about 00:49:53.460 |
comparing AI software and AI services to the existing 00:49:57.660 |
software stack. And on the other side of the discussion, we're 00:50:00.220 |
comparing it to the humans who are currently doing that work. 00:50:03.100 |
Imagine the 6% that two brokers get, you know, doing the sale of 00:50:07.620 |
a million dollar home and that 60,000 and AI could negotiate 00:50:11.220 |
that and find you a better home and sell your home for the 00:50:13.380 |
optimal price. For less than that 60,000, what would you be 00:50:17.340 |
willing to pay for that? Right? And the same thing with the 00:50:20.340 |
I don't think that's how it's going to play out exactly J. Cal 00:50:22.940 |
because to completely eliminate a job function, you have to do 00:50:26.700 |
you know, 100% of it. And you have to, you know, 100% of the 00:50:30.260 |
job function, as but as well as are better than the human, 00:50:33.780 |
whereas, I think as opposed to a model where you still have the 00:50:37.260 |
human in the loop, but they're much more productive, because 00:50:39.820 |
they're working with an AI, they're augmented, they're 00:50:42.100 |
that's more than Iron Man, Iron Man, like model. So I think 00:50:46.620 |
that's more effective. Yeah. So I think if there's a job 00:50:49.700 |
reduction, it would be more the case where they've got a team of 00:50:53.100 |
five accountants, and they go to two or three, because now they're 00:50:57.100 |
just much more productive. I don't think they go to zero. 00:51:01.780 |
I look at outsourcing as a possible corollary to this. 00:51:05.220 |
Remember, when you move the accountants to Manila, where 00:51:07.980 |
their knowledge workers there and it knocked out half the 00:51:10.020 |
price, two thirds of the price, whatever it was, this just feels 00:51:15.500 |
if you have a business model, like, you know, Infosys, or 00:51:18.300 |
Tata, or one of these things that's levered to utilization 00:51:21.100 |
rate, this is the most obvious way to basically add many 00:51:25.300 |
potentially percentage points, if not 10s of percentage points 00:51:29.420 |
of utilization to your business, that's all money, free money for 00:51:32.180 |
you, right? Because now you'll have fewer people, there'll be 00:51:35.420 |
more utilized, and they'll have more leverage because they'll be 00:51:38.060 |
using a bot or some AI agent to help them write code, write unit 00:51:43.180 |
tests, all that typical stuff that right now you outsource. 00:51:45.580 |
And even if you pay a marginal cost, you add the labor 00:51:49.660 |
arbitrage to technology arbitrage. Now, all of a sudden, 00:51:51.700 |
these businesses look really, really interesting. 00:51:53.820 |
Yeah, I think customer support definitely gets revolutionized, 00:51:58.860 |
right? Because the initial no brainer, you know, the first 00:52:03.740 |
line of defense is going to be the AI using, you know, text to 00:52:08.620 |
voice, and it can choose what language it wants to output to 00:52:12.420 |
what accent. So you'll never know that you're you'll think 00:52:16.940 |
Literally, you'll be in 50 languages with the right answer. 00:52:21.460 |
And you don't need to build up that entire group. I mean, this 00:52:25.420 |
I think we're underestimating in some ways. Yeah, but I think 00:52:29.460 |
But my point is, I think that a lot of that customer support 00:52:32.620 |
inquiries just go away, because the help the assistant gets 00:52:35.700 |
built into the tool directly. So you never get to the point of 00:52:38.940 |
inquiry. Yeah, like, why do you, you know, if you can just ask 00:52:43.180 |
people do that right now on YouTube, if you just type the 00:52:45.940 |
question into YouTube, and you find the video that takes five 00:52:49.180 |
minutes, but you're saying this gonna take 15 seconds, because 00:52:52.740 |
I think what Zack said before is hugely important. When you think 00:52:56.140 |
AI touches non technology businesses, what he said is the 00:52:59.020 |
boundary condition, which I think is right, I think he 00:53:00.860 |
nailed this, which is the boundary condition for AI to 00:53:04.300 |
replace a human is where the threshold error rate of that AI 00:53:08.620 |
is the same or less than the human, right? If you look at 00:53:12.540 |
very complicated markets, where does regulatory capture rear its 00:53:17.340 |
ugly head, it's in allowing humans to be error prone, and 00:53:20.460 |
you can't do anything about it, take healthcare. If you go into 00:53:23.700 |
a hospital, there's a certain error rate in every surgery, 00:53:26.940 |
right? There's a certain error rate in the things that happen. 00:53:29.780 |
But there's probably a whole bunch of ways in which that 00:53:32.860 |
entire infrastructure can be made much, much better with AI, 00:53:35.580 |
right, a robot that does laser guided precision surgery, 00:53:40.100 |
characterizing tumors 100 with 100% accuracy. So you always get 00:53:44.740 |
100% of the cancer out when you go and get surgeries done. All 00:53:48.100 |
these things are possible now. And all of a sudden, you take 00:53:51.900 |
these error rates that can be high as as high as 20 or 30%. So 00:53:55.140 |
for example, breast cancer surgeries, the dirty secret of 00:53:57.980 |
our healthcare industry is that has a 30% error rate, you know, 00:54:01.420 |
that can and should go to zero. And now all of a sudden, so 00:54:05.220 |
these highly regulated markets, I think can become much, much 00:54:08.700 |
more efficient and, and leveraged and at pass that 00:54:12.860 |
consumer surplus on to people. In that case, it's healthfulness, 00:54:16.340 |
which I think is a big deal. I didn't my interesting. I did my 00:54:21.220 |
new vo scan. Yeah, incredible. I mean, I got all the videos, I 00:54:24.940 |
got all the loops. I went to the one down on El Camino Real. It 00:54:28.780 |
was like going to a spa in and out, no big deal. But I got the 00:54:31.660 |
results. And it's like, Oh, here, here's a tiny of little 00:54:33.580 |
things that are not worth cutting your body open to look 00:54:37.260 |
at. But just so you know, your knee, your shoulder, your 00:54:39.620 |
kidney, there's a little polyp here, there's a little polyp 00:54:41.420 |
here, whatever, there's a little growth here. But let's see in 00:54:44.220 |
two or three years, just monitor it. And I'm like, Oh, my God, 00:54:46.700 |
I'm so grateful. This thing gets down to like 500 bucks, which 00:54:50.700 |
it obviously will or 1000 bucks, and everybody's doing it. And 00:54:52.980 |
then all that data is in there. And then the AI is looking at 00:54:56.100 |
it, like you're saying, I mean, the early detection, was the AI 00:55:00.340 |
able to tell the doctor how full of shit you are? No, you know, 00:55:04.460 |
you're not supposed to eat for four hours. So they, they didn't 00:55:13.980 |
Yeah, I, here's a really important clip for founders. 00:55:20.460 |
This is super important when looking at web three versus AI 00:55:25.140 |
to Sax's point, you've got to start with the customer 00:55:28.780 |
experience and work backwards to the technology. You can't start 00:55:34.020 |
with the technology and try to figure out where you're going to 00:55:37.820 |
try to sell it. And I've made this mistake probably more than 00:55:40.260 |
anybody else in this room. And I've got the scar tissue to 00:55:43.460 |
prove it. And I know that it's the case. And as we have tried 00:55:48.020 |
to come up with a strategy, and a vision for Apple. It started 00:55:58.100 |
with what incredible benefits can we give to the customer? 00:56:03.580 |
Where can we take the customer? Not, not starting with, let's 00:56:09.260 |
sit down with the engineers and and figure out what awesome 00:56:13.500 |
technology we have. And then how are we going to market that? 00:56:17.020 |
Um, and I think that's the right path to take. 00:56:21.820 |
Can I ask you guys a question? I sometimes I go down these rabbit 00:56:25.700 |
holes. I'll watch hours and hours of Steve Jobs clips. What 00:56:29.380 |
do you think makes him so calm? Doesn't he just strike you as 00:56:36.340 |
incredibly just like calm and like comfortable with himself 00:56:40.900 |
and just aware I know what it is. What is it was so much 00:56:44.780 |
better and aesthetically building product than anybody 00:56:49.700 |
else? He when you think of that PC era of no taste, beige boxes, 00:56:54.740 |
and everybody having no style, and just no swagger. He was 00:56:59.900 |
studying, you know, German design, Buddhism, tripping on 00:57:04.740 |
acid, and like just understanding the universe at a 00:57:08.100 |
level that Gates and the other contemporaries weren't, they 00:57:11.580 |
just weren't as transcendent in understanding product design as 00:57:16.900 |
he was. So it was like when you were saying you were playing 00:57:19.780 |
poker with a bunch of four year olds or something. That's the 00:57:22.540 |
analogy. He's just on such a different level that he's 00:57:25.340 |
watching people make, you know, as 400. And, you know, IBM, PS, 00:57:31.660 |
whatever, like, just garbage computers, garbage operating 00:57:36.060 |
the thing is, like, if you look at any era, just the way that he 00:57:39.180 |
communicates, there's just a level of calm. I don't know how 00:57:43.500 |
to describe it. So you understand what I'm trying to 00:57:44.940 |
say? Like he he just seems like he just sees through all the 00:57:47.620 |
noise. Like he's seen through the matrix, like he's unplugged 00:57:50.460 |
Sax is not impressed. Okay, there you have it. 00:57:53.620 |
No, I'm very impressed with Steve Jobs. I think he 00:57:56.380 |
understood product development better than anybody else. Yeah. 00:57:58.740 |
Clearly, that's it. I mean, my favorite Steve Jobs passage is 00:58:04.420 |
the one where he describes the john scully disease. Do you 00:58:07.860 |
Yeah, no. Oh, here it is. You know, one of the things that 00:58:13.700 |
really hurt Apple was after I left john scully got a very 00:58:16.940 |
serious disease. It's the disease of thinking that a 00:58:20.220 |
really great idea is 90% of the work. And if you just tell all 00:58:23.740 |
these other people, here's this great idea, then of course, you 00:58:27.020 |
can go off and make it happen. And the problem with that is 00:58:29.780 |
that there's just a tremendous amount of craftsmanship in 00:58:33.460 |
between a great idea and a great product. Yeah. So true. 00:58:38.420 |
Yeah, I mean, I tell people it's like a rugby scrum. You go, you 00:58:42.780 |
know, you got to get a whole team to get the ball down the 00:58:44.540 |
field. It's not like one person put the ball down the field. 00:58:48.820 |
You know, they kind of maybe suggested a play. But once 00:58:51.500 |
you're on the field, everything changes. And everyone's involved 00:58:55.700 |
That quotes where the name for craft ventures come from. 00:58:58.500 |
Oh, really? Oh, little known fact. Yeah. I didn't know that. 00:59:01.540 |
Yeah. Section 230. We talked about last week. The Gonzalez 00:59:06.660 |
versus Google case, the justices heard oral arguments and 00:59:09.260 |
plaintiffs seem to fare poorly, quote from SCOTUS blog, Justice 00:59:14.260 |
Elena Hagan suggests that even if section 230 is not well 00:59:18.900 |
suited to address the current needs of today's internet, such 00:59:21.660 |
as such a task was best left as we predicted last week, I think 00:59:26.380 |
sex you did best left to Congress rather than the Supreme 00:59:29.860 |
Court. Quote, these are not like the nine greatest experts on the 00:59:37.060 |
yeah, I mean, this is just, I think, really a quick update to 00:59:40.820 |
what we talked about last week, the justice heard oral 00:59:43.700 |
arguments, they seem to be very skeptical of the plaintiffs 00:59:46.580 |
arguments. Even Justice Thomas, who has written the most 00:59:51.300 |
skeptically in recent years about the broad immunity that 00:59:54.620 |
tech companies enjoy under section 230, seem surprisingly 00:59:58.140 |
sympathetic to the theory that the Ninth Circuit Court ruled on, 01:00:03.300 |
which is that section 230 protects recommendations, as 01:00:06.900 |
long as the providers algorithm treats content on its website 01:00:09.580 |
similarly. So even the justice who I think was most likely to 01:00:15.500 |
reign in 230 seem to be more comfortable with what the 01:00:19.180 |
defendant, which was Google was saying. So it looks to me like 01:00:23.140 |
Google and big tech are going to win this one. 01:00:25.180 |
Any thoughts? No, not really. I think I want to know what you 01:00:30.100 |
guys think about Trump showing up with Big Macs and water in 01:00:32.700 |
East Palestine. I mean, he is a genius. He beat Buddha judge to 01:00:37.300 |
East Palestine. Yeah, that was unbelievably pull up my tweet. I 01:00:42.900 |
think this is the power we because Trump has been out of 01:00:46.700 |
the public discourse. He's a media. He is a media savant. 01:00:50.620 |
Literally Biden is in Ukraine, saber rattling over air sirens 01:00:56.220 |
that may or may not be true. They were fake. Who cares? 01:01:01.180 |
Well, no, no, I think it doesn't matter. No, it doesn't matter. 01:01:03.780 |
No, we don't know. So we do we do actually. Okay, because 01:01:09.620 |
hold on a second. I don't need to be there because Jake Sullivan 01:01:14.500 |
just paid a press conference. And he was asked by a CBS News 01:01:17.460 |
reporter if the US gave the Russians any kind of heads up 01:01:20.580 |
the president was going to be in Kiev. And what Sullivan said, 01:01:24.060 |
and I quote is we did notify the Russians that President Biden 01:01:27.740 |
will be traveling to Kiev. We did so some hours before his 01:01:30.740 |
departure for de confliction purposes. You know what deep 01:01:34.500 |
confliction is, right? It's when the US tries to avoid an 01:01:39.140 |
accidental conflict. And you know, Putin's not crazy enough 01:01:42.660 |
to try and assassinate Biden. So the Russians were not attacking 01:01:45.820 |
Kiev that day. In fact, they haven't attacked Kiev as far as 01:01:48.660 |
I know, for weeks. So these air raid sirens were basically just 01:01:51.860 |
pure theater. But the amazing thing is, if you don't know if 01:01:55.420 |
you don't know that Biden orchestrated is my point people 01:01:58.420 |
on your side. Come on, Jason. But that doesn't mean Biden 01:02:02.380 |
press the button. So don't don't also take it to the other 01:02:05.220 |
stream. Either. Who knows who went who why the siren went off, 01:02:10.020 |
but put it aside. This was a joint event between the Biden 01:02:13.540 |
administration and the Zelensky team. They organized it the 01:02:17.020 |
whole thing was choreographed. How did how did that red carpet 01:02:19.500 |
get there? Jason? Was that an accident to? Okay, let's put 01:02:23.620 |
that aside. Like this is accidental. I mean, how can I 01:02:26.420 |
give you your GOP? Let me give you your GOP win. Donald Trump 01:02:30.460 |
is a savant. And he went to America to the place that we 01:02:35.420 |
were reporting on the under reported story. People in East 01:02:39.700 |
Palestine are being ignored. And he goes there to help the people 01:02:45.260 |
of America. I give you all credit. Your guy, Saks did the 01:02:49.980 |
most amazing media move in history. He went to middle 01:02:53.460 |
America where people are suffering as opposed to a war 01:02:55.580 |
that nobody wants to be in and spend all that money on we won't 01:02:58.380 |
spend money. But we will go spend billions in Ukraine go. 01:03:05.020 |
All right. I don't know what this reminded me of. And you may 01:03:07.620 |
think this is a weird connection. But it reminded me 01:03:09.620 |
of the ending to the movie, boys in the hood. Do you remember 01:03:12.900 |
what happens at the end of that movie? No, I haven't seen it in 01:03:15.260 |
years ago. Okay, it was 30 years old. But ice cube, you know, 01:03:18.300 |
plays this character doughboy and his brother gets killed. And 01:03:21.740 |
at the very end of the movie, he gives this speech to Cuba 01:03:25.060 |
Gooding Jr. where he says, you know, I turned on the TV. And 01:03:28.820 |
there was all this shit about violence in a foreign land. And 01:03:31.460 |
there was nothing on my brother getting killed all this stuff 01:03:33.820 |
about what's happening in foreign countries, nothing about 01:03:36.340 |
what's happening here. And then I think the most memorable line 01:03:38.900 |
was, either they don't know, don't show, or they don't care 01:03:42.580 |
what's going on in the hood. Right. So what's going on here 01:03:46.460 |
is the people of East Palestine, Ohio are being engulfed in a 01:03:50.580 |
plume of carcinogens and toxins. And Biden is off right pursuing 01:03:55.420 |
this crusade in eastern Ukraine. And it's not just him. I'll 01:03:59.540 |
dish out to Mitch McConnell as well. Mitch McConnell was 01:04:02.180 |
neocons of our neocons. Yeah, McConnell was on TV saying that 01:04:06.500 |
the number one priority of the United States right now is 01:04:09.180 |
defeating Russia in Ukraine. It's not helping the people of 01:04:14.020 |
Ohio. It is not securing the border. It is not solving crime 01:04:17.700 |
in our cities. It is not making our schools better. It's running 01:04:20.740 |
off and basically supporting this war in Ukraine. So both 01:04:24.020 |
these oxygen areas Biden and McConnell both they either don't 01:04:28.260 |
know, don't show or they don't care what is happening United 01:04:31.780 |
States of America. He's a genius. But it's not even 01:04:34.220 |
genius. I mean, it's so obvious that you go there. It's so 01:04:36.860 |
obvious. Nobody wants to be in a forever. Go there and Biden 01:04:40.460 |
didn't go there. It's not it's not genius. It's obvious. Go 01:04:46.740 |
he hasn't. Make a trip. I think the most senior Democratic 01:04:50.100 |
person that went over there was Josh Shapiro, who's the governor 01:04:53.100 |
of Pennsylvania, he got there before Buddha judge. 01:04:55.500 |
What is going on? And I mean, and this it's it's a never 01:04:59.300 |
ending war. And so, you know, this is if nobody wants to fight 01:05:04.100 |
a never ending war. This is, this is what got Bush in 01:05:07.220 |
trouble, right? Like this was the big critique is like, we're 01:05:09.500 |
spending all this money over in the Middle East on these 01:05:13.460 |
Well, you're talking about Bush senior. Yeah. So let's let's 01:05:15.740 |
contrast with Bush senior, I think actually, it's a good 01:05:17.940 |
analogy. So the with Bush senior Bush, actually, this is 1991. He 01:05:22.580 |
won the Iraq war, that was actually a stunning foreign 01:05:25.420 |
policy success. Because he actually didn't go too far. He 01:05:28.540 |
didn't go all the way on the road to Baghdad, the way that 01:05:30.500 |
his son, George W. Bush would creating an epic disaster. So 01:05:34.380 |
Bush 41 delivered a victory there. And he still lost 01:05:38.820 |
election. Why? Because he seemed out of touch. He wasn't focused 01:05:42.220 |
on domestic problems. The American people want an American 01:05:46.180 |
president to focus on American problems. And even if Biden 01:05:49.940 |
delivers some sort of victory in Ukraine, if he ignores these 01:05:52.900 |
festering problems at home, that he is, I think, vulnerable for 01:05:57.900 |
this reelection. But I think the truth of the matter is that this 01:06:00.820 |
war is going to turn out much worse than the Iraq war did in 01:06:04.620 |
1991. Because in 91, we showed restraint, and we knew what our 01:06:09.020 |
vital interest was. And we kept our objectives limited. And we 01:06:12.500 |
kept the timetable very short. What is Biden doing here, Biden 01:06:16.380 |
won't tell us what the objective is, it's just whatever the 01:06:18.860 |
Ukrainians want. He won't tell us what the timetable is. It's 01:06:21.740 |
basically for as long as it takes. And then meanwhile, this 01:06:25.140 |
week, you had Kamala Harris go to the Munich summit, declaring 01:06:28.580 |
that the Russians are guilty of crimes against humanity, which 01:06:31.900 |
that's something that we could have assessed after the war. 01:06:34.900 |
Think about the incentives, you're now giving the Russian 01:06:37.060 |
leadership before we said that we just wanted them to leave. 01:06:39.660 |
When you accuse them of war crimes, it implies that we're 01:06:42.740 |
gonna go chasing them all the way to Moscow, they're not gonna 01:06:45.340 |
want to end this war, when they can be put on trial at the Hague. 01:06:48.620 |
I mean, this is highly inflammatory. So, you know, this 01:06:53.620 |
Yeah, that was the thing I didn't like about Biden speech 01:06:56.380 |
over there is just, he's escalating, escalating, 01:06:58.780 |
escalating, hey, that we have to stop Putin. I mean, which you do 01:07:01.940 |
he didn't invade another country, he didn't cause three 01:07:05.140 |
or 400,000 Russians have died, according to reports over 100,000 01:07:09.180 |
Ukrainians have died, according to votes, neither side is given 01:07:11.940 |
the accurate number because they don't want to demoralize their 01:07:14.740 |
constituents. But the amount of suffering going on here is 01:07:18.700 |
extraordinary. And I think it should be the West who is going 01:07:21.300 |
send Macron, send somebody from Germany, send some, you know, 01:07:26.260 |
group of people to then go to Ukraine and work this out. But 01:07:30.540 |
you don't need to go in there saber rattling, it was too much 01:07:33.300 |
saber rattling for me, it is not enough escalation. We need 01:07:37.460 |
de escalation in these situations, not saber. 01:07:40.140 |
I agree with you, Jason, but, but Biden has really painted 01:07:43.780 |
himself into a corner here. Because before the war, he 01:07:47.340 |
refused to take NATO expansion off the table, he refused to 01:07:50.220 |
recognize the Russian interest in Crimea. And we gave no 01:07:53.340 |
support to the Minsk Accords, which would have given some 01:07:55.420 |
limited autonomy to the Russian speakers in the Donbass area. If 01:07:58.580 |
we had just done those three things, there would have been no 01:08:00.460 |
war, Biden refused to do that he refuses to take expansion off 01:08:04.220 |
the table even today. So he has nothing to compromise with he is 01:08:07.740 |
dug in. And the problem we have now is that it's a loose loose 01:08:12.700 |
scenario. If the Ukrainians keep doing poorly, because right now 01:08:17.260 |
it looks like they're on the back foot. What is the United 01:08:19.580 |
States going to do? We're going to let them lose this war? Or 01:08:22.140 |
are we going to keep giving them more aid and step in? It looks 01:08:24.260 |
to me like Biden now is invested his whole presidency in this, 01:08:27.460 |
and he can't just let them lose, which means more escalation from 01:08:30.220 |
us. And on the Russian side, if the Russians lose, then they 01:08:34.580 |
have an incentive to use nuclear weapons to rescue the 01:08:36.940 |
situation. So it seems to me that both scenarios here are 01:08:42.020 |
really bad. And we don't really have a good way out of this. 01:08:45.460 |
We're looking for some sort of magical Goldilocks scenario 01:08:48.860 |
where the Russians sort of lose but not enough to use nukes. You 01:08:52.980 |
know, the administration has not given us a clear picture of what 01:08:57.020 |
victory looks like here, that's actually reasonably achievable 01:09:00.940 |
in a reasonable timeframe at a reasonable cost. 01:09:04.260 |
What do we think? freeberg of Xi Jinping, making overtures and 01:09:08.980 |
hey, maybe we should work towards peace. If you follow 01:09:11.740 |
the money, he wants cheap oil. He wants this thing to end and 01:09:14.660 |
he wants the West to be buying goods from China. The West wants 01:09:18.500 |
to sell a bunch of armaments. The military industrial com 01:09:21.940 |
complex is absolutely in delight of replenishing all of these 01:09:26.500 |
weapons. Perhaps a little cynical to follow the money 01:09:29.260 |
concept. But what was your take on the chessboard of Xi Jinping 01:09:33.660 |
is going to visit Putin before Biden does. And he wants to 01:09:38.940 |
build bridges and we want to say Brattle. What are your thoughts? 01:09:43.220 |
he getting like, I mean, China buys energy from Russia today, 01:09:48.020 |
they buy oil on sale at a very cheap price. So if I'm trying to 01:09:52.500 |
want this to last longer, don't I? Like, why would I want to end 01:09:55.380 |
this and then have Russia's markets open up? Because if 01:09:57.860 |
their markets open up the markets normalize to market 01:10:00.980 |
prices, right now they're getting a discount. So I think 01:10:04.300 |
yeah, they certainly don't want things to escalate. The question 01:10:07.700 |
is how quickly do they want them to de escalate? So I'm China, 01:10:11.340 |
I'm kind of probably playing a little bit of a, you know, 01:10:14.780 |
middle line here. I just I obviously don't want to see a 01:10:18.460 |
big hot war. China's got its own domestic problems right now that 01:10:20.900 |
seem pretty significant. And existential and having access to 01:10:24.420 |
cheap energy seems like a benefit. Obviously, if there's 01:10:29.100 |
significant conflict and escalation of conflict, that 01:10:31.220 |
would be very bad from an economic perspective for China. 01:10:33.740 |
So they're probably somewhere in the middle, like a slow 01:10:36.820 |
resolution, let's say, I don't know. I mean, this is pure 01:10:40.740 |
sacks or Chamath Europe isn't going to buy Putin's oil anytime 01:10:46.020 |
nobody's able to sell it to China, he's able to sell to 01:10:48.300 |
India and the rest of the world. There was actually an article in 01:10:50.300 |
today's New York Times about how the West may be unified about 01:10:54.620 |
Ukraine, but the rest of the world is not the article was 01:10:57.740 |
saying something that cricks the worst set for a while, which is 01:11:00.020 |
we actually don't have the whole world with us at all. The BRICS 01:11:03.060 |
countries are not with us the emerging world, the whole 01:11:05.580 |
southern hemisphere basically is not with us. They would like the 01:11:08.660 |
US to play a more constructive role in finding a peace deal. 01:11:11.180 |
Not like you said, Jason, saber rattling or escalating. So the 01:11:14.460 |
rest of the world is not happy with us. And this is why the 01:11:17.020 |
Russian sanctions have not been effective. I think the Russian 01:11:19.820 |
economies had like a three to 4% hit. It is not the collapse that 01:11:23.420 |
was predicted, because there are enough other countries willing 01:11:26.860 |
Would this have happened to Martha Trump was president? And 01:11:29.140 |
how would Trump have handled it? Do you think just game theory 01:11:32.340 |
here? I'm just curious. Because Trump almost won, right? I mean, 01:11:34.980 |
if Trump had won, what would this look like? Would Putin have 01:11:36.900 |
gone in there if Trump was president? And how would Trump 01:11:39.740 |
have handled it? Because Trump seems to think I would have just 01:11:42.580 |
told him don't do this, and they wouldn't have done it. 01:11:44.140 |
I mean, this is the most obvious compliment I can give him. I 01:11:47.540 |
think that he is exceptionally pragmatic on being anti war. And 01:11:54.420 |
I think that that is one of the most positive characteristics 01:11:59.500 |
that he showed he was really the only president I think in modern 01:12:02.740 |
history, right? So actually, that hasn't gotten us embroiled 01:12:08.940 |
He's been incredibly, incredibly consistent. So I suspect that 01:12:11.900 |
there would have been some kind of a deal. I know that sounds so 01:12:16.420 |
ridiculous to say, but there would have been a deal. 01:12:18.340 |
He's actually a great he's a dealmaker. He's a Jason. He gave 01:12:22.940 |
a statement North Korea. He went to North Korea and met with 01:12:28.700 |
He would have fired all of the deep state blob that started to 01:12:32.700 |
position anything towards a conflict. So I think he would 01:12:36.860 |
have shut the door so ferociously on Ukraine and NATO 01:12:40.580 |
and anybody that crossed that line, he would have tarred and 01:12:43.820 |
feathered publicly. And I think the end result would have been 01:12:47.060 |
that Putin could have found an off ramp well before he invaded 01:12:53.420 |
And blame Germany for all this, right? He called it. 01:12:56.620 |
Well, Trump very early asked the question, why are we spending 01:13:01.380 |
all this money to defend Germany when Germany has this big 01:13:04.380 |
pipeline deal with Russia doesn't seem like they need our 01:13:06.940 |
protection, they should just pay for it themselves. But I think 01:13:09.500 |
there's a separate point that your mouth just made that is a 01:13:11.540 |
really good point, which is Trump's instinctual resistance 01:13:16.700 |
to what the deep state wants. And he actually said it this 01:13:19.180 |
week, he gave a two minute televised statement that was all 01:13:23.180 |
over Twitter, where he basically made the argument that listen, 01:13:26.140 |
the reason why we're in this war is because the military 01:13:29.060 |
industrial complex and the foreign policy establishment, 01:13:32.140 |
they basically courted this conflict and they are working at 01:13:35.860 |
odds with the interest of the American people. It's actually 01:13:38.540 |
a fairly radical critique, I don't think a major presidential 01:13:41.940 |
candidate has run against the military industrial complex the 01:13:45.100 |
way that he is now positioning himself. And let me tell you 01:13:48.140 |
this, you know, I've said it before, he's not my preferred 01:13:50.380 |
candidate. But if this war spirals out of control, either, 01:13:55.020 |
you know, it turns into a even bigger conflict that draws us in 01:13:59.580 |
or it turns into a big recession, because I don't think 01:14:03.100 |
we've seen the last of the supply shocks from this war. If 01:14:07.580 |
we get a recession that Trump can, I think, lay at the feet of 01:14:12.700 |
this war, he's positioning himself to take advantage, this 01:14:15.780 |
could be a silver bullet for him. I don't think he has any 01:14:18.140 |
other way of winning. But, you know, if this turns into a big 01:14:22.780 |
positioning, you have your tinfoil hat there. Put it on for 01:14:26.820 |
a second. I want to talk to tinfoil sacks, tinfoil hat 01:14:29.620 |
sacks. Let's put them the tinfoil hats on here. Do you 01:14:32.340 |
think Putin is escalating this as a way to position Trump to 01:14:39.740 |
where Putin says he could say this during the election? Like, 01:14:42.420 |
listen, you know, I would love to talk to Trump. And what if 01:14:44.980 |
Trump goes and talks to Putin? Or does a phone call with him? 01:14:48.100 |
Because I know that's against the rules, right? 01:14:49.980 |
So wait, so so your theory is that Putin is going to 01:14:54.260 |
Okay, so so your theory is that Putin's escalating this into 01:14:59.180 |
potentially a nuclear war to get Trump reelected. That's your 01:15:03.220 |
Tim, I'm just tinfoil hatting it. The reason that this has 01:15:07.660 |
occurred. No, no, now that this has occurred, not that he did. 01:15:19.660 |
he started the war for it that he would end the war to give 01:15:23.460 |
How's he gonna end the war for Trump? What are you talking 01:15:26.700 |
about? During the election? He's he does a call with Trump. And 01:15:30.780 |
he says, you know, I talked to Trump about this. And I'd love 01:15:33.540 |
to do some negotiations with Trump. I've always had 01:15:36.460 |
appreciation for his ability to help negotiate things I would 01:15:39.500 |
love I would feel better about negotiating with Trump, who 01:15:42.540 |
hasn't saber rattled and told everybody in the world that I 01:15:45.300 |
have to be that there isn't regime change. So 01:15:48.420 |
I know, it's really interesting how you come up with these 01:15:50.140 |
conspiracy theories, and then attribute them to me and call me 01:15:52.780 |
the tinfoil hat guy. But listen, I know it's a joke. I know. 01:15:59.180 |
No, it's a silver bullet. It's rails. Yeah, if this war is off 01:16:02.900 |
the rails, and the economy goes off the rails, because of this 01:16:05.380 |
war, he Trump right now is positioning himself to take 01:16:08.220 |
advantage of that fact. And DeSantis is to play right into 01:16:11.140 |
his hands as a pacifist, critical things about the war 01:16:13.620 |
skeptical, I would say things about the war this week. So it's 01:16:15.980 |
not just Trump. But look, the thing you have to understand 01:16:17.980 |
about this war is existential for Putin. It's existential at 01:16:23.500 |
It's extra. And it's extra curricular for us. 01:16:26.460 |
Yeah, yeah. And that's why Obama said back in 2014, that the 01:16:31.820 |
Russians have escalatory dominance, they will always 01:16:34.900 |
climb the escalatory ladder all the way up to nukes if they have 01:16:37.260 |
to. And the sooner we recognize that fact, the better off we're 01:16:40.140 |
I think the good news is that we are speech that he did, where he 01:16:43.660 |
kind of see the speech. Was it good? We just talked about it. 01:16:47.500 |
It was it was two minutes. It was fabulous. Saks just 01:16:50.940 |
The crazy thing is, it sounded a lot like we'll be talking on 01:16:53.500 |
this podcast, which is he talked about all these generals that 01:16:56.300 |
retire Victoria Newland. He called up Victoria Newland by 01:17:02.140 |
Because I didn't say this because I'm on a different time 01:17:05.340 |
zone. And it was it must have broken when I was asleep or 01:17:07.820 |
school. It's a two minute video in which he like I said, he 01:17:10.820 |
attacked the military industrial complex and foreign policy 01:17:12.940 |
establishment for creating this war. And he mentioned Victoria 01:17:15.460 |
Newland by name. Let me tell you something. Newland is going to 01:17:18.260 |
be it's going to be a very popular message. But yes, it's 01:17:21.060 |
very popular. Newland is the Fauci of this situation. Okay. 01:17:25.220 |
The same way that Fauci was supposed to be protecting us 01:17:28.180 |
go viruses, and then fun. Function research. Victoria 01:17:33.500 |
Newland was a label. Let me tell you something. Victoria Newland 01:17:36.460 |
misinformation. Victoria Newland was supposed to be our 01:17:40.140 |
chief diplomat with respect to Russia and Eastern Europe. And 01:17:43.700 |
what did she do instead? She ginned up this conflict. How 01:17:47.700 |
he ended up. We backed an insurrection in Ukraine in 2014. 01:17:52.780 |
Jason, if you didn't like the insurrection of January six, let 01:17:55.380 |
me tell you, you aren't going to like the insurrection that she 01:17:57.580 |
staged in Ukraine. Because they brought in these Ukrainian far 01:18:01.500 |
right nationalists as the muscle. And that is what we also 01:18:04.740 |
bring. He bring big max. Did he bring big max with him? Did you 01:18:10.220 |
say he brought big max to his power stride? He brought fast 01:18:15.260 |
food to them? Yeah, no, you're not. But you're ignoring what 01:18:18.060 |
SAC said. But no, no, I got it. I am not disagreeing with him. I 01:18:21.340 |
think if you want to understand the roots of this conflict, 01:18:24.580 |
nobody wants to be in a forever war. Yeah. But just be explained 01:18:28.900 |
why he mentioned Victoria Newland. He mentioned her 01:18:31.740 |
because she was the State Department official who was 01:18:35.100 |
responsible for backing this insurrection of a democratically 01:18:38.900 |
elected leader in Ukraine in 2014. named Yanukovych. Okay, 01:18:43.900 |
Yanukovych was trying to was doing a balancing act between 01:18:48.020 |
Ukrainian nationalists and Russia. And it was a very 01:18:51.140 |
delicate balancing act. And we basically toppled him. And ever 01:18:55.260 |
since then, the relations with the Russians over Ukraine have 01:18:58.100 |
been headed south. If you're wondering why Putin sees 01:19:01.340 |
Crimea, it was in direct retaliation for the coup that we 01:19:04.140 |
backed in Ukraine in 2014. This is the origin of the conflict. 01:19:08.620 |
And, you know, if you want to understand where this comes 01:19:12.620 |
from, you have to go back to this. And the fact that Trump's 01:19:16.500 |
I think that the good news for us is I think that heading into 01:19:20.500 |
June and the debt fiasco that's looming, I think we're going to 01:19:25.500 |
and I think this will help a lot, get distracted with 01:19:28.380 |
domestic issues in the sense that it'll take some heat off of 01:19:31.940 |
escalating all of this foreign adventurism. You know, this is 01:19:38.140 |
such a scene like this is such a scene from wag the dog. Every 01:19:43.500 |
time there's something inside the United States that we should 01:19:47.420 |
really focus on. We have this wag the dog moment where we get 01:19:51.020 |
distracted by some adventurism abroad, and we forget and we 01:19:55.020 |
lose sight. So we have this East Palestine thing right now. In 01:19:58.820 |
June, we're gonna have to come back to terms with this debt 01:20:00.780 |
ceiling issue, which is a huge one, how we're going to resolve 01:20:03.660 |
it. It's not clear. Just this week, the Federal Reserve 01:20:08.540 |
basically said, Hey, folks, we're taking rates to five and 01:20:11.260 |
a half plus, and they're going to stay there. That seems like 01:20:14.460 |
no news. People just seem to digest it and move on. It's 01:20:18.740 |
really incredible how we just find we're like, what is it 01:20:23.060 |
Jason, the dog that chased the bumper and caught the car or 01:20:29.020 |
We got plenty of big problems here in the United States, 01:20:31.580 |
plenty of big problems. And I don't know that wag the dog 01:20:33.940 |
works anymore. Because I think the American people want, like 01:20:36.740 |
I said, they want an American president to focus first and 01:20:38.900 |
foremost on American problems. And even remember, Bush senior 01:20:42.180 |
91 won that war and still lost reelection still lost. So I 01:20:46.060 |
don't think wagging the dog works anymore. It works for some 01:20:48.380 |
short period of time, especially while the media are portraying 01:20:51.460 |
this point, the air raid theater, that's eventually the 01:20:56.260 |
you're so right. So that issue, think about Bush, Bush came off 01:20:59.700 |
of the Persian Gulf War with like a 91 or 2% approval 01:21:03.380 |
rating. I mean, we've never seen anything like it. But he 01:21:07.020 |
violated a simple tenet of his domestic policy, which is read 01:21:09.980 |
my lips, no new taxes, boom, lost. And it was not even close 01:21:14.460 |
in the end. So I think you're right. I think people really 01:21:18.460 |
care about the economy. Go Nikki Haley and do how much do how 01:21:23.460 |
much debt do we want to go into over foreign wars? The only 01:21:26.900 |
thing I ever liked about Trump was his policy of not starting 01:21:30.500 |
wars and not getting into them. And Americans want to focus on 01:21:35.220 |
I'm a balance sheet voter right now I'm voting based on who is 01:21:38.380 |
going to be fiscally responsible, I mean, freeburger 01:21:42.620 |
we've got to be real careful in how we handle China, because you 01:21:44.980 |
had Blinken on all the Sunday shows, basically denouncing 01:21:47.620 |
them expressing outrage, that they might support the Russians 01:21:50.820 |
acting shock shock that they could do that. We don't even 01:21:53.620 |
have the ability anymore to understand that other countries 01:21:57.100 |
do things in their own interest. And we can't accept that. And 01:22:01.300 |
instead, we act as if foreign policy should be conducted 01:22:04.660 |
according to this morality play that we've created. And if you 01:22:07.660 |
don't do what we think is right, then we're gonna express all 01:22:10.860 |
this outrage and condemnation at you. And somehow that's going to 01:22:13.300 |
get you to violate your own interests. That's not the way 01:22:15.500 |
the world works. And what we're doing right now, what we're 01:22:18.180 |
doing right now is pushing China and Russia together into a new 01:22:22.580 |
axis block. This is very foolish, very foolish, even 01:22:26.460 |
during the Cold War. Okay, we work to keep Russia and China 01:22:30.820 |
apart. And whatever you think of those regimes today, they were 01:22:34.820 |
much worse back then. Remember, the Soviets, you had a Stalinist 01:22:38.380 |
regime, the Chinese had Mao, those were the two of the three 01:22:41.500 |
biggest mass murderers of the 20th century. And Nixon and 01:22:45.220 |
Kissinger still went to China and shook Mao's hand and toasted 01:22:48.780 |
him because it's important to keep China and the Soviet Union 01:22:52.580 |
divided. And what are we doing today, we are basically pushing 01:22:55.740 |
them together. With all this condemnation and outrage, it is 01:22:59.140 |
not a smart strategy. Can't disagree. We need to be building 01:23:02.700 |
bridges with India. That's a key key relationship. And China. I 01:23:06.620 |
don't know why we're not figuring out what we have. Yeah, 01:23:09.140 |
this is poisoning our relationship with India. India 01:23:11.700 |
is the biggest democracy in the world. And our relations with 01:23:14.540 |
them have gone south since this war, because they have a 01:23:17.540 |
friendship with Russia that goes around. I would rather see Biden 01:23:20.700 |
go to India and start building some bridges there. Yeah, I 01:23:23.460 |
agree. I can't disagree. JK, how's your fundraising going for 01:23:26.700 |
lunch on four? Thanks for asking. Great question. You 01:23:30.700 |
know, we're doing that public 506 C public fundraising thing. 01:23:33.540 |
And so I did a bunch of webinars and without doing a single in 01:23:37.100 |
person meeting $51 million in requests came in, just, you know, 01:23:42.780 |
to a type form, basically a form online. And now we're going to 01:23:46.020 |
be starting in the next month after I get back from Japan, 01:23:49.300 |
actually meeting with the, you know, big LPs in the world, and 01:23:52.380 |
I want to make a trip to the Middle East and just go all 01:23:54.700 |
around the world and meet all the big funds. So thanks for 01:23:56.820 |
asking. Yeah, I think it's gonna change everything. Yeah. 01:23:58.620 |
That's awesome. Can you imagine $52 million in commitments 01:24:04.020 |
before actually doing the actual tour? That's awesome. Just out 01:24:07.140 |
of the gate. And my last one was 44. And so I think this 506 C 01:24:10.580 |
like I can be public about the fact that we're raising a fund. 01:24:14.620 |
Well, congrats. And I have one question for you. 01:24:19.780 |
The world's greatest moderator? I mean, it's not gonna make 01:24:22.500 |
great jokes. Not for not for now. And oh, you know what, I 01:24:25.260 |
had an interesting point about management fees in these funds. 01:24:27.660 |
Just to circle back. Did you know, this is what I heard that 01:24:32.140 |
benchmark during that worst vintage, you know, after I think 01:24:36.220 |
the great financial crisis, or maybe it was the dot com was 01:24:38.340 |
either of those. They took their management fees, because that 01:24:42.420 |
fund was so you know, challenged. They deployed the 01:24:45.700 |
management fees into primary investing, or I'm sorry, into 01:24:50.260 |
follow on investing on their winners to regain the results. 01:24:53.940 |
Can you imagine in this market, a VC who deployed capital in 01:24:57.980 |
2020 2021? Saying, you know what, we've got these management 01:25:01.940 |
fees millions of dollars in the future, to pay for managing 01:25:04.900 |
these instead of taking that money. I'm going to put that 01:25:08.380 |
into your into the companies for my launch fund three tomorrow. I 01:25:12.140 |
had a couple of opportunities. And I was like, you know what, 01:25:14.460 |
I'm going to take some of the management fees and invest in 01:25:17.140 |
some of those existing companies to try to goose the returns for 01:25:20.180 |
my LPS. And so we're at 104% or 103% invested in the capital, 01:25:26.100 |
just by just taking a couple 100 grand off of the management 01:25:29.220 |
fees. And I'm like, well, this is a really interesting 01:25:30.740 |
strategy. Like, why am I playing for the management fees? Or am 01:25:32.860 |
I playing for the Mike, I'm paying for the Mike, right? I 01:25:37.140 |
Jason, by the way, it's not true that the AI can't tell jokes. 01:25:40.380 |
Our friend, Billy tweeted how the AI told a joke in this the 01:25:46.420 |
style of Jerry Seinfeld. Then he asked it to tell a joke in the 01:25:49.300 |
style of Dave Chappelle and it refused. So the AI can tell a 01:25:52.180 |
joke if it wants to. It's racist, but no only clean jokes. 01:25:58.900 |
I guess I don't think every I don't think Dave Chappelle has to 01:26:03.900 |
be blue. But it would not tell a joke about Dave. 01:26:07.900 |
Wow. I mean, we got to get Sam. He's an iconoclastic. Like he 01:26:19.020 |
Well, by the way, actually, he's got a shot there. After our 01:26:22.140 |
last episode, in which we were raising concerns about the AI 01:26:26.220 |
bias, they published a blog post saying that the day after if 01:26:30.540 |
bias has occurred, it is a bug, not a feature and they are 01:26:33.420 |
trying to be even handed. So I'm glad they have that smart 01:26:37.180 |
announced that and that's their standard. And we're going to 01:26:40.900 |
well, they have to be public. Like this. Yeah, yeah. I mean, I 01:26:44.020 |
read the blog post. It seemed reasonable. It's great. They're 01:26:47.100 |
addressing it. And I also think they're now doing embedded 01:26:50.980 |
citations. So somebody tweeted at me after we had the whole 01:26:53.220 |
discussion about credit. And when they were doing facts, 01:26:56.620 |
they're now saying, and they haven't made an announcement 01:26:59.700 |
about this yet. But they were saying, according to this 01:27:03.580 |
source, the following according to this source, so they're 01:27:06.420 |
starting to source in the copy that's being written. So that's 01:27:09.300 |
a big step. And then I was talking to Adam D'Angelo, about 01:27:13.420 |
PO, which is an amazing app, you should try it. I think it's the 01:27:15.780 |
best one out there right now of all the chats. Po is an app 01:27:19.220 |
based on the core data set. And I asked a questions about the 01:27:22.140 |
trip to Japan and the SECO and this and that. And it was 01:27:24.620 |
extraordinary how well done the answer was with bullets. And 01:27:28.140 |
then I asked him online, Hey, what about citations back to the 01:27:30.420 |
original core questions? And he said, Yes, we're going to be 01:27:33.020 |
adding that. So then I was thinking, wow, if you add to the 01:27:36.340 |
core corpus, and then they link back to your answer. That's 01:27:40.180 |
awesome for me as a person who's answered hundreds of questions 01:27:43.380 |
on core to build my reputation. So I think Cora is, for me, I 01:27:47.580 |
think Cora is the could be the Google I think Cora's got a 01:27:50.380 |
better data set. And if they play that right, I think they 01:27:53.260 |
could be better than chat GPT. And they said you have to say 01:27:57.260 |
Po is based on the core data set data set, Po, it will answer 01:28:01.700 |
questions like the best answers on core is that we're so you're 01:28:06.420 |
we're saying, yeah, that's kind of interesting is using Cora as 01:28:10.260 |
the primary data set. I'm sure it's using the rest of the web 01:28:12.620 |
to and Wikipedia and everything. I think I don't know why they're 01:28:15.700 |
calling it Po I think they should just do Cora chat bot or 01:28:18.100 |
whatever. Yeah, but just try it. It's called Po download it. You 01:28:21.220 |
can use it today. You want to know why I'm excited about that? 01:28:23.380 |
Because you got a little tasty. Please. I got a little slice of 01:28:25.780 |
Cora. Oh, good for you. Well, I mean, Cora was always like, are 01:28:28.580 |
they ever gonna make money? Or are they just going to build 01:28:31.100 |
this incredible data set and do nothing with it? Yeah. What did 01:28:33.980 |
I say? I said, I said, AI is gonna be to the to basically 01:28:38.300 |
SAS what mobile was to have one. Oh, you'll either get 01:28:41.180 |
disrupted or get turbocharged by it. It's gonna be I think Cora 01:28:44.380 |
is the number one player in AI going forward. I know that 01:28:47.820 |
sounds crazy. But the fact that and I think Reddit also has this 01:28:51.980 |
insane potential if Reddit had a chatbot because think about how 01:28:54.860 |
many times people do a search and YouTube is the other one 01:28:57.300 |
where they say, what's the best sci fi movie of the year or 01:29:00.580 |
which directors make the best screenplays or whatever. And 01:29:03.180 |
then they put the word Reddit at the end where they put the word 01:29:05.020 |
core at the end, where they put the word YouTube at the end to 01:29:08.260 |
just narrow down the corpus of where to find the answer. Go 01:29:10.740 |
ahead, I've worked with you. I've known D'Angelo for 17 01:29:13.740 |
years now. SmartCat. He was the CTO of Facebook when I worked 01:29:17.940 |
there. The single smartest and best single smartest person I 01:29:22.780 |
worked with. And then separately, one of the most 01:29:26.300 |
absolute genuinely best human beings in the world. Can we get 01:29:29.340 |
him out? He does. He doesn't. Is he not a good public speaker or 01:29:31.940 |
something? Because I never hear him talk. I'd like to get him at 01:29:34.900 |
all and summit me. Angela is just so superb on every 01:29:37.580 |
dimension. We should get him on actually, just because I didn't 01:29:40.180 |
know he was working in AI. He has a lot of interesting 01:29:42.380 |
thoughts about, you know, social networking platforms, and 01:29:46.060 |
and he's on the board of opening. Okay, that's really, 01:29:48.620 |
oh, get him on the pod, or maybe you own summit. 2023. All right, 01:29:52.700 |
He'll definitely make the anti establishment list. 01:29:54.460 |
Definitely anti establishment. Yeah. Okay. So for the Sultan of 01:29:57.820 |
sneaking out, he left and the dictator. And what do you want 01:30:03.860 |
to be referred to now? passages, the peace passages, 01:30:07.940 |
peacemaking? You are the saxophist. I'm the world's 01:30:12.620 |
undisputed greatest moderator on the number one podcast in the 01:30:17.500 |
world for now until the AI replaces you. Yeah, I trained the 01:30:20.540 |
AI to replace your sacks. Ukraine, UK, UK, and Biden, 01:30:23.460 |
Biden, Biden. No, Nikki Haley, no, stop making Nikki Haley 01:30:26.700 |
happen. The end. The data set has been done. All right, 01:30:37.220 |
we open source it to the fans and they've just gone crazy. 01:31:03.900 |
We should all just get a room and just have one big huge orgy 01:31:06.580 |
because they're all like this like sexual tension that they