back to indexE133: Market melt-up, IPO update, AI startups overheat, Reddit revolts & more with Brad Gerstner
Chapters
0:0 Bestie intros: Chamath flies public + Poker recap
8:12 FED pauses hikes momentarily, IPO window status, state of the market
25:10 Film, cold plunge, and sauna talk
33:25 AI's impact on tech and growth stocks surging, Google's position, AI's "$20T question"
50:27 Jay Trading beating the market, how Brad formulates and sizes public bets, how GPs should handle distributions, CalPERS mistakes
64:22 Reddit moderators in revolt and how this issue might reshape the future of social apps
75:11 Funding landscape for AI startups is overheating: Mistral's $100M+ Seed round, importance of constraints, and more
104:17 Science corner: Understanding the Bill Gates-funded mosquito project in Colombia
00:00:00.000 |
I'm so fucking tired. I've slept six hours in three and a half 00:00:04.080 |
days. Six hours. Wait, seven hours to the world series of 00:00:07.680 |
poker with Helmut. So you spent no, no, no. First of all, I flew 00:00:12.160 |
public took Southwest. What? Yeah, cost me. Yeah, cuz I got a 00:00:17.600 |
ticket for 49 bucks. It's like so fucking incredible. Did you 00:00:20.080 |
sit in seat a one also known as J cows reserve seat that has my 00:00:24.080 |
name on it. I was in the front row. Yeah, you it's great. Like 00:00:27.600 |
Southwest has these numbers and first class on Southwest. No, no, 00:00:32.160 |
no, no premium. Plus, he was in the front row. No, there's like 00:00:35.600 |
these signs that that like have a number that attaches to your 00:00:38.880 |
ticket and then you stand in that line and then you go on in 00:00:41.360 |
this orderly way. And so I was I had like a five or something. So 00:00:45.600 |
I was like the fifth person on and then I sat in the front and 00:00:49.360 |
I put my bags up top. Yeah, you carried your bags. Yeah. An hour 00:00:53.120 |
later, I was in Vegas. It was so easy. Southwest is phenomenal. 00:00:56.640 |
And then I flew back anyways. And after after losing as much 00:00:59.600 |
money as I did, it felt good to fly for $49. I gotta be honest 00:01:02.800 |
with you. Well, austerity measures have you can sleep 00:01:06.240 |
better at night with austerity measures. And did you go to the 00:01:08.560 |
all you can eat buffet and take it to go as well? No, I'd never 00:01:26.320 |
Why did you fly Southwest? Why did I fly Southwest? Well, the 00:01:33.040 |
planes in Europe was not that's number one. And then two, I just 00:01:36.800 |
wanted some flexibility to get in and out depending on when I 00:01:39.520 |
busted these tournaments. So let me tell you about these 00:01:41.360 |
tournaments. The 100k is literally like a murderer's row 00:01:46.960 |
of like every great poker pro. So it was like 96 of us or 00:01:51.520 |
something. I got to be honest with you. It was so much fun. 00:01:56.720 |
So much fun. Why? How so? You know that we play 40 minute 00:02:01.680 |
levels, and you have to really get the chips moving, which 00:02:04.960 |
means that there's only so much like, you know, Game Theory 00:02:08.080 |
optimal poker, you can play and at some point, you just got to 00:02:10.720 |
gamble it up and you got to take your shots and you have to be 00:02:13.120 |
willing to bluff and, you know, you have to bet for thin equity. 00:02:16.000 |
It's so much fun. And then you see some of these guys that just 00:02:18.960 |
lose their minds. Anyways, it was it was really, really a lot 00:02:22.720 |
of fun. I finished in the middle of the pack, like 42nd or 00:02:26.400 |
something. So that was brutal. But then the thing that I'm the 00:02:28.560 |
most proud of is and I went from that and I hopped in the 3k 00:02:31.920 |
six six handed right afterwards, and there was like 00:02:34.880 |
1250 people in that and I got to 81st and whoa, the problem was 00:02:39.520 |
like just couldn't get anything going. I couldn't get any real 00:02:42.000 |
car like you need some card distribution like you have to at 00:02:44.960 |
some point. Yes, make some hands you can't just, you know, I 00:02:47.920 |
could bluff my way to 81st but I needed hands to really, to 00:02:51.840 |
really, really have a chance to chip up and run it. Then I 00:02:55.040 |
played in the Razz, which is seven card stud low. There's 126 00:03:01.280 |
people now when I finished in the middle of the pack there and 00:03:03.520 |
then this 10k bounty there was about 600 people and I finished 00:03:06.640 |
like 150. Those are all respectable. I mean, I know play 00:03:10.400 |
tournaments. I have no idea how to play tournament to be honest. 00:03:12.560 |
I know how to play poker. I know how to play cash games, but 00:03:14.400 |
tournaments are very different. So I feel like I was very ill 00:03:17.440 |
prepared, but I had so much fun and you know the crazy thing is 00:03:21.760 |
when you're playing for 12 hours a day, you are focused for 00:03:24.800 |
12 hours. The thing that I forgot is that you end up I 00:03:28.960 |
ended up losing like two and a half pounds. Oh wow. If your 00:03:32.640 |
brain is going going going for eight hours a day, biggest 00:03:36.320 |
**** thing and I was I was mentally devastated at the end 00:03:40.320 |
of each night. You're just lying in bed eyes awake. Yep. And 00:03:43.200 |
then you can't sleep and so then you know you go and you 00:03:44.960 |
play craps and **** break the casino for a couple 100 times a 00:03:48.960 |
night there and then go back to bed. It's great. Sounds like a 00:03:52.000 |
great week. The good news is is Phil was live blogging for you 00:03:56.800 |
on your behalf from from Vegas over your shoulder. Oh, by the 00:03:59.840 |
way, the other thing was I brought my phone, but I don't 00:04:02.240 |
have anything on my phone like I don't have Twitter. I don't 00:04:04.480 |
have any of that stuff. so I was just totally in the zone. It 00:04:08.160 |
felt so great to not be connected to social flow 00:04:10.960 |
experience. It's a flow experience when you can turn 00:04:12.960 |
everything off and just be in the moment. Congratulations on 00:04:15.280 |
the on the you know coming in the 10% of the field in the 00:04:18.400 |
3000 or 8% of the field. That's incredible dude. If I really 00:04:22.000 |
practice at these tournaments, I think I would bank a couple, 00:04:24.160 |
but I just don't have the time. The paradox there is that you 00:04:26.640 |
did best in the largest field. If you can do this for three or 00:04:29.920 |
four years and you go into those Raz fields when it's 100 00:04:32.880 |
or those 25K 100 Ks when it's 100200 people in the 100 K. 00:04:37.840 |
There's 999596 people like you really do have about a 1% shot 00:04:42.480 |
of winning the thing, which is pretty incredible. Yeah. Raz 00:04:45.280 |
too. Raz is always 100. I love tournament poker. That was my 00:04:49.360 |
that was my game back in the day. It's because it's because 00:04:51.600 |
you're afraid to lose money. so it's perfect for you put up a 00:04:54.160 |
small amount of money and you can play for many hours. You're 00:04:56.240 |
probably in your mind. You're probably dividing the number of 00:04:58.720 |
hours by your buy in so that you can think about what your 00:05:00.880 |
hourly cost is. I actually like tournament poker because when 00:05:03.920 |
you lose you lose like you can't just yeah, rebuy and 00:05:07.440 |
basically and then there's a broader macro strategy. You 00:05:10.160 |
start to play the chips against the other players because you 00:05:12.960 |
know how everyone else is playing based on how many chips 00:05:15.520 |
they have in front of them. It's a great point. This theory 00:05:17.680 |
of ICM, which is like independent chip model is 00:05:20.000 |
exactly what you just said. The craziest thing freeberg happens 00:05:22.880 |
right at the money bubble. These guys start. I've never 00:05:26.240 |
seen this before. They ultra tank ultra tank. Oh yeah, they 00:05:29.920 |
go in the tank for two minutes and then they fold so 00:05:32.640 |
obnoxious when I used to play tournament poker. I'd sit I'd 00:05:35.360 |
fold Ace King suited. You'd fold pocket Queens. You'd fold 00:05:39.360 |
everything if you're on the bubble of getting into the 00:05:41.920 |
money or you're on the bubble of getting to the final table. 00:05:43.840 |
It just doesn't matter because if there's three guys with you 00:05:46.480 |
that have smaller chip stacks, let them get blinded off and 00:05:49.840 |
now you're in the money or now you're on the final table and 00:05:51.840 |
then you'll play poker. It's like go get up go have some 00:05:53.920 |
dinner. Come back in an hour. Don't even look at your cards. 00:05:57.040 |
If you can cruise your way there, it's a totally different 00:05:59.840 |
kind of game. Totally different. There's a guy that 00:06:01.920 |
was tanking. He's a really lovely guy. British guy. I 00:06:05.280 |
offered him the min cash just to not take. I was like, please 00:06:09.760 |
stop taking for four minutes and folding your head. I'll just 00:06:13.360 |
meet it as action. I had a guy do this in a tournament and I 00:06:16.640 |
timed it with the rest of the table. I took out my phone. I 00:06:18.880 |
timed it and I said, whatever number of seconds you do, I'm 00:06:22.160 |
doubling and I just did that. It was so obnoxious that the 00:06:27.520 |
entire table like the guy finally stopped. Well, in the 00:06:30.160 |
in the 100k, we actually had a better setup which is that we 00:06:32.960 |
had these iPads on the table and everybody gets a 30 second 00:06:36.080 |
shot clock. So, you can't really **** around but then in 00:06:39.520 |
these big field events, they're not going to have iPads on 100 00:06:43.920 |
tables obviously. Yes. And so instead, people just tank 00:06:47.040 |
forever and then you have to call the floor and you have to 00:06:49.280 |
call time on these folks. The funniest thing though is like 00:06:51.920 |
the last tournament I bought in for was a 10k secret bounty 00:06:54.560 |
which means that on day two, when you knock somebody out, 00:06:57.840 |
you actually, everybody stops the action. You go up on stage 00:07:00.480 |
and you pull like out of, it's like treasure hunting. Yeah. 00:07:03.360 |
And some of these bounties can be like a million bucks. And 00:07:07.680 |
obviously, there's a lot of people there that kind of 00:07:09.920 |
recognize me and when they saw me in the 10k secret bounty, 00:07:13.360 |
there was like a movement. It was so funny. They were like, 00:07:15.920 |
if you win the secret bounty and you win the million, we're 00:07:19.280 |
going to revolt and burn the Paris Paris to the ground. It 00:07:23.840 |
would have been so and I and I thought, you know, it's true. 00:07:26.480 |
It'd be patently unfair if I was the one that won that million 00:07:29.200 |
dollar secret about it. How many selfies did you take on 00:07:31.520 |
Southwest? That's the question everybody wants to know how 00:07:33.920 |
many people recognize you on Southwest and what was the shame 00:07:37.360 |
that you felt was great. I'm gonna fly I'm gonna fly 00:07:40.480 |
Southwest to Vegas all the time now. It's fucking great. It 00:07:43.280 |
literally was honestly to get from my house to the gate where 00:07:46.560 |
you stand to get into the plane. It was like plus or maybe 15 00:07:50.320 |
minutes more than just driving on the tarmac like it was a 00:07:52.400 |
nothing burger. J. Co. It's amazing what a down market can 00:07:56.400 |
do to people. This is what is it Brad? This is the victory of 00:08:01.680 |
the age of fitness. Absolutely. Southwest you were in the first 00:08:08.880 |
row. You were the front row South by all right, listen, 00:08:12.160 |
there's a big docket today. We've got a lot on the plate. 00:08:14.240 |
David Sacks is busy in a star chamber right now selecting the 00:08:18.160 |
next president of the United States. He's picking a VP and 00:08:20.960 |
the cabinet for whoever's going to win. So he couldn't make it 00:08:23.600 |
today. So we brought the fifth bestie Brad Gerstner back and 00:08:26.240 |
great timing because the Fed decided to pause rate hikes in 00:08:29.920 |
just yesterday, I guess and this would have been the 11th 00:08:38.400 |
And they anticipated two more 25 basis hikes before the end of 00:08:43.440 |
the year. So here's just the Fed funds effective rate. So you 00:08:48.800 |
can all can see it here over time. And there was also some 00:08:51.440 |
big news this week. A couple of 2023 IPO candidates started to 00:08:57.840 |
be floated. Obviously, we've been hearing about Reddit and 00:08:59.920 |
Shripe for a while but arm confidentially filed for an IPO 00:09:03.520 |
back in April. And they're seeking to raise between eight 00:09:06.000 |
and 10 billion later this year. And that would be a pretty big 00:09:09.840 |
deal for SoftBank, which owns the firm. And that could I guess 00:09:15.520 |
save SoftBank, which has had a just a brutal run with the two 00:09:20.560 |
save SoftBank, the arm deal is being done is what's the 00:09:26.720 |
valuation that's being done at SoftBank acquired on for 32 00:09:29.600 |
billion in 2016. They don't know the valuation right now. They 00:09:33.120 |
haven't put that out there yet. I mean, they were originally 00:09:35.040 |
looking for anywhere from 30 to 70. But I mean, Brad, where do 00:09:38.640 |
I have no idea. I think that you know, they're shopping to get 00:09:43.200 |
these anchor tenants into arm. If the IPO market was super hot, 00:09:48.480 |
and this was really easy to get done, this would have been done, 00:09:52.160 |
you know, so I think, you know, they're tiptoeing out because 00:09:55.040 |
they need to get it into the public market. But, you know, I 00:09:59.920 |
think that anybody who does an IPO today, okay, is going to 00:10:05.120 |
demand a, you know, a rate of return into that offering. That 00:10:11.520 |
is a significant margin of safety relative to all deals 00:10:15.280 |
that were done in 2021. And what that means is, on the roadshow 00:10:20.720 |
investment banks go and they try to drum up demand, they want to 00:10:23.520 |
have anchors in the IPO book, they call firms like ours, they 00:10:27.920 |
call strategic investors, they want to size up what that is. 00:10:31.120 |
And, you know, they give you their expectation of fair value 00:10:35.520 |
or where they think the IPO will trade to, of course, you meet 00:10:38.640 |
with management, you ascertain and develop your own 00:10:42.400 |
expectations of fair value. What I'm saying is, maybe in the 00:10:45.840 |
peak, you know, in 2021, people were getting thinner and thinner 00:10:51.280 |
in terms of the margin of safety they demand I my expectations, 00:10:54.960 |
whether it's arm, whether it's Databricks, or other companies 00:10:58.480 |
that are starting to queue up. And these are fantastic 00:11:00.720 |
companies, right? Make no mistake about it. Nvidia tried 00:11:03.440 |
to buy arm and, you know, Databricks is doing over a 00:11:06.480 |
billion in revenue growing extraordinarily fast, led by an 00:11:09.360 |
incredible team. But in order to get into the public markets, 00:11:13.200 |
they have to, you know, step in with a bigger discount than they 00:11:19.280 |
And just to put some numbers on it, arm had agreed to be 00:11:23.040 |
acquired by Nvidia for 40 billion before the talks fell 00:11:25.840 |
through. That was due to regulatory scrutiny. Bloomberg 00:11:28.400 |
says arm is aiming for a valuation of 70 billion. 00:11:31.280 |
Obviously, there's no confirmation about that. That's 00:11:32.960 |
just the news, hopefully intelligently speculating about 00:11:35.600 |
it. What exactly would you be buying for $70 billion? I 00:11:39.040 |
thought they missed it. Yeah. arm had a window where they had 00:11:43.200 |
to make some really important product decisions to be able to 00:11:46.560 |
actually have reference designs that actually made sense beyond 00:11:49.920 |
just simple mobile processors and CPUs. And they don't have 00:11:53.360 |
any of that. So they have a good cash cow business, it's 00:11:56.080 |
probably worth 25 billion. They paid 30 some odd 32. Yeah, 00:12:00.800 |
they could have gotten it out for 40 something, but it's a 00:12:02.880 |
mid 20s billion dollar company. No more. And if you're in if 00:12:07.200 |
you buy it at the 20s, you're buying a cash flow yielding 00:12:10.480 |
asset, and the good luck to all the players, I guess is what I 00:12:13.840 |
would say. Clearly, it's a function of softbank seeking 00:12:17.520 |
liquidity, right? I mean, they have this position for how many 00:12:22.640 |
years now? Four years? No more. I think this was six, maybe six 00:12:27.840 |
years. Yeah. When did they buy it? Was it 2016? I think it was 00:12:31.120 |
longer. Yeah, 2016. They bought it. So this has been in the 00:12:34.640 |
fiscal year ending March, the vision fund alone took a $32 00:12:38.560 |
billion write down. Obviously, softbank has a big chunk of 00:12:43.520 |
division fund, but this is not in division. It's not a division 00:12:46.720 |
fund. Exactly. So this is on the balance sheet. Yeah. But it's 00:12:51.360 |
clearly liquidity driver for them. I don't think that this 00:12:54.240 |
is, you know, first of all, Jason, I don't think the point 00:12:56.960 |
about softbank needs saving is necessarily true. It's not a 00:13:00.560 |
company that needs saving, but they've certainly taken some 00:13:02.560 |
hits. Well, they could use a win is what my point. Yeah. Yeah, 00:13:04.960 |
they could use a win. But the last two quarterly calls, you 00:13:10.000 |
know, Masayoshi-san was pretty repentant and humble. Right. In 00:13:16.480 |
fact, he used that exact line, which we talked about it here. 00:13:18.480 |
So this big liquidity need at some point for the business, so 00:13:22.080 |
they can get this thing out and get liquid on it. It would 00:13:25.520 |
certainly support the balance sheet more than anything. You 00:13:27.840 |
know, the market is on a tear this year, Brad, is there now 00:13:32.080 |
that everybody's buying back into tech stocks, I guess, two 00:13:35.840 |
questions. Why is everybody suddenly buying into tech stocks 00:13:39.920 |
when people think there's going to be a recession, and 00:13:44.160 |
consumers have run out of money? And yeah, what's the thinking 00:13:49.520 |
here of why so much money has gone to tech and this 00:13:51.840 |
incredible rally? It's great to watch, but people seem to be 00:13:56.320 |
perplexed by the rally. And then the Fed, even though they 00:13:58.880 |
didn't do a rate hike, seem to put cold water on and say, Hey, 00:14:02.640 |
listen, don't get ahead of yourselves. We're gonna probably 00:14:04.560 |
do rate hikes later in the year. We all have short memories. 00:14:06.960 |
You know, tech had a devastating year in 2022. So this is a bit 00:14:12.080 |
of a reversion to the meat. Right? If you remember the chart, 00:14:15.040 |
we, you know, we showed a lot on this pod last year, at the end 00:14:18.480 |
of starting at the end of 2021, was that we were trading at, 00:14:23.760 |
you know, multiples that were, you know, 50 to 70% above their 00:14:27.120 |
10 year average. And at the trough, we were trading about 00:14:31.360 |
30 to 40% below the 10 year average, right? So the market 00:14:34.720 |
corrected, it overshoots on the upside, because rates went to 00:14:37.680 |
zero. And by the end of 2022, when you had Larry Summers and 00:14:41.680 |
others who are going out and saying, we don't know what the 00:14:45.280 |
upper bound of inflation and rates is, that's really scary 00:14:48.720 |
to the markets. And so you add this overshoot to the downside, 00:14:52.320 |
we're still trading below the 10 year average for internet and 00:14:56.480 |
software companies based upon our numbers. And so when you 00:14:59.760 |
look at this, as we said, at the end of last year, the 00:15:02.320 |
framework changed early in this year, all of 22 was about two 00:15:07.440 |
things is inflation going higher, and our rates going 00:15:11.120 |
higher. And those things are debilitating for growth assets 00:15:14.400 |
and growth multiples. By the beginning of this year, we 00:15:18.000 |
started to have confidence that inflation had peaked, which 00:15:22.160 |
meant that rates were largely in kind of, you know, spitting 00:15:26.000 |
distance of their final destination. And the framework 00:15:29.760 |
shift to economy, are we going to have a hard landing? Now 00:15:32.880 |
remember, at the start of the year, Mike Wilson from Morgan 00:15:36.160 |
Stanley, who may give him credit made the call in 2022. He said 00:15:40.640 |
22 was going to be awful. He was right. But he stayed in that 00:15:44.400 |
bearish position at the beginning of 2021, said the 00:15:47.840 |
economy is going to hit the skids in q1. And that we're 00:15:51.120 |
going to revisit 3000 or 3200 on the S&P. Instead, we're 00:15:55.920 |
sitting at 4300. So he was tragically wrong at the 00:16:00.080 |
beginning of this year, because again, he was fighting, I think 00:16:03.200 |
the last battle that he that was one, as opposed to looking 00:16:07.520 |
ahead and saying, this is no longer about rates and 00:16:10.240 |
inflation. So the NASDAQ has moved 30%. To start this year, 00:16:14.320 |
you know, the Fed said yesterday, we're hitting pause, 00:16:18.080 |
you know, the quote from Chairman Powell that has 00:16:20.960 |
everybody a little bit perplexed, is on the one hand, he 00:16:24.000 |
says, we're data dependent. On the other hand, he said, we're 00:16:27.920 |
likely to have more rate hikes, right? And that rates are going 00:16:32.880 |
to stay high for a couple years. Well, if we're data, you can't 00:16:35.920 |
have a both ways. If we're data dependent, we have a hard 00:16:39.760 |
landing, like Stan Druckenmiller thinks we're going to have in q4 00:16:42.800 |
this year, then rates are going to get caught. But the market 00:16:46.960 |
seems to like this. And let's talk about where we are. The 10 00:16:50.880 |
years at about three, seven, the S&P is basically flat in terms 00:16:56.000 |
of where it was and where it was before and where it was after 00:16:59.120 |
the rate hike announcement. The economy appears to be 00:17:04.080 |
reasonably good 80% of earnings in q1 beat, the guides were okay. 00:17:09.360 |
Stan Druckenmiller said I shifted, you know, he pulled 00:17:13.040 |
forward when he thought we're going to have a hard landing 00:17:15.280 |
based upon the bank crisis, which now feels like it's a 00:17:18.560 |
distant memory for many. And he's now calling for potentially 00:17:22.320 |
a hard landing in q4. So the Fed just came out and said, No, 00:17:26.320 |
we're not going to have a hard landing in q4, we think we're 00:17:28.640 |
going to have 1% growth. They think that inflation to end the 00:17:33.120 |
year is going to be at 3.2% versus the 3.7% Goldman 00:17:37.680 |
consensus. What I would say is, you know, we've moved, but we're 00:17:42.000 |
still below the 10 year average. But you have to start paying 00:17:45.360 |
attention now to individual stocks that have likely gotten 00:17:48.960 |
ahead of themselves or where they've taken an example of 00:17:51.520 |
that. A lot of the return. Facebook, Apple, no, you know, I 00:17:57.360 |
think a lot of the AI related stocks are now at or within 10% 00:18:02.320 |
of our end of year price target. And when you when you see that 00:18:07.360 |
take something like Nvidia, if I'm playing at home, or I'm a, 00:18:10.960 |
you know, a professional investor, I may say, Oh, I'm 00:18:13.360 |
going to go sell some long dated calls to buy a little 00:18:16.720 |
protection to the downside. I don't think there's any 00:18:18.800 |
problems with Nvidia, I think they're going to continue to 00:18:21.840 |
perform, I think they're beat their numbers for the balance 00:18:24.400 |
of the year. But you know, remember, at the start of the 00:18:27.520 |
year, people thought they were going to miss their numbers for 00:18:29.920 |
data center, people thought data center revenues this year, 00:18:33.440 |
we're going to be negative. And they just revised their guide 00:18:37.120 |
from 7 billion in q2 to to 11 billion in q2. So they've 00:18:41.760 |
absolutely blistered their numbers, and the stocks gone 00:18:45.120 |
from $125 you know, to over $400. So when you have those 00:18:51.040 |
parabolic moves, just like we did with rates last year, I 00:18:54.480 |
think it's wise to say it's a fantastic company, we want to be 00:18:57.760 |
involved, it's going to continue to be one of our larger 00:19:00.320 |
positions. But if I can lock in 20% yield, by selling calls six 00:19:05.520 |
months out, I think that's a reasonable risk reward. So you 00:19:09.520 |
know, I think that's where we are. And now the question 00:19:12.960 |
really becomes and it's unknown and unknowable. Right? Are we 00:19:16.560 |
going to have a soft landing, a medium landing, a hard landing 00:19:19.600 |
in q4 and rolling into next year. And I would say as an 00:19:22.480 |
investor, I'm very data dependent. You know, we're 00:19:25.040 |
talking to companies, we're looking at a point should we 00:19:27.280 |
all be looking at at this point, Shamath, you think jobs, 00:19:30.240 |
unemployment, just the stocks and their multiples, and how are 00:19:34.480 |
you looking at it as a capital allocator tomorrow? Still on the 00:19:38.080 |
sidelines? What have I said like a broken record rates are 00:19:40.960 |
going to be higher than you want, and they're going to be 00:19:42.800 |
around for longer than you like. And now, Powell is basically 00:19:47.120 |
telling you the same thing. So we're almost at the end of, I 00:19:53.760 |
think the bottoming, though. I don't agree with Druckenmiller. 00:19:58.560 |
I think he's wrong on which part that doesn't be a hard 00:20:01.360 |
landing in q4. Yeah, and the reason there's not going to be 00:20:04.320 |
a hard landing is you just saw China today basically say we're 00:20:06.880 |
going to start to rip in trillions of dollars, they're 00:20:08.800 |
going to stimulate the economy. You can't have a hard landing 00:20:11.600 |
when China's printing trillions of dollars, not possible. So I 00:20:14.880 |
think that what Powell was forecasting is that if China 00:20:18.640 |
starts to basically turn on the money printer and go through a 00:20:23.600 |
huge spate of quantitative easing, it's going to just 00:20:27.040 |
inflate everything. Because they're just such a critical 00:20:30.320 |
artery to the world economy. And so you just have to get 00:20:33.200 |
prepared for rates just being sticky, and inflation being 00:20:37.360 |
sticky. And I think that that's probably the most reasonable 00:20:41.040 |
base case for the rest of the decade, rest of the decade, 00:20:43.840 |
seven more years. Yeah, yeah. And so in that environment, 00:20:47.040 |
the problem is that now you have the seven or eight most 00:20:52.480 |
valuable tech stocks priced to perfection yet again. If you 00:20:56.880 |
look at their enterprise value over their net income, these 00:20:58.960 |
things are trading at astronomical yields that are 00:21:03.360 |
less than half of the two year note, and now approaching 00:21:07.280 |
sometimes less than half of the 10 year note, government bonds, 00:21:10.720 |
that makes no sense. And if you subtract out those seven or 00:21:16.880 |
eight biggest companies in the S&P 500, the S&P is not been a 00:21:21.520 |
great asset. So the I think the equal weighted index, Brad, you 00:21:25.760 |
probably know the exact number, the equal weighted index is 00:21:27.840 |
just shit. So what does all this mean? I think it means that 00:21:32.240 |
people are psychologically exhausted with having lost 00:21:35.280 |
money. They are psychologically wanting to will the market up, 00:21:39.040 |
they want to psychologically believe whatever will allow 00:21:42.800 |
them to influence rates getting cut. But I think the most 00:21:49.280 |
reasonable bear cases rates aren't getting cut. And whatever 00:21:51.600 |
hope you had of rates getting cut just went out the window 00:21:53.520 |
today, because there is no world in which you cut rates when 00:21:56.080 |
China is going to print a trillion dollars or free bird, 00:21:58.080 |
a lot of people seem to think that this big run up in the 00:22:01.600 |
markets is AI related. Everybody's got an angle for 00:22:05.520 |
their company, some of them super valid, like Microsoft and 00:22:08.320 |
Google. Other ones feels kind of speculative. We talked about 00:22:12.960 |
BuzzFeed talked about having AI journalists in their stock, 00:22:16.000 |
which looks like it's going to be delisted, got a quick pop. Do 00:22:20.080 |
you think the AI AI actually coming to market driving 00:22:25.760 |
efficiency, driving revenue for companies is going to happen in 00:22:30.400 |
the short to midterm? Or is it more like midterm to long term? 00:22:33.360 |
In other words, is the AI rally overheated and fake? 00:22:36.960 |
I don't know if I would assume that the pricing of equities is 00:22:43.040 |
entirely driven by AI. I think there's a lot of factors and 00:22:48.080 |
including a lot of folks maybe being on the sidelines for too 00:22:51.120 |
long and needing to come and buy in at the same time. But 00:22:56.000 |
I do think that there's a radical realization underway 00:22:59.520 |
that a lot of businesses that are dependent on services that 00:23:06.400 |
might be replaced by AI don't necessarily have the longevity 00:23:09.920 |
today. Those are businesses that have been severely hurt by 00:23:14.240 |
the point of view of what AI could bring to market and bring 00:23:18.240 |
to industry. So there's a lot of stuff happening on the short 00:23:20.720 |
side. On the long side, I think it's a distribution curve on 00:23:25.360 |
like how quickly different industries will be affected in 00:23:28.400 |
a positive way by AI. Certainly, the most obvious is demand for 00:23:34.000 |
chips, because all this infrastructure is being built 00:23:36.320 |
out. So that's the first. And so, you know, with a high 00:23:39.600 |
discount rate, which is the environment we're in today, you 00:23:42.480 |
can more quickly bet on those opportunities. And so you see a 00:23:45.840 |
disproportionate bubbling and valuation and multiples on 00:23:49.040 |
businesses like that. There are other businesses that are 00:23:52.800 |
further down like services businesses, how long until 00:23:56.880 |
lawyers are able to leverage AI how long until banks, investment 00:24:00.960 |
banks are going to be leveraging AI to create new products and 00:24:04.400 |
improve their margins. There's a lot of speculation, a lot of 00:24:07.200 |
ideas, but the discount rate is quite high right now. And that's 00:24:10.160 |
probably a few years out, to the point that the fuzziness 00:24:14.320 |
What do you mean by the discount rate in this context? 00:24:16.880 |
So let's say that I think that investment banks are going to be 00:24:19.600 |
severely changed in seven to 10 years, that's the time horizon, 00:24:23.920 |
I think that their business is going to improve because of AI, 00:24:26.800 |
as an example, or, you know, factory machinery automation, 00:24:31.040 |
right manufacturing businesses, probably 10 to 15 years. So I 00:24:34.800 |
can't really give them that much credit today. And so I apply a 00:24:38.480 |
higher discount rate there. But there are other industries that 00:24:41.120 |
are certainly being affected much more quickly. And you know, 00:24:44.240 |
I'm sure Brad and his, his team of finance wizards probably have 00:24:48.720 |
a board, you know, where they have this list of all the 00:24:51.600 |
industries are going to be affected and what time horizon, 00:24:53.680 |
I mean, that's the way you could kind of think about this. And 00:24:56.080 |
over different time horizons, you could kind of accrue some 00:24:58.800 |
net improvement to earnings and say, okay, you apply a discount 00:25:02.240 |
rate to that, here's how much the stock should trade up and, 00:25:04.480 |
and, you know, that's one way to kind of think about this. I 00:25:07.120 |
don't think it's everything everywhere all at once. 00:25:12.560 |
Great. By the way, I gotta tell you, I just saw that movie. I 00:25:15.280 |
have to reference it because I saw it over the weekend. Have 00:25:17.440 |
you seen that movie? Of course, of course. Yeah. What was your 00:25:20.080 |
what an unbelievable film. I had not seen that film. I had 00:25:23.680 |
carried kind of postponing it. Incredible. Anyway, what is it? 00:25:27.040 |
What movie? It's a everything everywhere all at once. It won 00:25:30.560 |
Best Picture at the Oscars last year. I don't trust the Best 00:25:35.280 |
Picture Awards at the Oscars for a few years. I've just been 00:25:38.080 |
duped by that award. Like you get some really crappy, like 00:25:42.480 |
badass movies that are like, watch this movie and then report 00:25:45.440 |
back. Let's I won't even say anymore, politically correct or 00:25:48.240 |
whatever. And then there's a little virtue signaling they 00:25:50.720 |
were doing a little catch up because of Oscar so white the 00:25:53.440 |
movement to try to balance things out. In fact, people 00:25:57.040 |
well, that doesn't exactly reinforce trust that actually 00:25:59.840 |
destroys complete trust and reputational credibility. Yes, 00:26:03.520 |
their virtues. And actually, that was something people said 00:26:06.720 |
that everything everywhere all at once was possibly because 00:26:10.720 |
it's an Asian guy. I don't even know what is it? What is it? 00:26:12.720 |
It's a sci fi adventure, surrealist kind of romp. 00:26:17.520 |
Michelle Yo is in it. She's very famous from crouching tiger 00:26:22.480 |
Quan is in it, who played short round in Indiana Jones. Jones. 00:26:30.560 |
And I don't do my impersonation, but I would get 00:26:33.840 |
canceled for doing the impersonation of him. In that 00:26:36.560 |
fact, I must be like fucking 70 years old. Yeah, yeah. No, he's 00:26:40.240 |
like, she's like, something but it's pretty great because he 00:26:43.840 |
also won Best Actor Best Supporting Actor. And there are 00:26:48.400 |
all these pictures of him with Harrison Ford and Steven 00:26:51.280 |
Spielberg. The guys that made it are like music video directors. 00:26:56.560 |
You got to go see this film. It's really good. It's a theater 00:27:02.240 |
watch it on Apple TV in one of your theaters tonight. Go to 00:27:06.640 |
your theaters. Go to the theater on your plane and watch 00:27:09.600 |
it. The theater is literally actually will do that. I'll do 00:27:13.760 |
that. When I'm on the plane I ever had. I go to the bathroom. 00:27:17.920 |
And it's a long flight back from Italy. So I'm taking my time. 00:27:21.680 |
And I see that a sub a sub soap that I can't afford. It's like 00:27:27.040 |
100. Yeah, that one. So I see that and I'm like, wait a second. 00:27:30.880 |
So I look in the cabinet underneath. I'm like, if there's 00:27:32.880 |
two up here, there's got to be six down below. So I check in 00:27:36.240 |
the galley. Of course, there's six down there. I put two I 00:27:38.720 |
boost to my back, which I'm still not in my bathroom here. 00:27:43.520 |
I check out he's got a TV setup. So I'm like, Oh, I wonder what 00:27:50.080 |
movies are on here. And it's Netflix. He's got so much 00:27:52.800 |
bandwidth on that frickin plane. That is on the plane. You don't 00:27:57.680 |
have to download your movies before you take off on 00:27:59.600 |
Chamath's plane. All right, let's get back to work here. 00:28:02.800 |
By the way, have you seen that movie? Yeah, it's incredible. 00:28:07.120 |
He's got to watch it. It's not the best film of the that was 00:28:10.240 |
tar but I don't want to get into it with Chamath. 00:28:11.920 |
Oh my god, I tried to watch tar. That movie sucks. It was so 00:28:18.720 |
boring and fucking self absorbed and contrived. 00:28:22.960 |
But I like about it. For anybody that has sleeping issues 00:28:26.640 |
and otherwise needs to use a sleeping aid. I would just put 00:28:29.280 |
that movie on because, you know, once you see Cate Blanchett, 00:28:32.640 |
blathering on and walking back and forth from her kids fucking 00:28:35.520 |
school and then some crappy piano play you'll fall asleep. 00:28:38.960 |
And then what puts you to bed faster? Chamath Ambien tar or 00:28:46.000 |
science corner science corner which one I love. I actually 00:28:49.200 |
love science corner. I don't take I don't take sleep aids. 00:28:52.320 |
I've taken Ambien three times in my life. Yeah, but that was 00:28:55.600 |
recreationally during the day. That's no, no, it was 00:28:58.240 |
prescribed to me because I wanted to try it. I also tried 00:29:00.400 |
Lunesta once. They were they're both like, careful with that 00:29:04.640 |
stuff, man. You'll start tweeting weird shit. Be 00:29:06.720 |
careful. I tried those sleeping aids four times in my life and 00:29:09.760 |
you wake up, I wake up with a it has a really bad tail effect. 00:29:13.440 |
So like, I woke up super groggy. I only get five or six hours of 00:29:17.520 |
sleep with it. So anyways, I do take melatonin and that's really 00:29:21.440 |
good. That works. That's natural. It does the trick for 00:29:24.480 |
me and then I try to just everybody on melatonin all of 00:29:26.960 |
a sudden, like everybody I know takes melatonin. Are you sure 00:29:29.840 |
it's a natural way to ease into sleep? Yeah, for me, what I do 00:29:33.120 |
is like I use it as part of a routine where like, typically 00:29:36.320 |
around 10 like all the, you know, devices, things get 00:29:40.160 |
silenced and do not disturb mode and I'm in a really restful 00:29:43.840 |
place by 10 o'clock. I mean, usually by 1015 to 1030 I'm out 00:29:47.040 |
like a light. I wish I just got this infrared sauna. You guys 00:29:51.120 |
have infrared saunas. This is like the new hotness and man 00:29:54.240 |
that is incredible. You get a heart rate goes way up and then 00:29:57.680 |
you sleep like a baby. Okay, let's get back to AI. Brad. Are 00:30:01.440 |
people getting too much credit in their stock prices and is the 00:30:04.560 |
radio contrast therapy after the infrared sauna? I don't know 00:30:07.920 |
what contrast therapy is. What is that? Meaning when you do the 00:30:10.480 |
hot then you jump into like an ice bath? No, I'm getting an ice 00:30:13.600 |
bath for next for my little outdoor sauna area and I'm going 00:30:17.120 |
to start doing that. I did that with my friend Antonio and 00:30:19.840 |
turn it friend of the pot Antonio gracias. He's got one of 00:30:22.480 |
those cold plunges. We jump in the cold plunge together. He's 00:30:24.640 |
got this thing at 45 freaking degrees. He says don't try to do 00:30:28.400 |
too much. You'll have a heart attack. So I go in, I get 45 00:30:30.800 |
seconds in, I start breathing like, really heavily. And I'm 00:30:35.840 |
like, I gotta get the hell out of here. He does six minutes. 00:30:38.400 |
Yeah, I do four minutes, three times a week. What temperature 00:30:44.080 |
what temperature 42 to 44. The research paper that everybody 00:30:47.920 |
references now, I think her name is Suzanne's Soderbergh or 00:30:51.040 |
something like that. She's the Swedish researcher. Huberman did 00:30:55.200 |
it on his podcast that Rogan did it anyways, the the data is 00:30:58.800 |
like an app, like a little less than an hour of heat. And 11 00:31:04.640 |
minutes of cold per week per week was shown to be 00:31:07.600 |
physiologically optimal. So I do three times a week, 20 minutes 00:31:10.880 |
of heat, followed by four minutes of cold plunge three 00:31:14.640 |
times a week. Do you have like a cold plunge where you set the 00:31:16.800 |
dial on it? Or is it? No, no, I just I'm watering. No, I do old 00:31:20.320 |
school. I, I use the Mr. Steam in my bathroom. Because I after 00:31:24.960 |
I'm done working out, the last thing I want to do is schlep 00:31:26.960 |
outside, find my slippers, put on a robe, I'm not going to do 00:31:29.840 |
any of that. So I walk upstairs, I do the Mr. Steam at like 110 00:31:33.200 |
degrees. And then while I'm working out, somebody just makes 00:31:38.080 |
an ice bath for me and put a bunch of ice in a tub, fill it 00:31:42.320 |
with water and just jump in it. So what do they do? Do they 00:31:45.200 |
carve like a bunch of baby seals in ice? And then they just when 00:31:48.320 |
you get no, no, no, all the ice cubes were made to look like my 00:31:51.440 |
face. So I have these shapes of yourself. So I'm like, look at 00:31:55.520 |
myself. Yeah. No infrared gets up to like 130 140 degrees and 00:32:01.120 |
your heart rate will go up to 140 150 degrees in this. It's 00:32:04.160 |
all different experience than the Mr. I was recently on a ski 00:32:08.000 |
trip. Yeah. And it's a this extraordinary sauna. It's wood 00:32:13.040 |
fired. Okay, tell me about this. So that so the lead guide on 00:32:17.760 |
this trip, we go down there, he gets the thing heated up before 00:32:21.120 |
we show up right and I show up with this great athlete down 00:32:23.920 |
there. And we go in the hotbox because you know, this is the 00:32:27.280 |
thing to do. Yeah. And I look at the thermostat. I think it's 00:32:30.880 |
broken. It shows like it's hard peg that like 225 degrees 00:32:36.320 |
Fahrenheit. That's a lot of you walk in your I mean, you slow 00:32:40.320 |
cook meat at 225. That's my lips are burning, you know, and 00:32:46.640 |
there's this person sitting in there. He's like, now this is 00:32:49.120 |
the way you do it right? Tough guys, you know, do it at this 00:32:52.000 |
temperature. Like, alright, so you know, we sit in there for 00:32:54.240 |
like 1010 minutes, and then we go and we jump in, you know, in 00:32:57.760 |
a frozen lake. And about 10 minutes later, we're taking off, 00:33:01.520 |
we had to leave this particular place. And, you know, people are 00:33:04.640 |
waving goodbye. And they're looking, you know, they have to 00:33:07.920 |
look over the sauna as they're waving goodbye. And they notice 00:33:20.480 |
People are taking this a bit to the excess. It's a little crazy. 00:33:23.280 |
Alright, so back to back to the docket here. AI. Yeah, people 00:33:27.040 |
getting too much credit in public markets kind of running 00:33:29.600 |
with this before it's actually ready for primetime. Just to add 00:33:32.560 |
to freebergs point here. There are a lot of people talking 00:33:36.320 |
about doing things in startups. I see people actually doing 00:33:39.120 |
things I would say four out of five startups that we've 00:33:41.920 |
invested in, and the majority of them that were being pitched on 00:33:44.800 |
right now are not only talking about AI, which they were all 00:33:47.440 |
talking about it three years ago, they're actually 00:33:49.520 |
implementing it now because the tool sets are available. I just 00:33:52.160 |
had a call. Thank you freeberg for putting me in touch with the 00:33:54.560 |
Google people. I just did a call this past week with Google and 00:33:58.400 |
they gave me some previews of what they're working on. I can't 00:34:00.560 |
say anything about it. But it's it's mind blowing. I think 00:34:05.200 |
Brad, what's your read on Google? Now? We had a little 00:34:07.440 |
fuffle when I when I talked about Sundar's reaction to, you 00:34:12.240 |
know, AI and the concerns that others might have on folks 00:34:16.880 |
talking about the company. I mean, what's your point of view 00:34:18.480 |
on Google today? I don't know if you talked about this. 00:34:20.320 |
Yeah, yeah, no. Well, I mean, I think, you know, I had this, I 00:34:25.040 |
had this dinner, this incredible dinner, actually, Thursday at 00:34:27.920 |
the pub. And, you know, it's how I know, Silicon Valley is 00:34:31.360 |
back, because it was just like serendipitous. We threw it 00:34:33.440 |
together in the afternoon, you have this like, or extraordinary, 00:34:36.400 |
you know, CEOs there. And, you know, you had Mustafa, one of 00:34:41.680 |
the co founders, deep mind there, and rich Barton from 00:34:44.560 |
expedience, just just an incredible group. And the $20 00:34:47.440 |
trillion question is, how does the entire open architecture of 00:34:53.040 |
the web get re architected in the age of AI? Another way of 00:34:57.040 |
asking asking the question, because we're all kind of 00:34:59.120 |
playing small ball, right? We're all talking about, oh, you 00:35:01.920 |
know, our TPUs, GPUs, you know, and what clouds going to get 00:35:05.360 |
accelerated, that's kind of the small ball, the big ball here, 00:35:08.800 |
the $20 trillion question that has defined the entire internet 00:35:13.200 |
for our entire careers, has been web search. And I will tell 00:35:18.800 |
you that the conversation around that table, there's increasing 00:35:23.040 |
confidence that whatever comes next, right, the top of the 00:35:26.400 |
funnel is up for grabs. And all of those dollars over a period, 00:35:30.800 |
again, a five years, seven years is going to get redistributed. 00:35:34.640 |
And make no mistake, Google is well positioned to make its 00:35:39.040 |
claim for that. But its claim is going to come at the expense 00:35:43.920 |
of search. And so Google may create the best AI in the world. 00:35:47.920 |
But it's going to have to fight off the cannibalization of core 00:35:52.240 |
search. And the question is, in this new thing, and so rich 00:35:55.840 |
Barton, you know, an incredible product founder, right? Think 00:36:00.080 |
about Expedia, Zillow. Yes, and Zillow. So he's founder of 00:36:03.600 |
Expedia and founder Zillow, two vertical search engines that 00:36:07.200 |
dominated right there particular verticals. And the topic of the 00:36:10.960 |
conversation was that the new UI, right, the new way you 00:36:15.680 |
interact with your customers, right, is not optimizing a web 00:36:19.840 |
page. It's not optimizing an application. He calls it the 00:36:24.320 |
race to intimacy. The race to intimacy that you have to build 00:36:29.280 |
a conversational UI. I've run that race a few times. 00:36:40.400 |
And the penultimate question is, who ends up on top? Are you 00:36:46.240 |
I run really good middle distances. And then if I need to 00:36:50.960 |
just make it, you know, run a Boston Marathon, I can if I 00:36:57.920 |
Does he think, sorry, does he think Zillow is dead? I mean, 00:37:05.280 |
Sorry, not Zillow. Sorry, not Zillow. Expedia. Pardon me. 00:37:09.360 |
Well, you know, you and I had this conversation, Gurley 00:37:11.760 |
chimed in, the Kesh chimed in on Twitter. Well, you're not on 00:37:14.000 |
Twitter, but they chimed in. You know, about the conversation 00:37:18.640 |
from last week's pod, every vertical search engine and 00:37:22.720 |
Google itself, that architecture is suspect. Right? The 00:37:27.360 |
question is just over what timeline is it suspect, but the 00:37:31.280 |
world is moving past 10 blue links, information retrieval 00:37:35.360 |
that looked like a card catalog is not where we're going. 00:37:38.560 |
You no longer need an index, you need someone to do the 00:37:41.200 |
retrieval from the index for you. That's the new service. 00:37:45.600 |
And yeah, that's what I mean, it's personalized to you. But I 00:37:48.080 |
think one of the questions that's still outstanding from an 00:37:50.480 |
architectural point of view, in terms of your interaction with 00:37:53.840 |
all the data in the world, is, are you as a user going to have 00:37:58.560 |
a number of independent relationships with your travel 00:38:01.600 |
search engine, your travel agent, or you're going to have a 00:38:04.720 |
relationship with your doctor medical office, and you're going 00:38:07.040 |
to have a separate relationship with your, you know, financial 00:38:11.360 |
advisory type service? Or are you going to end up seeing 00:38:15.200 |
similar to the challenge we have in search versus vertical search 00:38:17.840 |
today, an aggregator of services, because there's value 00:38:21.920 |
in cross populating the data and the knowledge about you across 00:38:28.160 |
it's gonna be the no, it's gonna be the former, because like, 00:38:30.400 |
obviously, there's value to the company, because every company 00:38:34.320 |
wants to profiteer from you and monetize you in nine different 00:38:38.400 |
ways and maximize the equity for their employees and their 00:38:41.840 |
founders, and their board and their shareholders. But that's 00:38:44.640 |
not what people want, right? So like, for example, when you go 00:38:47.200 |
to a doctor, would your doctor be better off if, you know, she 00:38:51.040 |
also had all of your data from, I don't know, pick your other 00:38:54.640 |
service providers, of course, maybe you could make a claim 00:38:57.840 |
that if they had all of your personal information, and they 00:39:01.200 |
understood your risk factors, they could do a much better job. 00:39:04.160 |
But the reason you don't do that is you want some segregated 00:39:07.280 |
relationships, it gives you some level of privacy, but mostly it 00:39:10.320 |
gives you control in a world of AI, where you can be more 00:39:13.120 |
conversational. The idea that humans will want to get reduced 00:39:16.800 |
to talking to one thing for everything, I think is wrong. I 00:39:21.200 |
think instead, it's actually going to be hyper fragmented, 00:39:24.400 |
because the best in class use cases are going to be the things 00:39:28.240 |
that people want, and then be, it gives the user control. The 00:39:32.400 |
one thing I would tell people is do not give up all of your data 00:39:36.800 |
to one thing who's professing to be at all, they are going to lie 00:39:40.320 |
to you and trick you. And you're going to have a social media 00:39:44.960 |
There's another way to think about this, which is the reverse 00:39:47.360 |
of the client server model, where today, you as you are the 00:39:51.760 |
client, you are the node on the network, and you're communicating 00:39:54.960 |
with the center of the network, which is the server, which is 00:39:57.760 |
the service provider, and all the data sits there, and you're 00:40:00.560 |
getting data in and out of their product to suit your particular 00:40:03.840 |
objectives. But the future may be that more data is aggregating 00:40:07.440 |
and accruing about you, you end up becoming a server, you then 00:40:11.600 |
have the option just this speaks to kind of the architecture on 00:40:14.160 |
the internet. Imagine if every individual had their own IP 00:40:16.960 |
address, and that individual's IP address had behind it behind 00:40:21.440 |
your ability to control lots of information and lots of output 00:40:24.720 |
about all your interactions across the different network of 00:40:27.680 |
service providers that years, that's a very smart framework, I 00:40:30.800 |
think, and that object and ultimately what you have to agree, 00:40:33.920 |
you want to be able to rent that data to services. I don't even 00:40:37.280 |
think it's about rent is Yeah, as much as it is about, you should 00:40:40.240 |
have the reason the reason rent well that then you have to rent 00:40:43.120 |
it, you can't give it to them because that's be stupid. That's 00:40:45.280 |
right. Your goal, your usage, like let's say the doctor says, 00:40:49.040 |
Hey, can I can I get access to your health data, or sorry, your 00:40:52.640 |
Apple Watch activity data for the last year, you then have the 00:40:55.680 |
option to provide that data to them through some permission 00:40:58.960 |
type service that you then make available. So there will be 00:41:01.840 |
interconnectivity amongst the service providers, but entirely 00:41:04.560 |
gated and controlled by the user. Look, you're seeing this 00:41:06.800 |
up front now with Reddit. When in this world, the people that 00:41:11.920 |
create the content are actually in control of the data. And if 00:41:15.520 |
you try to like monetize an API, without paying the people that 00:41:19.360 |
created it or giving them control, they'll revolt. You're 00:41:22.480 |
seeing it happen. Now Reddit lost what 90 some odd x percent 00:41:25.920 |
of all of their content because the mods were like, fuck you to 00:41:29.440 |
Reddit. Well, think of that writ large where everybody is asked, 00:41:32.880 |
Hey, great service. Hey, I'm this great service. I can solve 00:41:36.240 |
x, y and z problem for you. Just give me all your data. And 00:41:39.120 |
then if you're not asking, Well, what do I get? And they're 00:41:42.400 |
like, well, you're going to get a whiz bang service. And that's 00:41:44.480 |
all they say you say you're being tricked. It's already 00:41:47.440 |
happening. I mean, there's an open source lawsuit against 00:41:49.920 |
Microsoft's GitHub co pilot, and you have the Getty lawsuit. And 00:41:54.240 |
you know, it, there's also a corollary for this Google tried 00:41:58.000 |
to kill Expedia already, they tried to kill yell. And it 00:42:01.200 |
didn't work. Folks use both services. And you know, if you 00:42:04.320 |
if you are trying to boil the ocean, like chat, GPT might, I 00:42:08.400 |
think it's not going to work because there's going to be so 00:42:10.480 |
much innovation, so much community brand and all that 00:42:13.040 |
kind of stuff. And these llm seem to be trending towards 00:42:16.560 |
good enough or some kind of parody that I think you're 00:42:19.680 |
right. Some people are just not going to give their data over 00:42:21.840 |
wholesale. I've been using the Zillow plugin to get exact. And 00:42:25.920 |
the Zillow plugin sucks on chat. It's getting slightly better. 00:42:30.880 |
This will exist on Zillow site, Zillow will not let chat GPT 00:42:36.560 |
take their business, you're going to go to Zillow site and 00:42:38.320 |
you're gonna say, Hey, show me homes that have a sauna and an 00:42:41.680 |
ice plunge. And that are four bedrooms and that are you know, 00:42:44.560 |
fixer uppers or not fixer uppers and have in suite bathrooms and 00:42:48.720 |
whatever. And it's just going to give you that list. Why would 00:42:53.520 |
Can you take a crack just at summarizing what we all just 00:42:56.400 |
said? Because your initial question was, Friedberg was 00:42:59.760 |
like, what does this mean for Google? Well, we all just said 00:43:01.840 |
that what we're certain of is there's not going to be 10 blue 00:43:06.240 |
links and a model of search that they have monetized in a way 00:43:10.160 |
that has been the most prolific business model in the history of 00:43:13.200 |
capitalism. Right? So that's a problem for them, you know, from 00:43:17.040 |
a business model perspective. So they're going to have to 00:43:19.520 |
reinvent their business, that's going to be hard to do. It's 00:43:21.520 |
like rebuilding the plane on the fly. But then if you just go 00:43:24.880 |
to the two models that we outlined here, on the one hand, 00:43:27.760 |
Chema said, you know, you're going to work with all of these 00:43:30.320 |
agents. That's exactly what Mark Zuckerberg said on Lex. He's 00:43:33.520 |
like, you're not going to have one super agent, you're going to 00:43:35.440 |
have, you know, hundreds of agents. And he thinks of 00:43:38.160 |
WhatsApp and all of his platforms as the new open web, 00:43:41.360 |
where all of these agents will live and you'll interact with 00:43:43.680 |
them. You'll interact with your Zillow agent there, you'll 00:43:46.080 |
interact with your booking agent, your Expedia agent there, 00:43:49.360 |
etc. On the other hand, Chad GPT, Hyatt, Inflection, Reid 00:43:54.160 |
Hoffman, and Mustafa, and many others think that the benefits 00:43:59.440 |
of general intelligence, this will be the first time that we 00:44:02.720 |
solve vertical problems horizontally, right? And their 00:44:06.720 |
metaphor, the compare there is, we all don't have 100 personal 00:44:12.240 |
assistants in our office, you have one that gets to know you 00:44:15.680 |
really well, they get to know what you like to wear, how you 00:44:18.000 |
like to travel, the things you like to eat the things you like 00:44:20.960 |
in your house, how many kids you have. So there's real leverage 00:44:24.560 |
in that, in that horizontal knowledge that can then be 00:44:27.920 |
applied to these different verticals. I think the answer 00:44:30.640 |
lies in between, your personal assistant will do a lot of the 00:44:34.480 |
general stuff, okay, whether it's pie, chat, GPT, whatever 00:44:38.480 |
googs comes up with, etc. But I think they will subcontract the 00:44:42.400 |
work, whether they chain it out via, you know, behind the scene, 00:44:46.720 |
you know, mechanism to other agents. So for example, if 00:44:50.960 |
you're interested in traveling to this undisclosed location 00:44:54.720 |
that I'm at in Europe right now, you know, my, my assistant 00:44:59.440 |
doesn't know anything about this place. So she may interact 00:45:02.960 |
with a specialist for this particular place to get you that 00:45:06.960 |
magical experience. So, you know, I don't think you have to 00:45:10.720 |
pick one or the other, but architecturally, $20 trillion 00:45:14.320 |
of value on the web is built around web pages, advertising, 00:45:21.840 |
and e commerce, right and sending traffic to those other 00:45:26.080 |
places. And what we're all saying, unequivocally, is it's 00:45:29.760 |
moving to knowledge extraction, and intelligent agents. And I 00:45:33.760 |
think that's tech Tom, but I still think there'll be clicks. 00:45:36.240 |
I you know, I have been using BARD a whole bunch, and they have 00:45:40.960 |
made massive rapid progress just to give you but one example 00:45:44.160 |
here, I'll share my screen, it is pretty extraordinary how 00:45:47.280 |
quickly they're figuring this out. And I did this just in the 00:45:49.920 |
search while we're talking about the five best Greek 00:45:51.440 |
restaurants in the Bay Area obviously came up with coca and 00:45:55.040 |
you know, other ones. And then I asked it like, hey, what are 00:45:59.360 |
the top items on each of these menus, they started putting 00:46:01.760 |
images in and linking to Yelp. And then I said, Hey, tell me 00:46:04.880 |
the most expensive wines and it actually got that from Kokari 00:46:09.600 |
they have a Chateau Margaux 2009 for 1500 Chamath I don't 00:46:12.640 |
think it's good enough for you. But what year is it? It looks 00:46:15.840 |
like 2009. Yeah, they have Screaming Eagle 2013. I don't 00:46:20.080 |
know if that's a no go Harlan 2014. No, no, no, no. Yeah. But 00:46:26.240 |
Harlan's good. Harlan's good. Okay, good. So we got to save. 00:46:29.280 |
Anyway, this I did the same exact search like four weeks 00:46:32.400 |
ago, it didn't have images, and it couldn't get me the items on 00:46:35.360 |
the menus. And these could all be links. And these could all be 00:46:38.480 |
paid links. So in these results, there's nothing to stop Google 00:46:42.560 |
from saying, Hey, Yelp, if you want, we'll put your 00:46:45.120 |
information here or not up to you. You choose do robots.txt 00:46:48.880 |
except when to call it AI dot txt, you tell us what you want 00:46:52.480 |
to be included. And if you want to be included, we want, you 00:46:55.280 |
know, it's a marketplace for clicks will include three of 00:46:57.440 |
your images with clicks, and we'll call it sponsored, they 00:46:59.840 |
are going to be able to insert ads into here that are better 00:47:03.200 |
than the current ads, and that will perform at a higher level. 00:47:05.520 |
And they're going to know us, this could actually be the 00:47:07.520 |
reverse of what everybody's thinking. This could lead to 00:47:10.080 |
higher CPMs and higher cost per click because of intent. 00:47:12.960 |
By the way, I mean, the the 2014 is actually, I think, 00:47:17.600 |
underrated, and it's very highly rated, but I think it is an 00:47:23.920 |
Jason, as to your point, I think all of that. I mean, that's 00:47:27.280 |
impressive. It's true. You can also do it on chat GPT. I'll 00:47:30.560 |
just say the number of people I interact with on a global basis 00:47:34.400 |
who talk about chat GPT versus Bard is like 10 to one today. 00:47:39.120 |
Now Google is not that's going to change real fast. So it's got 00:47:41.680 |
massive distribution power. Let me tell you what I what I also 00:47:44.800 |
think. But to your point, Brad, about this, I just want to I 00:47:47.440 |
want to respond to your barred versus GPT for that's people are 00:47:52.000 |
not paying attention. And when Google puts this on the homepage, 00:47:56.240 |
yeah, and start sending 5% of users to it, and obviously, it's 00:47:59.440 |
expensive to do it, our people are going to have their eyes 00:48:01.840 |
jump out of their head. I think Google's going to be chat GPT 00:48:05.040 |
for I'm saying it right here right now. I think they're going 00:48:07.520 |
to beat them because I think that they're better at indexing 00:48:10.160 |
all this information and understanding it than anybody on 00:48:12.160 |
the planet. And they have the largest ad network. If they get 00:48:16.000 |
this done in the next six months, I think it's going to 00:48:18.880 |
increase the cost per click, because they're going to know so 00:48:21.520 |
much about each user continue, Brad. I know I'm in the minority 00:48:25.200 |
here. No, no, I mean, listen, I think a lot of Google's revenue 00:48:28.560 |
comes from de facto navigation of the web, they turn they turn 00:48:31.920 |
the URL into a navigation box. There are a bunch of ads there 00:48:35.600 |
that, you know, people click on, they don't even know they're 00:48:37.920 |
clicking on ads. And, you know, they generate a lot of revenue 00:48:41.920 |
off of that. But let's be clear, Google is firing at this. I just 00:48:47.280 |
don't think it's going to be as monopolistic as they are in 00:48:49.920 |
search. I think there are going to be other competitors who are 00:48:52.880 |
going to be well financed, you're going to have access to 00:48:54.880 |
the data. And today, you have well north of 100 million people 00:49:00.000 |
who are paying to, you know, a huge percentage of those to use 00:49:04.720 |
chat GPT. And I think this, you know, it's the first 00:49:08.560 |
competition, but, you know, Friedberg knows he and I had the 00:49:12.000 |
back and forth, they reported earnings, stock was flat, I said 00:49:15.440 |
on CNBC that we had sold our shares, we bought back some of 00:49:19.120 |
the shares around Google IO, because we do think they're 00:49:22.240 |
going to be a player, we think they're going to be a 00:49:23.760 |
beneficiary. But I would say as I sit here today, the 00:49:27.520 |
distribution of potential for them is less than it was before 00:49:33.040 |
chat GPT, you know, the distribution of upside, they 00:49:38.320 |
have some competitors now, who are going to be vying for this 00:49:43.040 |
next new thing. And I wonder whether or not, you know, as 00:49:46.880 |
they try to navigate, you know, you're saying they're going to 00:49:49.360 |
take some of this traffic from search and feed it into this 00:49:52.240 |
other thing, this other thing better monetize as well as 00:49:54.640 |
search, or by definition, that means revenue goes down. 00:49:57.920 |
It feels like Google is so close to figuring this out. I 00:50:02.880 |
Jake, how much Google do you own? How much Google? 00:50:05.120 |
Not enough, not enough. I've just been playing with the J, 00:50:13.280 |
Like a buy in or two. But let's pivot over to this Reddit thing 00:50:18.560 |
because it's super important that people understand this. 00:50:20.480 |
I'm gonna buy 17 more shares. I'm gonna buy like a Harlan 00:50:25.200 |
2014 equivalent insurance. No, my J trading is up right now. 00:50:29.040 |
J trading.com. Shout out to my trades. I did it as for fun. 00:50:34.640 |
Well, Jake, you're kind of quiet for a while. So when you're 00:50:39.120 |
a thing. I liked and I didn't want to change them. 00:50:43.040 |
Do you track them and exit or what do you do with 00:50:45.440 |
I'm going to track the exits, but I haven't exited any. 00:50:48.000 |
If you go to J trading.com, you can play along. 00:50:49.840 |
But how can you only be up 20% when Facebook has doubled? 00:50:58.320 |
Yeah, but Facebook doubled off the bottom. The whole goal of 00:51:00.720 |
trading J cow is to maximize your highest conviction. 00:51:04.960 |
Maximize your highest conviction. What do you do? 00:51:10.160 |
I don't know. I just I just bought companies. 00:51:12.160 |
I know. When you talked about it on group chat, 00:51:15.520 |
I made the train companies. You got to put a lot of chips on 00:51:18.320 |
the table. I mean, you have that's I think what's the 00:51:20.720 |
optimal number as a day trader, Brad, what's the optimal number 00:51:23.920 |
of names I should have? You're not a day trader. You don't 00:51:26.320 |
trade. As a long trader, what's the number of names I should be? 00:51:32.560 |
You have too many. I think with your level of insight, you know, 00:51:38.160 |
on the things that you really believe in, there was such 00:51:41.040 |
asymmetry in that position at the time that you bought it, it 00:51:44.080 |
was your best idea, you should have put at least 30% of 00:51:49.520 |
That's what I'll do. Yeah. I mean, it's done. Okay. I mean, 00:51:53.280 |
listen, I'm up. What did I buy? But 500 shares and my basis is 00:52:00.400 |
Now imagine if you could, you know, 10 or 20 x. Yeah, that's 00:52:04.880 |
Jason, how wait, hold on. How much have you put into all those 00:52:07.360 |
stocks? 1.5 million. I basically took I had some money in an 00:52:11.360 |
index fund laying around and I was like, let me do this 5 00:52:13.600 |
million. I did J trading as a blatant attempt to get a sponsor 00:52:17.440 |
for this weekend startups. So I was like, Robin Hood or E trade 00:52:20.800 |
will sponsor this and I'll make it like a segment. 00:52:23.120 |
So I was like, let me do like a segment where I trade and see if 00:52:26.960 |
I can secure the bag and it you know, the markets are so crushed 00:52:30.400 |
that nobody's spending advertising from Robin Hood or 00:52:32.560 |
E trade or interactive brokers or Bloomberg but I also did 00:52:36.160 |
it because I wanted to become better at public market trading 00:52:38.960 |
to your point to last week's conversation conversation to 00:52:41.440 |
mom is under understanding when to sell and when to go long when 00:52:45.600 |
you get distributions as a VC. That was really the reason I 00:52:48.160 |
tried to do it was just to try to have skin in the game. You 00:52:50.880 |
know, just like learning to play PLO or you playing the Razz 00:52:53.680 |
tournament. I just wanted to it's very it's very easy. Once 00:52:57.120 |
you get shares distributed to you immediately sell them. And 00:53:00.480 |
then and then hold on. And then re underwrite from scratch from 00:53:04.880 |
there. Okay, once the money is sitting in your account, even if 00:53:10.000 |
it takes a day, a week a month, it's not like the stocks going 00:53:12.640 |
to rip and triple on you over the next 30 days. re underwrite 00:53:16.560 |
it from scratch. And then see if you have that much conviction 00:53:20.960 |
I like it. I would just tell you guys the distributions, you 00:53:25.200 |
know, there were a lot of firms that rode, you know, a lot of 00:53:30.720 |
these big names, right all the way down. You know, that they 00:53:36.000 |
got public, but they didn't distribute much if any. And 00:53:40.240 |
it's really don't say the names, but what do they rhyme 00:53:42.640 |
with? I'll start I'll start. But what's my boy? Yeah, go ahead. 00:53:48.320 |
You go you go next. You pick one. Not doing it. Bliger 00:53:53.280 |
bliger. Dreeson borrow. It's Malander's blonde. Freebird just 00:53:59.760 |
dropped off. With this conversation. All right, listen, 00:54:04.320 |
we all got our asses kicked. Except for me. uber's up 67% 00:54:08.800 |
this year. Oh, come on. Come on. Let me what what I was 00:54:11.520 |
doing. Andresen Blore. It's all of it. What tends to happen is 00:54:17.200 |
things hit the bottom. So little bit. And people are so 00:54:21.520 |
relieved that they bounced a little bit today, immediately 00:54:25.200 |
start distributing because LPs are hammering them. LPs are 00:54:29.200 |
like, why did you not send us this, you know, snowflake when 00:54:32.960 |
it was at $400 a share? Why did you not send us this door dash? 00:54:37.120 |
Why did you go through any name, you know that you can think of, 00:54:40.880 |
but then people compound the mistake, I think that as soon 00:54:45.760 |
as they get a little relief, boom, it's out the door. And 00:54:48.800 |
then the thing doubles on them. And I just think, you know, part 00:54:53.280 |
of the benefit of me having to get up every day, 5am deal with 00:54:57.680 |
public markets, and think about long or short, it's what we're 00:55:00.640 |
talking about in the fall of 2021, around the table before 00:55:04.160 |
the poker games, which is the public market is tilty, it 00:55:08.480 |
doesn't feel good. And that, you know, we stopped making 00:55:11.600 |
venture investments in October of 21. Principally, because of 00:55:15.680 |
how we felt about public markets at that point in time. And so I 00:55:18.560 |
think a discipline, in fact, I was meeting with a managing 00:55:21.440 |
partner of a very big venture firm in Silicon Valley this 00:55:24.960 |
week, they just started about a year ago, or six months ago, 00:55:28.720 |
doing public market investing. And they said it was incredibly 00:55:32.560 |
valuable to how they think about distributions and what they're 00:55:36.000 |
doing in the venture. But that's what it's been for me, I have 00:55:38.320 |
now started to learn how these things are priced. And I just 00:55:40.640 |
think it's great as a learning thing. For me, I have to make 00:55:43.440 |
two decisions when to distribute to my LPs, which I just give 00:55:46.800 |
them the shares the second I have them and let them make the 00:55:48.800 |
decision. But that is also personally for me, do I want to 00:55:51.440 |
hold it or do I not? I like your suggestions. Nice punch up 00:55:54.000 |
Chamath of looking at the cash and then deciding if you want to 00:55:56.320 |
read. Did you guys see this crazy thing from calipers where 00:55:58.880 |
they were like, yeah, they were they were down like, four, I 00:56:02.960 |
mean, horrible returns on the venture side. And so their 00:56:05.360 |
decision was to double down. Well, they didn't have they took 00:56:08.560 |
like a decade off from venture, the returns of what they did 00:56:12.080 |
invest in venture were atrocious. To be fair, I don't 00:56:14.720 |
know how the board has changed to calipers, but calipers brought 00:56:17.600 |
in a new CIO, an entire new team. And so they can't be held 00:56:22.640 |
liable for, you know, that track record, I did kind of see 00:56:26.160 |
something on Twitter about I was shocked, it would be very 00:56:28.640 |
difficult to manufacture that bad a track record, if what I 00:56:32.240 |
saw was accurate. However, I would say the team that is there 00:56:36.560 |
now the portfolio that they're putting together, and doubling 00:56:39.680 |
down where I think venture is kind of at the bottom or bottom 00:56:43.600 |
third, and this is the bottom, I think, bottom third, I don't 00:56:47.360 |
know, bottom half, you know, listen, in venture, you got to 00:56:50.640 |
find three things. You got to find, you know, you don't want 00:56:53.680 |
to invest all your money at the top. So you got to get the bot, 00:56:56.640 |
you know, bottom third, none of us can call the bottom. But you 00:56:59.200 |
know, last year, at the end of 22, we were certainly in the 00:57:02.000 |
bottom third evaluations, then you want to be early in a major 00:57:06.080 |
platform disruption. I think we all agree we're early in a major 00:57:10.000 |
platform disruptions, probably the third of our careers. And 00:57:13.920 |
then you want to back one of the best firms, best brands in 00:57:16.160 |
Silicon Valley, like if you do, like and people know who those 00:57:21.200 |
simultaneously, like you got a good shot at producing really 00:57:25.120 |
just to put some numbers on the CalPERS thing. So you can 00:57:26.800 |
respond to it. America's public largest public pension CalPERS, 00:57:30.400 |
that's California's manages some 444 billion in capital on 00:57:35.040 |
behalf of California's 1.5 million state school and public 00:57:38.160 |
agency employees is leaning into venture. After years of 00:57:42.560 |
bringing down its VC exposure to 1% to a 1% target, the 00:57:46.880 |
institution investor is now looking to increase its 00:57:49.120 |
allocation more than sixfold, obviously 6% from 800 million to 00:57:53.280 |
5 billion, the Financial Times reported. So thoughts on that 00:57:57.440 |
Well, I mean, this is the moment there's a lot of venture 00:57:59.600 |
firms that are hard up for money. And so they may have a 00:58:04.000 |
shot, they can get allocations. Now, in order to get into these 00:58:06.800 |
funds, the problem that they have to realize is they may be 00:58:10.240 |
buying a bunch of toxic assets or a bunch of toxic partners in 00:58:14.320 |
the sense that a lot of these people have been getting run 00:58:16.720 |
over for the last three or four years. We don't know again, 00:58:20.800 |
we've said before, most venture investors are probably 00:58:23.520 |
unprepared for the shock because they've never lived through 00:58:27.520 |
one. There's been a lot a lot of junior muckety mucks that were 00:58:30.640 |
hired because they were XYZ middle level exec at rando 00:58:34.800 |
company means nothing. So I don't know, I think it's smart 00:58:39.760 |
that they have I mean, they have to be thought first of all, 00:58:42.400 |
how is it that the caliph like, literally, it's like, I know, 00:58:48.320 |
some trillions of dollars being made in your backyard and some 00:58:51.360 |
genius was like, well, the thing that's creating all the wealth 00:58:54.560 |
in the world, which is in our backyard, where we probably have 00:58:57.280 |
the best pitch in order to get into these funds. 00:58:59.120 |
And you just come down Leonie and say, Hey, oh my god, give me 00:59:03.680 |
the list of 10 and put money on those 10 and call it a day. 00:59:06.720 |
Oh my god, that's unbelievable. That's unbelievably derelict. 00:59:13.120 |
sorry, that's it. Sorry, let me just finish. That's actually 00:59:17.120 |
that's the best word to use. When you look back on a decision 00:59:20.480 |
like that, there's no numerical justification that one could have 00:59:23.360 |
made in the 2000 teens to have made that decision, except that 00:59:26.640 |
that was an emotional decision. And when you're running a half a 00:59:30.800 |
trillion dollar fund, there is no room for emotion, you should 00:59:34.080 |
not be allowed, there should be some. So I think that that shows 00:59:37.280 |
a very hollowed out and at a minimum, imbecilic risk 00:59:43.200 |
management infrastructure at CalPERS that needs to get fixed. 00:59:45.840 |
So whether the allocation goes up or down, if I was a teacher, 00:59:49.760 |
and my money was being managed by them, I say, man, these people 00:59:53.440 |
I would want to understand how they're making decisions. Because 00:59:56.800 |
I don't, I would want to see the investment memo that got them to 01:00:01.200 |
decide as an investment committee that 1% in the most 01:00:05.040 |
important asset class that's in your backyard, that's making all 01:00:07.600 |
the money in the world made any sense. I just want to read that 01:00:10.480 |
memo and see, could I agree with that? Because then I would want 01:00:15.200 |
to do so. Hold on, then I would want to read the investment 01:00:18.240 |
memo that says we're going to change course and get back to 01:00:21.040 |
five or 6%. Because if that process isn't fixed, these 01:00:24.320 |
decisions are just going to be equally bad, you're just going 01:00:27.440 |
to compound bad on top of bad because, again, they clearly did 01:00:31.040 |
a terrible job. And then on top of that, they had horrible 01:00:35.840 |
partner selection, because the people that they did invest in, 01:00:38.800 |
because the data is public, have performed horribly. So what 01:00:43.280 |
changes now all of a sudden, right? Because if the best firm 01:00:46.560 |
still don't want you in increasing it from 800 million 01:00:49.040 |
to 5 billion just means you're going to lose 4.2 billion more 01:00:51.600 |
than you would have otherwise at 800 million. 01:00:54.400 |
says here, they made two bets, they had bet on looks like 01:00:59.120 |
light speed in the last couple years, light speed and TPG. So 01:01:04.400 |
I do have some data on this, because we've spent a lot of 01:01:10.320 |
time talking with them. And as you know, Jason, we were just 01:01:12.640 |
talking, you know, like talking to folks like my bottle of who 01:01:15.120 |
I California is one of the largest sovereign wealth funds 01:01:18.880 |
in the world, right? This is not just a state. This is bigger 01:01:22.240 |
than most countries, right? This is one of the biggest 01:01:24.560 |
economies in the world. I would say that what my interactions 01:01:29.120 |
with the new team have been very impressive. And I will tell 01:01:32.880 |
you, they're not just looking at funds and building a new 01:01:36.000 |
portfolio. I think they are thinking about doubling down at 01:01:39.120 |
the right time. And they're thinking about thematic bets 01:01:42.640 |
against super cycles like AI to say, let's allocate this much 01:01:47.040 |
money, who are the three or four deep strategic partnerships 01:01:50.560 |
that we can have to drive, you know, return on that. So I think 01:01:54.320 |
it's I think they're on their way. But I agree with you. 01:01:56.880 |
They're the sovereign wealth fund of the most prolific state 01:02:00.800 |
in the world. And they should have outsized returns, not 01:02:03.440 |
trailing. I just want to point out one thing here is the power 01:02:06.560 |
of writing to map you decided to write your annual letter a 01:02:09.280 |
couple years ago. Brad, you also will write a letter, famously 01:02:13.520 |
the Facebook getting fit one or the meta getting fit one. The 01:02:16.560 |
power of writings and I just did this for the launch on four. 01:02:20.000 |
And when you write a deal memo compared to doing a deck, 01:02:23.360 |
they're the questions you get are so qualitatively different. 01:02:28.400 |
And the people you attract are so different. It is 01:02:30.960 |
extraordinary. I am advising startups across the board to 01:02:35.840 |
write really tight deal memos because I write these we make 01:02:38.000 |
these deal memos internally when we make an investment. But man, 01:02:40.960 |
is it great for clarity of thought and for the person on 01:02:43.280 |
the other side to just stop and read 1000 words or 2000 words 01:02:47.120 |
as opposed to go through some stupid performative deck. It's 01:02:51.040 |
just so such a better process. Maybe you could you both could 01:02:53.680 |
speak to that or Friedberg. I don't know if you're you've been 01:02:55.440 |
writing deal memos. But maybe for people who are listening or 01:02:58.240 |
capital allocators and founders, I've been writing for years. 01:03:00.720 |
Why? And what is the what is the what does it do for you? 01:03:04.240 |
Yeah, well, it allows you to actually find people who will 01:03:08.400 |
critique things in a thoughtful, intelligent way. It's 01:03:11.280 |
hard to critique decks because you use broken English, you use 01:03:14.480 |
fancy graphics, all of a sudden, somebody that's very good at 01:03:17.840 |
like graphical layout can dupe somebody else. And so you don't 01:03:21.760 |
get to good outcomes because this weird groupthink effect 01:03:25.360 |
sets in when you look at decks. So I'm not a fan of decks, I use 01:03:29.440 |
them, but they need to be a companion to some sort of long 01:03:32.800 |
form narrative document. And I just think it's more useful, you 01:03:37.360 |
get people who can really think about what they agree about what 01:03:40.800 |
they don't agree about, it shows the intellect of the person 01:03:43.600 |
writing it, quite honestly. I just think it's a it's like a 01:03:47.760 |
basic skill that people should have. It's a useful skill to 01:03:50.800 |
teach people as well. decks are very dangerous. I think if 01:03:54.000 |
you're going to make decisions, I would encourage you the bigger 01:03:56.720 |
the decision, the deck is insufficient, it can be a 01:04:00.000 |
companion piece, but it needs to be attached to long form 01:04:02.320 |
documents. But the long form documents don't need to be long 01:04:05.360 |
either 2345 pages, but without it, I think you're going to 01:04:09.520 |
allow some really bad decisions to creep into some good ones. 01:04:18.400 |
he ran, he doesn't talk about writing. All right, as everybody 01:04:23.280 |
knows, Reddit is on strike right now, I should say the mods who 01:04:26.800 |
run Reddit are on strike. 95% of Reddits went dark, they basically 01:04:31.520 |
turned off new posts, or they just went private. Basically, 01:04:34.000 |
nobody could join, nobody could see the content, I believe is 01:04:37.120 |
what that means between Monday and Wednesday. And the reason 01:04:41.440 |
they're doing this is because Reddit decided it would start 01:04:44.000 |
charging for its API. So who gets impacted by using the API? 01:04:48.480 |
It turns out apps, we saw this at Twitter as well, when Twitter 01:04:51.440 |
started changing its pricing for its API. And this means that 01:04:56.240 |
some of the really high end Reddit apps would have to pay 01:05:00.160 |
$20 million a year for access to read its data. Now, they 01:05:05.280 |
originally had said they were changing this pricing because 01:05:07.760 |
they were going to train AI models. And they wanted anybody 01:05:11.600 |
using that data, i.e. Google barred or chat GPT for enclosed 01:05:16.400 |
AI, they wanted them to pay for it. And Steve Hoffman was 01:05:19.440 |
explained all this in a New York Times profile in April, you can 01:05:22.400 |
go search for that Reddit wants to get paid for helping to teach 01:05:24.800 |
big AI systems. The largest app is called Apollo. Just so you 01:05:29.120 |
know, if you're not into those Reddit really came out with 01:05:31.520 |
their app. Really late. This was a function of the 2005 to 2015 01:05:37.040 |
timeframe. Back then. Web 2.0 startups didn't have a lot of 01:05:41.280 |
capital. So they let other people build on their API's and 01:05:43.920 |
build apps. Some of those apps caught steam and are actually 01:05:46.240 |
better than the apps developed, at least for a while, it was 01:05:49.920 |
better than the Twitter app and the Reddit app. So thoughts on 01:05:53.760 |
this freeberg. There's a history here is mimicked at both 01:05:59.520 |
Facebook and Twitter, both of whom had open API's that 01:06:03.520 |
provided access to third party app developers to build tools 01:06:07.760 |
on top of the platform by accessing either user data or 01:06:10.640 |
content off of the network, and then making that available via 01:06:15.440 |
some different product function, some different UI than the 01:06:19.920 |
native tools allowed. If you'll remember, Facebook started to 01:06:23.040 |
kill off its API and in the process killed a number of 01:06:25.600 |
these third party app developers, the most prominent 01:06:27.920 |
of which was Zynga. I think this was around the 2012 01:06:31.280 |
timeframe. You guys know if I'm right on that, I think it's 01:06:33.520 |
right. Sounds right. And you know, we saw the same thing 01:06:36.400 |
happen in Twitter, where if you guys remember, in the early 01:06:40.160 |
days, a lot of users accessed tweets from people that they 01:06:44.080 |
followed through third party apps and third party apps all 01:06:46.880 |
competed for the user. And ultimately, Twitter's 01:06:50.880 |
management team realized that having direct access to the 01:06:53.760 |
user, being able to control the UI, the UX and not just become 01:06:57.440 |
a data network, but to actually become a service for users made 01:07:03.520 |
a similar sort of change. So you know, Reddit's motivation 01:07:07.120 |
around AI training is an interesting one. But it does 01:07:09.600 |
speak to this idea that the social network companies social 01:07:12.880 |
in the sense that the users themselves are creating the 01:07:15.600 |
value they're creating the content at both Facebook, at 01:07:19.120 |
Twitter, and at Reddit, ultimately, the company loses 01:07:23.680 |
the value if that data that content gets extracted, and 01:07:28.880 |
they can't monetize it or capture the value somehow. And 01:07:32.320 |
it's a lot different, you know, every company ultimately wants 01:07:34.560 |
to become a platform company, meaning that they can offer 01:07:37.920 |
multiple products or services to users that sit on top of some, 01:07:42.800 |
you know, network they've created, and in the process, 01:07:45.360 |
create a network effect, because more products, more apps 01:07:48.080 |
creates more users, more users, you know, be guts, more, more, 01:07:51.600 |
more apps, and so on. That works well, in some contexts, 01:07:55.120 |
like an Apple App Store context, but in the context 01:07:59.120 |
where there is a network unto itself, like Facebook, Twitter 01:08:03.280 |
and Reddit, meaning that there is already a user network that 01:08:06.720 |
is generating value in the form of the content that's being 01:08:09.040 |
produced and consumed, you don't necessarily gain anything by 01:08:12.560 |
then building an app network on top of it. And I think that's 01:08:15.600 |
been one of the kind of key learnings that's repeated 01:08:17.760 |
itself over and over with Twitter and Facebook. The thing 01:08:21.040 |
about Reddit, it's always been a community service. If you 01:08:23.840 |
guys remember, like in 2014 2015, Ellen Powell stepped down 01:08:27.840 |
after there was a Reddit revolt, she was the CEO at the time. 01:08:30.240 |
And she fired an employee at Reddit that ran the q&a site for 01:08:35.840 |
Reddit mods, and their users and the network, the community 01:08:38.960 |
was super pissed off when this happened, and they all revolted, 01:08:41.440 |
and they were going to shut down the service. And ultimately, 01:08:44.320 |
Ellen, you know, got removed from her role as CEO when this 01:08:48.320 |
happened. So you know, because so much of the value of Reddit 01:08:51.840 |
isn't in the management team. It's not in the work that the 01:08:54.960 |
software engineers do that run the company. It's not the VCs or 01:08:58.320 |
the shareholders, the value of Reddit is inherent in the 01:09:01.040 |
community. It's inherent in the individuals that build the 01:09:03.520 |
content on that platform. And that community has convened many 01:09:06.960 |
times in the past at Reddit to change the rules to say this is 01:09:10.240 |
what we want this community to become. And this is how we want 01:09:13.120 |
this management team to operate. And so it's a really uniquely 01:09:16.720 |
positioned business says a lot about how social networks in 01:09:20.240 |
this kind of modern era are operating, it really speaks to 01:09:22.880 |
how much of the value ultimately accrues to the shareholders in a 01:09:25.200 |
business like this, when the users themselves can step in and 01:09:28.080 |
unionize and say, Hey, you know what, we're not going to allow 01:09:30.960 |
this much value to be pulled out of the network in this way, we 01:09:33.760 |
want this to change. So I think it'll it'll have a lasting 01:09:36.800 |
effect in terms of investing in social network or social media 01:09:40.480 |
type businesses where the users are generating so much of the 01:09:43.120 |
value and have the ability to kind of communicate with one 01:09:45.760 |
another and control where value ultimately falls. 01:09:48.400 |
Tremont, you were at Facebook, I think when Zuckerberg realized 01:09:52.400 |
enabling a bunch of folks to use the API wasn't as good as 01:09:56.480 |
controlling the user experience, having a uniform user 01:09:58.960 |
experience. And it got deprecated. And I remember Zynga 01:10:04.080 |
and a bunch of other people had games and we're sucking users 01:10:07.360 |
off the platform. There was a LinkedIn competitor at one point 01:10:09.760 |
that was growing at a credibly violent pace. And I guess y'all 01:10:13.840 |
made a decision. We don't want you sucking our user base off 01:10:17.120 |
the platform. Maybe you could expand on what the decisions 01:10:20.800 |
That was one of the things I ran, I think one of my teams was 01:10:23.600 |
responsible for Facebook platform. Yeah, it was just a 01:10:27.280 |
very clinical decision around enterprise value. Look, the 01:10:30.320 |
thing with Reddit is that it's a hot mess. And in order to try 01:10:37.040 |
to create enterprise value, they decided to really leverage 01:10:43.040 |
the mods so that there was some level of control. And that 01:10:49.280 |
control was necessary so that they could basically sell ads. 01:10:52.480 |
That's how this thing moved in lockstep. Because the minute 01:10:54.880 |
that there was corporate venture investors and other investors 01:10:58.480 |
and a need to generate enterprise value, and is it worth 01:11:02.240 |
2 billion or 5 billion or 10 billion, whatever the number was 01:11:05.120 |
that they thought they were worth, they had to make money, 01:11:07.600 |
right. And Huffman was very straightforward about that. And 01:11:10.400 |
wanting to go public and the whole nine yards. But because 01:11:13.760 |
it was such a hot mess, the mods became this integral part of 01:11:18.320 |
the ecosystem, so that they could actually drive reasonable 01:11:21.680 |
revenue. But then what happened was, it also allowed them to 01:11:26.240 |
basically take over. And I think that it was a pretty 01:11:32.160 |
significant miscalculation. Because I think that what they 01:11:36.160 |
needed to do was really redefine the economics of how revenue 01:11:40.320 |
generation splits would work before they could do all of this 01:11:43.280 |
stuff and try to monetize the API. So I think like, they got 01:11:46.160 |
the order of operations wrong. But I also think it's very 01:11:49.280 |
fixable. And I think that they have some very smart people 01:11:51.840 |
around that table. So as long as they're, again, willing to be 01:11:55.360 |
clinical and unemotional, like we were, they'll get to the 01:11:58.320 |
right answer, which is, give them a healthy rev share. That's 01:12:01.760 |
the future. Freeberg, you know, what version of what freeberg 01:12:04.800 |
said is true. The content creators need to get paid, you 01:12:07.600 |
know, why, you know, you see content creators now on YouTube 01:12:10.960 |
making millions, 10s of millions, hundreds of millions, 01:12:14.080 |
and a few unique cases, billions. And then we are all 01:12:18.160 |
content creators, yet most of us on these old legacy platforms 01:12:21.280 |
make nothing. So that that exchange has to change. Brad, 01:12:27.600 |
Not on that. But I mean, I think, you know, you bring up 01:12:30.160 |
this thing about meta, you know, did anybody pay attention to, 01:12:34.720 |
you know, Zuck's launching project 92. Right. And project 01:12:39.360 |
92 is going to take on Twitter. It's a text based social 01:12:43.360 |
network that's going to pay creators and they're courting 01:12:46.720 |
apparently Oprah Winfrey, the Dalai Lama, and all of their 01:12:50.400 |
creators that are already on their current sites are saying 01:12:54.160 |
we will use this thing to interact and we will compensate 01:12:58.000 |
you. And then on Lex Friedman's podcast, he mentioned 01:13:00.720 |
something about, you know, having been inspired a little 01:13:05.680 |
bit by what was going on with blue sky. So I'm super 01:13:09.920 |
intrigued. You know, he talked about it at this all hands 01:13:12.480 |
meeting, I think Chris Cox talked about it. So it looks 01:13:15.120 |
like meta may be revisiting some of these things that they 01:13:17.840 |
shelved a while back. You know, it doesn't have any direct 01:13:21.280 |
implications on the Reddit front. But I think there's a 01:13:26.160 |
suggestion here that it may be more about putting the 01:13:30.320 |
control back in the hands of the user. From a data 01:13:34.240 |
perspective and a monetization perspective, that would be a 01:13:37.360 |
pretty gangster move. You know, and an interesting way to 01:13:41.840 |
leverage the platform. And I don't really hear anybody 01:13:44.640 |
There's a really easy solution here for Reddit. Those mods, 01:13:49.280 |
most of them do it for fame, glory and affiliation community. 01:13:53.840 |
But if some number of them wanted to monetize their 01:13:57.840 |
activity there, why not allow people to subscribe to sub 01:14:00.960 |
reddits and pay a membership fee and split it with the mods 01:14:04.560 |
that that seems like it would be a high scale move, getting 01:14:08.160 |
Patreon, you know, subscription services to let them make a 01:14:12.080 |
little bit of money. And then with these, it's only three apps 01:14:14.240 |
that are being affected, they should just either buy those 01:14:16.320 |
apps and bring those teams internal or I think it's not 01:14:19.760 |
where they get split revenue with those apps, they could tell 01:14:21.680 |
those apps that that's the key issue. It's like it's like if 01:14:24.320 |
you're going to try to monetize API's on user generated 01:14:26.880 |
content, I think what's happening here is the internet 01:14:29.840 |
is saying, Okay, that's enough, because we're going to leak our 01:14:32.960 |
activity someplace else where we can directly monetize it. So 01:14:36.000 |
that's the whole point. I think in this current version of the 01:14:38.400 |
internet, the value is going to go and again, further, further, 01:14:42.640 |
further erode away from the centralized apps, and more 01:14:47.520 |
towards the individual people, or in this case, these hubbard 01:14:50.880 |
spokes, these mods, and not to the centralized organization 01:14:54.400 |
that that has the housing, the Reddit, the Facebook of the 01:14:57.760 |
world. So that's just the trend that's happening, the 01:14:59.520 |
Instagrams of the world. And there's nothing wrong with that. 01:15:01.920 |
YouTube as well. It's just where the pendulum is swinging. 01:15:06.320 |
So I think that Reddit just has to cut them a deal. 01:15:09.840 |
Pretty easy to do. Speaking of AI funding, there was a breaking 01:15:16.320 |
news story. Just in the last 48 hours, a startup named Mishra AI 01:15:21.600 |
has raised 105 million seed round, they're calling it should 01:15:26.480 |
be at about a 240 million valuation, according to 01:15:29.440 |
reports. One of the co founders is a deep mind researcher, I 01:15:34.400 |
guess people were saying this is insane, because they haven't 01:15:36.480 |
written any code yet. And they've been working on the 01:15:39.920 |
Why these rounds are so big is I guess that you have to buy all 01:15:43.600 |
these h 100s. And they're expensive. And these rigs are 01:15:46.080 |
very expensive. I mean, you guys saw this that Nat Friedman 01:15:49.920 |
basically published that he spent $75 million on a bunch of 01:15:55.680 |
hardware. And if you were one of his portfolio companies, you 01:15:58.560 |
could use it. So getting these h 100s and a 100s to train on 01:16:02.720 |
seems to be a non trivial task. And so and they're very 01:16:05.520 |
expensive, even if you can get ahold of them. So I think what 01:16:09.280 |
we're talking about is basically that these rounds have to go up 01:16:11.840 |
my end, because if you notice, like the post, you had to raise 01:16:16.880 |
the post, so that it wasn't so utterly dilutive as to 01:16:20.320 |
completely disincentivize the employees. But let's be honest 01:16:23.840 |
here. Nobody's writing $105 million seat check that 105 01:16:27.280 |
million is chopped up, probably 10 ways to Sunday. So there's a 01:16:30.240 |
bunch of people putting in fives and 10s and 15s. 01:16:32.400 |
I'll just be honest in the round, and they said it was 01:16:37.280 |
Yeah, so I so it's everybody taking a little teaser bet. The 01:16:41.360 |
problem with these teaser bets is they never hit in the way 01:16:43.760 |
that you think maybe you'll get your money back. With all the 01:16:46.560 |
dilution that's going to happen, etc, etc. This is not the way to 01:16:50.160 |
make money, guys. I'm just going to be honest with you. So 01:16:52.160 |
whoever's putting money in thinking they know what the fuck 01:16:54.080 |
they're doing, you might as well just light it on fire, go to 01:16:56.640 |
Vegas and have some fun with it because you will make more you'll 01:16:59.200 |
get more enjoyment from that than you will for making these 01:17:01.360 |
kinds of investments. You think that's dumb, stupid, stupid bet? 01:17:05.440 |
Brilliant bet. Well, by the way, this has nothing to do with 01:17:09.040 |
When you're betting 4 million in $105 million round of 240 plus 01:17:15.280 |
you do not know what the fuck you're doing. That is not the 01:17:18.560 |
job. So again, Mr. There are gonna be a couple monster 01:17:23.120 |
rounds, I think announced next week, like that are gonna make 01:17:25.920 |
this one look like kids play. So a lot more of this is coming. 01:17:30.560 |
You're exactly right. This is about buying h 100s and compute 01:17:35.440 |
everything we're talking about, right and essential ingredient 01:17:40.560 |
My God, these people that put the money in the seed round 01:17:42.800 |
should have just bought Nvidia call options like you make more 01:17:49.040 |
Nvidia has a floor. And by the way, because in just in case 01:17:52.320 |
Mistral doesn't work, Nvidia will sell those h 100s to 01:17:55.360 |
The resale value. Yeah, they probably have the option to 01:17:58.560 |
I do think you're right on these teaser bets. This is a power 01:18:01.840 |
law business. There's a massive pressure on young junior 01:18:07.600 |
partners, principals within these firms to do something. 01:18:12.560 |
It's not just teaser bets. What happens in these fires, they 01:18:16.880 |
want to get the logos because they need to explain to the 01:18:21.520 |
No, actually, no, that's even worse. The GP that can't manage 01:18:27.840 |
And so I should those the young people should shut the fuck up. 01:18:30.880 |
Okay, be lucky you have a job. Learn the craft. It'll take you 01:18:36.080 |
a decade. And if you are proposing stupid bets like this, 01:18:40.160 |
again, sizing matters. Again, I'm not talking about the 01:18:43.200 |
company here. I'm just saying when you make a $5 million $3 01:18:46.640 |
million deal memo for $105 million round, that is stupid. 01:18:51.440 |
Okay. And so if you're the partnership that allows those 01:18:55.280 |
kinds of things to leak through, you don't know what you're 01:18:58.160 |
doing. So at some point, somebody should be held 01:19:01.520 |
accountable for this. And I guess what's going to happen is 01:19:03.920 |
the returns of CalPERS are going to cascade through everybody 01:19:06.480 |
else. Well, at least I own some, you know, h 100s for a 01:19:12.080 |
do you want to take the other side? No, this is just an 01:19:14.480 |
incredible team that's going to do you know that this may turn 01:19:17.520 |
out to be a fantastic bet. Or I don't know light speed or 01:19:20.720 |
whoever led this round. Let me let me say something different. 01:19:24.000 |
In 1997 98. We had a similar phenomenon. Everybody thought 01:19:28.960 |
internet search was going to be huge. There was massive FOMO 01:19:32.240 |
and chasing and everybody scrambled to get a search logo. 01:19:36.880 |
Alta Vista, InfoSeek, Lycos, Go, Planet All, you know, 01:19:43.120 |
GeoCities, just go through the list, you know that people were 01:19:46.400 |
scrambling after. And the truth of the matter is almost all 01:19:49.440 |
those companies went to zero, even though you got a couple 01:19:52.800 |
bets, right? You got the internet, right? You got search 01:19:56.000 |
right. But you didn't have to invest $1 in search until 2003. 01:20:03.120 |
I do it again now for social networking, same thing, do it 01:20:05.920 |
again for social networking, Brad, say all the names, say all 01:20:08.560 |
the names. It's the same thing. Same thing. And so, you know, 01:20:13.920 |
when you when I look at it today, we have a huge anti 01:20:17.600 |
portfolio for AI. It's painful. We've said no to over 60 01:20:22.720 |
companies, right? We, you know, but when we look around, I see a 01:20:28.960 |
lot of our competitors doing a lot of these deals. Maybe their 01:20:33.520 |
teaser bets, I don't really know the size they're putting into 01:20:36.240 |
those companies. But I suspect that if we believe this is as 01:20:40.080 |
big as it's going to be and going to play out over decades, 01:20:44.080 |
then putting a bunch of really small bets in order to buy a 01:20:47.040 |
network or buy relationships or buy logos, etc. I don't think 01:20:50.800 |
it's going to work any better this time than it worked then 01:20:53.920 |
around social networking. But to be clear, there are gonna be 01:20:56.640 |
some people who lead these deals, who help these companies 01:20:59.920 |
build incredible businesses. And those will, you know, they're 01:21:03.120 |
going to be some winners here. I think there's time to 01:21:06.080 |
participate in the winners. Right now, a lot of this is 01:21:09.360 |
unknown and unknowable, it will become more clear in the 123 01:21:13.840 |
years ahead. The problem is for the LPs, like if you are the LP, 01:21:17.440 |
I am an LP and a bunch of venture funds, this stuff really 01:21:20.080 |
turns my stomach because I'm like, wow, I am losing money 01:21:23.440 |
every day. When I see these things. That's why I get so 01:21:25.840 |
emotional about this. I think to myself, if the GPS that I've 01:21:29.680 |
given money to are doing these kinds of deals, I'm screwed. At 01:21:34.640 |
best, maybe I'll get back 50 or 60 cents on the dollar. And I 01:21:38.080 |
immediately start thinking to myself, I really need to write 01:21:40.320 |
this down. And I'm re underwriting that person and 01:21:42.800 |
that organization, because I'm wondering, how can you let these 01:21:45.760 |
things happen? Because if you just look back in history, you 01:21:49.680 |
have to be really, really negligent to not learn that 01:21:53.520 |
these things never work out the way that you think they are. And 01:21:56.080 |
especially these kinds of rounds and this nominal ownership, the 01:21:59.440 |
inability to defend ownership, it's just not a path to success. 01:22:03.360 |
I mean, also, if you think about this, Chama, you know, what 01:22:06.640 |
company in recent history that got overfunded actually use that 01:22:10.080 |
money logically, the magic of Silicon Valley is the milestone 01:22:13.040 |
based funding system. And whenever you short circuit that 01:22:16.480 |
this money becomes a huge distraction to founders, if they 01:22:19.920 |
were to receive 10 million 25 million work for a year or two, 01:22:23.600 |
then raise another 25 or 50 million, they don't need to 01:22:26.480 |
raise all this money at once. This is like taking your ABC 01:22:29.360 |
round, putting it all in 100% is distract founders. And then 01:22:33.040 |
everybody coming for a salary says, well, you got 105 million 01:22:35.760 |
in the bank, I want $3 million, I want $5 million. 01:22:38.640 |
What's the best example guys of a huge financial winner that 01:22:42.240 |
raised these ginormous amounts of money? What's the 01:22:44.480 |
pre launch of a product? There's no product here. 01:22:48.080 |
What's the best example magically? What is it? 01:22:54.800 |
What is the best example of a great company? I'm putting 01:22:58.400 |
crime quotes that raise these kinds of amounts so early in 01:23:03.140 |
No, Brad, do you have an example? What's made? 01:23:08.160 |
There are people who raised a lot of money over the life 01:23:15.680 |
What was your first seed check into Uber? What was the total 01:23:17.840 |
seed? Right? I can tell you what Facebook's was it was $500,000. 01:23:25.040 |
Okay, Facebook's was $500,000. Yeah, we need a lot of money 01:23:29.280 |
later. We needed a lot of money later. But no, it's 5 million. 01:23:35.520 |
$5 million valuation for Uber 1.25. And they went in. 01:23:39.200 |
Here's another thing, everybody walks in, they say, well, I have 01:23:41.520 |
to raise this much money. And as it's good, there's a circular 01:23:47.200 |
logic. I have to raise this money because I have to have all 01:23:50.800 |
this compute. And I say, well, okay, you got to train it, you 01:23:54.480 |
know, we want to have a vertically integrated model, we 01:23:56.160 |
want to train them on it. Okay, so there's a huge upfront cost. 01:24:00.000 |
And they're like, but I don't want to take a lot of dilution. 01:24:02.560 |
So I have to raise it at a really high price. And so you 01:24:06.400 |
say, okay, well, that's, that's a challenge for you, not 01:24:10.160 |
necessarily a good thing for us. And then they say, Oh, and I if 01:24:13.920 |
you ask the question, is this an ongoing expense? They're like, 01:24:16.400 |
Oh, yeah, we're gonna have to retrain, like, we're gonna have 01:24:18.400 |
to continue to spend this money. And I'm like, well, so if you're 01:24:22.720 |
a software company, for example, and you say, well, what's my 01:24:26.160 |
cogs, if all of a sudden you have an embedded cogs, that's 01:24:30.160 |
massive and recurring, right? For your compute costs that we 01:24:34.480 |
haven't had in the past, then the revenue on the other side of 01:24:37.840 |
that, it's got to be a multiple of what a traditional software 01:24:41.280 |
company might have, in order to get back to that set of 01:24:44.320 |
economics. So I think there's this, there's a scramble, and 01:24:48.800 |
understandably, so if you think the big win AGI, or, you know, 01:24:53.120 |
this autonomous agent that's going to, you know, be the top 01:24:57.600 |
of the next funnel, that thing is going to be worth a lot. But 01:25:01.600 |
to me, those are lottery tickets at this at this stage. 01:25:05.440 |
Can I tell you a little secret? Here's a little secret. When you 01:25:10.320 |
put in $100 million into a startup to buy compute, you are 01:25:16.240 |
not buying whiz bang next generation IP, you are 01:25:20.880 |
subsidizing capex. And that is a job that many, many other 01:25:26.800 |
people do at a very low hurdle rate. And so it is a law of 01:25:33.600 |
capitalism, that it could be the most incredibly innovative 01:25:38.000 |
company in the world. But if you are offering money to them, 01:25:41.760 |
to fulfill a low yield thing, you are just not going to make a 01:25:48.400 |
lot of money. When you put money into a startup, that is their 01:25:54.800 |
writing code to build groundbreaking IP, you own 01:25:58.560 |
something that's really real. But that's because 80 cents on 01:26:03.200 |
the dollar is going to core critical R&D in that point in 01:26:06.000 |
time. And then they raise a lot of money at a lot less dilution 01:26:10.080 |
when 80 cents on the dollar goes to sales and marketing, then 01:26:12.800 |
they raise a lot of even more money at an even smaller, 01:26:15.520 |
smaller amount of dilution, you start to get to scale 01:26:17.840 |
internationally. So you start to see right more dollars, less 01:26:22.000 |
return, right, you're owning less of the critical 01:26:24.880 |
differentiation. So if everybody is like, oh, it's so expensive 01:26:29.760 |
for compute, I need to raise 100 billion. Well, buddy, you 01:26:33.120 |
know, that's like a leasing function. You know, you're, 01:26:36.880 |
you're like, all of a sudden, like the best VCs in the world 01:26:39.840 |
have become like Comerica Bank. Yeah. I mean, they could have 01:26:43.440 |
done it with the same structure, right? Comerica could have 01:26:45.680 |
given them 50 million to buy these machines, and maybe they 01:26:48.320 |
should and Comerica and JP Morgan and somebody else should 01:26:50.880 |
basically say, Hey, you know what, here's a lease line for 01:26:52.880 |
your H 100. Because I know they're worth so much. And I'll 01:26:55.520 |
just Yes, write it at 10%. And, and my point is that the fact 01:26:59.920 |
that people don't understand this is why the money will get 01:27:02.880 |
torched. I would love a critique that says actually, 01:27:05.040 |
Chamath, you're an idiot. I'm right. I know that that's 01:27:07.600 |
happening. But here's why I still see it happening. I don't 01:27:10.240 |
hear any of that. Here's the problem with that critique. 01:27:12.640 |
Okay, so you asked like, what are the biggest projects in 01:27:15.040 |
history? You know, around startups think about AWS, I 01:27:18.640 |
don't know, they spent 400 million probably, in order to 01:27:21.600 |
get AWS off the ground, but it wasn't done by a startup. Right? 01:27:25.520 |
You think about what Zuck's spending on the metaverse, it's 01:27:28.160 |
not being done right by a startup. The truth of the 01:27:31.280 |
matter is the hyper can go back to a p a ws AWS was dog fooded 01:27:35.600 |
on Amazon retail, of course, of course. And Oculus was done on 01:27:39.600 |
Kickstarter, and the cash flow of Amazon retail, fed the 01:27:42.640 |
development of AWS. Correct. My point is that when you think 01:27:47.280 |
about what these hyper scalers are going to do, they're not 01:27:50.000 |
going to spend a billion, they're not gonna spend 10 01:27:51.760 |
billion, they'll spend $100 billion, right in order to be in 01:27:55.600 |
this race. And so if you're backing a startup that says I'm 01:28:00.640 |
going to build a better chat GPT, right, just like open AI 01:28:05.040 |
discovered themselves, they sold 51% or 50% of the company to 01:28:09.040 |
Microsoft for a reason. They had to, they had to have the data 01:28:12.480 |
and they had to have the compute. This is a nuclear arms 01:28:15.520 |
race around compute. And so but I think it's, this is it's 01:28:19.600 |
financially illiterate, for someone to think that they are 01:28:23.280 |
actually doing anything other than subsidizing capex when 01:28:26.720 |
you're giving $100 million to a startup to literally buy chips 01:28:31.440 |
and servers and rack and stack up 40% of their equity. That's 01:28:34.960 |
the other thing this equity is so valuable. Why would you want 01:28:36.880 |
to give 40% of it when you could get an equipment lease and keep 01:28:39.920 |
it? You can't get those you can't get I mean, if the VC is 01:28:44.960 |
put in 50 million instead of 105 you don't think America 01:28:47.840 |
would come on the back end with 25 million of course they would 01:28:50.400 |
Brad is right this happened in 98 99 2000 where all of this 01:28:55.040 |
money was getting flushed down the drain going into buying 01:28:57.440 |
data center capacity where I remember even at Facebook like 01:29:00.240 |
we were racking and stacking our own servers. And then we then 01:29:04.160 |
we ultimately got big enough where we actually built out our 01:29:06.240 |
east coast and west coast data centers and data centers all 01:29:08.320 |
around the world but it was very expensive and in that moment 01:29:12.400 |
again, all of those companies just lit all that money on fire 01:29:16.160 |
it they torched it there was no remnant equity value for that 01:29:20.160 |
capital. And so I guess I guess I guess I'm just I'm just 01:29:23.840 |
questioning like what does a GP think they're actually buying 01:29:27.360 |
Well 80% 80% of it's going to hardware I mean they're buying 01:29:32.640 |
the other 20% buying chips I mean it doesn't make sense and 01:29:36.480 |
that's going to get I want to make one point here and 01:29:39.600 |
freeberg I want I want to get your input on this as well it 01:29:42.160 |
constraint is important for founders and the thing that I 01:29:45.600 |
find really troubling about this is and putting this startup 01:29:48.960 |
aside because crypto people also went through this for the last 01:29:51.520 |
three or four years where they were overfunded how did that 01:29:53.680 |
turn out it was 10s of billions of dollars burnt of LP monies 01:29:58.240 |
people's retirements and college endowments and it's going to be 01:30:01.440 |
quite a postmortem but look at say you invoked meta it's 01:30:05.600 |
important for people who don't know this to know that that was 01:30:08.560 |
a kickstarter shout out to Palmer lucky he raised a couple 01:30:13.440 |
of million dollars in 2011 I think it was on a kickstarter 01:30:18.240 |
pre selling the devices right constraint makes for great art 01:30:21.920 |
constraint makes for great startups you need to have 01:30:25.040 |
pressure on a startup for them to deliver you cannot give 01:30:28.240 |
startups five years of runway and expect it's going to work 01:30:31.680 |
it just doesn't work and I've seen this movie so many times 01:30:35.280 |
but now we've gone through 18 months of nobody doing anything 01:30:38.240 |
I guess in VC land so folks on Santa Rosa are itchy they want 01:30:42.640 |
to justify why they should still be drawing 2% on the full fund 01:30:47.360 |
they want to try to show activity so that they can raise 01:30:49.520 |
the next fund and continue to stack fees and all of these 01:30:53.120 |
sort of leads to these suspension of financial logic 01:30:56.560 |
but it gets replaced with financial illiteracy which is 01:30:59.440 |
why there's an optimal fund size right and this is why the 01:31:02.080 |
people that pay the price are ultimately the LPs and it may 01:31:05.360 |
be the case that CalPERS maybe actually avoids a lot of these 01:31:07.920 |
pitfalls because by missing yeah they missed all the returns 01:31:10.800 |
but then they missed writing the mega follow on checks for all 01:31:14.720 |
of these folks that they and then they would have torched 01:31:16.880 |
okay I'm going to take the other side of that go ahead 01:31:19.280 |
Brett I think it's nearly impossible to conceive that all 01:31:22.880 |
of these bets that are currently being made are bad 01:31:25.120 |
bets I think is major platform disruption but I you know 01:31:30.480 |
instead I think the right way to think about it is it's about 01:31:33.440 |
pacing and if you're trying to if everybody thinks they're 01:31:37.360 |
going to build the next Google they're going to build the next 01:31:40.160 |
autonomous agent that's going to sit on top of the funnel 01:31:42.880 |
that's going to be worth a trillion and therefore they can 01:31:45.120 |
burn a billion dollars training models that's not we're not 01:31:48.720 |
going to have 10 of those winners okay but at the same 01:31:52.640 |
time there's a lot of stuff getting funded that is in the 01:31:55.840 |
application layer that is in the tool layer and these are 01:32:00.000 |
not the big headlines that you're reading about but these 01:32:03.120 |
are really interesting businesses that are solving 01:32:05.920 |
real problems in the software and tooling layer here you know 01:32:10.400 |
in the smaller model layer vertically oriented things 01:32:13.680 |
around life sciences or you know targeting you know 01:32:17.920 |
financial services or things in the application layer you know 01:32:22.400 |
like character.ai etc so I do think there are a lot of good 01:32:26.160 |
things getting funded that will deliver real value but I agree 01:32:30.880 |
with you the problem that there's a there's a second 01:32:33.440 |
problem to this Chamath if you drop a billion or 2 billion or 01:32:36.880 |
3 billion into something you have not only a a product 01:32:42.480 |
challenge you have a distribution challenge right we 01:32:46.320 |
know all the hyper scalers are going to play Google is going 01:32:49.120 |
to play meta is going to play and so you got to compete and 01:32:53.120 |
then beat them and it used to be that you would say well if I 01:32:56.480 |
get a lot of traction they'll buy me if I'm Instagram or 01:32:59.040 |
WhatsApp like they'll buy me well we have such a regulatory 01:33:03.280 |
nightmare in Washington DC today no hyper scaler can spend 01:33:08.800 |
over a billion dollars to buy any AI company not even that 01:33:12.320 |
400 million gifts get what is a giphy I was even in the United 01:33:17.120 |
States today we replaced mergers and acquisition right with copy 01:33:21.760 |
and compete because we've said to hyper scalers you're not 01:33:26.720 |
allowed to acquire any of these companies so the unintended 01:33:31.280 |
consequence of the regulation in Washington is that 01:33:34.640 |
entrepreneurs and founders and venture capitalists who might 01:33:38.400 |
otherwise have had a good idea built something with some 01:33:41.040 |
traction they can't find a home for it in the way you know that 01:33:45.760 |
WhatsApp found a home or isn't that a good thing Instagram is 01:33:48.880 |
that a good thing I don't think I don't think no public and be 01:33:52.240 |
independent is a better presupposes that everything can 01:33:55.600 |
become a big and profitable business there are a lot of net 01:33:58.720 |
net can would ever become a big and profitable business now on 01:34:03.600 |
its own no chance well it's still not a big and profitable 01:34:06.720 |
business okay so that's what I'm saying so maybe I should 01:34:09.680 |
have died I mean maybe it's being kept alive but I mean it's 01:34:12.240 |
actually don't you think it's great for the market I think 01:34:15.120 |
it's great and it got it was it was innovative technology it 01:34:18.240 |
was yeah it you know you you were able to back it with some 01:34:21.840 |
good funding and I think what's coming is going to be really 01:34:24.240 |
exciting but it took a really long runway a lot of capital a 01:34:28.160 |
lot of intelligence in order to build unfortunately we've killed 01:34:31.360 |
that so you have a two sided problem we're spending more than 01:34:34.320 |
ever to fund and start these companies and we you know have 01:34:38.720 |
undermined a lot of the downside protection free bird 01:34:41.520 |
your thoughts so if you look at how the capital is being 01:34:44.560 |
deployed if it's mostly being deployed to train models then 01:34:47.680 |
the question has to be is there really a sustainable advantage 01:34:52.880 |
that arises by being the first to train the models and then 01:34:56.640 |
being able to persist an advantage by training new models 01:34:59.760 |
from that foundational model going forward and the reason 01:35:02.880 |
that that matters so much is because you have to really have 01:35:06.880 |
a deep understanding if you're going to invest a lot of 01:35:08.880 |
capital here you have to have a deep understanding for how 01:35:12.080 |
quickly model development and training is accelerating and 01:35:15.600 |
how quickly the costs are reducing so something that costs 01:35:19.200 |
like we said open AI spent $400 million training models for 01:35:23.040 |
GPT for if they spent $400 million in the last couple of 01:35:26.560 |
years you could probably assume that doing the same trading 01:35:30.080 |
exercise could be done for five to $10 million 18 months from 01:35:34.160 |
now to generate the same model that's a you know 100 x cost 01:35:38.400 |
reduction and I'm just ballparking it here but if 01:35:40.960 |
that's really where things are headed then does the $100 01:35:45.520 |
million to train models today really make sense if trading 01:35:48.720 |
those same models can be done for $5 million in 18 to 24 01:35:52.000 |
months and that's where it becomes a really difficult 01:35:55.040 |
investment exercise and one that you have to really 01:35:57.680 |
critically understand how cost curves are moving in AI the same 01:36:00.880 |
thing was true in DNA sequencing in the early days and it's 01:36:03.200 |
following by the way a similar cost curve is DNA sequencing 01:36:06.720 |
which is actually greater than Moore's law greater than a 2x 01:36:09.840 |
cost reduction every 18 months we're seeing something much 01:36:13.040 |
greater than that in machine learning right now in terms of 01:36:15.760 |
cost reduction and model training so ultimately the 01:36:18.640 |
business model has to have some advantage that by being the 01:36:22.320 |
first to market you can then generate new data that gives 01:36:25.280 |
you a persisting advantage and no one can catch up with you and 01:36:27.920 |
my guess is if you get under the hood of the business models 01:36:31.040 |
it's unlikely going to be the case and it's very likely going 01:36:34.000 |
to be the case that you don't know when the market advantage 01:36:37.760 |
will lie when you will be able to kind of create a persisting 01:36:40.480 |
mode a mode that expands as you get more data and can train 01:36:43.200 |
more and this is why it's so hard to invest generally in 01:36:46.800 |
technology is because you don't know the point at which the 01:36:49.840 |
market tips relative to the point at which the technology 01:36:53.120 |
tips so there's a moment where the technology gets so cheap and 01:36:57.040 |
then the market maybe adopts after the technology gets cheap 01:37:00.240 |
and at that point it's a totally different game remember 01:37:02.560 |
in the mid 2000s where we had memory shortages and we used to 01:37:05.280 |
have to buy ram yeah i mean it's just like it's all this 01:37:08.720 |
stuff and it's like if vcs are funding this stuff just you just 01:37:11.840 |
like like the equity on fire guys not going to be worth 01:37:15.040 |
anything brad's point is right which is the question is what's 01:37:18.400 |
possible now where can you build a sustaining advantage now 01:37:22.240 |
rather than go after big model development cycles where the 01:37:25.040 |
cost curve is going to come down by 100 fold in some period 01:37:28.080 |
of time in the near future is there a business model 01:37:31.280 |
advantage you can build by being in the market first building a 01:37:34.320 |
customer base accelerating your features getting user feedback 01:37:38.160 |
and that certainly exists in the application and the tools 01:37:40.640 |
layer as brad is talking about that seems like such a no 01:37:43.520 |
brainer for disruption across many different segments many 01:37:46.960 |
different verticals many different markets right now 01:37:48.800 |
versus trying to compete further down the stack where it 01:37:53.280 |
takes hundreds of millions of dollars of capital and in a 01:37:56.000 |
couple of months that hundred millions of dollars of capital 01:37:58.320 |
can be replaced with 5 million bucks of training exercise and 01:38:01.120 |
compute can I take the other side of that yeah all that 01:38:03.280 |
coordination makes no money today so to your point when you 01:38:06.400 |
cut the actual input cost by 100 fold the coordination cost 01:38:10.720 |
goes from being zero to being worth less than zero right I 01:38:13.920 |
don't see any money being made there either and all the people 01:38:16.400 |
that say I'm sure there's going to be some genius in the 01:38:18.560 |
comments but what about open source it's like what about it 01:38:21.200 |
opening I also just gave an update on their cost structure 01:38:25.920 |
for their API and they just dropped it 25 to 75% again this 01:38:30.720 |
is after the tenfold drop they did last year play this out 100 01:38:33.840 |
million dollars of capital spent training today is a million 01:38:36.960 |
dollars spent doing training in a three years yeah three years 01:38:40.560 |
18 18 to 36 months somewhere in that time frame is likely the 01:38:43.680 |
time frame so why would you spend all this money today when 01:38:46.560 |
an 18 to 24 month things are gonna get so much cheaper you 01:38:49.120 |
know I think the further up the value stack you go the more of 01:38:52.400 |
an opportunity to truly kind of innovate disrupt and make 01:38:54.880 |
capital as possible what happens in 24 months so when you've made 01:38:58.880 |
a bunch of hundred million dollar investments and they're 01:39:05.440 |
maybe unfortunately as an investor to mop is that what 01:39:09.840 |
you're asking at some point you don't get invited to join the 01:39:14.160 |
next fund think about the alternative investment strategy 01:39:17.280 |
with lots of capital where the cost curve is not coming down 01:39:20.640 |
as quickly in terms of where that capital is being deployed 01:39:23.200 |
for example building rockets to go to space or building 01:39:27.520 |
infrastructure to transport power or building roads you 01:39:31.840 |
could raise a billion dollars to build a tollbooth system or 01:39:34.800 |
port let's use a port a shipping port is a good example you 01:39:38.480 |
could spend a billion dollars to build a shipping port it's not 01:39:41.600 |
that the cost of building a shipping port is going to come 01:39:43.760 |
down by 10x in 18 months so it makes sense to raise a billion 01:39:47.680 |
dollars and build a friggin shipping port and charge my 01:39:50.880 |
boy to come out vcs thought they were underwriting IP 01:39:54.640 |
instead they're just actually subsidizing capex it's a 01:39:57.040 |
building for thing it's the craziest thing Brad is this a 01:40:00.400 |
problem of the optimal venture fund size that Fred Wilson 01:40:03.920 |
talks about that Bill Gurley talks about a lot of the OGs 01:40:06.880 |
say hey 250 400 600 million there's an optimal size here 01:40:11.760 |
for four or five partners in a venture fund to put money to 01:40:14.160 |
work is this part of the problem right now which also 01:40:16.720 |
happened in crypto is you had billion dollar $2 billion fund 01:40:20.080 |
sitting around and different venture brand names having four 01:40:24.160 |
or five of these multi billion dollar funds burning a hole in 01:40:26.960 |
their pocket and getting frisky over this you know 18 month 01:40:31.040 |
pause and is this about optimal fund size making bad you know 01:40:37.120 |
we have over a billion dollar fund so if I if I if I take the 01:40:40.320 |
other side of that the people are going to say you're talking 01:40:42.160 |
your book but I don't think this is about large funds I 01:40:46.480 |
think this is about good and bad decisions at the end of the 01:40:49.280 |
day some of these decisions will pay off like for example 01:40:52.880 |
what's the largest bet size from a billion dollar fund that 01:40:55.520 |
you've made or will make I think out of that fund would be 01:40:59.600 |
$100 million most likely but we may cross it over other funds 01:41:04.480 |
and have you know bigger bets certainly that happened two or 01:41:08.720 |
three times that would you do it all at once like this or might 01:41:11.440 |
it happen over you know a series ABC kind of situation where 01:41:14.480 |
you build a position I would say you know we're not writing I 01:41:18.400 |
you know we haven't written $100 million check into a series 01:41:21.200 |
day and I don't think most of people to Chamas point that 01:41:23.760 |
you're talking about are writing $100 million checks into those 01:41:26.400 |
series days the mistral round that you reference I imagine the 01:41:29.600 |
lead check into that was maybe 50 maybe 50. So listen larger 01:41:34.960 |
fund sizes enable you to participate in companies that 01:41:40.800 |
require more capital and and these companies do require more 01:41:44.480 |
capital. So you may take the position that all these 01:41:47.520 |
companies are going to zero that's not my position my 01:41:50.880 |
position simply is that I do believe there is a bit of over 01:41:55.360 |
exuberance that too many things are getting funded right and 01:42:00.480 |
that's you know like the margin of safety the margin of return 01:42:06.000 |
being required is probably lower than it should be but 01:42:10.160 |
there's no doubt out of this vintage in my mind that you're 01:42:12.960 |
going to have some epic companies now I don't know if 01:42:15.440 |
what Mustafa and Reid Hoffman are doing at inflection is you 01:42:20.320 |
know building pie to take on chat GPT and to take on bar to 01:42:24.160 |
be the intelligent assistant. The ambition is extraordinary. 01:42:28.320 |
The cost of compute is high but the first mover advantage is 01:42:32.720 |
also high right because whoever secures this position you know 01:42:37.600 |
Bill Gates said he's been playing around with it it's one 01:42:39.840 |
of his favorite agents or whatever that's an interesting 01:42:42.720 |
comment I think it's pretty good but I think chat GPT is out in 01:42:46.160 |
front in this regard. I think a lot of people are going to try 01:42:49.120 |
to compete for that space but you know I can't imagine that 01:42:52.800 |
all these researchers leaving deep mind right are going to be 01:42:56.640 |
able to compete for the most sophisticated model to answer 01:43:00.720 |
general purpose questions so I don't think it's the large fund 01:43:03.280 |
size. I think it's just a lot of exuberance to participate in 01:43:07.280 |
what everybody perceives to be a massive platform disruption. I 01:43:11.440 |
think that can be true just like the internet was in 1997 and a 01:43:16.320 |
lot of these bets can can and will likely go to zero. How 01:43:19.760 |
great is America you just rent other people's money they pay 01:43:22.720 |
you 2% and then you're allowed to get exuberant yet keep your 01:43:26.160 |
job when you lose it. God bless America. Well I mean the the 01:43:31.760 |
issue Chamath is it takes 10 years to prove you're bad at 01:43:34.960 |
this job. No no no no no it takes 10 years to prove you're 01:43:40.080 |
good and it takes 20 to prove to prove that you can be 01:43:43.520 |
consistently good and didn't get lucky but it in a few years 01:43:47.680 |
you can tell that somebody sucks. Can LPs tell? Well I'm 01:43:51.920 |
not sure that they get the visibility because when LPs 01:43:54.560 |
interact with GPS, they're grinned fucked for the most 01:43:57.120 |
part. So I don't know probably not but when I interact with 01:44:01.280 |
them just as a peer to peer level and I see the deals that 01:44:05.360 |
get done it's pretty easy to understand that some people just 01:44:07.760 |
suck. They don't know how to make money I guess is the point 01:44:10.640 |
which in the job that's the job is to make money to 01:44:14.400 |
consistently make money. Okay Brad take care and let's go to 01:44:18.080 |
science corner. Bill Gates wants to genetically modify 01:44:21.600 |
mosquitoes. Is this fake news or real? Incorrect fake news. 01:44:25.840 |
It's fake news. Okay. Explain. The reference is from RFK 01:44:31.520 |
Jr's tweet that he sent out where he retweeted someone 01:44:34.960 |
talking about this mosquito factory in Colombia and this 01:44:39.680 |
guy basically put out a tweet saying oh look Bill Gates has a 01:44:44.320 |
mosquito factory in Colombia it's the largest in the world 01:44:46.720 |
30 million genetically modified mosquitoes are released every 01:44:49.520 |
week into 11 countries because Bill knows better than nature 01:44:52.960 |
what could possibly go wrong RFK Jr then took it upon himself 01:44:56.720 |
to retweet and say should Bill Gates be releasing 30 million 01:45:00.160 |
genetically modified mosquitoes into the wild part of the 01:45:02.800 |
mentality of earth as engineering object what could 01:45:05.600 |
possibly go wrong. So I really wanted to take issue with this 01:45:09.520 |
because I do think that this is the sort of misinformation 01:45:12.560 |
that both create scientific illiteracy and damages and 01:45:18.480 |
impacts negatively some of the significant progress that can be 01:45:22.000 |
made in medicine and in science. So I want to speak very clearly 01:45:25.760 |
as to what is going on what the science is behind it why this is 01:45:29.360 |
super important and then we can speak philosophically if you 01:45:32.800 |
guys are interested on kind of should we be doing this sort of 01:45:35.520 |
stuff and why so what's real here like what what are actual 01:45:38.640 |
facts versus fake news maybe that's a good place to start the 01:45:41.200 |
most common disease carrying mosquitoes are called edis 01:45:45.920 |
Egypt Egyptia. It's a species of mosquito that carry yellow 01:45:50.960 |
fever, dengue, Zika, a number of viruses that obviously are 01:45:56.880 |
pretty adverse to human health. Each year about 400 million 01:46:01.200 |
people are infected with dengue virus via this mosquito vector 01:46:05.440 |
100 million become ill and 21,000 deaths are attributed to 01:46:09.040 |
dengue globally 200,000 cases of yellow fever each year causing 01:46:12.960 |
30,000 deaths. These are pretty significant health concerns and 01:46:17.360 |
it turns out that in mosquito populations not in this 01:46:20.080 |
particular species that carry these viruses but in other 01:46:23.120 |
species of mosquito there's a bacteria a natural bacteria 01:46:27.040 |
called Wolbachia and this bacteria exists in nature and 01:46:30.320 |
sometimes these mosquitoes get infected with this bacteria so 01:46:33.200 |
they carry this this bacterial bug and this bacteria is really 01:46:37.440 |
interesting because it causes what's called cytoplasmic 01:46:40.080 |
instability in the mosquito cells which actually makes the 01:46:43.440 |
mosquito largely resistant to a lot of viruses and there's a 01:46:47.280 |
bunch of theories for this mechanism and why this is the 01:46:49.680 |
case but it causes the mosquito to not be able to carry and 01:46:54.080 |
spread these viruses that are super adverse to human health. 01:46:56.960 |
So number one there's a natural bacteria that's found in nature 01:47:00.400 |
and up to 40% of mosquitoes are already infected with it. Number 01:47:04.640 |
two is it's not common in the mosquito species that is common 01:47:08.800 |
in these areas that are spreading these awful viruses 01:47:10.800 |
to humans and so there's been a project that's been going on 01:47:14.160 |
now for 12 plus years where they're taking large amounts of 01:47:18.480 |
mosquitoes and breeding them specifically to have this 01:47:21.840 |
bacteria in the mosquitoes and then they release those 01:47:25.440 |
mosquitoes into the wild and over time the bacterial 01:47:28.720 |
infected mosquitoes start to become a larger percentage of 01:47:31.120 |
the population and as a result the vector of carrying these 01:47:34.720 |
awful viruses into humans goes way down. There was a study 01:47:38.560 |
done in Indonesia where they took these Wolbachia infected 01:47:41.760 |
mosquitoes and they released them into the wild and they 01:47:44.560 |
looked at a population that was in a region where they released 01:47:46.880 |
them in a population that they didn't. In the region where 01:47:50.080 |
they didn't release them 9.4% of people ended up getting 01:47:53.760 |
infected with dengue fever and where they did release them only 01:47:58.160 |
2.3% were infected. So it was an amazing 75% reduction in the 01:48:03.280 |
infection of people by these mosquitoes and so the whole 01:48:07.520 |
point is just to kind of you know move the mosquito 01:48:09.920 |
populations in a way without doing any sort of genetic 01:48:12.880 |
modification but by exposing them to this bacteria so that 01:48:15.760 |
they don't carry these viruses into people but you know, 01:48:19.120 |
because that's a nuanced point there, Friedberg, the it's not 01:48:22.560 |
genetic modification or it is it's not I just explained it's 01:48:26.000 |
a bacteria and they just expose the mosquitoes they're exposing 01:48:29.360 |
the so that they end up getting infected with this bacteria and 01:48:32.080 |
then as they breed the mosquitoes breed in this 01:48:34.240 |
facility, you have you have to have to an infected male and an 01:48:37.920 |
infected female for the offspring to have the bacteria 01:48:40.960 |
if you have an infected male, they're actually infertile, 01:48:44.480 |
they can only fertilize an infected female. So unless you 01:48:47.760 |
do this breeding work, you don't end up seeing this happen 01:48:50.320 |
naturally in the environment where the Wolbachia starts to 01:48:52.560 |
spread. Where's the genetic modification misinformation 01:48:56.160 |
coming from? Whereas from from from the presidential 01:49:00.080 |
candidate, RFK Jr, who just propagated it. And so this is 01:49:03.040 |
why I want to make this point because this whole idea that oh 01:49:05.920 |
should we be engineering the earth? Let me just say something 01:49:08.000 |
about engineering the earth. Humans used to wander around the 01:49:11.760 |
earth or proto humans did without access to food. And 01:49:15.600 |
until we realized that we could plant a seed in the ground and 01:49:18.240 |
grow crops and started to engineer the earth in the form 01:49:21.360 |
of farming, we did not have access to a reliable source of 01:49:24.480 |
calories. Human ingenuity, human engineering gave us the ability 01:49:28.000 |
to do this gave us the ability to feed ourselves. Similarly, 01:49:31.120 |
humans got infected by viruses by bacteria by fungus and died 01:49:35.360 |
at a young age over and over again. And when humans began to 01:49:39.200 |
engineer medicine and engineer unnatural substances, because 01:49:43.600 |
he makes this point, oh, should we be interfering in the natural 01:49:46.080 |
world? What is natural is for people to get infected with 01:49:49.040 |
viruses or bacteria and die. And if not for the advent of our 01:49:52.800 |
engineering and our ingenuity and our ability as a species to 01:49:56.000 |
create solutions through science, which allows us to do 01:49:58.960 |
discovery, and then through engineering, which allows us to 01:50:01.600 |
make solutions that solve problems that humans face, we 01:50:05.360 |
would all be dead at a young age, and we would not have 01:50:07.280 |
realized the progress that we've had as a species. So I really 01:50:10.720 |
get ticked off when I see guys like RFK, Jr. And others not 01:50:15.280 |
just propagate this, this BS misinformation spiel about Oh, 01:50:19.440 |
genetically modified this and that science is bad. But to then 01:50:22.960 |
say, should we be messing with the natural world? Because I 01:50:24.960 |
would say to him, what about when your kid got infected, and 01:50:27.360 |
you gave your kid antibiotics, maybe you shouldn't have done 01:50:29.600 |
that. And this is a group of people who are saying that you 01:50:33.040 |
shouldn't do that, right? There is a movement to stop taking 01:50:35.280 |
antibiotics because it's making it because it's it's having an 01:50:39.360 |
I'll tell you a couple things. There are bad pesticides that 01:50:43.440 |
impact human health and cause damage to our DNA. There is bad 01:50:47.200 |
sunscreen that is endocrine disruptors and can damage human 01:50:50.480 |
health. There are plenty of chemical products that are made 01:50:53.200 |
that we use in everyday products that cause cancer. There is an 01:50:57.040 |
endless string of things that are wrong with the system of 01:51:00.320 |
engine, the systems of engineering that we do use. That 01:51:03.360 |
doesn't mean that they're all bad. That doesn't mean that we 01:51:06.560 |
then say, hey, you know what, let's not do anything. Let's not 01:51:09.600 |
do any engineering. Let's not use any antibiotics. Let's not 01:51:12.480 |
use any technology in food. And that's the challenge is, you 01:51:16.080 |
know, getting into the details on like, I'll tell you, I don't 01:51:18.720 |
use any sunscreen products with myself or my kids. I only use 01:51:22.400 |
zinc. And I have a similar sort of nuanced approach to 01:51:24.560 |
understanding what things we should or shouldn't use in our 01:51:28.880 |
can you please explain that? Because I didn't know this. What 01:51:35.680 |
Yeah. So there's a number of substances, which are known 01:51:39.440 |
endocrine disruptors that are found in sunscreen. I think it's 01:51:42.960 |
like one of the craziest things that we haven't made these 01:51:45.040 |
products illegal at this point. But the only sunscreen that you 01:51:49.680 |
should use is natural mineral sunscreen. This Sorry, I 01:51:52.800 |
shouldn't say that you should I should say this is what I chose 01:51:54.880 |
to do based on the data that I've seen. What's a good brand? 01:51:58.720 |
What brand any brand that sinks just zinc sunscreen, just look 01:52:01.280 |
at the back. If there's anything but zinc in it, don't use it. 01:52:03.680 |
Zinc oxide. Is that? Yes, I totally refer to zinc oxide 01:52:09.120 |
sunscreen. Yeah. That's the ingredient in it. So I don't 01:52:15.600 |
know what you guys. Wow. But here, let me just send you this. 01:52:20.240 |
But are you saying that you're not allowed to have any other 01:52:22.560 |
ingredient like there's no stabilizers? There's nothing 01:52:24.880 |
else? No, no, that it's it's the principle ingredient has to 01:52:29.440 |
be zinc oxide versus some other chemical you're saying? Yeah, 01:52:32.960 |
so oxybenzone. I don't use any products with oxybenzone. That's 01:52:36.240 |
like the most common sunscreen ingredient. octinox eight is the 01:52:39.680 |
other one homo salate and the parabens all of those product 01:52:43.200 |
categories, which are the most common products used in 01:52:46.320 |
sunscreen. They're absorbed by your skin, they go into your 01:52:49.120 |
bloodstream and their endocrine disruptors. Now the problem with 01:52:51.440 |
the zinc sunscreen and the mineral sunscreen is it 01:52:53.600 |
actually stays on your skin. So you look like you're wearing 01:52:56.160 |
it's really hard to rub it in. You gotta really really rub it 01:52:58.480 |
in. So they're actually not popular from a cosmetic point 01:53:01.120 |
of view. People don't like wearing them because they look 01:53:03.440 |
like idiots and it's really hard to. Yeah, that's the stuff I 01:53:06.800 |
use in the summer that is it drives me crazy because I look 01:53:09.760 |
like this weird ghost shiny ghost thing. I know but I do 01:53:13.600 |
use that. That's like really hardcore about that. So that's 01:53:17.040 |
an example of understanding nuance, right? It doesn't mean 01:53:19.760 |
all sunscreens are bad. And it doesn't mean that we shouldn't 01:53:23.200 |
use sunscreen. But understanding what the risk factors are that 01:53:27.040 |
are associated with different ingredients or different 01:53:30.000 |
engineering that's been done to make sunscreens available is 01:53:32.480 |
important. But that's so many levels deep. It's a really 01:53:34.960 |
difficult thing. So then people end up being scientifically 01:53:37.120 |
illiterate and being wrong. Because someone like RFK Jr. 01:53:40.400 |
comes along and says, Hey, should we really be engineering 01:53:42.720 |
the earth with genetically modified mosquitoes? And then 01:53:45.040 |
people have this call to action, shut those things down, shut 01:53:47.600 |
those things down. And they're incredibly beneficial and 01:53:50.000 |
effective. They're not taking any sort of genetic editing to 01:53:52.480 |
market. They're not doing anything that people might 01:53:54.640 |
consider risky. And we can have a separate conversation about 01:53:57.200 |
the risks and benefits of genome editing. That's another 01:54:00.560 |
topic. But how much of this? freeberg Archima as we get ready 01:54:04.640 |
to wrap here is a reaction to what happened with COVID-19. And 01:54:09.280 |
people's fear now of and getting sort of educated on gain of 01:54:12.560 |
function research. And, you know, this sort of recency 01:54:15.920 |
effect of my Lord, doing some of this science feels like it's 01:54:20.160 |
too dangerous, certainly too dangerous to do inside a city. 01:54:22.800 |
And what's the point of taking bats out of caves and doing 01:54:26.400 |
gain of function research? How much of it has to do with that 01:54:29.040 |
right now? And it's a sort of the downside to questioning 01:54:32.240 |
that technology can can be asymmetric, you can have like 01:54:36.720 |
nuclear weapons can wipe out the world, they can wipe out the 01:54:39.920 |
whole population. Yeah, you know, Tala makes this point on 01:54:44.000 |
his argument against GMOs, which I would argue against him on 01:54:47.680 |
this point, but we can do that another time. If he's willing 01:54:50.080 |
to come on. I'd be super happy to debate him on this point. But 01:54:53.600 |
the idea being that there's super asymmetric downside. And 01:54:57.760 |
so you know, what happens is people see incremental 01:55:01.840 |
improvements from technology, and they don't really praise 01:55:05.120 |
those incremental improvements. They assume that to be the case, 01:55:08.000 |
it's a linear step function. But when something goes wrong, 01:55:11.200 |
it's a big step down. And then people are like, Oh, wow. And 01:55:15.120 |
then people get scared of technology. And then people 01:55:17.520 |
want to step away from it. And this is true in anything that 01:55:19.520 |
relates to your health, to food systems, to the environment now. 01:55:23.440 |
Now, it doesn't mean that all technology is bad, or all 01:55:26.560 |
engineering is bad. But you know, as mistakes are made in 01:55:29.200 |
the system, as new things are discovered, we have to retrench 01:55:32.400 |
and change what we're doing. But it doesn't mean that we 01:55:34.480 |
should stop progress. And it doesn't mean that the whole 01:55:36.560 |
system of humans figuring out how to engineer ourselves of 01:55:39.520 |
the world around us, to benefit the health to benefit the 01:55:42.640 |
planet to benefit other species on the planet, isn't a critical 01:55:46.080 |
mission and effort that we should be undertaking. 01:55:48.080 |
Well, the sunscreen thing is really tilting. I mean, I just, 01:55:50.720 |
I need to make sure I'm pretty sure we have a good one. But I 01:55:53.200 |
don't. Yeah, I'm on this right now. I'm at this has got me a 01:55:56.480 |
little nervous with my kids. I like these. Like, I have all my 01:56:01.680 |
kids have long sleeve sunshirts. And I try to do that, I think 01:56:07.120 |
is like the key thing. And because I, my family has skin 01:56:10.400 |
cancer, I have to do that since I'm dark skinned, free bird. I 01:56:14.000 |
mean, do you want to avoid skin cancer? I mean, I don't know. 01:56:16.800 |
Yeah, I wouldn't wear like a rash guard thing, you know? 01:56:20.240 |
Yeah, you know, it's a summertime, you're in the bed, 01:56:22.160 |
you know, you want to show you want to show off your revenge 01:56:25.280 |
body. I get it. I get it. All right, everybody, on behalf of 01:56:29.920 |
sacks. Let me just say Ukraine, Ukraine, Ukraine, Biden, Biden, 01:56:35.920 |
Biden, and Francis. Mayor Francis is now in the race. So I 01:56:41.040 |
guess we'll have him on the pod. We already talked to him. Oh, 01:56:44.640 |
okay. So yeah, at the summit last year. And also, we're doing 01:56:48.400 |
a survey for the podcast all in podcast.co slash survey, all in 01:56:53.760 |
podcast.co slash su r v. Why if you got to this point in the 01:56:58.160 |
podcast, please fill out our survey or listener survey. And 01:57:01.680 |
we will see you all next time. On the all in. Bye bye. 01:57:12.720 |
And it said we open source it to the fans and they've just gone 01:57:29.680 |
That is my dog taking a notice in your driveway. 01:57:39.040 |
We should all just get a room and just have one big huge orgy 01:57:41.760 |
because they're all just useless. It's like this like 01:57:43.600 |
sexual tension that they just need to release somehow.