back to indexEp13. Silicon Valley’s Political 180, META AI, COVID Postmortem | BG2 w/ Bill Gurley & Brad Gerstner
Chapters
0:0 Intro
1:38 The Perceived Political Realignment in Silicon Valley
18:2 Engagement Between Silicon Valley and Washington
27:21 Meta's Open Source Strategy (405b)
47:37 Post-Mortem on the COVID-19 Pandemic
58:37 Wiz IPO and Acquisitions
63:37 The Impact of AI on Self-Driving Cars
70:33 Lessons from the CrowdStrike Incident
76:11 Spotify's Success under Daniel Ek
00:00:05.220 |
going to get [BLEEP] bulldozers out and drop fiber lines 00:00:32.660 |
You've got a near assassination of the president. 00:00:45.180 |
She raises $80 million the day Biden drops out of the race. 00:00:49.340 |
It really seems to overshadow everything else. 00:00:52.340 |
Well, how is that inconsistent with sell and go away? 00:01:03.260 |
But it's just trying to stay on top of the events. 00:01:08.220 |
I find myself 50 to 100 chats behind every time 00:01:12.100 |
I open the app, just because of all the political activity 00:01:15.360 |
And of course, we don't talk a lot about politics on this pod. 00:01:20.540 |
But we also look at kind of this debate going on 00:01:26.380 |
Like The Washington Post reported Silicon Valley 00:01:30.180 |
realignment leading tech titans to back Trump. 00:01:39.660 |
It's been perceived as pro-business and pro-tech 00:01:42.980 |
It's been perceived as fairly moderate on social policies. 00:01:47.500 |
But just in the last few weeks, leading traditionally 00:01:51.540 |
Democratic supporters, Ben Horowitz, Mark Andreessen, 00:01:55.340 |
Elon, who supported a lot of Democrats in the past, 00:02:01.500 |
Why all of a sudden have people in Silicon Valley 00:02:08.020 |
And I might even say, like if I go back to when I joined-- 00:02:11.740 |
and I mentioned this a bit in the talk I gave at All In 00:02:22.420 |
sure, some people would say, yeah, I vote Democrat. 00:02:30.620 |
while you're at a meeting that relates to a startup. 00:02:39.860 |
which is the distance Washington is from here. 00:02:44.300 |
as you could possibly be in the United States, which I think 00:03:06.860 |
but it's provocative that they stepped out in this way. 00:03:11.260 |
Now, Ben and Mark foreshadowed this quite a bit, 00:03:18.500 |
Obviously, their firms have been getting bigger and bigger 00:03:32.220 |
you're getting into heavily regulated industries. 00:03:35.500 |
So you're going to bump up against this more, 00:03:39.740 |
And if you listen to what they do on their podcast, 00:03:43.300 |
and if you listen to what they write in their blog posts, 00:03:54.180 |
And so they went through in their podcast, which 00:04:05.460 |
They threw down and talked about it for 90 minutes. 00:04:08.380 |
And it's very clear-- and this was in their foreshadow piece 00:04:13.100 |
that they're looking after the interests of their constituents. 00:04:17.140 |
And I don't think there's anything wrong with that. 00:04:21.260 |
They have people they represent that are counting on them. 00:04:33.620 |
said we're going to get more involved in politics, 00:04:36.300 |
and we're going to look after-- they called it little tech, 00:04:38.740 |
which you might describe as entrepreneurs who want 00:04:49.660 |
No, I think, to be fair, they believe like both you 00:05:05.220 |
It drives our economy, which is the strongest in the world, 00:05:08.220 |
which is a great source of advantage, which allows us 00:05:31.260 |
I think they go through a list of things, right? 00:05:34.220 |
So one was an assault on business technology, 00:05:41.060 |
very important to have an open and permissionless network 00:05:48.700 |
They talk about being on the wrong side of AI. 00:05:57.940 |
to limit the number of flops and model sizes and everything 00:06:03.860 |
And then this crazy idea to tax unrealized capital gains 00:06:08.540 |
and how that, in and of itself, is not only almost impossible 00:06:12.140 |
to implement, but also would undermine the very incentive 00:06:16.180 |
that entrepreneurs have to go create businesses. 00:06:18.340 |
And let me just press pause on that one point, 00:06:23.620 |
especially if anyone's listening to the podcast who's 00:06:26.180 |
not deeply involved in a startup and knows what a cap table 00:06:29.980 |
looks like, it'd be easy to miss this point, or to look over it, 00:06:33.340 |
or to think maybe they're just talking about a nuance that 00:06:37.700 |
It would be catastrophic to the venture capital industry 00:06:42.060 |
and to the startup industry to tax unrealized capital gains. 00:06:55.700 |
how would you go about saying what it's worth? 00:07:05.700 |
we have to get a third party to pay them to create 00:07:09.660 |
this analysis to say what the option price has to be, 00:07:12.900 |
this is the most voodoo magic that I've ever seen in my life. 00:07:17.180 |
Like, if-- and it's so crazy, they run these models 00:07:22.220 |
and they produce a number that's like to two decimal points. 00:07:24.900 |
And I've always said they should be forced to-- 00:07:29.340 |
they should be forced to give a confidence interval. 00:07:31.900 |
Because it would probably be like from 5 to 20, 00:07:41.140 |
of work that was written in finance literature, which 00:07:45.500 |
was touted, if you're so smart, why aren't you rich? 00:07:50.380 |
perfect stock prices to the second decimal point, 00:07:58.580 |
I mean, we just went through this period of 20 and 21. 00:08:01.860 |
Where companies were valued at $10 billion, $15 billion, 00:08:04.660 |
$20 billion, you would have actually had to pay a tax. 00:08:13.060 |
the founders of the company have to come up with the money. 00:08:22.660 |
You would literally have to facilitate liquidity for them 00:08:27.300 |
And then the third point, which you just mentioned, 00:08:32.020 |
you'd have people pay a tax, and then the valuations 00:08:36.820 |
And then they would just have this massive tax loss 00:08:39.540 |
carry forward with nothing they could do with it. 00:08:56.300 |
And so I think that that was to explain the 180 00:09:00.420 |
or the realignment as the Washington Post describes it. 00:09:04.540 |
I think you have to understand that the Biden administration, 00:09:07.700 |
at least perceived by these people in Silicon Valley, 00:09:16.580 |
were out of alignment with what has historically happened. 00:09:25.900 |
You have Reid Hoffman, who have come out and said, listen, 00:09:31.340 |
made this really a personality test about Trump. 00:09:40.820 |
On the other side, you really hear a lot of policy arguments 00:09:44.620 |
around whether or not they're pro-business, pro-tech, 00:09:52.420 |
I do think that Biden now stepping down and Kamala 00:09:58.500 |
I think the straw that broke the camel's back, frankly, 00:10:12.140 |
of being on the wrong side of a lot of these issues. 00:10:15.780 |
It makes me want to just back up to a super high level. 00:10:21.020 |
about how if you take out the MAG 7, the S&P was down. 00:10:48.420 |
would be Exxon or Delta or JPMorgan or whatever. 00:10:53.660 |
And those aren't companies that grow very fast. 00:10:55.940 |
Yeah, I mean, the vast majority of the economic wealth that's 00:11:06.220 |
And so this is-- and remember, in a prior generation, 00:11:16.020 |
it was the Henry Fords, it was the Thomas Edisons 00:11:21.260 |
And I think the one thing that does pull people together 00:11:23.740 |
in Silicon Valley is an attack or an assault on innovation 00:11:30.580 |
And what's interesting is I was watching somebody 00:11:39.500 |
And he said, nobody should be surprised by what Silicon 00:11:43.660 |
a bunch of rich white guys who are advocating for what's 00:11:48.260 |
And what's interesting-- I think Pete got it wrong, right? 00:11:53.480 |
is because I don't hear in any of our conversations 00:12:06.940 |
that impact entrepreneurship that people care deeply about. 00:12:10.940 |
I hope that the conversation around the realignment-- 00:12:21.580 |
is a Democratic administration, back to engagement. 00:12:24.500 |
I thought it was really odd that Ben and Mark said-- 00:12:41.860 |
But they talked about Jared, and Ivanka, and Trump. 00:12:45.700 |
And they asked for meetings, and they got them right away. 00:12:48.220 |
And they're feeling responded to, to your point. 00:12:53.460 |
And I think one of the things that happens any time-- 00:12:58.180 |
that Silicon Valley has been a longtime supporter 00:13:02.620 |
Any time you unequivocally give someone your support 00:13:14.940 |
who has lost Suave, and Stripe, and all these fintech companies 00:13:25.620 |
And it's just like, hey, you have a consequence 00:13:32.500 |
And you've benefited from this constituent being successful. 00:13:55.180 |
but I want to support the other candidate for this reason, 00:14:02.780 |
the people who are expressing their point of view. 00:14:12.460 |
talked in the past about how the single human that's 00:14:17.780 |
is Deng Xiaoping, by bringing entrepreneurism and capitalism 00:14:21.820 |
to China, he brought 500 million people out of poverty. 00:14:36.940 |
between the countries that exist on this planet 00:14:42.100 |
And so I don't know a better way to help the populace. 00:14:49.020 |
we should all be able to agree on, on both sides of the aisle. 00:14:51.620 |
And by the way, I've spent a lot of time on Capitol Hill 00:15:01.420 |
I think they're incredibly smart people on both sides. 00:15:04.340 |
I think that they both see the right issues on AI, 00:15:08.500 |
on national defense, on issues related to China, 00:15:13.060 |
on issues related to technology and innovation. 00:15:15.980 |
I think that there is strong consensus in Washington 00:15:19.940 |
that the greatest source of our national advantage 00:15:30.460 |
over the course of the last six or nine months, 00:15:32.260 |
but it kind of lost its way with Silicon Valley, 00:15:37.420 |
was a natural constituency of the Democratic Party. 00:15:43.740 |
they said they started to see some things 10 years ago, 00:15:47.300 |
in the 2010s, that started to cause some concerns. 00:15:53.140 |
that Zuckerberg pledged to give 99% of his wealth away, 00:15:58.580 |
And like, starting to see some cracks in the wall. 00:16:02.740 |
This should be something that everybody gets excited about, 00:16:12.140 |
and I think, you know, the pendulum had just swung too far. 00:16:15.820 |
I do think that you're gonna see a Democratic Party 00:16:26.380 |
"Listen, you think you ought to get a green card 00:16:28.460 |
"for people who've taken a four-year college degree? 00:16:35.060 |
I think that most everyone in Silicon Valley agrees 00:16:39.180 |
that, like, tripling, quadrupling, maybe even 5X-ing 00:16:53.420 |
And so it's ironic that no one's been able to get there. 00:16:56.900 |
And for, I think Trump said that on All In, right? 00:17:02.340 |
Like, you're gonna get a lot of support for saying that. 00:17:07.580 |
it's not about taking political positions here, 00:17:12.740 |
is that if you attack innovation, entrepreneurship, 00:17:21.900 |
both parties should support. - I agree with that. 00:17:23.500 |
And that's why, regardless of who you're gonna vote for, 00:17:32.260 |
that Ben and Mark brought the issues to the forefront, 00:17:36.100 |
you know, by doing this, like taking a stand. 00:17:39.260 |
Because otherwise, you just sit back and say, 00:17:52.740 |
- Well, and let's end this section with this. 00:18:09.980 |
You mentioned American dynamism or national defense. 00:18:16.260 |
Whether or not we can export chips to certain country 00:18:20.980 |
I find myself after a 10-year hiatus in Washington, 00:18:28.140 |
How do you feel, like, do you think that engagement 00:18:38.060 |
You and I had a conversation about, you know, 00:18:53.260 |
- I'm going to just admit, like, that I'm very skeptical. 00:19:00.060 |
- Like, the way that legislation gets written 00:19:05.660 |
We're already seeing, you know, this massive fight in AI, 00:19:13.140 |
but the leaders with their closed source AI models 00:19:17.820 |
are spending more money than any startups ever spent, 00:19:32.260 |
I don't know why they wouldn't look after themselves. 00:19:34.620 |
The thing that it would take, and I owe it to my, 00:20:03.260 |
that engagement by Silicon Valley to fight back 00:20:06.180 |
against the incumbents who are trying to do the capture. 00:20:11.540 |
on all of these issues, sitting on your hands, you know, 00:20:16.940 |
that there are now large incumbents in Silicon Valley. 00:20:19.980 |
You know, you referenced some of the leading AI companies 00:20:30.300 |
- Let's go a little bit down the E-line case, 00:20:34.100 |
because I do think, I don't know who to blame it on. 00:20:40.660 |
I'll just pitch this as someone that's on neither side. 00:20:46.620 |
maybe you see a way that Elon picked the fight with Biden. 00:20:50.940 |
I think if you're on the other, if you're on Elon's side, 00:20:53.420 |
he kind of feels like Biden picked the fight. 00:21:01.380 |
- And holding an EV summit without inviting Tesla, 00:21:07.700 |
is, I don't know how you even do it with a straight face. 00:21:13.860 |
- And I don't even think they're hiding it, right? 00:21:15.460 |
Like, when Biden was signing some bills in the Rose Garden, 00:21:19.060 |
he's surrounded by six people in union jackets, right? 00:21:24.260 |
that would only be eligible to a car built by a union worker. 00:21:47.500 |
- And there's immense regulatory capture in this. 00:21:51.980 |
I happen to have some rural property in Texas, 00:21:59.220 |
because they get to bill the government cost plus, 00:22:06.300 |
and they would, like, dig a two-mile trench in rock, 00:22:16.380 |
So, anyway, they decided they were gonna spend 42 billion 00:22:20.060 |
on rural broadband, and between 10 years ago and now, 00:22:25.460 |
And there's just, I don't know a better way to say this, 00:22:44.540 |
if you're by yourself and isolated in an area, 00:23:01.460 |
Now, the powers that be at the landline companies 00:23:08.020 |
"Oh, well, will you stay up during a rainstorm?" 00:23:15.540 |
'cause it might not work if it's the heaviest rain, 00:23:22.540 |
In this recent hurricane that went through Houston, 00:23:30.820 |
but, like, some of the landline broadband things went out. 00:23:38.300 |
- 'Cause someone had, there was, like, a flooded cabinet. 00:24:00.940 |
he says zero, zero people have been connected. 00:24:07.540 |
everyone knows government's not very successful 00:24:11.020 |
but they disqualified the very best new solution 00:24:21.780 |
- The cost of the antenna, the cost of the antenna. 00:24:28.460 |
or do it at cost, like the government could have, 00:24:31.700 |
you know, gone in and done like a bomb analysis 00:24:54.900 |
I can only surmise that that happened for political reasons. 00:25:13.740 |
I wonder if there's something else going on as well. 00:25:16.940 |
You know, forever we've had kind of this fourth estate 00:25:24.860 |
and you add the nightly news and the newspapers 00:25:27.980 |
that worked very closely with those branches of government 00:25:34.260 |
But really over the course of the last five years, 10 years, 00:25:37.420 |
we've had this explosion in the democratization of media, 00:25:53.460 |
you have this libertarian streak in Silicon Valley, 00:25:57.060 |
which I would agree is already slightly distrustful 00:26:02.100 |
And now they feel like based on this reporting, 00:26:07.700 |
They were lied to about censorship in big online media. 00:26:18.220 |
And so I think there's something deeper going on here 00:26:35.580 |
So I think there's been a real breach of trust 00:26:40.240 |
And so again, I think there are people of goodwill 00:26:47.640 |
when I hear people say, well, there's a realignment 00:26:50.060 |
because you just have a bunch of rich white billionaires 00:26:56.460 |
I think they need to really look in the mirror 00:26:58.700 |
and take a deeper analysis as to the situation, right? 00:27:28.760 |
It's the first open source really frontier quality model. 00:27:35.580 |
Importantly, they updated their community policies, right? 00:27:40.440 |
So now this can be open and permissively licensed, 00:27:45.920 |
synthetic data generation, distillation, fine tuning. 00:27:50.440 |
You know, Zuck wrote a letter in defense of open source. 00:28:00.040 |
He said Zuck deserved a lot of credit for open sourcing it. 00:28:05.760 |
Because there was some debate as to whether 405 00:28:09.800 |
What did we learn today and why is this so important? 00:28:12.240 |
Well, I think there's two things we could talk about. 00:28:14.200 |
So you tell me which one you want to talk about first. 00:28:16.200 |
There's the strategy behind why would a company 00:28:25.960 |
And then there's the question of what's happening 00:28:33.120 |
Let's talk about just Meta's decision to do this 00:28:36.280 |
and the commitment to open source and why it's so important. 00:28:53.920 |
it's created a lot of frustration and limitation 00:29:04.120 |
Something happened over the past, I'd say 15 years, 00:29:12.640 |
in Silicon Valley have developed a new strategic play 00:29:16.280 |
where they use open source as a defensive weapon 00:29:23.480 |
One of the first to do it was Google with Android. 00:29:27.120 |
And it's really hard for people to put themselves 00:29:48.080 |
So I think they backed up on that a little bit, 00:29:57.080 |
the openness that Verizon and Samsung thought 00:30:04.680 |
they were getting into with Android did play out there. 00:30:08.080 |
Like there's no Google or Apple in the China market. 00:30:21.000 |
move your workloads very easily between clouds. 00:30:24.200 |
They were worried Amazon was running away with AWS. 00:30:32.960 |
And they got the Linux Foundation to help organize, 00:30:40.280 |
They got all these hundreds of players behind Kubernetes. 00:30:49.040 |
And that has allowed more fluid transfer of workloads, 00:30:54.440 |
Facebook created this thing called Open Compute Project. 00:31:02.000 |
if you wanna sell computers into our data center, 00:31:08.840 |
or we're gonna write a list of specifications 00:31:10.800 |
that those things have to meet in order to go in there. 00:31:14.440 |
Well, one of them is basically no proprietary technology. 00:31:21.200 |
that says we're only gonna support open source things here. 00:31:25.720 |
And by the very definition, if it qualifies to come in, 00:31:59.360 |
'cause they're worried about Google having too much power. 00:32:02.160 |
So this has become a kind of a new go-to move by, 00:32:07.600 |
You can't, like, obviously there are startups 00:32:11.840 |
but if you're gonna play this defensive move, 00:32:24.960 |
but I think some of the players in the AI world 00:32:29.200 |
by claiming that they were gonna change the world. 00:32:31.720 |
And so in such dramatic ways, you awaken the giants, right? 00:32:35.280 |
If you cause everybody's investor call to be, 00:33:01.680 |
on a third party for an AI tool that they needed 00:33:06.640 |
- I thought one of the most interesting parts 00:33:23.840 |
- You know I'm a huge, huge fan of this idea. 00:33:35.280 |
where we saw ecosystem control by a single company 00:33:42.840 |
He said it was bad enough that we had to pay a big tax, 00:33:45.640 |
right, because that tax hurts all the developers 00:33:51.520 |
He said, "But it was absolutely soul-crushing." 00:33:54.280 |
He used that phrase, which I thought was interesting. 00:33:56.720 |
When you develop, work really hard on a product improvement, 00:34:01.240 |
"No, you can't release that into your product 00:34:08.320 |
that this has become extraordinarily personal, right? 00:34:12.000 |
Here you have a company that produces tens of billions 00:34:17.480 |
It can fund all of, you know, all of these models 00:34:33.080 |
the question first is, okay, I'm going to have to compete 00:34:35.960 |
and keep up with this frontier-level competition. 00:34:43.600 |
you know, Meta's gonna give it away for free. 00:34:55.400 |
but years ago I wrote a blog post about Android, 00:35:04.160 |
"It doesn't matter if they can make money on Android." 00:35:08.920 |
But it creates a moat that's so wide around the castle. 00:35:13.680 |
I think I drew a picture, I talked about like, 00:35:20.300 |
for five square miles outside, around the water. 00:35:24.520 |
But you're not getting to our goddamn castle. 00:35:36.280 |
but it makes a lot of sense for these companies to do it. 00:35:43.920 |
to Mark's podcast, playing with the new models. 00:35:46.540 |
By the way, on Grok, they're like nutty fast. 00:35:57.860 |
He also said today that they've already laid out 00:36:03.620 |
So this was LLAMA 3.1, it was a large 405B model, 00:36:17.860 |
they've already laid out the data architecture for it, 00:36:22.040 |
So it seems to me, and our team's best guess, 00:36:34.540 |
so they probably think you're gonna get a new release 00:36:41.280 |
that they can then try to quickly leapfrog with LLAMA 4. 00:36:47.440 |
but he said it's kind of fun and fascinating, 00:36:53.160 |
And he said the models are all gonna be bigger, 00:36:55.740 |
they're all gonna cost a lot more money, billions of dollars, 00:37:04.880 |
which he said, this isn't gonna be a straight line. 00:37:11.340 |
and said you may have to go through a bubble here. 00:37:20.240 |
Second, you're gonna have to spend billions for a long time, 00:37:26.200 |
Third, that we may in fact have to cross this chasm, 00:37:32.720 |
I think what it does is it's also a shot across the bow 00:37:37.720 |
of those folks who might wanna start closed-model companies 00:37:43.640 |
- No doubt, and one other thing I would highlight 00:37:49.640 |
like every startup, whether you are an AI startup 00:37:54.640 |
or whether you're a startup that's been around a while 00:38:02.840 |
Like you should be so happy that he made these decisions. 00:38:10.440 |
in terms of openness today than the model was before. 00:38:14.320 |
More permissions, and the Hugging Face team was clapping 00:38:17.920 |
and celebrating on Twitter as well for this reason. 00:38:21.080 |
And so for, as Mark and Ben like to call it, little tech, 00:38:27.920 |
- Like phenomenal, it may not be for open AI, 00:38:37.120 |
is open AI the new Netscape, which is somewhat provocative, 00:38:43.880 |
And I would say this, I'm gonna stop and let you go, 00:38:47.920 |
but like last thing, when, I just don't think 00:38:52.800 |
that Sam Altman realizes when he talks about things 00:39:14.680 |
if you're open AI, you got to go raise your next 00:39:19.680 |
Remember, Meta had allocated $20 billion a year 00:39:26.000 |
And there was a report last week that they may, you know, 00:39:29.200 |
tighten their belt on reality labs to the tune of 20%. 00:39:33.120 |
Okay, so that's four or $5 billion of savings 00:39:36.800 |
on a research project that they have in reality labs 00:39:45.040 |
I actually think that they're doing extraordinarily well. 00:39:48.820 |
And here's the interesting thing about open AI, 00:39:59.700 |
But if you look at chat GPT 4.0, since it was released, 00:40:08.700 |
DAUs have gone from 60 million to 100 million. 00:40:15.980 |
there's only one way, you know, to build a business. 00:40:23.560 |
And the only other company that's really shown 00:40:32.700 |
So the question is, can they build a consumer product 00:40:35.940 |
that is durably better than the alternative, right? 00:40:41.340 |
And Mark mentioned that there are hundreds of millions 00:40:53.800 |
you probably don't even know you're using it yet. 00:41:10.420 |
And I mean, like those numbers you're referring to, 00:41:18.840 |
They're not necessarily the paid constituency. 00:41:23.540 |
- Yeah, and I, playing around with credit card data, 00:41:44.800 |
So I doubt that anyone that is looking at that, 00:42:06.900 |
I mean, ultimately, the cogs in these businesses, right, 00:42:12.220 |
because these are super quickly depreciating assets, right? 00:42:15.740 |
You have to keep reinventing that model every single year. 00:42:24.160 |
I would say, I think one of the retention problems 00:42:26.760 |
is simply all the substitutes that are available. 00:42:30.000 |
And if meta's gonna, and then if speed's gonna matter. 00:42:35.680 |
of Google made itself like 30 milliseconds faster. 00:42:41.160 |
Like, and that, play with the llama grok demo. 00:42:51.320 |
- So anyway, I just think there's a lot of alternatives. 00:42:53.480 |
If Mark, if we're right about Mark's intention 00:42:57.560 |
to, he took it personally, he's playing defensively, 00:43:06.680 |
- And he might run it for four years with no ads. 00:43:11.700 |
and, you know, it's grok, it's Cerebris has fast inference, 00:43:16.400 |
you know, Fireworks, another benchmark investment, 00:43:25.480 |
You're now getting to the point with these inference engines 00:43:43.840 |
You may have a hundred different interactions, 00:43:52.480 |
because computers can talk to each other at, you know, 00:43:55.240 |
the fastest speed you can possibly, you know, 00:43:58.760 |
And so reducing that latency unlocks a lot of innovation, 00:44:27.840 |
everybody, you know, launched with optimized versions 00:44:33.040 |
So, you know, in the case of Databricks or Snowflake, 00:44:36.480 |
they can basically take these optimized models 00:44:57.720 |
I went back to the Kubernetes Wikipedia page. 00:45:00.440 |
So on launch, the principal competitors were VMware, 00:45:08.240 |
And then, you know, a few months later, AWS came in, 00:45:12.180 |
but yes, if you're a participant in the market 00:45:16.760 |
who is looking for your own solution to be advantaged, 00:45:21.760 |
and there's someone who wants to make a piece of it 00:45:30.680 |
And so you're right, it becomes this attractor 00:45:33.640 |
that brings the other parties to the table to play. 00:45:37.360 |
And yeah, that's part of why it's so powerful 00:45:41.140 |
in this defensive way, what I call defensive strategic play. 00:45:50.280 |
And once again, I think it's good for entrepreneurs, 00:45:56.320 |
- Like 15, 20 year patents on big pharma drugs 00:46:02.880 |
This is technology getting freer, cheaper, more available. 00:46:05.880 |
- You know, and I'll give you credit for this. 00:46:08.120 |
In a lot of conversations I've had in Washington, 00:46:11.120 |
your work on regulatory capture has actually made its way 00:46:15.040 |
into a lot of Congress people's minds and offices, right? 00:46:18.280 |
They're on the lookout for regulatory capture. 00:46:39.920 |
So thank you for triggering this in my brain, 00:46:47.720 |
So you look like, you know, you're the good guy 00:46:54.000 |
- By the way, I do think it's worth mentioning 00:46:58.040 |
Just from talking to entrepreneurs in the AI space, 00:47:07.400 |
or Azure is advantaged by being the hosting system 00:47:13.320 |
- And Amazon doesn't, so not surprised that-- 00:47:22.320 |
- It makes sense, they need a response to that. 00:47:32.400 |
We have a few more topics we want to talk about. 00:47:34.520 |
We'll make these a little bit more lightning round. 00:47:37.200 |
But you had me listen to a podcast, you know, this week. 00:47:41.120 |
Excellent podcast with Dr. Jay Bhattacharya and Rick Rubin. 00:47:46.120 |
You know, Jay's a Stanford kind of economic epidemiologist, 00:47:52.560 |
on the Great Barrington Declaration in April of 2020. 00:48:07.400 |
into the population by April of 2020 than people thought, 00:48:19.600 |
you were telling all your friends to listen to this pod. 00:48:23.400 |
You clearly are still agitated about the fact 00:48:29.920 |
Why is this so important for society that, you know, 00:48:34.040 |
that we listen to this, that we come to terms with it, 00:48:47.040 |
Rick, obviously, is one of the most interesting Americans 00:48:54.320 |
on the planet, an incredible record producer. 00:49:01.200 |
and how he thinks about creation that came out last year. 00:49:14.840 |
even though I can't pronounce the name of his podcast. 00:49:22.880 |
and they covered, I think, maybe like a good two, three, 00:49:26.680 |
four-year period of COVID and everything Jay went through. 00:49:42.760 |
for whatever reason, we were in this weird place, 00:49:45.520 |
and some of this came out in the Twitter files, 00:49:48.280 |
where if you said anything that wasn't 100% consistent 00:49:55.960 |
it was labeled misinformation, conspiracy theory. 00:50:02.400 |
but I looked and found this incredible New York Times piece 00:50:17.800 |
probably thousands of hours of investigative journalism 00:50:24.160 |
And they had these infographics, it was like amazing. 00:50:30.960 |
we have, I don't know, 20 million people dead, 00:50:33.680 |
probably the worst catastrophe since World War II, 00:50:38.920 |
Like no one was looking, no one was doing the work 00:50:46.600 |
Yet the consequences were 1,000x, like 10,000x bigger. 00:50:51.760 |
And I don't know if it was Trump derangement syndrome, 00:50:58.280 |
had a lot of this where people tried to talk up. 00:51:01.640 |
And so to me, one thing should be obvious to everyone, 00:51:12.200 |
and if you hadn't listened to Jay, I would say, 00:51:24.920 |
- Right, and so I would argue that the reason 00:51:31.480 |
you had three prominent senior authors on this paper, 00:51:34.360 |
60,000 signatories, and it basically was early 00:51:38.720 |
in the pandemic and at odds with the conventional wisdom. 00:51:42.160 |
Okay, the conventional wisdom was that you needed 00:51:48.120 |
that it wasn't that deeply penetrated, you know, 00:51:51.480 |
of disease, and we could prevent it from spreading. 00:51:54.440 |
He was basically making the case, because they had-- 00:52:02.640 |
would have massive consequences, which they did. 00:52:06.720 |
- But he wasn't given a voice, and Francis Collins 00:52:21.040 |
and so he's actually thinking about how these things 00:52:33.560 |
and I might include this in how some of the people 00:52:38.360 |
responded to Ben and Mark, and I remember I triggered 00:52:42.200 |
on Kevin Scott when people questioning LLM scaling 00:52:47.400 |
Like any time your reaction is not to react to the argument, 00:53:06.840 |
They wanted to be, you know, it was like excommunicating 00:53:11.040 |
them from science, you know, because you had the head 00:53:13.600 |
of the NIH and you had Fauci who were calling them 00:53:17.920 |
They didn't deal with any of the facts, right? 00:53:19.960 |
The science was that they had the sewage data 00:53:26.560 |
They knew the penetration, and they, of course, 00:53:30.240 |
knew the death rate, you know, that they could calculate. 00:53:32.960 |
They knew among younger people that there was almost 00:53:35.360 |
zero deaths, and among older people, you needed to take 00:53:41.240 |
Now, Jay, what makes him interesting is, you know, 00:53:45.280 |
he actually had two relatives in India who died from COVID. 00:53:49.240 |
It wasn't like this guy didn't care about COVID. 00:53:55.680 |
who actually looked at data, did the research, 00:53:58.440 |
put together, you know, a very important paper. 00:54:02.400 |
Now, this is a person who was celebrated by the NIH. 00:54:09.240 |
Right, until he said something they didn't like. 00:54:16.280 |
"I need to listen to this," I'm thinking to myself, 00:54:20.120 |
But he says something at the end of the podcast 00:54:21.800 |
where he said, "The problem is we're no better prepared today 00:54:25.760 |
"than we were then, and the exact same rush to judgment, 00:54:30.760 |
"right, lock everything down, we're prone to do it again 00:54:38.240 |
"And the proper post-mortem is that we need to do 00:54:43.720 |
And one thing, obviously, I hope everyone goes 00:54:45.680 |
and listens to it, but one thing that he uncovers, 00:54:48.520 |
they've gone back through and looked at excess deaths 00:54:59.680 |
I think most people believe there were consequences 00:55:11.640 |
and it was horrific for them not to have the experience 00:55:14.640 |
that so many of us cherish from those moments in time. 00:55:17.800 |
It actually, I get a little upset thinking about it. 00:55:23.400 |
because we need to go back through and look at everything. 00:55:30.120 |
If you look at what's being discussed in Congress 00:55:34.320 |
and what they're uncovering, it really, really needs more, 00:55:44.120 |
there's people that you should follow on Twitter, 00:55:48.520 |
There were these incentive systems pushed through hospitals 00:55:51.120 |
where you could charge 30% more if a patient had COVID. 00:56:06.640 |
that I brought up during my regulatory capture speech 00:56:14.440 |
And all that stuff, while we don't have a pandemic, 00:56:20.480 |
And I hope that whoever is our next president 00:56:27.840 |
I have no idea if Jay has that kind of time in his life. 00:56:39.040 |
This deal, Google was rumored to be buying Wiz 00:56:46.920 |
The first was HubSpot that seems to have kind of blown up 00:56:53.440 |
Asif, the CEO of Wiz, came out and said not to worry. 00:56:57.000 |
We're gonna get to a billion dollars in revenues 00:57:03.960 |
I mean, my read on it, just at a quick level is, 00:57:08.360 |
it would probably trade at 13 or 14 times forward. 00:57:10.840 |
That would be like 13 or 14 billion, not 23 billion. 00:57:24.360 |
Well, I mean, one thing I would say is that these are, 00:57:29.840 |
if you're lucky enough to be involved in a situation 00:57:42.320 |
but I remember when Instagram, you know, got their offer, 00:57:52.600 |
And we had funded the 14th photo sharing site. 00:57:57.000 |
And Matt comes in and says, "Should we do this?" 00:58:00.040 |
And like, it's easy to want to be on the yes side 00:58:04.520 |
in these situations, because it's like, wow, this might, 00:58:23.560 |
And so, and there's also the famous, you know, 00:58:30.000 |
and then became one of the most powerful companies 00:58:34.560 |
- Turning down 23 or whatever the number was, 00:58:37.880 |
it's a higher number, it makes me talk with a higher voice. 00:58:47.920 |
The number of people that make it to a billion 00:58:50.360 |
is a fraction of the number that make it to a hundred million 00:58:53.240 |
and the number that make it to 10 billion is a fraction. 00:58:56.000 |
Like it's, the air's thin as you climb the mountains. 00:59:01.840 |
if they're just that confident in the business. 00:59:05.800 |
that the CrowdStrike situation makes them feel 00:59:18.880 |
because Google doesn't have a huge security business 00:59:21.800 |
and there's actually concern about consolidation 00:59:37.080 |
I think there'd be less reason even to worry about it. 00:59:41.560 |
- Yeah, that's probably the one thing I would press on is, 00:59:45.220 |
you know, the betting markets I think are now 60/40 00:59:49.040 |
with Kamala in the race of Trump winning the election. 00:59:54.040 |
You know, we've been in this period in Silicon Valley 01:00:10.880 |
It seems to me they did three things instead. 01:00:19.020 |
They, you know, they tighten their belts on people. 01:00:23.920 |
All the companies became a lot more profitable 01:00:34.720 |
usually the partner that's involved in the company, 01:00:37.640 |
the one that's on the board or that led the investment, 01:00:40.400 |
they're usually pretty, they tend to be overly confident. 01:00:59.520 |
the odds that they were sitting there pushing, 01:01:04.300 |
maybe we should do this," I would think would be high. 01:01:14.460 |
Which speaks to the fact that the LPs feel a need 01:01:25.860 |
- But it could have been. - It could have been. 01:01:27.580 |
- As another problem that we face in Silicon Valley, 01:01:32.060 |
part of the vibrancy of this ecosystem relies on the, 01:01:36.740 |
not just the IPO market, but also the M&A market. 01:01:44.420 |
a peer pressure to appear particularly confident 01:01:48.580 |
and to play the long ball game amongst venture capitalists, 01:02:05.060 |
I'm just looking at the list, Andreessen Thrive, 01:02:25.900 |
but they've clearly built a terrific business. 01:02:27.940 |
- No doubt, I've heard nothing but good things. 01:02:29.580 |
I don't, and by the way, one thing that could be 01:02:34.980 |
is if they turn and run at the IPO markets very quickly. 01:02:44.460 |
- And we talked about this at the Code Two event. 01:02:47.260 |
I think people have gotten overly conservative 01:02:57.780 |
We've had talk about these potential AI IPOs coming. 01:03:07.940 |
- Yeah, you know, I just got off a board call 01:03:14.380 |
that's gonna come public in September, October. 01:03:20.020 |
about companies coming to the public market again. 01:03:25.900 |
and its maximum innovation is still in front of it, right? 01:03:32.020 |
like all of that can occur post going public, right? 01:03:41.940 |
they miss their numbers a little bit on margins. 01:03:47.380 |
But interestingly enough, Elon said on the call, 01:03:50.420 |
if you don't believe that Tesla's gonna solve autonomy, 01:03:56.180 |
So he's like, this is not just about the number of cars 01:04:01.180 |
Like the reason you're in this stock is because of autonomy. 01:04:04.980 |
And so I want to talk a little bit, you know, 01:04:07.140 |
you and I spent a whole pod talking basically about 12.3. 01:04:16.700 |
He mentioned 12.5 and 12.6 on the call today. 01:04:24.940 |
They're running like 5X larger models on the edge, 01:04:27.540 |
on the car, so rather than a billion parameter model, 01:04:29.860 |
maybe something like four or five billion parameter model. 01:04:32.980 |
The rates of improvement continue to accelerate. 01:04:39.280 |
You know, we went and did some testing of our own on this. 01:04:44.220 |
In 12.3, you know, you could take your hands off the wheel 01:04:48.460 |
for like 40 seconds before it told you to re-engage. 01:04:51.840 |
By 12.4, you could take your hands off the wheel 01:05:00.500 |
And it has eye tracking, and so you could look away 01:05:03.700 |
a little bit before, you know, it was telling you 01:05:08.100 |
On 12.5 and 12.6, we think it'd get to the point 01:05:13.900 |
so it doesn't constantly ding you to grab the wheel. 01:05:17.300 |
And you can really start looking away from the road 01:05:26.900 |
because they're confident in the efficacy of the model. 01:05:38.340 |
they're going to look for approvals in China and Europe 01:05:57.180 |
Interestingly, one of our analysts who covers China-- 01:06:08.820 |
But not without a driver, they have to apply-- 01:06:22.740 |
deploying 10,000 cars in Wuhan overnight, et cetera. 01:06:26.300 |
And here's what she, you know, came back and-- 01:06:32.460 |
She said, you know, they're still catching up to FSD-11. 01:06:39.100 |
So they're, like, three years behind on full self-driving. 01:06:56.620 |
to being hands off the wheel, eyes off the road. 01:07:01.180 |
Huawei, Xiaoping, NIO, all these different companies. 01:07:07.100 |
And she said she had some really lousy experiences 01:07:23.500 |
So in Africa, you know, there are no landline. 01:07:26.900 |
And so everything, you know, changes in a different way. 01:07:34.860 |
So most people are either bicyclists or commuters. 01:07:42.140 |
why your expectations would be in a very different place 01:07:45.700 |
than if you were an American who loves your car. 01:07:52.100 |
but we spent, I don't know, 100 years, 70 to 100 years, 01:08:05.860 |
One of the things you and I've debated back and forth 01:08:08.460 |
on this pod is just the impact that AI is going to have 01:08:12.060 |
and the timeline against which it's going to have the impact. 01:08:18.660 |
the magnitude of the impact, perhaps from the LLM. 01:08:27.860 |
on full self-driving autonomy and likely on robo-taxi. 01:08:32.860 |
And so it does feel to me like this is one of those places 01:08:39.660 |
it's not surprising to see two or three Waymos around you. 01:08:57.820 |
to pull the steering wheel out of the car altogether. 01:09:07.980 |
which was Misha Laskin, who is an entrepreneur 01:09:12.980 |
that runs an AI company here in Silicon Valley. 01:09:18.020 |
But he was at Google DeepMind and he talks about AlphaGo. 01:09:23.820 |
And he has some different perspectives on AlphaGo. 01:09:32.260 |
and there's this great book about infinite and finite games, 01:09:46.220 |
I think you define it in a way where it's a finite game, 01:09:52.900 |
And so these models are able to go a lot farther 01:09:57.900 |
when there's a finite game that can be played. 01:10:19.140 |
- Yeah, I think there are a bunch of discrete things 01:10:32.020 |
So what the heck happened with CrowdStrike this week? 01:10:38.900 |
all my friends were complaining that they couldn't fly, 01:10:42.020 |
get from point A to point B, airlines ground to a halt. 01:10:54.200 |
And what are the things we ought to be looking out for 01:10:58.020 |
- A whole bunch of things went through my mind. 01:11:04.600 |
but that is just so like such a strong argument 01:11:09.600 |
is why didn't you stage gate how this is rolled out? 01:11:13.160 |
Like, how could you let it have this big an impact? 01:11:16.480 |
Like if you did increasingly large groups every two hours, 01:11:21.480 |
you wouldn't have never had this impact on the world. 01:11:23.760 |
- Deployment engineer that needed to go on vacation. 01:11:36.720 |
at kind of analyzing crisis PR who went through his stuff. 01:11:40.480 |
I think they could have talked about things like that, 01:11:46.400 |
they gave a technical description of what happened 01:11:52.360 |
They didn't say why this isn't gonna happen again, 01:11:59.120 |
The other thing that if you read Matthew Prince 01:12:02.960 |
who wrote a large analysis of this and keep in mind, 01:12:11.280 |
And he's very worried about a very specific thing here. 01:12:19.800 |
but apparently Microsoft will argue the reason 01:12:22.680 |
that these security companies have access to the kernel 01:12:25.360 |
is because they were forced to let them have access 01:12:34.800 |
Now, there were no non-Microsoft boxes that failed 01:12:42.560 |
"Apple doesn't allow you to have access to the kernel." 01:12:56.760 |
but Microsoft's now one of the top four security companies 01:13:02.840 |
And so, that's a new kind of dimension of competition 01:13:06.760 |
that I think people need to pay attention to. 01:13:21.840 |
They were on really, really old Windows machines. 01:13:29.960 |
that has a terminal that matters is running Windows at all. 01:13:40.240 |
of the network effect of Windows and all this stuff, 01:13:42.840 |
but you would think there'd be a hardened Linux thing 01:13:52.680 |
You know, we have these esoteric conversations all the time. 01:13:58.040 |
But, you know, the world has become software. 01:14:04.420 |
the amount of software that's running around us. 01:14:09.100 |
soon to be driving our cars for us, et cetera. 01:14:25.160 |
That really makes people in Washington unhappy. 01:14:28.600 |
Right, and so, part of the reason I think that, you know, 01:14:38.220 |
and Silicon Valley is changing, is forever changing, 01:14:50.660 |
whether it's the phone we carry in our pocket 01:14:53.020 |
or the software that's getting us from point A to point B, 01:14:56.620 |
to the AI we're going to be using in the future. 01:14:59.260 |
And so, I think that this symbiotic relationship, 01:15:02.500 |
this, you know, figuring out how to make sure 01:15:11.380 |
in the most productive way possible for this country, 01:15:29.660 |
And, in that regard, I think policies are gonna matter. 01:15:33.620 |
Like, I don't think any party can take Silicon Valley-- 01:15:55.700 |
One, I'm gonna try and pronounce Tetragrammaton, 01:16:02.500 |
- And, apparently, it's a important Hebrew symbol 01:16:29.540 |
A lot of people thought he couldn't make positive cashflow 01:16:32.140 |
because of the way that the music label deals 01:17:04.740 |
- Yeah, it's up like six, seven billion today. 01:17:19.740 |
- All right, man. - All right, great to see you.