back to index

Ep13. Silicon Valley’s Political 180, META AI, COVID Postmortem | BG2 w/ Bill Gurley & Brad Gerstner


Chapters

0:0 Intro
1:38 The Perceived Political Realignment in Silicon Valley
18:2 Engagement Between Silicon Valley and Washington
27:21 Meta's Open Source Strategy (405b)
47:37 Post-Mortem on the COVID-19 Pandemic
58:37 Wiz IPO and Acquisitions
63:37 The Impact of AI on Self-Driving Cars
70:33 Lessons from the CrowdStrike Incident
76:11 Spotify's Success under Daniel Ek

Whisper Transcript | Transcript Only Page

00:00:00.000 | And instead of a $250 marginal cost
00:00:03.260 | to implement a rural broadband, we're
00:00:05.220 | going to get [BLEEP] bulldozers out and drop fiber lines
00:00:09.940 | to a ranch in the middle of nowhere.
00:00:12.500 | It's stupid.
00:00:13.680 | [MUSIC PLAYING]
00:00:16.660 | Hey, man, good to see you.
00:00:28.060 | Good to see you.
00:00:28.940 | I mean, so much for sell in May and go away.
00:00:31.040 | This summer's been nuts.
00:00:32.660 | You've got a near assassination of the president.
00:00:36.060 | The head of the Secret Service resigns.
00:00:38.380 | Biden withdraws under pressure from Pelosi.
00:00:40.820 | Kamala has all the delegates.
00:00:42.940 | Now she's the presumptive nominee.
00:00:45.180 | She raises $80 million the day Biden drops out of the race.
00:00:49.340 | It really seems to overshadow everything else.
00:00:51.540 | I mean, I don't know that--
00:00:52.340 | Well, how is that inconsistent with sell and go away?
00:00:54.340 | Wouldn't that have been a good strategy?
00:00:56.060 | Or wasn't it a good strategy?
00:00:57.380 | I don't know.
00:00:58.180 | The markets are actually doing pretty fine,
00:01:00.700 | given the background and the context here.
00:01:03.260 | But it's just trying to stay on top of the events.
00:01:05.980 | I mean, our group chat has been exploding.
00:01:08.220 | I find myself 50 to 100 chats behind every time
00:01:12.100 | I open the app, just because of all the political activity
00:01:14.860 | going on.
00:01:15.360 | And of course, we don't talk a lot about politics on this pod.
00:01:18.780 | We want to talk about business.
00:01:20.540 | But we also look at kind of this debate going on
00:01:25.220 | about Silicon Valley.
00:01:26.380 | Like The Washington Post reported Silicon Valley
00:01:30.180 | realignment leading tech titans to back Trump.
00:01:34.260 | You've been here a long time.
00:01:36.340 | Silicon Valley's really been dominated
00:01:37.900 | by the Democratic Party.
00:01:39.660 | It's been perceived as pro-business and pro-tech
00:01:41.980 | enough.
00:01:42.980 | It's been perceived as fairly moderate on social policies.
00:01:47.500 | But just in the last few weeks, leading traditionally
00:01:51.540 | Democratic supporters, Ben Horowitz, Mark Andreessen,
00:01:55.340 | Elon, who supported a lot of Democrats in the past,
00:01:58.020 | have come out in support of Trump.
00:02:00.060 | What's changed?
00:02:01.500 | Why all of a sudden have people in Silicon Valley
00:02:04.980 | that have traditionally supported Democrats,
00:02:06.820 | why are they moving to Trump?
00:02:08.020 | And I might even say, like if I go back to when I joined--
00:02:11.740 | and I mentioned this a bit in the talk I gave at All In
00:02:14.900 | on regulatory capture.
00:02:16.900 | I think I might even describe Silicon Valley
00:02:19.380 | as apolitical back then.
00:02:20.900 | Like there just wasn't--
00:02:22.420 | sure, some people would say, yeah, I vote Democrat.
00:02:24.700 | But there wasn't any discussion of it
00:02:28.180 | within the ecosystem, within the industry,
00:02:30.620 | while you're at a meeting that relates to a startup.
00:02:34.060 | Like it just didn't come up.
00:02:35.300 | And the title of my speech was 2,851 miles,
00:02:39.860 | which is the distance Washington is from here.
00:02:42.420 | And ironically, it's about as far away
00:02:44.300 | as you could possibly be in the United States, which I think
00:02:47.420 | was helpful, as I said.
00:02:50.220 | All of a sudden now, though, you've
00:02:53.100 | had those people you mentioned come out
00:02:57.020 | in a way that's very provocative.
00:03:02.620 | I think it's provocative not only
00:03:04.820 | that they're coming out in favor of Trump,
00:03:06.860 | but it's provocative that they stepped out in this way.
00:03:11.260 | Now, Ben and Mark foreshadowed this quite a bit,
00:03:15.540 | at least by 30 days or something.
00:03:18.500 | Obviously, their firms have been getting bigger and bigger
00:03:20.420 | and bigger with multi-stage, multi-industry,
00:03:23.420 | like specialized funds.
00:03:25.340 | And as you move into things that they've
00:03:27.780 | been moving into-- crypto, defense tech--
00:03:32.220 | you're getting into heavily regulated industries.
00:03:35.500 | So you're going to bump up against this more,
00:03:37.860 | and you're going to think about policy more.
00:03:39.740 | And if you listen to what they do on their podcast,
00:03:43.300 | and if you listen to what they write in their blog posts,
00:03:46.180 | that trend has been going on for a while.
00:03:48.660 | It's not like this just came out of nowhere.
00:03:51.180 | They've been talking about policy.
00:03:54.180 | And so they went through in their podcast, which
00:03:58.700 | I think is a must listen--
00:04:00.620 | It was great on this issue.
00:04:02.060 | --on this Trump-- but they didn't just
00:04:04.060 | throw down and walk away.
00:04:05.460 | They threw down and talked about it for 90 minutes.
00:04:07.740 | Exactly.
00:04:08.380 | And it's very clear-- and this was in their foreshadow piece
00:04:12.260 | as well--
00:04:13.100 | that they're looking after the interests of their constituents.
00:04:15.860 | And their business.
00:04:17.140 | And I don't think there's anything wrong with that.
00:04:21.260 | They have people they represent that are counting on them.
00:04:26.740 | Their LP base, their employee base,
00:04:29.220 | the entrepreneurs they've backed.
00:04:30.860 | And the piece that foreshadowed it
00:04:33.620 | said we're going to get more involved in politics,
00:04:36.300 | and we're going to look after-- they called it little tech,
00:04:38.740 | which you might describe as entrepreneurs who want
00:04:41.300 | to break through and be disruptive.
00:04:43.180 | They're looking after that constituency
00:04:45.300 | because that's endemic to everything
00:04:48.140 | that they do and work.
00:04:49.660 | No, I think, to be fair, they believe like both you
00:04:53.940 | and I believe.
00:04:54.460 | And I think most people in Silicon Valley
00:04:56.180 | believe that-- in fact, they talk
00:04:58.220 | about this triangle, which is technology
00:05:02.420 | is the great source of American advantage.
00:05:05.220 | It drives our economy, which is the strongest in the world,
00:05:08.220 | which is a great source of advantage, which allows us
00:05:10.860 | to have a superior military.
00:05:12.320 | Because both technology that we invent
00:05:14.700 | is leveraged by our military, and we
00:05:16.260 | have the financial resources to invest.
00:05:18.820 | So that leads to American global leadership.
00:05:22.580 | In their podcast, one of the things
00:05:24.540 | they lay out is what they believe
00:05:26.580 | was, by the Biden administration,
00:05:28.380 | an assault on business or technology.
00:05:31.260 | I think they go through a list of things, right?
00:05:34.220 | So one was an assault on business technology,
00:05:36.940 | being on the wrong side of crypto,
00:05:39.420 | which they believe the blockchain is
00:05:41.060 | very important to have an open and permissionless network
00:05:45.940 | that people can build on.
00:05:48.700 | They talk about being on the wrong side of AI.
00:05:51.300 | Obviously, you and I have commented
00:05:53.540 | on the executive order that was wanting
00:05:57.940 | to limit the number of flops and model sizes and everything
00:06:01.340 | before the industry even gets started.
00:06:03.860 | And then this crazy idea to tax unrealized capital gains
00:06:08.540 | and how that, in and of itself, is not only almost impossible
00:06:12.140 | to implement, but also would undermine the very incentive
00:06:16.180 | that entrepreneurs have to go create businesses.
00:06:18.340 | And let me just press pause on that one point,
00:06:21.220 | because I think when you hear--
00:06:23.620 | especially if anyone's listening to the podcast who's
00:06:26.180 | not deeply involved in a startup and knows what a cap table
00:06:29.980 | looks like, it'd be easy to miss this point, or to look over it,
00:06:33.340 | or to think maybe they're just talking about a nuance that
00:06:36.540 | doesn't matter.
00:06:37.700 | It would be catastrophic to the venture capital industry
00:06:42.060 | and to the startup industry to tax unrealized capital gains.
00:06:45.780 | It would be--
00:06:46.380 | Tell us why.
00:06:47.860 | Well, first of all, there's no way
00:06:49.540 | to value a private company.
00:06:51.660 | So you start with this problem that--
00:06:55.700 | how would you go about saying what it's worth?
00:06:58.060 | And for those of you that have been exposed
00:07:00.660 | to 49A, which is this god-awful policy where
00:07:05.700 | we have to get a third party to pay them to create
00:07:09.660 | this analysis to say what the option price has to be,
00:07:12.900 | this is the most voodoo magic that I've ever seen in my life.
00:07:17.180 | Like, if-- and it's so crazy, they run these models
00:07:22.220 | and they produce a number that's like to two decimal points.
00:07:24.900 | And I've always said they should be forced to--
00:07:29.340 | they should be forced to give a confidence interval.
00:07:31.900 | Because it would probably be like from 5 to 20,
00:07:37.140 | but they say 1149.
00:07:39.180 | And there's this other great piece
00:07:41.140 | of work that was written in finance literature, which
00:07:45.500 | was touted, if you're so smart, why aren't you rich?
00:07:47.980 | If these people are capable of calculating
00:07:50.380 | perfect stock prices to the second decimal point,
00:07:54.660 | they should be in your business.
00:07:56.740 | And there's just something even more basic.
00:07:58.580 | I mean, we just went through this period of 20 and 21.
00:08:01.860 | Where companies were valued at $10 billion, $15 billion,
00:08:04.660 | $20 billion, you would have actually had to pay a tax.
00:08:08.740 | Somehow you have to come up with the money.
00:08:10.860 | The company, the employees of the company,
00:08:13.060 | the founders of the company have to come up with the money.
00:08:15.060 | There's no secondary market.
00:08:16.420 | There's no ability to get liquid on this.
00:08:18.300 | I don't even know how they would pay.
00:08:20.260 | But then the idea--
00:08:21.220 | And that's the second part.
00:08:22.660 | You would literally have to facilitate liquidity for them
00:08:26.380 | to pay a tax.
00:08:27.300 | And then the third point, which you just mentioned,
00:08:29.820 | there would be numerous situations where
00:08:32.020 | you'd have people pay a tax, and then the valuations
00:08:35.500 | would collapse.
00:08:36.820 | And then they would just have this massive tax loss
00:08:39.540 | carry forward with nothing they could do with it.
00:08:42.220 | And they'd be wiped out by a tax.
00:08:44.860 | And putting taxes on unrealized gains
00:08:49.900 | is just a nutty, nutty, nutty concept.
00:08:51.660 | Right.
00:08:52.140 | And by the way, Democrats historically
00:08:54.340 | have not supported that.
00:08:56.300 | And so I think that that was to explain the 180
00:09:00.420 | or the realignment as the Washington Post describes it.
00:09:04.540 | I think you have to understand that the Biden administration,
00:09:07.700 | at least perceived by these people in Silicon Valley,
00:09:10.540 | they did a 180.
00:09:12.660 | They were introducing policies that
00:09:16.580 | were out of alignment with what has historically happened.
00:09:19.460 | Now, it's interesting, right?
00:09:21.300 | On the one hand, you have Vinod Khosla.
00:09:25.900 | You have Reid Hoffman, who have come out and said, listen,
00:09:31.340 | made this really a personality test about Trump.
00:09:34.260 | He's a liar.
00:09:35.460 | He's a cheat.
00:09:36.580 | He's not good for America.
00:09:37.940 | He's an existential risk to democracy.
00:09:40.820 | On the other side, you really hear a lot of policy arguments
00:09:44.620 | around whether or not they're pro-business, pro-tech,
00:09:47.780 | pro-AI, pro-crypto, et cetera.
00:09:50.700 | And that's really where you see the clash.
00:09:52.420 | I do think that Biden now stepping down and Kamala
00:09:56.820 | taking the position--
00:09:58.500 | I think the straw that broke the camel's back, frankly,
00:10:01.540 | for a lot of people--
00:10:02.300 | Elon came out and endorsed after this--
00:10:05.660 | was just the recognition that Biden was not
00:10:08.740 | in the mental state to lead on top
00:10:12.140 | of being on the wrong side of a lot of these issues.
00:10:15.780 | It makes me want to just back up to a super high level.
00:10:19.180 | Last pod we did, you were talking
00:10:21.020 | about how if you take out the MAG 7, the S&P was down.
00:10:26.140 | And you could take that to an extreme case
00:10:28.580 | and say, imagine if the MAG 7 didn't exist.
00:10:33.060 | What would our economy be like?
00:10:35.500 | What would America be like?
00:10:37.140 | And what you realize is that--
00:10:40.460 | and not to maybe pick on Europe too much,
00:10:43.100 | but it'd be a little bit like that,
00:10:45.100 | like where the stewards of our industries
00:10:48.420 | would be Exxon or Delta or JPMorgan or whatever.
00:10:53.660 | And those aren't companies that grow very fast.
00:10:55.940 | Yeah, I mean, the vast majority of the economic wealth that's
00:10:59.220 | been created in the last 20 years
00:11:00.740 | has been on the back of technology.
00:11:02.700 | And I think fundamentally--
00:11:03.860 | Well, in venture-backed companies.
00:11:05.220 | Right, right.
00:11:06.220 | And so this is-- and remember, in a prior generation,
00:11:13.220 | it was the Lockheed Martin--
00:11:16.020 | it was the Henry Fords, it was the Thomas Edisons
00:11:20.140 | that were those innovators.
00:11:21.260 | And I think the one thing that does pull people together
00:11:23.740 | in Silicon Valley is an attack or an assault on innovation
00:11:27.660 | or on technology or on entrepreneurship.
00:11:30.580 | And what's interesting is I was watching somebody
00:11:35.820 | who I actually quite like, Pete Buttigieg.
00:11:39.500 | And he said, nobody should be surprised by what Silicon
00:11:42.180 | Valley is doing because it's just
00:11:43.660 | a bunch of rich white guys who are advocating for what's
00:11:46.780 | in their best interest.
00:11:48.260 | And what's interesting-- I think Pete got it wrong, right?
00:11:51.940 | And the reason I think he got it wrong
00:11:53.480 | is because I don't hear in any of our conversations
00:11:56.700 | that we're supporting Trump because he's
00:12:00.980 | going to lower our tax rate, right?
00:12:03.240 | I do hear things around key policy areas
00:12:06.940 | that impact entrepreneurship that people care deeply about.
00:12:10.940 | I hope that the conversation around the realignment--
00:12:14.820 | and I see some movement around this--
00:12:17.300 | forces the next administration, if it
00:12:21.580 | is a Democratic administration, back to engagement.
00:12:24.500 | I thought it was really odd that Ben and Mark said--
00:12:28.660 | you know, they were stonewalled by Gensler.
00:12:31.900 | Stonewalled, by the way.
00:12:33.060 | It's like there was not even a willingness
00:12:35.300 | to talk about the issues openly.
00:12:37.100 | Right, and admittedly, this other candidate
00:12:39.700 | is just starting to run.
00:12:41.860 | But they talked about Jared, and Ivanka, and Trump.
00:12:45.700 | And they asked for meetings, and they got them right away.
00:12:48.220 | And they're feeling responded to, to your point.
00:12:53.460 | And I think one of the things that happens any time--
00:12:56.480 | and this may go back to where you started,
00:12:58.180 | that Silicon Valley has been a longtime supporter
00:13:00.460 | of the Democratic Party, kind of writ large.
00:13:02.620 | Any time you unequivocally give someone your support
00:13:06.620 | for a long period of time, they may kind of
00:13:08.500 | forget that you're a constituent.
00:13:10.860 | They might take it for granted.
00:13:12.540 | And I look at the city of San Francisco,
00:13:14.940 | who has lost Suave, and Stripe, and all these fintech companies
00:13:19.780 | because of the way they structured
00:13:22.980 | their homeless tax and gross receipts thing.
00:13:25.620 | And it's just like, hey, you have a consequence
00:13:29.940 | if you don't keep this constituent happy.
00:13:32.500 | And you've benefited from this constituent being successful.
00:13:36.140 | And so look, part of being a democracy
00:13:39.900 | is that everyone's allowed to have a voice.
00:13:42.100 | And I do think that it's worth--
00:13:46.180 | I think it would be helpful to the industry
00:13:47.980 | if the people that push back would say--
00:13:51.380 | and maybe they're saying this, but they
00:13:53.000 | say, I agree with you on all those issues,
00:13:55.180 | but I want to support the other candidate for this reason,
00:13:58.060 | rather than trying to buffoon or bully
00:14:02.780 | the people who are expressing their point of view.
00:14:06.020 | And look, at the high level--
00:14:08.180 | and I know that not everyone in the world
00:14:10.780 | will agree with me on this, but I've
00:14:12.460 | talked in the past about how the single human that's
00:14:16.100 | brought the most people out of poverty
00:14:17.780 | is Deng Xiaoping, by bringing entrepreneurism and capitalism
00:14:21.820 | to China, he brought 500 million people out of poverty.
00:14:24.740 | And I go back to those Matt Ridley books.
00:14:27.700 | The way you create an increase in prosperity
00:14:31.620 | is through growth.
00:14:33.020 | And the way you can get outpaced growth
00:14:36.940 | between the countries that exist on this planet
00:14:39.380 | is by being really good at innovation.
00:14:42.100 | And so I don't know a better way to help the populace.
00:14:46.060 | Right, and I think it's something
00:14:49.020 | we should all be able to agree on, on both sides of the aisle.
00:14:51.620 | And by the way, I've spent a lot of time on Capitol Hill
00:14:54.700 | recently with Democrat, Republican senators,
00:14:57.660 | Democrat, Republican House members.
00:14:59.340 | And I will say, I'm really optimistic.
00:15:01.420 | I think they're incredibly smart people on both sides.
00:15:04.340 | I think that they both see the right issues on AI,
00:15:08.500 | on national defense, on issues related to China,
00:15:13.060 | on issues related to technology and innovation.
00:15:15.980 | I think that there is strong consensus in Washington
00:15:19.940 | that the greatest source of our national advantage
00:15:22.660 | is innovation.
00:15:23.740 | And I happen to agree with that.
00:15:25.460 | It just seems that this administration,
00:15:27.420 | and we don't really know what was going on
00:15:30.460 | over the course of the last six or nine months,
00:15:32.260 | but it kind of lost its way with Silicon Valley,
00:15:35.500 | which was odd because Silicon Valley
00:15:37.420 | was a natural constituency of the Democratic Party.
00:15:41.340 | You know, Ben and Mark, on their pod,
00:15:43.740 | they said they started to see some things 10 years ago,
00:15:47.300 | in the 2010s, that started to cause some concerns.
00:15:51.180 | And he pointed out, you know, this idea
00:15:53.140 | that Zuckerberg pledged to give 99% of his wealth away,
00:15:56.940 | and he was attacked for it.
00:15:58.580 | And like, starting to see some cracks in the wall.
00:16:01.300 | And you're like, hold on a second here.
00:16:02.740 | This should be something that everybody gets excited about,
00:16:06.460 | not something that everybody criticizes.
00:16:09.180 | But here we find ourselves in 2024,
00:16:12.140 | and I think, you know, the pendulum had just swung too far.
00:16:15.820 | I do think that you're gonna see a Democratic Party
00:16:18.980 | that, you know, looks in the mirror,
00:16:20.980 | comes back to, you know, Silicon Valley
00:16:23.500 | on a lot of these issues.
00:16:25.100 | Trump was listening and said,
00:16:26.380 | "Listen, you think you ought to get a green card
00:16:28.460 | "for people who've taken a four-year college degree?
00:16:31.100 | "We're gonna staple a green card
00:16:32.460 | "to everybody who graduates from college."
00:16:34.060 | It's a huge deal. - That was amazing.
00:16:35.060 | I think that most everyone in Silicon Valley agrees
00:16:39.180 | that, like, tripling, quadrupling, maybe even 5X-ing
00:16:43.780 | the size of legal, skilled immigrants
00:16:47.060 | is huge for innovation,
00:16:48.780 | huge for everything that we wanna do.
00:16:51.300 | And everyone's on the same page on that.
00:16:53.420 | And so it's ironic that no one's been able to get there.
00:16:56.900 | And for, I think Trump said that on All In, right?
00:17:00.180 | Like, that's amazing.
00:17:02.340 | Like, you're gonna get a lot of support for saying that.
00:17:04.860 | - Right, and I, you know, again,
00:17:07.580 | it's not about taking political positions here,
00:17:10.100 | but the case we wanna make
00:17:12.740 | is that if you attack innovation, entrepreneurship,
00:17:17.380 | and capitalism, it will be to the detriment
00:17:19.820 | of the United States, right?
00:17:21.060 | - I agree with that. - And that's something
00:17:21.900 | both parties should support. - I agree with that.
00:17:23.500 | And that's why, regardless of who you're gonna vote for,
00:17:27.260 | I am, I was, I think I was pleased
00:17:32.260 | that Ben and Mark brought the issues to the forefront,
00:17:36.100 | you know, by doing this, like taking a stand.
00:17:39.260 | Because otherwise, you just sit back and say,
00:17:42.420 | "Oh, I have to vote this way
00:17:43.660 | because that's what everyone does here."
00:17:45.660 | And it, like I said, I think it leads
00:17:47.860 | to being taken advantage of a little bit
00:17:51.020 | and not being paid attention to.
00:17:52.740 | - Well, and let's end this section with this.
00:17:55.820 | You know, over the 20 years, you know,
00:17:59.340 | 25 years that you've been here, you know,
00:18:02.380 | yes, we're a long way away from Washington,
00:18:04.740 | but today it looks very different, right?
00:18:07.380 | AI policy runs through Washington, right?
00:18:09.980 | You mentioned American dynamism or national defense.
00:18:12.940 | All of that is running through Washington.
00:18:14.860 | The CHIPS Act, right?
00:18:16.260 | Whether or not we can export chips to certain country
00:18:18.860 | runs through Washington, right?
00:18:20.980 | I find myself after a 10-year hiatus in Washington,
00:18:24.420 | spending a lot of time over the course
00:18:26.340 | of the last five years.
00:18:28.140 | How do you feel, like, do you think that engagement
00:18:31.500 | with Washington is a good thing?
00:18:33.820 | I know that, you know, top of mind for you
00:18:36.660 | is regulatory capture.
00:18:38.060 | You and I had a conversation about, you know,
00:18:40.660 | this rural broadband initiative.
00:18:42.780 | - Oh, gosh.
00:18:43.620 | - You know, tell us a little bit about that.
00:18:45.540 | - Yeah, I mean-
00:18:46.380 | - And how to balance engagement
00:18:48.180 | without falling into, you know,
00:18:51.260 | the abyss of regulatory capture.
00:18:53.260 | - I'm going to just admit, like, that I'm very skeptical.
00:18:58.260 | - Yeah.
00:19:00.060 | - Like, the way that legislation gets written
00:19:04.020 | is it gets written by the incumbents.
00:19:05.660 | We're already seeing, you know, this massive fight in AI,
00:19:09.500 | and we've pointed to these articles before,
00:19:11.980 | like the Politico article,
00:19:13.140 | but the leaders with their closed source AI models
00:19:17.820 | are spending more money than any startups ever spent,
00:19:22.540 | you know, in lobbying efforts.
00:19:24.620 | And they're drafting the language
00:19:27.700 | that ends up in the EU doctrine, you know?
00:19:30.100 | And so they're going to look at,
00:19:32.260 | I don't know why they wouldn't look after themselves.
00:19:34.620 | The thing that it would take, and I owe it to my,
00:19:37.980 | I've told some people that follow,
00:19:40.060 | I'm going to fund some research
00:19:42.020 | to study best practices in other countries,
00:19:44.740 | but it would take putting people in agencies
00:19:48.380 | that are non-conflicted
00:19:50.660 | and smart enough about the subject matter.
00:19:53.900 | Very hard to do.
00:19:55.020 | - Right, right, right.
00:19:56.660 | - That actually write the policy
00:19:59.060 | on behalf of the consumer or the citizen.
00:20:01.860 | - It also means, I think,
00:20:03.260 | that engagement by Silicon Valley to fight back
00:20:06.180 | against the incumbents who are trying to do the capture.
00:20:09.140 | Right?
00:20:09.980 | So, I mean, if you're sitting here
00:20:11.540 | on all of these issues, sitting on your hands, you know,
00:20:14.220 | then that's a problem.
00:20:15.380 | But I will also stipulate
00:20:16.940 | that there are now large incumbents in Silicon Valley.
00:20:19.980 | You know, you referenced some of the leading AI companies
00:20:22.420 | that have closed models that may, in fact,
00:20:24.660 | want to engage in the capture themselves,
00:20:27.740 | you know, and prevent open source.
00:20:30.300 | - Let's go a little bit down the E-line case,
00:20:34.100 | because I do think, I don't know who to blame it on.
00:20:37.780 | Maybe if you're a Democrat,
00:20:40.660 | I'll just pitch this as someone that's on neither side.
00:20:45.340 | But like, if you're a Democrat,
00:20:46.620 | maybe you see a way that Elon picked the fight with Biden.
00:20:50.940 | I think if you're on the other, if you're on Elon's side,
00:20:53.420 | he kind of feels like Biden picked the fight.
00:20:55.860 | - Right.
00:20:56.700 | - We do know there was an EV summit
00:20:59.260 | where he was not invited.
00:21:00.540 | - Correct.
00:21:01.380 | - And holding an EV summit without inviting Tesla,
00:21:04.140 | who has like 95% of the shipments in the US,
00:21:07.700 | is, I don't know how you even do it with a straight face.
00:21:11.340 | - Right.
00:21:12.180 | - Like, it's theater.
00:21:13.020 | Like, that's flat out.
00:21:13.860 | - And I don't even think they're hiding it, right?
00:21:15.460 | Like, when Biden was signing some bills in the Rose Garden,
00:21:19.060 | he's surrounded by six people in union jackets, right?
00:21:21.660 | - And he tried to pass an EV credit
00:21:24.260 | that would only be eligible to a car built by a union worker.
00:21:28.580 | - Right, right.
00:21:29.540 | - And, well, one, I just,
00:21:31.980 | I mean, it's certainly not pro-innovation.
00:21:34.460 | - Correct, correct.
00:21:35.300 | - And so, anyway, and then this other one
00:21:37.940 | that's really bizarre,
00:21:39.340 | the FCC, for a long time,
00:21:42.540 | has looked over rural broadband initiatives.
00:21:46.660 | - Right.
00:21:47.500 | - And there's immense regulatory capture in this.
00:21:49.540 | Most of these programs are funded cost plus.
00:21:51.980 | I happen to have some rural property in Texas,
00:21:55.300 | and if you ask for anything, they'll do it,
00:21:59.220 | because they get to bill the government cost plus,
00:22:02.500 | 'cause you're in a rural location.
00:22:04.300 | So, I could get a fourth phone line,
00:22:06.300 | and they would, like, dig a two-mile trench in rock,
00:22:11.300 | just 'cause they get to make 30% profit
00:22:14.380 | on whatever the activity is.
00:22:16.380 | So, anyway, they decided they were gonna spend 42 billion
00:22:20.060 | on rural broadband, and between 10 years ago and now,
00:22:24.060 | Starlink's been built.
00:22:25.460 | And there's just, I don't know a better way to say this,
00:22:29.940 | but, like, if you want broadband
00:22:32.340 | in the middle of nowhere in America,
00:22:34.660 | Starlink's perfection.
00:22:37.860 | - Yeah, right, right.
00:22:39.780 | - That's what this was designed for.
00:22:41.700 | In fact, because of the way Starlink works,
00:22:44.540 | if you're by yourself and isolated in an area,
00:22:48.980 | you'll get much better Starlink performance
00:22:51.580 | than if you were in a city.
00:22:52.900 | It couldn't be better.
00:22:55.300 | And you're gonna get, like, you'd be shocked
00:22:58.780 | you could get 200 megabits or 300 megabits.
00:23:01.460 | Now, the powers that be at the landline companies
00:23:05.860 | have written the laws that say,
00:23:08.020 | "Oh, well, will you stay up during a rainstorm?"
00:23:12.180 | - Right, right, right.
00:23:13.180 | - And so they disqualify Starlink,
00:23:15.540 | 'cause it might not work if it's the heaviest rain,
00:23:18.660 | like, for five minutes a year.
00:23:21.180 | And it's just so stupid.
00:23:22.540 | In this recent hurricane that went through Houston,
00:23:27.020 | I paid attention, but, like,
00:23:29.340 | I was looking for this for this reason,
00:23:30.820 | but, like, some of the landline broadband things went out.
00:23:35.140 | And guess what?
00:23:35.980 | They were out three days.
00:23:37.460 | - Right.
00:23:38.300 | - 'Cause someone had, there was, like, a flooded cabinet.
00:23:40.500 | Someone had to go in and take that apart.
00:23:43.020 | So Starlink may not have worked
00:23:44.700 | during the worst hour of the storm,
00:23:47.820 | but it worked the next day.
00:23:49.100 | - Right.
00:23:49.940 | - As soon as you could get power
00:23:50.780 | and plug in your antenna.
00:23:51.620 | - So for 42 billion, what did we get?
00:23:54.580 | - Well, that's the other thing.
00:23:55.940 | So according to Brendan Carr,
00:23:58.140 | who is one of the four FCC commissioners,
00:24:00.940 | he says zero, zero people have been connected.
00:24:03.900 | So not only, but, yeah, I mean, look,
00:24:07.540 | everyone knows government's not very successful
00:24:09.860 | in implementing things,
00:24:11.020 | but they disqualified the very best new solution
00:24:16.020 | that's perfect, perfect.
00:24:18.580 | - Where the marginal cost of implementing it
00:24:20.940 | is a Starlink antenna.
00:24:21.780 | - The cost of the antenna, the cost of the antenna.
00:24:23.860 | - Right, right.
00:24:24.700 | - And you could strike a deal with Starlink
00:24:26.740 | and probably get that cost reduced
00:24:28.460 | or do it at cost, like the government could have,
00:24:31.700 | you know, gone in and done like a bomb analysis
00:24:34.060 | and say, let's build one for the...
00:24:35.540 | - And I think it's...
00:24:36.380 | - You could have done any of those things.
00:24:37.220 | You could have probably got it down to $250.
00:24:40.020 | And instead of a $250 marginal cost
00:24:43.300 | to implement a rural broadband,
00:24:44.980 | we're going to get bulldozers out
00:24:47.820 | and drop fiber lines to a ranch
00:24:51.260 | in the middle of nowhere.
00:24:52.340 | - Right.
00:24:53.180 | - It's stupid.
00:24:54.020 | - Right, right.
00:24:54.900 | I can only surmise that that happened for political reasons.
00:24:59.900 | - Well...
00:25:00.980 | - Or regulatory capture, one of the two.
00:25:03.860 | - This is related to another topic
00:25:05.420 | you and I have talked about.
00:25:07.100 | And this all goes to like,
00:25:08.580 | why has there been this realignment
00:25:11.220 | or perceived realignment in Silicon Valley?
00:25:13.740 | I wonder if there's something else going on as well.
00:25:16.940 | You know, forever we've had kind of this fourth estate
00:25:20.340 | in the media.
00:25:21.180 | It's called the fourth estate
00:25:22.260 | because we have three branches of government
00:25:24.860 | and you add the nightly news and the newspapers
00:25:27.980 | that worked very closely with those branches of government
00:25:32.180 | to report on what was going on.
00:25:34.260 | But really over the course of the last five years, 10 years,
00:25:37.420 | we've had this explosion in the democratization of media,
00:25:42.260 | if you will, right?
00:25:43.740 | And you have all these citizen journalists.
00:25:46.380 | And just over the last four years,
00:25:48.020 | let's think since COVID really, right?
00:25:50.660 | You have people who feel like,
00:25:53.460 | you have this libertarian streak in Silicon Valley,
00:25:57.060 | which I would agree is already slightly distrustful
00:26:00.100 | of government, right?
00:26:02.100 | And now they feel like based on this reporting,
00:26:04.940 | they were lied to about COVID.
00:26:07.700 | They were lied to about censorship in big online media.
00:26:12.700 | They were lied to about Biden's health
00:26:15.420 | being the latest thing.
00:26:18.220 | And so I think there's something deeper going on here
00:26:22.420 | that you've kind of lit this fuse
00:26:25.020 | where people are looking around and saying,
00:26:27.340 | this just doesn't feel like government
00:26:30.220 | is shooting us straight
00:26:31.620 | and that there's a real dialogue going on
00:26:33.900 | about what's best for America.
00:26:35.580 | So I think there's been a real breach of trust
00:26:38.240 | that also needs to be mended.
00:26:40.240 | And so again, I think there are people of goodwill
00:26:44.860 | on both sides, but when I have people,
00:26:47.640 | when I hear people say, well, there's a realignment
00:26:50.060 | because you just have a bunch of rich white billionaires
00:26:53.140 | in Silicon Valley that are doing
00:26:54.580 | what's in their best interest,
00:26:56.460 | I think they need to really look in the mirror
00:26:58.700 | and take a deeper analysis as to the situation, right?
00:27:02.780 | And certainly there are problems
00:27:05.820 | that have been well-documented about Trump.
00:27:08.340 | There've been problems on the other side,
00:27:11.180 | but I think if Democrats want to win back
00:27:13.540 | this natural constituency in Silicon Valley,
00:27:16.140 | there needs to be a real realignment
00:27:17.920 | on some of these issues.
00:27:19.440 | Let's shift gears.
00:27:21.040 | We have open source going frontier today.
00:27:23.640 | Meta dropped a new model.
00:27:25.160 | It's 405 billion parameter model.
00:27:28.760 | It's the first open source really frontier quality model.
00:27:32.360 | It now stands on top of the leader board
00:27:34.360 | for a lot of charts.
00:27:35.580 | Importantly, they updated their community policies, right?
00:27:40.440 | So now this can be open and permissively licensed,
00:27:43.640 | including for commercial use,
00:27:45.920 | synthetic data generation, distillation, fine tuning.
00:27:50.440 | You know, Zuck wrote a letter in defense of open source.
00:27:53.240 | I saw it. Beautiful, beautiful letter.
00:27:54.920 | You tweeted about it.
00:27:56.240 | Elon then responded to your tweet
00:27:58.360 | and said it was an impressive model.
00:28:00.040 | He said Zuck deserved a lot of credit for open sourcing it.
00:28:03.640 | So what did we learn today?
00:28:05.760 | Because there was some debate as to whether 405
00:28:07.840 | was going to be open sourced or not, Bill.
00:28:09.800 | What did we learn today and why is this so important?
00:28:12.240 | Well, I think there's two things we could talk about.
00:28:14.200 | So you tell me which one you want to talk about first.
00:28:16.200 | There's the strategy behind why would a company
00:28:21.200 | like Facebook Meta choose this methodology.
00:28:25.960 | And then there's the question of what's happening
00:28:28.200 | with the large language model competition.
00:28:31.080 | Which one did you want to correct?
00:28:33.120 | Let's talk about just Meta's decision to do this
00:28:36.280 | and the commitment to open source and why it's so important.
00:28:39.280 | So Mark, first of all,
00:28:41.040 | everyone should go read what Mark wrote.
00:28:42.480 | It's not ultra long.
00:28:43.960 | It's about four or five pages,
00:28:45.320 | but he lays out a lot of it, you know?
00:28:47.360 | And he talks about when Meta has run
00:28:51.080 | into other closed system companies,
00:28:53.920 | it's created a lot of frustration and limitation
00:28:56.600 | to what he perceives as innovation.
00:28:59.040 | And he mentioned the Apple world
00:29:01.440 | that he's kind of forced to live in.
00:29:03.080 | Right.
00:29:04.120 | Something happened over the past, I'd say 15 years,
00:29:09.200 | where some of the smartest companies
00:29:12.640 | in Silicon Valley have developed a new strategic play
00:29:16.280 | where they use open source as a defensive weapon
00:29:19.640 | rather than as an offensive weapon.
00:29:21.480 | That's my terminology.
00:29:23.480 | One of the first to do it was Google with Android.
00:29:27.120 | And it's really hard for people to put themselves
00:29:30.520 | in the mindset when that iPhone shipped
00:29:33.040 | and when Apple had only let AT&T have it
00:29:36.200 | and it was on AT&T's term.
00:29:38.480 | And Google convinced the rest of the world,
00:29:42.400 | all the handset manufacturers,
00:29:44.160 | the carriers to support this model
00:29:46.080 | by telling them it was gonna be open.
00:29:48.080 | So I think they backed up on that a little bit,
00:29:50.440 | but that worked.
00:29:51.440 | Like that got them to get behind this thing.
00:29:54.240 | And I guess if you look at the China market,
00:29:57.080 | the openness that Verizon and Samsung thought
00:30:04.680 | they were getting into with Android did play out there.
00:30:08.080 | Like there's no Google or Apple in the China market.
00:30:13.080 | But then others started to do this too.
00:30:16.960 | Google did it again with Kubernetes,
00:30:18.960 | which is a piece of technology that lets you
00:30:21.000 | move your workloads very easily between clouds.
00:30:24.200 | They were worried Amazon was running away with AWS.
00:30:27.360 | So they took a piece of technology
00:30:29.000 | that was inside their company, Kubernetes,
00:30:31.800 | and made it free.
00:30:32.960 | And they got the Linux Foundation to help organize,
00:30:35.520 | which the Linux Foundation is very good at.
00:30:38.520 | And they got IBM.
00:30:40.280 | They got all these hundreds of players behind Kubernetes.
00:30:44.040 | And eventually it became so successful
00:30:46.600 | that Amazon had to give up and support it.
00:30:49.040 | And that has allowed more fluid transfer of workloads,
00:30:52.400 | and it's leveled the playing field a bit.
00:30:54.440 | Facebook created this thing called Open Compute Project.
00:30:59.800 | And they basically said,
00:31:02.000 | if you wanna sell computers into our data center,
00:31:05.480 | we're gonna write a spec about the,
00:31:08.840 | or we're gonna write a list of specifications
00:31:10.800 | that those things have to meet in order to go in there.
00:31:14.440 | Well, one of them is basically no proprietary technology.
00:31:18.800 | So it's almost like a patent deflection move
00:31:21.200 | that says we're only gonna support open source things here.
00:31:25.720 | And by the very definition, if it qualifies to come in,
00:31:28.920 | it doesn't have any proprietary technology,
00:31:31.800 | which has been a huge.
00:31:33.880 | And by the way, people say,
00:31:35.320 | oh, well, who would agree to that?
00:31:36.480 | Well, Dell sells into there, Cisco.
00:31:39.280 | Like all these people end up
00:31:40.600 | kind of just agreeing and selling in.
00:31:42.960 | And so that is a remarkable move by Facebook
00:31:47.000 | to lower their cost of their infrastructure
00:31:48.920 | and to keep it lower and to keep themselves
00:31:51.200 | out of position where they can be held up.
00:31:54.080 | There's been a recent move in the map world
00:31:56.200 | to create an open source map project
00:31:59.360 | 'cause they're worried about Google having too much power.
00:32:02.160 | So this has become a kind of a new go-to move by,
00:32:05.360 | you need a lot of resources to do this.
00:32:07.600 | You can't, like, obviously there are startups
00:32:10.200 | that get behind open sources,
00:32:11.840 | but if you're gonna play this defensive move,
00:32:13.720 | you need to be a big player.
00:32:15.640 | You usually need to get the help of others.
00:32:18.480 | Here, meta, I think, just became concerned.
00:32:23.120 | And you and I have talked about this,
00:32:24.960 | but I think some of the players in the AI world
00:32:27.960 | brought this upon themselves
00:32:29.200 | by claiming that they were gonna change the world.
00:32:31.720 | And so in such dramatic ways, you awaken the giants, right?
00:32:35.280 | If you cause everybody's investor call to be,
00:32:40.280 | "What are you gonna do about AI?"
00:32:42.200 | You're gonna wake up all the giants, right?
00:32:46.040 | And so in this case, meta didn't want anyone
00:32:50.200 | to end up in an advantaged position to them
00:32:54.240 | because they were so much better in AI,
00:32:56.360 | or they didn't want to become dependent
00:33:01.680 | on a third party for an AI tool that they needed
00:33:05.320 | and couldn't have.
00:33:06.640 | - I thought one of the most interesting parts
00:33:08.560 | of the letter, and Mark's pod on it today,
00:33:13.040 | is, you know, he said it's become personal.
00:33:16.400 | First, he points to Linux and Unix,
00:33:18.480 | and it's just been a great unlock
00:33:21.240 | by unleashing the power of the long tail-
00:33:23.840 | - You know I'm a huge, huge fan of this idea.
00:33:26.360 | - The long tail of developers.
00:33:28.000 | But one of the things he said, he's like,
00:33:30.200 | "This is personal to me," right?
00:33:32.960 | Because he said with Apple, you know,
00:33:35.280 | where we saw ecosystem control by a single company
00:33:38.840 | that extracted most of the rents
00:33:40.560 | out of the mobile ecosystem.
00:33:42.840 | He said it was bad enough that we had to pay a big tax,
00:33:45.640 | right, because that tax hurts all the developers
00:33:48.400 | in the ecosystem.
00:33:49.320 | He's like, "But you could tolerate the tax."
00:33:51.520 | He said, "But it was absolutely soul-crushing."
00:33:54.280 | He used that phrase, which I thought was interesting.
00:33:56.720 | When you develop, work really hard on a product improvement,
00:34:00.240 | and then you have a company that says,
00:34:01.240 | "No, you can't release that into your product
00:34:03.880 | for whatever reason."
00:34:04.880 | - Right, right, right.
00:34:06.080 | - And so it seemed to me
00:34:08.320 | that this has become extraordinarily personal, right?
00:34:12.000 | Here you have a company that produces tens of billions
00:34:15.480 | of dollars of profit in the back room.
00:34:17.480 | It can fund all of, you know, all of these models
00:34:21.120 | for as far as the eye can see,
00:34:22.920 | and they've made a decision to make it open.
00:34:25.280 | I think that has broad economic implications
00:34:28.280 | for the entire ecosystem, right?
00:34:30.600 | If you're a closed model company,
00:34:33.080 | the question first is, okay, I'm going to have to compete
00:34:35.960 | and keep up with this frontier-level competition.
00:34:38.760 | That's hard enough.
00:34:39.600 | It takes a lot of resources.
00:34:41.280 | But number two, Mark's gonna give it away,
00:34:43.600 | you know, Meta's gonna give it away for free.
00:34:46.160 | Now you have to develop a business model
00:34:49.640 | that can compete with free.
00:34:51.120 | - Yep, and so this reminds me,
00:34:53.120 | and we'll put a link to it in the notes,
00:34:55.400 | but years ago I wrote a blog post about Android,
00:34:58.880 | and I said, "It doesn't matter,"
00:35:01.480 | this is so similar to what you just said,
00:35:04.160 | "It doesn't matter if they can make money on Android."
00:35:06.480 | It turns out they figured out how to.
00:35:08.920 | But it creates a moat that's so wide around the castle.
00:35:13.680 | I think I drew a picture, I talked about like,
00:35:16.800 | okay, yes, you have the water moat,
00:35:18.600 | but now they're gonna char the earth
00:35:20.300 | for five square miles outside, around the water.
00:35:24.520 | But you're not getting to our goddamn castle.
00:35:27.440 | And so it has, and you know,
00:35:30.440 | not every company can afford
00:35:31.960 | a multi-billion dollar moat extender,
00:35:36.280 | but it makes a lot of sense for these companies to do it.
00:35:39.400 | And there's no one today that is listening
00:35:43.920 | to Mark's podcast, playing with the new models.
00:35:46.540 | By the way, on Grok, they're like nutty fast.
00:35:50.260 | And saying, "Oh, like that guy's stupid."
00:35:54.420 | Like, no one's saying that.
00:35:56.180 | He looks like a genius today.
00:35:57.860 | He also said today that they've already laid out
00:36:01.740 | the compute cluster for LLAMA 4.
00:36:03.620 | So this was LLAMA 3.1, it was a large 405B model,
00:36:08.620 | also distilled into an 80 and an 8B,
00:36:11.780 | or 70 and an 8B version.
00:36:14.180 | But they said they've already laid out
00:36:15.660 | the compute cluster for LLAMA 4,
00:36:17.860 | they've already laid out the data architecture for it,
00:36:20.680 | they've run the research on it.
00:36:22.040 | So it seems to me, and our team's best guess,
00:36:25.560 | is that LLAMA 4 is gonna come
00:36:27.200 | in the back half of the year, right?
00:36:28.920 | Maybe in November, December timeframe.
00:36:31.120 | They probably don't wanna be more
00:36:32.440 | than a couple months behind GPT-5,
00:36:34.540 | so they probably think you're gonna get a new release
00:36:37.560 | right after 4.0 from OpenAI,
00:36:41.280 | that they can then try to quickly leapfrog with LLAMA 4.
00:36:44.560 | But what was interesting, he said,
00:36:45.880 | not only have we laid out LLAMA 4,
00:36:47.440 | but he said it's kind of fun and fascinating,
00:36:49.740 | we've laid out 5, 6, and 7, okay?
00:36:53.160 | And he said the models are all gonna be bigger,
00:36:55.740 | they're all gonna cost a lot more money, billions of dollars,
00:36:59.200 | and he kind of alluded to what Satya said
00:37:03.200 | a few weeks ago that you and I heard,
00:37:04.880 | which he said, this isn't gonna be a straight line.
00:37:07.960 | He reminded folks of the internet bubble,
00:37:11.340 | and said you may have to go through a bubble here.
00:37:13.800 | So if you're a venture capitalist, Bill,
00:37:16.040 | and you're hearing this, okay?
00:37:18.000 | First, you gotta go compete with free.
00:37:20.240 | Second, you're gonna have to spend billions for a long time,
00:37:23.640 | and the incumbents are going to compete.
00:37:26.200 | Third, that we may in fact have to cross this chasm,
00:37:30.920 | some bubble-like chasm.
00:37:32.720 | I think what it does is it's also a shot across the bow
00:37:37.720 | of those folks who might wanna start closed-model companies
00:37:41.920 | to go compete with that.
00:37:43.640 | - No doubt, and one other thing I would highlight
00:37:46.720 | like for the listening audience,
00:37:49.640 | like every startup, whether you are an AI startup
00:37:54.640 | or whether you're a startup that's been around a while
00:37:56.920 | that wants to enhance their product with AI,
00:38:00.360 | this should make you tickled pink.
00:38:02.840 | Like you should be so happy that he made these decisions.
00:38:07.200 | 'Cause as you said, he went further
00:38:10.440 | in terms of openness today than the model was before.
00:38:14.320 | More permissions, and the Hugging Face team was clapping
00:38:17.920 | and celebrating on Twitter as well for this reason.
00:38:21.080 | And so for, as Mark and Ben like to call it, little tech,
00:38:26.000 | this is phenomenal.
00:38:27.080 | - Correct.
00:38:27.920 | - Like phenomenal, it may not be for open AI,
00:38:30.400 | which I don't know Brad Gailbreath,
00:38:34.280 | but I saw a funny tweet today where he said,
00:38:37.120 | is open AI the new Netscape, which is somewhat provocative,
00:38:41.400 | but Mark created that question.
00:38:43.880 | And I would say this, I'm gonna stop and let you go,
00:38:47.920 | but like last thing, when, I just don't think
00:38:52.800 | that Sam Altman realizes when he talks about things
00:38:57.800 | in such a grand fashion,
00:39:00.400 | how it makes people like Zuckerberg feel,
00:39:02.880 | or if Zuckerberg has to take a question,
00:39:05.800 | is AI gonna make you irrelevant?
00:39:08.320 | Like it gets under his skin a little bit.
00:39:11.100 | - I do think about it this way, you know,
00:39:14.680 | if you're open AI, you got to go raise your next
00:39:17.080 | four or $5 billion, right?
00:39:19.680 | Remember, Meta had allocated $20 billion a year
00:39:24.400 | to reality labs.
00:39:26.000 | And there was a report last week that they may, you know,
00:39:29.200 | tighten their belt on reality labs to the tune of 20%.
00:39:33.120 | Okay, so that's four or $5 billion of savings
00:39:36.800 | on a research project that they have in reality labs
00:39:39.900 | that they can reallocate to AI.
00:39:42.080 | Now, what I would say on the open AI front,
00:39:45.040 | I actually think that they're doing extraordinarily well.
00:39:48.820 | And here's the interesting thing about open AI,
00:39:51.360 | in the face of all this, they're, you know,
00:39:54.160 | Zuckerberg said, I wanna have the number one
00:39:56.360 | consumer application in AI by the fall.
00:39:59.700 | But if you look at chat GPT 4.0, since it was released,
00:40:03.460 | web visits have gone from 1.8 billion
00:40:06.180 | to 2.6 billion visits per month.
00:40:08.700 | DAUs have gone from 60 million to 100 million.
00:40:12.900 | And so, you know, as I say to my team,
00:40:15.980 | there's only one way, you know, to build a business.
00:40:19.140 | Either you got to get a consumer to pay you,
00:40:21.340 | or you got to get an enterprise to pay you.
00:40:23.560 | And the only other company that's really shown
00:40:27.580 | traction with consumers paying, you know,
00:40:30.300 | at that scale is open AI.
00:40:32.700 | So the question is, can they build a consumer product
00:40:35.940 | that is durably better than the alternative, right?
00:40:40.020 | That Meta is gonna put out there.
00:40:41.340 | And Mark mentioned that there are hundreds of millions
00:40:43.780 | of people using Meta AI,
00:40:45.100 | but they're not really paying a subscription
00:40:47.860 | and using a standalone product, right?
00:40:50.380 | It's embedded in all the other products.
00:40:52.540 | A lot of the time you're using it,
00:40:53.800 | you probably don't even know you're using it yet.
00:40:56.280 | But I do think that Meta has a history
00:40:58.700 | of grinding out good products,
00:41:00.740 | and I suspect it will get there.
00:41:02.620 | But no doubt about it,
00:41:03.860 | this is gonna put a lot of pressure
00:41:06.140 | on the closed source model companies.
00:41:10.420 | And I mean, like those numbers you're referring to,
00:41:16.660 | I think, are just at a high level.
00:41:18.840 | They're not necessarily the paid constituency.
00:41:21.620 | No, no, those are total users.
00:41:23.540 | - Yeah, and I, playing around with credit card data,
00:41:27.000 | I think open AI is about 2 billion,
00:41:29.480 | Anthropic's about 100 million,
00:41:32.040 | and Perplexity's at like 25 million.
00:41:34.240 | Just, I'm guessing based on credit cards.
00:41:36.340 | - I think that's directionally right.
00:41:37.280 | - On the $20 thing, but at 12 months,
00:41:40.920 | the retention, even in open AI, is like 35%.
00:41:44.800 | So I doubt that anyone that is looking at that,
00:41:49.920 | even at 2 billion, feels great
00:41:53.060 | about that as a durable business model.
00:41:55.180 | And you add to it, Mark saying,
00:41:58.980 | Mark's not gonna charge.
00:42:00.420 | - It would feel really great
00:42:01.700 | if it didn't cost you $3 billion every year
00:42:04.840 | to build a new model in order to compete.
00:42:06.900 | I mean, ultimately, the cogs in these businesses, right,
00:42:11.140 | that you have to,
00:42:12.220 | because these are super quickly depreciating assets, right?
00:42:15.740 | You have to keep reinventing that model every single year.
00:42:18.860 | - By the way, I pay for all of 'em.
00:42:20.120 | I play, 'cause I love playing with 'em,
00:42:21.800 | and they're so much fun and so fantastic.
00:42:24.160 | I would say, I think one of the retention problems
00:42:26.760 | is simply all the substitutes that are available.
00:42:30.000 | And if meta's gonna, and then if speed's gonna matter.
00:42:33.300 | Like, as speed, there were these old stories
00:42:35.680 | of Google made itself like 30 milliseconds faster.
00:42:39.560 | Like, there were more searches.
00:42:41.160 | Like, and that, play with the llama grok demo.
00:42:45.860 | Like, it's a different experience.
00:42:48.280 | I don't know how much that will drive.
00:42:50.240 | - Yes.
00:42:51.320 | - So anyway, I just think there's a lot of alternatives.
00:42:53.480 | If Mark, if we're right about Mark's intention
00:42:57.560 | to, he took it personally, he's playing defensively,
00:43:00.760 | he's gonna make his version free.
00:43:04.400 | - Yeah, there's no doubt about that.
00:43:06.680 | - And he might run it for four years with no ads.
00:43:08.920 | - Right.
00:43:09.760 | And I think, you know, this fast inference,
00:43:11.700 | and, you know, it's grok, it's Cerebris has fast inference,
00:43:16.400 | you know, Fireworks, another benchmark investment,
00:43:19.320 | I think, you know, is inference optimized
00:43:22.040 | on top of NVIDIA, right?
00:43:25.480 | You're now getting to the point with these inference engines
00:43:29.160 | that they're going to be, you know,
00:43:31.240 | it's faster than you can read, certainly.
00:43:33.600 | You and I had a conversation about this.
00:43:35.600 | That doesn't really matter
00:43:36.760 | because all of these interactions, you know,
00:43:39.200 | particularly when we think about
00:43:40.680 | multi-chained agent interactions, right?
00:43:43.840 | You may have a hundred different interactions,
00:43:46.120 | computer to computer interactions
00:43:48.120 | before you ever see an answer.
00:43:50.680 | And so speed really matters
00:43:52.480 | because computers can talk to each other at, you know,
00:43:55.240 | the fastest speed you can possibly, you know,
00:43:57.520 | kind of contemplate.
00:43:58.760 | And so reducing that latency unlocks a lot of innovation,
00:44:03.760 | you know, that will be good.
00:44:05.020 | And I think there's a lot of that coming.
00:44:07.560 | You know, I was really impressed today
00:44:10.480 | with the alliances that Meta had built
00:44:16.320 | you know, and Mark pointed that out,
00:44:18.440 | but really from AWS to Databricks,
00:44:21.960 | to Snowflake, to Accenture, you know,
00:44:25.760 | the Grox of the world, Fireworks, et cetera,
00:44:27.840 | everybody, you know, launched with optimized versions
00:44:31.560 | of this, they've been playing with it.
00:44:33.040 | So, you know, in the case of Databricks or Snowflake,
00:44:36.480 | they can basically take these optimized models
00:44:39.600 | immediately to their companies.
00:44:41.200 | They can do fine tuning,
00:44:42.400 | they can do synthetic data generation,
00:44:44.720 | whatever their companies might want to do.
00:44:46.560 | So part of the way that open source works
00:44:50.760 | is by running the field, right?
00:44:53.280 | Getting everybody into the pool.
00:44:56.080 | No doubt, while you were saying that,
00:44:57.720 | I went back to the Kubernetes Wikipedia page.
00:45:00.440 | So on launch, the principal competitors were VMware,
00:45:05.440 | Mesosphere, Docker, Azure.
00:45:08.240 | And then, you know, a few months later, AWS came in,
00:45:12.180 | but yes, if you're a participant in the market
00:45:16.760 | who is looking for your own solution to be advantaged,
00:45:21.760 | and there's someone who wants to make a piece of it
00:45:24.880 | free and open rather than closed and paid,
00:45:29.080 | that's awesome for you.
00:45:30.680 | And so you're right, it becomes this attractor
00:45:33.640 | that brings the other parties to the table to play.
00:45:37.360 | And yeah, that's part of why it's so powerful
00:45:41.140 | in this defensive way, what I call defensive strategic play.
00:45:45.680 | You are able to bring the other people in.
00:45:47.800 | - Right, right.
00:45:49.000 | - It's really fun.
00:45:50.280 | And once again, I think it's good for entrepreneurs,
00:45:53.980 | I think it's good for society.
00:45:55.400 | - Right.
00:45:56.320 | - Like 15, 20 year patents on big pharma drugs
00:46:00.400 | is not good for society.
00:46:02.000 | This is the opposite.
00:46:02.880 | This is technology getting freer, cheaper, more available.
00:46:05.880 | - You know, and I'll give you credit for this.
00:46:08.120 | In a lot of conversations I've had in Washington,
00:46:11.120 | your work on regulatory capture has actually made its way
00:46:15.040 | into a lot of Congress people's minds and offices, right?
00:46:18.280 | They're on the lookout for regulatory capture.
00:46:20.640 | - Awesome.
00:46:21.480 | - Right, and so I think it's pretty wild.
00:46:24.520 | Zuck went from a few years ago,
00:46:27.240 | having a bullseye on him in Washington DC,
00:46:30.420 | and now he's somebody that people point to
00:46:33.200 | as a real asset against closed models.
00:46:36.760 | - Yeah, and I hadn't thought about this
00:46:38.640 | until you just said that.
00:46:39.920 | So thank you for triggering this in my brain,
00:46:42.420 | but maybe there's a bit of a halo you get.
00:46:45.120 | - From open source, for sure.
00:46:46.880 | - Yeah, for sure.
00:46:47.720 | So you look like, you know, you're the good guy
00:46:50.600 | and not the bad guy.
00:46:51.840 | - So, you know.
00:46:54.000 | - By the way, I do think it's worth mentioning
00:46:56.280 | on the AWS thing.
00:46:58.040 | Just from talking to entrepreneurs in the AI space,
00:47:02.920 | Microsoft certainly is advantaged by being,
00:47:07.400 | or Azure is advantaged by being the hosting system
00:47:10.520 | that has access to open AI.
00:47:12.480 | - Oh, for sure.
00:47:13.320 | - And Amazon doesn't, so not surprised that--
00:47:18.080 | - AWS was in the alliance, you know,
00:47:19.960 | kind of featured on launch day.
00:47:22.320 | - It makes sense, they need a response to that.
00:47:26.160 | - Yeah, and, you know,
00:47:29.160 | we could talk about this for a long time.
00:47:31.480 | Let's continue on.
00:47:32.400 | We have a few more topics we want to talk about.
00:47:34.520 | We'll make these a little bit more lightning round.
00:47:37.200 | But you had me listen to a podcast, you know, this week.
00:47:41.120 | Excellent podcast with Dr. Jay Bhattacharya and Rick Rubin.
00:47:46.120 | You know, Jay's a Stanford kind of economic epidemiologist,
00:47:50.920 | was one of the senior authors
00:47:52.560 | on the Great Barrington Declaration in April of 2020.
00:47:57.040 | And that was really the seminal paper
00:47:59.600 | that showed that the COVID death rate
00:48:01.600 | was far lower than people thought,
00:48:03.640 | and that the virus had penetrated far deeper
00:48:07.400 | into the population by April of 2020 than people thought,
00:48:12.160 | which then had broad implications
00:48:13.840 | on whether or not we should lock down.
00:48:15.440 | You know, we're now four years later.
00:48:17.760 | Tell us why, you know,
00:48:19.600 | you were telling all your friends to listen to this pod.
00:48:23.400 | You clearly are still agitated about the fact
00:48:27.240 | that there's been no post-mortem here.
00:48:29.920 | Why is this so important for society that, you know,
00:48:34.040 | that we listen to this, that we come to terms with it,
00:48:36.480 | that we think about this?
00:48:37.440 | Yeah, I would encourage anybody
00:48:38.960 | and everybody to listen to this.
00:48:40.400 | It's on Rick's podcast called "Techna,"
00:48:43.480 | I can't pronounce it, "Techna," I give up.
00:48:47.040 | Rick, obviously, is one of the most interesting Americans
00:48:54.320 | on the planet, an incredible record producer.
00:48:58.120 | He wrote this great book on kind of design
00:49:01.200 | and how he thinks about creation that came out last year.
00:49:04.960 | And his podcast has this breadth of guests
00:49:09.800 | that's just phenomenal.
00:49:11.760 | So I listened to it quite a bit,
00:49:13.040 | and I'm a huge Rick Rubin fan,
00:49:14.840 | even though I can't pronounce the name of his podcast.
00:49:17.120 | And yeah, so he had Jay on,
00:49:20.080 | and they talked for two hours kind of about,
00:49:22.880 | and they covered, I think, maybe like a good two, three,
00:49:26.680 | four-year period of COVID and everything Jay went through.
00:49:30.520 | And, you know, you were talking earlier
00:49:32.680 | about like the media just kind of blocking
00:49:35.880 | and not doing research.
00:49:37.160 | I think everything from the origin of COVID
00:49:40.440 | to the responsiveness to COVID,
00:49:42.760 | for whatever reason, we were in this weird place,
00:49:45.520 | and some of this came out in the Twitter files,
00:49:48.280 | where if you said anything that wasn't 100% consistent
00:49:53.280 | with what Fauci and Collins were saying,
00:49:55.960 | it was labeled misinformation, conspiracy theory.
00:49:59.480 | And I did a bunch of research at the time,
00:50:02.400 | but I looked and found this incredible New York Times piece
00:50:07.400 | about the tower collapse in Miami.
00:50:10.720 | And maybe we'll put it in the show notes,
00:50:12.920 | 'cause it's just so amazing.
00:50:14.200 | So someone spent, you know, six months,
00:50:17.800 | probably thousands of hours of investigative journalism
00:50:21.760 | to figure out exactly what happened.
00:50:24.160 | And they had these infographics, it was like amazing.
00:50:28.280 | And you look at the origin of COVID,
00:50:30.960 | we have, I don't know, 20 million people dead,
00:50:33.680 | probably the worst catastrophe since World War II,
00:50:37.160 | and no one was looking.
00:50:38.920 | Like no one was looking, no one was doing the work
00:50:41.760 | that these New York Times journalists did
00:50:44.480 | for this collapse of this building.
00:50:46.600 | Yet the consequences were 1,000x, like 10,000x bigger.
00:50:51.760 | And I don't know if it was Trump derangement syndrome,
00:50:54.200 | I don't know, you know, I don't know why.
00:50:56.400 | I mean, and the Twitter files, once again,
00:50:58.280 | had a lot of this where people tried to talk up.
00:51:01.640 | And so to me, one thing should be obvious to everyone,
00:51:05.800 | we don't wanna go through this again.
00:51:08.240 | - Right.
00:51:09.080 | - Yet you look at how people talk,
00:51:12.200 | and if you hadn't listened to Jay, I would say,
00:51:17.200 | if this happens again, are we any smarter?
00:51:20.520 | Are we gonna make any better decisions?
00:51:22.560 | - His concern is that we're not.
00:51:24.080 | - Yeah.
00:51:24.920 | - Right, and so I would argue that the reason
00:51:26.480 | to do the postmortem is that, you know,
00:51:31.480 | you had three prominent senior authors on this paper,
00:51:34.360 | 60,000 signatories, and it basically was early
00:51:38.720 | in the pandemic and at odds with the conventional wisdom.
00:51:42.160 | Okay, the conventional wisdom was that you needed
00:51:45.700 | to lock down, keep people separated,
00:51:48.120 | that it wasn't that deeply penetrated, you know,
00:51:51.480 | of disease, and we could prevent it from spreading.
00:51:54.440 | He was basically making the case, because they had--
00:51:57.840 | - And by the way, he had instincts.
00:51:59.400 | - He had the data.
00:52:00.240 | - He also had instincts that lockdowns
00:52:02.640 | would have massive consequences, which they did.
00:52:05.880 | - Right, right.
00:52:06.720 | - But he wasn't given a voice, and Francis Collins
00:52:09.840 | called him a fringe epidemiologist.
00:52:11.880 | - Epidemiologist, right.
00:52:13.080 | - And this is a celebrated, in fact,
00:52:16.040 | his area of study is economic epidemiology,
00:52:21.040 | and so he's actually thinking about how these things
00:52:25.400 | interact with our global economic system.
00:52:28.160 | He's like the perfect person to listen to.
00:52:30.040 | - Exactly.
00:52:30.880 | - And he got attacked, and I think any time,
00:52:33.560 | and I might include this in how some of the people
00:52:38.360 | responded to Ben and Mark, and I remember I triggered
00:52:42.200 | on Kevin Scott when people questioning LLM scaling
00:52:46.280 | called him trolls.
00:52:47.400 | Like any time your reaction is not to react to the argument,
00:52:51.760 | but to throw this label, this negative label
00:52:54.800 | out at the other side, my brain says,
00:52:57.680 | you don't have the goods.
00:52:58.720 | - Right, right, and that's what he said.
00:53:00.680 | What was interesting here is, you know,
00:53:04.480 | it was all ad hominem attacks.
00:53:06.840 | They wanted to be, you know, it was like excommunicating
00:53:11.040 | them from science, you know, because you had the head
00:53:13.600 | of the NIH and you had Fauci who were calling them
00:53:16.560 | fringe players.
00:53:17.920 | They didn't deal with any of the facts, right?
00:53:19.960 | The science was that they had the sewage data
00:53:23.600 | from Santa Clara County and from LA County.
00:53:26.560 | They knew the penetration, and they, of course,
00:53:30.240 | knew the death rate, you know, that they could calculate.
00:53:32.960 | They knew among younger people that there was almost
00:53:35.360 | zero deaths, and among older people, you needed to take
00:53:38.560 | a different sort of precaution.
00:53:41.240 | Now, Jay, what makes him interesting is, you know,
00:53:45.280 | he actually had two relatives in India who died from COVID.
00:53:49.240 | It wasn't like this guy didn't care about COVID.
00:53:51.920 | It wasn't like he was, you know, some kooky.
00:53:53.920 | He was one of the most celebrated scientists
00:53:55.680 | who actually looked at data, did the research,
00:53:58.440 | put together, you know, a very important paper.
00:54:02.400 | Now, this is a person who was celebrated by the NIH.
00:54:05.760 | He was on NIH review panels.
00:54:08.000 | He had written over a hundred--
00:54:09.240 | Right, until he said something they didn't like.
00:54:10.840 | A hundred different papers, et cetera.
00:54:13.520 | And the reason, you know, so when you said,
00:54:16.280 | "I need to listen to this," I'm thinking to myself,
00:54:17.880 | "That's been four years ago."
00:54:19.280 | Yeah.
00:54:20.120 | But he says something at the end of the podcast
00:54:21.800 | where he said, "The problem is we're no better prepared today
00:54:25.760 | "than we were then, and the exact same rush to judgment,
00:54:30.760 | "right, lock everything down, we're prone to do it again
00:54:34.760 | "the next time because we haven't done
00:54:37.000 | "the proper post-mortem here.
00:54:38.240 | "And the proper post-mortem is that we need to do
00:54:42.280 | "the science and the research."
00:54:43.720 | And one thing, obviously, I hope everyone goes
00:54:45.680 | and listens to it, but one thing that he uncovers,
00:54:48.520 | they've gone back through and looked at excess deaths
00:54:51.800 | at a number of different countries,
00:54:53.600 | and lockdowns achieved nothing, right?
00:54:56.080 | And I think he makes a solid argument.
00:54:59.680 | I think most people believe there were consequences
00:55:01.960 | to lockdowns, definitely with kids,
00:55:04.280 | lost generations of kids from school.
00:55:06.280 | Increased suicide rates, lost education.
00:55:09.520 | And all of my kids were in high school,
00:55:11.640 | and it was horrific for them not to have the experience
00:55:14.640 | that so many of us cherish from those moments in time.
00:55:17.800 | It actually, I get a little upset thinking about it.
00:55:21.680 | But there's more than that, really,
00:55:23.400 | because we need to go back through and look at everything.
00:55:25.560 | There were, there's a lot of questions
00:55:28.880 | about the origin of COVID.
00:55:30.120 | If you look at what's being discussed in Congress
00:55:34.320 | and what they're uncovering, it really, really needs more,
00:55:39.160 | more work to be done.
00:55:40.400 | And I think Catherine Bach and Alina Chan,
00:55:44.120 | there's people that you should follow on Twitter,
00:55:45.960 | they're uncovering that, finally.
00:55:48.520 | There were these incentive systems pushed through hospitals
00:55:51.120 | where you could charge 30% more if a patient had COVID.
00:55:54.880 | So has anyone gone back through and seen
00:55:57.680 | if that was really a good idea?
00:55:59.840 | Or did people take advantage of it?
00:56:01.600 | And did they over-select?
00:56:03.400 | They probably did.
00:56:04.440 | There were questions about how,
00:56:06.640 | that I brought up during my regulatory capture speech
00:56:09.160 | about the different types of tests
00:56:11.080 | that could be done and their cost,
00:56:12.560 | and which was more accurate.
00:56:14.440 | And all that stuff, while we don't have a pandemic,
00:56:18.680 | this would be a great time to go in depth.
00:56:20.480 | And I hope that whoever is our next president
00:56:24.520 | will put together a panel.
00:56:26.360 | And I hope Jay's in charge of it.
00:56:27.840 | I have no idea if Jay has that kind of time in his life.
00:56:31.440 | To go review everything that happens
00:56:34.280 | so that we can be better prepared next time.
00:56:36.040 | Well said, well said.
00:56:37.320 | Let's talk about Wiz.
00:56:39.040 | This deal, Google was rumored to be buying Wiz
00:56:42.160 | for 23 billion bucks.
00:56:43.800 | There's a second high profile deal.
00:56:46.920 | The first was HubSpot that seems to have kind of blown up
00:56:50.720 | that Google was supposedly going to do.
00:56:53.440 | Asif, the CEO of Wiz, came out and said not to worry.
00:56:57.000 | We're gonna get to a billion dollars in revenues
00:56:58.960 | and then we'll go public.
00:57:01.120 | You know, anything to take away here?
00:57:03.960 | I mean, my read on it, just at a quick level is,
00:57:06.920 | you know, if this were to go public,
00:57:08.360 | it would probably trade at 13 or 14 times forward.
00:57:10.840 | That would be like 13 or 14 billion, not 23 billion.
00:57:14.360 | Are these things blowing up
00:57:15.520 | because people all of a sudden get cold feet
00:57:17.280 | about antitrust?
00:57:18.680 | Do we think that a Trump administration
00:57:21.440 | is gonna loosen up M&A?
00:57:23.160 | Anything to see here?
00:57:24.360 | Well, I mean, one thing I would say is that these are,
00:57:29.840 | if you're lucky enough to be involved in a situation
00:57:33.480 | where one of these deals comes down,
00:57:35.320 | it's hard to know what to do, you know?
00:57:40.040 | And I mean, it's not nearly this number,
00:57:42.320 | but I remember when Instagram, you know, got their offer,
00:57:47.320 | I think it was 1.1, 1.2 billion,
00:57:50.000 | and they had like 20 employees.
00:57:52.600 | And we had funded the 14th photo sharing site.
00:57:57.000 | And Matt comes in and says, "Should we do this?"
00:58:00.040 | And like, it's easy to want to be on the yes side
00:58:04.520 | in these situations, because it's like, wow, this might,
00:58:09.840 | and I can remember very famous situations
00:58:12.640 | where people walked away from these deals
00:58:14.760 | and never got close to that number again.
00:58:16.720 | So the consequences are high.
00:58:18.800 | It's like a really intense poker moment,
00:58:21.360 | like where you're making a gut call.
00:58:23.560 | And so, and there's also the famous, you know,
00:58:27.760 | Google didn't sell to Yahoo for a billion
00:58:30.000 | and then became one of the most powerful companies
00:58:31.920 | of all time.
00:58:32.880 | - Microsoft, Facebook for 15 billion.
00:58:34.560 | - Turning down 23 or whatever the number was,
00:58:37.880 | it's a higher number, it makes me talk with a higher voice.
00:58:41.140 | - Right, right, right.
00:58:42.440 | - Because as we've often discussed,
00:58:45.440 | like it gets exponentially thin at the top.
00:58:47.920 | The number of people that make it to a billion
00:58:50.360 | is a fraction of the number that make it to a hundred million
00:58:53.240 | and the number that make it to 10 billion is a fraction.
00:58:56.000 | Like it's, the air's thin as you climb the mountains.
00:58:59.640 | - Yes.
00:59:01.000 | - Kudos to them,
00:59:01.840 | if they're just that confident in the business.
00:59:04.020 | There's some rumors today that it may be
00:59:05.800 | that the CrowdStrike situation makes them feel
00:59:08.920 | that they may have more running room.
00:59:11.000 | I wouldn't, I have a hard time believing
00:59:15.000 | this one was a concern about the antitrust
00:59:18.880 | because Google doesn't have a huge security business
00:59:21.800 | and there's actually concern about consolidation
00:59:25.560 | around Palo Alto and Microsoft and Cisco
00:59:30.060 | in the industry, sorry, and CrowdStrike.
00:59:32.560 | So I would think they'd be supportive of it.
00:59:35.560 | And if you thought Trump were coming in,
00:59:37.080 | I think there'd be less reason even to worry about it.
00:59:40.160 | So I don't see that.
00:59:41.560 | - Yeah, that's probably the one thing I would press on is,
00:59:45.220 | you know, the betting markets I think are now 60/40
00:59:49.040 | with Kamala in the race of Trump winning the election.
00:59:54.040 | You know, we've been in this period in Silicon Valley
00:59:57.120 | for four years where companies just stopped,
01:00:00.960 | they may have sent their M&A teams
01:00:02.680 | to the beach for all I know.
01:00:04.160 | There just has not been a lot of activity,
01:00:06.060 | certainly among the hyperscalers
01:00:08.840 | relative to what had existed before.
01:00:10.880 | It seems to me they did three things instead.
01:00:13.400 | They bought back their own stock,
01:00:14.760 | they all started issuing dividends
01:00:16.920 | and they bought Nvidia chips, right?
01:00:19.020 | They, you know, they tighten their belts on people.
01:00:22.040 | They weren't spending more money on people.
01:00:23.920 | All the companies became a lot more profitable
01:00:26.080 | and there just was a real lack of M&A,
01:00:28.400 | which is a problem for Silicon Valley.
01:00:30.040 | - Right, and I would say,
01:00:32.320 | like having been in these situations before,
01:00:34.720 | usually the partner that's involved in the company,
01:00:37.640 | the one that's on the board or that led the investment,
01:00:40.400 | they're usually pretty, they tend to be overly confident.
01:00:44.960 | And that's often balanced
01:00:46.780 | by the other partners back at home.
01:00:50.060 | So when you circle around the table
01:00:51.900 | and have these discussions, they're like,
01:00:53.620 | "Really? Are you sure you don't?"
01:00:55.460 | And if we've been in an environment
01:00:57.420 | where liquidity's been scarce,
01:00:59.520 | the odds that they were sitting there pushing,
01:01:02.940 | "Hey, hey, maybe we should do this,
01:01:04.300 | maybe we should do this," I would think would be high.
01:01:06.480 | We'd saw this unusual Sequoia event
01:01:10.020 | where they actually recycled Stripe
01:01:12.540 | out of their own dollars, right?
01:01:14.460 | Which speaks to the fact that the LPs feel a need
01:01:18.220 | for capital. - Having to come up
01:01:19.060 | with alternative liquidity in a world
01:01:20.540 | where M&A can't get done.
01:01:21.580 | I mean, M&A was not on Market Ben's list,
01:01:24.500 | but it could have been. - Yes, yeah.
01:01:25.860 | - But it could have been. - It could have been.
01:01:27.580 | - As another problem that we face in Silicon Valley,
01:01:32.060 | part of the vibrancy of this ecosystem relies on the,
01:01:36.740 | not just the IPO market, but also the M&A market.
01:01:39.580 | - Yeah, and so I think there's always been
01:01:44.420 | a peer pressure to appear particularly confident
01:01:48.580 | and to play the long ball game amongst venture capitalists,
01:01:51.700 | and they almost fall over each other
01:01:54.520 | trying to express this bravado.
01:01:56.820 | So kudos to Index and Sequoia
01:02:00.060 | and others that have turned down Greylock.
01:02:05.060 | I'm just looking at the list, Andreessen Thrive,
01:02:07.500 | that said no, and the team, obviously.
01:02:10.940 | But boy, you know, we talk about valuations
01:02:14.780 | being discounted future expectations.
01:02:17.860 | Nothing smells like that when you walk away
01:02:20.260 | from 23 billion. (laughs)
01:02:22.340 | - You know, I think it would be less
01:02:24.700 | in the public markets today,
01:02:25.900 | but they've clearly built a terrific business.
01:02:27.940 | - No doubt, I've heard nothing but good things.
01:02:29.580 | I don't, and by the way, one thing that could be
01:02:32.780 | very positive that comes out of this
01:02:34.980 | is if they turn and run at the IPO markets very quickly.
01:02:38.020 | - Yes. - Because I would love
01:02:39.940 | for the list of, I think we have a bit
01:02:43.020 | of constipation here. - Yeah.
01:02:44.460 | - And we talked about this at the Code Two event.
01:02:47.260 | I think people have gotten overly conservative
01:02:52.180 | about what they need to be a public company.
01:02:54.700 | - Yeah. - So I hope they run
01:02:55.820 | as fast as humanly possible.
01:02:57.780 | We've had talk about these potential AI IPOs coming.
01:03:02.220 | I would, I'd love for you and I on this call
01:03:05.080 | to be talking about IPOs more than we are.
01:03:07.940 | - Yeah, you know, I just got off a board call
01:03:11.860 | with a company we're mutual investors in
01:03:14.380 | that's gonna come public in September, October.
01:03:17.020 | And, you know, it's exciting to be talking
01:03:20.020 | about companies coming to the public market again.
01:03:21.460 | - No doubt, no doubt.
01:03:23.260 | - And I think it's best days
01:03:25.900 | and its maximum innovation is still in front of it, right?
01:03:30.380 | And I was just reminding them,
01:03:32.020 | like all of that can occur post going public, right?
01:03:35.260 | And so I think for Wiz as well.
01:03:37.500 | So today, you know, Tesla reported tonight,
01:03:41.940 | they miss their numbers a little bit on margins.
01:03:45.220 | The stock's down seven or 8%.
01:03:47.380 | But interestingly enough, Elon said on the call,
01:03:50.420 | if you don't believe that Tesla's gonna solve autonomy,
01:03:53.920 | sell our stock, okay?
01:03:56.180 | So he's like, this is not just about the number of cars
01:03:58.780 | we sold in the quarter gross margin.
01:04:01.180 | Like the reason you're in this stock is because of autonomy.
01:04:04.980 | And so I want to talk a little bit, you know,
01:04:07.140 | you and I spent a whole pod talking basically about 12.3.
01:04:12.140 | But, you know, now they are onto 12.5.
01:04:16.700 | He mentioned 12.5 and 12.6 on the call today.
01:04:20.580 | You know, what we've been able to discern
01:04:22.660 | is these models are much bigger, right?
01:04:24.940 | They're running like 5X larger models on the edge,
01:04:27.540 | on the car, so rather than a billion parameter model,
01:04:29.860 | maybe something like four or five billion parameter model.
01:04:32.980 | The rates of improvement continue to accelerate.
01:04:36.620 | And so here's how it manifests itself, Bill.
01:04:39.280 | You know, we went and did some testing of our own on this.
01:04:44.220 | In 12.3, you know, you could take your hands off the wheel
01:04:48.460 | for like 40 seconds before it told you to re-engage.
01:04:51.840 | By 12.4, you could take your hands off the wheel
01:04:56.260 | for up to four minutes, right?
01:04:58.660 | A lot longer period of time.
01:05:00.500 | And it has eye tracking, and so you could look away
01:05:03.700 | a little bit before, you know, it was telling you
01:05:05.900 | to take control of the wheel.
01:05:08.100 | On 12.5 and 12.6, we think it'd get to the point
01:05:11.300 | where they're 100% hands-free,
01:05:13.900 | so it doesn't constantly ding you to grab the wheel.
01:05:17.300 | And you can really start looking away from the road
01:05:19.940 | for longer periods of time.
01:05:21.740 | Now, of course, you won't have to do that,
01:05:23.260 | but I'm just saying it will do it
01:05:24.660 | before kicking in its warning systems
01:05:26.900 | because they're confident in the efficacy of the model.
01:05:31.300 | In order to bet, you know, Elon said,
01:05:34.380 | because the 12.5 and 12.6 are so good,
01:05:38.340 | they're going to look for approvals in China and Europe
01:05:41.860 | around, you know, around what time--
01:05:42.700 | I was wondering when that was coming
01:05:44.660 | 'cause the Waymos of the world,
01:05:47.980 | they have to apply for licenses
01:05:50.140 | for the car to run independently.
01:05:53.260 | And so I always knew that would be a step
01:05:55.580 | before they were live.
01:05:57.180 | Interestingly, one of our analysts who covers China--
01:05:59.540 | But they said Europe and they didn't--
01:06:01.420 | In China. And not U.S.
01:06:03.900 | Well, in the U.S. they already can run
01:06:06.580 | full self-driving in states like Texas.
01:06:08.820 | But not without a driver, they have to apply--
01:06:10.700 | No, no, no, the driver's in the car.
01:06:12.060 | Yeah, yeah. The driver's in the car.
01:06:14.380 | So one of our analysts went to China
01:06:17.500 | to test all the L2 and L4s.
01:06:19.100 | You've read some of these--
01:06:20.340 | I've watched all the videos online.
01:06:21.180 | You've read the headlines of, like,
01:06:22.740 | deploying 10,000 cars in Wuhan overnight, et cetera.
01:06:26.300 | And here's what she, you know, came back and--
01:06:29.260 | This will be good.
01:06:30.100 | She came back and reported.
01:06:32.460 | She said, you know, they're still catching up to FSD-11.
01:06:37.060 | Now remember, FSD-11 was--
01:06:39.100 | So they're, like, three years behind on full self-driving.
01:06:42.860 | Those were still the deterministic models
01:06:45.220 | before the big breakthrough of 12.3.
01:06:48.700 | She said, but here's what's crazy.
01:06:50.620 | Although they're far inferior models,
01:06:53.380 | the Chinese consumers seem very receptive
01:06:56.620 | to being hands off the wheel, eyes off the road.
01:06:59.780 | And everybody's at this.
01:07:01.180 | Huawei, Xiaoping, NIO, all these different companies.
01:07:05.500 | And so she was in them.
01:07:07.100 | And she said she had some really lousy experiences
01:07:10.580 | in the cars, but she was blown away
01:07:13.340 | at the Chinese consumers' willingness
01:07:16.220 | to adopt the technology.
01:07:17.380 | Well, keep in mind, I mean,
01:07:19.420 | you've heard a similar story about, like,
01:07:22.060 | leapfrogging technology.
01:07:23.500 | So in Africa, you know, there are no landline.
01:07:25.820 | It went straight to mobile.
01:07:26.900 | And so everything, you know, changes in a different way.
01:07:30.100 | Car ownership in China was relatively low.
01:07:32.940 | Exactly.
01:07:33.780 | Like 5% or 10%.
01:07:34.860 | So most people are either bicyclists or commuters.
01:07:39.620 | And so that would be one reason
01:07:42.140 | why your expectations would be in a very different place
01:07:45.700 | than if you were an American who loves your car.
01:07:48.140 | Right, right.
01:07:48.980 | I mean, this is for another pod,
01:07:52.100 | but we spent, I don't know, 100 years, 70 to 100 years,
01:07:56.820 | maximizing America's structure and layout
01:08:00.500 | and topology for that damn car.
01:08:03.340 | Yeah, yeah, yeah.
01:08:05.860 | One of the things you and I've debated back and forth
01:08:08.460 | on this pod is just the impact that AI is going to have
01:08:12.060 | and the timeline against which it's going to have the impact.
01:08:16.260 | And you've called into question, right,
01:08:18.660 | the magnitude of the impact, perhaps from the LLM.
01:08:22.020 | LLM, yeah.
01:08:22.860 | But we've been in lockstep
01:08:25.220 | that this is having a major impact
01:08:27.860 | on full self-driving autonomy and likely on robo-taxi.
01:08:32.860 | And so it does feel to me like this is one of those places
01:08:37.260 | where now if you're in San Francisco,
01:08:39.660 | it's not surprising to see two or three Waymos around you.
01:08:42.780 | I think you're gonna see a lot of people
01:08:44.140 | driving down the streets in their Teslas
01:08:46.340 | in the not-too-distant future, right?
01:08:48.420 | They're gonna be reading a book.
01:08:49.820 | They're gonna be looking away.
01:08:51.060 | Their hands are not gonna be on the wheels.
01:08:53.820 | And obviously the robo-taxi is reported
01:08:57.820 | to pull the steering wheel out of the car altogether.
01:09:00.940 | By the way, on that last step,
01:09:02.620 | I'm gonna put one more podcast in the list
01:09:05.940 | that I found super interesting,
01:09:07.980 | which was Misha Laskin, who is an entrepreneur
01:09:12.980 | that runs an AI company here in Silicon Valley.
01:09:16.220 | He went on the Sequoia podcast.
01:09:18.020 | But he was at Google DeepMind and he talks about AlphaGo.
01:09:23.820 | And he has some different perspectives on AlphaGo.
01:09:27.260 | And when problems are finite,
01:09:32.260 | and there's this great book about infinite and finite games,
01:09:35.540 | like chess, the ability for these models
01:09:38.580 | to compete and innovate is so much higher
01:09:41.700 | than when you have open-ended problems.
01:09:43.580 | And one of the things about self-driving is,
01:09:46.220 | I think you define it in a way where it's a finite game,
01:09:48.780 | which is no rec.
01:09:50.380 | Like no rec becomes a finite game.
01:09:52.900 | And so these models are able to go a lot farther
01:09:57.900 | when there's a finite game that can be played.
01:10:01.180 | And anyway, his talk is super interesting
01:10:03.780 | and it makes me think about the areas
01:10:06.860 | where AI is going to accelerate faster
01:10:10.340 | because you can do those kind of things.
01:10:13.020 | The open-ended AGI thing is actually,
01:10:16.140 | in my mind, is gonna take the longest
01:10:17.940 | 'cause it's the hardest problem.
01:10:19.140 | - Yeah, I think there are a bunch of discrete things
01:10:20.920 | you can define as finite games.
01:10:22.780 | Customer service being an example of that.
01:10:25.980 | - Conversion on a website
01:10:29.380 | could potentially be done that way.
01:10:31.180 | - For sure.
01:10:32.020 | So what the heck happened with CrowdStrike this week?
01:10:36.140 | Not only did it lose 30% of its value,
01:10:38.900 | all my friends were complaining that they couldn't fly,
01:10:42.020 | get from point A to point B, airlines ground to a halt.
01:10:45.820 | What do you think this reveals to us
01:10:48.300 | about kind of the nature of systems
01:10:52.280 | that are being built in the cloud today?
01:10:54.200 | And what are the things we ought to be looking out for
01:10:56.680 | or concerned about as a result?
01:10:58.020 | - A whole bunch of things went through my mind.
01:11:00.820 | One, the one that actually,
01:11:02.840 | I think J-Cal brought it up first,
01:11:04.600 | but that is just so like such a strong argument
01:11:09.600 | is why didn't you stage gate how this is rolled out?
01:11:13.160 | Like, how could you let it have this big an impact?
01:11:16.480 | Like if you did increasingly large groups every two hours,
01:11:21.480 | you wouldn't have never had this impact on the world.
01:11:23.760 | - Deployment engineer that needed to go on vacation.
01:11:26.060 | - I don't know.
01:11:26.900 | Like someone, like, so crowd,
01:11:28.480 | and I think that's on CrowdStrike.
01:11:30.360 | So, and there were other people,
01:11:33.860 | there's a Twitter account that's really good
01:11:36.720 | at kind of analyzing crisis PR who went through his stuff.
01:11:40.480 | I think they could have talked about things like that,
01:11:44.120 | why that failed rather than like,
01:11:46.400 | they gave a technical description of what happened
01:11:48.760 | and say, "Oh, we uncovered it."
01:11:50.360 | But that doesn't really help.
01:11:52.360 | They didn't say why this isn't gonna happen again,
01:11:54.920 | 'cause there was a policy process failure,
01:11:58.040 | you know, on top of this.
01:11:59.120 | The other thing that if you read Matthew Prince
01:12:02.960 | who wrote a large analysis of this and keep in mind,
01:12:05.560 | Matthew runs a different company CloudFlare
01:12:08.800 | in the security space.
01:12:11.280 | And he's very worried about a very specific thing here.
01:12:14.740 | I just went and read a whole bunch,
01:12:18.560 | so I didn't know this ahead of time,
01:12:19.800 | but apparently Microsoft will argue the reason
01:12:22.680 | that these security companies have access to the kernel
01:12:25.360 | is because they were forced to let them have access
01:12:28.200 | because of an EU ruling that related
01:12:31.800 | to how big and powerful Microsoft is.
01:12:34.800 | Now, there were no non-Microsoft boxes that failed
01:12:39.920 | and people will say, "Oh, well, you know,
01:12:42.560 | "Apple doesn't allow you to have access to the kernel."
01:12:45.240 | And so, what Matthew's worried about
01:12:48.200 | is that Microsoft's able to use this
01:12:50.120 | to reverse the EU initiative
01:12:52.360 | and then push these players out.
01:12:54.740 | And a lot of people don't know this,
01:12:56.760 | but Microsoft's now one of the top four security companies
01:12:59.980 | in the world and may want to do that.
01:13:02.840 | And so, that's a new kind of dimension of competition
01:13:06.760 | that I think people need to pay attention to.
01:13:09.860 | I'm also struck, there was a funny article
01:13:15.680 | that said Southwest was spared by this
01:13:17.480 | because they weren't on Windows 95 yet
01:13:19.880 | or they weren't on Windows.
01:13:21.840 | They were on really, really old Windows machines.
01:13:24.960 | And I just don't understand why anyone
01:13:29.960 | that has a terminal that matters is running Windows at all.
01:13:35.860 | Right, right.
01:13:37.280 | Maybe it's a testament to just the power
01:13:40.240 | of the network effect of Windows and all this stuff,
01:13:42.840 | but you would think there'd be a hardened Linux thing
01:13:46.760 | that would be so much less susceptible
01:13:49.440 | to this kind of risk.
01:13:52.680 | You know, we have these esoteric conversations all the time.
01:13:55.360 | We're investing in these startup companies.
01:13:58.040 | But, you know, the world has become software.
01:14:00.920 | And, you know, we take for granted every day
01:14:04.420 | the amount of software that's running around us.
01:14:06.880 | That's true.
01:14:07.720 | That's getting us from point A to point B,
01:14:09.100 | soon to be driving our cars for us, et cetera.
01:14:11.920 | And, you know, with it, I think, you know,
01:14:15.960 | comes significant responsibility.
01:14:18.920 | And again, circling back
01:14:20.760 | to where we started the conversation,
01:14:22.840 | you start taking down airlines.
01:14:25.160 | That really makes people in Washington unhappy.
01:14:27.760 | Oh, I believe that.
01:14:28.600 | Right, and so, part of the reason I think that, you know,
01:14:34.540 | the nature of the relationship between DC
01:14:38.220 | and Silicon Valley is changing, is forever changing,
01:14:41.220 | is also as simple as this.
01:14:43.520 | Our lives today are so, you know,
01:14:47.100 | twisted around all of these technologies,
01:14:50.660 | whether it's the phone we carry in our pocket
01:14:53.020 | or the software that's getting us from point A to point B,
01:14:56.620 | to the AI we're going to be using in the future.
01:14:59.260 | And so, I think that this symbiotic relationship,
01:15:02.500 | this, you know, figuring out how to make sure
01:15:06.560 | that we're managing the relationship
01:15:09.420 | between Washington and Silicon Valley,
01:15:11.380 | in the most productive way possible for this country,
01:15:14.340 | is gonna be super important.
01:15:17.900 | No longer acceptable for us
01:15:19.860 | just to stick our head in the sand
01:15:21.260 | and say we're 2,851 miles away.
01:15:23.880 | You know, engagement, I think,
01:15:27.380 | is going to be the new course.
01:15:29.660 | And, in that regard, I think policies are gonna matter.
01:15:33.620 | Like, I don't think any party can take Silicon Valley--
01:15:36.760 | - One of the articles said that people
01:15:39.740 | were forced to implement CrowdStrike
01:15:42.260 | because of regulatory reasons.
01:15:44.820 | So, they had it implemented
01:15:46.140 | because someone had told them they had to.
01:15:49.180 | So, that's another interesting dimension
01:15:51.800 | to this whole thing.
01:15:52.640 | Hey, before we wrap, I have two things
01:15:54.500 | I need to do super quick.
01:15:55.700 | One, I'm gonna try and pronounce Tetragrammaton,
01:15:59.940 | which is the name of Rick Rubin's podcast.
01:16:01.660 | - Perfect.
01:16:02.500 | - And, apparently, it's a important Hebrew symbol
01:16:05.700 | from way back before Christ.
01:16:07.620 | So, I wanna do him justice there.
01:16:10.100 | And then, second, I would like to call out,
01:16:13.180 | with respect, Mr. Daniel Eck at Spotify,
01:16:16.860 | who just printed amazing numbers
01:16:18.760 | and took his stock to an all-time high.
01:16:21.340 | He's been grinding away at Spotify
01:16:23.260 | for a very long time.
01:16:25.460 | It's a great product that I love to use.
01:16:28.260 | Super innovative.
01:16:29.540 | A lot of people thought he couldn't make positive cashflow
01:16:32.140 | because of the way that the music label deals
01:16:35.900 | are structured, but he's after it.
01:16:38.140 | He had a fun social media post
01:16:40.780 | before he did his earnings call,
01:16:42.420 | which is-- - I saw that.
01:16:43.260 | - I thought was clever.
01:16:44.940 | - I love how he does this.
01:16:45.780 | - And he's just, he's so thoughtful.
01:16:48.100 | He's so wonderful.
01:16:49.140 | He's so available.
01:16:50.540 | He gives back to other entrepreneurs.
01:16:52.380 | He cares about, especially about Europe,
01:16:55.220 | where he came from and the company.
01:16:57.220 | And anyway, I wanted to call him.
01:16:58.820 | - Well, the stock's up 12% today.
01:17:00.420 | It's up 100% over the course of last year,
01:17:02.700 | 120% over the course of the last five years.
01:17:04.740 | - Yeah, it's up like six, seven billion today.
01:17:06.220 | - Yeah, so he's built an amazing business,
01:17:08.700 | but more importantly, again,
01:17:10.160 | getting back to what we just talked about,
01:17:11.980 | he makes our lives better every day.
01:17:13.780 | For all those who use the product,
01:17:15.020 | our lives are just happier every day.
01:17:17.340 | So, kudos to him on a great quarter.
01:17:19.740 | - All right, man. - All right, great to see you.
01:17:21.060 | Let's wrap.
01:17:21.900 | (upbeat music)
01:17:24.480 | - As a reminder to everybody,
01:17:34.820 | just our opinions, not investment advice.