back to index

New SEC Chair, Bitcoin, xAI Supercomputer, UnitedHealth CEO murder, with Gavin Baker & Joe Lonsdale


Chapters

0:0 Bestie announcement!
2:53 Gavin Baker and Joe Lonsdale join the show
4:14 State of the Trump Bump: Debt focus, Deregulation, America's lucky position
20:7 Trump nominates Paul Atkins as SEC Chair, replacing Gary Gensler: What this means for crypto and other markets
40:52 Thoughts on Michael Saylor's Bitcoin play, state of defense tech, and the US/China AI competition
49:7 xAI's massive GPU cluster, expanding to 1M GPUs, how Grok 3 will test AI scaling laws, and what's next
67:56 UnitedHealth CEO murdered, reactions

Whisper Transcript | Transcript Only Page

00:00:00.000 | Hey, everybody. Hey, everybody.
00:00:01.080 | Before we drop this week's amazing episode,
00:00:04.160 | just some quick information for you.
00:00:06.720 | Sax and Chamath both had the week off.
00:00:09.360 | So we had two amazing friends of the pod on Gavin Baker
00:00:13.000 | from Atreides and Joe Lonsdale from 8VC.
00:00:15.800 | What a treat to have them on the program.
00:00:17.640 | But there is some big news, huh, Freeberg?
00:00:19.680 | Yeah, well, the news dropped after we recorded, so it's not going to be referenced.
00:00:23.880 | And the show might feel a little stale, but here we go.
00:00:27.680 | Our friend David Sacks is the White House A.I.
00:00:31.280 | and crypto czar of the United States of America.
00:00:34.360 | Congratulations to our boy.
00:00:35.960 | Amazing. Super thrilled.
00:00:37.240 | That is a big reason why he wasn't able to join us for the show.
00:00:39.720 | He was in the middle of getting this news ramped up.
00:00:42.120 | Yes, it came out. We had already recorded.
00:00:44.200 | Yeah, take that into account as we go into the show.
00:00:46.280 | Yes. And according to Trump's truth social post,
00:00:49.640 | Sacks will, quote, guide policy for the administration
00:00:53.640 | in artificial intelligence and cryptocurrency.
00:00:56.320 | He will also lead the presidential council of advisors for science and technology.
00:01:01.200 | How great is that, Freeberg?
00:01:02.760 | Wow. Science corner in the White House.
00:01:04.800 | Can't wait. It's going to be it's going to be amazing and pretty awesome.
00:01:08.120 | I have no announcements.
00:01:09.640 | The rumors about me becoming press secretary are obviously premature.
00:01:12.880 | But if asked to serve, I will serve my country.
00:01:16.000 | OK, let's get to it.
00:01:17.320 | Don't worry, besties.
00:01:18.360 | All in isn't going anywhere.
00:01:20.200 | We'll be here every week for you, except maybe Thanksgiving
00:01:23.120 | or a holiday now and again.
00:01:25.000 | All right. Let's start the show, huh, Freeberg? Great episode.
00:01:27.200 | Let's go. All right, everybody.
00:01:29.080 | Welcome to the all in podcast.
00:01:31.000 | I am your host, Jason Calacanis.
00:01:33.880 | How are you doing, Freeberg?
00:01:35.760 | I'm hanging in there. I'm waiting for the tsunami to hit.
00:01:38.040 | It's going to we're about 40 minutes away.
00:01:40.080 | I'm a little anxious right now.
00:01:42.240 | More anxious than normal is what you're saying.
00:01:44.280 | Like on a scale of one to Freeberg,
00:01:47.840 | this is like as anxious as you get with the tsunami.
00:01:50.400 | I don't know what's going to happen.
00:01:51.920 | This current warning shows a five foot water rise for five meter.
00:01:57.240 | I can't tell.
00:01:58.560 | God, I really hope we don't publish this episode.
00:02:00.520 | And there's like a total disaster that we're kind of making light of it.
00:02:04.360 | We're not going to make light of it.
00:02:07.320 | We get warnings once in a while.
00:02:08.600 | Just to give everybody a little perspective here.
00:02:10.240 | I'm at David Saxe's house.
00:02:13.320 | I've taken over, as you can see, I got my Moncler hat on, Freeberg.
00:02:16.480 | I found Saxe's robe in the shirt.
00:02:20.800 | It was Saxe's robe, but I got a red sharpie and I just crossed it out and put J-Cal on it.
00:02:25.160 | And then I went down and I was talking to Chef.
00:02:28.240 | Does Saxe know that you're staying at his house?
00:02:30.440 | He knows I'm staying here, but does he, though?
00:02:32.800 | He doesn't know that I found the actual caviar.
00:02:35.640 | We open source it to the fans and they've just gone crazy with it.
00:02:48.840 | Love you guys.
00:02:49.840 | I'm the queen of quinoa.
00:02:51.400 | With us today, Chamath couldn't make it this week.
00:02:55.840 | And Chamath had surgery and now he looks like Gavin.
00:02:58.480 | OK, yes.
00:02:59.720 | And Saxe is taking a day off.
00:03:05.640 | He's he's he's got the day off today.
00:03:07.600 | I think he's just winning too much with us.
00:03:09.560 | Two substitutes jump in here for the team.
00:03:12.480 | Cackling. You can hear the cackling.
00:03:15.360 | Joe Lonsdale.
00:03:16.280 | There's Joe Lonsdale.
00:03:17.000 | I'll take my Moncler hat off here.
00:03:18.320 | And J-Cal.
00:03:19.000 | I am also winning too much, but I'm happy to be here last minute as a sub for David J-Cal.
00:03:25.600 | Joe Lonsdale, for people who don't know, is to the right of David Saxe.
00:03:29.440 | He is a venture capitalist and he started a couple of companies.
00:03:33.080 | You might have heard of them, Palantir.
00:03:35.200 | And what are the other greatest hits?
00:03:36.480 | Adipar.
00:03:37.120 | Adipar just crossed seven trillion dollars, reported on it this month.
00:03:39.960 | It's a real company, too.
00:03:41.040 | Yeah, the OpenGov sold this year.
00:03:42.840 | We started a bunch of grants.
00:03:44.840 | Yeah. Joe gets he's you're a little bit lower profile than all these accomplishments.
00:03:51.360 | It's pretty incredible.
00:03:52.160 | Co-founded how many billion dollar companies have you co-founded now?
00:03:55.160 | I don't know, maybe six or so.
00:03:57.280 | Six or so.
00:03:58.440 | Also with us, Gavin Baker from Atreides.
00:04:01.280 | He is a hedge fund manager.
00:04:04.560 | He does private. He does Publix.
00:04:06.360 | One of the smartest guys I know.
00:04:07.680 | Great analyst.
00:04:08.440 | Welcome to the program, Gavin, friend of the pod.
00:04:10.600 | Thank you, J-Cal.
00:04:11.640 | Happy to be here.
00:04:12.720 | Great to see everybody.
00:04:13.880 | Yeah, we are experiencing a huge Trump bump.
00:04:17.400 | The election is over.
00:04:19.800 | The cabinet is being assembled and we're seeing DOJ,
00:04:22.920 | the Department of Government Efficiency,
00:04:25.760 | as well as maybe regulations being pulled down.
00:04:28.480 | And we'll we'll get to our first topic about our new SEC chair.
00:04:33.040 | But just generally speaking, Gavin, as a market participant for a living.
00:04:37.640 | What's your take on what we've seen over the last three weeks
00:04:41.720 | since the election results came in?
00:04:43.120 | So if they execute on stated plans and.
00:04:48.360 | There are some of the world's
00:04:52.000 | greatest execution machines involved.
00:04:54.120 | You know, Elon generally does what he says he is going to do.
00:05:02.560 | Like this is going to be awesome for America, for markets, for the world.
00:05:06.600 | And the analogy I keep coming back to is Satya Nadella
00:05:10.880 | taking over as CEO of Microsoft.
00:05:13.160 | Microsoft was a monopoly, incredibly advantaged.
00:05:16.280 | It had just been horribly mismanaged for years.
00:05:19.200 | All he had to do to start winning was stop doing really, really dumb things.
00:05:23.480 | And that's an incredible place to be.
00:05:28.160 | You know, America, like we're the greatest country.
00:05:29.960 | You know, we've got, you know, oceans on two sides, peaceful neighbors,
00:05:34.120 | incredible natural resources, you know, completely, you know,
00:05:37.240 | can produce our own food and energy, like in many ways,
00:05:40.640 | most privileged country on Earth.
00:05:43.160 | But sometimes with great privilege comes great like stupidity.
00:05:46.480 | And California, to me, would be a leading example of that.
00:05:49.040 | Most in many ways, most privileged state in America
00:05:52.280 | and has printed it away with bad policies.
00:05:55.880 | And I do think one thing that everyone of all political stripes agrees on
00:06:01.800 | is there are too many regulations that result in far too many administrators,
00:06:06.800 | far too much complexity and an inability to build things in America.
00:06:11.800 | So, you know, it was used very effectively
00:06:16.120 | and as it should have been against the Harris administration
00:06:19.000 | that they'd approve $42 billion for rural broadband, hadn't built anything,
00:06:23.520 | had approved $40 billion for EV chargers, whatever it was, hadn't done anything.
00:06:27.360 | And it's not that they didn't want to,
00:06:31.120 | you know, dig trenches and do broadband.
00:06:34.480 | They didn't want to make EV chargers.
00:06:36.480 | Their own regulations prevented them.
00:06:39.440 | So I just think deregulation and simplifying
00:06:44.000 | regulations and the tax code
00:06:47.600 | is going to lead to an immense amount of growth,
00:06:51.160 | which is something that all Americans should be happy about.
00:06:53.800 | Joe, you want to maybe give us your take?
00:06:56.440 | I know, obviously, you're very vocal.
00:06:58.960 | You're obviously a conservative run think tank
00:07:02.160 | and have strong feelings on all this.
00:07:03.560 | But there is consensus, I think, amongst all Americans
00:07:06.600 | that we don't like fraud, waste, abuse and those items.
00:07:09.480 | Those seem to be consensus building efficiency and paying lower taxes.
00:07:13.840 | All of that seems like something almost all Americans,
00:07:17.640 | most Americans can get behind.
00:07:18.800 | So I guess maybe a little bit of your take
00:07:22.040 | on what we've seen in the markets and the plan.
00:07:24.360 | And then also, are you going to be involved in this at all?
00:07:27.080 | Well, listen, Jake, I think I do agree.
00:07:30.800 | I think almost all reasonable Americans
00:07:33.160 | can see that our growth is ridiculously constrained.
00:07:35.680 | And I think Jeff Bezos was saying yesterday,
00:07:37.960 | like we need a growth oriented mindset
00:07:40.120 | if we're going to get out of our debt problems, our deficit problems.
00:07:42.560 | So it's nice to see everyone kind of coming on the team and saying,
00:07:45.240 | yeah, we need to fix these things.
00:07:46.520 | I think what people don't understand is like just how broken
00:07:50.320 | the government bureaucracy and regulations are, right?
00:07:53.440 | It's not like they're kind of sort of bad.
00:07:55.400 | Like it's almost like they're companies that went bankrupt.
00:07:58.000 | Like think of the worst company, you know, in Silicon Valley.
00:08:00.160 | They went bankrupt like 30 years ago.
00:08:02.360 | And imagine if someone just like kept pumping money into that worst company,
00:08:05.760 | you know, over like 30 years to keep hiring people.
00:08:08.360 | So like Yahoo or AOL, like some legacy company.
00:08:12.800 | It was failing to take the worst department there.
00:08:15.720 | And then like the worst department gets the most money.
00:08:17.760 | So it's like it's like it's like more, I guess you can use the word retarded now.
00:08:22.680 | It's more retarded than anything you've seen in a long time.
00:08:25.160 | And I think one of the important things is America
00:08:27.920 | used to have these really hard tests for 100 years.
00:08:30.600 | You know, this in 1883, we said, you know what?
00:08:32.880 | We shouldn't just hire our friends.
00:08:34.720 | If someone's going to run something in government,
00:08:36.520 | let's have these tough tests, kind of like China did for thousands of years.
00:08:39.600 | And for 100 years, we had really hard tests.
00:08:42.040 | When we went to the moon, when we fought the world wars,
00:08:44.720 | you know, we did the Manhattan Project, the people making those decisions
00:08:47.960 | in hiring people had to pass things that really only like five,
00:08:51.320 | 10 percent people could pass who took them.
00:08:53.040 | And then in the late 70s, we said, not only are these tests racist,
00:08:56.920 | so we can't test people anymore of any background.
00:08:59.080 | We also can't fire people anymore because we're going to give them tons of
00:09:01.720 | protections. So and so the last 40 years, it just got dumber.
00:09:05.400 | And then 10 years ago, you started doing all the virtue signaling and
00:09:08.080 | and, you know, hiring based on your identity versus based on anything else.
00:09:11.800 | So it's got even worse.
00:09:13.160 | And so now it's so broken that, yes, we're letting tens of billions of dollars
00:09:16.240 | on fire. And so there's really two things here.
00:09:18.320 | One, you've got to take a chainsaw.
00:09:20.200 | As Elon, like I said, you've got to get a chainsaw and just like cut
00:09:22.880 | a ton of broken stuff.
00:09:23.760 | But then day two is you got to say, how do we make this not stupid in the future?
00:09:28.280 | And there's things like maybe you bring back tests, maybe you bring accountability.
00:09:31.160 | The thing I think I'm most passionate about is, you know, right now
00:09:34.680 | there's over a million rules to the federal level.
00:09:36.600 | It's stuff that everyone disagrees on.
00:09:38.640 | You can be the most left person trying to build like solar or wind or whatever.
00:09:42.280 | And you're like, what? I have to do what study over how many years?
00:09:44.800 | This makes no sense.
00:09:45.760 | So we got it. We got to we got to take these regulations and not only cut them,
00:09:48.920 | but we got to make a data driven system
00:09:50.680 | that forces regulations to defend themselves.
00:09:52.720 | And that way, instead of having a cancer that just grows forever out of control,
00:09:56.400 | you could actually have a process that like naturally trims things,
00:09:59.640 | naturally makes things fight for itself.
00:10:01.360 | And that way it doesn't get as dumb ever again.
00:10:03.480 | I would. Why not just make them
00:10:05.480 | automatically sunsetting after five years if they're not renewed?
00:10:08.280 | Exactly. But make the process to renew them difficult and data driven.
00:10:11.640 | Exactly. So basically, yeah, yeah.
00:10:14.120 | So just if you the process of renewing them
00:10:16.800 | will take so much time that it will slow it down.
00:10:19.480 | I wonder where we got to the place.
00:10:21.240 | Friedberg, I will bring you in on this since really.
00:10:23.880 | I think it was maybe two or three years ago
00:10:25.880 | you started to point out exactly the debt spiral we to get in.
00:10:29.800 | I want to give you your flowers here on the pod, because from this pod
00:10:33.520 | and in your obsession and you're harping on and on
00:10:37.240 | about this national debt problem, you saw it early.
00:10:41.480 | You talked about it constantly.
00:10:43.360 | You brought a lot of consensus on board.
00:10:44.840 | And now we're seeing it as the issue of the transition is the debt.
00:10:48.600 | And we were sitting here for the past year or two saying,
00:10:51.400 | when are politicians going to even pay attention to this?
00:10:54.240 | And now they are paying attention to it.
00:10:56.400 | So first, here's your virtual flowers.
00:10:58.560 | Second, how are you feeling about the transition today?
00:11:02.840 | Well, I mean, I think first of all, there's like three layers of the problem,
00:11:06.520 | which I've been trying to harp on for almost four years.
00:11:08.600 | Number one is the inefficiency and lack of accountability, which
00:11:11.480 | Joe and Gavin are obviously talking about.
00:11:13.600 | And that leads to excessive spending.
00:11:15.920 | And that leads to the debt problem.
00:11:18.240 | And the debt problem creates this kind of arithmetic debt death spiral,
00:11:22.040 | which is something you don't want to find yourself in.
00:11:24.720 | So that's the inevitability that I've been kind of really worried about.
00:11:27.880 | And I think that, you know, look, there's
00:11:30.560 | the first derivative and second derivative can kind of get addressed.
00:11:33.080 | And then hopefully you don't get to the absolute point
00:11:35.400 | where you have this breaking breaking point.
00:11:37.400 | I think right now everyone's banking number one on can we deregulate
00:11:41.200 | in a way that can unlock growth and by unlocking growth?
00:11:44.680 | Right. The the the kind of arithmetic argument is grow GDP
00:11:49.520 | because you're not going to be able to shrink debt super fast
00:11:52.280 | in order to grow GDP.
00:11:54.440 | There needs to be some unleashing happening.
00:11:56.640 | And so if you can get GDP to grow four, five percent.
00:11:59.760 | And you can minimize the excess spending,
00:12:03.600 | meaning you can minimize the deficit, the federal deficit,
00:12:06.520 | and therefore minimize the increment in the debt level,
00:12:09.880 | the debt to GDP ratio becomes more manageable
00:12:12.080 | because ultimately you can only tax so much of GDP.
00:12:14.080 | So, yeah, I feel like this is important in both senses.
00:12:18.600 | One is just cut the inefficient, wasteful spending,
00:12:20.800 | get rid of the regulations, and that'll unleash.
00:12:23.160 | The necessary growth, one of the proxies I look to, and I think that this is
00:12:28.400 | going to be kind of the critical motivating factor for the United States,
00:12:32.360 | whether it's this administration or the next or on a content of continuous basis,
00:12:36.080 | there's going to be this kind of moment where we're going to look across the water
00:12:40.200 | at what China has.
00:12:42.360 | And you can see what China is getting, what China is making, what China is doing.
00:12:46.280 | I've talked about this a lot as well.
00:12:48.560 | And I do think this is the most critical metric
00:12:50.920 | that no one talks about as much as I think we should,
00:12:53.440 | which is the increment in electricity production capacity in China
00:12:58.320 | compared to the United States.
00:13:00.120 | And so we are going from one terawatt to two terawatts.
00:13:02.800 | They're going from, I think, two to eight over the same time period.
00:13:06.520 | And why is that so important
00:13:07.920 | for the people who are listening, who think you're saying?
00:13:10.400 | I'll let you I'll let you answer that.
00:13:13.280 | Go ahead. Just unpack it for the audience.
00:13:14.920 | I mean, I obviously understand why this is important, but I want you to.
00:13:17.680 | The number one thing I'd say is it's usually correlated to how well
00:13:20.800 | the working class is doing in your country, to per capita GDP, to cost of goods.
00:13:25.600 | I mean, obviously, I care about it because we want to scale
00:13:28.200 | AI for all the things we're doing and we want to be able to do it effectively.
00:13:31.200 | But if you just look over time, it's really clear the relationship
00:13:34.360 | between how well is your average working middle class person
00:13:37.320 | doing and their quality of life, their cost of life
00:13:39.360 | and the cost of electricity and cost of power.
00:13:41.560 | And it's crazy.
00:13:42.680 | We're not just like ramping up and cutting more to do this.
00:13:45.080 | More electricity means more automation, means more AI,
00:13:48.880 | means more things are being done for the for every person
00:13:52.040 | that are being done in factories or being done by machines.
00:13:55.320 | And that unlocks a new kind of level of living.
00:13:59.960 | And that's been kind of a continuous process for humans.
00:14:02.560 | There's this guy that writes these books that Bill Gates always talks about.
00:14:06.000 | Vaclav Smith, Smiertz, no, no, no.
00:14:11.000 | He's got all these books on the history of the relationship
00:14:13.520 | between energy and kind of prosperity.
00:14:15.760 | There's definitely correlation there.
00:14:18.120 | And there's definitely causation there, Gavin.
00:14:20.280 | And there's even like really simple ones.
00:14:22.520 | I don't know if you've ever seen Lee Kuan Yew from,
00:14:24.920 | you know, the founder of Singapore essentially talk about just how air
00:14:28.880 | conditioning change the country's fate.
00:14:32.360 | It raises average IQ by two or three points.
00:14:35.800 | It really matters.
00:14:36.920 | But look, I think at the end of the day, at the end of the day,
00:14:40.280 | my key point was we have to make sure that this administration
00:14:44.160 | and Joe can hear me on this.
00:14:45.360 | And I know he he believes this.
00:14:46.920 | But for me, it has to be such a priority that we
00:14:49.840 | accelerate a nuclear energy rollout in the United States,
00:14:54.400 | because that's what China is doing.
00:14:55.960 | They have dozens of Gen four reactors that are going to get built out,
00:14:59.440 | each of which has a gigawatt of production capacity.
00:15:02.440 | And they're doing it at a cost that we can't compete with today.
00:15:05.960 | But fundamentally, they can do it in the US.
00:15:08.920 | We can't even do it.
00:15:09.920 | And regulations, it goes back to Joe.
00:15:11.760 | Yeah. And so the regulatory structure prohibits our ability
00:15:15.280 | to actually expand energy capacity or electricity production capacity,
00:15:18.360 | which is a critical difference.
00:15:19.680 | And that ultimately leads to a situation where in 10 years
00:15:23.320 | we're going to be looking across the water at, you know,
00:15:26.720 | a competitive country that has three X,
00:15:30.520 | four X, the electricity production capacity of our country.
00:15:33.320 | And everything is cheaper. Everything is faster.
00:15:36.560 | It gives them superiority in a lot of functions that we can't.
00:15:39.400 | This this is a security thing, too, because we need manufacturing here
00:15:42.680 | to be affordable and competitive in order for our national security.
00:15:45.400 | You know, David, when you when you texted me last minute to join this,
00:15:47.880 | I was actually in a meeting with our friend from Founders Fund
00:15:49.840 | who's building nuclear fuel again for the first time in the US.
00:15:52.920 | That's actually gone away, too.
00:15:54.360 | And so so that needs to be approved.
00:15:55.720 | But, you know, I know Chris Wright, he's a nominee for the new energy
00:15:59.600 | secretary of energy.
00:16:00.880 | And he you know, he's super pro Liberty guy, but super pro cheap,
00:16:03.680 | cheap energy guy and pro nuclear.
00:16:05.280 | So I think we have some really great people who care about this.
00:16:07.560 | We're going to be fighting hard for it.
00:16:08.680 | Gavin, what's your read on it?
00:16:10.320 | Yeah. What's your read on energy and the blockers?
00:16:12.440 | I mean, more nuclear, more better.
00:16:15.480 | Like, I mean, I agree with everything Joe said.
00:16:18.960 | And the only thing I would just add is it is the most environmentally friendly
00:16:23.120 | kind of energy source.
00:16:25.600 | It's the most it's even more like I think it is highly likely
00:16:29.880 | that in my lifetime, the world just runs on solar.
00:16:32.840 | Like if you just, you know, we all know compound interest
00:16:36.920 | is the greatest force on Earth.
00:16:38.840 | But if you just look at the rate at which photovoltaic cell
00:16:41.680 | efficiency is compounding.
00:16:44.040 | Battery efficiency is compounding, and people make these balance
00:16:47.200 | of system arguments, but it will it will never be as cheap as nuclear,
00:16:51.720 | but it will likely approach coal.
00:16:54.400 | And I think a lot of the world will run on solar.
00:16:56.800 | But that's going to take 50 years.
00:17:00.200 | Nuclear is arguably just as environmentally friendly,
00:17:02.880 | done right and carefully.
00:17:04.480 | And it is here now.
00:17:05.840 | And so I just.
00:17:07.720 | Yeah, I mean, that's yeah, I don't I don't I don't it is unbelievable to watch.
00:17:13.880 | Not exactly Moore's law, but this precipitous drop
00:17:16.960 | in the cost of just solar panels.
00:17:19.880 | Solar solar is very solar is very impressive.
00:17:22.440 | But in some ways, to me, it's a little dystopian
00:17:24.280 | to think about all these forests just covered with this stuff.
00:17:26.400 | And I think it's great. It's rate limiting.
00:17:28.240 | It's rate limiting.
00:17:28.920 | You have you have to have many, many acres rolled out.
00:17:32.440 | And with nuclear, you can have a building that can,
00:17:35.840 | you know, power the equivalent of many acres.
00:17:38.040 | So the answer is both. The answer is both.
00:17:39.800 | But it is like Ilana's tweeted many times about the tiny fraction
00:17:44.280 | of, you know, deserted desert areas of America
00:17:49.120 | that need to be covered with panels.
00:17:50.760 | And then you put in batteries.
00:17:52.680 | There's still a space limiting.
00:17:54.280 | Like, let's just fast forward 100 years on planet Earth.
00:17:57.240 | And if you look at the past 100 years and 100 years before that,
00:18:00.680 | like energy demand, even on a on a per capita basis,
00:18:05.800 | has this nonlinear kind of scaling problem.
00:18:09.040 | And that means that land consumption will scale nonlinearly.
00:18:12.160 | Now, we have it.
00:18:12.960 | If we're going to say 100 years, though, I mean, sorry to interrupt.
00:18:15.080 | If you're going to say 100 years, technology changes.
00:18:16.920 | It's very clear.
00:18:17.440 | You could probably do it from space if you really wanted to.
00:18:19.480 | I mean, this sounds crazy, but you probably could
00:18:21.560 | have like 10 of these in space. I mean, 100 years is far.
00:18:23.920 | I mean, we have we have enough uranium in just the crust of the,
00:18:27.200 | you know, like northern hemisphere to, in other words,
00:18:30.840 | to power everything we are on the verge of unlimited energy for all time.
00:18:36.400 | Just to wrap this up and move us on.
00:18:38.000 | If we can get out of our own way, get out of our own way
00:18:40.240 | with a bunch of midwits who have put so much regulation in place
00:18:44.800 | and who are working from home for four hours a week.
00:18:47.880 | If we can just get them out of the way,
00:18:49.840 | then maybe we could approve some nuclear reactors and a little more.
00:18:53.320 | Did you just call did you just call government workers midwits?
00:18:56.600 | Just the ones who are blocking.
00:18:59.480 | If you're not blocking and you're working 50 hours a week.
00:19:02.200 | God bless a lot of people working hard in the government.
00:19:04.920 | That's exactly my point.
00:19:06.080 | But there's some number of them
00:19:07.360 | who are just obviously do not see the forest through the trees.
00:19:10.280 | By the way, on that point, it is interesting.
00:19:12.240 | Texas is the number one solar producer in the country.
00:19:15.640 | And it's not because everybody there is more environmentally conscious.
00:19:19.760 | No, we're not that liberal here.
00:19:22.000 | It's just it's easy to build stuff in Texas.
00:19:25.360 | Yeah. Solar power plants.
00:19:27.720 | This is I mean, for the lives in cities who are so pro
00:19:31.320 | solar and anti natural gas or whatever.
00:19:34.400 | Yeah, look at Texas.
00:19:35.200 | They're just build as much as you want.
00:19:36.880 | And we're Joe and I live in Austin.
00:19:39.200 | Home prices have gone down two years in a row.
00:19:42.040 | Rent has gone down two years in a row.
00:19:44.400 | They are building like lunatics because you don't need
00:19:48.520 | to beg and and bribe people to build stuff.
00:19:52.360 | Joe, maybe some thoughts on what we've seen in regulation.
00:19:55.960 | Austin City Council is not perfect.
00:19:57.760 | Austin is called the blueberry in the tomato soup.
00:19:59.680 | So there may be a tiny bit of begging going on.
00:20:01.600 | But despite that, you're able to build there
00:20:03.080 | and everywhere else around your building like that.
00:20:04.920 | So it works. There's there's there's plenty of supply.
00:20:07.200 | I agree. Speaking of regulation, Gary Gensler is out.
00:20:10.440 | Paul Atkins is in and Bitcoin just cracked 100.
00:20:14.240 | This is all obviously related.
00:20:15.840 | Atkins was previously S.E.
00:20:16.960 | Commissioner under Bush, too, in the early 2000s.
00:20:19.960 | In the 90s, he worked for both Bush one and Clinton at the SEC.
00:20:23.400 | According to The New York Times, Atkins is admired
00:20:26.560 | among D.C. legal circles and regulators.
00:20:29.840 | He's pro crypto and he's been helping draft some best practices
00:20:34.240 | for the crypto trading platforms.
00:20:36.640 | As you know, Gary Gensler's approach to crypto was
00:20:40.240 | there's a rule set, follow the rule set.
00:20:42.880 | We're not here to change the rules.
00:20:44.400 | We're here to enforce them. Good luck.
00:20:46.720 | And in some cases, I think maybe he was right
00:20:50.680 | with I.C. Hoes and a bunch of scams, but he also gave good actors
00:20:54.920 | no path to go forward.
00:20:56.840 | Anybody have strong feelings on this?
00:20:58.680 | I love Paul J. Kelly.
00:20:59.800 | I think he's I think he's going to, you know, be our guy.
00:21:01.920 | Yeah, I've met him several times at conferences and groups,
00:21:04.040 | and he's a really smart guy, cares about the rules, cares about helping innovators.
00:21:07.800 | The thing that really pissed everyone off with Gary.
00:21:10.080 | I mean, I love you probably seen like I think I think
00:21:12.280 | I think both Coinbase with Brian and the Winklevoss twins have put something out.
00:21:17.000 | They won't even hire anyone who worked with Gary on this stuff they were doing.
00:21:20.680 | And the reason they're that angry at them is they were purposely
00:21:23.480 | not defining the rules in certain cases and then going after people
00:21:27.720 | after not having to find it in order to kind of like play gotcha game.
00:21:30.920 | It was very dishonorable.
00:21:31.920 | And Paul sees all that.
00:21:33.320 | There's no way he's going to allow that.
00:21:34.520 | Gavin, any thoughts here on how this might change regulations in our business?
00:21:39.360 | Also, the fund business, Joe, is a venture capitalist, a VC.
00:21:42.960 | You're in public markets and funds.
00:21:45.680 | I have I do precede.
00:21:47.120 | So what do you think in terms of funds and regulations there?
00:21:50.240 | And then crypto and, you know, the SEC may be becoming innovative
00:21:55.160 | as opposed to punitive and, you know, what's the word for Friedberg?
00:22:01.560 | What's the word I'm looking for here?
00:22:03.600 | Adversarial?
00:22:05.120 | Well, they are a regulator.
00:22:06.520 | They are a regulator, but they don't also seem
00:22:08.680 | like they seem to have gone beyond just regulating.
00:22:10.920 | They seem to have been the aggressively adversarial.
00:22:14.320 | Yeah, I mean, they're a regulatory enforcement agency.
00:22:16.720 | That's their job.
00:22:17.400 | So I don't know, like, you know, I mean, but not giving a path
00:22:21.320 | or not even meeting with people.
00:22:23.160 | I think that was the thing that I felt was kind of weird.
00:22:25.160 | Well, they may leave you if you're a big donor to the left.
00:22:27.280 | They wouldn't meet with you otherwise.
00:22:28.240 | That was part of it, too.
00:22:29.080 | That was sketchy.
00:22:29.720 | That was why SBF got tons of meetings.
00:22:31.360 | But Brian got none. Right.
00:22:32.440 | I mean, it's nonsense like that.
00:22:34.000 | Oh, good.
00:22:35.080 | I do think I don't know.
00:22:36.480 | Yeah, go ahead.
00:22:36.960 | At some level, fundamentally reduce the power of nation states.
00:22:40.840 | And that is something that is.
00:22:44.120 | Oft professed by the true believers, but it is true.
00:22:48.440 | And so if you are on the left and, you know, you are a devout believer
00:22:54.880 | in the power of the state to do good things,
00:22:57.480 | I get why you would not like crypto currencies.
00:22:59.640 | You know, at the same time, we're very early in crypto.
00:23:02.760 | I do think I read with interest all of the posts that David Marcus
00:23:08.080 | and his peers at Libra made made about what happened to them.
00:23:12.880 | Libra was a Facebook project to embrace cryptocurrency
00:23:17.360 | at a very intrinsic level onto the sort of identity level
00:23:21.920 | of every one of their billions of users.
00:23:24.160 | And I think you can argue it would have been really, really good
00:23:27.040 | for not just America, but the entire world.
00:23:29.120 | You know, there are a lot of immigrants in America and in America
00:23:33.280 | who send remissions back to their home countries at extremely high
00:23:38.000 | predatory rates, predatory rates.
00:23:41.120 | It would have made that free.
00:23:43.440 | And that would have been amazing for a lot of really hardworking
00:23:47.400 | people all over the world.
00:23:49.120 | It would have, you know, Visa, MasterCard, they do charge big fees.
00:23:52.440 | I mean, Visa, MasterCard do not charge big fees,
00:23:55.560 | but the credit card complex and aggregate is a is a reasonably big fee.
00:24:00.560 | I mean, everybody, oh, it's just, you know, two, two and a half percent
00:24:03.000 | to make it perfect and safe.
00:24:05.640 | I think Facebook had a sound argument that they could have done that cheaper.
00:24:08.760 | That would have been an efficiency game for America.
00:24:11.520 | And it really did bum me out, you know, to read some of the letters
00:24:17.160 | that they sent the way they killed Libra.
00:24:20.360 | If anyone doesn't know, maybe you could pull up David Marcus's post.
00:24:23.640 | They just all these politicians sent letters to participants
00:24:28.280 | in financial markets saying, we don't know if they are doing anything wrong,
00:24:32.800 | but we think they probably are.
00:24:35.600 | And we're going to look at this very closely.
00:24:37.800 | And we want to discourage you from participating.
00:24:40.280 | Well, it was worse. It was worse than that.
00:24:42.920 | It was like a mafia letter.
00:24:44.040 | It was like, if you are to support this and help with this,
00:24:46.640 | we are going to look into everything else you are doing
00:24:48.920 | and we may then find some issues.
00:24:50.440 | It was it was like a mafia threat.
00:24:51.880 | We can't stop you legally, but watch out.
00:24:54.120 | Like it was really sketchy.
00:24:55.560 | It was Sherrod Brown, who who now is out of office,
00:24:57.720 | who led this, the senator from Ohio.
00:24:59.640 | But it was it was really bad.
00:25:00.680 | It was really bad what they were doing.
00:25:02.320 | It was really bad. I found it upsetting as an American.
00:25:04.520 | And it's you kind of hinted at this, Gavin.
00:25:08.480 | The fear of governments is that they will lose control of money supply.
00:25:13.960 | And monetary policy, maybe unpack that a bit.
00:25:16.640 | And then, Friberg, we'll go to you on the same sort of thread.
00:25:19.640 | Yeah, I mean, look, it's a it is a rational fear.
00:25:22.760 | I mean, controlling monetary supply, like at some level,
00:25:26.320 | the you know, the greatest, you know, powers
00:25:29.040 | we give the state are a monopoly on violence to keep us safe.
00:25:32.240 | And, you know, control over the money supply has,
00:25:36.760 | you know, a means of exchange, a unit of account.
00:25:38.960 | And those are great powers,
00:25:41.680 | particularly if you're American, you reserve currency.
00:25:43.840 | The only thing I would just say to balance this out.
00:25:46.760 | And I do think people are very positive on Paul Atkins.
00:25:49.880 | I've never met him.
00:25:50.720 | But the reaction from people who know him like Joe.
00:25:54.960 | And who I respect has been very positive.
00:25:58.280 | It is important to remember we have the best capital markets in the world.
00:26:02.480 | You know, the U.S.
00:26:04.240 | equity and fixed income markets are the most trusted places on Earth.
00:26:09.000 | And we can always make them better.
00:26:10.800 | But just it is very, you know, you you want to be very vigilant
00:26:15.320 | about keeping them fair and keeping out things like inside information,
00:26:19.800 | which makes people feel comfortable doing business here.
00:26:22.480 | You know, making, you know, having investors have confidence
00:26:25.960 | in a company's financial statements.
00:26:27.800 | And those financial markets are one reason America is such a great country.
00:26:31.440 | And Gavin, to put this in context, this was during a time period,
00:26:36.040 | this David Marcus Libra process, when they felt certain people in government.
00:26:41.680 | And I'm not saying I endorse this, but this is their position.
00:26:45.240 | Zuckerberg had too much power.
00:26:47.720 | Zuckerberg was censoring people.
00:26:49.360 | The right felt they were being censored.
00:26:51.760 | The left felt that Cambridge Analytica and people were using targeting
00:26:57.040 | diversely. The general vibe.
00:26:59.680 | And even J.D. Vance's kind of was in on this as well,
00:27:04.360 | was too much power, specifically at Metta, too much influence.
00:27:08.560 | Now they've got this many people, this penetration and the algorithm.
00:27:13.120 | Plus, we're going to give them money
00:27:15.200 | and they're going to have power over people's wallet.
00:27:17.320 | And then what does the government have?
00:27:18.760 | They can't censor people.
00:27:20.720 | They can't control the message and they can't control the purse strings.
00:27:23.840 | That is the time period we're talking about here.
00:27:26.080 | Oh, no, I understand 100% why they did it.
00:27:29.440 | Yeah, I just think as an American, at a minimum, the way that they did it.
00:27:33.680 | Yeah. Was to, you know, use one of Joe's phrases, dishonorable.
00:27:37.640 | You know, if you if you want to say that we're going to kill this,
00:27:40.960 | then just like let's have a debate as a nation.
00:27:44.080 | Absolutely. Don't do it.
00:27:46.840 | Yes. Rule of law.
00:27:48.280 | Rule of law is also another reason America is a great country.
00:27:51.720 | Yes. Freeberg, your thoughts on Dave Marcus's comments.
00:27:55.800 | And this SEC pick and just generally a more
00:28:01.320 | I don't know, less regulation kind of situation.
00:28:04.040 | I think the SEC is like one of the most important agencies we have,
00:28:09.640 | and I think they're one of the best federal agencies.
00:28:12.840 | So I've worked with them and I've worked with any other agencies
00:28:16.000 | and there's all these issues at the SEC, but they play a very important job.
00:28:20.080 | And if you've worked in and foreign markets
00:28:23.120 | and you've dealt with foreign securities regulators,
00:28:25.840 | you're going to be like, thank God for the SEC.
00:28:27.800 | You can't wait to go back.
00:28:29.600 | This whole crypto thing, I there's there's a distinction,
00:28:33.040 | I think, between Bitcoin and crypto currencies as speculative
00:28:37.600 | kind of asset speculative trade. What do you see those differences as?
00:28:40.720 | Well, I definitely concur with Gavin.
00:28:43.520 | I think Bitcoin fundamentally is meant to be supposed to be
00:28:46.480 | ultimately will become a real threat to the US dollar.
00:28:50.320 | And it's kind of ironic that Trump had this declaration this week.
00:28:54.080 | Yeah, that he's going to put 100 percent tariff on all these BRICS
00:28:57.600 | nations that try to participate in an alternative currency
00:29:00.800 | to the US dollar, the greatest currency on Earth,
00:29:03.200 | when he literally turns around and then says we're going to support Bitcoin.
00:29:06.080 | It felt like the biggest irony of the week to me,
00:29:08.520 | because I do think Bitcoin is the big threat to the US dollar.
00:29:12.240 | And I do think that at some point, whether it's this administration
00:29:16.040 | or the next, they're going to wake up to that fact.
00:29:18.200 | And maybe the Bitcoin does, you know, the network state concept does emerge.
00:29:22.280 | And that's where we end up.
00:29:23.800 | But I do think we want to have
00:29:25.040 | and are going to have a strong federal government in the United States
00:29:27.320 | for quite some time.
00:29:28.640 | That's going to play an important role in everyone's lives here.
00:29:30.880 | And I don't know if you can really just say, let the dollar,
00:29:33.640 | you know, be supplanted by Bitcoin.
00:29:35.920 | Bitcoin seems to be a more of a safe haven asset.
00:29:38.360 | And that seems to be the trade that it's stored.
00:29:40.640 | It should be kind of sort of value and alternative.
00:29:43.000 | It's just going to take over gold.
00:29:44.680 | Well, you think the state has reasonable concerns
00:29:50.080 | about crypto competing with it?
00:29:52.080 | And then maybe specifically, when you saw Trump talking about,
00:29:55.120 | hey, congratulations on your Bitcoin, I did this for you on 100K
00:29:59.040 | and taking credit for it at the state, which he should take credit for it.
00:30:02.040 | He did it. He did that last $40,000 per coin.
00:30:04.560 | And then you see him talking about the BRICS.
00:30:08.080 | And then we have the US currency.
00:30:09.640 | So which is a bigger threat to American
00:30:13.040 | exceptionalism and supremacy on planet Earth?
00:30:17.160 | Bitcoin or BRICS?
00:30:19.160 | So one represents a move towards liberty and one represents
00:30:22.080 | a move towards authoritarianism, if BRICS were to become dominant,
00:30:25.000 | if China and Russia were able to control global currency together
00:30:28.840 | and other players, that is terrible.
00:30:31.080 | It's bad for the US for so many reasons.
00:30:33.280 | Bad for US consumers for so many reasons.
00:30:34.600 | He's right to fight it.
00:30:35.920 | At the same time, having, you know, to channel biology,
00:30:39.160 | having this like pro-liberty network state, like forces in that direction.
00:30:43.000 | Like, for example, I think rather than just, you know, Hong Kong and Singapore,
00:30:46.040 | Hong Kong has been lost.
00:30:46.880 | We should have like more Hong Kong and Singapore in the West.
00:30:48.880 | They compete with the US.
00:30:49.720 | That'd be good for all of us.
00:30:51.200 | We're on the pro-freedom side of the US and make the US wealthier.
00:30:53.720 | We show examples of new experiments that create great wealth.
00:30:57.040 | And that's kind of the side of Bitcoin is you want more experiments
00:30:59.760 | and competition from the liberty distributed side.
00:31:02.160 | You don't want competition from the authoritarian side.
00:31:04.200 | So to me, it was very consistent.
00:31:05.960 | Yeah, well, I would just say Texas is the Singapore of America,
00:31:08.680 | you know, and it is putting pressure on the rest of America.
00:31:11.280 | And I don't say that just because I grew up in Texas.
00:31:13.560 | It's just a fact.
00:31:15.640 | I do think long term Bitcoin.
00:31:17.880 | I do not think the BRICS will ever be able to replace the dollar.
00:31:22.200 | There's a rule of law,
00:31:25.600 | even if it has occasionally been corrupted in America, is very, very powerful.
00:31:29.920 | Yes. As opposed to rule by law.
00:31:32.840 | And but I do think Bitcoin.
00:31:34.920 | Will at some point be a serious threat to the US dollar.
00:31:39.600 | And that just is what it is.
00:31:43.160 | And we will see how different administrations react.
00:31:45.720 | Would that it's a check on the check on the most aggressive mistakes.
00:31:48.960 | First of all, if the debt doesn't get under control for deficit is ridiculous.
00:31:52.240 | Bitcoin is a wonderful check on that.
00:31:54.080 | You actually want healthy competition from something good
00:31:56.400 | because you want to check the excesses and the craziness you want.
00:31:59.000 | You need someone coming from the outside to say, no, don't do that.
00:32:02.360 | Let's just say at some point an AOC like person gets in charge.
00:32:05.360 | It could happen in the next 20 years.
00:32:06.880 | You need some kind of check on the dollar.
00:32:08.600 | And it's much better to come from the liberty side than from the other side.
00:32:11.280 | I agree with all that.
00:32:12.560 | I will just say on AOC, I thought it was very interesting.
00:32:15.680 | I think if you're a Democrat, I think you're probably largely heartened
00:32:18.920 | by the kind of intellectual leaders of that party, their reaction to losing.
00:32:24.160 | You know, it's focused around, hey, we do need to deregulate.
00:32:26.440 | It is too hard to build.
00:32:28.200 | Josh Shapiro has been tweeting every three days about how
00:32:30.480 | these make it easier, you know, to do business in Pennsylvania.
00:32:33.600 | It used to take 20 days to become get a hairdresser license.
00:32:37.360 | That takes an hour.
00:32:38.680 | All that is good.
00:32:40.360 | But I actually thought AOC, who is a very talented politician.
00:32:44.480 | She's an extraordinary communicator, is extraordinary communicator.
00:32:47.200 | Her reaction was like she posted this on Instagram.
00:32:50.760 | If you voted for Trump.
00:32:53.800 | I want to hear why I want to hear the things you listen to that convinced you.
00:32:57.880 | Like, I don't say this out of anything other than genuine curiosity.
00:33:01.760 | Basically, clearly, I am missing something.
00:33:04.560 | And I, you know, I want to listen.
00:33:07.200 | And I just thought that was a very interesting reaction.
00:33:09.320 | That is the proper reaction.
00:33:10.800 | Yeah, absolutely. Yeah.
00:33:12.640 | I want to point out one thing I did.
00:33:13.960 | I was doing some research before the show, and I found this SEC speech.
00:33:18.360 | And this is a really fascinating.
00:33:21.200 | This is from 2007 from Paul.
00:33:23.600 | And he was kind of giving his State of the Union here.
00:33:26.880 | And he's talking about doing some self-reflection.
00:33:30.320 | And he's talking about accreditation rules in 2007
00:33:34.600 | before the great financial crisis.
00:33:37.320 | The concept of economic risk and return
00:33:39.280 | also affect a different proposal of the commission, the SEC.
00:33:43.720 | Relating to private investment funds,
00:33:45.200 | specifically part of the commission's proposal
00:33:47.240 | would add an additional requirement for any natural accredited person
00:33:51.160 | to have at least 2.5 million in investments before he or she
00:33:54.520 | could invest in private investment in a private investment fund like a hedge fund
00:33:58.120 | or private equity fund other than a venture capital fund.
00:34:01.040 | The underlying premise for the commission's proposal
00:34:03.880 | is that these types of investments are too risky for individuals
00:34:06.120 | other than the very rich.
00:34:07.480 | Therefore, we would have to presume that the non-rich are either unsophisticated
00:34:11.080 | or lack access to sophistication.
00:34:13.320 | And it is simply not tolerable to have these types of people
00:34:16.840 | at risk of losing their money on a hedge fund.
00:34:19.120 | Assuming that these premises are true, however,
00:34:21.120 | what evidence does the commission have to support the conclusion
00:34:23.960 | that private investment funds are the most risky?
00:34:26.520 | What makes a hedge fund or a private equity fund
00:34:29.120 | more risky than a venture capital fund?
00:34:31.040 | Great question.
00:34:31.960 | And how does the risk profile of a pooled investment compare
00:34:35.440 | with the risks of investing in securities of a single issuer
00:34:38.160 | for which this new 2.5 million standard is?
00:34:41.000 | This is where it gets super interesting.
00:34:42.760 | Many public comment letters express indignation at the commission's proposal.
00:34:46.400 | One commentator wrote, "Stay out of my wallet.
00:34:48.480 | Stop trying to protect me from myself.
00:34:50.680 | Stop presuming to know more than I do about my own life,
00:34:54.640 | risk tolerance and financial sophistication."
00:34:56.840 | The commission's proposal may very well prevent the non-rich
00:34:59.800 | from losing their money in private investment funds,
00:35:01.680 | but it also certainly will prevent the non-rich
00:35:04.240 | from participating in any upside profits and gains on these funds.
00:35:08.360 | Does this mean the rich get richer while the non-rich should be content
00:35:11.160 | to just hold their place on the economic ladder?
00:35:13.600 | This, to me, when I saw this, I was like,
00:35:17.360 | "You know what? I am feeling absolutely fantastic about Trump.
00:35:21.400 | If he stops these wars, or stops one out of two,
00:35:24.560 | and keeps us out of, you know, participating,
00:35:26.520 | even though we're not on the ground with any new ones,
00:35:29.800 | and he removes regulations,
00:35:31.800 | and we have people in power who understand
00:35:34.760 | that moving up the socioeconomic ladder
00:35:36.960 | is as important as protecting the downside risk,
00:35:39.920 | especially when we see wealth polarization,
00:35:43.720 | this is a great pick."
00:35:45.520 | Now, the other picks, there are some whack-pack picks in there.
00:35:48.080 | I'll be totally honest.
00:35:49.200 | I don't know what the strategy is, Joe.
00:35:50.680 | I'll ask you that in a minute.
00:35:52.440 | But what do we think, Gavin, of this sort of approach here,
00:35:56.640 | which is really thinking thoughtfully about
00:35:59.680 | what is sophisticated, and what are we doing here?
00:36:02.800 | What is the outcome we're looking for?
00:36:05.560 | - Yeah, no, I think it's great, and you could just...
00:36:07.880 | I mean, the reality is a hedge fund
00:36:09.000 | that runs with a lot of leverage probably is more risky.
00:36:12.440 | I don't know that it's more risky than a single security,
00:36:15.360 | but, like, at the end of the day,
00:36:19.280 | since he wrote that letter,
00:36:20.760 | it was an incredible 15-year run for private equity.
00:36:24.240 | And, you know, now all the big private equity firms
00:36:27.520 | are making a huge effort now,
00:36:31.800 | when probably the business is more mature
00:36:33.320 | and maybe the return opportunities aren't what they were,
00:36:35.960 | to appeal to Main Street America.
00:36:38.360 | Like, it would have been cool if, you know,
00:36:40.280 | Blackstone or KKR or whoever in '07, '08, '09, 2010,
00:36:46.720 | you know, could have had their fund up
00:36:48.240 | on, like, the Fidelity marketplace for subscription.
00:36:51.160 | That probably would have been good for America.
00:36:53.520 | So I think his comments are well taken.
00:36:55.560 | Imagine if Elon could raise for SpaceX,
00:36:57.600 | like, you know, from regular Americans.
00:36:59.280 | I'm sure he would have loved to do that
00:37:00.440 | if it wasn't crazy risky with the SEC, right?
00:37:02.440 | So there's all these people who are blocking us
00:37:04.320 | from letting the average person be part of it.
00:37:05.840 | It does keep them from climbing the wealth ladder.
00:37:07.440 | I think it's crazy.
00:37:09.040 | - Yeah.
00:37:09.880 | Freeberg, any thoughts here?
00:37:10.720 | I mean, this has been my pet peeve for a long time,
00:37:13.000 | is letting people do what they want with their money.
00:37:15.600 | If you don't allow people to take risk with their money
00:37:18.760 | and move up from poor to middle class,
00:37:20.680 | from middle class to upper middle class,
00:37:22.080 | and maybe eventually becoming affluent.
00:37:23.720 | - Jason, there's a lot of places
00:37:25.040 | people can invest their money.
00:37:26.080 | They can buy public stocks.
00:37:27.400 | There's $20 trillion of public stocks they can buy.
00:37:29.960 | I don't think that you're prohibiting people
00:37:31.320 | from transitioning their wealth like bands
00:37:33.960 | by not being able to buy private stocks.
00:37:35.480 | In fact, I think it's more likely than not
00:37:37.560 | that people are gonna go market bullshit securities
00:37:40.360 | in private markets and rip poor people off even worse.
00:37:43.440 | And that's why there are these regulatory kind of barriers.
00:37:46.360 | And I don't think that it's necessary.
00:37:47.880 | Look, everyone says let's make it everything free
00:37:50.120 | and everything libertarian seems like a good idea.
00:37:52.520 | And so someone gets punched in the face,
00:37:54.280 | gets ripped off.
00:37:55.120 | Someone dies from a drug that's not properly tested.
00:37:58.600 | And then we're all like, where are the regulators?
00:38:00.440 | Where are the agencies to protect the individuals?
00:38:03.000 | And I think that's the role that these agencies
00:38:04.880 | are kind of providing.
00:38:06.200 | And that's the reason these rules are in place.
00:38:08.160 | I don't think that this is like one of these things
00:38:09.600 | where, oh, liberty is being denied.
00:38:11.280 | I think there's a careful line to walk.
00:38:14.920 | - Yeah, I believe you.
00:38:16.240 | - I'm generally pretty centrist in most things.
00:38:19.360 | So I agree with what a lot of David said.
00:38:21.800 | And I do think if you were to allow
00:38:24.160 | ordinary Americans to buy private companies
00:38:27.280 | that are held to a lower standard of disclosure
00:38:29.720 | and reporting than public companies,
00:38:31.480 | like something would have to change.
00:38:33.560 | Like just, hey, if a private company wants to,
00:38:36.720 | public companies are held to certain standards for a reason.
00:38:41.920 | - Tweak the numbers and have non-gap financials and yeah.
00:38:45.800 | - And there is a lot of fraud in venture.
00:38:48.720 | There are unethical people, there are-
00:38:52.080 | - And smart, diligent people can't even find it all.
00:38:54.920 | You know, that's, it's like the smartest,
00:38:56.560 | most diligent people are still getting ripped off.
00:38:58.920 | - Yeah, I mean, look at FTX.
00:39:00.360 | I mean-
00:39:01.200 | - I mean, FTX ripped off some very,
00:39:03.560 | very intelligent friends of ours.
00:39:05.000 | - Not that anyone did any diligence at FTX.
00:39:07.600 | - Yeah.
00:39:08.440 | - But yeah.
00:39:09.280 | - Which is an important part of this.
00:39:10.280 | But I mean, if you had a sophistication,
00:39:11.960 | I mean, there's such an easy solution to this.
00:39:13.640 | You just do a sophistication test.
00:39:15.280 | People take a five hour course
00:39:17.280 | and they answer 50 questions at the end,
00:39:19.000 | like a driver's license.
00:39:20.040 | - Well, I do think it is a little,
00:39:22.360 | that's why I talked about private equity.
00:39:24.440 | These are extremely sophisticated institutions
00:39:26.720 | who are always investing alongside their clients.
00:39:29.120 | - Yeah.
00:39:29.960 | - And they're buying established businesses.
00:39:31.840 | They're putting leverage on them.
00:39:33.480 | But like, I think private equity is kind of a middle ground
00:39:37.240 | and on a pooled basis, I think you could argue.
00:39:40.960 | And, you know, maybe those funds would need
00:39:42.600 | to change their reporting and their disclosures
00:39:45.680 | to deal with kind of the average American.
00:39:47.800 | But by that, I mean, strengthen it.
00:39:50.480 | But I think private equity,
00:39:55.000 | his comments were well taken.
00:39:56.560 | That's what I would say.
00:39:57.400 | - Joe, what are your thoughts?
00:39:58.240 | - Yeah.
00:39:59.080 | I mean, I think the elephant in the room--
00:39:59.920 | - Going around the horn here.
00:40:00.760 | - The elephant in the room that's really funny here
00:40:02.080 | is that you have the people on the left
00:40:04.120 | saying that only people with lots of money
00:40:05.920 | are smart enough to be allowed to do this,
00:40:07.480 | which I think is just a very funny position to take.
00:40:09.600 | It's like, you have to have a few million dollars
00:40:11.920 | to be sophisticated enough to be allowed to do this.
00:40:14.440 | I like your idea, Jake, of a test.
00:40:16.040 | Like there should be some other way of accessing this
00:40:18.800 | if you really want to.
00:40:19.920 | I mean, I agree.
00:40:20.760 | Listen, I don't wanna live in a world
00:40:22.320 | where we're all constantly being spammed
00:40:24.120 | to the average person by like stupid financial stuff.
00:40:26.960 | That's just like, we would be scammed all the time.
00:40:28.520 | There probably should be some rules.
00:40:30.360 | Like, I agree.
00:40:31.200 | I'm not like a total libertarian on this.
00:40:32.720 | I think it'd be really annoying.
00:40:33.840 | But yeah, but just to totally block people
00:40:35.560 | from participating, to me, that sounds crazy.
00:40:38.320 | - All right.
00:40:39.160 | - I mean, that's finance college professor.
00:40:41.240 | You know, might, unless they had another job,
00:40:45.520 | might not, you know, be able to do this.
00:40:47.640 | - They would have to make $250,000.
00:40:50.960 | - Let's talk about a very important public market
00:40:55.440 | set of transactions.
00:40:57.080 | Gavin, I'd like your point of view
00:40:58.480 | on Michael Saylor's convertible note issuances
00:41:01.600 | being used by Bitcoin.
00:41:03.440 | - Going over the top.
00:41:04.800 | - Which this morning Bloomberg reported
00:41:08.160 | is the hottest trade in hedge funds right now.
00:41:10.920 | Nick, if you could pull up the article.
00:41:12.680 | And then Jekyll, I know you've been an outspoken
00:41:15.720 | opinion setter on Michael Saylor's promotion
00:41:18.680 | of his securities actions.
00:41:20.560 | We'd love your point of view.
00:41:21.960 | Gavin.
00:41:23.320 | - Just at some point, this does get too big.
00:41:25.800 | For a while, when it was smaller,
00:41:27.880 | you know, you could support the debt.
00:41:29.480 | - I'm sorry, could you explain it?
00:41:30.520 | Could you explain it to everyone, Gavin?
00:41:31.800 | - Yeah.
00:41:32.640 | So what he is doing is issuing debt and buying Bitcoin
00:41:35.800 | with the premise that Bitcoin is always going to go up.
00:41:38.280 | And he, you know, has made eloquent arguments
00:41:40.880 | why that is the case.
00:41:42.480 | No trees grow to the sky.
00:41:48.320 | And I think the interest expense on his convertible notes
00:41:53.320 | is 75 million off the top of my head.
00:41:56.640 | And by the way, I could care less about micro strategy.
00:41:59.880 | Like I'm not close to it.
00:42:01.520 | I'm not involved.
00:42:03.160 | - Yeah.
00:42:04.000 | - I don't know any hedge funds who own it.
00:42:05.880 | - No horse in the race.
00:42:07.160 | - Yeah, yeah.
00:42:08.800 | I think a lot of hedge funds are short micro strategy,
00:42:11.360 | but I have no horse in the race.
00:42:13.240 | But his, the underlying business that, you know,
00:42:18.240 | pays the interest expense on the debt
00:42:21.240 | only does $400 million a year in revenue.
00:42:24.920 | And it's, you know, high gross margin revenue.
00:42:27.200 | But I just, unless debt investors have absolute confidence
00:42:35.560 | in Bitcoin has collateral.
00:42:37.760 | And I don't think that's where
00:42:39.160 | fixed income markets are yet.
00:42:40.800 | He, it will get to a point where it is too big
00:42:47.360 | for the size of his company.
00:42:49.080 | And then yeah, maybe he can over collateralize it
00:42:51.360 | and, you know, have $10 in Bitcoin for every dollar of debt.
00:42:54.880 | But then like the magic money creation machine that,
00:43:00.080 | you know, I see discussed on X breaks down
00:43:03.800 | because that's like, you know, that's,
00:43:06.000 | that's very, very different
00:43:10.080 | than what is being discussed today.
00:43:12.440 | - Joe, do you have a point of view?
00:43:13.520 | 'Cause Joe's got to run by the way.
00:43:14.840 | So Joe, do you want to close this out?
00:43:16.320 | - Most of the defense people down here in LA.
00:43:18.280 | Listen, I am very bullish Bitcoin.
00:43:20.480 | I love all the energy of our society around it.
00:43:23.000 | I agree with what you said, Dave,
00:43:25.480 | that there's like, there needs to be some barriers
00:43:27.480 | for the public taking crazy risks.
00:43:28.920 | And I think the risks around how he's accessing this
00:43:31.320 | is actually very unusual and does scare me a bit.
00:43:34.160 | And people need to do their research
00:43:35.400 | and they shouldn't just like throw money without studying it.
00:43:37.440 | It does scare me a little bit
00:43:38.600 | how likely people are throwing money at this
00:43:40.400 | without knowing the kind of leverage he's taking, you know?
00:43:43.720 | - Yeah. Joe, anything else you want to share
00:43:45.640 | that you're up to, that you're excited about
00:43:47.000 | before you head out?
00:43:48.600 | - Well, you know, this weekend is the annual defense forum
00:43:52.280 | at the Reagan library room on the board.
00:43:53.720 | It's the biggest defense event of the year.
00:43:55.720 | Everyone's coming in.
00:43:57.080 | I guess I'm really excited that America has like woken up
00:44:00.600 | and assuming we can bring back
00:44:01.800 | more advanced manufacturing here,
00:44:03.440 | I think there's enough top companies now with Anduril,
00:44:06.040 | with Cerronic, we're going to be building thousands
00:44:07.600 | of these vessels for the Navy, with AI.
00:44:09.920 | With EPRIS, we're like turning things off, you know,
00:44:12.040 | 10 miles away, whatever,
00:44:13.120 | fairly far away with microwave radiation.
00:44:14.360 | There's some really cool technology coming
00:44:16.000 | that actually is going to make us be able to deter enemies.
00:44:18.240 | So if you asked me six, seven years ago, I was panicked.
00:44:20.280 | We're going to like get way behind China.
00:44:22.400 | I'm feeling really good about it now.
00:44:23.360 | I think, and I think, you know,
00:44:25.320 | just the Pete Hegseth pick, you know,
00:44:28.000 | is actually someone I'm quite bullish on
00:44:29.520 | for everything I'm hearing.
00:44:30.360 | So I think things are going the right direction.
00:44:31.920 | - Is there a wholesale upgrade happening
00:44:34.160 | in defense in the United States?
00:44:35.720 | Are all systems and all strategies
00:44:38.840 | being rethought right now using technology and innovation
00:44:42.880 | and kind of a performance-based product mindset
00:44:47.000 | leading kind of a reinvention of everything?
00:44:49.400 | Is that what's going to happen in the next four years?
00:44:51.240 | - Warfare is fundamentally totally shifted.
00:44:53.400 | We're seeing some of this in Ukraine.
00:44:54.800 | There's all sorts of new ways you want to swarm things
00:44:56.760 | on the land, swarm things in the water,
00:44:58.320 | swarm things in the air.
00:44:59.640 | How do they coordinate and how do they work together?
00:45:01.600 | How do you manufacture enough of these things
00:45:03.680 | and how do you use electronic warfare in new ways to,
00:45:06.040 | you know, it's basically created by this.
00:45:07.680 | And so it's just a whole new way of doing things.
00:45:10.280 | And we are going that direction.
00:45:11.960 | Too much of the money, David,
00:45:13.240 | like 95% of the money is still going towards like,
00:45:16.040 | frankly, like wasteful legacy,
00:45:17.720 | like mostly things we don't need.
00:45:18.800 | - Cost plus maintenance and gold systems, yeah.
00:45:21.720 | - But enough is shifting
00:45:22.800 | and there's enough good people fighting.
00:45:24.360 | And as long as we keep just allowing open competition,
00:45:26.800 | allow it to say, okay, which is better
00:45:28.520 | and just let the best things win.
00:45:29.800 | As long as we keep doing that,
00:45:30.760 | I think it's going the right way.
00:45:31.800 | I'm feeling very good about it.
00:45:33.400 | - Is the Chinese position shift
00:45:36.520 | in their technology strategy and their system strategy
00:45:39.200 | going to motivate a shift here, do you think?
00:45:41.160 | - It already has. - Is there a shift underway?
00:45:42.440 | - It already has.
00:45:43.280 | I mean, it definitely already has.
00:45:44.280 | - So we don't need big aircraft carriers.
00:45:46.120 | We don't need F-35s, we need drones,
00:45:48.280 | we need lasers, we need weapons and stuff.
00:45:50.360 | - I think there's a role for carriers and force projection.
00:45:52.680 | I don't think we need like incrementally a ton more of them.
00:45:55.240 | I'm not like this crazy radical
00:45:57.520 | where you get rid of all this,
00:45:58.400 | but yeah, on the margin,
00:45:59.640 | I'd much rather have 10,000 more smart drones
00:46:02.760 | above and below the water
00:46:03.920 | than like an incremental carrier, right?
00:46:05.840 | So there's, on a margin,
00:46:07.120 | there's all these things that are better uses of money
00:46:08.760 | and I think we're pushing that way.
00:46:10.400 | - And is there a huge tidal wave of venture money?
00:46:12.920 | I was at a dinner last night
00:46:14.240 | where there was this conversation
00:46:15.560 | about defense tech used to be off limits
00:46:17.280 | and a lot of LPAs,
00:46:18.120 | so you can't invest in defense companies.
00:46:20.200 | That's now changed or is changing
00:46:21.840 | and everyone's kind of coming up
00:46:23.120 | with their own defense tech strategy, do you think?
00:46:25.920 | I mean, you were obviously early in this, Gavin.
00:46:27.560 | I don't know if you're an investor in this space,
00:46:29.400 | but are you guys seeing a big shift?
00:46:30.720 | Yeah.
00:46:31.560 | - I obviously, you know, listen,
00:46:32.920 | there's only been nine unicorns, I think,
00:46:34.320 | still at this point,
00:46:35.160 | and I started three of them
00:46:36.000 | and invested three of them in the first round,
00:46:37.360 | so I'm obviously pretty involved in the space.
00:46:39.960 | And I think, listen, it helps me
00:46:41.760 | if there's more money for my companies,
00:46:43.600 | which is really great.
00:46:45.360 | - Right.
00:46:46.200 | - I think there's probably the right answer for the US.
00:46:47.800 | It's not going to be like 1,000 businesses
00:46:49.240 | or a hundred businesses.
00:46:50.080 | It's going to be like seven to 10 new primes.
00:46:52.920 | One of them's obviously under roll.
00:46:54.200 | I think one is probably Ceronic and Epirus, we'll see.
00:46:56.320 | But like, there's going to be seven to 10 new primes,
00:46:58.200 | and that's what it's going to be.
00:46:59.240 | So there's going to be a lot of zeros
00:47:00.560 | and a lot of bad investments,
00:47:01.840 | but yes, these seven to 10 new primes are going to be huge.
00:47:03.880 | And if you can get access to them,
00:47:05.320 | you're going to do really well.
00:47:06.720 | - Gavin, how do you look at that market?
00:47:08.400 | You agree?
00:47:09.240 | - I agree with everything Joe said.
00:47:11.240 | The only thing I would just add on China,
00:47:13.520 | what we are doing by restricting their access
00:47:17.280 | to advanced compute and advanced networking,
00:47:21.120 | if you have read or watched "The Three-Body Problem,"
00:47:24.000 | America is unfolding a SOFON over China.
00:47:27.240 | - Yeah, that's a great way to say it.
00:47:29.960 | - I have been really impressed
00:47:31.040 | with some of the Chinese models that have come out.
00:47:34.840 | And I think the risk to this strategy
00:47:36.440 | is necessity is the mother of invention.
00:47:38.840 | And despite this handicap,
00:47:40.920 | they're managing to stay just behind
00:47:43.480 | the leading edge of America, which is amazing.
00:47:46.440 | But, you know, NVIDIA's Blackwell chip comes out next year.
00:47:49.640 | You're going to have new chips from AMD,
00:47:51.040 | new ASICs from Broadcom.
00:47:53.560 | And I think at that point,
00:47:56.280 | it is not going to be possible for them to keep up anymore.
00:47:59.440 | - So that's actually positive regulation
00:48:03.360 | and great, in your mind, foreign policy.
00:48:07.520 | - It is very aggressive foreign policy.
00:48:13.600 | - Yeah, clearly.
00:48:15.440 | - You know, that could have lots of unforeseen consequences.
00:48:18.400 | - Yeah, what do you think about
00:48:19.880 | the rare earth trade restrictions coming to us?
00:48:23.520 | And is that going to actually affect the supply?
00:48:25.720 | - We have lots of rare earth here in America.
00:48:28.160 | Like in America, we have everything in America.
00:48:30.680 | Like we, you know, and I think there's a project underway
00:48:33.000 | to restart rare earth production.
00:48:34.960 | If there were ever to be a conflict,
00:48:37.120 | all this stuff would go away.
00:48:38.720 | - It's a cost and it's allowing us to do the refining here.
00:48:41.360 | So refining some of this stuff,
00:48:42.560 | like we desperately need gallium, gallium nitride
00:48:44.600 | for things that I'm doing, right?
00:48:45.840 | With like shooting the microwave radiation.
00:48:48.120 | The problem is refining is really messy.
00:48:50.840 | If you let it happen in the US, it will still be messy,
00:48:52.800 | but it will be cleaner.
00:48:53.880 | But we're not letting it happen.
00:48:55.160 | So there's things like this.
00:48:56.000 | - So we're back to regulation.
00:48:57.440 | And obviously the cost here is different.
00:48:59.160 | We have a different cost structure.
00:49:00.160 | - Joe, you got to go meet 20 senators.
00:49:01.920 | So thank you.
00:49:02.760 | We appreciate it. - Thank you, guys.
00:49:03.600 | This was fun.
00:49:04.440 | - Stay tuned. - Good times.
00:49:06.400 | - J-Cal, do you want to talk about AI with Gavin?
00:49:09.240 | - Yeah, I think that would be like a great next place to go
00:49:13.000 | would be to talk about the supercomputer being built
00:49:16.240 | by a friend of the pod, Elon.
00:49:19.120 | He's now got the world's largest supercomputer
00:49:21.680 | and he's got a 10 exit, according to reports.
00:49:25.760 | - Yeah.
00:49:26.600 | And I would just say this is, I think,
00:49:27.640 | a very important moment for AI,
00:49:31.920 | you know, for this entire AI trade
00:49:35.000 | in the public and private markets.
00:49:36.720 | You know, everybody I'm sure who watches your podcast
00:49:41.040 | is very aware of scaling loss.
00:49:42.920 | And we have not had a scaling loss for training,
00:49:46.000 | where if you 10X the amount of compute
00:49:47.560 | you should train a model,
00:49:48.800 | you significantly improve the intelligence
00:49:50.720 | and capability of that model.
00:49:53.000 | And often they're these, you know,
00:49:54.960 | kind of emergent properties that emerge
00:49:57.640 | alongside that higher IQ.
00:50:00.200 | No one thought it was possible to make more than 25,000,
00:50:05.200 | maybe 30,000, 32,000, pick a number,
00:50:09.240 | NVIDIA hoppers coherent.
00:50:11.760 | And what coherent means is in a training cluster
00:50:15.360 | that each GPU, to kind of simplify it,
00:50:18.800 | knows what every other GPU is thinking.
00:50:21.880 | So every GPU in that 30,000 cluster
00:50:25.040 | knows what the other 29,999 are thinking.
00:50:28.600 | And you need a lot of networking to make that happen.
00:50:31.160 | - Enabled by InfiniBand, right?
00:50:32.760 | - InfiniBand, and I think even more importantly, NVLink.
00:50:37.320 | Although a lot of...
00:50:38.680 | Ethernet is, you know, never bet against the internet,
00:50:42.480 | never bet against Ethernet.
00:50:44.720 | Like if you read the Lama 3.1 technical paper,
00:50:47.600 | you know, got a lot of people excited
00:50:48.800 | about SkinnyLink Ethernet.
00:50:50.600 | But...
00:50:51.880 | - Just to slow down for the audience here, Gavin,
00:50:54.000 | maybe explain why transporting information
00:50:57.680 | between the GPUs is important.
00:51:00.240 | And we're talking, we're in the weeds here a little bit.
00:51:02.360 | Everybody's heard of Ethernet,
00:51:03.560 | but some of the other protocols
00:51:05.400 | and ways of moving stuff around, large amounts of data.
00:51:08.360 | That's what these H200s, H200s do particularly well.
00:51:11.600 | They'll move a couple of terabytes a second
00:51:14.120 | from one processor to the next processor.
00:51:17.520 | - Yeah.
00:51:18.360 | So...
00:51:19.200 | You know, picture a server,
00:51:23.400 | in the case of a GPU,
00:51:25.720 | it looks like maybe three pizza boxes
00:51:28.640 | stacked on top of each other,
00:51:30.080 | and it has eight GPUs together.
00:51:32.120 | And those eight GPUs are connected today
00:51:36.640 | with something called NVLink.
00:51:38.920 | Probably, you can think of the speed of communication.
00:51:42.440 | On-chip is the fastest.
00:51:45.920 | Chip-to-memory, next fastest.
00:51:48.080 | You know, chip-to-chip within a server, next fastest.
00:51:52.000 | And so you take those units of servers,
00:51:56.000 | which are connected, the GPUs are connected on the server
00:51:59.280 | with a technology called NVSwitch,
00:52:01.160 | and you stitch them together with either InfiniBand
00:52:04.120 | or Ethernet into a giant cluster.
00:52:09.680 | And each GPU has to be connected to every other GPU
00:52:14.600 | and know what they're thinking.
00:52:15.600 | They need to be coherent.
00:52:16.680 | They need to kind of share memory.
00:52:19.040 | For the compute to work,
00:52:21.160 | the GPUs need to work together for AI.
00:52:25.640 | And no one thought it was possible
00:52:29.920 | to connect more than 30,000 of these
00:52:32.080 | with today's technology.
00:52:35.400 | From public reports, Elon, as he so often does,
00:52:39.240 | focused deeply on this,
00:52:42.400 | thought about it from first principles,
00:52:44.360 | how long it should take, the way it should be done.
00:52:47.880 | And he came up with a very, very different way
00:52:50.320 | of designing a data center.
00:52:53.640 | And he was able to make over 100,000 GPUs coherent.
00:52:58.640 | No one thought it was possible.
00:53:02.960 | If I was a last minute ad for this,
00:53:05.960 | but I would have said there were all these articles
00:53:08.920 | that were being published in the summer
00:53:11.560 | saying that no one believed he was gonna be able to do it.
00:53:15.000 | It was hype.
00:53:16.280 | It was, you know, ridiculousness.
00:53:18.880 | And that was coming.
00:53:20.120 | The reason the reporters felt comfortable
00:53:22.120 | writing those silly stories
00:53:24.080 | is because engineers at Meta and Google and other firms
00:53:28.280 | were saying, "We can't do it.
00:53:29.880 | There's no way he can do it."
00:53:31.440 | - Hmm.
00:53:33.040 | - He did it.
00:53:34.400 | And I think the world really only believed it
00:53:36.840 | when, you know, Jensen did that podcast,
00:53:39.840 | I think with, wasn't it with Gerstner?
00:53:42.800 | - It might've been with Gerstner.
00:53:44.200 | - Yeah, I think it was with Gerstner,
00:53:45.280 | and said, "What Elon did was superhuman.
00:53:47.760 | No one else could have done it."
00:53:49.800 | And I actually think you can argue
00:53:52.160 | that Elon doing that, in a lot of ways,
00:53:54.760 | kind of saved NVIDIA from a tough six-month period
00:53:58.280 | when Blackwell was delayed,
00:54:00.080 | because everyone who was waiting for Blackwell
00:54:01.960 | and thought it was impossible
00:54:02.880 | to make 100,000 hoppers coherent,
00:54:05.400 | rushed out and bought a lot of hoppers
00:54:07.680 | to try and do it themselves.
00:54:09.080 | Now, we will see if someone else is able to do it.
00:54:12.120 | It was really, really hard.
00:54:13.680 | No one else thought it was possible.
00:54:17.440 | And as a result of that,
00:54:21.440 | Gruk 3 is in trading now
00:54:23.600 | on this giant, colossus supercomputer,
00:54:26.400 | the biggest in the world, 100,000 GPUs.
00:54:27.920 | - In Memphis.
00:54:29.040 | - In Memphis.
00:54:30.120 | - In Memphis.
00:54:30.960 | - At the old Electrolux factory,
00:54:32.240 | and they're putting a lot of energy in there,
00:54:34.200 | a lot of natural gas, a lot of-
00:54:36.080 | - Yeah, a dingy Electrolux factory.
00:54:39.360 | - Yeah.
00:54:40.720 | - With a lot of megapacks around it,
00:54:42.240 | and the city of Memphis is all in on supporting this.
00:54:45.200 | - Yeah.
00:54:46.040 | - Which is obviously smart for them.
00:54:47.960 | But you have not had a real test
00:54:49.480 | of scaling laws for training,
00:54:51.320 | arguably since GPT-4.
00:54:53.640 | And this will be the first test.
00:54:55.440 | And if scaling laws for training hold,
00:54:57.440 | Gruk 3 should be a significant advance
00:55:00.040 | in the state of the art.
00:55:01.240 | That is an immensely,
00:55:06.000 | from a Bayesian way to look at the world,
00:55:08.400 | that is an immensely important data point.
00:55:10.920 | But if that card doesn't work,
00:55:15.800 | and I think it is gonna work,
00:55:17.160 | I think Gruk 3 is gonna be really good,
00:55:19.480 | I should note that I am-
00:55:21.360 | - Well, with consumers,
00:55:22.440 | yeah, you're involved in XAI.
00:55:23.280 | - My firm is an investor in XAI.
00:55:25.440 | - Got it.
00:55:26.320 | - Yeah, they've raised a tremendous amount of capital,
00:55:28.200 | a lot of it from the Middle East,
00:55:29.480 | and they're supposedly gonna build Colossus
00:55:31.480 | to a million GPUs.
00:55:33.400 | Is this data goal 10 times bigger than it is currently?
00:55:37.120 | There's been some debate back and forth, Freeburg,
00:55:39.080 | about, "Hey, are we hitting a wall here?"
00:55:42.440 | Maybe you could explain the wall,
00:55:43.760 | either of you, to the audience, yeah.
00:55:45.920 | - David.
00:55:46.760 | - Well, I'll let Gavin speak to the wall.
00:55:49.040 | I mean, Gavin, I think one of the questions also is,
00:55:52.120 | you know, do we see
00:55:54.880 | an evolution if the kind of increment in performance
00:55:59.120 | relative to the investment in
00:56:01.080 | net kind of training, compute resources, declines,
00:56:08.080 | do we start to see a shift in how the architecture
00:56:11.920 | of the systems are run?
00:56:13.760 | Meaning, like, do we start to build models of models,
00:56:16.120 | and that starts to resolve a higher-level architecture
00:56:18.800 | that unlocks new performative capabilities?
00:56:23.720 | - I would just say we're already building models of models.
00:56:25.720 | You know, almost every application startup I'm aware of
00:56:28.720 | is chaining models.
00:56:29.720 | You know, you start with a cheap model,
00:56:31.200 | you check the cheap model's work
00:56:32.440 | with a more expensive model.
00:56:34.360 | You know, lots of very clever things are being done.
00:56:36.240 | You know, every AI application company
00:56:38.880 | has what's called a router,
00:56:40.280 | so they can, you know, swap out the underlying model
00:56:42.480 | if another one is better for the task at hand.
00:56:44.760 | As far as what the wall is,
00:56:49.640 | there's been a big debate that we were hitting a wall
00:56:52.320 | on these scaling laws,
00:56:54.120 | and that scaling laws were breaking down.
00:56:57.360 | And I just thought that was deeply silly,
00:56:59.400 | 'cause no one had built a cluster bigger than,
00:57:02.320 | you know, 32,000 H100s.
00:57:04.480 | And nobody knew.
00:57:07.120 | It was a ridiculous debate.
00:57:10.200 | And there were, you know, really smart people on both sides,
00:57:13.200 | but there's no new data.
00:57:15.680 | GROK 3 is the first new data point
00:57:18.880 | to support whether or not scaling laws
00:57:20.480 | are breaking or holding.
00:57:21.920 | Because no one else thought
00:57:23.520 | you could make 100,000 hoppers coherent.
00:57:25.680 | And I think based on public reports,
00:57:28.360 | they're going to 200,000 hoppers.
00:57:31.480 | And then the next tick is a million.
00:57:33.760 | It was reported they're gonna be first in line
00:57:35.400 | for Blackwell.
00:57:36.600 | But GROK 3 is a big card and will resolve
00:57:39.600 | this question of whether or not we're hitting a wall.
00:57:44.000 | The other question you raised, David, is very interesting.
00:57:46.840 | And by the way, we should note,
00:57:48.040 | there is now a new axis of scaling.
00:57:51.440 | Some people call it test time compute.
00:57:52.960 | Some people call it inference scaling.
00:57:54.640 | And basically the way this works,
00:57:55.840 | you just think of these models as human.
00:57:58.000 | The more you speak to one of these models,
00:57:59.720 | the way you'd speak to your like 17 year old
00:58:01.680 | going off to take the SAT, the better it will do for you.
00:58:04.960 | As a human, you know, if I asked you, David,
00:58:06.440 | what's two plus two, four flashes in your mind right away.
00:58:10.000 | If I ask you to, you know, unify a grand unified theory
00:58:12.920 | of physics that accounts for both quantum mechanics
00:58:15.280 | and relativistic physics, you will think for a lot longer.
00:58:18.800 | - 147.
00:58:19.960 | - Yeah, nobody knows.
00:58:21.520 | We have been giving these models the same amount of time
00:58:25.720 | to think no matter how complicated the question was.
00:58:28.840 | What we've now learned is if you let them think for longer
00:58:31.120 | about more complex questions, test time compute,
00:58:34.480 | you can dramatically improve their IQ.
00:58:36.240 | So we're just at the beginning of this new scaling law.
00:58:40.080 | But I think the question you raised on ROI is very good.
00:58:42.880 | And I'm happy to address it.
00:58:46.760 | And there's a context window shift underway as well,
00:58:50.600 | which also creates a new kind of scaling access arguably
00:58:55.160 | in terms of the potential set of applications.
00:58:57.680 | So networks of models, think time, context window,
00:59:02.680 | there are multiple dimensions upon which these tools
00:59:06.400 | ultimately kind of resolve to better performance.
00:59:09.840 | - Oh yeah, even if scaling laws for training break,
00:59:13.320 | we have another decade of innovation ahead of us.
00:59:15.200 | - Exactly, and as my understanding from speaking to folks,
00:59:18.760 | I'm certainly not as deep and well versed as you,
00:59:20.880 | but there's a lot of effort and research going on
00:59:23.560 | in re-engineering various parts of the stack
00:59:28.120 | to reduce energy, to reduce every resource
00:59:31.160 | that effectively drives model performance,
00:59:33.440 | to basically re-engineer architecture.
00:59:35.120 | It was all like very brute force for a period of time.
00:59:37.840 | And it was like push, push, push.
00:59:39.040 | But now as we go back and we start to re-engineer
00:59:40.880 | and architect things in perhaps a more designed way,
00:59:43.360 | we get better performance
00:59:44.840 | and there's a lot of work to do there still.
00:59:46.560 | - Absolutely.
00:59:47.880 | - This is one of the great things about capitalism
00:59:50.320 | and a functioning capital market
00:59:52.400 | is you've got people working just on the context window.
00:59:54.560 | For people who don't know what that is,
00:59:55.720 | is that's the number of tokens.
00:59:57.760 | A token is essentially a word you can think of it,
00:59:59.960 | a piece of information.
01:00:01.360 | The number of tokens you can put into a conversation
01:00:06.000 | within a large language model.
01:00:07.800 | Some people have really large context windows,
01:00:09.640 | some people have smaller ones,
01:00:10.480 | but you can basically put an entire book
01:00:11.960 | in the context window and start asking questions
01:00:14.120 | against the model.
01:00:15.440 | And the speed of those is critically important as well,
01:00:18.280 | because if you put the book in there
01:00:19.240 | and it takes you 10 minutes to get an answer,
01:00:20.880 | that's not functional, right?
01:00:22.840 | - Gavin, are you an investor in open AI?
01:00:25.240 | - Oh, absolutely not.
01:00:26.680 | - Yeah.
01:00:27.520 | Can you kind of theorize on what the build out
01:00:30.200 | that's being done with Colossus does
01:00:32.600 | to the advantage that open AI has today?
01:00:35.400 | How long till we kind of catch up there with XAI
01:00:38.440 | and how much is going to be disrupted
01:00:41.160 | and how quickly here?
01:00:42.160 | - Well, if scaling laws hold,
01:00:44.440 | the best information I have is the largest cluster
01:00:50.920 | Microsoft has after panicking.
01:00:53.360 | It's still smaller than XAI's cluster in Memphis.
01:00:57.160 | If you didn't believe that it was possible,
01:00:59.320 | you weren't even working on it.
01:01:00.880 | Grok 3 should take the lead if scaling laws hold
01:01:06.120 | in January or February.
01:01:11.080 | I do think a lot of talent has left open AI.
01:01:13.600 | I thought it was a really shocking statement
01:01:20.200 | from Mira Mirati that she resigned during a fundraise.
01:01:25.200 | That's the only way she can express disapproval
01:01:28.040 | of what is going on there
01:01:29.720 | and still probably get her money.
01:01:32.040 | - Right.
01:01:33.560 | - So I think there's a lot of reasons
01:01:38.560 | if scaling laws hold, to be optimistic about Grok 3.
01:01:42.280 | But I think, and then by the way, on the power question,
01:01:47.440 | and they are 23 and 24, it was just a panic to get GPUs
01:01:52.440 | and get them plugged in.
01:01:54.600 | Now we're trying to make them efficient and thoughtful
01:01:57.600 | and to your point, re-architecting them.
01:01:59.160 | - And the H200s now are 50% less power
01:02:02.040 | and either 50% more or twice as much compute,
01:02:06.280 | depending on the task.
01:02:07.480 | - They have a little more compute and a lot more memory,
01:02:09.280 | which really matters.
01:02:10.880 | So per kind of a unit of effective compute,
01:02:13.560 | they're a lot more power efficient.
01:02:15.320 | - Two or three times, you think, or?
01:02:18.480 | - No, the H200s, probably not 2X,
01:02:22.320 | but a good increment and the H100 was a great chip.
01:02:25.160 | - So 50%, yeah.
01:02:26.600 | - Yeah, Blackwell's just around the corner
01:02:28.040 | and that's an entirely new architecture
01:02:30.120 | with an entirely new set of networking technologies.
01:02:33.040 | - What would consumers, if we had to sort of speculate here,
01:02:35.600 | what would consumers,
01:02:37.320 | how would consumers' experiences change
01:02:40.000 | in using forward-facing language models?
01:02:42.920 | And then maybe what are developers going to see
01:02:44.760 | on the back end, you know,
01:02:45.760 | in terms of what they're going to be able to build?
01:02:47.920 | If this pans out in the next short term, two years.
01:02:52.360 | - Right now, you have like a friend in your pocket
01:02:55.240 | who has an IQ of 115, 110 maybe,
01:03:00.240 | but has all of the world's knowledge accessible to it.
01:03:04.720 | And that's what makes it amazing.
01:03:06.480 | I think this will be like,
01:03:07.400 | you have a friend in your pocket,
01:03:08.440 | but, and they sometimes make things up.
01:03:11.320 | Again, they're very human.
01:03:13.000 | And a lot of humans,
01:03:13.840 | when they don't know the answer, they dissemble.
01:03:16.880 | These AIs do it too.
01:03:18.480 | So you will have a friend in your pocket
01:03:21.440 | with an IQ of maybe 130 that knows everything,
01:03:25.680 | has more up-to-date knowledge of the world,
01:03:28.040 | and is more grounded in factual accuracy.
01:03:33.880 | And it is interesting,
01:03:34.720 | for any question involving real-time information,
01:03:37.920 | mostly sports and finance,
01:03:39.920 | you know, I always, you know,
01:03:40.920 | if there's a stock down 25%,
01:03:42.520 | ask every AI, "Why is the stock down 25%?"
01:03:45.680 | Generally, Grok is the one that knows.
01:03:48.160 | - Yeah, Grok actually-- - But I'm obviously biased.
01:03:49.880 | - Yeah, no, no. - Grok knows.
01:03:51.400 | - Grok, because of the Twitter data set,
01:03:53.920 | - Exactly. - knows what is happening
01:03:56.120 | at the moment.
01:03:57.160 | - In the world today.
01:03:58.640 | - All right, and then, you know,
01:03:59.640 | as we sort of wrap up here on the AI,
01:04:02.240 | what about the ROI here that Deva was mentioning?
01:04:06.240 | - Yeah, so I find these debates also very funny, you know.
01:04:10.640 | There have been articles written
01:04:12.080 | about multi-hundred-billion-dollar ROI questions.
01:04:15.840 | Those are very strange to me,
01:04:17.440 | because the biggest spenders on GPUs are public companies,
01:04:22.440 | and they report financial results every quarter.
01:04:26.120 | And you can calculate a metric
01:04:27.600 | called return on invested capital,
01:04:29.760 | - Yes. - and ROIC, ROI,
01:04:32.520 | has gone vertical since they ramped their CapEx on GPUs,
01:04:37.200 | and actually just started to level out
01:04:38.600 | in this latest quarter.
01:04:39.840 | So the ROI on AI has been very positive thus far,
01:04:43.640 | just a fact.
01:04:45.080 | It's a really good question.
01:04:47.320 | Will it continue, particularly if, you know,
01:04:49.320 | it's gonna cost a hundred billion dollars
01:04:50.760 | to train a model in two or three years,
01:04:53.400 | which I think is a realistic estimate.
01:04:56.920 | - I guess the counter to that isn't the counter to that,
01:04:59.440 | that maybe there's a little bit of hype,
01:05:01.720 | that, you know, maybe there are people
01:05:04.160 | are trying to determine the ROI
01:05:06.520 | and correlate it more precisely.
01:05:08.320 | And I guess that's the challenge, you know,
01:05:10.640 | meta doing AI across its entire enterprise,
01:05:13.840 | you might see, and Google,
01:05:14.960 | you might see it directly making ads
01:05:18.880 | more effective, as an example.
01:05:20.040 | - 100%, yeah.
01:05:20.880 | - So that's happening. - Meta and Google
01:05:21.720 | have shown the best ROIs on AI.
01:05:23.240 | - Yeah, and I think that, you know,
01:05:24.800 | that's happening. - Google have shown
01:05:25.640 | the best ROIs on AI.
01:05:26.760 | - Yeah, but then for other folks, like, you know,
01:05:30.240 | is it actually happening, or is it a toy, I guess,
01:05:32.880 | is the criticism I hear.
01:05:34.000 | I'm not saying that's my position,
01:05:35.000 | but that's the criticism I hear is like,
01:05:36.720 | are people actually getting money from the copilot,
01:05:39.400 | or maybe this is just product market fit discovery process,
01:05:42.520 | 'cause the AI laptops, AI intelligence on Apple,
01:05:47.480 | and let's say some general LLMs people feel
01:05:51.400 | maybe aren't worth the money,
01:05:52.400 | or copilot for Microsoft, maybe not worth the money.
01:05:55.680 | - Yeah, I mean, I personally have not had good experiences
01:05:57.800 | with copilot, but I would say
01:06:00.000 | that, and I'm sure both of you have come across these,
01:06:06.200 | there are lots of companies that are just these thin wrappers
01:06:10.720 | over a foundation model,
01:06:13.120 | and they go from zero to 40 million instantaneously,
01:06:16.920 | and they're profitable, and for their customers,
01:06:20.640 | they're replacing labor budgets.
01:06:22.880 | - Yes. - And I think,
01:06:23.760 | I'm sure you guys are noticing this too,
01:06:25.800 | but startups today at a given size
01:06:28.000 | are employing fewer people
01:06:29.360 | than they would have three years ago.
01:06:31.120 | And just like, you know, it's funny,
01:06:32.240 | people were very skeptical. - I would say 50% less.
01:06:34.840 | - Yeah, and that's the ROI on AI.
01:06:36.840 | And like in, you know,
01:06:37.680 | I went to the first AWS re:Invent conference,
01:06:39.800 | and no big companies were using cloud computing.
01:06:42.240 | It was all startups.
01:06:43.400 | Startups always adopt technologies first.
01:06:46.480 | So outside of the ROI on AI that you're seeing
01:06:49.320 | in Google and Meta from, you know,
01:06:51.800 | using this across their businesses,
01:06:54.000 | you're seeing real ROI on AI from startups
01:06:56.600 | the same way they saw real ROI
01:06:58.240 | from cloud computing before anyone else.
01:07:00.120 | - It's crazy.
01:07:01.400 | - But I don't think these companies
01:07:03.840 | are in a classic prisoner's dilemma.
01:07:05.400 | They all believe to varying degrees
01:07:08.760 | that whoever gets there first
01:07:10.240 | to artificial superintelligence is gonna create tens
01:07:13.480 | or hundreds of trillions of dollars of value,
01:07:15.600 | and I think they may be right if they get there,
01:07:19.120 | and they think that if they lose the race,
01:07:21.280 | their company is at mortal risk.
01:07:24.000 | So as long as one person is spending,
01:07:27.320 | I think they will all spend, even if the ROI decelerates.
01:07:30.760 | It is a classic prisoner's dilemma.
01:07:32.640 | - Competition is amazing.
01:07:34.680 | It really is when you have a free market with competition.
01:07:37.120 | We were sitting here two years ago,
01:07:38.280 | three years ago on this pod, Gavin,
01:07:40.200 | just lamenting like, oh my God, what could happen?
01:07:43.440 | And China could just roll over us.
01:07:44.880 | They've got everything dialed in.
01:07:46.120 | We're a disaster, and now here we are.
01:07:48.360 | China's a disaster.
01:07:49.520 | They've got all kinds of challenges,
01:07:50.840 | and the free market is, even with weapons systems,
01:07:53.600 | we're seeing capitalism applied and competition applied there.
01:07:56.200 | All right, I think we should just wrap up
01:07:57.920 | on this Brian Thompson story.
01:07:59.520 | The CEO of UnitedHealthcare was shot and killed
01:08:01.960 | outside a Manhattan hotel on Wednesday morning.
01:08:04.960 | UnitedHealthcare is an insurance subsidiary
01:08:07.040 | of UnitedHealth Group,
01:08:08.440 | and they employ over 140,000 people.
01:08:11.800 | They provide coverage to millions.
01:08:13.960 | And this was, I don't know if you've seen them,
01:08:18.240 | I'm assuming you guys have seen the video
01:08:19.480 | on social media or go by on X.
01:08:21.320 | There's been a big debate.
01:08:24.040 | Was this like a really well-trained killer,
01:08:26.920 | like an assassin, like a hired gun?
01:08:29.000 | Was it something personal maybe?
01:08:31.080 | And this person's a hack,
01:08:32.120 | and they're not really good at doing a hit like this,
01:08:34.960 | and most people are sort of coming somewhere in between.
01:08:37.760 | ABC has reported, and this is where
01:08:41.320 | this thing has taken like a crazy turn,
01:08:43.800 | that the words deny, defend, and depose
01:08:47.200 | were discovered on the bullet casings at the scene.
01:08:50.560 | And those are the terms, two of the three are in a book
01:08:54.720 | about how health insurers reject
01:08:57.920 | many of the claims every year.
01:08:58.920 | They deny it, they defend it,
01:09:00.400 | and then they will depose people
01:09:01.760 | to harass them essentially.
01:09:03.200 | This is a crazy, crazy story, and it's breaking here.
01:09:09.560 | I don't know if anybody has any insights on it,
01:09:12.000 | but I thought I would bring it up here
01:09:14.320 | to just maybe intelligently speculate about what we've seen.
01:09:18.600 | - I mean, it seems like the most likely case
01:09:20.360 | is someone's loved one died
01:09:22.760 | 'cause they were denied coverage.
01:09:24.640 | And whether this person was hired,
01:09:26.560 | or they're a victim, or related to a victim,
01:09:29.400 | I don't know if it matters.
01:09:31.640 | I think there's,
01:09:32.720 | the observation I'll make is a question,
01:09:38.400 | which is, should CEOs be personally responsible
01:09:43.560 | for corporate actions, generally speaking?
01:09:46.800 | So there's a difference between a CEO committing fraud
01:09:51.520 | or being negligent.
01:09:53.400 | But if you don't get a good service,
01:09:58.800 | or a good quality of service, or the product you expect,
01:10:01.240 | even if it is something you depend on
01:10:04.560 | for healthcare, for example.
01:10:07.960 | Let's say you take a drug,
01:10:09.680 | and the drug causes a side effect
01:10:11.400 | that causes some permanent damage.
01:10:13.040 | Should the CEO be individually held accountable?
01:10:16.040 | And if that were the case,
01:10:17.600 | would anyone wanna be a CEO of a company
01:10:20.240 | that sets out to provide services
01:10:22.280 | that are critical like this?
01:10:23.880 | It's a very, I think, challenging question to think about,
01:10:26.800 | because certainly you could feel
01:10:29.600 | like you wanna hold someone responsible
01:10:31.240 | because a loved one died,
01:10:32.600 | because that CEO's job was to make more money
01:10:36.480 | for their shareholders, and therefore deny claims.
01:10:39.560 | And therefore, the way that they run that business is wrong.
01:10:42.560 | I think ultimately, we've gotta kind of
01:10:45.640 | have this distinction between negligence, fraud,
01:10:50.760 | and acting on the corporate behalf.
01:10:53.680 | There's a, for a period of time,
01:10:55.720 | all of these documentaries and this movement
01:10:59.240 | against the idea of the corporation, generally speaking.
01:11:02.120 | You guys remember this was like a decade ago,
01:11:03.840 | or 15 years ago. - Anti-capitalism
01:11:05.760 | movements, yeah. - It is.
01:11:07.200 | Yeah, and it's like the corporation shields individuals,
01:11:11.640 | and the corporation creates a shield
01:11:13.760 | for individuals to do harm,
01:11:15.920 | is kind of the argument that's made.
01:11:18.200 | And so there's a lot of people in this camp
01:11:20.000 | that think that these old CEOs of companies
01:11:22.440 | that let people down are evil,
01:11:24.200 | should be killed, should be put in jail,
01:11:25.760 | whatever the awful kind of contextualization of that is.
01:11:29.200 | And I do think it's really important to think about,
01:11:32.200 | well, if no one were to be the CEO
01:11:34.000 | because they faced that threat,
01:11:35.680 | and those companies can't make money as a business,
01:11:37.800 | then those services go away entirely.
01:11:40.000 | That's kind of the end state of where this goes.
01:11:41.960 | There's gonna be these difficult situations.
01:11:44.480 | If a CEO does something negligent, fraudulent, wrong,
01:11:47.960 | there's a court and there's a system,
01:11:50.600 | and there should be kind of laws that protect people.
01:11:53.240 | I don't know, man, the whole thing is pretty depressing
01:11:56.320 | to think that a guy who's the CEO,
01:11:57.920 | he's, from what I heard-- - This guy's got a family,
01:12:00.360 | yeah, I mean-- - He's got a family,
01:12:01.560 | he's got a wife, people that have known him
01:12:03.840 | said he was like a nice guy and a good guy,
01:12:05.880 | and he runs a tough business.
01:12:08.080 | Insurance is a very tough business.
01:12:09.800 | - Well, and to your point about this seems
01:12:11.680 | like it's very personalized with the casings,
01:12:14.320 | if that is in fact true, again, this is breaking news,
01:12:16.480 | so we're speculating here, hopefully informed.
01:12:19.200 | This chart has been the one that's been circulating
01:12:21.520 | on social media now.
01:12:22.800 | This has now become, Gavin, like a Rorschach test
01:12:25.480 | about how you feel about corporate America,
01:12:28.320 | healthcare, et cetera.
01:12:30.120 | But UnitedHealthcare, at least according to these charts
01:12:32.960 | that are speculating, deny claims at a rate,
01:12:37.000 | at a multiple, two or three x other people in the industry
01:12:41.240 | and that they are the most hated.
01:12:43.040 | Not that any of this obviously results
01:12:45.440 | in somebody deserving to die.
01:12:48.920 | I mean, I can't believe I'm even saying this,
01:12:50.560 | but Taylor Lorenz went viral with a series of tweets,
01:12:55.560 | not tweets, I think she's on whatever the blue thing
01:13:01.960 | is, Sky Blue, Blue Sky, where she wrote some quotes here,
01:13:06.960 | "And people wonder why we want these executives dead,"
01:13:12.080 | Lorenz wrote on Blue Sky, a microblogging social media site
01:13:14.440 | alongside an article about how Blue Sky,
01:13:17.320 | Blue Cross Blue Shield no longer cover anesthesia
01:13:20.520 | for the following surgeries.
01:13:23.480 | That's according to the New York Post.
01:13:26.040 | Gavin, your thoughts on this insanity and tragedy?
01:13:29.000 | - Yeah, well, first of all, it's a tragedy.
01:13:30.560 | And actually I'm in New York as we record this
01:13:33.040 | and it happened one block from where I am.
01:13:35.920 | And I mean, it is absolutely, it's a human tragedy.
01:13:43.520 | Taylor Lorenz was not the only person on social media
01:13:47.320 | who reacted that way, which I thought was deeply troubling.
01:13:50.320 | - It was a large group of people
01:13:51.840 | who were writing very morbid comments.
01:13:54.160 | There was a lot of anger around this issue
01:13:56.960 | that I was unaware of.
01:13:59.440 | I was completely unaware of it.
01:14:01.040 | - Well, I think we all probably are very privileged
01:14:04.080 | to have great healthcare and able to pay our premiums.
01:14:07.040 | So we're not affected by this at this stage in our lives.
01:14:09.800 | I was very affected by this 30 years ago.
01:14:11.480 | - Yeah, we're all very lucky on this podcast.
01:14:14.520 | - Yeah.
01:14:15.360 | - And like, I can't imagine how I would feel
01:14:19.040 | if someone I loved had been denied medical care and died.
01:14:24.040 | And I felt like it was unnecessary
01:14:28.000 | and due to some corporation trying to make more money.
01:14:33.000 | But I just, I cannot believe, I'm sure I'd be outraged.
01:14:41.200 | I just can't believe I was deeply disturbed as a person
01:14:46.200 | by the number of people online celebrating this.
01:14:50.560 | - It's really crazy.
01:14:52.240 | - Yeah, I thought it was awful.
01:14:56.600 | - And I guess the last thing I would just say
01:14:57.840 | about that chart is I think it is a little misleading.
01:15:00.240 | I think that is initial denial.
01:15:02.040 | You know, maybe the, you know, in other words,
01:15:06.440 | maybe the company that, you know, only denies 7%,
01:15:09.000 | that's a final deny.
01:15:10.120 | You know, that's a company that--
01:15:11.960 | - And who knows if the chart's even correct.
01:15:14.120 | I bring it up only that it is the trending item
01:15:17.800 | and people are, I find in these tragedies,
01:15:20.160 | they become like Rorschach tests, right?
01:15:22.440 | - Yeah, UnitedHealthcare's medical loss ratio
01:15:25.840 | is about 85%.
01:15:27.960 | So 85 cents-- - Explain what that means, yeah.
01:15:29.880 | - So 85 cents of every dollar
01:15:31.800 | they collect in insurance premium,
01:15:33.280 | they're paying out in claims.
01:15:35.800 | If you guys wanna look at what the most egregious
01:15:37.800 | insurance industry in the world is, it's title insurance.
01:15:40.840 | And I'll give you the list of the rest.
01:15:42.200 | Travel insurance is pretty bad.
01:15:44.040 | They pay out like nothing.
01:15:44.880 | - Right, you were in the insurance business
01:15:46.040 | for a bit there, yeah.
01:15:46.880 | - Yeah, like, I mean, you know,
01:15:48.160 | health insurance is the hardest, one of the hardest,
01:15:50.320 | besides auto insurance businesses to be in.
01:15:53.120 | You're paying out constantly
01:15:55.040 | and there is a very difficult kind of process
01:15:57.920 | of managing losses 'cause the number of claims
01:16:00.560 | that comes in, it's very easy to suddenly pay everything out
01:16:03.280 | and then your premium goes up
01:16:04.520 | and then people can't afford the health insurance.
01:16:06.040 | So you're striking this balance
01:16:07.880 | of making health insurance affordable
01:16:09.880 | against the cost of medical claims.
01:16:11.720 | So it's a very kind of difficult business to be in.
01:16:14.240 | And I think it's very complicated to walk people
01:16:16.680 | through how they and their individual circumstances
01:16:20.360 | aren't necessarily motivated by some corporate,
01:16:24.120 | you know, malfeasance.
01:16:25.640 | It's just the way the thing has to operate, unfortunately.
01:16:28.800 | Let me just say, I think that they're,
01:16:30.680 | much like we saw with Hamas
01:16:32.400 | and the attacks in Israel a year ago,
01:16:35.760 | there were people celebrating that behavior.
01:16:39.280 | And we've seen that several times since,
01:16:41.200 | and people have become very vocal about their celebration
01:16:44.400 | of what they view to be the death and harm done
01:16:48.720 | to those who they view to be oppressors.
01:16:51.400 | In whatever context you wanna kind of fit this to
01:16:54.400 | and put that oppressor label on an individual.
01:16:57.120 | And that mindset seems to hold true
01:16:59.920 | through any social, financial, economic, political context.
01:17:03.760 | There is an oppressor group and there's an oppressed group.
01:17:06.200 | And if you're in the oppressor group, you deserve harm,
01:17:08.840 | you deserve death, you deserve jail.
01:17:11.480 | And this is another manifestation
01:17:13.280 | of that mindset playing out.
01:17:15.560 | This is an individual with a family who ran a business,
01:17:18.280 | who worked very hard for many years
01:17:19.840 | and wasn't trying to hurt people.
01:17:21.760 | And for him to kind of have his life taken like this
01:17:25.040 | and for people to say, that person is an oppressor,
01:17:27.800 | I think really speaks to how deeply people's minds
01:17:31.120 | have been contorted by this concept
01:17:33.240 | that there's oppressors and oppressed.
01:17:34.640 | - It's almost like a mind virus.
01:17:36.160 | - And I'm not gonna call it the woke mind virus,
01:17:37.840 | but 'cause I just think you immediately shut down
01:17:40.320 | and won't hear that 'cause it sounds cliche.
01:17:42.480 | But there is this concept of like,
01:17:44.200 | everyone is in one of two groups.
01:17:45.880 | You're either being oppressed or you're an oppressor.
01:17:47.720 | And if you're an oppressor,
01:17:49.120 | there is no limit to what I should do
01:17:50.520 | or what should be done to you.
01:17:51.520 | - I think it's well said.
01:17:52.360 | - And it's a very stark and sad
01:17:55.240 | kind of commentary on what's going on right now.
01:17:57.680 | Sorry.
01:17:58.520 | - If you're oppressed, the corollary to that is,
01:17:59.560 | if you are considered oppressed, you can do no wrong.
01:18:02.880 | - You can do no wrong.
01:18:03.880 | - Yeah.
01:18:04.720 | - And that is, obviously there are people
01:18:07.120 | who are oppressed on this planet,
01:18:09.240 | but they can still do wrong.
01:18:11.360 | They should still be held to a moral standard.
01:18:14.240 | - Yeah, and there is a moral standard
01:18:16.160 | and there is a standard in murder, obviously,
01:18:18.280 | and assassination like this is not acceptable.
01:18:21.880 | And it's just tragic.
01:18:23.640 | I just feel so terrible for these kids.
01:18:25.560 | - Let's end on a happy note.
01:18:26.600 | So we'll see you on Saturday, big party.
01:18:28.920 | - Yeah. - Excited.
01:18:29.760 | - Hard turn here, but looking forward
01:18:31.680 | to seeing everybody on Saturday for the all-in holiday.
01:18:34.080 | - Zoom is sponsoring the--
01:18:35.400 | - All-in holiday spectacular.
01:18:37.080 | So if you want to go to allin.com,
01:18:39.120 | you can go sign up there, right, Freebird?
01:18:41.400 | - Yeah, and then you can sign up for the Zoom.
01:18:44.200 | Gavin, we expect you to sign up.
01:18:45.880 | - Yes, we have a caviar budget
01:18:48.960 | we have to replenish over here at Saxon.
01:18:50.720 | - We do appreciate Zoom helping us out for the event.
01:18:52.960 | Zoom AI companion helped set up the live stream.
01:18:56.760 | So thanks to Zoom for doing that.
01:18:58.040 | It's gonna be awesome. - Look at that,
01:18:58.880 | AI everywhere.
01:18:59.720 | I have to say, you know what, one of the great,
01:19:01.120 | I have a couple of experience with AI that are really great.
01:19:04.440 | Notion has a phenomenal AI built into it,
01:19:07.280 | and so does Zoom with the AI summaries.
01:19:11.320 | I love getting these AI summaries of calls.
01:19:13.360 | I ask people permission to turn it on
01:19:14.880 | and we get a summary and the bullet points.
01:19:16.280 | Really well done.
01:19:17.480 | - I went to, we did a hackathon in the office
01:19:20.200 | a couple of weeks ago, and we used,
01:19:21.560 | have you guys actually built applications with Cursor?
01:19:24.720 | You guys ever done it? - I have not.
01:19:25.960 | Was it fun to build an app?
01:19:27.920 | - Well, it was great because we had so many people
01:19:30.120 | that have never built software applications
01:19:32.640 | in the hackathon, and they built tools from scratch,
01:19:35.840 | deployed them in production, and are now using them.
01:19:39.240 | And I think it really showcases
01:19:41.000 | the kind of the impressive impact
01:19:43.880 | that AI is having on workplace productivity.
01:19:46.040 | You don't need to buy SaaS tools.
01:19:47.480 | You don't need to have service providers
01:19:48.920 | do stuff for you.
01:19:49.760 | Individuals, and by the way, it's only getting better,
01:19:52.400 | and you can kind of think through a point.
01:19:53.240 | - I mean, the ability to just wake up in the morning
01:19:55.480 | and say, I want this app and have it built for you.
01:19:58.000 | - It is, and by the way, I'll just say,
01:19:59.360 | it's like 70, 80% there.
01:20:01.240 | You still have to debug.
01:20:02.240 | You still have to have someone come in and help
01:20:03.840 | kind of get things into production.
01:20:05.040 | So there's still a little bit of work.
01:20:06.080 | - Well, it's just like web pages, right?
01:20:07.640 | - Going back to the architecture question,
01:20:11.120 | two months ago, and then go back to the architecture.
01:20:12.720 | - Two months ago, it was 60% there.
01:20:13.960 | Now it's 70 to 80.
01:20:15.080 | - And you fast forward 12 months,
01:20:16.560 | and now you've got the architecture
01:20:18.200 | where the AI can run its own QA testing and debugging,
01:20:21.440 | and the AI can run its own kind of UX.
01:20:23.320 | - Sales and marketing.
01:20:25.040 | Customer support.
01:20:25.880 | - Its own UX of the application,
01:20:27.440 | and it can run everything.
01:20:28.640 | So you're basically going to say,
01:20:29.760 | I want this app to do this.
01:20:31.360 | It builds it.
01:20:32.180 | It tests it.
01:20:33.200 | It builds the UX.
01:20:34.240 | It tests the UX.
01:20:35.080 | It iterates the UX.
01:20:35.920 | It does everything streamlined for you,
01:20:37.840 | and then you show up a couple hours later,
01:20:39.560 | and you're using a new product
01:20:40.600 | that was built on the fly for you.
01:20:42.840 | It is, and we're literally like climbing this ladder,
01:20:46.400 | Gavin, like very quickly,
01:20:48.280 | that it's going to totally change the entire
01:20:51.280 | software industry is like getting re-architected.
01:20:53.440 | - I could guess Freeberg's app at his company hackathon.
01:20:56.920 | - What do you think I made?
01:20:57.760 | - I made a vegan version of Yelp.
01:21:00.600 | Only profiles all the vegan restaurants.
01:21:04.040 | - I actually tried to make a CRM tool
01:21:05.640 | so that I wouldn't have to pay for CRM licenses.
01:21:08.840 | - I tried nine months ago to make a multiplayer.
01:21:10.520 | - Shout out Mark Benioff, oh!
01:21:12.920 | Sorry, go ahead, Gavin.
01:21:14.040 | - I tried nine months ago to make a multiplayer app
01:21:18.440 | for Skyrim, which I was very excited with.
01:21:20.720 | Skyrim's a big game. - That's cool.
01:21:22.040 | - But I do think I failed.
01:21:23.920 | Maybe I should give it another go, but the--
01:21:26.480 | - You learned.
01:21:27.320 | You didn't fail.
01:21:28.140 | You learned. - I learned.
01:21:28.980 | I learned.
01:21:29.800 | I didn't fail.
01:21:30.640 | I learned.
01:21:31.480 | Yeah, I have a growth mindset.
01:21:32.300 | - We learned the limits.
01:21:33.140 | We all tested the limits.
01:21:33.980 | - Yes, we stress tested the app.
01:21:35.400 | - But I think next year,
01:21:36.840 | human language will be the dominant programming language.
01:21:40.320 | - Totally.
01:21:41.360 | Totally. - It's awesome.
01:21:42.960 | - Totally.
01:21:43.800 | - You really made a great insight earlier, Gavin,
01:21:47.840 | with startups is where you see these innovations happen.
01:21:52.760 | And I always say internally,
01:21:55.100 | resource constraints really do drive innovation.
01:22:02.000 | And when you only have a nickel,
01:22:04.400 | you gotta try to get a dollar out of it.
01:22:06.040 | And when you got a dollar and you got a lot of dollars,
01:22:08.320 | you're like, "Eh, it's okay if I get a nickel
01:22:09.700 | "out of a dollar.
01:22:10.540 | "I got more dollars laying right over here."
01:22:12.320 | And Dylan always talked about this as well.
01:22:15.240 | They asked him, "Why did Blood on the Tracks?
01:22:19.000 | "Why did you do that?
01:22:19.840 | "It was incredible."
01:22:21.240 | This Rolling Stone interviewer was talking to Dylan about it.
01:22:24.080 | And he said, "This was my favorite album,
01:22:26.080 | "and this is incredible," whatever.
01:22:27.220 | And they're like, "What was the inspiration?
01:22:28.200 | "Take me behind it."
01:22:29.040 | And he said, "Well, I owed Columbia Records an album
01:22:31.280 | "and they had given me an advance
01:22:32.520 | "and they were gonna sue me and I had to give the money back
01:22:35.120 | "and I had just gone through a divorce and I needed the money
01:22:37.200 | "and I couldn't do it, so I wrote the album."
01:22:39.760 | I was desperately, I was crushed that Dylan's,
01:22:44.880 | one of his best albums and pieces of art in his life
01:22:48.420 | was strictly a function of the pressure of being-
01:22:52.320 | - His destiny is the mother of both-
01:22:53.860 | - There it is!
01:22:54.700 | - Technical and creative invention.
01:22:56.400 | Yeah. - Absolutely.
01:22:57.240 | All right, everybody, four.
01:22:58.480 | - Thanks, guys.
01:22:59.800 | - Gavin Baker.
01:23:00.840 | - Joe Lonsdale.
01:23:01.880 | - David Friedberg.
01:23:03.360 | And we miss you, Saxx, taking a victory day.
01:23:08.080 | And it's taking victory day today.
01:23:09.720 | And Chamath, both out of the office today.
01:23:12.760 | I am the world's greatest moderator,
01:23:15.440 | and we'll see you next time on the "All In" podcast.
01:23:17.680 | Bye-bye.
01:23:18.520 | ♪ I'm going all in ♪
01:23:19.340 | - Bye-bye.
01:23:20.180 | (upbeat music)
01:23:21.260 | ♪ We'll let your winners ride ♪
01:23:23.960 | ♪ Rainman David Saxx ♪
01:23:26.480 | ♪ I'm going all in ♪
01:23:28.420 | ♪ And it said ♪
01:23:29.260 | ♪ We open-sourced it to the fans ♪
01:23:30.480 | ♪ And they've just gone crazy with it ♪
01:23:32.320 | - Love you, Westies.
01:23:33.160 | ♪ I'm queen of Kinwa ♪
01:23:34.360 | ♪ I'm going all in ♪
01:23:35.560 | ♪ We'll let your winners ride ♪
01:23:37.240 | ♪ We'll let your winners ride ♪
01:23:39.480 | ♪ We'll let your winners ride ♪
01:23:41.120 | - Besties are gone.
01:23:42.480 | (laughing)
01:23:43.920 | - That is my dog taking a notice in your driveway, Saxx.
01:23:46.500 | (laughing)
01:23:48.760 | - Oh, man.
01:23:49.600 | - My avatars will meet me at-
01:23:51.560 | - We should all just get a room
01:23:52.640 | and just have one big huge orgy,
01:23:54.240 | 'cause they're all just useless.
01:23:55.440 | It's like this sexual tension
01:23:56.960 | that they just need to release somehow.
01:23:59.560 | ♪ Let your beat be ♪
01:24:01.360 | ♪ Let your beat be ♪
01:24:04.860 | - We need to get merch.
01:24:05.680 | - Besties are back.
01:24:06.520 | ♪ I'm going all in ♪
01:24:11.520 | ♪ I'm going all in ♪