back to indexE129: Sam Altman plays chess with regulators, AI's "nuclear" potential, big pharma bundling & more
Chapters
0:0 Bestie intros!: Reddit Performance Reviews
5:14 Quick AIS 2023 update
6:27 AI Senate hearing: Sam Altman playing chess with regulators, open-source viability
21:25 Regulatory capture angle, "preemptive regulation," how AI relates to nuclear, insider insights from the White House's AI summit
43:33 Elon hires new Twitter CEO
48:46 Lina Khan moves to block Amgen's $27.8B acquisition of Horizon Therapeutics
62:30 Apple's AR goggles are reportedly being unveiled at WWDC in early June, platform shift potential, how it deviates from Apple's prior launch strategies
67:45 Residential and commercial real estate headwinds
75:29 Unrest in SF and NYC after two recent tragic incidents ended in death
84:20 Soros political motivations for backing progressive DAs
00:00:00.000 |
I have a little surprise for my besties. Everybody's been 00:00:03.240 |
getting incredible adulation. People have been getting 00:00:06.840 |
incredible feedback on the podcast. It is a phenomenon as 00:00:10.080 |
we know. And I thought it was time for us to do performance 00:00:14.640 |
reviews of each bestie. Now I, as executive producer, I'm not 00:00:19.700 |
in a position to do your performance reviews. I too need 00:00:23.100 |
to have a performance review. So I thought I would let the 00:00:25.560 |
audience do their performance reviews. So we went and we're 00:00:28.600 |
debuting a new feature today. This very moment. It's called 00:00:31.480 |
Reddit performance review. God cue some music here, some 00:00:35.720 |
graphics, Reddit performance reviews. And so we'll start off 00:00:39.600 |
with you, David Friedberg. This proves why you haven't been 00:00:46.600 |
I'm trying to build a successful enterprise instead of keeping 00:00:49.960 |
the judgment to yourself, which is what all elite performers do. 00:00:53.120 |
Yes. You turned it over to a bunch of mids on Reddit. Yes. 00:00:57.160 |
Tell you, they were already doing it. They were already 00:01:00.560 |
What elucidation were you going to get from Reddit? Yeah, really? 00:01:03.240 |
Let's not ruin the bit. Do you think Elon does 360 performance 00:01:07.200 |
On Reddit? On Reddit of all things? You know how many 360 00:01:11.160 |
performance reviews I've done in my life? Zero, of course. And 00:01:16.480 |
Okay, go go start off with me. Let's hear it. 00:01:18.520 |
Here's how it's gonna work. You are going to be presented with 00:01:21.840 |
the candid feedback that you've gotten in the last 30 days on 00:01:24.600 |
Reddit. But for the first time, and you have to read it out loud 00:01:28.600 |
to the class, Friedberg, you'll go first. Here's your first 00:01:32.400 |
piece of candid feedback in your 360 from the Reddit in cells. 00:01:38.280 |
David Friedberg deserves more hate. He and the others have 00:01:42.080 |
made it their mission to convince us that reforming social 00:01:45.000 |
security is the only way forward to survive. He hides behind this 00:01:48.920 |
nerdy a political persona, and then goes hard right out of 00:01:53.240 |
nowhere. exact fear mongered about the deficit as an excuse 00:01:57.480 |
to restructure entitlement programs, we would see that as 00:02:00.640 |
the partisan right wing take it is when Friedberg does it. We're 00:02:05.080 |
supposed to act like he has no skin in this game. He's just the 00:02:08.000 |
science guy. No, he's a rich guy who would rather work your 00:02:12.440 |
grandparents to death and pay an extra 5% tax. 00:02:15.320 |
All right, there's your review. Very good. I think you took it 00:02:17.880 |
well. And you don't have to respond now. Don't just take it 00:02:21.600 |
just to counter briefly, I have highlighted multiple times. I 00:02:25.160 |
think we're going to 70% tax rates. But hey, you know, right 00:02:29.040 |
there on everyone's got an opinion that won't generate more 00:02:33.480 |
Well, the audience has been waiting for this. David Sachs 00:02:36.120 |
has never taken a piece of feedback and the feedback he has 00:02:38.880 |
gotten he hasn't taken well. So here we go, David. Here's your 00:02:41.640 |
performance. Go ahead. I just read this go with the bit. Come 00:02:45.600 |
on. It's a little feedback for you. Come on. All right. And I'm 00:02:50.920 |
Oh, Nick pulled these. Nick police. This is not 00:03:02.280 |
The thing about David Sachs, if he wasn't rich, everyone would 00:03:05.040 |
dismiss him as being both stupid and boring. The secret to his 00:03:09.000 |
wealth is just follow Peter Thiel around since college. It's 00:03:11.360 |
like if turtle from Entourage pretend to be a public 00:03:20.880 |
has never had a 360 review. He informs us and I think his staff 00:03:24.240 |
you know, for all the people at social capital, you can get in 00:03:27.360 |
on this by just posting to Reddit since he won't do 360 00:03:29.800 |
reviews at social capital guy. Let's pull up two months. 00:03:33.320 |
Is the biggest self serving leech. As long as he can make a 00:03:38.680 |
dollar trade on it. He will burn anything down to the ground. 00:03:42.360 |
Fuck the consequences to society or anyone else. 00:03:52.880 |
I got to read mine. I haven't seen this. I'm bracing for 00:03:59.240 |
What's the critique exactly? What is the critique from all 00:04:04.600 |
pretty accurate. Okay. All right, here we go. I got to read 00:04:08.160 |
mine. Okay. I can't wait for AI to replace Jake. Jake is the 00:04:12.440 |
least skilled knowledge worker on the show. I think he has 00:04:15.840 |
about three shows left before AI replaces his hosting skills. 00:04:19.040 |
And ability to trick dentists into investing in high 00:04:27.680 |
I do have a lot of dentist friends in the funds. Okay, I'm 00:04:37.360 |
one group one, this is one for the whole group. This is a group 00:04:41.120 |
survey. This is our group 360 vote to rename the podcast. The 00:04:48.600 |
three billionaires and J. Kel three kind of smart guys and J 00:04:57.720 |
All right, everybody, welcome to the all in podcast. We're still 00:05:17.480 |
here episode 129. All in summit 2023. General Admission sold 00:05:22.600 |
out. Too many people apply for scholarships that's on pause. 00:05:26.400 |
And there's a couple of VIP tickets left get them while 00:05:28.960 |
they're hot. Just search for all in summit freeberg. Anything to 00:05:31.680 |
add? We'll just get through the grift real quick here. 00:05:33.400 |
No, it's gonna get it all right. That was it. 00:05:35.840 |
Yeah, I mean, just more demand than we predicted. We look for a 00:05:39.360 |
bigger venue couldn't find one. We're I think we're excited 00:05:42.040 |
about Royce Hall. It's still as you pointed out two and a half 00:05:45.240 |
times the size of last year. So we want to make sure it's a 00:05:47.280 |
great quality event. But unfortunately, way too many 00:05:49.920 |
folks want to go so we have to kind of pause ticket sales. 00:05:53.360 |
What's my wine budget? $300 per person per night $1,000 per VIP 00:05:58.960 |
per event. Thank you. Okay, I will handle it from here. So 00:06:03.600 |
there's 750 of them. So I think you have $750,000 in wine 00:06:06.920 |
budget. I just can't believe I just gave him 750 by wide by 00:06:23.320 |
some way Josh. All right, let's get to work. Let's get to work. 00:06:26.600 |
Okay, the Senate had a hearing this week for AI. Sam Altman was 00:06:31.880 |
there as well as Gary Marcus, a professor from NYU. That's an 00:06:36.960 |
overpriced college in New York City. And Christina Montgomery, 00:06:41.320 |
the chief privacy and trust officer from IBM, which had 00:06:44.120 |
Watson before anybody else was in the AI business. And I think 00:06:47.520 |
they deprecated it or they stopped working on it, which is 00:06:49.800 |
quite paradoxical. There were a couple of very interesting 00:06:53.640 |
moments. Sam claimed the US should create a separate agency 00:06:58.040 |
to oversee AI, I guess he's in the Chamath camp. He wants the 00:07:01.640 |
agency to issue licenses to train and use AI models, a 00:07:06.480 |
little regulatory capture there, as we say in the biz. He also 00:07:09.600 |
claims, and this was interesting dovetailing with Ilan's CNBC 00:07:13.960 |
interview with I think Dave Farber, which is very good, that 00:07:17.440 |
he owns no equity in open AI whatsoever, and was quote, doing 00:07:24.040 |
it because he loves it. Any thoughts, Chamath, you did say 00:07:29.440 |
that this would happen two months ago. And here we are two 00:07:32.800 |
months later. And exactly what you said would happen is in the 00:07:36.080 |
process of happening. regulation, licensing and 00:07:39.200 |
regulatory capture, Sam went a little further than I sketched 00:07:46.120 |
out a few months ago, which is that he also said that it may 00:07:49.440 |
make sense for us to issue licenses for these models to 00:07:53.800 |
even be compiled. And for these models to actually do the 00:07:59.000 |
learning. And I thought that that was really interesting, 00:08:03.200 |
because what it speaks to is a form of KYC, right? Know your 00:08:06.400 |
customer. And again, when you look at markets that can be 00:08:10.800 |
subject to things like fraud and manipulation, right, where you 00:08:14.040 |
can have a lot of bad actors, banking is the most obvious one. 00:08:16.840 |
We use things like KYC to make sure that money flows are 00:08:20.400 |
happening appropriately and between parties that were the 00:08:23.640 |
intention is legal. And so I think that that's actually 00:08:29.440 |
probably the most important new bit of perspective that he is 00:08:33.600 |
adding as somebody right in the middle of it, which is that you 00:08:38.040 |
should apply to this agency to get a license to then allow you 00:08:41.040 |
to compile a model. And I think that that was a really 00:08:44.920 |
interesting thing. The other thing that I said, and I said 00:08:47.680 |
this in my in a tweet just a couple days ago is I'm really 00:08:50.840 |
surprised actually, where this is the first time in modern 00:08:54.360 |
history that I can remember, where we've invented something 00:08:58.080 |
we being Silicon Valley, and the people in Silicon Valley are the 00:09:02.160 |
ones that are more circumspect than the folks on Wall Street or 00:09:05.960 |
other areas. And if you see if you gauge the sentiment, the 00:09:10.800 |
hedge funds and family offices right now are just giddy about 00:09:14.920 |
AI. And it turns out if you look at the 13 Fs, they're all long 00:09:17.760 |
NVIDIA and AMD. But if you actually look at the other side 00:09:22.640 |
of the coin, which is the folks in Silicon Valley, that's 00:09:24.480 |
actually making it the rest of us are like, hey, let's crawl 00:09:29.360 |
Yeah, let's think about guardrails. Let's be thoughtful 00:09:31.720 |
here. And so the big money people are saying, let's place 00:09:34.680 |
bets. And the people building in are saying, Hey, let's be 00:09:38.120 |
thoughtful sack, which is opposite to what it's always 00:09:40.640 |
been, I think, right? We're like, hey, let's let's run with 00:09:43.160 |
this. And Wall Street's like, prove it to me. sacks, you are a 00:09:47.960 |
less regulation guy. You are free market monster. I've heard 00:09:51.040 |
you've been called. You don't believe that we should license 00:09:54.680 |
this. What do you think about what you're seeing here? And 00:09:57.720 |
there is some cynical, cynical thoughts about what we just saw 00:10:02.320 |
happen in terms of people in the lead, wanting to maintain their 00:10:06.160 |
lead by creating red tape. What are your thoughts? 00:10:09.880 |
Yeah, of course. I think, you know, Sam just went straight for 00:10:13.360 |
the endgame here, which is regulatory capture. Normally, 00:10:16.560 |
when a tech executive goes and testifies at these hearings, 00:10:19.440 |
they're in the hot seat, and they get grilled. And that 00:10:21.240 |
didn't happen here. Because, you know, Sam Allman basically bought 00:10:24.840 |
into the narrative of these senators. And he basically 00:10:28.720 |
conceded all these risks associated with with AI talked 00:10:32.200 |
about how chat GPT style models, if unregulated, could increase 00:10:38.280 |
online misinformation, bolster cyber criminals, even threaten 00:10:41.640 |
confidence in election systems. So he basically bought into the 00:10:47.600 |
senators narrative. And like you said, agree to create a new 00:10:50.840 |
agency that would license models and can take licenses away. He 00:10:56.000 |
said that he would create safety standards, specific tests that 00:10:58.840 |
the model has to pass before it can be deployed. He says he 00:11:02.640 |
would require independent audits who can say the model is or 00:11:05.720 |
isn't in compliance. And by basically buying into their 00:11:10.400 |
narrative and agreeing to everything they want, which is 00:11:13.880 |
to create all these new regulations and a new agency, I 00:11:16.920 |
think that Sam is pretty much guaranteeing that he'll be one 00:11:19.760 |
of the people who gets to help shape the new agency and the 00:11:24.080 |
rules they're going to operate under and what these independent 00:11:26.480 |
audits are going to how they're going to determine what's in 00:11:28.680 |
compliance. So he is basically putting a big boat around his 00:11:32.600 |
own incumbency here. And so yes, it is a smart strategy for him. 00:11:39.600 |
But the question is, do we really need any of this stuff? 00:11:42.120 |
And you know, what you heard at the hearing is that just like 00:11:45.720 |
with just about every other tech issue, the centers on the 00:11:48.720 |
judiciary committee didn't exhibit any real understanding 00:11:52.240 |
of the technology. And so they all generally talked about their 00:11:55.400 |
own hobby horses. So you know, you heard from Senator 00:11:58.400 |
Blackburn, she wants to protect songwriters. Holly wants to stop 00:12:02.000 |
anti conservative bias. Klobuchar was touting a couple 00:12:05.120 |
of bills that have her name on them once called the JCP a 00:12:09.200 |
journalism competition preservation Bernie Sanders want 00:12:11.480 |
to do he wants to protect the 1% of the one Durbin hate section 00:12:15.240 |
230. That was the hobby horse he was riding and then Senator 00:12:17.560 |
Blumenthal was obsessed that someone had published deep fakes 00:12:20.600 |
of himself. So you know, all of these different senators had 00:12:24.680 |
different theories of harm that they were promoting. And they 00:12:28.160 |
were all basically hammers looking for a nail. You know, 00:12:30.920 |
they all wanted to regulate this thing. And they didn't really 00:12:34.560 |
pay much of any attention to the ways that existing laws could 00:12:37.880 |
already be used. Absolutely. To stop any of these things. If you 00:12:41.920 |
commit a crime with AI, there are plenty of criminal laws, 00:12:44.920 |
every single thing they talked about could be handled through 00:12:47.000 |
existing law, if they're not doing harms at all. Yeah, but 00:12:50.440 |
they want to jump right to creating a new agency and 00:12:52.760 |
regulations. And Sam, I think did the, you know, expedient 00:12:58.000 |
thing here, which is basically buy into it in order to was a 00:13:01.640 |
chance to be an insider. If this was a chess game, Sam got to the 00:13:04.680 |
mid game, he traded all the pieces and went right to the end 00:13:07.080 |
game. Let's just try to checkmate here. I've got the 00:13:09.600 |
lead. I got the 10 billion from Microsoft. Everybody else get a 00:13:12.960 |
license and try to catch up Friedberg. We have Chamath pro 00:13:16.760 |
regulation licensing, I think, or just being pretty thoughtful 00:13:19.760 |
about it. There you got sacks being typically a free market 00:13:24.240 |
monster, let the laws be what they are. But these senators are 00:13:26.800 |
going to do regulatory capture. Where do you as a sultan of 00:13:32.240 |
I think there's a more important kind of broader set of trends 00:13:39.640 |
that are worth noting and that the folks doing these hearings 00:13:42.600 |
and having these conversations are aware of, which implies why 00:13:46.640 |
they might be saying the things that they're saying. That's not 00:13:49.280 |
necessarily about regulatory capture. And that is that a lot 00:13:51.960 |
of these models can be developed and generated to be much smaller. 00:13:56.680 |
We're seeing, you know, models that can effectively run on an 00:13:59.560 |
iPhone, we're seeing a number of open source models that are 00:14:02.160 |
being published. Now there's a group called mosaic ml. Last 00:14:05.440 |
week, they published what looks like a pretty good quality model 00:14:08.920 |
that has, you know, a very large kind of token input, which means 00:14:13.240 |
you can do a lot with it. And that model can be downloaded and 00:14:17.440 |
used by anyone for free, you know, really good open source 00:14:20.600 |
license that they've provided on that model. And that's really 00:14:22.840 |
just the tip of the iceberg on what's going on, which is that 00:14:25.760 |
these models are very quickly becoming ubiquitous, 00:14:28.000 |
commoditized, small, and effectively are able to move to 00:14:31.760 |
and be run on the edge of the network. As a result, that means 00:14:35.400 |
that it's very hard to see who's using what models how behind the 00:14:39.520 |
products and tools that they're building. And so if that's the 00:14:43.400 |
trend, then it becomes very hard for a regulatory agency to go in 00:14:48.600 |
and audit every server, or every computer or every computer on a 00:14:52.680 |
network and say, What model are you running? Is that an approved 00:14:55.080 |
model? Is that not an approved model? It's almost like having a 00:14:58.360 |
regulatory agency that has to go in and audit and assess whether 00:15:04.120 |
a Linux upgrade or some sort of, you know, open source platform 00:15:07.920 |
that's that's being run on some server is appropriately vetted 00:15:10.920 |
and checked. And so it's almost like a fool's errand. And so if 00:15:16.200 |
I'm running one of these companies, and I'm trying to, 00:15:18.560 |
you know, get Congress off my butt and get all these 00:15:21.200 |
regulators off my butt, I'm going to say, go ahead and 00:15:22.880 |
regulate us. Because the truth is, there really isn't a great 00:15:26.480 |
or easy path or ability to do that. And there certainly won't 00:15:29.080 |
be in five or 10 years. Once these models all move on to the 00:15:33.040 |
edge of the network, and they're all being turned around all the 00:15:35.600 |
time every day. And there's a great evolution underway. So I 00:15:40.000 |
actually take a point of view that it's not just that this is 00:15:44.000 |
necessarily bad. And there's cronyism going on. I think that 00:15:47.200 |
the point of view is just that this is going to be a near 00:15:50.520 |
impossible task to try and track and approve LLM and audit servers 00:15:56.760 |
that are running LLM and audit apps and audit what's behind the 00:16:00.720 |
tools that everyday people are using. And I wish everyone the 00:16:04.400 |
best of luck in trying to do so. But that's kind of the joke of 00:16:07.440 |
the whole thing is like, let's go ahead and paddle these 00:16:09.440 |
Congress people on the shoulder and say you got it. Right? 00:16:12.360 |
There you have it, folks. Wrong answer. shamatha sacks. Right 00:16:16.000 |
answer. Friedberg. If you were to look at hugging face, if you 00:16:18.880 |
don't know what that is, it's a basically an open source 00:16:21.480 |
repository of all of the LLM. The cat is out of the bag, the 00:16:25.720 |
horses have left the barn. If you look at what I'm showing on 00:16:29.840 |
the screen here, this is the open LLM leaderboard kind of 00:16:33.000 |
buried on hugging face. If you haven't been to hugging face, 00:16:34.920 |
this is where developers show their work, they share their 00:16:37.600 |
work. And they kind of compete with each other in a social 00:16:40.480 |
network showing all their contributions. And what they do 00:16:43.680 |
here is and this is super fascinating. They have a series 00:16:46.760 |
of tests that will take an LLM language model and they will 00:16:51.600 |
have it do science questions that would be for grade school. 00:16:55.320 |
They'll do a test of mathematics, US history, 00:16:59.720 |
There's a jeopardy test to I don't know if it's on here, but 00:17:02.520 |
the jeopardy test is really good. It's like straight up 00:17:04.240 |
jeopardy trivia and see if it can answer the questions. Yeah, 00:17:06.480 |
which actually, Friedberg was actually his high school 00:17:09.920 |
jeopardy championship three years in a row. But anyway, on 00:17:13.440 |
this leaderboard, you can see the language models are out 00:17:18.960 |
pacing what open AI did. I'm sorry, closed AI is what I call 00:17:23.480 |
it now because they're closed source, closed AI and Bard have 00:17:27.520 |
admitted that internal person at Bard said the language models 00:17:30.360 |
here are now outpacing what they're able to do with much 00:17:33.280 |
more resources. Many hands makes for light work, the open source 00:17:36.760 |
models are going to fit on your phone or the latest, you know, 00:17:40.120 |
Apple silicon. So I think the cat's out of the bag. I don't 00:17:44.080 |
pull something incompatible about that. What Friedberg just 00:17:46.520 |
said with what I said, in fact, freeberg's point bolsters my 00:17:49.200 |
point, it's highly impractical to regulate open source software 00:17:54.200 |
in this way. Also, when you look at that list of things that 00:17:56.720 |
people are doing on hugging face, there's nothing nefarious 00:18:00.160 |
Yeah, and all the harms that were described are already 00:18:02.280 |
illegal and can be prosecuted. Exactly. Some special agency, 00:18:05.800 |
you know, giving it seal of approval. Again, this is going 00:18:08.800 |
to replace permissionless innovation, which is what has 00:18:11.960 |
defined the software industry and especially open source with 00:18:15.240 |
the need to develop some connection or relationship and 00:18:18.280 |
lobbying in Washington to go get your project approved. And 00:18:22.480 |
there's no really good reason for this, except for the fact 00:18:24.600 |
that the senators on the judiciary committee and all of 00:18:28.080 |
Washington really wants more control. So they can get more 00:18:32.000 |
Saks have a question. Do you think that creating the DMV and 00:18:36.320 |
requiring a driver's license limits the ability for people to 00:18:40.720 |
the DMV is like the classic example of how government 00:18:44.040 |
doesn't work. I don't know why you'd want to make that your 00:18:45.960 |
example. I mean, people have to spend all day waiting in line. 00:18:49.440 |
He sends people to it. He's got a VIP person waiting in line to 00:18:53.240 |
get your photo taken. It's insane. I mean, everyone has a 00:18:55.920 |
miserable experience with it. No, but it's highly relevant 00:18:58.360 |
because you're right. If you create an agency where people 00:19:01.920 |
have to go get their permission, it's a licensing scheme. You're 00:19:05.200 |
gonna be waiting in some line of untold length. It won't be like 00:19:09.000 |
a physical line at the DMV building, it's gonna be a 00:19:11.560 |
virtual line where you're in some queue, where there's 00:19:13.720 |
probably gonna be some overwork regulator who doesn't even know 00:19:16.640 |
how they're supposed to approve your project. They're just gonna 00:19:19.080 |
be trying to cover their ass because if the project ends up 00:19:21.600 |
being something nefarious, then they get blamed for it. So 00:19:27.600 |
Let me also highlight something that I think is maybe I think 00:19:30.560 |
the gut a little bit, maybe a little bit misunderstood. But, 00:19:34.160 |
you know, an AI model is an algorithm. So it's a it's a 00:19:38.760 |
piece of software that takes data in and spits data out. And 00:19:43.240 |
you know, we have algorithms that are written by humans, we 00:19:45.440 |
have algorithms that have been, you know, written by machines, 00:19:48.840 |
these are machine learn models, which is what a lot of what 00:19:51.640 |
people are calling AI today is effectively an extension of an 00:19:54.720 |
out of. And so the idea that a particular algorithm is 00:20:00.400 |
differentiated from another algorithm is also what makes 00:20:04.040 |
this very difficult, because these are algorithms that are 00:20:06.240 |
embedded and sit within products and applications that an end 00:20:10.320 |
user and end customer ultimately uses. And I just sent you guys a 00:20:14.240 |
link to the you know, the EU has been working towards passing 00:20:18.200 |
this AI. Here we go. There are a couple of weeks ahead of these 00:20:22.120 |
conversations in the US. But I mean, as you read through this 00:20:26.080 |
AI act, and the proposal that it's that it's put forth, it 00:20:29.960 |
almost becomes the kind of thing that you say, I just don't know 00:20:34.280 |
if these folks really understand how the technology works, 00:20:37.640 |
because it's almost as if they're going to audit and have, 00:20:40.720 |
you know, an assessment of the risk level of every software 00:20:45.080 |
application out there. And that the tooling and the necessary 00:20:48.960 |
infrastructure to be able to do that just makes no sense. In the 00:20:52.200 |
context of open source software in the context of an open 00:20:54.680 |
internet, in the context of how quickly software and applications 00:20:59.240 |
and tools evolve, and you make tweaks to an algorithm, and you 00:21:03.720 |
or you can be sure their number one job Friedberg is going to 00:21:07.680 |
be to protect jobs. So anything there that in any way infringes 00:21:12.640 |
on somebody's ability to be employed in a position, whether 00:21:15.760 |
it's an artist or a writer, or a developer, they're going to say, 00:21:19.640 |
you can't use these tools, or they're going to try to throttle 00:21:21.880 |
them to try to protect jobs, because that's their number one 00:21:24.320 |
job. All three of you. Do you guys think that this was Sam's 00:21:27.480 |
way of pulling up the ladder behind him? Of course, 100% just 00:21:31.160 |
like, Oh, absolutely. It's and it's because no, you can, you 00:21:34.240 |
can prove it. He made open AI closed AI by making it not open 00:21:38.640 |
source. If you're Sam, you're smart enough to know how quickly 00:21:42.760 |
the models are commoditizing and how many different models there 00:21:47.120 |
are, that, you know, can provide similar degrees of 00:21:50.040 |
functionality, as you just pointed out, J. Cal. So I don't 00:21:52.240 |
think it's about trying to lock in your model. I think it's 00:21:54.960 |
about recognizing the impracticality of creating some 00:21:57.920 |
regulatory regime around model auditing. And so you're very so 00:22:02.840 |
that in that in that world, in that scenario, where you have 00:22:05.280 |
that vision, you have that foresight, do you go to Congress 00:22:07.880 |
and tell them that they're dumb to regulate AI? Or do you go to 00:22:10.400 |
Congress and you say, Great, you should regulate AI? Knowing that 00:22:13.520 |
it's like, hey, yeah, you should go ahead and stop the sun from 00:22:15.760 |
shining. You know, like, it's just Yeah, so basically, he's 00:22:18.640 |
telling them to do that. Because he knows they can't. Therefore, 00:22:22.520 |
he gets all the points, all the joy points, all the social 00:22:26.200 |
credit, I don't wanna say virtual signaling, but he gets 00:22:28.480 |
all the credit relationship credit with Washington for 00:22:31.720 |
saying what they want to hear reflecting back to them. Even 00:22:34.840 |
though he knows they can't compete with Facebook's open 00:22:38.720 |
Yeah, there is historical precedent interesting for 00:22:41.680 |
companies that are facing congressional scrutiny, to go to 00:22:45.000 |
Congress and say, go ahead and regulate us as a way of 00:22:47.800 |
preemptively. Yeah. And I think that it doesn't necessarily 00:22:50.960 |
mean you're going to get regulated. But it's a way of 00:22:53.440 |
kind of creating some relief and getting everyone to take a 00:22:55.680 |
breather and a sigh of relief and be like, okay, the 00:22:58.560 |
guard. What do you think of the strategy? Saks the guard as 00:23:04.400 |
I think that's in chess when you are going to take the queen. 00:23:07.280 |
What's like, anyway, what do you think of his chess? 00:23:10.200 |
That's not a strategy in chess. So I think it is a chess move. 00:23:19.640 |
I don't think that's his number one goal. But I think it is the 00:23:23.720 |
result. And so I think the the goal here is I think he's got 00:23:28.680 |
to pass in front of him when you go to testify like this, you can 00:23:31.920 |
either resist, and they will put you in the hot seat and just 00:23:35.600 |
grill you for a few hours, or you can sort of concede and you 00:23:39.720 |
buy into their narrative. And then you kind of get through the 00:23:43.520 |
hearing without being grilled. And so I think on that level, 00:23:46.680 |
it's preferable just to kind of play ball. And then the other 00:23:50.720 |
thing is that by playing ball, you get to be part of the 00:23:53.360 |
insiders club that's going to shape these regulations. And 00:23:56.040 |
that will I wouldn't say it's a ladder coming up, I think it's 00:23:58.320 |
more of a moat, where because it's not that the ladder comes 00:24:02.280 |
up and nobody else can get in. But the regulations are gonna be 00:24:05.320 |
a pretty big moat around major incumbents who know they qualify 00:24:10.120 |
for this because we're gonna write these standards. So at the 00:24:13.320 |
end of the day, if you're someone in Sam's shoes, you're 00:24:15.940 |
like, why resist and make myself a target, or I'll just buy into 00:24:21.280 |
the narrative and help shape the regulations. And it's good for 00:24:24.200 |
I like the analysis, gentlemen, this is a perfect analysis. Let 00:24:27.240 |
me ask you a question tomorrow. What is the commercial incentive 00:24:31.240 |
from your point of view, to ask for regulation and to be 00:24:35.080 |
pro regulation, your pro regulation? Can you just 00:24:37.240 |
highlight for me at least what you think? You know, the 00:24:40.840 |
commercial reason is to do that? You know, how do you benefit 00:24:43.920 |
from that? Like, not you personally, but generally, like, 00:24:48.120 |
I think that certain people in a sphere of influence, and I would 00:24:52.560 |
put us in that category, have to have the intellectual capacity 00:24:56.840 |
to see beyond ourselves and ask what's for the greater good. I 00:25:02.720 |
think Buffett is right. Two weeks ago, he equated AI to 00:25:08.320 |
nuclear weapons, which is an incredibly powerful technology 00:25:11.960 |
whose genie you can't put back in the bottle, whose 99.9% of use 00:25:17.600 |
cases are generally quite societally positive, but the 00:25:22.480 |
point 1% of use cases destroys humanity. And so I think you 00:25:27.400 |
guys are unbelievably naive on this topic, and you're letting 00:25:30.880 |
your ideology fight your common sense. The reality is that there 00:25:37.480 |
are probably 95 billion trillion use cases that are incredibly 00:25:42.720 |
positive. But the 1000 negative use cases are so destructive. 00:25:47.200 |
And they're equally possible. And the reason they're equally 00:25:50.120 |
possible. And this is where I think there's a lot of 00:25:51.960 |
intellectual dishonesty here is, we don't even know how 00:25:54.600 |
transformers work. The best thing that happened when 00:25:57.840 |
Facebook open source llama was also that somebody stealthily 00:26:01.680 |
released all the model weights. Yeah. Okay, so 00:26:05.480 |
I don't think that a little bit for a neophyte what we're 00:26:08.080 |
talking about. So there's the model and there's the weights, 00:26:11.120 |
think about it as it's a solution to a problem. The 00:26:13.600 |
solution looks like a polynomial equation. Okay, let's take a 00:26:17.480 |
very simple one. Let's take Pythagorean theorem, you know, x 00:26:20.440 |
squared plus y squared equals z squared. Okay. So if you want to 00:26:23.920 |
solve an answer to a problem, you have these weights, you have 00:26:27.280 |
these variables, and you have these weights associated with 00:26:30.040 |
it, the slope of a line y equals mx plus b, okay, what a computer 00:26:34.320 |
does with AI is it figures out what the variables are. And it 00:26:38.280 |
figures out what the weights are the answer to identifying images 00:26:43.080 |
flawlessly turns out to be 2x plus seven, where x equals this 00:26:48.400 |
thing. Now take that example and multiply it by 500 billion 00:26:53.560 |
parameters, and 500 billion weights. And that is what an AI 00:26:58.640 |
model essentially gives us to it as an answer to a question. So 00:27:03.120 |
even when Facebook released llama, what they essentially 00:27:05.880 |
gave us was the equation, but not the weights. And then what 00:27:10.320 |
this guy did, I think it was an intern, apparently, or somebody, 00:27:12.880 |
he just left the weights, so that we immediately knew what 00:27:16.880 |
the structure of the equation look like. So that's what we're 00:27:20.920 |
basically solving against. But we don't know how these things 00:27:23.760 |
work. We don't really know how transformers work. And so this 00:27:26.400 |
is my point, when I think you guys are right about the 00:27:30.440 |
overwhelming majority of the use cases, but there will be people 00:27:34.280 |
who can nefariously create havoc and chaos. And I think you got 00:27:38.800 |
to slow the whole ship down to prevent those few folks from 00:27:44.280 |
Let me just talk with you. I haven't had a chance to chime in 00:27:48.400 |
on my position. So I'd like to just make my mind. Nobody cares. 00:27:51.760 |
Okay, well, I do. I think actually, I split the 00:27:55.200 |
difference here a little bit. I don't think it needs to be an 00:27:56.800 |
agency and licensing. I do think we have to have a commission. And 00:27:59.800 |
we do need to have people being thoughtful about those 1000 use 00:28:02.560 |
cases chamath because they are going to cause societal harm or 00:28:07.320 |
things that we cannot anticipate. And then number two, 00:28:10.160 |
for the neophyte with the 1600 rating on chess.com sacks, 00:28:14.440 |
Gardez, an announcement to the opponent that their queen is 00:28:17.840 |
under direct attack similar to the announcement of check the 00:28:19.940 |
warning was customary until the early 20th century. So since you 00:28:23.000 |
do not know the history of check, now you've learned 00:28:25.200 |
something early 20th century. Okay, well, since I've only 00:28:27.720 |
played chess in the 20th and 21st centuries, I'm unaware of 00:28:30.640 |
that. Jekyll and Francis Brown's got the girl they play man if he 00:28:36.040 |
got the go ahead free bird in the context of what we're 00:28:39.200 |
talking about that models are becoming smaller and can be run 00:28:43.320 |
on the edge. And there's obviously hundreds and thousands 00:28:45.560 |
of variants of these open source models that have, you know, good 00:28:49.320 |
effect, and perhaps compete with some of these, these models that 00:28:53.880 |
you're mentioning that are closed source. How do you 00:28:56.760 |
regulate that? How do you and then they sit behind an 00:29:01.520 |
I think in order for you to be able to compile that model to 00:29:05.240 |
generate that initial instantiation, you're still 00:29:13.080 |
You can't be past that. We're not past that yet. Okay, we 00:29:16.040 |
don't have 5 million models, we don't have all kinds of things 00:29:20.280 |
that solve all kinds of problems. We don't have an open 00:29:22.680 |
source available simulation of every single molecule in the 00:29:25.880 |
world, including all the toxic materials that could destroy 00:29:28.200 |
humans. We don't have that yet. So before that is created, and 00:29:32.080 |
shrunk down to an iPhone, I think we need to put some stage 00:29:38.080 |
I think you need some form of KYC. I think before you're 00:29:41.040 |
allowed to run a massive cluster to generate the model that then 00:29:45.440 |
you try to shrink, you need to be able to show people that 00:29:48.600 |
you're not trying to do something absolutely chaotic, or 00:29:53.360 |
that that could be as simple as putting your driver's license 00:29:55.920 |
and your social security number that you're working on an 00:29:58.560 |
instance in a cloud, right? It could be you're putting your 00:30:02.280 |
it becomes slightly more nuanced than that. It's like, I think 00:30:05.520 |
that J Cal, that's probably the simplest thing for AWS GCP and 00:30:08.880 |
as you're to do, which is that if you want to run over a 00:30:12.720 |
certain number of GPU clusters, you need to put in that 00:30:15.200 |
information. I think you also need to put in your tax ID 00:30:17.680 |
number. So I think if you want to run a real high scale model, 00:30:20.280 |
that's still going to run you 10s or hundreds of millions of 00:30:23.440 |
dollars. I do think there aren't that many people running those 00:30:26.720 |
things. And I do think it's easy to police those and say, what 00:30:30.480 |
So let me just push back on that. Because mosaic ml publish 00:30:33.920 |
this model that is, let me I can pull up the performance chart 00:30:38.040 |
or Nick, maybe you can just find it on their website real quick 00:30:40.080 |
at the new model they published to mock bait, trained this model 00:30:43.560 |
on open source data that's publicly available. And they 00:30:49.240 |
spent $200,000 on a cluster run to build this model and look at 00:30:53.160 |
how it performs compared to some of the top models that are 00:30:56.640 |
Just say it for the people who are listening. 00:30:58.360 |
Yeah. So for people that are listening, basically, this 00:31:00.360 |
model is called mpt seven B. That's the name of the AI model, 00:31:05.880 |
the LLM model that was generated by this group called mosaic ml. 00:31:09.600 |
And they spent $200,000 creating this model from scratch. And the 00:31:15.000 |
data that they trained it on is all listed here. It's all 00:31:17.520 |
publicly available data that you can just download off the 00:31:19.720 |
internet, then they score how well it performs on its results 00:31:23.160 |
against other big models out there like llama seven B. 00:31:26.600 |
Yeah, but he I know, but I don't exactly know what the actual 00:31:31.080 |
problems they're trying to ask it to compare. 00:31:33.640 |
Right. So the but the point is that this model theoretically 00:31:36.760 |
could then be applied to a different data set once it's 00:31:39.720 |
been, you know, built. I don't think so. I just want to use 00:31:43.320 |
your point earlier about toxic chemistry, because models were 00:31:46.600 |
generated, and then other data was then used to fine tune those 00:31:52.120 |
Hold on a second, like those answers were to specific kinds 00:31:55.240 |
of questions. If you wanted to all of a sudden ask totally 00:31:58.640 |
orthogonal thing of that model, that model would fail, you would 00:32:01.960 |
have to go back and you'd have to retrain it. That training does 00:32:05.280 |
cost some amount of money. So if you said to me, h math, I could 00:32:09.200 |
build you a model trained on the universe of every single 00:32:12.760 |
molecule in the world. And I could actually give you 00:32:15.400 |
something that could generate the toxic list of all the 00:32:17.600 |
molecules and how to make it for $200,000. I would be really 00:32:21.600 |
scared. I don't think that that's possible today. So I 00:32:24.680 |
don't understand these actual tests. But I don't think it's 00:32:27.360 |
true that you could take this model and these model weights, 00:32:30.520 |
apply it to a different set of data and get useful answers. 00:32:33.240 |
But let's let's assume for a minute that you can, in fact, 00:32:38.280 |
I don't want to assume it. Here's my point. I want to tell 00:32:41.560 |
you what's happening right now, which is that's not possible. So 00:32:44.600 |
we should stop so that then I don't have to have this argument 00:32:47.800 |
with you in a year from now, which is like, hey, some jack 00:32:50.880 |
jerk off just created this model. Now the cat's out of the 00:32:53.560 |
bag. So let's not do it. Yeah. And, and then what's going to 00:32:56.520 |
happen is like some chaotic seeking organization is going to 00:32:59.960 |
print one of these materials and release it into the wild to 00:33:02.640 |
prove it. But here's the point for the audience. We are at a 00:33:04.960 |
moment in time where this is moving very quickly. And you 00:33:07.480 |
have very intelligent people here who are very knowledgeable. 00:33:10.160 |
Talking about the degree to which this is going to manifest 00:33:14.480 |
itself, not if it will manifest, you are absolutely 100% certain 00:33:18.440 |
freeberg that somebody will do something very bad in terms of 00:33:22.200 |
the chemical example as but one, it we're only determining here 00:33:26.120 |
what level of hardware and what year that will happen to Martha 00:33:29.080 |
saying, we know it's going to happen, whether it's two or 10, 00:33:32.400 |
or, you know, five years, let's be thoughtful about it. And I 00:33:36.400 |
think, you know, this discussion we're having here, I think is 00:33:39.040 |
on a spectrum. It's this is a unique moment where the most 00:33:45.080 |
knowledgeable people across every single political spectrum 00:33:49.680 |
persuasion for profit, nonprofit, Democrat, Republican, 00:33:53.280 |
right, Elon and Sam, I'll just use those as the two canonical 00:33:56.320 |
examples to demonstrate our pro regulation, and then the further 00:34:00.600 |
and further you get away, the less technically astute you are, 00:34:03.880 |
the more anti regulation and like pro market you are. And all 00:34:07.360 |
I'm saying is, I think that should also be noted that that's 00:34:10.600 |
a unique moment that the only other time that that's happened 00:34:13.080 |
was around nuclear weapons. And you know, that's when I actually 00:34:18.120 |
think, I think it's politically incorrect. Right now, I think 00:34:23.520 |
because of what you're saying, just give me a second, I think 00:34:25.320 |
because of what you're saying, everyone on the left and the 00:34:27.640 |
right, it's it's become popular to be pro regulation on AI and 00:34:31.560 |
say that AI is going to doom the world. And it's unpopular. And 00:34:36.960 |
I've explained my point of view on Sam, Elon's different, but I 00:34:40.040 |
think like it, I think it's become politically incorrect to 00:34:43.080 |
stand up and say, you know what, this is a transformative 00:34:45.480 |
technology for humanity. I don't think that there's a real path 00:34:48.400 |
to regulation. I think that there are totally there are laws 00:34:51.160 |
that are in place that can protect us in other ways, with 00:34:53.560 |
respect to privacy, with respect to fraud, with respect to 00:34:56.640 |
biological warfare, and all the other things that we should 00:34:59.360 |
Elon has said pretty clearly, he doesn't give a shit about what 00:35:02.120 |
it does to make money or not. He cares about what he thinks. So 00:35:05.200 |
all I'm saying is that's a guy that's not trying to be 00:35:07.880 |
Elon has a very specific concern, which is a GI, he's 00:35:11.800 |
concerned that we're on a path to a digital super intelligence 00:35:14.400 |
singularity. And if we create the wrong kind of artificial 00:35:19.040 |
general intelligence that decides that it doesn't like 00:35:21.720 |
humans, that is a real risk to the human species. That's the 00:35:26.200 |
concern he's expressed. But that's not what the hearing was 00:35:30.800 |
really about. And it's not what any of these regulatory 00:35:33.400 |
proposals are about. The reality is none of these centers know 00:35:37.880 |
what to do about that even the industry doesn't know what to 00:35:40.440 |
do about the long term risk of creating an AGI. 00:35:44.760 |
Nobody really. And so and so I actually I disagree with this 00:35:47.960 |
idea that tomorrow, the early earlier you said that there's 1000 00:35:51.840 |
use cases here that could destroy the human species. I 00:35:54.680 |
think there's only one, there's only one species level risk, 00:35:58.560 |
which is AGI. But that's a long term risk. We don't know what to 00:36:01.480 |
do about it yet. I agree, we should have conversations. What 00:36:04.680 |
we're talking about today is whether we create some new 00:36:07.360 |
licensure regime in Washington, so that politically connected 00:36:11.400 |
insiders get to control and shape the software industry. And 00:36:15.360 |
that's a disaster. Let me give you another detail on this. So 00:36:18.080 |
in one of the chat groups, I'm in, there was somebody who just 00:36:23.360 |
got back from Washington, I won't say who they are. It's not 00:36:26.480 |
someone who's famous outside the industry, but they're kind of 00:36:28.280 |
like a tech leader. And what they said is they just got back 00:36:32.560 |
from Capitol Hill and the White House. And I guess there's like 00:36:36.040 |
a White House summit on AI. You guys know about that? Yeah. So 00:36:41.840 |
what this person said is that the White House meeting was 00:36:46.000 |
super depressing. Some smart people were there to be sure, 00:36:50.520 |
but the White House and VPS teams were rapidly negative, no 00:36:54.520 |
real concern for the opportunity, or economic impact 00:36:57.600 |
just super negative. Of course, basically, the mentality was 00:37:00.640 |
that tech is bad. We hate social media. This is the hot new 00:37:04.120 |
thing. We have to stop it. Of course, that basically is their 00:37:07.000 |
attitude. They don't understand the technology White House. 00:37:09.480 |
Yeah, the Y and the VP specifically, because she's the 00:37:11.960 |
now the AI is our to put Kamala Harris in charge of this makes 00:37:15.280 |
no sense. I mean, does she have any background in this? Like it 00:37:18.400 |
just shows like a complete utter lack of awareness. Where's the 00:37:21.960 |
Megan Smith, or somebody like a CTO to be put in charge of 00:37:25.520 |
this? Remember, Megan Smith was CTO under, I guess Obama, like, 00:37:29.160 |
you need somebody with a little more depth of experience here, 00:37:32.280 |
then you guys then saying you're you're pro regulation, 00:37:36.920 |
Well, I'm pro thoughtfulness. I'm pro thoughtfulness. I'm 00:37:39.800 |
illustrating that really, this whole new agency that's being 00:37:43.320 |
discussed is just based on vibes. You're not down with the 00:37:46.840 |
are the vibes, the vibe is that a bunch of people in Washington 00:37:51.200 |
don't understand technology, and they're afraid of it. So 00:37:54.040 |
anything you're afraid of, you're gonna want to control 00:37:56.040 |
these are socialists, David, they're socialists. They hate 00:37:59.240 |
progress. They are scared to death that jobs are going to 00:38:02.040 |
collapse. They're socialists, they, they're union leaders. 00:38:05.520 |
This is their worst nightmare. Because the actual truth of 00:38:08.560 |
this technology is 30% more efficiency, and it's very 00:38:11.960 |
mundane. This is the truth here. I think that representatives 00:38:15.680 |
have 30% more efficiency means Google, Facebook, and many other 00:38:20.360 |
companies, finance, education, they do not add staff every 00:38:24.360 |
year, they just get 30% more efficient every year. And then 00:38:27.320 |
we see unemployment go way up, and Americans are going to have 00:38:30.120 |
to take service jobs. And white collar jobs are going to be 00:38:33.080 |
refined to like a very elite few people who actually do work in 00:38:37.720 |
the world. There is absolutely a lot of new companies. If humans 00:38:41.960 |
can become if knowledge workers can become 30% more productive, 00:38:44.800 |
there'll be a lot of new companies. And the biggest 00:38:47.120 |
shortage our economy is coders, right? And we're going to have 00:38:49.800 |
an unlimited number of them. Now. They're all going to go. 00:38:51.560 |
Yeah, I don't know if it's unlimited. But yes, it's a good 00:38:53.560 |
thing. If you give them superpowers. We've talked about 00:38:55.280 |
this before. Yeah. So I think it's too soon to be concluding 00:38:59.000 |
that we need to stop job displacement that hasn't even 00:39:01.880 |
occurred yet. I'm not saying it's actually going to happen. I 00:39:04.360 |
do agree, there'll be more startups. I'm seeing it 00:39:06.000 |
already. I just think that's what they fear. That's their 00:39:09.320 |
fear is and that's the fear of the EU, the EU is going to be 00:39:12.560 |
protectionist unionist, protect pro workers, which is fine, 00:39:15.760 |
they're gonna be affected because these are not blue 00:39:17.640 |
collar jobs we're talking about. These are knowledge. There's 00:39:19.680 |
white collar unions that all the media companies created unions 00:39:22.600 |
and look at them. They're all going circling all these media 00:39:24.800 |
companies are circling the training going. But that's on 00:39:27.200 |
the margins. I mean, that's not just trying to start tech 00:39:29.360 |
unions. Sure, they're trying to start them. But when we think of 00:39:31.600 |
unionized workers, you're thinking about factory workers, 00:39:36.200 |
Okay, listen, this has been an incredible debate. This is why 00:39:39.120 |
you tune into the pod. A lot of things can be true at the same 00:39:42.080 |
time. I really think the analogy of the the atom bomb is really 00:39:44.800 |
interesting, because what you want is scared about with 00:39:47.880 |
general artificial intelligence is nuclear holocaust, the whole 00:39:51.000 |
planet blows up. Between those two things are things like 00:39:54.600 |
Nagasaki and Hiroshima or a dirty bomb and many other 00:39:58.560 |
possibilities with nuclear power, you know, Fukushima, you 00:40:05.320 |
Not yet. Right. And you know, the question is, is a three mile 00:40:09.880 |
island is a Fukushima is a Nagasaki are those things 00:40:13.920 |
probable. And I think we are all looking at this saying there 00:40:17.080 |
will be something bad that will happen. There will be the 00:40:20.760 |
It strings together and these large language models string 00:40:24.480 |
together words in really interesting ways. And they give 00:40:26.560 |
computers the ability to have a natural language interface that 00:40:29.160 |
is so far from AGI. I think it's a component. Hold on, I think 00:40:34.640 |
it's a obviously the ability to understand language and 00:40:37.400 |
communicate in a natural way is a component of a future AGI. But 00:40:41.840 |
by itself, these are models for stringing together 00:40:44.640 |
auto GPT, where these things go out and pursue things without 00:40:49.080 |
any interference, I would be the first one to say that if you 00:40:51.840 |
wanted to scope models to be able to just do human language 00:40:56.920 |
back and forth on the broad open internet, you know, there's 00:41:02.320 |
probably a form David where these chat GPT products can 00:41:06.280 |
exist. I don't I think that those are quite benign. I agree 00:41:09.080 |
with you. But I think what Jason is saying is that every week 00:41:12.400 |
you're you're taking a leap forward. And already with auto 00:41:15.760 |
GPT is you're talking about code that runs in the background 00:41:18.240 |
without supervision. It's not a human interface that's like, 00:41:21.240 |
hey, show me how to color my cookies green for St. Patty's 00:41:25.440 |
Day. It's a plan my trip to Italy. Yeah, yeah, it's not 00:41:28.320 |
doing that. So I just think that there's there's a place well 00:41:31.240 |
beyond what you're talking about. And I think you're 00:41:33.000 |
minimizing the problem a little bit by just kind of saying the 00:41:35.920 |
whole class of AI is just chat GPT and asking, kid asking it to 00:41:41.040 |
This example, I hate to say it out loud. But somebody could say 00:41:44.240 |
here is the history of financial crimes that were committed and 00:41:48.280 |
other hacks, please with their own model on their own server, 00:41:51.840 |
say, please come up with other ideas for hacks, be as creative 00:41:55.360 |
as possible and steal as much money as possible and put that 00:41:57.960 |
in an auto GPT, David, and study all hacks that occur in the 00:42:01.800 |
history of hacking. And it could just create super chaos around 00:42:05.640 |
And you can be sure history is going to regret buying into this 00:42:08.320 |
narrative because the members of the Judiciary Committee are 00:42:11.080 |
doing the same playbook they ran back in 2016. After that 00:42:14.320 |
election, they ran all these hearings on disinformation, 00:42:16.920 |
claiming that social networks been used to hack the election. 00:42:19.400 |
It was all a phony narrative. Hold on that. What happened? 00:42:22.400 |
They got tech companies to buy into Biden, they Russians hacked 00:42:25.800 |
Hunter right after that. Stop, they got they got all these tech 00:42:28.920 |
CEOs to buy into that phony narrative. Why? Because it's a 00:42:32.240 |
lot easier for the tech CEOs to agree and tell the senators what 00:42:35.560 |
they want to hear to get them off their backs. And then what 00:42:38.200 |
does that lead to a whole censorship industrial complex. 00:42:41.320 |
So we're going to do the same thing here. We're going to buy 00:42:44.240 |
into these phony narratives to get the senators off our backs. 00:42:47.960 |
And that's going to create this giant AI industrial complex 00:42:52.080 |
that's going to slow down real innovation and be a burden on 00:42:55.440 |
Okay, lightning round lightning round. We got to move on three 00:42:59.200 |
If honored to become an evil more evil. Yeah. What is it 00:43:02.920 |
called an evil comic book character? A supervillain, a 00:43:07.080 |
supervillain. I would you want to be an even more loathsome 00:43:10.520 |
supervillain continue, I would take every single virus patch 00:43:15.360 |
that's been developed and publicized, learn on them and 00:43:18.800 |
then find the next zero day exploit on a whole bunch of 00:43:21.160 |
stuff. I mean, it's like even publish that idea. Please don't 00:43:23.400 |
I mean, I'm worried about publishing that idea. That's 00:43:25.880 |
not an intellectual leap. I mean, you have to be an hour. 00:43:29.280 |
Obvious. Okay, let's move on. Another great debate. Elon hired 00:43:33.680 |
a CEO for Twitter. Linda yakarino. I'm hoping pronouncing 00:43:37.840 |
that correct was the head of ad sales at NBC Universal. She's a 00:43:40.440 |
legend in the advertising business. She worked at Turner 00:43:43.760 |
for 15 years before that she is a workaholic is what she says 00:43:47.840 |
she's going to take over everything but product and CTO 00:43:51.240 |
Elon's gonna stick with that. She seems to be very moderate 00:43:57.040 |
and she follows people on the left or right people are 00:43:59.400 |
starting the character assassination and trying to 00:44:01.640 |
figure out her politics. She was involved with the WF World 00:44:06.800 |
Economic Forum, which anybody in business basically does. But 00:44:10.400 |
your take sacks on this choice for CEO and what this means just 00:44:16.360 |
broadly for the next six months because we're sitting here at 00:44:20.160 |
six months almost exactly since you on took over obviously you 00:44:23.040 |
and I were involved in month one but but not much after that. 00:44:25.480 |
What do you think the next six months holds and what do you 00:44:27.480 |
think her role is going to be obviously some precedent this 00:44:33.120 |
Listen, I think this choice makes sense on this level 00:44:37.400 |
Twitter's business model is advertising. Elon does not like 00:44:40.960 |
selling advertising. She's really good at selling 00:44:43.200 |
advertising. So he's chosen a CEO to work with who's highly 00:44:46.640 |
complimentary to him in their skill sets and interests. And I 00:44:52.000 |
think that makes sense. I think there's a lot of logic in that 00:44:54.640 |
what Elon likes doing is the technology and product side of 00:44:58.240 |
the business. He actually doesn't really like the let's 00:45:03.880 |
call it the standard business chores. And especially related 00:45:07.600 |
to like we said advertising and he loves to offer that stuff. 00:45:10.720 |
Advertisers make is his personal nightmare. Right? So I think the 00:45:15.440 |
choice makes sense on that level. Now. Instantly, you're 00:45:18.040 |
right, her hiring led to attacks from both the left and the right 00:45:22.360 |
the right, you know, pointed out her views on COVID and vaccines 00:45:27.160 |
and her work with the WF. And on the left, I mean, the attack is 00:45:31.560 |
that she's following lives at Tick Tock, which you're just not 00:45:36.920 |
Well, if you're just following lives to tick tock, they want to 00:45:41.000 |
Well, she also follows David Sachs. So that does mean that 00:45:44.280 |
she's pretty that that is a signal. But the truth is, if 00:45:47.280 |
Sachs correct me if I'm wrong here, or trim off, maybe I'll 00:45:49.360 |
send it to you. If you pick somebody that both sides 00:45:51.800 |
dislike or trying to take apart, you probably picked the right 00:45:54.920 |
Here's what I think. Okay, go ahead. We're not going to know 00:45:59.320 |
how good she is for six to nine months. But here's what I took a 00:46:03.200 |
lot of joy out of. Here's a guy who gets attacked for all kinds 00:46:10.440 |
of things now, right? He's an anti Semite, apparently. And then 00:46:13.800 |
he had to be like, I'm very pro Semite. He's a guy that all of 00:46:16.720 |
a sudden people think is a conspiracy theorist. He's a guy 00:46:19.200 |
that people think is now on the raging right, all these things 00:46:23.080 |
that are just like inaccuracies, basically firebombs thrown by 00:46:27.280 |
the left. But here's what I think is the most interesting 00:46:30.440 |
thing for a guy that theoretically is supposed to be a 00:46:33.560 |
troll and everything else. He has a female chairman at Tesla, 00:46:37.480 |
a female CEO at Twitter, and a female president at SpaceX. Of 00:46:42.560 |
course, it's a great insight. It's the same insight. I think a 00:46:46.240 |
lot of these virtue signaling lunatics on the left virtue 00:46:49.160 |
signaling mids. They're all nice. And you know, like, you 00:46:53.120 |
have Elizabeth Warren and Bernie Sanders, giving, you know, the 00:46:58.920 |
CEO of Starbucks a hard time when he doubled the pay of the 00:47:04.480 |
minimum wage, gave them a free bird. You love mid right? That's 00:47:08.440 |
a great term, isn't it? These fucking meds and I paid for the 00:47:13.240 |
college tuition. What gives you the right at Starbucks to pay 00:47:17.280 |
for college tuition and double the minimum wage? That's so so 00:47:23.000 |
Isn't it so great? Like, you can picture them when I say that 00:47:26.520 |
these are these mids feverishly typing on their keyboards, their 00:47:30.080 |
virtue signaling nonsense, sex wrapping it up. 00:47:32.480 |
Yeah, look, like you said, Elon has worked extremely well with 00:47:36.480 |
when shot well, who's the president of SpaceX for a long 00:47:38.680 |
time. And I think that relationship shows the way to 00:47:44.240 |
make it work here at Twitter, which is they have a very 00:47:46.280 |
commoner skill set. I think my understanding is that when 00:47:48.640 |
focuses on the business side and the sales side of the operation, 00:47:52.280 |
Elon focuses on product and technology. She lets Elon be 00:47:55.280 |
Elon. I think if Linda tries to reign Elon in, tell him not to 00:48:00.200 |
tweet, or tries to meddle in the free speech aspects of the 00:48:04.040 |
business, which is the whole reason he bought Twitter, which 00:48:06.720 |
is the beginning and end of it. Yeah, that's right. That's when 00:48:09.400 |
it will fall apart. So my advice would be let Elon be Elon. You 00:48:13.760 |
know, he bought this company to make it a free speech platform. 00:48:16.640 |
Don't mess with that. And I think it could work great. And 00:48:20.680 |
a free speech platform. It is when you are saying anything 00:48:23.600 |
about COVID. And I really don't even want to say it here, 00:48:26.000 |
because I don't want our to even say the word COVID or vaccine 00:48:29.240 |
means that this could get tagged by YouTube and be D. You know, 00:48:33.400 |
algorithm, the algorithm could could could D. I don't know what 00:48:37.840 |
they call it deprecate this and when we don't show up and people 00:48:40.360 |
don't see us because we just said the word COVID. I mean, the 00:48:42.640 |
censorship built into these algorithms is absurd. But 00:48:45.480 |
speaking of absurd, Lena Khan, who has been the least effective 00:48:50.160 |
FTC chair, I think, started out pretty promising with some 00:48:54.660 |
interesting ideas. She's now moved to block a major pharma 00:48:57.640 |
deal. In December, Amgen agreed to acquire Dublin based horizon 00:49:02.980 |
therapeutic for 27.8 billion. This is the largest pharma deal 00:49:06.040 |
announced in 2022. FTC has filed a lawsuit in federal court 00:49:08.920 |
seeking an injunction that would prevent the deal from closing. 00:49:11.600 |
The reasoning is the deal would allow Amgen to entrench the 00:49:14.860 |
monopoly positions of horizons I and gout drugs. The agency said 00:49:19.680 |
that those treatments don't face any competition today and that 00:49:22.840 |
Amgen would have a strong incentive to prevent any 00:49:25.840 |
potential rivals from introducing similar drugs. 00:49:28.880 |
Chamath the pharmaceutical industry is a little bit 00:49:34.080 |
Explain why and then sacks will go to you on the gout stuff. 00:49:37.680 |
Well, I know that personally impacts you. Go ahead. 00:49:39.520 |
I think that this is a little like scientifically illiterate, 00:49:44.040 |
to be honest. Unpack the thing is that you want drugs that can 00:49:48.600 |
get to market quickly. But at the same time, you want drugs to 00:49:51.600 |
be safe, and you want drugs to be effective. And I think that 00:49:55.480 |
the FDA has a pretty reasonable process. And one of the direct 00:50:00.360 |
byproducts of that process is that if you have a large 00:50:03.040 |
indication that you're going after, say diabetes, you have to 00:50:06.540 |
do an enormous amount of work, it has to be run on effectively 00:50:10.440 |
1000s of people, you have to stratify it by age, you have to 00:50:13.700 |
stratify it by gender, you have to stratify by race, you have to 00:50:16.960 |
do it across different geographies, right? The bar is 00:50:19.720 |
high. But the reason the bar is high is that if you do get 00:50:22.580 |
approval, these all of a sudden become these blockbuster 1020 00:50:26.660 |
$30 billion drugs, okay. And they improve people's lives, and 00:50:31.040 |
they allow people to live, etc, etc. What has happened in the 00:50:35.200 |
last 10 or 15 years, because of Wall Street's influence inside 00:50:39.480 |
of the pharma companies, is that what pharma has done a very good 00:50:45.440 |
job of doing is actually pushing off a lot of this very risky R&D 00:50:51.320 |
to young early stage biotech companies. And they typically do 00:50:56.080 |
the first part of the work, they get through a phase one, they 00:50:59.720 |
even may be able to sometimes go and start a phase two trial to a 00:51:03.640 |
trial. And then they typically can get sold to pharma. And 00:51:09.080 |
these are like multi billion dollar transactions. And the 00:51:12.920 |
reason is that the private markets just don't have the 00:51:15.160 |
money to support the risk for these companies to be able to do 00:51:19.040 |
all the way through a phase three clinical trial because it 00:51:21.260 |
would cost on some cases 5678 billion dollars. You've never 00:51:26.080 |
heard of a tech company raising that much money except in a few 00:51:28.520 |
rare cases. In far in biotech, it just doesn't happen. So you 00:51:33.080 |
need the M&A machine to be able to incentivize these young 00:51:37.200 |
companies to even get started in the first place. Otherwise, what 00:51:40.800 |
literally happens is you have a whole host of diseases that just 00:51:44.520 |
stagnate. Okay. And instead, what happens is a younger 00:51:48.920 |
company can only raise money to go after smaller diseases, which 00:51:53.000 |
have smaller populations, smaller revenue potential, 00:51:56.040 |
smaller costs, because the trial infrastructure is just less. So 00:52:00.200 |
if you don't want industry to be in this negative loop, where you 00:52:04.080 |
only work on the small diseases, and you actually go and tackle 00:52:06.720 |
the big ones, you need to allow these kinds of transactions 00:52:10.160 |
happen. The last thing I'll say is that even when these big 00:52:12.720 |
transactions happen, half the time they turn out to still not 00:52:16.480 |
work, there is still huge risk. So don't get caught up in the 00:52:20.120 |
dollar size. You have to understand the phase it's in. 00:52:22.800 |
And the best example of this is the biggest outcome in in 00:52:26.120 |
biotech private investing in Silicon Valley was this thing 00:52:29.160 |
called stem centric. And that thing was a $10 billion dud, 00:52:32.200 |
right. But it allowed all these other companies to get started 00:52:35.800 |
after some centric Scott bought for 10 billion. 00:52:37.880 |
Freiburg, I want to get your take on this, especially in 00:52:41.640 |
light of maybe something people don't understand, which is that 00:52:44.520 |
the amount of time you get to actually exclusively monetize a 00:52:48.680 |
drug, because my understanding you correct me if I'm wrong, you 00:52:52.520 |
get a 20 year patent, it's from the date you file it, but then 00:52:55.320 |
you're working towards getting this drug approved by the FDA. 00:52:57.840 |
So by the time the FDA approves the drug, this 20 year patent 00:53:00.920 |
window, how many years do you actually have exclusively to 00:53:03.880 |
monetize that drug? And then your wider thoughts on this? 00:53:07.000 |
FTA? Yeah, I'm not going to answer that question right now. 00:53:10.000 |
Because I do want to kind of push back on the point I'm 00:53:12.960 |
generally pretty negative on a lot of the comments Lena 00:53:16.240 |
contemade and her positioning. And obviously, as you guys know, 00:53:18.760 |
we've talked about it on the show. But I read the FTC filing 00:53:22.480 |
written federal court. And if you read the filing, let me just 00:53:28.040 |
start the company that am gens trying to buy it's called 00:53:29.800 |
horizon therapeutics, which is a company that's doing about 4 00:53:33.000 |
billion in revenue a year about a billion to a billion and a 00:53:35.360 |
half in EBITDA. So it's a it's a business that's got a portfolio 00:53:39.200 |
of orphan drugs, meaning drugs that treat orphan conditions 00:53:42.240 |
that aren't very big blockbusters in the in the 00:53:44.320 |
pharmaceutical drug context. And so it's a it's a nice portfolio 00:53:48.480 |
of cash generating drugs. And Jen, buying the business gives 00:53:53.320 |
them real revenue really, but and helps bolster a portfolio 00:53:57.920 |
that you know, is aging. And I think that's a big part of the 00:54:01.240 |
strategic driver for am Jen to make this massive $28 billion 00:54:04.920 |
acquisition, the FTC is claim in the filing, which I actually 00:54:10.200 |
read and I was like, this is actually a pretty good claim is 00:54:13.840 |
that the way that am Jen sets the prices for their 00:54:16.760 |
pharmaceutical drugs is they go to the insurance companies, the 00:54:19.720 |
payers and the health systems, and they negotiate drug pricing. 00:54:23.160 |
And they often do bulk multi product deals. So they'll say, 00:54:27.760 |
hey, we'll give you access to the product at this price point, 00:54:31.320 |
but we need you to pay this price point for this product. And 00:54:33.800 |
over time, that drives price inflation, it drives costs up. 00:54:37.160 |
And it also makes it difficult for new competitors to emerge. 00:54:39.680 |
Because they tell the insurance company, you have to pick our 00:54:42.400 |
drug over other drugs in order to get this discounted price. And 00:54:46.520 |
so it's a big part of their negotiating strategy that they 00:54:48.520 |
do with insurance companies. So the FTC is claim is that by 00:54:52.080 |
giving am Jen this large portfolio of drugs that they're 00:54:55.240 |
buying from horizon, it's going to give them more negotiating 00:54:58.040 |
leverage and the ability to do more of this drug blocking that 00:55:00.880 |
they do with insurance companies and other payers in the drug 00:55:03.920 |
system. So they're trying to prevent pharmaceutical drug 00:55:06.760 |
price inflation, and they're trying to increase competition 00:55:09.280 |
in their lawsuit. So I felt like it was a fairly kind of 00:55:12.040 |
compelling case. I'm no lawyer on antitrust and monopolistic 00:55:16.600 |
practices and the Sherman Act. But this was not sorry, let me 00:55:18.680 |
just say this was not an early stage biotech risky deal that 00:55:22.160 |
they're trying to block. No, it is a mature company with 00:55:25.080 |
4 billion in revenue and a billion and a half in EBITDA. I 00:55:27.280 |
understand. And I read it too. But two comments. It is because 00:55:31.440 |
the people that traffic in these stocks are the same ones that 00:55:34.200 |
fund these early stage biotech companies. And I talked to a 00:55:37.160 |
bunch of them. And they're like, if these guys block this kind of 00:55:39.680 |
deal, we're going to get out of this game entirely. So just from 00:55:43.560 |
the horse's mouth, what I'm telling you is you're going to 00:55:45.320 |
see a Paul come over the early stage venture financing 00:55:48.640 |
landscape, because a lot of these guys that are crossover 00:55:51.280 |
investors that own a lot of these public biotech stocks that 00:55:53.880 |
also fund the private stocks will change their risk posture 00:55:57.480 |
if they can't make money. That's just the nature of capitalism. 00:56:00.360 |
The second thing is Lena Khan did something really good about 00:56:03.680 |
what you're talking about this week, actually, which is she 00:56:06.280 |
actually went after the PBMs. And if you really care about 00:56:09.600 |
drug inflation, and you follow the dollars, the real culprits 00:56:12.800 |
around this are the pharmacy benefit managers. And she 00:56:16.160 |
actually launched a big investigation into them. But 00:56:18.960 |
this is what speaks to the two different approaches. It seems 00:56:21.720 |
that unfortunately, for the FTC, every merger just gets contested 00:56:26.080 |
for the sake of it being contested. Because I think that 00:56:29.280 |
if you wanted to actually stop price inflation, there are 00:56:31.920 |
totally different mechanisms. Because why didn't you just sue 00:56:35.280 |
all the PBMs? Well, there's no merger to be done. But you can 00:56:39.000 |
investigate, and then you could regulate. And I think that 00:56:42.040 |
that's probably a more effective way. And the fact that she 00:56:44.640 |
targeted the PBM says that somebody in there actually 00:56:47.240 |
understands where the price inflation is coming from. But I 00:56:50.520 |
don't think something like an Amgen horizon, because what I 00:56:53.560 |
think will happen is all the folks will then just basically 00:56:56.640 |
say, Well, man, if if, if these kinds of things can't get 00:56:59.720 |
bought, then why am I funding these other younger things? Yeah, 00:57:02.440 |
yeah, we're just not seeing a lot of the younger stuff get 00:57:05.440 |
get blocked. I don't think we've seen any attempts at blocking 00:57:09.920 |
speculative portfolio acquisitions or speculative 00:57:13.560 |
I think these guys are getting caught up in the dollar number. 00:57:15.800 |
You know, yeah. So I think the problem is they see 28 billion, 00:57:19.160 |
they're like, we need to stop it. You know, it's amazing. I'll 00:57:21.760 |
just wrap on this, because it's a good discussion. But I think 00:57:23.840 |
we got to keep moving here is I took the PDF that you shared. 00:57:27.320 |
And I put it into chat GPT now. Oh, and you don't need to upload 00:57:32.960 |
the PDF anymore. You could just say summarize this and put the 00:57:41.240 |
I just use it. No, it's not the browser plugin. I just did. This 00:57:44.400 |
is the game. But 3.5 model, I just gave it a link and it 00:57:47.240 |
pulled the link in the GPT 3.5 model does know we could do 00:57:51.360 |
that. That's new. They must add a browsing in the background. 00:57:54.960 |
Yeah, or just pulling it. They did today, by the way. Whoa. 00:57:58.560 |
They did it today. Yeah. Well, remember last week, we said that 00:58:02.000 |
they had to build browsing into the actual product like bard, 00:58:05.080 |
right? Otherwise, it can move. And I got to say, closed AI 00:58:09.080 |
closed AI is on the top of their game. The app is available in 00:58:11.760 |
the apps are now right as of today. No, they they had a test 00:58:15.080 |
app. I was on the test. No, no, no, they just launched the app. 00:58:17.560 |
Today. Oh, that's game over, man. If this thing is an app 00:58:20.520 |
form, that's going to 10 x the number of users and it's going 00:58:22.880 |
to 10 x the amount of usage. The same thing for bard, we should 00:58:25.880 |
compare the two but bard bard is pretty good as well. Yeah. Yeah. 00:58:29.680 |
Well, bring you out to actually compare its summary with chat 00:58:33.320 |
GPT summary. Tell us which one's better. Right? This is some 00:58:37.720 |
interesting news here. You know, we speaking of platform shifts, 00:58:42.160 |
do I get to give my view on the Lena con thing? 00:58:44.200 |
Oh, well, yes. But first, I don't want to read this was 00:58:49.160 |
getting a little, it was getting a little personal here, David. 00:58:51.920 |
And I didn't I didn't want to trigger you. I know you've been 00:58:53.720 |
struggling with the gout. Because of your lifestyle 00:58:56.920 |
choices, the alcohol, the foie gras everything, but now you've 00:59:00.840 |
lost a lot of way to give you a lot of credit. Tell us what do 00:59:03.360 |
you think about the bundling we're seeing here? Because it 00:59:05.200 |
does seem Microsoft s with the operating system. Yeah, it's 00:59:07.400 |
very similar. And what I said in the context of tech is that we 00:59:10.480 |
should focus on the anti competitive tactics and stop 00:59:12.880 |
those rather than blocking all mergers. And I think the same 00:59:16.360 |
thing is happening now in the pharma space. If bundling is 00:59:19.120 |
the problem focus on bundling. The problem when you just block 00:59:22.360 |
M&A is that you deny early investors, one of the biggest 00:59:27.520 |
ways that they can make a positive outcome. And what's the 00:59:30.200 |
downstream effect of that? Yeah, exactly. Look, it is hard enough 00:59:33.800 |
to make money as either a pharma investor or as a VC, that 00:59:38.000 |
there's only two good outcomes, right? There's IPOs, there's 00:59:40.240 |
M&A, everything else basically goes, everything else is a zero 00:59:43.400 |
goes bankrupt. So if you take M&A off the table, you really 00:59:48.800 |
suppress the already challenged returns of venture capital. 00:59:53.840 |
And you're right, you mentioned earlier that we were willing to 00:59:56.520 |
give Lena Khan a chance. We thought that some of our ideas 00:59:59.520 |
were really interesting, because I think there are these huge 01:00:02.000 |
tech companies that do need to be regulated these big tech 01:00:05.880 |
monopolies, basically that you have the mobile operating system 01:00:08.720 |
duopoly with Apple and Google, you've got Amazon, you've got 01:00:11.600 |
Microsoft, and there is a huge risk of those companies 01:00:14.760 |
preferring their own applications over downstream 01:00:17.800 |
applications or using these bundling tactics. Yes. If you 01:00:22.520 |
don't put some limits around that, that creates I think, an 01:00:26.960 |
This is the insight. And I think it's exactly correct sex. For 01:00:30.800 |
Lena Khan, and if she listens to the pod, hey, Lena, you want 01:00:33.560 |
to go after tactics, not acquisitions. So if somebody 01:00:37.920 |
buys something, and they lower prices and increases consumer 01:00:41.520 |
choice, that's great. If it encourages more people to 01:00:44.840 |
invest more money into innovation, that's great. But if 01:00:48.600 |
the tactics are we're going to bundle these drugs together to 01:00:51.160 |
keep some number of them artificially high or reduced 01:00:54.000 |
choice, or if we're going to bundle features into the you 01:00:56.880 |
know, suite of products, and we do anti competitive stuff, you 01:00:59.920 |
have to look at the tactics on the field are people cheating? 01:01:03.280 |
And are they using the monopoly power to force you to use their 01:01:06.560 |
app store? Just make Apple have a second app store. That's all 01:01:10.880 |
we're asking you to do. There should be an app store on iOS 01:01:15.320 |
that doesn't charge any fees or charges 1% fees. Break the 01:01:21.800 |
Saks is so right. Perfectly said, she actually did issue 01:01:25.040 |
compulsory orders to the PBMs. So to your point, Saks, the FTC 01:01:30.400 |
has been worried that what freeberg said has been 01:01:32.840 |
happening. But the real sort of middleman manipulator in this 01:01:36.240 |
market are the pharmacy benefit managers. And so this week, she 01:01:39.400 |
actually issued compulsory orders to the PBMs and said, 01:01:41.880 |
Turn over all your business records to me, I'm going to look 01:01:44.240 |
into them. That makes a ton of sense. But then on the same end, 01:01:48.320 |
it's like you see merger and you're like, No, it can't 01:01:50.280 |
happen. It just doesn't speak to a knowledge of the market. 01:01:53.200 |
We should have Lena on with the Halina. I know you listen to the 01:01:56.040 |
pod. I've heard the back channel just come on the pod. I should 01:01:58.200 |
be a good guest, right? We would have a good conversation. I 01:02:00.120 |
think. Yeah. Open invite Nikki Halley is coming on the pod. By 01:02:03.640 |
the way, you have homework to do for the summit, which is to 01:02:06.120 |
see if you can get Donald Trump to come on to the summit. Okay, 01:02:13.400 |
Cute. Okay, not as good as apprentice but class. Okay, 01:02:19.480 |
As your your mannerisms are unbelievable. How did you 01:02:23.240 |
practice it? I did I did a little bit only because I like 01:02:25.560 |
to show people and trigger them. I'm going to really dial in my 01:02:28.680 |
Trump in the coming weeks. All right, here we go. Apple's long 01:02:31.520 |
anticipated a our headset that stands for augmented reality, 01:02:35.680 |
which means VR, you can't see the real world, you're just in a 01:02:40.760 |
virtual world AR lets you put digital assets on the real 01:02:44.600 |
world. So you can see what's happening in the real world. But 01:02:46.600 |
you can put graphics all around that's expected to be revealed. 01:02:49.800 |
As early as June, the projected cost is going to be around 01:02:55.360 |
$3,000. It won't ship into the fall. This is a break from 01:02:59.080 |
Apple's typical way of releasing products, which is to wait till 01:03:03.040 |
it's perfect and to wait until all consumers can afford it. 01:03:05.800 |
This is a different approach. They're going to give this out 01:03:07.640 |
to developers early. And Tim Cook is supposedly pushing this 01:03:11.760 |
there was another group of people inside of Apple who did 01:03:14.240 |
not want to release a Chema. But there is some sort of external 01:03:18.600 |
battery pack, it seems like a bit of a Franken product 01:03:21.440 |
Frankenstein kind of project here. That, you know, perhaps 01:03:26.120 |
Steve Jobs wouldn't have wanted to release but he needs to get 01:03:28.480 |
it out. I think because Oculus is making so much progress to 01:03:31.840 |
killer app supposedly is a FaceTime like live chat 01:03:35.320 |
experience that seems interesting, but they look like 01:03:38.000 |
ski goggles. Your thoughts Chamath on this as the next 01:03:42.200 |
compute platform if they can get it, you know to work would you 01:03:46.200 |
wear these? Would they have to be Prada? What's the story? No, 01:03:50.320 |
this seemed like a weird conversation because none of us 01:03:52.240 |
fucking know none of us have seen this product and none of us 01:03:54.280 |
have used it. So like, this is just lucky friend of the pot 01:03:58.640 |
So what that's just like commenting on one guy's five 01:04:02.400 |
Let's Palmer knows I mean, Palmer invented Oculus. 01:04:05.920 |
Great. What are we talking about? We have nothing to say 01:04:08.040 |
I'll a form of really good question here. Do you believe 01:04:11.800 |
this is going to be a meaningful compute platform in the coming 01:04:14.720 |
years because Apple is so good at product? How do we know till 01:04:17.680 |
we see it? We got to see it. I think on Facebook, of course, 01:04:20.800 |
they're of course they're good at product. Let's let's see the 01:04:23.760 |
all right, fine. Saks, what are your thoughts? 01:04:25.480 |
I think it's a good thing that they're launching this. Like you 01:04:27.880 |
said, it is a deviation for what they've normally done. They 01:04:30.200 |
normally don't release a product unless they believe the entire 01:04:33.520 |
world can use it. So their approach has been only to 01:04:37.080 |
release mass mass market products and have a very small 01:04:40.200 |
portfolio of those products. But when those products work, 01:04:43.240 |
they're, you know, billion user home runs. This obviously can't 01:04:46.920 |
be at a $3,000 price point. And it also seems like it's a little 01:04:49.960 |
bit of a early prototype where the batteries are like in a 01:04:53.000 |
fanny pack around your waist. And there's a ways to go around 01:04:56.440 |
this, but I give them credit for launching what is probably going 01:05:00.800 |
to be more of an early prototype so they can start iterating on 01:05:03.440 |
it. I mean, the reality is the Apple watch the first version 01:05:05.960 |
kind of sucked and first five versions. Yeah, now they're on 01:05:09.320 |
one that's pretty good, I think. So look, I think this is a cool 01:05:11.840 |
new platform. They get knocked on for not innovating enough. I 01:05:15.160 |
think good, let them try something new. I think this will 01:05:18.120 |
be good for meta to have some competition. Yeah, it's great. 01:05:21.720 |
Having two major players in the race, maybe it actually speeds 01:05:27.240 |
She wants any of that here. I mean, I think they should have 01:05:30.680 |
done something in cars. I agree with David, like they do in 01:05:35.800 |
cars. Well, if you were going to talk about the car, what would 01:05:38.120 |
it be? Tell me what you think would be the right approach you 01:05:40.760 |
were going to do the Facebook phone that could have changed 01:05:42.880 |
the entire destiny of Facebook, they should have bought Tesla 01:05:45.000 |
when they could have the chance for four or $5 billion. They 01:05:47.200 |
could have bought it for 10 billion 20 billion. And when it 01:05:49.920 |
got to 5060 that it got out of reach. What do you think the car 01:05:53.880 |
should know they could have bought it at 100 billion. It 01:05:55.400 |
could have bought 100 billion Tim Cook famously wouldn't take 01:05:57.600 |
the meeting. Elon said it he wouldn't he wouldn't meet was 01:06:01.000 |
Maybe they missed an opportunity there. But I do think the end 01:06:03.680 |
game with the AR headset or glasses, right? Yes. Where you 01:06:07.640 |
get the screens and you get the Terminator mode. And in is that 01:06:14.120 |
were like fancy technology. This this size glasses is what you're 01:06:19.040 |
talking about. Yeah, like a little camera built in and 01:06:22.480 |
conjunction with AI, then it gets really interesting. So 01:06:27.600 |
give the audience an example of what this combination of AI plus 01:06:31.560 |
when you're walking around, it could layer on intelligence 01:06:33.960 |
about the world or you meet with somebody and it can remind you 01:06:36.880 |
of their name and the last time you met with them and give you a 01:06:39.240 |
summary of what you talked about. What action items there 01:06:42.960 |
You can be walking in a city and it could tell you it knows you 01:06:46.600 |
like Peking duck, it could show you Hey, there's a Peking duck 01:06:48.800 |
place over here. Some reviews have it just knows you and it's 01:06:51.960 |
What about for people that do the same routine 99% of the time 01:06:56.880 |
it could tell you your steps every day could tell you 01:07:00.200 |
incoming messages so you don't have to take your 01:07:01.880 |
$3,000 on that. No, but you would spend people spend plus 01:07:06.400 |
your people spend $400 on notifications on your wrist. Why 01:07:10.120 |
do you want it on your eyes for 3000? I would love this. 01:07:12.600 |
Maybe I just do like a lot of meetings or I'm at events where 01:07:16.000 |
people are coming up to me and I've met him like once a year 01:07:18.280 |
before. Like it would be really helpful to kind of have the 01:07:21.440 |
term. Let's be honest, though, the terminated mode for you to 01:07:24.360 |
be able to be present with your family and friends, but be 01:07:28.320 |
playing chess with Peter Thiel on those glasses. That's your 01:07:31.000 |
dream come true. You and Peter in a are playing chess all day 01:07:35.520 |
long. Throw up the picture of sacks beating Peter Thiel. I 01:07:39.480 |
watched the clip from the early all in episodes when we 01:07:42.400 |
discussed you beating Peter Thiel. What a great moment it 01:07:44.760 |
was for you. All right, listen, let's wrap up with this gallop 01:07:47.600 |
survey the number of Americans who say it's a good time to 01:07:51.360 |
buy a house has never been lower 21% say it's a good time to buy 01:07:55.480 |
a house down 9% from the prior low of a year ago prior to 01:08:00.520 |
2022 50% or more consistently thought it was a good time to 01:08:03.320 |
buy significantly significantly fewer expect local housing 01:08:07.120 |
prices to increase in the year. Hey, sacks is this like a 01:08:11.360 |
predictive of a bottom and pure capitulation and then that means 01:08:15.720 |
maybe it is in fact a good time. How would you read this data? 01:08:17.880 |
I don't see it as a bottom necessarily the way I read the 01:08:21.120 |
data is that the spike in interest rates have made it very 01:08:24.680 |
unaffordable to buy a house right now you've got you know, 01:08:27.040 |
the mortgages are what like 7% interest rate or even slightly 01:08:31.520 |
higher. So people just can't afford the same level of house 01:08:35.360 |
that they did before. I mean, mortgages were at three, three 01:08:38.720 |
and a half percent like a year and a half ago. Now I think 01:08:41.800 |
what's kind of interesting is that even in the 1980s, the 01:08:44.840 |
early 1980s, when interest rates were at like 15%, you still had 01:08:50.360 |
50% thought it was an okay time to buy a house or attractive 01:08:53.120 |
time to buy a house. So for the number to be this low tells me 01:08:58.000 |
that is not just about interest rates, I think consumer 01:09:00.600 |
confidence is also plummeting, and people are feeling more 01:09:05.480 |
insecure. So I think it's just another economic indicator that 01:09:11.160 |
things are looking really shaky right now. And I'll tell you one 01:09:13.960 |
of the the knock on effects of this is going to be that people 01:09:16.720 |
can't move. Because in order to move, you have to sell your 01:09:20.160 |
current house, and then buy a new one. And you're not going to 01:09:23.360 |
want to sell your current house when prices are going down. And 01:09:26.800 |
then for the new one, you're going to lose your 3% mortgage 01:09:29.760 |
and have to get a new one at 7%. So you're not gonna buy 01:09:32.360 |
anything like the house. So it freezes the market, it freezes 01:09:36.160 |
mobility. I think over the last few years, during COVID, you saw 01:09:39.720 |
tremendous movement between states, I think that's gonna 01:09:42.800 |
slow down a lot now, because people just can't afford to 01:09:45.520 |
trade houses. So as a result of that, I think discontent is 01:09:49.640 |
going to rise. Because I think one of the ways that you create 01:09:52.320 |
a pressure valve is when people are unhappy in a state, they 01:09:55.600 |
just move somewhere else. Well, now they're not going to do 01:09:57.320 |
that. Well, you can also move to a better opportunity for you and 01:09:59.960 |
your family, whether that schools, taxes, a job, 01:10:02.680 |
lifestyle. So yeah, you can you can, you're going to reduce joy 01:10:06.080 |
in the country. And it also, it screws with price discovery, 01:10:09.100 |
doesn't it? Your mouth if you if you don't have a fluid market 01:10:11.800 |
here, then how does anybody know what their house is worth? And 01:10:14.840 |
this just, again, creates more, I think, frost, 01:10:18.360 |
I think Friedberg has said this a couple times, Friedberg, you 01:10:20.840 |
can correct me if I'm wrong. But like the the home is like the 01:10:24.280 |
disproportionate majority of most Americans wealth, right? 01:10:30.600 |
All their wealth. Yeah. So I mean, there's that factoid. 01:10:36.960 |
Yeah. And then what does that do for their attention savings or 01:10:40.200 |
whatever? It's okay. You got incoming what's going on? 01:10:43.160 |
No, I was looking I was looking at a mansion that's for sale. 01:10:48.200 |
Like $175 million, but they just got the price to 140. So I'm 01:10:56.160 |
there's gonna be a lot of distress in the market soon. I'm 01:10:58.880 |
predicting a lot of distress. Actually, can we shift to the 01:11:01.040 |
commercial side for a second? I just passed away. Yeah. Sam 01:11:04.840 |
Zell passed away. Today. Oh, wow. Rest in peace. Yeah. Rest in 01:11:08.280 |
peace. Chicago. Yeah, crazy. But speaking of bastic, interesting 01:11:13.560 |
guy. Yeah. But speaking of the real estate market, so I want to 01:11:16.080 |
give an update on San Francisco, Siri, I was talking to a broker 01:11:20.640 |
the other day. And so here are the stats that they gave me. So 01:11:25.720 |
it was a local broker than someone from Blackstone. And 01:11:28.840 |
they're fans of the pod and just came up to me, we started 01:11:30.760 |
talking about what's happening in San Francisco. Shout out to 01:11:33.760 |
them. Didn't take a didn't take a photo. But But in any event, 01:11:37.160 |
they're they're fans of the pod. So we started talking about 01:11:39.080 |
what's happening in San Francisco, real estate. So the 01:11:43.000 |
SF office market is just a level set is 90 million square feet, 01:11:46.440 |
they said the vacancy rate is now 35%. So that's over 30 01:11:50.000 |
million square feet vacant, and vacancy still growing as leases 01:11:53.880 |
end, and companies shed space because some of that space that 01:11:57.280 |
they're not using is not for sublease. Everyone says what 01:12:00.200 |
about AI is AI going to be the savior? The problem is that AI 01:12:03.640 |
companies are only that's only about a million square feet of 01:12:07.320 |
demand. So 1 million out of 30 million is going to be absorbed 01:12:11.480 |
by AI. And you know, maybe that number grows over time over the 01:12:14.440 |
next 510 years as we create some really big AI companies, but it's 01:12:18.440 |
just not going to bail out San Francisco right now. The other 01:12:21.040 |
thing is that, you know, VC backed startups are very 01:12:24.360 |
demanding in terms of their tenant improvements, and 01:12:27.120 |
landlords don't really have the capital right now to put that 01:12:29.440 |
into the buildings. And startups just are not the kind of credit 01:12:33.320 |
worthy tenants that landlords really want. So this is not 01:12:37.720 |
going to bail anybody out there. They said there are a ton of 01:12:40.600 |
zombie office towers, especially in Soma. And all these office 01:12:44.800 |
towers are eventually gonna be owned by the banks, which you're 01:12:46.600 |
gonna have to liquidate them. And then we're gonna find out 01:12:49.280 |
that these loans that they made are gonna have to be written off. 01:12:52.080 |
Because the collateral that they thought was blue chip, that was 01:12:56.280 |
backing up those loans is not so blue chip anymore. So I think 01:12:59.800 |
we've got not just a huge commercial real estate problem, 01:13:02.680 |
but it's gonna be a big banking problem as basically people 01:13:06.120 |
stop pretending, you know, right now, they're trying to 01:13:08.240 |
restructure loans, it's called pretend and extend, you reduce 01:13:11.280 |
the rate on the loan, but add term to it. But that only works 01:13:15.320 |
for so long. If this keeps going, if the market keeps 01:13:18.160 |
looking like this, I think you're gonna have a real 01:13:20.160 |
problem. And that will be a problem in the banking system. 01:13:22.440 |
Now, San Francisco is the worst of the worst. But they said that 01:13:25.000 |
New York is similar. And all these other big cities with empty 01:13:29.920 |
I'm in New York right now for the side connections conference. 01:13:32.880 |
And it is packed, the city is packed, getting anywhere, 01:13:38.880 |
there's gridlock, you can't walk down the street, you got to walk 01:13:41.840 |
around people every restaurant, it is dynamic. And then I talked 01:13:45.600 |
to people about offices. And they said, people are staying in 01:13:48.640 |
their houses and their tiny little New York apartments, 01:13:51.080 |
instead of going three train stops to their office, they go 01:13:53.640 |
to the office one or two days a week, unless you're like JP 01:13:55.920 |
Morgan, or some other places that's the drop the boom. But 01:14:00.280 |
there's a lot of people still working from home, the finance 01:14:02.720 |
people have all gone back media people are starting to go back. 01:14:04.880 |
So there are three to five days here. And the city is booming. 01:14:09.240 |
Contrast that I spent the last two weeks in San Francisco, 01:14:12.920 |
walking from Soma to the Embarcadero back dead, nobody in 01:14:17.760 |
the city. It's like literally a ghost town. It's a real shame. 01:14:21.760 |
It's a real, real shame. I wonder if these this is the 01:14:26.840 |
question I have for you, Saks. Can they cut a deal? Can they go 01:14:30.480 |
to like month to month ran sub lats? You know, Lucy, Lucy, just 01:14:34.720 |
give people any dollar amount to convince them to come back? Is 01:14:38.200 |
there any dollar amount because I'm looking for a space for the 01:14:40.520 |
incubator in San Mateo, I've been getting a ton of inbound, 01:14:43.640 |
but the prices are still really high. And I'm like, how do I cut 01:14:47.240 |
a deal here? Because shouldn't people be lowering the prices 01:14:51.160 |
dramatically? Or just pretending or will I get a lower 01:14:55.280 |
rates are definitely coming down big time, especially for space 01:14:58.520 |
that sort of commodity and not that desirable. But what's 01:15:00.640 |
happening is, according to the people I talked to is that the 01:15:03.520 |
demand, the people who actually are looking for new space, they 01:15:06.880 |
only want to be in the best areas. And they want to be in 01:15:09.400 |
the newest buildings that have the best amenities. And so that 01:15:13.240 |
sort of commodity office tower where there's barely anybody 01:15:16.000 |
ever there, like no one wants that. So I think people would 01:15:19.360 |
rather pay a higher rent. I mean, the rent will still be 01:15:22.520 |
much lower, probably half the price of what it used to be, but 01:15:24.600 |
they'd rather pay a little bit more for that than get like a 01:15:27.800 |
zombie office tower. We can't talk about all this without 01:15:30.000 |
talking about two cases. Tragically, a shoplifter, a 01:15:36.320 |
criminal who was stealing from a drugstore in San Francisco, I 01:15:40.880 |
got shot. And the video was released, I'm sure you've seen 01:15:44.200 |
it sacks. And then here in New York, everybody's talking about 01:15:47.280 |
this one instance of a marine trying to subdue a violent 01:15:53.360 |
homeless person with two other people. And it's on everybody's 01:15:57.040 |
minds here. And Brooke Jenkins is not prosecuting in San 01:16:01.720 |
Francisco, the shooter, they look like, you know, a clean 01:16:05.640 |
shoot, as they would say in the police business, and 01:16:08.360 |
appropriate. And it's tragic to say it is, but the person did 01:16:11.720 |
charge the security guard, the security guard did fear for 01:16:15.200 |
their life and shot him. So Brooke Jenkins is not going to 01:16:18.040 |
pursue anything. But in New York City, they're pursuing 01:16:21.040 |
manslaughter for the person who did seem a bit excessive from 01:16:23.720 |
the video, it's hard to tell what the reality is in these 01:16:25.600 |
situations. Any thoughts on it, David, these two cases in two 01:16:29.600 |
Yeah, look, I mean, the only time you can get a source da 01:16:34.160 |
excited about prosecuting someone is when they act in 01:16:37.080 |
self defense or defensive others. I mean, this marine, I 01:16:40.440 |
guess Daniel penny is his name, he was acting in defense of 01:16:44.200 |
others, the person who he stopped was someone with an 01:16:48.120 |
extensive criminal record, who had just recently engaged in a 01:16:54.000 |
attempt at kidnapping who had punched elderly people had 01:16:58.080 |
dozens of arrests. In fact, people on Reddit were talking 01:17:02.680 |
about how dangerous this person was, apparently, a dozen years 01:17:06.760 |
ago or so he was seen as more of like a quirky like Michael 01:17:11.720 |
Jackson impersonator, street performer, street performer, but 01:17:15.560 |
something happened. This is according to a Reddit post that 01:17:18.360 |
I saw where something happened. And there was some sort of 01:17:21.040 |
psychological break. And then since then, he's had dozens and 01:17:24.200 |
dozens of crimes. And they just keep letting him loose through 01:17:29.320 |
this revolving door of a justice system we have. And now look, no 01:17:32.640 |
one likes to see him basically dying. And yeah, it's too bad. 01:17:37.640 |
It's horrible that that happened. Tragic. I don't know, 01:17:40.640 |
though, that if you're trying to stop someone, I don't know how 01:17:44.640 |
easy it is to precisely control whether you use too much force 01:17:48.640 |
or not. So I think Daniel Penny has a strong case that he was 01:17:52.960 |
acting in self defense and defensive others. And there 01:17:55.560 |
were two other people, by the way, who are holding this 01:17:57.320 |
person down. There are three of them restraining him. And what 01:18:00.680 |
universally New Yorker said to me of all different backgrounds 01:18:05.080 |
was this is not a race issue. The other I think one or two of 01:18:07.880 |
the other people were people of color. It was not a race issue. 01:18:10.560 |
And they're trying to make it into a race issue in both these 01:18:12.640 |
cases. And it's this is literally what happens. It just 01:18:16.320 |
having been through this in New York in the 70s and 80s when you 01:18:18.880 |
do not say who's they when you say trying to make a bunch of 01:18:22.240 |
protests on the street, both in San Francisco, and New York 01:18:26.240 |
people protesting these as you know, justice issues. The fact 01:18:29.720 |
is, if you do not, if you allow lawlessness for too long a 01:18:33.680 |
period of time, you get a Bernie gets situation. And Bernie gets 01:18:38.080 |
people can look it up in the 80s. I was a kid when it 01:18:40.840 |
happened. But they tried to mug somebody, he had a gun, he shot 01:18:44.680 |
him. And like, this is what happens if you allow lawlessness 01:18:47.400 |
for extended periods of time. It's just, you're basically 01:18:51.860 |
gambling. And what happened to Bernie gets he got not guilty, 01:18:55.080 |
the case, he got not guilty, but I think he had an illegal gun. 01:18:58.480 |
So he was guilty of that. The Bernie gets thing was really 01:19:01.680 |
crazy. Because at the time, the climate in New York, and this 01:19:06.480 |
1984 shooting. There was a portion of people who I don't 01:19:12.560 |
want to say they made him a hero, but they made it a see, 01:19:17.080 |
this is what happens if you allow us to be assaulted 01:19:19.840 |
forever, we're going to fight back at some point. That was the 01:19:22.520 |
vibe in New York when I was a child. I was 1415 years old when 01:19:25.080 |
this happened. He was charged with attempted murder assault. 01:19:27.640 |
Jason, what was the name of that vigilante group that used to 01:19:30.400 |
walk the streets to something angels that was the guardian 01:19:33.640 |
angels guardian angels. So it was so bad in the 80s. And I 01:19:36.880 |
actually almost signed up for the guardian angels. I went to 01:19:39.560 |
their headquarters because I was practicing martial arts. And I 01:19:41.360 |
thought I would check it out. And they had their office in 01:19:44.800 |
Hell's Kitchen, I didn't wind up joining. But what they would do 01:19:46.880 |
is they would just ride the subway, they would wear a 01:19:49.920 |
certain type of hat, and wear a guardian angel shirt. And all 01:19:53.560 |
they did was ride the subways, a red beret, and they would just 01:19:56.600 |
ride the subways. And you felt like martial arts were you 01:20:00.080 |
taking? Taekwondo? I was in Taekwondo. Yeah, this is before 01:20:03.680 |
mixed martial arts, but they just rode the subways. And 01:20:06.680 |
honestly, I've been on the subways with the many times you 01:20:08.800 |
felt safe. And it wasn't vigilantes. They were guardian 01:20:12.640 |
angels, they use that term. And many times they would do exactly 01:20:16.160 |
what this Marine did, which is try to subdue somebody who was 01:20:18.440 |
committing crime. I was, I had two distinct instances where 01:20:23.120 |
people tried to mug me, you know, riding the subways in New 01:20:25.800 |
York in the 80s. Two distinct times. And one was a group of 01:20:29.520 |
people and one was one person like it was pretty scary. Both 01:20:33.360 |
times I navigated it, but it was Yeah, not pleasant in the 80s in 01:20:38.360 |
New York. I say one more thing about this. Danny's honey, 01:20:41.040 |
Jordan Neely case. So look, at the end of the day, this is 01:20:43.240 |
gonna be litigated. I don't know all the details, they're gonna 01:20:46.080 |
have to litigate whether Daniel pennies use of force was was 01:20:49.160 |
excessive or not. But But here's the thing is that the media has 01:20:52.760 |
been falsely representing Jordan Neely by only posting 10 year 01:20:56.800 |
old photos of him and leaving out crucial information. This 01:20:59.320 |
was a press report. So again, this is why I mentioned the 01:21:02.560 |
whole Michael Jackson impersonator thing is that the 01:21:04.640 |
media keeps for training Neely as this innocent, harmless guy 01:21:07.760 |
who is this like delightful Michael Jackson personator. In 01:21:10.840 |
truth, he hasn't done that in more than a decade, because 01:21:14.520 |
again, he had some sort of mental break. And since then, 01:21:17.840 |
he's been arrested over 40 times, including for attempting 01:21:20.680 |
to kidnap a seven year old child. And so the media is not 01:21:26.120 |
portraying this case, I think in an accurate way. And I think as 01:21:30.280 |
a result of that, it leads to pressure on the DA to prosecute 01:21:35.280 |
someone who has I think, a strong self defense claim. Or 01:21:38.280 |
you know, maybe the DA just wants to do this anyway. And it 01:21:41.280 |
gives the DA cover to do this Soros is I mean, I know that we 01:21:45.420 |
had this back and forth with us. Why is CNN being inaccurate? Do 01:21:47.880 |
you think sex? They're basically cooperating with Alvin Bragg's 01:21:51.800 |
interpretation of the case. They're trying to make the case 01:21:57.120 |
Why don't they just take it straight down the middle? It's a 01:21:59.200 |
tragedy. We have a screwed up situation here. We got a mental 01:22:02.360 |
health crisis. And it's a tragedy for everybody involved 01:22:04.920 |
on the Bernie get stuff. He served eight of a 12 month 01:22:08.200 |
sentence for the firearm charge. And he had a massive $43 01:22:11.800 |
million civil judgment against him. In 1996. Decade later, it's 01:22:17.880 |
this is a little different than the gets thing because pulling 01:22:21.360 |
out a gun and shooting somebody. Yeah, that's deadly intent. 01:22:24.040 |
Yeah, that's that's a huge escalation. Whereas penny, he's 01:22:27.840 |
a he's a trained Marine, right? He's trying to immobilize him. 01:22:30.520 |
He has to believe that he's just trying to subdue. Yes, nearly. 01:22:34.800 |
And so still using a chokehold to kill him. That's an 01:22:37.920 |
unfortunate consequence of what happened. But he was trying to 01:22:41.360 |
restrain the guy. As far as we know, right, as far as we know. 01:22:45.160 |
Yeah, I mean, tragedies all around. We got to have law and 01:22:48.480 |
order. I tweeted like, I don't know why we still have the post 01:22:51.400 |
office, maybe we can make that like once a week, and redo all 01:22:54.680 |
of that space and allow every American who's suffering from 01:22:57.640 |
mental illness to check in to what used to be the post 01:23:00.280 |
office. You know, maybe like once a week, and obviously give 01:23:04.120 |
those people very gentle landings, but I don't think we 01:23:06.240 |
need postal service more than once or twice a week. And then 01:23:08.800 |
let you let's reallocate some money towards mental health in 01:23:13.320 |
this country where anybody who's sick, who feels like they're 01:23:16.280 |
violent or feels like they're suicidal can just go into a 01:23:19.540 |
publicly provided facility and say, I'm a sick person, please 01:23:22.440 |
help me this would solve a lot of problems in society. We've 01:23:25.680 |
got a mental health crisis, we should provide mental health 01:23:28.160 |
services to all Americans. And it's obviously easy thing for us 01:23:32.520 |
And if we had done that, then this never would have happened. 01:23:34.880 |
Exactly. I mean, literally, you have sacks who wants to balance 01:23:38.240 |
the budget saying, Hey, this is something worth spending on. We 01:23:41.800 |
compared to the impact on society. I don't think it'd be 01:23:44.960 |
We would save money. We'd save money because a city like San 01:23:48.120 |
Francisco could become quite livable or New York if and then 01:23:51.640 |
you've got forbid these terrible school shootings, you know, if 01:23:53.880 |
you avoid even one of them, it's 30 people's lives or 10 people's 01:23:58.440 |
convert post offices, what we need to do is stand up scale 01:24:02.400 |
shelters, and it doesn't need to be done on the most expensive 01:24:05.040 |
land in a given city outside of cities. Why does this? There is 01:24:09.720 |
no expectation in Europe for like Paris or London to be 01:24:13.360 |
affordable or Hong Kong to be affordable. There are affordable 01:24:15.920 |
places 30 minutes outside of those places where you could 01:24:18.760 |
put these facilities. I just want to ask one question to 01:24:21.320 |
sacks because I don't know. And I know sacks is a little bit 01:24:24.720 |
deeper into this. And what is George sources motivation for 01:24:28.680 |
putting in these lawless insane da is like, I understand that 01:24:34.120 |
he was able to buy them. They're low cost. There's not a lot of 01:24:37.080 |
money in them. Okay, I understand that that's 01:24:38.480 |
tablestakes. But what is his actual motivation for causing 01:24:42.760 |
Listen, we can't know exactly what his motivation is. But what 01:24:46.000 |
he did is he went into cities where he doesn't live and 01:24:50.680 |
flooded the zone with money to get his preferred candidate 01:24:53.440 |
elect his da. Now, the reason he did that was to change the law. 01:24:56.360 |
And the way that he changed law was not through legislature 01:24:58.640 |
is the way you're supposed to operate, but rather, by abusing 01:25:01.560 |
prosecutorial discretion. So in other words, once he gets his 01:25:04.880 |
Soros da elected, they can change the law by deciding what 01:25:08.320 |
to prosecute and what not to prosecute. Right. And that's why 01:25:10.880 |
there is so much lawlessness in these cities. 01:25:14.400 |
Yeah, but this is not the only way that Soros has, I'd say, 01:25:19.480 |
imposed his values on cities that he doesn't even live in. 01:25:24.320 |
Where does he live? I think he's a New York guy, but I'm not 01:25:26.640 |
sure. But but he's gone far beyond that, obviously, in these 01:25:31.120 |
elections, but also, he's done this across the world. Soros has 01:25:35.400 |
this thing called the Open Society Foundation, which sounds 01:25:37.680 |
like it's spreading democracy and liberal values, but in fact, 01:25:40.640 |
is fomenting regime change all over the world. And he's been 01:25:45.160 |
sponsoring and funding color revolutions all over the world. 01:25:47.800 |
Now, if you like some of the values he's spreading, then 01:25:50.880 |
maybe you think that's a good thing. But I can tell you that 01:25:54.040 |
the way this is perceived by all these countries all over the 01:25:57.480 |
world is it creates tremendous dissension and conflict. And 01:26:02.520 |
then they look at America and they basically say, you know, 01:26:05.680 |
this American billionaire is coming into our country, and 01:26:07.800 |
he's funding regime change. And it makes America look bad. Now 01:26:12.400 |
he's doing this, I think, with the cooperation of our State 01:26:16.000 |
Department, a lot of cases, and maybe the CIA, I don't know. But 01:26:19.640 |
this is why America, frankly, is hated all over the world as we 01:26:22.540 |
go running around meddling in the internal affairs of all 01:26:27.080 |
Two is this guy? Well, they're like, that was the other thing I 01:26:29.400 |
heard is that he's not all there. And the people around him 01:26:31.520 |
are doing these kind of things in his organization. 01:26:33.440 |
I heard something similar is that as the idiot son, 01:26:35.760 |
Alexander, who's really now pulling the strings. Would you 01:26:39.480 |
would you allow Soros to speak at all and summit? Would you 01:26:41.560 |
enter? Sure. Yeah, let's have Soros or a son and Nick and 01:26:45.640 |
Well, apparently, there was an article that Alexander Soros has 01:26:48.640 |
visited the White House like two dozen times during the Biden 01:26:51.240 |
presidency. This is an extremely powerful and connected person. 01:26:55.020 |
I mean, I'm sure he listens to pod. Okay, we'll see you all 01:26:57.900 |
next time. This is Episode 129 of all in we'll see you in 01:27:10.580 |
we open sources to the fans and they've just gone crazy. 01:27:31.560 |
We should all just get a room and just have one big huge orgy 01:27:39.960 |
because they're all just like this like sexual tension that