back to indexE94: NFT volume plummets, California's overreach, FBI meddling, climate change & national security
Chapters
0:0 Bestie intros + Burning Man debate
4:16 NFT trading volume plummets
10:20 California's new fast food wages bill, government creating bad incentives, how the free market response will further the labor issue
40:15 Zuckerberg on Rogan, FBI interference with the Hunter Biden laptop story
52:0 Balancing fighting climate change without harming the economy or national security, EU energy policy failure
66:4 Besties reflect on Mikhail Gorbachev's passing
00:00:00.000 |
All right, everybody, welcome to Episode 94 of all in. This is 00:00:04.440 |
the summer unprepared episode. There's not much news. We're 00:00:07.140 |
gonna wing it. With me again, in his golfing cap, the Rain Man 00:00:14.520 |
himself. David Sachs, you do okay, buddy. Yeah, that's good. 00:00:18.520 |
Yeah. All right. I'm kind of annoyed that we're taping on a 00:00:21.520 |
Wednesday. You know, this is a slow news week. It's a slow 00:00:24.660 |
news. Well, that's not why we're taping on a Wednesday is because 00:00:26.960 |
you have to want to go to Burning Man. Well, whatever. I 00:00:28.920 |
mean, listen, everybody has to get their burnout. Have you 00:00:31.160 |
ever been to Burning Man? Yeah, I've been before. Burning Man. 00:00:34.980 |
Really? He was there on threesome Thursday. I've been a 00:00:39.120 |
couple of times. I think it's, it's a cool experience doing 00:00:42.960 |
worth doing, you know, once or twice in your life. It's not 00:00:46.740 |
something I want to do every year. But I know a lot of people 00:00:49.600 |
do like doing it every year. They're like really into it. 00:00:51.960 |
I'm not really into it that way. But I think it was a worthwhile 00:00:54.880 |
experience to go at least once. Jamal, have you been? 00:00:58.920 |
Freiburg? Have you been? I've not been no, I have no desire. 00:01:02.160 |
No, I don't like driving in the car for a long period of time. 00:01:04.620 |
And well, by the way, I'll tell you, I'll tell you why I don't 00:01:07.680 |
really have a huge desire to go. I I don't really have a huge 00:01:12.780 |
desire at this point in my life to want to do a ton of drugs. 00:01:16.620 |
That's not that's not just about that. Really? No, really. What 00:01:21.300 |
it's really about is art. Are we joking right now? It's if you 00:01:29.940 |
I really I really like art. I think I actually collect 00:01:32.640 |
phenomenal art. In fact, I like to go to like freeze the bien 00:01:35.700 |
ale. It would blow your mind. You have no idea the scale of the 00:01:40.780 |
art there is tremendous. It is extraordinary. And if you're 00:01:43.420 |
into music, we just can we just call it what it is. It started 00:01:46.260 |
out as a festival that did celebrate art and creativity. 00:01:50.020 |
Yeah. And over time became mass market. And in order to become 00:01:54.100 |
mass market, there is still an element of that. But it's a 00:01:57.400 |
Huge party. And I think people should just be more honest that 00:01:59.680 |
it's super fun. You can really rage for a weekend or for an 00:02:03.640 |
entire week. And people should enjoy themselves. But let's not 00:02:07.240 |
like, have to put all these labels about how like 00:02:09.700 |
intellectually superior it is and how you feel like bullshit. 00:02:12.520 |
I don't think it's really it's okay. It's okay to go and do 00:02:16.240 |
drugs. The thing that's interesting about it is it's 00:02:18.420 |
really about community. A large number of people get together 00:02:20.740 |
they build can do drugs. No community. No, you've never 00:02:24.080 |
been you would. Literally, that's not the vibe there. 00:02:27.340 |
The vibe is really dancing and creativity and art and community. 00:02:31.180 |
so it's like a rave. It's where everybody's on drugs. 00:02:34.660 |
I mean, sure. I mean, with neon with a lot of neon. No, it's 00:02:41.080 |
be those experiences are the best when you're sober. Best 00:02:43.940 |
best you have to understand the grace going to EDC sober. I've 00:02:47.500 |
never been to EDC. But this there's a great filter here, 00:02:50.160 |
which is you've got to drive six, seven hours into the desert. 00:02:53.740 |
And you have to survive. You have to bring everything with 00:02:57.100 |
whatever it is a bathroom water food. And it is harsh. I mean, 00:03:01.240 |
it's 100 plus degrees in the day and it's 40 degrees at night. 00:03:06.920 |
It used to be that way. When I went on when I went, I went with 00:03:10.360 |
a bunch of guys who had the whole thing like wired in and 00:03:13.360 |
they had like these yurts and it was like a whole community and 00:03:16.940 |
they had like running water and a kitchen and it was basically 00:03:20.500 |
glamping. It was glamping. There are people that hard to survive. 00:03:24.140 |
No, they're I mean, it's hard to do that. There are people who 00:03:29.080 |
see where you have to go with people. You have to go to 00:03:31.480 |
people who know what they're doing and are highly organized. 00:03:33.980 |
So yes, they take months to set up. So yes, they spent the 00:03:37.300 |
whole year whatever setting it up and it was really elaborate. 00:03:39.580 |
And then there was an article like in the New York Times 00:03:42.260 |
basically saying that you know that they were ruining Burning 00:03:45.920 |
Man because they're making it too nice. That was part of the 00:03:49.700 |
problem. You remember this? That's when I went I went to the 00:03:52.520 |
too nice experience. They wrote that about you. Yeah, basically 00:03:56.840 |
Thanks, Sachs. Thanks for ruining Burning Man. 00:04:05.540 |
We open source it to the fans and they've just gone crazy. 00:04:14.720 |
I don't know if you've been following the NFT story. We'll 00:04:19.220 |
queue that up as our next story. OpenSea, the marketplace, the 00:04:23.160 |
eBay of NFTs, if you will. The volume has dropped. 00:04:26.000 |
99% since May 1st peak of $406 million in a single day. On August 00:04:35.000 |
28th volume dropped to around $5 million. Down 99% according to 00:04:40.280 |
decentralized app tracker DappRadar. On a monthly basis, 00:04:44.600 |
OpenSea's volume has dropped 90% according to Dune analytics. The 00:04:48.480 |
floor price of a Bored Ape has now dropped by 53%. If you 00:04:55.520 |
At a $13.3 billion valuation December 2021. In a round led by 00:05:05.060 |
KOTU and Paradigm. To put that in perspective, that was nine 00:05:09.320 |
whole months ago. My how the world has changed. Freiburg, 00:05:13.280 |
what's your take on NFTs and this whole boondoggle? 00:05:17.120 |
how we look back on it. Yeah, I don't know. I mean, we've had a 00:05:19.880 |
lot of bubbles as a species. This is just another one. 00:05:27.440 |
Well, this is different. This is a different market. What do you 00:05:32.420 |
We're in the 8th month of this story, which is fill in the 00:05:37.100 |
Yeah. This one scene. The question I have here, Sax, is do 00:05:41.300 |
you think that this is the end of the category though? Do you 00:05:44.180 |
think that's a category ending? Do you think there was ever 00:05:46.360 |
anything interesting here? Because you took you heard a lot 00:05:49.040 |
of pitches like I did, for how NFTs were going to change 00:05:52.400 |
You know what, I'll say something because I like in the 00:05:55.040 |
past, I think I've said this to the point, just to tie it 00:05:58.340 |
back to what we just talked about. At the end of the day, 00:06:01.460 |
you know, I think I've said this in the past, but like what 00:06:05.300 |
differentiates humans from all other species on Earth is our 00:06:08.600 |
ability to tell to communicate and tell stories. A story is 00:06:12.380 |
like a narrative about something that doesn't exist. And by 00:06:15.260 |
telling that narrative, you can create collective belief in 00:06:17.720 |
something. And then that collective belief drives 00:06:20.300 |
behavioral change and action in the world, right? I mean, 00:06:24.140 |
to government, to money, to art is all kind of driven, all 00:06:28.040 |
markets are driven by this notion of narrative and 00:06:31.400 |
collective belief that arises from that narrative. So when you 00:06:34.160 |
go to an art dealer, and you have a sit down with an art 00:06:36.800 |
dealer, you might appreciate the art. But then the narrative 00:06:39.860 |
begins. And the narrative is this artist did this and they 00:06:42.800 |
came from this and the artist that looks like this traded at 00:06:45.140 |
6.8 million. And this artist is only trading for 3.2 million. 00:06:48.800 |
And the whole narrative takes you away into Okay, I should pay 00:06:53.240 |
for this piece of art. And ultimately, it'll be worth more. 00:06:56.180 |
And that's the fundamental premise of markets is you're 00:06:59.180 |
buying into something not necessarily always and this is 00:07:02.420 |
such a minority. Now it's scary. used to be that you don't a 00:07:05.300 |
piece of a business or productive asset, you pull cash 00:07:08.180 |
out of it. Hopefully you get more cash out than you put in 00:07:10.820 |
when you buy it. Nowadays, markets allow you to trade. 00:07:14.060 |
Therefore, the majority of market based decisions are 00:07:16.760 |
driven by if I buy something for x, I'm going to be able to sell 00:07:20.180 |
it to someone else for y. And so the narrative is driven around, 00:07:22.340 |
I'm paying this someone else will pay y down the road, I'll 00:07:26.480 |
make money from that. And this has been repeated 1000 times 00:07:29.600 |
over, it's been repeated in the ICO tokens, it's been repeated in 00:07:34.580 |
the tulip bubble. It's been repeated in every art market and 00:07:37.520 |
subsidiary of every art market since the dawn of time, since 00:07:40.660 |
the dawn of markets. And I think the NF T's are really just one 00:07:44.340 |
more kind of example where this beautiful narrative is formed, 00:07:48.080 |
the digitization of art, but it is no different than any other 00:07:52.320 |
The real point is that there's no longer a market for me to buy. 00:07:54.760 |
And so if I buy something for x, I'm going to be able to sell it. 00:07:56.880 |
So that's one of the ways that I'm going to be able to sell it. 00:07:58.860 |
And the other way that I'm going to be able to sell it is to make my own 00:08:01.860 |
assets, which is the way that I'm going to be able to sell my assets, 00:08:04.500 |
which is the way that I'm going to be able to sell my assets. And I'm 00:08:07.140 |
going to be able to sell my assets, I'm going to be able to sell my assets. 00:08:09.900 |
And I'm going to be able to sell my assets. And I'm going to be able to sell my assets. 00:08:14.340 |
And ultimately, you can maybe trade out of it early. But still, so much of it became about, 00:08:18.900 |
well, the stock is going up. Therefore, if I spend x, someone else will spend y, 00:08:22.860 |
and I'll make money on the stock with no underlying assertion of what's the productivity 00:08:27.060 |
of that business, what's the return on cash going to be based on the cash flows coming 00:08:31.020 |
out of that business over time, or any of the fundamentals around it. 00:08:33.960 |
And so so much of our discussion and distortion has really been driven by this narrative fueling. And I 00:08:39.420 |
think the NFTs are an example, but there are many others. And we're going to see many more, 00:08:43.860 |
as long as humans have the ability to communicate with each other. 00:08:48.540 |
I think Freeburg's mostly right. Like, I do think that there's this thing, 00:08:52.380 |
the the Burning Man Coachella example is the best way to describe this. A lot of 00:08:58.140 |
these things are the same. But when a few people approach something early, 00:09:02.940 |
they're too insecure to admit that it's the same as something else. And so they 00:09:08.940 |
spend a lot of time trying to tell you a narrative about why it's totally different. 00:09:12.420 |
The Buffett example, you know, would be the quote, you know, when somebody tells you that 00:09:17.820 |
this time is different, it's probably not that different. Or the other quote that's well worn 00:09:22.620 |
in history is like, you know, things don't necessarily repeat in history, but they rhyme. 00:09:27.480 |
All of this is trying to say that other than like, fundamental leaps of science, 00:09:32.940 |
there's not a lot of stuff that's new in the world. You know, we are repeating things over and over. 00:09:38.100 |
And one of the things we repeat is the social capital that you get from having certain choices, 00:09:44.040 |
and then getting other people to validate those choices, because you want to feel like you're, 00:09:48.840 |
you know, worthwhile. And this happened in NFTs. And I'm sure in the first phase of 00:09:56.100 |
different movements in art, that also happened. It's probably happened in a bunch of other markets 00:10:02.100 |
as well. So these things are more similar than they are different. Coachella and Burning Man, 00:10:07.620 |
the same. NFTs and part of the art market, the same. Everybody that runs to you with why it's 00:10:14.460 |
so different. I would just have a grain of salt and say, you don't need to be different. Just enjoy 00:10:18.600 |
it because you think it's cool. All right, we can go either to this California fast food wages story, 00:10:23.100 |
or we can go to Goldman Sachs workers are quitting en masse because Goldman Sachs is saying you got 00:10:29.460 |
to come back to the office five days a week. Where do you want to go, boys? 00:10:31.440 |
I think the California thing is really interesting. Just the three things that California did in the 00:10:33.720 |
last five days a week. Where do you want to go, boys? I think the California thing is really interesting. Just the three things that California did in the last five days a week. 00:10:34.740 |
I think the California thing is really interesting. Just the three things that California did in the last five days a week. Where do you want to go, boys? 00:10:35.100 |
The three things that California did in the last week. I don't know if you guys saw but number one 00:10:39.420 |
is they basically said that you know you have they're going to ban ice combustion engines, 00:10:45.780 |
I think by 2035. So you have to be battery EV. And they have new sales, you can still have a new 00:10:52.200 |
sales, sorry. Yeah, new sales, which is the first in the country to do it. And just to go back to a 00:10:57.660 |
second California is the largest auto market in the United States. And effectively, you know, 00:11:04.440 |
because California had certain emission standards that were tougher than the rest of the United 00:11:09.600 |
States. And, you know, the federal government tried to sue California and back and forth all 00:11:13.500 |
of this rigmarole. So California is always sort of led on climate. That was one thing. Then the 00:11:19.080 |
second thing is they said, we're going to create, you know, a state sponsored organization to set 00:11:25.320 |
the minimum wage rates for fast food workers. Pause what we think about that. And then the 00:11:30.660 |
third thing is that this congresswoman, I think, or this assembly person, 00:11:34.380 |
Buffy Wicks, passed a law through the Senate and through the house, that essentially holds social 00:11:41.880 |
media companies liable for the mental well being and the mental health of kids. And depending on 00:11:50.220 |
where you're lying on all these issues, it's like an interesting kind of lens in which we're like, 00:11:55.860 |
California is really becoming extremely, extremely legislatively active and basically in imposing 00:12:03.300 |
their will on free market. And so I think that's a really interesting thing to look at. And I think, 00:12:03.720 |
you know, we're going to have to look at the federal government and the federal government 00:12:05.220 |
and the federal government and the federal government and the federal government and the 00:12:06.180 |
federal government and the federal government and the federal government and the federal government. 00:12:06.240 |
Yeah, and for people who don't know the fat, the fast food issue was, they want to move to a more 00:12:11.940 |
European style where instead of unions, debating with individual corporations, McDonald's Burger 00:12:18.600 |
King, whoever, what they're going to pay their employees, or employees having that discussion one 00:12:24.060 |
on one, they want the state, some state authority to pick the amount per hour, fast food companies 00:12:32.340 |
pay. And then I guess, you know, we're going to have to look at the federal government and the federal 00:12:33.060 |
government and the federal government and the federal government and the federal government and the federal 00:12:33.540 |
government and the federal government and the federal government and the federal government and then I guess 00:12:34.020 |
disintermediate the unions. It's kind of confusing. But interestingly, I've our friend, 00:12:41.220 |
the Raina Gonzalez, if you remember, she's the one who told Elon to, you know, F off. And was a 00:12:48.600 |
former Democratic legislator. She introduced the bill when she was in the assembly. And so it's kind 00:12:55.380 |
of a weird approach. I'm not sure why the government No, it's not. It's not. It's not 00:13:00.900 |
disintermediating the unions. She learning a lot of the law. And then I guess disintermediate the unions. 00:13:02.400 |
She learning a lot of the law. And then I guess disintermediate the unions. She learning a lot of the law. And then I guess 00:13:02.900 |
Gonzalez, Fletcher, I guess now she's the former Democratic legislature who drove Elon out of the 00:13:08.880 |
state. Remember when she said, you know, F Elon? And now she works message received? Yes, and he 00:13:14.280 |
left. Great, great move for for the state. So she is now running one of the biggest unions. And what 00:13:20.280 |
she said is that this bill will move California closer to the labor model used in Europe, where 00:13:27.300 |
unions negotiate, the unions are still negotiating for wages and work conditions. But in an end, it's going to be a big deal. 00:13:31.740 |
But in an entire sector, with some sort of government board, rather than company by company. 00:13:37.140 |
So in other words, it's too slow to unionize, you know, company by company, they would just want to 00:13:42.420 |
unionize entire sectors of the economy. Now, the thing that's really crazy about this minimum wage 00:13:48.780 |
proposal is that, like you said, Jason, it isn't just that they set a minimum wage, they created a 00:13:56.400 |
10 person panel. So there's a new government agency that's going to regulate fast food, 00:14:01.080 |
fast food, the fast food industry, you know, they're not going to stop a minimum wage, it's gonna be work conditions. So now you're turning fast food restaurants into a highly regulated part of the economy. And the weird thing is, it's not even just all fast food, it's basically chains that have more than 100 fast food restaurants nationally. So in other words, if you're a local operator of two McDonald's, you'd be subject to this 10 person board. But if you own 20 00:14:30.420 |
restaurants that aren't part of a national chain in California, then you're not. And so the weird thing is that some workers in the fast food industry could get this new minimum wage of $22, where other or as other workers who work for, you know, they're not part of a major chain would get the statewide minimum wage at 15. So it's, it's, it's kind of unfair in that some parts of the restaurant industry are being regulated and others aren't. 00:14:55.920 |
Is there any justification here that you can think 00:14:59.760 |
of to not let the free market do its thing, Friedberg? 00:15:02.760 |
Well, if you represent the workers, you're saying, Hey, pay him $22. And you can, and here's what that points out. It actually points out that there's probably a pretty big business flaw in a business that's so reliant on commodity labor. And that business in and of itself is not as valuable as you might have thought it was before. Think about it this way. There's a company called McDonald's. And then there's another company called 00:15:29.100 |
McDonald's labor. And what happened before is McDonald's employed people to do work at McDonald's. And maybe what happens in the future is all those people are looking left and looking right. And they're like, you know what, McDonald's has nothing without us. We are the business, we deserve the value. So in terms of how much of the value of the business flows to the shareholders of McDonald's, versus the employees that work at McDonald's, the employees that work at McDonald's are saying, let's go do a startup. And our startup is called U.S. 00:15:34.100 |
unemployed people to do work at McDonald's. And maybe what happens in the future is all those 00:15:39.320 |
people are looking left and looking right. And they're like, you know what, McDonald's has 00:15:42.780 |
nothing without us. We are the business, we deserve the value. So in terms of how much of the value of 00:15:49.260 |
the business flows to the shareholders of McDonald's versus the employees that work at 00:15:53.600 |
McDonald's, the employees that work at McDonald's are saying, let's go do a startup. And our startup 00:15:58.500 |
is called Union. And our business that's called Union is now going to provide a bunch of services 00:16:03.360 |
to McDonald's. And those services, we're going to start to value capture what they're doing. 00:16:07.580 |
And it indicates that maybe there's something inherently disadvantaged in that model. And it 00:16:13.400 |
could be that the argument could be made that that business in the early 20th century, 00:16:16.820 |
in the late 19th century, yeah, the 19th century, early 20th century and beyond, 00:16:23.720 |
that you could build a good advantaged business because there was such an eager, hungry workforce, 00:16:29.120 |
people looking for work, people would come and work for nickel an hour or whatever. 00:16:32.820 |
And so you as a business owner could make a lot of money doing that you could sell a product for $1, 00:16:37.860 |
pay people 5 cents to make it for you. But nowadays, those people are looking left, 00:16:41.700 |
they're looking right, they're like, wait a second, we are the business, 00:16:44.180 |
or we're 90% of the value of this business. And so a couple things will happen. Number one is the 00:16:49.860 |
inherent business model of a company that's so dependent on commodity service 00:16:54.100 |
is going to be flawed and challenged in the 21st century. And fast food restaurants aren't going to 00:16:58.340 |
be as valuable. Businesses that rely on commodity service are not going to be as valuable. Number two, 00:17:03.460 |
businesses are going to automate. So new businesses will emerge that actually do the fast 00:17:07.700 |
food work, or do the car building work, or do the dock loading and unloading work that are automated, 00:17:13.220 |
and they'll have an inherent advantage in the economy. And they'll win. And I think number three 00:17:17.700 |
is that in the near term, consumers will suffer because prices will go up, because someone has to 00:17:22.020 |
pay for the incremental cost of running this unionized business. And that will ultimately 00:17:26.020 |
be the customer of that business. I'll say the fourth thing is that the consumer will suffer. 00:17:27.380 |
And I think the fourth thing is that the consumer will suffer because prices will go up, because 00:17:27.460 |
someone has to pay for the incremental cost of running this unionized business. And that will ultimately 00:17:27.700 |
be the customer of that business. And I think the fourth thing is that the consumer will suffer. I'll say the fourth point is, I am concerned about this sort of behavior in regulated spaces, 00:17:33.220 |
where the government has actually control over the market itself. So this is a good example of this 00:17:38.340 |
is, you know, we had our friend Ryan from Flexport on a few times, but you can't just start up a 00:17:45.220 |
shipping company and have a dock, the docks are run by the cities, and they're run by the state. 00:17:50.020 |
And so those docks access to those docks access to that market 00:17:53.380 |
is regulated by the government. So then what happens is the union, right, the the labor company 00:17:58.660 |
can actually have regulatory capture through the government of a segment of the economy. 00:18:03.060 |
And that's where things start to get dangerous. So look, I'm all for unions, if they want to show 00:18:07.060 |
that a business is reliant on a commodity labor force and commodity service, and they all band 00:18:11.860 |
together and start a startup. I mean, that's what we do. We all start a company. Yeah, go and compete. 00:18:16.100 |
But the issue is then when the government creates regulatory capture over a segment of the economy, 00:18:20.740 |
and make it difficult for the free market to ultimately do its job, it's a very difficult job. 00:18:21.460 |
of either automating or creating a newly advantaged business model, or all those things that allow us to progress. 00:18:28.580 |
Automating is coming. A couple of us made a bet on a crazy idea of like automated, robotic coffee 00:18:36.180 |
called Cafe X. Not to talk our own book here, and the company really struggled during the pandemic. 00:18:40.820 |
But they have two units at SFO Chamath, they're doing 70 $73,000 in two units last month. 00:18:48.100 |
And I'm like, wow, the company figured it out. And there's another 00:18:51.380 |
thing that's going on. And that's the fact that we're not going to be able to do a lot of the work that we're doing. 00:18:51.380 |
company doing franchise. This is the this is the sad thing that California doesn't realize is like, 00:18:56.260 |
I think that the folks writing these laws have just an extremely poor understanding of 00:19:01.060 |
economics and capitalism. Because the first thing that it does, 00:19:05.460 |
effectively, and everybody can understand this is it caps the profitability of a company, right? 00:19:13.540 |
Because it effectively says, if you look at an industry that is over earning, then this council will 00:19:21.300 |
essentially see money that should be reappropriated to employees. Now, that idea is not necessarily a 00:19:26.900 |
bad thing. For example, you see that all the time in technology, right? If you look at the EBITDA 00:19:31.700 |
margins of like big tech, and how they've changed, they've actually eroded over time, as the companies 00:19:38.420 |
have had to generate more revenue, because employees demand more and more of those gains. 00:19:43.700 |
That's an implicit part of how the free market economy 00:19:46.100 |
works in technology. So you know, if you don't feel like you're getting paid well at, you know, Google, 00:19:51.220 |
you go to Meta, and you get paid more and vice versa. So there's a natural effect of people being 00:19:57.780 |
able to do that in certain markets. But when you impose a margin and essentially say you can only 00:20:02.420 |
make 10%, because the rest of these profits I'm giving to these folks, because I've imputed 00:20:06.580 |
their new wage to be something that's much greater than what they were paying before. 00:20:10.260 |
The unfortunate thing is what Freebrook said, which is that you will rebuild that 00:20:15.060 |
business without those people. Because it is the rational thing to do. And so unfortunately, 00:20:21.140 |
what you'll do is you'll take a bunch of people that should be in the economy, right? Think about 00:20:25.380 |
like, new immigrants or young people that are first getting on their feet, they'll take these 00:20:29.700 |
jobs, you know, I mean, I've worked at Burger King, when I was 14. That's how I got my first 00:20:35.380 |
you, those jobs won't be there to be had. And that has ripple effects as these folks get older or try 00:20:42.900 |
to get more ingrained in the economy. And this is what I wish California would understand is that 00:20:51.060 |
not free, you're not going to be able to get out of the economy. And so, 00:20:54.500 |
you know, I think that's the thing that's really important. And I think that's the thing that's 00:20:56.820 |
really important. And I think that's the thing that's really important. And I think that's the 00:20:57.700 |
thing that's really important. And I think that's the thing that's really important. And I think that's 00:20:58.180 |
the thing that's really important. And I think that's the thing that's really important. And I think 00:20:58.580 |
that's the thing that's really important. And I think that's the thing that's really important. 00:20:58.820 |
obvious consequences that you're ignoring, by not really understanding how the economy works. 00:21:03.860 |
I'll just say like, look, these people can I I'll say, but I mean, you know, the people that are 00:21:08.740 |
elected, are elected by the people to represent the people. They're not elected to govern over 00:21:14.900 |
the long term plan, necessarily for the state. And while you think that those two may be the same, 00:21:20.260 |
the near term incentive and the near term motivation of someone who's voting is I want 00:21:26.580 |
to get a benefit for myself sooner rather than later, not necessarily, I want to make an 00:21:31.300 |
investment for 30 or 40 years down the road. And if you're an individual that's struggling 00:21:36.020 |
in some way, you're going to prioritize the former over the latter. And you're going to 00:21:40.420 |
elect someone who's going to go put in place public policy that's going to benefit you. 00:21:44.820 |
So I don't necessarily raise my hand and say, I blame the people that have been elected. I 00:21:49.380 |
raise my hand and I say, Look, this is the natural evolution of what happens in a democracy where 00:21:54.180 |
there is a struggling class. And there were where there is some sort of view disparity or considered 00:21:59.940 |
disparity, that this is what's going to happen in a democracy under this condition is that you're 00:22:04.260 |
going to elect individuals who are going to go and govern and they're going to make decisions 00:22:07.860 |
that are going to benefit those individuals that got them elected over the short term. 00:22:11.540 |
And there may be some real long term damage that results. 00:22:14.740 |
sacks as a part of this problem is that people are now looking 00:22:18.820 |
to the government to solve their problems in life versus maybe 00:22:22.280 |
looking internally to solve their problems and negotiate 00:22:25.960 |
better salaries. And I think you have to give people a little bit 00:22:29.060 |
more credit. Nobody was asking for them to step in like this. 00:22:32.080 |
Look, this is as much for the politicians as it is for any of 00:22:35.180 |
the workers. What's happening right now, the National 00:22:38.120 |
Restaurant Association and all these restaurant lobbying groups 00:22:40.700 |
are flooding, the state, the legislators, the governor with 00:22:44.360 |
lobbying money, because they want to get this bill modified, 00:22:47.600 |
tempered or thrown out. And so this is basically a racket like 00:22:52.180 |
look, as a public policy matter, it makes zero sense for the same 00:22:57.480 |
worker if they go work at say, Jiffy Lube or something outside 00:23:01.460 |
the restaurant industry, they get the California minimum wage 00:23:03.840 |
of $15. If they go to a mom and pop restaurant, they get the 00:23:07.100 |
minimum wage of $15. But if they go to McDonald's, they now get a 00:23:10.160 |
new minimum wage specifically for chain restaurants of $22. 00:23:13.980 |
That simply makes no sense. There's no public policy 00:23:17.160 |
rationale. But the real question you have to ask and 00:23:19.620 |
Toronto's kind of getting at this is, why shouldn't it be $50? 00:23:22.200 |
Why shouldn't it be $100? Well, the reason is because if you 00:23:25.600 |
raise the minimum wage too much, then these employers have a huge 00:23:30.340 |
incentive to replace that labor with automation. And so the 00:23:34.080 |
unintended consequence of tomorrow is talking about is 00:23:36.500 |
that these big chain restaurants are going to rely even more 00:23:40.560 |
heavily on automation now. And they're going to basically 00:23:43.600 |
employ less of this labor, where the price has been artificially 00:23:46.960 |
raised for that sub sector of the economy for 15 to $22 for 00:23:51.580 |
reasons that no one can really explain. So again, this is not 00:23:54.400 |
for the benefit of workers, it's for the benefit of the 00:23:56.320 |
politicians. Look what's happening right now, is all 00:23:59.200 |
these articles talking about the lobbyists coming in, they're 00:24:01.960 |
going to have Gavin Newsom begging him to be their savior 00:24:05.080 |
right now. And this is the game that the California Democratic 00:24:08.360 |
Party plays. It's a one party state, where they were 00:24:11.440 |
basically business loves Gavin Newsom, because he's their 00:24:13.220 |
protector. He's the protection end of the extortion racket. 00:24:16.520 |
Lorena Gonzalez gets this bill passed, which is basically 00:24:19.400 |
incredibly damaging to these chain restaurants, and they have 00:24:23.840 |
to go running to Gavin Newsom and beg him to either veto it 00:24:27.080 |
or modify it temper the bill. So it's not as destructive to 00:24:30.080 |
their business. He's a protection in the extortion 00:24:32.420 |
racket, but it's the same racket. This is a one party 00:24:35.240 |
state. And what they're doing with chain restaurants they 00:24:37.700 |
would do with every sector of the economy, if they could, it's 00:24:42.840 |
And what is the purpose of the California government? Like, are 00:24:45.600 |
there things they could be focusing on other than this, 00:24:47.640 |
like maybe the free market settles this issue, and they 00:24:49.920 |
could extract benefits from the private sector from basically 00:24:54.000 |
the wealth producing part of the economy, and to line their 00:24:57.360 |
pockets in the pockets of their supporters, you know that the 00:25:00.080 |
typical government worker in California makes 53% more than 00:25:04.440 |
their private sector counterpart. And if you factor 00:25:07.200 |
in all the pensions have been promised, it's 100% more. Okay, 00:25:12.460 |
there's a huge wealth transfer that's taking place, where the 00:25:16.600 |
we're basically the legislators, the party, they are transferring 00:25:20.920 |
wealth from the private sector to their supporters, their 00:25:23.800 |
backers. And free where you talk about democracy, listen, I mean, 00:25:26.920 |
California is a one party state and ballot harvesting is legal 00:25:31.120 |
here. I'm not saying the votes are fake or anything like that. 00:25:32.980 |
But there's a giant party machinery and apparatus that 00:25:36.640 |
literally just goes door to door and collects the ballots from 00:25:42.080 |
who their supporters are on a door by door household by 00:25:45.380 |
household level. It has gotten to the point now where the party 00:25:49.160 |
machine is so powerful, they can pretty much run anybody for any 00:25:53.780 |
post and get them elected. It's very hard for an outsider 00:25:57.560 |
candidate. It does feel like we're getting tired of it here. 00:26:00.860 |
No, but guys, that's exactly how it works. I mean, like, if you 00:26:03.680 |
look at Gavin Newsom's run to the governorship, he paid his 00:26:08.420 |
dues, he did the right jobs, he waited in line, he didn't, you 00:26:12.200 |
know, rock the boat, independent of any of the things he did 00:26:15.140 |
right or wrong. And he had some pretty big moral transgressions. 00:26:18.320 |
It didn't matter, because it is about paying your dues and 00:26:21.800 |
waiting in line, because there aren't any meaningful challenges 00:26:26.120 |
In LA right now, there's a mayoral race going on. And the 00:26:32.480 |
two there's a runoff between Rick Caruso and Karen Bass. Karen 00:26:36.920 |
Bass is basically a product of the political machine. She's 00:26:39.260 |
basically worked her way up in the Democratic Party forever, 00:26:41.240 |
sort of uninspiring candidate. She's just basically a puppet 00:26:45.080 |
of the machine. Caruso is actually he's a real estate 00:26:48.320 |
developer who's built some of the most landmark, you know, 00:26:52.100 |
spots in LA like the Grove, like the Pacific Palisades 00:26:55.940 |
development. He's made incredible. He's created 00:26:59.720 |
incredible amenities for the LA area. He's also been on a bunch 00:27:03.740 |
of civic boards. You know, I had lunch with him, 00:27:06.740 |
at the Grove and people were coming up to him. He's like a 00:27:09.020 |
celebrity. They were taking photos with him. He's the best 00:27:11.660 |
possible candidate that you could have in a place like LA. 00:27:15.320 |
He actually understands business. I think he would 00:27:18.020 |
restore law and order. I mean, all this kind of stuff, right? 00:27:21.080 |
He's spoken out about the homeless problem being out of 00:27:23.360 |
control. And on election night, he had the most votes. But five 00:27:27.920 |
days later, when they counted all the the basically the 00:27:30.800 |
collected ballots through ballot harvesting, Karen Bass was up 00:27:33.620 |
by like five points. That is the power of the 00:27:36.560 |
Republican Party. And the ballot harvesting, and I still hope that 00:27:39.380 |
he wins. But the party machine is so powerful, they can 00:27:43.040 |
basically get almost anyone elected in the state. Can't the 00:27:46.160 |
bad. Change it. What's that? Just devil's advocate here, the 00:27:51.080 |
ballot harvesting can work both ways. Like you could go collect 00:27:53.540 |
Republican votes? No, but you have to have the party 00:27:56.180 |
infrastructure to literally go the Republican hasn't. The 00:28:00.680 |
Think about it this way. Okay. So, you know, like your phone, 00:28:06.380 |
Think about the party as the platform and the candidate as 00:28:10.280 |
the app. Okay. The platform can drive a ton of traffic to 00:28:13.700 |
whatever app they want. A guy like Caruso doesn't have the 00:28:16.880 |
party behind him. So he has to build his own operating system. 00:28:20.720 |
Right. Exactly. So he has to build his own platform. Then he 00:28:24.320 |
has to build his own candidacy. So when you think about the 00:28:27.020 |
amount of money that's required to do something like that about 00:28:29.300 |
how hard it is to do it. It's it's very, very hard. So he 00:28:33.080 |
spent something like $100 million of his own money. Hopefully he 00:28:36.200 |
wins. But if he doesn't, all the infrastructure, all the 00:28:39.680 |
platform that he created just goes away. And then the next 00:28:42.920 |
candidate has to start from scratch. This is why outsiders 00:28:47.180 |
Why hasn't the GOP invested in California? They just don't 00:28:49.340 |
think they have any shot because it seems like the Democrats are 00:28:51.740 |
investing heavily in trying to flip, you know, 00:28:54.140 |
look, there's a two to one party affiliation. There's a two to 00:28:57.200 |
one party affiliation in California towards Democrats. So 00:29:00.500 |
look, I mean, the Republican Party here is a mess. And that's 00:29:03.020 |
part of the problem is the Republican Party in California 00:29:06.020 |
needs to move to the center. It's a plus 30 blue state, 00:29:09.320 |
unless they're willing to do that, they're not going to make 00:29:11.420 |
progress. But that could take, you know, generations to fix. 00:29:14.960 |
And with Roe v. Wade, and that kind of stuff, it's gonna be 00:29:17.060 |
really hard to look, there are pro choice Republicans in 00:29:20.660 |
California. But all Newsom has to do is point to what's 00:29:24.200 |
happening in Texas. And, you know, all of a sudden, people 00:29:28.100 |
kind of revert to their tribal political affiliations. I'll 00:29:32.000 |
make I'll make one more simple point. The average voter in 00:29:35.840 |
probably knows more people that work at McDonald's than our 00:29:38.840 |
shareholders of McDonald's. And I think in any time that there's 00:29:42.560 |
policy that gets voted or get suggested, or ultimately gets 00:29:46.700 |
passed, there's a side that benefits and there's a side that 00:29:49.880 |
doesn't. And if you know more people that benefit on one side 00:29:53.540 |
than are hurt, you're probably going to be supportive of that 00:29:56.120 |
policy. And I think that's probably a pretty simple rubric 00:29:58.880 |
for thinking about how these things over time, of all 00:30:02.300 |
anything that's true nationally, what why is it only true for 00:30:05.660 |
I mean, like in Florida, Florida knows more people who work at 00:30:09.620 |
McDonald's and shareholders McDonald's, and they're not 00:30:12.380 |
engaged in a systematic takeover of the private sector 00:30:14.900 |
philosophically, then they believe in independence and 00:30:17.900 |
businesses not being interfered by the government, right? That's 00:30:22.580 |
Also, the sad reality is that within a few years, unfortunately, 00:30:26.900 |
you'll know a lot fewer people that work at McDonald's because 00:30:29.900 |
the number of jobs for humans will be much lower than the 00:30:35.480 |
And so the number of jobs for humans will be dramatically 00:30:38.720 |
They got rid of the cashiers that's gone already, 00:30:40.880 |
or what people will be more acutely aware of in a number of 00:30:43.220 |
years is how expensive McDonald's has gotten. Because 00:30:45.740 |
if the price don't know that that won't be because like, 00:30:48.140 |
again, McDonald's has a very simple, no, not not even the 00:30:51.140 |
market. McDonald's is not stupid. They understand how to 00:30:53.300 |
use Excel. And so they'll think, well, I can amortize a robot 00:30:56.720 |
versus actually jacking up prices, because what McDonald's 00:31:01.880 |
understands is the sensitivity of pricing to demand for their 00:31:05.300 |
customers. Right? I mean, they are the most intelligent about 00:31:08.240 |
how they price every anything at the sort of the lower end where 00:31:11.180 |
you're capturing, you know, nickels and dimes, they are the 00:31:14.240 |
most sophisticated and understanding supply demand, and 00:31:17.780 |
and, and the curves. And so to think that they're not going to 00:31:21.080 |
just invest heavily now at the corporate level, the next 00:31:23.780 |
franchisee of McDonald's will still pay a million dollars for 00:31:26.960 |
franchise fee, but we'll give will be given a bevy of robots 00:31:29.600 |
that they rent from McDonald's, and they'll have to hire half 00:31:32.300 |
or a third less. That's, that's the real shame of this, which is 00:31:35.120 |
the number of people that should actually be gainfully employed in 00:31:38.720 |
the economy will then shrink again. But in this case, it was 00:31:42.320 |
because legislators just completely don't understand how 00:31:48.620 |
And people are not clued into exactly how well narrow AI 00:31:55.040 |
By the way, I'll say like, Jen, what's really moving quickly. 00:31:57.800 |
One general way to think about this is technology is introduced 00:32:01.760 |
and the technology creator captures roughly a million 00:32:04.940 |
dollars of the value increment, because of the delivery of that 00:32:07.940 |
technology, the productivity increment. So let's say that it 00:32:11.000 |
costs $12.15 an hour to employ a McDonald's labor today. And 00:32:15.920 |
let's say that you can build technology that the amortized 00:32:18.260 |
robotic cost is $10 an hour, there will be a technology 00:32:21.920 |
company that will make 250 an hour off of that. So net net, 00:32:25.280 |
what ends up happening is productivity has to be gained, 00:32:28.220 |
and prices will reduce. And so you know, there are moments like 00:32:32.000 |
this where there could be an acceleration of technology, and 00:32:34.760 |
also an opportunity for new industries to emerge. And when 00:32:38.180 |
that happens, new jobs emerge too. So it's, you know, maybe 00:32:42.200 |
look, I mean, at the end of the day, it seems Luddite ish. But, 00:32:45.260 |
you know, there's going to be a mark a market opportunity that 00:32:48.200 |
I think we need to like, just step back and, and ask what is 00:32:52.100 |
the type of economy that California is creating here with 00:32:55.040 |
this fast food law with these other laws we're talking about. 00:32:57.440 |
This is a state in which you have a disappearing middle 00:33:01.340 |
class, the middle class has been moving out of the state. 00:33:04.580 |
By the hundreds of 1000s. In fact, last year, we had for the 00:33:07.520 |
first time, net migration out of the state. This is basically 00:33:11.360 |
become a state where the only middle class are basically the 00:33:14.300 |
government workers who like we talked about, are making twice 00:33:17.480 |
as much as their private sector counterparts. It's a state where 00:33:20.240 |
you have the very rich and very poor, and the only middle class 00:33:22.940 |
or government workers, that is what we're creating the rich in 00:33:25.520 |
the state have done really well, over the last few decades. Why? 00:33:29.300 |
Because globalization. So you take the two big industries in 00:33:34.400 |
Hollywood, entertainment, they have benefited enormously through 00:33:38.060 |
globalization, because now you can sell software, or you can 00:33:41.060 |
sell movies or music. Exactly. So now these these industries 00:33:44.720 |
have become, you know, massive global exports. And so if you're 00:33:49.460 |
in those industries in California, you've done really 00:33:51.860 |
well, because now your market is the whole world. So the super 00:33:57.740 |
rich have done really well, that's funded the state that's 00:34:00.440 |
allowed the state it's almost like a resource curse to have all of these 00:34:04.220 |
bad policies that really hurt small business owners, they've 00:34:07.880 |
been moving out of the state. And so again, you've got the very 00:34:10.320 |
rich and very poor. This is not an economic model for the entire 00:34:13.380 |
nation. This is what you know, Gavin Newsom says he wants to 00:34:16.220 |
take to all of America. This is not a model for all of America. 00:34:20.060 |
In order for democracy to function, we need a thriving 00:34:22.720 |
middle class. And this is not a recipe for delivering a thriving 00:34:28.400 |
And that I just want to show you guys a quick video, which play 00:34:33.200 |
So we had invested in this company that got sold, I'm not 00:34:36.880 |
going to say the name of it. But if you watch this quick robot, 00:34:40.700 |
this is a computer vision company out of Boston. And like, 00:34:44.700 |
they're doing high speed picking of tomatoes and berries. Like 00:34:49.800 |
this is a two year old video when we invested in one up 00:34:52.280 |
getting bought by another company, and they're doing 00:34:54.020 |
vertical farming, but and there's companies doing this now 00:34:58.580 |
for French fries, it's over, like we are within, you know, I 00:35:03.020 |
years of a lot of these jobs, and we're talking 10s of millions 00:35:06.740 |
of manual labor jobs being gone. And we're going to look at this 00:35:10.260 |
moment in time, where we try to squeeze an extra 10 or 20% out 00:35:14.120 |
of these, you know, employers. And then you're gonna see these 00:35:19.540 |
employers say, you know what 24 hour a day robot? Yeah, it's a 00:35:23.420 |
little bit upfront cost, I'll put it on a lease, and they're 00:35:25.940 |
just going to move to these robots. It's really very close 00:35:28.780 |
to being game over for manual labor. And you can see it also 00:35:32.360 |
with Cybertruck. And you know, what Elon's doing with AI and 00:35:36.200 |
these things are getting so close. And it's been, I don't 00:35:39.620 |
know, when did you invest in your first robotics or AI 00:35:43.040 |
company, Saks or trauma? Do you remember the first robotic or AI 00:35:47.000 |
company you invested in? And how long ago was 00:35:49.080 |
2014? It's a company called relativity space, which is 3d 00:35:56.240 |
printing rockets and engines. And basically, they're able to 00:36:02.180 |
how to actually fix the morphology of all the materials 00:36:05.360 |
that they use to print better and cheaper and more useful 00:36:08.300 |
rocketry. That's an example. But then that led me down the garden 00:36:12.560 |
path, I was just going to say something more generic, instead 00:36:15.740 |
of talking our book, which is, I think people misunderstand that 00:36:21.860 |
Moore's law has actually not ended. And this is something 00:36:32.000 |
characterized to me, beautiful human being from work together. 00:36:36.500 |
He was a CTO of Facebook when I worked there. So we this is, you 00:36:39.440 |
know, we've known each other for 1516 years. And now he's a CEO 00:36:42.980 |
and founder of Cora. But you know, the way that he described 00:36:46.780 |
it to me, which is so true, the minute he said it, I was like, 00:36:49.240 |
my gosh, it's like Moore's law never ended, it just shifted to 00:36:51.860 |
GPUs. You know, because the inherent lack of parallelization 00:36:55.640 |
that CPUs have, you solve the GPUs. And so that's why the 00:37:01.820 |
surface area of compute of innovation has actually shifted 00:37:05.360 |
Jason to what you're saying, which is, you know, all of these 00:37:08.080 |
new kinds of machine learned models, because you can just now 00:37:11.720 |
brute force and create such a tonnage of compute capabilities 00:37:15.380 |
and resources to solve these problems that weren't possible 00:37:19.160 |
before. So, you know, you see this thing, you know, fun things 00:37:22.820 |
like Wally, or, you know, Dali, sorry, and more sophisticated 00:37:27.320 |
things like GPT three. But he's right, which is that, you know, 00:37:31.640 |
we are at a point now where all kinds of expert systems, which is 00:37:35.360 |
sort of like v1, simple forms of AI, are going to be completely 00:37:39.160 |
next level. Well, you think about the data sets adding to 00:37:42.160 |
that. And so when government exactly. And so when governments 00:37:44.680 |
don't understand this, and they try to, you know, step in, 00:37:48.020 |
they're going to be shocked at how much faster they're pulling 00:37:50.900 |
forward the actual future they actually are trying to prevent. 00:37:53.420 |
Yeah, the the data sets is the other issue freeberg, you look 00:37:57.260 |
at, you know, the the data sets that are available to train. So 00:38:01.460 |
you have the GPUs still escalating massively, the amount 00:38:05.540 |
of compute power can do you have cloud computing at the same time, 00:38:07.520 |
and then you have these data sets. One of the most 00:38:09.620 |
interesting things about, you know, Dali, and a lot of these 00:38:12.180 |
is the corpus of data they're able to absorb. And a second 00:38:15.960 |
factor issue that is starting to come up is who owns the 00:38:19.040 |
derivative work in copyright. So if you train your data set on a 00:38:24.220 |
bunch of images, and those images are owned by Disney, or 00:38:27.020 |
whoever, you know, photographer Getty, and then the AI makes you 00:38:31.280 |
so here's another, here's another example of this. Here's 00:38:34.820 |
another, here's another example of this in a law that just 00:38:37.100 |
passed. So in the IRA, the Inflation Reduction Act, you 00:38:40.580 |
know, we granted CMS and Medicare the ability to 00:38:44.180 |
negotiate drug prices. And we actually also capped the total 00:38:49.640 |
prescription costs that any American can pay at $2,000. Now, 00:38:54.260 |
it turns out that overwhelmingly, most Americans 00:38:57.360 |
will never spend $2,000 a year on prescription medications. 00:39:01.100 |
But there will be a few, and they bump up against drugs that 00:39:04.580 |
cost millions of dollars. So there was a, you know, there's a 00:39:06.980 |
drug for, you know, SMA. That's like, you know, 2.4 million, 00:39:12.200 |
and like, you know, beta thalassemia of like 2.8 million. 00:39:16.820 |
And you're and we will all bear the cost of that. Now, if CMS 00:39:22.340 |
and Medicare, go and really crunch those costs, what is pharma 00:39:27.320 |
going to do? Well, pharma is going to do what 00:39:30.920 |
is the obvious thing, which is like, if you're going to cap 00:39:34.220 |
what I can make, I have to completely change my R&D model. 00:39:37.880 |
And they're just going to use alpha fold, you know, and what 00:39:41.300 |
it's really going to do is potentially put a bunch of 00:39:43.820 |
research scientists out of work. And those are folks that should 00:39:47.960 |
probably in the, you know, for our general future be gainfully 00:39:51.500 |
employed. But the nature of their job is going to change 00:39:54.000 |
completely, because the economic capitalist incentive of big 00:39:57.380 |
pharma is going to move towards technology, which is, you know, 00:40:00.740 |
the alpha fold, and their protein library is the equivalent 00:40:04.340 |
of AI and automation for McDonald's. So this is where 00:40:08.480 |
again, governments have to be super careful that they 00:40:10.760 |
understand what's actually happening, because they're 00:40:12.320 |
creating incentives, I think that they don't completely 00:40:14.420 |
understand. All right, we can go to raw meat for sacks, 00:40:18.740 |
Zuckerberg's admission of FBI meddling, truth social. 00:40:23.180 |
Well, after you guys disputed me and like, how dare you imply 00:40:27.320 |
that J. Edgar Hoover and Jim Comey's FBI is politicized. 00:40:34.580 |
investigation. You didn't, but some of you were saying I can't 00:40:37.760 |
believe sacks that you're part of the 35% of Americans who 00:40:40.940 |
don't believe fully in the rectitude and integrity of the 00:40:44.600 |
FBI. I don't know if this is comes this bombshell. I mean, 00:40:50.540 |
you gotta admit this was a bombshell. What was the 00:40:52.700 |
bombshell? Okay, so Joe, Joe Rogan had zuck on. And he asked 00:40:58.160 |
him about this very specific thing. Remember, the 00:41:00.380 |
New York Post story on Hunter Biden's laptop in like the week 00:41:04.040 |
or two before the election was censored, right? And there's a 00:41:07.160 |
very controversial thing, because the FBI was dealing with 00:41:09.980 |
known Russian interference and hacks and all this stuff. Part 00:41:15.560 |
of which Trump was like asking people to hack other candidates. 00:41:18.200 |
Okay, you're not even setting this up correctly. Well, okay, 00:41:20.480 |
I'm gonna keep going. And then so this is what Zuckerberg said, 00:41:23.360 |
Hey, look, let's let sacks moderate this segment. Well, I 00:41:26.480 |
mean, I'm just gonna read the Federalist, which is, you know, 00:41:28.520 |
super right wing, but he says, I'll just read this. I'm gonna 00:41:30.200 |
just read you what zuck said, because that's says it all. He 00:41:33.200 |
says, Hey, look, if the FBI, which I still view as a 00:41:36.140 |
legitimate institution in this country, I'm quoting from the 00:41:38.780 |
Federalist, which is quoting him from Joe Rogan. It's a very 00:41:41.520 |
professional law enforcement, they come to us and tell us we 00:41:43.880 |
need to be on guard about something I want to take it 00:41:46.620 |
seriously. So when the New York Post broke the Hunter Biden 00:41:49.760 |
laptop story on October 14 2020, Facebook treated the story as 00:41:53.300 |
potentially misinformation important misinformation for five 00:41:56.640 |
to seven days while the tech giants team could determine whether it was 00:42:00.020 |
false or not. During that time, this is federal speaking, 00:42:03.140 |
decreased its distribution, Facebook decreased distribution 00:42:06.640 |
of the story by making the story rank lower in the news feed. And 00:42:11.660 |
this is his quote, Zuckerberg's you could still share it, you 00:42:14.660 |
could still consume it Zuckerberg explained, but quote, 00:42:17.840 |
fewer people saw it than would have otherwise. And while he 00:42:23.760 |
would not quantify the impact, the Facebook founder said the 00:42:26.420 |
decreased distribution was quote unquote, meaningful and a follow 00:42:29.840 |
up Rogan asked if the FBI had specifically said, quote, to be 00:42:34.700 |
on guard about that story, meaning the laptop story after 00:42:37.700 |
originally responding, no Zuckerberg corrective says I 00:42:40.340 |
don't remember if it was that specifically, but it was 00:42:42.440 |
basically the pattern. So basically, I guess you would 00:42:45.020 |
agree, Saks, Zuck didn't know exactly what to do here with 00:42:47.960 |
this information. And if it was real or not, which, you know, 00:42:51.740 |
it turned out to be very real, and it probably could have swayed 00:42:55.080 |
the election. I mean, I do think if people thought there was a 00:42:59.660 |
between Hunter Biden, maybe it could have, I don't know if we 00:43:04.160 |
want people being hacked to do that. So what's your take on it, 00:43:06.540 |
Saks? You know, just hacking in general, and this specific thing, 00:43:09.240 |
what should the FBI do? If these kind of hacks are getting 00:43:12.560 |
released of people's families? Okay, so yeah, here's a 00:43:16.200 |
complicated issue, right? There's many vectors. Yeah. 00:43:18.320 |
What Zuck basically says the FBI came to them and encourage them 00:43:22.280 |
to suppress the story or to suppress a story that they 00:43:25.940 |
described that would be just like this. Okay. So 00:43:29.480 |
now look, a lot of conservatives are dragging Zuckerberg and 00:43:32.300 |
Facebook for doing for doing the FBI's bidding. But I think a 00:43:35.540 |
lot of people, I think most people in Zuckerberg's position 00:43:38.480 |
would have believed the FBI when the FBI comes to you and says, 00:43:41.300 |
you're about to be targeted by Russians information, you need 00:43:44.240 |
to do something about it. You would have listened to the FBI, 00:43:47.060 |
he believed the FBI. My point is, so I don't blame Facebook 00:43:51.440 |
too much for that. I think it's understandable that he would 00:43:53.720 |
have believed them. The issue here is the politicization of 00:43:56.600 |
the FBI. Look, let's back up what was happening. So, you know, 00:43:59.300 |
so the New York Post gets this story about the leak contents of 00:44:03.260 |
Hunter Biden's hard drive. In response to that, you had 50 00:44:07.640 |
former security state officials, many of whom were Democratic 00:44:12.800 |
partisans, like Clapper, like Brennan, they signed a letter 00:44:17.000 |
saying that this story has all the hallmarks of Russian 00:44:20.900 |
disinformation. Now, the truth is they had not inspected the 00:44:24.440 |
hard drive. They simply said this is the hallmarks of it. And 00:44:29.120 |
we thought that the social networks and so forth censored 00:44:32.780 |
the story. Okay. Now it turns out that the FBI, by the way, 00:44:36.860 |
the FBI had the hard drive. They had the hard drive for over a 00:44:40.460 |
They had an image of the hard drive or the actual hard drive? 00:44:44.060 |
The leak of the story was likely prompted because the FBI didn't 00:44:49.100 |
appear to be doing anything with the hard drive. So, somebody 00:44:51.800 |
leaked the story to, you know, to Rudy Giuliani, and then he 00:44:56.600 |
leaked it and so forth to the New York Post. But the point is, 00:44:58.940 |
that the FBI had the hard drive. So, they knew it was authentic. 00:45:02.060 |
Okay. So, they knew that this hallmarks of Russian 00:45:05.360 |
disinformation was not the truth. Because again, they had 00:45:08.480 |
the authenticated hard drive in their possession. And yet, they 00:45:11.720 |
still went to Zuckerberg and basically played into this 00:45:14.780 |
narrative, this phony narrative that, you know, Democratic 00:45:18.800 |
partisans like Clapper and like Brennan had created. 00:45:22.160 |
Was there anything meaningful, by the way, about what was on 00:45:25.040 |
the laptop in your mind? Like, do you think it's relevant? Do 00:45:28.760 |
Are there any laptops being hacked like this? Who are in 00:45:30.560 |
people's political families? I don't think the laptop was 00:45:34.640 |
hacked. I mean, there's a weird convoluted story about how the 00:45:40.700 |
Should people's personal pictures, you know, and stuff 00:45:43.220 |
like that be released? I never liked the part where they were 00:45:46.520 |
showing Hunter Biden's, you know, drug use and the other personal 00:45:50.840 |
stuff. I thought that was scurrilous. And I didn't like it. 00:45:54.560 |
And I didn't think that was particularly germane. So, I think, and I think it was an important part of the story. 00:45:58.580 |
I think it was an invasion of his privacy. However, there were materials on there related 00:46:02.720 |
to his business dealings in Ukraine. And I don't see how you can say those weren't relevant. 00:46:07.160 |
Oh, I think they're highly relevant. He's being investigated for them too now. I mean, 00:46:10.400 |
a lot of this has to do with timing too. You know, it's like, how do you deal with something 00:46:14.660 |
like this seven days before an election when you know the Russians are hacking it, if it has been 00:46:18.800 |
or not? It's a really difficult situation for everybody, I think. 00:46:21.200 |
So, look, my point is this, that I can agree with you about the personal 00:46:25.460 |
stuff about Hunter Biden shouldn't have come out. But with respect 00:46:28.400 |
to the Ukraine stuff, that was legitimate. It was fair game. And the most importantly, 00:46:33.440 |
I don't know whether it would have sworn the election or not. Probably not. Probably by 00:46:36.680 |
itself it's not. But the real point here is that the FBI went to Facebook to censor a story that 00:46:42.860 |
they must have known was true because they had the laptop in their possession. They should not 00:46:48.380 |
be intervening in elections that way. That is the bombshell here. That's unbelievable. 00:46:52.520 |
I think they didn't know the providence of this, but I guess we'll find out. 00:46:58.220 |
I think they're landing in the car drive that's in their possession. 00:47:06.920 |
The audience wants to know, Sachs, the audience really wants to know, 00:47:09.500 |
you were going to reserve judgment on the Mar-a-Lago search. Where do you and or Republicans 00:47:16.280 |
come out on all this now when you see, fuck, he did in fact have a ton of really sensitive stuff? 00:47:21.860 |
Well, define ton. Define ton of really sensitive stuff. 00:47:25.940 |
Yeah, I mean, listen, and he wouldn't give it back. 00:47:28.040 |
He had 300 pages of documents marked classified. If you were to go to the 30,000 boxes of documents 00:47:35.600 |
that Obama took, are you telling me you really couldn't find 300 pages that have classified 00:47:40.340 |
Well, the fact that he wouldn't give it back. 00:47:41.420 |
I never disputed, by the way. I never disputed. Hold on a second. I never disputed that they would 00:47:46.460 |
find documents with classified markings in Trump's basement. I still think that the approach was 00:47:51.080 |
heavy-handed, and we don't know enough because we don't know what those documents are. Listen, 00:47:55.700 |
every document they stick in front of the president, or not every, but many of them are 00:47:59.960 |
marked classified. Okay, so let me ask you this. Aside from these kind of details, what is Trump's 00:48:05.300 |
deal that he just wouldn't give them back? What do you think his motivation was? I mean, I have my own 00:48:09.200 |
theory, which he likes memorabilia. He's always liked to show off memorabilia. 00:48:12.260 |
I think that's it. I think you got it. Honestly, if I had to guess what this whole thing was about, 00:48:16.160 |
it's probably that the archivist at the National Archives wants the original copy of those letters 00:48:21.920 |
with Little Rocket Man, and he doesn't want to give them up. 00:48:24.560 |
Right. And the assortment. And I'm not saying that's not true. I'm not saying that's not true. 00:48:25.520 |
I literally think that's what this is about. I think it's about something as silly as that. 00:48:29.060 |
Memorable. Memorabilia. I think it's about memorabilia. I do not think these documents 00:48:32.900 |
pose a threat to national security. Why wouldn't Trump just give it all back? It's so dumb. 00:48:36.200 |
But why would the FBI feel the need to do a raid? Well, because there's so much classified stuff, 00:48:39.200 |
I guess. By the way, remember when we first discussed this? Hold on a second. When we 00:48:41.960 |
first discussed this issue, we were told it was nuclear. It was nuclear secrets. 00:48:46.460 |
Could be, yeah. No, they've backed off that completely. It was not in the affidavit. It's 00:48:51.500 |
not about nuclear. That was something they leaked to get them through a tough press cycle. 00:48:55.340 |
So, but look, my point about all this stuff, just to wrap it up so people are clear, 00:48:58.940 |
was never to defend Trump per se. My real concern is the politicization of the FBI. Our law 00:49:04.880 |
enforcement agency should not be weaponized by either party to pursue their political agenda. 00:49:09.200 |
But you believe they are at this point? We just saw with the Zuckerberg thing, 00:49:12.860 |
what business did the FBI have going to Zuckerberg to get them to censor a New York Post story that 00:49:19.160 |
turned out to be completely true? Hold on a second. That is election interference. They 00:49:25.160 |
That is inexcusable. It's a tough job if they don't know if it's been stolen or not. Yeah, it's a tough 00:49:29.240 |
job. By the way, the government of the United States has no business censoring a free press. 00:49:35.000 |
That is a violation of the First Amendment. When the government instructs a private company to 00:49:41.600 |
censor a press story that the government itself does not have the authority to censor, 00:49:46.640 |
that is a violation of the First Amendment. We thought, remember when Jack Dorsey said 00:49:51.260 |
that he regretted the censorship that Twitter did? And he came out 00:49:54.980 |
with that apology? We thought it was just Twitter's decision. Now we find out from 00:49:58.340 |
Zuck that he was leaned on by the FBI. And I think that these big tech companies like Twitter, 00:50:04.400 |
like Facebook, they're going to have to have a new policy, which is when the government says, 00:50:08.540 |
we want you to censor something, their response needs to be, show us a court order. They cannot 00:50:15.140 |
just censor now based on the say-so of some operative from the government. 00:50:19.400 |
I would agree with you on that. I think it was a bad decision, obviously. The one big tech company 00:50:24.800 |
that really nailed this, I think, was Apple. Very early on, they drew a really hard line in the sand 00:50:31.100 |
with the San Bernardino shooter because they said, if we allow the FBI to get into this person's phone, 00:50:36.980 |
even though what he did was completely heinous, we are going to create a backdoor that will become 00:50:43.580 |
an exploit for everybody. And we are drawing the line on privacy and security. Now, different set 00:50:48.920 |
of issues, but it just goes to show you, there have been a few examples where some of these 00:50:53.600 |
companies have drawn a hard line. And they have done it. And they have done it. And they have 00:50:54.620 |
drawn a hard line. And then in that example, and I think, Jason, you mentioned this, they went to 00:50:58.100 |
Israel and had that company hacked the phone. San Bernardino. 00:51:01.280 |
Okay, whatever. But it didn't come with, they were able to stand up to the pressure. So I think there 00:51:07.340 |
are some examples, Sachs, where you can say no. I mean, the idea that Twitter would block a New 00:51:13.760 |
York Post URL, it's like, ah, such a jump ball. If it was the sex material, I could understand them 00:51:19.700 |
saying, you can't hack it. That's against our terms of service, and we agree on that, Sachs. But 00:51:24.440 |
in this case, like, I don't know. Let's go to something more interesting. Well, heatwave, I think, 00:51:28.100 |
and the drowning. I know this is interesting to Sachs, but I think it's interesting to everybody 00:51:30.920 |
else in the audience. Heatwaves and droughts are just- Newsflash is hot in summer. 00:51:35.720 |
Yeah, I think this might have to do with global warming and be a little unique. 00:51:40.640 |
I'm kidding. I'm kidding. I know there's more to it. 00:51:42.500 |
You're kidding. Before all the ESG people get into Sachs and say, you don't, you're global. 00:51:47.000 |
I love free- I do think the death of Gorbachev is a big story. 00:51:52.280 |
Okay, we'll get to that after this. Oh, yeah, I think that's a great- 00:51:55.640 |
Let's give free Burke his science corner, and then we'll come back around. I gave you your raw meat. 00:51:59.240 |
Sachs, what's your point of view on climate change, the impact it will have on the planet, 00:52:08.960 |
Listen, I'm not an expert in that area. I'm not going to pretend to be. I do think that 00:52:16.880 |
we can't save the planet by destroying the economy. And it seems to me that too many of 00:52:24.080 |
the Save the Planet people, like, want to take reckless, extreme actions that would wreck our 00:52:29.360 |
economy. You just saw, actually, Elon just gave a talk and made news this past week from Norway, 00:52:35.960 |
where he said that we still need oil and gas. He's the leading innovator in basically moving 00:52:41.780 |
to solar and renewables. I don't disagree with you, but- 00:52:43.880 |
And he said, listen, unfortunately, we got to rely on oil and gas because it's too important 00:52:48.380 |
for civilization. If we cut this stuff off too quickly, we end civilization. 00:52:53.900 |
I mean, I think this is a long-term problem, and I think we need to have long-term solutions. 00:52:58.940 |
You believe in global warming. Just full stop. The planet is warming. You agree with the science. 00:53:04.760 |
It's not just about warming. Just to use a different term, that there are more frequent 00:53:09.980 |
extreme events that can severely hurt people, hurt the economy, hurt the food supply, hurt the energy 00:53:17.000 |
supply, all the things that, you know, I think we're, like, all the way on the bottom of Maslow's 00:53:22.340 |
hierarchy. Like, these are the things that are going to be the most important. And I think that's 00:53:23.720 |
the thing that's going to be the most important. These are the things that are most critical that 00:53:25.340 |
they're starting to get disrupted in a pretty severe way. You know, I think that's the big 00:53:32.180 |
question. So, Sax, I think it's becoming, and this is where the transition starts to happen, 00:53:36.860 |
that a lot of people kind of say, hey, over the long run, the temperature is going to go up by 00:53:41.060 |
one degree Celsius over a century. You know, who cares? But there are kind of- 00:53:49.160 |
And by the way, I don't disagree with your comment- 00:53:50.420 |
Whenever somebody like me points out how insane some of these people are, I'm like, oh, I'm not 00:53:53.540 |
sure who these policies are. The topic has shifted to, are you a denier of some science or other? 00:53:58.760 |
Listen, my point is not about the science. My point is, look at the collapse of the Sri 00:54:02.900 |
Lankan economy because they implemented these ESG rules on fertilizer. Look at what's happening to 00:54:08.360 |
the Dutch farmers who are being put out of work because of ESG rules. 00:54:11.720 |
California, California today, Governor Newsom today said, please don't use your electric power 00:54:18.200 |
to charge your electric cars. A week after they said gas 00:54:23.360 |
cars are now going to be banned in the state. And so there's a deep irony underway today in 00:54:29.300 |
California because, and Fox News is obviously latched onto this story that- 00:54:36.020 |
Look, I've become suspicious whenever politicians invoke some crisis as the reason for some 00:54:43.580 |
authoritarian measure, I've learned to distrust it. That's the point. And so, 00:54:49.940 |
You know what's happening in Europe right now is obviously gas is through 00:54:53.180 |
the roof because of the Ukraine war. The price has basically gone through the roof. And so, 00:54:58.400 |
because of that, they're not able to produce fertilizer. And one of the byproducts of the 00:55:03.020 |
production of fertilizer is CO2, which gets added to the production of beer to make it fizzy. Well, 00:55:09.740 |
guess what? The Germans are not only about to run out of gas, they're about to run out of beer. 00:55:15.020 |
You think they're going to keep supporting the Ukraine war when they find out that there's 00:55:19.280 |
no beer for Oktoberfest? Forget it all, not be able to eat their- 00:55:23.000 |
Eat their homes. They can't drink beer in October. 00:55:27.380 |
The Western life is going to fracture very quickly. 00:55:31.220 |
Zach, just taking a beat for a minute. Like, taking off the table the political response, 00:55:38.840 |
are you as an individual concerned about the impact of a changing climate on people and on 00:55:47.240 |
the economy? And are you interested as an investor in the private market solutions to resolve some of 00:55:52.820 |
those things that are obviously going to become market-driven problems? 00:55:58.160 |
I think there may be a long-term problem here. I'm not sure, like, how long it takes. I definitely 00:56:04.400 |
think- I'm skeptical of this claim that, you know, the world's going to end in the next 10 years, 00:56:09.680 |
because frankly, we've been hearing that since the 1990s. 00:56:12.620 |
1988. There's a great headline article, by the way, in New York Times, where it was like, 00:56:16.520 |
scientists in 1988 said that all the oceans, all the ice caps will melt, the oceans will flood the 00:56:22.640 |
land by the year 2000. So there is a credibility challenge associated with predicting, you know, 00:56:27.920 |
kind of long-term outcomes like this that continues to kind of foment, unfortunately, right? 00:56:32.660 |
Yeah, all my beachfront property would be underwater right now if all that came true. 00:56:36.020 |
No, I'm just kidding. But no, but look, it hasn't escaped my attention that many of the political 00:56:42.080 |
leaders who are claiming that we're facing imminent threat of being underwater in 10 years own 00:56:48.740 |
beachfront property and fly on private jets. So, I'm not sure if that's a good thing. I'm not sure if that's a good thing. 00:56:52.460 |
So obviously, you know, I'm not saying this isn't a long term problem. But I do think they try to 00:56:58.040 |
create an imminent crisis. So they can shove through a bunch of bad policies that are very 00:57:03.080 |
destructive to the freeberg there. Would you define what's happening with temperature and 00:57:07.820 |
extreme weather as a crisis or not? Yeah, I mean, there's there's a I would 00:57:15.440 |
say there's a critical impact in and will continue to be a critical impact in the food supply chain. 00:57:22.280 |
In the quarters and years ahead, because of what we're seeing so severe drought and heat wave in 00:57:29.780 |
China right now. And by the way, food is not the only one there's also the manufacturing supply 00:57:35.360 |
chain. So in the province of Shishuan in China, they actually lost power because so much of the 00:57:41.000 |
power is driven by hydroelectric plants. So streams and water flow has slowed down and stopped. As a 00:57:47.180 |
result, there's less power as a result, the factories are being shut down. As a result, key components 00:57:52.100 |
for manufacturing in the computing industry and and in kind of mechanical goods are not being 00:57:59.780 |
produced at the rate that they were being produced before that has a ripple effect in the supply chain 00:58:04.100 |
similar to what we saw in COVID. And then in the food supply chain, we're seeing a drought and a 00:58:08.960 |
heat wave. Right now, we're coming out of it in the Midwest in the United States where we grow 00:58:13.400 |
corn and soybeans throughout Europe, and also in China. And in China, they had 70 days in a row of 00:58:19.280 |
record setting temperatures. 70 days in a row. 00:58:24.080 |
No water, high temperatures, the they're just starting to assess the crop damage, 00:58:29.840 |
it looks pretty severe. There was massive crop damage in the Midwest in the corner, 00:58:34.820 |
the Corn Belt this year. And then obviously, European farmers are having issues. And combine 00:58:39.560 |
that with the geopolitical issues of the Ukraine crisis and the natural gas pricing being so high, 00:58:44.360 |
one third of fertilizer of ammonia plants have been shut down in Europe. And they think that ammonia 00:58:51.740 |
and fertilizer production may drop by as much as one half in Europe, because of the crisis. So energy 00:58:56.780 |
prices are so high, you can't make fertilizer. So that energy is being also redirected into other 00:59:01.040 |
industry and support homes. So you believe it's a crisis. So I guess the question there, 00:59:04.520 |
yeah, people are turning on their ACS in California this week, you know, you see that 00:59:08.300 |
Governor Newsom just said, our grid in California cannot support excess electricity consumption. 00:59:14.600 |
Therefore, one of the biggest variables is electric cars. He's saying don't plug in your 00:59:18.800 |
electric cars this week. And that's because of record high temperatures that are about to hit 00:59:21.560 |
state California in two days. So look, everyone says, hey, these are kind of anecdotal stories. 00:59:27.800 |
But no, it's a trick. Statistically, the frequency of extreme things happening is continuing to climb. 00:59:34.460 |
And then the impact is in the food supply chain, the energy supply chain, and the manufacturing 00:59:39.200 |
supply chain. And there are follow on effects. Now, I'm optimistic that private market solutions 00:59:44.360 |
will resolve these issues. And that was my next question is, can we solve this or not? 00:59:48.260 |
Yeah, of course. I mean, look, we've had crises since the 00:59:51.380 |
dawn of humanity. I mean, we were a starving, you know, we need a crisis to solve some of these 00:59:56.480 |
problems. These are pretty economically obvious. Yeah. And that's, that's my point that the private 01:00:01.160 |
market will resolve because people want to have electricity, they want to have food, they want to 01:00:06.440 |
have one need they need. And so so so the market will pay for those things. And therefore, producers 01:00:12.440 |
and technologists will resolve to solutions to make that happen. So we just have to have the 01:00:16.460 |
problem and suffering and see it firsthand. Shama, the market to kick in. 01:00:21.200 |
Your friend in the group chat that shall not be named posted this meme. You guys saw it right, 01:00:28.460 |
which said like, an autistic schoolgirl takes what was it I have it here quit school, the dominoes. 01:00:35.060 |
You can show the dominoes. But basically, it said, an autistic Swedish girl decides to skip school is 01:00:42.500 |
the little domino and seven dominoes later this huge dominoes it says the collapse of Europe's 01:00:51.020 |
the language used in the meme for a second. What is the point? The point is, these conversations 01:00:56.840 |
can so quickly go off the rails and are not about national security and economics and quickly become 01:01:03.260 |
about moral virtue signaling that you missed the point. The point in the United States just to be 01:01:08.240 |
very blunt, is that the cost of electricity has gone up by 46% in the last decade, it will go up 01:01:20.840 |
So between 2010 and 2030, the cost of electricity for every single American will have effectively 01:01:28.340 |
doubled, even though the ability to generate particularly from wind and solar have been 01:01:35.900 |
reduced by 90%. Now, why is that? Well, it turns out that if you look inside the PNLs of these 01:01:41.900 |
utilities, it actually gives you the answer. So over the next 10 years, we have to spend as a nation, just just 01:01:50.660 |
on upgrading power lines? $2 trillion. This is all cap, this 01:01:57.560 |
is all like, crazy, crazy oppex, right? CapEx, improvements for 01:02:01.400 |
sunk cost investments, our powerline infrastructure in 01:02:04.120 |
America is 20 plus years old, 30 years old, our generation 01:02:10.360 |
infrastructure is pretty poor. We have all of these failures, 01:02:14.220 |
we don't have enough Peakers, we don't have ability to store 01:02:17.000 |
energy when we need it. So if you add this all up, I do think 01:02:20.900 |
that there's a huge economic incentive to solve it. And there 01:02:23.700 |
are practical ways to solve it. And that's what we have to stay 01:02:27.260 |
focused on. Because if we allow the moral virtue signaling to 01:02:30.260 |
get in the way, we're going to make some really stupid 01:02:32.500 |
decisions. We're gonna go out trying to turn off nuclear. 01:02:34.820 |
Another thing that Elon said, you know, we're not going to 01:02:37.980 |
greenlight shale and nat gas, we don't have time for this. If you 01:02:42.440 |
actually believe this is a cataclysmic issue, you need to 01:02:46.980 |
be okay with hydrocarbons, because it is the only credible 01:02:50.340 |
bridge fuel we have to keep the world working properly. Because 01:02:53.940 |
otherwise, what David said, is right, we are going to 01:02:56.820 |
economically destroy parts of the world by trying to race 01:03:00.540 |
towards this net zero goal. By the way, guys, I just want to 01:03:03.540 |
take the you know, rip the bandaid off. Net zero by 2050 26. 01:03:08.520 |
It is not possible. There is zero credible plans that the 01:03:12.640 |
world has to do it. So we have to take small incremental steps 01:03:16.580 |
and many of them. Yeah, this is and we should stay focused on 01:03:20.820 |
that. Yeah, the interesting thing. And you cannot let people 01:03:24.140 |
guilt trip you because this is when you make these stupid 01:03:26.600 |
mistakes like the entire continent of Europe made really 01:03:30.320 |
which is now going to cripple here's many economies of that of 01:03:35.280 |
that entire continent unnecessarily. Here's the good 01:03:38.000 |
news, all this virtue signaling and people wanting to be green 01:03:41.060 |
and for whatever reason, all of that now economics and geopolitical 01:03:45.940 |
stuff. Right? I mean, that's what's going on. The other thing 01:03:48.500 |
that I think is really interesting is that the government 01:03:50.620 |
is forcing people to rethink nuclear, the Diablo Canyon, 01:03:54.400 |
nuclear power plant, which is the last one in California that's 01:03:58.160 |
operational. It's being voted on today as we're taping this and 01:04:02.020 |
it's expected that they're going to keep it online. The governor 01:04:04.760 |
with the great hair wants to keep it going. And the 01:04:07.440 |
fascinating part about this is how what choice does he have? 01:04:09.820 |
Exactly. And so now the awareness is so high about power 01:04:15.540 |
people don't understand how nuclear energy even works. And 01:04:19.820 |
I they went to a no nukes concert in 1978. And they 01:04:22.260 |
haven't changed their position since but this is interesting. 01:04:24.180 |
This guy, Gene Nielsen, who's been a champion of this stuff. 01:04:27.240 |
He said, I thought our chances were zero of keeping this thing 01:04:32.100 |
you know, any said he told this conference of attendees this 01:04:36.360 |
nuclear conference, that the effort to maintain it, he said, 01:04:40.560 |
what's happened since you know, the last six months has been like 01:04:45.660 |
That everybody has changed their position just in the last 01:04:48.240 |
year, on shutting these things down. And we saw obviously 01:04:51.120 |
Germany is re you know, thinking the three remaining of their 01:04:54.660 |
six that they were going to turn off, they're putting back on. 01:04:57.180 |
And I think we're going to see new ground broken. And Japan 01:05:00.420 |
came out just in the last week and said they're going to build 01:05:03.600 |
more nuclear power and nuclear power plants. Now think about 01:05:06.720 |
that. That's awesome. That's unbelievable. They had 01:05:09.700 |
Fukushima, the German shut off their nuclear because of 01:05:15.660 |
Fuka guys, Fukushima didn't happen by accident. There was a 01:05:19.520 |
like, an incredible tsunami, which was triggered by a 01:05:23.800 |
once in 100 earthquake. You're talking about these extremely 01:05:27.560 |
long tail events. Yes, the retaining walls could have been 01:05:30.700 |
built better. But these are things we find we iterate and 01:05:34.300 |
resolve. It was not the reason to shut down an entire mechanism 01:05:39.880 |
stupid mistakes where they put it just too low in the wrong 01:05:44.460 |
a couple of miles, it would have been fine. But also, if you 01:05:47.280 |
look inside of what happened, there was enormous pressure 01:05:50.280 |
inside of these companies to basically, you know, take take 01:05:54.120 |
actual direct blame and responsibility. I get all of 01:05:57.240 |
that. But these organizations were shamed into turning these 01:06:00.640 |
things off. That is not the way to make good smart decisions. 01:06:04.100 |
Okay, so Gorbachev, the last ruler of the USSR passed away 01:06:10.980 |
Yeah, I mean, I think this was a real milestone. 01:06:14.260 |
You go back to the 1980s. And Ronald Reagan, the who had spent 01:06:20.740 |
his entire career being a cold warrior, saw the opportunity to 01:06:27.400 |
basically do business with with Gorbachev. Margaret Thatcher had 01:06:30.920 |
told him that this a man we can do business with Gorbachev had 01:06:34.000 |
come to power in 1985. He had initiated reforms of the Soviet 01:06:37.500 |
system. He was a communist to be sure. But he introduced 01:06:40.760 |
political reforms called gloss nose and economic reforms called 01:06:44.560 |
And Reagan seized the opportunity to go meet with him 01:06:46.900 |
and they signed arms control treaty after arms control 01:06:50.180 |
treaty and ended the threat of mutually assured destruction 01:06:55.220 |
that the world had been living with since the beginning of the 01:06:57.080 |
Cold War. You got to remember that, you know, the Cold War 01:06:59.480 |
began shortly after World War Two. And we had this doctrine of 01:07:04.000 |
mutually assured destruction or mad and the whole world was 01:07:07.120 |
living under the shadow of nuclear annihilation. This was 01:07:10.300 |
showed repeatedly when I was a kid, there was a TV 01:07:15.000 |
the day after the day after it was scary. Lord terrorized us 01:07:21.500 |
that America we're all gonna die. We had Did you ever have 01:07:24.500 |
nuclear bomb drills we had to get under your desk? 01:07:27.080 |
Yeah, we have. Yeah. So if you're like our age, you 01:07:31.580 |
remember this that that movie, by the way, that that was a TV 01:07:34.280 |
event movie that was one of the most widely watched movies. But 01:07:37.700 |
there were others, you know, Jim Cameron used this concept in the 01:07:40.460 |
Terminator movies. You know, it was something that people were 01:07:43.860 |
really afraid of. And Reagan sees the opportunity thought 01:07:47.580 |
fundamentally that nuclear deterrence was a moral that yet 01:07:52.380 |
better to have deterrence than a nuclear war, but that he if he 01:07:56.280 |
could, he sees the opportunity to negotiate the end of the Cold 01:07:59.880 |
War. And by the way, there were hardliners in his administration 01:08:02.940 |
who did not want him to negotiate with Gorbachev. But 01:08:05.220 |
they ended up doing a series of meetings. It culminated in 1986 01:08:13.660 |
a deal to remove, you know, these these INF system deals now, 01:08:18.040 |
and that ended the Cold War. And then basically what happened is 01:08:20.680 |
in 1989, the Berlin Wall came down. Gorbachev allowed the 01:08:25.360 |
Western the Warsaw Pact countries to leave. And then in 01:08:28.300 |
1991, the Soviet Union collapsed. So, you know, he gets 01:08:32.900 |
a lot of credit for being willing to reform that system. 01:08:35.600 |
Now, the sad thing is, if we were fast forward 30 years 01:08:39.620 |
later, where are we today, we're back in a new Cold War, with 01:08:43.460 |
Russia, I mean, that we've been spending a good part of this 01:08:46.760 |
year, talking about the threat of a nuclear use. And, you know, 01:08:51.320 |
this was a problem that we thought was solved 30 years ago. 01:08:54.440 |
And now we're back with it today. And you've got to ask, 01:08:58.520 |
have the successors of, you know, Reagan and, and and George 01:09:05.180 |
Herbert Walker Bush, the people who inherited our foreign policy 01:09:08.660 |
over the last 30 years, have they done as good a job as 01:09:13.260 |
Reagan did? Reagan ended the Cold War, we are back in a new 01:09:16.620 |
Cold War. Why? What is the reason for this? There's been a 01:09:21.060 |
a series of stupid policies that now have put the risk of nuclear 01:09:28.020 |
My interest in Gorbachev is slightly different. He is an 01:09:32.220 |
incredibly important character. The second half of the 20th 01:09:37.200 |
century, undeniable, you know, won the Nobel Prize, as you 01:09:43.060 |
But the most important thing, in my opinion, was the precursor to 01:09:47.200 |
perestroika. And why he did it. And as you said, like, you know, 01:09:52.120 |
this is a fairly ardent communist, although he had 01:09:54.340 |
really interesting views, like he ran a very kind of like, 01:09:57.300 |
open, kind of Politburo where folks could debate and he, you 01:10:02.440 |
know, promoted a lot of young people from within all of these 01:10:05.500 |
interesting things. But the most important thing and he's written 01:10:09.200 |
about this a lot is the reason that he embarked on perestroika 01:10:12.560 |
was because USSR at the time had an incredibly poor work ethic, 01:10:16.880 |
terrible productivity, and horrible quality goods. And I 01:10:21.920 |
think that there's something to be learned by that. Because at 01:10:24.740 |
the tail end of communism, essentially, where you had 01:10:28.040 |
central planning, central imposed economic principles, 01:10:33.260 |
what happened, people did not feel that they had any ownership 01:10:37.740 |
in the outcome. No agency, no agency whatsoever. And I think 01:10:42.540 |
really important lesson to observe there, which is that if 01:10:45.300 |
governments get too actively involved, this doesn't just 01:10:49.720 |
happen in Russia, it happens everywhere. It's happening in 01:10:52.380 |
California, we just talked about it, it's happening where we live 01:10:55.200 |
right now. And if you look at then what happened afterwards, 01:10:58.980 |
it became the aristocracy that basically ruled the USSR before, 01:11:04.440 |
and then fighting against all these folks that wanted reforms. 01:11:08.360 |
And that created the schism, which then perverted capitalists 01:11:12.360 |
capitalism for you know seven or eight years through yeltsin until putin got there and that's 01:11:18.240 |
what created the oligarch class and you know really exacerbated a bunch of wealth capture 01:11:22.660 |
by a handful of folks that may or may not have deserved it and i'm not going to judge that 01:11:26.440 |
but i just think it's really important to understand that he was forced to embark on this 01:11:31.760 |
because of all of these state central planning policies and so it's just an important lesson 01:11:38.240 |
for americans in democracy which is careful if you want more government you might get more 01:11:43.400 |
government guys you're exactly right the soviet union their economy used to run on what they 01:11:48.380 |
called five-year plans it was incredibly centralized it was all run by the government 01:11:52.740 |
and you know this was communism and the 20th century the second half of especially was a 01:11:58.800 |
giant battle of systems not just of countries not just the western bloc um you know led by the u.s 01:12:05.140 |
the free world versus the soviet union the warsaw pact 01:12:08.180 |
you know the u.s the free world versus the soviet union the warsaw pact 01:12:08.220 |
was also a battle of philosophies and systems it was the philosophy of state control versus freedom 01:12:14.120 |
and uh and a free economy and freedom won you know the free economy won in that battle and the crazy 01:12:21.240 |
thing is 30 years later we're talking about socialism being a viable doctrine you have 01:12:26.920 |
politicians basically saying that they are socialists but 30 years ago you would have been 01:12:32.000 |
like that was there's an election and the point is example after example 01:12:38.200 |
evidence well documented of just how it doesn't work. And I guess 01:12:42.880 |
the thing is, you know, we all say like, you just have to learn 01:12:45.480 |
it for yourself. You know, like, you'll tell your kids to do it 01:12:48.360 |
to not do something at infinite stove. Yeah, they got to touch 01:12:52.320 |
the stove themselves and get burned. We're about to go do 01:12:56.320 |
Can you imagine if like people who want to be socialist here 01:12:59.600 |
binders if they had to be in a food line or have rations, 01:13:02.920 |
literally Russia had food lines, and they were rationing food in 01:13:06.280 |
the 1980s. Like, well, yeah, just central God, yeah, a large 01:13:10.560 |
driver of Gorbachev, basically negotiating these peace 01:13:13.480 |
settlements with, you know, with Reagan and this nuclear, you 01:13:16.600 |
know, demilitarization was in part because he knew he couldn't 01:13:22.720 |
was collapsing. The class was collapsing. Yeah, he could not 01:13:27.920 |
keep up. Reagan began an arms build up. And there was an arms 01:13:31.720 |
race and they were losing. They were losing. They were losing. 01:13:36.240 |
people should not misunderstand what Gorbachev was. He was not 01:13:39.680 |
necessarily some democratic freedom fighter. He was a person 01:13:42.720 |
who was observing the on the ground conditions of a 01:13:45.480 |
socialist system, decay their ability to compete. And so he 01:13:49.120 |
had to capitulate before it was forced upon him. Right? Right. 01:13:53.880 |
This is why I don't think he deserves as much credit as as 01:13:56.120 |
Ronald Reagan is because at the end of the day, Gorbachev would 01:13:58.880 |
have kept communism going. If he could have if he could have if 01:14:02.200 |
he wanted the Soviet Union to stay together. I mean, he 01:14:06.080 |
but I think to his credit, when the things start to collapse, he 01:14:09.120 |
didn't use violence and repression to try and hold it 01:14:12.640 |
together. So, you know, he was a he was a reformer. He was a 01:14:16.200 |
liberalizer, pragmatist, maybe, and a pragmatist and didn't use 01:14:19.160 |
violence. But and ultimately, he was a partner for Ronald Reagan. 01:14:22.320 |
Remember, Ronald Reagan began this gigantic arms build up, he 01:14:25.280 |
was denounced as a cowboy who would get us in World War Three. 01:14:28.200 |
But when he had the chance to negotiate a deal with Gorbachev, 01:14:31.800 |
he took it and they basically got arms control. And it was 01:14:35.280 |
because Reagan was able to sit on top of an extremely productive 01:14:40.120 |
capitalist system that allowed him to make those investments 01:14:43.120 |
that made that capitulation. We have basically a fat to complete. 01:14:46.480 |
And I think that that's another thing for a lot of us to 01:14:49.160 |
understand, which is free market capitalism, removing these 01:14:53.840 |
degrees of decision making give us degrees of freedom. And I've 01:14:57.000 |
said this before, on the last part, the great thing about the 01:14:59.680 |
IRA and what Chuck Schumer did, is actually will be written in 01:15:05.120 |
Because when we get to energy independence, when every home is 01:15:09.160 |
resilient, the national security calculus in America changes 01:15:13.120 |
wholeheartedly overnight. Yeah, I mean, think about our relationship 01:15:16.760 |
years ago. Right, but I didn't cancel that. Okay, well, listen, 01:15:21.920 |
no, it's true. If you're willing to use natural gas and fracking, 01:15:25.120 |
America is one of the richest energy countries in the world. 01:15:28.680 |
We produce a lot of oil too. Yeah. I mean, listen, it will 01:15:31.640 |
change everything if we can really go all in on all of the 01:15:36.080 |
It's just it's just the fossil fuels and the and the if you if 01:15:40.040 |
you let the capital markets do and the free markets do what 01:15:43.040 |
they're meant to do. They will empower the government to do 01:15:46.560 |
great things for all of you. All citizens. I agree with 100%. 01:15:50.200 |
And the reason we want to stop it is because we have a couple 01:15:53.000 |
of examples of you know, extreme wealth. And I think that's 01:15:56.080 |
something we have to think about is like are the extreme is the 01:15:58.680 |
extreme wealth created for a certain class enough to stop the 01:16:02.280 |
prosperity train that we actually have in America. Listen, Jason, 01:16:04.840 |
the politicians in California think they can run fast food 01:16:07.800 |
businesses better than the people who own those those 01:16:11.520 |
restaurants. People have never worked in their lives. No, we 01:16:15.160 |
have the great thing work. They wouldn't. Yeah. The great thing 01:16:18.440 |
is we have a running a B test, which will show whether this 01:16:22.240 |
state central planning can work or not. And again, if we refuse 01:16:27.200 |
to want to listen to the examples of Russia, or all of 01:16:30.240 |
these other countries that have tried this, then so be it. We 01:16:34.800 |
In the next three to five years that these policies actually 01:16:37.560 |
don't work and actually, that it actually accelerates the exact 01:16:41.520 |
hellscape that they think they're trying to avoid. 01:16:43.360 |
But you know, not only to my point about foreign policy, not 01:16:46.480 |
only are we forgetting the lessons of this Cold War, the 01:16:50.640 |
economic lessons, that a free enterprise system generates more 01:16:54.040 |
wealth and prosperity, and more national greatness, more ability 01:16:57.920 |
to fund a defense budget, create a better economy. Not only does 01:17:02.120 |
it do all those things we forgot, not only have we forgotten 01:17:04.760 |
that, but also we have abrogated, we have ended every 01:17:09.920 |
single defense control and arms control treaty with the Russians 01:17:14.080 |
that Reagan and Gorbachev signed. Why? Why did we do that? 01:17:18.520 |
Now our nuclear missiles are pointing at each other again. 01:17:21.560 |
And you know what? Russia's a much poorer country in the US. 01:17:24.720 |
We're actually 15 times richer than them. During the Cold War 01:17:28.080 |
at their peak, we were only three times richer than them, 01:17:30.880 |
but we're 15 times richer, but they still got over 6,000 nukes. 01:17:34.720 |
Why did we get rid of all those arms control treaties? Why have 01:17:39.160 |
Yeah, we talked, we said it before, like you're dealing with 01:17:41.720 |
a pragmatist, and Putin's KGB agent is slightly different. 01:17:45.960 |
Maybe you feel like Putin isn't someone we can deal with now, 01:17:48.240 |
but he was someone we could have dealt with 10 years ago, 20 01:17:50.800 |
years ago. We missed the opportunity to deal with him, 01:17:54.080 |
If he was a rational actor, we would be able to deal with him. 01:17:57.400 |
I think the only way he's actually going to be, the only 01:18:00.280 |
way we'll be able to deal with him is if he doesn't have the 01:18:04.680 |
Like he just has too much power from that oil. And over time, 01:18:08.160 |
that's the solution. We have to be energy independent. You know, 01:18:10.640 |
I think Europe has learned that lesson harder than we have. 01:18:12.920 |
Gorbachev was a communist. He was an heir to Stalin. And yet 01:18:16.640 |
Reagan could still do business with him and sign an arms 01:18:19.680 |
control treaty so that we could end the risk of mutually 01:18:25.280 |
You don't have to believe, you don't have to believe that 01:18:27.160 |
Putin is a good guy in order for us to avoid getting back into a 01:18:31.840 |
Cold War, which is the situation we find ourselves in, 01:18:34.640 |
now. How does it benefit us to have Russian nukes once again 01:18:41.880 |
And the reason why, the reason why we alienated him, Jason, 01:18:44.800 |
has nothing to do with him being a madman or whatever. It is 01:18:47.800 |
because we brought NATO, which he views as a hostile military 01:18:51.160 |
alliance, right up to his border. And by the way, listen 01:18:55.200 |
to this video from Gorbachev. Gorbachev was asked about NATO 01:18:58.480 |
expansion in front of the US Congress. He was giving a talk 01:19:01.640 |
to them and they asked him, Gorbachev, what do you think 01:19:03.800 |
about NATO? And he said, I don't know. I don't know. I don't 01:19:04.600 |
know what you mean. What did he say? He said that you cannot 01:19:06.840 |
humiliate a country this way and expect there to be no 01:19:09.600 |
consequences. Gorbachev was against NATO expansion. When 01:19:13.960 |
basically George Herbert Walker Bush and the Secretary of State 01:19:16.360 |
James Baker went to Gorbachev to argue on behalf of German 01:19:19.440 |
reunification. This is basically in 1990. Baker made the promise 01:19:24.360 |
to Gorbachev not one inch eastward. Gorbachev said, yes, 01:19:27.760 |
okay, they can reunify, but we do not want NATO moving up to 01:19:31.040 |
our border. Baker made that promise. We have brought NATO up to their 01:19:34.560 |
border. That is why Putin regards us with hostility. 01:19:39.120 |
Yeah, and I think Europe regards him with hostility and they're 01:19:43.480 |
scared of him because he keeps invading countries. So yeah, 01:19:46.000 |
there's takes two to tango. But hey, this has been episode 94. 01:19:49.040 |
It does. It does take two to tango. But you have to ask has 01:19:52.280 |
the US foreign policy over the last 30 years that's gotten us 01:19:55.800 |
in a new Cold War with Russia? Has that been a successful part 01:19:58.920 |
foreign policy? They have undermined all the good work 01:20:01.720 |
that Ronald Reagan and Gorbachev did together to 01:20:04.520 |
end the Cold War. We are back in a new Cold War. And you need 01:20:07.760 |
to find Reagan. You just need to find a new Reagan. Putin has 01:20:09.920 |
some blame. Putin has some blame, but so does the US State 01:20:12.720 |
Department. So does the US State Department. And Europe. Europe 01:20:15.600 |
has a big part of the blame too here because he's their 01:20:17.960 |
neighbor. Yeah. For the Sultan of Science, the Rain Man End, 01:20:22.160 |
Reagan Library Chairman. Are you the chairman of the Reagan 01:20:25.440 |
Library? Let me ask you this. When you were a kid, did you 01:20:27.560 |
have a Reagan poster in your room? Have you ever owned a 01:20:30.280 |
Reagan poster? Maybe a Stanford? I'm sure I've definitely owned a 01:20:34.480 |
picture of Reagan and Ronald Reagan. Listen, Jason, 01:20:38.720 |
Ronald Reagan poster or a picture in your dorm at 01:20:42.560 |
Stanford? Yes or no? Listen, Ronald Reagan. That's a yes. 01:20:46.120 |
Hold on a second. Ronald Reagan won the Cold War without firing 01:20:49.840 |
a shot. He gave us the greatest economy the US has ever had. He 01:20:53.160 |
ended the inflation and stagflation of the 1970s. And 01:20:57.480 |
again, he avoided getting us into any of these major wars. 01:21:01.640 |
Tell me the politician who you can have lunch with any of the 01:21:04.440 |
politicians who you revere, who has a track record like that. 01:21:08.200 |
Um, yeah, I mean, I guess one would say Bill Clinton, you know, 01:21:13.600 |
would be the closest, you know, great president of our 01:21:17.400 |
I think he was a good, good president, especially number two 01:21:21.400 |
remember, he benefited from the economy that Reagan 01:21:24.160 |
whatever reason you would agree with me, Clinton is right behind 01:21:27.400 |
Reagan and Clinton in since like 1950 something easily top two. 01:21:34.320 |
I mean, it's hard for sacks to say that Eisenhower is very good. 01:21:37.680 |
Listen, I mean, but head and shoulders, I think like for 01:21:40.080 |
practically what they do at their point of time. Amazing. 01:21:43.640 |
All right, listen, sacks. I know you love Reagan. And Reagan and 01:21:49.080 |
Clinton. You should play you should play Reagan and Nixon. 01:21:51.720 |
Reagan and Bill Clinton one on each side of you for next week. 01:21:54.400 |
Will you do that for us? I gotta go guys. I gotta go to I love 01:21:57.520 |
you. Sassy. I love you. Love you too. Happy birthday. Thanks, 01:22:07.520 |
And instead we open source it to the fans and they've just gone crazy. 01:22:32.880 |
I should all just get a room and just have one big huge or because they're all just like this like sexual tension, but they just need to release that. 01:22:58.920 |
© 2012 University of Georgia College of Agricultural and Environmental Sciences UGA Extension