back to indexLA's Wildfire Disaster, Zuck Flips on Free Speech, Why Trump Wants Greenland
Chapters
0:0 The Besties welcome Cyan Banister!
9:16 Reacting to the devastating wildfires in LA: broken incentives, leadership failures, lessons learned
36:51 Insurance issues, rebuilding headwinds, reclaiming the government
59:44 Zuck goes full free speech, fires third-party fact-checkers, opts for Community Notes model
80:19 Nvidia goes consumer at CES: market cap impact, most interesting vertical
94:49 Why Trump wants to purchase Greenland from Denmark
100:5 Conspiracy Corner: Who built the pyramids?
00:00:00.000 |
I just got a haircut with a new person. She was like, I'm like, do what you want. This is what 00:00:04.880 |
she did. Okay. Well, let me know who she is. Chamath and I will go beat her up and get them, 00:00:09.360 |
get your money back. Did she feather your bangs and blow your hair out? She did. She gave you 00:00:13.360 |
a blow, didn't she? It's starting already. Okay. No, but that's a blow dryer. Just, 00:00:17.440 |
yes. Right. She blow dry your hair? At the end, she gave me a little. 00:00:20.560 |
Yeah. That's not sustainable. So you can't tell what the quality of the haircut's like, 00:00:23.600 |
because you're never going to do that again. You don't have the skill. 00:00:25.920 |
I've never blow dried my hair in my life. No, I understand that. Then this is why, 00:00:28.960 |
because if you get the blow and it looks good in the blow. 00:00:31.200 |
Just say blow out, please. Just say blow out, the full word. 00:00:34.720 |
Why? What are we, six? Just grow up, you f***ing dingo. 00:00:37.200 |
The way you're saying it, you're saying it to provoke a reaction. Come on. 00:00:44.560 |
Tell us about what your rules for blows are. What I'm saying is if you get a haircut 00:00:48.720 |
and you get a blow, it's very hard for you to know. No, but I'm serious. It's very hard for 00:00:54.720 |
you to know what it's going to look like the next day when you take a shower and when you don't, 00:01:03.040 |
Oh, you're saying the self-blow can't match the stylist blow. 00:01:06.080 |
It's just important when you get a haircut with a new stylist or a hairdresser or a barber, 00:01:11.120 |
you cannot let them blow you. He's not happy with the ending. 00:01:14.800 |
Got it. It was an unhappy ending. Because when you blow yourself, Chamath, 00:01:20.560 |
which people have accused you of blowing yourself on this very program, when you blow yourself, 00:01:25.040 |
it's not going to come out the way it did. It won't be as fabulous. 00:01:29.040 |
Every time I've blown myself, it's been perfect. 00:01:31.360 |
All right, everybody, welcome back to the All In podcast. I'm your host, J-Pow from 00:01:56.640 |
Japan here, cutting turns in Niseko and at Iwanai. And we have an incredible lineup today. 00:02:06.480 |
As always, the Chairman Dictator Chamath is here to reign supreme. How are you doing, brother? 00:02:12.560 |
Good. How are you? What are you wearing exactly? 00:02:15.040 |
I'm just wearing my kimono as I want to do here on the All In podcast. 00:02:23.360 |
I just decided in 2025, I'm going to live my best life and I'm going to do everything. 00:02:29.440 |
Anybody asks me to do something, I'm saying yes. 00:02:31.520 |
Oh, I'm going to ask you a bunch of stuff next weekend then. I got so many ideas. I got a list. 00:02:36.240 |
I'm going to ask you to do all sorts of things. 00:02:39.120 |
So yeah, I'm over the moon right now. And then I'll be going to the inauguration to see all my 00:02:47.360 |
friends and celebrate the big Trump victory and tape an episode there. With us, of course, 00:02:53.680 |
David Freeberg, your resident sultan of science. And not a moment too soon. We have so much to 00:02:58.480 |
talk about. What's the background here? Those are some dead trees with Mount Fuji in the background. 00:03:05.040 |
Yeah, that's basically Jason Kurosawa landscape. 00:03:08.960 |
While you've been gallivanting and being a dilettante, 00:03:12.960 |
your original adopted home state is burning to the ground. 00:03:16.800 |
Yes, I know about this. I got off the ski lift and I saw this after I had posted like, 00:03:22.400 |
"Oh, my life is amazing." And I was like, "Oh my God." Everybody replied like, 00:03:28.400 |
"Are you in the room?" And I'm like, "Oh my Lord." 00:03:31.600 |
This is unbelievable. And we'll obviously talk about that a whole bunch. 00:03:37.040 |
Yes, I put $500 behind it to boost it to try to get my ratings up. No, I actually 00:03:41.680 |
literally deleted it. And I never do that. But I posted a video where I was like, "Oh my God, 00:03:45.760 |
it's incredible." And I was like, "You know what? This is the wrong time for it." 00:03:49.440 |
So a little grace there, folks. And I am so happy to have here on All In Idol, 00:03:57.920 |
the one, the only, my good friend, Cyan Bannister. 00:04:02.320 |
She's my good friend before you guys met her. So she's ours, but I've been friends with her longer. 00:04:07.920 |
So my good friend, Cyan Bannister and our good friend and our bestie. Let's just leave it at 00:04:11.680 |
that. It doesn't have to be a competition for who Cyan is best. We'll ask her to rate at the end of 00:04:17.680 |
Thanks for having me. I appreciate it. It's nice to see everyone. 00:04:22.000 |
Jason, you want to tell people about Cyan's background? 00:04:25.040 |
Yeah, do an epic rant on Cyan's epic history. 00:04:27.360 |
I mean, Cyan and I fought in the Clone Wars together. It was a long time ago. But 00:04:31.600 |
she's a technologist, self-made individual who then decided she would start writing small angel 00:04:38.640 |
checks about 14 years ago, literally the same year I did. And myself, Cyan... 00:04:46.800 |
She's done incredible, yes, of course. But we'll get into it. And Cyan and I would... 00:04:53.840 |
14 years ago, I guess, we would meet startup companies together and host little events 00:05:01.840 |
where we get together and take pitches. And we invested in a couple of companies together, 00:05:06.320 |
and it worked out very nicely for everyone involved. 00:05:10.000 |
Yeah, we're in a couple of companies together. 00:05:13.280 |
Density. We were on the board together for a little bit, so that was fun. 00:05:18.400 |
Thumbtack. Actually, Thumbtack, Density, and Uber, I all discovered through you. 00:05:26.240 |
What's... Which one was Uber? I got to check my... 00:05:28.160 |
Yeah, and I don't think you're allowed to say them. 00:05:30.480 |
Let me check my Google sheet here. I didn't know I invested in Uber. Let me check. I have 00:05:33.520 |
to confirm that. Oh, yeah, I did. But at one of these events, I introduced Cyan to Uber. It's 00:05:40.080 |
Found all three of those deals at your event, so that was really great. 00:05:45.920 |
Let me try. Cyan is a prolific angel investor. 00:05:55.360 |
She runs a seed fund called Long Journey Ventures. 00:05:58.320 |
Some of her hits include SpaceX, Anduril, Density, Postmates, Niantic, which is the 00:06:08.080 |
makers of Pokemon Go, and Jason's favorite startup, Uber. Yeah, it's been a good run. 00:06:15.520 |
It's been a good run. And also, I'll just add, a wonderful human being. And if you ever 00:06:19.600 |
had the chance to hang out and talk for a couple of hours, Cyan would be one of those people that 00:06:26.320 |
I will promote Cyan's interview with Tim Ferriss a couple of weeks ago. 00:06:30.880 |
I randomly turned it on. I was in the car driving home, and then I stopped in my driveway and kept 00:06:36.000 |
listening. I was just telling Cyan it was a fantastic... What was it? About two hours, 00:06:44.400 |
And then I had to drive again. I listened to it, and I was excited to get back to it, 00:06:48.480 |
which never happens for me listening to long-form interviews like that. It was 00:06:56.000 |
A couple of things. One, Cyan is an incredible storyteller. The way she describes her 00:07:01.520 |
experiences, her history, her life, beautiful. She talks in kind of, I think, a deep persuasive 00:07:08.400 |
way about some of the things that have shaped her, her business investing, as well as kind 00:07:14.160 |
of spirituality, which she mentioned earlier, which is not something that you'll typically... 00:07:17.120 |
And you're like, "Wait, where did this conversation just pivot to?" 00:07:19.440 |
And then you go down this whole other path with her, and you go on the journey with her. 00:07:22.480 |
I just thought it was great. So all over the place, it was great. Beautiful. 00:07:26.080 |
Recommend it to everyone to get to know Cyan. 00:07:28.240 |
Oh, thanks. Can I have you all as my professional cheerleading squad from now on? 00:07:32.080 |
This is pretty awesome. I don't like talking about myself, and this is great. I love it. 00:07:36.240 |
Well, it's true. Cyan was voted most humble in our angel investing group, 00:07:41.440 |
and I was a close second, so I almost won most humble. 00:07:47.280 |
I'm going to get you a t-shirt called "The Humblest." 00:07:49.120 |
No, you have to borrow it from Chamath. He's got his hat on for the last 10 years. 00:07:54.400 |
Seven or eight years ago, Jason approached Cyan and I and said, "Hey, guys, 00:07:58.560 |
Harvey Weinstein has asked me to make a show." 00:08:03.920 |
Here's how he asked me to do it in his room. It was really- 00:08:09.520 |
Cyan, myself, and Jason went to someplace in the city, and we taped a- 00:08:17.360 |
We taped the NBC pilot for "The Accelerator," or "The Incubator." So I had been approached 00:08:25.680 |
and did a pilot for NBC called "The Accelerator," and they spent like a half million dollars on 00:08:31.680 |
this. And you guys came on, and it came out great. And it was just going to follow me around- 00:08:35.520 |
Well, it was very awkward, because afterwards, they approached Cyan and I to do the show- 00:08:49.520 |
We decided our friendship was more important. 00:08:52.640 |
I don't know if I ever told you guys the story, but literally, they were figuring out where to 00:08:56.880 |
put this and what time slot. They were like, "We're going to do it in the summer, because 00:08:59.600 |
we're trying to get some summer programming going. That's where we're going to test stuff." 00:09:02.720 |
And then Harvey Weinstein turns out to be an horrible monster, and the whole thing gets 00:09:06.240 |
canceled. And anything that was anywhere within 100 miles of Harvey Weinstein got canceled, 00:09:11.200 |
including my failed or forgotten reality TV show. 00:09:16.400 |
All right, let's get to more important things. 00:09:19.440 |
There is an unbelievable tragedy occurring in Los Angeles, as we're speaking. Devastating 00:09:26.080 |
wildfires. Basically, I formed a ring around LA, the most destructive of which has been 00:09:31.760 |
the Palisades Fire, which had stretched into Malibu, obviously. 00:09:36.960 |
And 15,000 acres or so have been burned in that area. Thousands of homes, maybe 2,000 homes. 00:09:45.360 |
Here's some images. They're just devastating. And we have a lot of friends in this area. 00:09:51.040 |
And the area you're seeing on fire, if you don't know the topography of Los Angeles, 00:09:55.920 |
is north of Santa Monica. You have Palisades and then Malibu. And obviously, east of the 405, 00:10:00.960 |
you have things that you've heard of, like Bel Air and Brentwood. This area is part of a mountain 00:10:07.600 |
area called the Santa Monica Mountains, and they get very dry. And there's a phenomenon, 00:10:14.240 |
which we'll get into, called the Santa. And it winds that blow really, really strongly, 00:10:19.520 |
and a perfect storm has happened, where thousands of homes and tragically five lives, 00:10:25.040 |
and I'm sure there will be more, unfortunately, have burned it down. 00:10:29.520 |
This video of driving down PCH, if you've ever driven PCH, the Pacific Coast Highway, 00:10:37.040 |
these are 10, 20, $50 million homes that are literally on the Pacific Ocean. 00:10:43.280 |
The most coveted homes in Los Angeles are not Bel Air and Brentwood. You might think that because 00:10:48.160 |
you hear them on TV. But really, if you were an incredibly successful person, you would aspire 00:10:53.600 |
to live in the Pacific Palisades, just west of Brentwood, and just south of Malibu or Malibu. 00:10:58.560 |
Many celebrities live there, many executives, etc. And these homes are gone. Thousands and 00:11:06.480 |
thousands of homes. This has turned into the ultimate Rorschach test on social media, where 00:11:13.680 |
people are projecting into this tragedy, which tragically occurs every year to varying degrees, 00:11:20.400 |
and maybe every 20, 30 years, it's an acute situation. We'll get into that in a moment. 00:11:24.640 |
But looking at this absolute, just devastating loss of property and lives, 00:11:32.160 |
the lives could have been a lot worse. Friberg, from a scientific perspective, maybe we'll start 00:11:37.360 |
there. When you look at these wildfires, extreme weather, global warming, and you look at this 00:11:44.320 |
situation, is that where your mind goes? Or in this Rorschach test of how you feel about 00:11:49.760 |
these kind of tragedies and how you interpret it? Do you go somewhere else? The incompetence 00:11:54.480 |
of California's government, DEI, Ukraine, I mean, everybody is superimposing on this 00:12:01.520 |
natural disaster, whatever their pet issues are. Where do you come to when you look at this? 00:12:08.160 |
I don't think that those are exclusive. I think that you can have had both incompetent planning 00:12:17.920 |
and execution by leadership, as well as have kind of uncontrollable circumstances that 00:12:27.680 |
management and planning weren't necessarily going to solve. I'll kind of talk about a couple of 00:12:31.600 |
these points real quick. First of all, we talked about when the hurricane hit a couple of months 00:12:38.080 |
ago, remember? As you guys know, I have an office or facility out in Asheville, so we were exposed 00:12:42.480 |
to the flooding circumstances. We talked about the frequency of that sort of an event having 00:12:48.640 |
been such a rare occurrence becoming more common. Similarly, we're seeing more frequent high wind 00:12:55.120 |
events in California, flooding events in California, and extremely hot events in California. 00:13:00.560 |
If you look at this link I sent out, Nick, in terms of the total precipitation over this current 00:13:05.280 |
what's called rain season, the Southern California region is basically at a, you know, call it zero 00:13:12.560 |
percent of normal. So this is Southern California. You can see that third column. That's the percent 00:13:17.360 |
of normal rainfall that has been experienced. There's been zero rain in these regions. So 00:13:22.640 |
everything is primed to be very dry, and then you get these Santa Ana winds, 100-mile-an-hour winds. 00:13:28.720 |
No matter how much underbrush you clear out, no matter how many trees you remove, if there's some 00:13:33.600 |
embers in the air, there's a 100-mile-an-hour wind. That is going to create a fire hurricane, 00:13:38.160 |
and a lot of homes are going to get caught on fire. So it's very hard to kind of just pin the 00:13:42.560 |
blame solely on not doing underbrush clearing, not doing removal of trees. Those should have 00:13:48.240 |
happened. They didn't happen. That was wrong. That was bad policy. But it doesn't excuse the 00:13:53.440 |
fact that there's a natural event that happened here that seems to be occurring with greater 00:13:57.920 |
frequency. The thing I'll kind of pivot to, if we want to get there now, maybe we'll talk about that 00:14:01.840 |
in a minute, it's kind of the economic and the policy issues with respect to the Department of 00:14:06.160 |
Insurance. Okay, let's get to that after we go through maybe a little bit of the quick reactions 00:14:10.960 |
here. I think that's where that's where there's going to be real pain and devastation, and that's 00:14:14.480 |
the biggest economic consequence is the role that insurance has played in all this stuff, 00:14:18.080 |
which we'll get to in a minute. Okay, so Chamath, I think, table stakes, we all agree, 00:14:22.800 |
global warming, extreme weather, depending on what degree you believe in it, there's play some 00:14:28.480 |
factor here. And this is something that has reoccurred over and over again in this specific 00:14:33.600 |
region. But on social media, we're seeing a lot of other interpretations of this event, 00:14:39.680 |
maybe your thoughts on some of the other interpretations. And then where, when you 00:14:43.840 |
look at it, what do you start to think about preventing this in the future? Or maybe who's 00:14:48.800 |
responsible? What's your general take on what we've seen in the last week? 00:14:52.160 |
I mean, I'm not very sympathetic to the there were 100 mile an hour winds, not because it's not true. 00:15:00.480 |
But there's been enough modeling that we know that these kinds of outlier weather events are 00:15:08.480 |
happening in greater and greater frequency. Nick, maybe you can find this and just put it up here. 00:15:14.240 |
But remember that crazy apocalyptic video of that exact same part of Southern California in 2018, 00:15:23.600 |
burning to the ground? Can we just look at that all of us collectively, because that was 00:15:28.640 |
six years ago. This is not like it was a distant memory from 100 years ago. 00:15:36.240 |
We knew in 2018. That's a poll that a pass. So this, this idea that we were just lollygagging 00:15:43.760 |
around and got caught off guard by 100 mile an hour winds to me is completely not an acceptable 00:15:48.960 |
answer. We knew in 2018, that these things could happen. We knew across the rest of the United 00:15:55.360 |
States that these outlier weather events were happening in greater and greater frequency. 00:15:59.120 |
If you weren't sure, you saw most of the insurance companies try to dump Southern California homes 00:16:06.160 |
fire coverage three months before this event happened. So all this data was in the realm of 00:16:12.400 |
the knowable. And then when you double click, and you get into a little bit more of the details, 00:16:19.040 |
there's a level of incompetence bordering on criminal negligence here that we need to get to 00:16:24.800 |
the bottom of. So I'll just give you a couple of facts. In the 1950s, the average amount of timber, 00:16:32.240 |
so wood that was harvested in California, was around 6 billion board feet per year. 00:16:38.000 |
Into intervening 70 years that shrank to about 1.5 billion board feet. And so you'd say, okay, 00:16:46.560 |
well, that's a 75% reduction. We must be making a very explicit stance on conservation. It turns out 00:16:54.480 |
that that's not entirely true, because what it left behind was nearly 163 million dead trees, 00:17:02.400 |
dead, like gone. And so you would say, well, those things should have been removed. And 00:17:09.520 |
the problem is that then there's this California Environmental Quality Act, 00:17:14.240 |
CEQA, hopefully I'm pronouncing this right. And a whole bunch of these other regulatory policies 00:17:20.080 |
that limited the ability of local governments and fire management to clear these dead trees 00:17:25.040 |
and vegetation. And I think that that's a really big deal. And when you double click on that, 00:17:30.880 |
here's where you find the real head scratcher. Okay. Multiple bills, AB 2330, AB 1951, 00:17:40.480 |
AB 2639, all rejected by the democrat controlled legislator, or worse vetoed by Governor Newsom. 00:17:49.520 |
That would have exempted these wildfire prevention projects from CEQA and other permitting issues. 00:17:56.480 |
Then there were other bills to try to minimize the risk of fires by burying power lines underground. 00:18:01.840 |
SB 103, as an example, went nowhere, didn't even get to the governor's desk. So I'm just a little 00:18:10.160 |
bit at a loss to explain these two bodies of data. One is everybody can see that these events are 00:18:18.080 |
happening. Southern California lived through this exact type of moment just six years ago. 00:18:24.320 |
All the bills that are meant to prevent this are blocked or vetoed. This is the 00:18:32.640 |
ultimate expression of negligence and incompetence. 00:18:35.920 |
Okay, Sian, you've heard Chamath and Freeberg's take here. Some amount of incompetence, 00:18:44.480 |
some amount of, hey, this keeps occurring, and there might be some global warming that 00:18:48.320 |
is contributing to it. What do you take away from this situation? 00:18:52.160 |
I agree with Freedberg and Chamath. It's a lot of everything. But I also think that 00:18:59.280 |
to add to the prevention part, other than clearing out underbrush and trees and things like that, 00:19:04.800 |
we don't build things in the state of California in a way that houses should be built when you 00:19:12.880 |
know that there are fires like this. So for example, we have more wooden roofs than we 00:19:17.120 |
really should have. We should really evaluate our materials that we're building things out of. 00:19:22.080 |
But we also have, down in El Segundo, this is a company that I invested in, Rainmaker. 00:19:27.920 |
We have the ability now to cloud seed and do preventative measures to actually make a region 00:19:33.040 |
have more water. And I don't understand why we're not looking into things like this that 00:19:39.440 |
could have prevented. We knew that this storm was coming. We knew that these winds were coming. 00:19:44.880 |
Southern California shut power down. I have a farm down there. We still don't have power 00:19:50.800 |
because they knew that most of these fires were started by PG&E or down power lines. 00:19:55.680 |
And so they proactively shut everybody down, and we're still running on generators. And if 00:20:00.320 |
you notice, there's no fires down there. But they also have 100-mile-per-hour winds. 00:20:06.320 |
And you're not seeing it. And there's plenty of mountain ranges and dryness there. You know, 00:20:09.680 |
avocado farms are basically just sitting fuel. So I do think it's a combination of all of those 00:20:17.040 |
things. And competence is definitely one of them. Yeah. And I actually lived right next door to this 00:20:25.040 |
area for a long time in Brentwood. And to your point about roofs, it seems silly. And a lot of 00:20:29.600 |
these fire prevention things can seem silly when you first mention them, which Trump looked, 00:20:35.200 |
let's face it, the way he says things sometimes is very colorful. And when he said, listen, 00:20:39.440 |
you're not raking like people in wherever he said it, Scandinavia, Finland are raking 00:20:43.280 |
the forest. And he was absolutely 100% correct on that maybe sounded bombastic or silly when 00:20:48.960 |
the way he said it. But the truth is, in Tahoe, where we just were over the holidays, 00:20:54.480 |
people are clearing underbrush. When I lived in Los Angeles, people who lived in the Hollywood 00:20:58.480 |
Hills would get a fine if they didn't clear it. But there are mountain ranges that nobody owns. 00:21:04.160 |
And when you showed that Sepulveda pass, that's the 405 going past the Getty Center. 00:21:07.920 |
That area has got to be cleaned by the city and the government. And maybe they weren't doing it 00:21:13.120 |
as much. Look at this. This is apocalyptic. Yeah. So I know this past very well, because I would 00:21:18.480 |
drive Jason, what, what did California learn from this? What did Gavin Newsom implement? 00:21:24.160 |
Based on what happened here? What did the city of Los Angeles implement? Based on what 00:21:29.840 |
happened here? I want to just specifically know the answer to those two questions. 00:21:33.440 |
Yeah, and I think that's going to be a big part of this breakdown after this happens. Because 00:21:38.000 |
in a lot of these cases, you might lose a home or two, but you haven't had this kind of wholesale 00:21:43.280 |
destruction in a while. And when I lived in Brentwood, I had a shake roof. That's a fancy 00:21:48.240 |
way of saying shingles, wood shingles, and they would bake in the sun. And I love this roof. But 00:21:53.760 |
my neighbors who in Brentwood were all 70, 80 years old, and I was right on Sunset Boulevard, 00:21:57.680 |
and I could look up from my house and see the place you just showed, which is the Getty Center 00:22:02.080 |
and the and the Sepulveda Pass on the 405 and the Sunset Boulevard. I was only allowed cyan to 00:22:08.720 |
replace 30% of my roof at a time, you couldn't replace it and put shake roofs on, you could only 00:22:13.200 |
like maintain it. Because in 1961, there was the Bel Air and Brentwood fires. And these fires, 00:22:20.320 |
you want to talk about like, in memory, Chamath, this one, Zsa Zsa Gabor and tons of celebrities 00:22:25.680 |
lost their homes as well. This one was started because of the Santa Ana winds. And somebody was 00:22:31.200 |
just burning a rubbish pile. I think it was some construction workers were burning that. 00:22:34.160 |
They said to me, the neighbors, do you know about the Bel Air fire? You know what the Brentwood 00:22:38.640 |
fire, you got to get rid of that shake roof, you got to get rid of that shake roof. When my daughter 00:22:41.760 |
was born, the the roofer said to me, let's put composites on I put composites on and he said, 00:22:47.360 |
what do you want to do with the sprinkler system? And I said, there's a sprinkler system in my 00:22:50.320 |
little one story ranch house. He said, Yeah, I said, I've never seen it. And he showed it to me 00:22:54.560 |
was on the roof. People were so scared after that 62 fire, they were putting these on the roof. And 00:22:59.920 |
now you cannot have wood roofs have been banned. You were grandfathered in, I was part of that. 00:23:04.400 |
But there was a lot of PTSD from that. And now, I do think there's gonna have to be some lessons 00:23:10.400 |
learned. And let's get to where some folks online are pointing to maybe not having great priorities, 00:23:18.320 |
and maybe focusing on things that are not as important as the taxpaying citizens. A lot of 00:23:25.360 |
tweets, I don't know how people feel about them about de i about who's running the fire department, 00:23:30.800 |
etc. Did you have any thoughts on that? Friedberg? I'll I mean, look, we one of the things I wanted 00:23:36.400 |
to talk about was the do is role of the Department of Insurance role in what I think will ultimately 00:23:42.480 |
be creating a pretty significant economic consequence here from this sort of an event. 00:23:49.440 |
I don't but I'll answer your question. Okay. I don't think that the mission of any public 00:23:57.440 |
service organization should be to meet de i metrics. I think the mission of that public 00:24:04.960 |
service organization should be to serve the public. And I think that those de i metrics 00:24:10.720 |
should not be a priority when serving the public is the objective. The best ability to serve the 00:24:18.400 |
public should be the objective. And that's it. And I'll state that really clearly. So obviously, 00:24:25.200 |
the fire chief in LA is getting a lot of attention, whether or not that prioritization of de i metrics 00:24:33.520 |
took away from the interest and the focus in preparing for major disasters. I don't know, 00:24:38.640 |
there have been some interviews over the last day or two just to be fair, where she has claimed that 00:24:43.200 |
they asked for more money to that would not be able to be prepared for major disasters. If the 00:24:48.240 |
budget cut took place that was proposed by bass, that budget cut did take place. And so the fire 00:24:54.480 |
chief has said that she asked for budget to make this the preparation for this sort of an event, 00:24:58.560 |
and she lost it. And so I don't want to just say, hey, she's to blame. She's to blame because she 00:25:02.880 |
was focused on de i. But I will separately say that I think that creating de i as a mission 00:25:07.200 |
for an organization that's supposed to serve the public interest makes no sense. 00:25:10.480 |
This is an important one. James Woods, obviously the famous actor who lost his home in Pacific 00:25:16.800 |
Palisades has been going on a bit of a rant about Christine Crowley. She is LA's fire chief. She 00:25:23.520 |
also happens to be a lesbian and has made a priority and done a number of talks on trying 00:25:30.320 |
to increase diversity inside of the fire department. She also just with a bit of research 00:25:36.400 |
is one of the top performing firefighters, a paramedic, an engineer, a fire inspector, a captain, 00:25:43.360 |
a battalion chief, an assistant chief fire marshal, deputy chief. And when she took the firefighter 00:25:48.000 |
exam in the late 90s, she was scored in the top 50 out of 16,000. She seems eminently qualified. 00:25:52.400 |
There has been a massive pile on attack on her. And you know how it is on x and other social 00:25:59.040 |
networks where people are really tweaked about de i that they're kind of putting the blame on her. 00:26:05.280 |
What are your thoughts of this de i angle trauma? I don't think this is to blame. Okay. If all of a 00:26:11.040 |
sudden, because of de i 70% were physically incapable of carrying out the task, and that's 00:26:18.720 |
why these fires grew, maybe you could make the claim that it is a de i problem. I do agree with 00:26:26.000 |
free bird, but the thing that these public institutions need to do a better job of is 00:26:33.920 |
being very clear about what their North Star is. I think the North Star for the fire department 00:26:39.360 |
is to save people's lives and put out fires. I think the North Star for the police service 00:26:45.120 |
should be to save people's lives and to hold criminals responsible and get them off the 00:26:49.360 |
streets. And you should hire the people that allow you to do that job the best. The thing 00:26:55.200 |
to keep in mind is that there were probably 20 or 30 people interviewed to be fire chief. 00:27:01.520 |
It's not her fault that she was selected. The real question is, what was she mandated to focus 00:27:08.640 |
on once she got the job? And I think what you see in all of these interviews is I don't think 00:27:13.760 |
that she all of a sudden after growing up through the fire service had this de i bent, I think 00:27:18.400 |
typically what happens is it becomes an institutional directive. It guides your compensation, 00:27:24.960 |
it guides your recognition. And so you do it. It's sort of what Charlie Munger says, 00:27:29.520 |
show me the incentive and I'll show you the outcome. The entire public service is riddled 00:27:34.560 |
with this. The entire private service is riddled with this, which is that we've lost the script 00:27:40.320 |
about what is important. So it's yet another example. She's probably quite a capable person 00:27:46.320 |
who if was just allowed to focus on fighting fires and saving people's lives would probably 00:27:51.760 |
do a good job. But if you had to add all these other things that are not germane to that task, 00:27:58.160 |
then people will get frustrated and projected onto her. 00:28:00.960 |
It seems like a lot of projecting going on here to mafia. I agree. 00:28:04.160 |
All of that said, though, I think you got to go back to how did these fires start? 00:28:08.720 |
How did they grow out of control? And again, I think that these wins didn't come out of nowhere 00:28:14.960 |
in the sense that they caught everybody off guard. This has happened before. That area has gone 00:28:20.400 |
through this exact moment. Yes, there were laws that were proposed, they were vetoed. Okay. So 00:28:28.000 |
that even if you could have controlled it, then you see certain developers like Rick Caruso who 00:28:34.160 |
were able to protect the buildings that he was responsible for because he took proactive and 00:28:40.320 |
protective measures. Could those proactive and protective measures not be taken more broadly 00:28:45.360 |
through L.A. County? Of course they could have. Why were they not? 00:28:50.000 |
And here what we're seeing on the screen is Rick Caruso's village. 00:28:55.760 |
How much money, and we know the answer to this, how much money did the government of California 00:29:01.440 |
spend poorly, as it turns out, on homelessness? It was about $21 billion and illegal immigrants. 00:29:09.440 |
I don't know what the final number is there, but I suspect in the tens of billions. 00:29:12.320 |
If you reappropriated those dollars to these kinds of protective mechanisms in these areas, 00:29:18.640 |
what would the outcome have been? Maybe there still would have been a fire. Maybe there would 00:29:22.080 |
have been damage. But it's hard for me to believe it would have been as bad as it is right now. 00:29:26.400 |
Yeah, I think what you're getting to here is we can confirm lesbians didn't cause the Santa 00:29:32.960 |
winds to cause these fires, obviously. But there is an issue that I think many people in the public, 00:29:39.440 |
especially in California, who voted for this very leftist liberal ideology are now starting to 00:29:45.360 |
realize is, "Hey, wait a second. What are the priorities here, Cyan? What are we focused on, 00:29:51.040 |
and what should we be focused on?" And it's very easy to be focused on DEI and maybe things that 00:29:57.680 |
aren't as important, homelessness, and move budget there. But at the same time, they wouldn't give 00:30:02.880 |
her $17 million. They cut the fire budget. She tried to fight it. Well, that's not clear. Now, 00:30:08.000 |
the counter narrative is that she actually got an extra 50, Jason. 00:30:12.400 |
Okay, so we're in a breaking news environment, so we'll see what the truth winds up being here. 00:30:18.880 |
But Cyan, I think the point remains the same here, which is, 00:30:21.200 |
is prioritization and what we focus on out of whack in California? 00:30:27.040 |
Oh, without a doubt. I think diversity is good unless that's all you have. And I'll just simplify 00:30:33.200 |
it like that. And I think it's very sad that somebody could be very qualified and be in a 00:30:38.320 |
position, and we now have to question whether or not they were hired because of DEI. And then it 00:30:44.080 |
comes down to prioritization. When you're dealing with an organization like a fire department whose 00:30:49.840 |
main job is to protect the public and put out fires and save people, any amount of time, as we 00:30:54.880 |
know, is a valuable, precious resource that's being spent trying to roll out these programs. 00:31:00.160 |
It goes beyond just who you hire. It's even the thought police of how you think. 00:31:08.640 |
It's so pervasive within an organization that you die from the bureaucracy of it. 00:31:14.000 |
And if anything went wrong with DEI, it was that they didn't have their eye on the prize of fighting 00:31:21.280 |
fires. And instead, they're focusing on something that truly doesn't matter. So you can be as 00:31:25.840 |
diverse as you want to be and not be able to put out a fire, and then it just really doesn't matter, 00:31:29.680 |
right? Because you're not training people. You're not spending money on things that matter. You're 00:31:33.760 |
not having the discussions that matter. And that's where I think that does fall apart, and it 00:31:40.960 |
has a place there. But I go back to what Chamath said, though. It really comes down to prevention 00:31:46.320 |
and learning from our past. We seem to have a very short-term memory, and we forget very quickly 00:31:51.120 |
because we rebuild and it looks pretty again, and everybody forgets. And we just don't have the 00:31:56.720 |
ability as a society really to think long anymore. And that's a real problem. And I think we should 00:32:03.440 |
learn from this fire. I really hope that what comes out of this is a shift in political leanings 00:32:09.840 |
in this state. I think more moderates are going to come to their senses, as we've seen with the 00:32:16.640 |
election and the outcome. And I think the state might shift some, and we might actually get some 00:32:20.960 |
policies that work. >> You're so right. You're so right. I mean, when are we going to get tired 00:32:25.680 |
of all this late-stage progressivism? It's like these litany of excuses. The people that are in 00:32:31.760 |
charge have failed us yet again. >> Exactly. >> We have wasted so much money on so many things 00:32:38.640 |
that don't move the needle. And then the things that they needed to do, they didn't do. And then 00:32:45.440 |
they point the finger at climate change. It's a joke. >> At a certain point, you have to wonder, 00:32:50.160 |
are we using politics and the purpose of it to make people's lives better and to have a 00:32:57.280 |
high-functioning society? Or is it a way to furture signal or to share your opinions on things? 00:33:02.480 |
>> Oh, it's absolutely a virtue signal. >> Yeah. And I think what people are starting 00:33:04.720 |
to realize is, you know, in an acute situation, whether it's our budget deficit, whether it's 00:33:09.360 |
schools, whether it's safety from climate, you know, or non-climate-induced disasters, 00:33:14.880 |
you do need to have competence. And this is the Rick Caruso is such a competent executive 00:33:19.840 |
that when he ran for office there, the fact that he didn't get that job is just absolutely crazy. 00:33:25.520 |
And you saw the mayor come in, and she wouldn't even address, she wouldn't answer any questions 00:33:31.680 |
from the press, not even thoughts and prayers or, you know, "We're thinking of this," or "We're 00:33:35.760 |
going to get it done." It just seems like we're hiring non-executives to work in 00:33:39.840 |
functions that should be high-performing executives. This is an operational role. 00:33:45.200 |
>> Let me maybe bring something that ties these three things together, but it builds on 00:33:50.080 |
critically what Sian said. There are so many people here that are good, hardworking people 00:33:56.400 |
that lost their homes. For many of these folks, it could be the most single and only 00:34:03.760 |
financially securing asset that they have. For other people, those that are family age, 00:34:12.480 |
they have kids now beyond the financial damage that are totally displaced. Where will these folks 00:34:17.760 |
go? There was a comment by Adam Carolla, a commentary, where he said, "The real test," 00:34:25.040 |
to Sian's point, "will be how they internalize and metabolize this because it now affects them 00:34:32.480 |
personally, and they have to go and wait three years to build building permits to rebuild." 00:34:36.640 |
Now, that's assuming that they can even get a reasonable amount of insurance coverage, 00:34:42.000 |
which touches Freeberg's point. This is the real tragedy. That is the actual tragedy multiplied 00:34:47.920 |
by 120,000 or 200,000 families. The real question is how much of that was completely avoidable. 00:34:56.000 |
I think there is a reasonable amount of it that could have been. That's what really sucks, 00:35:00.960 |
and that's where you cannot take your finger off the scale and forget. 00:35:04.960 |
>> Yeah. When it lands on your doorstep, quite literally here, they are not going to be able, 00:35:10.880 |
having been in this exact area, I can tell you, when you try to pull a permit to do anything, 00:35:16.080 |
as I was explaining with my roof, the regulations are deep and expensive and time consuming. 00:35:23.840 |
I don't believe... We talked about the California Coastal Commission on a recent episode, 00:35:28.160 |
Freeberg. What are the chances that the California Coastal Commission even allows 00:35:32.320 |
these people to build those homes in those locations on PCH, Freeberg? 00:35:37.280 |
>> I was talking to Chamath about this earlier today because the California Coastal Commission 00:35:41.280 |
was created by the voters directly in 1976, and that commission has authority that exceeds 00:35:49.760 |
legislative action. So you would have to basically go back... My understanding is you'd have to go 00:35:55.840 |
back to a state vote to rescind the powers of the California Coastal Commission. So they have 00:36:00.160 |
effective, complete authority over deciding what does or doesn't get built on the coast because 00:36:06.160 |
their objective is to preserve the coast for the use of the community and restore it to its natural 00:36:10.320 |
habitat. So anytime there's a request or a permit request, it can take two decades, three decades 00:36:15.120 |
sometimes to get anything approved if they ever approve it at all. And so the California Coastal 00:36:20.320 |
Commission, any property that touches the beachfront in California, they have this kind of 00:36:26.720 |
God-level authority over, and they're basically all political appointees that sit on the commission. 00:36:31.600 |
>> To my question, Freeberg, what are the chances they allow the millionaires on Pacific Coast 00:36:38.000 |
Highway in Malibu to rebuild those homes, or do you think they slow roll it and those people are 00:36:42.880 |
all 50, 60, 70 years old? They'll never be able to rebuild their homes. The California Coastal 00:36:47.120 |
Commission could just slow roll this and say, "You know what? Nature returned it to its natural 00:36:51.120 |
state." >> I think we should talk about insurance. This is a great segue. 00:36:53.760 |
>> Yeah, yeah, the perfect segue. This is the key point I wanted to say about insurance. So 00:36:57.840 |
going forward, yeah. >> All of this property that sits in climate-sensitive zones or weather-sensitive 00:37:04.560 |
zones, whatever you want to call it, like we've talked about on the coast of California, of 00:37:08.400 |
Florida, in hurricane centers, in tornado centers, where the frequency of loss is going up, they're 00:37:15.280 |
priced as if the frequency of loss is what it used to be, which is like, let's say you buy a home 00:37:21.440 |
for a million dollars, and the probability of your home getting wiped out by a natural disaster 00:37:26.320 |
is a one-in-a-thousand-year kind of situation. So you have a one-in-a-thousand chance of your 00:37:31.440 |
home getting wiped out each year. So your price for insurance on that million-dollar home should 00:37:34.800 |
be about $10,000 a year, one-tenth of 1%. So $10,000 a year for a million-dollar home sounds 00:37:40.480 |
expensive, but it is what you have to pay for homeowners insurance. But now let's say that the 00:37:44.720 |
probability shifts to one-in-20 years. So now you've got a one-in-20-year probability of your 00:37:49.520 |
home getting wiped out. Are you going to pay 5% of your home value? No. And if you have a $10 00:37:57.840 |
million home, are you going to pay $500,000 a year for property insurance? No. Now what's happened 00:38:04.320 |
is the insurance companies have these models. They're called CAT models, or catastrophe models. 00:38:09.120 |
It used to be two companies. One was called RMS. The other one was called Equicat. And I used to 00:38:12.880 |
work in this business, so I know it pretty well. And then all the companies started building in-house 00:38:16.640 |
models, and now there's startups that make models. And these models have shown that there are 00:38:20.160 |
increased probability of complete loss in a region because of the increased probability of these 00:38:24.560 |
crazy weather events happening. And so the price of insurance should go up. Here's the problem. 00:38:29.040 |
There are 50 state insurance commissioners in the U.S. In order to sell insurance in a state, 00:38:34.160 |
you have to have the insurance carrier and the policy approved in that state. And the states 00:38:39.280 |
determine what rate or what price you can charge for insurance. So the state insurance commissions 00:38:45.120 |
have a couple of goals. Number one is to keep all the insurance companies solvent. So they want to 00:38:49.840 |
check the financials of all the insurance companies, make sure they're not writing too 00:38:52.400 |
many policies that they won't be able to pay out. The second thing is they want to make sure that 00:38:56.640 |
the insurance companies aren't ripping consumers off. So they have control over the rates, 00:39:01.840 |
and they don't want the rates to go up too much in any given year. So they're controlling rates 00:39:06.720 |
and keeping them down. And then the third is they're supposed to make sure that consumers 00:39:10.240 |
have access to insurance. And the third is a very hard thing to do if you're trying to keep 00:39:14.880 |
companies solvent, so you can't write too many policies, and you're saying, "Hey, you can't raise 00:39:18.800 |
prices." And meanwhile, the probability of loss has gone up. So the insurance carriers are like, 00:39:22.960 |
"What choice do I have?" So earlier this year, a state farm pulled out of Palisades. They stopped 00:39:27.760 |
writing fire insurance at Palisades. They canceled 1,600 policies in the exact neighborhood that just 00:39:33.360 |
burnt down. - What about the timing of that freeburn? That was three months before? Six months 00:39:36.880 |
before this happened? - By the way, it's not just that. - I think it was like six months before. 00:39:39.360 |
- Yeah, but it seems crazy. But as you know, in Tahoe, a lot of the policies have been canceled. 00:39:45.120 |
- Yes, so it's just crazy timing. It's a crazy coincidence. - And remember, in wine country, 00:39:50.400 |
we had a lot of wipeouts. All of Santa Rosa was burnt out a few years ago. You guys remember that? 00:39:54.960 |
And so they started pulling out of there. So a lot of the carriers are generally pulling 00:39:58.880 |
out of California because when they go up to the DOI and they're like, "Hey, we need to raise rates 00:40:02.320 |
by... We need to double the price of insurance. We need to triple the price of insurance." This 00:40:05.760 |
is now a one-in-20-year event. The Department of Insurance says, "No, no, no. We're not going 00:40:09.760 |
to let you charge that much to consumers." And then the carrier's like, "Okay, we got no choice," 00:40:13.840 |
and they exit the market. Here you can see right here, 1,600 policies canceled. This has been a 00:40:18.720 |
big driver is the Department of Insurance has made it very difficult to find this free market outcome. 00:40:23.680 |
But at the end of the day, one of three parties are going to end up eating the cost of the change 00:40:30.320 |
in probability of loss that has occurred. It's either the homeowner because they're going to end 00:40:34.160 |
up losing the value of their home in a loss, or they're going to end up needing to write down 00:40:37.200 |
the value of their home when they sell it to someone who will take on that risk, which means 00:40:40.480 |
the price has to come down. Or number two is the insurers, and there's not enough insurance capital 00:40:45.280 |
out there to cover all these losses, so all these insurers would go bankrupt. Or the third is the 00:40:49.280 |
taxpayer. One of those three is going to end up eating the loss that's about to happen. 00:40:52.720 |
No, you know the answer. You know the answer. 00:40:54.800 |
Taxpayer. Yeah, somebody's going to lobby somebody. But hey, we're sitting here, 00:40:58.880 |
Chamath, in the age of Doge and saying, "Hey, let's make the government smaller." 00:41:03.840 |
In fact, Dave, you and I were talking about, at some point, gangs of New York and the fire 00:41:08.080 |
departments being, you know, crazy timing that we were talking about that two or three weeks 00:41:13.280 |
before this happened. But, you know, when we look at making government smaller, well, that means 00:41:18.160 |
that these kind of situations would put citizens more on their own. So let's counterbalance what 00:41:23.600 |
you think, Chamath, about who should be responsible. We all espouse, I think, free market ideology on 00:41:29.520 |
this program and as executives and in what we do every day. Should the people who own these homes, 00:41:35.600 |
going forward, who decide to rebuild them here, have to pay, you know, 5-10% of their value of 00:41:41.520 |
home every year? Should their home prices collapse because it's too hard to build there? And should 00:41:45.600 |
the free market take over this risk? Or should it constantly be put on the other 329 million 00:41:50.720 |
Americans who are going to have to bear the brunt of what happens to the million people affected in 00:41:54.960 |
Well, I mean, 'should' is a very strong word. The cap on the insurance reimbursement is about 00:42:01.680 |
3 million, is my understanding. David, you can tell me if I'm wrong, but I think that's right. 00:42:05.680 |
The houses in the Palisades are anywhere from, call it, 1 million on the low end to maybe 40 00:42:14.240 |
Yeah, that's what I was about to say. There's nothing for a million these days. Yeah, 00:42:18.400 |
Right. But the median is probably more instructive, which is probably seven or eight million. So 00:42:22.880 |
my point is that folks will get less than half their home value back. They're going to have to 00:42:28.240 |
come up with some amount of money to then rebuild. But the cost of rebuilding a 7,000-square-foot 00:42:33.840 |
house in the Pacific Palisades is probably at least 1,000-square-foot. So that's 7 million of 00:42:39.680 |
So now all of a sudden, these people have to come up with a lot of money. 00:42:44.960 |
And that's post-tax money. So you might as well double it because California is just so 00:42:49.920 |
egregiously burdensome in terms of taxes. So the individual homeowner is not going to be in a 00:42:55.600 |
position to rebuild. I think that the liabilities of the insurance claims are going to be so massive 00:43:05.120 |
that the state's going to look to the federal government to bail them out. 00:43:08.480 |
My parents just got evacuated. I got to call them and just there's a new fire. 00:43:14.960 |
Literally, right. There's a new fire called Kenneth Fire. It just took off. 00:43:18.880 |
And it's at their house. So just give me, I'll be back. 00:43:24.880 |
We're talking about, hey, maybe less government. Hey, maybe spending less. 00:43:30.080 |
Now the same group of people, maybe who were saying, hey, we need to spend less and 00:43:33.840 |
reduce the size of government are saying, hey, well, why isn't California more prepared? Well, 00:43:37.760 |
being prepared obviously means more money and more taxes. So you have now these two competing 00:43:44.400 |
ideologies here. But to the question of who's responsible, it is economically going to make 00:43:49.600 |
no sense to rebuild unless you can get that insurance. It is a coveted place to live. 00:43:54.640 |
But because of the construction costs have gone absolutely parabolic in California, 00:43:59.760 |
because of regulations, you're talking about $14 million in income to build a $7 million house. 00:44:04.880 |
And maybe you're just better off selling the lot for $1 million and letting it be somebody else's 00:44:10.080 |
problem going forward and just taking the two or three or $4 million loss. Who should pay for 00:44:15.840 |
on a go forward basis, underwriting these homes? 00:44:20.000 |
Yeah, I mean, a lot of these people paid for I was reading stories of 30 some odd years 00:44:24.720 |
into insurance thinking that, you know, their house wouldn't burn down. And then of course, 00:44:28.560 |
it gets canceled two weeks before their house burns down. And then the one time they need it, 00:44:32.000 |
they don't have it. And part of this is, I mean, a huge part of it is what Friedberg was talking 00:44:38.160 |
about are the regulators. And so the free market solution is the only solution. If you look at, 00:44:42.800 |
I have an investment in a company called Ken Insurance, and they specialize in direct to 00:44:47.200 |
consumer insurance for areas that are plagued with natural disasters. So their number one 00:44:53.600 |
state out of the 11 that they serve is Florida, followed by Texas, which, you know, has tornadoes 00:44:59.040 |
and things like that. And how they're able to get into these places and do insurance is the pricing 00:45:04.480 |
is according to, you know, the construction of your home and all of these various things. And 00:45:08.560 |
also weather models and using data science, things that are not allowed in California, 00:45:13.040 |
if you can believe it or not. So you're not allowed to use a weather model to price in, 00:45:18.000 |
you know, your decision making for insurance in the state. And that just doesn't make a lot of 00:45:22.480 |
sense. You know, you should be rewarded if you put the resources and time into your home to make it 00:45:29.200 |
a weatherproof, fireproof, fireproof, I mean, you know, even earthquake resistant, right? 00:45:34.880 |
This is more regulations that were layered on here to try to create equality, you know, 00:45:40.800 |
in the fact is, it's now working against the system in Tahoe, to your point, they gave us 00:45:48.640 |
explicit instructions around homes, put stone and pebbles around your home, cut the trees and bushes 00:45:54.320 |
down around your homes, do this over here, you know, your premium, when you do that, and if you 00:45:59.760 |
do that, if you do that, it might cost $10,000 a home, you should, it would keep these from jumping 00:46:05.600 |
from one to the other in most fire situations. Freeburg, you're back, is everything okay? 00:46:09.840 |
It's not okay listening to your, you know, 70 something year old parents evacuate their 00:46:16.800 |
home and try and pack their cars with all their stuff in a matter of minutes while a fire creeps 00:46:22.240 |
on their home is a pretty devastating thing to listen to. Yeah, what are they saying? 00:46:26.800 |
They're trying to get out of the house, they're throwing everything in the car, 00:46:29.200 |
there's a vacuum, it's like, if I'm looking at the video right now, the fire is like right by 00:46:33.120 |
their house, it's insane, it's literally like blocks away from their house. God. 00:46:37.920 |
This is nuts, this is the house I grew up in, in LA. I'm so sorry, man. Gosh. It's blocks away, 00:46:44.240 |
and I'm like, you know, what do you say to them? Like, throw all the photo albums in the car is 00:46:48.240 |
what I said, throw the photos, like just grab the framed photos, my mom's trying to grab all her 00:46:51.760 |
life. That's the number one thing that everybody misses. And it's mentioned in every interview 00:46:57.280 |
that I've seen is photographs. I'm like, grab all the photos, grab all the albums. And she's like, 00:47:02.800 |
you know, she's grabbing her jewelry and stuff. And I'm like, grab the photos, like, 00:47:05.920 |
we're the last generation that will be thinking about this issue of grabbing the photos. Yeah, 00:47:10.560 |
it's fascinating. I just want to say like, you know, as we wrap up this segment, you know, 00:47:14.160 |
obviously, we're thinking about everybody there. This is complex. This is not the fault of a 00:47:19.120 |
lesbian firefighter or the Ukraine or any of these other issues. This is leadership, and nature and 00:47:25.280 |
preparedness. So there are big issues around climate change, you want to believe you don't 00:47:29.680 |
want to believe in fine, put that aside. But I can tell you that when I saw Karen Bass, 00:47:33.840 |
get off that flight. Play the clip neck of Karen Bass here, because it's a short enough 00:47:39.520 |
clip that we can play it here for the audience. I'm assuming you all three of you saw this clip 00:47:44.480 |
of her being absolutely unwilling to answer a single goddamn question about what's going on. 00:47:51.520 |
This is the opposite of leadership. Just 10 seconds of this being absent while their homes 00:47:59.120 |
were burning. Do you regret cutting the fire department budget by millions of dollars? Madam 00:48:04.640 |
Mayor? Have you nothing to say today? Have you absolutely nothing to say to the citizens today? 00:48:14.160 |
Disgraceful shock. I mean, I have zero sympathy. You took the leadership job. I don't 00:48:21.040 |
give a if you're in shock. You're a leader, you just you sold yourself as the leader that you 00:48:26.480 |
were going to service these people and you don't have the dignity that the honor to just answer 00:48:31.920 |
the questions. It is the that is absolutely the worst leadership I've ever seen. Under fire. 00:48:39.760 |
Let me ask you guys a question. Disgraceful. What do we do? You fire them all and you vote 00:48:45.360 |
for Rick Crusoe. You vote for executives who know what they're doing and know what to do in a crisis 00:48:51.040 |
because they've been under fire before because they've run a business before because they've 00:48:54.880 |
seen hit the fan. This person, I don't know her. I don't know her history, but I'll be totally 00:49:00.320 |
honest, like I wouldn't trust her literally to pick up my lunch if she can't answer one or two 00:49:06.240 |
goddamn questions and give a placating answer to a reporter. Hey, it's an intense situation. 00:49:11.040 |
We're working as hard as we can. She can't even say two goddamn words to the people who voted her 00:49:16.960 |
in. And for anybody who voted for this level of a competence reminds me of exactly what we went 00:49:21.040 |
through in San Francisco. And I was living there in the Bay Area when you put someone like Chesa 00:49:24.720 |
Boudin in or London Breed or this entire clown car, Aaron Petzger, all these disgraceful, 00:49:31.680 |
disgraceful Marxist lunatics who would rather virtue signal dopey Dean Preston, the whole lot 00:49:39.120 |
of them, you vote them out and you vote in executives. And it doesn't mean a Republican 00:49:43.200 |
executive. It doesn't mean a Democrat executive. It means an executive who's run something in their 00:49:47.600 |
life before, whether it's Bloomberg, whether it's Trump, whether it's Rick Crusoe, it doesn't matter 00:49:52.240 |
their ideology. It matters their effectiveness. And if you vote for ineffective people, you're 00:49:56.560 |
going to get situations like this over and over and over again. So use your brains and vote for 00:50:01.520 |
executives who've done something in the world. This is why I've changed my position on rooting 00:50:06.240 |
for Trump. Now, I was a never Trump or everybody knows that, but he put executives around him this 00:50:11.200 |
time round. And I am rooting for those executives to do what's right for the American people and 00:50:15.760 |
solve big problems, not make them worse. It's infuriating. Timothy, what do you think? 00:50:21.680 |
I think we need to have a wholesale replacement of the people that govern the state of California. 00:50:28.640 |
It's just not worrying. And I think that the citizens that live in California need to do 00:50:35.680 |
some real soul searching. It is beyond party politics. So I think what has happened in 00:50:43.280 |
California is people vote for whatever vessel has the name Democrat beside their name or Republican 00:50:50.320 |
beside their name. And I think that you have to go back to first principles and do a better job 00:50:56.880 |
of picking the people to represent us because the people that are in positions of power 00:51:01.040 |
just don't fundamentally know what they're doing. They're not capable. 00:51:05.520 |
And the fact that then what we have to deal with are sort of lies and distractions 00:51:13.120 |
to excuse incompetence, I think is unacceptable. I think we pay way too high of a price. 00:51:20.560 |
And like I said, you are now dealing with hundreds of thousands of families whose 00:51:26.000 |
entire lives have been totally disrupted and ripped away. And I hope that we learn something 00:51:33.200 |
from this because we didn't learn from it eight years ago. And we clearly didn't learn from it 00:51:39.360 |
when a different natural disaster in North Carolina. Will we find out that folks said, 00:51:44.160 |
hey, guys, is there an outlier natural disaster event? Obviously, it's not going to be the same 00:51:48.960 |
thing in North Carolina, but could a different form of something happen here? What could it be? 00:51:53.520 |
Are we prepared? I'm sure we'll find out that they didn't do that. Maybe they had different 00:51:58.320 |
meetings and they were all about other total distractions or things that just didn't matter. 00:52:05.440 |
So this is what we need to do. We as a populace in this state need a reset. Otherwise, we deserve 00:52:12.400 |
what we get. Bingo. Sian, you agree? Yeah, I think I think Democrats need to reclaim their party. 00:52:19.440 |
I think there's a lot more strength in the middle. And, you know, they've let this woke ideology, 00:52:26.960 |
I call it woke imperialism, like a religion take over in place of actually doing things that matter 00:52:36.960 |
to the people that elected them, that pay taxes, that pay their, you know, their paychecks and 00:52:42.080 |
everything in between. And it's time that people really look in the mirror. I've got so many 00:52:45.680 |
moderates coming to me saying, you know, people call me a Republican and I'm far right and I'm 00:52:50.880 |
a Nazi. And I'm like, yeah, welcome to the club. You know, it's at some point you've got to stop 00:52:55.520 |
letting them run the board and stand up and say, you know, enough's enough. You know, we're not 00:53:01.040 |
building some railway that's never being built. We're not solving homelessness with billions and 00:53:05.680 |
billions of dollars. We're not doing this stuff anymore. You know, we do need real executives, 00:53:10.960 |
to your point, Jason, you know, to run things that understand how things are, 00:53:16.320 |
how it works and, you know, the best use of funds. Because right now it's misappropriated. 00:53:22.480 |
- It's a crisis of competence. I mean, I think we all see it. These are incompetent people. 00:53:27.200 |
- By the way, it's not just the leadership, it's also legislative action that's gonna be needed 00:53:33.680 |
to fix a lot of the policies, the regulations, the way infrastructure operates in the state. 00:53:39.760 |
And that requires three things to change. Number one is the California State Assembly. Number two 00:53:46.000 |
is the California State Senate. And number three is to put things in front of the voters that they 00:53:51.360 |
can vote on to make the wholesale change needed to rescind some of the bad decisions that were 00:53:56.000 |
made over the last three decades in the state that has led us to this point. And I think that 00:54:00.160 |
it's gonna require, just like what happened recently in the national politics, a state 00:54:05.680 |
politics organizational effort to say, let's take a look at the composition of the state assembly, 00:54:11.280 |
the state senate, and what are some of the votes that need to be done by the citizens to make the 00:54:16.640 |
necessary changes in the state to try and get the world's fifth largest economy to start acting and 00:54:22.320 |
looking like it. Because right now it's sort of like a weirdly disabled third world country type 00:54:27.840 |
operation with the wealthiest resources on planet earth. And it seems pretty f***ed up. It's almost 00:54:34.000 |
like once people have it all, that's when they want to give it all up. That seems to be the 00:54:38.000 |
moment that this state has just passed. Now maybe it's time to go reclaim it and build it back. 00:54:42.480 |
Well said. I mean, and as we said, in this segment, there are so many common sense, 00:54:48.320 |
tactical, strategic things that these people could be doing that they should be doing that they're 00:54:55.280 |
not. And there needs to be a full blown investigation. You kind of alluded to this 00:54:59.520 |
earlier Chamath. But if there is, if this is dereliction of duty, then we need to look into 00:55:05.280 |
this in a very deep fashion. And to the people of California, you have more power than you know. 00:55:09.280 |
My friend who used to be on this podcast once in a while, he and I collaborated on Chesa Boudin 00:55:15.120 |
being taken out as this DA in the Bay Area. I know some other people here were involved in it as well. 00:55:21.120 |
And you can recall somebody. So recall these incompetent lunatics, 00:55:27.120 |
recall them and replace them. It's scary, but you can. You know, they send all their people after 00:55:33.120 |
you. They threaten you. It's personal. They went after you. I was signature number one. And I had 00:55:38.240 |
to deal with the deluge of that stuff. But to be honest with you, I've never been happier to do 00:55:42.880 |
something and get civically engaged. I think it's so important that everybody starts getting 00:55:46.720 |
involved in their local government and their state government and the national government, 00:55:51.280 |
because you can't just expect people to do the work for you and expect it to turn out well. 00:55:56.560 |
And I think that's kind of the mistake we all made. We want to take some responsibility. The 00:56:01.920 |
tech industry as a whole did not get as involved as we ought to have in the past. And I think we 00:56:07.200 |
should get more involved. Why was it? Why? Why did for 20 years while we were all in the Bay Area or 00:56:12.640 |
other people, you know, we just were too busy building companies and it didn't seem companies. 00:56:17.040 |
And I remember if I remember correctly, the only person I remember getting involved in local stuff 00:56:23.280 |
was Ron Conway. Yes. And he would try to get everybody involved. And we were all just like, 00:56:28.640 |
you know, there's people who are smart that do that sort of thing. And they're going to do their 00:56:32.560 |
thing and they're running stuff. And we're just not going to get involved. And a lot of people 00:56:36.240 |
would say, I'm not political. I don't, I don't do politics. I don't, you know, they didn't get 00:56:41.920 |
involved until it affected them. Kind of like the houses burning down, it affects them. And, 00:56:46.160 |
you know, like they're saying that first they came for so-and-so and I didn't speak up. You 00:56:52.000 |
know, that's what's happening here. And, you know, I just really think that people need to 00:56:57.280 |
realize it's now affecting them and it's now time to make a change and elect better leaders. 00:57:01.440 |
Here's a framing. If you're paying 50% tax in California, you're a shareholder of an organization 00:57:08.560 |
known as California Inc. You're on the board of that company. You're paying the salaries of the 00:57:12.720 |
people there. You have a say. Recall these people, start a recall of Newsom, start a recall of Karen 00:57:18.800 |
Bass. Just do it. I'm not doing it. I don't have time for this. I'm in Austin. But y'all in Cal, 00:57:23.280 |
who is still in California, start a page, recall Newsom, recall Bass, and you have the power to do 00:57:28.880 |
it and you will succeed. I guarantee it. Now is the moment to strike. There's other news we should 00:57:33.200 |
get to. You know, I hate to say thoughts and prayers. But literally, I've been thinking about 00:57:38.000 |
this, you know, all day long. And I have a lot of friends, my friend Mark Sooster lost his home. 00:57:44.160 |
I used to play cards with Jimmy Woods. And you know, I just feel terrible for everybody who's 00:57:49.360 |
lost their homes. And then their kids and their schools are burned down as well. All those great 00:57:53.120 |
schools in Pacific Palisades are gone. I could see developers coming in and being like, dude, 00:57:56.640 |
if I could buy all these lots for 80% off, I will. That's what's going to happen. 00:58:00.960 |
They're going to sit on them. Yeah, they're gonna just sit on them and wait for people 00:58:03.600 |
to forget like they did in 1962. Rick Caruso's of the world will do that. Yeah. 00:58:07.120 |
Anyway, yeah. He should be running the place and probably I'll give you another another 00:58:12.320 |
California Department of Insurance stats. So after the California Department of Insurance 00:58:16.800 |
wouldn't allow the rates to rise like they should from a free market perspective, 00:58:21.280 |
they had to set up their own insurance program called the fair plan for homeowners. 00:58:25.120 |
It has about $220 million of capital in it. And then they bought about $5 billion of reinsurance. 00:58:31.120 |
They have about 6 billion of exposure in Pacific Palisades alone. This is a bankrupt just like I 00:58:36.400 |
told you guys about in Florida. The State Insurance Commission tries to step in and 00:58:40.080 |
fill the market gap that they create by regulating rates. And then they don't have enough capital to 00:58:44.480 |
actually fill the gap because the reason the rates want to go up is because the thing costs more than 00:58:49.680 |
the state is willing. So they're distorting it, they're putting their thumbs on the scale, 00:58:53.280 |
and they're distorting it even more. They're driving real estate value up, 00:58:58.160 |
because they're not allowing the cost of insurance of that real estate 00:59:01.680 |
to naturally float. And so by driving real estate values up, the economy looks good, 00:59:06.400 |
they make property taxes, income comes in. But at the end of the day, the bill is going to come due. 00:59:11.520 |
And in the case of Florida, and in the case of California, either the state government 00:59:15.840 |
or the federal government is going to step in and pay the difference. And at some point, 00:59:19.840 |
taxpayers are going to look at the fact that they're paying some percentage of their income 00:59:24.400 |
to support someone else's home value. And they're going to say enough is enough. 00:59:28.240 |
And enough of these sorts of events start to happen. And then the legislative change, 00:59:32.240 |
I think will happen that says, this, it doesn't make sense, we have to make a change. And I think 00:59:37.680 |
we're getting pretty close after the series of events. All right, this has been an absolutely 00:59:41.840 |
fantastic discussion. Let's move on to our next topic here. Zuck just fired matters third party 00:59:47.360 |
fact checkers, and he is going to embrace the community notes model from Twitter slash x, 00:59:54.160 |
which predates Elon's ownership of the platform, and is an open source project for those folks who 01:00:00.000 |
don't know, on Tuesday, maybe he made the announcement on an Instagram video, he published 01:00:05.280 |
a blog with a bunch of details. And he made the signal that he was going to move the trust and 01:00:13.680 |
safety team out of California, which he feels maybe was too far to the left, as we were just 01:00:20.000 |
discussing in the previous story, and move it to the great state of Texas. And here's a quote from 01:00:29.360 |
his comments. In recent years, we've developed increasingly complex systems to manage content 01:00:33.760 |
across our platforms, partly in response to societal and political pressures to moderate 01:00:38.720 |
content. This approach has gone too far. Remember back in August, Zuck sent a letter to the House 01:00:44.560 |
Judiciary Committee explaining how the FBI and Biden administration have pressured Facebook into 01:00:49.040 |
censoring posts about COVID and Hunter Biden, you'll also remember that Zuckerberg has over 3 01:00:55.680 |
billion members to his platform, and had no problem banning Trump from the platform after January 6, 01:01:03.680 |
a lot to talk about in this topic, cyan, what's your general take of Zuck going MAGA? How do you 01:01:12.800 |
interpret his part? I actually think deep down inside, he always has been, you know, I go back 01:01:19.840 |
to the beginning days of Facebook. And when there was social networks that were competing with back 01:01:25.280 |
at the time was MySpace, the only political party you could be was Republican or Democrat. And then 01:01:30.800 |
along came Facebook, and he added this third option called libertarian. And I would like to 01:01:35.760 |
go to the Wayback Machine at some point and find his profile because his profile said he was a 01:01:39.840 |
libertarian. So when he started Facebook, you know, that that's where he leaned. So I think 01:01:45.280 |
he's always been a free speech person. I think he's always, this has been deep in his heart, 01:01:49.520 |
I think what happened was he had enormous success, they grew very large, and he had to become neutral. 01:01:55.120 |
Or he thought he did. And so I think what we're seeing with Zuck right now with his change in his, 01:02:00.560 |
you know, even how he appears with a gold chain and how he's dressing and everything that he's 01:02:06.320 |
doing, is him going back to his roots to be more authentic, because I think he hasn't been authentic 01:02:11.040 |
for a long time. And, and that was a big critique that people had of him, you know, they were just 01:02:14.960 |
like when he talks, he's like a robot. And I think what we're seeing is him coming out of his shell, 01:02:20.160 |
and I don't know if fighting helped it or what helped it. But, you know, I do think it's the 01:02:25.040 |
best thing to do. And all the platforms need to do it and should embrace it. And it can be game, 01:02:30.640 |
though, community notes can be game that we saw it with, I saw a report that, you know, Kamala's 01:02:36.240 |
campaign or I don't know, they directly work for her what happened, but they did take over community 01:02:42.160 |
notes on X and started manipulating them. So you have to be really careful, you know, how you run 01:02:47.600 |
a community. But in general, I'm all for it. I think it's the right move. It's but one signal, 01:02:52.320 |
it's one system for trying to get to the truth. It's not the only one fact checking is another one. 01:02:57.040 |
And having no system is another one. Chamath, you're obviously an alumni, you worked 01:03:00.960 |
side by side with Zuckerberg in the pivotal years of building the Facebook platform, 01:03:06.160 |
what's your take on what cyan said? And what do you attribute Zuckerberg's massive 180 here? 01:03:13.200 |
I would start by saying I think he's a phenomenal businessman. I mean, I think the 01:03:17.440 |
the results speak for itself. But I also think that that is exactly what explains the shift. 01:03:24.800 |
In many ways, he had to make that shift. I think it's fair to say that in the Obama and Biden 01:03:31.200 |
administrations, when the winds were blowing towards censorship, they were part of that 01:03:38.160 |
machinery. And that was the value maximizing function for Facebook shareholders in that time. 01:03:44.800 |
Because if you push back against that, it's not clear what would have happened to Facebook in 01:03:49.200 |
other ways. And so I think the decision, whether he morally agreed with it or not, almost didn't 01:03:56.960 |
matter. It's the leadership of the country in which I operate is telling me it's going to go 01:04:01.040 |
this way, I go that way. Once the Biden and Obama administration sort of went to the wayside, 01:04:06.800 |
there's a very interesting picture that Donald Trump put in his book. And I just I sent it to 01:04:12.720 |
Nick. And I think it sort of explains the last week's events relatively well. So I'll just read 01:04:18.080 |
it. This is a picture of him sitting in the Oval and it says, Mark Zuckerberg would come to the 01:04:23.680 |
Oval Office to see me, he would bring his very nice wife to dinners, be as nice as anyone could 01:04:29.120 |
be, while always plotting to install shameful lockboxes in a true plot against the president 01:04:34.880 |
in J. Cal all caps. Okay, shout out to the president. He told me that there was nobody 01:04:41.200 |
like Trump on Facebook, but at the same time, and for whatever reason, steered it against me. 01:04:45.760 |
We are watching him closely. And if he does anything illegal this time, he will spend the 01:04:50.400 |
rest of his life in prison, as will others who cheat in the 2024 presidential election. 01:04:55.920 |
Now that's what I put in the book. And then he was asked about this quote. 01:04:59.520 |
At a recent press conference, Nick, do you have the link to that? He's colorful, 01:05:04.800 |
very bright. Did you notice Donald Trump a little bit colorful? Essentially, Trump was asked 01:05:09.680 |
about Zuckerberg's move to free speech. And he was sent he was asked, you know, do you think 01:05:18.000 |
it was because of your threat? And he goes, Yeah, probably. Yeah, I watched their news conference. 01:05:24.640 |
And I thought it was a very good news conference. I think they've honestly I think they've come a 01:05:28.880 |
long way meta. I think he's directly responding to the threats that you have made to him in the 01:05:33.120 |
past. Probably, probably. Wow, there it is. But again, the the, the lens that I would put on this 01:05:41.120 |
is now the winds are blowing in a different direction. And I do think it's the value 01:05:45.440 |
maximizing function. I think Elon didn't make a value maximizing function, he made a moral decision. 01:05:50.960 |
He did it when it was unpopular and where the winds were clearly blowing in the opposite direction. 01:05:55.760 |
Now that those winds have changed, and it's clear Trump won in early November. The decisions you 01:06:01.200 |
make in January are more reflective of the new conditions on the field coming into the inauguration. 01:06:05.920 |
But I do think it's the smart value maximizing decision yet again for Facebook shareholders. 01:06:11.520 |
And I think it begets a broader point. I think the thing is, when you see Elon operate, 01:06:18.480 |
he's a complete outlier in many dimensions. But I think the one dimension where it matters the most 01:06:24.240 |
is that he acts morally. And in the best interests of what he believes humanity benefits from, 01:06:31.920 |
he's always done it, he was willing to torch $44 billion when he bought Twitter in order to do it. 01:06:36.640 |
And so he does these things from his own perspective. I don't think there's any other 01:06:41.520 |
CEO that leads this way. And I don't think they should necessarily I do think that, you know, 01:06:47.200 |
marks a good person. But his intimate feelings should be known by his wife, his children, 01:06:53.600 |
his friends, his family. I don't think we as shareholders have any right to know necessarily, 01:06:58.640 |
Elon is different. And I think it creates an expectation that maybe we'll get that from 01:07:03.120 |
everybody else. But I wouldn't conflate everybody else with him. So I think that this is a smart 01:07:09.200 |
business decision. It makes a ton of sense. And as you can see, he was basically told to do this. 01:07:15.920 |
So he complied. Freeburg, your thoughts on Zuckerberg making this decision? If Kamala 01:07:24.480 |
Harris had one, would he have released a statement or added Dana White to the board of Facebook? 01:07:30.400 |
Probably not. Okay, there you have it, folks. Pretty straightforward here. Kamala wins, 01:07:36.800 |
he would not have done this. He is jumping in front of a marching band. And he is the 01:07:42.800 |
band leader. Now he's got his baton and he's a front runner. And if you open the dictionary, 01:07:48.240 |
you look it up. But I mean, it's a smart business move. I think if you're a meta shareholder, 01:07:54.080 |
Is there anything wrong with it, Jekyll? Or you're just saying, 01:07:56.800 |
Oh, yeah, there's a tremendous amount wrong with it. It's called moral integrity, having 01:08:00.320 |
an ethical compass, having chutzpah, having an own sense of what's right and wrong in the world, 01:08:07.440 |
which he does not have, in my estimation, based on his behavior. 01:08:10.960 |
That's not fair. You don't know, because, again, what I'm saying is, 01:08:15.920 |
no, but Jason, what I'm trying to say is, Elon shares who he is in a 360 degree way with the 01:08:22.720 |
world. So we know where he stands. And all I'm saying is, what Mark does or doesn't believe 01:08:27.920 |
really isn't known to us. It's probably known to his wife, and his family. 01:08:34.080 |
Some of his close confidants, some of his confidants. 01:08:36.960 |
Let me be clear. I'll even, I'm happy you're challenging me on it. 01:08:41.280 |
I base people on their actions. His action was to be the greatest censor 01:08:45.840 |
in the history of humanity. There's no human being who censored more humans than him. 01:08:49.600 |
That was his decision when it was a popular decision, whether it was COVID or... 01:08:53.760 |
Not popular. Hold on, but not popular. Not popular, Jason. Necessary for maximizing 01:08:59.440 |
Well, he doesn't need, no, no, I disagree. His business would have been just as vibrant 01:09:04.560 |
if he had a spine and he just said, "This is what I believe." And I think he's over-optimizing 01:09:10.640 |
based on what he thinks everybody else around him wants. And I don't know, I've never worked 01:09:15.120 |
with him. I don't know him personally. You're right on that front. But he banned Trump for 01:09:20.080 |
two years. The President of the United States. I said at the time, "I don't know that you can 01:09:23.360 |
give a permanent ban to the President of the United States." When he had the opportunity 01:09:27.200 |
to reevaluate that decision, you know what he did? He punted. He created a third party 01:09:31.360 |
organization to make the decision for him and deflect it. Zuck created the Oversight 01:09:35.600 |
Board. He's so spineless, he decided, "I'll create and give $150 million to this board 01:09:41.200 |
to make these hard decisions for me." Instead of me making the decision, he has God-voting 01:09:46.320 |
shares of that company, Chamath. He controls it with an iron fist. And not only does he 01:09:50.720 |
control it with an iron fist, he has put protection, precisions in that so that his children could 01:09:55.280 |
take that $3.3 billion platform and own it forever. And he punted to them and said, 01:10:00.800 |
"I don't want to make these decisions." What I saw when he did that was, "I don't want to 01:10:04.240 |
be blamed for these decisions." And that is a lack of courage and morality in my estimation. 01:10:10.320 |
And then the second he is threatened by Trump, he makes the opposite decision. And if he's 01:10:15.680 |
making his decisions strictly on maximizing money, I don't respect that. I think he should 01:10:20.960 |
make the decisions based on what he thinks is the moral. What is the point of being a billionaire 01:10:25.440 |
or worth $100 billion or $200 billion if you don't get to say, "I have [expletive] you money, 01:10:30.480 |
[expletive] you. I'm going to do what I want." And that's what I think is his moral failure. 01:10:34.480 |
And anybody giving him his flowers or champing him for this, I think it's just political expediency 01:10:40.960 |
and I think it's disgraceful. That's my feeling. Sorry. I have my own opinion. 01:10:45.120 |
-What about the fact that he was dragged in front of Congress many times over and people that could 01:10:51.440 |
put him behind bars pulled him to his face many times and this has all been kind of been coming 01:10:56.480 |
out over the last couple of months that government officials were directing him in a way that feels 01:11:02.000 |
like do this or you will be prosecuted to do the following things, to act the following way and to 01:11:06.720 |
moderate your platform in a way that we are telling you to moderate it or you will find yourself 01:11:11.360 |
behind bars. Do you not think that there's some degree of inherent complicit kind of role that 01:11:16.480 |
certain government officials and folks in power had in driving some of those actions that maybe 01:11:21.120 |
he had to do it to survive and to keep the company alive? -Not to mention a violation of our 01:11:25.440 |
constitution. -No, not at all. He could have just hired lawyers and fought it. He didn't put up any 01:11:30.480 |
fight. The second they told him to roll over and ban Trump, he did it. Zero fight from him. He 01:11:36.960 |
has no... -Do you know that for sure? Because I just want to make sure I ask you... -I'm just 01:11:40.720 |
basing it on his actions. Like I told you at the beginning of this, I'm basing it on his actions. 01:11:45.040 |
-Right, but I just want to make sure... -He was not going to jail for banning Trump. 01:11:48.000 |
If he didn't ban Trump or he gave him a six-month suspension, he would have been just fine. 01:11:52.400 |
-I'm just trying to get you to take a fair point of view, which means like let's make sure you're 01:11:56.080 |
thoughtful about the fact that this is not a dumb person or a person... Let's give him the benefit 01:12:00.320 |
for a minute. He's not a dumb person... -I do give him the benefit of being a great business 01:12:03.440 |
executive. -I'm just saying let's just assume he's not dumb and let's say that as Cyan points out and 01:12:07.360 |
as he's kind of highlighted points in his history, he actually does have certain beliefs and certain 01:12:12.080 |
systems that he would love to kind of embrace. I've said this many times before. All of the 01:12:16.400 |
founders of the big tech companies were all big free speech advocates. That was a big part of 01:12:20.560 |
the open internet and the movement of the open internet when a lot of people got involved and 01:12:24.000 |
that was a big part for him and I don't know like you know if you really think at some point he 01:12:28.640 |
flipped his switch and said I don't care about the open internet. I now want to have a closed 01:12:31.680 |
controlled internet or if he recognized or was coerced into controlling moderation on the 01:12:36.400 |
platform because of the reach that he had and he said the only way I can have any degree of openness 01:12:40.560 |
is to do the following and I will say that my experience is similar in Google. When Google had 01:12:45.120 |
to exit China, they initially went to China with a closed internet with a closed censored model of 01:12:50.400 |
search because that was the way they had to survive to offer a business in China. They didn't 01:12:55.120 |
morally agree with it. They didn't think it was ethically correct. -Did they launch that or did 01:12:58.720 |
Sergey kill the deal when Eric Schmidt proposed it? -Well that deal went live. There was a... let 01:13:03.280 |
me let me just make sure I get this all correct. -No, they didn't go live. Sergey Brin because of 01:13:08.720 |
his upbringing in Russia, he went to the mat and said on a moral basis we're not going into China 01:13:14.720 |
and I've talked to Sergey about it. He did not want to go in there and compromise his 01:13:19.200 |
own ethics. -That's right. -You're at full stop. So I don't think that is the only outlier here. 01:13:25.200 |
-You're not right and I just want to make sure that... -Okay, tell us, when did it... when did... 01:13:28.480 |
because the Dragon... it was called Project Dragon. -There was a... there's a long history 01:13:32.800 |
to this. -Okay, let's let Cyan come in here. -I want to make sure I get this right but go ahead. 01:13:36.640 |
-Obviously he's a brilliant businessman but I do think underneath it all he is a human being 01:13:40.960 |
and I think his fighting in the arena and this fighting stuff that he does actually did change 01:13:45.360 |
him and this happened long before the first amendment stuff started to appear. You know, 01:13:50.480 |
I think, or free speech, I shouldn't call it first amendment, but I do think that the government did 01:13:55.200 |
interfere and after January 20th we're going to find out some interesting stuff and we'll get to 01:14:00.320 |
the bottom of, you know, how did the government pressure him to censor things and I think he's 01:14:07.920 |
getting in front of that because it is going to come out and I think that is a huge part of why 01:14:13.520 |
he is getting more involved is because it's going to be revealed just how much the government 01:14:18.640 |
coerced him and... -And how much he acquiesced? Is that sort of what you're insinuating? -Yeah, 01:14:22.800 |
I mean this is why I think the fighting actually helped. I think he learned to stop acquiescing. 01:14:26.400 |
-Wow. -I actually think that... -Interesting. He put up a fight. -That is where I started seeing 01:14:30.000 |
the change in him and started noticing and so did the, you know, he's, there are so many more fans 01:14:35.120 |
and people who are looking to him as a leader in a different way now because he's actually starting 01:14:39.280 |
to express who he is and like what kind of music he likes. Nobody ever knew that. They thought he 01:14:44.720 |
was just a robot. He doesn't like music. -He hired a whole PR team to craft this is my understanding, 01:14:49.040 |
but anyway. -Yeah, again, I don't know that much detail. I don't, I'm not involved in his personal 01:14:54.320 |
life like that, but I just, I always love to give people the benefit of doubt. I guess that's just 01:14:57.920 |
me and I do think that people can change and I'm hoping that he is actually going to stay on this 01:15:05.280 |
side. We want more leaders like him to believe in free speech. -Of course, of course. I mean, 01:15:10.240 |
listen, Reddit had... -By the way, they all do and I've never met, you know, an internet business 01:15:16.800 |
executive who didn't come from kind of the open internet philosophical doctrine by background, 01:15:22.320 |
that that was a big motivator for all of us because the internet took away the controls, 01:15:27.280 |
took away the power, took away the censorship, took away all these things that other kind of 01:15:32.000 |
communication systems had vested in them and the internet through an open protocol allowed anyone 01:15:36.880 |
to share anything with anyone else and obviously laws and all this other stuff that's happened 01:15:41.760 |
since then has made that far more difficult and I will revisit our conversation, Jason. Google's 01:15:46.880 |
China with censored search results was live for four years before they cancelled it. So they 01:15:51.440 |
launched in 2006, they censored results, they complied with the Chinese government's request 01:15:56.320 |
and eventually in 2010, they killed it and you could argue it was because of philosophical reasons 01:16:00.640 |
but fundamentally, it never actually got a lot of users in China. There were more users on Baidu 01:16:05.120 |
and Google had separately made an investment. -I think it was Gmail was the moment, I think it 01:16:09.680 |
became, if I remember correctly, it's 20 years ago but I think... -I think it was YouTube. -Oh, 01:16:14.320 |
is it YouTube? Because one of the other services, they started saying, "Hey, we need to know 01:16:18.800 |
these people's names who posted this, who sent this email, we want full access into it," and 01:16:23.120 |
that's where they drew the line because it wasn't just a passive search engine, right? It was 01:16:26.640 |
actually like roundup dissidents like Yahoo famously did. Yahoo... -Yeah, Google claimed 01:16:33.360 |
there was a hack that happened because on their servers in China and so they were just no longer 01:16:38.080 |
comfortable operating... -How about this, guys? However, we got here, we're here and we should 01:16:44.960 |
all be happy that we're here. -Yeah, exactly. -Yeah, I'll take the win. -And we just kind of 01:16:50.240 |
move forward. -I mean, taking the win is a good way to do it. -The world is a better place because 01:16:53.920 |
of his decision. -Yeah, exactly. -I think we all agree on that. I mean, what's the point of having 01:16:59.040 |
an open platform and you can say things... -But why do you call him spineless? Like, why go after 01:17:03.680 |
the guy? -Why? Because you can judge a person when they're put under pressure to make the right 01:17:07.920 |
decision. -I think what Jason is expressing is sort of what I was trying to say and I probably 01:17:11.760 |
said it poorly. I think there are some of us who look at the way that Elon runs himself and his 01:17:20.320 |
companies, okay, as a sort of world-beating technology CEO and then that sort of sets the bar 01:17:27.680 |
but I think that that bar is impossible to meet and I think part of it is because of Elon's genius, 01:17:35.440 |
the other part of it is his success, the other part of it is his influence but there's an element, 01:17:40.080 |
Jason, a fundamental moral risk-taking that he takes that has been rewarded over and over again 01:17:46.800 |
that no other CEO has had to make and when they have, they've largely failed and so I 01:17:52.160 |
understand where you're coming from but I would give a lot of folks the benefit of the doubt here 01:17:57.600 |
and say it's not clear what they believe then versus what they believe now but the destination 01:18:03.120 |
is very good and we're in a better place for society and hopefully we can maintain these norms 01:18:09.520 |
independent of who's in charge after Trump. I am super happy he's making these decisions. 01:18:15.120 |
I believe in freedom of speech. I think he's going to have to deal with advertisers next 01:18:18.800 |
though. I mean that's one thing that X doesn't have to deal with as much and that's going to be 01:18:22.320 |
the second problem he's going to have is not just the government but do advertisers want to be next 01:18:27.680 |
to some of the content that's about to appear. And when he loses tens of billions of dollars 01:18:32.320 |
in personal net worth, will he make the same decisions? We'll see but I can tell you if 01:18:37.600 |
Kamala Harris had been voted in, he would double down on censorship instead of taking this position. 01:18:43.200 |
I think he is terrified of Trump and having his company broken up and he's doing this 01:18:47.680 |
strictly to appease Trump which I think putting Dana White on the board is another signal that's 01:18:52.800 |
one of Trump's good friends. He's just trying to get close to the party. He's trying to make up 01:18:56.160 |
for lost time for when he supported the censorship of Trump and other folks. I think he would make 01:19:02.720 |
the opposite decision but to your point, we're here. I'm glad he's here. 01:19:06.080 |
Would you meet Zuck in the octagon? That's the most important question of the day. 01:19:11.280 |
No, definitely not. He's 10-15 years younger than me. He'd kill me. Not a chance would I meet him 01:19:18.320 |
in the octagon but I wish him well. Would you meet Palmer Luckey in the octagon? 01:19:23.120 |
Let's not start that up again. I'm just wondering. 01:19:27.200 |
I actually literally challenged him. He wanted to send the mountain. He wanted to 01:19:31.600 |
pick somebody to fight for him, Trey, from Founders Fund and I said no unless Trey was 01:19:37.440 |
willing to do it. Do you guys ever watch the old TV show American Gladiators? 01:19:41.120 |
I would like you and Palmer to have an American Gladiators-style tournament, 01:19:46.240 |
like maybe four or five events. That would be incredible. 01:19:50.400 |
Put up a million dollars for charity. I'll totally do it. 01:19:52.960 |
We'll put up a million dollars each for charity. I'll do it. 01:19:58.160 |
Let's get the word out there. I think that this could be the show of the season. 01:20:03.840 |
This would be more exciting than the Accelerator, I will tell you. 01:20:06.800 |
Absolutely, it might get more ratings than it, yeah. 01:20:09.680 |
You could actually call it American Gladiators. It would be a great show. 01:20:12.640 |
Well, there you go. American Gladiators, the CEO edition. Business to business edition. 01:20:16.880 |
All right, listen. NVIDIA going consumer. Let's talk about it. NVIDIA made a big announcement 01:20:23.360 |
at CES this week. They made a lot of them. One of them that was particularly interesting was 01:20:26.880 |
this $3,000 personal AI computer for researchers. It's called Project Digits. 01:20:32.080 |
It's essentially like, maybe Arduino would be a way to look at this, like a personal device, 01:20:40.720 |
but it's powerful enough to run LLMs on. They're also going after physical AI, 01:20:45.120 |
like robotics and self-driving. As we said here on the award show, a lot of people on the panel 01:20:50.720 |
were predicting this year would be the year of robotics. They announced that they're going to 01:20:54.720 |
have driver-assistant chips and maybe build worlds for people to simulate, which, net-net, 01:21:02.160 |
at the end of the day, I think Freeberg puts them on second, would put autonomy partners on second 01:21:06.960 |
or third base in terms of creating technology by incorporating it into the chips and into their 01:21:12.160 |
stack. So, Cyan, what do you think of these announcements and some of the other ones he made? 01:21:19.360 |
Yeah, I'm really excited to talk about it because I think I've been trying to figure out how they 01:21:22.960 |
justify their valuation over the long run. I'm not a public market person, but I am fascinated 01:21:30.160 |
with NVIDIA and their cloud GPU business is definitely a majority of their revenue. So, 01:21:37.680 |
I think a lot of what we're seeing is them trying to grow into that and trying to expand 01:21:42.800 |
in case the music stopped. Now, I don't actually think the music's going to stop. 01:21:48.800 |
It's insane to me that we haven't even barely touched what AI is going to do and change and 01:21:55.200 |
all of the various things that are going to come from it. And the early adopters cannot use Cloud 01:21:59.920 |
without getting shut down because of scaling issues. And I don't think those are artificially 01:22:04.560 |
created based on the type of investing I'm doing. And so, I'm very bullish on NVIDIA. It is 01:22:11.200 |
interesting. It's an interesting thing to go consumer. And the thing that really hit me 01:22:18.000 |
was the fact that he declared Tesla one of the most valuable companies in the world in the long 01:22:23.760 |
run. It's interesting that he got behind Toyota. But at the same time, there's one single car 01:22:33.200 |
company out there that has the kind of data that Full Self Drive has and Tesla has. So, 01:22:38.720 |
if they enter the robo-taxi market, I actually think they should buy Uber. 01:22:45.840 |
Well, that would be about 10% of Tesla's market cap at this point. If they paid a premium, 01:22:49.600 |
that might be 15%. So, it would be very similar to the WhatsApp. 01:22:55.040 |
And then you launch that robo-taxi service and maybe there's some sort of secondary 01:23:00.560 |
aftermarket solution, kind of like comma AI or something like that that you can do for people's 01:23:04.960 |
cars where you can actually get anybody's car into the fleet and start self-driving. 01:23:10.080 |
But it is true. This is going to be the largest breakout in robotics we've ever seen 01:23:15.040 |
if Waymo is any indicator. And I read somewhere, I think that Amazon or somebody was looking at, 01:23:22.560 |
I don't know what was going on with Waymo, but... Oh, Lyft. Amazon was going to buy Lyft. 01:23:30.720 |
Well, it's a dying brand and would the point be... 01:23:35.920 |
I think maybe delivery or something like that. I can't figure out what their play is there. 01:23:39.520 |
Well, and it's also not global, but looking at the Amazon and Waymo, Tesla, and Uber, 01:23:45.840 |
I think Waymo plus Uber, Amazon plus Uber, or Tesla plus Uber defines who number one is, right? 01:23:52.880 |
Because you'd have a global footprint and for the five, 10 years, maybe 10 years it takes to roll 01:23:56.720 |
out taxis globally, you could have people drive... I mean, it's a really interesting thought process 01:24:01.920 |
you have there, Sayen. Imagine if there was an intermarry step where they sold less Teslas this 01:24:07.120 |
year slightly than last year, you could just keep producing lots of Model Ys and give them to the 01:24:13.040 |
Uber drivers, keep reinforcement learning going while the taxis and regulations get set. And then 01:24:20.480 |
you would be able to put another, instead of selling 1.8 million Teslas, you could sell 3 01:24:25.440 |
million Teslas, 4 million Teslas to Uber drivers, get all that data and have the safety driver in 01:24:30.720 |
while each region decides if they want robo-taxis, where, how, et cetera. Your thoughts, Chamath, 01:24:36.640 |
on NVIDIA's dipping their toe into maybe taking the bottom 30% of the stack of self-driving. 01:24:43.360 |
I don't have much of an opinion on that, to be honest. I think that sort of along the lines of 01:24:47.520 |
what I said on the prediction show, I think that Waymo and Tesla are going to run away with this 01:24:52.160 |
market and I think it's going to force a bunch of consolidation in the traditional auto OEMs. 01:24:58.160 |
I think the interesting thing is that they really doubled down and created a pretty decent test 01:25:04.720 |
bench for robotics. I thought that was pretty interesting. So I think that reinforces what a 01:25:09.760 |
lot of smart people, including, you know, what Freeberg and Gavin also spoke about just in terms 01:25:14.480 |
of the long-term future for robots. I think that that was cool. I was a little confused by the low 01:25:22.000 |
end PC. I don't understand what the point of that is. Maybe it like creates some crazy deep in market 01:25:29.440 |
where you can buy GPU and then contribute it to some distributed network and allow some distributed 01:25:35.440 |
workload to run on that, I guess. I don't know. I think it's a toy, a hobbyist kind of device 01:25:41.840 |
that becomes like a bridge. And we see this often in technology where somebody creates, 01:25:47.200 |
like even the original PCs, let's face it, they were kind of like toys and hobbyist devices, 01:25:51.440 |
Arduinos and the original drones were kind of hobbyist. 01:25:54.880 |
Yeah. I guess the point is a toy to do what, because if you're trying to do inference, 01:25:59.440 |
like everything is telling us that we are reaching the limits of training. 01:26:05.520 |
And that's an LLS though, right? So the point is, it's not, yeah. 01:26:11.280 |
Let me get to it. So, so in this world of AI that we know it today, there's training and there's 01:26:16.480 |
inference. And right now we think that there's training that's at a limit. And so now the market 01:26:21.280 |
shifts to inference. So if you're going to buy this jacked up personal computer, what are you 01:26:27.760 |
going to use it for? My suspicion is some sort of test time compute use case, which is an inference 01:26:34.000 |
use case. But it's not clear to me why that's a better solution than all of the AI accelerators 01:26:42.640 |
plus tensors that are now just prolifically being exposed to the market, whether it's 01:26:48.080 |
Amazon exposing what they've done, whether it's Google exposing what they've done, 01:26:51.840 |
a whole litany of startups exposing what they've done. So I was just confused. I don't really know 01:26:58.080 |
what the whole point is. What do you think about this? 01:27:03.120 |
The robotics thing was interesting if the market develops in the way that they think. 01:27:07.360 |
So we're talking about maybe two or three different pieces here, Freebird. Which one 01:27:11.440 |
do you think is super interesting than the this $3,000 sort of GPU for your desktop that you 01:27:18.720 |
attach to your computer, you get to play with things locally? Do you think that's promising? 01:27:21.840 |
Where would that go? If you have to guess? So I think the bet he's making is it's not just 01:27:28.080 |
LLMs, which is predicting text. But you know, we've talked a lot about machine vision models, 01:27:34.560 |
graph neural nets that that are being used for weather forecasting. There's now these kind of 01:27:40.000 |
genome language models that are trying to predict genomic output for biotech applications. 01:27:46.320 |
There's also going to be kind of real time machine vision and robotic response. 01:27:52.800 |
Like we're working on this at Ohalo. And we're trying to figure out what's the right kind of 01:27:57.840 |
runtime environment for these sorts of systems that are going to be using machine vision and 01:28:01.840 |
a robotic kind of response type system. And there's a lot of these industrial applications 01:28:05.520 |
that are emerging. Let's say you're running a robot in a warehouse, do you really want that 01:28:09.360 |
robot in the warehouse to be sending data to the cloud and waiting for a model to run in the cloud 01:28:14.800 |
and getting a response? The probability is you want to have that at the edge of the network, 01:28:19.280 |
you want to have something local. And I don't think he necessarily has a strong point of view 01:28:24.000 |
on what the types of models and industrial applications will be. But the bet he's making 01:28:28.560 |
is that the models are good enough. And now the chips are good enough that they can actually 01:28:32.400 |
realize real time responses, using machine vision, using real time input, and then respond quickly 01:28:39.680 |
with a local model running whatever that model is, to drive some output in the industrial setting. 01:28:45.120 |
And that there'll be a lot of these sorts of applications, whether that's making predictions 01:28:48.800 |
for biotech research, or whether that's for running robots in warehouses, or building 01:28:54.000 |
new research models. Or maybe you could strap this PC on the back of something like a car, 01:28:59.120 |
a tractor, a lawnmower, a humanoid robot, or any other set of applications. 01:29:04.320 |
Explain to the audience, Freeberg, why having the computer at the edge is beneficial for 01:29:09.520 |
those folks who might not know. If you're taking in a lot of data, 01:29:12.560 |
and then you have to run a lot of data in a model, it's a lot faster to run that model locally. Like 01:29:18.160 |
when Tesla runs self driving, it's not sending the video images from your car to a server 1000 01:29:24.880 |
miles away, and then letting the server decide how to drive your car. The car is running its 01:29:28.960 |
model on what to do with respect to the video imagery in the car. It's local, because the 01:29:34.640 |
ability for all that data to get processed in the car means that you don't have to wait for 01:29:38.000 |
the internet to transmit data back and forth. You don't have lag time, you don't have the 60 01:29:42.000 |
millisecond or 100 millisecond response time. You don't have it losing your phone connection, 01:29:45.920 |
and then not knowing what to do. Exactly. Or the connection drops, 01:29:49.200 |
or waiting for a server to come online, or server breaks in the data center, everything is local. 01:29:54.080 |
So if you scrap this, like, you know, NVIDIA computer, which is basically plug and play, 01:29:58.960 |
you don't have to have like hardware expertise, you could scrap it onto the back of a humanoid 01:30:02.960 |
robot or run research applications locally. So I think that there's going to be some really 01:30:07.600 |
interesting use cases, whether it becomes a replacement for the Apple, you know, Macintosh 01:30:12.960 |
Pro, studio device, whatever, maybe we'll see. Mac Mini 4, yeah. 01:30:17.520 |
The Mac Mini 4. But a lot of people have pointed out that actually the compute on this thing for 01:30:21.680 |
$3,000 knocks a lot of Macs out of the field. So it is pretty bonkers. 01:30:25.840 |
I just can't run an operating system in the traditional sense. Sam, when we look at startups, 01:30:30.000 |
I remember when you and I started investing, two of the driving forces was free storage, 01:30:34.160 |
free bandwidth, and cloud computing drove a lot of ability to get a product to market very quickly, 01:30:41.280 |
effectively, etc. What impact will AI have on all these startups that are being 01:30:47.360 |
originating now in 2024, 2025? Look into your crystal ball and how do you think they'll grow 01:30:55.040 |
the footprint of them? How is this going to accelerate the startup scene? 01:30:58.800 |
I actually think we're going to see a Cambrian explosion of creativity and development of 01:31:04.000 |
different things. And some of them are going to be stupid ideas, and some of them are going to 01:31:08.000 |
be great. But I think it's going to make our job, especially at the seed stage of investing harder 01:31:12.320 |
and harder. There's going to be so many, there's just going to be a lot of people that have similar 01:31:16.960 |
ideas at the same time that can execute quickly and do things at breakneck speeds that they've 01:31:22.160 |
never been able to do before. Picking the winner is going to be hard to figure out. 01:31:27.840 |
It's going to be harder and harder. I've been thinking about this. Do you invest in 01:31:33.520 |
competitors, which is something I never used to do? Do you take a bet and index an entire category 01:31:38.960 |
that you're interested in? What is the approach at seed and pre-seed? I think of an idea and I'm 01:31:44.960 |
like, wow, that's really neat. And then I go and look out there and there's 30 people working on 01:31:48.960 |
it. And that didn't used to be the case. And I think part of it is we've really unlocked a tool 01:31:54.800 |
that allows people to do things that would have been cost prohibitive or gives them the ability 01:32:02.080 |
to think, gosh, I could be an entrepreneur and I can try this and I could do this. 01:32:05.040 |
So I'm seeing people experiment and do all sorts of things. As far as the startups, 01:32:11.440 |
some of the AI stuff is just a feature. It's just table stakes at this point. It's like a 01:32:17.600 |
chat or whatever. And that doesn't really matter. But then you're seeing people re-imagine games and 01:32:22.000 |
re-imagine even things down to your kitchen appliance, et cetera. So I do think it's going 01:32:30.640 |
to be very, very difficult. And I tend to sit out a lot of hype cycles. So I invested in power 01:32:37.040 |
and compute, lithography, kind of all of the things that are going to be underneath all of this. 01:32:43.840 |
And so I'm not sure how much of it I'm going to participate in until it starts to get to a steady 01:32:48.880 |
state and you kind of can understand what's next. Because the rate of acceleration is just so great 01:32:54.640 |
that it's just kind of unclear to me sometimes, especially when it comes to these consumer 01:33:01.520 |
applications, consumer facing things. It's just really hard. 01:33:05.040 |
Well, when we were picking famously Uber, you had to pick between sidecar lift and Uber. There 01:33:11.040 |
were three people doing it and it was pretty clear who was the most qualified amongst those three. 01:33:15.600 |
Now, to your point, if you want to be involved in tax plus AI or legal plus AI, you might be 01:33:22.400 |
looking at 50 companies, 100. And it was tradition in Silicon Valley to not bet on competitors. 01:33:28.080 |
There were some notable exceptions. When you run an accelerator like I do, Techstars or Y Combinator, 01:33:32.560 |
you aren't bound by that because 50% of the companies pivot almost by design. 01:33:37.360 |
So I think you just have to, I think at pre-seed, because people pivot, you just have to tell people 01:33:43.120 |
like, listen, we have a lot of pivoting going on. People are going to run into each other. I can't 01:33:46.800 |
just bet on one thing, right? In a space. But I think that's a reasonable compromise. If all 01:33:51.840 |
the founders are going to keep pivoting to each other's businesses, how can the investors even 01:33:55.680 |
keep track of that? It's like being air traffic control of 10 airports at once. It's just not 01:34:00.480 |
feasible. You wouldn't think it, but there's still a lot of spreadsheet companies out there. 01:34:05.280 |
You think you'd run out of them, but they're still out there. You look at, 01:34:09.040 |
and I think this is where AI is really going to make a difference, like RFP proposals for 01:34:13.760 |
governments. Something that takes like 30 days and it's manual and you have to submit these 01:34:18.560 |
horrible documents. You can ingest your entire corpus of all of your previous bids and submit 01:34:24.400 |
them at a breakneck speed now and win more contracts. That becomes like a national defense 01:34:29.600 |
company at that point. I think we're going to see a lot of really interesting things where 01:34:34.320 |
a lot of cruft is going to disappear. That'll be a really interesting wave that I'm looking forward 01:34:42.560 |
to. Yeah. In fact, Chamath has made a big bet there with his time, with his software startup 01:34:48.000 |
that he's created. All right, let's end on the United States of America growing from 50 to 60 01:34:55.520 |
or 70 states. Trump has been rattling off some ideas around this. Chamath, what's your take on 01:35:03.600 |
it? I know we got to get wrapped up here. So we'll just do a quick lightning round on it. 01:35:07.440 |
I mean, I thought it was really interesting. And I was just caught off guard at how the media tried 01:35:14.320 |
to portray it as Trump being Trump. Goofy, whatever, colorful. But I think like what 01:35:20.640 |
I've realized, even with the California fire thing, the guy has this prescient way of, 01:35:25.600 |
he may not say it in the way that it works for some people, but he's just really on top of this 01:35:31.200 |
stuff. So I just had to make a thing. So I started to learn a little bit more about why he wants to 01:35:36.800 |
take over Greenland. And it really comes down to one very basic idea here. Because of climate change 01:35:44.240 |
and other things, the Arctic ice shelf is melting. And the more and more it melts, it opens up 01:35:50.400 |
a shipping lane in the Northern Passage for a lot of critical goods. And so if you had some 01:35:58.080 |
sort of strategic agreement with Canada and Greenland, you effectively have this monopoly 01:36:04.720 |
control over something that could become as important as the Panama Canal. And so I think 01:36:10.240 |
if you look across the world, the control of maritime shipping lanes becomes this really 01:36:17.760 |
critical, strategic, military and economic asset. And so the reason why he's trying to find a way 01:36:25.600 |
to initiate some sort of a discussion between Greenland and Canada is exactly this reason. 01:36:32.480 |
And I think it's sort of like a bargaining gambit the way that he started. But it's really smart 01:36:37.280 |
that he's trying to get this done for the United States of America. Because meanwhile, what you 01:36:41.760 |
have is China militarizing very aggressively, Russia militarizing very aggressively. And what 01:36:47.600 |
you don't want to have happen is those two countries take control of that Northern Passage 01:36:52.000 |
as the ice sheet melts. So I just thought that was important. Having a capable business executive 01:36:57.520 |
thinking about the future of business and shipping and logistics. Pretty, pretty big win. And I just 01:37:03.200 |
love the idea, Cyan. You know what's smart? I mean, let's give Trump credit. What's so smart 01:37:07.040 |
is like, somebody was doing this work. Yes. Got it, got it in front of him. Yeah. And he was 01:37:12.640 |
smart enough to say, hold on a second, this is really important. Let me tweet it. And then the 01:37:17.280 |
way that he initiates it, though, gets even more attention. Because if he basically tweeted, 01:37:23.040 |
hey, guys, I have this really interesting idea to gain more leverage in the Northern Maritime 01:37:27.680 |
shipping lane, nobody would have paid attention. Absolutely. Nobody would have. And now we're all 01:37:31.840 |
talking about it. And now there's an opportunity for millions of people to understand why and be 01:37:36.560 |
supportive of it. It's pretty smart. Cyan, any thoughts on expanding the United States to a 01:37:40.880 |
couple more territories and states? I love it. I would love to have 60 states in our lifetime. 01:37:45.760 |
I mean, let's pick one in the Caribbean. Let's pick one in Europe. I think we should have an 01:37:49.440 |
open invitation. Jason, that's not what he's doing. I think he's... I know I'm being a bit 01:37:53.440 |
facetious here. This is very strategic, this one. But I'm just thinking the next time, you know, 01:37:58.160 |
I would like to get Cuba, maybe Portugal. I don't know who 80% of people, Cyan, what do you think, 01:38:03.200 |
in the country want to join? Join. It's very strategic. If you look at the Panama Canal, 01:38:07.600 |
I believe either end is operated and controlled by China. We are at war with China, whether we 01:38:13.520 |
like to admit it or not, in my opinion. And so this is very strategic. He has a very strange way 01:38:19.840 |
of communicating, as you pointed out, but I think it's brilliant. And I actually think 01:38:22.960 |
we should add to that. I've always thought that we should open up and add more states 01:38:28.320 |
and extend that invitation, you know, to Taiwan. It might be controversial to even say India. 01:38:35.120 |
But I do think that there's a lot of countries out there and people who really, 01:38:41.840 |
really resonate with what it means to be an American and the freedoms that come with our 01:38:48.000 |
subscription fees of this country. And so I do think that it would be great for us to expand. 01:38:54.000 |
And, you know, I don't know what he's thinking or who he's got behind the scenes who motivated him 01:38:59.440 |
to do it, but I really think it's a great idea. Freeberg, what do you think about opt-in 01:39:04.880 |
imperialism and this incredible concept of expanding our territories in the 21st century? 01:39:12.560 |
Again, I don't know how to read it. I have no inside information. There's clearly some 01:39:17.920 |
posturing, as we've heard many times when Trump makes a declaration, like I'm going to put on 01:39:21.920 |
100% tariff on every car that's imported, or I'm going to charge you 2000 bucks Mexico for every 01:39:27.440 |
time you ship something here, or I want to do X or Y or Z. It's not the literal statement that 01:39:34.560 |
matters as much as kind of the vector and the magnitude of the vector. He's clearly trying to 01:39:39.520 |
begin negotiating for some change. I don't know what the ultimate kind of strategic endpoint is 01:39:45.760 |
meant to be here, but clearly there's something. I think Chamath might have a good read on this, 01:39:51.200 |
and it seems to make a lot of sense. Well, we have a military base there, 01:39:54.800 |
and we also protect it, and we occupy it already, which is interesting. 01:39:58.880 |
Right. Yeah, we somewhat abandoned all that in Greenland, but there is a lot of that 01:40:03.920 |
infrastructure still sitting around. Can I ask you guys a question? I listened to 01:40:07.440 |
Lex Friedman's interview. This is totally off topic, but I listened to Lex Friedman's interview 01:40:11.840 |
with Graham Hancock. You guys ever heard of this guy? Yes. Have you read any of his stuff or watched 01:40:17.920 |
any of his shows? No, I have not. No. Okay, so he's got this belief that there was this ancient 01:40:23.920 |
civilization on Earth, not sci-fi futuristic, but an advanced human civilization, and that's where 01:40:30.640 |
the Great Pyramid of Giza was. There was a smaller pyramid that was built there, and a lot of these 01:40:35.440 |
other historical places were built, and then they were built on top of later, but that a lot of this 01:40:44.000 |
advanced civilization was wiped out during the last Ice Age. There was a very rapid freezing 01:40:51.920 |
event that happened over a period of about 1200 years, and that's when this great Ice Age era 01:40:56.640 |
civilization was wiped out, but what I didn't realize, and so I went down this really crazy 01:41:00.880 |
rabbit hole in the last week on how much of planet Earth, how different planet Earth was just 12,000 01:41:07.600 |
years ago during the Ice Age. Have you guys spent any time on this? I just went down a similar 01:41:12.320 |
rabbit hole with the Grand Canyon. First of all, how the planet Earth has changed in such a short 01:41:17.840 |
period of time blows my mind, but the sea level was 400 feet lower than it is today just 12,000 01:41:26.080 |
years ago, and there were humans on Earth at the time, and so all of this area that we look at as 01:41:31.680 |
like Malta, the island of Malta, was the southern tip of a continental stretch that went into Italy, 01:41:38.880 |
so it was all part of one great landmass, and there's all this area that was actually part 01:41:44.000 |
of that landmass that now sits under that ocean there, and there's these ruts in the ground for 01:41:49.360 |
moving stuff and buildings and all this other crazy stuff, and we have no idea what's actually 01:41:54.960 |
under the ice in Greenland, what's under the ice in Antarctica. There's all these parts of Earth 01:41:59.520 |
where humans very likely had some... This is so off topic, we could cut this from the show. 01:42:04.160 |
No, I think it's incredible. Chamath and I... 01:42:06.320 |
It's so crazy that there's all these parts of Earth, and especially in the oceans, 01:42:11.120 |
as we start to explore, there's actually large humans, potentially advanced civilizations that 01:42:15.920 |
lived in these areas, not like sci-fi flying around. The Atlanta stuff, that it was actually 01:42:20.160 |
an advanced civilization, and then humans lost a lot of this ability when this period of freezing 01:42:26.240 |
happened over 1,200 years, and then a lot of it was preserved in legends and myths that showed up 01:42:30.800 |
in later archeology and later museums. How do you explain the pyramids? 01:42:35.280 |
I think he has a really interesting explanation. We don't want to ask you, 01:42:37.760 |
Sayan, I think, because we had Gavin explain it last time. Sayan, welcome to Conspiracy Corner. 01:42:42.160 |
So somebody sent me an email, and he said what they did was they flooded the area, 01:42:47.280 |
and then they floated the rocks into place. I think I mentioned that last time. They floated 01:42:49.280 |
the rocks up, yes. It's brilliant. Yeah, I've heard this, yeah. 01:42:51.760 |
But you know what, Chamath and I were talking about this, too, because when you remove all 01:42:54.640 |
that crust, because we actually were talking, we just didn't use the guy in Graham's name. 01:42:57.600 |
It's just like Uranus. When you break away all that crust, 01:43:00.560 |
what did they find in Uranus, Friedberg? Mine was better, mine was better. 01:43:06.480 |
You got it. You landed the joke. It's great, it's great. 01:43:11.040 |
This has been another amazing episode of the All In Podcast. It's different, yeah, I can't say 01:43:17.680 |
anything other than, Sayan, you were great for a first time out. You got to the conspiracies, 01:43:23.280 |
you rocked it. You got to interject more, because it's a vibrant panel, but for a first time out, 01:43:29.920 |
do you have an alternative explanation for the pyramids, Sayan? 01:43:32.800 |
Yeah, Sayan, what is your end, and what about UFOs? 01:43:34.880 |
I've looked into, I mean, UFOs is the only one that I usually come back to, because, 01:43:39.760 |
you know, if you look at putting logs underneath and trying to roll them, or you look at 01:43:44.160 |
flooding an area, all of this just doesn't make a whole lot of sense. 01:43:48.000 |
And so, and then, you know, the fact that there are other civilizations that also have pyramids 01:43:55.040 |
that are stunning and feats of engineering as well, things like Stonehenge, et cetera. I mean, 01:44:01.440 |
there's just things that defy explanation. I don't know if you ever tried to make a catapult, 01:44:08.240 |
It's really hard. And so, like, we just did not have the technology, 01:44:13.840 |
or at least we can't find any definitive way that it happened. And so, I do think 01:44:19.520 |
there is a possibility that there was a more advanced civilization here, or we were visited, 01:44:26.880 |
I think it's mutants. I'm going with the X-Men theory. I think there were mutant human beings 01:44:30.400 |
who had the ability with superpowers to build them. 01:44:32.240 |
It could be that. It could be that. It could be we had control of matter and alchemy, 01:44:37.520 |
This is what we've come to now. We get conspiracy corner at the end of every program. We try to 01:44:42.080 |
figure out unsolved mysteries. Welcome to Unsolved Mysteries. And, oh, just a little 01:44:47.680 |
housekeeping here as we wrap. Our friends, our partners, dare I say, at Polymarket have 01:44:54.400 |
done us a solid, Friedberg. Check this out. We talked a little bit about our long debates here 01:45:02.000 |
on the program. So, Chamath, we created a market here. The Magnificent Seven shrinks below 30% 01:45:08.560 |
of S&P 500 in 2025, 44% chance is what people in the real world are putting volume on that. 01:45:16.640 |
I see $11,000 already in volume. And then, Friedberg, you came up with one, which was Will, 01:45:22.800 |
I guess we did this one together, but I think it should be really under your name. Will U.S. 01:45:27.520 |
national debt surpass $38 trillion in 2025? And then, third, talking about immigration, 01:45:33.120 |
we got a lot of passion around this topic. Trump's team and Trump himself said they're 01:45:37.520 |
going to deport 15 million immigrants from America. I said, "Hey, let's create a market for 01:45:45.680 |
Will Trump deport 750,000 or more people in 2025, 38% chance." For those of you who don't know, 01:45:52.160 |
Obama, I think, did 2 million people in eight years. So, this is not like a partisan thing. 01:45:56.560 |
This is just a practical thing. So, anyway, go to Polymarket, look at the creators. You'll see 01:46:00.640 |
under that tab, that all in has a bunch of markets. We're doing this in partnership with 01:46:04.560 |
our partners who've partnered with us in a partnership at Polymarket, #FTCPartnership. 01:46:24.400 |
And it said, "We open-sourced it to the fans and they've just gone crazy with it." 01:46:35.600 |
Let your winners ride. Let your winners ride. 01:46:41.600 |
That's my dog taking a notice in your driveway, Sachs. 01:46:49.600 |
We should all just get a room and just have one big huge orgy because they're all just useless. 01:46:54.560 |
It's like this sexual tension that they just need to release somehow.