back to indexMarc Andreessen: Trump, Power, Tech, AI, Immigration & Future of America | Lex Fridman Podcast #458

Chapters
0:0 Introduction
1:9 Best possible future
10:32 History of Western Civilization
19:51 Trump in 2025
27:32 TDS in tech
40:19 Preference falsification
56:15 Self-censorship
71:18 Censorship
79:57 Jon Stewart
82:43 Mark Zuckerberg on Joe Rogan
91:32 Government pressure
102:19 Nature of power
115:8 Journalism
120:43 Bill Ackman
125:40 Trump administration
133:19 DOGE
147:11 H1B and immigration
185:5 Little tech
197:25 AI race
206:15 X
209:47 Yann LeCun
213:21 Andrew Huberman
214:53 Success
217:49 God and humanity
00:00:00.000 |
I mean, look, we're adding a trillion dollars to the national debt every 100 days right 00:00:04.240 |
And it's now passing the size of the Defense Department budget. 00:00:06.800 |
And it's compounding and it's pretty soon it's going to be adding a trillion dollars 00:00:10.000 |
And then it's going to be adding a trillion dollars every 80 days. 00:00:11.880 |
And then it's going to be a trillion dollars every 70 days. 00:00:14.140 |
And then if this doesn't get fixed at some point, we enter a hyperinflationary spiral 00:00:17.480 |
and we become Argentina or Brazil and the following is a conversation with Mark Andreessen. 00:00:26.100 |
His second time on the podcast, Mark is a visionary tech leader and investor who fundamentally 00:00:31.860 |
shaped the development of the Internet and the tech industry in general over the past 00:00:37.680 |
He is the co-creator of Mosaic, the first widely used web browser, co-founder of Netscape, 00:00:45.280 |
co-founder of the legendary Silicon Valley venture capital firm Andreessen Horowitz, 00:00:50.600 |
and is one of the most influential voices in the tech world, including at the intersection 00:01:00.880 |
To support it, please check out our sponsors in the description. 00:01:04.680 |
And now, dear friends, here's Mark Andreessen. 00:01:11.920 |
If you were to imagine the best possible one to two years, 2025, 26, for tech, for big 00:01:23.280 |
Lay out your vision for the best possible scenario trajectory for America. 00:01:32.580 |
It is remarkable over the last several years with all of the issues, including not just 00:01:37.520 |
everything in politics, but also COVID and every other thing that's happened. 00:01:42.280 |
If you just look at economic growth charts, the US just kept growing and very significantly 00:01:46.740 |
So Canada stopped growing, the UK stopped growing, Germany stopped growing, and some 00:01:50.920 |
of those countries may be actually going backwards at this point. 00:01:54.000 |
And there's a very long discussion to be had about what's wrong with those countries. 00:01:57.800 |
And there's, of course, plenty of things that are wrong with our country, but the US is 00:02:04.440 |
And I think that's a consequence of many factors, some of which are lucky and some of which 00:02:11.200 |
I mean, the lucky part is just, number one, we just have incredible physical security 00:02:20.240 |
There's this running joke now that whenever it looks like the US is going to run out of 00:02:22.920 |
some rare earth material, some farmer in North Dakota kicks over a hay bale and finds a $2 00:02:29.760 |
I mean, we're just blessed with geography and with natural resources. 00:02:35.040 |
We can be energy independent anytime we want. 00:02:37.680 |
This last administration decided they didn't want to be. 00:02:41.120 |
This new administration has declared that they have a goal of turning it on in a dramatic 00:02:46.120 |
There's no question we can be energy independent. 00:02:51.600 |
And I think the new administration is going to do that. 00:02:56.400 |
One is we are the beneficiaries, and you're an example of this, we're the beneficiary 00:03:02.000 |
of 50, 100, 200 years of like the basically most aggressive driven, smartest people in 00:03:06.480 |
the world, most capable people moving to the US and raising their kids here. 00:03:10.440 |
And so we just have by far the most dynamic, we're by far the most dynamic population, 00:03:15.060 |
most aggressive, we're the most aggressive set of characters certainly in any Western 00:03:19.440 |
country and have been for a long time and certainly are today. 00:03:23.340 |
And then finally, I would just say, look, we are overwhelmingly the advanced technology 00:03:28.320 |
We have our issues and we have a particular issue with manufacturing, which we could talk 00:03:31.880 |
about, but for anything in software, anything in AI, anything in all these advanced biotech, 00:03:38.180 |
all these advanced areas of technology, we're by far the leader, again, in part because 00:03:41.720 |
many of the best scientists and engineers in those fields come to the US. 00:03:46.320 |
And so we have all of the preconditions for just a monster boom. 00:03:53.800 |
I could see productivity growth going way up, rate of technology adoption going way 00:03:58.080 |
I mean, we can do a global tour if you like, but like basically all of our competitors 00:04:02.120 |
have like profound issues and we could kind of go through them one by one, but the competitive 00:04:06.680 |
landscape just is, it's like it's remarkable how much better position we are for growth. 00:04:13.460 |
What about the humans themselves, almost a philosophical questions? 00:04:16.720 |
I travel across the world and there's something about the American spirit, the entrepreneurial 00:04:26.040 |
I've talked to a saga who claims it might be the Scots-Irish blood that runs through 00:04:36.360 |
You at the heart of Silicon Valley, is there something in the water? 00:04:43.400 |
So is this a family show or am I allowed to swear? 00:04:47.960 |
So the great TV show Succession, the show of course, which you were intended to root for 00:04:53.720 |
The best line for Succession was in the final episode of the first season when the whole 00:04:57.280 |
family's over in Logan Roy's ancestral homeland of Scotland and they're at this castle for 00:05:02.400 |
some wedding and Logan is just completely miserable because he's been in New York for 00:05:07.960 |
He's totally miserable being back in Scotland and he gets in some argument with somebody 00:05:11.760 |
and he says finally, he just says, "My God, I cannot wait to get out of here and go back 00:05:15.500 |
to America where we can fuck without condoms." 00:05:25.760 |
And then everybody instantly knows what the, like everybody watching that instantly starts 00:05:28.760 |
laughing because you know what it means, which is, it's exactly this. 00:05:31.000 |
I think there's like an ethnographic way of it. 00:05:33.040 |
There's a bunch of books on like all, like you said, the Scots-Irish, like all the different 00:05:35.740 |
derivations of all the different ethnic groups that have come to the US over the course of 00:05:40.960 |
But it's, and what we have is this sort of amalgamation of like the Northeast Yankees 00:05:50.080 |
We've got the Southerners and the Texans, and the sort of whole kind of blended kind 00:05:55.320 |
of Anglo-Hispanic thing with super incredibly tough, strong, driven, capable characters. 00:06:00.680 |
The Texas Rangers, we've got the California, we've got the wild, we've got the incredibly 00:06:07.960 |
inventive hippies, but we also have the hardcore engineers. 00:06:10.400 |
We've got the best rocket scientists in the world. 00:06:12.520 |
We've got the best artists in the world, creative professionals, the best movies. 00:06:17.800 |
And so, yeah, there is, you know, all of our problems I think are basically, in my view 00:06:24.320 |
to some extent, attempts to basically sand all that off and make everything basically 00:06:29.800 |
But there is something in the national spirit that basically keeps bouncing back. 00:06:33.600 |
And basically what we discover over time is we basically just need people to stand up 00:06:37.080 |
at a certain point and say, you know, it's time to, you know, it's time to build, it's 00:06:40.860 |
time to grow, you know, it's time to do things. 00:06:43.640 |
And so, and there's something in the American spirit that just like reversed right back 00:06:47.120 |
I mean, before I actually saw, you know, I saw it as a kid here in the early eighties, 00:06:51.080 |
you know, cause the seventies were like horribly depressing, right? 00:06:54.680 |
In the U.S. like that, it was, they were a nightmare on many fronts. 00:06:58.420 |
And in a lot of ways, the last decade to me has felt a lot like the seventies, just being 00:07:02.960 |
mired in misery and just this self-defeating, you know, negative attitude and everybody's 00:07:09.120 |
And, you know, and then by the way, like energy crisis and hostage crisis and foreign wars 00:07:16.760 |
You know, the low point for in the seventies was, you know, Jimmy Carter who just passed 00:07:20.120 |
away, he went on TV and he gave this speech known as the Malay speech. 00:07:22.800 |
And it was like the weakest possible trying to like rouse people back to a sense of like 00:07:28.800 |
And, you know, we had the, you know, the hostages in, you know, Iran for 440 days and every 00:07:33.840 |
night on the nightly news, it was, you know, lines around the block, energy crisis, depression, 00:07:39.440 |
And then, you know, Reagan came in and, you know, Reagan was a very controversial character 00:07:43.280 |
And, you know, he came in and he's like, yep, nope, it's morning in America. 00:07:46.360 |
And we're the shiny city on the hill and we're going to do it. 00:07:50.440 |
And the national spirit came roaring back and, you know, worked really hard for a full 00:07:53.480 |
And I think that's exactly what, I think, you know, we'll see, but I think that's what 00:07:57.280 |
And I just did a super long podcast on Milton Friedman with Jennifer Burns, who's this incredible 00:08:05.680 |
So there's a bunch of components to that, one of which is economic and one of which 00:08:11.160 |
maybe you can put a word on it of, not to be romantic or anything, but freedom, uh, 00:08:16.880 |
individual freedom, economic freedom, political freedom, and just in general individualism. 00:08:24.000 |
And as you know, America has this incredible streak of individualism, you know, and individualism 00:08:28.120 |
in America probably peaked, I think, between roughly call it the end of the Civil War, 00:08:32.040 |
1865, through to probably call it 1931 or something, um, you know, and there was this 00:08:36.900 |
like incredible, I mean, that period, you know, we now know that period is the second 00:08:39.860 |
industrial revolution and it's when the United States basically assumed global leadership 00:08:43.360 |
and basically took, took over technological and economic leadership from, from England. 00:08:46.840 |
Um, and then, you know, that, that led to, you know, ultimately then therefore being 00:08:50.200 |
able to, you know, not only industrialize the world, but also win world war two and 00:08:54.240 |
Um, and yeah, you know, there's a massive industrial, you know, massive, uh, individualistic 00:08:59.540 |
Um, by the way, you know, Milton Friedman, Milton Friedman's old videos are all on YouTube. 00:09:02.580 |
They are every bit as compelling and inspiring, um, as they, uh, as they were then, um, you 00:09:08.360 |
know, he's this, he's a singular figure and many of us, you know, have, you know, I never 00:09:11.440 |
knew him, but, um, uh, he was at, uh, actually at Stanford for many years at the Hoover institution, 00:09:15.680 |
but, uh, I never met him, but I know a lot of people who worked with them and you know, 00:09:18.840 |
that, that, you know, he was, he was a singular figure, but his, his, all, all of his lessons, 00:09:24.360 |
Um, but I would also say it's not just individualism and this is, you know, one of, this is one 00:09:28.120 |
of the big things that's like playing out in a lot of our culture and kind of political 00:09:31.360 |
fights right now, which is, you know, basically this feeling, you know, certainly that I have 00:09:35.360 |
and I share with a lot of people, which is it's not enough for America to just be an 00:09:40.240 |
Um, and it's not enough for us to just be individuals and it's not enough to just have 00:09:43.680 |
line go up and it's not enough to just have economic success. 00:09:46.600 |
There are deeper questions, uh, at play and, and, and also, you know, there, there, there's 00:09:50.280 |
more to a country, uh, than just that and, and, you know, quite, quite frankly, a lot 00:09:55.400 |
Um, a lot of it is, you know, involves spirit, um, and, and passion. 00:09:58.840 |
And you know, like I said, we, we have more of it than anybody else. 00:10:01.280 |
Um, but, um, you know, we, we have to choose to want it. 00:10:04.720 |
The way I look at it is like all of our problems are self-inflicted, like they're, you know, 00:10:09.160 |
You know, all of our problems are basically demoralization campaigns, you know, basically 00:10:13.200 |
people telling us people in positions of authority telling us that we should, you know, we shouldn't, 00:10:21.200 |
You know, we shouldn't, you know, this, that, and the other thing, and we should feel bad 00:10:25.360 |
And I think we've lived through a decade where that's been the prevailing theme. 00:10:28.560 |
And I, I think quite honestly, as of November, I think people are done with it. 00:10:33.080 |
If we could go on a tangent of a tangent, since we're talking about individualism and 00:10:37.760 |
that's not all that it takes, you've mentioned in the past, the book, the ancient city by, 00:10:42.700 |
if I can only pronounce the name, French historian, Numa Deni Foustel de Coulombe. 00:10:52.600 |
Anyway, you said this is an important book to understand who we are and where we come 00:10:54.920 |
So what that book does, it's actually quite a striking book. 00:10:57.120 |
Um, so that book is written by this guy, um, as a profuse, I'm going to let Lex do the 00:11:02.200 |
pronunciations, the foreign language pronunciations for the day. 00:11:05.160 |
Um, he was a professor of classics, um, at, uh, the Sorbonne in, um, in Paris, you know, 00:11:11.400 |
the top university, uh, at, um, in the, in the, actually in the 1860s. 00:11:14.800 |
So actually right around after the US civil war. 00:11:18.000 |
And he was a savant of a particular kind, which is he, and you can see this in the book 00:11:21.520 |
as he had apparently read and sort of absorbed and memorized every possible scrap of Greek 00:11:27.360 |
Um, and so it was like a walking like index on basically Greek and Roman, everything we 00:11:34.000 |
The reason this matters is because basically none of that has changed, right? 00:11:36.720 |
And so he, he, he had access to the exact same written materials that we have, we have 00:11:40.600 |
And so there, you know, we, we've learned nothing. 00:11:42.040 |
And then specifically what he did is he talked about the Greeks and the Romans, but specifically 00:11:46.480 |
He reconstructed the people who came before the Greeks and the Romans and what their life 00:11:50.400 |
And these were the people who were now known as the, as the Indo Europeans. 00:11:53.360 |
And these were, or you may have heard of these, these are the people who came down from the 00:11:56.880 |
And so they, they, they came out of what's now like Eastern Europe, like around sort 00:12:03.120 |
They ultimately took over all of Europe, by the way, you know, almost many of the ethnicities 00:12:07.280 |
in the Americas, the hundreds of years to follow, you know, are Indo European. 00:12:10.680 |
And so like, you know, they were this, basically this warrior, basically class that like came 00:12:14.360 |
down and swept through and, and, and, and, you know, essentially you know, populated 00:12:19.080 |
And there's a whole interesting saga there, but what he does, and then they basically 00:12:22.600 |
that they, from there came basically what we know as the Greeks and the Romans were 00:12:27.840 |
And so what he reconstructs is sort of what life was like, what life was like, at least 00:12:31.260 |
in the West for people in their kind of original social state. 00:12:34.720 |
And the significance of that is, is the original social state is this, is living in the state 00:12:38.240 |
of the absolute imperative for survival with absolutely no technology, right? 00:12:46.560 |
You've got your, you know, you've, you've got whatever you can build with your bare 00:12:50.480 |
This is, you know, predates basically all concepts of, of, of technologies we understand 00:12:54.800 |
And you put people under like maximum levels of physical survival pressure. 00:12:58.080 |
And so what, what social patterns did they evolve to be able to do that? 00:13:00.840 |
And then the social pattern basically was as follows, um, is a, is a three-part social 00:13:04.960 |
structure, uh, family, um, uh, tribe, and city, um, and, um, zero concept of individual 00:13:11.160 |
rights, um, and essentially no concept of individualism. 00:13:14.160 |
And so you were not an individual, you were a member of your family, and then a set of 00:13:20.160 |
And then a set of tribes would aggregate into a, um, uh, into a city. 00:13:23.720 |
Um, and then the morality was completely, it was actually what Nietzsche talks, Nietzsche 00:13:28.080 |
talks about it, the, the morality was entirely master morality, not slave morality. 00:13:31.320 |
And so in their morality, anything that was strong was good and anything that was weak 00:13:37.600 |
It's because strong equals good equals survive. 00:13:41.460 |
And that led to what became known later as the master-slave dialectic, which is, is it 00:13:45.040 |
more important for you to live on your feet as a master, even at the risk of dying? 00:13:48.200 |
Or are you willing to, um, you know, live as a slave on your knees in order to not die? 00:13:51.840 |
And this is sort of the, the derivation of that moral framework. 00:13:54.760 |
Christianity later inverted that moral framework, but it, you know, the, the original, uh, framework 00:13:58.320 |
lasted for, you know, many, many thousands of years. 00:14:01.040 |
No concept of individualism, the head of the family had total life and death control over 00:14:03.880 |
the, over, over the family, the head of the tribe, same thing, head of the city, same 00:14:07.800 |
And then you were morally obligated to kill members of the, of the other cities on, on 00:14:14.040 |
You were morally required to, like, if you didn't do it, you were a bad person. 00:14:15.760 |
Um, and then the form of the society was basically maximum fascism combined with maximum communism. 00:14:22.800 |
And so it was maximum fascism in the form of this, like, absolute top-down control where 00:14:27.000 |
the head of the family, tribe, or city could kill other members of the community at any 00:14:32.640 |
So maximum hierarchy, but combined with maximum communism, which is no market economy. 00:14:39.240 |
And the, and sort of the point of being in one of these collectives is that it's a collective 00:14:42.400 |
and, and, and, you know, and, and people are sharing. 00:14:44.560 |
And of course that limited how big they could get, because you know, the problem with communism 00:14:50.920 |
It doesn't work in order to make it work at the level of a country, impossible. 00:14:55.600 |
And then, and then it was all intricately tied into their religion. 00:14:59.720 |
And their, their religion was, uh, it was in two parts. 00:15:02.160 |
It was a veneration of ancestors, um, and it was veneration of nature and the veneration 00:15:06.920 |
of ancestors is extremely important because it was basically like, basically the ancestors 00:15:11.240 |
were the people who got you to where you were, the ancestors were the people who had everything 00:15:16.480 |
And then it was veneration of nature because of course nature is the thing that's trying 00:15:20.280 |
Um, and then you had your ancestor, every family, tribe, or city had their ancestor 00:15:22.620 |
gods and then they had their, um, they had their nature gods. 00:15:25.960 |
So fast forward to today, like we live in a world that is like radically different, 00:15:29.280 |
but in the, in the book takes you through kind of what happened from that through the 00:15:34.280 |
And so that, but it's very helpful to kind of think in these terms because the conventional 00:15:37.600 |
view of the progress through time is that we are, you know, the, the cliche is the arc 00:15:42.280 |
of the moral universe, you know, Ben Storrs justice, right? 00:15:45.400 |
Or so-called wig history, which is, you know, that the arc of progress is positive. 00:15:49.520 |
And so we, we, you know, what you hear all the time, what you're taught in school and 00:15:52.080 |
everything is, you know, every year that goes by, we get better and better and more and 00:15:54.960 |
more moral and more and more pure and a better version of our, of ourselves. 00:15:58.360 |
Our Indo-European ancestors would say, Oh no, like you people have like fallen to shit. 00:16:02.920 |
Like you people took all of the principles of basically your civilization and you have 00:16:06.800 |
diluted them down to the point where they barely even matter, you know, and you're having, 00:16:10.840 |
you know, children, a wedlock and you're, you know, you regularly encounter people of 00:16:13.880 |
other cities and you don't try to kill them and like, how crazy is that? 00:16:17.000 |
And they would basically consider us to be living like an incredibly diluted version 00:16:20.960 |
of this sort of highly religious, highly cult-like, right. 00:16:24.800 |
Highly organized, highly fascist, fascist communist society. 00:16:27.800 |
I can't resist noting that as a consequence of basically going through all the transitions 00:16:33.720 |
we've been through, going all the way through Christianity, coming out the other end of 00:16:36.880 |
Christianity, Nietzsche declares God is dead. 00:16:38.760 |
We're in a secular society, you know, that still has, you know, tinges of Christianity, 00:16:42.520 |
but you know, largely prides itself on no longer being religious in that way. 00:16:46.840 |
You know, we being the sort of most fully evolved, modern, secular, you know, expert 00:16:51.320 |
scientists and so forth have basically re-evolved or fallen back on the exact same religious 00:16:55.720 |
structure that the Indo-Europeans had, specifically ancestor worship, which is identity politics 00:17:02.120 |
and nature worship, which is environmentalism. 00:17:05.040 |
And so we have actually like worked our way all the way back to their cult religions without 00:17:09.040 |
And, and, and it just goes to show that like, you know, in some ways we have fallen far 00:17:11.800 |
from the, far from the family tree, but in, in, in some cases we're exactly the same. 00:17:16.240 |
You kind of described this progressive idea of wokeism and so on as a worshiping ancestors. 00:17:23.680 |
Identity politics is worshiping ancestors, right? 00:17:25.560 |
It's, it's, it's tagging newborn infants with either, you know, benefits or responsibilities 00:17:29.880 |
or, you know, levels of condemnation based on who their ancestors were. 00:17:33.660 |
The Indo-Europeans would have recognized it on site. 00:17:36.960 |
We somehow think it's like super socially progressive. 00:17:41.000 |
I mean, I would say obviously not, you know, get, get nuanced, which is where I think you're 00:17:44.680 |
headed, which is look like is the idea that you can like completely reinvent society every 00:17:48.780 |
generation and have no regard whatsoever for what came before you. 00:17:53.400 |
That's like the Cambodians with year zero under Pol Pot and you know, death follows. 00:17:57.400 |
It's obviously the Soviets tried that, um, you know, the, the, you know, the, the utopian 00:18:02.640 |
fantasists who think that they can just rip up everything that came before and create 00:18:05.400 |
something new in the human condition and human society have a very bad history of, of causing, 00:18:10.200 |
So on the one hand, it's like, okay, there, there is like a deeply important role for 00:18:14.820 |
And, and, and the way I think about that is it's, it's the process of evolutionary learning, 00:18:19.560 |
Which is what, what tradition ought to be is the distilled wisdom of all. 00:18:22.400 |
And, and, you know, this is how the Indo-Europeans thought about it. 00:18:24.040 |
It should be the distilled wisdom of everybody who came before you, right? 00:18:27.400 |
All those important and powerful lessons learned. 00:18:29.600 |
Um, and that's, that's why I think it's fascinating to go back and study how these people lived 00:18:32.740 |
is because that's, that's part of the history and part of the learning that got us to where 00:18:37.020 |
Having said that, there are many cultures around the world that are, you know, mired 00:18:40.260 |
in tradition to the point of not being able to progress. 00:18:42.420 |
Um, and in fact, you might even say globally, that's the default human condition, which 00:18:46.460 |
is, you know, a lot of people are in societies in which, you know, there's like absolute 00:18:49.500 |
seniority by age, you know, kids are completely, you know, like in the U S like for some reason 00:18:53.780 |
we decided kids are in charge of everything, right? 00:18:55.620 |
And like, you know, they're the trendsetters and they're allowed to like set all the agendas 00:18:58.380 |
and like set, set all the politics and set all the culture. 00:19:00.380 |
And maybe that's a little bit crazy, but like in a lot of other cultures, kids have no voice 00:19:03.940 |
at all, no role at all because it's the old people who were in charge of everything, you 00:19:07.300 |
know, their gerontocracies and it's all a bunch of 80 year olds running everything, 00:19:11.460 |
which by the way, we have a little bit of that too. 00:19:15.140 |
Um, and so I, what I would say is like, there's a doubt there's a, there's a real downside, 00:19:18.380 |
you know, full traditionalism as communitarianism, you know, it's ethnic particularism, um, you 00:19:23.700 |
know, it's ethnic chauvinism, it's, um, you know, this incredible level of, of resistance 00:19:28.380 |
Um, you know, that's, I mean, it just doesn't get you anywhere. 00:19:31.340 |
Like it, it may be good and fine at the level of individual tribe, but it's a society society 00:19:35.780 |
You, you can't evolve, you can't, you can't advance, you can't participate in all the 00:19:39.380 |
good things that, you know, that, that have happened. 00:19:41.420 |
And so I, you know, I think probably this is one of those things where extremeness on 00:19:46.260 |
Um, and I, but you know, but, but this needs to be approached in a sophisticated and nuanced 00:19:51.500 |
So the beautiful picture you painted of the roaring twenties, how can the Trump administration 00:20:01.220 |
So a big part of this is getting the government boot off the neck of the American economy, 00:20:06.100 |
the American technology industry, the American people, um, you know, and then again, this 00:20:10.180 |
is a replay of what happened in the sixties and seventies, which is, you know, for what 00:20:13.260 |
started out looking like, you know, I'm sure good and virtuous purposes, you know, we, 00:20:17.100 |
we ended up both then and now with this, you know, what I, what I described as sort of 00:20:20.180 |
a form of soft authoritarianism, you know, the, the good news is it's not like a military 00:20:24.780 |
It's not like, you know, you get thrown into Ljubljana, you know, for the most part, and 00:20:27.820 |
that's not coming at four in the morning, you're not getting dragged off to a cell. 00:20:30.720 |
So it's not hard authoritarianism, but it is soft authoritarianism. 00:20:33.780 |
And so it's this, you know, incredible suppressive blanket of regulation rules, you know, this 00:20:40.860 |
What's required to get anything done, you know, you need to get 40 people to sign off 00:20:45.860 |
You know, there's a lot of how our now political system works. 00:20:48.580 |
Um, and then, you know, just this general idea of, you know, progress is bad and technology 00:20:53.220 |
is bad and capitalism is bad and building businesses is bad and success is bad. 00:20:57.300 |
Um, you know, tall poppy syndrome, you know, basically anybody who sticks their head up, 00:21:02.360 |
you know, deserves to get it, you know, chopped off. 00:21:04.240 |
Anybody who's wrong about anything deserves to get condemned forever, you know, just this, 00:21:07.960 |
this very kind of, you know, grinding, you know, repression and then coupled with specific 00:21:12.920 |
government actions such as censorship regimes, right? 00:21:18.000 |
Um, and, you know, draconian, you know, deliberately kneecapping, you know, critical American industries. 00:21:23.240 |
Um, and then, you know, congratulating yourselves in the back for doing it or, you know, having 00:21:26.400 |
these horrible social policies like, let's let all the criminals out of jail and see 00:21:31.640 |
Um, and so like we, we've just been through this period, I, you know, I call it a demoralization 00:21:35.600 |
Like we've just been through this period where, you know, whether it started that way or not, 00:21:38.120 |
it ended up basically being this comprehensive message that says you're terrible and if you 00:21:41.400 |
try to do anything, you're terrible and fuck you. 00:21:44.080 |
Um, and the Biden administration reached kind of the full pinnacle of that in, in, in, in 00:21:49.160 |
They, they got really bad on, on many fronts at the same time. 00:21:51.160 |
Um, and so just like relieving that, um, and getting kind of back to a reasonably, you 00:21:57.560 |
know, kind of optimistic, constructive, um, you know, pro-growth frame of mind. 00:22:01.280 |
Um, there's just, there's so much pent up energy and potential in the American system 00:22:04.840 |
that that alone is going to, I think, cause, you know, growth and, and, and, and, and spirit 00:22:09.300 |
And then there's a lot of things proactively, but yeah, and then there's a lot of things 00:22:15.360 |
To what degree has the thing you described ideologically permeated government and permeated 00:22:23.920 |
Disclaimer at first, which is, I don't want to predict anything on any of this stuff cause 00:22:26.800 |
I've learned the hard way that I can't predict politics or Washington at all. 00:22:30.480 |
Um, but I would just say that the, the, the plans and intentions are clear and the staffing 00:22:34.960 |
Um, and all the conversations are consistent, um, with the new administration and that they 00:22:38.320 |
plan to take, you know, very rapid action on a lot of these fronts very quickly. 00:22:42.240 |
They're going to do as much as they can through executive orders and then they're going to 00:22:44.360 |
do legislation and regulatory changes for the rest. 00:22:47.320 |
And so they're, they're going to move, I think quickly on a whole bunch of stuff. 00:22:49.480 |
You can already feel, I think, a shift in the national spirit or at least let's put 00:22:54.520 |
And in Silicon Valley, like it, you know, I mean, we, we, you know, we just saw a great 00:22:56.880 |
example of this with what, you know, with what Mark Zuckerberg is doing. 00:22:59.440 |
Um, you know, and obviously I'm, I'm involved with, with his company, but you know, we, 00:23:02.440 |
we just saw it kind of in public, the scope of it and speed of the changes, you know, 00:23:05.720 |
are reflective of, of sort of this, of a lot of these shifts. 00:23:08.800 |
But I would say that that same conversation, those same kinds of things are happening throughout 00:23:13.560 |
You know, the, the, the tech industry itself, whether people were pro Trump or anti-Trump, 00:23:16.800 |
like there's just like a giant five shift mood shift that's like kicked in already. 00:23:20.320 |
And then I was with a group of Hollywood people about two weeks ago. 00:23:23.560 |
Um, and they were still, you know, people who at least, at least vocally were still 00:23:28.040 |
But I said, you know, has anything changed since, since November 6th? 00:23:31.040 |
And they, they immediately said, Oh, it's completely different. 00:23:33.080 |
Um, it feels like the ice is thawed, um, you know, woke us over, um, you know, they 00:23:37.600 |
said that all kinds of projects are going to be able to get made now that couldn't 00:23:39.960 |
before that, you know, probably was going to start making comedies again, you know, 00:23:43.560 |
like it's, it's, they were just like, it's, it's like a, it's like a, just like an incredible 00:23:49.560 |
And I'm, as I talk to people kind of throughout, you know, certainly throughout the economy, 00:23:53.120 |
people who run businesses, I hear that all the time, which is just this, this last 10 00:23:57.600 |
I mean, the one that I'm watching, that's really funny. 00:24:00.000 |
I mean, Facebook's giving a lot, that is getting a lot of attention, but the other funny one 00:24:02.880 |
is black rock, which I'm not, which I, you know, and I, I don't know him, but I've watched 00:24:07.280 |
You know, the Larry Fink, who's the CEO of black rock was like first in as a major investment 00:24:12.200 |
CEO on like every dumb social trend and rule set, like every, all right, I'm going for 00:24:21.040 |
Every retard, every retarded thing you can imagine, every ESG and every, like every possible 00:24:27.800 |
settling companies with every aspect of just there's these crazed ideological positions. 00:24:31.800 |
And, you know, he was coming in, he literally was like, had aggregate aggregated together 00:24:35.440 |
trillions of dollars of, of, of, of, of shareholdings that he did not, that were, you know, that 00:24:40.040 |
were his, his customers rights and he, you know, seized their voting control of their 00:24:43.920 |
shares and was using it to force all these companies to do all of this like crazy ideological 00:24:48.480 |
And he was like the typhoid Mary of all this stuff in corporate America. 00:24:50.840 |
And if he, in the last year has been like backpedaling from that stuff, like as fast 00:24:55.280 |
And I saw just an example last week, he pulled out of that, whatever the corporate net zero 00:24:58.960 |
Alliance, you know, he pulled out of the crazy energy, energy, energy stuff. 00:25:02.400 |
And so like, you know, he's backing away as fast as he can. 00:25:05.960 |
Remember that Richard Pryor backwards walk, Richard Pryor had this way where he could, 00:25:09.520 |
he could back out of a room while looking like he was walking forward. 00:25:17.840 |
And just the whole thing, I mean, I don't know if you saw the court recently ruled that 00:25:21.040 |
the NASDAQ had these crazy board of directors, composition rules. 00:25:24.760 |
One of the funniest moments of my life is when my friend Peter Thiel and I were on the 00:25:27.920 |
meta board and these NASDAQ rules came down, mandated diversity on corporate boards. 00:25:33.000 |
And so we sat around the table and had to figure out, you know, which of us counted 00:25:36.400 |
And the very professional attorneys that met up explained with a 100% complete straight 00:25:42.080 |
face that Peter Thiel counts as diverse by virtue of being LGBT. 00:25:47.000 |
And this is a guy who literally wrote a book called the diversity myth. 00:25:50.640 |
And he literally looked like he'd swallowed a live goldfish. 00:25:56.760 |
I mean, this was like so incredibly offensive to him that like, it just like, it was just 00:25:59.560 |
absolutely appalling, and I felt terrible for him, but the look on his face was very 00:26:03.560 |
It was imposed by NASDAQ, you know, your stock exchange imposing this stuff on you. 00:26:07.560 |
And then the court, whatever the court of appeals just nuked that, you know, so like 00:26:11.720 |
these things basically are being like ripped down one by one. 00:26:14.760 |
And what's on the other side of it is basically, you know, finally being able to get back to, 00:26:18.420 |
you know, everything that, you know, everybody always wanted to do, which is like run their 00:26:21.320 |
companies, have great products, have happy customers, you know, like succeed, like succeed, 00:26:26.080 |
achieve, outperform, and, you know, work with the best and the brightest and not be made 00:26:31.360 |
And I think that's happening in many areas of American society. 00:26:33.640 |
It's great to hear that Peter Thiel is fundamentally a diversity hire. 00:26:38.320 |
Well, so it was very, you know, there was a moment. 00:26:40.960 |
So Peter, you know, Peter, of course, you know, is, you know, is publicly gay, has been 00:26:45.120 |
for a long time, you know, but, you know, there are other men on the board, right? 00:26:48.960 |
And you know, we're sitting there and we're all looking at it and we're like, all right, 00:26:52.440 |
And we just, we keep coming back to the B, right? 00:26:55.920 |
And it's like, you know, it's like, all right, you know, I'm willing to do a lot for this 00:27:02.680 |
company, but it's all about sacrifice for diversity. 00:27:07.840 |
And then it's like, okay, like, is there a test, you know, so, oh yeah, exactly. 00:27:15.560 |
The questions that got asked, you know, what are you willing to do? 00:27:21.160 |
I think I'm very good at asking lawyers completely absurd questions with a totally straight face. 00:27:30.880 |
I think in fairness, they have trouble telling when I'm joking. 00:27:32.760 |
So you mentioned the Hollywood folks, maybe people in Silicon Valley and vibe shift. 00:27:38.900 |
Maybe you can speak to preference falsification. 00:27:46.440 |
What, like, what percent of them are feeling this vibe shift and are interested in creating 00:27:54.400 |
the Roaring Twenties in the way David described? 00:27:59.880 |
So there's like all of Silicon Valley and the way to just measure that is just look 00:28:05.240 |
And what that shows consistently is Silicon Valley is just a, you know, at least historically 00:28:07.600 |
my entire time there has been overwhelmingly majority, just straight up Democrat. 00:28:12.080 |
The other way to look at that is political donation records. 00:28:14.120 |
And again, you know, the political donations in the Valley range from 90 to 99%, you know, 00:28:20.080 |
And so, you know, we'll, we'll, I just bring it up because like, we'll see what happens 00:28:22.680 |
with the voting and with donations going forward. 00:28:25.980 |
We maybe talk about the fire later, but I can tell you there is a very big question 00:28:28.920 |
of what's happening in Los Angeles right now. 00:28:30.520 |
I don't want to get into the fire, but like it's catastrophic and, you know, there was 00:28:34.680 |
already a rightward shift in the big cities in California. 00:28:37.480 |
And I think a lot of people in LA are really thinking about things right now as they're 00:28:41.360 |
trying to, you know, literally save their houses and save their families. 00:28:44.720 |
But you know, even in San Francisco, there was a big right, there was a big shift to 00:28:49.840 |
So we'll see where, we'll see where that goes. 00:28:51.360 |
But you know, you observe that by just looking at, looking at the numbers over time. 00:28:54.940 |
The part that I'm more focused on is, you know, and I don't know how to exactly describe 00:28:58.400 |
this, but it's like the top thousand or the top 10,000 people. 00:29:02.480 |
And you know, I don't have a list, but like it's the, you know, it's all the top founders, 00:29:06.080 |
top CEOs, top executives, top engineers, top VCs, you know, and then kind of into the ranks, 00:29:10.960 |
you know, the people who kind of built and run the companies and they're, you know, I 00:29:15.000 |
don't have numbers, but I have a much more tactile feel, you know, for what's happening. 00:29:21.040 |
So the big thing I have now come to believe is that the idea that people have beliefs 00:29:34.080 |
And I think even most high status people just go along. 00:29:36.340 |
And I think maybe the most high status people are the most prone to just go along because 00:29:42.920 |
And the way I would describe that is, you know, one of the great forbidden philosophers 00:29:47.200 |
of our time is the Unabomber, Ted Kaczynski, and amidst his madness, he had this extremely 00:29:53.480 |
You know, he was a, he was a, he was an insane lunatic murderer, but he was also a, you know, 00:30:02.120 |
Shots fired, man, but, uh, he was a very bright guy and he, he did this whole thing, um, where 00:30:08.920 |
he talked about, basically he w he was very right wing and talk, talked about leftism 00:30:13.320 |
Um, and he had this great concept that's just stuck in my mind ever since I read it, which 00:30:16.120 |
is he had this concept you just called over social over socialization. 00:30:19.200 |
Um, and so, you know, most people are socialized, most people are socialized. 00:30:23.520 |
Like most people are, you know, we live in a society. 00:30:25.480 |
Most people learn how to be part of a society. 00:30:29.680 |
There's something about modern Western elites where they're over socialized, um, and they're 00:30:33.560 |
just like overly oriented towards what other people like themselves, you know, think, um, 00:30:41.560 |
If you have a little bit of an outside perspective, which I just do, I think as a consequence 00:30:45.280 |
of where I grew up, um, like even before I had the views that I have today, there was 00:30:49.520 |
always just this weird thing where it's like, why does every dinner party have the exact 00:30:54.800 |
Why does everybody agree on every single issue? 00:30:57.880 |
Why is that agreement precisely what was in the New York times today? 00:31:01.480 |
Um, why are these positions not the same as they were five years ago? 00:31:06.440 |
Uh, but why does everybody like snap into agreement every step of the way? 00:31:10.280 |
And that was true when I came to Silicon Valley and it's just, just true today, 30 years later. 00:31:14.120 |
And so I think most people are just literally take, I think they're taking their cues from 00:31:18.160 |
it's some combination of the press, the universities, the big foundations. 00:31:22.120 |
So it's like basically it's like the New York times, Harvard, the Ford foundation, and you 00:31:25.920 |
know, I don't know, um, you know, a few CEOs, um, and a few public figures and, you know, 00:31:30.680 |
maybe, you know, maybe the president of your parties in power and like whatever that is, 00:31:34.620 |
everybody just, everybody who's sort of good and proper and elite and good standing and 00:31:39.040 |
in charge of things and a sort of correct member of, you know, let's call it coastal 00:31:45.340 |
And then, you know, the two interesting things about that is number one, there's no divergence 00:31:51.340 |
So Harvard and Yale believe the exact same thing. 00:31:53.880 |
The New York times, the Washington post believe the exact same thing. 00:31:55.920 |
The Ford foundation and the Rockefeller foundation believe the exact same thing. 00:31:58.480 |
Google and you know, whatever, you know, Microsoft believe the exact same thing. 00:32:02.480 |
Um, but those things change over time, but there's never conflict in the moment. 00:32:09.120 |
Um, and so, you know, the New York times, the wall in the Washington post agreed on 00:32:12.460 |
exactly everything in 1970, 1980, 1990, 2000, 2010 and 2020, despite the fact that the specifics 00:32:18.800 |
changed radically, the lockstep was what mattered. 00:32:22.200 |
Um, and so I, I think basically we, we in the valley, we're, we're on the tail end 00:32:27.280 |
And the same way New York's on the tail end of that, the same way the media is on the 00:32:30.720 |
It's, it's like some sort of collective hive mind thing. 00:32:33.280 |
And I just go through that to say like, I don't think most people in my orbit or, you 00:32:36.600 |
know, let's say the top 10,000 people in the valley or the top 10,000 people in LA, I don't 00:32:40.960 |
think they're sitting there thinking basically I have rocks. 00:32:43.640 |
I mean, they probably think they have rock solid beliefs, but they don't actually have 00:32:48.600 |
And then they kind of watch reality change around them and try to figure out how to 00:32:53.720 |
I think what happens is they conform to the belief system around them. 00:32:57.080 |
And I think most of the time they're not even aware that they're basically part of a herd. 00:33:01.680 |
Is it possible that the surface chatter of dinner parties underneath that there is a 00:33:08.120 |
turmoil of ideas and thoughts and beliefs that's going on, but you're just talking to 00:33:13.440 |
people really close to you or in your own mind and the socialization happens at the 00:33:19.200 |
So, uh, when you go outside the inner circle of one, two, three, four people who you really 00:33:26.180 |
But inside there, inside the mind, there is an actual belief or a struggle, a tension 00:33:31.960 |
within New York times or with the, uh, for the listener, there's a, there's a slow smile 00:33:40.800 |
So look, I'll just tell you what I think, which is at, at, at the dinner parties and 00:33:46.440 |
Um, I, it's what, what there is, is that all of the heretical conversations, anything that 00:33:50.280 |
challenges the status quo, um, any heretical ideas and any new idea, you know, as a heretical 00:33:54.600 |
idea, um, any deviation, it's either discussed a one-on-one face-to-face it's, it's like 00:33:59.960 |
a whisper network or it's like a realized social network. 00:34:03.240 |
There's a secret handshake, which is like, okay, you meet somebody and you like know 00:34:08.460 |
And like, you're both trying to figure out if you can like talk to the other person openly 00:34:11.200 |
or whether you have to like be fully conformist. 00:34:19.360 |
If the other person laughs, the conversation is on. 00:34:23.800 |
If the other person doesn't laugh back slowly away from the scene, I didn't mean anything 00:34:30.760 |
And then by the way, it doesn't have to be like a super offensive joke. 00:34:32.800 |
That's just up against the edge of one of the use the Sam Bankman free term, one of 00:34:36.560 |
the shibboleths, you know, it has to be up against one of the things, um, of, um, you 00:34:41.120 |
know, one of the things that you're absolutely required to believe to be the dinner parties. 00:34:44.160 |
And then, and then at that point what happens is you have a peer to peer network, right? 00:34:47.760 |
You have, you have, you have a, a one-to-one connection with somebody and then you, you 00:34:50.840 |
have your, you have your own, your own, your little conspiracy of thought, thought criminality. 00:34:54.600 |
Um, and then you have your net, you've probably been through this. 00:34:57.200 |
You have your network of thought criminals and then they have their network of thought 00:35:00.800 |
And then you have this like delicate mating dance as to whether you should bring the thought 00:35:04.960 |
Um, and the dance, the fundamental, uh, mechanism of the dance is humor. 00:35:17.040 |
Everyone, humor is a way to have deniability. 00:35:19.040 |
Humor is a way to discuss serious things without, without, without, with having deniability. 00:35:23.040 |
So, so that's part of it, which is one of the reasons why comedians can get away with 00:35:25.040 |
The rest of us can, you know, they can always fall back on, oh yeah, I was just going for 00:35:28.040 |
But, but the other key thing about humor, right. 00:35:29.040 |
Is that, is that laughter is involuntary, right? 00:35:32.520 |
And, and it's not like a conscious decision whether you're going to laugh and everybody 00:35:37.880 |
Every professional comedian knows this, right. 00:35:38.880 |
The laughter is the clue that you're onto something truthful. 00:35:41.580 |
Like people don't laugh at like made up bullshit stories. 00:35:44.040 |
They laugh because like you're revealing something that they either have not been allowed to 00:35:47.480 |
think about or have not been allowed to talk about. 00:35:51.520 |
And all of a sudden it's like the ice breaks and it's like, oh yeah, that's the thing. 00:35:56.420 |
And then, and then of course, this is why of course live comedy is so powerful is because 00:36:01.000 |
The safety of, you know, the safety of numbers. 00:36:02.880 |
And so, so the comedians have like the all, it's no, no surprise to me. 00:36:06.120 |
Like for example, Joe has been as successful as he has, because they have, they have this 00:36:08.960 |
hack that the, you know, the rest of us who are not professional comedians don't have, 00:36:11.960 |
but, but you have your in-person version of it. 00:36:14.760 |
And then you've got the question of whether the, whether you can sort of join the networks 00:36:17.840 |
And then you've probably been to this as you know, then at some point there's like a different, 00:36:20.320 |
there's like the alt dinner party, uh, the Tharker middle dinner party and you get six 00:36:23.680 |
or eight people together and you join the networks. 00:36:25.640 |
And those are like the happiest moment, at least in the last decade, those are like the 00:36:28.320 |
happiest moments of everybody's lives because they're just like, everybody's just ecstatic 00:36:31.840 |
because they're like, I don't have to worry about getting yelled at and shamed. 00:36:34.920 |
Like for every third sentence that comes out of my mouth and we can actually talk about 00:36:40.400 |
And then the, and then of course the other side of it is the, you know, the group chat, 00:36:45.040 |
And then, and then this, and then basically the same thing played out, you know, until, 00:36:47.840 |
until Elon bought X and until Substack took off, um, you know, which were really the two 00:36:51.320 |
big breakthroughs in free speech online, the same dynamic played out online, which is you 00:36:55.400 |
had absolute conformity on the social networks, like literally enforced by the social networks 00:36:59.000 |
themselves through censorship and, and then also through cancellation campaigns and mobbing 00:37:04.960 |
And, and, but then you had, but, but then group chats grew up to be the equivalent of 00:37:09.320 |
Anybody who grew up in the Soviet Union under, you know, communism, no, you know, they had 00:37:14.000 |
It's like, how do you know who you could talk to? 00:37:16.880 |
And you know, like, you know, again, that was the hard authoritarian version of this. 00:37:21.240 |
And then we've been living through this weird mutant, you know, soft authoritarian version, 00:37:24.800 |
but with, you know, with some of the same patterns. 00:37:26.760 |
And WhatsApp allows you to scale it, make it more efficient to, uh, to build on these, 00:37:31.360 |
uh, groups of heretical ideas bonded by humor. 00:37:37.920 |
And then, well, this is kind of the running joke about groups, the running, running kind 00:37:43.960 |
If you've noticed this, like every principle of group chats, every group chat ends up being 00:37:49.520 |
And the goal of the game, the game of the group chat is to get as close to the line 00:37:52.880 |
of being actually objectionable as you can get without actually tripping it. 00:37:58.520 |
And so literally every group chat that I have been in for the last decade, even if it starts 00:38:01.880 |
some other direction, what ends up happening is it becomes the absolute comedy fest where, 00:38:06.800 |
but it's walking, they walk right up the line and they're constantly testing it every once 00:38:11.320 |
Somebody will trip the line and people will freak out and it's like, Oh, too soon. 00:38:14.280 |
You know, we've got to wait until next year to talk about that. 00:38:18.080 |
And yeah, and then group chats is a technological phenomenon. 00:38:20.440 |
It was amazing to see because basically it was number one, it was, you know, obviously 00:38:23.680 |
the rise of smartphones, then it was the rise of the, of the new messaging services. 00:38:28.640 |
Then it was the rise specifically of, I would say, combination of WhatsApp and signal. 00:38:32.040 |
And the reason for that is those were the two, the two big systems that did the full 00:38:38.480 |
And then the real breakthrough I think was disappearing messages, which hit signal probably 00:38:43.360 |
four or five years ago and hit WhatsApp three or four years ago. 00:38:48.040 |
And then the combination of the combination of encryption and and disappearing messages, 00:38:54.240 |
Well, and then there's the fight, then there's the fight over the, the, the length of the 00:38:58.960 |
And so it's like, you know, I often get behind my, my thing. 00:39:01.640 |
So I, I set a seven day, you know, disappearing messages and my friends who are like, no, 00:39:09.120 |
And then every once in a while, somebody will set it to five minutes before they send something 00:39:14.400 |
Well, what, I mean, one of the things that bothers me about WhatsApp, the choices between 00:39:17.600 |
24 hours and you know, seven days, one day or seven days, and I have to have an existential 00:39:23.000 |
crisis deciding whether I can last for seven days with what I'm about to say. 00:39:30.040 |
Now, of course, what's happening right now is the big thought, right? 00:39:33.520 |
So what's happening on the other, on the other side of the election is, you know, Elon on 00:39:37.000 |
Twitter two years ago and now Mark with Facebook and Instagram. 00:39:40.160 |
And by the way, with the continued growth of Substack and with other, you know, new 00:39:42.680 |
platforms that are emerging, you know, like I think it may be, you know, I don't know 00:39:47.160 |
that everything just shifts back into public, but like a tremendous amount of the, a tremendous 00:39:51.600 |
amount of the verboten conversations, you know, can now shift back into public view. 00:39:56.400 |
And I mean, quite frankly, this is one of those things, you know, quite frankly, even 00:39:59.280 |
if I was opposed to what those people are saying, and I'm sure I am in some cases, you 00:40:03.400 |
know, I would argue it's still like net better for society that those things happen in public 00:40:07.480 |
You know, do you, do you really want like, yeah, like, don't you want to know? 00:40:13.120 |
And so, and then it's just, look, it's just, I think clearly much healthier to live in 00:40:16.360 |
a society in which people are not literally scared of what they're saying. 00:40:19.720 |
- I mean, to push back and to come back to this idea that we're talking about, I do believe 00:40:24.080 |
that people have beliefs and thoughts that are heretical, like a lot of people. 00:40:32.080 |
To me, this is, the preference falsification is really interesting. 00:40:35.520 |
What is the landscape of ideas that human civilization has in private as compared to 00:40:43.520 |
'Cause like that, the dynamical system that is the difference between those two is fascinating. 00:40:49.880 |
Like there's, throughout history, the fall of communism and multiple regimes throughout 00:40:55.000 |
Europe is really interesting because everybody was following, you know, the line until not. 00:41:02.360 |
But you better, for sure, privately, there was a huge number of boiling conversations 00:41:08.120 |
happening where like, this is this, the bureaucracy of communism, the corruption of communism, 00:41:13.840 |
all of that was really bothering people more and more and more and more. 00:41:17.360 |
And all of a sudden, there's a trigger that allows the vibe shift to happen. 00:41:22.280 |
To me, like the interesting question here is, what is the landscape of private thoughts 00:41:28.080 |
and ideas and conversations that are happening under the surface of Americans? 00:41:35.040 |
Really, my question is, how much dormant energy is there for this roaring 20s where people 00:41:43.880 |
So let's go through the, we'll go through the theory of preference falsification. 00:41:51.080 |
So this is exactly, this is one of the all-time great books. 00:41:52.080 |
Incredibly, about 20, 30-year-old book, but it's very, it's completely modern and current 00:41:55.760 |
in what it talks about as well as very deeply historically informed. 00:41:59.440 |
So it's called Private Truths, Public Lies, and it's written by a social science professor 00:42:10.640 |
And so he has this concept he calls preference falsification. 00:42:13.400 |
And so preference falsification is two things. 00:42:15.840 |
Preference falsification, and you get it from the title of the book, Private Truths, Public 00:42:19.720 |
So preference falsification is when you believe something and you can't say it, or, and this 00:42:23.840 |
is very important, you don't believe something and you must say it, right? 00:42:29.400 |
And the commonality there is, in both cases, you're lying. 00:42:33.280 |
You believe something internally and then you're lying about it in public. 00:42:36.640 |
And so the thing, and there's sort of the two classic forms of it. 00:42:40.200 |
There's the, for example, there's the I believe communism is rotten, but I can't say it version 00:42:44.960 |
But then there's also the famous parable, the real life example, but the thing that 00:42:50.600 |
Václav Havel talks about in the other good book on this topic, which is The Power of 00:42:53.720 |
the Powerless, who is an anti-communist resistance fighter who ultimately became the president 00:42:59.120 |
of Czechoslovakia after the fall of the wall. 00:43:01.600 |
But he wrote this book and he describes the other side of this, which is workers of the 00:43:07.120 |
And so he describes what he calls the parable of the green grocer, which is you're a green 00:43:14.040 |
And for the last 70 years, it has been, or 50 years, it's been absolutely mandatory to 00:43:17.760 |
have a sign in the window of your store that says workers of the world unite, right? 00:43:23.520 |
It is like crystal clear that the workers of the world are not going to unite. 00:43:27.240 |
Like of all the things that could happen in the world, that is not going to happen. 00:43:34.100 |
But that slogan had better be in your window every morning, because if it's not in your 00:43:36.840 |
window every morning, you are not a good communist. 00:43:38.960 |
The secret police are going to come by and they're going to get you. 00:43:42.080 |
And so the first thing you do when you get to the store is you put that slogan in the 00:43:44.560 |
window and you make sure that it stays in the window all day long. 00:43:46.800 |
But he says the thing is, every single person, the green grocer knows the slogan is fake. 00:43:52.380 |
Every single person walking past the slogan knows that it's a lie. 00:43:55.180 |
Every single person walking past the store knows that the green grocer is only putting 00:44:01.740 |
And the green grocer has to go through the humiliation of knowing that everybody knows 00:44:05.500 |
that he's caving into the system and lying in public. 00:44:07.580 |
And so it turns into the moralization campaign. 00:44:13.460 |
In fact, it's not ideological enforcement anymore because everybody knows it's fake. 00:44:19.300 |
It's not that they're enforcing the actual ideology of the world's workers of the world 00:44:22.620 |
uniting, it's that they are enforcing compliance and compliance with the regime and fuck you, 00:44:29.460 |
And so anyway, that's the other side of that. 00:44:32.260 |
And of course, we have lived in the last decade through a lot of both of those. 00:44:36.580 |
I think anybody listening to this could name a series of slogans that we've all been forced 00:44:40.140 |
to chant for the last decade that everybody knows at this point are just like simply not 00:44:44.940 |
I'll let the audience speculate on their own group chats. 00:44:54.620 |
So anyway, so it's the two sides of that, right? 00:44:59.060 |
So then what preference falsification does is it talks about extending that from the 00:45:02.100 |
idea of the individual experiencing that to the idea of the entire society experiencing 00:45:06.380 |
And this gets to your percentages question, which is like, okay, what happens in a society 00:45:10.100 |
in which people are forced to lie in public about what they truly believe? 00:45:12.640 |
What happens number one is that individually, they're lying in public, and that's bad. 00:45:15.420 |
But the other thing that happens is they no longer have an accurate gauge at all or any 00:45:19.420 |
way to estimate how many people agree with them. 00:45:22.240 |
And this is how you and again, this, this, this literally is like how you get something 00:45:25.060 |
like like the communist system, which is like, okay, it you end up in a situation in which 00:45:29.140 |
80 or 90 or 99% of society can actually all be thinking individually, I really don't buy 00:45:34.340 |
And if anybody would just stand up and say it, I would be willing to go along with it, 00:45:37.580 |
but I'm not going to be the first one to put my head on the chopping block. 00:45:40.160 |
But you have no because of the suppression censorship, you have no way of knowing how 00:45:45.580 |
And if the people, if the people agree with you are 10% of the population, and you become 00:45:49.420 |
part of a movement, you're going to get killed. 00:45:52.260 |
If 90% of the people agree with you, you're going to win the revolution. 00:45:56.540 |
And so the question of like, what the percentage actually is, is like a really critical question. 00:46:00.580 |
And then and then basically, in any sort of authoritarian system, you can you can't like 00:46:03.620 |
run a survey, right to get an accurate result. 00:46:05.660 |
And so you actually can't know until you put it to the test. 00:46:08.180 |
And then what he describes in the book is it's always put to the test in the same way. 00:46:10.820 |
And this is exactly what's happened for the last two years, like 100% of exactly what's 00:46:14.940 |
And this is what we create out of this book, which is somebody, Elon sticks his hand up 00:46:20.940 |
and says, the workers of the world are not going to unite, right, or the Emperor is actually 00:46:25.340 |
wearing no clothes, right, you know, that famous parable, right? 00:46:29.780 |
And literally, that person is standing there by themselves. 00:46:31.820 |
And everybody else in the audience is like, Ooh, I wonder what's gonna happen to that 00:46:38.740 |
Elon doesn't know the first guy doesn't know other people don't know, like, which way is 00:46:42.340 |
And it may be that that's a minority position. 00:46:45.140 |
Or it may be that that's the majority position in that and you are now the leader of a revolution. 00:46:49.340 |
And then basically, of course, what happens is, okay, the first guy does that doesn't 00:46:53.420 |
The second guy does well, a lot of the time that guy doesn't get killed. 00:46:56.100 |
But when the guy doesn't get killed, then a second guy pops his head up says the same 00:47:00.220 |
All right, now you've got two, two leads to four, four leads to eight, eight leads to 00:47:04.140 |
And then as we saw with the fall of the Berlin Wall, this is what happened in Russia and 00:47:07.340 |
Eastern Europe in 89 you when it when it goes, it can go right and then it rips. 00:47:12.180 |
And then what happens is very, very quickly, if it turns out that you had a large percentage 00:47:16.700 |
of the population that actually believe the different thing, it turns out all of a sudden, 00:47:19.740 |
everybody has this giant epiphany that says, Oh, I'm actually part of the majority. 00:47:23.420 |
And at that point, like you were on the freight train to revolution, right? 00:47:28.920 |
The other part of this is the distinction between the role of the elites and the masses. 00:47:33.660 |
And here in here, the best book is called the true believer, which is the Eric Hoffer 00:47:37.700 |
And so the nuance you have to put on this is that the elites play a giant role in this 00:47:42.260 |
because the elites do idea formation and communication, but the elites by definition are a small minority. 00:47:47.920 |
And so there's also this giant role played by the masses and the masses are not necessarily 00:47:51.860 |
thinking these things through in the same intellectualized formal way that the elites 00:47:55.300 |
are, but they are for sure experiencing these things in their daily lives. 00:47:58.340 |
And they for sure have at least very strong emotional views on them. 00:48:01.460 |
And so when you when you really get the revolution, it's when you get the elites lined up with 00:48:04.980 |
or, or, or, or a new either the current elites change or the new set of elites, a new set 00:48:08.860 |
of counter elites, um, basically come along and say, no, there's actually a different, 00:48:13.140 |
And then the piece that people basically decide to follow the, you know, to follow the counter 00:48:17.140 |
So that, that, that's the other dimension to it. 00:48:18.140 |
And of course that part is also happening right now. 00:48:20.060 |
Um, and again, case study number one of that would be Elon and his, you know, who turns 00:48:26.180 |
And he has done that over and over in different industries, not just saying crazy shit online, 00:48:30.140 |
but saying crazy shit in the, in the realm of space, in the realm of autonomous driving, 00:48:36.180 |
in the realm of AI, just over and over and over again. 00:48:38.980 |
Turns out saying crazy shit is one of the ways to do a revolution and to actually make 00:48:44.740 |
And it's like, well, but then there's the test. 00:48:48.740 |
And, and, and, you know, and this is where, you know, many, there are many specific things 00:48:50.740 |
about Elon's genius, but one of the, one of the really core ones is an absolute dedication 00:48:55.100 |
Um, and so when Elon says something, it sounds like crazy shit, but in his mind it's true. 00:49:06.540 |
But at least my, my through line with him, both in what he says in public and what he 00:49:09.900 |
says in private, which by the way are the exact same things. 00:49:13.540 |
He doesn't lie in public about what he believes in private, or at least he doesn't do that 00:49:16.900 |
Like he's a hundred percent consistent in my, in my experience, by the way, there's 00:49:20.380 |
two guys who are a hundred percent consistent like that, that I know, um, Elon and Trump. 00:49:26.420 |
So whatever you think of them, what they say in private is a hundred percent identical 00:49:34.260 |
Which is like, and again, it's not like they're perfect people, but they're honest in that 00:49:37.500 |
And it, and it makes them potentially both as they have been very powerful leaders of 00:49:41.460 |
Cause they're both willing to stand up and say the thing that if it's true, it turns 00:49:45.760 |
In many cases that, you know, many or most or almost everyone else actually believes, 00:49:49.140 |
but nobody was actually willing to say out loud and so that they can actually catalyze 00:49:53.300 |
And I, and I mean, I think this framework is exactly why Trump took over the Republican 00:49:55.660 |
party is I think Trump stood up there on stage with all these other kinds of conventional 00:49:58.540 |
Republicans and he started saying things out loud that it turned out the base really was, 00:50:02.520 |
they were either already believing or they were prone to believe. 00:50:04.660 |
Um, and he was the only one who was saying them. 00:50:06.500 |
And so the, again, elite masses, he was the elite, the voters of the masses and the voters 00:50:16.580 |
Like what we just described is like the actual mechanism of social change. 00:50:19.260 |
It is fascinating to me that we have been living through exactly this. 00:50:22.940 |
We've been living through everything, exactly what Tim McGrath describes, everything that 00:50:26.860 |
Vaclav Havel described, you know, black squares and Instagram, like the whole thing, right? 00:50:32.860 |
Um, uh, and we've been living through the, um, you know, the, the true believer elites 00:50:36.060 |
masses, you know, thing with, you know, with a set of like basically incredibly corrupt 00:50:39.700 |
elites wondering why they don't have the loyalty to the masses anymore and a set of new elites 00:50:43.980 |
And so like we're, we're living through this like incredible applied case study, um, of 00:50:48.860 |
And you know, if there's a moral of the story, it is, you know, I think fairly obvious, 00:50:51.860 |
which is, it is a really bad idea for a society to wedge itself into a position in which most 00:50:56.180 |
people don't believe the fundamental precepts of what they're told they have to do, you 00:51:03.040 |
So one of the ways to avoid that in the future maybe is to keep the Delta between what's 00:51:07.700 |
said in private and what's said in public small. 00:51:11.220 |
It's like, well, this is sort of the, the siren song of censorship is we can keep people 00:51:13.900 |
from saying things, which means we can keep people from thinking things. 00:51:17.660 |
And you know, by the way, that may work for a while, right? 00:51:20.540 |
Like, you know, I mean, again, the hard form of the Soviet Union, Soviet Union, uh, owning 00:51:24.900 |
a mimeograph pre-photocopiers, there were mimeograph machines that were used to make 00:51:28.340 |
somersault and underground newspapers, which was the, the mechanism of written communication 00:51:31.860 |
of, of radical ideas, radical ideas, um, ownership of a mimeograph machine was punishable by 00:51:41.060 |
You know, the soft version is somebody clicks a button in Washington and you are erased 00:51:46.540 |
Like which, you know, good news, you're still alive. 00:51:48.580 |
Bad news is, you know, shame about not being able to get a job, you know, too bad your 00:51:51.860 |
family now, you know, they hate you and won't talk to you, you know, whatever the, you know, 00:51:55.060 |
whatever the version of cancellation has been. 00:52:01.100 |
I could work for the Soviet Union for a while, you know, in its way, especially when it was 00:52:04.780 |
coupled with, you know, official state power, but when it unwinds, it can unwind with like 00:52:08.940 |
incredible speed and ferocity because to your point, there's all this bottled up energy. 00:52:12.660 |
Now your question was like, what are the percentages? 00:52:16.140 |
And so my, my rough guests, just based on what I've seen in my world is it's something 00:52:21.460 |
like 20, 60, 20, um, it's like you've got 20% like true believers in whatever is, you 00:52:27.060 |
know, the current thing, you know, you got 20, 20% of people who are just like true believers 00:52:31.140 |
of whatever they, they're, they're, you know, whatever's in the, like I said, whatever's 00:52:33.660 |
the New York times, Harvard professors and the Ford foundation, like just they're just 00:52:38.060 |
Maybe it's 10, maybe it's five, but let's say generously it's, it's 20. 00:52:41.460 |
So it's a, you know, 20% kind of full on revolutionaries. 00:52:44.500 |
Um, um, and then you've got, let's call it 20% on the other side that are like, no, I'm 00:52:50.900 |
I'm not, I'm not signing up for this, but you know, you know, they, their view of themselves 00:52:54.100 |
is they're in a small minority and in fact they start out in a small minority because 00:52:57.140 |
what happens is the 60% go with the first 20%, not the second 20%. 00:53:00.700 |
So you've got this large middle of people and it's not that there's anything like, it's 00:53:04.860 |
not the people in the middle are not smart or anything like that. 00:53:07.220 |
It's that they just have like normal lives and they're just trying to get by and they're 00:53:10.780 |
just trying to go to work each day and do a good job and be a good person and raise 00:53:14.060 |
their kids and you know, have a little bit of time to watch the game. 00:53:17.140 |
Um, and they're just not engaged in the cut and thrust of, you know, political activism 00:53:24.380 |
But then, but that's where the over socialization comes in. 00:53:26.740 |
It's just like, okay, by default, the 60% will go along with the 20% of the radical 00:53:34.060 |
And then the counter elite is in this other 20% and over time they build up a theory and 00:53:39.660 |
network and ability to resist, um, and a new set of representatives and a new set of ideas. 00:53:45.020 |
And then at some point there's a contest and then, and then, and then, and then right. 00:53:48.980 |
And then the question is what happens in the middle, what happens in the 60% and it, and 00:53:51.980 |
it, and it's kind of my point, it's not even really does the 60% change their beliefs as 00:53:55.820 |
much as it's like, okay, what, what is the thing that that 60% now decides to basically 00:54:02.300 |
And I think that 60% in the Valley, that 60% for the last decade decided to be woke. 00:54:06.420 |
Um, and you know, extremely, I would say on edge, uh, on a lot of things. 00:54:12.620 |
And I, you know, that 60% is pivoting in real time. 00:54:18.660 |
And I would love to see where that pivot goes because there's internal battles happening 00:54:26.540 |
So there's two, two forms of internal, there's two forms of things that and Timur Kuran is 00:54:30.060 |
So, so one is, he said that he said, this is the kind of unwind where what you're going 00:54:33.180 |
to have is you're, you're now going to have people in the other direction. 00:54:34.900 |
You're going to have people who claim that they supported Trump all along who actually 00:54:41.940 |
And by the way, Trump's not the only part of this, but you know, he's just a convenient 00:54:44.740 |
shorthand for, you know, for, for a lot of this. 00:54:46.460 |
Um, but you know, whatever it is, you'll, you'll have people who will say, well, I never 00:54:50.260 |
Or I never supported the SG or I never thought we should have canceled that person. 00:54:54.500 |
Where of course they were full on a part of the mob, like, you know, kind of at that moment. 00:54:57.460 |
And so anyway, so you'll have preference falsification happening in the other direction. 00:55:01.300 |
And his prediction I think basically is you'll end up with the same quote problem on the, 00:55:08.740 |
You know, well, how far is American society willing to go in any of these things? 00:55:14.300 |
Um, and then, and then the other part of it is, okay, now you have this, you know, elite 00:55:18.140 |
that is used to being in power for the last decade. 00:55:21.020 |
And it, and by the way, many of those people are still in power and they're in very important 00:55:24.100 |
positions and the New York times is still the New York times and Harvard is still Harvard. 00:55:26.980 |
And like those people haven't changed like at all. 00:55:30.420 |
And they still, you know, they've been bureaucrats in the government and, you know, senior democratic, 00:55:35.460 |
And, and they're sitting there, you know, right now feeling like reality has just smacked 00:55:38.440 |
them hard in the face because they lost the election so badly. 00:55:41.540 |
But they're now going into a, and specifically the democratic party is going into a civil 00:55:47.720 |
And, and, and, and, and that form of the civil war is completely predictable. 00:55:50.060 |
And that's exactly what's happening, which is half of them are saying we need to go back 00:55:54.540 |
We need to de-radicalize because we've lost the people, we've lost that the people in 00:55:59.300 |
We need to go back to the middle in order to be able to get 50% plus one in an election. 00:56:04.060 |
And then the other half of them are saying, no, we weren't true to our principles. 00:56:08.920 |
We must double down and we must, you know, celebrate, you know, murders in the street 00:56:13.380 |
And that's, and that, that right now is like a real fight. 00:56:15.780 |
If I could tell you a little personal story that breaks my heart a little bit, there's 00:56:19.500 |
a, there's a professor, historian, I won't say who, who I admire deeply love his work. 00:56:28.580 |
And we were talking about having a podcast or doing a podcast. 00:56:33.180 |
And he eventually said that, you know what, at this time, given your guest list, I just 00:56:40.020 |
don't want the headache of being in the faculty meetings in my particular institution. 00:56:47.700 |
And I asked who are the particular figures in this guest list? 00:56:54.500 |
And the second one, he said that you announced your interest to talk to Vladimir Putin. 00:57:01.980 |
Now, I fully believe he, uh, it would surprise a lot of people if I said who it is, but you 00:57:08.260 |
know, this is a person who's not bothered by the, uh, the guest list. 00:57:13.280 |
And I should also say that 80 plus percent of the guest list is left wing. 00:57:19.740 |
But nevertheless, he just doesn't want the headache. 00:57:22.540 |
And that speaks to the, the thing that you've kind of mentioned that you just don't, don't 00:57:27.980 |
You just want to just have a pleasant morning with some coffee and talk to your fellow professors. 00:57:33.620 |
And I think a lot of people are feeling that in universities and in other contexts and 00:57:39.060 |
And I wonder if that shifts, how quickly that shifts. 00:57:43.580 |
And there, the percentages you mentioned, 20, 60, 20 matters and the, and the, the contents 00:57:49.260 |
of the private groups matters and the dynamics of how that shifts matters because very possible 00:57:54.740 |
nothing really changes in universities and in major tech companies or just there's a 00:57:59.580 |
kind of, um, excitement right now for potential revolution and these new ideas, this new vibes 00:58:07.540 |
to reverberate through these companies, universities, but it's possible the, the, the wall will 00:58:19.700 |
So I would like to beat on him on your behalf. 00:58:30.220 |
This is the ultimate indictment of the corruption and the rot at the heart of our education 00:58:33.660 |
system, uh, at the heart of these universities. 00:58:35.460 |
And it's, by the way, it's like across the board, it's like all the, all the top universities. 00:58:39.560 |
It's like, cause the, the siren song for right, what it's been for 70 years, whatever the 00:58:43.500 |
tenure system, peer review system, tenure system, um, which is like, yeah, you work 00:58:47.700 |
your butt off as an academic to get a professorship and then to get tenure because then you can 00:58:55.700 |
Then you can do your work and your research and your speaking and your teaching without 00:59:06.940 |
I mean, think of the term academic freedom and then think of what these people have done 00:59:15.340 |
Like that entire thing was fake and is completely rotten and these people are completely, completely 00:59:25.300 |
giving up the entire moral foundation of the system has been built for them, which by the 00:59:29.180 |
way is paid for virtually 100% by taxpayer money. 00:59:34.980 |
What's the, what's the inkling of hope in this, like what, uh, this particular person 00:59:39.020 |
and others who hear this, what can give them strength, inspiration, and courage, um, that 00:59:44.620 |
the population at large is going to realize the corruption in their industry and it's 00:59:54.660 |
So the, the, the universities, the universe, the universities are funded by four primary 00:59:59.540 |
The big one is the federal student loan program, which is, you know, in the many trillions 01:00:02.340 |
of dollars at this point, and then only spiraling, you know, way faster than the, than inflation. 01:00:07.620 |
Number two is federal research funding, which is also very large. 01:00:10.460 |
And you probably know that, um, um, when a scientist at university gets a research grant, 01:00:15.180 |
the university rakes as much as 70% of the money, uh, for central uses. 01:00:19.620 |
Uh, number three is tax exemption at the operating level, which is based on the idea that these 01:00:24.260 |
are nonprofit institutions as opposed to let's say political institutions. 01:00:27.900 |
Um, and then number four is tax exemptions at the endowment level, um, you know, which 01:00:33.220 |
is the financial buffer that these places have. 01:00:35.900 |
Um, hypothetical, anybody who's been close to university budget will basically see that 01:00:40.500 |
what would happen if you withdrew those sources of federal taxpayer money. 01:00:43.780 |
And then for the state schools, the state money, they don't instantly go bankrupt. 01:00:52.180 |
Cause the problem right now, and you know, like the folks at university of Austin are 01:00:54.060 |
like mounting a very valiant effort and I hope that they succeed and I'm sure I'm cheering 01:00:57.820 |
But the problem is you're, you're now inserting, you're supposed to, suppose you and I want 01:01:00.380 |
to start a new university and we want to hire all the free thinking professors and we want 01:01:05.620 |
Basically speaking, we can't do it because we can't get access to that money. 01:01:08.940 |
I'll give you the most direct reason we can't get access to that money. 01:01:11.300 |
Uh, we can't get access to federal student funding. 01:01:13.900 |
Do you know how universities are accredited for the purpose of getting access to federal 01:01:18.340 |
Federal student loans, they're accredited by the government, but not directly, indirectly. 01:01:23.940 |
They're not accredited by the department of education. 01:01:26.020 |
Instead what happens is the department of education accredits accreditation bureaus 01:01:30.060 |
that are nonprofits that do the accreditation. 01:01:32.660 |
Guess what the composition of the accreditation bureaus is? 01:01:35.820 |
The existing universities, they're in complete control. 01:01:39.740 |
The incumbents are in complete control as to who gets, um, as to who gets access to 01:01:44.540 |
Guess how enthusiastic they are about accrediting a new university, right? 01:01:49.740 |
And so we have a government funded and supported cartel, um, that has gone, I mean, it's just 01:01:54.820 |
obvious now it's just gone like sideways and basically any possible way it could go sideways, 01:01:58.820 |
including, I mean, literally, as you know, students getting beaten up in the, on campus 01:02:01.740 |
for being, you know, the wrong religion and it's just, they're just wrong in every possible 01:02:05.900 |
Um, and, and they're, they're, it's all in the federal taxpayer back. 01:02:08.340 |
Um, and there is no way, I mean, I, my opinion, there is no way to fix these things without, 01:02:14.100 |
Um, and, and there's no way to replace them without letting them fail. 01:02:17.060 |
And by the way, it's like everything else in life. 01:02:19.500 |
I mean, in a sense, this is like the most obvious conclusion of all time, which is what 01:02:22.820 |
happens in the business world when a company does a bad job is they go bankrupt and another, 01:02:31.260 |
And of course, below that is what happens is this is the process of evolution, right? 01:02:35.060 |
It's because things are tested and tried and then you, you know, the things that, the things 01:02:41.220 |
They've been allowed to cut themselves off from both from evolution at the institutional 01:02:43.980 |
level and evolution at the individual level as shown by the, the, the, the, the, the just 01:02:52.900 |
We, we built it, we built an ossified system, an ossified centralized corrupt system. 01:03:02.360 |
I have, maybe it's a grounded in hope that I believe you can revolutionize the system 01:03:07.060 |
from within because I do believe Stanford and MIT are important. 01:03:19.020 |
So I just started watching a key touchstone of American culture with my nine year old, 01:03:26.460 |
And there is a, which by the way is a little aggressive for a nine year old. 01:03:31.140 |
So he's learning all kinds of new words and all kinds of new ideas. 01:03:33.900 |
But yeah, I told him, I said, you're going to hear words on here that you are not allowed 01:03:40.820 |
And I said, you know how we have an agreement that we never lied to mommy. 01:03:45.580 |
I said, not using a word that you learn in here, uh, does not count as lying and keep 01:03:56.220 |
In the very opening episode, in the first, in the first 30 seconds, one of the, one of 01:03:58.280 |
the kids calls the other kid a dildo right now we're off. 01:04:10.380 |
Um, so, um, uh, so famous episode of South Park, the underpants gnomes. 01:04:15.980 |
And so the, the underpants, so there's, there's, there's this rat, all the kids basically realize 01:04:19.380 |
that their underpants are going missing from their dresser drawers. 01:04:22.780 |
Somebody stealing the underpants and it's just like, well, who on earth would steal 01:04:24.940 |
the underpants and it turns out it's the underpants gnomes. 01:04:28.620 |
And it turns out the underpants gnomes have come to town and they've got this little underground 01:04:31.300 |
warren of tunnels and storage places for all the underpants. 01:04:33.380 |
And so they go out at night, they steal the underpants and the kids discover that, you 01:04:36.620 |
know, the underpants gnomes and they're, you know, what, what are you doing? 01:04:40.340 |
And so the underpants gnomes present their, their master plan, which is a three-part plan, 01:04:54.580 |
So you just, you just proposed the underpants gnome. 01:04:57.580 |
So, so the form of this in politics is we must do something. 01:05:00.940 |
This is something, therefore we must do this, but there's no causal logic chain in there 01:05:06.580 |
at all to expect that that's actually going to succeed because there's no reason to believe 01:05:14.060 |
I will let you talk as the host of the show in a moment, but, uh, but the, but I hear 01:05:23.020 |
And I hear this all the time, which is like, oh, these are very important. 01:05:32.660 |
If there's that pressure that you described in terms of cutting funding, then you have 01:05:37.300 |
the leverage to fire a lot of the administration and have new leadership that steps up, that, 01:05:43.420 |
that, uh, aligns with this vision that things really need to change at the heads of the 01:05:50.940 |
Put students and faculty to primary, fire a lot of the administration and realign and 01:05:57.420 |
reinvigorate this idea of freedom of thought and intellectual freedom. 01:06:01.180 |
I mean, I don't, because there is already a framework of great institutions that's there 01:06:07.780 |
and the way they talk about what it means to be a great institution is aligned with 01:06:14.380 |
It's this meaning like intellectual freedom, the idea of tenure, right? 01:06:19.020 |
On the surface it's aligned underneath is become corrupted. 01:06:23.420 |
If we say free speech and academic freedom, often enough, sooner or later, these tenured 01:06:27.260 |
Wait, do you think the universities are fundamentally broken? 01:06:32.340 |
How do you have institutions for educating 20 year olds and institutions that host, uh, 01:06:41.700 |
researchers that have the freedom to do epic shit, like research type shit that's outside 01:06:47.420 |
the scopes of R and D departments and inside companies. 01:06:50.020 |
So how do you create an institution like that? 01:06:51.960 |
How do you create a good restaurant when the one down the street sucks? 01:07:00.460 |
Like how often in your life have you experienced a restaurant that's just absolutely horrible 01:07:03.980 |
and it's poisoning all of its customers and the food tastes terrible and then three years 01:07:09.580 |
Charlie Munger actually had the great, the best comment on this great investor, Charlie 01:07:12.860 |
He was once asked, he's like, you know, he was, you know, General Electric was going 01:07:15.540 |
through all these challenges and he was asked at a Q and a, he said, how would you fix the 01:07:19.580 |
And he said, fix the culture at General Electric. 01:07:21.140 |
He said, I couldn't even fix the culture at a restaurant. 01:07:27.900 |
I mean, nobody in business thinks you can do that. 01:07:32.620 |
Like it's not, it's now, now look, having said all that, I should also express this 01:07:36.140 |
because I have a lot of friends to work at these places and, and um, and are involved 01:07:43.260 |
I would love for the, I would love for the underpants gnome step two to be something 01:07:46.740 |
clear and straightforward that they can figure out how to do. 01:07:50.340 |
I'd love to see them come back to their spoken principles. 01:07:53.460 |
I'd love to see the professors with tenure get bravery. 01:07:56.180 |
I would love to see, I mean, it would be fantastic. 01:07:58.820 |
You know, my partner and I've done like a lot of public speaking on this topic. 01:08:01.800 |
It's been intended to not just be harsh, but also be like, okay, like these, these challenges 01:08:07.580 |
By the way, let me also say something positive, you know, especially post October 7th, there 01:08:11.320 |
are a bunch of very smart people who are major donors and board members of these institutions 01:08:15.480 |
like Mark Rowan, you know, who are really coming in trying to, you know, I think legitimately 01:08:20.540 |
I have a friend on the executive committee at one of the top technical universities. 01:08:23.540 |
He's working overtime to try to do this, man. 01:08:28.340 |
Um, but I, but the counter question would just be like, do you see it actually happening 01:08:33.500 |
Uh, I'm a person that believes in leadership. 01:08:36.180 |
If you have the right leadership, the whole system can be changed. 01:08:41.560 |
So here's a question for your friend who have tenure at one of these places, which is who 01:08:46.040 |
I think, you know, you know how I think runs it. 01:08:49.300 |
Whoever the fuck says they run it, that's what great leadership is like a president 01:08:54.320 |
The president of the university has the leverage because they can mouth off like Elon can. 01:08:59.600 |
They can fire them through being vocal publicly, yes. 01:09:11.720 |
The professors and the students, the professors and the feral students. 01:09:13.340 |
And they're of course in a radicalization feedback cycle driving each other crazy. 01:09:21.640 |
What happens when you're put in charge of your, of your bureaucracy where the, where 01:09:22.640 |
the thing that the bureaucracy knows is that they can outlast you. 01:09:25.840 |
The thing that the tenured professors at all these places know is it doesn't matter who 01:09:28.680 |
the president is because they can outlast them because they cannot get fired. 01:09:32.640 |
By the way, it's the same thing that bureaucrats in the government know. 01:09:35.000 |
It's the same thing that the bureaucrats in the Department of Education know. 01:09:40.680 |
It's, I mean, it's the whole thing that, it's the resistance. 01:09:47.860 |
That's definitely a crisis that needs to be solved. 01:09:50.120 |
And I also don't like that I'm defending academia here. 01:09:53.080 |
I agree with you that the situation is dire and, but I just think that institutions are 01:10:01.120 |
And I should also add context since you've been grilling me a little bit. 01:10:07.100 |
And earlier offline in this conversation, you said the Dairy Queen is a great restaurant. 01:10:17.320 |
So everything Marc Andreessen is saying today, put that into context. 01:10:21.680 |
Just one day, one day you should walk down there and order a Blizzard. 01:10:30.240 |
And they'll put, they'll put anything in there you want. 01:10:33.240 |
I think I'll close by saying, look, I, I, my friends, the university system, I would 01:10:35.760 |
just say, look, like I, this is the challenge. 01:10:37.600 |
Like I would just, I would just pose this as the challenge. 01:10:39.640 |
Like to me, like this is having had a lot of these conversations, like this is the bar 01:10:42.920 |
that in my view, this is the conversation that actually has to happen. 01:10:47.160 |
These problems need to be confronted directly because I think there's just, I think there's 01:10:51.480 |
I mean, I'm actually worried kind of on the other side, there's too much happy talk in 01:10:55.320 |
I think the taxpayers do not understand this level of crisis. 01:10:58.320 |
And I think if the taxpayers come to understand it, I think the funding evaporates. 01:11:02.040 |
And so I, I think the, the fuse is going through, you know, no fault of any of ours, but like 01:11:06.480 |
the fuse is going and there's some window of time here to fix this and address it and 01:11:09.920 |
justify the money because it just normal taxpayers sitting in normal towns in normal jobs are 01:11:15.840 |
not going to tolerate this for, for that much longer. 01:11:18.960 |
You've mentioned censorship a few times, let us, if we can go deep into the darkness of 01:11:23.760 |
the past and how censorship mechanism was used. 01:11:27.280 |
So you are a good person to speak about the history of this because you were there on 01:11:32.720 |
the ground floor in 2013 ish, Facebook, I heard that you were there when they invented 01:11:41.480 |
or maybe developed the term hate speech in the context of censorship on social media. 01:11:51.380 |
So take me to through that history, if you can, the use of censorship. 01:11:55.960 |
So I was there on the ground floor in 1983, there's multiple floors to this building. 01:12:04.320 |
So I got the first ask to implement censorship on the internet, um, which was in the web 01:12:12.080 |
19, actually in 1982, uh, I was asked to implement a nudity filter. 01:12:14.240 |
Did you have the courage to speak up back then? 01:12:16.680 |
I did not have any problems speaking up back then. 01:12:24.560 |
And look, I, you know, it's legitimate, you know, in some sense of legitimate request, 01:12:27.480 |
which is working on a research project actually funded by the federal government and a public 01:12:32.640 |
So, you know, I don't think my boss was like in any way out of line, but it was like, yeah, 01:12:35.320 |
like this, this web browser thing is great, but like, could it just make sure to not have 01:12:40.720 |
But if you think about this for a second, as a technologist, I had an issue, which is, 01:12:46.440 |
And so I had a brief period where I tried to imagine an algorithm, um, that I referred 01:12:49.520 |
to as the breast detection algorithm, um, that I was going to have to design. 01:12:55.200 |
And then apparently a variety of other, apparently body parts people are also sensitive about. 01:12:59.800 |
And, um, and then I, I politely declined to do this for just the technical difficulties. 01:13:03.800 |
Well, number one, I did, I actually didn't know how to do it, but number two is just 01:13:06.160 |
like, no, I'm not, I'm not building, I'm just not building a censorship engine. 01:13:10.920 |
Um, and, and in those days it was, you know, in those days the internet generally was, 01:13:15.200 |
It was actually interesting as sort of pre 93 the internet was such a specific niche 01:13:20.960 |
Like it was like the million kind of highest IQ nerds in the world. 01:13:25.160 |
Um, and so it actually like didn't really have a lot of, you know, issues that people 01:13:29.080 |
were like super interested in talking about, like astrophysics and not very interested 01:13:32.040 |
in, you know, even politics at that, at that time. 01:13:34.560 |
So there really was not an issue there, but, um, yeah, I didn't, I didn't want to start 01:13:38.560 |
Um, so I think the way to think about this, so first of all, um, you know, yeah, so I 01:13:42.320 |
was involved in this with Facebook every step, by the way, I've been involved in this with 01:13:45.180 |
Facebook every step of the way I joined the board there in 2007. 01:13:47.840 |
So I saw, I've seen everything in the last, you know, almost 20 years, um, every step 01:13:52.320 |
But also I've been involved in most of the other companies over time. 01:13:54.040 |
So I was an angel investor in Twitter and knew them really well. 01:13:56.400 |
Um, we were the founding investor in sub stack. 01:13:58.920 |
I'm part of the Elon takeover of Twitter with X. 01:14:03.280 |
So I, I, I've been in these, we were the funder of Pinterest. 01:14:05.880 |
We were one of the, one of the main investors there, read it, um, as well. 01:14:09.440 |
And I was having these conversations with all these guys all the way through. 01:14:11.800 |
So as much talk specifically about Facebook, but I can just tell you like the general 01:14:16.000 |
And for quite a while, it was kind of all the same across these companies. 01:14:20.000 |
So, so basically the way to think about this, the, the, the true kind of nuanced view of 01:14:23.400 |
this is that there is practically speaking, no internet service that can have zero censorship. 01:14:28.120 |
Um, and, and by the way, that also mirrors, there is no country that actually has limited 01:14:34.500 |
The U S first amendment actually has 12 or 13 formal carve outs from the Supreme court 01:14:38.880 |
over time, you know, so incitement to violence and terrorist recruitment and child abuse. 01:14:44.320 |
And so, you know, child pornography and so forth are like, they're not covered by the 01:14:48.040 |
Um, and just practically speaking, if you and I are going to start an internet company 01:14:51.440 |
and have a service, we can't have that stuff either. 01:14:54.320 |
Cause it's illegal or it will just clearly, you know, destroy the whole thing. 01:14:56.520 |
So you, you, you're always gonna have a censorship engine. 01:14:59.480 |
I mean, hopefully it's not actually in the browser, but like, you're going to have it 01:15:02.440 |
for sure at the level of an, of an internet service. 01:15:05.640 |
But then what happens is now you have, now you have a machine right now, now you have 01:15:09.040 |
a system where you can put in rules saying we allow this, we don't allow that. 01:15:17.480 |
Um, and once that system is in place, like it becomes the ring of power, right? 01:15:22.000 |
Which is like, okay, now anybody in that company or anybody associated with a company or anybody 01:15:26.180 |
who wants to pressure that company will just start to say, okay, you should use that machine 01:15:29.480 |
for more than just terrorist recruitment and child pornography. 01:15:34.240 |
Um, and basically that transition happened to call it 2012, 2013 is, is when there was 01:15:42.900 |
I think the kickoff to it for some reason was this, it was the beginning of the second 01:15:46.900 |
Um, I think it also coincided with the sort of arrival of the first kind of super woke 01:15:52.080 |
kids into these, into these schools, um, you know, that kind of, you know, it's, it's the 01:15:56.200 |
kids that were in school between like, you know, for the Iraq war and then the global 01:16:00.580 |
And like, they came out like super radicalized, they came into these companies and they immediately 01:16:03.280 |
started mounting these social crusades to ban and censor, uh, lots of things. 01:16:08.120 |
And then, you know, quite frankly, the democratic party figured this out, um, and they figured 01:16:11.020 |
out that these companies were, you know, very subject to being controlled and they, and 01:16:14.220 |
the, you know, the executive teams and boards of directors are almost all Democrats. 01:16:17.200 |
And, you know, there's tremendous circulation, a lot of Obama people from the first term 01:16:20.720 |
actually came and worked in these companies and a lot of FBI people and other, you know, 01:16:25.000 |
law enforcement intelligence people came in and worked and they're all, you know, they 01:16:29.600 |
Um, and so they just, you know, the, the, the ring of power was lying on the table, 01:16:33.160 |
it had been built and they, you know, pick it up and put it on and then they just ran. 01:16:38.160 |
And the original discussions were basically always on two topics. 01:16:41.360 |
It was hate speech and misinformation, uh, hate speech was the original one. 01:16:45.520 |
Um, and the hate speech conversation started exactly like you'd expect, which is we can't 01:16:49.620 |
have the N word in which the answer is fair enough. 01:16:57.100 |
And then, but the, and then the, and Jordan Peterson has talked a lot about this. 01:16:59.760 |
The definition of hate speech ended up being things that make people uncomfortable, right? 01:17:03.940 |
So we can't have things that make people uncomfortable. 01:17:05.740 |
I, of course, you know, people like me that are disagreeable, raise their hands and say, 01:17:09.380 |
well, that idea right there makes me uncomfortable, but of course that doesn't count as hate speech, 01:17:15.100 |
So, you know, the ring of power is on one hand and not, not, not on the other hand. 01:17:19.060 |
Uh, and then basically that began this slide where it ended up being that, you know, completely 01:17:24.560 |
anodyne is the point that Mark has been making recently, like completely anodyne comments 01:17:28.740 |
that are completely legitimate on television or on the Senate floor. 01:17:31.820 |
All of a sudden our hate speech and can't be said online. 01:17:33.520 |
So that, you know, the, the ring of power was wielded in grossly irresponsible ways. 01:17:37.400 |
We can talk about all the stuff that happened there. 01:17:40.440 |
And that wasn't as there was a little bit of that early on, but of course that really 01:17:45.680 |
Um, so, so the hate speech stuff, the hate speech stuff predated Trump by like three 01:17:50.720 |
Um, the misinformation stuff was basically a, uh, it was a little bit later and it was 01:17:56.360 |
Um, and then that was, you know, a ring of power that was even more powerful, right? 01:18:01.400 |
Because you know, hate speech is like, okay, at some point if some, if something offensive 01:18:05.000 |
or not, like at least you can have a question as to whether that's the case. 01:18:07.760 |
But the problem with misinformation is like, is it the truth or not? 01:18:11.200 |
You know, you know, what do we know for 800 years or whatever Western civilization, it's 01:18:15.400 |
that, you know, there's only a few entities that can determine the truth on every topic. 01:18:18.960 |
You know, there's God, you know, there's the King. 01:18:21.880 |
We don't have those anymore and the rest of us are all imperfect and flawed. 01:18:25.360 |
And so the idea that any group of experts is going to sit around the table and decide 01:18:28.640 |
on the truth is, you know, deeply anti-Western and deeply authoritarian and somehow the misinformation 01:18:33.400 |
kind of crusade went from the Russiagate hoax into just full blown. 01:18:37.480 |
We're going to use that weapon for whatever we want up. 01:18:40.340 |
And then, and then of course then, then the culminating moment on that, that really was 01:18:43.080 |
the straw that broke the camel's back was, um, we're going to censor all theories that 01:18:47.160 |
the COVID virus might've been manufactured in a lab as misinformation. 01:18:50.960 |
And, and then, and inside these companies, like that was the point where people for the 01:18:54.200 |
first time, this is like what, three years ago, if for the first time they were like, 01:18:57.920 |
okay, that was when it sunk in where it's just like, okay, this has been completely 01:19:02.320 |
So anyway, that, that, that's how we got to where we are. 01:19:05.040 |
And then basically that spell lasted that, that, that complex existed and got expanded 01:19:18.440 |
And so, and I'm super proud of those guys cause they started from scratch and declared 01:19:23.200 |
right up front that they were going to be a, a, a free speech platform. 01:19:26.480 |
Um, and they came under intense pressure, um, including from the press, um, and you 01:19:31.720 |
know, the, try to just beat them to the ground and kill them and intense pressure by the 01:19:34.800 |
way from, you know, let's say certain of the platform companies, um, you know, basically 01:19:39.160 |
Um, and they stood up to it and, you know, sitting here today, they have the widest spectrum 01:19:42.600 |
of, of speech and, and, and conversation I've, you know, anywhere on planet earth and they've 01:19:49.000 |
Uh, and then obviously Elon, uh, you know, with X was the, the, you know, the hammer 01:19:53.320 |
I believe it's the third one now is what mark is doing at Facebook and there's also like 01:19:59.200 |
I think you've spoken about this, which, uh, like Jon Stewart going on Stephen Colbert 01:20:06.120 |
and talking about the lab leak theory, I just, there's certain moments to just kind of shake 01:20:12.320 |
everybody up the right person, the right time, just it's a wake up call. 01:20:17.880 |
So that there, and I will tell you like, I know I should say Jon Stewart attacked me 01:20:21.200 |
recently, so I'm not that thrilled about him. 01:20:23.200 |
But I would say I was a long run fan of Jon Stewart. 01:20:26.280 |
I watched probably every episode of the daily show when he was on it, um, you know, for 01:20:29.880 |
probably 20 years, but he did a very important public service and it was that appearance 01:20:35.360 |
And I don't know how broadly this is, you know, at the time it was in the news briefly, 01:20:38.160 |
but I don't know how if people remember this, but I will tell you in, in the rooms where 01:20:41.400 |
people discuss what is misinformation and these policies, that was a very big moment. 01:20:45.040 |
That was probably actually the key catalyzing moment. 01:20:47.200 |
And I think he exhibited, I would say conspicuous bravery and had a big impact with that. 01:20:50.880 |
Um, and, and yeah, what, what, for people who don't recall what he did, what, and this 01:20:54.000 |
was in the full blown, like you absolutely, you know, you absolutely must lock down for 01:20:58.440 |
You absolutely must keep all the schools closed. 01:20:59.480 |
You absolutely must have everybody work from home. 01:21:01.480 |
You absolutely must wear a mask, like the whole thing. 01:21:04.160 |
Um, and then one of those was you absolutely must believe that COVID was completely natural. 01:21:09.940 |
You must believe that and not believing that means you're a fascist, Nazi, Trump supporter, 01:21:19.520 |
Um, and, and, and like I said, that, that was the peak and, and, and John Stewart went 01:21:23.120 |
on the Colbert show and I don't know if they planned it or not because Colbert looked shocked. 01:21:26.560 |
I don't know how much it was a bit, but he went on there and he, he just had one of these 01:21:29.880 |
like the Emperor's wearing no clothes things where he said, it's just not plausible that 01:21:34.600 |
you had the COVID super virus appear 300 yards down the street from the Wuhan Institute of 01:21:42.440 |
Um, like it's just not plausible that that, that, that certainly that you could just rule 01:21:47.440 |
Another key moment, actually the more serious version was, I think the author Nicholson 01:21:50.100 |
Baker wrote a big piece for New York magazine. 01:21:52.000 |
Uh, and Nicholson Baker is like one of our great novelist writers, um, of our time. 01:21:56.040 |
And he wrote the piece and he did the complete addressing of it. 01:21:58.760 |
Um, and that was the first, I think that was the first legit, there had been like alt, 01:22:02.840 |
you know, renegade, there had been, you know, people running around saying this, but getting 01:22:06.280 |
That was the first one that was like in the mainstream press where he, and he talked to 01:22:09.760 |
all the heretics and he just like laid the whole thing out. 01:22:13.600 |
And I remember, let's say a board meeting at one of these companies after that were 01:22:16.720 |
basically, you know, everybody looked around the table and it was like, all right, like 01:22:19.760 |
I guess we're not, we don't need to censor that anymore. 01:22:24.000 |
And you know, and then of course what immediately follows from that is, well, wait a minute, 01:22:26.840 |
why were we censoring that in the first place? 01:22:30.320 |
Like, and then, you know, the downstream, not that day, but the downstream conversations 01:22:32.920 |
were like, okay, if, if, if we made such a giant in retrospect, if we all made such 01:22:36.480 |
a giant collective mistake censoring that, then what does that say about the rest of 01:22:41.240 |
And I think that was the thread in the sweater that started to unravel it. 01:22:44.960 |
I do think that the Jon Stewart appearance and the statement he made was a courageous 01:22:50.960 |
I think we need to have more of that, of that in the world. 01:22:53.400 |
And like you said, Elon, everything he did with X is, is, is a series of courageous acts. 01:23:01.080 |
And I think what, um, Zuck, what Mark Zuckerberg did on Rogan a few days ago is a courageous 01:23:11.960 |
He has become, I think, an outstanding communicator, right. 01:23:14.440 |
Um, and he's, you know, somebody who came in for a lot of criticism earlier in his career 01:23:18.400 |
And I think he's, you know, he's, he's one of these guys who can sit down and talk for 01:23:23.880 |
And, and, you know, as, as you do with, with all of, all of your episodes, like when somebody 01:23:27.840 |
sits and talks for three hours, like you really get a sense of somebody cause it's, it's really 01:23:33.240 |
And, you know, he's, he's not done that repeatedly. 01:23:36.000 |
And then look, I, again, I would maybe put him in the third category now with, um, certainly 01:23:39.280 |
after that appearance, uh, I would say, um, I would put him up there now with, you know, 01:23:43.000 |
kind of be a line of Trump in the sense of the public or the public and the private are 01:23:46.800 |
I guess I'd say that, like he, he, he said on that show, what he really believes. 01:23:50.240 |
He said all the same things that he says in private. 01:23:51.920 |
Like I don't think there's really any, any discrepancy anymore. 01:23:54.560 |
Um, I would say he has always taken upon himself a level of obligation responsibility to running 01:24:01.040 |
a company, the size of meta and to running services that are that large. 01:24:05.480 |
Um, and I think, you know, his conception of what he's doing, which I think is correct 01:24:08.840 |
he's running services that are bigger than any country, right? He's running, you know, 01:24:11.720 |
over 3 billion people use those services. Um, and so, and then, you know, the company has, 01:24:16.760 |
you know, many tens of thousands of employees and many investors, and it's a public company, 01:24:20.360 |
and he thinks very deeply and seriously about his responsibilities. Um, and so, you know, 01:24:25.080 |
he has not felt like he has had, let's just say the complete flexibility that Elon has had. Um, 01:24:29.880 |
and, you know, people could argue that one way or the other, but, you know, he's, he's, you know, 01:24:33.800 |
yeah, he's, he's, you know, he talked about a lot, he's, he's evolved a lot. A lot of it was, 01:24:37.480 |
he learned a lot. And by the way, I'm going to put myself right back up there. Like, 01:24:40.760 |
I'm not claiming any huge foresight or heroism on any of this. Like, I've also learned a lot, 01:24:45.080 |
like, like I, my views on things are very different than they were 10 years ago 01:24:48.680 |
on lots of topics. And so, um, you know, I would, I've been on a learning journey. He's been on a 01:24:52.600 |
learning journey. Um, he is a really, really good learner. Um, he assimilates information, 01:24:58.040 |
you know, as good as, or better than anybody else I know. Um, the other thing I guess I would just 01:25:03.000 |
say is he talked on that show about something very important, which is when you're in a role where 01:25:07.800 |
you're running a company like that, there are a set of decisions that you get to make and, 01:25:11.720 |
and you deserve to be criticized for those decisions and so forth. And it's valid. 01:25:15.000 |
Um, but you are under tremendous external pressure, um, as well. Uh, and, and by the 01:25:20.440 |
way, you're under tremendous internal pressure. You've got your employees coming at you. You've 01:25:24.440 |
got your executives in some cases coming at you. You've got your board in some cases coming at you. 01:25:28.680 |
Um, you've got your shareholders coming at you. So you, you've got your internal pressures, 01:25:32.360 |
but you also have the press coming at you. You've got academia coming at you. You've got the entire 01:25:37.320 |
nonprofit complex coming, activist complex coming at you. And then really critically, you know, 01:25:42.200 |
he talked about it, Rogan, um, and these companies all went through this in this last, especially 01:25:47.000 |
five years, you had the government coming at you. Um, and you know, that's the really, you know, 01:25:51.720 |
stinky end of the pool, um, where, you know, the government was in my view, you know, illegally 01:25:56.280 |
exerting, um, you know, just in flagrant violation of the first amendment and federal laws on speech 01:26:00.600 |
and coercion and, uh, and conspiracy, uh, forcing these companies to, uh, engage in activities. Um, 01:26:05.800 |
you know, again, in some cases they may have wanted to do, but in other cases they clearly 01:26:08.840 |
didn't want to do and felt like they had to do. And the level of pressure, like I say, like I've 01:26:14.600 |
known every CEO at Twitter, um, they, they've all had the exact same experience, which when they were 01:26:18.760 |
in the job, it was just daily beatings. Like it's just getting punched in the face every single day 01:26:23.960 |
constantly. And, you know, Mark is very good at getting physically punched in the face. 01:26:30.840 |
And he is, and, and he, you know, and he's very good at, you know, taking a punch and he has taken 01:26:35.000 |
many, many punches. So I would encourage people to have a level of sympathy for these are not Kings. 01:26:39.720 |
These, these are people who operate with like, I would say, extraordinary levels of external 01:26:43.160 |
pressure. I think if I had been in his job for the last decade, I would be a little puddle on the 01:26:47.320 |
floor. Um, and so it, it, it says, I think a lot about him that he has, you know, risen to this 01:26:52.520 |
occasion the way that he has. And by the way, I should also say, you know, the, the cynicism of 01:26:55.720 |
course is immediately out. And, you know, it's a legitimate thing for people to say, but, you know, 01:26:59.720 |
it's like, oh, you're only doing this because of Trump or, you know, whatever. And it's just like, 01:27:02.840 |
no, like he has been thinking about and working on these things and trying to figure them out for a 01:27:06.840 |
very long time. And so I, I think what you saw are legitimate, deeply held beliefs, not some, 01:27:11.720 |
you know, you know, sort of just in the moment thing that could change at any time. 01:27:15.000 |
So what do you think it's like to be him and, uh, other leaders of companies to be you 01:27:20.120 |
and withstand internal pressure and external pressure? What's that life like? Is it deeply 01:27:26.680 |
lonely? That's a great question. So leaders are lonely to start with. And this is one of those 01:27:30.680 |
things where almost nobody has sympathy, right? Nobody feels sorry for a CEO, right? Like it's 01:27:35.240 |
not a thing. Right. Um, and, you know, and again, legitimately, so like CEOs get paid a lot, like 01:27:39.880 |
the whole thing, there's a lot of great things about it. So it's not like they should be out 01:27:42.520 |
there asking for a lot of sympathy, but it is the case that they are human beings. Um, and it is the 01:27:46.360 |
case that it is a lonely job. And the reason it's a lonely job, um, is because your words carry 01:27:50.920 |
tremendous weight. Um, and you are dealing with extremely complicated issues and you're under a 01:27:56.120 |
tremendous amount of emotional, you know, personal, emotional stress. Um, and you know, you often end 01:28:00.920 |
up not being able to sleep well and you end up not being able to like keep up an exercise routine and 01:28:04.520 |
all those things. And, you know, you're come under family stress because you're working all the time 01:28:07.800 |
or my partner, Ben, you know, was, he was CEO of our last company before we started the venture 01:28:12.040 |
firm. He, he said, you know, the problem he had like with, with his family life was he would, 01:28:15.480 |
even when he was home at night, he wasn't home because he was in his head trying to solve all 01:28:20.120 |
the business problems. And so he was like supposed to be like having dinner with his kids and he was 01:28:23.080 |
physically there, but he wasn't mentally there. So, you know, you kind of get, you get that a lot, 01:28:26.920 |
but the key thing is like, you can't talk to people, right? So you can't, I mean, you can 01:28:29.880 |
talk to your spouse and your kids, but like, they don't understand that they're not working in your 01:28:32.920 |
company. They don't understand, they have the context to really help you. You, if you talk to 01:28:36.840 |
your executives, they all have agendas, right? And so they're all, they're all, and they can't 01:28:41.160 |
resist, like it's just human nature. And, and, and so you, you can't necessarily rely on what 01:28:45.480 |
they say. It's very hard in most companies to talk to your board because they can fire you. 01:28:50.520 |
Right now, now Mark has the situation because he has control. It actually turns out he can 01:28:54.840 |
talk to his board and Mark talks to us about many things that he does that most CEOs won't talk to 01:28:58.520 |
the boards about because we literally, because we can't fire him. Um, but the general, a general C, 01:29:03.240 |
including all the CEOs of Twitter that, that none of them had control. And so they could all get 01:29:06.680 |
fired. So, um, you can't talk to the board members. They're going to fire you. You can't talk to the 01:29:10.920 |
shareholders because they'll just like dump your stock, right? Like, okay, so who's the, so, so the, 01:29:15.960 |
so every once in a while, what you find is basically the best case scenario they have is 01:29:19.160 |
they can talk to other CEOs. Um, and there's these little organizations where they kind of pair up 01:29:23.080 |
and do that. And so they maybe get a little bit out of that, but, but even that's fraught with 01:29:26.200 |
peril because can you really talk about confidential information with another CEO 01:29:29.960 |
insider trading risk? Um, and so it's just a very, it's just a very lonely and isolating 01:29:34.360 |
thing to start with. Um, and then you, you, and then on top of that, you apply pressure, 01:29:39.000 |
um, right. And that's where it gets painful. And then maybe I'll just spend a moment on 01:29:42.280 |
this internal external pressure thing. Um, my general experience with companies is that they 01:29:48.840 |
can withstand most forms of external pressure as long as they retain internal coherence, 01:29:53.960 |
right? So as long as the internal team, um, is really bonded together and supporting each other, 01:30:00.520 |
um, most forms of external pressure you can withstand. And by that, I mean, 01:30:04.840 |
investors stopped your stock, you lose your biggest customers, you know, whatever negative 01:30:10.040 |
article, you know, negative headline, you know, um, you can, you can withstand all that. And 01:30:14.600 |
basically, and in fact, many of those forms of pressure can be bonding experiences for the team 01:30:17.960 |
where they, where they, where they come out stronger. Um, what you 100% cannot withstand 01:30:22.120 |
is the internal crack. And what I always look for in high pressure corporate situations now 01:30:27.160 |
is the moment when the internal team cracks, because I know the minute that happens, 01:30:32.840 |
we're in a different regime. Like it's like the, you know, the solid has turned into a liquid, 01:30:37.400 |
like we're in a different regime. And like the whole thing can unravel in the next week. Cause 01:30:40.840 |
then people turn it. I mean, this is, I guess this is what's happening in Los Angeles right now. The, 01:30:44.280 |
the, the, the mayor and the fire chief turned on each other and that's it. That government is 01:30:50.040 |
dysfunctional. It is never going to get put back together again. It is over. It is not going to 01:30:53.880 |
work ever again. And that's what happens inside companies. Um, and so, so, so somebody like Mark 01:30:59.960 |
is under like profound internal pressure and external pressure at the same time. Now he's 01:31:04.520 |
been very good at maintaining the coherence of his executive team, but he has had over the years, 01:31:09.720 |
a lot of activist employees, um, as a lot of these companies have had. And so that's 01:31:13.560 |
been continuous pressure. And then the final thing I'd say, as I said, that companies can 01:31:17.240 |
withstand most forms of external pressure, but not all in the special, though, not all one is 01:31:21.400 |
government pressure. Um, is that when your government comes for you, like yeah, any CEO 01:31:28.040 |
who thinks that they're bigger than the government, um, has that notion beaten out of them 01:31:31.640 |
in short order. Can you just linger on that? Cause it is, uh, maybe educating and deeply 01:31:38.840 |
disturbing. You've spoken about it before, but we're speaking about again, this government 01:31:43.480 |
pressure. So you think they've crossed the line into essentially criminal levels of pressure, 01:31:50.120 |
flagrant criminality, felonies, like obvious felonies. And I can, I can actually cite the 01:31:55.880 |
laws, but yes, uh, absolute criminality. Can you explain how those possible to happen? 01:32:03.320 |
Maybe on a hopeful note, how we can avoid that happening again. 01:32:07.720 |
So as to start with is a lot of this now is in the public record, um, which is good because it 01:32:12.360 |
needs to be in the public record. And so there's, there's three forms of things that are in the 01:32:15.000 |
public record that people can look at. So one is the Twitter files, um, right. We, which Elon put 01:32:19.240 |
out with a set of journalists when he took over. And I will just tell you, the Twitter files are 01:32:22.680 |
a hundred percent representative of what I've seen at every other one of these companies. 01:32:25.960 |
And so you can just see what happened in Twitter and you can just assume that that 01:32:29.480 |
happened in these other companies. Um, you know, for the most part, certainly in terms of the kind 01:32:33.000 |
of pressure that they got. So that's, that's number one, that stuff, you can just read it 01:32:36.440 |
and you should, if you haven't. Um, the second is, um, Mark references in the Rogan podcast, 01:32:41.960 |
there's a Congressman Jim Jordan who has a committee congressional committee called the 01:32:45.240 |
weaponization committee. And they, in the last, you know, whatever, three years have done a full 01:32:49.640 |
scale investigation of this and Facebook produced a lot of documents into that investigation. And 01:32:54.600 |
those have many of those have now been made public and you can download those reports. 01:32:57.880 |
And there's like, I'd like 2000 pages worth of material on that. And that's essentially 01:33:01.560 |
the Facebook version of the Twitter files just arrived at with a different mechanism. Um, and 01:33:06.040 |
then third is Mark himself talking about this on, on, on, on Rogan. So I'll just defer to his 01:33:10.040 |
comments there, but yeah, basically what, what those three forms of information show you is 01:33:14.920 |
basically the government, you know, over time, um, and then culminating in 2020, 2021, you know, 01:33:20.600 |
in the last four years, just decided that the first amendment didn't apply to them. Um, and 01:33:24.600 |
they just decided that, um, federal laws around free speech and around, uh, conspiracies to take 01:33:29.720 |
away the rights of citizens just don't apply. Um, and they just decided that they can just 01:33:34.200 |
arbitrarily pressure, um, uh, just like literally arbitrarily call up companies and threaten and 01:33:38.680 |
bully, um, and yell and scream and, and, you know, threaten repercussions and force people to 01:33:43.160 |
force them to censor. Um, and you know, there's this whole thing of like, well, the first amendment 01:33:47.800 |
only applies to, you know, the government, it doesn't apply to companies. It's like, well, 01:33:50.760 |
there's actually a little bit of nuance to that. First of all, it definitely applies to the 01:33:54.680 |
government. Like 100%, the first amendment applies to the government by the way. So does the fourth 01:34:00.600 |
amendment and the fifth amendment, including the right to due process also applies to the 01:34:03.720 |
government. There was no due process at all to any of the censorship regime that was put in place. 01:34:08.120 |
There was no due process put in place by the way for debanking either. Those are just as serious 01:34:12.840 |
violations as the, as the free speech violations. And so this is just like flagrant, flagrant 01:34:17.160 |
unconstitutional behavior. And then there are specific federal statutes. Um, there's, it's 18 01:34:21.240 |
two 41 and 18 two 42. And one of them applies to federal employees, government employees. 01:34:26.200 |
And the other one applies to private actors, um, uh, around what's called deprivation of rights, 01:34:31.400 |
um, and conspiracy to deprive rights. And it is not legal according to the United States 01:34:35.400 |
criminal code for government employees or in a conspiracy private entities to take away 01:34:40.120 |
constitutional rights. And interestingly, some of those constitutional rights are enumerated, 01:34:45.160 |
for example, in the first amendment freedom of speech. And then some of those rights actually 01:34:48.680 |
do not need to be enumerated. It is, if the government takes away rights that you have, 01:34:54.120 |
they don't need to be specifically enumerated rights in the constitution in order to still be 01:34:57.560 |
a felony. You, the constitution does not very specifically does not say you only have the 01:35:02.760 |
rights that it gives you. It says you have all the rights that have not been previously defined 01:35:06.520 |
as being taken away from you. Right. And so debanking qualifies as a right, you know, right 01:35:11.160 |
to access the financial system is every bit something that's subject to these laws as, as, 01:35:14.920 |
as free speech. Um, and so, yeah, this has happened. And then I'll just add, add one final 01:35:19.000 |
thing, which is, we've talked about two parties so far, start with the government employees. Um, 01:35:22.600 |
and then we've talked about the companies, the government employees for sure have misbehaved. Um, 01:35:27.160 |
the companies, there's a very interesting question there as to whether they are victims or 01:35:30.600 |
perpetrators or both. Um, you know, they will defend and they will argue, and I believe they 01:35:35.560 |
have a good case that they are victims, not perpetrators, right? They are the downstream 01:35:38.760 |
subjects of pressure, not the cause, you know, not the cause of pressure, but there's a big 01:35:43.240 |
swath of people who are in the middle. And specifically the ones that are funded by the 01:35:46.520 |
government that I think are in possibly pretty big trouble. And that's all of these third-party, 01:35:51.160 |
um, censorship bureaus. Um, I mean, the one that sort of is most obvious is the so-called, 01:35:55.880 |
uh, Stanford internet observatory, um, that got booted up there over the last several years. 01:36:00.440 |
And they basically were funded by the federal government to be third-party censorship 01:36:04.520 |
operations. And they're private sector actors, but acting with federal funding. And so it puts 01:36:10.840 |
them in this very interesting spot where there, there could be, you know, very obvious theory 01:36:15.400 |
under which they're basically acting as agents of the government. Um, and so I think they're 01:36:19.080 |
also very exposed on this and have behaved in just flagrantly illegal ways. So fundamentally, 01:36:23.240 |
government should not do any kind of pressure, even soft pressure on companies to censor. 01:36:29.960 |
Can't, not allowed. It really is disturbing. I mean, it probably started soft, lightly, slowly, 01:36:40.040 |
and then it escalates as the, uh, the old will to power will instruct them to do. Cause you get, 01:36:46.760 |
you get, I mean, yeah, I mean, that's why, that's why there's protection because you can't put a 01:36:52.280 |
check on power for government, right? There are so many ways that they can get you, 01:36:56.200 |
like there are so many ways they can come at you and get you. And, you know, the thing here to 01:36:59.880 |
think about is a lot of times when people think about government action, they think about 01:37:03.240 |
legislation, right? Cause you, so when I was a kid, we got trained at how does government work? 01:37:07.960 |
There was this famous animated short, uh, the thing we got shown was just a cartoon of how 01:37:12.040 |
a bill becomes a law. And it's like this, you know, fancy little bill snipped along and goes 01:37:15.080 |
this. I'm just the bill. Yeah, exactly. Like, it's like, all right, number one, that's not how it 01:37:19.560 |
works at all. Like that, that doesn't actually happen. We could talk about that, but, but, 01:37:22.520 |
but even beyond that, mostly what we're dealing with is not legislation. When we talk about 01:37:26.360 |
government power these days, mostly it's not legislation. Um, mostly it's either regulation, 01:37:30.440 |
which is basically the equivalent of legislation, but having not gone through the legislative 01:37:34.520 |
process, which is a very big open legal issue. And one of the things that the dojo is very focused 01:37:38.200 |
on most, most government rules are not legislated. They're regulated. Um, and that, and there's tons 01:37:43.400 |
and tons of regulations that these companies are. So this is another cliche you'll hear a lot, 01:37:46.760 |
which is all private companies can do whatever they want. And it's like, oh no, they can't. 01:37:49.960 |
They're subject to tens of thousands of regulations that they have to comply with. 01:37:53.320 |
Um, and the, the hammer that comes down when you don't comply with regulations is profound. Like 01:37:57.400 |
they can completely wreck your company with no ability, no ability for you to do anything about 01:38:00.760 |
it. So, so, so regulation is a big part of the way the power gets exercised. And then there's 01:38:05.000 |
what's called just flat out administrative power, the term that you'll hear. And administrative 01:38:08.840 |
power is just literally the government telling you, calling you and telling you what to do. 01:38:11.880 |
Here's an example of how this works. So Facebook had this whole program a few years back to do a 01:38:16.040 |
global cryptocurrency for payments called Libra. And they, they built the entire system and it was 01:38:20.520 |
this high scale, you know, sort of new cryptocurrency and they were going to build into every product. 01:38:23.640 |
And there were going to be 3 billion people who could transact with Libra. And they went to the 01:38:26.680 |
government and they went to the, all these different, try to figure out how to make it. 01:38:29.400 |
So it was like fully compliant with anti-money laundering and all these, you know, controls and 01:38:32.360 |
everything. And they had the whole thing ready to go. Two senators wrote letters to the big banks, 01:38:36.360 |
um, saying, we're not telling you that you can't work with Facebook on this, but if you do, 01:38:42.840 |
you should know that every aspect of your business is going to come under greatly increased level of 01:38:47.080 |
regulatory scrutiny, which is of course the exact equivalent of, it sure is a nice corner restaurant 01:38:52.520 |
you have here. It would be a shame if, you know, somebody tossed a Molotov cocktail through the 01:38:55.640 |
window and burned it down tonight. Right. And so that like, what is that letter? Like, it's not a 01:39:00.600 |
law. It's not even a regulation. It's just like straight direct state power. And then, and then, 01:39:06.200 |
and then it culminates in literally calls from the white house where they're just like flat out 01:39:10.520 |
telling you what to do, which is of course what a King gets to do, but not what a president gets to 01:39:14.040 |
do. Um, and, and so anyway, so this, so, so what these companies experienced was they experienced 01:39:18.920 |
the full panoply of this, but it was, it was the level of intensity was in that order. It was 01:39:23.560 |
actually legislation was the least important part. Regulation was more important. Administrative 01:39:27.560 |
power was more important. And then just like flat out demands, um, and flat out threats were 01:39:31.400 |
ultimately the most important. How do you fix it? Well, first of all, like you have to elect people 01:39:36.520 |
who don't do it. So like, as with all these things, ultimately the fault lies with the voters. 01:39:40.760 |
Um, and so, you know, you have to decide you don't want to live in that regime. I have no idea what 01:39:45.320 |
part of this recent election map to the censorship regime. I do know a lot of people on the right 01:39:48.920 |
got very angry about the censorship, but I, you know, I think it probably at least helped with 01:39:52.360 |
enthusiasm on that side. Um, you know, maybe some people on the left will now not want their, 01:39:57.240 |
you know, democratic nominees to be so pro censorship. Um, so the, the voters definitely, 01:40:01.960 |
you know, get a vote. Um, number one, number two, I think you need transparency. You need to know 01:40:06.840 |
what happened. We know some of what happened. Um, Peter Thiel has written in the FT just now 01:40:11.720 |
saying we just need like brought after what we've been through in the last decade, we need broad 01:40:15.000 |
based truth and reconciliation, uh, you know, efforts to really get to the root of things. 01:40:19.000 |
Um, so maybe that's part of it. We need investigations for sure. Ultimately we need 01:40:25.240 |
prosecutions. Like we, we need, ultimately we need people to go to jail, um, cause we need to set 01:40:29.160 |
object lessons that say that you don't get to do this. Uh, and on those last two, I would say that, 01:40:33.640 |
that those are both up to the new administration and I don't want to speak for them and I don't 01:40:37.240 |
want to predict what they're going to do, but they have, they for sure have the ability to 01:40:40.280 |
do both of those things. And you know, we'll, we'll see where they take it. 01:40:43.320 |
Yeah. It's really disturbing. I don't think anybody wants this kind of overreach of power 01:40:46.680 |
for government, including perhaps people that were participating in it. It's like this 01:40:51.400 |
dark momentum of power that you just get caught up in it. And that's the reason there's that kind 01:40:58.920 |
of protection. Nobody wants that. So I use the metaphor of the ring of power and for people who 01:41:03.480 |
don't catch the reference as Lord of the rings. And the thing with the ring of power and Lord of 01:41:07.000 |
the rings, it's the ring that Gollum has in the beginning and it turns you invisible and it turns 01:41:10.600 |
out it like unlocks all this like fearsome power. It's the most powerful thing in the world. 01:41:13.640 |
It's the key to everything. And basically the moral lesson of Lord of the rings, which was, 01:41:18.040 |
you know, written by a guy who thought very deeply about these things is yeah, 01:41:21.160 |
the ring of power is inherently corrupting. The characters at one point they're like, 01:41:24.440 |
Gandalf, just put on the ring and like fix this. Right. And he's like, he'd like, he will not put 01:41:29.000 |
the ring on even to like end of the war, um, because he knows that it will corrupt him. 01:41:34.360 |
And then, you know, the characters that starts the character of Gollum is the result of, 01:41:37.800 |
you know, it's like a normal character who ultimately becomes, you know, this incredibly 01:41:41.720 |
corrupt and deranged version of himself. And so, I mean, I think you, I think you said something 01:41:45.560 |
actually quite profound there, which is the ring of power is infinitely tempting. You know, 01:41:49.960 |
the censorship machine is infinitely tempting. If you, if you have it, like you are going to use it, 01:41:54.920 |
it's overwhelmingly tempting because it's so powerful and that it will corrupt you. 01:41:59.240 |
And yeah, I, I don't know whether any of these people feel any of this today. 01:42:03.160 |
They should, I don't know if they do. Um, but yeah, you go out five or 10 years later, 01:42:08.600 |
you know, you would hope that you would realize that your soul has been corroded 01:42:11.640 |
and you probably started out thinking that you were a patriot and you were trying to 01:42:15.000 |
defend democracy and you ended up being, you know, extremely authoritarian and anti-democratic 01:42:18.760 |
and anti-Western. Can I ask you a tough question here? Staying on the ring of power, 01:42:25.560 |
Elon is quickly becoming the most powerful human on earth. 01:42:29.880 |
Uh, I'm not sure about that. You don't, you don't think he is? Well, he doesn't have the nukes. So 01:42:38.040 |
nukes. Yeah. There's different definitions and perspectives on power, right? How can he 01:42:47.160 |
and or Donald Trump, uh, avoid the corrupting aspects of this power? I mean, I think the 01:42:53.400 |
danger is there with power. It's just, it's flat out there. I, I would say with Elon, 01:42:57.240 |
I mean, we'll, you know, we'll see. I would say with Elon and I would say by the way, 01:42:59.960 |
overwhelmingly, I would say so far so good. I'm extremely, extremely thrilled by what he's done 01:43:03.560 |
on almost every front, um, for like, you know, the last 30 years, but including all this stuff 01:43:08.840 |
recently, like I think he's, he's been a real hero on a lot of topics where we needed to see 01:43:11.880 |
heroism. But look, I would say, I guess the sort of case that he has this level of power is some 01:43:16.440 |
combination of the money and the, the, and the proximity to the president. Um, and obviously 01:43:20.280 |
both of those are, are, are instruments of power. The counter argument to that is, I do think a lot 01:43:25.640 |
of how Elon is causing change in the world right now. I mean, there's, there's the companies he's 01:43:30.040 |
running directly where I think he's doing very well. Um, and we're investors in multiple of them 01:43:33.480 |
and doing very well. Um, uh, but I think like a lot of the stuff that gets people mad at him is 01:43:39.320 |
like, it's the social and political stuff and it's, you know, it's his statements and then 01:43:42.440 |
it's the downstream downstream effects of his statements. So like, for example, it's, you know, 01:43:45.800 |
for the last couple of weeks it's been him, you know, kind of weighing in on this rape gang scandal, 01:43:49.800 |
you know, this rape organized child rape thing in the UK. Um, and you know, it's, it's, you know, 01:43:55.080 |
it's, it's actually a, it's a preference cascade. It's one of these things where people knew there 01:43:58.520 |
was a problem. They weren't willing to talk about it. It kind of got suppressed. Um, and then, um, 01:44:03.400 |
Elon brought it up and then all of a sudden there's now in the UK, this like massive explosion of 01:44:07.320 |
basically open conversation about it for the first time. And you know, it's like this catalyzing all 01:44:11.240 |
of a sudden everybody's kind of woken up and being like, oh my God, you know, this is really 01:44:14.680 |
bad. And then there will be now, you know, pretty sure pretty, pretty clearly big changes as a 01:44:18.520 |
result. So, and Elon was, you know, he played the role of the boy who said the emperor has no clothes, 01:44:23.000 |
right. But, but, but here's the thing, here's my point. Like he said it about something that was 01:44:27.320 |
true, right. And so had he said it about something that was false, you know, he would get no credit 01:44:32.120 |
for it. He wouldn't deserve any credit for it, but he said something that was true. And by the way, 01:44:36.600 |
everybody over there instantly, they were like, oh yeah, he's right. Right. Like nobody, nobody 01:44:40.280 |
seriously said they're just arguing the details now. So, so number one, it's like, okay, he says 01:44:44.680 |
true things. And so it's like, okay, how far, I'll put it this way, like how worried are we about 01:44:49.800 |
somebody becoming corrupt by virtue of their power being that they get to speak the truth. 01:44:53.720 |
And I guess I would say, especially in the last decade of what we've been through, where everybody's 01:44:57.320 |
been lying all the time about everything, I'd say, I think we should run this experiment as hard as 01:45:01.000 |
we can to get people to tell the truth. And so, uh, I don't feel that bad about that. And then 01:45:05.160 |
the money side, you know, this rapidly gets into the money and politics question and the money and 01:45:10.760 |
politics question is this very interesting question because, um, it seems like if there's 01:45:15.400 |
a clear cut case that the more money in politics, the worst things are, and the more corrupted the 01:45:19.160 |
system is. Um, that was a very popular topic of public conversation up until 2016. Um, when Hillary 01:45:25.160 |
outspent Trump three to one and lost, you'll notice that money in politics has all most vanished as a 01:45:30.840 |
topic, um, in the last eight years. And once again, Trump spent far less, you know, Kamala raised and 01:45:35.480 |
spent 1.5 billion on top of what Biden had spent. So they were, they were at, I don't know, something 01:45:40.120 |
like three, 3 billion total. And Trump, I think spent again, like a third or a fourth of that. 01:45:44.040 |
Um, and so the money in politics kind of topic has kind of vanished from the popular conversation 01:45:49.320 |
in the last eight years. It has come back a little bit now that Elon is spending, 01:45:52.360 |
you know, but, but again, like, it's like, okay, he's spending, but the data would seem to indicate 01:45:59.480 |
in the last, at least the last eight years that money doesn't win the political battles. It's 01:46:02.920 |
actually like the voters actually have a voice and they actually exercise it and they don't just 01:46:06.360 |
listen to ads. And so again, there, I would say like, yeah, clearly there's some power there, 01:46:09.720 |
but I don't know if it's like, I don't know if it's some, like, I don't know if it's some weapon 01:46:13.720 |
that he can just like turn on and, and, and use in a definitive way. And I don't know if there's 01:46:18.520 |
parallels there, but I could also say just on a human level, he has a good heart and I interact 01:46:24.600 |
with a lot of powerful people and that's not always the case. So that's a good thing there. 01:46:30.200 |
If we, if we can draw parallels to the Hobbit or whatever, who gets to put on the ring. 01:46:37.400 |
Yeah. I mean, maybe one of the lessons of Lord of the Rings, right, is even, even Frodo would 01:46:40.360 |
have been, you know, even Frodo would have been corrupted. Right. But, you know, nevertheless, 01:46:43.640 |
you had somebody who could do what it took at the, at the time. The thing that I find just so 01:46:47.880 |
amazing about the Elon phenomenon and all the critiques is, um, you know, the one thing that 01:46:51.560 |
everybody in our societies universally agrees on because of our, it's sort of our, our, our post 01:46:56.360 |
Christian egalitarian. So, you know, we live in sort of this post secularized Christian context 01:47:01.400 |
in the West now. And it's, you know, we, you know, we've, we consider Christianity kind of, 01:47:05.320 |
you know, backwards, but we still believe essentially all the same things. We just 01:47:08.440 |
dress them up and, and, and sort of fake science. Um, um, so the, the one thing that we're all told, 01:47:14.200 |
we're all taught from, from early is that the best people in the world are the people who care 01:47:17.000 |
about all of humanity. Right. Uh, and we venerate, you know, all of our figures are people who care 01:47:21.320 |
about all of, you know, Jesus cared about all of humanity. Gandhi cared about all of humanity. 01:47:24.920 |
Martin Luther King cared about all of humanity. Like we, it's, it's the person who cares the most 01:47:28.840 |
about everybody. Uh, and, and with Elon, you have a guy who literally like is, he's literally, he, 01:47:34.040 |
he talks about this constantly. And he talks about exactly the same in private is literally, 01:47:37.400 |
he is operating on behalf of all of humanity to try to get us to, you know, he goes through to 01:47:41.480 |
get us through multi-planetary civilization so that we can survive a strike on any one planet, 01:47:44.760 |
so that we can extend the light of human consciousness into the world and, you know, 01:47:47.400 |
into the universe and have it persist, you know, and the good of the whole thing. And like literally 01:47:51.400 |
the critique is, yeah, we want you to care about all of humanity, but not like that. 01:47:54.600 |
Yeah. All the critics, all the, all the surface turmoil, the critics will be forgotten. 01:48:02.680 |
Yeah. I think that's, yeah, that's clear. You said that, uh, we always end up being 01:48:08.120 |
ruled by the elite of some kind. Can you explain this law, this idea? 01:48:13.400 |
So this comes from a Italian political philosopher from about a hundred years ago named Robert. 01:48:18.760 |
I'm going to mangle the, let you pronounce the Italian, uh, Michelle's, um, uh, or Michael's. 01:48:24.760 |
Um, and then it was, I learned about it through a, a famous book on, on politics, probably the 01:48:29.000 |
best book on politics written in the 20th century called the Machiavellians by this guy, James 01:48:32.760 |
Burnham, um, who has had a big impact on me, but, um, in the Machiavellians, he resurrects what he 01:48:37.640 |
calls is a sort of Italian realist school of political philosophy from the, from the tens 01:48:41.240 |
and twenties. Um, and these were people to be clear, this was not like a Mussolini thing. 01:48:44.520 |
These were people who were trying to understand the actual mechanics of how politics actually 01:48:48.840 |
works. So to get to the actual sort of mechanical substance of like how the political machine 01:48:54.520 |
operates. And, um, this guy, Michelle's had this concept he ended up with called the iron law of 01:48:59.640 |
oligarchy. Um, and so what the iron law of oligarchy, and I mean, take a step back to say 01:49:05.080 |
what he meant by oligarchy, because it has multiple meanings. So basically in classic 01:49:08.040 |
political theory, there's basically three forms of government at core. There's democracy, which 01:49:13.160 |
is rule of the many. There's oligarchy, which is rule of the few. Um, and there's monarchy, 01:49:17.080 |
which is rule of the one. And you can just use that as a general framework of any government 01:49:20.920 |
you're going to be under is going to be one of those just mechanical observation without even 01:49:24.280 |
saying which one's good or bad, just a structural observation. Um, and so the question that 01:49:28.760 |
Michelle's asked was like, is there such a thing as democracy? Like, is there actually such a thing 01:49:32.360 |
as democracy? Is, is there ever actually like direct, direct government? And what he did was 01:49:37.080 |
he mounted this sort of incredible historical, uh, exploration of whether democracies had ever 01:49:40.840 |
existed in the world. And the answer basically is almost never. And we could talk about that, 01:49:44.920 |
but the other thing he did was he, he sought out the most democratic, uh, private organization in 01:49:50.520 |
the world that he could find at that point, which he concluded was some basically communist German 01:49:54.360 |
auto workers union that was like wholly devoted to the workers of the world uniting, you know, 01:49:58.360 |
back when that was like the hot thing. And he went in there and he's like, okay, this is the 01:50:01.960 |
organization out of all organizations on planet earth that must be operating as a, as a, as a 01:50:05.400 |
direct democracy. And he went in there and he's like, Oh, no, there's a leadership class. You 01:50:09.000 |
know, there's like six guys at the top and they control everything and they lead the rest of the 01:50:13.160 |
membership along, you know, by the nose, which is of course the story of every union, the story of 01:50:18.040 |
every union is always the story of, you know, there's, there's a Jimmy Hoffa in there, you 01:50:21.080 |
know, kind of running the thing. Uh, you know, we just saw that with the dock workers union, right? 01:50:24.120 |
Like, you know, there's a guy, um, and he's in charge. And by the way, the number two is his son, 01:50:30.280 |
right? Like that's not like a, you know, an accident, right? So the iron law of oligarchy 01:50:35.080 |
basically says democracy is fake. Um, there's always a ruling class. There's always a ruling 01:50:39.000 |
elite structurally. And he said, the reason for that is because the masses can't organize, 01:50:43.240 |
right? What, what's the fundamental problem that whether the mass is 25,000 people in a union or 01:50:47.320 |
250 million people in a country, the masses can't organize. The majority cannot organize. Only a 01:50:51.960 |
minority can organize. Um, and to be effective in politics, you must organize. Um, and therefore 01:50:57.400 |
every political structure in human history has been some form of a small organized elite ruling, 01:51:02.840 |
a large and dispersed majority, every single one. Um, the Greeks and the Florentines had 01:51:09.880 |
brief experiments in direct democracy and they were total disasters. Uh, in Florence, 01:51:14.680 |
I forget the name of it. It was called like the workers revolt or something like that. There was 01:51:18.040 |
like a two year period, um, where they basically experimented with direct democracy during the 01:51:22.920 |
Renaissance and it was a complete disaster and they never tried it again in the state of 01:51:27.480 |
California. We have our own experiment on this, which is the proposition system, which is an 01:51:32.440 |
overlay on top of the legislature. And it, you know, anybody who looks at it for two seconds 01:51:36.440 |
concludes it's been a complete disaster. It's just a catastrophe. Um, and it's caused enormous 01:51:40.520 |
damage to the state. And so basically the, the, the, basically the, the presumption that we are 01:51:44.600 |
in a democracy is just sort of by definition fake. Now, good news for the U S it turns out the 01:51:48.600 |
founders understood this. And so of course they didn't give us a direct democracy. They gave us 01:51:51.880 |
a representative democracy, right? And so they, they, they built the oligarchy into the system 01:51:55.800 |
in the form of Congress and the, and the executive branch, the judicial branch. Um, but, but so 01:52:00.360 |
anyway, so as a consequence, democracy is always and everywhere fake, there is always a ruling elite. 01:52:04.600 |
Um, and, and basically the, the, the, the lesson of the Machiavellians is you can deny that if 01:52:09.560 |
you want, but you're fooling yourself. Um, the way to actually think about how to make a system work 01:52:13.800 |
and maintain any sort of shred of freedom, um, is to actually understand that that is actually 01:52:17.960 |
what's happening. And, uh, lucky for us, the founders saw this and figured out a way to, 01:52:24.680 |
given that there's going to be a ruling elite, how to create a balance of power 01:52:28.920 |
among that elite. Yes. So it doesn't get out of hand. And it was very clever, right? And, 01:52:34.440 |
you know, some of this was based on earlier experiments. Some of this, by the way, you know, 01:52:37.080 |
they, these, these were very, very smart people, right? And so they, they knew tremendous amounts 01:52:40.600 |
of like Greek and Roman history. They knew the Renaissance history, you know, they, the 01:52:43.880 |
Federalist Papers, they argued this a great length. You can read it all. Um, you know, 01:52:47.160 |
they, they, they, they ran like a, one of the best seminars in world history trying to figure this 01:52:50.840 |
out. Um, and they, they went through all this and yeah. And so they, they thought through it very 01:52:54.360 |
carefully, but just I'll give you an example, which continues to be a hot topic. So, you know, 01:52:58.520 |
one way they did it is through the three branches of government, right? Executive, legislative, 01:53:01.880 |
and judicial, um, uh, sort of balance of powers. But the other way they did it was they sort of 01:53:07.080 |
echoing what had been done earlier. I think in the UK parliament, um, uh, they created the two 01:53:11.480 |
different bodies of the legislature, right? And so the, you know, the house and the Senate, 01:53:15.000 |
and as you know, the, the house is a portion on the basis of population and the Senate is not, 01:53:19.000 |
right? Uh, the small states have just as many senators as the big States. Um, and then they 01:53:23.480 |
made the deliberate decision to have the house get reelected every two years to make it very 01:53:26.920 |
responsive to the will of the people. And they made the decision to have the Senate get reelected 01:53:31.080 |
every six years so that it had more buffer from the passions of the moment. Um, but what's 01:53:35.560 |
interesting is they didn't choose one or the other, right? They did them both. And then to, 01:53:39.480 |
to get legislation passed, you have to get through both of them. And so they, they built in like a 01:53:42.840 |
second layer of checks and balances. And then there's a thousand observations we could make 01:53:47.880 |
about like how well the system is working today and like, how much does it live up to the ideal 01:53:51.960 |
and how much are we actually complying with the constitution? And there's lots of, you know, 01:53:55.560 |
there's lots of open questions there, but you know, this system has survived for coming on 250 01:54:00.440 |
years with a country that has been spectacularly successful that I don't think at least, you know, 01:54:05.080 |
I don't think any of us would trade the system for any other one. And so it's one of the great 01:54:08.600 |
all-time achievements. Yeah, it's incredible. And we should say they were all pretty young 01:54:12.440 |
relative to our current set of leaders. Many in their twenties at the time. And like supergeniuses, 01:54:17.720 |
this is one of those things where it's just like, all right, something happened where there was a 01:54:20.760 |
group of people where, you know, nobody ever tested their IQs, but like these are Einstein's 01:54:24.520 |
of politics. Yeah. An amazing thing. But anyway, I just, I go through all that, which is they were 01:54:28.760 |
very keen students of the actual mechanical practice of democracy, not fixated on what 01:54:34.600 |
was desirable. They were incredibly focused on what would actually work, which is, you know, 01:54:38.600 |
I think the way to think about these things. They were engineers of sort, not the fuzzy 01:54:43.080 |
humanity students. They were shape rotators, not word cells. 01:54:46.920 |
I remember that. Wow. That meme came and went. I think you were central to them. You're central 01:54:55.240 |
You're the meme dealer and the meme popularizer. That meme I get some credit for. And then the 01:55:00.360 |
current thing is the other one I get some credit for. I don't know that I invented either one, 01:55:03.400 |
but I popularized them. Take credit and run with it. 01:55:06.680 |
If we can just linger on the Machiavellians, it's a study of power and power dynamics. Like 01:55:15.320 |
you mentioned, looking at the actual reality of the machinery of power from everything you've seen 01:55:22.680 |
now in government, but also in companies, what are some interesting things you can sort of 01:55:27.960 |
continue to say about the dynamics of power, the jostling for power that happens inside these 01:55:34.200 |
Yeah. So a lot of it, we already talked about this a bit with the universities, which is you can 01:55:38.520 |
apply a Machiavellian style lens. It's why I posed the question to you that I did, which is, okay, 01:55:43.000 |
who runs the university, the trustees, the administration, the students or the faculty? 01:55:48.200 |
And the answer, the true answer is some combination of the three or of the four, 01:55:51.640 |
plus the donors, by the way, plus the government, plus the press, et cetera. Right. And so there's 01:55:57.240 |
a mechanical interpretation of that. I mean, companies operate under the exact same set of 01:56:01.800 |
questions who runs a company, the CEO, but the CEO runs the company basically up to the day that 01:56:07.480 |
either the shareholders or the management team revolt. If the shareholders revolt, it's very 01:56:11.560 |
hard for the CEO to stay in the seat. If the management team revolts, it's very hard for the 01:56:14.760 |
CEO to stay in the seat. By the way, if the employees revolt, it's also hard to stay in the 01:56:18.200 |
seat. By the way, if the New York Times comes at you, it's also very hard to stay in the seat. If 01:56:22.440 |
the Senate comes at you, it's very hard to stay in the seat. So a reductionist version of this 01:56:28.920 |
that is a good shorthand is who can get who fired. So who has more power? The newspaper columnist who 01:56:36.040 |
makes $200,000 a year or the CEO who makes $200 million a year. And it's like, well, I know for 01:56:41.560 |
sure that the columnist can get the CEO fired. I've seen that happen before. I have yet to see 01:56:45.640 |
a CEO get a columnist fired. Did anyone ever get fired from the Bill Ackman assault on journalism? 01:56:54.840 |
So Bill really showed the bullshit that happens in journalism. 01:56:59.160 |
No, because what happens is they wear it with... And I would say to their credit, 01:57:02.920 |
they wear it as a badge of honor. And then to their shame, they wear it as a badge of honor, 01:57:06.040 |
right? Which is if they're doing the right thing, then they are justifiably proud of 01:57:11.080 |
themselves for standing up under pressure. But it also means that they can't respond 01:57:14.840 |
to legitimate criticism. And they're obviously terrible at that now. As I recall, he went 01:57:20.360 |
straight to the CEO of, I think, Axel Springer that owns Insider. And I happen to know the CEO, 01:57:26.360 |
and I think he's quite a good CEO. But it was a good example. It was the CEO of Axel Springer run 01:57:30.600 |
his own company, right? Well, there's a fascinating... Okay, so there's a fascinating 01:57:34.920 |
thing playing out right now. Not to dwell on these fires, but you see the pressure reveals things, 01:57:40.760 |
right? And so I've just been watching what's happened with the LA Times recently. So this 01:57:44.760 |
guy, biotech entrepreneur, buys the LA Times, whatever, eight years ago. It is just the most 01:57:49.720 |
radical social revolutionary thing you can possibly imagine. It endorses every crazy left-wing 01:57:54.600 |
radical you can imagine. It endorses Karen Bass. It endorses Gavin Newsom. It's just like a litany 01:57:59.160 |
of all the people who are currently burning the city to the ground. It's just like endorsed every 01:58:03.000 |
single bad person every step of the way. He's owned it the entire time. He put his foot down 01:58:08.360 |
right before... For the first time, I think, put his foot down right before the November election 01:58:11.640 |
and said, "We're not... " He said, "We're gonna get out of this thing where we just always endorse 01:58:15.240 |
the Democrat." And we said, "We're not endorsing... " I think he said, "We're not endorsing for the 01:58:18.200 |
presidency." And the paper flipped out, right? It's like our billionaire backer who's... I don't 01:58:22.840 |
know what he spends, but he must be burning $50 or $100 million a year out of his pocket to keep 01:58:27.320 |
this thing running. He paid $500 million for it, which is amazing, back when people still thought 01:58:32.920 |
these things were businesses. And then he's probably burned another $500 million over the 01:58:39.080 |
last decade keeping it running. And he burns probably another $50, $100 million a year to 01:58:42.920 |
do this. And the journalists at the LA Times hate him with the fury of a thousand suns. 01:58:47.320 |
They just absolutely freaking despise him. And they have been attacking him. And the ones that 01:58:51.960 |
can get jobs elsewhere quit and do it, and the rest just stay and say the worst, most horrible 01:58:55.400 |
things about him. And they wanna constantly run these stories attacking him. And so he has had 01:59:00.760 |
this reaction that a lot of people in LA are having right now to this fire and to this just 01:59:05.000 |
incredibly vivid collapse of leadership and all these people that his paper had endorsed are just 01:59:09.480 |
disastrous. And he's on this tour. He's basically just... He's decided to be the boy who says the 01:59:17.000 |
emperor has no clothes, but he's doing it to his own newspaper. Very smart guy. He's on a press 01:59:21.320 |
tour, and he's basically saying, "Yeah, we... Yes, we did all that, and we endorsed all these people, 01:59:25.000 |
and it was a huge mistake, and we're gonna completely change." And his paper is in a 01:59:28.680 |
complete internal revolt. But I go through it, which is, okay, now we have a very interesting 01:59:32.840 |
question, which is who runs the LA Times. Because for the last eight years, it hasn't been him. 01:59:38.360 |
It's been the reporters. Now, for the first time, the owner is showing up saying, "Oh, 01:59:44.840 |
no, I'm actually in charge." And the reporters are saying, "No, you're not." And it is freaking 01:59:50.840 |
on. And so again, the Machiavellian's mindset on this is like, okay, how is power actually 01:59:55.160 |
exercised here? Can a guy who's even super rich and super powerful, who even owns his own newspaper, 02:00:00.440 |
can he stand up to a full-scale assault, not only by his own reporters, but by every other 02:00:04.840 |
journalism outlet who also now thinks he's the Antichrist? And he is trying to exercise power 02:00:10.440 |
by speaking out publicly, and so that's the game of power there? And firing people. And he has 02:00:15.160 |
removed people, and he has set new rules. I mean, he is now, I think, at long... I think he's saying 02:00:19.480 |
that he's now at long last actually exercising prerogatives of an owner of a business, which is 02:00:23.720 |
deciding the policies and staffing of the business. There are certain other owners of these 02:00:27.240 |
publications that are doing similar things right now. He's the one I don't know, so he's the one 02:00:31.400 |
I can talk about. But there are others that are going through the same thing right now. 02:00:34.680 |
And I think it's a really interesting open question. In a fight between the employees 02:00:40.440 |
and the employer, it's not crystal clear that the employer wins that one. 02:00:43.720 |
And just to stay on journalism for a second, we mentioned Bill Ackman. I just want to say, 02:00:47.080 |
put him in the category we mentioned before of a really courageous person. I don't think 02:00:52.840 |
I've ever seen anybody so fearless in going after, in following what he believes in publicly. 02:01:02.760 |
That's courage. Several things he's done publicly has been really inspiring, just being courageous. 02:01:10.200 |
What do you think is the most impressive example? 02:01:12.040 |
Where he went after journalists, whose whole incentive is to... I mean, it's like sticking 02:01:20.200 |
your, like kicking the beehive or whatever. You know what's going to follow. And to do that, 02:01:26.520 |
I mean, that's why it's difficult to challenge journalistic organizations. Because they're going 02:01:31.800 |
to, you know, there's just so many mechanisms they use, including like writing articles and 02:01:36.120 |
get cited by Wikipedia, and then drive the narrative, and then they can get you fired, 02:01:39.960 |
all this kind of stuff. Bill Ackman, like a bad MF-er, just tweets these essays and just goes 02:01:48.680 |
after them legally and also in the public eye. And just, I don't know, that was truly inspiring. 02:01:55.000 |
There's not many people like that in public. And hopefully that inspires not just me, 02:02:01.400 |
but many others to be like, to be courageous themselves. 02:02:05.160 |
Did you know of him before he started doing this in public? 02:02:08.120 |
I knew of Neri, his wife, who's this brilliant researcher and scientist. And so I admire her, 02:02:15.000 |
Well, the reason I ask if you knew about Bill is because a lot of people had not heard of 02:02:18.280 |
him before, especially like before October 7th and before some of the campaigns he's been running 02:02:21.960 |
since in public, and with Harvard and so forth. But he was very well known in the investment world 02:02:27.240 |
before that. So he was a famous, he's a so-called activist investor for, you know, very, very 02:02:33.320 |
successful and very widely respected for probably 30 years before now. And I bring that up because 02:02:39.240 |
it turns out they weren't, for the most part, battles that happened in kind of full public 02:02:42.920 |
view. They weren't national stories. But in the business and investing world, the activist investor 02:02:47.160 |
is a very, it's like in the movie Taken, it's a very specific set of skills on how to like, 02:02:54.200 |
really take control of situations and how to wreck the people who you're going up against. 02:02:57.880 |
And just, and there's lots, there's been controversy over the years on this topic, 02:03:03.960 |
and there's too much detail to go into. But the defense of activist investing, which I think is 02:03:07.640 |
valid is, you know, these are the guys who basically go in and take stakes in companies 02:03:11.960 |
that are being poorly managed or under-optimized. And then generally what that means is, at least 02:03:16.440 |
the theory is, that means the existing management has become entrenched and lazy, mediocre, you 02:03:21.880 |
know, whatever, not, you're responding to the needs of the shareholders, often not responding 02:03:25.960 |
to the customers. And the activists basically go in with a minority position, and then they rally 02:03:31.640 |
support among other investors who are not activists. And then they basically show up and 02:03:35.480 |
they force change. But they are the aggressive version of this. And I've been on the, I've been 02:03:40.600 |
involved in companies that have been on the receiving end of these, where it is amazing how 02:03:44.840 |
much somebody like that can exert pressure on situations, even when they don't have formal 02:03:48.680 |
control. So it's another, it would be another chess piece on the mechanical board of kind of 02:03:53.480 |
how power gets exercised. And basically what happens is the effective analysts, a large amount 02:03:56.680 |
of the time they end up taking, they end up taking over control of companies, even though they never 02:04:00.280 |
own more than like 5% of the stock. And so anyway, so it turns out with Bill, it's such a fascinating 02:04:04.520 |
case because he has that like complete skillset. And he has now decided to bring it to bear in 02:04:11.080 |
areas that are not just companies. And two interesting things for that. One is, some of 02:04:15.720 |
these places, and some of these battles are still ongoing, but number one, like a lot of people who 02:04:19.480 |
run universities or newspapers are not used to being up against somebody like this. And by the 02:04:24.920 |
way, also now with infinitely deep pockets and lots of experience in courtrooms and all the 02:04:28.120 |
things that kind of go with that. But the other is through example, he is teaching a lot of the rest 02:04:33.000 |
of us, the activist playbook, like in real time. And so the Liam Neeson skillset is getting more 02:04:38.600 |
broadly diffused just by being able to watch and learn from him. So I think he's having a, 02:04:44.200 |
I would put him up there with Elon in terms of somebody who's really affecting 02:04:46.680 |
how all this is playing out. But even skillset aside, just courage. 02:04:50.440 |
Yes. Including by the way, courage to go outside of his own zone. 02:04:56.280 |
I'll give you an example. Like my firm, venture capital firm, we have LPs. There are things that 02:05:00.040 |
I feel like I can't do or say, because I feel like I would be bringing embarrassment or other 02:05:05.400 |
consequences to our LPs. He has investors also where he worries about that. And so a couple 02:05:11.080 |
things. One is his willingness to go out a bit and risk his relationship with his own investors. 02:05:15.480 |
But I will tell you the other thing, which is his investors, I know this for a fact, his investors 02:05:19.000 |
have been remarkably supportive of him doing that. Because as it turns out, a lot of them actually 02:05:23.320 |
agree with him. And so it's the same thing he does in his activism campaigns. He is able to 02:05:29.320 |
be the tip of the spear on something that actually a lot more people agree with. 02:05:32.840 |
Yeah. It turns out if you have truth behind you, it helps. 02:05:36.120 |
And just again, how I started is a lot of people are just fed up. 02:05:40.280 |
You've been spending a bunch of time in Mar-a-Lago and Palm Beach, 02:05:44.520 |
helping the new administration in many ways, including interviewing people who might join. 02:05:49.320 |
So what's your general sense about the talent, about the people who are coming 02:05:55.560 |
So I should start by saying I'm not a member of the new administration. I'm not in the room when 02:06:05.560 |
I'm an unpaid intern. So I'm a volunteer when helpful, but I'm not making the decisions nor 02:06:11.160 |
am I in a position to speak for the administration. So I don't want to say anything that would cause 02:06:15.000 |
people to think I'm doing that. This is a very unusual situation where you had an incumbent 02:06:18.120 |
president and then you had a four-year gap where he's out of office and then you have him coming 02:06:21.240 |
back. And as you'll recall, there was a fair amount of controversy over the end of the first term. 02:06:28.120 |
The fear, the specific concern was the first Trump administration, and they will all say this, 02:06:33.320 |
is they didn't come in with a team. So they didn't come in with a team. And most of the 02:06:37.240 |
sort of institutional base of the Republican Party were Bush Republicans, and many of them 02:06:41.800 |
had become never Trumpers. And so they had a hard time putting the team together. 02:06:44.760 |
And then by the way, they had a hard time getting people confirmed. And so if you talk to the people 02:06:48.520 |
who were there in the first term, it took them two to three years to kind of even get the government 02:06:51.880 |
in place. And then they basically only had the government in place for basically like 18 months 02:06:57.160 |
and then COVID hit. And then sort of the aftermath and everything and all the drama and headlines 02:07:01.640 |
and everything. And so the concern, including from some very smart people in the last two years has 02:07:05.960 |
been, boy, if Trump gets a second term, is he going to be able to get a team that is as good 02:07:10.440 |
as the team he had last time or a team that is actually not as good? Because maybe people got 02:07:14.120 |
burned out. Maybe they're more cynical now. Maybe they're not willing to go through the drama. 02:07:17.960 |
By the way, a lot of people in the first term came under their own withering legal assaults, 02:07:22.040 |
and some of them went to prison, and a lot of stuff happened. Lots of investigations, 02:07:27.080 |
lots of legal fees, lots of bad press, lots of debanking, by the way. A lot of the officials 02:07:33.240 |
in the first Trump term got debanked, including the president's wife and son. 02:07:37.960 |
>> Yeah, I heard you tell that story. That's insane. That's just insane. 02:07:41.640 |
>> In the wake of the first term, yes. We now take out spouses and children with our ring of power. 02:07:46.520 |
And so there was this legitimate question as to whether, okay, what will the team for this 02:07:51.960 |
second term look like? And at least what I've seen and what you're seeing in the appointments is it 02:07:55.560 |
looks much, much better. First of all, it just looks better than the first term, and not because 02:07:59.640 |
the people in the first term were not necessarily good, but you just have this influx of incredibly 02:08:04.360 |
capable people that have shown up that want to be part of this. And you just didn't have that the 02:08:08.600 |
first time. And so they're just drawing on a much deeper, richer talent pool than they had the first 02:08:12.840 |
time. And they're drawing on people who know what the game is. They're drawing on people now who 02:08:17.240 |
know what is going to happen, and they're still willing to do it. And so they're going to get, 02:08:21.400 |
I think, some of the best people from the first term, but they're bringing in a lot of people who 02:08:24.440 |
they couldn't get the first time around. And then second is there's a bunch of people, including 02:08:29.960 |
people in the first term, where they're just 10 years older. And so they went through the first 02:08:33.720 |
term, and they just learned how everything works. Or there are young people who just had a different 02:08:37.800 |
point of view, and now they're 10 years older, and they're ready to go serve in government. 02:08:41.400 |
And so there's a generational shift happening. And actually, one of the interesting things about 02:08:45.400 |
the team that's forming up is it's remarkably young. Some of the cabinet members and then many 02:08:50.120 |
of the second and third level people are in their 30s and 40s, which is a big change from the 02:08:55.080 |
gerontocracy that we've been under for the last 30 years. And so I think the caliber has been 02:09:00.120 |
outstanding. And we could sit here and list tons and tons of people, but the people who are running, 02:09:05.080 |
it's everything from the people who are running all the different departments at HHS. It's the 02:09:08.360 |
people running the number two at the Pentagon is Steve Feinberg, who's just an incredible legend 02:09:13.320 |
of private equity, incredible capable guy. We've got actually two of my partners are going in, 02:09:18.520 |
who I both think are amazing. Yeah, many, many parts of the government, 02:09:22.840 |
the people are really impressive. Well, I think one of the concerns is actually that 02:09:28.200 |
given the human being of Donald Trump, that there would be more tendency towards, let's say, 02:09:37.480 |
favoritism versus meritocracy, that there's circles of sycophancy that form. And if you're 02:09:45.560 |
be able to be loyal and never oppose and just be basically suck up to the president, 02:09:53.960 |
that you will get a position. So that's one of the concerns. And I think you're in a good position to 02:09:57.880 |
speak to the degree that's happening versus hiring based on merit and just getting great teams. 02:10:06.360 |
Yeah. So look, just start by saying any leader at that level, by the way, any CEO, there's always 02:10:11.400 |
some risk of that, right? So it's like a natural reality warps around powerful leaders. And so 02:10:17.240 |
there's always some risk to that. Of course, the good and powerful leaders are very aware of that. 02:10:20.600 |
And Trump at this point in his life, I think, is highly aware of that. At least my interactions 02:10:23.880 |
with him, he definitely seems very aware of that. So that's one thing. I would just say, 02:10:29.720 |
I think the way to look at that, I mean, look, like I said, I don't want to predict what's 02:10:32.440 |
going to happen once this whole thing starts unfolding. But I would just say, again, the 02:10:36.280 |
caliber of the people who are showing up and getting the jobs, and then the fact that these 02:10:39.320 |
are some of the most accomplished people in the business world and in the medical field. 02:10:43.960 |
Jay Battataria coming in to run NIH. So I was part of the interview team for a lot of the HHS folks. 02:10:52.120 |
Nice. Jay's amazing. I was so happy to see that. 02:10:55.480 |
So I literally got, this is a story, I got to the transition office for one of the days of the HHS 02:10:59.480 |
interviews. And I was on one of the interviewing teams and they gave us, I didn't know who the 02:11:02.040 |
candidates were, and they gave us the sheet in the beginning. And I go down the sheet and I 02:11:04.760 |
saw Jay's name, and I almost physically fell out of my chair. And I was just like... And I happen 02:11:12.680 |
to know Jay, and I respect him enormously. And then he proved himself under this, talk about a 02:11:17.080 |
guy who proved himself under extraordinary pressure over the last five years. 02:11:20.680 |
And then go radical under the pressure. He maintained balance and thoughtfulness and depth. 02:11:27.000 |
I mean, incredible. Very serious, very analytical, 02:11:29.960 |
very applied. And yes, 100%, tested under pressure, came out. The more people look back 02:11:36.440 |
at what he said and did, and none of us are perfect, but overwhelmingly insightful throughout 02:11:43.480 |
that whole period. And we would all be much better off today had he been in charge of the response. 02:11:47.720 |
And so just an incredibly capable guy. And look, and then he learned from all that. He learned a 02:11:53.240 |
lot in the last five years. And so the idea that somebody like that could be head of NIH as 02:11:57.320 |
compared to the people we've had is just breathtakingly... It's just a gigantic upgrade. 02:12:01.400 |
And then Marty McAree coming in to run FDA, exact same thing. The guy coming to run a CDC, 02:12:07.240 |
exact same thing. I mean, I've been spending time with Dr. Oz. I'm not on these teams, 02:12:15.720 |
I'm not in the room, but I've been spending enough time trying to help that his level of insight into 02:12:20.280 |
the healthcare system is astounding. And it comes from being a guy who's been in the middle of the 02:12:25.000 |
whole thing and been talking to people about this stuff and working on it and serving as a doctor 02:12:28.600 |
himself and in medical systems for his entire life. And he's like a walking encyclopedia on 02:12:33.960 |
these things. And very dynamic, very charismatic, very smart, organized, effective. So to have 02:12:41.160 |
somebody like that in there... And so anyway, I have like 30 of these stories now across all 02:12:46.120 |
these different positions. And then I just, I'd be quite honest, I do the compare and contrast 02:12:51.800 |
of the last four years and it's not even... These people are not in the same ballpark. 02:12:55.640 |
They're just like wildly better. And so pound for pound is maybe the best team in the White House 02:13:02.920 |
since, I don't even know, maybe the '90s, maybe the '30s, maybe the '50s. Maybe Eisenhower had a 02:13:11.960 |
team like this or something, but there's a lot of really good people in there now. 02:13:16.200 |
Yeah. The potential for change is certainly extremely high. Can you speak to Doge? 02:13:20.360 |
What's the most wildly successful next two years for Doge? Can you imagine, 02:13:27.720 |
maybe also, can you think about the trajectory that's the most likely 02:13:32.360 |
and what kind of challenges would it be facing? 02:13:35.800 |
Yeah. So I'll start by saying again, I'm not, disclaimer, I have to disclaim, I'm not on Doge. 02:13:43.000 |
We should say there's about 10 lawyers in the room. 02:13:47.320 |
Both the angels and the devils on my shoulder are lawyers. 02:13:51.080 |
Yeah. So I'm not speaking for Doge. I'm not in charge of Doge. Those guys are doing it. I'm not 02:13:56.200 |
doing it, but again, I'm volunteering to help as much as I can and I'm 100% supportive. Yeah. So 02:14:02.600 |
look, I think the way to think of... I mean, the basic outlines are in public, right? Which is it's 02:14:07.240 |
a time-limited basically commission. It's not a formal government agency. It's a time-limited 18 02:14:13.000 |
month. In terms of implementation, it will advise the executive branch, right? And so the implementation 02:14:20.040 |
will happen through the White House. And the president has total latitude on what he wants to 02:14:25.080 |
implement. And then basically what I think about it is three kind of streams, kind of target sets. 02:14:31.080 |
And they're related, but different. So money, people, and regulations. And so the headline 02:14:37.640 |
number, they put the $2 trillion number and there's already disputes over that and whatever. 02:14:41.880 |
And there's a whole question there, but then there's the people thing. And the people thing 02:14:45.320 |
is interesting because you get into these very kind of fascinating questions. And I've been doing 02:14:50.600 |
this. I won't do this for you as a pop quiz, but I do this for people in government as a pop quiz. 02:14:54.600 |
I can stump them every time, which is A, how many federal agencies are there? And the answer is 02:15:00.040 |
somewhere between 450 and 520, and nobody's quite sure. And then the other is how many people work 02:15:05.480 |
for the federal government? And the answer is something on the order, I forget, but like 4 02:15:10.040 |
million full-time employees and maybe up to 20 million contractors and nobody's quite sure. 02:15:13.960 |
And so there's a large people component to this. And then by the way, there's a related component 02:15:19.720 |
to that, which is how many of them are actually in the office? And the answer is not many. Most 02:15:24.440 |
of the federal buildings are still empty. And then there's questions of are people working from home 02:15:29.560 |
or are we actually working from home? So there's the people dimension. And of course, the money and 02:15:33.400 |
the people are connected. And then there's the third, which is the regulation thing. And I 02:15:37.080 |
described earlier how basically our system of government is much more now based on regulations 02:15:41.080 |
than legislation. Most of the rules that we all live under are not from a bill that went through 02:15:46.440 |
Congress. They're from an agency that created a regulation. That turns out to be very, very 02:15:51.080 |
important. So one is, Elon already described, we want to do, the Doge wants to do broad-based 02:15:55.000 |
regulatory relief. And Trump has talked about this and basically get the government off people's 02:15:58.760 |
backs and liberate the American people to be able to do things again. So that's part of it. 02:16:03.240 |
But there's also something else that's happened, which is very interesting, which was there were 02:16:06.360 |
a set of Supreme Court decisions about two years ago that went directly after the idea that the 02:16:11.880 |
executive branch can create regulatory agencies and issue regulations and enforce those regulations 02:16:17.240 |
without corresponding congressional legislation. And most of the federal government that exists 02:16:22.920 |
today, including most of the departments and most of the rules and most of the money and most of the 02:16:28.600 |
people, most of it is not enforcing laws that Congress passed. Most of it is regulation. And 02:16:34.280 |
the Supreme Court basically said large parts, large to maybe all of that regulation that did 02:16:40.760 |
not directly result from a bill that went through Congress the way that the cartoon said that it 02:16:44.920 |
should, that may not actually be legal. Now, the previous White House, of course, 02:16:51.560 |
was super in favor of big government. They did nothing based on this. They didn't pull anything 02:16:56.360 |
back in. But the new regime, if they choose to, could say, "Look, the thing that we're doing here 02:17:01.960 |
is not challenging the laws. We're actually complying with the Supreme Court decision that 02:17:06.760 |
basically says we have to unwind a lot of this. And we have to unwind the regulations, which are 02:17:11.480 |
no longer legal, constitutional. We have to unwind the spend, and we have to unwind the people." 02:17:15.240 |
And that's how you basically connect the thread from the regulation part back to the money part, 02:17:20.440 |
back to the people part. They have work going on all three of these threads. They have, I would 02:17:24.760 |
say, incredibly creative ideas on how to deal with this. I know lots of former government people who, 02:17:30.760 |
100% of them, are super cynical on this topic. And they're like, "This is impossible. This 02:17:33.960 |
could never possibly work." And I'm like, "Well, I can't tell you what the secret plans are, but 02:17:38.280 |
blow my mind." And all three of those, they have ideas that are really quite amazing, 02:17:45.080 |
as you'd expect from the people involved. And so over the course of the next few months, 02:17:49.720 |
that'll start to become visible. And then the final thing I would say is 02:17:52.840 |
this is going to be very different than attempts like that. There have been other programs like 02:17:58.120 |
this in the past. The Clinton-Gore administration had one, and then there were others before that. 02:18:03.160 |
Reagan had one. The difference is, this time, they're social media. And so there has never been... 02:18:11.400 |
It's interesting. One of the reasons people in Washington are so cynical is because they know 02:18:16.840 |
all the bullshit. They know all the bad spending, and all the bad rules, and all the... 02:18:21.320 |
I mean, look, we're adding a trillion dollars to the national debt every 100 days right now. 02:18:26.920 |
And that's compounding, and it's now passing the size of the Defense Department budget, 02:18:30.280 |
and it's compounding. And pretty soon, it's going to be adding a trillion dollars every 90 days, 02:18:33.960 |
and then it's going to be adding a trillion dollars every 80 days, and then it's going to 02:18:36.440 |
be a trillion dollars every 70 days. And then if this doesn't get fixed at some point, we enter a 02:18:40.520 |
hyperinflationary spiral, and we become Argentina or Brazil and kablooey. And so everybody in DC 02:18:47.960 |
knows that something has to be done. And then everybody in DC knows for a fact that it's 02:18:51.320 |
impossible to do anything. They know all the problems, and they also know the sheer impossibility 02:18:56.200 |
of fixing it. But I think what they're not taking into account, what the critics are not taking into 02:19:00.280 |
account is these guys can do this in the full light of day, and they can do it on social media. 02:19:05.480 |
They can completely bypass the press. They can completely bypass the cynicism. They can expose 02:19:09.960 |
any element of unconstitutional or silly government spending. They can run victory 02:19:14.920 |
laps every single day on what they're doing. They can bring the people into the process. 02:19:18.120 |
And again, if you think about it, this goes back to our Machiavellian structure, which is, 02:19:23.640 |
if you think about, again, you've got democracy, oligarchy, monarchy, rule of the many, 02:19:27.880 |
rule of the few, rule of the one. You could think about what's happening here as a little bit of a 02:19:31.800 |
sandwich, which is we don't have a monarch, but we have a president, rule of the one with some power. 02:19:37.720 |
And then we have the people who can't organize, but they can be informed, and they can be aware, 02:19:42.040 |
and they can express themselves through voting and polling. And so there's a sandwich happening 02:19:47.240 |
right now is the way to think about it, which is you've got basically monarchy. If you've got rule 02:19:51.400 |
of one combining with rule of many, and rule of many is that you get to vote. The people do get 02:19:55.720 |
to vote, basically. And then essentially Congress and this permanent bureaucratic class in Washington 02:20:01.640 |
as the oligarchy in the middle. And so the White House plus the people, I think, have the power to 02:20:07.640 |
do all kinds of things here. And I think that would be the way I would watch it. The transparency, 02:20:12.360 |
I mean, Elon just, by who he is, is incentivized to be transparent and show the bullshit in the 02:20:20.760 |
system and to celebrate the victories. So it's going to be so exciting. I mean, honestly, 02:20:25.720 |
just makes government more exciting, which is a win for everybody. 02:20:30.520 |
These people are spending our money. These people have enormous contempt for the taxpayer. 02:20:37.080 |
Okay, here's the thing you hear in Washington. Here's one of the things. So the first thing you 02:20:40.600 |
hear is this is impossible. They'll be able to do nothing. And then, yeah, I walk them through this, 02:20:43.400 |
and they're like, they start to get, it starts to dawn on them that this is a new kind of thing. 02:20:46.760 |
And then they're like, well, it doesn't matter because all the money is in entitlements and the 02:20:49.960 |
debt and the military. And so, yeah, you've got this silly, fake, whatever, NPR funding or whatever, 02:20:57.080 |
and it's a rounding error, and it doesn't matter. And you look it up in the budget, and it's like, 02:21:00.360 |
whatever, $500 million or $5 billion. Or it's the charging stations that don't exist. It's the $40 02:21:07.400 |
billion of charging stations, and they build eight charging stations. Or it's the broadband 02:21:11.960 |
internet plan that delivered broadband to nobody and cost you $30 billion. So these boondoggles. 02:21:17.720 |
And what everybody in Washington says is $30 billion is a rounding error on the federal 02:21:20.920 |
budget. It doesn't matter. Who cares if they make it go away? And of course, any taxpayer is like, 02:21:25.560 |
what the? What do you mean? It's $30 billion, right? And the experts are like, and the press 02:21:34.040 |
is in on this too. Then the experts are like, well, it doesn't matter because it's a rounding 02:21:36.760 |
error. No, it's $30 billion. And if you're this cavalier about $30 billion, imagine how cavalier 02:21:41.880 |
you are about the $3 trillion. Okay. Then there's the, okay, $30 billion is $30 billion a lot of the 02:21:47.400 |
federal budget and percentage. No, it's not. But $30 billion divided by, do the math, $30 billion 02:21:51.800 |
divided by let's say 300 million taxpayers, right? What's that? Math expert? $100. 02:21:57.880 |
$100 per taxpayer per year. Okay. So $100 to an ordinary person working hard every day to make 02:22:04.840 |
money and provide for their kids. $100 is a meal out. It's a trip to the amusement park. It's the 02:22:11.080 |
ability to buy additional educational materials. It's the ability to have a babysitter, to be able 02:22:16.360 |
to have a romantic relationship with your wife. There's like a hundred things that that person can 02:22:20.520 |
do with $100 that they're not doing because it's going to some bullshit program that is being 02:22:25.240 |
basically where the money is being looted out in the form of just like ridiculousness and graft. 02:22:30.200 |
And so the idea that that $30 billion program is not something that is like a very important thing 02:22:34.440 |
to go after is just like the level of contempt for the taxpayer is just off the charts. And then 02:22:40.200 |
that's just one of those programs. There's like a hundred of those programs and they're all just 02:22:45.080 |
like that. It's not like any of this stuff is running well. The one thing we know is that none 02:22:48.600 |
of this stuff is running well. We know that for sure. And we know these people aren't showing up 02:22:52.680 |
to work and we know that all this crazy stuff is happening. Do you remember Elon's story of what 02:23:00.600 |
got the Amish to turn out to vote in Pennsylvania? Okay. So Pennsylvania is like a wonderful state, 02:23:05.560 |
great history. It has these cities like Philadelphia that have descended like other 02:23:09.080 |
cities into just like complete chaos, violence, madness, and death. And the federal government 02:23:12.920 |
has just like let it happen. It's incredibly violent places. And so the Biden administration 02:23:17.800 |
decided that the big pressing law enforcement thing that they needed to do in Pennsylvania 02:23:20.920 |
was that they needed to start raiding Amish farms to prevent them from selling raw milk 02:23:24.600 |
with armed raids. And it turns out it really pissed off the Amish. And it turns out they 02:23:33.080 |
weren't willing to drive to the polling places because they don't have cars. But if you came 02:23:36.520 |
and got them, they would go and they would vote. And that's one of the reasons why Trump won. 02:23:39.960 |
Anyway, so like the law enforcement agencies are off working on like crazy things. Like 02:23:44.520 |
the system's not working. And so you add up, pick $130 billion programs. All right, now you're, 02:23:50.920 |
okay, math major, 100 times 100. $10,000. $10,000. Okay. $10,000 per taxpayer per year. 02:23:56.440 |
And, but it's also not just about money. That's really, obviously money is a hugely important 02:24:01.560 |
thing, but it's the cavalier attitude that then in sort of in the ripple effect of that, 02:24:08.520 |
it makes it so nobody wants to work in government and be productive. It makes it so the corruption 02:24:14.200 |
can, it breeds corruption. It breeds laziness. It breeds secrecy because you don't want to be 02:24:20.440 |
transparent about having done nothing all year, all this kind of stuff. And you don't want to 02:24:24.680 |
reverse that. So it will be exciting for the future to work in government because the amazing 02:24:31.480 |
thing, if you're the steel man government is you can do shit at scale. You have money and you can 02:24:37.960 |
directly impact people's lives in a positive sense at scale. It's super exciting. As long as 02:24:45.480 |
there's no bureaucracy that slows you down or not huge amounts of bureaucracy that slows you down 02:24:52.280 |
significantly. Yeah. So here's the trick. This blew my mind because I was, once you look at it, 02:24:58.360 |
once you open the hell mouth of looking into the federal budget, you learn all kinds of things. 02:25:03.000 |
So there is a term of art in government called impoundment. And so you, if you're like me, 02:25:08.760 |
you've learned this the hard way when your car has been impounded. The government meaning of 02:25:12.200 |
impoundment, the federal budget meaning is a different meaning. Impoundment is as follows. 02:25:16.920 |
The constitution requires Congress to authorize money to be spent by the executive branch, 02:25:21.480 |
right? So the executive branch goes to Congress, says, "We need money X." Congress does their 02:25:25.640 |
thing. They come back and they say, "You can have money Y." The money is appropriated from Congress. 02:25:29.240 |
The executive branch spends it on their military or whatever they spend it on or on roads to nowhere 02:25:33.000 |
or charging stations to nowhere or whatever. And what's in the constitution is the Congress 02:25:39.240 |
appropriates the money. Over the last 60 years, there has been an additional interpretation of 02:25:46.040 |
appropriations applied by the courts and by the system, which is the executive branch not only 02:25:51.400 |
needs Congress to appropriate X amount of money. The executive branch is not allowed to underspend. 02:25:56.520 |
Yeah. I'm aware of this. I'm aware of this. And so there's this thing that happens in Washington 02:26:02.040 |
at the end of every fiscal year, which is September 30th, and it's the great budget flush. 02:26:06.040 |
And any remaining money that's in the system that they don't know how to productively spend, 02:26:09.800 |
they deliberately spend it on productively to the tune of hundreds and hundreds of billions of 02:26:15.480 |
dollars. A president that doesn't want to spend the money can't not spend it. Like, okay, A, 02:26:21.480 |
that's not what's in the constitution. And there's actually quite a good Wikipedia page that goes 02:26:25.880 |
through the great debate on this has played out in the legal world over the last 60 years. 02:26:29.160 |
And basically, if you look at this with anything resembling, I think, an open mind, you're like, 02:26:33.160 |
all right, this is not what the founders meant. And then number two, again, we go back to this 02:26:37.400 |
thing of contempt. Can you imagine showing up and running the government like that and thinking that 02:26:44.200 |
you're doing the right thing and not going home at night and thinking that you've sold your soul? 02:26:47.560 |
Right. I actually think you had a really good point, which is it's even unfair to the people 02:26:52.600 |
who have to execute this. Because it makes them bad people. And they didn't start out 02:26:57.160 |
wanting to be bad people. And so there is stuff like this everywhere. And so we'll see how far 02:27:04.200 |
these guys get. I am extremely encouraged what I've seen so far. It seems like a lot of people 02:27:08.360 |
will try to slow them down. But yeah, they get far. Another difficult topic, immigration. What's 02:27:13.720 |
your take on the, let's say, heated H-1B visa debate that's going on online and legal immigration 02:27:21.080 |
in general? Yeah. So I should start by saying I am not involved in any aspect of government 02:27:25.400 |
policy on this. I am not planning to be. This is not an issue that I'm working on or that I'm going 02:27:30.120 |
to work on. This is not part of the agenda of what my firm is doing. So I'm not in this in the new 02:27:37.400 |
administration or the government. I'm not planning to be. So purely just personal opinion. So I would 02:27:43.640 |
describe what I have as a complex or nuanced, hopefully nuanced, view on this issue that's 02:27:47.880 |
maybe a little bit different than what a lot of my peers have. And I think, and I kind of thought 02:27:53.480 |
about this, I didn't say anything about it all the way through the big kind of debate over Christmas, 02:27:56.680 |
but I thought about it a lot and read everything. I think what I realized is that I just have a very 02:28:01.320 |
different perspective on some of these things. And the reason is because of the combination of 02:28:05.080 |
where I came from and then where I ended up. And so I'll start with this, where I ended up, 02:28:11.080 |
Silicon Valley. And I have made the pro-skilled, high-skilled immigration argument many, many times, 02:28:16.840 |
the H-1B argument many times. In past lives, I've been in DC many times arguing with prior 02:28:22.280 |
administrations about this, always on the side of trying to get more H-1Bs and trying to get 02:28:26.200 |
more high-skilled immigration. And I think that argument is very strong and very solid and very, 02:28:32.200 |
has paid off for the US in many, many ways. And we can go through it, but I think it's the argument 02:28:37.720 |
everybody already knows. It's like the stock. You take any Silicon Valley person, you press the 02:28:41.480 |
button and they tell you why we need to brain drain the world to get more H-1Bs. So everybody 02:28:45.160 |
kind of gets that argument. So it's basically just summarize, it's a mechanism by which you 02:28:49.640 |
can get super smart people from the rest of the world, import them in, keep them here to increase 02:28:56.200 |
the productivity of the US companies. Yeah. And then it's not just good for them and it's not 02:29:01.720 |
just good for Silicon Valley or the tech industry. It's good for the country because they then create 02:29:05.480 |
new companies and create new technologies and create new industries that then create many more 02:29:09.720 |
jobs for Americans, native-born Americans than would have previously existed. And so you've got 02:29:13.720 |
a, it's a positive, some flywheel thing where everybody wins. Everybody wins. There are no 02:29:18.600 |
trade-offs. It's all absolutely glorious in all directions. You cannot possibly, there cannot 02:29:24.120 |
possibly be a moral argument against it under any circumstances. Anybody who argues against it is 02:29:28.360 |
obviously doing so from a position of racism, is probably a fascist and a Nazi, right? I mean, 02:29:33.960 |
that's the thing. And like I said, I've made that argument many times. I'm very comfortable 02:29:37.560 |
with that argument. And then I'd also say, look, I would say, number one, I believe a lot of it. 02:29:41.160 |
I'll talk about the parts I don't believe, but I believe a lot of it. And then the other part is, 02:29:44.520 |
look, I benefit every day. I always describe it as I work in the United Nations. My own firm 02:29:50.920 |
and our founders and our companies and the industry and my friends are just this amazing 02:29:57.960 |
panoply cornucopia of people from all over the world. And I just, I don't know at this point 02:30:05.320 |
where people from, it's gotta be, I don't know, 80 countries or something. And hopefully over time, 02:30:08.920 |
it'll be the rest as well. And it's just, it's been amazing. And they've done many of the most 02:30:13.080 |
important things in my industry and it's been really remarkable. So that's all good. And then 02:30:18.920 |
there's just the practical version of the argument, which is we are the main place these people get 02:30:22.520 |
educated anyway, right? The best and the brightest tend to come here to get educated. And so this is 02:30:26.840 |
the old kind of Mitt Romney, staple of green card to every, at least, maybe not every university 02:30:31.800 |
degree, but every technical degree, maybe the sociologists we could quibble about, but the 02:30:37.000 |
roboticists for sure, for sure, for sure. We can all agree that at least I won you over on something 02:30:41.800 |
today. Well, no, I'm, I'm exaggerating for effect. So, and I lost you. I had you for a second. I 02:30:47.880 |
haven't gotten to the other side of the argument yet. Okay. So surely we can all agree that, uh, 02:30:53.240 |
we need to staple a green card. The rollercoaster is going up. The rollercoaster is rationing slowly 02:30:57.640 |
up. So, um, yeah, so surely we can all agree that the roboticist should all get green cards. And 02:31:01.400 |
again, like there's a lot of merit to that, obviously, like, look, we want the U S to be 02:31:04.360 |
the world leader in robotics. What step one to being the world leader in robotics is have all 02:31:08.200 |
the great robotics people, right? Like, you know, very unlike the underpants, no, it's like a very 02:31:12.520 |
straightforward formula, right? All right. That's all well and good. All right. But it gets a little 02:31:16.360 |
bit more complicated, um, because, um, there is a kind of argument that's sort of right underneath 02:31:21.080 |
that, that you also hear from, you know, these same people. And I have made this argument myself 02:31:24.440 |
many times, which is, we need to do this because we don't have enough people in the U S who can do 02:31:28.280 |
it otherwise. Right. We have all these unfilled jobs. We've got all these, you know, all these 02:31:32.360 |
companies that wouldn't exist. We don't have enough good founders. We don't have enough 02:31:34.920 |
engineers. We don't have enough scientists. Or then the next version of the argument below that 02:31:38.840 |
is our education system is not good enough to generate those people. Um, and which is a weird 02:31:44.440 |
argument by the way, because like our education system is good enough for foreigners to be able 02:31:47.880 |
to come here preferentially in like a very large number of cases, but somehow not good enough to 02:31:52.280 |
educate our own native foreign people. So there's like a weird, there's little cracks in the matrix 02:31:56.680 |
that you can kind of stick your fingernail into and kind of wonder about. No, we'll come back to 02:32:00.280 |
that one. But like, at least, yes, our education system has its flaws. And then, and then underneath 02:32:04.600 |
that is the, is the argument that, you know, Vivek made, um, you know, which is, you know, 02:32:08.280 |
we have a cultural rot in the country and, you know, native born people in the country aren't, 02:32:12.280 |
you know, don't work hard enough and spend too much time watching TV and tick tock and don't 02:32:15.640 |
spend enough time studying differential equations. Um, and again, it's like, all right, like, you 02:32:20.520 |
know, yeah, there's a fair amount to that. Like there's a lot of American culture that, um, is, 02:32:24.200 |
um, you know, there's a lot of frivolity. There's a lot of, you know, look, I mean, 02:32:28.360 |
we have well-documented social issues on many fronts, many things that cut against having a 02:32:32.280 |
culture of just like straightforward high achievement and effort and striving. 02:32:35.880 |
But anyway, like, you know, those are the basic arguments. Um, but then I have this kind of other 02:32:40.440 |
side of my, you know, kind of personality and thought process, which is, well, I grew up in 02:32:44.200 |
a small farming town in rural Wisconsin, the rural Midwest. And you know, it's interesting. There's 02:32:48.360 |
not a lot of people who make it from rural Wisconsin to, you know, high tech. Um, and so 02:32:54.440 |
it's like, all right, why is that exactly? Right. And then I noticed I'm an aberration. Like I was 02:32:58.200 |
the only one from anybody I ever knew who ever did this, right. I know what an aberration I am, 02:33:02.120 |
and I know exactly how that aberration happened. And it's a very unusual, you know, set of steps, 02:33:05.960 |
um, including, you know, many that were just luck. Um, but like it, there is in no sense, 02:33:11.800 |
a talent flow from rural Wisconsin into high tech, like not at all. Um, there is also like in no 02:33:19.480 |
sense of talent flow from the rest of the Midwest into high tech. There is no talent flow from the 02:33:23.320 |
South into high tech. There is no flow from the Sunbelt into high tech. There's no flow from, 02:33:27.320 |
you know, the deep South into high tech. Like just like literally it's like the blanks. There's this 02:33:32.840 |
whole section of the country that just where the people just like for some reason don't end up in 02:33:37.480 |
tech. Now that's a little bit strange because these are the people who put a man on the moon. 02:33:42.760 |
These are the people who built the world war two war machine. These are the people, at least their 02:33:48.120 |
ancestors are the people who built the second industrial revolution and built the railroads 02:33:52.040 |
and built the telephone network or telephone network and built, you know, logistics and 02:33:55.480 |
transportation in the auto. And I mean, the auto industry was built in Cleveland and Detroit. 02:33:59.240 |
And so at least these people's parents and grandparents and great grandparents somehow 02:34:03.240 |
had the wherewithal to like build all of this, like amazing things, invent all these things. 02:34:06.760 |
And then there's many, many, many, many stories in the history of American invention and innovation 02:34:11.640 |
and capitalism where you had people who grew up in the middle of nowhere. 02:34:14.600 |
Philo Farnsworth invented the television and just like, you know, tons and tons of others, 02:34:18.200 |
endless stories like this. Now you have like a puzzle, right? And the conundrum, which is like, 02:34:22.920 |
okay, like what is happening on the blank spot of the map? And then of course, you also can't 02:34:27.400 |
help noticing that the blank spot on the map, the Midwest, the South, you've also just defined 02:34:31.960 |
Trump country, the Trump voter base, right? And it's like, oh, well, that's interesting. 02:34:36.680 |
Like how, how did that happen? Right. And so either you really, really, really have to believe 02:34:40.920 |
the very, very strong version of like the Vivec thesis or something where you have to believe 02:34:44.360 |
that like that basically culture, the whole sort of civilization in the middle of the country and 02:34:49.080 |
the South of the country is so like deeply flawed, either inherently flawed or culturally flawed, 02:34:53.720 |
such that for whatever reason, they are not able to do the things that their, you know, 02:34:57.560 |
their parents and grandparents were able to do and that their peers are able to do 02:35:00.200 |
or something else is happening. Would you care to guess on what else is happening? 02:35:03.640 |
I mean, what? Affirmative action? Affirmative action. 02:35:05.880 |
Okay. This is very, think about it. This is very entertaining, right? What are the 02:35:11.880 |
three things that we know about affirmative action? It is absolutely 100% necessary, 02:35:16.040 |
but however, it cannot explain the success of any one individual. 02:35:24.360 |
They could explain maybe disproportionate, but like it surely doesn't explain why 02:35:29.880 |
you're probably the only person in Silicon Valley from Wisconsin. 02:35:33.960 |
What educational institution in the last 60 years has wanted farm boys from Wisconsin? 02:35:37.560 |
But what institution rejected farm boys from Wisconsin? 02:35:42.840 |
Of course. Okay. So we know this. We know this. The reason we know this is because of the Harvard 02:35:46.920 |
and UNC Supreme Court cases. So this was like three years ago. These were big court cases, 02:35:52.120 |
you know, because the idea of affirmative action has been litigated for many, many, many years 02:35:55.720 |
and through many court cases and the Supreme Court repeatedly in the past had upheld that it was a 02:35:59.320 |
completely legitimate thing to do. And a lot of these, and there's basically two categories of 02:36:03.240 |
affirmative action that like really matter, right? The one is admissions into educational 02:36:08.120 |
institutions. And then the other is jobs, right? Getting hired. Like those are the two biggest 02:36:11.800 |
areas. The education one is like super potent, has been a super potent political issue for a 02:36:16.040 |
very long time for all, you know, people have written and talked about this for many decades. 02:36:19.240 |
I don't, I don't need to go through it. There's many arguments for why it's important. There's 02:36:22.040 |
many arguments as to how it could backfire. It's been this thing, but the Supreme Court upheld it 02:36:26.520 |
for a very long time. Um, the most, the most recent ruling, I don't, I'm not a lawyer. I 02:36:30.520 |
don't have the exact reference in my head, but there was a case in 2003 that said that, um, 02:36:36.040 |
Sandra Day O'Connor famously wrote that, um, you know, it's, although it had been 30 years of 02:36:40.200 |
affirmative action, and although it was not working remotely as it had, had been intended, 02:36:44.200 |
um, uh, she said that, you know, well, basically we need to try it for another 25 years. 02:36:48.600 |
But she said, basically as a message to future Supreme Court justices, if it hasn't resolved 02:36:52.520 |
basic, basically the issues it's intended to resolve within 25 years, then we should probably 02:36:56.200 |
call it off. By the way, we're coming up on the 25 years. It's a couple of years away. 02:37:01.640 |
Uh, the Supreme Court just, uh, had these cases as a Harvard case, and I think a U at University 02:37:06.680 |
of North Carolina case. And what's interesting about those cases is the, the lawyers in those 02:37:10.360 |
cases put a tremendous amount of evidence into the record of how the admissions decisions actually 02:37:15.640 |
happen, um, at Harvard and happen at UNC. And it is like every bit as cartoonishly garish and 02:37:21.880 |
racist as you could possibly imagine, because it's a ring of power. And if you're an admissions 02:37:28.120 |
officer at a private university or an administrator, you have unlimited power to do what you want, 02:37:32.440 |
and you can justify any of it under any of these rules or systems. Um, and up until these cases, 02:37:38.360 |
it had been a black box where you didn't have to explain yourself and show your work. And what the 02:37:42.280 |
Harvard and USC cases did is they basically required showing the work. And so they, and 02:37:46.280 |
there was like all kinds of like phenomenal detail. I mean, number one is there were text 02:37:49.880 |
messages in there that will just curl your hair of people, the students being spoken of, and just 02:37:53.560 |
like crude racial stereotypes that would just make you want to jump out the window. It's horrible 02:37:57.480 |
stuff. But also, um, there was statistical information. And of course, the big statistical 02:38:02.120 |
kicker to the whole thing is that at top institutions, it's common for different ethnic 02:38:06.040 |
groups to have different cutoffs for SAT, uh, that are as wide as 400 points. Um, right. So 02:38:12.520 |
different groups. So, so, so specifically Asians need to perform at 400 SAT points higher than 02:38:19.480 |
other ethnicities in order to actually get admitted into these. I mean, this is not even 02:38:22.360 |
about, I mean, white people are a part of this, but like Asians are like a very big part of this. 02:38:25.400 |
And actually the, the Harvard case was actually brought by an activist on behalf of actually 02:38:29.160 |
the Asian students that were being turned away. Um, and it's basically, I mean, it's the cliche 02:38:33.560 |
now in, in, in the Valley and in the medical community, which is like, if you want a super 02:38:36.520 |
genius, you hire an Asian from Harvard because they are guaranteed to be freaking Einstein. 02:38:41.160 |
Cause if they weren't, they were never getting admitted. Right. Almost all the qualified Asians 02:38:44.840 |
get turned away. Um, so they've been running this. It's very, very explicit, very, very clear 02:38:50.840 |
program. Um, this of course has been a third rail of things that people are not supposed to discuss 02:38:55.000 |
or under any circumstances. Um, the thing that has really changed the tenor on this is I think 02:38:59.560 |
two things. Number one, those Supreme court cases, the Supreme court ruled that they can 02:39:03.240 |
no longer do that. I will tell you, I don't believe there's a single education institution 02:39:07.880 |
in America that is conforming with the Supreme court ruling. Um, I think they are all flagrantly 02:39:12.440 |
ignoring it. And we could talk about that. Mostly because of momentum probably or what? 02:39:16.360 |
They are trying to make the world a better place. They are trying to solve all these 02:39:19.320 |
social problems. They are trying to have diverse student populations. They are trying to live up 02:39:23.800 |
to the expectations of their donors. They are trying to make their faculty happy. They are 02:39:27.480 |
trying to, um, have their friends and family think that they're good people. Right. They're 02:39:32.840 |
trying to have the press write nice things about them. Like it's nearly impossible for them. And, 02:39:38.120 |
and, you know, to be clear, like nobody has been fired from an admissions office for, you know, 02:39:42.440 |
25 years of prior, what we now, the Supreme court now has ruled to be illegality. Um, and so they're 02:39:47.160 |
all the same people under the exact same pressures. Um, and so, um, like I, you know, the numbers are 02:39:52.120 |
moving a little bit, but like, I don't think, I don't know anybody in the system who thinks that 02:39:55.720 |
they're complying with the Supreme court, like who's in charge in the rank ordering of who rules, 02:40:00.040 |
who the universities rule, the Supreme court way more than the Supreme court rules, the universities. 02:40:03.960 |
Right. Well, another example of that is I think it's, is that every sitting member of the Supreme 02:40:07.880 |
court right now went to either Harvard or Yale, right? Like the, the level of incestuousness 02:40:12.440 |
here is like, is it anyway? So, so there's that. And so, so this has been running for a very long 02:40:17.160 |
time. So one is the Harvard and USC cases kind of gave up the game number one, or at least showed, 02:40:20.840 |
showed what the mechanism was. Uh, and the number two, the other thing is obviously the aftermath 02:40:24.920 |
of October 7th. Right. Um, and what we discovered was happening with Jewish applicants. Um, and what 02:40:29.960 |
was happening at all the top institutions for Jewish applicants was they were being managed down 02:40:33.640 |
as they were being actively managed down as a percentage of the, of, of, of the base. Um, and, 02:40:39.320 |
um, let's see, I, I've heard reports of like extremely explicit, um, uh, basically plans to 02:40:44.520 |
manage, uh, to manage the Jewish admissions down to their representative percentage of the U S 02:40:48.440 |
population, which is 2%. And, and, you know, there's a whole backstory here, which is a hundred 02:40:52.360 |
years ago, Jews were not admitted into a lot of these institutions. And then there was a big 02:40:55.560 |
campaign to get them in. Once they could get in, they immediately became 30% of these institutions 02:40:59.960 |
because there's so many smart, talented Jews. So it went from 0% to 30%. And then the most recent 02:41:05.320 |
generation of leadership has been trying to get it done to 2%. And a lot of Jewish people, at least 02:41:09.480 |
a lot of Jewish people I know sort of, they kind of knew this was happening, but they discovered 02:41:13.640 |
it the hard way, uh, after October 7th. Right. And so all of a sudden, so basically the Supreme 02:41:19.320 |
court case meant that you could address this in terms of the Asian victims, the October 7th meant 02:41:24.280 |
that you could address it in terms of the Jewish victims. And for sure, both of those groups are 02:41:28.200 |
being systematically excluded. Right. And then of course, um, there's the thing that you basically 02:41:32.760 |
can't talk about, which is all the white people are being excluded. And then it turns out it's 02:41:37.560 |
also happening to black people. And this is the thing that like blew my freaking mind when I found 02:41:43.240 |
out about it. So, um, I just assumed that like, this was great news for like American blacks, 02:41:48.200 |
because like, you know, obviously if, you know, whites, Asians, and Jews are being excluded, 02:41:52.760 |
then, you know, the whole point of this in the beginning was to get the black population up. 02:41:56.200 |
And so this must be great for American blacks. So then I discovered this New York times article 02:42:00.200 |
from 2004, um, called, um, blacks are being admitted into top schools at greater numbers, 02:42:05.640 |
but which ones, uh, and again, and by the way, this is in the New York times, this is not in like, 02:42:12.200 |
you know, whatever national review. This is New York times, uh, 2004. And the two authorities 02:42:17.480 |
that were quoted in the story are Henry Louis Gates, who's the Dean of the African-American 02:42:21.400 |
studies, you know, community in the United States, super brilliant guy. And then Lonnie Guineer, 02:42:25.880 |
who was a, she was a potential Supreme court appointee under, I think, a close friend of 02:42:31.720 |
Hillary Clinton. And there was for a long time, she was on the shortlist for Supreme court. So 02:42:35.000 |
one of the top, you know, jurists, uh, lawyers in the country, both black, um, was sort of 02:42:40.520 |
legendarily successful in their, in their, in, in the academic and legal worlds, um, and black. 02:42:44.600 |
Um, and they are quoted as the authorities in this story and the story that they tell 02:42:48.520 |
is actually very, it's amazing. Um, and by the way, it's happening today in, uh, education 02:42:55.080 |
institutions and it's happening in companies and you can see it all over the place and the 02:42:58.200 |
government, um, which is, um, at least at that time, the number was half of the black admits 02:43:04.920 |
into a place like Harvard were not American born blacks. They were foreign born blacks, um, 02:43:09.960 |
specifically, uh, uh, uh, uh, uh, Northern African off generally Nigerian, um, or, uh, West Indian. 02:43:16.360 |
Um, right. Um, and by the way, many Nigerians and Northern Africans have come to the U S and 02:43:21.880 |
have been very successful. Nigerian Americans as a group, like way outperformed there, you know, 02:43:25.480 |
this is a super smart cohort of people. And then West Indian blacks in the U S are incredibly 02:43:29.160 |
successful. Um, most recently, by the way, Kamala Harris, as well as Colin Powell, like just to, 02:43:34.760 |
sort of examples of that. And so basically what Henry Louis Gates and Lenny Guineer said in the 02:43:39.080 |
story is Harvard is basically struggling to either whatever it was, identify a recruit, 02:43:44.680 |
make successful, whatever it was, American born native blacks. And so therefore they were using 02:43:49.400 |
high-skill immigration, high-skill immigration as an escape hatch to go get blacks from other 02:43:53.160 |
countries. Um, and then, and then this was 2004 when you could discuss such things. Um, 02:43:58.840 |
obviously that is a topic that nobody has discussed since it has sailed on all of the 02:44:04.040 |
DEI programs of the last 20 years have had this exact characteristic. Um, there's large numbers 02:44:09.000 |
of black people in America who are fully aware of this and are like, it's obviously not us 02:44:13.000 |
that are getting any slots where we're obviously, we're literally competing with people who are 02:44:15.960 |
being imported. And, and, you know, if you believe in the basis of primitive action, 02:44:19.640 |
you were trying to make up for historical injustice of American black slavery. And so 02:44:23.240 |
the idea that you're import somebody from, you know, Nigeria that never experienced that, um, 02:44:28.440 |
you know, is like tremendously insulting to, to, to black Americans. Anyway, so you can see where 02:44:33.160 |
I'm heading with this. We have been in a 60 year social engineering, engineering experiment to 02:44:37.720 |
exclude native born people from the educational slots and jobs that high-skill immigration has 02:44:42.600 |
been funneling foreigners into. Right. And so it turns out it's not a victim-free thing. There's, 02:44:47.400 |
there's like 100% there's victims cause why there's only so many, for sure, there's only so 02:44:51.240 |
many education slots. And then for sure, there's only so many of these jobs, right. You know, 02:44:55.480 |
Google only hire so many, you know, whatever level seven engineers. Right. And so, so, so, 02:44:59.880 |
so that's the other side of it. Right. And so you're a farm boy in Wisconsin, right. Or a, 02:45:05.320 |
you know, black American whose ancestors arrived here, you know, on a slave ship 300 years ago 02:45:09.800 |
in Louisiana or a, you know, Cambodian immigrants in, you know, the Bronx, um, and your kid or a 02:45:17.000 |
Jewish immigrant, uh, or a, you know, or from a very successful Jewish family. Um, and you know, 02:45:21.320 |
your entire, you know, for three generations, you and your parents and grandparents went to Harvard 02:45:25.640 |
and what all of those groups know is the system that has been created is not for them, right. 02:45:31.560 |
It's designed specifically to exclude them. And then what happens is all of these tech people 02:45:35.960 |
show up in public and say, yeah, let's bring in more foreigners. Right. And so, so anyway, 02:45:40.200 |
so the, the, the short version of it is you can't anymore. I don't think just have the, 02:45:44.600 |
the, the, um, the, the, the quote high-skilled immigration conversation for either education or 02:45:49.000 |
for, um, uh, or for employment without also having the DEI conversation. Um, and then underneath, 02:45:55.480 |
and then DEI is just another word for affirmative action. So it's, it's the affirmative action 02:45:58.440 |
conversation. And you, you need to actually deal with this at substance and to see what's actually 02:46:01.960 |
happening to people. You need to join these topics. And, and, and I think it is much harder to make 02:46:06.840 |
the moral claim for high-skilled immigration, given the, the, the, the, the extent to which 02:46:10.920 |
DEI took over both the education hiring, uh, education process and the, and the hiring process. 02:46:15.080 |
Okay. So first of all, that was brilliantly laid out the nuance of it. So just to understand, 02:46:21.080 |
it's not so much a criticism of H1B high-skilled immigration. It's that there needs to be more 02:46:26.280 |
people saying, yay, we need more American born hires. So I spent the entire Christmas holiday 02:46:33.160 |
reading every message on this and not saying anything. And what I was, which you know me 02:46:38.120 |
well enough to know that's a serious level of, yeah, that was very Zen. Yes. Thank you. Thank 02:46:42.520 |
you. No, it wasn't. There was tremendous rage on the other side of it, but I suppressed it. 02:46:46.760 |
So, um, I was waiting for the dog that didn't bark, uh, right. And the dog that didn't bark was, 02:46:53.640 |
I did not. And you tell me if you saw one, I did not see a single example of somebody 02:46:57.240 |
pounding the table for more high-skilled immigration who was also pounding the table 02:47:00.280 |
to go get more smart kids who are already here, uh, into these educational institutions and into 02:47:04.920 |
these jobs. I didn't see, I didn't see a single one. That's true. I, I think I agree with that. 02:47:09.720 |
There was, there really was a divide, but it was like, literally it was like the proponents 02:47:14.040 |
of high-skilled immigrant. And again, this was me for a very long time. I mean, I kind of took 02:47:17.160 |
myself by surprise on this because I was on, you know, I, I, I had the much say simpler version 02:47:21.640 |
of this story for a very long, like I said, I I've been in Washington many times under past 02:47:25.560 |
presidents lobbying for this, by the way, never made any progress, which we could talk about, 02:47:28.600 |
like it never actually worked. Um, but, um, you know, I, I've been on the other side of this one, 02:47:32.920 |
but I was literally sitting there being like, all right, which of these like super geniuses, 02:47:36.680 |
um, who, you know, many of whom by the way are very, you know, successful high-skilled immigrants 02:47:40.200 |
or children of high-skilled immigrants, you know, which of these super geniuses are going to like, 02:47:44.680 |
say, actually we have this like incredible talent source here in the country, which again, 02:47:48.040 |
to be clear, I'm not talking about white people. I'm talking about native born Americans, whites, 02:47:51.240 |
Asians, Jews, blacks, for sure, for sure, for sure. Those four groups, but also white people. 02:47:57.400 |
Yeah. And, and also white people, people that are making the case for American born hires 02:48:04.200 |
are usually not also supporting H1B. This is, it's an extreme divide. And those people that are 02:48:10.280 |
making that case are often not making it in a way that's like, uh, make, making it in quite a radical 02:48:18.440 |
way. Let's put it this way. Yeah. But you have this interesting thing that you have a split 02:48:21.400 |
between the sides that I've noticed, which is one side has all of the experts. Right. 02:48:25.560 |
Right. And I, and I'm using scarecrow for people listening to audio. I'm making quotes in the air 02:48:29.560 |
with my fingers as vigorously as I can. One side has all the certified experts. 02:48:33.880 |
The other side just has a bunch of people who were like, they know that something is wrong 02:48:36.680 |
and they don't quite know how to explain it. And what was so unusual about the Harvard UNC cases, 02:48:41.160 |
by the way, in front of Supreme court is they actually had sophisticated lawyers for the first 02:48:44.440 |
time in a long time, actually put all this evidence together and actually put it in the 02:48:47.000 |
public record. They actually, they actually had experts, which is just, which is just really rare. 02:48:51.480 |
Generally what you get is you get, cause if you don't have experts, what do you have? You know, 02:48:54.600 |
something is wrong and you have, but you have primarily an emotional response. You feel it, 02:48:59.240 |
but can you put it, you know, can you put it in the words and tables and charts, 02:49:04.040 |
you know, that, that a certified expert can and no, you can't like, that's not, you know, 02:49:07.400 |
that's not who you are. That doesn't mean that you're wrong. And it also doesn't mean that you 02:49:10.120 |
have less of a moral stance. Um, yeah. And so it's just like, all right, now, by the way, look, 02:49:14.760 |
I think there's there, I think there are ways to square the circle. I think there's a way to have 02:49:17.640 |
our cake and eat it too. Like, I think there'd be many ways to resolve this. Um, I think again, 02:49:22.280 |
I think the way to do it is to look at these, these, these issues combined, look at, at DEI 02:49:26.520 |
combined with, um, uh, high school immigration. It so happens the DEI, um, is under much more 02:49:31.880 |
scrutiny today than it has been for probably 20 years. Affirmative action is, um, the Supreme 02:49:37.000 |
court did just rule that it is not legal, um, for universities to do that. They are still doing it, 02:49:42.600 |
but they should stop. Um, and then there are more and more, you've seen more companies now also 02:49:50.040 |
ditching their DEI programs. Um, in part that's happening for a bunch of reasons, 02:49:54.360 |
but it's happening in part because a lot of corporate lawyers will tell you that the Supreme 02:49:58.440 |
court rulings in education either already apply to businesses, or it just is a clear foreshadowing. 02:50:04.520 |
The Supreme court will rule on new cases that will ban in businesses. And so, so, so there, 02:50:09.480 |
there is a moment here to be able to look at this, um, on both sides. Um, let me add one 02:50:14.760 |
more nuance to it that makes it even more complicated. Yeah. So the cliche is we're 02:50:18.760 |
going to brain drain the world, right? You've heard that we're going to, we're going to take 02:50:21.480 |
all the smart people from all over the world. We're going to bring them here. We're going to 02:50:23.960 |
educate them and then we're going to keep them. And then they're going to raise their families 02:50:26.120 |
here, create businesses here, create jobs here. Right? In the cliche, that's a super positive 02:50:30.200 |
thing. Yeah. Okay. So what happens to the rest of the world? They lose. Well, how fungible are people? 02:50:38.920 |
How many highly ambitious, highly conscientious, highly energetic, 02:50:45.720 |
high achieving, high IQ, super geniuses are there in the world? And if there's a lot, 02:50:52.280 |
that's great. But if there just aren't that many and they all come here and they all aren't where 02:50:57.480 |
they would be otherwise, what happens to all those other places? So it's almost impossible 02:51:03.480 |
for us here to have that conversation in part, because we become incredibly uncomfortable as a 02:51:07.480 |
society talking about the fact that people aren't just simply all the same. Um, which is a whole 02:51:12.120 |
thing we could talk about, but, um, it also, we, we are purely the beneficiary of this effect, 02:51:16.600 |
right? We are brain draining the world, not the other way around. There's only four. So if you 02:51:21.320 |
look at the flow of high-skill immigration over time, there's only four permanent sinks of high 02:51:25.480 |
skill immigration places. People go, it's the U S Canada, the UK, and Australia. It's the, it's the, 02:51:30.680 |
it's the four of the five, five eyes. It's the major Anglosphere countries. And so for those 02:51:35.960 |
countries, this, there, this seems like a no lose proposition. It's all the other countries that 02:51:40.840 |
basically what, what we, what we four countries have been doing is draining all the smart people 02:51:43.960 |
out. It's actually much easier for people in Europe to talk about this. I've discovered, 02:51:47.480 |
um, because the Eurozone is whatever, you know, 28 countries and within the Eurozone, 02:51:51.560 |
the high-skill people over time have been migrating to originally the UK, but also 02:51:56.040 |
specifically, I think it's the Netherlands, um, uh, Germany and France, but specifically they've 02:52:00.200 |
been migrating out of the peripheral Eurozone countries. And the, the, the, the one where this 02:52:04.520 |
really hit the fan was in Greece, right? So, you know, Greece falls into chaos disaster. And then, 02:52:09.240 |
you know, you're running the government in Greece and you're trying to figure out how to put an 02:52:11.560 |
economic development plan together. All of your smart young kids have left, like, what are you 02:52:17.720 |
going to do? Right. Um, by the way, this is a potential, I know you care a lot about Ukraine. 02:52:23.480 |
This is a potential crisis for Ukraine, not because in part because of this, because we 02:52:27.400 |
enthusiastically recruit Ukrainians, of course. And so we've been draining brain draining Ukraine 02:52:31.240 |
for a long time, but also of course, you know, war does tend to cause people to migrate out. 02:52:35.400 |
And so, you know, when it comes time for Ukraine to rebuild as a peaceful country, 02:52:40.120 |
is it going to have the talent base even that it had five years ago? It's like a very big and 02:52:43.880 |
important question, by the way, Russia, like we have brain drain, a lot of really smart people 02:52:47.560 |
out of Russia. A lot of them are here right over the last, you know, 30 years. Um, and so there's 02:52:52.840 |
this thing, it's actually really funny if you think about it, like the one thing that we know 02:52:57.160 |
to be the height of absolute evil that the West ever did was colonization and resource extraction, 02:53:02.680 |
right? So we know the height of absolute evil was when the Portuguese and the English and, 02:53:06.600 |
you know, everybody else went and had these colonies and then went in and we, you know, 02:53:09.480 |
took all the oil and we took all the diamonds or we took all the whatever lithium or whatever it 02:53:13.080 |
is, right? Well, for some reason, we realized that that's a deeply evil thing to do when it's a 02:53:17.400 |
physical resource, when it's a non-conscious physical matter. Um, for some reason, we think 02:53:22.120 |
it's completely morally acceptable to do it with human capital. In fact, we think it's glorious 02:53:27.000 |
and beautiful and wonderful and, you know, the great flowering of, of, uh, of, uh, peace and 02:53:31.080 |
harmony and moral justice of our time to do it. And we don't think for one second what we're doing 02:53:35.480 |
to the countries that we're pulling all these people out of. And I, I, this is one of these 02:53:39.800 |
things, like, I don't know, like maybe we're just going to live in this delusional state forever 02:53:43.320 |
and we'll just keep doing it and it'll keep benefiting us and we just won't care what 02:53:46.200 |
happens. But like, I, I think there may come, this is one of these, this is like one of these 02:53:49.480 |
submarines under 10 feet under the waterline. Like, I think it's just a matter of time until 02:53:53.800 |
people suddenly realize, oh my God, what are we doing? Cause like we need the rest of the world 02:53:59.160 |
to succeed too, right? Like we need these other countries to like flourish. Like we don't want 02:54:03.240 |
to be the only successful country in the middle of just like complete chaos and disaster. 02:54:06.440 |
And we just extract and we extract and we extract and we don't think twice about it. 02:54:10.280 |
Well, this is so deeply profound actually. So what is the cost of winning? 02:54:15.960 |
If these countries are drained in terms of human capital on the, on the level of geopolitics, 02:54:24.280 |
what does that lead to? Even if we talk about wars and conflict and all of this, 02:54:29.080 |
we actually want them to be strong in the way we understand strong, not just 02:54:34.840 |
in every way so that cooperation and competition can build a better world for all of humanity. 02:54:41.160 |
It's interesting. I've been, this is one of those truths where you just speak and it, 02:54:48.600 |
it resonates and I didn't even think about it. Yeah, exactly. So this is what you were 02:54:53.240 |
sitting in the June, the holiday season, just boiling over. So all that said, there's still 02:54:59.800 |
to use some good to the H-1B. Okay. So then you get this other, okay. So then there's this. 02:55:05.400 |
Oh, come all the way around. There's another nuance. So there's another nuance. There's 02:55:08.200 |
another nuance, which is mostly the valley. We don't use H-1Bs anymore. Mostly we use O-1s. 02:55:12.520 |
So there's a, there's a, you mean, there's a separate class of visa and the O-1 is like this, 02:55:17.800 |
it turns out the O-1 is the super genius visa. So the O-1 is the, basically our founder, 02:55:22.520 |
like when we have like a, when we have somebody from anywhere in the world and they've like 02:55:25.640 |
invented a breakthrough, a new technology, and they want to come to the U S to start a company, 02:55:28.520 |
they come in through an O-1 visa. Um, and, and, and that actually is like a, it's a fairly high 02:55:33.400 |
bar. It's a high acceptance rate, but it's like a pretty high bar and they, they do a lot of work 02:55:37.000 |
and they, there's like a, you have to put real work into it, really, really prove your case. 02:55:40.200 |
Um, mostly what's happened with the H-1B visa program, um, is that it has gone to basically 02:55:46.440 |
two categories of employers. One is, uh, the basically a small set of big tech companies 02:55:50.120 |
that hire in volume, which is exactly the companies that you would think. Um, and then 02:55:54.120 |
the other is it goes to these, what they call kind of the mills, um, the consulting mills, 02:55:57.800 |
right. And so there's these set of companies with names, I don't want to pick on companies, but you 02:56:01.080 |
know, names like Cognizant that, you know, hire basically have their business model, uh, is 02:56:06.200 |
primarily Indian, being in primarily Indians, um, in, in large numbers. Um, and you know, 02:56:10.680 |
they often have, you know, offices next to company owned housing and they'll have, you know, 02:56:14.360 |
organizations that are, you know, they'll have, you know, organizations that are literally 02:56:17.320 |
thousands of Indians, you know, living and working in the U S and they do basically, 02:56:20.680 |
um, call it mid tier, like, uh, it consulting. So, you know, these folks, they're making good, 02:56:26.360 |
good, good, good wages, but they're making six year, eight year, a hundred thousand dollars a 02:56:29.240 |
year, not the, you know, 300,000 that you'd make in the Valley. Um, and so like in practice, 02:56:35.480 |
the startups basic, like little tech, as we call it, or the startup world mainly doesn't use H-1B 02:56:40.760 |
at this point and mainly can't because the system is kind of rigged in a way that we really can't. 02:56:45.720 |
Um, and then, and then, and then again, you get to the sort of underlying morality here, 02:56:48.920 |
which is, it's like, well, you know, Amazon, like Amazon's in like, I love Amazon, but like, 02:56:53.320 |
they're a big, powerful company. You know, they've got, you know, more money than God. They've got 02:56:57.400 |
resources. They've got long-term planning horizon. They do big, you know, profound things over, 02:57:01.080 |
you know, decades at a time. Um, you know, they could, you know, or any of these other companies 02:57:05.720 |
could launch massively effective programs to go recruit the best and brightest from all throughout 02:57:10.520 |
the country. And, you know, you'll notice they don't do that. You know, they bring in, you know, 02:57:14.920 |
10,000, 20,000 H-1Bs a year. Um, and so you've got a question there. Um, and then these mills, 02:57:20.840 |
like there's lots of questions around them and whether they should, you know, whether that's 02:57:23.880 |
even a ethical way to, you know, I don't want to say they're unethical, but there's questions 02:57:27.080 |
around like exactly what, what the trade-offs are there. Um, and so, you know, this, this, 02:57:32.520 |
yeah. And this is like a Pandora's box that really, you know, nobody really wanted to be opened. 02:57:36.040 |
Um, you know, to, to, to play devil's advocate on all this in terms of like national immigration 02:57:40.600 |
issues, you know, none of this is like a top end issue just because the numbers are small. 02:57:44.040 |
Um, right. And so, you know, I don't think, you know, the administration has said like, 02:57:46.840 |
this is not like a priority of theirs for right now. Um, but I guess what I would say is like, 02:57:51.640 |
there is actually a lot of complexity and nuance here. Um, I have a lot of friends, like I said, 02:57:56.680 |
I have a lot of friends and colleagues who are, you know, came over on H-1Bs or ones green cards, 02:58:00.840 |
many are now citizens. And, you know, every single one, one of them was not every single one. A lot 02:58:05.160 |
of them were enthusiastic to, you know, defend the honor of immigrants throughout this whole period. 02:58:09.160 |
And they said to me, it's like, well, Mark, how can we, you know, how can we, how can we more 02:58:11.880 |
clearly express, you know, the importance of high school immigration to the U.S.? And I was like, 02:58:15.160 |
um, I think you can do it by advocating for also developing our native foreign 02:58:18.600 |
talent. And like, do you want to inflame the issue or do you want to diffuse the issue? 02:58:23.240 |
Right. And I think, I think the answer is to diffuse the issue. Oh, let me give you one more 02:58:28.360 |
positive scenario, which, and then I'll also beat up on the university some more. 02:58:33.000 |
Um, do you, do you know about the national merit scholarship system? Have you heard about this? 02:58:38.680 |
Um, not really. So there's a system that was created during the cold war, um, called national 02:58:43.800 |
merit scholars. And, um, it is a, basically, um, it was created, I forget in the late fifties or 02:58:48.840 |
sixties when it was when people in government actually wanted to identify the best and the 02:58:52.600 |
brightest as heretical an idea as that sounds today. Um, and so it's basically a national 02:58:58.280 |
talent search for basically IQ. Um, it, it, it's goal is to identify basically the top 0.5% of the 02:59:04.440 |
IQ, um, uh, in the country, uh, by the way, completely regardless of other characteristics. 02:59:09.160 |
So there's no race, gender, or any other aspect to it. It's just going for straight intelligence. 02:59:13.320 |
Um, it uses the first, the PSAT, um, uh, which is the preparatory SAT that you take. Um, and then 02:59:19.080 |
the SAT. Uh, so it uses those scores that, that, that is the scoring. It's a straight PSAT SAT 02:59:24.360 |
scoring system. Um, so they use the SAT as a proxy for IQ, which it is. Um, uh, they run this every 02:59:32.440 |
year. They identify, they all, they, they, it's like a, they get down to like 1% of the population 02:59:36.280 |
of the kids at 18 year olds in a given year who scored highest on the PSAT. And then they get down 02:59:40.520 |
to further qualify down to the 0.5% that also replicate on the SAT. Um, and then it's like 02:59:45.720 |
the scholarship amount is like $2,500, right? So it's like, it was a lot of money 50 years ago, 02:59:50.760 |
not as much today, but it's a national system being run literally to find the best and the 02:59:55.240 |
brightest. How many of our great and powerful universities use this as a scouting system? 03:00:00.440 |
Like our universities all have sports teams. They all have national scouting, uh, full-time scouts 03:00:06.280 |
who go out and they go to every high school and they try to find all the great basketball players 03:00:09.480 |
and bring them into the NCAA, into all these leagues. Um, how many of our great and powerful 03:00:14.120 |
and enlightened universities use the national merit system to go do a talent, uh, search for 03:00:18.520 |
the smartest kids and just bring them in? Let me guess. Very few. Zero. As you say it, that's 03:00:25.800 |
brilliant. There should be that same level of scouting for talent internally. Go get the smartest 03:00:31.320 |
ones. I'll give you one more kicker on this topic. If you're not, if I haven't beaten it to death, 03:00:35.240 |
um, you know, the SAT has changed. Um, so the SAT used to be a highly accurate proxy for IQ, 03:00:41.800 |
um, that caused a bunch of problems. People really don't like the whole idea of IQ. Um, 03:00:48.040 |
and so the SAT has been actively managed over the last 50 years by the college board that runs it. 03:00:53.000 |
And it has been essentially like everything else. It's been dumbed down. Um, and so the, 03:00:58.200 |
the, in, in, in two ways, um, number one, it's been dumbed down where, uh, an 800 from 40 years 03:01:03.480 |
ago does not mean what an 800 means today. Um, and 800, 40 years ago, it was almost impossible 03:01:08.520 |
to get an 800. Uh, today there's, today there's so many eight hundreds that you could stock the 03:01:12.920 |
entire Ivy league with eight hundreds. Right. Um, and so, so, so, so it's been deliberately dumbed 03:01:17.240 |
down. Um, and then two is they have, they have tried to pull out a lot of what's called the G 03:01:21.400 |
loading. And so they've, they've tried to detach it from being an IQ proxy because IQ is such an 03:01:25.800 |
inflammatory concept. And, and the consequence of that is, and this is sort of perverse, they've 03:01:29.640 |
made it more coachable, right? So the IT for the SAT 40 years ago, coaching didn't really work. 03:01:35.480 |
And more recently it has really started to work. And one of the things you see is that the Asian 03:01:38.840 |
spike, you see this like giant leap upward in Asian performance over the last decade. And I, 03:01:42.600 |
I think looking at the data, I think a lot of that is because it's more coachable now 03:01:45.960 |
and the, and the Asians do the most coaching. Um, so there's a bunch of issues with this. 03:01:50.600 |
And so the coaching thing is really difficult because the coaching thing is a subsidy then 03:01:54.440 |
to the kids whose parents can afford coaching. Right. And I don't know about you, but where I 03:01:58.200 |
grew up, there was no SAT coaching. So there's like an issue there. I didn't even know what the 03:02:02.280 |
SAT was until the day I took it, much less that there was coaching, much less that it could work. 03:02:05.880 |
So much less we could afford it. So, so number one, there's issues there. But the other issue 03:02:10.200 |
there is think about what's happened by the dumbing down 800, no longer captures all this 03:02:15.160 |
mark. 800 is too crude of a test. It's like the AI benchmarking problem. It's the same problem 03:02:20.760 |
they have in AI benchmarking right now. 800 is too low of a threshold. There are too many kids 03:02:26.120 |
scoring 800 because what you want is you want whatever, if it's going to be a hundred thousand 03:02:30.520 |
kids, I don't know what it is, but it's going to be 50,000 kids a year scoring 800. You also then 03:02:33.720 |
want kids to be able to score 900 and a thousand and 1100 and 1200. And you want to ultimately get 03:02:38.200 |
to, you know, you'd like to identify, ultimately identify the top hundred kids and make sure that 03:02:42.440 |
you get them in MIT. And the resolution of the test has been reduced so that it actually is not 03:02:47.640 |
useful for doing that. And again, I would say this is like part of the generalized corruption 03:02:52.200 |
that's taken place throughout this entire system where we have been heading in the reverse direction 03:02:56.440 |
from wanting to actually go get the best and brightest and actually put them in the places 03:02:59.560 |
where they should be. And then just the final comment would be the great thing about standardized 03:03:03.960 |
testing and the national merit system is it's completely, like I said, it's completely race 03:03:07.000 |
blind. It's gender blind. It's blind on every other characteristic. It's only done on test scores. 03:03:11.880 |
You know, and you can make an argument about whether that's good or bad, but it is, 03:03:15.000 |
you know, for sure, um, you know, it's the closest thing that we had to get to merit. 03:03:19.000 |
It was the thing that they did when they thought they needed merit to win the Cold War. And of 03:03:23.560 |
course we could, we could choose to do that anytime we want. And I just say, I find it like 03:03:27.640 |
incredibly striking, um, and an enormous moral indictment of the current system that there are 03:03:31.560 |
no universities that do this today. So back to the immigration thing, just real quick, it's like, 03:03:35.480 |
okay, we aren't even trying to go get the smart kids out of the center and south. And even if 03:03:39.480 |
they think that they can get into these places, they get turned down. And the same thing for the 03:03:42.920 |
smart Asians and the same thing for the smart Jews and the same thing for the smart black people. 03:03:45.880 |
And like, it, it just like, it's just like, I don't know how, like, I don't know how that's 03:03:51.880 |
moral. Like I don't get it at all. As you said about the 800. So I took the SAT and ACT many 03:03:58.200 |
times and I've always gotten perfect on math 800. It's just, and I'm not that, I'm not special. Like 03:04:08.200 |
it doesn't identify genius. I think you want to search for genius and you want to create 03:04:14.200 |
measures that find genius of all different kinds speaking of diversity. And I guess we should 03:04:21.960 |
reiterate and say over and over and over, uh, defend immigrants. Yes. But say we should hire 03:04:30.440 |
more and more native born. Well, you asked me in the beginning, like what, what would, 03:04:34.520 |
what's the most optimistic forecast, right. That we could have in the most optimistic forecast 03:04:37.800 |
would be my God. What if we did both? So that's the reasonable, the rational, the smart thing to 03:04:47.000 |
say here. In fact, we don't have to have a war. Well, it would diffuse, it would diffuse the 03:04:51.560 |
entire issue. If everybody in the center of the South of the country and every Jewish family, 03:04:55.320 |
Asian family, black family knew they were getting a fair shake. Like it would diffuse the issue. 03:04:58.760 |
Like how about diffusing the issue? Like what a crazy radical, sorry, I don't mean to really get 03:05:04.200 |
out of my skis here, but I think your profile on X States, it's time to build. It feels like 25, 03:05:11.880 |
2025 is a good year to build. So, um, I wanted to ask your device and maybe, uh, for advice for 03:05:23.160 |
anybody who's trying to build, so who's trying to build something useful in the world, maybe launch 03:05:29.560 |
a startup or maybe just launch apps, services, whatever ship software products. So maybe by way 03:05:39.320 |
of advice, how do you actually get to shipping? So, I mean, a big part of the answer, I think, 03:05:45.480 |
is we're in the middle of a legit revolution. And I know you've been talking about this on 03:05:48.920 |
your show, but like AI coding, I mean, this is the biggest earthquake to hit software in 03:05:54.200 |
certainly my life, um, maybe since the invention of software. Um, and I'm sure, you know, we're 03:05:59.240 |
involved in various of these companies, but, you know, these, these tools, um, you know, 03:06:02.280 |
from a variety of companies are, um, like absolutely revolutionary and they're getting 03:06:08.360 |
better at leaps and bounds right every day. And you, you, you know, all this, but like 03:06:11.640 |
the thing with coding, like there's like open questions of whether AI can get better at like, 03:06:15.880 |
I don't know, understanding philosophy or whatever, creative writing or whatever, but like 03:06:19.240 |
for sure we can make it much better at coding, right. Because you can validate the results of 03:06:23.480 |
coding. Um, and so, you know, there's all these methods of, you know, synthetic data and self 03:06:27.400 |
training and reinforcement learning that for sure you can do with, with coding. And so everybody I 03:06:31.880 |
know who works in the field says AI coding is going to get to be phenomenally good. And it's, 03:06:36.280 |
it's already great. And you can, I mean, anybody wants to see this, just go on YouTube and look at 03:06:39.960 |
AI coding demos, you know, little, little kids making apps in 10 minutes working with an AI 03:06:43.480 |
coding system. And so I think it's the golden, I mean, I think this is an area where it's clearly 03:06:47.400 |
the golden age. The tool set is extraordinary, you know, in a day as a coder for sure in a day, 03:06:52.360 |
you can retrain yourself. Um, you know, start using these things, get a huge boost in productivity 03:06:56.760 |
as a non-coder, you can learn much more quickly than you could before. 03:06:59.640 |
That's, that's actually a tricky one in terms of learning as a non-coder to build stuff. It's 03:07:04.760 |
still, I feel like you still need to learn how to code. It, it becomes a superpower. It helps 03:07:10.840 |
him be much more productive. Like you could legitimately be a one person, uh, company 03:07:17.240 |
and get quite far. I agree with that up to a point. So the, um, I think for sure for quite 03:07:24.120 |
a long time, the people who are good at coding are going to be the best at actually having AIs 03:07:27.480 |
code things. Um, cause they're going to understand what, I mean, very basic, they're going to 03:07:30.600 |
understand what's happening, right. And they're going to be able to evaluate the work and they're 03:07:33.880 |
going to be able to, you know, literally like manage AIs better. Um, like even if they're not 03:07:37.560 |
literally handwriting the code, they're just going to have a much better sense of what's going on. 03:07:40.200 |
So I definitely think like a hundred percent, my nine-year-old is like doing all kinds of coding 03:07:43.720 |
classes and he'll keep doing that for certainly through 18. We'll see after that. Um, and so 03:07:50.280 |
like for sure that's the case. Um, but, but look, having said that, one of the things you can do 03:07:53.720 |
with an AI is say, teach me how to code. Right. Um, and so, and you know, there's, there's a whole 03:07:59.880 |
bunch of, um, you know, I'll, I'll name names, you know, Khan Academy, like there's a whole 03:08:03.880 |
bunch of, a whole bunch of work that they're doing at Khan Academy for free. And then we, 03:08:06.760 |
you know, we have this company replit, uh, which is originally specifically built for kids for 03:08:10.760 |
coding. Um, that has AI built in. That's just absolutely extraordinary now. Um, and then, 03:08:15.640 |
you know, there's a variety of other, of other systems like this. Um, and, uh, yeah, I mean, 03:08:20.200 |
the AI is going to be able to teach you to code AI, by the way, is, as you know, spectacularly 03:08:24.120 |
good at explaining code. Right. Um, and so, you know, the tools have these features now where you 03:08:30.040 |
can talk to the code base. And so you can like literally like ask the code base questions about 03:08:33.720 |
itself. Um, and you can also just do the simple form, which is you can copy and paste code into 03:08:38.680 |
a chat GPT and just ask it to explain it and what's going on, rewrite it, improve it, 03:08:41.800 |
make recommendations. And so there, there's, yeah, there's dozens of ways to, to, to do this, 03:08:46.520 |
by the way, you can also, I mean, even more broadly than code, like, okay, you want to make 03:08:50.120 |
a video game. Okay. Now you can do AI art generation, sound generation, dialogue generation, 03:08:55.480 |
voice generation. Right. And so all of a sudden, like you don't need designers, you know, you don't 03:08:59.320 |
need, um, you know, voice, uh, actors, you know, so yeah. So there's just like unlimited and then, 03:09:05.080 |
you know, a big, is, you know, a big part of coding is so-called glue, you know, it's, it's 03:09:08.200 |
interfacing into other systems. So it's interfacing into, you know, Stripe to take payments or 03:09:12.280 |
something like that. And, you know, AI is fantastic at writing glue code. Um, so, you know, really, 03:09:17.640 |
really good at making sure that you can plug everything together. Really good at helping 03:09:20.760 |
you figure out how to deploy. Um, you know, it'll even write a business plan for you. Um, so it's 03:09:26.920 |
just this, it's like everything happening with AI right now. It's just, it's like this latent 03:09:29.960 |
superpower. And there's this incredible spectrum of people who have really figured out massive 03:09:34.440 |
performance increases, productivity increases with it already. There's other people who aren't even 03:09:37.960 |
aware it's happening. And there's some gearing to whether you're a coder or not, but I think 03:09:43.800 |
there are lots of non-coders that are off to the races. And I think there are lots of professional 03:09:47.320 |
coders who are still like, eh, you know, the blacksmiths were not necessarily in favor of, 03:09:52.040 |
you know, the car business. Um, so, uh, yeah, there's the old William Gibson quote, 03:09:57.640 |
the future is here. It's just not evenly distributed yet. And this is maybe the 03:10:01.160 |
most potent version of that that I've ever seen. Yeah. There's a, you know, the old meme with the, 03:10:07.080 |
uh, with the bell curve, the, the, the people on both extremes say AI coding is the future. 03:10:13.640 |
Um, it's very common to programmers to say, you know, if you're any good of a programmer, 03:10:18.840 |
you're not going to be using it. That's just, that's just not true. Now I consider myself 03:10:23.480 |
reasonably good programmer and I, my productivity has been just skyrocketed and the joy of programming 03:10:28.920 |
skyrocketed is every aspect of programming is more efficient, uh, more productive, more fun, 03:10:36.760 |
all of that kind of stuff. I would also say code is, you know, code has, code has of anything in 03:10:41.320 |
like industrial society, code is, has the highest elasticity, which is to say the easier it is to 03:10:46.120 |
make it, the more, the more of it gets made. Like I think effectively there's unlimited demand for 03:10:50.520 |
code. Like never, it's like, there's always some other idea for a thing that you can do, 03:10:55.400 |
a feature that you can add or a thing that you can optimize. Um, and so, and so like overwhelmingly, 03:11:01.560 |
you know, the amount of code that exists in the world is a fraction of even the ideas we have 03:11:04.680 |
today. And then we come up with new ideas all the time. Um, and so I, I think that like, 03:11:08.600 |
you know, I was, I was in the late eighties, early nineties when sort of automated coding 03:11:13.320 |
systems started to come out, expert systems, big deal in those days. And there were all these, 03:11:16.760 |
there was a famous book called the decline and fall of the American programmer, you know, 03:11:20.280 |
the predicted that these new coding systems were going to mean we wouldn't have programmers in 03:11:23.080 |
the future. And of course the number of programming jobs exploded by like a factor of a hundred. 03:11:26.760 |
Like my guess will be, we'll have more, my guess is we'll have more coding jobs probably by like 03:11:30.920 |
an order of magnitude 10 years from now that will be different. There'll be different jobs. They'll, 03:11:35.640 |
they'll involve orchestrating AI. Um, but um, we'll, there will be, we will be creating so 03:11:40.360 |
much more software that the whole industry will just explode in size. Are you seeing the size of 03:11:45.320 |
companies decrease in terms of startups? What's the landscapes of little tech? All we're seeing 03:11:51.720 |
right now is the AI hiring boom of all time for the big tech people and little tech. Everybody's 03:11:58.360 |
trying to hire as many engineers as they can to build AI systems. It's just, uh, it's a hundred 03:12:02.840 |
percent. Uh, I mean there's, there's a handful of company, you know, there's a little bit, there's, 03:12:06.680 |
there's a customer service, you know, there, we have some companies and others, uh, I think it's 03:12:10.360 |
clarinet that's publicizing a lot of this, um, in Europe. Um, uh, where, um, you know, they're, 03:12:15.800 |
they're, you know, there are jobs that can be optimized, um, and jobs that can be automated, 03:12:19.320 |
but, um, like for engineering jobs, like it's just an explosion of hiring. Um, but at least so far, 03:12:25.560 |
there's no trace of any sort of diminishing effect. Um, now having said that, I am looking 03:12:29.800 |
forward to the day. I am waiting for the first company to walk in saying yes. Um, like the more 03:12:34.760 |
radical form of it. So basically the companies that we see are basically one of two kinds. We, 03:12:38.840 |
we see the companies that are basically sometimes use weak form, strong form. So the, the weak form 03:12:44.840 |
companies, I sometimes use the term, it's, it's the, it called the sixth bullet point. Um, AI is 03:12:49.240 |
the sixth bullet point on whatever they're doing. Sure. Right. And it's on the slide, right? So 03:12:53.560 |
they've got the, you know, whatever, dot, dot, dot, dot, and then AI is the sixth thing. And 03:12:56.200 |
the reason it has the sixth thing is because they had already previously written the slide before 03:12:59.080 |
the AI revolution started. And so they just added the sixth bullet point on the slide, 03:13:01.960 |
which is how you're getting all these products that have like the AI button up in the corner, 03:13:05.960 |
right? The little sparkly button. Yep. Right. Um, and all of a sudden Gmail is offering to 03:13:09.800 |
summarize your email, which I'm like, I don't need that. Like I need, I need you to answer my 03:13:14.120 |
email, not summarize it. Like what the hell? Okay. So we see those and that's fine. That's like, 03:13:18.920 |
I don't know, putting sugar on the cake or something. Um, but then we see the strong 03:13:23.000 |
form, which is the companies that are building from scratch for AI, right? And they're, they're 03:13:27.080 |
building it. I actually just met with a company that is building literally an AI email system as 03:13:30.760 |
an example. So just good. Oh, nice. I can't wait. Yeah. They're going to completely right. So the 03:13:35.800 |
very obvious idea of very smart team, um, you know, it's going to be great. You know, and then, 03:13:40.520 |
you know, notion just, uh, you know, another, not one of our companies, but just came out with a 03:13:43.640 |
product. So now companies are going to basically come through a sweep through and they're going to 03:13:47.080 |
do basically AI first versions of basically everything. And those are like companies built, 03:13:51.800 |
you know, AI is the first bullet point. It's the strong form of the argument. Yeah. Cursor is an 03:13:55.800 |
example of that. They basically said, okay, we're going to rebuild the thing with AI as the first 03:14:00.840 |
citizen. What if we knew from scratch that we could build on this? And, and, and, and again, 03:14:04.120 |
this is like, this is part of the full employment act for startups and VCs is it just like, if a, 03:14:09.880 |
if a technology transformation is sufficiently powerful, then you actually need to start the 03:14:13.960 |
product development process over from scratch because you need to reconceptualize the product. 03:14:17.400 |
And then usually what that means is you need a new company because most incumbents just, 03:14:21.320 |
just won't do that. Um, and so, yeah, so that's underway across many categories. Um, what I'm 03:14:26.040 |
waiting for is the company where it's like, no, our org chart is redesigned as a result of AI. 03:14:31.000 |
Right. And so I'm looking at, I'm waiting for the company where it's like, no, we're going to have, 03:14:34.680 |
like, you know, and, and the cliche, here's a thought experiment, right? The cliche would be, 03:14:39.080 |
we're going to have like the human executive team, and then we're going to have the AIs be the 03:14:42.120 |
workers, right? So we'll have a VP of engineering, supervising a hundred instances of, of, of coding, 03:14:46.520 |
of coding agents, right? Okay. Maybe, right. By the way, or maybe, um, maybe the VP of engineering 03:14:52.600 |
should be the AI, maybe supervising human coders who are supervising AIs, right? Cause one of the 03:14:58.200 |
things that AI should be pretty good at is managing. Cause it's like, not, you know, 03:15:02.680 |
it's like a process driven. It's the kind of thing that AI is actually pretty good at, 03:15:05.960 |
right? The performance evaluation coaching. Um, and so should it be an AI executive team? 03:15:11.240 |
Um, and then, you know, and then of course the ultimate question, which is AI CEO, 03:15:14.920 |
right? Um, and then, you know, and then there's, and then maybe the most futuristic version of it 03:15:20.200 |
would be an actual AI agent that actually goes fully autonomous. Yeah. What if you really set 03:15:24.600 |
one of these things loose and let it, let it, uh, basically build itself a business. Um, and so I 03:15:29.080 |
will say like, we're, we're not yet seeing those. And I think there's a little bit of the systems 03:15:33.960 |
aren't quite ready for that yet. Um, and then I think it's a little bit of, you really do need 03:15:38.360 |
at that point, like a founder who's really willing to break all the rules, um, and really willing to 03:15:43.160 |
take the swing. And I, and I know people exist. And so I'm sure we'll see that. And some of it is, 03:15:47.800 |
as, as you know, with all the startups, this is the execution, the, the idea that you have a AI 03:15:52.680 |
first email client, this seems like an obvious idea, but actually creating one, executing, 03:15:58.600 |
and then take it on Gmail is really, it's really difficult. I mean, Gmail it's, it's, it's fascinating 03:16:03.800 |
to see Google can't do it because, because why? Because the momentum, because it's hard to 03:16:09.560 |
re-engineer the entirety of the system. It feels like Google's perfectly positioned to, to do it. 03:16:15.320 |
Same with like, you have perplexity, uh, which I love, like Google could technically take on 03:16:21.080 |
perplexity and do it much better, but they haven't, not yet. So it's fascinating why that is 03:16:27.880 |
for large companies. I mean, that, that is an advantage for little tech. They can be agile. 03:16:34.680 |
Yeah. Little companies can break glass in a way big companies can't. 03:16:37.960 |
This is sort of the big breakthrough that Clay Christensen had in the innovators dilemma, 03:16:41.160 |
which is sometimes when big companies don't do things, it's because they're screwing up. And 03:16:44.680 |
that certainly happens. But a lot of times they don't do things because it would break too much 03:16:48.520 |
glass. It was specifically, it would, it would, it would interfere with their existing customers, 03:16:52.760 |
um, and their existing businesses. And they just simply won't do that. And by the way, 03:16:55.880 |
responsibly, they shouldn't do that. Um, right. Um, and so they just get, but Clay Christensen's 03:17:02.440 |
big thing is they, they often don't adapt because they are well run, not because they're poorly run, 03:17:06.600 |
but they're optimizing machines. They're, they're, they're optimizing against the existing business. 03:17:10.840 |
And, and, and, and as, as you kind of just said, this is like a permanent state of affairs for 03:17:15.160 |
large organizations. Like every once in a while, one breaks the pattern and actually does it. But 03:17:19.480 |
for the most part, like this is a very predictable form of human behavior. And this fundamentally is 03:17:23.240 |
why startups exist. It feels like 2025 is when the race for dominance in AI will see some winners. 03:17:31.160 |
Like it's a big year. So who do you think wins the race? Open AI, Meta, Google, XAI, 03:17:36.680 |
who do you think wins the AI race? I would say, I'm not going to predict. I'm going to say there's 03:17:41.000 |
questions all over the place. Um, and then we have the, we have this category of question we call the 03:17:44.680 |
trillion dollar question, which is like literally depending on how it's answered, people make or 03:17:48.600 |
lose a trillion dollars. And I think there's like, I don't know, five or $6 trillion questions right 03:17:52.600 |
now that are hanging out there, which is an unusually large number. And I just, you know, 03:17:57.080 |
I'll just hit a few of them and we can talk about them. So one is big models versus small models. 03:18:00.440 |
Another is open models versus closed models. Another is whether you can use synthetic data or 03:18:05.960 |
not. Another is chain of thought. How far can you push that and reinforcement learning. And then 03:18:11.400 |
another one is political trillion dollar questions, um, policy questions, which, you know, 03:18:16.120 |
the U S and the EU have both been flunking dramatically and the U S hopefully is about to 03:18:21.000 |
really succeed at, um, yeah. And then there's probably another, you know, half dozen big, 03:18:25.400 |
important questions after that. And so these are all just like, say this is an industry that's in 03:18:30.280 |
flux in a way that I even more dramatic, I think than the ones I've seen before. Um, I, and look, 03:18:35.560 |
the most example, most obvious example, the flux is sitting here three, sitting here in the summer, 03:18:39.320 |
you know, sitting here less than three years ago, sitting here in December of 22, we would have said 03:18:43.400 |
that open AI is just running away with everything. Um, and sitting here today, it's like, you know, 03:18:48.040 |
there's at least six, you know, world-class God model companies and teams that are by the way, 03:18:53.800 |
generating remarkably similar results. That's actually been one of the most shocking things 03:18:57.320 |
to me is like, it turns out that once, you know, that it's possible to build one incredibly smart 03:19:01.560 |
Turing test passing large language model, which was a complete shock and surprise, uh, to the 03:19:06.520 |
world. Um, it turns out within a year, you can have five more. Um, there's also a money component 03:19:11.720 |
thing to it, which is, um, to get the money, to scale one of these things into the billions of 03:19:15.800 |
dollars. There's basically right now only two sources of money that will do that for you. 03:19:19.160 |
One is, um, the hyperscalers giving you the money, which you turn around and round trip back to them. 03:19:23.960 |
Um, or, you know, foreign sovereigns, you know, other countries, you know, country sovereign, 03:19:27.560 |
sovereign wealth funds, which can be, you know, difficult in some cases to, for companies to 03:19:31.400 |
access. Um, so there's a, there's another, there's maybe another trillion dollar question is the 03:19:36.280 |
financing question. Here's one. Uh, so Sam Altman has been public about the fact that he wants to 03:19:40.760 |
transition open AI from being a nonprofit to being a for-profit. Um, the way that that is legally 03:19:46.040 |
done is that, um, there is a way to do it. There is a way in us law to do it. Um, the IRS and other 03:19:51.560 |
legal entities, uh, government entities scrutinize this very carefully because the us takes foundation 03:19:56.120 |
nonprofit law very seriously because of the tax exemption. Um, and so the way that historically, 03:20:01.080 |
the way that you do it is you start a for-profit and then you, you raise money with the for-profit 03:20:05.480 |
to buy the assets of the nonprofit at fair market value. Um, and you know, the last financing round 03:20:11.480 |
at open AI was, you know, 150 some billion dollars. And so logically the, if, if, if the flip is going 03:20:17.480 |
to happen, the for-profit has to go raise $150 billion out of the chute to buy the assets, 03:20:22.040 |
you know, raising 150 billion is a challenge. Um, so, you know, is that even possible? If that is 03:20:29.160 |
possible, then open AI, maybe it's off to the races as a for-profit company, if not, you know, 03:20:34.040 |
you know, I don't know. And then, you know, obviously the Elon lawsuit. So, 03:20:36.840 |
so just because they're the market leader today, you know, there's big important questions there, 03:20:40.040 |
you know, Microsoft has this kind of love hate relationship with them. Where does that go? 03:20:44.600 |
Apple's, you know, lagging badly behind, but you know, they're very good at catching up. 03:20:48.280 |
Amazon, you know, is primarily hyperscaler, but they now have their own models. 03:20:51.800 |
And then there's the other questions like you laid out brilliantly briefly and brilliantly 03:20:56.280 |
open versus closed big versus little models, synthetic data. That's a huge, huge question. 03:21:02.120 |
And then a test on compute with a chain of thought, the role of that as this fascinating, 03:21:08.120 |
these are, I think it's fair to say trillion dollar questions. 03:21:11.160 |
Yeah. These are big, like, look, you know, it's like, okay, here's a trillion dollar question, 03:21:14.120 |
which is kind of embedded in that, which is just hallucinations, right? Like, so 03:21:17.240 |
if you are trying to use these tools creatively, you're thrilled because they can draw new images 03:21:22.520 |
and they can make new music and they can do all this incredible stuff, right? They're creative. 03:21:26.840 |
The flip side of that is if you need them to be correct, they can't be creative. And that's, 03:21:30.600 |
you know, the term hallucination and these things do hallucinate. And, um, you know, 03:21:36.120 |
there have been, you know, court cases already where lawyers have submitted legal briefs that 03:21:39.800 |
contain made up court citations and case citations. The judge is like, wait a minute, 03:21:43.640 |
this doesn't exist. And the very next question is, did you write this yourself? And the lawyer goes, 03:21:47.800 |
I mean, that's why Elon with Grock, uh, looking for truth. I mean, that's an 03:21:54.280 |
open technical question. How close can you get to truth with LLMs? 03:21:58.600 |
Yeah, that's right. And, and I, I, my, my sense, um, is this very contentious topic 03:22:02.840 |
at the industry. My sense is if to the extent that there is a domain in which there is a 03:22:07.720 |
definitive and checkable and provable answer, and you might say math satisfies that coding 03:22:12.040 |
satisfies that, and maybe some other fields, then you should be able to generate synthetic data. 03:22:16.760 |
You should be able to do chain of thought reasoning. You should be able to do reinforcement 03:22:19.320 |
learning and you should be able to ultimately, you know, eliminate hallucinations. Um, for, 03:22:24.760 |
but by the way, that's a trillion dollar question right there as to whether that's true. 03:22:27.720 |
Um, but then, but then there's questions like, okay, is that going to work in the more general 03:22:30.920 |
domain? Like, so for example, one possibility is these things are going to get truly superhuman 03:22:35.320 |
at like math and coding, but at like discussing philosophy, they're going to just, they're 03:22:40.280 |
basically as smart as they're ever going to be. Um, and they're going to be kind of, you know, 03:22:44.120 |
say midwit grad student level. Um, and, and the theory there would just be, 03:22:47.560 |
they're already out of training data. Like they, they, they, they literally, if you know, 03:22:50.680 |
you talk to these people, like literally, literally the big models, the big middle models 03:22:53.560 |
are like within a factor of two X of consuming all the human generated training data to the 03:22:57.000 |
point that some of these big companies are literally hiring people like doctors and lawyers 03:23:00.200 |
to sit and write your training data by hand. And so does this mean that like, you have to, 03:23:04.120 |
if you want your model to get better at philosophy, you have to go hire like a thousand 03:23:06.680 |
philosophers and have them write new content. And is anybody going to do that? And so, you know, 03:23:11.240 |
maybe, maybe these things are topping out in certain ways and they're going to leap way ahead 03:23:14.120 |
in other ways. And so anyway, so we just don't, you know, I guess this is, this is, I'll tell you, 03:23:19.160 |
maybe my main, main conclusion is I, I don't, any of these, anybody telling, you know, anybody 03:23:22.920 |
telling you these big sweeping conclusions, you know, this whole super, you know, all of these 03:23:26.920 |
abstract generalized superintelligence, AGI stuff like it, you know, maybe it's the engineer in me, 03:23:31.400 |
but like, no, like that's not, that's not the, that's too abstract. Like it, it's got to actually 03:23:36.920 |
work. Um, and then by the way, it has to actually be able to pay for it. Um, I mean, this is a 03:23:42.920 |
problem right now with the, you know, the big models, the big models that are like really good 03:23:45.800 |
at coding a math, they're like actually very expensive to run. You know, they're quite slow. 03:23:49.880 |
Um, another trillion dollar question, future chips, um, which I know you've talked a lot about, 03:23:55.000 |
um, another trillion dollar question. Um, yeah, I mean, all the global issue. Oh, 03:23:59.720 |
another trillion dollar question, censorship, right? Like, and, and, and, and, um, and all the, 03:24:05.720 |
as they say, all the, uh, human feedback, uh, training process, exactly what are you training 03:24:11.400 |
these things to do? What are they allowed to talk about? How long did they give you these? 03:24:15.800 |
How, how often do they give these incredibly preachy moral lectures? Here's a, here's a, 03:24:20.840 |
here's a good, here's a trillion dollar question. How many other countries want their country to 03:24:25.160 |
run its education system, healthcare system, new system, political system on the basis of an AI 03:24:29.960 |
that's been trained according to the most extreme left-wing California politics. Um, right. Cause 03:24:35.960 |
that's kind of what they have on offer right now. And I think the answer to that is not very many. 03:24:39.400 |
So there's like massive open questions there about like what, you know, and by the way, 03:24:45.560 |
like what morality are these things going to get trained on as a. 03:24:47.880 |
And that one we're cracking wide open with, uh, what's been happening over the past few months, 03:24:54.360 |
censorship on every level of these companies. And just the very idea what truth means and what it 03:25:01.640 |
means to be expand the Overton window of LLMs or the Overton window of human discourse. 03:25:07.000 |
So what, what I experienced, you know, going back to how we started, what I experienced was, 03:25:10.760 |
all right, social media, censorship regime from hell D banking, right. At like a large scale. 03:25:16.920 |
Um, and then the war on the crypto industry trying to kill it and then basically declared intent to 03:25:22.200 |
do the same thing to AI, um, and to put AI under the same kind of censorship and control regime as, 03:25:27.240 |
as, as social media in the banks. And I, and I think this election tips in America, 03:25:32.120 |
I think this election tips us from a timeline in which things were going to get really bad on that 03:25:36.520 |
front, uh, to a timeline in which I think things are going to be quite good. But look, those same 03:25:40.600 |
questions also apply outside the U S and you know, the EU is doing their thing. They're being 03:25:45.080 |
extremely draconian and they're trying to lock in a political censorship regime on AI right now 03:25:49.640 |
that's so harsh that even American AI companies are not even willing to launch new products in 03:25:52.840 |
the EU right now. Like that's not going to last, but like what, what happens there, 03:25:57.720 |
right. And what, what are the trade-offs, you know, what levels of censorship are American 03:26:01.320 |
companies going to have to sign up for if they want to operate in the EU or is the EU still 03:26:05.160 |
capable of generating its own AI companies or have we brain drained them so that they can't. 03:26:12.040 |
So big questions. Uh, quick questions. So you're very active on X, a very unique character. Um, 03:26:21.080 |
flamboyant, exciting, bold. Uh, you post a lot. I think there's a meme. I don't remember it 03:26:29.960 |
exactly, but the Elon posted something like, uh, inside Elon, there are two wolves. One is, uh, 03:26:36.280 |
please be kind or more positive. And the other one is, I think, uh, you know, doing the, uh, 03:26:43.400 |
take a big step back and fuck yourself in the face guy. How many wolves are inside your mind 03:26:49.800 |
when you're tweeting? To be clear, a reference from the comedy classic, Tropic Thunder, 03:26:53.960 |
Tropic Thunder. Yeah. Legendary movie. Yes. Um, any zoomers listening to this, who haven't seen 03:27:01.160 |
that movie, go watch it immediately. Yeah. There's nothing offensive about nothing offensive about it 03:27:05.240 |
at all. Um, Tom Cruise's greatest performance. Um, so, um, yeah, no, look, uh, just start by 03:27:16.520 |
saying like, I'm not supposed to be tweeting at all. So, uh, yeah, yes, yes, yes. And so, 03:27:20.600 |
but you know, so how do you approach that? Like how do you approach what to tweet? I mean, I don't, 03:27:26.440 |
I like, so it's, it's a, it's a, um, it, it, I don't, I don't well enough. Um, it's mostly an 03:27:31.720 |
exercise in frustration. Um, look, there's a glory to it and there's, there's a, there's an 03:27:35.560 |
issue with it. And the glory of it is like, you know, instantaneous global communication that, 03:27:39.160 |
you know, in X in particular is like the, you know, the town square on all these, you know, 03:27:43.240 |
social issues, political issues, everything else, current events. Um, but I mean, look, 03:27:47.240 |
there's no question the format, the format of at least the original tweet is, you know, 03:27:50.760 |
prone to be inflammatory. You know, I'm, I'm, I'm the guy who at one point the entire nation 03:27:54.840 |
of India hated me. Um, cause I once tweeted something, it turned out that is still politically 03:27:59.080 |
sensitive. Um, and the entire continent, um, I stayed up all night that night as, as I became 03:28:03.960 |
front page headline and leading television news in each time zone in India for a single tweet. 03:28:09.000 |
So like the single tweet out of context is a very dangerous thing. Um, obviously 03:28:15.640 |
X now has the middle ground where they, you know, they, they now have the longer form essays. Um, 03:28:19.880 |
and so, um, you know, probably the most productive thing I can do is, is, is longer form, 03:28:23.160 |
um, is, is longer form things. Um, you're not going to do it though. I do, I do from time to 03:28:28.600 |
time. I should, I should do more of them. And then, yeah, I mean, look, but, and yeah, 03:28:32.360 |
obviously X is, X is doing great. And then, uh, like I said, like sub stack, you know, has become 03:28:36.200 |
the center for a lot, you know, a lot of the, I think the best kind of, you know, deeply thought 03:28:40.200 |
through, you know, certainly intellectual content, um, you know, tons of current events, 03:28:44.840 |
uh, stuff there as well. Um, and then, um, yeah. So, and then there's a bunch of other, 03:28:48.600 |
you know, a bunch of new systems that are very exciting. So I think one of the things we can 03:28:51.480 |
look forward to in the next four years is number one, just like a massive reinvigoration of social 03:28:54.920 |
media as a consequence of the changes that are happening right now. Uh, I'm very excited to see 03:28:59.160 |
the con to see what's going to happen with that. And then, um, I mean, it's happened on X, but 03:29:02.840 |
it's not going to happen on other platforms. And then, um, the other is, um, crypto is going to 03:29:08.680 |
come, you know, crypto is going to come right back to life. Um, and actually that's very exciting. 03:29:12.360 |
Actually, that's worth noting is that's another trillion dollar question on AI, which is, um, 03:29:15.960 |
in a world of pervasive AI and especially in a world of AI agents and imagine a world of billions 03:29:21.400 |
or trillions of AI agents running around, they need an economy. Um, and in crypto, in our view, 03:29:27.800 |
happens to be the ideal economic system for that, right? Cause it's a programmable money. It's a 03:29:31.320 |
very easy way to plug in and do that. And there's this transaction processing system that can, 03:29:35.160 |
that can do that. And so I think the crypto AI intersection, you know, is potentially very, 03:29:39.080 |
a very, very big deal. Um, and so that was, that was going to be impossible under the prior regime. 03:29:44.840 |
And I think under the new regime, hopefully it'll be something we can do, uh, almost for fun. Let 03:29:49.160 |
me ask, uh, a friend of yours, Yann LeCun, what are your top 10 favorite things about Yann LeCun? 03:29:54.520 |
He's, uh, I think he's a, he's a brilliant guy. I think he's important to the world. I think you 03:30:02.040 |
guys disagree on a lot of things. Uh, but I personally like vigorous disagreement. I, 03:30:07.240 |
as a person in the stands, like to watch the gladiators go at it. 03:30:10.600 |
And no, he's a super genius. I mean, look, I haven't said we're super close, but you know, 03:30:15.240 |
casual casual friends. I worked with him at Metta, you know, he's the chief scientist at 03:30:18.920 |
Metta for a long time and it's still, you know, works with us. And, and, um, you know, and it's, 03:30:23.960 |
obviously it's a legendary figure in the field and one of the main people responsible for what's 03:30:27.080 |
happening. Um, I, it's my serious observation would be that it's, it's, it's the thing I keep, 03:30:32.520 |
I've talked to him about for a long time and I keep trying to read and follow everything he does 03:30:35.720 |
is he's probably, he is the, I think, see if you agree with this, he is the smartest and most 03:30:42.280 |
credible critic of LLMs is the path for AI. Um, and, um, he's not, you know, there's certain, 03:30:48.360 |
I would say, troll-like characters who are just like crapping everything. But like Yann has like 03:30:51.880 |
very deeply thought through basically, um, theories as to why LLMs are an evolutionary dead end. Um, 03:30:58.440 |
and, um, and I actually, like, um, I, I try to do this thing where I try to model, you know, 03:31:03.240 |
I try to have a mental model of like the two different sides of a serious argument. And so 03:31:05.960 |
I, I've, I've tried to like internalize that argument as much as I can, which is difficult 03:31:09.400 |
cause like we're investing it behind LLMs as aggressively as we can. And so if he's right, 03:31:12.760 |
like that can be a big problem, but like, we should also know that. Um, and then I sort of 03:31:18.040 |
use his ideas to challenge all the bullish people, you know, to really kind of test their level of 03:31:23.480 |
knowledge. So I like to kind of grill people. Like, I'm not like, I'm not, you know, I'm, I'm, 03:31:27.800 |
I was not, you know, I was got my CS degree 35 years ago. So I'm not like deep in the technology, 03:31:33.320 |
but like if, if to the extent I can understand Yann's points, I can use them to, um, you know, 03:31:37.960 |
to really surface a lot of the questions for the people who are more bullish. Um, and that's been, 03:31:41.560 |
I think very, very productive. Um, yeah. So I, yeah, just, it's very striking that you have 03:31:45.880 |
somebody who is like that central in the space who is actually like a full-on, a full-on skeptic. 03:31:51.480 |
And, and, you know, and, and again, as you could, this could go different ways. He could end up 03:31:54.840 |
being very wrong. He can end up being totally right. Or it could be that he will provoke 03:31:59.400 |
the evolution of these systems to be much better than they would have been. 03:32:02.600 |
Yeah. He could be both right and wrong. I, I, first of all, I do, I do agree with that. 03:32:06.120 |
He's one of the most legit and, uh, rigorous and deep critics of the LLM path to AGI, 03:32:13.480 |
you know, his basic notion is that there needs, AI needs to have some physical 03:32:17.480 |
understanding of the physical world, and that's very difficult to achieve with LLMs. 03:32:22.680 |
And that, that is a really good way to challenge the limitations of LLMs and so on. He's also been 03:32:29.000 |
a vocal and a huge proponent of open source, which is a whole nother, which you have been as well. 03:32:42.920 |
He embodies, he also has many wolves inside his mind. 03:32:46.120 |
Yes, he does. Yes, he does. Yes, he does. Yes, he does. 03:32:50.120 |
The other two, okay, here's my other wolf coming out. 03:32:52.680 |
The other two of the three godfathers of AI are like radicals, like, like full on left, 03:32:58.920 |
you know, far left, you know, like they, I would say like either Marxists or borderline Marxists. 03:33:03.960 |
And they're like, I think quite extreme in their social and political views. 03:33:10.040 |
And I think, you know, they, they, they are lobbying for like draconian government. 03:33:13.240 |
I think what would be ruinously destructive government legislation and regulation. 03:33:18.760 |
super, super helpful to have you on as a counterpoint to those two. 03:33:21.560 |
Another fun question. Our mutual friend, Andrew Huberman. 03:33:25.880 |
First, maybe what do you love most about Andrew? 03:33:28.280 |
And second, what score on a scale of one to 10, 03:33:31.320 |
do you think he would give you on your approach to health? 03:33:35.800 |
Physical, three. You think you'd score that high, huh? 03:33:40.920 |
Exactly. Well, so he can, he did, he convinced me to stop drinking alcohol, um, which was a big. 03:33:46.840 |
Well, it was like my favorite, other than my family, it was my favorite thing in the world. 03:33:50.360 |
And so it was a major, major reduction. Like having like a glass of scotch at night was like a major, 03:33:54.680 |
like, it was like the thing I would do to relax. And so he has profoundly negatively 03:33:57.960 |
impacted my emotional health. Um, I, uh, I, I blame him for making me much less happy as a person, 03:34:03.640 |
but much, much, much healthier, uh, physically healthier. So that, that I, I credit him with 03:34:09.320 |
that. I'm glad I did that. Um, but then his sleep stuff, like, yeah, I'm not doing any of that. 03:34:14.200 |
I have no interest in his sleep shit. Like, no, this whole light, natural light. No, 03:34:19.400 |
we're not doing too hardcore. I don't see any, I don't see any natural, I don't see any natural 03:34:23.320 |
light in here. It's all covered. It's all horrible. And I'm very happy. I would be very 03:34:29.320 |
happy living and working here because I'm totally happy without natural light. 03:34:32.360 |
In darkness. That must be a metaphor for something. 03:34:35.480 |
Yes. It's a test. Look, it's a test of manhood as to whether you can have a blue screen in your 03:34:38.600 |
face for three hours and then go right to sleep. Like, I don't understand why you shouldn't want 03:34:43.160 |
Um, I now understand what they mean by toxic masculinity. All right. So, uh, let's see. 03:34:54.520 |
You're exceptionally successful by most measures, but what to you is the definition of success? 03:35:02.280 |
I would probably say it is a combination of two things. I think it is, um, contribution. 03:35:09.720 |
Um, so, you know, have you done something that mattered ultimately? Um, and, um, and, 03:35:16.840 |
you know, specifically matter to people. Um, and then the other thing is, I think happiness is 03:35:21.160 |
either overrated or almost a complete myth. Um, and in fact, interesting, Thomas Jefferson did 03:35:26.840 |
not mean happiness the way that we understand it when he said pursuit of happiness in the 03:35:30.120 |
declaration of independence. He meant it more of the Greek meaning, um, which is closer to 03:35:35.240 |
satisfaction, uh, or fulfillment. Um, and so I think happiness is, so I think about happiness as 03:35:41.800 |
the first, the first, uh, ice cream cone makes you super happy. The first mile of the walk in 03:35:47.160 |
the park during sunset makes you super happy. The first kiss makes you super happy. The thousandth 03:35:52.680 |
ice cream cone, not so much. Um, the south thousandth mile of the walk through the park. 03:35:59.240 |
Um, the thousandth kiss can still be good, but maybe just not right in a row. Um, right. And so 03:36:05.560 |
happiness is this very fleeting concept. Um, and the people who anchor on happiness seem to 03:36:09.720 |
go off the rails pretty often. Sort of the deep sense of having been, I don't know how to put it, 03:36:16.040 |
useful. So that's a good place to arrive at in life. Yeah, I think so. Yeah. I mean, 03:36:25.080 |
can you sit, can you, yeah, you know, who was it who said the oldest source of all the ills in the 03:36:29.480 |
world is man's inability to sit in a room by himself doing nothing. Um, but like if you're 03:36:34.520 |
sitting in a room by yourself and you're like, all right, or, you know, four in the morning, 03:36:37.480 |
it's like, all right, have I like, you know, have I lived up to my expectation of myself? 03:36:40.840 |
Like if you have, you know, the people I know who feel that way are pretty centered. Um, and, 03:36:46.200 |
um, you know, generally seem very, um, I don't know how to put it, pleased with, you know, proud, 03:36:52.520 |
um, calm at peace. Um, the people who are, um, you know, sensation seekers, um, 03:36:58.840 |
you know, some of the sensations, by the way, some sense, you know, there's, there's certain 03:37:02.680 |
entrepreneurs, for example, who are like into every form of extreme sport and they get, you 03:37:06.040 |
know, huge satisfaction out of that. Um, or, you know, there's sensation seeking and sort of useful 03:37:10.360 |
and productive ways. You know, Larry Ellison was always like that. Zuckerberg is like that. 03:37:13.560 |
And then, you know, there's a lot of entrepreneurs who end up, you know, drugs, 03:37:19.960 |
you know, like sexual, you know, sexual escapades that seem like they'll be fun at first and then 03:37:24.920 |
backfire. Yeah. But at the end of the day, if you're able to be at peace by yourself in a room 03:37:30.600 |
at 4 a.m. Yeah. And I would even say happy, but I know I understand Thomas Jefferson didn't mean it 03:37:36.600 |
the way, the way maybe I mean it, but I can be happy by myself at 4 a.m. with a blue screen. 03:37:43.160 |
That's good. Exactly. Staring at cursor. Exactly. 03:37:46.440 |
As a small tangent, a quick shout out to an amazing interview you did with Barry Weiss 03:37:54.040 |
and just to her in general, Barry Weiss of the Free Press. She has a podcast called Honestly 03:37:59.560 |
with Barry Weiss. She's great. People should go listen. You were asked if you believe in God. 03:38:05.640 |
One of the joys, see, we talked about happiness. One of the things that makes me happy 03:38:11.640 |
is making you uncomfortable. Thank you. So this question is designed for many of the questions 03:38:16.600 |
today have been designed for that. You were asked if you believe in God and you said after a pause, 03:38:21.560 |
you're not sure. So it felt like the pause, the uncertainty there was 03:38:27.720 |
some kind of ongoing search for wisdom and meaning. 03:38:32.760 |
Are you in fact searching for wisdom and meaning? I guess I put it this way. There's a lot 03:38:40.440 |
to just understand about people that I feel like I'm only starting to understand. 03:38:45.960 |
And that's certainly a simpler concept than God. So that's what I've spent a lot of the last 15 03:38:55.640 |
years trying to figure out. I feel like I spent my first, whatever, 30 years figuring out machines. 03:39:00.040 |
And then now I'm spending 30 years figuring out people, which turns out to be quite a bit more 03:39:03.000 |
complicated. And then I don't know, maybe God's the last 30 years or something. And then look, 03:39:10.760 |
I mean, just like Elon is just like, okay, the known universe is very complicated and mystifying. 03:39:17.160 |
I mean, every time I pull up in astronomy, I get super in astronomy and it's like, "Daddy, 03:39:21.320 |
how many galaxies are there in the universe?" And how many galaxies are there in the universe? 03:39:32.280 |
Like how is that freaking possible? It's such a staggering concept that I... 03:39:38.840 |
I actually wanted to show you a tweet that blew my mind from Elon from a while back. 03:39:42.920 |
Elon said, "As a friend called it, this is the ultimate skill tree. This is a wall of galaxies 03:39:50.840 |
a billion light years across." So these are all galaxies. 03:39:55.560 |
Yeah. How is it that big? How the hell? I can read the textbook and the this and the that and 03:40:02.920 |
the whatever, 8 billion years and the big bang and the whole thing. And then it's just like, 03:40:06.280 |
all right, wow. And then it's like, all right, the big bang. All right. What was before the big bang? 03:40:11.640 |
You think we humans will ever colonize a galaxy and maybe even go beyond? 03:40:19.240 |
Sure. I mean, yeah. I mean, in the fullness of time. Yeah. 03:40:21.800 |
So you have that kind of optimism, you have that kind of hope that extends across thousands of 03:40:25.480 |
years. In the fullness of time. I mean, yeah. I mean, yeah. You know all the problems, all the 03:40:27.960 |
challenges with it that I do, but like, yeah, why not? I mean, again, in the fullness of time, 03:40:32.280 |
it'll take a long time. You don't think we'll destroy ourselves? 03:40:34.360 |
No, I doubt it. I doubt it. And fortunately, we have Elon giving us the backup plan. So I don't 03:40:41.640 |
know. I grew up, you know, rural Midwest sort of just like conventionally kind of Protestant 03:40:44.840 |
Christian. It never made that much sense to me. Got trained as an engineer and a scientist. I'm 03:40:49.240 |
like, "Oh, that definitely doesn't make sense." I'm like, "I know. I'll spend my life as an 03:40:52.520 |
empirical rationalist and I'll figure everything out." And then again, you walk up against these 03:40:57.800 |
things, you bump up against these things and you're just like, "All right. Okay. I guess 03:41:02.840 |
there's a scientific explanation for this, but like, wow." And then there's like, "All right, 03:41:08.360 |
where did that come from?" Right. And then how far back can you go on the causality chain? 03:41:12.680 |
Yeah. And then, yeah. I mean, then even just experiences that we all have on earth, 03:41:17.880 |
it's hard to rationally explain it all. And then, so yeah, I guess I just say I'm 03:41:21.960 |
kind of radically open-minded at peace with the fact that I'll probably never know. 03:41:26.200 |
The other thing though that's happened, and maybe the more practical answer to the question is, 03:41:30.680 |
I think I have a much better understanding now of the role that religion plays in society than 03:41:35.320 |
I didn't have when I was younger. And my partner, Ben, has a great, I think he quotes his father on 03:41:40.920 |
this. He's like, "If a man does not have a real religion, he makes up a fake one." And the fake 03:41:46.040 |
ones go very, very badly. And so there's this class, it's actually really funny. There's this 03:41:52.520 |
class of intellectual. There's this class of intellectual that has what appears to be a very 03:41:55.240 |
patronizing point of view, which is, "Yes, I'm an atheist, but it's very important that the people 03:41:59.880 |
believe in something." Right. And Marx had the negative view on that, which was religion is the 03:42:05.400 |
opiate of the masses. But there's a lot of right-wing intellectuals who are themselves, 03:42:08.520 |
I think, pretty atheist or agnostic that are like, "It's deeply important that the people 03:42:11.320 |
be Christian," or something like that. And on the one hand, it's like, "Wow, that's arrogant 03:42:15.960 |
and presumptive." But on the other hand, maybe it's right because what have we learned in the 03:42:21.640 |
last hundred years is in the absence of a real religion, people will make up fake ones. 03:42:25.560 |
There's this writer, there's this political philosopher who's super interesting on this 03:42:30.360 |
named Eric Vogelin. And he wrote in that mid-part of the century, mid-late part of the 20th century. 03:42:37.000 |
He was born in, I think, 1900 and died in '85. So he saw the complete run of communism and 03:42:41.720 |
Nazism and himself fled. I think he fled Europe and the whole thing. 03:42:47.640 |
And his big conclusion was basically that both communism and Nazism and fascism were basically 03:42:55.240 |
religions, but in the deep way of religions. He called them political religions, but they 03:43:01.400 |
were actual religions. And they were what Nietzsche forecasted when he said, "God is 03:43:06.600 |
dead. We've killed him and we won't wash the blood off our hands for a thousand years," 03:43:10.440 |
is we will come up with new religions that will just cause just mass murder and death. 03:43:14.760 |
And you read his stuff now and you're like, "Yep, that happened." And then of course, 03:43:20.680 |
as fully elite moderates, of course, we couldn't possibly be doing that for ourselves right now, 03:43:24.840 |
but of course we are. And I would argue that Eric Vogelin for sure would argue that the last 10 03:43:30.680 |
years we have been in a religious frenzy, that woke has been a full-scale religious frenzy and 03:43:36.680 |
has had all of the characteristics of a religion, including everything from patron saints to holy 03:43:40.840 |
texts to sin. Wokeness is that every single aspect of an actual religion other than redemption, 03:43:51.800 |
which is maybe the most dangerous religion you could ever come up with is the one where there's 03:43:57.560 |
no forgiveness. And so I think if Vogelin were alive, I think he would have zeroed right in on 03:44:02.200 |
that, would have said that. And we just sailed right off. I mentioned earlier, we somehow 03:44:07.800 |
rediscover the religions of the Indo-Europeans, we're all into identity politics and environmentalism. 03:44:12.040 |
I don't think that's an accident. So anyway, there is something very deep going on in the 03:44:19.640 |
human psyche on religion that is not dismissible and needs to be taken seriously, even if one 03:44:29.720 |
struggles with the specifics of it. I think I speak for a lot of people that it's been a real 03:44:35.640 |
joy and for me an honor to get to watch you seek to understand the human psyche as you described, 03:44:43.240 |
you're in that 30-year part of your life. And it's been an honor to talk with you today. 03:44:49.720 |
Thank you, Lex. Is that it? How long is that? 03:44:54.440 |
Four hours with Mark Andreessen. It's like 40 hours of actual content. 03:45:01.400 |
For the listener, Mark looks like he's ready to go for 20 more hours and I need a nap. Thank you, 03:45:11.560 |
Thanks for listening to this conversation with Mark Andreessen. To support this podcast, 03:45:17.000 |
please check out our sponsors in the description. And now let me leave you with some words from 03:45:22.440 |
Thomas Sowell. "It takes considerable knowledge just to realize the extent of your own ignorance." 03:45:30.440 |
Thank you for listening and hope to see you next time.