back to indexE48: The role of decentralization, China/US break down & more with Bestie Guestie Balaji Srinivasan
Chapters
0:0 Bestie Guestie Balaji Srinivasan is introduced
6:1 Balaji's day in the life, Coinbase background, comparing crypto with early 2000's p2p music industry, China's lawful evil
16:9 China declares crypto transactions illegal, comparing and contrasting the US and China's future outlook
39:43 Chances of a revolution in China, predictions for the next 1-2 decades
48:37 Decentralized citizen journalism, the end of corporate journalism
58:11 Facebook's tough stretch continues: leaker might come forward with SEC, lawsuit over FTC payouts, public sentiment turning on big tech, decentralized social media
78:43 How decentralized media platforms would work mechanically
00:00:00.000 |
So we have biology here. And obviously biology is an expert 00:00:04.640 |
in everything. But biology, you're used to being interviewed 00:00:09.120 |
one on one. I've had you on my podcast a couple years ago, we 00:00:13.180 |
had many great discussions, you've been on a bunch of other 00:00:15.220 |
people's but today we'll see how you do on on the squad here with 00:00:20.000 |
five people passing the ball. You've listened to the pod 00:00:24.020 |
Yeah, I'll pass. So 20% each or whatever. It's fine. 00:00:27.320 |
Okay, perfect. Well, we will have all in stats. We'll get 00:00:31.160 |
good luck trying to get 20% with this moderator. This says the guy 00:00:39.360 |
Look, you're you're supposed to be a non shooting point guard, 00:00:42.080 |
Jason. Nobody wants to see you brick three pointer after three 00:00:47.720 |
pointer. Just bring the ball down the court and pass it just 00:00:54.560 |
That's coming from the guy who has 24% of the ball. 00:00:58.880 |
Yeah, exactly. I fit exactly what I'm supposed to be doing, 00:01:03.360 |
which is one quarter is statistically proven. 00:01:13.080 |
Hey, everybody, welcome to another episode of 00:01:26.580 |
the all in podcast with us again this week, Chamath Pali 00:01:30.420 |
Hapatiya, the dictator in just a great sweater. And David 00:01:36.480 |
you should touch this, you have no idea the material that is 00:01:39.300 |
made from it's made from like the the chin hair of like a 00:01:43.980 |
belly goat, baby goat that's been plucked by a Tibetan Sherpa 00:01:49.620 |
who literally is forced to use lotion and camphor to keep their 00:01:56.280 |
Soft so that they don't disturb the innate properties of the 00:02:01.500 |
It's amazing and available right now two for one at Kmart 1495. 00:02:06.300 |
I mean, I'm just going based on the aesthetics of the look. David 00:02:09.960 |
Friedberg is back the queen of kin wa recently having his office 00:02:15.600 |
hit by a skunk. Are you okay, David? Are you gonna be okay for 00:02:18.780 |
Be all right, just getting getting used to the condition. 00:02:26.040 |
You're on bandage. Really? Did you get hit by the skunk or just 00:02:29.800 |
Well, my windows were open. Apparently it's baby skunk 00:02:32.260 |
season here in Northern California. So there were some 00:02:46.820 |
From definitely too soon. And from a nondescript motel eight 00:02:55.800 |
Texas, in Texas, Texas. David Sachs, a coastal elite in Texas. 00:03:03.900 |
On mute. You're on mute, David. Where this is episode 48. You 00:03:10.380 |
I happen to be in Texas. That's correct. I it happens to be a 00:03:15.720 |
happens to be a state where you can carry a gun now. Yeah, 00:03:25.560 |
Are no longer allowed to make decisions for their own body. 00:03:34.020 |
David, are you considering moving to the great state of 00:03:40.080 |
Um, well, I'm open to the possibility. Let's just say I'm 00:03:46.780 |
But I you know, I recently ran a Twitter survey where I asked 00:03:50.040 |
Miami or Austin and Miami won by about 52%. Austin got 00:03:55.320 |
about 48%. With over 10,000 votes. So it's pretty 00:03:59.480 |
And do you have a preference yourself between the two? 00:04:01.860 |
I personally like Miami. I already have a place in Miami, 00:04:05.580 |
but I'm checking out Austin to serve. Yeah, just for 00:04:10.260 |
due diligence, just for complete due diligence. So I think at 00:04:16.260 |
this point, we should just go through the cities that sex 00:04:18.780 |
doesn't have a home. Can we just that might be easier for us to 00:04:25.080 |
All right, listen, people started haranguing us on the 00:04:29.400 |
Twitter to have some bestie guesties on the program. And so 00:04:34.240 |
we decided we were bringing Balaji Srini vasan Balaji Srini 00:04:40.640 |
vasan. Did I get it correct? I mean, I've been pronouncing your 00:04:44.280 |
I mean, you're such a fucking ignoramus race. You are such a 00:04:49.220 |
Listen, Greek names are hard to pronounce. I mean, it's fine. 00:04:54.840 |
Balaji, Balaji, Balaji, Balaji, Balaji, Balaji, Balaji, Balaji, 00:05:10.080 |
to just figured out how to say polyhapitia. So 00:05:12.660 |
I have been training the world how to say polyhapitia for years. 00:05:16.960 |
Sri Lankan names are the only names that are harder than South 00:05:21.000 |
Although Tamil names are pretty brutal. I mean, yeah, Tamil 00:05:24.260 |
Basically, you guys add like maybe one or two more syllables. 00:05:28.020 |
And ours end in a vowel. The Tamils always end in a consonant 00:05:31.540 |
which always just then you can just they take liberal license 00:05:35.780 |
Can you guys just drop a few syllables from your names? 00:05:39.320 |
We could. We could. Yeah. But we don't. It's an intimidation 00:05:48.920 |
All right, there's a big tech backlash. And it's not a big tech backlash. 00:05:53.380 |
And it's not just because of David Sachs comments. A new poll shows that 80% of registered voters. 00:05:58.880 |
Wait a minute, wait a minute, wait a minute, wait, before we go there. I have a question for Balaji. 00:06:02.140 |
You are, in my opinion, but I think a lot of people would say, one of the most incredibly 00:06:10.300 |
well read thoughtful, you know, commentators that we have, frankly, like, you know, not just in tech, 00:06:19.660 |
but I just think generally, in our society, I would give you that I think you're incredible. I'm 00:06:23.980 |
curious, how did you become such a polymath? Was this just one of these things where you've always 00:06:28.960 |
been curious about everything? Like, just tell us about I'm curious about like, how you grew up, 00:06:32.920 |
first of all, and then second, how do you spend a normal day? I just want to know those. And then 00:06:38.080 |
then we can just jump in. I'm just curious, because I sure do extremely well read and 00:06:41.560 |
extremely knowledgeable. Well, thank you. I appreciate that. You know, I try to learn from 00:06:47.200 |
from everybody. What do I do? I think normal people go to the club or they, they go out and 00:06:56.260 |
have fun. And I read math books and history books and science books and stuff like that. 00:07:02.980 |
That's how I have fun. I find that more interesting. Nothing wrong with with going 00:07:07.720 |
out people do that I go for walks and stuff, I work out. But so I do a lot of that. And I have, 00:07:16.780 |
I mean, other than that, I just, I just read a lot. And I think I remember a moderate amount. And 00:07:23.500 |
so I cite that. And it's nothing really much more complicated than that. But that's it. 00:07:29.680 |
The only difference would be you, if he could stop going to the club. And he lost that last 40%. 00:07:40.180 |
I will make an observation, which I was actually talking to sacks about this, which is, there's 00:07:46.720 |
learn the tech because that part is necessary to put it together. And there's some people who 00:07:52.480 |
are fundamentally like scientists or academics, and then get into business in order to advance 00:07:57.940 |
technology. And there's nothing wrong with the first group. And I think that's totally, 00:08:01.480 |
you know, fine and legitimate, like business people first, and then they get tech. I'm really 00:08:06.280 |
part of the second group, I think, where, you know, I'm fundamentally an academic at heart, 00:08:09.880 |
you know, I spent almost the first three decades of my life meditating on mathematics. And 00:08:15.580 |
just kind of got into tech relatively later in life than than many others. 00:08:19.660 |
All right. And you spent your career building companies, everybody remembers, earn.com. 00:08:24.880 |
By the way, that's actually doing extremely well at Coinbase. Coinbase earn is on track 00:08:30.400 |
for 100 million revenue. Actually, it's much more than that in terms of GMB is broken out 00:08:34.780 |
separately in there. Not their s1, but their quarterly filing. So you can go and check me 00:08:39.760 |
on that. And the other assets that we added after I joined Coinbase are more than 50% 00:08:45.520 |
of Coinbase revenue, again, in their filings, and Coinbase custody in all of its assets come 00:08:52.060 |
from Rosetta, which is something that we did there. And we launched USDC when I was there. 00:08:56.380 |
And that is whatever $100 million a day. So they got a lot of value out of that acquisition. 00:09:02.920 |
Just the CTO for a while of Coinbase. Obviously, what was your take on Coinbase is lend product, 00:09:09.520 |
the SEC coming in and saying, Hey, pump the brakes, we want a bunch of 00:09:15.460 |
information. And then they've just Brian Armstrong and Coinbase announced that they're 00:09:20.560 |
not going to do the lend product. What's your take on that? 00:09:22.660 |
So I don't speak for Coinbase anymore. Okay, I'm, you know, in my personal capacity. 00:09:27.880 |
But I do think that this is sort of the beginning of an era where we're going to 00:09:37.300 |
be rolling back the alphabet soup that FDR put in place. That is to say, you know, the SEC is not 00:09:45.400 |
go after millions of crypto holders and developers are set up to go after a Goldman and a Morgan, 00:09:52.720 |
you know, just like the FAA is not set up to go after millions of drone developers, 00:09:55.900 |
just have to go after Boeing and Airbus. And the FDA is not set up to go after millions of 00:10:00.580 |
biohackers are set up to go after, you know, Pfizer and Merck. So the entire 20th century 00:10:05.080 |
alphabet soup that regulatory apparatus is meeting something that it's not really set up for. And 00:10:11.440 |
it's going to be it's the problem is not the problem is not, 00:10:15.340 |
any one actor like Coinbase or Kraken or what have you, their problem is that technology has shifted. 00:10:21.280 |
And they've got many more people to deal with. And maybe those are individuals who are more risk 00:10:25.900 |
tolerant than companies. So, so I don't think they're going to be able to maintain the the 00:10:30.280 |
status quo of 100 years ago, the statutes that they're citing, I think that's going to either 00:10:34.000 |
get knocked down in court are going to be technologically invalidated, there's going 00:10:37.120 |
to be sanctuary cities and states for crypto, or there's going to be international entities, 00:10:40.720 |
like what El Salvador is doing in Switzerland. So I don't believe that that era, 00:10:45.220 |
is going to maintain, but I think there's going to be a big conflict over it. So we'll see. 00:10:49.000 |
How do you think it's different this time though, Bology to the like, crackdown because folks said 00:10:56.620 |
the same thing, going into kind of the Napster era, when you know, everything was basically 00:11:02.200 |
peer to peer sharing of media files that were technically copyright. And there was, you know, 00:11:08.560 |
some regulatory regime that you know, had oversight over that copyright. But then the DOJ 00:11:14.680 |
got wrangled in and they went in and they made an example out of arresting a couple 100 kids, 00:11:18.100 |
I think, and it basically caused everyone to back away entirely from the market, similar to kind of 00:11:22.240 |
what we may be seeing happening in China right now. But is it not possible that we see a similar 00:11:27.040 |
sort of response this time around where they kind of take this targeted make an example approach to 00:11:33.460 |
kind of share people to scare people kind of out of the frenzy that's, that's, I think, driving a 00:11:39.100 |
lot of people onto these platforms rather than going after the platforms themselves. So kind of 00:11:43.060 |
saying, look, this is this is a violation of security laws, you guys are getting prosecuted. 00:11:46.600 |
Here's the 100 examples, and suddenly 80% of the attention kind of gets vacuumed out. 00:11:52.600 |
Yeah, so I'd say a few things about that first is, I think 2021 is different from the year 2000, 00:11:58.120 |
in the sense that China is an absolutely ruthless and very competent state. Whereas the US government 00:12:06.400 |
I would consider, you know, it's a difference between lawful evil and chaotic evil. You guys 00:12:09.760 |
know that from Dungeons and Dragons? Okay, so Chinese government is lawful evil. They are very 00:12:15.280 |
organized. And they plan ahead. And when they push a button, they just execute, like, like this whole 00:12:22.420 |
society, and they don't leave, you know, round corners or things left out. The US government is 00:12:28.180 |
today in 2021, not the 1950 US government, the 2020 US government is like this shambolic, chaotic 00:12:33.580 |
mess that can't really do anything, and is optimized for PR and yelling online. 00:12:38.260 |
And that's a whole important topic. But to your so that's one thing where I think there's a 00:12:44.860 |
difference in just state capacity. But in that's not to say that they're not going to try with what 00:12:49.720 |
they've got. The other thing is that, so to your Napster point, there was thesis and antithesis, 00:12:56.500 |
but there was synthesis, that's to say, you know, Napster led to Kazaa and LimeWire. And actually, 00:13:02.260 |
the Kazaa guys went into Skype. So something legit came out of that. But the more important 00:13:06.100 |
word more on point thing is, it led to Spotify. And that's a whole different topic. And that's 00:13:07.840 |
led to Spotify and iTunes. The record companies were forced to the negotiating table. 00:13:12.580 |
There was a there was a waypoint there that's important, which is that when that happened in 00:13:16.900 |
Napster, we, we ran a company called Winamp, and it was part of AOL. The biggest architectural flaw 00:13:24.340 |
of Napster was that Salama's ass and and well, the biggest problem with Napster was that it was 00:13:28.840 |
it was not fundamentally peer to peer. And so there were these servers. And so, you know, the 00:13:33.340 |
simple software architecture decision that somebody had to make was okay, well, let's just make a 00:13:37.420 |
entirely headless product that basically is fundamentally peer to peer. And that's what 00:13:43.600 |
the Nutella source code was, we actually released that on AOL servers, without them knowing, 00:13:49.840 |
it took them a few hours to figure this out. They called us, we shut the whole thing down. 00:13:54.340 |
But in those few hours, it was downloaded about five or 6000 times all this open source code that 00:13:59.680 |
we put out. That was the basis of LimeWire, bear share. And that is what basically just decapitated 00:14:05.920 |
the music industry. And it was there that then the music industry realized we needed a contractual 00:14:12.820 |
framework to operate with these folks, because they'll just keep inventing technology that makes 00:14:16.480 |
this impossible. And then that's what sort of that's what iTunes was able to do with a 99 cent 00:14:21.700 |
store. That's what Spotify was able to do after that. But a lot of it started was because of what 00:14:28.720 |
biology said earlier, which is a technologically, people just will continue to push the boundaries. 00:14:32.740 |
And we did that in music. And I think that's a lot of why, 00:14:35.500 |
the industry looks the way it does today. But ultimately, the premise kind of resolved 00:14:40.360 |
back to centralized systems, right? I mean, like, think about, you know, the Napster concept, 00:14:47.500 |
it was really supposed to be true peer to peer file sharing, and it ended up becoming the iTunes 00:14:51.940 |
store where everything sat on Apple servers. And they ultimately served everything out. 00:14:55.180 |
That's the reflexivity, because like, what happens is all of these folks, you know, you're sitting at 00:14:59.560 |
Warner Brothers, or Warner Music, or Universal Music Group, you're seeing an entire industry that 00:15:04.240 |
was worth probably 20 or 30 million dollars. And then you're like, well, I'm gonna go to Apple, 00:15:05.080 |
or $30 billion, just getting completely decimated. Then there's this psychological thing that happens 00:15:10.440 |
where first, you want to shut everything down, but then because there's this other thing that 00:15:14.120 |
is even more evil and even worse and uncontrollable than the thing that you hated, 00:15:19.160 |
then you actually say, "Well, listen. My enemy's enemy is my friend." Then you say, "Well, 00:15:23.160 |
fine. Let's find a few partners to work with and we can try to find a way of living with these 00:15:28.120 |
technologies." That's going to repeat to your point, Balaji, crypto is probably now where we're 00:15:33.480 |
going to see it play out first because that's probably the most important intersection of 00:15:40.040 |
individuals' desire and a regulatory framework that's outdated where the SEC has some extremely 00:15:45.960 |
complicated questions it needs to answer, especially even after something today. 00:15:49.880 |
If you're sitting inside the SEC and all of a sudden you see that China, an entire country that 00:15:56.360 |
basically was where an enormous amount of this crypto activity started and originated and was 00:16:01.880 |
happening, can completely change the way that you think about it. I think that's a really important 00:16:01.880 |
thing to think about. I think that's a really important thing to think about. I think that's 00:16:01.880 |
a really important thing to think about. I think that's a really important thing to think about. 00:16:01.880 |
I think that's a really important thing to think about. I think that's a really important thing to think about. 00:16:01.880 |
I think that's a really important thing to think about. I think that's a really important thing to think about. 00:16:01.880 |
happening can completely turn something off. What is our position as a government and as a society? 00:16:08.520 |
Well, let me just fill everybody in for a second who don't know what happens. On the taping today, 00:16:13.240 |
September 24th, Chinese government has announced that doing any transactions in cryptocurrency is 00:16:19.720 |
now illegal. Holding it, apparently you can still hold it and this comes after 00:16:23.640 |
Bitcoin servers being kicked out of the country. Yes, they've pushed the button. Go ahead, Balaji. 00:16:31.720 |
kept the record industry honest and that forced them to the table of iTunes, to Chamath's point 00:16:37.480 |
also. I wouldn't call it more evil, I'd call it more good. BitTorrent also lurks out there as this 00:16:46.200 |
It's a check, exactly. It's not gone. In fact, it's a basis for new technologies and I think 00:16:50.920 |
this is another flare-up. The other thing is that with crypto, the upside, even though the state 00:16:54.760 |
wants to crack down on it harder, the upside is also greater and the global internet is bigger. 00:17:00.600 |
I think the rest of world, meaning all the world besides the US and China, 00:17:04.520 |
is a huge player in what is to come. That's India, but it's also Brazil, 00:17:08.920 |
it's every other country that's not the US or China. That's a new player on the stage. 00:17:12.760 |
Third, to Chamath, to your point about yes, China banned crypto, it can do this to it. 00:17:19.160 |
Well, what's interesting is people talk about China copying the US. Nowadays, 00:17:24.520 |
actually in many ways, especially on a policy front, the US is copying China without admitting 00:17:28.440 |
it, but it does so poorly. For example, lockdown. The Chinese lockdown was something where it wasn't 00:17:37.400 |
just sit in your rooms. It was something with drones with thermometers and central quarantine 00:17:43.080 |
where people were taken from their families and centrally quarantined and a thousand ultra 00:17:47.160 |
intrusive measures that the population by and large complied with. Forget about a vaccine 00:17:52.440 |
mandate. We're talking about, you can see all the videos and stuff from out of China. The government 00:17:56.840 |
itself is published. Yeah, they didn't see much. 00:17:57.320 |
Yeah, they didn't seem real at first. They were pretty crazy. 00:17:59.720 |
They didn't seem real at first, right. The thing is that for the US to follow that, 00:18:04.600 |
it's a little bit like as a mental image, imagine a lithe Chinese gymnast going and doing this whole 00:18:09.720 |
gymnastics routine and then a big lumpy American following and not being able to do those same 00:18:15.160 |
moves. The Chinese government is, as I said, lawful evil, but they are set up to just snap 00:18:21.240 |
this, snap that, do this, do that. The US government is absolutely not and the US people 00:18:27.320 |
we also, our system is a democracy, is it not? You cannot just do that. Even if the government 00:18:32.200 |
wanted to weld people into their homes, we have a democratic state here where we have to discuss 00:18:36.440 |
these things and there's a legal framework to it, so it's not even possible. 00:18:39.080 |
Yes, but even in the 1950s, the 1940s, 1930s, the US was still a democracy, 00:18:43.720 |
but it basically managed to exert a very strong top-down control on things. 00:18:48.760 |
Today, we have something for where, like under FDR or in the 50s, a very conformist society, 00:18:56.440 |
which was able to drive things through. Even though it's a democracy, it was something where 00:19:01.480 |
mass media was so centralized that a relatively small group of people could get consensus among 00:19:07.800 |
themselves and then what they printed in the headlines, I mean, who's going to go and figure 00:19:11.320 |
out the facts on their own? This gets into the media topic later. It was de facto centralized 00:19:15.560 |
at the media level, the information production and dissemination level, and then you manufactured 00:19:20.280 |
consent from there. Today, it's you're combining that democratic aspect, the legal aspect with a 00:19:25.560 |
new information dissemination thing, which is decentralized, I think, a lot of things. 00:19:30.520 |
Yeah, well, I'd love to get Balji's take on an article that came out this week in 00:19:34.360 |
the Wall Street Journal. I think it was a really important article, came about four days ago, 00:19:38.040 |
and it was entitled, "Xi Jinping Aims to Reign in Chinese Capitalism and Hu to Mao's Socialist 00:19:44.920 |
Vision." And the article describes how, we've all seen and talked about on this pod, how 00:19:50.760 |
they've been cracking down on tech giants, they've been cutting down their tech oligarchs, 00:19:55.400 |
down to size, like Jack Ma, how he disappeared under house arrest. And we've seen that they 00:20:02.520 |
stopped the anti-IPO. But this article went beyond those steps and really talked about Xi's ideological 00:20:11.080 |
vision. And what it basically says is that what's going on here isn't just the CCP consolidating 00:20:17.800 |
power. They actually want to bring the country back to socialism. And what they say is that within the CCP 00:20:25.240 |
or at least Xi's view of it, capitalism was just something they did as an interim measure 00:20:31.160 |
to catch up to the West. But ultimately, they are very serious about socialism. 00:20:35.960 |
And reading this article, I had to wonder, well, gee, did we just catch a lucky break here 00:20:41.480 |
in the sense that, look, they're abandoning or if they abandon the thing they copied from us, 00:20:47.480 |
which is market-based capitalism, they use that to catch up. Maybe this is the way 00:20:52.280 |
that actually this is the thing that slows them down. And 00:20:55.080 |
I think that's the thing that historically that it brought to mind is that if you go back into the 00:21:01.320 |
1500s and you compare Europe to China, the Chinese civilization was way ahead. I mean, 00:21:08.920 |
the standard of living was way ahead. Technologically, it was way ahead. 00:21:12.760 |
Europe was a bunch of squalid, warring tribal nation states. And then what happened is a 00:21:19.080 |
single Chinese emperor banned the shipbuilding industry. Any ship with more than two masts 00:21:24.920 |
was banned. And so exploration just stopped in China. They became very inward facing. 00:21:33.000 |
Whereas the European states explored and discovered and conquered the new world, 00:21:39.160 |
colonized the world, and that led to an explosion of wealth and innovation. And as a result, 00:21:45.640 |
Western Europe and then its sort of descendant, the United States, ended up essentially 00:21:50.360 |
conquering and dominating the modern world. And so I guess, to bring it back to my question to 00:21:54.760 |
Bology, is there a chance that what Xi is doing, returning to socialism, could this be like the 00:22:00.680 |
Chinese emperor who banned shipbuilding? Or am I reading way too much into the single Wall Street 00:22:06.200 |
Journal story? So, my short answer on that is I think it is, I think the socialist thing is real, 00:22:14.120 |
but I think it's better to call it nationalist socialism with the implications that has. 00:22:19.720 |
Whereas I think the US is kind of going in the direction of what I call maybe socialist nationalism. 00:22:25.400 |
You know, where it's like different emphases in terms of what is prioritized, but you know, 00:22:30.760 |
and I think in many ways, China is like the new Nazi Germany, woke America is like the new Soviet 00:22:37.480 |
Russia, and the decentralized center is going to be the new America, I can elaborate on that. 00:22:42.840 |
But basically, with respect to this, one thing I try to do is I try to triangulate lots of stories. 00:22:49.960 |
So rather than, for example, and nothing wrong with the Wall Street Journal, you know, the 00:22:54.440 |
piece like looking at it, but for every W shaped piece you read, it's useful to get like, you know, 00:22:59.960 |
what is CGTN or Global Times saying, even if you discounted just to see what they're saying. 00:23:05.000 |
And then you also triangulate with, let's say, the Indian or Russian point of view. And by doing 00:23:10.520 |
this, you I feel that it's better than just reading 10 American articles. And, you know, 00:23:16.520 |
especially reading primary sources, like there's this good site called reading the Chinese dream, 00:23:20.200 |
which actually translates primary sources, and then you can kind of form your opinion, 00:23:24.280 |
you know, your opinions from that versus like, like a quote, hot take, right? I'm not saying you 00:23:27.720 |
are, I'm just saying that that's what I tried to do. So I think we've really gotten China right 00:23:31.640 |
here. I mean, I think that if you if you look at what's happening, I think we've we've basically 00:23:36.120 |
forecasted this orchestration of essentially the re vertical integration of China. You know, 00:23:43.000 |
we have China Inc, where the CEO is Xi Jinping. And where there is a, 00:23:48.360 |
it's, it's, it's almost like they've, they've changed the game, where what they are playing 00:23:54.120 |
essentially is like, settlers of Catan or something where the goal is just to hoard 00:23:58.200 |
resources. And I think that they have enough critical resources for the world. And the clever 00:24:02.680 |
part of what they did was the last 20 or 30 years, they leveraged the world to essentially finance 00:24:08.840 |
their ability to then have a stranglehold on these critical resources, whether it's ships, 00:24:16.680 |
or whether it's rare earths or other materials, they leveled up. And I think that was the genius. 00:24:21.160 |
Yeah, well, they leveled up, they leveled up on our 00:24:23.960 |
capital. Yeah. And now that all that infrastructure is there, and now that we are addicted to the drug, 00:24:28.680 |
they can then change the rules. They leveled up with our operating system. It was a brilliant move. 00:24:32.680 |
But they allowed entrepreneurs to believe that they could be entrepreneurs, they allowed an 00:24:36.440 |
entire society to basically level up. Why did nobody see this coming Chamath? 00:24:40.680 |
Nobody saw this coming. People were starting venture firms, 00:24:44.760 |
look what they were doing, taking companies public. 00:24:47.560 |
They did Belt and Road while we did nation building in Iraq and Afghanistan. Right? I mean, 00:24:52.440 |
what is Belt and Road? Belt and Road is a country that is built on the land of the people of the 00:24:53.800 |
people. And they're building a highway. They're building a highway. They're building a highway. 00:24:55.080 |
They're building a highway. They're building a highway. They're building a highway. They're 00:24:55.880 |
building a highway. They're building a highway. They're building a highway. They're building a 00:24:56.920 |
highway. They're building a highway. They're building a highway. They're building a highway. 00:24:57.480 |
And they're building a highway. They're building a highway. They're building a highway. 00:24:57.880 |
Superhighways to extract the resources that Chamath is talking about, 00:25:02.040 |
and bringing them into the Chinese system. And what did we spend our trillions on nation 00:25:06.280 |
building in Afghanistan? I mean, why didn't we see this coming? Everybody was looking at China 00:25:10.760 |
starting venture firms there. You know, embracing it taking companies public Wall Street politicians, 00:25:17.080 |
nobody saw this coming. So the rising tiger story has been around since the late 90s. Right? 00:25:23.640 |
why didn't we see them cutting the heads off? 00:25:25.880 |
The answer entrepreneurship spent because it was not about white people. 00:25:30.120 |
China was in Sri Lanka. China was in Africa. These are not sexy, interesting places. 00:25:39.000 |
Western audiences, they could you guys couldn't give a fuck. Let's just be honest. Okay. And so 00:25:45.240 |
that's why it didn't matter. Because you we thought we all thought that these are countries 00:25:50.280 |
that are sort of, you know, squalid third world. 00:25:54.120 |
Developing nation states that don't really matter. They don't necessarily have 00:25:57.880 |
the resources that matter. But what did China do they realized that those folks are the future GDP, 00:26:04.200 |
those folks are the future population pools, the young that can actually do the hard work, 00:26:09.080 |
and then they went and they secured again. So not only resources now, 00:26:12.520 |
so that's one thing, hold on. So that's one thing. I think we entirely missed it. Because 00:26:16.920 |
as David said, the military industrial complex doesn't look at, you know, a development, 00:26:23.320 |
a developing nation and say, we want to be there. It looks at Afghanistan and says we want to 00:26:27.480 |
dominate it, because we can actually feed off of that domination. So that's one thing. The second 00:26:33.960 |
thing is that I think that we misunderstood Xi Jinping's ambition. And I think that that's a 00:26:39.400 |
reasonable mistake to have made. The first one is an error of just complete stupidity. The second 00:26:45.240 |
one is something that I think it was legitimate to have missed. 00:26:47.480 |
And to your point, Chamath, they not only stole the entrepreneurial playbook, but the colonizing other places and 00:26:53.160 |
the economic development of the country. And so I think that's a really good point. 00:26:55.880 |
And I think that's a really good point. And I think that's a really good point. 00:26:56.360 |
We all thought it was dumb. Let's all be honest, you know, 10 years ago, 00:26:59.720 |
I disagree. No, I would argue in a lot of cases we benefited. So China set up and bought like the 00:27:08.120 |
largest pork production company in Australia. And what do you feed those pigs, you feed them 00:27:14.200 |
soybeans, and where do those soybeans come from? Largest soybean exporting market is the United 00:27:19.320 |
States. You know, we had a tremendous customer in China, 00:27:23.000 |
as they expanded their consumption pattern through all of this investing they were doing worldwide. 00:27:27.380 |
We were exporting John Deere equipment and Caterpillar construction equipment and 00:27:33.100 |
soybean products and our knowledge industry and there was a tremendous service model. 00:27:38.000 |
And globalization really created, call it a catch-22 for the United States, 00:27:42.800 |
where we were watching the rising tiger fueled in part by this distributed entrepreneurism. 00:27:50.280 |
But as that distributed entrepreneurism creates, you know, obviously the social... 00:27:54.000 |
Do you think there is an equal amount of dollars that went into Western developed 00:27:57.000 |
nations from China as it went into third world countries? 00:28:00.240 |
Most Western nations couldn't give a shit about... 00:28:01.800 |
Yeah, but that money is too hard to turn down. 00:28:03.460 |
There's a mistake in our thinking with respect to the rest of the world that we've been making, 00:28:07.040 |
we make over and over again. And it was all kind of predicted by a historian named Samuel Huntington 00:28:13.680 |
at Harvard in the 1990s. He wrote a book called Class of Civilizations. 00:28:20.220 |
Then another famous book came out called The End of History and The Last Man. 00:28:23.080 |
And what the common belief was in the 1990s in the US was that we had reached the end of history 00:28:29.220 |
where every country would become democratic and capitalist, right? 00:28:32.780 |
That was the end of history is democratic capitalism. 00:28:34.680 |
And we believe that the more we went all over the world spreading our ideals, 00:28:38.980 |
it would hasten this day where they'd all become democratic capitalists. 00:28:42.600 |
What Huntington said is no, you know, cultures and civilizations, 00:28:46.520 |
these go back thousands of years, these are stubborn things. 00:28:50.200 |
What these other cultures really want is not westernization, but modernization. 00:28:54.540 |
And what they're going to do is they're going to modernize, 00:28:57.100 |
they're going to learn from us as much as they can about technology, 00:29:00.440 |
they're going to assimilate and adapt and take all of our technology, 00:29:04.340 |
but they are not going to become like us, they're not going to westernize. 00:29:07.260 |
And that is basically what's happened over the last 25 years. 00:29:12.280 |
And suddenly we realize, whoa, wait a second, they have not westernized. 00:29:16.760 |
They are still their own unique civilization. 00:29:19.920 |
They have basically the equivalent of a modern day emperor. 00:29:28.660 |
Because now we've allowed them to catch up from a technological standpoint. 00:29:33.140 |
Why did everybody in the West get this so wrong? 00:29:36.940 |
One is political differences aren't public in China, but they are real. 00:29:44.540 |
And there was a real leadership shift from Deng and Zhang, 00:29:51.720 |
You know, the Mao era was revolutionary communist. 00:29:54.720 |
The Deng, Zhang, Hu era was internationalist capitalist. 00:30:02.800 |
Like, Hu did not fully control the military the way that Xi does. 00:30:06.300 |
There's this ad you guys should watch, the Chinese military ad. 00:30:10.440 |
It's called like, "We will all be here," or something like that. 00:30:14.540 |
And the thing about it, it's not just that it's like extremely well coordinated military program. 00:30:19.520 |
It's not just that it's like a military parade and set to be intimidating and so on. 00:30:22.520 |
It's that the whole thing clearly just falls up to one guy, Xi. 00:30:26.200 |
It's not him riding in a car with the rest of the communist party, 00:30:30.900 |
It's just like one guy, this 2 million person army folds up to him. 00:30:34.040 |
That is a true consolidation and roll up of power that he was able to accomplish 00:30:37.640 |
among other things with tigers and flies, going and throwing Bo Xili in prison, 00:30:42.740 |
you know, having generals, you know, thrown in prison for whether they were actually corrupt 00:30:50.620 |
Yeah, it's hard to know from the outside, you know, in one sense, 00:30:54.120 |
because so many folks in the Chinese government were taking bribes 00:30:56.820 |
as almost like an equity stake in their province, you know, like coming up, 00:31:00.340 |
they're like, "Hey, you know, hey, Gru, so, you know, give me a cut." 00:31:06.500 |
So, it's not something, you know, people will say, "Oh, you know, the Chinese plan for 100 years." 00:31:12.380 |
They had this huge war in the 20th century between the nationalists and communists, 00:31:15.680 |
the Deng transition, you know, his triumph for the Gang of Four. 00:31:22.260 |
And they had a real leadership transition after three hours of continuity. 00:31:25.920 |
Then this is what I think, this is the key point you're making, which is, 00:31:29.160 |
I think the reason that this actually flipped and we didn't see it coming is because Xi Jinping 00:31:33.760 |
decided, "I want to have complete control over the country. 00:31:36.400 |
These other people are getting too powerful." 00:31:38.300 |
And we're actually reading into this that there's some crazy plan. 00:31:41.200 |
It just could be a crazy man, not a crazy plan. 00:31:48.920 |
basically take away every entrepreneur's company. 00:31:52.360 |
They were not trying to do, they were doing all of this stuff in plain sight. 00:31:56.520 |
As an example, you know, we view Sierra Leone as a place where we make commercials about, 00:32:01.800 |
or we try to fundraise for, or we send USAID. 00:32:04.600 |
They view Sierra Leone as a place where there's critical resources that are dependent on that, 00:32:11.800 |
And so they will go, they will modernize, they will invest, and they will then own those critical 00:32:18.720 |
as a random country in South America that abuts, you know, Peru and Argentina. 00:32:23.040 |
They view Chile as a place where you, where there is the largest sources of lithium that we need for 00:32:28.960 |
They were doing this for decades right in front of us. 00:32:34.560 |
The cutting of capitalism off though is the question. 00:32:36.720 |
The reason, the reason we didn't pay attention is because none of those countries mean anything to us. 00:32:43.600 |
And the thing is though, I'd argue that the blindness actually comes from both ends of the political spectrum. 00:32:48.520 |
on the conservative end, oh, these are shithole countries basically, you know, you can bleep that if you want, but you know, 00:32:54.860 |
You know, it's something where, for example, COVID was only taken seriously once it was hitting Italy and France. 00:33:01.000 |
Like China was still considered like a third world country, but it actually also comes from the liberal side in a different, 00:33:07.800 |
like it's a slightly masked, but it's a condescension of not the military industrial complex, but the nonprofit NGO, you know, 00:33:15.480 |
complex like, oh, the white savior with the NGOs. 00:33:18.320 |
You know, coming in and, you know, you know, pat them on the head kind of thing. 00:33:23.160 |
And the thing about this is like the one thing I think is a huge thing for our diplomatic core today is making any generalization about another culture, which says that they're not completely good or that they have some aspect to them that doesn't match exactly to the US. 00:33:45.960 |
I mean, if you criticize Saudi Arabia for throwing gay 00:33:48.120 |
people off of buildings, you're not respecting their religion. 00:33:51.200 |
Can I just say that Balaji just gave the most precise delineation that I've had. 00:33:54.440 |
So let me just repackage what you said, because it resonated with me. 00:33:57.560 |
Western philosophy tends to view countries in three buckets. 00:34:01.480 |
Bucket number one are countries like us, right? 00:34:05.160 |
Those are the other Western countries, the G8 countries. 00:34:08.520 |
And then what happens is you get this weird thing where then you move into like countries where you basically either deal with it with wokeism. 00:34:20.280 |
Let me go try to save them because it makes me feel better. 00:34:22.520 |
Or you deal with them as neocons, where it's just like, let me go dominate them and take them to war. 00:34:27.280 |
And literally, you can take all 180 odd countries in the world and effectively sort them into those three things. 00:34:32.640 |
And that, I think, is the problem with America's view of what we're doing. 00:34:37.440 |
And so what David said is really true, which is that while we were doing this, we had neocon war, wokeism, or you're our buddy because you're like us. 00:34:47.720 |
The entire world reordered itself with completely different incentives, and they did it right in front of us. 00:34:54.100 |
And now we wake up to realize, oh my gosh, that was an enormous miscalculation that we just did. 00:34:59.200 |
So, you know, if you look at kind of the history of the West's interaction with the rest of the world, and let's talk about colonialism, and whether you're talking about colonialism over the last few hundred years, or even you talk about a microcosm like Afghanistan over the last 20 years, I would argue that there's three phases to the West's interaction with these other countries. 00:35:22.200 |
Phase two is assimilation, where the culture that's been dominated realizes that they're behind and they want to catch up. 00:35:31.320 |
And so there's a process of Americanization, Romanization. 00:35:35.560 |
And what they're doing is they are learning from us and taking our technology, and it lulls us into a false sense of security that we think they're becoming Americanized or becoming Westernized. 00:35:49.200 |
Then the third phase is reassertion, where the dominated country, culture, you know, what have you, reasserts its traditional culture and its traditional views because they've modernized, but without becoming truly becoming Americanized or Westernized. 00:36:07.920 |
And we don't really realize that they never really wanted our culture. 00:36:11.920 |
They just wanted to throw off, you know, American domination or Western domination. 00:36:17.120 |
And so what they've actually done is use this period to basically assimilate and catch up. 00:36:24.000 |
And the reality is, like in Afghanistan, they don't have to fully assimilate all of our technology. 00:36:29.880 |
They don't have to become as strong as us because we are in their land. 00:36:34.520 |
They just have to become strong enough to basically achieve a sense of... 00:36:38.800 |
They just kind of upgrade their systems, right? 00:36:40.440 |
They just have to achieve a defensive superiority. 00:36:44.720 |
So it's much easier for them to catch up than we think. 00:36:46.920 |
And we are always caught off guard by this dynamic, and it repeats itself over and over again. 00:36:51.920 |
And what you've seen in China is, you know, 30, 40 years ago, you had the great economic reformer, Deng Xiaoping. 00:37:00.920 |
He said, "Abide your time and hide your strength. 00:37:11.920 |
They embraced basically perestroika without glasnost. 00:37:18.720 |
They copied us, but not making Gorbachev's mistake of giving up any political control whatsoever. 00:37:26.720 |
And so I would say that Xi was not an aberration. 00:37:31.720 |
They had gotten to a point of strength where they were ready for that strong leader who was ready to reassert their... 00:37:42.720 |
And once again, we've been played for fools and caught off guard. 00:37:46.520 |
I think one of the signs was his corruption crackdown a few years ago, right? 00:37:50.520 |
I mean, that was kind of step one where he took all these provincial managers and kicked them out and put them in jail and started to clean things up internally, where there was clearly kind of corrupt behavior underway. 00:38:06.520 |
I do think it's worth highlighting that to some degree, you can kind of try and diagnose his motivation or diagnose the motivation as being rather not too surprising and maybe not too surprising. 00:38:16.320 |
too nefarious in the power grabbing sense, which I think we all kind of want to bucket it as. 00:38:21.760 |
But if there's a population like we're seeing in the United States, where 00:38:26.640 |
when you loosen the screws on liberalism, and you allow for more freedom to operate and more 00:38:34.720 |
kind of free market behaviour, you see tremendous progress. And as I think we've talked about in the 00:38:40.640 |
past, tremendous progress always yields a distribution of outcomes amongst the population, 00:38:46.800 |
but everyone moves forward. Some people just move forward 10 times faster and farther. 00:38:50.960 |
And it causes populist unrest. And we've seen it in the United States. I mean, if AOC, 00:38:57.440 |
Elizabeth Warren and Bernie Sanders were elected to kind of be the triumvirate that ran the United 00:39:03.200 |
States today, they would probably say, "Let's end all this capitalism that's creating all this 00:39:07.760 |
wealth in the United States." And progress generally would 00:39:10.480 |
slow down. And I think that there's been inklings of that. Clearly, there's data to support the 00:39:15.600 |
inklings of this in China, that indicates, you know what, the loosening of the screws 00:39:20.720 |
has allowed tremendous progress, but it's time to tighten the screws. Because populism and unrest is 00:39:26.240 |
going to arise from kind of perceived inequality, just like we're seeing in the United States. 00:39:31.520 |
And, you know, I guarantee or I can't guarantee, but I would assume that if, 00:39:36.240 |
you know, certain populist leaders in the United States have the same level of authority, 00:39:40.320 |
that Chinese leaders do, they would probably act in the same way. 00:39:43.040 |
I think what we're going to see next is, and I think we should talk about what we think will 00:39:47.040 |
happen next with China. I think China is on the brink of having a revolution. If you look 00:39:52.160 |
at what happened to the Uyghurs, obviously, you can't practice religion there. 00:39:55.760 |
Students in Hong Kong can't protest, you can't publish what you want. Founders can't start 00:40:00.640 |
companies. Now you're not going to be able to play video games as much as you want. You can't 00:40:04.720 |
use social media. And today, you can't have any control over your finance. If you squeeze people 00:40:10.160 |
across this many vectors, this hard, this quickly, I think Freiburg's right, it could result in 00:40:17.440 |
China is not like a uniform people and a uniform culture. 00:40:20.720 |
Of course, yeah. But I mean, I'm just listed like seven or eight things they're doing to squeeze... 00:40:25.200 |
But there are many provinces, many cultures, many differences, 00:40:28.960 |
many differences of experience, by the way. I mean, you know, the rural population in China 00:40:32.880 |
doesn't experience much of what I think is driving industry and driving this inequality 00:40:36.880 |
and perceived inequality and the changes that are underway. 00:40:40.000 |
I don't think that a revolution is generally supported unless you have, you know, enough 00:40:44.720 |
kind of concentrated swell across the population. I don't know how you could see something like that 00:40:48.720 |
happening in as diverse a population as China. I support China's limits on social media use 00:40:54.000 |
by children. I could use that here for my kids. 00:40:57.040 |
Clearly, Sachs is letting his kids use whatever they want. I mean, we definitely need to have 00:41:01.440 |
some of those over here. Balaji, what do you think is going to happen? Worst case, best case for China 00:41:05.840 |
in the next two decades? Well, so one thought I want to give is basically that in some ways, 00:41:09.840 |
this is inevitable because China and India are 35% of the world. Asia was the center of the world. 00:41:16.240 |
And one way of thinking about it is that America and the West executed extremely well over the last 00:41:21.760 |
couple of centuries and Asia didn't with socialism and communism. And now that they've actually got a 00:41:27.920 |
better OS, it's not like the US really could restrain them, you know? So in that sense, 00:41:33.920 |
that's also part of their internal narrative in a way. 00:41:36.480 |
I mean, of course, Mao killed millions of people there. They screwed up their own 00:41:39.680 |
stuff. But their narrative is they were colonized by the West and the opium wars are patronized as 00:41:44.640 |
copycats emasculated on film. For decades, they're like heads down in sweatshops. They built plastic 00:41:50.160 |
stuff. They took orders from all these overseas guys and announced their time to stand up and take 00:41:55.760 |
back their rightful position in the world. And that's their narrative. And so it's important to 00:42:01.760 |
not agree with it, but at least understand where it's coming from because they want it more, I 00:42:09.520 |
And so I disagree, Jason, with your view that they're going to have a revolution. I think 00:42:13.840 |
that's like, that's the kind of, that's a Western mindset where, you know, Australia, for example, 00:42:19.360 |
is having these COVID protests. In China, the harder the crackdown, the less more crackdown, 00:42:25.200 |
like the easier it makes the next crackdown. It's like, it's something where the difference- 00:42:29.360 |
It's not going to be an easy revolution, that's for sure. But we saw protests, 00:42:32.240 |
you saw Tiananmen Square, you saw Hong Kong, and you're going to see Taiwan. 00:42:35.360 |
Well, you've not seen anything since. That was 30 years ago. 00:42:39.360 |
squeeze people, they will take to the streets. Jason, the quality of life in China has 00:42:44.800 |
accelerated at such a pace over the past 30 years. The average person in China is so much better off 00:42:50.480 |
than they were five years ago, 10 years ago, 20 years ago, 30 years ago. Revolutions don't come 00:42:54.880 |
out of that amount of progress, right? When you go from $4,000 a year in average income 00:43:00.320 |
to $20,000 a year in average income- What happens if they don't have jobs in 00:43:03.680 |
the recession? The only time you revolt is because of economics. 00:43:07.440 |
That's what I just said. So what happens if there is- 00:43:09.200 |
They're growing at 8% a year. The GDP is growing 8% a year. The population is seeing an incredible 00:43:15.920 |
benefit and the cost is- What if that reverses? 00:43:18.720 |
Would you see a revolution then? Well, J. Cal, I mean, look, I think 00:43:21.840 |
we- To Balji's point about this is like a Western mindset, I mean, think about the Arab Spring. 00:43:27.280 |
We saw all those revolutions with the Arab Spring, and we thought, "Oh, look, 00:43:31.520 |
they're finally throwing off the yoke of oppression, and now they're going to set 00:43:34.720 |
up democratic states." And what did we actually get? We actually got religious fundamentalism 00:43:39.040 |
and religious fundamentalism, right? We didn't get what we thought. And I think this happens over 00:43:44.800 |
and over again is that we're trying to superimpose our mindset on them. We're thinking like, frankly, 00:43:55.440 |
People want security, by the way. And security can come in a lot of different forms. And religious 00:44:00.240 |
fundamentalism is one of them. And the way that we see government operate in China may seem foreign 00:44:06.080 |
and uncomfortable to us, but it provides enough security to people to know they're going to have 00:44:08.880 |
to be able to do what they want to do, which is to protect themselves, to protect their family, 00:44:11.920 |
to protect their friends, to protect their family, to protect their family, to protect their family, 00:44:13.200 |
to protect their family, to protect their family, to protect their family, to protect their family, 00:44:13.760 |
and to protect their family. And so, I think that's a really important thing. 00:44:14.880 |
And I think that's a really important thing. And I think that's a really important thing. 00:44:14.960 |
And I think that's a really important thing. And I think that's a really important thing. 00:44:15.200 |
You know, it's not necessarily an equation that says all humans that don't live like 00:44:19.280 |
Americans are going to be unhappy. Yeah. Last thing I'll just say is 00:44:21.920 |
basically I think the idea that China will collapse from internal revolution or the US 00:44:27.520 |
military is like really strong relative to China, I think are both actually forms of wishful thinking. 00:44:31.600 |
With that said, I do believe that we need a strategy for China on a global scale. 00:44:35.280 |
I think the future is a centralized East and decentralized West. But I don't think it's going 00:44:40.560 |
to be just, hey, deus ex machina is going to solve this problem. 00:44:44.640 |
No, I don't think the revolution is going to necessarily overturn China. I think you're 00:44:48.640 |
going to see revolutionary moments. Well, so just to say one thing in your 00:44:52.960 |
defense, J. Cal, because everyone's kind of beating up on you is I- 00:44:55.920 |
I didn't mean it as beating up, by the way, just- 00:45:01.440 |
I'm not saying it's a guarantee of revolution. But I think this is going to be- there's going to 00:45:05.520 |
Yeah, well, look, the one way in which I agree with J. Cal is I do think that freedom 00:45:10.800 |
ultimately is the birthright of every human, regardless of where you're born, you know, 00:45:16.560 |
who you are, what culture you are. But I think the thing that the United States has learned over the 00:45:20.400 |
last 20 years is the road from here to there is going to be much more complicated than we think. 00:45:26.800 |
And longer. And cultures are very stubborn things, and they're not going away anytime soon. 00:45:31.280 |
And the transition is going to have to happen within those cultures. It's not 00:45:37.920 |
By the way, I would also point out freedom is the birthright and the want of a people 00:45:43.280 |
that at some point have enough security to feel like they have that luxury. 00:45:47.280 |
Up until that point, I think that you have to make the decision of does security give 00:45:51.760 |
me more than freedom? And in a lot of cases, security coming with all the costs it comes with 00:45:57.120 |
may give people more than absolute freedom. And that's a transitionary phase. 00:46:01.120 |
And I think that's a transitionary phase. I think, you know, a lot of people's go through, 00:46:03.520 |
people's being civilizations and states and whatnot. 00:46:05.520 |
Anybody have a prediction of what's going to happen in the next 20 years, 00:46:09.840 |
I just can't believe Sachs was empathetic to your point, J. Cal. That was a great moment. 00:46:15.120 |
I have a few predictions. I think actually, if you read, you know, the Kill Chain or 00:46:22.560 |
similar books, that's by I think Christian Brose. It's a good book, where basically he's like, 00:46:27.680 |
you know, the US military has a perfect record in its war games with China. 00:46:30.960 |
China has won every round. And, you know, if you look at just the fact that with COVID, 00:46:39.040 |
US military basically suffered a military defeat in the sense that it had this whole 00:46:42.400 |
biodefense program, it's supposed to protect against biological weapons. That didn't work. 00:46:46.320 |
Afghanistan, huge defeat, $2 trillion. You have this AUKUS thing where it looks like France is now 00:46:52.880 |
pulling off and you know, from NATO or the EU is doing their own thing. I think that China is, and 00:47:00.800 |
then China is really already predominant in many ways in Asia. And the US just doesn't care about 00:47:05.600 |
the area as much. I mean, who wants to start another gigantic war over this? Certainly, 00:47:10.080 |
I don't think the people of America do after 20 years of forever war. And China really cares about 00:47:16.400 |
Taiwan. They really care about their backyard. So whether that's a war or whether that's a 00:47:21.200 |
referendum on Taiwan or whether it's some unpredictable event like COVID, I don't know. 00:47:26.000 |
But I do think that China does have some Monroe Doctrine-like thing that 00:47:30.640 |
it gets to within Asia where basically it says, just like the US said, "Hey, we're running this 00:47:36.320 |
hemisphere." They're saying, "Hey, we're running this sector of things." Whether that's a military 00:47:40.720 |
conflict where the US decides to just withdraw, I don't know. And then I think what has to happen is 00:47:45.680 |
we have to figure out what the decentralized West looks like. An asymmetric response to China 00:47:51.200 |
because it's going to basically be the number one centralized state in the world. You're not 00:47:55.040 |
going to be able to combat it head on militarily. It's got like 10x growth in front of it and it's 00:48:00.480 |
already a monster. Unless there's some assassination or revolution or something 00:48:04.960 |
crazy like that that's hard to predict, if it manages what it's got, it's like Google 2010. 00:48:11.280 |
It's got 10 years of that in front of it. Whether it's a Chinese decade or a Chinese century, 00:48:16.240 |
I think depends on whether we can build the technologies to defend freedom, 00:48:19.920 |
meaning like encryption, meaning decentralized social networks, 00:48:24.880 |
meaning these kinds of things because that's the only, I think, kinds of tools that are going to help 00:48:30.320 |
us. Whether it's drones, whether it's other kinds of things, asymmetric defense versus what they're 00:48:35.120 |
going to be. It's not going to just be a deus ex machina. 00:48:37.440 |
Look, I think that's a great point to end on, Jason, because there's other stuff 00:48:40.400 |
we want to talk about. So, one of the things that, you know, Bology's commented on that I give him, 00:48:46.000 |
you know, a tremendous amount of credit for is corporate, is the idea of corporate journalism. 00:48:51.040 |
In fact, Bology, you're the first person I heard that term from corporate journalism, 00:48:54.960 |
which is a recognition that all of these reporters actually are employees 00:49:00.160 |
of companies and they have a company culture and they often have, they have owners, 00:49:05.440 |
the companies do. Those owners often have an agenda. There's often a dogma inside these 00:49:11.200 |
corporations. And it really got me to see journalism in a new light because 00:49:18.160 |
these journalists portray themselves as the high priests of the truth who are there to speak truth 00:49:24.720 |
to power. And actually, they're really just kind of the lowest paid functionaries on the 00:49:30.000 |
corporate totem pole. And, you know, in contradistinction to what you've called 00:49:36.400 |
citizen journalists, who are people who are writing what they see as the truth in blogs or, 00:49:43.040 |
you know, formats like this, where they are not getting paid for it. You know, we're doing it 00:49:48.240 |
because we want to put forward what we know to be the truth. And actually, I'd love to hear you 00:49:53.920 |
speak to that because I think this is like one of the most powerful ideas that I've heard you put 00:49:59.840 |
say about this. The first thing I would do is I'd recommend a book that recently came out by 00:50:03.680 |
Ashley Rinsberg called The Grey Lady Winked. And the reason it's very important and I put it up 00:50:08.800 |
there frankly with the top five books I'd recommend, I know I've recommended other books, 00:50:11.840 |
top five books I recommend, Grey Lady Winked. It goes back through the archives. You know, 00:50:16.560 |
The New York Times calls themselves the paper of record. They call themselves, 00:50:20.480 |
you know, the first draft of history. They've literally run billboards calling themselves 00:50:24.320 |
the truth. But it's just owned by some random family in New York. 00:50:29.680 |
And, you know, this guy, Arthur Sulzberger, who inherited it from his father's father's father. 00:50:34.080 |
And so you have this like, random rich white guy in New York who literally tries to determine what 00:50:40.000 |
is true for the entire world. His employees write something down. We're supposed to believe that this 00:50:43.520 |
is true. And I simply just don't believe that that model is operative anymore because I think truth 00:50:48.640 |
is mathematical truth. It's cryptographic truth. It's truth that one can check for oneself rather 00:50:54.000 |
than, you know, argument from authority. It's argument from cryptography. And one of things with Bitcoin 00:50:59.520 |
with cryptocurrencies is given decentralized way of checking on that. Now, to the point about 00:51:03.680 |
corporate media, it's they're literally corporations. These are publicly traded 00:51:08.560 |
companies with financial statements and quarterly, you know, reports and goals and revenue targets. 00:51:15.360 |
And so once you, you know, kind of are on the inside of one of these operations, you realize 00:51:19.840 |
that that hit piece or what have you is being graded in the spreadsheet 00:51:24.640 |
for how many likes and RTs it gets on Twitter. And if it gets more, 00:51:29.360 |
they're going to do more pieces like that, flood the zone with that. And if it doesn't, 00:51:33.200 |
then they're going to do less. And they're all conscious of this. 00:51:35.280 |
You know, for example, Nicholas Kristof wrote an article, I think it's like, 00:51:39.040 |
the articles no one reads, articles someone will read where he noticed that 00:51:43.040 |
his Trump columns get something like 5x more views than his other columns. It's like this huge ratio. 00:51:48.240 |
And so at least some folks there are privy to their page views. 00:51:51.600 |
So, and of course, they're looking at their Twitter likes and RTs even if they, 00:51:55.040 |
those are not directly page views. They're certainly correlated with pages on the article. 00:51:59.200 |
So, all these folks are literally employees of for-profit corporations that are trying to 00:52:03.840 |
maximize their profits. But we believe them when they mark themselves as the truth like the NYT, 00:52:09.280 |
or as democracy itself, like the Washington Post, or fair and balanced like Fox. These people equate 00:52:14.320 |
themselves with like truth, democracy, and fairness. They weren't exactly around at the 00:52:17.920 |
founding, okay? They weren't part of the constitution in '77, '76, the post offices. But 00:52:22.960 |
these media corporations were started later on and have kind of glommed themselves on and declared 00:52:27.680 |
themselves like part of the status quo. And so, you know, I think that's a really good way to 00:52:29.040 |
understand that. And I think that's a really good way to understand that. 00:52:29.120 |
And I think that's a really good way to understand that. And I think that's a really good way to 00:52:29.600 |
understand that. And I think that's a really good way to understand that. 00:52:30.400 |
And they are not. And the last point, and I'll just give you guys the floor, 00:52:36.480 |
but we didn't go and say, "Oh, Blackberry, do better. Blockbuster, do better. Barnes & Noble, 00:52:44.720 |
do better." We just replaced them, disrupted them with better technological versions. 00:52:50.160 |
And so, the idea that, "Oh, you know, New York Times, do better." 00:52:53.760 |
That's completely outmoded way of thinking about it. We need to disrupt, we need to replace, we need 00:52:58.880 |
to build better things. We need to have things that have like on-chain fact-checking that have 00:53:02.800 |
voices from overseas, that have voices that are actually experts in their own fields, that have 00:53:07.200 |
voices that are not necessarily corrupted by these four-pronged modes. 00:53:08.960 |
Explain what you mean by on-chain fact-checking. Give an example. 00:53:13.680 |
So, this is a very deep area. But fundamentally, the breakthrough of Bitcoin was that an Israeli 00:53:23.280 |
and a Palestinian or a Democrat and a Republican or a Japanese person and a Chinese person all agree 00:53:28.720 |
on who has what Bitcoin on the Bitcoin blockchain. It is essentially a way of in a low-trust 00:53:34.640 |
environment, but a high-computation environment, you can use computation to establish mutually 00:53:40.160 |
agreed upon facts. And these facts are who owns what million or billion of the roughly trillion 00:53:47.360 |
dollars on the Bitcoin blockchain. And that's the kind of thing that's, I mean, people will fight 00:53:50.880 |
over a $10,000 shit. When you can establish global truth over a byte, which says, "Do you have 00:53:58.560 |
20 or 5 or 10 BTC?" You can actually generalize that to establish global truth on any other 00:54:05.760 |
financial instrument. And that's tokens, and that's loans and derivatives, and that's a huge 00:54:12.240 |
part of the economy. And then you can generalize it further to establish other kinds of assertions. 00:54:17.680 |
And that gets a little bit further afield, but not just proof of work and proof of stake, but things 00:54:22.400 |
like proof of location or proof of identity. There's various other facts you can put on there. 00:54:28.400 |
what I think of as not the paper of record, but the ledger of record, a ledger of all these facts 00:54:33.200 |
that some of them are written by so-called crypto oracle, some of them arise out of smart contracts, 00:54:37.360 |
but this is what you refer to. And as a today model of what that looks like, it's sort of like 00:54:41.840 |
how when someone links a tweet to prove that something happened, people link an on-chain 00:54:45.920 |
record to prove that something happened. Concrete example, do you guys remember when Vitalik Buterin 00:54:49.920 |
made that large donation earlier this year? Yes. 00:54:52.640 |
Okay. So, you want to talk about that? Explain how it happened? 00:54:58.240 |
something that was essentially trying to raise money to secure necessary equipment and 00:55:05.200 |
pharmaceutical supplies to India during a pretty bad COVID outbreak. And then I think you decided 00:55:10.960 |
that you were going to donate some money and then Vitalik basically stepped up and actually 00:55:14.480 |
gave quite a large sum. I'm not exactly sure the exact number, Balaji. 00:55:17.920 |
Yeah. So, it was an enormous amount in illiquid terms of this meme coin, these Shibu coins. 00:55:25.280 |
But the thing is that everybody, when they want to 00:55:28.080 |
prove that this has happened because it's so unbelievable, such a large amount of money, 00:55:31.680 |
he was marked it on the order of a billion when he gave it. Do you know how they proved it? They 00:55:35.440 |
didn't link a tweet. They linked a block explorer, okay, like an on-chain record that showed that 00:55:41.600 |
this debit and this credit had happened. And the big thing about this is, you know, 00:55:45.440 |
we take for granted when you link a tweet, you're taking for granted that Twitter hasn't 00:55:49.600 |
monkeyed with it. But guess what? Twitter monkeys with a lot of tweets these days. 00:55:53.040 |
A lot of people get disappeared. And so, it's not actually that good a record of what happened anymore. 00:55:57.920 |
This is deleted. This guy got suspended. You know, like, even the Trump stuff, forget about, 00:56:03.120 |
like, Trump himself, but that historical archive of what happened, you can't link it anymore. 00:56:08.000 |
It's link-rotted. Just from that, like, I know it's one of the thousand most important things 00:56:11.680 |
about it, but it's an important thing. In the example of, say, Taiwan, 00:56:14.480 |
and China believing Taiwan is part of the one China concept, and Taiwan believes it is a 00:56:22.880 |
country, and everybody in the world has a different vote on that, how does the block 00:56:27.760 |
chain clarify what is the fact about is China a country or is it a province? 00:56:33.600 |
Well, what it does is it clarifies the facts about the metadata. Who asserted that it was a country, 00:56:39.600 |
and who asserted that it was a province, and what time did they do so, and, you know, what money, 00:56:45.200 |
how much BTC did they put behind that or what have you. So, it doesn't give you everything, 00:56:49.600 |
but it does start to give you unambiguous, like, proof of who and proof of when and proof of what, 00:56:57.600 |
Yeah. I mean, like, the way I've kind of explained it to family members who've asked is, 00:57:01.760 |
you know, the concept of a chain is difficult, I think, for people to - that aren't, you know, 00:57:06.320 |
don't have a background in computer science to really grow up, because like, but everyone 00:57:09.520 |
understands the concept of a database, you know, there's a bunch of data in there, except in this 00:57:12.960 |
case, the data that makes up the database is what's being verified by everyone, and it's 00:57:19.600 |
distributed, so everyone has a copy of it. I want to know what you guys think about 00:57:23.680 |
this week in the Facebook dumpster fire. Let's move to something splashy and - 00:57:27.440 |
Can I just say one thing? There's a book, Friedberg, on what you just mentioned, 00:57:30.640 |
which is called The Truth Machine by Casey and Vigna, and it gives a pop culture explanation of 00:57:35.600 |
the ledger of record type concepts I just mentioned. 00:57:37.600 |
Which I don't think a lot of people grok at, Balaji, to your point, and I think, 00:57:41.520 |
you know, we're skipping past it, but it's a really important point, which is historically, 00:57:45.280 |
databases have sat on someone's servers, and whoever has those servers decides what 00:57:49.600 |
data goes in the database and how they're edited and what's allowed to come out of the database. 00:57:53.200 |
So, in this notion, generally, a database with information in it can be held, 00:57:57.280 |
and it can be held by lots of people who generally, as a group, kind of vote and decide 00:58:01.840 |
what's going to go into it. And that's the power of decentralization and how it changes, you know, 00:58:06.560 |
the information economy, which drives the world. And it's going to have a lot of implications, 00:58:11.760 |
Facebook's worst month ever continues. We talked last week about Facebook 00:58:16.640 |
having an internal leak called the Facebook Papers. 00:58:21.520 |
This is a continuous leak to not only the Wall Street Journal, but apparently members of Congress 00:58:27.120 |
getting it and the leaker and the SEC and the SEC and the leaker apparently works in the safety 00:58:33.760 |
group, according to a congressperson who has been getting it and they are going to 00:58:39.360 |
uncloak themselves, and that they were leaking this out of frustration, 00:58:44.000 |
that there is human trafficking, democracy issues, and obviously self harm in girls using Instagram 00:58:51.520 |
and, you know, this research, but that's not all. Facebook is admitting for the first time this 00:58:56.960 |
week that Apple's privacy updates are hurting their ad business. And I think the story you're 00:59:01.920 |
referring to is that two groups of Facebook shareholders are claiming that the company paid 00:59:06.080 |
billions of extra dollars to the FTC to spare Mark Zuckerberg and Sheryl Sandberg from depositions 00:59:12.640 |
and personal liability in the Cambridge Analytica saga. From the political Politico article, quote, 00:59:19.760 |
Zuckerberg, Sandberg and other Facebook directors agreed to authorize a multi billion dollar 00:59:23.840 |
settlement with the FTC as an express quid pro quo to the FTC. 00:59:26.800 |
To protect Zuckerberg from being named in the FTC's complaint, 00:59:30.160 |
made subject to personal liability or even required to sit for a deposition. According 00:59:35.200 |
to the article, the initial penalty was 106 million, but the company agreed to pay 50 times 00:59:40.640 |
more 5 billion to have Zuck and Sandberg spared from depositions liability. Here is the money, 00:59:47.760 |
quote, the board has never provided, this is from the group of shareholders suing the board has 00:59:51.680 |
never provided a serious check on Zuckerberg's unfettered authority instead, it has enabled him 00:59:56.640 |
defended him and paid him paid billions of dollars for Facebook's corporate coffers to make his 01:00:01.280 |
problems go away. Chamath. I have one prediction. The Facebook whistleblower. You know, when you are 01:00:11.120 |
a federal whistleblower, number one is you get legal protection. But number two, which people 01:00:14.880 |
don't talk about much as you actually get a large share of the fines that are paid by the act of 01:00:19.920 |
your whistleblowing. You know, there's a couple of SEC claims that I think were settled last year, 01:00:26.480 |
whistleblower got paid, I think like 115 odd million dollars or something and just an enormous 01:00:31.040 |
amount of money and the SEC has done a fabulous job and, you know, using whistleblowers as a 01:00:35.200 |
mechanism of getting after folks and, you know, I think the SEC said they've collected almost a 01:00:39.200 |
billion dollars since this whistleblower program started that they paid out or something just an 01:00:43.200 |
enormous amount. And I had this interesting observation, which is this person leaked a 01:00:48.880 |
bunch of stuff or whistle blue to the Senate to Congress to the SEC, there probably will be an 01:00:56.320 |
enormous fine. This person may actually make billions of dollars, which will then make every 01:01:02.880 |
other employee at Facebook really angry about why they didn't leak it first, because all like I guess 01:01:08.000 |
all this stuff was sitting around and apparently now they've shut it down, right? So that that 01:01:12.720 |
entire data repository around this whole topic is no longer freely available for employees to prove. 01:01:19.920 |
Well, I think it was more like, like, I guess, like, all of this data was sitting inside of some 01:01:24.960 |
Facebook, in some way, or something like that. But I think it was more like, like, I guess, like, 01:01:26.160 |
like, internal server. I mean, I mean, what this this leaker makes money under like key 01:01:31.040 |
tam. So the SEC will pay for information that results in a fine. And so they just recently 01:01:40.800 |
announced that they paid out $114 million whistle player payment, that was the highest 01:01:47.280 |
ever. And that they, this whistleblowers extraordinary actions and high quality information 01:01:56.000 |
successful enforcement actions. I don't think they announced all of these whistleblower payouts. 01:02:01.840 |
They just pay them. So that's, that's, that's my one observation is I actually think this 01:02:06.000 |
whistleblower may make billions of dollars. So more than any of us made at Facebook, which I 01:02:09.920 |
think is hilarious. But the second thing, which is more important is that there was an article 01:02:15.680 |
in the Wall Street Journal about how sentiment amongst Americans have now really meaningfully 01:02:20.880 |
changed. And I Jason, I don't know if you have those stats. But this is a plurality of Democrats 01:02:25.840 |
Where it's like, 80% of anybody now basically says the government needs to check big tech. 01:02:31.840 |
The Wall Street Journal published an article yesterday highlighting a new poll conducted 01:02:36.480 |
for the future of tech commission, it found that 80% of registered voters 83% damn 78% 01:02:41.280 |
repubs agreed that the federal government quote, needs to do everything it can to curb influence 01:02:45.600 |
of big tech companies that have grown too powerful, and now use our data to reach far 01:02:49.840 |
into our lives findings are based on a survey of 2000 or so registered voters. I think 01:02:55.680 |
it's a really, really, really tough road that these guys will have to navigate these next few 01:03:05.760 |
So, um, you know, the whistleblower thing, you know, real whistleblowers, 01:03:10.960 |
in my view, are like Snowden or Sanjay, who are, you know, basically overseas and, 01:03:17.280 |
or in prison for for telling what the US government is doing. And the difference is, I'd say, 01:03:25.520 |
they're whistleblowing, if accepted and acted upon would reduce the power of the US government. 01:03:30.960 |
Whereas these, you know, kind of awards and so on, I think they they do distort incentives. It's not 01:03:38.240 |
like they're giving a billion dollars to Snowden for blowing the whistle on the NSA, the military 01:03:41.600 |
industrial complex is not happy with that. But this money is being given because government 01:03:46.880 |
is currently mad at Facebook and wants to do something that is like a quasi nationalization 01:03:51.600 |
of Facebook. Now, very similar to what happened in China, where 01:03:55.360 |
basically all the tech CEOs, they just do it much more explicitly, they're just basically 01:04:00.240 |
decapitate all them say, okay, you're going, you know, spending time with your family. 01:04:03.920 |
In the US, it's done in this sort of denied way, and so on. But the US government getting more 01:04:09.760 |
control over Facebook is not a solution to Facebook's real problems. It's just going to mean 01:04:15.600 |
backdoor surveillance of everything. Every single thing that was pushed back on every 01:04:20.720 |
end to end encryption thing that they implemented. Now, the Keystone cops in the US government, 01:04:25.200 |
not they don't just surveil everything, then their database gets leaked, and it's all on the 01:04:28.960 |
internet, just like what happened under the SolarWinds hack. So I'm not denying that there 01:04:35.120 |
are, you know, like bad things about Facebook, I actually think on net, it's probably been 01:04:39.680 |
more beneficial than many people say. But I don't believe that the federal government is a solution 01:04:45.360 |
to those problems. I think the solution looks more like decentralized social networking, where people 01:04:49.920 |
have control over their own data, not simply the US government quasi nationalizing the thing. 01:04:55.040 |
you know, people bring up this decentralized social network thing, and as if it is a better 01:04:59.680 |
solution, I think you believe it's a better solution. But I rarely hear anybody talk about 01:05:04.880 |
well, if there's slander on a decentralized network, if there's child pornography, 01:05:08.480 |
if your personal banking information or your, you know, you were personally hacked, 01:05:13.280 |
and that information was put on a decentralized social network, 01:05:16.480 |
that cannot be reversed and stopped because it's decentralized, correct? 01:05:20.720 |
It depends. You know, the thing is, it's basically about 01:05:24.880 |
what does it depend? You just said that the blockchain couldn't be changed, 01:05:27.200 |
and that all the facts were permanent. So why does it depend now? 01:05:30.160 |
Well, for something like child porn, for example, it's actually being used for that you're not going 01:05:34.800 |
to find lots of people who are running those nodes. It is, it's something where edge cases 01:05:41.520 |
are always used to attack something. There's a famous cartoon which says, 01:05:45.440 |
how do you want this wrapped, and it's called control of the internet. And 01:05:49.200 |
it's either protect children or stop terrorists, right? And so when we talk about an edge case, 01:05:54.720 |
like that, I mean, the CSAM stuff, child porn, you know, that was that's used by Apple to justify 01:06:00.000 |
intrusive devices that are scanning everybody's stuff. The I think the answer to a lot of those 01:06:05.440 |
things is if you're doing something that's bad, there's usually ways of going after it, 01:06:12.800 |
that don't involve this gigantic surveillance state that was after all only built 01:06:16.640 |
in the last 10 or 20 years, just normal police work that you can do. 01:06:20.640 |
If they're actually like, you know, a bad guy, there's other forms of police work, 01:06:24.560 |
you can get search warrants, you don't have this completely lawless thing, 01:06:28.880 |
where you just, you know, some guy in San Francisco hits a button and you're digitally executed. 01:06:32.640 |
And so, so, you know, it's not that there isn't any possibility for rule of law, 01:06:38.880 |
it's just that it has to actually be exercised in this. 01:06:41.440 |
Listen, I think we're a long way away from decentralized social networking actually being 01:06:46.240 |
the norm or being a solution, Jason, I think we're at the step of actually figuring out and how much 01:06:54.400 |
of a difference we can make in the way that we think we're doing and how much of a difference we're 01:06:58.960 |
making in our own work. And so, I think, you know, I think we're going to have to work on that. 01:07:02.880 |
And I think that's the thing that's really important. I think that's the thing that's 01:07:06.320 |
really important. I think that's the thing that's really important. And I think that's the thing that's 01:07:08.800 |
really important. I think that's the thing that's really important. And I think that's the thing that's 01:07:11.520 |
really important. And I think that's the thing that's really important. And I think that's the 01:07:13.520 |
thing that's really important. And I think that's the thing that's really important. And I think that's 01:07:14.800 |
the thing that's really important. And I think that's the thing that's really important. And I think that's 01:07:15.440 |
the thing that's really important. And I think that's the thing that's really important. And I 01:07:20.160 |
think that's the thing that's really important. And I think that's the thing that's really important. 01:07:20.480 |
And I think that's the thing that's really important. And I think the more important thing 01:07:21.600 |
that I take away from all of this is that we've all kind of let it probably get a little bit too 01:07:28.240 |
far. And I think now that there's a plurality, something's going to happen. I don't think it's 01:07:34.560 |
going to be right. I don't think it's going to be just it's kind of like trying to perform surgery 01:07:38.480 |
with a rusty knife. There's going to be all kinds of collateral damage. 01:07:41.120 |
You're speaking specifically to how to police Facebook, Twitter, social networks. 01:07:45.280 |
I think it's just like social media. I think we've jumped the shark at this point. And so, 01:07:49.200 |
and I think folks will want to- Would you see desexualization as the solution, 01:07:53.200 |
I do think that that's the ultimate solution for two key things. One is the most important thing 01:07:59.360 |
that we all want is to know what the actual economic relationship we're having with folks 01:08:04.160 |
that we spend time with is. So when we spend time with friends, that's friendship. There's no 01:08:09.040 |
economic relationship there necessarily. When we spend time with a lot of these applications, there 01:08:15.120 |
is a subtle economic relationship that is actually hidden from us, which is that we believe we're 01:08:20.160 |
getting value for free, but really what's happening is we're giving back a bunch of 01:08:23.680 |
information that we don't know. When you move to a world of decentralization, you shine a light on 01:08:28.960 |
how people make money and you allow us to vote. Do I want it? Do I not? That single feature will 01:08:36.160 |
provide more clarity for people than any of this other stuff will because it'll force people to 01:08:42.400 |
then step into an economic relationship with the people that they're going to vote for. And that's 01:08:44.960 |
the ultimate solution. I think that's the ultimate solution. I think that's the ultimate solution. 01:08:45.920 |
And I think that that's just fair because those folks should be allowed to make money, 01:08:50.560 |
but we should also be allowed to know what the consequences are and then decide. 01:08:53.680 |
David, you are a big proponent of freedom of speech. We saw massive election interference, 01:09:00.320 |
the Russians trying to use social media to create division, other countries doing it to each other. 01:09:05.600 |
It's not just the US and Russia, it's China and Russia and everybody doing it to each other. 01:09:08.880 |
Do you believe that something like election interference and those bots would be solved? 01:09:14.800 |
Or it would get worse because of decentralization? Are you a fan of decentralization? Or would you 01:09:19.200 |
rather have a centralized Facebook, Twitter and somebody responsible like Zuckerberg or Jack 01:09:24.240 |
to mitigate this for democracies around the world? 01:09:27.120 |
Well, the problem that we have is we do have a problem of social networks spreading 01:09:33.920 |
lies and misinformation. However, the people who are in charge of 01:09:39.680 |
censoring those social networks keep getting it wrong. So they 01:09:44.640 |
allow disinformation to be spread by official channels, whether it's, you know, whether it's 01:09:56.400 |
There's so many official channels that get things wrong. We talked last week about 01:10:02.160 |
the Rolling Stone ivermectin hoax. There's been absolutely no censorship of that manifestly 01:10:08.560 |
wrong story. There's no labeling of it. But then a subjective opinion, like what Dave 01:10:14.480 |
Portnoy posted about AOC attending the Met Gala, which can't be factually wrong, because it's just 01:10:20.560 |
him, an opinion that gets fact checked and labeled. It's bizarre. So the situation we have today is 01:10:27.120 |
we're not preventing misinformation. We're just enforcing the cultural and political biases of the 01:10:32.800 |
people who have the power. And that is always a problem with censorship. And this is why I agree 01:10:37.280 |
with Justice Brandeis when he said that, you know, the sunlight's the best disinfectant. The answer 01:10:42.880 |
to bad speech is more speech. And that's why I agree with Justice Brandeis when he said that, you 01:10:44.320 |
know, the sunlight's the best disinfectant. The answer to bad speech is more speech. 01:10:45.360 |
We need to have more free and open marketplace of ideas. 01:10:49.280 |
And that ultimately is how you prevent disinformation. 01:10:52.720 |
So decentralized Twitter, decentralized social networks, do you think that is too much sunlight 01:10:57.520 |
and too unruly? The fact that things could be spread on there and not stopped? 01:11:00.800 |
Well, I'd like to see what those things look like when we actually have them. 01:11:03.760 |
I agree with Jermon, we're still some ways off from that. 01:11:06.240 |
Are we? I mean, but yeah, can I say a few things? Yeah, go ahead. Because I think we have these 01:11:10.800 |
out there. Isn't Mastodon out there and other services out there and they're content? 01:11:14.160 |
Yeah. I think we're content with these very issues. 01:11:17.040 |
This philosophy was not... Sorry, sorry, Balaji. I'll just say one thing before you go. But like 01:11:21.600 |
this, this general philosophy is not novel. You know, the internet and the word are being called 01:11:28.480 |
the tech platforms were meant to be the response to the undue influence that kind of Americans 01:11:35.840 |
thought existed already in the media when they emerged in the late 90s. 01:11:39.040 |
And you know, you can go back hundreds of years, like the state was meant to be 01:11:44.000 |
the response to the church. And you know, the media was meant to be the response to the state 01:11:49.520 |
and propaganda. And then the tech companies were meant to be the response to media. 01:11:53.120 |
And you know, now we're talking about decentralization kind of being the response 01:11:57.440 |
to tech. And at some point, you know, information accrues in this kind of asymmetric way. And it 01:12:02.160 |
becomes called that undue influencer. And that I think ends up becoming the recurring battle that 01:12:08.240 |
we'll continue to see whether or not this notion of decentralized systems actually is the endpoint or 01:12:13.840 |
is just the next stepping stone in the evolution. That is this constant kind of evolving cat and 01:12:18.800 |
mouse game of where does the information lie, who has control over it, and who's influencing people 01:12:24.000 |
ends up kind of being I think, the big narrative that we'll kind of realize over the next couple 01:12:28.400 |
decades. But I don't know biology if it becomes the endpoint, right? I mean, this is this feels 01:12:32.960 |
to me part of a longer form narrative. Yeah, so I think like lots of things look cyclic. If you look 01:12:38.400 |
at them on like this, if you look from the z axis is more like a helix where you do make progress, 01:12:42.880 |
even if it seems to your goal, you're not going to be able to do it. But if you look at the z axis, 01:12:43.680 |
you're going in a loop. And so I think, you know, it's centralized, then you decentralize, 01:12:50.080 |
and you re centralize. It's like the concept of unbundling and bundling, you unbundle the 01:12:54.240 |
CD into individual MP3s, you re bundle into playlists, right? And so with decentralized 01:12:59.840 |
media, it's not purely every single node on their own, I think it's more like a million hubs and a 01:13:05.280 |
billion spokes. And Jason, to your point, basically, most of those hubs are not going to allow 01:13:13.520 |
things that 99.99% of people think are bad, like CSAM, you know, as for the other things, like, 01:13:19.360 |
you know, slander, hack documents, you mentioned, the thing is current central arbiters will falsely 01:13:23.920 |
accuse people of these things or enforce them in political ways. It's the centralization is 01:13:28.880 |
actually also not a solution is being abused as sacks, you know, points out, in fact, official 01:13:33.120 |
disinformation early in COVID, which, you know, I had to like, basically beat back with a stick, 01:13:38.400 |
fortunately, you know, got some of it out in time. But, you know, people said the flu is more 01:13:41.920 |
serious, the travel bans were over. So, you know, I think that's a big part of the problem. 01:13:43.360 |
Overreactions that only Wuhan visitors were at risk, that avoiding handshakes is paranoid that 01:13:48.080 |
the virus is contained, tests are available, masks don't help, you know, all that stuff, 01:13:52.160 |
the Surgeon General himself, you know, to, you know, people don't wear masks, right? BuzzFeed, 01:13:57.440 |
you know, NYT, all these guys got the story wrong, and then they rewrote history to pretend that they 01:14:01.200 |
didn't. So that, to me is a much greater danger when you have a single source of truth that's 01:14:07.360 |
false. We're picking the least bad solution. It's such a good point, because I'm old enough to 01:14:13.200 |
remember when Balaji was right about everything related to the beginning of COVID. And I'm old 01:14:18.960 |
enough to remember when in April of last year, I wrote a piece in favor of masks, when the WHO 01:14:25.360 |
and the Surgeon General and all these official channels were saying don't wear masks. So the 01:14:29.520 |
problem is with official censorship is that they keep getting it wrong. They keep getting it wrong. 01:14:37.120 |
And I want to bring up one more quick point. Okay, Jason, you mentioned foreign interference on 01:14:43.040 |
Facebook, I would really encourage anybody who's concerned about that issue, to look up you can go 01:14:49.440 |
look, you just Google the actual ads that were run by agents of the FSB on Facebook. During the 2016 01:14:58.240 |
election, you can actually see the ads they ran. I want to make two points about that. Number one, 01:15:03.600 |
the ads are ridiculous. They are. They are sort of like an absurd, you know, foreigners perspective. 01:15:12.880 |
They're meme, they're meme with bad English. Yes. 01:15:14.960 |
Bad English. And it's it's somebody who doesn't understand American culture's attempt 01:15:20.240 |
to propagandize an American and you look at it, it's so ham handed. Let me give an example. It's 01:15:25.840 |
like, in one of them, they've got Jesus arm wrestling with the devil, saying, and it's 01:15:32.000 |
Jesus saying that I support Trump, and the devil saying I support Hillary Clinton. I mean, 01:15:37.600 |
literally stuff like that. Okay, it's utterly you'd see at a Trump rally. It's utterly 01:15:42.720 |
absurd. And nobody would ever be convinced by it. The second thing is the second thing about it is 01:15:49.920 |
that when you actually look at the number of impressions that were created by the sum total 01:15:55.520 |
of all of the so the so called disinformation of all these ads, it is a fractionally small, 01:16:01.600 |
infinitesimal drop in the ocean compared to the total number impressions on Facebook. And so I'm 01:16:07.040 |
not disputing the fact that somebody in the basement somewhere in Moscow, perhaps was running 01:16:12.560 |
some sort of disinformation operation that was running ads on Facebook. What I am saying is that 01:16:20.720 |
when you actually look at the effect, quantitatively and qualitatively, you realize that that whole 01:16:28.000 |
story was massively blown out of proportion in order to create a hysteria that then justify 01:16:34.560 |
censorship, then justifies the empowerment by centralized authorities to be able to regulate 01:16:42.400 |
networks with the effect that the people in power end up censoring in ways that do not support the 01:16:50.080 |
truth, but actually just reinforce their own power. That is what the disinformation story is 01:16:56.080 |
No, it's not. But it's really about sacks is that Russia wants to pit people like you and 01:17:01.360 |
me against each other, you're right leaning on left leaning. And what they want to do is create 01:17:04.960 |
this moment where you and I are fighting over this instead of fighting Russia. Russia has this as a 01:17:12.240 |
And this is classic KGB technique. So battles back and forth between Americans so we don't 01:17:19.680 |
Well, I have no interest in fighting Russia per se. I'm not interested in picking fights with 01:17:25.520 |
foreign countries. And you should want to fight against Russian interference. Yeah, 01:17:30.160 |
I'm also not engaging in a fight with fellow Americans. I'm attempting to D propagandize 01:17:36.880 |
fellow Americans who've been led to believe that Russian interference the election was I'm not 01:17:42.080 |
saying it didn't happen. But they didn't let me know that led happened, but they've been led to 01:17:46.480 |
believe that it was a greater threat than it actually was in order to empower centralized 01:17:53.840 |
authorities to engage in censorship over social network. So I'm trying to essentially 01:17:58.080 |
deprogram an enormous amount of programming that's taken place. I do not consider that 01:18:11.920 |
counting votes even to this day, saying the election was stolen. And then you have on the 01:18:16.160 |
left, you have the Democrats saying Russia won the election for Trump. In both cases, 01:18:21.840 |
In our last podcast, I cited the piece by Rich Lowry, which he is the editor of National 01:18:26.720 |
Review, speaking to conservatives saying that this whole stolen election myth. Yes, 01:18:31.440 |
he called it a myth is an albatross is it is nothing it will do nothing but backfire on 01:18:36.880 |
conservatives and Republicans. I think there are plenty of people who recognize that story to be 01:18:41.760 |
We're talking about something very different here. 01:18:43.440 |
I kind of just want to pivot forward a little bit. And I think it's a good question with 01:18:46.800 |
Bology on the line. But yeah, let's just use Bology. 01:18:49.520 |
Let's assume we move forward to this decentralized model where there isn't a central 01:18:53.920 |
media platform that adjudicates content and makes it available to the users of the platform. 01:19:00.720 |
So now in a decentralized world, take YouTube, for example. 01:19:04.160 |
YouTube has an application layer, which is the website we all use to access the videos. 01:19:08.320 |
And there's a recommendation list that pops up. And then all the media 01:19:11.600 |
that exists on the YouTube platform sits on some servers. 01:19:15.760 |
And so the application is how I'm selecting what media to watch. But that's being 01:19:19.920 |
projected to me because of the recommendations being made by Google and the search function. 01:19:24.720 |
If you guys remember, I don't know if you guys are old enough. 01:19:27.360 |
But in the early 90s, we were all on gopher boards and we were trying to find information. 01:19:32.800 |
And it was a total cluster. There's just no way to find what you're looking for. 01:19:38.000 |
And so search became the great unlock and use that. And search became 01:19:41.440 |
the great unlock for accessing content on this distributed network that we call the internet. 01:19:47.040 |
Now in the future, if all of the YouTube media is decentralized and sits on distributed servers 01:19:53.120 |
and sits on a chain or whatever, what does the application layer look like, Balaji? 01:19:58.320 |
Because how do you end up giving... Users are ultimately going to have to 01:20:02.000 |
pick an application or pick a tool to help them access media, to help them access information. 01:20:08.320 |
And there has to be to some degree a search function or an algorithmic 01:20:11.280 |
function that creates the list of what content to read. Or are they simply subscribing to nodes 01:20:16.880 |
on the network? And that's the future of decentralization where there's no longer 01:20:20.640 |
a search or recommendation function, but there is simply a subscription function. 01:20:24.240 |
And I think that to me is the big philosophical question. Because from a user experience 01:20:27.760 |
perspective, people want things easy and simple. They want to have things recommended to them. They 01:20:32.240 |
want to have a bunch of a list of things and they click down the list and they're done. 01:20:35.440 |
I'm not sure most people as we saw in web 2.0 when there were all these like, 01:20:38.720 |
make your own website stuff and you had RSS feeds. 01:20:41.120 |
And that all died because it was complicated and it was difficult and it wasn't great content for 01:20:45.520 |
most people. They preferred having stuff presented to them. So do we just end up with like 10, 20, 01:20:49.840 |
30 subscription applications that create different algorithms and different kind of access points for 01:20:55.040 |
the content? Or are people just living in a subscription universe in a decentralized world 01:20:59.440 |
when it comes to media? It's a great question. So I think first, 01:21:04.480 |
we have some vision of what that world looks like already. Because if you think about block explorers 01:21:10.960 |
and exchanges and wallets, and much of the rest of the crypto ecosystem, they are all clients to the 01:21:19.760 |
Bitcoin and Ethereum and other blockchains, right? So I wrote this post called Yes, You May Need a 01:21:27.680 |
Blockchain, where people have said, oh, blockchains are just slow databases. And that's like saying 01:21:33.360 |
the early iPhone camera is just a poor camera. Yes, it was worse on one dimension, image quality, 01:21:39.440 |
but it was far better on other dimensions. And that's why I think it's important to think about 01:21:40.800 |
the blockchain as a way to understand the blockchain. And I think that's a really good 01:21:42.960 |
way to understand the blockchain. And I think that's why I think it's important to think about 01:21:43.520 |
the blockchain as a way to understand the blockchain. And I think that's why I think 01:21:44.080 |
it's important to think about the blockchain as a way to understand the blockchain. 01:21:44.640 |
It was free, it was bundled, it was programmable, it was connected to a network and so on. 01:21:47.120 |
And so blockchains, yes, they have lower transaction throughput, but they are 1000x 01:21:53.120 |
or more on another dimension, which is the number of simultaneous root users. 01:21:55.920 |
That is to say, it's a blockchain is a massively multiplayer database, where every user is a root 01:22:01.360 |
user. That's meaning everybody can read the rows in the Bitcoin blockchain. And anybody 01:22:06.320 |
who has some Bitcoin can write a row that's a debit or a credit to someone else. 01:22:10.640 |
And anybody with compute can mine blocks. Okay, so it's open and permissionless. 01:22:15.760 |
Similarly, for a decentralized social network could operate in a similar way. 01:22:19.680 |
Whereas, let's say something like Twitter, or PayPal, the root password is not public, 01:22:25.200 |
nobody can can access it. And so what this leads to, to kind of continue your point is basically, 01:22:30.000 |
I think there's basically been three eras of the internet. The first was P2P, which was, 01:22:35.680 |
you know, peer to peer, right. And so individual nodes are point to point communicating. And that's 01:22:46.480 |
Yeah, exactly. Right. And so the great thing about this is open source is peer to peer is 01:22:51.680 |
fully programmable, you can see everything right, then, because of search, because of social, 01:22:57.200 |
because of the rise of the second era MVC model view controller, 01:23:01.520 |
essentially many protocols like social networking or search are not efficient. If every node is 01:23:06.720 |
pinging every other node, you can't ping every other node with the index of the web, you can't 01:23:10.320 |
ping every other node with Facebook's entire social graph, when you know, you go and message 01:23:15.280 |
somebody, you want a central hub with their photos and stuff. So you can just send a few packets back 01:23:20.080 |
and forth, right. And this led to that, you know, the last 15 years, these gigantic hubs last 20 01:23:26.160 |
years, these giant hubs, search and social and messaging, two sided marketplaces, hub and spoke 01:23:31.200 |
architecture. And these hubs have global state and they're highly monetizable, you can make 01:23:35.200 |
billions of dollars off them as many of our friends have, right. By the way, the early peer to 01:23:40.160 |
peer versions failed, right? Because I juiced I mean, you know, the alternative didn't really work 01:23:44.880 |
out. But right, the hub and spoke. Yeah, exactly. Right. So I mean, BitTorrent did exist during this 01:23:50.000 |
time, it's not like it was zero. But generally speaking, this was the era of hub and spoke MVC 01:23:54.720 |
dominance. And now we're into a third architecture, which I call CBC, client, blockchain client. 01:24:01.360 |
And so desktop clients have a blockchain that they communicate with, for example, 01:24:06.800 |
I have a wallet, you have a wallet on your client, and then the Bitcoin 01:24:10.000 |
blockchain intermediates us I debit in credit you, okay, that's a different architecture than 01:24:14.960 |
both MVC and CBC, it actually combines the positive qualities of both. It's decentralized 01:24:20.160 |
and open source and programmable, like peer to peer. But it also has global state and it's 01:24:25.200 |
highly monetizable, like MVC. So combines the best of both worlds. And it has some 01:24:29.440 |
something that's very new, which is not open source, where it's open source code, it's open 01:24:33.840 |
state, where it's open state database. Like open state means the back end is open also. So 01:24:39.840 |
all the applications that get built out, essentially, our clients that same back end 01:24:43.600 |
that same Bitcoin or Ethereum or what have you back end, and then you kind of exchange between 01:24:47.360 |
them. And the true scarcity now comes not from locking in your users, but from holding a currency 01:24:52.400 |
kind of gets reduced, the IP gets reduced to its minimal viable thing, which is holding that 01:24:56.320 |
cryptocurrency and what you can do that. But so give me the media, give me the 01:24:58.800 |
media analogy, right? So where does the media sit? And how do I as a user have an experience 01:25:05.360 |
on what media to view and to access? Like think about because again, like just speaking in a 01:25:09.680 |
layman's term for folks, maybe to understand, you know, most people, I don't think understand the 01:25:15.120 |
dynamics of what underlies a social network as much as they understand the user experience of 01:25:18.720 |
browsing YouTube, right? So yeah, what is what is my option in browsing the decentralized version 01:25:24.240 |
of YouTube? What does that experience end up looking like in this decentralized world? 01:25:28.080 |
Right? Who ultimately adjudicates the algorithm that defines what my recommended videos are that 01:25:34.720 |
I'm going to be accessing from these kind of, you know, different nodes? 01:25:39.520 |
your own, you can pick one, there'll be a library. 01:25:41.840 |
Well, that's my point is like the picking function is what doesn't didn't work in web 1.0 or web 0.9. 01:25:47.280 |
So we'll take a take a look at. So first of all, that state could potentially so today, 01:25:53.360 |
blockchains are not very scalable. But tomorrow, they already actually they're improving very 01:25:59.440 |
rapidly. If you're if you're following Matic, if you're following polygon, and if you're looking 01:26:04.560 |
at what Solana is doing, like you can do a lot more on chain in 2021, that you couldn't 01:26:09.360 |
2020. It's kind of like bandwidth, it just improves every year. So more applications will 01:26:15.280 |
become feasible like search indexes. Moreover, blockchains actually suggest to your specific 01:26:19.280 |
point about search indexing, blockchains actually radically simplify search engine to such a point, 01:26:24.000 |
there are dark horse competitor to Google. What I mean by that is Google index the open web because 01:26:29.360 |
open web is open. And then social networks were like a dark web, you know, to them where that's 01:26:35.600 |
why they were so mad, Google was so mad, it couldn't acquire Facebook, right? Because it could 01:26:39.200 |
index it couldn't index Twitter had two deals with them. That's all on their, you know, on their 01:26:43.760 |
server. So the social web is even harder to index than the open web. But block explore blockchains. 01:26:50.000 |
And, you know, block explorers, which are like search engines on top of them, 01:26:53.120 |
blockchains are easier to index in either the open web or the social web. And the reason is 01:26:56.880 |
many of the problems associated with indexing just go away. For example, just one small problem, 01:27:01.680 |
as you're probably aware, when you're indexing the open web, let's say you've got a million 01:27:05.200 |
websites, you're crawling, each one of them, you only have you only have a certain bandwidth amount 01:27:09.040 |
total. So you have to figure out, okay, do I have recrawled this site or this one, which one is going 01:27:13.040 |
to freshen? Okay, does this update every day or not? So you have to run all these statistical 01:27:16.960 |
estimators just to figure out when to crawl a site. All that goes away, for example, in the 01:27:20.480 |
context of a blockchain where you just subscribe, and you get a block of transactions, but it's got 01:27:25.920 |
the real time index and they can either there's no advantage being Google and having to serve from 01:27:30.160 |
freeberg. We got to wrap for best each moth. David Sachs, Friedberg and our best ebology. 01:27:39.280 |
Thanks for coming on the pod and we will see you all next time on the all in the podcast. Bye bye. 01:27:53.600 |
We open source it to the fans and they've just gone crazy. 01:28:16.400 |
We should all just get a room and just have one big huge orgy because they're all just useless. 01:28:20.160 |
It's like this like sexual tension that they just need to release somehow.