back to index

Balaji Srinivasan: How to Fix Government, Twitter, Science, and the FDA | Lex Fridman Podcast #331


Chapters

0:0 Introduction
2:21 Prime number maze
28:33 Government
44:50 The Network State
54:54 Pseudonymous economy
78:40 Exit
93:21 Building a network state
141:8 Wikipedia
178:38 Fixing science
205:6 Fixing the FDA
296:14 Longevity
315:12 Donald Trump's ban from social media
344:31 War
351:39 Censorship
365:39 Social media
381:54 Wokeism and communism
400:42 Cryptocurrency
418:15 AI, AR, and VR
430:14 Advice for young people
463:8 Regulating logic

Whisper Transcript | Transcript Only Page

00:00:00.000 | "Donald Trump was probably the biggest person ever
00:00:02.600 | "to be removed from social media."
00:00:04.840 | Do you understand why that was done?
00:00:06.820 | Can you still man the case for it and against it?
00:00:09.680 | - Everybody who's watching this around the world
00:00:12.220 | basically saw, let's say, US establishment
00:00:15.600 | or Democrat-aligned folks just decapitate
00:00:19.600 | the head of state digitally, right?
00:00:23.320 | Like just boom, gone, okay?
00:00:25.440 | And they're like, "Well, if they can do that in public
00:00:27.800 | "to the US president, who's ostensibly
00:00:29.520 | "the most powerful man in the world."
00:00:31.480 | What does the Mexican president stand against that?
00:00:34.360 | Nothing.
00:00:35.760 | Regardless of whether it was justified on this guy,
00:00:39.400 | that means they will do it to anybody.
00:00:40.960 | Now the seal is broken.
00:00:42.040 | Just like the bailouts, as exceptional as they were
00:00:44.280 | in the first year, everybody was shocked by them,
00:00:46.440 | then they became a policy instrument.
00:00:48.720 | And now there's bailouts happening,
00:00:51.000 | every single bill is printing another whatever,
00:00:53.040 | billion dollars or something like that.
00:00:55.000 | - The following is a conversation with Biology,
00:00:59.240 | Srinivasan, an angel investor, tech founder,
00:01:02.480 | philosopher, and author of "The Network State,
00:01:06.400 | "How to Start a New Country."
00:01:08.880 | He was formerly the CTO of Coinbase
00:01:11.800 | and general partner at Andreessen Horowitz.
00:01:15.920 | This conversation is over seven hours.
00:01:19.480 | For some folks, that's too long.
00:01:21.840 | For some, too short.
00:01:23.800 | For some, just right.
00:01:26.120 | There are chapter timestamps, there are clips,
00:01:29.040 | so you can jump around or, like I prefer to do
00:01:32.120 | with podcasts and audio books I enjoy,
00:01:34.840 | you can sit down, relax with a loved human,
00:01:38.360 | animal, or consumable substance, or all three if you like,
00:01:43.360 | and enjoy the ride from start to finish.
00:01:46.160 | Biology is a fascinating mind
00:01:48.000 | who thinks deeply about this world
00:01:50.000 | and how we might be able to engineer it
00:01:52.840 | in order to maximize the possibility
00:01:55.080 | that humanity flourishes on this fun little planet of ours.
00:01:59.840 | Also, you may notice that in this conversation,
00:02:03.080 | my eye is red.
00:02:04.720 | That's from jiu-jitsu.
00:02:06.400 | And also, if I may say so, from a life well lived.
00:02:10.800 | This is the Lex Friedman Podcast.
00:02:13.720 | To support it, please check out our sponsors
00:02:15.820 | in the description.
00:02:17.040 | And now, dear friends, here's Biology Srinivasan.
00:02:20.520 | At the core of your belief system
00:02:23.080 | is something you call the prime number maze.
00:02:25.800 | I'm curious, I'm curious.
00:02:27.160 | We gotta start there.
00:02:28.360 | - Sure. - If we can start anywhere,
00:02:29.480 | it's with mathematics.
00:02:30.720 | Let's go. - All right, great.
00:02:32.120 | A rat can be trained to turn at every even number
00:02:37.120 | or every third number in a maze to get some cheese.
00:02:40.680 | But evidently, it can't be trained to turn at prime numbers.
00:02:44.480 | Two, three, five, seven, and then 11,
00:02:47.520 | and so on and so forth.
00:02:48.480 | That's just too abstract.
00:02:49.520 | And frankly, if most humans were dropped
00:02:51.360 | into a prime number maze,
00:02:53.440 | they probably wouldn't be able to figure it out either.
00:02:55.640 | They'd have to start counting and so on.
00:02:57.200 | Actually, it'd be pretty difficult to figure out
00:02:58.360 | what the turning rule was.
00:03:00.820 | Yet, the rule is actually very simple.
00:03:02.880 | And so, the thing I think about a lot
00:03:04.600 | is just how many patterns in life
00:03:07.520 | are we just like these rats
00:03:09.120 | and we're trapped in a prime number maze?
00:03:10.720 | And if we had just a little bit more cogitation,
00:03:14.640 | if we had a little bit more cognitive ability,
00:03:17.060 | a little bit more, whether it's brain-machine interface
00:03:20.220 | or just better physics,
00:03:21.480 | we could just figure out the next step
00:03:23.120 | in that prime number maze.
00:03:23.960 | We could just see it.
00:03:24.800 | We could just see the grid.
00:03:26.900 | And that's what I think about.
00:03:28.040 | Like, that's a big thing that drives me,
00:03:30.880 | is figuring out how do we actually conceive,
00:03:32.820 | understand that prime number maze that we're living in.
00:03:35.240 | - So, understand which patterns are just complex enough
00:03:39.680 | that they're beyond the limit of human cognition.
00:03:41.680 | - Yes.
00:03:42.800 | - And what do you make of that?
00:03:45.160 | Are the limits of human cognition a feature or a bug?
00:03:49.500 | - I think mostly a bug.
00:03:51.200 | I admire Ramanujan.
00:03:52.320 | I admire Feynman.
00:03:55.160 | I admire these great mathematicians and physicists
00:03:57.880 | who were just able to see things that others couldn't.
00:04:02.120 | And just by writing it down, that's a leap forward.
00:04:05.800 | People talk about it's not the idea, it's execution,
00:04:09.200 | but that's for trivial ideas, for great ideas,
00:04:11.640 | for Maxwell's equations or Newton's laws
00:04:14.080 | or quantum electrodynamics
00:04:16.120 | or some of Ramanujan's identities.
00:04:17.880 | That really does bring us forward,
00:04:19.940 | especially when you can check them
00:04:21.140 | and you don't know how they work, right?
00:04:22.900 | You have the phenomenological,
00:04:24.480 | but you don't have the theory underneath it.
00:04:26.220 | And then that stimulates the advancement of theory
00:04:28.020 | to figure out why is this thing actually working?
00:04:29.660 | That's actually, StatMech arose in part
00:04:33.440 | from the kind of phenomenological studies
00:04:36.180 | that were basically being done
00:04:37.080 | where people are just getting steam engines
00:04:38.500 | and so on to work.
00:04:39.540 | And then they kind of abstracted out thermodynamics
00:04:41.780 | and so on from that, right?
00:04:42.660 | So the practice led the theory rather than vice versa,
00:04:45.700 | to some extent that's happening in neural networks now,
00:04:47.420 | as you're aware, right?
00:04:48.620 | And I think that's,
00:04:50.660 | so just something that's true and that works,
00:04:53.760 | you know, if we don't know yet, that's amazing.
00:04:55.760 | And it pulls us forward.
00:04:56.600 | So I do think that the limits
00:04:58.020 | are more of a bug than a feature.
00:04:59.520 | - Is there something that humans will never be able
00:05:01.160 | to figure out about our universe,
00:05:03.140 | about the theory, about the practice of our universe?
00:05:06.460 | - Yeah, people will typically quote
00:05:08.060 | Cordell's incompleteness for such a question.
00:05:10.480 | And yeah, there are things that are provably unknowable
00:05:14.120 | or provably unprovable.
00:05:16.360 | But I think you can often get an approximate solution.
00:05:18.520 | You know, the Hilbert, you know, Hilbert's problems like,
00:05:21.900 | we will know, we must know.
00:05:23.300 | At least we should know that we can't know.
00:05:26.360 | Push to get at least an approximate solution.
00:05:28.760 | Push to know that we can't know.
00:05:29.980 | At least we push back that darkness enough
00:05:32.440 | so that we have lit up that corner
00:05:34.000 | of the intellectual universe.
00:05:35.600 | - Okay, let's actually take a bit of a tangent
00:05:38.080 | and explore a bit in a way that I did not expect we would.
00:05:42.040 | But let's talk about the nature of reality briefly.
00:05:44.320 | I don't know if you're familiar
00:05:45.480 | with the work of Don Hoffman.
00:05:47.400 | - No, I don't.
00:05:48.240 | I know Roger Penrose has like his road to reality series
00:05:51.280 | for like basic physics getting up to everything we know.
00:05:53.600 | But go ahead, tell me.
00:05:54.440 | - It's even wilder.
00:05:55.760 | In modern physics, we start to question
00:05:57.400 | of what is fundamental and what is emergent
00:05:59.960 | in this beautiful universe of ours.
00:06:01.760 | And there's a bunch of folks who think that space-time,
00:06:05.840 | as we know it, the four-dimensional space, is emergent.
00:06:08.680 | It's not fundamental to the physics of the universe.
00:06:11.840 | And the same, many argue, I think Sean Carroll
00:06:14.480 | is one of them, is that time itself,
00:06:17.120 | the way we experience it, is also emergent.
00:06:20.260 | It's not fundamental to the way our universe works.
00:06:23.400 | Anyway, those are the technical term,
00:06:26.480 | I apologize for swearing,
00:06:27.840 | those are the mind fucks of modern physics.
00:06:31.640 | But if we stroll along that road further,
00:06:35.200 | we get somebody like Donald Hoffman,
00:06:37.360 | who makes the evolutionary case
00:06:39.280 | that the reality we perceive with our eyes
00:06:41.960 | is not only an abstraction of objective reality,
00:06:45.760 | but it's actually completely detached.
00:06:48.120 | We're in a video game, essentially,
00:06:50.280 | that's consistent for all humans,
00:06:55.280 | but it's not at all connected to physical reality.
00:06:59.160 | It's an illusion.
00:07:00.000 | - It's like a version of the simulation hypothesis,
00:07:01.400 | is that his--
00:07:02.360 | - In a very distant way, but the simulation says
00:07:06.920 | that there's a sort of computational nature to reality,
00:07:09.760 | and then there's a kind of a programmer
00:07:11.520 | that creates this reality, and so on.
00:07:13.680 | No, he says that we humans have a brain
00:07:16.120 | that is able to perceive the environment,
00:07:18.680 | and evolution is produced from primitive life
00:07:22.200 | to complex life on Earth,
00:07:24.520 | produce the kind of brain that doesn't at all
00:07:26.800 | need to sense the reality directly.
00:07:29.960 | So like this table, according to Donald Hoffman,
00:07:33.120 | is not there.
00:07:34.120 | - Well, so--
00:07:36.120 | - Like, not just as an abstraction,
00:07:38.720 | like we don't sense the molecules that make up the table,
00:07:41.400 | but all of this is fake.
00:07:43.360 | - Interesting.
00:07:44.200 | So I tend to be more of a hard science person, right?
00:07:49.200 | And so just on that, people talk about qualia,
00:07:54.280 | like is your perception of green
00:07:58.200 | different from my perception of green?
00:08:00.800 | And my counterargument on that is,
00:08:03.460 | well, we know something about spectrum of light,
00:08:05.960 | and we can build artificial eyes.
00:08:07.960 | And if we can build artificial eyes, which we can,
00:08:10.240 | like they're not amazing, but you can actually,
00:08:12.120 | you can do that, you can build artificial ears, and so on.
00:08:14.200 | Obviously, we can build recording devices,
00:08:16.040 | and different cameras, and things like that.
00:08:18.480 | Well, operationally, the whole concept
00:08:21.280 | of your perception of green,
00:08:22.240 | you see green as purple, I see green as green,
00:08:23.920 | or what I call green, doesn't seem to add up,
00:08:25.880 | because it does seem like we can do engineering around it.
00:08:29.080 | So the Hoffman thing, I get why people more broadly
00:08:33.240 | will talk about a simulation hypothesis,
00:08:34.760 | 'cause it's like Feynman, many others have talked about
00:08:37.840 | how math is surprisingly useful to describe the world.
00:08:42.840 | Like very simple equations give rise
00:08:44.520 | to these complex phenomena.
00:08:45.360 | Wolfram is also on this, from a different angle
00:08:48.920 | with the cellular automata stuff.
00:08:50.560 | - It's almost suspicious how well it works.
00:08:54.600 | - Yeah, but on their hand, it's like,
00:08:56.440 | yet we're still also in a prime number maze.
00:09:02.040 | There's things we just don't understand.
00:09:03.640 | And so--
00:09:05.480 | - Also, within the constraints of the non-prime numbers,
00:09:09.200 | we find math to be extremely effective,
00:09:12.320 | surprisingly effective.
00:09:13.680 | - Yeah, exactly.
00:09:14.560 | So maybe the math we have gets us through
00:09:17.160 | the equivalent of the even turns and the odd turns,
00:09:19.120 | but there's math we don't yet have
00:09:20.480 | that is more complex, or more complex rules for other parts.
00:09:22.760 | - This is probably all RH, or just rats in a cage.
00:09:25.680 | - I know that gets very abstract,
00:09:27.320 | but there are unsolved problems in physics.
00:09:30.840 | Like the condensed matter space,
00:09:31.920 | there's a lot of interesting stuff happening.
00:09:33.280 | My recollection, I may be out of date on this,
00:09:35.760 | things like sonoluminescence,
00:09:36.960 | we don't know exactly how they work.
00:09:38.560 | And sometimes those things that are at the edges of physics,
00:09:42.120 | in the late 1800s, I think Rutherford,
00:09:45.280 | somebody, I think it was Rutherford said,
00:09:48.120 | basically all physics is being discovered, et cetera.
00:09:50.280 | And that was obviously before quantum mechanics.
00:09:53.280 | That sort of edge case, people are looking at the bomber
00:09:55.360 | in the "Passion" series and seeing this weird thing
00:09:59.400 | with the hydrogen spectrum, and it was quantized.
00:10:02.240 | And that led to the sort of phenomenological
00:10:06.400 | set of observations that led to quantum mechanics
00:10:08.400 | and everything.
00:10:10.000 | And sometimes I think the UAP stuff might be like that.
00:10:13.400 | People immediately go to aliens for UAP,
00:10:15.960 | like the unidentified aerial phenomena.
00:10:17.840 | People have been, there's surprising amount of stuff
00:10:20.520 | out there on this.
00:10:21.360 | The UK has declassified a bunch of material.
00:10:24.360 | Harry Reid, who's a senator, has talked about this.
00:10:26.280 | It's not an obviously political thing, which is good.
00:10:30.600 | It's something that, is there something happening there?
00:10:34.280 | And people had thought for a long time
00:10:35.520 | that the UAP thing was American kind of counter propaganda
00:10:40.520 | to cover up their new spy planes that were spying
00:10:43.440 | over the Soviet Union to make anybody who talked about them
00:10:45.800 | seem crazy and hysterical or whatever.
00:10:49.840 | But if the UAP thing is real,
00:10:52.960 | it could be atmospheric phenomena,
00:10:55.120 | like the aurora borealis or the Northern Lights,
00:10:58.000 | but some things we don't understand.
00:10:59.680 | It could be something like the Bomber and Passion series,
00:11:03.800 | which were the observations of emission spectra
00:11:06.440 | before quantum mechanics.
00:11:07.840 | So that's another option as opposed to doesn't exist
00:11:10.540 | or Lilligreen men.
00:11:12.380 | It could be physics we don't understand yet.
00:11:14.600 | That's one possible.
00:11:15.440 | - Do you think there's alien civilizations out there?
00:11:18.080 | - So there's a lot of folks who have kind of written
00:11:20.320 | and talked about this.
00:11:21.160 | There's the Drake equation, which is like multiplying
00:11:24.100 | all the probabilities together.
00:11:25.420 | There's perhaps more sophisticated takes,
00:11:27.620 | like the Dark Forest, which says that if the universe
00:11:32.240 | is like a dark forest, we're the dumb ones
00:11:34.040 | that aren't hiding our presence.
00:11:36.640 | There's one calculation I saw,
00:11:37.920 | and I haven't reproduced it myself,
00:11:39.140 | but basically says that the assumption
00:11:42.240 | that other civilizations have seen ours is wrong
00:11:44.240 | because when you have like a spherical radius
00:11:46.560 | for like the electromagnetic radiation
00:11:49.280 | that's leaving our planet,
00:11:50.280 | as that sphere gets larger and larger,
00:11:51.880 | it gets like smaller and smaller amounts of energy.
00:11:55.200 | So you get farther out,
00:11:57.000 | you're not getting enough photons or what have you
00:12:01.880 | to actually detect it.
00:12:04.240 | I don't know, I actually haven't looked into the math
00:12:06.800 | behind it, but I remember seeing that argument.
00:12:08.560 | So actually, it is possible that it's so diffuse
00:12:11.260 | when you go past a certain number of light years out
00:12:14.240 | that people, that an alien civilization
00:12:16.560 | wouldn't be able to detect it, right?
00:12:17.640 | That's another argument.
00:12:18.640 | - That's more basically about signals from them,
00:12:21.200 | from us to be able to, signals colliding enough
00:12:25.160 | to find the signal from the noise.
00:12:28.080 | - Right, exactly.
00:12:28.920 | - Intelligent signal.
00:12:29.920 | - Yeah, Hanson has an article called "Grabby Aliens."
00:12:34.320 | Have you seen his thing?
00:12:35.160 | - Yes, I've seen it.
00:12:35.980 | - And so there's--
00:12:36.820 | - He's been on this podcast.
00:12:38.040 | - Oh, great.
00:12:38.880 | - He's brilliant.
00:12:39.700 | - I like him, he pushes boundaries in interesting ways.
00:12:42.720 | - In every ways, in all of the ways.
00:12:44.480 | - In all the ways, that's right.
00:12:45.560 | I like him overall.
00:12:46.720 | He's an asset to humanity.
00:12:49.440 | - Grabby Aliens, so he has this interesting idea
00:12:52.840 | that the civilizations quickly learn how to travel
00:12:57.840 | close to the speed of light,
00:12:59.720 | so we're not gonna see them until they're here.
00:13:02.160 | - Yeah, that's possible.
00:13:04.520 | I mean, one of the things is, so here's, for example,
00:13:06.680 | a mystery that we haven't yet done, right?
00:13:08.720 | Which, or we haven't really figured out yet,
00:13:10.520 | which is abiogenesis in the lab, right?
00:13:15.200 | We've done lots of things where you've got,
00:13:18.280 | you can show macromolecules binding to each other,
00:13:20.880 | you can show evidence for the so-called RNA world,
00:13:24.360 | abiogenesis is to go from non-life to life, right,
00:13:29.360 | in the lab, you can show microevolution,
00:13:31.960 | obviously with bacteria,
00:13:33.320 | you can do artificial selection on them,
00:13:35.140 | lots of other aspects of fundamental biochemistry origins
00:13:40.140 | of life stuff have been established.
00:13:41.760 | There's a lot of plausibility arguments
00:13:43.340 | about the primitive environment
00:13:45.060 | and nitrogens and carbons snapping together
00:13:47.440 | to get the RNA world is the initial hypothesis,
00:13:51.900 | but to my knowledge at least,
00:13:52.980 | we haven't actually seen abiogenesis demonstrated.
00:13:55.440 | Now, one argument is you need just like this massive world
00:13:59.360 | with so many different reps before that actually happens.
00:14:03.520 | And one possibility is if we could do atomic level
00:14:08.520 | simulations of molecules bouncing against each other,
00:14:11.960 | it's possible that in some simulation,
00:14:13.560 | we could find a path, a reproducible path to abiogenesis,
00:14:16.120 | and then just replicate that in the lab, right?
00:14:19.200 | I don't know, okay?
00:14:21.960 | But that seems to me to be like a mystery
00:14:24.180 | that we still don't fully understand,
00:14:25.720 | like an example of the prime number maze, right?
00:14:27.480 | - One of the most fascinating mysteries.
00:14:29.400 | - One of the most important, yep.
00:14:30.440 | - Yeah. - Yeah, and again,
00:14:31.960 | there may be some biochemist who's like,
00:14:33.760 | oh, Vald, you didn't know about X, Y, and Z
00:14:35.600 | that happened in the abiogenesis field.
00:14:37.520 | I freely confess, I'm not like, you know, au courant on it.
00:14:41.600 | The last thing I remember looking at is--
00:14:43.360 | - What's au courant mean?
00:14:44.480 | - Like up to the moment.
00:14:45.520 | - Oh, nice, that's a nice word.
00:14:47.240 | - That's a-- - Au courant.
00:14:48.560 | - I'm probably mispronouncing it, but--
00:14:49.760 | - Yeah, we'll edit it in post
00:14:52.280 | to pronounce it correctly with AI.
00:14:54.080 | - Yeah, yeah. (laughs)
00:14:55.760 | - We'll copy your voice,
00:14:56.960 | and it will pronounce it perfectly correctly in post.
00:15:00.120 | - One thing that I do think was interesting is
00:15:03.160 | Craig Venter a while back tried to make
00:15:05.520 | a minimum viable cell where he just tried to delete
00:15:09.960 | all of the genes that were not considered essential.
00:15:13.600 | And so it's like a new life form,
00:15:15.120 | and this was like almost 20 years ago and so on.
00:15:17.000 | And that thing was viable in the lab, right?
00:15:20.200 | And so it's possible that you could,
00:15:22.080 | you kind of reverse engineer.
00:15:23.520 | So you're coming at the problem from different directions,
00:15:25.300 | like RNA molecules can do quite a lot.
00:15:27.720 | You've got some reasonable assumptions
00:15:30.640 | as to how that could come together.
00:15:32.800 | You've got sort of stripped down minimum viable life forms.
00:15:36.440 | And so it's not there isn't stuff here.
00:15:38.760 | You can see microevolution,
00:15:40.000 | you can see at the sequence level.
00:15:41.520 | You know, if you do molecular phylogenetics,
00:15:42.800 | you can actually track back the basis.
00:15:44.320 | There's actually, so it's not like there's no evidence here.
00:15:46.120 | There's a lot of tools to work with.
00:15:47.600 | But this, in my view, is a fascinating area.
00:15:49.400 | And actually also relevant to AI,
00:15:51.240 | because another form of abiogenesis would be
00:15:53.600 | if we were able to give rise to a different
00:15:56.040 | branch of life form, the silicon-based
00:15:57.760 | as opposed to carbon-based, you know, to stretch a point.
00:16:00.700 | You give rise to something that actually does meet
00:16:03.040 | the definition of life or some definition of life.
00:16:05.400 | - What do you think that definition is
00:16:07.400 | for an artificial life form?
00:16:09.400 | 'Cause you mentioned consciousness.
00:16:11.400 | - Yeah. - When will it give us pause
00:16:13.200 | that we created something that feels by some definition
00:16:18.200 | or by some spiritual, poetic, romantic,
00:16:24.240 | philosophical, mathematical definition
00:16:26.600 | that it is alive. - Right.
00:16:28.400 | - And we wouldn't want to kill it.
00:16:30.120 | - So, couple of remarks on that.
00:16:31.660 | One is Francis Crick, of Watson and Crick,
00:16:36.040 | before he died, I think his last paper
00:16:37.840 | was published on something called the claustrum, okay?
00:16:40.600 | And the thing is that, you know,
00:16:41.880 | sometimes in biology or in any domain,
00:16:44.600 | people are sort of discouraged from going after
00:16:46.440 | the big questions, right?
00:16:48.480 | But he proposed the claustrum is actually the organ
00:16:51.740 | that is the seat of consciousness.
00:16:53.040 | It's like this sheath that covers the brain.
00:16:56.880 | And for mice, if you, and again,
00:17:00.120 | I may be recollecting this wrong, so you can look,
00:17:02.120 | but my recollection is in mice, if you disrupt this,
00:17:06.400 | the mouse is very disoriented, right?
00:17:08.760 | It's like, it's the kind of thing which, you know,
00:17:11.600 | Watson and Crick were all about structure implies function,
00:17:14.200 | right, they found the structure of DNA, this amazing thing.
00:17:16.560 | And, you know, they remarked in this very understated way
00:17:19.840 | at the end of the paper that, well, obviously this gives
00:17:23.240 | a basis for how the genetic material might be replicated
00:17:25.600 | and error corrected because, you know,
00:17:27.520 | helix unwinds and you copy paste it, right?
00:17:29.200 | So he was a big structure function person,
00:17:31.440 | and that applies not just at the protein level,
00:17:33.280 | not just at the level of DNA,
00:17:35.320 | but potentially also at the level of organs.
00:17:37.540 | Like the claustrum is kind of this system integrated level,
00:17:40.440 | right, it's like the last layer in the neural network
00:17:42.840 | or something, you know?
00:17:44.640 | And so that's the kind of thing
00:17:47.160 | that I think is worth studying.
00:17:48.720 | So consciousness is another kind of big,
00:17:52.760 | abiogenesis is a big question, the prime number
00:17:54.560 | remains consciousness is a big question.
00:17:56.520 | And, you know, then definition of life, right?
00:18:01.280 | There's folks, gosh, there's, I think,
00:18:03.240 | so this one is something I'd have to Google around,
00:18:06.080 | but there was a guy, I think at Santa Fe Institute
00:18:08.040 | or something, who had some definition of life
00:18:10.040 | and like some thermodynamic definition.
00:18:12.960 | But you're right that it's gonna be
00:18:15.080 | a multi-feature definition.
00:18:17.800 | We might have a Turing test like definition, frankly,
00:18:20.000 | which is just if enough humans agree it's alive,
00:18:22.600 | it's alive, right?
00:18:24.020 | And that might frankly be the operational definition.
00:18:26.800 | 'Cause, you know, viruses are like this boundary case,
00:18:28.840 | you know, are they alive or not?
00:18:30.160 | Most people don't think they're alive,
00:18:31.640 | but they're on, they're kind of,
00:18:34.120 | they're more alive than a rock in a sense.
00:18:35.960 | - Well, I think in a world that we'll talk about today
00:18:39.920 | quite a bit, which is the digital world,
00:18:42.360 | I think the most fascinating, philosophically
00:18:45.280 | and technically definition of life
00:18:47.800 | is life in the digital world.
00:18:50.480 | So chatbots, essentially creatures,
00:18:54.600 | whether they're replicas of humans
00:18:57.200 | or totally independent creations,
00:19:00.640 | perhaps in an automatic way,
00:19:02.660 | I think there's going to be chatbots
00:19:05.840 | that we would ethically be troubled by
00:19:08.720 | if we wanted to kill them.
00:19:10.680 | They would have the capacity to suffer.
00:19:12.860 | They would be very unhappy with you trying to turn them off.
00:19:17.860 | And then there'll be large groups of activists
00:19:21.080 | that will protest and they'll go to the Supreme Court
00:19:24.680 | of whatever the Supreme Court looks like
00:19:27.320 | in 10, 20, 30, 40 years.
00:19:29.640 | And they will demand that these chatbots
00:19:33.400 | would have the same rights as us humans.
00:19:35.680 | Do you think that's possible?
00:19:37.120 | - I saw that Google engineer who was basically saying
00:19:39.680 | this had already happened.
00:19:40.880 | And I was surprised by it because it just,
00:19:45.880 | when I looked at the chat logs of it,
00:19:48.080 | it didn't seem particularly interesting.
00:19:50.960 | On their hand, I can definitely see it.
00:19:53.160 | I mean, GPT-3 for people who haven't paid attention
00:19:57.040 | shows that serious step-ups are possible.
00:19:59.600 | And obviously, you've talked about AI
00:20:02.320 | in your podcast a ton.
00:20:05.040 | Is it possible that GPT-9 or something is kind of like that?
00:20:09.120 | Or GPT-15 or GPT-4, maybe?
00:20:12.100 | But--
00:20:14.800 | - Yeah, for people just listening,
00:20:16.900 | there's a deep skepticism in your face.
00:20:19.260 | - Yeah, the reason being because,
00:20:22.160 | you know what's possible?
00:20:25.220 | It's possible that you have a partition of society
00:20:27.920 | on literally this basis.
00:20:29.300 | That's one model where there's some people,
00:20:32.420 | just like there's vegetarians and non-vegetarians, right?
00:20:35.340 | There may be machines have life
00:20:40.340 | and machines are machines, you know?
00:20:42.580 | Like, or something like that, right?
00:20:44.740 | You could definitely imagine some kind of partition
00:20:48.900 | like that in the future where your fundamental
00:20:51.260 | political social system, that's a foundational assumption.
00:20:55.620 | And, you know, is AI, does it deserve the same rights
00:21:01.380 | as like a human or, for example,
00:21:02.860 | a corporation is an intermediate?
00:21:05.580 | Do you see the thing which is how human
00:21:08.380 | are different corporations?
00:21:09.500 | Have you seen that infographic?
00:21:10.980 | It's actually funny.
00:21:11.820 | So it's like-- - There's a spectrum.
00:21:12.940 | - There's a spectrum.
00:21:13.760 | So for example, Disney is considered about as human
00:21:15.980 | as like a dog, but like Exxon,
00:21:18.340 | I may be remembering this wrong,
00:21:19.620 | but they had like a level with like human at one end
00:21:21.620 | and like rock at the other.
00:21:22.460 | - Does it have to do with corporate structure?
00:21:23.860 | What's the--
00:21:24.700 | - I think it's about people's empathy for that corporation,
00:21:26.660 | their brand identity.
00:21:28.160 | But it's interesting to see that, first of all,
00:21:31.360 | people sort of do think of corporations as being more,
00:21:35.440 | like the branding is really what they're responding to.
00:21:37.360 | - Well, that's what, I mean, they're also responding.
00:21:39.840 | You know, I have a brand of human that I'm trying to sell
00:21:43.920 | and it seems to be effective for the most part.
00:21:46.120 | - Sure.
00:21:46.940 | - Although it has become like a running joke
00:21:49.000 | that I might be a robot.
00:21:50.520 | - Right.
00:21:51.360 | - Which means the brand is cracking.
00:21:53.440 | - Could be.
00:21:55.400 | - It's seeping through.
00:21:56.500 | But I mean, in that sense, I just, I think,
00:22:00.360 | I don't see a reason why chatbots can't manufacture
00:22:05.360 | the brand of being human, of being sentient.
00:22:09.760 | - I mean, that is the Turing test,
00:22:11.320 | but it's like the multiplayer Turing test.
00:22:12.760 | Now that actually a fair number of chatbots
00:22:14.240 | have passed the Turing test,
00:22:15.360 | I'd say there's at least two steps up, right?
00:22:18.240 | One is a multiplayer Turing test
00:22:21.280 | where you have chatbots talking to each other.
00:22:25.000 | And then you ask, can you determine the difference
00:22:27.980 | between in chatbots talking to each other
00:22:30.820 | and clicking buttons and stuff in apps
00:22:33.020 | and humans doing that?
00:22:34.340 | And I think we're very far off,
00:22:35.620 | or I shouldn't say very far off.
00:22:36.740 | At least, I don't know how far we are in terms of time,
00:22:40.220 | but we're still far off in terms of a group of in chatbots
00:22:44.260 | looking like their digital output
00:22:46.020 | is like the group of in humans,
00:22:47.180 | like go from the Turing test
00:22:49.140 | to the multiplayer Turing test.
00:22:50.060 | That's one definition.
00:22:51.240 | Another definition is to be able to kind of swap in
00:22:56.240 | and you're not just convincing one human
00:22:59.600 | that this is a human for a small session.
00:23:03.000 | You're convincing all humans
00:23:05.120 | that this is a human for end sessions.
00:23:06.800 | Remote work actually makes this possible, right?
00:23:09.080 | That's another definition of a multiplayer Turing test
00:23:11.280 | where basically you have a chatbot that's fully automated
00:23:16.280 | that is earning money for you
00:23:18.040 | as an intelligent agent on a computer
00:23:19.860 | that's able to go and get remote work jobs and so on.
00:23:22.040 | I would consider that next level, right?
00:23:23.920 | If you could have something that was like that,
00:23:25.980 | that was competent enough to,
00:23:28.480 | I mean, 'cause everything on a computer can be automated.
00:23:30.960 | Literally, you could be totally hands-free,
00:23:32.200 | just like autonomous driving.
00:23:33.600 | You could have autonomous earning.
00:23:35.240 | As a challenge problem,
00:23:37.320 | if you were Microsoft or Apple
00:23:39.200 | and you had legitimate access to the operating system,
00:23:42.760 | just like Apple says,
00:23:43.720 | "Can you send me details of this event?"
00:23:45.920 | A decentralized thing could, in theory,
00:23:48.480 | log the actions of 10,000 or 100,000 or a million people.
00:23:53.480 | And with cryptocurrency,
00:23:55.960 | you could even monitor a wallet that was on that computer.
00:23:59.220 | And you could see what long run series of actions
00:24:02.760 | were increasing or decreasing this digital balance.
00:24:05.340 | You see what I'm saying?
00:24:06.540 | So you start to get, at least conceptually,
00:24:09.300 | it'd be invasive and there'd be a privacy issue and so on.
00:24:12.740 | Conceptually, you could imagine an agent
00:24:15.180 | that could learn what actions humans were doing
00:24:18.100 | that resulted in the increase
00:24:19.660 | of their local cryptocurrency balance.
00:24:21.940 | There may be better ways to formulate it,
00:24:23.380 | but that I consider a challenge problem
00:24:25.440 | is to go from the Turing test to a genuine intelligent agent
00:24:28.580 | that can actually go and make money for you.
00:24:30.220 | If you can do that, that's a big deal.
00:24:31.260 | People obviously have trading bots and stuff,
00:24:33.100 | but that would be the next level.
00:24:34.340 | It's typing out emails, it's creating documents.
00:24:36.220 | - So mimic human behavior in its entirety.
00:24:39.060 | - Yeah, that's right.
00:24:39.900 | And it'll schedule Zooms, it'll send emails.
00:24:42.620 | It'll essentially, 'cause if you think about it,
00:24:45.260 | a human is hitting the keys and clicking the mouse,
00:24:47.180 | but just like a self-driving car,
00:24:48.940 | the wheel rotates by itself, right?
00:24:51.340 | Those keys are effectively just,
00:24:53.100 | it's like the automator app in Apple, right?
00:24:56.500 | Everything's just moving on the screen.
00:24:58.980 | You're seeing it there and it's just an AI.
00:25:00.740 | - It's kind of hilarious that the I'm not a robot
00:25:03.460 | click thing actually works.
00:25:05.980 | 'Cause I actually don't know how it works,
00:25:09.500 | but I think it has to do with the movement of the mouse,
00:25:13.460 | the timing, and they know that it's very difficult
00:25:17.220 | for currently for a bot to mimic human behavior
00:25:19.940 | in the way they would click that little checkbox.
00:25:22.260 | - Yeah, exactly.
00:25:23.100 | I think it's something, I mean, again,
00:25:25.140 | my recollection on that is it's like a pile
00:25:27.780 | of highly obfuscated JavaScript with all kinds,
00:25:29.900 | it looks like a very simple box,
00:25:31.620 | but it's doing a lot of stuff
00:25:32.780 | and it's collecting all kinds of instrumentation.
00:25:35.140 | And yeah, exactly.
00:25:36.180 | Like a robot is just a little too deterministic
00:25:39.420 | or if it's got noise, it's like Gaussian noise.
00:25:42.020 | And the way humans do it is just not something
00:25:45.140 | that you'd used to be able to do
00:25:46.900 | without collecting thousands and thousands
00:25:48.620 | and thousands of human traces doing it.
00:25:50.660 | But it is a predator prey on that.
00:25:52.140 | - Well, and then the computer--
00:25:53.780 | - Or millions of human traces, I don't know.
00:25:55.100 | - The computer just sees the JavaScript.
00:25:56.780 | It needs to be able to look outside the simulation
00:26:01.140 | for the computer, the world is like,
00:26:03.660 | it doesn't, the computer doesn't know
00:26:04.940 | about the physical world.
00:26:05.820 | So it has to look outside of its world
00:26:08.060 | and introspect back on this simple box,
00:26:11.700 | which is kind of, I think that's exactly what mushrooms do
00:26:15.340 | or like psychedelics is you get to go outside
00:26:18.220 | and look back in and that's what a computer needs to do.
00:26:21.060 | - I do wonder whether they actually give people insight
00:26:23.380 | or whether they give people the illusion of insight.
00:26:26.060 | - Is there a difference?
00:26:27.300 | - Yeah, because, well, actual insight,
00:26:29.700 | actual insight is, again, Maxwell's equations.
00:26:33.860 | You're able to shift the world with that.
00:26:35.660 | There's a lot of practical devices that work.
00:26:37.980 | The illusion of insight is I'm Jesus Christ
00:26:41.460 | and nothing happens, right?
00:26:43.660 | So I don't know.
00:26:45.140 | I think those are quite different.
00:26:46.860 | - I don't know.
00:26:47.700 | I think you can fake it till you make it on that one,
00:26:50.420 | which is insight in some sense is revealing a truth
00:26:55.220 | that was there all along.
00:26:57.100 | - Yeah, so I mean, I guess like I'm talking
00:26:59.180 | about technical insight where you have,
00:27:02.700 | this is the thing we were talking about actually
00:27:04.900 | before the podcast, like technical truths
00:27:06.340 | versus political truths, right?
00:27:07.800 | Some truths, they're on a spectrum
00:27:10.100 | and there's some truths that are actually entirely political
00:27:12.940 | in the sense that if you can change the software
00:27:15.340 | in enough people's heads,
00:27:16.500 | you change the value of the truth.
00:27:18.540 | For example, the location of a border
00:27:20.860 | is effectively consensus between large enough groups
00:27:23.700 | of people.
00:27:25.100 | Who is the CEO?
00:27:26.700 | That's consensus among a certain group of people.
00:27:29.640 | What is the value of a currency or any stock, right?
00:27:32.020 | That market price is just the psychology
00:27:34.620 | of a bunch of people.
00:27:35.460 | Like literally, if you can change enough people's minds,
00:27:37.220 | you can change the value of the border
00:27:39.060 | or the position of the hierarchy
00:27:40.540 | or the value of the currency.
00:27:41.380 | Those are purely political truths.
00:27:43.220 | Then all the way on the other end are technical truths
00:27:45.660 | that exist independent of whatever any one human
00:27:48.900 | or all humans think, like the gravitational constant, right?
00:27:52.660 | Or the diameter of a virus.
00:27:54.500 | Those exist independent of the human mind.
00:27:57.860 | Change in a few minds doesn't matter.
00:27:59.100 | Those remain constant.
00:28:00.860 | And then you have things that are interestingly
00:28:03.740 | in the middle where cryptocurrency has tried
00:28:05.780 | to pull more and more things from the domain
00:28:07.580 | of political truths into technical truths,
00:28:10.140 | where they say, okay, the one social convention we have
00:28:13.860 | is that if you hold this amount of Bitcoin,
00:28:17.780 | or that if you hold this private key,
00:28:19.140 | you hold this Bitcoin.
00:28:19.980 | And then we make that very hard to change
00:28:21.340 | 'cause you have to change a lot of technical truths.
00:28:23.100 | So you can push things to this interesting
00:28:27.020 | intermediate zone.
00:28:27.860 | - Yeah, the question is how much of our world
00:28:31.020 | can we push into that?
00:28:33.340 | - Right.
00:28:34.180 | - And that takes us in a nonlinear fascinating journey
00:28:39.180 | to the question I wanted to ask you in the beginning,
00:28:43.140 | which is this political world that you mentioned
00:28:46.860 | in the world of political truth.
00:28:49.100 | As we know it, in the 20th century,
00:28:51.300 | in the early 21st century, what do you think works well
00:28:55.140 | and what is broken about government?
00:28:57.700 | - The fundamental thing is that we can't easily
00:29:01.900 | and peacefully start new opt-in governments.
00:29:05.940 | And--
00:29:06.780 | - Like startup governments.
00:29:07.860 | - Yeah, and what I mean by that is basically,
00:29:10.440 | you can start a new company, you can start a new community,
00:29:15.460 | you can start a new currency even these days.
00:29:17.420 | You don't have to beat the former CEO in a duel
00:29:20.580 | to start a new company.
00:29:22.700 | You don't have to become head of the World Bank
00:29:25.780 | to start a new currency, okay?
00:29:28.120 | Because of this, yes, if you want to,
00:29:32.960 | you can join, I don't know, Microsoft
00:29:37.240 | or name some company that's a GameStop
00:29:40.200 | and you can try to reform it, okay?
00:29:42.360 | Or you can start your own.
00:29:44.160 | And the fact that both options exist mean that
00:29:47.200 | you can actually just start from scratch.
00:29:50.560 | And that's just, I mean,
00:29:51.400 | the same reason we have a clean piece of paper, right?
00:29:53.080 | I've mentioned this actually in the "Network State" book.
00:29:55.800 | I'll just quote this bit,
00:29:56.720 | but we want to be able to start a new state peacefully
00:29:59.240 | for the same reason we want a bare plot of earth,
00:30:01.440 | a blank sheet of paper, an empty text buffer,
00:30:03.660 | a fresh job or a clean slate,
00:30:05.400 | 'cause we want to build something new
00:30:06.280 | without historical constraint, right?
00:30:08.120 | For the same reason you hit plus and do docs.new,
00:30:10.440 | you know, like create a new doc.
00:30:11.920 | It's for the same reason, right?
00:30:13.280 | Because you don't have to backspace,
00:30:14.680 | you don't have to have just like 128 bytes of space,
00:30:17.560 | 128 kilobytes and just have to backspace the old document
00:30:19.800 | for creating the new one.
00:30:21.200 | So that's a fundamental thing that's wrong
00:30:23.040 | with today's governments.
00:30:24.320 | And it's a meta point, right?
00:30:25.780 | Because it's not any one specific reform,
00:30:27.400 | it's a meta reform of being able to start new countries.
00:30:30.200 | - Okay, so that's one problem,
00:30:32.280 | but you know, you could push back and say,
00:30:34.240 | that's a feature,
00:30:36.240 | because a lot of people argue that tradition is power.
00:30:39.560 | Through generation, if you try a thing long enough,
00:30:43.360 | which is the way I see marriage,
00:30:44.880 | there's value to the struggle
00:30:47.040 | and the journey you take through the struggle
00:30:49.120 | and you grow and you develop ideas together,
00:30:52.280 | you grow intellectually, philosophically together.
00:30:54.520 | And that's the idea of a nation that spans generations,
00:30:57.700 | that you have a tradition that becomes more,
00:31:02.440 | that strives towards the truth
00:31:06.740 | and is able to arrive there, or no, not arrive,
00:31:09.680 | but take steps towards there through the generations.
00:31:12.640 | So you may not want to keep starting new governments.
00:31:15.940 | You may want to stick to the old one
00:31:19.620 | and improve it one step at a time.
00:31:23.240 | So just because you're having a fight inside a marriage
00:31:25.760 | doesn't mean you should get a divorce
00:31:27.000 | and go on Tinder and start dating around.
00:31:29.840 | That's the pushback.
00:31:31.480 | So it's not obvious that this is a strong feature to have
00:31:35.320 | to launch new governments.
00:31:36.840 | - There's several different kind of lines of attack
00:31:39.600 | or debate or whatever on this, right?
00:31:41.800 | First is, yes, there's obviously value to tradition.
00:31:46.040 | And people say, this is Lindy and that's Lindy.
00:31:49.120 | It's been proven for a long time and so on.
00:31:52.320 | But of course there's a tension
00:31:54.240 | between tradition and innovation.
00:31:56.200 | Like going to the moon wasn't Lindy, just, it was awesome.
00:32:01.200 | And artificial intelligence is something that's very new.
00:32:05.800 | New is good, right?
00:32:07.360 | And this is a tension within humanity actually itself,
00:32:10.080 | 'cause it's way older than all of these nations.
00:32:12.800 | I mean, humans are tens of thousands of years old.
00:32:15.360 | The ancestors of humans are millions of years old, right?
00:32:17.800 | And you go back far enough,
00:32:19.440 | and the time that we know today
00:32:20.880 | of the sessile farmer and soldier is,
00:32:24.800 | if you go back far enough,
00:32:25.640 | you wanna be truly traditional,
00:32:27.640 | well, we're actually descended from hunter gatherers
00:32:29.640 | who were mobile and wandered the world
00:32:32.080 | and there weren't borders and so on.
00:32:33.480 | They kind of went where they want, right?
00:32:35.320 | And people have had done historical reconstructions
00:32:39.200 | of like skeletons and stuff like that.
00:32:41.160 | And many folks report that the transition to agriculture
00:32:45.160 | and being sessile resulted in diminution of height.
00:32:49.960 | People had like tooth decay and stuff like that.
00:32:52.040 | The skeletons, people had traded off upside for stability.
00:32:56.200 | Right, that's what the state was.
00:32:57.320 | That was what these sessile kinds of things were.
00:32:58.960 | Now, of course, they had more likelihood
00:33:02.760 | of living consistently.
00:33:04.760 | You could support larger population sizes,
00:33:06.440 | but it had lower quality of life, right?
00:33:08.520 | And so the hunter gatherer,
00:33:10.320 | maybe that's actually our collective recollection
00:33:12.400 | of a Garden of Eden where people,
00:33:14.120 | just like a spider kind of knows innately how to build webs
00:33:18.160 | or a beaver knows how to build dams.
00:33:20.400 | Some people theorize that the entire Garden of Eden
00:33:23.240 | is like a sort of built-in neural network recollection
00:33:27.600 | of this pre-sessile era where we're able to roam around
00:33:31.120 | and just pick off fruits and so on,
00:33:32.320 | low population density.
00:33:33.880 | So the point is that I think what we're seeing is a V3.
00:33:37.760 | You go from the hunter gatherer to the farmer and soldier,
00:33:40.640 | the sessile nations are here
00:33:42.480 | and they've got borders and so on,
00:33:44.160 | to kind of the V3, which is the digital nomad,
00:33:46.960 | the new hunter gatherer.
00:33:48.000 | We're going back to the future
00:33:49.200 | because what's even older than nations is no nations, right?
00:33:53.520 | Even more traditional than tradition
00:33:55.160 | is being international, right?
00:33:58.160 | And so we're actually tapping
00:33:59.360 | into that other huge thread in humanity,
00:34:01.800 | which is the desire to explore, pioneer, wander, innovate.
00:34:06.800 | And I think that's important.
00:34:08.000 | - So the way to make America great again
00:34:09.560 | is to dissolve it completely into oblivion.
00:34:12.880 | No, that's a joke.
00:34:13.720 | - Yeah, yeah, I know it's a joke.
00:34:15.240 | - Humor, I'm learning this new thing.
00:34:18.200 | - Yes, a new thing for the road.
00:34:19.880 | The chatbot emulation isn't fully working there.
00:34:21.920 | - Yeah, yeah, glitch.
00:34:23.360 | That's where in the beta.
00:34:24.880 | - And let me say one other thing about this,
00:34:26.440 | which is, you know, there are,
00:34:29.360 | I mean, everybody in the world,
00:34:32.520 | okay, let's say, I don't know what percentage,
00:34:34.080 | let's say 99.99% or it's rounds to that number
00:34:39.160 | of political discourse in the US
00:34:42.120 | focuses on trying to fix the system.
00:34:44.720 | If those folks, I mean, 0.01% of the energy
00:34:47.520 | is going towards building a new system,
00:34:49.320 | that seems like a pretty good portfolio strategy, right?
00:34:52.040 | Or 100% are supposed to go and edit this code base
00:34:54.760 | from 200 something years ago.
00:34:56.400 | I mean, the most American thing in the world
00:34:58.280 | is going and leaving your country
00:35:01.120 | in search of a better life.
00:35:02.400 | America was founded 200 years ago by the founding fathers.
00:35:05.120 | It's not just a nation of immigrants,
00:35:06.800 | it's a nation of emigrants, right?
00:35:08.640 | Emigration from other countries to the US
00:35:11.440 | and actually also emigration within the US.
00:35:13.640 | There's this amazing YouTube video called,
00:35:15.960 | it's like 50 states, US population, I think 1792.
00:35:19.940 | It says 2050, so they've got a simulation.
00:35:21.740 | So you just stop it at 2019 or 2020.
00:35:23.920 | But it shows that like Virginia was like number one
00:35:26.400 | early on and then it lost ground and like New York gained.
00:35:29.640 | And then like Ohio was a big deal in the early 1800s.
00:35:32.920 | And it was like father of presidents in general,
00:35:34.720 | these presidents and later Illinois and Indiana.
00:35:37.320 | And then California only really came up
00:35:38.960 | in the 20th century, like during the great depression.
00:35:41.720 | And now we're entering the modern era
00:35:43.080 | where like Florida and Texas have risen
00:35:44.600 | and New York and California have dropped.
00:35:46.560 | And so interstate competition,
00:35:48.120 | it's actually just like inter-currency competition.
00:35:50.380 | You've got trading pairs, right?
00:35:52.120 | You sell BTC by ETH, you sell, you know,
00:35:57.120 | Solana or Z, you know, sell Monero by Zcash, right?
00:36:00.680 | Each of those trading pairs gives you signal for today
00:36:04.200 | on this currency is down or up
00:36:06.360 | relative to this other currency.
00:36:07.920 | In the same way, each of those migration pairs,
00:36:10.800 | someone goes from New York to Ohio, Ohio to California,
00:36:13.360 | gives you information on the desirability
00:36:16.000 | of different states.
00:36:16.840 | You can literally form a pairs matrix like this over time,
00:36:19.800 | very much like the link matrix.
00:36:21.200 | That's shaped America in a huge way.
00:36:22.800 | And so, you know, you ask, A, if this nation of immigrants
00:36:26.520 | that was founded by men younger than us, by the way,
00:36:28.640 | the founding fathers were often in their 20s, right?
00:36:31.220 | Who, you know, endorsed the concept of a proposition nation
00:36:35.160 | who've given rise to a country of founders and pioneers
00:36:40.160 | who've literally gone to the moon, right?
00:36:42.560 | Those folks would think that this is the end of history,
00:36:46.200 | that that's it, we're done.
00:36:47.840 | Like, we've done everything else.
00:36:49.760 | I mean, there's people in technology who believe,
00:36:51.840 | and I agree with them, that we can go to Mars,
00:36:53.680 | that we might be able to end death,
00:36:55.360 | but we can't innovate on something that was 230 years old.
00:36:58.720 | You know?
00:36:59.880 | - So there is a balance, certainly, to strike.
00:37:03.120 | The American experiment is fascinating, nevertheless.
00:37:07.160 | So one argument you can make is actually
00:37:10.120 | that we're in the very early days of this V2.
00:37:12.800 | If this is what you describe as V2,
00:37:15.520 | you could make the case that we're not ready for V3,
00:37:18.400 | that we're just actually trying to figure out the V2 thing.
00:37:22.600 | You're trying to like skip-
00:37:24.480 | - When are we ever ready?
00:37:25.720 | - Now again, we'll go back to marriage, I think,
00:37:28.960 | and having kids kind of thing.
00:37:30.780 | I think everyone who has kids
00:37:31.920 | is never really ready to be kids,
00:37:33.440 | this whole point, you dive in.
00:37:35.080 | Okay, but the...
00:37:37.240 | I mean, you mentioned the U.K. and Washington.
00:37:40.040 | Is there other criticisms of government
00:37:41.880 | that you can provide as we know it today?
00:37:43.680 | Before we kind of outline the ideas of V3,
00:37:47.520 | let's stick to V2.
00:37:48.760 | - I'll give a few, right?
00:37:49.600 | And so a lot of this stuff will go into the version.
00:37:52.360 | So I've got this book, "The Network State,"
00:37:54.680 | which covers some of these topics.
00:37:56.440 | - Does "Network State" have a subtitle?
00:37:58.920 | - It is "The Network State, How to Start a New Country."
00:38:02.280 | - How to start a new country.
00:38:03.520 | - But I just have it at thenetworkstate.com.
00:38:05.560 | - I should say, it's an excellent book that you should get.
00:38:09.080 | I read it on Kindle, but there's also a website.
00:38:12.360 | And Balaji said that he's constantly working
00:38:15.880 | on improving it, changing it.
00:38:18.660 | But by the time the whole project is over,
00:38:21.840 | it'll be a different book than it was in the beginning.
00:38:25.800 | - I think so.
00:38:26.640 | - It's getting its old skin.
00:38:29.200 | - Well, I wanted to get something out there
00:38:31.040 | and get feedback and whatnot, just like an app, right?
00:38:34.120 | Again, you have these two poles of an app is highly dynamic
00:38:37.200 | and you're accustomed to having updates all the time
00:38:39.480 | and a book is supposed to be static.
00:38:40.680 | And there's a value in something static,
00:38:42.240 | something unchanging and so on.
00:38:44.200 | But in this case, I'm glad I kind of shipped a version 1.0
00:38:48.720 | and the next version, I'm gonna split it
00:38:53.320 | into like tentatively motivation, theory and practice.
00:38:57.360 | Like motivation, like what is the sort of political
00:38:59.920 | philosophy and so on that motivates me at least to do this,
00:39:02.900 | which you can take or leave, right?
00:39:04.720 | And then theory as to why network state is now possible
00:39:08.600 | and I can define it in a second.
00:39:10.120 | And then the practice is zillions of practical details
00:39:13.080 | and everything from roads to diplomatic recognition
00:39:15.160 | and so on, funding, founding, all that stuff.
00:39:18.760 | A lot of the stuff actually I left out of V1
00:39:20.360 | simply because I wanted to kind of get the desirability
00:39:23.240 | of it on the table and then talk about the feasibility.
00:39:26.200 | - I should actually linger on that briefly
00:39:28.640 | in terms of things we can revolutionize.
00:39:30.920 | Like one of the biggest innovations I think that Tesla does
00:39:35.320 | with the way they think about the car,
00:39:37.200 | with the deploy the car is not the automation
00:39:39.880 | or the electric to me, it's the over the air updates.
00:39:44.880 | Be able to send instantaneously updates to the software
00:39:50.800 | that completely changes the behavior, the UX,
00:39:54.080 | everything about the car.
00:39:55.320 | And so I do think it would be interesting
00:39:58.000 | 'cause books are representation of human knowledge,
00:40:01.820 | a snapshot of human knowledge.
00:40:06.400 | And it would be interesting that if we can somehow figure
00:40:10.080 | out a system that allows you to do sort of like a GitHub
00:40:13.840 | for books, like if I buy a book on Amazon
00:40:16.640 | without having to pay again, can I get updates
00:40:20.000 | like V1.1, V1.2 and there's like release notes.
00:40:25.000 | That would be incredible.
00:40:27.560 | It's not enough to do like a second edition
00:40:30.040 | or a third edition, but like minor updates
00:40:33.400 | that's not just on your website,
00:40:34.840 | but actually go into the model that we use to buy books.
00:40:39.840 | So I spend my money, maybe I'll do a subscription service
00:40:44.120 | for five bucks a month where I get regular updates
00:40:46.960 | to the books.
00:40:47.960 | And then there's an incentive for authors
00:40:50.200 | to actually update their books such that it makes sense
00:40:53.280 | for the subscription.
00:40:54.440 | And that means your book isn't just a snapshot,
00:40:57.760 | but it's a lifelong project.
00:40:59.640 | - Right.
00:41:00.480 | - If you care enough about the book.
00:41:01.400 | - So I think there's a lot that can be done there
00:41:03.720 | because actually in going through this process,
00:41:05.360 | in many ways, the most traditional thing I did
00:41:07.960 | was a self-publish ebook on Kindle, right?
00:41:12.540 | Because basically like, if you actually ink a deal
00:41:14.680 | with a book publisher, first they'll give you some advance.
00:41:17.360 | I didn't need the advance or anything.
00:41:19.120 | But second is all these constraints.
00:41:21.560 | Oh, you wanna translate into this,
00:41:23.440 | or you wanna do this other format, or you wanna update it,
00:41:25.480 | you have to go and now talk to this other party, right?
00:41:28.240 | And also the narrowing window
00:41:32.560 | of what they'll actually publish, it gets narrower and narrower
00:41:34.760 | you see all these meltdowns over young adult novels
00:41:38.320 | and stuff on Twitter, but it's more than that.
00:41:40.880 | So actually having an Amazon page,
00:41:44.080 | it's just like a marker that a book exists.
00:41:46.680 | - Okay, and now I've got an entry point
00:41:50.440 | where if someone says, okay, I like this tweet,
00:41:53.480 | but how do I kind of get the,
00:41:55.400 | that might be a concept
00:41:57.000 | from like the middle of chapter three, right?
00:41:58.800 | How do I get the thing from front to back?
00:42:00.320 | I can just point them at thenetworkstate.com
00:42:01.920 | that is import this, right?
00:42:03.520 | This one entry point, okay?
00:42:05.400 | And you mentioned like subscription and money
00:42:10.000 | and so on and so forth.
00:42:10.840 | And I think people are paying for content online now
00:42:12.960 | with newsletters and so on, but I've chosen to,
00:42:15.520 | and I will always have the thing free.
00:42:17.600 | And I want it on, you can get the Kindle version on Amazon
00:42:20.880 | simply because you have to kind of set a price for that.
00:42:23.480 | But then networkstate.com, what I wanna do
00:42:25.840 | is have that optimized for every Android phone.
00:42:28.760 | So people in India or Latin America or Nigeria
00:42:32.280 | can just tap and open it.
00:42:33.920 | Gonna do translations and stuff like that.
00:42:36.240 | Greg Fodor of AllSpaceVR, founder of AllSpaceVR,
00:42:39.560 | he sold that and he coded the website
00:42:42.280 | and I worked with him on it.
00:42:44.360 | And there's another designer who, Elijah,
00:42:47.800 | and it was basically just a three-person group.
00:42:49.880 | And we thought we had something pretty nice,
00:42:51.800 | but one thing I was really pleasantly surprised by
00:42:55.240 | is how many people got in touch with us afterwards
00:42:57.120 | and asked us if we could open source the software
00:42:59.840 | to create this website, right?
00:43:02.240 | Because it's actually, you can try it on mobile.
00:43:03.840 | I think it's actually in some ways
00:43:05.760 | a better experience than Kindle.
00:43:07.480 | And so that was interesting because I do think
00:43:11.360 | of the website as like a V1 version
00:43:13.160 | of this concept of a book app, right?
00:43:15.440 | For example, imagine if you have the Bible
00:43:18.360 | and the 10 commandments aren't just text,
00:43:20.600 | but there's like a checklist and there's a gateway
00:43:23.320 | to a Christian community there.
00:43:25.000 | And the practice is embedded into the thing.
00:43:29.020 | Like, did you know brilliant.org?
00:43:31.760 | Amazing site, I love this site.
00:43:33.240 | Brilliant is basically mobile-friendly tutorials
00:43:36.600 | and you can kind of just swipe through,
00:43:38.280 | you're in line at Starbucks or getting on a plane
00:43:41.160 | or something, you just swipe through
00:43:42.400 | and you just get really nice micro lessons on things.
00:43:46.760 | And it's just interactive enough that your brain is working
00:43:49.520 | and you're problem solving.
00:43:50.920 | And sometimes you'll need a little pen and paper,
00:43:53.400 | but that format of sort of very mobile-friendly,
00:43:57.920 | just continuous learning,
00:43:59.280 | I'd like to do a lot more with that.
00:44:02.520 | And so that's kind of where we're gonna go with the book app.
00:44:05.120 | - So there's a lot of fun stuff
00:44:07.640 | about the way you did at least V1 of the book,
00:44:10.240 | which is you have like a one sentence summary,
00:44:13.760 | one paragraph summary, TLDR, and like one image summary,
00:44:18.760 | which is, I think honestly,
00:44:23.880 | it's not even about a short attention span,
00:44:25.640 | it's a really good exercise
00:44:27.400 | about summarization, condensation,
00:44:29.600 | and like helping you think through what is the key insight.
00:44:34.360 | Like we mentioned the prime number maze
00:44:36.920 | that reveals something central to the human condition,
00:44:41.280 | which is struggling against the limitation of our minds.
00:44:46.040 | And in that same way,
00:44:46.880 | you've summarized the network state in the book.
00:44:49.720 | So let's actually jump right there.
00:44:51.840 | And let me ask you, what is the network state?
00:44:55.160 | - What is the network state?
00:44:56.080 | So I'll give it a sentence
00:44:57.120 | and also give it an image, right?
00:44:58.760 | So the informal sentence,
00:45:00.600 | a network state is a highly aligned online community
00:45:03.400 | with a capacity for collective action
00:45:05.400 | that crowdfunds territory around the world
00:45:07.560 | and eventually gains diplomatic recognition
00:45:09.440 | from preexisting states, okay?
00:45:11.620 | So just taking those pieces,
00:45:13.560 | highly aligned online community,
00:45:15.160 | that is not Facebook, that is not Twitter.
00:45:17.400 | People don't think of themselves
00:45:18.360 | as Facebookers or Twitterians, right?
00:45:21.240 | That's just a collection of hundreds of millions of people
00:45:23.280 | who just fight each other all day, right?
00:45:25.160 | It's a fight club.
00:45:26.280 | A company is highly aligned where, you know,
00:45:28.600 | you'll put a task into the company Slack
00:45:30.520 | and if you do it in all hands,
00:45:33.000 | about 100% of the people in a company Slack will do it.
00:45:35.560 | So they're highly aligned in that way.
00:45:37.320 | But online communities don't tend to be highly aligned.
00:45:39.520 | Online communities tend to be like a Game of Thrones fan
00:45:41.600 | club or something like that.
00:45:42.920 | Or, you know, on a Twitter account,
00:45:44.320 | you might get 0.1% of people engaging with something.
00:45:46.240 | It's not the 100%.
00:45:47.520 | If you combine the degree of alignment of a company
00:45:50.960 | with the scale of a community,
00:45:52.680 | that's like what a highly aligned, you know,
00:45:55.080 | online community is, right?
00:45:56.480 | Start to get a thousand or 10,000 people
00:45:57.960 | who can collectively do something as simple
00:45:59.640 | as just all liking something on Twitter.
00:46:01.700 | For example, why would they do that?
00:46:02.880 | They're a guild of electrical engineers.
00:46:05.000 | They're a guild of graphic designers.
00:46:06.800 | And you've got a thousand people in this guild
00:46:09.320 | and every day somebody is asking a favor from the guild
00:46:12.520 | and the other 999 people are helping them out.
00:46:15.320 | For example, I've just launched a new project
00:46:17.720 | or I'd like to get a new job.
00:46:19.520 | Can somebody help me? And so on.
00:46:20.680 | And so you kind of give to get.
00:46:22.040 | You're, you know, you're helping other people
00:46:24.120 | in the community and you're kind of building up karma
00:46:25.760 | this way and then sometimes you spend it down.
00:46:27.620 | Like Stack Overflow has this karma economy.
00:46:29.720 | It's not meant to be an internal economy
00:46:31.620 | that is like making tons and tons of money off of,
00:46:35.000 | it's sort of to keep score, right?
00:46:36.600 | That's a highly aligned online community part.
00:46:38.440 | Then capacity for collective action.
00:46:39.760 | I just kind of described that,
00:46:41.000 | which is at a minimum,
00:46:43.600 | you don't have a highly aligned online community
00:46:46.440 | unless you have a thousand people
00:46:48.200 | and you paste in a tweet and a thousand of them RT it
00:46:50.920 | or like it, okay?
00:46:52.520 | If you can't even get that, you don't have something.
00:46:54.960 | If you do have that, you have the basis
00:46:56.820 | for at least collective digital action on something, okay?
00:47:00.520 | And you can think of this as a group of activists.
00:47:02.620 | You can think of it as, for example, let's say,
00:47:04.860 | I mentioned a guild, but let's say they're a group
00:47:06.600 | that wants to raise awareness of the fact
00:47:09.280 | that life extension is possible, right?
00:47:11.440 | Every day there's a new tweet on, I don't know,
00:47:15.200 | whether it's a Metformin research or Sinclair's work
00:47:18.760 | or David Sinclair, right?
00:47:20.160 | Andrew Huberman has good stuff here, you know,
00:47:22.080 | or there's a Longevity VC.
00:47:24.440 | There's a bunch of folks working in this area.
00:47:26.080 | Every day there's something there.
00:47:27.120 | And literally the purpose of this online community
00:47:29.160 | is raise awareness of longevity.
00:47:31.600 | And of the thousand people, 970 go and like that.
00:47:34.880 | That's pretty good, right?
00:47:36.140 | That's solid.
00:47:37.160 | You've got something there.
00:47:38.000 | You've got a laser, right?
00:47:39.720 | You've got something which you can focus on something
00:47:41.580 | because most of the web to internet is in tropic.
00:47:44.280 | You go to Hacker News, you go to Reddit, you go to Twitter,
00:47:47.100 | and you're immediately struck by the fact
00:47:48.480 | that it's like 30 random things, random.
00:47:51.080 | It's just a box of chocolates.
00:47:53.160 | It's meant to be, you know, we're-
00:47:55.920 | - Some of them look delicious.
00:47:57.120 | - Some of them look delicious.
00:47:58.240 | Novelty, we can over consume novelty, right?
00:48:01.160 | So, you know, where we're talking about earlier,
00:48:02.640 | the balance between tradition and innovation, right?
00:48:04.880 | Here is a different version of that,
00:48:06.360 | which is entropy going in a ton of different directions
00:48:10.760 | due to novelty versus like focus, you know?
00:48:14.520 | It's like heat versus work, you know?
00:48:16.760 | Heat is entropic and work force along a distance.
00:48:19.560 | You're going in a direction, right?
00:48:21.640 | And so if those 30 links on, you know,
00:48:25.160 | the next version of Hacker News or Reddit
00:48:26.800 | or something like so brilliant,
00:48:28.200 | it's just, that's leveling you up.
00:48:30.200 | The 30 things you click,
00:48:31.520 | you've just gained a skill as a function of that, right?
00:48:33.860 | So these kinds of online communities,
00:48:35.260 | I don't know what they look like.
00:48:36.100 | They probably don't look like the current social media.
00:48:38.840 | They, just like, for example, I know this is a meta analogy,
00:48:41.520 | but in the 2000s, people thought Facebook for Work
00:48:44.080 | would look like Facebook.
00:48:45.360 | And, you know, David Sacks, you know,
00:48:47.160 | founded and sold a company, Yammer,
00:48:48.440 | that was partially on their basis.
00:48:49.600 | It was fine, it was a billion dollar company.
00:48:51.520 | But Facebook for Work was actually Slack, right?
00:48:55.600 | It looked different.
00:48:56.440 | It was more chat focused,
00:48:57.560 | it was less image focused and whatnot.
00:48:59.800 | What does the platform
00:49:00.800 | for a highly aligned online community look like?
00:49:02.920 | I think Discord is the transitional state,
00:49:04.960 | but it's not the end state.
00:49:06.400 | Discord is sort of chatty.
00:49:08.200 | The work isn't done in Discord itself, right?
00:49:11.720 | The cryptocurrency for tracking or the crypto karma
00:49:14.760 | for sort of tracking people's contributions
00:49:16.760 | is not really done in Discord itself.
00:49:18.000 | Discord was not built for that.
00:49:19.540 | And I don't know what that UX looks like.
00:49:20.880 | Maybe it looks like tasks, you know,
00:49:23.240 | like maybe it looks something different.
00:49:25.520 | Okay.
00:49:26.360 | - Wait, wait, wait, let me linger on this.
00:49:28.480 | So you were actually,
00:49:29.480 | there's some people might not be even familiar
00:49:32.360 | with Discord or Slack or so on.
00:49:34.760 | Even these platforms have like communities
00:49:38.480 | associated with them.
00:49:39.680 | - Yes.
00:49:40.920 | - Meaning the big, like the meta community
00:49:43.560 | of people who are aware of the feature set
00:49:46.400 | and that you can do a thing,
00:49:47.600 | that this is a thing and then you could do a thing with it.
00:49:50.400 | Discord, like when I first realized it,
00:49:52.520 | I think it was born out of the gaming world.
00:49:54.880 | - Yes.
00:49:55.840 | - Is like, holy shit, this is like a thing.
00:49:59.640 | There's a lot of people that use this.
00:50:01.520 | - Right.
00:50:02.360 | - There's also a culture that's very difficult to escape
00:50:04.840 | that's associated with Discord
00:50:06.440 | that spans all the different communities within Discord.
00:50:09.500 | Reddit is the same,
00:50:11.120 | even though there's different subreddits,
00:50:14.000 | there's still, because of the migration phenomenon maybe,
00:50:17.520 | there's still a culture to Reddit and so on.
00:50:19.560 | - Yes.
00:50:20.400 | - So I'd like to sort of try to dig in and understand
00:50:23.720 | what's the difference between the online communities
00:50:27.720 | that are formed and the platforms
00:50:30.080 | on which those communities are formed?
00:50:32.200 | - Sure. - Very important.
00:50:33.160 | - Yes, it is, it is.
00:50:34.120 | So for example, an office,
00:50:36.040 | a good design for an office is frequently you have,
00:50:38.600 | the commons, which is like the lunchroom
00:50:42.880 | or the gathering area,
00:50:44.080 | then everybody else has a cave on the border
00:50:46.240 | that they can kind of retreat to.
00:50:47.320 | - Cave in the commons, I love it.
00:50:48.640 | By the way, I was laughing internally
00:50:50.280 | about the heat versus work.
00:50:51.800 | I think that's gonna stick with me.
00:50:53.800 | That's such an interesting way to see Twitter.
00:50:55.920 | - Yeah. - Like is this heat
00:50:57.600 | or is this thread,
00:51:00.120 | 'cause there's a lot of stuff going on.
00:51:02.920 | - Right.
00:51:04.000 | - Is it just heat or are we doing some,
00:51:06.040 | is there a directed thing
00:51:07.580 | that's gonna be productive at the end of the day?
00:51:09.320 | - That's right. - I love this.
00:51:10.160 | I've never seen, I mean, anyway,
00:51:12.000 | the cave in the commons is really nice.
00:51:15.440 | So that has to do with the layout of an office
00:51:18.200 | that's effective. - That's right.
00:51:19.280 | And so you can think of many kinds of social networks
00:51:23.960 | as being on the cave in commons continuum.
00:51:27.440 | For example, Twitter is just all commons.
00:51:30.360 | The caves are just like individual DMs
00:51:32.480 | or DM threads or whatever,
00:51:33.560 | but it's really basically just one gigantic,
00:51:35.360 | global, public fight club for the most part, right?
00:51:38.640 | Then you have-- - Or love club.
00:51:40.480 | - Well, some would love the mostly fight.
00:51:42.640 | Or actually it's-- - I love aggressively,
00:51:44.360 | that's all. - Yeah, I mean,
00:51:45.400 | the way I think, I mean, Twitter is like a cross
00:51:47.680 | between a library and a civil war.
00:51:52.680 | It's something where you can learn,
00:51:55.560 | but you can also fight if you choose to fight, right?
00:51:59.160 | - Yeah, well, I mean, it's because of the commons structure
00:52:04.160 | of it, it's a mechanism for virality of anything.
00:52:09.200 | - Yeah, so-- - You just describe
00:52:11.320 | the kind of things that become viral.
00:52:13.440 | - Yeah, meaning no offense to Liberians.
00:52:15.360 | It's like a library and Liberia.
00:52:17.080 | Liberia was wracked by civil war for many years, right?
00:52:20.960 | - Libraries is one of my favorite sets for porn.
00:52:25.360 | Just kidding, jokes.
00:52:26.680 | I'm learning as that's probably crossing the line
00:52:29.440 | for the engineers working on this humor module.
00:52:31.640 | Maybe take that down a notch. - Yeah, gosh.
00:52:34.640 | We're just talking about--
00:52:35.480 | Oh yeah, so continue, go ahead, continue.
00:52:36.320 | - Twitter is the commons. - Yeah, so Twitter
00:52:37.960 | is the commons, then Facebook is like,
00:52:39.760 | it's got all these warrens and stuff.
00:52:41.880 | Facebook, it's very difficult to reason about
00:52:45.040 | like privacy on that.
00:52:46.920 | And the reason is I think it's easy to understand
00:52:48.760 | when something is completely public like Twitter
00:52:51.280 | or completely private like Signal.
00:52:53.600 | And those are the only two modes I think
00:52:54.960 | in which one can really operate.
00:52:56.720 | When something is quasi private like Facebook,
00:53:00.040 | you have to just kind of assume it's public
00:53:02.000 | because if it's interesting enough,
00:53:03.880 | it'll go outside your friend network
00:53:06.120 | and it'll get screenshotted or whatever and posted.
00:53:08.400 | And so, Facebook is sort of forced into default public
00:53:13.520 | despite its privacy settings.
00:53:15.320 | For anybody who says something interesting,
00:53:17.400 | if it's like, you can figure out all their dials
00:53:20.440 | and stuff like that, but just hard to understand
00:53:22.360 | unless it's totally private or totally public.
00:53:24.560 | You have to basically treat it if it's totally public,
00:53:26.440 | if it's not totally private.
00:53:27.440 | Okay, at least under a real name.
00:53:29.040 | I'll come back to Soonim.
00:53:30.080 | So you've got Twitter, that's total commons.
00:53:32.320 | Facebook, which is like a warrens,
00:53:34.560 | it's like rabbit warrens or like a ant colony
00:53:36.600 | where you don't know where information is traveling.
00:53:39.240 | Then you've got Reddit, which has sort of your global Reddit
00:53:42.360 | and then all the subreddits.
00:53:43.280 | That's a different model of cave and commons.
00:53:45.880 | I think one of the reasons it works
00:53:46.960 | is that you have individual moderators
00:53:48.400 | where something is totally off topic and unacceptable
00:53:52.720 | in this subreddit and totally on topic and acceptable
00:53:55.320 | in another, that's like kind of a precursor
00:53:59.720 | of the digital societies I think that we're gonna see
00:54:01.960 | that actually have become physical societies,
00:54:03.520 | like lots and lots of subreddit like things
00:54:06.160 | have become physical societies.
00:54:07.880 | Then you start going further into like Discord
00:54:12.100 | where it's more full featured than,
00:54:15.720 | as you go Twitter, Facebook, Reddit,
00:54:19.840 | now you jump into Discord and Discord
00:54:21.680 | is a bunch of individual communities that are connected
00:54:25.240 | and you can easily sort of jump between them.
00:54:27.640 | And then you have Slack and yes, you can use Slack
00:54:32.400 | to go between different company Slacks,
00:54:33.960 | but Slack historically at least,
00:54:35.520 | I'm not sure what their current policy is,
00:54:36.640 | historically they discourage public Slacks.
00:54:38.720 | So it's mostly like you have your main Slack
00:54:41.640 | for your company and then you sometimes may jump into like,
00:54:44.400 | let's say you've got a design consultant
00:54:45.740 | or somebody like that, you'll jump into their Slack.
00:54:47.480 | But Discord is, you've got way more Discords usually
00:54:49.880 | that you jump into than Slacks, right?
00:54:51.800 | - Well, and let me ask you then on that point
00:54:53.920 | because there is a culture, one of the things I discovered
00:54:57.040 | on Reddit and Discord of anonymity or pseudonyms
00:55:01.520 | or usernames that don't represent the actual name.
00:55:03.800 | Now Slack is an example of one.
00:55:05.720 | So I think I did a, I used to have a Slack
00:55:08.480 | for like deep learning course that I was teaching
00:55:11.080 | and that was like very large, like 20,000 people, whatever.
00:55:14.960 | But so you could grow quite large,
00:55:17.040 | but there was a culture of like,
00:55:18.320 | I'm going to represent my actual identity, my actual name.
00:55:21.400 | And then the same stuff in Discord,
00:55:23.640 | I think I was the only asshole
00:55:24.960 | using my actual name on there.
00:55:26.640 | It's like everybody was using pseudonyms.
00:55:29.400 | So what's the role of that in the online community?
00:55:33.120 | - Well, so I actually gave a talk on this a few years ago
00:55:35.460 | called "The Pseudonymous Economy," okay?
00:55:38.400 | And it's come about faster than I expected,
00:55:41.680 | but I did think it was going to come about fairly fast.
00:55:44.000 | And essentially the concept is obviously we've had,
00:55:45.920 | so first, anonym, pseudonym, real name, right?
00:55:49.440 | - Can you describe the difference in that?
00:55:51.240 | - Anonymous is like 4chan where there's no tracking of a name.
00:55:56.000 | You know, there's zero reputation
00:55:58.520 | associated with an identity, right?
00:56:00.280 | Pseudonymous is like much of Reddit
00:56:02.180 | where there's a persistent username
00:56:04.240 | and it has karma over time,
00:56:06.960 | but it's not linked to the global identifier
00:56:10.000 | that is your state name, all right?
00:56:11.760 | So your quote real name,
00:56:13.020 | even the term real name, by the way, is a misnomer
00:56:14.960 | because it's like your social security name,
00:56:17.400 | like social security number.
00:56:18.660 | It's your official government name.
00:56:20.240 | It's your state name.
00:56:22.100 | It is the tracking device.
00:56:23.820 | It's the air tag that's put on you, right?
00:56:26.040 | Why do I say that, right?
00:56:27.160 | Another word for a name is a handle.
00:56:29.200 | And so just visualize like a giant file cabinet.
00:56:31.600 | There's a handle with Lex Friedman on it
00:56:33.480 | that anybody, the billions of people around the world
00:56:35.660 | can go up to and they can pull this file on you out.
00:56:40.120 | Images of you, things you said,
00:56:42.320 | like billions of people can stalk
00:56:44.360 | billions of other people now.
00:56:45.400 | That's a very new thing.
00:56:46.480 | And I actually think this will be a transitional era
00:56:49.200 | in like human history.
00:56:51.120 | We're actually gonna go back
00:56:52.120 | into a much more encrypted world.
00:56:53.760 | - Okay, okay, okay, let me linger on that
00:56:56.160 | because another way to see real names
00:57:00.720 | is the label on a thing that can be canceled.
00:57:04.480 | - Yes, that's right.
00:57:05.320 | In fact, there's a book called "Seeing Like a State"
00:57:07.720 | which actually talks about the origins
00:57:09.800 | of surnames and whatnot.
00:57:11.800 | Like if you have a guy who is,
00:57:15.120 | that guy with brown hair,
00:57:16.240 | that's like an analog identifier.
00:57:17.560 | It could mean 10 different people in a village.
00:57:19.880 | But if you have a first name, last name,
00:57:21.880 | okay, that guy can now be conscripted.
00:57:23.640 | You can go down with a list,
00:57:24.960 | a list of digital identifiers,
00:57:26.680 | pull that guy out, pull him into the military
00:57:28.800 | for conscription, right?
00:57:30.080 | So that was like one of the purposes of names
00:57:32.560 | was to make masses of humans legible to a state, right?
00:57:37.440 | Hence seeing like a state, you can see them now, right?
00:57:39.920 | See, digital identifiers,
00:57:41.080 | one thing that people don't usually think about
00:57:43.080 | is pseudonymity is itself a form of decentralization.
00:57:47.020 | So, you know, people know Satoshi Nakamoto was pseudonymous.
00:57:50.200 | They also know he's into decentralization.
00:57:52.120 | But one way of thinking about it is,
00:57:54.920 | let's say his real name, okay,
00:57:56.880 | or his state name is a node, okay?
00:58:00.080 | Attached to that is every database,
00:58:03.840 | you know, his Gmail, his, you know, Facebook,
00:58:07.800 | if he had one, every government record on him, right?
00:58:11.440 | All of these databases have that state name
00:58:16.360 | as the foreign key, right?
00:58:19.900 | And so it can go and look things up
00:58:21.960 | in all of those databases, right?
00:58:24.800 | And so it's like, think of it as being the center
00:58:27.200 | of a giant network of all of these things.
00:58:29.360 | When you go and create a pseudonym,
00:58:31.000 | you're budding off a totally new node
00:58:32.620 | that's far away from all the rest.
00:58:34.600 | And now he's choosing to attach BitcoinTalk and Bitcoin.org
00:58:38.720 | and the GPT signatures of the code,
00:58:41.280 | if he should choose to do that.
00:58:42.800 | All those things, the digital signatures
00:58:45.120 | are all attached to this new decentralized name
00:58:48.560 | because he's instantiating it, not the government, right?
00:58:52.080 | One way of thinking about it is the root administrator
00:58:54.180 | of the quote real name system is the state
00:58:57.520 | because you cannot simply edit your name there, right?
00:59:01.000 | You can't just go, you can't log into USA.gov
00:59:03.200 | and backspace your name and change it.
00:59:05.000 | Moreover, your birth certificate,
00:59:06.920 | all this stuff that's fixed and immutable, right?
00:59:09.160 | Whereas you would take for granted
00:59:10.640 | that on every site you go to, you can backspace,
00:59:12.740 | you can be like, call me Ishmael, you know,
00:59:14.500 | walk into a site, use whatever name you want.
00:59:16.740 | You just have to use the same name across multiple sites,
00:59:18.460 | you can do that.
00:59:19.320 | And if not, you don't have to.
00:59:20.800 | One thing that we're seeing now actually
00:59:22.400 | is at the level of kids, you know, the younger generation,
00:59:26.400 | Eric Schmidt several years ago mentioned that, you know,
00:59:29.320 | people would like change their names when they became adults
00:59:31.560 | so that they could do that.
00:59:33.280 | This is kind of already happening.
00:59:34.440 | People are using, I remarked on this many years ago,
00:59:37.520 | search resistant identities, okay?
00:59:41.120 | They have their Finsta,
00:59:41.940 | which is their quote fake name Instagram
00:59:43.840 | and Rinsta, which is their real name Instagram.
00:59:46.080 | - Oh, this is cool.
00:59:46.900 | - Okay, and what's interesting is on their Rinsta,
00:59:50.000 | they're their fake self
00:59:51.120 | because they're in their Sunday best and, you know, smiling.
00:59:53.920 | And this is the one that's meant to be search indexed, right?
00:59:56.600 | On the Finsta, with their fake name,
00:59:58.920 | this is just shared with their closest friends.
01:00:01.080 | They're their real self and they're, you know,
01:00:02.720 | hanging out at parties or whatever, you know?
01:00:05.060 | And so this way they've got something
01:00:07.720 | which is the public persona and the private persona, right?
01:00:10.880 | The public persona that's search indexed
01:00:12.920 | and the private persona that is private for friends, right?
01:00:16.440 | And so organically people are, you know, like Jean Jacobs,
01:00:19.600 | she talks about like cities and how, you know,
01:00:21.640 | they're organic and what I like.
01:00:23.840 | Some of the mid 20th century guys,
01:00:25.920 | the architecture they had removed shade from, you know,
01:00:30.120 | like awnings and stuff like that got removed.
01:00:32.120 | So this is like the restoration of like awnings and shade
01:00:35.320 | and structure so that you're not always exposed
01:00:37.820 | to the all seeing web crawler that I have sore on,
01:00:40.920 | which is like Google bot just indexing everything.
01:00:43.280 | These are search resistant identities
01:00:44.720 | and that like I just sort of passes over you,
01:00:47.520 | like, you know, in the Terminator,
01:00:48.480 | like in the Terminator,
01:00:49.320 | I just kind of passes over you, right?
01:00:51.000 | So search resistant identity is not pulled up,
01:00:52.680 | it's not indexed, right?
01:00:54.440 | And now you can be your real self.
01:00:56.160 | And so we've had this kind of thing for a while
01:00:58.800 | with communication.
01:01:00.200 | The new thing is that cryptocurrency
01:01:01.400 | has allowed us to do it for transaction,
01:01:03.220 | hence the pseudonymous economy, right?
01:01:05.680 | And should go from anonymous, pseudonymous, real name.
01:01:09.920 | These each have their different purposes,
01:01:11.760 | but the new concept is that pseudonym,
01:01:16.080 | you can have multiple of them, by the way,
01:01:17.360 | your ENS name, you could have it under your quote,
01:01:19.400 | real name or state name, like Lex Friedman.eth,
01:01:22.080 | but you could also be punk6529.eth, okay?
01:01:26.320 | And now you can earn, you can sign documents,
01:01:28.920 | you can boot up stuff,
01:01:30.700 | you can have a persistent identity here, okay?
01:01:33.720 | Which has a level of indirection to your real name.
01:01:36.840 | Why is that very helpful?
01:01:38.280 | Because now it's harder to both discriminate against you
01:01:41.480 | and cancel you.
01:01:42.980 | Concerns of various factions are actually obviated
01:01:46.520 | or at least partially addressed
01:01:48.000 | by going pseudonymous as default, right?
01:01:50.240 | It is the opposite of bring your whole self to work,
01:01:51.920 | it's bring only your necessary self to work, right?
01:01:54.400 | Only show those credentials that you need, right?
01:01:56.120 | Now, of course, anybody who's in cryptocurrency
01:01:59.200 | understands Satoshi Nakamoto and so on is for this,
01:02:01.280 | but actually many progressives are for this as well.
01:02:03.960 | You don't ban the boxes.
01:02:05.860 | It's like, you're not supposed to ask about
01:02:07.840 | felony convictions when somebody is being hired
01:02:10.520 | because they've served their time, right?
01:02:12.520 | Or you're not supposed to ask about immigration status
01:02:15.800 | or marital status in an interview,
01:02:18.420 | and people have this concept of blind auditions
01:02:21.420 | where if a woman is auditioning for a violin seat,
01:02:26.420 | they put it behind a curtain so they can't downgrade her
01:02:30.900 | for playing, so her performance is judged
01:02:35.060 | on the merits of its audible quality,
01:02:37.220 | not in terms of who this person is.
01:02:40.380 | So this way they don't discriminate
01:02:41.540 | versus male or female for who's getting a violin position.
01:02:46.660 | So you combine those concepts like ban the box,
01:02:48.620 | not asking these various questions, blind auditions,
01:02:51.140 | and then also the concept of implicit bias.
01:02:52.900 | Like if you believe this research,
01:02:55.540 | people are unconsciously biased towards other folks, right?
01:02:58.740 | Okay, so you take all that, you take Satoshi,
01:03:01.220 | and you put it together, and you say,
01:03:02.740 | "Okay, let's use pseudonyms."
01:03:04.420 | That actually takes unconscious bias even off the table,
01:03:07.060 | right, because now you have genuine global equality
01:03:11.260 | of opportunity.
01:03:12.140 | Moreover, you have all these people,
01:03:13.340 | billions of people around the world
01:03:14.820 | that might speak with accents,
01:03:16.440 | but they type without them.
01:03:17.940 | And now if they're pseudonymous,
01:03:19.260 | you aren't discriminating against them, right?
01:03:21.800 | Moreover, with AI, very soon, the AI version of Zoom,
01:03:25.500 | you'll be able to be whoever you wanna be
01:03:28.820 | and speak in whatever voice you wanna speak in, right?
01:03:31.780 | And you'll be, and that'll happen in real time.
01:03:35.060 | - So I mean, this is really interesting,
01:03:36.380 | but for Finsta and Rinsta,
01:03:41.380 | there's some sense in which the fake Instagram
01:03:46.740 | you're saying is where you could be your real self.
01:03:49.920 | Well, my question is under a pseudonym
01:03:54.040 | or when you're completely anonymous,
01:03:56.500 | is there some sense where you're not actually
01:03:59.080 | being your real self, that as a social entity,
01:04:04.080 | human beings are fundamentally social creatures,
01:04:10.560 | and for us to be social creatures,
01:04:12.400 | there is some sense in which we have to have
01:04:14.480 | a consistent identity that can be canceled,
01:04:18.320 | that can be criticized or applauded in society,
01:04:23.320 | and that identity persists through time.
01:04:28.080 | So is there some sense in which we would not be
01:04:31.760 | our full, beautiful human selves
01:04:34.440 | unless we have a lifelong, consistent, real name
01:04:38.280 | attached to us in a digital world?
01:04:40.080 | - So this is a complicated topic,
01:04:42.140 | but let me make a few remarks.
01:04:43.960 | First is real names, quote-unquote state names,
01:04:46.680 | were not built for the internet.
01:04:47.680 | They're actually state names, right?
01:04:49.920 | It's actually a great way of thinking about it,
01:04:51.120 | a social security name, right?
01:04:53.400 | So your state name, your official name,
01:04:55.960 | was not built for the internet.
01:04:58.400 | They give both too much information and too little, okay?
01:05:02.080 | So too much information because someone with your name
01:05:05.700 | can find out all kinds of stuff about you.
01:05:07.960 | Like for example, if someone doesn't wanna be stalked,
01:05:11.160 | right, their real name is out there,
01:05:13.320 | their stalker knows it, they can find address information,
01:05:16.000 | all this other kind of stuff, right?
01:05:18.400 | And with all these hacks that are happening,
01:05:20.120 | just every day we see another hack, massive hack, et cetera,
01:05:23.920 | that real name can be indexed into data
01:05:25.720 | that was supposed to be private, right?
01:05:26.840 | Like for example, the Office of Personnel Management,
01:05:28.920 | like the government, the US government,
01:05:31.120 | many governments actually, are like a combination
01:05:33.720 | of the surveillance state and the Keystone cops, right?
01:05:37.880 | They slurp up all the information
01:05:39.160 | and then they can't secure it.
01:05:40.680 | So it leaks out the back door, okay?
01:05:42.960 | They basically have 100 million records
01:05:45.480 | of all this very, 300 million records,
01:05:47.080 | all this very sensitive data,
01:05:48.280 | they just get owned, hacked over and over again, right?
01:05:51.440 | And so really there should be something
01:05:52.720 | which just totally inverts the entire concept of KYC
01:05:55.160 | and what have you.
01:05:56.200 | And of course, comply with the regulations
01:05:58.200 | as they are currently written.
01:05:59.880 | But also you should argue privacy over KYC,
01:06:03.360 | the government should not be able to collect
01:06:04.720 | what it can't secure.
01:06:05.880 | It's slurping up all this information,
01:06:07.800 | it's completely unable to secure it,
01:06:09.800 | it's hacked over and over again.
01:06:11.640 | Like China probably has the entire OPM file.
01:06:14.040 | And it's not just that, like Texas is hacked.
01:06:16.600 | And some of these hacks are not even detected yet, right?
01:06:19.600 | And these are just the ones that have been admitted.
01:06:21.640 | And so what happens is criminals can just run this stuff
01:06:26.320 | and find, oh, okay, so that guy
01:06:28.440 | who's got that net worth online,
01:06:30.320 | and he merges various databases,
01:06:31.520 | they've got a bunch of addresses to go and hit, okay?
01:06:33.960 | So in that sense, real names were not,
01:06:36.240 | state names were not built for the internet,
01:06:37.560 | they just give up too much information.
01:06:38.720 | In our actually existing internet environment,
01:06:41.360 | they give up too much information.
01:06:42.640 | On the other hand, they also give too little, why?
01:06:44.720 | If instead you give out lexfriedman.eth, okay?
01:06:49.280 | Or a similar crypto domain name or urban name
01:06:51.640 | or something like that.
01:06:53.160 | Now that's actually more like a DNS, okay?
01:06:56.280 | First, if you've got a lexfriedman.eth,
01:06:58.960 | what can you do that?
01:07:00.680 | Some you can do today, some you'll soon be able to do.
01:07:03.920 | You can pay lexfriedman.eth,
01:07:05.680 | you can message lexfriedman.eth,
01:07:07.320 | you can look it up like a social profile,
01:07:10.800 | you can send files to it, you can upload and download.
01:07:14.600 | Basically, it combines aspects of an email address,
01:07:17.880 | a website, a username, et cetera, et cetera.
01:07:23.840 | Eventually, I think you'll go from email to phone number
01:07:26.160 | to ENS address or something like that
01:07:28.320 | as the primary online identifier,
01:07:30.520 | because this is actually a programmable name, right?
01:07:34.120 | Whereas a state name is not.
01:07:36.080 | Think about it, like a state name will have apostrophes
01:07:38.080 | perhaps in it, or is that your middle name
01:07:40.320 | or this and that?
01:07:41.160 | That was a format that was developed for the paper world,
01:07:43.360 | right?
01:07:44.200 | Whereas the ENS name is developed for the online world.
01:07:46.520 | Now, the reason I say ENS or something like it,
01:07:49.400 | you know, somebody in a village,
01:07:52.480 | their name might be Smith because they were a blacksmith
01:07:54.840 | or Potter because they were a Potter, right?
01:07:57.000 | And same, I think your surname,
01:07:59.040 | right now for many people, it's .eth
01:08:01.560 | and that reflects the Ethereum community.
01:08:03.320 | Your surname online will carry information about you.
01:08:06.520 | Like .solve says something different about you.
01:08:09.040 | .btc says yet something different.
01:08:11.400 | I think we're gonna have a massive fractionation
01:08:13.240 | of this over time.
01:08:14.240 | We're still in the very earliest days
01:08:15.640 | of our internet civilization, right?
01:08:18.040 | 100, 200 years from now, those surnames may be
01:08:20.040 | as informative as say Chen or Friedman or Srinivasan
01:08:23.360 | in terms of what information they carry,
01:08:25.480 | 'cause the protocol, it's the civilization fundamentally
01:08:27.560 | that you're associated with, right?
01:08:28.880 | - Right, so there's some improvements to the real name
01:08:32.800 | that you could do in the digital world.
01:08:34.360 | But do you think there's value of having a name
01:08:36.600 | that's persists throughout your whole life
01:08:39.240 | that is shared between all the different digital
01:08:44.040 | and physical communities?
01:08:44.880 | - I think you should be able to opt into that, right?
01:08:47.560 | - At which stage?
01:08:48.560 | - At which level, in terms of the society
01:08:50.320 | that you're joining.
01:08:51.520 | - Wait a minute, so can I murder a bunch of people
01:08:54.560 | in society one and then go to society two
01:08:59.000 | and be like, I'm murder free.
01:09:01.560 | My name is-- - No, no, I don't mean it
01:09:02.880 | like that, no, no, yeah.
01:09:03.800 | So here's why that wouldn't work.
01:09:05.360 | - That's the application I'm interested in.
01:09:07.160 | - Okay, well, I'm not interested in the murder application,
01:09:09.760 | but what do you think?
01:09:10.600 | I don't know, I'm just kidding.
01:09:11.440 | - I would like you to prevent me,
01:09:14.600 | a person who's clearly bad for society, from doing that.
01:09:17.440 | - Sure, sure, sure.
01:09:18.280 | Murder is gonna be against the rules
01:09:20.000 | in almost every society.
01:09:21.720 | And I mean, people will argue--
01:09:23.760 | - Most likely, yeah.
01:09:24.600 | - Yeah, most likely, right?
01:09:25.680 | And the reason I'm thinking-- - Except animals.
01:09:28.200 | - Well, I'm thinking of like the Aztecs or the Mayas
01:09:30.560 | or something like that, there's various,
01:09:32.520 | you know, Soviet Union, there's weird edge case,
01:09:35.680 | unfortunately. - Broad words.
01:09:36.520 | - Yeah, there's societies, unfortunately,
01:09:37.800 | that have actually, that's why I asterisked it.
01:09:40.280 | But let's say murder is something that society one
01:09:45.280 | probably has effectively a social smart contract
01:09:48.200 | or a social contract that says, that's illegal,
01:09:51.000 | therefore, you're in jail,
01:09:52.320 | therefore, you're deprived of the right to exit.
01:09:54.120 | But upon entry into that society, in theory,
01:09:57.640 | you would have said, okay, I accept this, quote,
01:10:00.160 | social contract, right?
01:10:01.680 | Obviously, if I kill somebody, I can't leave, okay?
01:10:04.000 | So you've accepted upon crossing the border into there,
01:10:07.440 | right?
01:10:08.280 | Now, as I mentioned, you know, like, what is murder?
01:10:12.240 | Like people will, I mean, there's an obvious answer,
01:10:15.240 | but as I said, there's been human sacrifice
01:10:17.160 | in some societies, communism, they kill lots of people,
01:10:19.160 | Nazism, they kill lots of people.
01:10:20.720 | Unfortunately, there's quite a lot of societies,
01:10:23.440 | you know, I wanted to say it's an edge case,
01:10:25.600 | but maybe many of the 20th century societies
01:10:28.840 | around the world have institutionalized
01:10:29.960 | some kind of murder, whether it was the Red Terror,
01:10:31.480 | you know, in the Soviet Union,
01:10:32.760 | or obviously the Holocaust, or, you know,
01:10:34.560 | the Cultural Revolution, or Year Zero,
01:10:37.200 | and so on and so forth, right?
01:10:38.680 | So my point there is that who's committing all those murders?
01:10:43.040 | It was the state, it was the organization
01:10:45.400 | that one is implicitly trusting them to track you, right?
01:10:48.760 | And how did they commit those murders?
01:10:49.920 | Well, how did Lenin, you know,
01:10:52.560 | do you know the hanging order?
01:10:53.600 | You know what I'm talking about,
01:10:54.440 | the hanging order for the Kulaks?
01:10:56.080 | - Yes.
01:10:56.920 | - Okay, the famous hanging order,
01:10:57.760 | which actually showed they were actually bloodthirsty,
01:10:59.680 | the key thing was he said,
01:11:00.680 | "Here's a list of all the quote rich men,
01:11:02.840 | "the Kulaks, go and kill them."
01:11:04.360 | The real names, the state names
01:11:06.240 | were what facilitated the murder.
01:11:07.960 | They didn't prevent the murderers there, right?
01:11:11.000 | So my point is, just in the ethical weighting of it,
01:11:14.600 | it's a two-sided thing, right?
01:11:16.560 | You're right that the tracking can, you know,
01:11:19.400 | prevent disorganized murders,
01:11:21.120 | but the tracking facilitates,
01:11:22.440 | unfortunately, organized murders.
01:11:24.200 | Lists of undesirables were the primary tool
01:11:27.400 | of all of these oppressive states in the 20th century.
01:11:29.800 | You see my point?
01:11:31.520 | - I see your point, and it's a very strong point.
01:11:34.520 | In part, it's a cynical point,
01:11:38.320 | which is that the rule of a centralized state
01:11:44.240 | is more negative than positive.
01:11:50.320 | - I think it is like nuclear energy, okay?
01:11:54.560 | It's like fire.
01:11:57.320 | It is something which you're gonna keep having it reform
01:12:01.640 | because there's good reasons where you have
01:12:03.560 | centralization, decentralization, re-centralization,
01:12:06.280 | but power corrupts, absolute power corrupts absolutely,
01:12:09.600 | and you just have to be very suspicious
01:12:11.800 | of this kind of centralized power.
01:12:13.760 | The more trust you give it,
01:12:14.880 | often the less trust it deserves.
01:12:17.560 | It's like a weird feedback loop, right?
01:12:18.880 | The more trust, the more it can do.
01:12:20.400 | The more it can do, the more bad things it will do.
01:12:24.000 | So, okay, there is a lot of downside to the state
01:12:29.000 | being able to track you.
01:12:31.360 | And history teaches us lessons, one at a large scale,
01:12:36.280 | especially in the 20th century, at the largest of scale,
01:12:38.960 | a state can do, commit a large amount of murder
01:12:41.800 | and suffering.
01:12:42.640 | - And by the way, history isn't over.
01:12:43.920 | If you think about what the Chinese are building on this,
01:12:47.520 | that surveillance state, it's not just tracking your name,
01:12:51.360 | it's tracking everything on you.
01:12:53.840 | Like WeChat is essentially like,
01:12:55.680 | it is all the convenience and none of the freedom.
01:12:59.920 | - So that's the downside, but don't you,
01:13:03.320 | the question is, I think probably fundamentally
01:13:05.920 | about the human nature of an individual,
01:13:09.680 | of how much murder there would be
01:13:12.920 | if we can just disappear every time we murder.
01:13:16.680 | - Well, I mean-- - At the individual level.
01:13:18.760 | - So the issue is basically like,
01:13:20.920 | once one realizes that the moral trade-off
01:13:23.280 | has two poles to it, right?
01:13:25.040 | And moreover that basically centralized organized murder
01:13:27.840 | has, I mean, if we add up all the disorganized murder
01:13:31.000 | of the 20th century, it's probably significantly less
01:13:33.920 | than the organized murder that these states facilitated.
01:13:38.320 | And probably by, R.J. Rommel has this thing
01:13:40.400 | called democide, right?
01:13:41.760 | And the thing is, it's so grim, right?
01:13:43.360 | Because it's saying like, one death is a tragedy,
01:13:46.680 | a million is a statistic, right?
01:13:48.360 | These are just like, just incalculable tragedies
01:13:51.200 | that we can't even understand.
01:13:54.600 | But nevertheless engaging with it,
01:13:57.800 | like, I don't know, is the ratio 10X?
01:14:00.080 | Is it 100X?
01:14:00.920 | I wouldn't be surprised if it's 100X, right?
01:14:02.120 | - Yeah, but have you seen the viciousness,
01:14:04.720 | the negativity, the division within online communities
01:14:09.240 | that have anonymity?
01:14:11.120 | - So that's the thing, is basically,
01:14:13.180 | there's also a Scylla and a Charybdis.
01:14:14.960 | I'm not, when you see what centralization can do,
01:14:19.840 | and you correct in the direction of decentralization,
01:14:23.120 | you can overcorrect with decentralization
01:14:25.440 | and you get anarchy.
01:14:26.260 | And this is basically, then you want to re-centralize, right?
01:14:28.760 | And this is the, I think it's the romance
01:14:31.560 | of the three kingdoms, the empire long united must divide,
01:14:34.600 | the empire long divided must unite.
01:14:35.920 | That's always the way of it, right?
01:14:37.160 | So what's gonna happen is,
01:14:38.920 | we will state certain verbal principles, right?
01:14:42.120 | And then the question is, where in state space you are?
01:14:45.000 | Are you too centralized?
01:14:46.000 | Well, then, okay, you want to decentralize.
01:14:47.840 | And are you too decentralized?
01:14:49.320 | Then we want to centralize and maybe track more, right?
01:14:51.800 | And people opt into more tracking
01:14:53.440 | because they will get something from that tracking,
01:14:55.360 | which is a greater societal stability.
01:14:57.160 | So it's kind of like saying, are we going north or south?
01:15:01.040 | And the answer is like, what's our destination?
01:15:02.800 | Where's our current position
01:15:04.220 | in the civilizational state space?
01:15:06.320 | - Well, my main question, I guess,
01:15:08.360 | is does creating a network state escape
01:15:13.360 | from some of the flaws of human nature?
01:15:15.800 | The reason you got Nazi Germany is a large scale resentment
01:15:20.760 | with different explanations for that resentment
01:15:22.920 | that's ultimately lives in the heart of each individual
01:15:26.160 | that made up the entirety of Nazi Germany
01:15:28.560 | and had a charismatic leader
01:15:30.320 | that was able to channel that resentment into action,
01:15:34.200 | into actual policies,
01:15:35.920 | into actual political and military movements.
01:15:40.000 | Can't you not have the same kind of thing
01:15:41.920 | in digital communities as well?
01:15:44.200 | Have you heard the term argumentum ad Hitlerum
01:15:46.400 | or like Godwin's law or something?
01:15:47.720 | Like, it's something where if the reference point is Hitler,
01:15:51.000 | it's this thing where a lot of things break down.
01:15:54.520 | But I do think, I mean, look, is there any,
01:15:56.940 | did Bitcoin manage to get where it was
01:15:59.540 | without a single shot being fired to my knowledge?
01:16:01.840 | Yes, right?
01:16:03.260 | Did Google manage to get to where it is
01:16:04.680 | without shots being fired?
01:16:05.920 | Absolutely.
01:16:08.760 | - While a lot of shots were being fired
01:16:11.760 | elsewhere in the world.
01:16:13.120 | - Sure.
01:16:13.960 | - And-
01:16:14.800 | - But who's firing those shots?
01:16:16.080 | - The states.
01:16:16.920 | - The states, right, yeah.
01:16:17.740 | - But that's because Bitcoin and Google
01:16:20.200 | are a tiny minority of communities.
01:16:22.360 | It's like the icing on the cake of human civilization.
01:16:25.440 | - Sure.
01:16:26.280 | Basically, any technology, I mean, like you can use a hammer
01:16:30.480 | to go and hit somebody with it, right?
01:16:31.880 | I'm not saying every technology is equally destructive
01:16:36.560 | or what have you, but you can conceive of,
01:16:38.920 | it's kind of like rule 34, but for technology, right?
01:16:43.240 | Okay, right, you can probably figure out some-
01:16:45.480 | - Your ability to reference brilliant things throughout
01:16:48.160 | is quite admirable, yes.
01:16:50.280 | But anyway, sorry, rule 34 for technology.
01:16:52.440 | - Rule 34, but for abusive technology,
01:16:54.560 | you can always come up with a black mirror version
01:16:56.720 | of something.
01:16:57.560 | And in fact, there is this kind of funny tweet,
01:16:59.240 | which is like a sci-fi author.
01:17:02.080 | My book, "Don't Invent the Torment Nexus,"
01:17:05.420 | was meant to be a cautionary tale on what would happen
01:17:09.900 | if society invented the torment nexus.
01:17:11.420 | And then it's like, "Tech guys, at long last,
01:17:15.040 | we have created the torment nexus."
01:17:16.680 | (laughing)
01:17:17.520 | Or whatever, right?
01:17:18.360 | And so the thing is that simply describing something,
01:17:22.460 | some abuse, unfortunately,
01:17:24.620 | after the initial shock wears off,
01:17:28.080 | people will unconsciously think of it
01:17:29.960 | as sort of an attractor in the space, right?
01:17:32.160 | It's like, I'll give you some examples,
01:17:33.680 | like "Minority Report" had the gesture thing, right?
01:17:37.400 | And the Kinect was based on that.
01:17:38.840 | So it's a dystopian movie, but had this cool kind of thing,
01:17:40.900 | and people kind of keyed off it, right?
01:17:43.240 | Or people have said that movies like "Full Metal Jacket,"
01:17:47.440 | that was meant to be, in my understanding,
01:17:49.200 | is meant to be like an anti-war movie.
01:17:50.960 | But lots of soldiers just love it,
01:17:53.680 | despite the fact that the drill sergeant
01:17:55.320 | is actually depicted as a bad guy, right?
01:17:57.400 | For the sort of portrayal of that kind of environment, right?
01:18:01.280 | So I'm just saying, it's like giving the vision
01:18:02.720 | of the digital Hitler or whatever.
01:18:04.640 | It's not actually a vision I want to paint.
01:18:06.760 | I do think everything is possible.
01:18:09.320 | Obviously, ISIS uses the internet, right?
01:18:11.520 | - Yeah, we're not bringing up Hitler in a shallow argument.
01:18:17.480 | We're bringing up Hitler
01:18:18.680 | in a long, empathetic, relaxed discussion,
01:18:22.200 | which is a different-- - Sure, sure, I understand.
01:18:23.640 | - Which is where Hitler can live, in a healthy way.
01:18:27.160 | There's deep lessons in Hitler and Nazi Germany,
01:18:32.560 | as there is with Stalin, yes.
01:18:33.960 | - Okay, so in many ways,
01:18:35.960 | and this is a very superficial way of talking about it,
01:18:39.080 | but this is, exit is the anti-genocide technology, right?
01:18:43.400 | Because exit is the route of the politically powerless.
01:18:46.480 | Exit is not, people always say,
01:18:48.120 | oh, exit is for the rich, or whatever.
01:18:49.760 | That's actually not true.
01:18:50.600 | Most immigrants equals most emigrants are not rich.
01:18:54.640 | They're politically powerless.
01:18:55.920 | - You can describe exit.
01:18:57.340 | - What is exit?
01:18:58.180 | So there's this book, which I reference a lot,
01:19:00.840 | I like it, called "Exit, Voice, and Loyalty"
01:19:03.720 | by Albert Hirschman, okay?
01:19:05.600 | And he essentially says,
01:19:07.680 | and I gave this talk in 2013
01:19:10.240 | that goes through this at YC Starb School,
01:19:14.480 | but just to describe these,
01:19:16.520 | voices reform, exits alternatives.
01:19:18.520 | For example, in the context of an open source project,
01:19:21.080 | voice is submitting a bug, and exit is forking.
01:19:25.360 | In a company, voices, you're saying,
01:19:30.200 | hey, here's a ticket, okay, that I'd like to get solved,
01:19:34.400 | and exit is taking your business elsewhere, okay?
01:19:37.640 | You know, at the level of corporate governance,
01:19:39.320 | voices, you know, board of directors vote,
01:19:41.080 | and exit is selling your shares, right?
01:19:42.920 | In a country, voice is a vote, and exit is migration, okay?
01:19:47.080 | And I do think that the two forces we talk about a lot,
01:19:51.920 | democracy and capitalism, are useful forces,
01:19:54.680 | but there's a third, which is migration, right?
01:19:59.160 | So you can vote with your ballot,
01:20:00.600 | you can vote with your wallet, you can vote with your feet.
01:20:02.560 | Wallet has some aspects of exit built into it,
01:20:06.320 | but voting with your feet actually has some aspects
01:20:08.000 | of voice built into it, because when you leave,
01:20:10.300 | it's like an amplifier on your vote.
01:20:11.760 | You might say 10 things, but when you actually leave,
01:20:14.800 | then people take what you said seriously,
01:20:16.520 | you're not just like complaining or whatever,
01:20:18.000 | you actually left San Francisco
01:20:19.800 | because it was so bad on this and this issue,
01:20:22.240 | and you've actually voted with your feet.
01:20:23.680 | It is manifest preference as opposed to stated preference.
01:20:27.240 | So voice versus exit is this interesting dichotomy.
01:20:29.480 | Do you try to reform the system, or do you exit it,
01:20:31.440 | and build a new one, or seek an alternative?
01:20:33.640 | And then loyalty modulates this,
01:20:35.560 | where if you are a patriot,
01:20:38.120 | as part of the initial part of your conversation,
01:20:40.560 | like, "Are you a traitor?
01:20:44.040 | "You're giving up on our great thing," or whatever,
01:20:47.080 | and people will push those buttons to get people to stick.
01:20:49.880 | I shouldn't say the bad version, let's say a common version.
01:20:53.120 | Sometimes good, sometimes bad.
01:20:54.580 | But then there's the good version, which is,
01:20:58.600 | "Oh, maybe the price is down right now,
01:21:00.940 | "but you believe in the cause."
01:21:02.500 | So even if they're on paper, you would rationally exit,
01:21:06.360 | you believe in this thing,
01:21:08.240 | and you're gonna stick with it, okay?
01:21:10.100 | So loyalty can be, again, good and bad,
01:21:12.160 | but it kind of modulates the trade-off
01:21:13.800 | between voice and exit, okay?
01:21:15.860 | So given that framework, we can think of a lot of problems
01:21:20.200 | in terms of, "Am I gonna use voice or exit
01:21:23.120 | "or some combination thereof?"
01:21:24.200 | 'Cause they're not mutually exclusive.
01:21:25.160 | It's kind of like left and right,
01:21:26.440 | sometimes you use both together.
01:21:28.540 | I think that one of the biggest things the internet does
01:21:31.460 | is it increases microeconomic leverage
01:21:34.400 | and therefore increases exit in every respect of life.
01:21:37.200 | For example, on every phone,
01:21:40.800 | you can pick between Lyft and Uber, right?
01:21:44.280 | When you're at the store, you see a price on the shelf
01:21:47.120 | and you can comparison shop, right?
01:21:49.440 | If it's Tinder, you can swipe, right?
01:21:51.440 | If it's Twitter, you can click over to the next account.
01:21:54.020 | The back button is exit.
01:21:55.360 | The microeconomic leverage,
01:21:58.320 | leverage in the sense of alternatives, right?
01:22:01.280 | This is like one of the fundamental things
01:22:03.740 | that the internet does.
01:22:04.580 | It puts this tool on your desktop
01:22:06.020 | and now you can go and talk to an illustrator
01:22:08.740 | or you can kind of build it yourself, right?
01:22:10.580 | By typing in some characters into Dali.
01:22:13.460 | - And that makes the positive forces of capitalism
01:22:15.740 | more efficient, increase in microeconomic leverage.
01:22:19.060 | - And it's individual empowerment, right?
01:22:21.020 | And so our sort of industrial-age systems
01:22:24.100 | were not set up for that level of individual empowerment.
01:22:26.140 | Just to give you like one example that I think about.
01:22:28.380 | We take for granted every single website
01:22:30.240 | you go and log into.
01:22:31.260 | You can configure your Twitter profile
01:22:33.780 | and you can make it dark mode or light mode
01:22:36.100 | and your name, all this stuff is editable, right?
01:22:38.600 | How do you configure your USA experience?
01:22:40.700 | Is there a USA.gov that you edit?
01:22:44.380 | Can you even edit your name there?
01:22:45.980 | - Dark mode for USA.
01:22:48.140 | - But I mean, just your profile.
01:22:49.300 | Is there like a national profile?
01:22:51.180 | I mean, there's like driver's license.
01:22:52.620 | Point is that it's assumed
01:22:54.460 | that it's not like individually customizable
01:22:56.860 | quite in that way, right?
01:22:58.220 | Of course you can move around your house
01:23:00.060 | and stuff like that.
01:23:00.900 | But it's not like your experience of the US
01:23:04.040 | is like configurable, you know?
01:23:07.120 | - Let me think about that.
01:23:08.840 | Let me think about sort of the analogy of it.
01:23:11.580 | So the microeconomic leverage, you can switch apps.
01:23:15.880 | Can you switch your experience in small ways,
01:23:21.120 | efficiently, multiple times a day
01:23:23.420 | in inside the United States?
01:23:25.320 | - Well, the physical world--
01:23:28.120 | - You do, yeah, under the constraints of the physical world,
01:23:31.440 | you do like micro migrations.
01:23:33.560 | - So this is coming back to the hunter-gatherer,
01:23:35.360 | farmer-soldier, digital nomad kind of thing, right?
01:23:38.440 | The digital nomad combines aspects of the V1 and the V2
01:23:43.680 | for a V3, right?
01:23:44.800 | Because digital nomad has the mobility and freedom
01:23:47.520 | with the hunter-gatherer,
01:23:48.600 | but some of the consistency of the civilization
01:23:52.380 | of the farmer and soldier, right?
01:23:54.320 | But coming back to this,
01:23:55.560 | one other thing about it is in the 1950s,
01:23:58.280 | if a guy in assembly line
01:23:59.440 | might literally push the same button for 30 years, okay?
01:24:02.960 | Whereas today, you're pushing a different key every second.
01:24:05.960 | That's like one version of like microeconomic leverage.
01:24:10.480 | Another version is, in the 1980s,
01:24:13.480 | I mean, they didn't have Google Maps, right?
01:24:15.420 | So you couldn't just like discover things off the path.
01:24:17.880 | People would just essentially do home to work
01:24:20.960 | and work to home and home to work
01:24:22.520 | and a trip had to be planned, right?
01:24:25.200 | They were contained within a region of space
01:24:27.680 | or you do home to school, school to home, home to school.
01:24:30.840 | It wasn't like you went and explored the map.
01:24:33.040 | Most people didn't, right?
01:24:34.260 | They were highly cannalized, okay?
01:24:36.460 | Meaning, it was just back and forth, back and forth,
01:24:39.520 | very routine, just like the push the button,
01:24:41.360 | push the button, trapped within this very small piece
01:24:46.280 | and also trapped within this large country
01:24:48.140 | 'cause it was hard to travel between countries and so on.
01:24:50.020 | Again, of course there were vacations,
01:24:52.620 | of course there were some degree of news and so on.
01:24:54.480 | Your mobility wasn't completely crushed
01:24:56.640 | but it was actually quite low, okay, relatively speaking.
01:24:59.680 | Just you were trapped in a way
01:25:02.440 | that you weren't even really thinking about it, okay?
01:25:04.640 | And now that map has opened up.
01:25:06.640 | Now you can see the whole map.
01:25:08.040 | You can go all over the place.
01:25:10.160 | You know, I don't have the data to show it
01:25:11.560 | but I'd be shocked if people,
01:25:14.480 | the average person didn't go to more places,
01:25:16.400 | wasn't, you know, going to more restaurants
01:25:19.600 | and things like that today than they were in the '80s
01:25:21.760 | simply because the map is open, okay?
01:25:23.640 | - And the map is made more open through the digital world.
01:25:27.080 | - Through the digital world, exactly.
01:25:28.200 | So we're reopening the map like the hunter-gatherer, okay?
01:25:31.020 | Because you can now, think about every site
01:25:33.360 | for very low cost that you can visit, right?
01:25:36.320 | The digital world, you can, I mean,
01:25:37.520 | how many websites have you visited?
01:25:39.480 | I don't know, hundreds of thousands probably
01:25:41.240 | at this point over your life, right?
01:25:42.800 | How many places on the surface have you visited?
01:25:44.520 | You're actually unusual.
01:25:45.360 | You might be like a world traveler or what have you, right?
01:25:47.400 | But still, even your physical mobility
01:25:50.320 | is less than your digital mobility, right?
01:25:52.560 | You can just essentially, I mean,
01:25:54.240 | the entire concept of like nations and borders and whatnot
01:25:57.640 | didn't exist in the hunter-gatherer era, right?
01:25:59.560 | Because you couldn't build permanent fortifications
01:26:03.440 | and whatnot, even nations as we currently think of them
01:26:06.640 | with like demarcated borders, you needed cartography,
01:26:10.420 | you needed maps, right?
01:26:11.880 | That stuff didn't exist for a long time.
01:26:13.400 | You just had sort of a fuzzy area
01:26:14.960 | of we kind of control this territory
01:26:16.960 | and these guys are on the other side of the river, okay?
01:26:19.160 | - I think just to--
01:26:20.600 | - I don't wanna digress too much, but yeah.
01:26:22.240 | - We're to digress away.
01:26:23.840 | I think entirety of life on Earth is a kind of a digression
01:26:28.040 | which creates beauty and complexity
01:26:30.480 | as part of the digression.
01:26:31.700 | I think your vision of the network state
01:26:35.440 | is really powerful and beautiful.
01:26:37.040 | I just wanna linger on this real name issue.
01:26:40.160 | - Yes, really.
01:26:41.120 | - Let me just give you some data.
01:26:42.480 | - Go ahead.
01:26:43.560 | - Personal, anecdotal experience data.
01:26:45.640 | There's a reason I only do this podcast in person.
01:26:49.120 | There is something lost in the digital space.
01:26:52.640 | - Oh, sure.
01:26:53.640 | - And I find, now I personally believe
01:26:57.360 | to play devil's advocate against the devil's advocate
01:27:00.360 | that I'm playing, I personally believe
01:27:02.480 | that this is a temporary thing.
01:27:03.720 | We will figure out technological solutions to this,
01:27:06.000 | but I do find that currently people are much more willing
01:27:09.760 | to be on scale cruel to each other online
01:27:13.000 | than they are in person.
01:27:14.900 | The only, the way to do that,
01:27:16.640 | I just visited Ukraine, went to the front.
01:27:18.900 | The way you can have people be cruel to each other
01:27:21.880 | in the physical space is through the machinery of propaganda
01:27:25.260 | that dehumanizes the other side, all that kind of stuff.
01:27:27.960 | That's really hard work to do.
01:27:31.480 | Online, I find just naturally at the individual scale,
01:27:34.700 | people somehow start to easily engage in the drug
01:27:39.700 | of mockery, derision, and cruelty
01:27:45.600 | when they can hide behind anonymity.
01:27:47.940 | I don't know what that says about human nature.
01:27:50.920 | I ultimately believe most of us want to be good
01:27:53.880 | and have the capacity to do a lot of good,
01:27:56.220 | but sometimes it's fun to be shitty,
01:27:58.540 | to shit on people, to be cruel.
01:28:00.440 | I don't know what that is.
01:28:01.400 | - It's weird because I think, you know,
01:28:03.780 | one of my sayings is just like the internet
01:28:06.080 | increases microeconomic leverage,
01:28:07.720 | the internet increases variance.
01:28:09.400 | For anything that exists before,
01:28:11.060 | you have, you know, the zero and 100 versions of it.
01:28:15.000 | I'll give some examples, then I'll come to this.
01:28:16.480 | For example, you go from the 30 minute sitcom
01:28:19.320 | to the 30 second clip or the 30 episode Netflix binge.
01:28:23.080 | You go from guy working 95 to the guy who's 40 years old
01:28:28.080 | and has failed to launch, doesn't have a job or anything,
01:28:33.880 | and the 20 year old tech billionaire.
01:28:36.120 | You go from all kinds of things that were sort of Gaussian
01:28:40.000 | or kind of constrained in one location
01:28:41.360 | to kind of extreme outcomes on both sides.
01:28:44.200 | And applying that here,
01:28:47.120 | you are talking about the bad outcome,
01:28:49.000 | which I agree does happen where the internet,
01:28:51.340 | in some sense, makes people have very low empathy
01:28:54.160 | between others.
01:28:55.040 | But it also is the other extent
01:28:56.720 | where people find their mental soulmates across the world.
01:29:01.440 | Someone who's living in Thailand or in, you know,
01:29:04.440 | like Latin America who thinks all the same stuff,
01:29:07.500 | just like them.
01:29:08.420 | Wow, you'd never met this person before, right?
01:29:11.560 | You get to know them online, you meet a person
01:29:13.640 | it's like, you know, the brains have been communicating
01:29:16.040 | for two years, three years, you've been friends
01:29:18.000 | and you see them in person and it's just great, right?
01:29:20.300 | So it's actually, it's not just the total lack of empathy.
01:29:24.000 | It is frankly, far more empathy than you would be able
01:29:28.720 | to build usually with an in-person conversation
01:29:31.580 | in the 80s or the 90s with someone on their side of the world
01:29:34.360 | 'cause you might not even be able to get a visa
01:29:35.520 | to go to their country or not even know they existed.
01:29:37.720 | How would you be able to find each other
01:29:39.240 | and so on and so forth, right?
01:29:40.480 | So it is kind of both.
01:29:41.840 | It is tearing society apart
01:29:43.600 | and it's putting it back together, both at the same time.
01:29:46.040 | - My main concern is this.
01:29:47.880 | What I see is that young people are for some reason
01:29:52.800 | more willing to engage in the drug of cruelty online
01:29:56.640 | under the veil of anonymity.
01:29:59.000 | - That's what you're seeing publicly
01:30:00.160 | but you're not seeing the private chats.
01:30:02.160 | Like there's, it's kind of, it's a, you know,
01:30:05.560 | the sensory distribution. - Well, I work
01:30:06.600 | for the intelligence agency so I'm seeing the private chats.
01:30:09.280 | I mean, I'm collecting all of your data.
01:30:11.920 | Yeah, yes, but you can intuit stuff
01:30:15.600 | and I don't think I'm being very selective.
01:30:17.440 | I mean, if you just look at the young folks,
01:30:21.480 | I mean, I am very concerned about
01:30:24.760 | the intellectual psychological growth
01:30:30.400 | of young men and women. - I agree.
01:30:31.880 | So I'm not disagreeing with you on this.
01:30:34.120 | I am saying, however, there is a positive there
01:30:36.800 | that once we see it, we can try to amplify that.
01:30:39.280 | - Yes, with technology. - Yes, that's right.
01:30:41.000 | And I'm just saying the very, very basic technology.
01:30:44.200 | I give stuff I caught up over the weekend kind of thing.
01:30:47.480 | I think if I throw an anonymity on top of that,
01:30:51.360 | it will lead to many bad outcomes for young people.
01:30:55.360 | - Anonymity, yes.
01:30:56.400 | Pseudonymity, maybe not.
01:30:57.560 | 'Cause Reddit is actually fairly polite, right?
01:30:59.860 | - The entirety of Reddit just chuckled as you said that.
01:31:03.720 | - Well, within a subreddit, it's actually fairly polite.
01:31:06.880 | Like let's say you're not usually seeing,
01:31:09.320 | it depends on which subreddit, of course.
01:31:10.920 | - There's a consistency.
01:31:12.120 | I think definition of politeness is interesting here
01:31:17.000 | because it's polite within the culture of that subreddit.
01:31:21.000 | - Yes, they abide by, let me put it a different way.
01:31:22.680 | They abide by the social norms of that subreddit.
01:31:24.880 | - And that's the definition of politeness.
01:31:27.000 | - Yeah, or civility, is that right?
01:31:29.120 | - So there is an interesting difference
01:31:30.760 | between pseudonymous and anonymous, you're saying.
01:31:33.680 | It's possible that pseudonymity,
01:31:38.320 | you can actually avoid some of the negative aspects.
01:31:41.640 | - Absolutely, we're re-Dunbarizing the world in some ways.
01:31:45.000 | With China being the big exception or outlier.
01:31:47.240 | You know, the Dunbar number, 150 people.
01:31:49.640 | If you know, that's like roughly
01:31:51.200 | the scale of your society, right?
01:31:52.880 | Or that's the number of people that a human
01:31:56.000 | can kind of keep in their brain.
01:31:58.600 | Whether apocryphal or not,
01:31:59.560 | I think it's probably roughly true.
01:32:01.640 | And we're re-Dunbarizing the world because,
01:32:07.460 | A, we're making small groups much more productive,
01:32:10.200 | and B, we're making large groups much more fractious.
01:32:13.480 | Right?
01:32:14.320 | So you have an individual like Notch,
01:32:16.880 | who can program Minecraft by himself.
01:32:18.600 | Or Satoshi, who could do V1 of Bitcoin by himself.
01:32:20.720 | Or Instagram, which is just like 10 people,
01:32:23.240 | or WhatsApp is just like 50 people when they sold.
01:32:25.960 | But on the other hand, you have huge, quote,
01:32:27.880 | countries of hundreds of millions of people
01:32:30.240 | that are just finding that the first and second principle,
01:32:32.800 | or they're just splitting on principle components.
01:32:36.440 | Scott Alexander thinks of them as scissor statements.
01:32:38.620 | You know, statements that one group thinks is obviously true,
01:32:41.540 | one group thinks is obviously false.
01:32:43.380 | You can think of them as political polarization.
01:32:45.340 | You think of it in terms of game theory.
01:32:46.500 | There's lots of different reasons you can give
01:32:47.780 | for why this happens.
01:32:49.140 | But those large groups now are getting split.
01:32:52.520 | And so you have both the unsustainability
01:32:56.100 | of these large sort of artificial groups,
01:32:58.220 | and the productivity of these small organic ones.
01:33:01.140 | And so that is kind of, it's like sort of obvious,
01:33:03.200 | that's the direction of civilizational rebirth.
01:33:05.380 | We just need to kind of lean into that.
01:33:07.540 | - Scissor statements.
01:33:08.380 | There's so many beautiful, just like,
01:33:11.580 | you know, we mentioned chocolates,
01:33:13.700 | right, advertising themselves.
01:33:15.360 | Your entirety of speech is an intellectual
01:33:18.860 | like box of chocolates.
01:33:20.600 | But okay, so I don't think we finished
01:33:23.900 | defining the network state.
01:33:25.440 | Let's like linger on the definition.
01:33:27.780 | You gave the one sentence statement,
01:33:29.840 | which I think essentially encapsulated
01:33:32.220 | the online nature of it.
01:33:33.420 | I forget what else.
01:33:35.520 | Can we just try to bring more richness
01:33:40.520 | to this definition of how you think about the network state?
01:33:43.000 | - Absolutely.
01:33:43.840 | So that informal sentence is,
01:33:45.800 | "A network state is a highly aligned online community
01:33:48.320 | with a capacity for collective action
01:33:50.500 | that crowdfunds territory around the world
01:33:52.320 | and eventually gains diplomatic recognition
01:33:54.020 | from preexisting states."
01:33:55.440 | So we talked about was the alignment of online communities
01:33:58.680 | and the capacity for collective action.
01:34:00.960 | Well, one collective action,
01:34:02.240 | it could be a thousand people liking a tweet, right?
01:34:04.720 | If you can get a thousand of a thousand people doing it.
01:34:06.680 | But a much higher level, much higher bar
01:34:09.100 | is a thousand people crowdfunding territory
01:34:11.220 | and actually living together.
01:34:12.460 | Just like people currently--
01:34:13.300 | - In physical space.
01:34:14.120 | - In physical space.
01:34:15.160 | And not all in one place.
01:34:16.760 | That's critical.
01:34:17.600 | Just like Bitcoin is a decentralized currency,
01:34:20.020 | the network state is a recipe
01:34:21.040 | for a decentralized state-like entity, okay?
01:34:23.920 | Where it starts with, you know, for example,
01:34:27.320 | two people just get, you know, they become roommates.
01:34:29.640 | They meet in this community, they become roommates.
01:34:31.840 | Okay, they get a place together.
01:34:33.360 | Or 10 people get a group house.
01:34:35.000 | Or eventually a hundred people
01:34:36.480 | just buy a small apartment building together.
01:34:38.440 | And guess what?
01:34:39.280 | They start getting equity and not just paying rent, okay?
01:34:41.280 | These are all people who share their values.
01:34:43.340 | And now they can crowdfund territory together.
01:34:46.200 | Now, of course, they don't just jump straight
01:34:47.880 | from a thousand people liking something
01:34:50.080 | to a thousand people crowdfunding something.
01:34:51.920 | What I described in the middle is you do a lot of meetups.
01:34:54.600 | You get to know these other people
01:34:55.800 | before you decide to live, you know,
01:34:57.640 | collectively with them.
01:34:58.760 | But once you live with them,
01:34:59.840 | you start to get a network effect.
01:35:01.120 | For example, if those hundred people want to learn Spanish
01:35:06.120 | or Turkish or Vietnamese,
01:35:09.260 | they could all have a building
01:35:10.880 | where they're doing Vietnamese immersion, right?
01:35:13.900 | And that's something which they get a benefit
01:35:16.220 | from being physically around the other people
01:35:17.880 | that the pure digital wouldn't give them
01:35:19.260 | to quite the same extent, right?
01:35:20.860 | And so crowdfunds territory around the world,
01:35:24.000 | crucially, not just one place,
01:35:25.560 | they're all connected by the internet.
01:35:27.560 | Just like Hawaii is 2000 miles away
01:35:29.600 | from the continental US,
01:35:30.720 | but both sides think of them as American,
01:35:33.080 | but the people on Hawaii
01:35:33.920 | and people in the continental US.
01:35:35.120 | - What's the role of having to have territory?
01:35:38.600 | - Why? - If most of the exchange,
01:35:40.600 | so presumably as technology gets better and better,
01:35:42.880 | the communication, the intimacy,
01:35:44.520 | the exchange of ideas all happens in a digital world.
01:35:48.040 | What's the importance of being able to crowdfund territory?
01:35:50.840 | - Well, because we're still physical creatures.
01:35:52.480 | You can't reproduce yet digitally, right?
01:35:55.140 | There's still lots of things.
01:35:55.980 | - So it's all about sex.
01:35:57.160 | - Well, that's gotta be part of it.
01:35:58.120 | You're gonna wanna, you know, reproduce, right?
01:35:59.440 | - Are we talking about a cult?
01:36:01.000 | - Well, it's not a cult.
01:36:01.840 | (laughs)
01:36:02.660 | It's not a cult.
01:36:03.500 | - Why can't you just like take a trip?
01:36:04.720 | Why is it not a cult?
01:36:05.600 | - It's not a cult because a cult is very internally focused
01:36:09.680 | and it tries to close its members off from the outside world.
01:36:13.080 | This is much more how America itself was populated,
01:36:15.240 | where there were lots of towns,
01:36:16.600 | like Penn is named after William Penn,
01:36:18.240 | or the founder of Texas, like Sam Houston, right?
01:36:20.640 | Lots of towns like the Oneida Commune
01:36:22.480 | in Northern New York, they recruited
01:36:25.420 | and they became a town
01:36:26.320 | and they became actually the Oneida Glassware Company,
01:36:29.840 | kind of, you know, makes glassware out of it.
01:36:32.000 | All of these communities that were opt-in,
01:36:33.680 | voluntary communities were not simply like cults
01:36:37.160 | that were closed off from the world.
01:36:38.000 | They were meant to set an example to the world
01:36:39.320 | of what virtuous living looked like
01:36:40.520 | and they were trying to recruit from the rest of the world
01:36:42.160 | and they were exporting goods to the rest of the world, right?
01:36:44.240 | So it is, yes, reproduction, it's, you know,
01:36:48.200 | marriage and kids and so on,
01:36:49.640 | but it's also just hanging out
01:36:51.680 | and it is just, the physical world is very high bandwidth.
01:36:54.760 | There's lots of stuff, you know,
01:36:55.720 | it's fun to just go and have a dinner in person,
01:36:57.360 | just to hang out, to build things.
01:36:58.980 | Moreover, there's also lots of innovation
01:37:00.560 | that can only take place in the physical world.
01:37:03.200 | You know, look, I'm, you know,
01:37:04.520 | one of my sayings in the book is,
01:37:05.960 | "Cloud first, land last, but not land never."
01:37:09.120 | Okay? - Right.
01:37:09.960 | - In many ways, one of the problems the book solves
01:37:13.000 | is Thiel's problem of, you know,
01:37:15.480 | we have innovation in bits, but not in atoms, right?
01:37:17.900 | We can build a billion dollar company online,
01:37:20.600 | but we need a billion permits
01:37:21.680 | to build a shed in San Francisco, right?
01:37:23.920 | How do you reconcile that?
01:37:25.040 | Well, what is stopping the innovation in atoms?
01:37:28.280 | It is a thicket of regulations.
01:37:30.480 | What are those regulations?
01:37:31.660 | Ultimately, a social construction.
01:37:33.520 | If you lean into the, you know,
01:37:35.960 | whole deconstructionist, you know, school of thinking,
01:37:39.200 | you can deconstruct and then reconstruct the state itself,
01:37:42.120 | given sufficient social consensus online, okay?
01:37:45.080 | If the population of Nevada had 100% consensus,
01:37:48.060 | you could just dissolve every law in Nevada,
01:37:50.480 | in theory, and then build new ones, okay?
01:37:53.000 | So the online consensus of getting people
01:37:55.580 | to agree on something is upstream of what happens offline.
01:37:58.680 | So once you have consensus in bits,
01:38:01.800 | the human consensus, also, you know,
01:38:03.560 | cryptographic consensus, cryptocurrency consensus,
01:38:06.320 | then you can reshape the world of atoms.
01:38:08.720 | The reason we can't reshape the world of atoms right now
01:38:10.920 | is because you don't have that consensus of minds, okay?
01:38:13.960 | For example, in SF, anything you do,
01:38:16.200 | there's gonna be 50% of people who are against you.
01:38:18.920 | Like, so that's just a recipe for gridlock.
01:38:21.580 | Whereas if you have a bare piece of land
01:38:24.120 | that everybody agrees on, you can get, you know,
01:38:26.000 | 70,000 units get set up in Burning Man in just a few days.
01:38:28.960 | Okay?
01:38:29.800 | That's the power of what,
01:38:31.920 | when you actually have human consensus.
01:38:33.360 | And one way I talk about this,
01:38:34.720 | also in the book a little bit,
01:38:35.880 | and this I'm gonna go much more into detail in the V2,
01:38:38.440 | I think of this as 100% democracy,
01:38:41.400 | as opposed to 51% democracy.
01:38:43.040 | 51% democracy, which is the current form of government,
01:38:46.200 | is 49% dictatorship.
01:38:48.800 | Because the entire premise of democracy
01:38:50.400 | is about the consent of the governed, right?
01:38:52.100 | That's the actual legitimating underpinning principle.
01:38:54.740 | And insofar as 49% did not consent
01:38:59.740 | to the current, you know,
01:39:01.740 | president or prime minister or whatever,
01:39:04.120 | let's say presidential system first past the post, okay?
01:39:08.780 | Insofar as 49% did not consent,
01:39:10.700 | or in a prime minister system,
01:39:12.660 | it could be like 60% or more didn't consent
01:39:14.740 | to the current leader,
01:39:16.200 | those folks are having something imposed on them
01:39:18.660 | that they literally did not vote for.
01:39:20.800 | Moreover, campaign promises are non-binding.
01:39:22.880 | So whatever they voted for,
01:39:24.760 | they can effectively be defrauded.
01:39:26.080 | You know, the actual voter fraud
01:39:27.600 | is when a politician promises X, but does not do it.
01:39:31.120 | It's as if you bought a can of orange juice
01:39:33.120 | and it actually drinking its milk, or it's nothing, right?
01:39:36.200 | So all of that is routinized, all of that is accepted.
01:39:38.680 | We have this thing, which is just the minimum possible
01:39:41.760 | amount of democracy, a 51%, okay?
01:39:44.440 | And what happens is then that 51% tries to ram something
01:39:47.360 | down the 49% throat, and then the next election,
01:39:50.460 | it's now 51, 49 the other way, and then they ram it back.
01:39:53.600 | And that's how you get the seesaw
01:39:55.500 | that is just splitting countries apart, right?
01:39:58.280 | The alternative to that is you build a consensus online,
01:40:01.360 | you go and get some God-forsaken patch of territory.
01:40:03.940 | Actually, the worse the territory, the better.
01:40:06.900 | Because it's like Burning Man, nobody cares, right?
01:40:09.060 | The nicer the piece of land,
01:40:10.620 | the more the people are gonna argue about it.
01:40:12.300 | But Starlink has repriced the world.
01:40:14.860 | Basically all kinds of piece of territory
01:40:16.860 | that were previously, they're far away from natural ports,
01:40:21.180 | they're far away from natural resources,
01:40:23.300 | all kinds of piece of territory around the world
01:40:24.980 | now have satellite internet.
01:40:27.020 | And so what you can do is, again,
01:40:28.500 | the map has been reopened, right?
01:40:30.020 | What we were talking about earlier,
01:40:30.860 | the map has been reopened,
01:40:31.860 | you can gather your community online,
01:40:33.640 | they're now capable of collective action,
01:40:35.400 | you can point here, this place has great Starlink coverage,
01:40:38.980 | you go there like the Verizon guy,
01:40:41.180 | can you hear me now?
01:40:42.020 | Good, right?
01:40:42.980 | You see that the coverage is good there,
01:40:44.140 | you drive out there, you test it out.
01:40:45.820 | Maybe you do it with mobile homes first, right?
01:40:48.500 | This by the way is its own thing.
01:40:50.580 | There's Yimby and there's Nimby and there's Yimby,
01:40:52.820 | but I actually also like Himby, okay?
01:40:55.060 | Do you want that?
01:40:55.900 | - Let's go, Nimby, Yimby, Himby, what are those?
01:40:58.340 | - So Nimby is not in my backyard, don't build in cities.
01:41:01.540 | Yimby is let's build high density buildings,
01:41:04.460 | really tall buildings and so on in cities.
01:41:06.300 | There's a third version, which is Himby,
01:41:09.340 | it's my little coinage, which is horizontal sprawl is good.
01:41:12.120 | Why horizontal sprawl?
01:41:13.140 | Because to build a skyscraper,
01:41:14.940 | to build a tall building in a city,
01:41:16.820 | you have this enormous permitting process,
01:41:19.260 | all of this stuff, which has to get done,
01:41:21.020 | it's expensive, it's time consuming.
01:41:22.860 | The way that cities were built,
01:41:23.980 | if you go back to the V1,
01:41:24.880 | what does the startup city look like?
01:41:26.620 | It looks like something like Burning Man,
01:41:28.260 | it looks like the cities of the Wild West.
01:41:30.340 | They were not multi-story buildings, right?
01:41:33.100 | They were basically things that were just like one story
01:41:35.880 | and someone could have it there in the dust
01:41:38.060 | and then you build roads and stuff between them
01:41:39.500 | and they can move them around.
01:41:40.700 | It was a much more dynamic geography.
01:41:42.720 | And so when you have that as a vision
01:41:44.300 | of what a startup city looks like, right?
01:41:46.980 | Now you've got something,
01:41:47.820 | there's a company I found called Kift,
01:41:49.300 | which is like van life.
01:41:50.580 | There's a lot of stuff in construction
01:41:52.900 | that makes this feasible.
01:41:54.260 | There's so-called man camps for fracking,
01:41:57.560 | where people can just do like,
01:41:59.180 | companies like Agreco, they have private power,
01:42:01.340 | you can bring water, all this stuff on site.
01:42:03.020 | So it's easy to actually snap this stuff to grid,
01:42:05.520 | relatively speaking, if you've got horizontal space,
01:42:08.020 | you pick this space, you crowd from the territory,
01:42:09.820 | now you've got a city.
01:42:10.660 | Okay, and the last bit is,
01:42:12.180 | eventually gains diplomatic recognition
01:42:13.900 | from pre-existing states.
01:42:15.220 | And this is the part that people,
01:42:17.420 | different people will be with me up to this point
01:42:19.940 | and then they'll say, okay,
01:42:20.780 | that's a part I disagree with
01:42:21.980 | or how are you gonna ever do that, right?
01:42:23.500 | They'll say, yeah, you can build an online community.
01:42:25.580 | I believe you can get them to do collective action.
01:42:27.340 | Of course, people have crowdfunded land
01:42:29.880 | and moving together and doing it a larger scale.
01:42:31.820 | All that I believe,
01:42:32.720 | how are you possibly ever gonna gain
01:42:34.120 | diplomatic recognition from pre-existing states?
01:42:35.820 | You dumb delusional tech bro, right?
01:42:37.920 | That's a common thing.
01:42:38.920 | Okay, that's about the tone of it as well, right?
01:42:42.160 | And so first I would say,
01:42:44.060 | sovereigns are already out for business.
01:42:46.700 | They're inking deals, okay?
01:42:48.900 | Nevada inked a deal with Tesla to build the Geiger factory.
01:42:51.720 | El Salvador has Bitcoin as its national currency.
01:42:54.320 | Wyoming has done the Dow law
01:42:56.100 | where Ethereum is now recognized,
01:42:58.060 | where you can have on-chain incorporations
01:43:00.240 | that are recognized by Wyoming law.
01:43:01.980 | Virginia and New York negotiated with Amazon for HQ2.
01:43:06.660 | Tuvalu signed a deal with GoDaddy for the .tv domain.
01:43:09.940 | Columbia signed a deal for the .co domain
01:43:12.400 | and on and on and on.
01:43:13.240 | Sovereigns are open for business.
01:43:15.800 | Sovereigns are doing deals with companies
01:43:18.280 | and with currencies.
01:43:19.920 | Sovereigns at the level of cities like Miami or New York
01:43:23.640 | where the mayors are accepting their salary in Bitcoin.
01:43:26.400 | States like Wyoming or Nevada
01:43:29.960 | has its new private cities legislation
01:43:31.920 | or entire countries like El Salvador.
01:43:35.160 | When you say sovereigns, by the way,
01:43:36.160 | you mean the old school-
01:43:37.280 | Governments.
01:43:38.480 | Physical nation states, governments.
01:43:40.800 | Fiat states.
01:43:42.040 | Fiat states.
01:43:43.120 | Okay.
01:43:44.520 | But the fiat isn't the thing that makes a state.
01:43:47.920 | What makes a state is geographical location.
01:43:50.280 | It is something where, they're both, right?
01:43:53.760 | So basically it's a play on words.
01:43:55.120 | So just like fiat currency is cryptocurrency,
01:43:57.280 | we will have fiat country and crypto country, right?
01:44:01.640 | And in fact, you can think of the fiat
01:44:03.680 | and crypto version of almost anything.
01:44:05.920 | One thing I'll come to later is a big thing.
01:44:08.640 | The big thing I think comes
01:44:09.480 | after digital currency is digital passports.
01:44:12.120 | Mm-hmm.
01:44:12.960 | Okay.
01:44:13.780 | So, and that's a big part of this whole network thing
01:44:15.720 | which we'll come back to.
01:44:16.880 | But, so that last bit,
01:44:19.560 | the reason I just mentioned all those deals
01:44:20.960 | between sovereigns, whether at the city,
01:44:24.240 | US state or UN listed country level, okay.
01:44:28.440 | And on the other hand, so that's on one side of the market.
01:44:30.760 | On the other side are the companies and the currencies.
01:44:35.200 | Why could we not have online communities, right?
01:44:37.600 | So let me--
01:44:38.560 | Making those deals, signing those deals.
01:44:39.520 | So diplomatic recognition,
01:44:42.440 | but aren't you still attached to the responsibilities
01:44:47.440 | that come from being a member of a sovereign,
01:44:54.800 | old school nation state?
01:44:58.640 | Can you possibly escape that?
01:45:00.040 | So yes, and let me give you a concrete example.
01:45:02.700 | Israel, okay, why?
01:45:05.120 | You know, people talk about,
01:45:07.040 | a lot of people are like, "Oh, Bolles,
01:45:08.160 | he just, he took this from Snow Crash
01:45:09.960 | or some sci-fi book they'll reference."
01:45:11.880 | Right, actually, if,
01:45:13.720 | there's many different references to the book.
01:45:15.160 | This is not the only reference.
01:45:16.640 | But a very important reference
01:45:18.300 | that I think is much more important to me than Snow Crash,
01:45:20.480 | which is good, a good book, whatever, but it's fictional,
01:45:24.040 | is "Der Judenstaat" by Theodor Herzl,
01:45:26.340 | which translates as "The Jewish State."
01:45:28.080 | And that led to the foundation of Israel.
01:45:29.480 | And that's very real.
01:45:30.720 | It's worth reading because it's amazing.
01:45:32.800 | Theodor Herzl was like a tech founder, okay?
01:45:35.640 | In the book, he was writing
01:45:36.480 | about the death of distance in 1897.
01:45:39.660 | Because steamships could take you across countries, okay?
01:45:43.280 | And he like, he's just, you know,
01:45:46.480 | amazingly smart and practical guy,
01:45:49.240 | where he just handled all these various objections.
01:45:51.480 | And he said, "Look, you know, the Jewish people,
01:45:54.320 | you know, our choices are either A,
01:45:56.360 | assimilate and give up the culture,
01:45:57.820 | or B, some people are thinking communism's a good idea.
01:46:00.000 | I disagree with that.
01:46:00.960 | We should do C, build our own country, right?"
01:46:03.640 | And that was considered totally crazy.
01:46:05.640 | But what he did was he A, wrote a book,
01:46:07.800 | B, started a fund,
01:46:09.260 | C, organized a semiannual conference,
01:46:11.400 | the, you know, World Zionist Congress,
01:46:13.200 | and the fund and the Congress are still going today.
01:46:15.360 | Crucially, there were a bunch of intermediate stages
01:46:18.560 | between the book and the idea,
01:46:20.760 | and then the actual state of Israel in 1947.
01:46:24.100 | For example, the, you know,
01:46:26.760 | the folks who were committed Zionists got together
01:46:28.960 | and started crowdfunding territory
01:46:30.640 | in what is now Palestine.
01:46:31.800 | And in fact, though, Palestine was only one choice.
01:46:34.040 | In the book, they also had Argentina as a choice.
01:46:36.840 | So this is my concept, cloud first, land last,
01:46:39.360 | and the land's a parameter, you can choose, right?
01:46:41.400 | Other places that were considered at various points,
01:46:43.040 | like Madagascar, Birobidzin in the former Soviet Union,
01:46:46.000 | right?
01:46:46.840 | So the land was a parameter.
01:46:47.880 | Palestine went out because of its, you know,
01:46:49.920 | historical and religious importance.
01:46:51.420 | Now, by the way, one thing,
01:46:52.840 | I'm sure there's some, like, some fraction of viewers
01:46:54.720 | are gonna be like, "Oh my God,
01:46:55.600 | like all the bad stuff that happened."
01:46:57.440 | I'm obviously not denying
01:46:58.960 | that there's enormous amounts of controversy and so on
01:47:00.800 | that attends Israel.
01:47:01.800 | I consider myself generally pro-Israeli.
01:47:03.880 | I'd also consider myself pro-Palestinian.
01:47:05.320 | I fund lots of Palestinians and so forth.
01:47:07.120 | So I'm leaving that part out, that huge conflict,
01:47:09.880 | or, you know, for now, okay?
01:47:11.480 | And you might say that's airbrushing it.
01:47:13.600 | I don't mean it to do that.
01:47:14.620 | I'm saying, here is the positive things they did.
01:47:17.680 | Can we take the positive and not have the negative?
01:47:20.480 | And I'll come back to how we might swap those parts out.
01:47:22.960 | But let me just talk about this a little bit more.
01:47:24.800 | So one of the things that happened was
01:47:26.880 | committed Zionists went and crowdfunded territory
01:47:29.480 | in what is now Israel, and they knit it together, right?
01:47:33.760 | Because when you're physically present on territory,
01:47:35.780 | yes, in theory, like the British Empire was in control.
01:47:39.760 | They were the sovereign, okay?
01:47:41.320 | In practice, who were the boots on the ground,
01:47:43.560 | the facts on the ground, right?
01:47:45.000 | These are the people who are actually tilling the land
01:47:47.060 | and building the buildings and so on and so forth.
01:47:49.480 | Like, who had the claim there
01:47:51.120 | is like the people who are present, okay?
01:47:52.800 | Now, this territory, this network of territories
01:47:57.080 | eventually became the basis for,
01:48:00.360 | or part of the basis for what became Israel.
01:48:02.080 | Now, I'm fully aware that the exact configuration
01:48:06.540 | of what territory belongs to Israel,
01:48:07.960 | what territory belongs to Palestinians,
01:48:09.440 | this is an enormous topic of dispute, okay?
01:48:11.660 | But I just point this out to say
01:48:13.760 | the process of going from book to crowdfunding territory
01:48:16.720 | to a sovereign state where people
01:48:18.400 | were now citizens of Israel,
01:48:19.700 | as opposed to the British Empire,
01:48:21.320 | is not some fictional thing, but did happen,
01:48:24.200 | and within the lifetimes of some of the older,
01:48:26.640 | you know, they're in their 80s now,
01:48:28.200 | but in the lifetimes of some older people, okay?
01:48:31.040 | So it's not impossible.
01:48:32.880 | In fact, it has happened, right?
01:48:34.900 | - But for that step, perhaps, hopefully,
01:48:37.840 | is a better example, because in this particular,
01:48:41.480 | like you said, land last, if I were to say,
01:48:45.400 | if I was an alien and arrived at Earth
01:48:48.520 | and say, choice of land, maybe if you were interested
01:48:53.080 | in choosing a land that represents a network state
01:48:58.080 | where ideas that unites a people based on ideas,
01:49:03.480 | maybe pick a land that doesn't lead
01:49:06.600 | to generational conflict and war.
01:49:09.920 | - Yes, so I'll get to that point.
01:49:10.760 | - And destruction and suffering and all that.
01:49:13.000 | - All the stuff, that's right.
01:49:14.120 | So now that I've said what are the positive things
01:49:18.320 | about Israel, and I think there's a lot to admire
01:49:19.880 | in Israel, as I said, I think there's also a lot
01:49:21.400 | to admire in the Palestinians and so on.
01:49:23.120 | I'm not taking any position on that.
01:49:25.040 | There's other inspirations for the network state.
01:49:27.800 | The second major inspiration is India,
01:49:30.240 | which managed to achieve independence nonviolently, right?
01:49:34.560 | That's very important, right?
01:49:35.740 | So can you fuse these things, right?
01:49:37.840 | A state started with a book
01:49:39.320 | that achieved independence nonviolently, okay?
01:49:42.300 | And that managed to build this polyglot,
01:49:44.860 | you know, multicultural democracy, right?
01:49:47.880 | That does, you know, like, you know,
01:49:50.200 | India has its flaws, but it does manage to have,
01:49:53.160 | you know, human rights of lots of people respected
01:49:54.980 | and what have you, right?
01:49:56.880 | And has managed to, you know,
01:49:59.040 | there were times like emergency in the 1970s
01:50:01.520 | in the Aragondi declared emergency.
01:50:02.960 | There were times when it seemed touch and go,
01:50:05.040 | but overall with fits and starts,
01:50:06.700 | this flawed thing has kind of made its way through.
01:50:09.320 | And, you know, the third inspiration is Singapore
01:50:12.600 | with Lee Kuan Yew, who built a city state from nothing.
01:50:17.040 | You know, I shouldn't say from nothing.
01:50:18.040 | Okay, there was something there,
01:50:19.000 | but let's say built one of the richest countries
01:50:21.340 | in the world without like huge amounts
01:50:23.760 | of natural resources in the middle of a zone
01:50:26.560 | where there was lots of communist revolution going on.
01:50:29.920 | And so he was the CEO founder essentially
01:50:32.260 | of this amazing startup country, right?
01:50:35.280 | And, you know, finally, of course, America,
01:50:37.900 | which has too many influences to name
01:50:40.680 | things we talked about, the nation of immigrants,
01:50:42.740 | obviously the constitution and so on.
01:50:44.840 | And you think, okay, can we go,
01:50:46.280 | you think of these inspirations.
01:50:48.240 | What's interesting about these four countries, by the way,
01:50:50.760 | Israel, India, Singapore, and the US,
01:50:53.080 | they have something in common.
01:50:53.920 | You know what that is?
01:50:54.760 | - What is that?
01:50:55.580 | - They're all forks of the UK code base.
01:50:57.900 | We think obviously, you know,
01:51:00.160 | the UK was sort of the ancestor of America,
01:51:02.360 | but Israel was a former British colony, right?
01:51:06.120 | India was a British colony and so was Singapore, right?
01:51:08.560 | - For people who don't know what fork and code base means,
01:51:12.400 | it's a language from versioning systems,
01:51:15.520 | particularly Git, represented online
01:51:18.240 | on a website called GitHub.
01:51:20.020 | And a fork means you copy the code
01:51:23.800 | and all the changes you make to the code
01:51:26.040 | now live in their own little world.
01:51:28.200 | So America took the ideas that define the United Kingdom
01:51:31.580 | and then forked it by evolving those ideas
01:51:36.000 | in a way that didn't affect the original country.
01:51:39.480 | - That's right.
01:51:40.320 | And what's interesting about this is,
01:51:41.680 | and of course I'm saying that
01:51:42.520 | in a somewhat playful way, right?
01:51:44.080 | But I think it's a useful analogy, interesting analogy.
01:51:46.200 | So you have the Americans who forked the UK code base,
01:51:50.760 | and then you have the Indians, Israelis,
01:51:53.760 | and the Singaporeans who also made their own modifications.
01:51:56.640 | And in some ways, each society has pieces
01:51:59.020 | that you can take from them and learn from them
01:52:01.800 | and try to combine them, right?
01:52:03.320 | So you have a state that is started by a book
01:52:07.460 | that non-violently assembles,
01:52:11.280 | that crowdfunds territory around the world,
01:52:13.480 | that is led by a CEO founder,
01:52:16.400 | and that is also governed by something
01:52:20.200 | that's like a constitution.
01:52:21.900 | But just like you went from,
01:52:23.240 | you know, I talk about the V1, V2, and V3 a lot, right?
01:52:25.900 | Like V1 is gold and V2 is fiat and V3 is Bitcoin, right?
01:52:29.740 | Or V1 is hunter-gatherer and V2 is farmer-soldier,
01:52:32.440 | V3 is digital nomad or sovereign collective, okay?
01:52:35.280 | Which is not just an individual, but a group.
01:52:37.540 | Here, V1 is UK common law.
01:52:40.280 | They don't have a constitution.
01:52:41.160 | It's all precedent going for many years, right?
01:52:43.840 | V2 is the US constitution and V3 is the smart contract,
01:52:47.960 | the social smart contract, which is a fusion, obviously,
01:52:50.440 | of Rousseau's concept of the social contract
01:52:52.760 | and the smart contract.
01:52:53.600 | The social smart contract is like written in code, okay?
01:52:56.840 | So it's like even more rigorous than the constitution.
01:52:59.160 | And in many ways, you can think of going
01:53:01.360 | from the United Kingdom of England, Wales,
01:53:06.240 | you know, Scotland, Northern Ireland,
01:53:07.640 | the United States of America,
01:53:08.920 | the network states of the internet, okay?
01:53:11.080 | Where you go from the rights of Englishmen
01:53:13.760 | with the Magna Carta to Europeans, African-Americans,
01:53:17.760 | all the immigrants to the Americans or the North America,
01:53:22.440 | then you go to all the people of the world.
01:53:24.400 | And so you basically are more democratic
01:53:28.080 | and you're more capitalist
01:53:30.140 | because you're talking about internet capitalism,
01:53:31.940 | not just nation-state-locked capitalism.
01:53:33.600 | In a sense, it's the V3, right?
01:53:35.800 | In other ways, the V3, only about 2% of the world
01:53:39.440 | is over 35, native-born American,
01:53:43.640 | can qualify to be president of the United States.
01:53:46.160 | But 100% of the world,
01:53:47.440 | you could become the president of a network state.
01:53:49.360 | There might be a, you know, a Palestinian Washington
01:53:52.920 | or a, you know, Brazilian Hamilton, right?
01:53:56.420 | And now, rather than say, okay,
01:53:58.600 | maybe you have a small percentage chance
01:54:01.200 | of immigrating to the US
01:54:02.040 | and a small percentage chance of your descendant,
01:54:03.880 | you know, becoming like, you know, president,
01:54:06.520 | now we can just say, you can start online.
01:54:08.840 | And you know what?
01:54:09.680 | Maybe this person is so exceptional,
01:54:10.720 | they have Americans coming to their,
01:54:12.880 | you know, network state, right?
01:54:14.160 | - You don't think that kind of thing is possible
01:54:15.800 | with like the rich get richer in a digital space too,
01:54:20.040 | that people with more followers
01:54:21.880 | have friends that have followers and they like-
01:54:25.640 | - I don't think it's the rich get richer.
01:54:26.640 | I think what happens is,
01:54:27.840 | so this is an important concept.
01:54:30.760 | It is, it's multi-axis, right?
01:54:32.940 | That is to say, for example,
01:54:35.360 | just the introduction of the Bitcoin axis, right?
01:54:38.820 | And those, 'cause it didn't exist pre-2009, now it exists.
01:54:43.460 | Those people who are rich in BTC terms
01:54:47.160 | are only partially correlated
01:54:48.440 | with those who are rich in USD terms.
01:54:50.280 | There's all these folks, essentially-
01:54:52.480 | - BTC is Bitcoin and USD is US dollar.
01:54:55.240 | - Yes. (laughs)
01:54:56.720 | So that's a new axis.
01:54:58.200 | And ETH is yet another axis, right?
01:54:59.960 | You- - Ethereum.
01:55:00.800 | ETH is Ethereum.
01:55:01.920 | - Right.
01:55:02.760 | So you are essentially getting new social systems
01:55:07.200 | which are actually net inequality decreasing
01:55:10.160 | because before you only had USD millionaires
01:55:12.760 | and now you have a new track
01:55:14.680 | and then another track and another track, right?
01:55:16.960 | You have different hierarchies, different ladders, right?
01:55:19.840 | And so on net, you have more ladders to climb.
01:55:24.020 | And so it's not the rich getting richer.
01:55:25.560 | In fact, old money in some ways
01:55:27.160 | is a last to cryptocurrency.
01:55:28.840 | Old money and old states, I think,
01:55:31.480 | those people who are the most focused on,
01:55:33.640 | you might call it reform, I would call it control, okay?
01:55:36.660 | The most focused on control of the old world
01:55:38.280 | who have the least incentive to switch,
01:55:40.700 | the rich will get poorer
01:55:44.320 | because it will be the poor
01:55:46.040 | or those who are politically powerless,
01:55:48.120 | politically poor, who go and seek out these new states.
01:55:50.880 | - Yeah, I didn't mean in the actual money, but yes, okay.
01:55:53.800 | There's other ladders.
01:55:54.760 | I meant in terms of influence,
01:55:56.440 | political and social influence
01:55:58.160 | in these new network states.
01:56:01.760 | You, I think, said that basically anybody
01:56:04.480 | can become president of a network state.
01:56:06.360 | - Just like anybody can become CEO of a startup company.
01:56:08.240 | Of course, whether people follow you is another matter,
01:56:10.880 | but anybody can go and found one.
01:56:12.080 | Go ahead, sorry.
01:56:13.000 | - Oh, from the perspective that anyone can found one.
01:56:16.160 | Anyone could found, I see.
01:56:18.960 | - We don't think it's implausible
01:56:20.440 | that somebody from Brazil or Nigeria,
01:56:22.640 | I mean, most quote billionaires in the world
01:56:25.480 | are not American.
01:56:26.320 | And in fact, actually, here's another important point.
01:56:28.800 | It's far easier to become a tech billionaire
01:56:31.160 | than become, or a billionaire period,
01:56:32.760 | than become president of the United States.
01:56:34.240 | There's less than 50 US presidents ever, all time, okay?
01:56:38.000 | It is a much more realistic ambition
01:56:40.640 | to become a billionaire than become president.
01:56:43.000 | There's like thousands of billionaires worldwide.
01:56:44.800 | In fact, 75% of them are outside of the US.
01:56:47.160 | And many of those have been,
01:56:48.880 | some of them are like energy and oil,
01:56:51.120 | which is often based on political connections,
01:56:53.200 | but a very large chunk of the rest are tech, okay?
01:56:56.280 | And that's something where you're mining,
01:56:58.680 | but you're mining online by hitting keys
01:57:00.680 | as opposed to with a pickaxe, you know, in granite, right?
01:57:03.540 | So the point is that we think it's
01:57:05.680 | totally understandable today for there to be a, you know,
01:57:10.120 | huge founder who comes out of Vietnam or, you know,
01:57:14.560 | South America, like that,
01:57:15.920 | like you can name founders from all over the world, right?
01:57:18.360 | Exceptional people can rise from all the world
01:57:20.200 | to run giant companies.
01:57:21.840 | Why can they not rise to run giant new countries?
01:57:24.880 | And the answer is we didn't develop the mechanism yet,
01:57:26.640 | right?
01:57:27.900 | And just as another example, I talk about this in the book,
01:57:30.480 | Vitalik Buterin is far more qualified than Jerome Powell,
01:57:35.480 | right, or anybody at the Federal Reserve.
01:57:36.760 | He actually built and managed a monetary policy
01:57:40.460 | and a currency from scratch, okay, as a 20-something, right?
01:57:45.400 | Obviously that's a more accomplished person
01:57:48.060 | than somebody who just inherited an economy.
01:57:50.120 | This is a- - A lot of people
01:57:51.360 | can push back at that and say that the people
01:57:54.000 | that initially build a thing aren't necessarily
01:57:57.560 | the best ones to manage a thing once it scales
01:57:59.920 | and actually has impact.
01:58:00.920 | - Sometimes, sometimes, but Dzuk has done a good job
01:58:03.360 | of both, I think Vitalik has done a good job of both, right?
01:58:05.600 | - But that's not an inherent truth.
01:58:07.800 | - Well, so actually I have- - If you built the thing,
01:58:09.800 | you will be the best person to run it.
01:58:11.560 | - I will agree with you on that,
01:58:12.720 | and actually I talk about this in the book,
01:58:14.860 | or I've got an essay on this called
01:58:16.000 | Founding vs. Inheriting, okay?
01:58:18.040 | And the premise is actually that, the classic example,
01:58:21.360 | you know the saying, "Shirt sleeves to shirt sleeves
01:58:23.640 | "in three generations."
01:58:24.960 | It means the guy who starts out poor and builds a fortune,
01:58:28.920 | his son maintains it,
01:58:30.120 | and his dissipate grandson dissipates it, right?
01:58:34.520 | - Why is shirt sleeves a symbol of poverty?
01:58:37.960 | - Back in the past, it was kind of like,
01:58:39.200 | you know, you're just working with your,
01:58:40.600 | you're not white collar,
01:58:42.040 | you're back to working with your hands,
01:58:43.360 | you're just- - Oh, so it's a blue collar
01:58:44.800 | to blue collar in three generations.
01:58:45.880 | - Yeah, yeah, or working class or something like that, right?
01:58:48.440 | So essentially that the grandson squanders it, right?
01:58:52.040 | And, you know, in sense, by the way,
01:58:54.720 | just to talk about that for a second,
01:58:56.360 | if you have two children and four grandchildren
01:58:59.320 | and eight great-grandchildren and 16 and so on,
01:59:01.760 | and in older families, you know, they were much bigger,
01:59:04.120 | right, six, you know, children is not uncommon.
01:59:06.800 | Whatever fortune you have is now split six ways,
01:59:09.000 | and then six ways and six ways again.
01:59:10.800 | So with the exception of premature,
01:59:13.000 | where the oldest son inherits all the way down,
01:59:15.360 | the majority of descendants just a few generations out
01:59:17.960 | have probably inherited none of that fortune,
01:59:19.680 | unless it has compounded to such an extent
01:59:21.880 | that it's like up six X over 20 years, right?
01:59:25.800 | So it's actually hard to maintain a quote ruling class
01:59:28.680 | in the sense that this person
01:59:30.360 | who's like four generations down has, you know,
01:59:34.200 | like 1/16 of the DNA, you know,
01:59:37.360 | one over two to the fourth, right,
01:59:39.560 | of their scion who built a fortune.
01:59:42.240 | So it's not even like the same,
01:59:43.400 | is it the same family even, right?
01:59:45.480 | Is the fortune actually in the family?
01:59:46.640 | So most people don't think a few generations out,
01:59:49.080 | they just kind of think, oh, Marx is right,
01:59:50.800 | there's always been a rich and a poor.
01:59:51.920 | It's actually much more dynamic than that,
01:59:53.600 | 'cause you literally, like, what is even the family
01:59:55.480 | when it's diluted out, you know, 1/16, right?
01:59:58.760 | If you're 1/16 of Rockefeller, are you a Rockefeller?
02:00:01.400 | Really like 15, 16, something else,
02:00:03.140 | would you have the Rockefeller fortune?
02:00:04.400 | Probably not, right?
02:00:05.580 | Now, are there, again, premature,
02:00:07.880 | where the guy who inherits the name all the way through,
02:00:10.520 | that would be one way to pass it down.
02:00:13.400 | But even that person doesn't necessarily have the qualities
02:00:15.840 | of the guy who, you know, the cultural qualities,
02:00:17.640 | other qualities of the guy who's like four generations past,
02:00:19.600 | so they tend to squander it, right?
02:00:21.120 | So this actually brings us to,
02:00:23.160 | you know, coming back up to governance,
02:00:25.200 | the system, the guys who built the United States,
02:00:27.960 | you know, like Washington and Hamilton,
02:00:30.240 | these are giants, right?
02:00:31.240 | These are founders.
02:00:32.700 | And the folks today are like, not the grandson,
02:00:37.000 | but like the 40th generation heir of a factory
02:00:41.000 | that somebody else built.
02:00:42.400 | Like, think about a factory and you have, you know,
02:00:44.400 | this grandchild or great-grandchild inherits a factory.
02:00:47.480 | Most of the time, it's just cranking out widgets
02:00:49.320 | and the great-grandson is cashing checks.
02:00:51.560 | They have been selected as legitimate heir
02:00:54.080 | because it's the, you know, the founder,
02:00:57.280 | passes it down to his son, passes it down to his grandson,
02:00:59.440 | to his great-grandson.
02:01:00.280 | So legitimacy is there.
02:01:01.280 | They've got title, they can show, I own this factory, okay?
02:01:04.520 | They can cash the checks.
02:01:05.640 | There's professional managers there.
02:01:06.800 | Everything seems fine.
02:01:08.040 | Until one day, that factory has to go from making,
02:01:12.080 | you know, widgets to making masks for COVID
02:01:14.240 | or something else.
02:01:15.080 | It has to change direction.
02:01:15.920 | It has to do something it hasn't done before.
02:01:17.760 | None of that capability for invention and reinvention
02:01:21.120 | is present anymore.
02:01:22.560 | These people have inherited something
02:01:24.520 | that they could not build from scratch.
02:01:26.040 | Because they could not build from scratch,
02:01:27.200 | they can't even maintain it.
02:01:28.160 | This is an important point.
02:01:29.520 | The ability to build from scratch is so important
02:01:31.240 | because if some part breaks
02:01:34.240 | and you don't know why it was there,
02:01:35.960 | can you even maintain it?
02:01:37.000 | No, you can't, okay?
02:01:38.960 | Unless all the replacement parts
02:01:41.040 | and the know-how to fit them together is there,
02:01:42.560 | you can't repair this.
02:01:43.880 | So in 2009, Mother Jones had a story
02:01:45.560 | that said that the US military had forgotten
02:01:47.680 | how to make some kinds of nuclear weapons
02:01:49.880 | 'cause there was a part where all the guys
02:01:51.920 | who knew how to make it had aged out or left.
02:01:54.480 | Okay?
02:01:55.320 | And this was some aero gel or something like that.
02:01:57.000 | It was rumored, okay?
02:01:58.320 | Thing is, you're seeing increasingly, for example,
02:02:02.760 | you've got wildfires in California.
02:02:05.240 | You've got water that's not potable in Jackson.
02:02:08.960 | You've got power outages in Texas.
02:02:12.040 | You're seeing a lot of the infrastructure of the US
02:02:15.200 | is just less functional.
02:02:17.080 | I think probably part of that is due to civil engineering
02:02:19.720 | not being that sexy a field, people aging out,
02:02:23.480 | and just domain knowledge being lost,
02:02:26.680 | and the heirs who win the role of mayor
02:02:31.520 | or whatever of this town
02:02:33.080 | don't have the ability to build it from scratch.
02:02:34.800 | You're just selected for legitimacy, not competence, okay?
02:02:37.520 | So once you think about this concept
02:02:38.880 | of founding versus inheriting,
02:02:40.120 | and I've got the whole essay which talks about this,
02:02:42.400 | of course, the alternative to somebody
02:02:44.800 | who's legitimate but not competent
02:02:47.000 | where people will say is,
02:02:47.840 | "Oh, we need an authoritarian
02:02:49.960 | "to be in control of everything."
02:02:51.200 | And then their hope is that that person is competent,
02:02:55.040 | but they don't have legitimacy
02:02:56.560 | because if they're just installed
02:02:57.800 | as just like a authoritarian ruler,
02:03:00.000 | 50% of the population is really mad at them.
02:03:01.720 | They don't have title.
02:03:02.560 | They just grab the title.
02:03:04.040 | Maybe they can exert enough force,
02:03:05.880 | but that's the problem with the authoritarian dictator
02:03:08.600 | takeover, right?
02:03:09.720 | So the alternative, the third version,
02:03:11.360 | is the founder who combines both legitimacy and competence
02:03:14.160 | 'cause they start from scratch,
02:03:15.320 | and they attract people to their vision,
02:03:16.960 | they build it from scratch,
02:03:18.240 | and so you need is the ability
02:03:19.480 | to constantly do refoundings, rebirths.
02:03:21.960 | - So if you imagine a world
02:03:24.080 | that is primarily network states,
02:03:28.760 | can you help me imagine what that looks like?
02:03:33.360 | Now, there's several ways to imagine things,
02:03:36.200 | which is how many of them are there,
02:03:38.840 | and how often do the new ones pop up?
02:03:42.680 | - There could be thousands.
02:03:43.760 | - Given seven billion people,
02:03:45.040 | eight billion people on Earth.
02:03:46.240 | - Yeah, yeah, so there's network state
02:03:48.280 | in the precise definition I have in the book,
02:03:51.000 | which is a diplomatically recognized entity,
02:03:53.360 | and there's network state in sort of the loose definition
02:03:55.040 | where, you know, one thing that's interesting is
02:03:57.200 | this term has become a lowercase term really fast, okay?
02:04:00.840 | - Network state.
02:04:01.680 | - Yeah, like in the sense of Google
02:04:03.880 | became lowercase Google for like Googling,
02:04:05.680 | or like Uber became lowercase Uber.
02:04:07.360 | Like if you go to the networkstate.com/reviews,
02:04:10.440 | or you go to search.twitter.com and put in network state,
02:04:13.680 | you'll see it's just become like a word or a phrase, okay?
02:04:16.280 | So that means it's sort of, whatever I intend it to mean,
02:04:19.260 | people will use it to mean what they want it to mean, right?
02:04:21.720 | Okay. - Internet.
02:04:23.000 | - It's internet, right?
02:04:23.840 | - I think internet.
02:04:24.660 | You've become a meme.
02:04:25.500 | Well, first of all, you're a meme, and this book is a meme.
02:04:27.240 | - Am I a meme?
02:04:28.080 | Okay, maybe I'm a meme.
02:04:28.900 | But the book is, the book I think is a good meme.
02:04:30.560 | That's actually why I wanted to make it free.
02:04:31.920 | I wanted people to take it out there and make it their own.
02:04:34.440 | And one of the things I say at the beginning,
02:04:35.880 | and I'll come back to this thing,
02:04:36.840 | is it's a toolbox, not a manifesto.
02:04:39.320 | Even if you dislike 70% of it, 80% of it, 90% of it,
02:04:43.120 | if there's something that's useful to you,
02:04:44.360 | you can take that and use it, just like a library,
02:04:47.200 | you know, a software library.
02:04:48.240 | You might just use one function there.
02:04:49.400 | Great, I'm glad I've delivered you some value, right?
02:04:51.400 | That's my purpose in this, right?
02:04:52.240 | - So you're not Ayn Rand?
02:04:53.700 | - No, I'm not Ayn Rand.
02:04:54.680 | Basically, the whole point of this actually is
02:04:57.320 | it's polytheistic, polystatistic, polymystic,
02:05:02.040 | is genuinely--
02:05:03.000 | - Is it polyamorous?
02:05:04.440 | - It's not polyamorous.
02:05:05.280 | - Okay.
02:05:06.400 | - Though somebody might want--
02:05:07.240 | - Do you have love advice in the book?
02:05:08.640 | I didn't see it.
02:05:09.640 | So do you talk about love in the book?
02:05:11.160 | - I do not talk about love.
02:05:13.000 | Rather--
02:05:13.840 | - Maybe in V2 you will.
02:05:14.660 | - Not that I don't believe in love.
02:05:15.500 | Love is great.
02:05:16.320 | - All right, I will accept your offer
02:05:17.160 | to write a guest chapter in your V2 book about love.
02:05:20.400 | - All right, great.
02:05:21.340 | - Because there is some aspect that's very interesting,
02:05:25.800 | which parts of human civilization require physical contact,
02:05:30.320 | physical, 'cause it seems like more and more
02:05:33.440 | can be done in a digital space.
02:05:35.120 | - Yeah, but as I said--
02:05:36.520 | - Work, for example.
02:05:37.520 | - But you're not gonna build a self-driving car city
02:05:39.880 | in digital space.
02:05:40.720 | You're not gonna be able to do--
02:05:42.200 | - Oh, I do cars at all.
02:05:43.960 | - Well, sure, but let's say you're not gonna be able
02:05:46.440 | to get to Mars in a purely digital thing.
02:05:49.440 | You need to build, you have to have a little rocket launchpad.
02:05:51.560 | You're not gonna be able to do all the innovative
02:05:54.160 | biomedicine, whether it's all the,
02:05:58.000 | have you seen bioelectricity?
02:05:59.320 | Or there's stuff on regenerative medicine,
02:06:02.040 | stem cells, all this stuff.
02:06:03.880 | You just can't do that digitally, right?
02:06:06.080 | We're still physical beings, so you need physical space,
02:06:09.680 | but how do we get that, right?
02:06:10.640 | So this is meant to wind its way
02:06:13.320 | through various roadblocks in the so-called,
02:06:15.800 | actually my term from many years ago,
02:06:17.440 | the idea maze, it's meant to wind its way
02:06:18.920 | through the idea maze to find how to use bits
02:06:21.000 | to unlock innovation in atoms.
02:06:22.800 | - The idea maze within the bigger prime number maze,
02:06:25.200 | or go back to visualizing the number of states
02:06:28.240 | and how often are they born.
02:06:29.400 | - So let me first anchor this,
02:06:31.520 | because people, just to give some numbers, right?
02:06:34.500 | How many UN listed countries are there?
02:06:36.960 | Like 196, 193, okay?
02:06:38.800 | And there's some that are on the border,
02:06:40.120 | like Taiwan or Israel, right?
02:06:42.080 | Where they're not, I mean, Israel is a country,
02:06:43.560 | but it's not recognized by every country
02:06:45.040 | or what have you, right?
02:06:45.880 | - Is Texas a country?
02:06:47.640 | - No, but it may eventually become, right, okay.
02:06:50.380 | So within that list of about 200 countries,
02:06:53.400 | okay, I've got a graph in the book that shows
02:06:54.920 | that most countries are actually small countries.
02:06:57.800 | About, there's 12 countries that have less
02:06:59.400 | than 100,000 people by the UN definition of a country.
02:07:02.100 | There's another 20 something that have
02:07:03.720 | between 100,000 and one million.
02:07:06.560 | There's another 50 or 60 something
02:07:08.040 | that have between a million and 10 million.
02:07:10.480 | So most countries in the UN are less than 10 million people.
02:07:15.200 | There's only 14 countries that are over 100 million people.
02:07:17.540 | Okay, so most countries are small countries
02:07:19.880 | is kind of surprising to us
02:07:21.520 | because most people live in big countries, okay?
02:07:24.320 | And so now you're like, okay,
02:07:26.680 | well, I've built social networks that are bigger than that.
02:07:29.400 | You have a following that's bigger than 100,000 people.
02:07:31.600 | You have a following that's bigger than, you know,
02:07:33.200 | a small country like Kiribati or what have you, right?
02:07:35.880 | And, okay, so that first changes feasibility.
02:07:38.920 | You think of a country as this huge, huge, huge thing,
02:07:41.120 | but it's actually smaller than many,
02:07:42.360 | many countries are smaller than social networks
02:07:44.120 | that you've built, okay, number one.
02:07:46.260 | Number two is the number of UN listed countries,
02:07:50.480 | even though it's been flat-ish for the last 30 years
02:07:53.280 | with like a few things like South Sudan
02:07:55.280 | and East Timor that have come online,
02:07:58.080 | there's a graph that I posted which shows
02:07:59.480 | that it's increased by about,
02:08:00.800 | from about 40 or 50 something at the end of World War II
02:08:05.040 | when the UN was set up to 190 something today.
02:08:07.720 | There's been like kind of a steady increase in particular
02:08:09.920 | with all the decolonization,
02:08:11.600 | all the countries that got their independence
02:08:12.880 | first from the British empire
02:08:14.720 | and then from the Soviet empire, right?
02:08:16.720 | That imperial breakup led to new countries, okay?
02:08:20.480 | And so then the question is, is that flat forever?
02:08:23.980 | Well, the number of new currencies
02:08:25.920 | similarly increased for a while,
02:08:27.640 | roughly one per country or thereabouts,
02:08:30.320 | and then it was flat for a while
02:08:31.400 | and then suddenly it's gone completely vertical.
02:08:33.800 | That's an interesting graph, right?
02:08:35.800 | Where it's like linear-ish, then it's flat,
02:08:38.800 | and then it just goes voof like this.
02:08:40.840 | Now you can define, you can argue where the boundary is
02:08:44.440 | for quote a new currency, okay?
02:08:47.460 | But I think Bitcoin certainly counts,
02:08:48.760 | I think Ethereum certainly counts
02:08:50.040 | in terms of just its scale and adoption worldwide.
02:08:52.440 | So at least you have two.
02:08:54.000 | If you take the broad church view,
02:08:56.360 | you have a thousand or something like that, right?
02:08:59.560 | Somewhere in between, you might say,
02:09:01.520 | how many currencies are above the market cap
02:09:04.520 | of an existing previously recognized fiat currency?
02:09:07.240 | Like which got onto the leaderboard, right?
02:09:09.720 | There's a website just like coinmarketcap.com.
02:09:12.160 | That's like a site for like cryptocurrency tracking,
02:09:14.200 | it's very popular, okay?
02:09:15.480 | There's a fun site called fiatmarketcap.com
02:09:18.160 | which shows where Bitcoin is relative
02:09:20.160 | to the fiat currencies of the world.
02:09:21.520 | And it's like, last I checked, like number 27,
02:09:23.320 | somewhere in between the Chilean peso
02:09:26.640 | and the Turkish lira or something, okay?
02:09:28.920 | And it previously been close to cracking the top 10, okay?
02:09:31.440 | And I think it will again at some point.
02:09:33.600 | So we know that you can have a currency out of nowhere
02:09:37.400 | that ranks with the fiat currencies of the world.
02:09:39.680 | Could you have a country out of nowhere
02:09:40.800 | that ranks with the countries of the world?
02:09:42.560 | So this is maybe the fastest way,
02:09:45.240 | you probably should have said this at the very beginning.
02:09:46.800 | If you go to the network state in one image, okay?
02:09:50.320 | That kind of summarizes what a network state looks like
02:09:52.920 | in a visual, just one single visual.
02:09:54.840 | And the visual is of a dashboard.
02:09:56.680 | And the dashboard shows something
02:09:58.000 | that looks like a social network,
02:09:59.600 | except you're visualizing it on the map of the world.
02:10:02.480 | And it's got network nodes all over the place.
02:10:05.000 | A hundred people here, a thousand people there,
02:10:06.840 | they're all connected together.
02:10:08.200 | The total population of the people in this social network
02:10:11.200 | is about 1 million people.
02:10:13.480 | So 1.7 million people in this example.
02:10:15.520 | And some of the buildings are,
02:10:17.760 | some of the people are just singletons.
02:10:19.320 | They're just folks in their apartment
02:10:21.120 | who can conceptualize themselves
02:10:22.760 | as citizens of this network state.
02:10:24.480 | And they've got the flag on their wall, right?
02:10:26.600 | And the digital passport on their phone
02:10:30.520 | along with the digital currency.
02:10:31.840 | Others are groups of hundreds or thousands
02:10:35.080 | or even tens of thousands of people
02:10:36.840 | that have all taken over a neighborhood,
02:10:38.480 | just like Chinatowns exist, right?
02:10:40.640 | Just like, you know, intentional communities existed.
02:10:45.160 | They just basically, you know,
02:10:47.720 | go and crowdfund land together, right?
02:10:50.160 | And these are all networked together, you know,
02:10:52.160 | just like the islands of Indonesia are separated by ocean.
02:10:56.000 | These are islands of this network state
02:10:57.640 | that are separated by intranet, okay?
02:10:59.280 | So they conceptualize themselves as something.
02:11:00.800 | And at the very top of the dashboard,
02:11:03.040 | there's something very important,
02:11:03.880 | which is the population annual income
02:11:06.200 | and real estate footprint of this network state.
02:11:08.720 | So the population we already discussed,
02:11:10.240 | you can build an online social network.
02:11:11.720 | We know you can build something
02:11:13.360 | which has a population that's bigger
02:11:14.600 | than these hundred thousand or million person countries.
02:11:18.040 | One of the new things contributions the network state has
02:11:21.480 | is say that you can not just exceed in population,
02:11:23.640 | you can exceed it in real estate footprint.
02:11:26.080 | Because one way of thinking about it is,
02:11:28.120 | I don't exactly know the numbers
02:11:29.760 | on foreign ownership in Estonia,
02:11:31.720 | but let's say to first order,
02:11:33.560 | the million something Estonians own
02:11:35.800 | and could afford Estonia, okay?
02:11:38.080 | A million people could buy a territory
02:11:40.840 | that is the size of Estonia, right?
02:11:42.480 | That's probably true to first order.
02:11:43.600 | There might be some overseas ownership,
02:11:44.880 | but it's probably true, okay?
02:11:45.720 | You probably find a country for which that is true.
02:11:47.680 | But that means is a million people digitally
02:11:50.920 | could buy distributed territory
02:11:53.120 | that is probably greater than or equal
02:11:54.360 | to the size of Estonia.
02:11:55.800 | Especially if they're buying like desert territory
02:11:57.560 | or stuff like that.
02:11:58.680 | Which means now you have a digital country
02:12:00.600 | that is ranking not just in people,
02:12:03.760 | not just in real estate footprint.
02:12:06.400 | So it's also in real estate footprint
02:12:08.000 | with the countries of the world.
02:12:09.880 | So you start ranking and you're bigger
02:12:11.880 | than these UN listed countries in your population
02:12:14.440 | and your real estate footprint.
02:12:15.280 | And the third is income, okay?
02:12:17.040 | You can prove on chain that you have a income
02:12:21.600 | for the digital population
02:12:23.200 | that is above a certain amount, right?
02:12:25.280 | This is what I call the census of the network state.
02:12:27.440 | And it's actually such a crucial component
02:12:29.000 | that I have it in the essay,
02:12:31.600 | "The Network State in a Thousand Words."
02:12:33.360 | The post office and census were actually important enough
02:12:35.320 | to be written into the US constitution, okay?
02:12:38.240 | Partly because it was like
02:12:39.320 | for apportionment of representatives,
02:12:41.000 | partly because it was a feedback mechanism.
02:12:43.040 | And so that census was done every 10 years
02:12:45.320 | and it's provided a crucial snapshot of the US
02:12:47.280 | for the last several hundred years, okay?
02:12:49.400 | Now here, this census of a digital state
02:12:52.520 | could be done every 10 seconds, okay?
02:12:56.120 | Conducting it is actually not the hard part.
02:12:57.920 | You know what the hard part is?
02:12:59.720 | Proving it.
02:13:00.920 | Because how will the world believe
02:13:03.600 | that you actually have 100,000 people spread
02:13:06.080 | across countries?
02:13:07.120 | Couldn't they all be bots?
02:13:08.120 | Couldn't they be AIs?
02:13:10.120 | Proof of human, proof of income,
02:13:12.640 | and also proof of real estate
02:13:14.120 | start to actually rise dramatically in importance
02:13:17.080 | because you're saying we're gonna rank this digital state
02:13:20.480 | on the leaderboard of the fiat states, okay?
02:13:24.440 | And so that means that people will start to,
02:13:26.080 | at first they'll just laugh at it.
02:13:27.640 | Once you start claiming you have 10,000 citizens,
02:13:30.720 | people are gonna start poking and be like,
02:13:32.120 | "Is that real?
02:13:32.960 | Prove that it's real."
02:13:33.960 | So I have a whole talk on this,
02:13:36.320 | actually I'm giving at this Chainlink conference,
02:13:38.680 | but essentially how do you prove this, right?
02:13:41.320 | The short answer is crypto oracles plus auditing.
02:13:45.000 | The somewhat longer answer is
02:13:47.440 | you put these assertions on chain,
02:13:50.360 | these proof of human,
02:13:52.520 | these proof of real estate, et cetera,
02:13:54.400 | assertions on chain, okay?
02:13:55.600 | And there's people who are writing to the blockchain
02:13:58.720 | and they are digitally signing their assertions.
02:14:01.200 | Now, of course, simply just putting something on chain
02:14:03.120 | doesn't make it true.
02:14:03.960 | It just says you can prove not that
02:14:06.400 | what is written on chain is true,
02:14:07.400 | but that the metadata is true.
02:14:08.800 | You can show who wrote it via their digital signature,
02:14:12.720 | what they wrote, their hash,
02:14:13.840 | and when they wrote it, their timestamp.
02:14:15.080 | So you can establish those things in metadata
02:14:17.080 | of who, what, and when was written.
02:14:18.840 | Who's the who in that picture?
02:14:20.160 | So for example-
02:14:21.360 | How do you know it's one human?
02:14:23.640 | Great question.
02:14:24.480 | So let's say you've bought a bunch of your piece of territory
02:14:26.200 | from Blackstone, okay?
02:14:27.800 | As a function of that,
02:14:29.600 | blackstone.eth signs an on-chain receipt
02:14:34.600 | that says this,
02:14:38.080 | lexfriedman.eth bought this piece of property from us
02:14:42.760 | and it has, you know, like, it's a thousand square meters
02:14:45.760 | and this is put on chain, they sign it, okay?
02:14:48.960 | That's their digital receipt.
02:14:50.440 | Just like you might get an email receipt
02:14:51.840 | when you buy a piece of property or something, okay?
02:14:53.760 | It's just put not online, but on chain.
02:14:56.120 | And it's signed by Blackstone
02:14:57.600 | or whatever real estate vendor you buy it from.
02:15:00.960 | It could be a company,
02:15:01.840 | it could obviously be an individual, right?
02:15:03.840 | And so you have a bunch of these assertions.
02:15:05.520 | Let's say there's 47 different real estate vendors.
02:15:08.760 | I know vendor's an atypical term there,
02:15:10.240 | but just bear with me, right?
02:15:11.480 | 47 different real estate sellers
02:15:14.920 | that you've bought all of your territory from.
02:15:17.120 | Each of them put digital signatures that are asserting
02:15:19.320 | that a certain amount of real estate was bought
02:15:21.040 | and its square meters, its location,
02:15:22.840 | or whatever else they wanna prove.
02:15:24.520 | The sum of all that is now your real estate footprint, okay?
02:15:28.000 | And now the question is, was that real?
02:15:29.880 | Well, because they signed what they put on chain,
02:15:33.640 | you can do things like you can audit.
02:15:35.360 | Let's say Blackstone has signed 500,000 properties
02:15:38.880 | and they've sold them and put them on chain.
02:15:40.640 | And I'm not talking about 2022 or 2023, but 2030, right?
02:15:44.320 | It'll be a few years out,
02:15:45.160 | but people are doing this type of stuff.
02:15:46.160 | They're putting this stuff on chain.
02:15:47.320 | So you get that on-chain receipt.
02:15:48.680 | They've got 500,000 of these.
02:15:50.080 | What you can do is just sampling, okay?
02:15:53.200 | You pick a subset N of them,
02:15:55.600 | let's say 500 properties around the world.
02:15:57.560 | You go there, you actually go and independently look
02:16:00.480 | at what the square footprint is.
02:16:03.120 | And then from that,
02:16:04.120 | you can see what was the actual, your measurement
02:16:07.400 | versus their reported.
02:16:08.880 | And then you can, via Cisco inference,
02:16:10.840 | extrapolate that if they were randomly selected
02:16:12.920 | to the rest of the properties
02:16:14.000 | and get a reliability score for Blackstone's reporting
02:16:16.800 | of its real estate square footage.
02:16:18.200 | - Who does the, so that's the auditing step.
02:16:19.920 | - That's the auditing step.
02:16:20.760 | - So the crypto oracle is the-
02:16:24.360 | - Auditable oracle.
02:16:25.200 | - On-chain, what did you say, assertions?
02:16:28.480 | - That's right.
02:16:29.320 | - Yeah, about like who bought stuff with who.
02:16:30.800 | I still have to get to the proof of human,
02:16:32.320 | but auditing, there's a bunch of people randomly checking
02:16:36.520 | that you're not full of shit.
02:16:37.640 | - That's right.
02:16:38.480 | - Who is in charge of the auditing though?
02:16:40.600 | - So it could be a big four, like a PWC
02:16:43.440 | and basically the accountants that do corporate balance sheet
02:16:48.160 | and cashflow and-
02:16:50.040 | - Who keeps them in check from corruption?
02:16:52.560 | I'm just imagining a world full of network states.
02:16:55.040 | - Yeah, it's a good question.
02:16:55.880 | So at a certain point you get to who watches the watchers.
02:16:58.800 | Right?
02:16:59.640 | And, oh, well, the government is meant
02:17:02.320 | to keep the accountants accountable.
02:17:04.280 | And Arthur Anderson actually did have a whole flame out
02:17:06.720 | in the, around the time of the Enron thing.
02:17:09.560 | So it is possible that there's corrupt accountants
02:17:12.720 | or bad accountants or what have you.
02:17:14.120 | But of course the government itself is corrupt in many ways
02:17:17.960 | and prints all this money and seizes all of these assets
02:17:20.440 | and surveils everybody and so on and so forth.
02:17:22.720 | So, the answer to your question is going to be probably exit
02:17:27.720 | in the sense that if those accountants,
02:17:31.800 | they are themselves gonna digitally sign a report
02:17:34.000 | and put it on-chain.
02:17:35.080 | Okay?
02:17:35.920 | So they're gonna say, we believe that X, Y, and Z's,
02:17:40.920 | you know, reports are on-chain, we're this reliable,
02:17:44.240 | and here's our study.
02:17:45.240 | If they falsify that, well, if somebody finds that,
02:17:48.600 | eventually, then that person is downweighted
02:17:50.520 | and then you have to go to another accountant.
02:17:52.040 | Right?
02:17:52.880 | - Is there ways to mess with this?
02:17:53.720 | I mean, I just, let me breathe in and out.
02:17:57.880 | As I mentioned, some of the heaviest shit I've ever read.
02:18:01.480 | So because I visited Ukraine,
02:18:03.240 | I read "Red Famine" by Ann Applebaum, "Bloodlands."
02:18:08.240 | - Yep.
02:18:09.760 | - And it's just a lot of coverage of the census.
02:18:12.200 | I mean, there's a lot of coverage of a lot of things,
02:18:14.600 | but in Ukraine in the 1930s,
02:18:18.040 | Stalin messed a lot with the census to hide the fact
02:18:21.040 | that sort of a lot of people died from starvation.
02:18:25.440 | - And did that with the cooperation
02:18:26.760 | of Arthur G. Sulzberger's New York Times company.
02:18:29.360 | Like Walter Grant, he falsified all those reports.
02:18:31.840 | Are there several parties involved?
02:18:33.760 | Can there be several parties involved in this case
02:18:37.160 | that manipulate the truth as it is represented
02:18:42.160 | by the crypto oracle
02:18:44.520 | and as it is checked by the auditing mechanism?
02:18:47.200 | - It is possible, but the more parties that are involved
02:18:49.600 | in falsifying something, the more defections there are.
02:18:52.120 | So that's why you basically have another level of auditing
02:18:57.120 | is fundamentally the answer, right?
02:18:58.640 | And really, I think what it comes back to is
02:19:01.240 | if you're showing your work, right?
02:19:03.240 | This is the difference between crypto economics
02:19:05.200 | and fiat economics.
02:19:06.080 | You know, the Bitcoin blockchain,
02:19:07.400 | anybody can download it and run verification on it, okay?
02:19:10.800 | This is different than government inflation stats,
02:19:13.480 | which people don't believe, right?
02:19:14.840 | Because the process is just, you know,
02:19:17.400 | it is true that CPI methodology is published and so on,
02:19:20.760 | but it is not something which people feel
02:19:23.240 | reflects their actual basket of goods, right?
02:19:26.260 | And so the independent verifiability is really the core
02:19:30.360 | of what true audibility is.
02:19:32.560 | And so then to your question,
02:19:34.240 | it's hard for some group to be able to collude
02:19:36.160 | because the blockchain is public
02:19:37.360 | and everything they've written to it is public.
02:19:39.140 | And so if there's an error,
02:19:40.880 | it's easier in some ways to tell the truth than to lie
02:19:43.080 | because the truth is just naturally consistent
02:19:45.880 | across the world, whereas lies can be found out,
02:19:48.520 | even, you know, Cisco Tesla, you know, Benford's law?
02:19:51.080 | - Yes.
02:19:52.840 | - Right, it's something where the digits in like a real,
02:19:56.460 | if you take the last digit or the first,
02:19:58.280 | I forget if it was the last digit or the first digit,
02:19:59.920 | I think it's first digit, right?
02:20:01.320 | So you take the first digit in an actual financial statement,
02:20:05.240 | you look at the distribution of like how many ones
02:20:08.320 | and how many twos, how many threes, the percentages.
02:20:11.480 | It has actually, you'd guess it might be,
02:20:14.480 | oh, each one will be equally random, it'd be 10%.
02:20:17.240 | It's not like that, actually.
02:20:19.160 | There's a certain distribution that it has
02:20:21.120 | and fake data doesn't look like that, but real data does.
02:20:25.600 | - That's weird.
02:20:26.520 | - It's interesting, right?
02:20:27.360 | - Benford's law, also called the first digit law,
02:20:29.600 | states that the leading digits in a collection of data sets
02:20:33.120 | are probably going to be small.
02:20:36.120 | For example, most numbers in a set,
02:20:39.440 | about 30%, will have a leading digit of one.
02:20:42.280 | - Yeah, so that's a great example
02:20:43.400 | of what we were talking about earlier,
02:20:44.480 | the observational leading to the theory.
02:20:46.840 | - Ooh, there's a Benford's law of controversy.
02:20:48.960 | I'm looking that up.
02:20:50.200 | Benford's law of controversy.
02:20:53.080 | Benford's law of controversy is an adage
02:20:55.080 | from the 1980 novel "Timescape" stating,
02:20:58.120 | "Passion is inversely proportional
02:21:00.040 | "to the amount of real information available."
02:21:03.080 | The adage was quoted in an international drug policy article
02:21:06.480 | in peer-reviewed social science.
02:21:08.080 | Can I just say how much I love Wikipedia?
02:21:10.280 | I have the founder of Wikipedia
02:21:12.320 | coming on this very podcast very soon,
02:21:14.120 | and I think the world is a better place
02:21:17.200 | because Wikipedia exists.
02:21:18.840 | One of the things he wanted to come on and talk about
02:21:21.600 | is the ways that he believes that Wikipedia is going wrong.
02:21:26.360 | - So on technical truths, it's great.
02:21:28.240 | Remember I think earlier on technical truths
02:21:30.720 | versus political truths?
02:21:31.560 | On technical truths, it's great.
02:21:32.880 | On political truths, it's like a defamation engine.
02:21:35.880 | Just as one example, okay?
02:21:37.360 | This is something that I was gonna write up,
02:21:39.280 | but there was a scam called HPZ Token
02:21:43.560 | that managed to edit Wikipedia.
02:21:45.680 | Nobody detected it.
02:21:47.280 | It said that I was the founder of HPZ Token.
02:21:50.240 | - That you were the founder of HPZ Token.
02:21:51.280 | - Yeah, I had nothing to do with this,
02:21:53.960 | and people were scammed out of it
02:21:55.880 | because Google just pushes Wikipedia links
02:21:59.280 | to high on Google, and people are like,
02:22:02.840 | "Well, it's Wikipedia, therefore it's real," right?
02:22:06.240 | Wikipedia has the bio of living persons thing.
02:22:09.200 | They should just allow people to delete their profile
02:22:10.960 | 'cause they have zero quality control on it.
02:22:12.760 | It's literally facilitating fraud, right?
02:22:15.840 | Where people will maliciously edit
02:22:18.120 | and then do things with them,
02:22:19.740 | and nobody cares or is looking at it
02:22:22.520 | beyond the fraudsters, and this is happening.
02:22:24.600 | If that was happening, that was undetected.
02:22:26.120 | I wasn't paying attention to this.
02:22:27.080 | This was there for weeks or months, totally undetected,
02:22:30.860 | that literally facilitated fraud, right?
02:22:32.960 | And fundamentally, the issue is that
02:22:35.040 | Wikipedia doesn't have any concept of who's editing
02:22:38.400 | or property rights or anything like that, right?
02:22:40.800 | It is also something which is,
02:22:42.480 | it used to be something in the early 2000s, mid-2000s,
02:22:45.000 | people said, "Oh, it's Wikipedia,
02:22:46.440 | "how trustworthy it can be, Britannica's reviewed,"
02:22:48.720 | and that's been forgotten,
02:22:49.880 | and now it's become over-trusted, right?
02:22:51.960 | Remember the thing, like, the more trust something gets,
02:22:54.080 | the less trustworthy it often becomes.
02:22:56.160 | It kind of abuses the power, right?
02:22:58.120 | So what I'm interested in,
02:23:01.040 | Google actually had a model a while back called KNOL.
02:23:03.840 | KNOL, K-N-O-L, was something where
02:23:07.640 | when there were different versions of a Wikipedia-style page,
02:23:12.480 | you had Google Docs-like permissions on them.
02:23:14.800 | For example, you might have 10 different versions
02:23:17.560 | of the Israeli-Palestinian conflict, okay?
02:23:21.560 | And each one had an editor
02:23:23.840 | and folks that they could grant edit rights and so on,
02:23:26.800 | but this way, you would actually be able to see
02:23:29.560 | different versions of a page,
02:23:31.400 | and they might have different versions of popularity,
02:23:33.680 | but this way, you wouldn't have edit wars,
02:23:35.560 | you'd have forks, right?
02:23:37.480 | And they would all kind of coexist,
02:23:39.840 | and then people could review them,
02:23:41.840 | and now you could see different versions of something
02:23:44.860 | versus the thing that just kind of rewards
02:23:48.000 | dogged persistence or being an editor
02:23:50.000 | or something like that.
02:23:51.120 | The other thing is, a lot of the folks
02:23:52.920 | who have editorial privileges of Wikipedia
02:23:55.480 | are there from the early 2000s,
02:23:57.440 | and most of India wasn't online then.
02:24:00.240 | Most of Africa wasn't online then, right?
02:24:02.320 | So there's this inherited power that exists,
02:24:06.480 | which again, was fresh and innovative 10 or 20 years ago,
02:24:10.220 | but it's now kind of outdated.
02:24:13.000 | - Yeah, I wanna see some data, though.
02:24:14.560 | - You wanna see what? - I wanna see some data,
02:24:16.160 | because we can always, I mean, this is,
02:24:20.800 | we often highlight small anecdotal--
02:24:23.480 | - Okay, I'll give you an example.
02:24:24.360 | - Cases, hold on a second.
02:24:26.320 | We often highlight issues in society, in the world,
02:24:30.840 | in anything by taking a specific example,
02:24:35.040 | taking anecdotal data and saying, "There's a problem here."
02:24:37.920 | I wanna know on net how much positive
02:24:40.500 | is being added to the world because of it.
02:24:42.520 | My experience that I try to be empathetic and open-minded,
02:24:46.960 | my exploration of Wikipedia has been such
02:24:49.800 | that it is a breath of fresh air
02:24:53.000 | in terms of the breadth and depth of knowledge that is there.
02:24:58.000 | Now, you can say there's bias built in,
02:25:02.080 | there's wars that are incentivized not to produce truth,
02:25:06.360 | but to produce a consensus around a particular narrative,
02:25:11.000 | but that is how the entirety of human civilization operates,
02:25:15.480 | and we have to see where's it better and where's it worse
02:25:19.120 | in terms of platforms.
02:25:20.240 | - I think Wikipedia was an improvement over what came before
02:25:24.000 | but has a lot of flaws.
02:25:25.680 | You're right that absolutely,
02:25:27.620 | sometimes people can over-fixate on the anecdotal,
02:25:29.960 | but sometimes the anecdotal illustrates a general pattern.
02:25:34.120 | For example, one thing that happens frequently in Wikipedia
02:25:36.960 | is there are editors who will plant a story
02:25:41.960 | and then they will then go and use that story
02:25:44.420 | as like a neutral third party to win an edit war.
02:25:47.840 | So here's a phenomenon that happens in Wikipedia.
02:25:51.520 | You have an editor who's privileged
02:25:54.800 | above just random users, okay,
02:25:56.920 | who will plant a story and then cite that story
02:25:59.040 | as if it was a neutral third party.
02:26:00.520 | So there's a site called Wikipediocracy, okay,
02:26:03.760 | and it discusses the case of a person named Peppermint
02:26:08.280 | who had a name that they didn't want included,
02:26:11.520 | their so-called dead name on their Wikipedia profile.
02:26:14.120 | And there's a Wikipedia editor named Tenebrae
02:26:16.720 | who people allege was a Newsday reporter or writer
02:26:21.720 | that put a piece into Newsday that dead named Peppermint
02:26:27.800 | and then was able to cite it on the Wikipedia article
02:26:30.580 | as if it was like a neutral third party
02:26:32.080 | when it actually wasn't,
02:26:33.100 | when people allege it was the same guy, okay?
02:26:35.500 | Now, that is not an uncommon thing.
02:26:38.320 | That actually--
02:26:39.160 | That's what I want data on.
02:26:40.000 | Okay, I know--
02:26:40.820 | How many articles, I'm not--
02:26:42.640 | Who's auditing--
02:26:43.920 | I'm dancing with you, not against you.
02:26:45.600 | Sure, sure.
02:26:46.440 | I'm saying how many articles have that kind of war
02:26:50.560 | where douchebags are manipulating each other?
02:26:53.520 | So that's the question, what's the audit?
02:26:55.640 | Has Wikipedia actually been audited, right?
02:26:57.640 | Who are the editors?
02:26:58.680 | Like, who's actually writing this stuff?
02:27:00.320 | It is actually something where, again, on technical topics,
02:27:03.440 | I think it's pretty good.
02:27:04.720 | On non-technical topics, there's something called
02:27:07.200 | the Wikipedia Reliable Sources Policy.
02:27:09.360 | It's a fascinating page, okay?
02:27:11.160 | So it actually takes a lot of the stuff
02:27:13.600 | that we have been, you know, the world has been talking about
02:27:16.400 | in terms of what's a reliable source of information
02:27:19.120 | and so on and so forth.
02:27:20.280 | It's called the Wikipedia Reliable Sources,
02:27:22.160 | Perennial Sources, okay?
02:27:23.800 | And if you go to this page, okay,
02:27:25.320 | which I'm just gonna send to you now, all right,
02:27:27.360 | you will literally see every media outlet in the world
02:27:31.320 | and they're colored gray, green, yellow, or red, okay?
02:27:36.320 | And so red is like untrustworthy, green is trustworthy,
02:27:40.320 | yellow is like neutral, okay?
02:27:42.680 | Now, this actually makes Wikipedia's epistemology explicit.
02:27:47.040 | They are marking a source as trustworthy or untrustworthy.
02:27:50.320 | For example, you are not allowed to cite social media
02:27:52.800 | on Wikipedia, which is actually an enormous part
02:27:55.440 | of what people are posting.
02:27:56.840 | Instead, you have to cite a mainstream media outlet
02:28:00.760 | that puts the tweets in the mainstream article
02:28:04.560 | and only then can it be cited in Wikipedia.
02:28:06.640 | - By the way, to push back, this is a dance.
02:28:09.320 | We're dancing. - Sure, sure, sure, sure.
02:28:10.960 | - That those are rules written on a sheet of paper.
02:28:14.280 | I have seen Wikipedia in general play in the gray area
02:28:18.600 | that these rules create.
02:28:20.160 | - Oh, well, if you are an editor, then you can get--
02:28:23.440 | - So you can use the rules and you can,
02:28:26.000 | because there's a lot of contradictions within the rules,
02:28:28.960 | you can use them to, in the ways you said,
02:28:32.000 | to achieve the ends you want.
02:28:33.320 | It really boils down to the incentives,
02:28:37.800 | the motivations of the editors.
02:28:40.160 | And one of the magical things about Wikipedia,
02:28:42.960 | the positive versus the negative,
02:28:44.720 | is that it seems like a very small number of people,
02:28:47.560 | same with Stack Overflow, can do an incredible amount
02:28:51.880 | of good editing and aggregation of good knowledge.
02:28:56.880 | Now, as you said, that seems to work much better
02:29:03.000 | for technical things over which there's not
02:29:05.400 | a significant division.
02:29:08.960 | Some of that has to do less with the rules
02:29:11.400 | and more with the human beings involved.
02:29:15.160 | - Well, but here's the thing is,
02:29:17.040 | so first, let me take this,
02:29:17.960 | I should finish off this point
02:29:19.000 | with reliable source, perennial sources, right?
02:29:20.800 | So if you go to this, you'll see that Al Jazeera
02:29:24.920 | is marked green, but let's say the Cato Institute
02:29:29.560 | is marked yellow, right?
02:29:32.200 | The nation is marked green. - Oh, shit.
02:29:34.680 | Oh, snap.
02:29:35.840 | Right, okay, sure, yes.
02:29:37.680 | - The nation is marked green,
02:29:38.720 | but National Review is marked yellow, okay?
02:29:41.160 | You could probably go and do,
02:29:42.600 | so what's good about this is it makes
02:29:43.920 | the epistemology explicit, right?
02:29:46.920 | You could actually take this table,
02:29:48.960 | and you could also look at all the past edit wars
02:29:51.240 | and so on over it, and take a look at what things
02:29:53.560 | are starting to get marked as red or yellow
02:29:55.760 | and what things are starting to get marked as green,
02:29:57.760 | and I'm pretty sure you're gonna find
02:29:59.080 | some kind of partisan polarization
02:30:00.520 | that comes out of it, right, number one.
02:30:02.960 | Number two is once something gets marked
02:30:06.400 | as being yellow or red, then all links
02:30:10.720 | and all references to it are pulled out.
02:30:12.160 | For example, Coindesk, okay, was marked
02:30:16.040 | as being like, gosh, what was it?
02:30:17.880 | - Yellow?
02:30:18.720 | - I think it's marked as red.
02:30:22.040 | Coindesk, which is actually like--
02:30:23.360 | - I get a lot of useful information from Coindesk.
02:30:25.280 | - That's right, but it's marked as red, why?
02:30:26.960 | Because there's some Wikipedia editors
02:30:29.040 | who hate cryptocurrency, and so cryptocurrency
02:30:31.760 | on Wikipedia has been a huge topic
02:30:34.000 | where they've just edited out all the positive stuff,
02:30:35.960 | and these are senior editors of Wikipedia
02:30:37.840 | who can control what sources are considered reliable.
02:30:41.120 | So they've now knocked out Coindesk,
02:30:42.960 | they've knocked out social media.
02:30:45.060 | They only allow mainstream media coverage,
02:30:47.440 | and not even all mainstream media,
02:30:48.840 | only those they've marked as green.
02:30:50.960 | This is the manipulation of consensus.
02:30:52.520 | - I wanna know how many articles are affected by it,
02:30:55.760 | and on that-- - Hundreds of thousands.
02:30:57.080 | Hundreds of thousands.
02:30:57.960 | - You could just say that randomly.
02:31:00.600 | - I can, I can. - No, no, no, no, no.
02:31:01.680 | - I can, because all the-- - They're affected,
02:31:02.840 | there's different levels of effect
02:31:04.440 | in terms of it actually having a significant impact
02:31:07.400 | on the quality of the article.
02:31:08.520 | - Let me give you an example.
02:31:09.440 | Let me give you an example, right?
02:31:12.040 | The fact that people cannot cite direct quotes
02:31:16.760 | on social media, but can only cite the rehash
02:31:20.440 | of those quotes in a mainstream media outlet,
02:31:22.320 | and not just any mainstream media outlet,
02:31:24.180 | but those that are colored green
02:31:25.800 | on the Wikipedia reliable perennial sources policy,
02:31:29.080 | is a structural shift on every single article
02:31:32.320 | to make Wikipedia align with
02:31:34.280 | US mainstream media corporations, right?
02:31:36.920 | - I am, as often, playing devil's advocate,
02:31:41.320 | to counter a point so that the disagreement
02:31:44.920 | reveals some profound wisdom.
02:31:48.760 | That's what I'm doing here.
02:31:50.800 | But also in that task here, I'm trying to understand
02:31:55.800 | exactly how much harm is created by the bias
02:32:02.120 | within the team of editors that we're discussing,
02:32:06.400 | and how much of Wikipedia is technical knowledge.
02:32:10.360 | For example, the Russian invasion of Ukraine.
02:32:16.400 | The Wikipedia article I've seen there,
02:32:21.640 | now that changes very aggressively a lot,
02:32:24.500 | and I hear from every side on this,
02:32:29.440 | but it did not seem biased to me.
02:32:32.140 | As compared to mainstream media in the United States.
02:32:39.160 | - So now I'm gonna sound extremely woke, okay?
02:32:42.100 | If you go and look at this, all right?
02:32:45.840 | Times of India is yellow, but Mother Jones,
02:32:49.680 | Jacobin, okay, they are green, right?
02:32:53.180 | So a niche, mostly white, Western, partisan left outlet,
02:32:59.140 | is marked green, but a billion people,
02:33:02.620 | like the Times of India is marked yellow, right?
02:33:06.660 | That's a structural bias towards Western media outlets
02:33:11.540 | and Western editors when much of the rest of the world
02:33:15.060 | hadn't gotten online or whatever.
02:33:16.220 | - I would just love to see, in terms of the actual article,
02:33:20.700 | what ideas are being censored, altered, shifted.
02:33:27.340 | I would love, I just think it's an open, I'm not sort of--
02:33:31.180 | - So the edit logs are there, edit logs are public.
02:33:32.900 | - Yeah, I would be fascinated, yeah.
02:33:34.940 | Is there a way to explore the way
02:33:37.180 | that narratives are shifted because of--
02:33:39.100 | - Sure, so a very simple one is,
02:33:41.480 | if you were to pull all the edit logs of Wikipedia,
02:33:44.560 | you could see how many times
02:33:46.780 | are social media links disallowed, okay?
02:33:51.020 | Like, first of all, think about it like this.
02:33:52.780 | How many, I mean, just the fact that social media
02:33:55.540 | is not allowed to be cited on Wikipedia
02:33:57.940 | or inconsistently allowed.
02:33:58.780 | - You think that's a problem?
02:33:59.900 | - It's a huge problem.
02:34:00.740 | You can't cite, let's say Jeff Bezos' own tweet.
02:34:04.060 | You have to cite some random media corporation.
02:34:06.900 | - Here's the thing, and sorry if I'm interrupting.
02:34:09.820 | - Please.
02:34:10.660 | - Hopefully I'm adding to it.
02:34:11.500 | I think they're trying to create friction
02:34:15.260 | as to the sources used because if you can use social media,
02:34:20.260 | then you can use basically bots
02:34:23.220 | to create a bunch of sources, right?
02:34:25.700 | And then you can almost automate the editor war, right?
02:34:30.340 | - Here's the thing, is basically Wikipedia initially,
02:34:33.500 | like said, oh, we'll only cite mainstream media
02:34:37.220 | as a way of boosting its credibility in the early 2000s,
02:34:42.100 | okay, when its credibility was low.
02:34:44.020 | Now it's sort of become merged with the US establishment
02:34:49.020 | and it only cites these things whose trust,
02:34:51.700 | I mean, have you seen the graphs on trust in mainstream media?
02:34:54.900 | Like it's plummeted.
02:34:55.780 | It's down to like 10% or something like that, right?
02:34:58.020 | So the most trusted sources for Wikipedia
02:35:00.700 | are untrusted by the population.
02:35:02.660 | - Yeah. - True?
02:35:03.500 | - That feels like it's a fixable technological problem.
02:35:06.700 | I think I'm under-informed and my gut says
02:35:10.220 | we're both together under-informed.
02:35:12.300 | I do a rigorous three to four hour discussion
02:35:14.620 | about Wikipedia.
02:35:15.460 | But hold on a second.
02:35:16.300 | I think I have a gut sort of developed
02:35:22.180 | feeling about which articles not to trust on Wikipedia.
02:35:25.540 | I think I need to make that explicit also.
02:35:28.380 | I have a kind of an understanding
02:35:30.420 | that you don't go to Wikipedia for this particular topic.
02:35:34.740 | Like don't go to Wikipedia for an article
02:35:37.540 | on Donald Trump or Joe Biden.
02:35:38.820 | There's going to be, if I did,
02:35:41.060 | I would go to maybe sections that don't have room
02:35:45.460 | for insertion of bias or like the section on controversy
02:35:49.220 | or accusations of racism or so on or sexual assault.
02:35:53.580 | I usually not trust Wikipedia on those sections.
02:35:57.260 | - Like math, that'll be great, right?
02:35:59.420 | Wikipedia's great for that.
02:36:00.820 | On many topics that do not have a single consensus truth,
02:36:05.820 | it's structurally shifted towards
02:36:08.780 | basically white Western liberals, woke whites, right?
02:36:13.980 | Fundamentally, that's the demographic of the Wikipedia.
02:36:16.100 | - What kind of articles do you think are affected by this?
02:36:18.860 | Let's think about it.
02:36:20.100 | - Everything that's not math and technology.
02:36:22.660 | - I think that's too strong a statement.
02:36:24.020 | So we can, like I said, war in Ukraine.
02:36:28.180 | - Sure.
02:36:30.500 | - I think that's too strong a statement.
02:36:32.380 | There's so much, I guess I'm saying affected
02:36:37.260 | to a large degree.
02:36:39.220 | Even major battles in history, Battle of Stalingrad.
02:36:43.820 | - Sure.
02:36:45.260 | - That's not math.
02:36:46.660 | So you think all of that is affected to a point
02:36:50.460 | where it's not a trusted source?
02:36:53.100 | - Absolutely.
02:36:53.940 | If you look at the edit wars, for example,
02:36:55.460 | on Stalin versus Hitler,
02:36:56.820 | the tone on Hitler starts out legitimately and justifiably
02:37:01.460 | as basically genocidal, maniacal dictator.
02:37:04.540 | With Stalin, there's a fair number of Stalin apologists
02:37:07.380 | that edit out mention of genocide
02:37:09.260 | from the first few paragraphs.
02:37:10.820 | - I am playing devil's advocate in part,
02:37:14.020 | but I also am too under-informed to do the level of defense
02:37:18.500 | I would like to provide for the wisdom that is there,
02:37:22.260 | for the knowledge that is there.
02:37:23.620 | I don't wanna use the word truth,
02:37:25.020 | but for some level of knowledge that is there in Wikipedia,
02:37:30.020 | I think I really worry about,
02:37:33.940 | I know you don't mean this,
02:37:35.020 | but a cynical interpretation of what you're saying,
02:37:38.100 | which is don't trust anything written on Wikipedia.
02:37:42.020 | I think you're being very consistent and eloquent
02:37:44.660 | in the way you're describing the issues of Wikipedia,
02:37:46.940 | and I don't have enough actual specific examples
02:37:51.940 | to give where there is some still battle for truth
02:37:58.420 | that's happening that's outside of the bias of society.
02:38:02.220 | I just, I think if we naturally distrust
02:38:05.100 | every source of information,
02:38:07.100 | there is a general distrust of institutions
02:38:12.100 | and a distrust of social knowledge
02:38:16.900 | that leads to an apathy and a cynicism
02:38:20.500 | about the world in general.
02:38:22.420 | If you believe a lot of conspiracy theories,
02:38:26.820 | you basically tune out from this collective journey
02:38:30.020 | that we're on towards the truth,
02:38:32.060 | and that's, it's not even just Wikipedia.
02:38:35.660 | I just think Wikipedia was, at least for a time,
02:38:37.900 | and maybe I tuned out,
02:38:39.580 | maybe because I am too focused on computer science
02:38:42.140 | and engineering and mathematics,
02:38:45.340 | but to me, Wikipedia for a long time
02:38:47.500 | was a source of calm escape
02:38:51.980 | from the political battles of ideology.
02:38:55.780 | And as you're quite eloquently describing,
02:38:59.060 | it has become part of the battleground
02:39:03.480 | of political ideology.
02:39:05.100 | I just would love to know where the boundaries of that are.
02:39:08.420 | - Glenn Greenwald has observed this.
02:39:10.260 | Lots of other folks, for example,
02:39:13.540 | I'm definitely not the only person who's observed
02:39:15.260 | that Wikipedia-- - A lot of,
02:39:16.380 | let me just state, because I'm sensing this,
02:39:18.780 | and because of your eloquence and clear brilliance here,
02:39:21.860 | that a lot of people are going to immediately agree with you.
02:39:25.340 | - Okay.
02:39:26.180 | - And this is what I am also troubled by.
02:39:29.020 | This is not you, but I often see that people
02:39:33.780 | will detect cynicism,
02:39:36.300 | especially when it is phrased as eloquence,
02:39:38.820 | it's yours, and will look at a natural dumbass like me
02:39:43.820 | and think that Lex is just being naive.
02:39:48.180 | Look at him trusting Wikipedia--
02:39:49.980 | - Let me argue your side. - For his mainstream narrative.
02:39:51.420 | - Let me argue your side, okay?
02:39:52.900 | - Can you please do that,
02:39:53.740 | 'cause you could do that better than me?
02:39:54.940 | - No, no, no, no, Lex, I enjoy talking to you.
02:39:57.060 | - And I'm doing devil's advocate a little bit,
02:39:58.900 | 'cause I do really want to be,
02:40:01.180 | I am afraid about the forces that are basically editors
02:40:06.180 | of authority of talking down to people
02:40:08.340 | and censoring information.
02:40:09.860 | - Yeah, so let me first argue your side,
02:40:11.580 | and then let me say something, okay?
02:40:12.700 | Which is, what you are reacting to is,
02:40:17.700 | oh, even those things I thought of as constants
02:40:21.620 | are becoming variables.
02:40:22.500 | Where is the terra firma?
02:40:24.020 | If we cannot trust anything,
02:40:26.500 | then everybody's just, it's anarchy and it's chaos,
02:40:30.100 | like there's literally no consensus reality,
02:40:31.940 | and anybody can say anything, and so on and so forth, right?
02:40:34.900 | And I think that there's two possible deviations from,
02:40:38.980 | let's say that the mainstream,
02:40:42.180 | obviously people talk about QAnon, for example,
02:40:44.340 | as this kind of thing, where people just make things up.
02:40:48.740 | They just go totally,
02:40:50.140 | quit supply chain independent from mainstream media.
02:40:52.940 | And if mainstream media is a distorted gossamer
02:40:57.220 | of quasi-truth, these guys go to just total fiction,
02:41:00.460 | as opposed to like, right?
02:41:01.900 | The alternative to QAnon is not BlueAnon, mainstream media,
02:41:06.900 | but Satoshianon, okay?
02:41:09.780 | Which is an upward deviation, okay?
02:41:11.740 | Not a downward deviation to say
02:41:12.940 | there is no such thing as truth,
02:41:14.300 | but rather the upward deviation
02:41:15.660 | is decentralized cryptographic truth,
02:41:18.360 | not centralized corporate or government truth, okay?
02:41:22.340 | - So how does the decentralization of Wikipedia look like?
02:41:25.540 | - Great question.
02:41:26.380 | It's this concept of the ledger of record.
02:41:27.980 | First, whether you're Israeli or Palestinian,
02:41:30.340 | Japanese or Chinese, Democrat or Republican,
02:41:32.920 | those people agree on the state of the Bitcoin blockchain.
02:41:37.260 | Hundreds of billions of dollars is managed without weapons,
02:41:40.780 | okay, across tribes with wildly varying ideologies, right?
02:41:45.780 | And what that means is that is a mechanism
02:41:50.360 | for getting literally consensus.
02:41:52.340 | It's called consensus, cryptographic consensus,
02:41:54.380 | proof of work.
02:41:56.020 | And when people can get consensus on this,
02:41:58.180 | what they're getting consensus on are basically bytes
02:42:00.860 | that determine who holds what Bitcoin.
02:42:02.860 | This is exactly the kind of thing
02:42:03.900 | people would fight wars over.
02:42:05.500 | You know, for hundreds of billions of dollars,
02:42:06.980 | let alone millions of dollars,
02:42:08.180 | people will kill each other over that in the past, right?
02:42:11.220 | So for hundreds of billions of dollars,
02:42:12.320 | people can get consensus truth on this
02:42:14.420 | in this highly adversarial environment, right?
02:42:16.940 | So the first generalization of that is it says,
02:42:20.340 | you can go from bytes that reflect
02:42:22.000 | what Bitcoin somebody has,
02:42:24.740 | to bytes that reflect what stocks, bonds,
02:42:26.980 | other kinds of assets people have.
02:42:28.460 | That's the entire DeFi, Ethereum, that whole space, okay?
02:42:32.620 | Basically the premise is if you go from consensus
02:42:34.660 | on one byte by induction,
02:42:36.900 | you can go to consensus on N bytes,
02:42:38.360 | depending on the cost of getting that consensus, right?
02:42:40.540 | And almost anything digital can be represented,
02:42:43.180 | you know, everything digital can be represented as bytes,
02:42:45.340 | right?
02:42:46.180 | So now you can get consensus
02:42:47.320 | on certain kinds of digital information, Bitcoin,
02:42:50.360 | but then also any kind of financial instrument.
02:42:52.620 | And then the next generalization is
02:42:54.460 | what I call the ledger of record.
02:42:57.940 | Many kinds of facts can be put partially
02:43:02.940 | or completely on chain.
02:43:04.680 | It's not just proof of work and proof of stake.
02:43:07.260 | There's things like proof of location,
02:43:09.220 | proof of human, proof of this, proof of that.
02:43:11.420 | The auditable oracles I talked about extended further.
02:43:14.140 | Lots and lots of people are working on this, right?
02:43:16.040 | Proof of solvency,
02:43:17.220 | seeing that some actor has enough of a bank balance
02:43:21.540 | to accommodate what they say they accommodate.
02:43:23.380 | You can imagine many kinds of digital assertions
02:43:26.140 | can be turned into proof of X and proof of Y.
02:43:28.500 | You start putting those on chain,
02:43:29.580 | you now have a library of partially
02:43:32.380 | or completely provable facts, okay?
02:43:35.060 | This is how you get consensus.
02:43:37.500 | As opposed to having a white Western Wikipedia editor
02:43:42.500 | or mostly white Western US media corporation
02:43:47.840 | or the US government simply say what is true
02:43:51.960 | in a centralized fashion.
02:43:53.800 | - So do you think truth is such an easy thing
02:43:58.520 | as you get to higher and higher questions of politics?
02:44:03.060 | Is the problem that the consensus mechanism is being hacked
02:44:07.760 | or is the problem that truth is a difficult thing
02:44:09.960 | to figure out?
02:44:11.000 | Was the 2020 election rigged or not?
02:44:13.880 | Is the earth flat or not?
02:44:15.840 | That's a scientific one.
02:44:17.000 | That's how this is--
02:44:17.840 | - My technical versus political truth spectrum, yeah.
02:44:19.960 | - But even the earth, like, well, that one is,
02:44:23.520 | yeah, nevermind, that's a bad example
02:44:25.160 | because that is very,
02:44:27.160 | you can rigorously show that the earth is not flat.
02:44:29.680 | But what, there's some social phenomena,
02:44:34.400 | political phenomena, philosophical one,
02:44:36.320 | that will have a lot of debates, historical stuff,
02:44:39.440 | about the different forces operating within Nazi Germany
02:44:46.700 | and Stalinist Soviet Union.
02:44:50.080 | I think there's probably a lot of,
02:44:52.720 | yeah, like, historians debate about a lot of stuff,
02:44:56.180 | like Blitz, the book that talks about the influence
02:45:01.180 | of drugs in the Third Reich.
02:45:03.200 | - Were they on meth or something?
02:45:04.240 | - Yeah, there's a lot of debates about how truth,
02:45:07.280 | what is the significance of meth on the actual behavior
02:45:11.240 | and decisions of Hitler and so on.
02:45:13.520 | So there's still a lot of debates.
02:45:15.340 | Is it so easy to fix with decentralization,
02:45:20.040 | I guess is the question.
02:45:21.300 | - So I actually have, like, basically chapter two
02:45:23.760 | of the "Network State" book is on essentially this topic.
02:45:26.400 | And so it's like 70 pages or something like that.
02:45:29.080 | So let me try to summarize what I think about on this.
02:45:32.960 | The first is that there was an Onion article that came out,
02:45:36.160 | I can't find it now anymore,
02:45:37.920 | but it was about historians in the year 3000
02:45:41.480 | writing about the late '90s and early 2000s.
02:45:43.840 | And they're like, clearly Queen Brittany
02:45:47.920 | was a very powerful monarch.
02:45:51.520 | We can see how many girls around the world
02:45:55.680 | worshipped her like a god.
02:45:57.160 | And so, and it was very funny
02:45:58.700 | because it was a plausible distortion
02:46:02.740 | of the current society by a human civilization
02:46:07.740 | picking through the rubble a thousand years later,
02:46:11.340 | having no context on anything, right?
02:46:13.600 | And it was a very thought provoking article
02:46:15.240 | because it says, well, to what extent is that us
02:46:19.640 | picking over Pompeii or the pyramids,
02:46:22.800 | or even like the 1600s or the 1700s,
02:46:25.440 | like a few hundred years ago,
02:46:26.960 | we're basically sifting through artifacts.
02:46:29.240 | And Selma Berger actually has this concept,
02:46:32.680 | which is obvious, but it's also useful to have a name for it.
02:46:36.240 | It's like, I think he calls it like dark history,
02:46:38.120 | which is, and again, I might be getting this wrong,
02:46:40.640 | but it's like only a small percentage
02:46:42.800 | of what the Greeks wrote down,
02:46:45.400 | has come to us to the present day, right?
02:46:47.360 | So perhaps it's not just the winners who write history,
02:46:51.040 | it's like the surviving records.
02:46:52.280 | We have this extremely partial,
02:46:55.400 | fragmentary record of history.
02:46:57.840 | And sometimes there's some discovery
02:46:59.240 | that rewrites the whole thing.
02:47:00.080 | Do you know what like Gobekli Tepe is?
02:47:02.300 | - Everything I know about that is from Rogan
02:47:04.320 | 'cause he's a huge fan of that kind of stuff.
02:47:06.160 | - Yeah, so that like rewrites.
02:47:07.680 | - And then there's a lot of debates there.
02:47:09.360 | - There's a lot of debates.
02:47:10.200 | And I think it's like the discovery of this site
02:47:11.560 | in Northern Turkey that totally shifts our estimate
02:47:15.000 | of like when civilization started,
02:47:17.360 | maybe pushing it back many thousands of years further
02:47:19.520 | in the past, right?
02:47:20.840 | The past, it's like an inverse problem in physics, right?
02:47:24.300 | We're trying to reconstruct this from limited information,
02:47:27.920 | right?
02:47:28.760 | It's like X-ray crystallography,
02:47:29.580 | it's an inverse problem, right?
02:47:31.240 | It's Plato's cave.
02:47:32.440 | We're trying to reconstruct what the world looks like
02:47:34.480 | outside from the shadows,
02:47:35.600 | these fragments that have been given to us, right?
02:47:39.580 | Or that we've found.
02:47:41.000 | And so in that sense, as you find more information,
02:47:44.280 | your estimate of the past changes, right?
02:47:46.560 | Oh, wow, okay, that pushes back civilization
02:47:48.480 | farther than we thought.
02:47:49.320 | That one discovery just changes it.
02:47:50.160 | - So you want to try to,
02:47:51.840 | given all the gaps in the data we have,
02:47:55.080 | you want to try to remove bias
02:47:57.560 | from the process of trying to fill the gaps.
02:48:01.160 | - Well, so here's the thing.
02:48:02.600 | I think we're very close to the moment of it.
02:48:04.520 | And so that's why it'll sound crazy when I say it now.
02:48:06.540 | But our descendants,
02:48:09.300 | I really do think of what the blockchain is
02:48:13.500 | and cryptographically verifiable history
02:48:15.540 | as being the next step after written history.
02:48:18.340 | It's like on par with that.
02:48:20.080 | Because anybody who has the record,
02:48:22.140 | the math is not gonna change, right?
02:48:24.060 | Math is constant across human time and space, right?
02:48:27.220 | So, you know, the value of pi is constant.
02:48:29.420 | That's one of the few constants
02:48:30.560 | across all these different human civilizations, okay?
02:48:33.740 | So somebody in the future,
02:48:36.840 | assuming of course the digital record
02:48:38.360 | is actually intact to that point,
02:48:39.880 | because in theory digital stuff will persist.
02:48:43.500 | In practice, you have lost data and floppy drives
02:48:46.880 | and stuff like that.
02:48:47.720 | In a sense, in some ways digital is more persistent,
02:48:49.440 | in some ways physical is more persistent, okay?
02:48:51.240 | But assuming we can figure out
02:48:52.200 | the archival problem somehow,
02:48:54.880 | then this future record,
02:48:57.480 | at least it's internally consistent, right?
02:48:59.760 | You can run a bunch of the equivalence of checksums, right?
02:49:02.760 | The Bitcoin verification process,
02:49:04.560 | just sum it all up and see that,
02:49:06.320 | okay, it's F of G of H of X,
02:49:08.440 | and boom, that at least is internally consistent, okay?
02:49:11.680 | Again, it doesn't say that
02:49:13.320 | all the people who reported it were,
02:49:16.200 | you know, they could have put something on chain
02:49:17.480 | that's false, but at least you know the metadata
02:49:19.200 | is likely to be very difficult to falsify.
02:49:21.160 | And this is a new tool.
02:49:22.560 | It's really a new tool in terms of a robust history
02:49:26.240 | that is expensive and technically challenging
02:49:28.900 | to edit and alter.
02:49:30.200 | And that is the alternative to the Stalin-esque
02:49:32.640 | rewriting of history by centralized power.
02:49:34.720 | - Yeah, I'm gonna have to do a lot of actually reading
02:49:37.560 | and thinking about,
02:49:39.120 | I'm actually, as you're talking,
02:49:40.680 | I'm also thinking about the fact that I think
02:49:43.000 | 99% of my access to Wikipedia is on technical topics,
02:49:48.000 | 'cause I basically use it very similarly to Stack Overflow.
02:49:53.200 | - And even there, it doesn't have unit tests.
02:49:54.800 | For example, one thing-
02:49:56.320 | - That's a good way to put it.
02:49:57.400 | - Right, so one thing I remember,
02:49:58.920 | again, I might be wrong on this,
02:49:59.960 | but I recall that the Kelly criterion
02:50:02.120 | it's actually quite a useful thing to know.
02:50:04.360 | It's like how to optimally size your bets.
02:50:06.480 | And you can have, given your kind of probability
02:50:10.840 | that some investment pays off or assumed probability,
02:50:13.120 | you can have bets that are too large,
02:50:15.040 | bets that are too small.
02:50:16.200 | Sometimes the Kelly criterion, it goes negative
02:50:18.520 | and actually it says you should actually take leverage.
02:50:20.800 | You're so sure this is a good outcome
02:50:22.600 | that you should actually spend more
02:50:23.840 | than your current bankroll
02:50:24.760 | because you're gonna get a good result, right?
02:50:26.360 | So it's a very sophisticated thing.
02:50:28.080 | And as I recall, many sites on the internet
02:50:30.320 | have the wrong equation.
02:50:33.120 | And I believe that was reprinted on Wikipedia.
02:50:34.920 | The wrong equation was put on Wikipedia
02:50:36.520 | as a Kelly criterion for a while.
02:50:38.080 | - That's funny.
02:50:38.920 | - Okay, and so without unit tests,
02:50:41.880 | see math is actually the kind of thing
02:50:43.040 | that you could unit test, right?
02:50:44.200 | You could literally have the assert
02:50:45.720 | on the right-hand side today, right?
02:50:47.480 | The modern version, we've got Jupiter,
02:50:50.000 | we've got Replit, we've got all these things.
02:50:52.240 | The modern version of Wikipedia,
02:50:54.360 | there's sites like golden.com, for example,
02:50:58.680 | there's a bunch of things.
02:50:59.520 | I'm funding lots of stuff across the board on this.
02:51:02.480 | And I'm not capitalizing these companies
02:51:05.400 | or capitalize independently,
02:51:06.360 | but I'm trying to see if,
02:51:07.720 | not just talk about a better version.
02:51:09.560 | It's hard to build something better.
02:51:10.640 | So actually go and build it.
02:51:12.160 | And where you want is assertions
02:51:14.040 | that are actually reproduced.
02:51:15.280 | You don't just have the equation there.
02:51:17.280 | You have it written down in code.
02:51:18.200 | You can hit enter, you can download the page,
02:51:19.720 | you can rerun it.
02:51:20.680 | It's reproducible.
02:51:21.720 | - So the problem with that kind of reproducibility
02:51:24.240 | is that it adds friction.
02:51:25.280 | It's harder to put together articles
02:51:26.640 | that do that kind of stuff,
02:51:28.080 | unless you do an incredible job with UX and so on.
02:51:31.360 | The thing that I think is interesting about Wikipedia
02:51:34.120 | on the technical side is that without the unit tests,
02:51:38.160 | without the assertions,
02:51:40.080 | it still often does an incredible job
02:51:42.720 | because the reason it's,
02:51:44.520 | the people that write those articles,
02:51:46.400 | and I've seen this also in Stack Overflow,
02:51:48.720 | is are the people that care about this most.
02:51:51.240 | And there's a pride to getting it right.
02:51:54.440 | - Okay, so let me agree and disagree with that.
02:51:58.120 | So absolutely, there's some good there.
02:52:00.840 | There's, I mean, again,
02:52:02.600 | do I think Wikipedia is a huge step up
02:52:04.000 | from what preceded it in some ways on the technical topics?
02:52:08.200 | However, you're talking about the editing environment.
02:52:10.600 | Like the markup for Wikipedia,
02:52:12.200 | it's very mid 2000s.
02:52:14.960 | It is not--
02:52:15.800 | - It's a Craigslist.
02:52:16.720 | - Yeah, exactly.
02:52:17.560 | At a minimum, for example, it's not WYSIWYG.
02:52:20.240 | So like Medium or something like that,
02:52:23.720 | you know, or Ghost,
02:52:25.560 | you can just go in and type
02:52:26.600 | and it looks exactly like it looks on the page.
02:52:28.440 | Here, you have to go to a markup language
02:52:31.760 | where there can be editor conflicts
02:52:33.800 | and you hit enter and someone is over in your edit
02:52:36.080 | or something like that.
02:52:37.200 | And you don't know how it looks on the page.
02:52:38.480 | You might have to do a few, you know,
02:52:39.640 | previews or what have you.
02:52:40.600 | So number one, so editing,
02:52:42.480 | you talk about bearish ending, that's the thing.
02:52:45.000 | Number two is, given that it might be read a thousand times
02:52:49.240 | for every one time it's written,
02:52:51.280 | it is important to actually have
02:52:52.880 | the mathematical things unit tested, if they can be,
02:52:55.280 | given that we've got modern technology.
02:52:57.040 | And that's something that's hard to like retrofit into this
02:52:59.200 | because it's so kind of ossified, right?
02:53:01.760 | - Right, there's the interface on every side for the editor,
02:53:05.000 | even just for the editor to check that they're,
02:53:08.160 | say the editor wants to get it right,
02:53:09.960 | we make it, we wanna make it really,
02:53:12.640 | or not really easy, but easier to check their work.
02:53:15.680 | - That's right. - Like debugging,
02:53:16.520 | like a nice ID for the--
02:53:19.840 | - That's exactly right.
02:53:20.680 | - Editing experience.
02:53:21.520 | - That's right, and the thing about this is,
02:53:24.160 | as I said, because the truth is a global constant,
02:53:27.360 | but like incorrectness, you know, right, go ahead.
02:53:30.200 | Every happy family.
02:53:31.640 | - I love to think that like truth will have a nice debugger.
02:53:36.120 | - Well, so here's, right?
02:53:37.800 | So the thing is that what you can do is,
02:53:42.120 | let's say you did have like a unit tested page
02:53:44.040 | for everything that's on Wikipedia.
02:53:45.240 | First of all, it makes a page more useful
02:53:47.200 | because you can download it, you can run it,
02:53:48.600 | you can import it and so on.
02:53:50.000 | Second is it leads into,
02:53:51.560 | one of the things that we can talk about,
02:53:53.480 | I've sort of like a roadmap for building alternatives
02:53:56.160 | to not just existing companies,
02:53:58.680 | but to many existing US institutions
02:54:01.600 | from media and tech companies to courts and government
02:54:05.800 | and, you know, academia and nonprofits.
02:54:08.840 | The Wikipedia discussion actually relates
02:54:10.920 | to how you improve on academia, right?
02:54:14.800 | And so academia right now, one of the big problems,
02:54:17.000 | this is kind of related to the, oh boy,
02:54:19.320 | okay, the current institutions,
02:54:20.560 | we don't have trust in them.
02:54:21.400 | Is that the answer is,
02:54:22.240 | is that the answer to trust no one, right?
02:54:24.200 | And I think the alternative is decentralized cryptographic
02:54:26.800 | trust or verification.
02:54:28.120 | How does that apply to academia?
02:54:30.520 | First observation is we are seeing science
02:54:35.440 | being abused in the name of quote unquote science, okay?
02:54:40.760 | Capital S science is Maxwell's equations.
02:54:44.120 | That's- - That's the good one.
02:54:45.360 | - That's a good one, right?
02:54:46.800 | Quote unquote science is a paper that came out last week.
02:54:50.520 | And the key thing is that capital S science, real science,
02:54:54.720 | is about independent replication, not prestigious citation.
02:54:58.460 | That's the definition, like all the journal stuff,
02:55:02.760 | the professors, all that stuff is just a superstructure
02:55:06.440 | that was set on top to make experiments more reproducible.
02:55:11.440 | And that superstructure is now like dominating
02:55:16.120 | the underlying thing because people are just fixating
02:55:18.200 | on the prestige and the citation
02:55:20.040 | and not the replication, right?
02:55:21.800 | So how does that apply here?
02:55:23.240 | Once you start thinking about how many replications
02:55:26.240 | does this thing have, Maxwell's equation,
02:55:28.680 | I mean, there's trillions of replications.
02:55:30.560 | Every time, us speaking into this microphone right now,
02:55:33.920 | you know, we're testing, you know,
02:55:36.160 | our theory of the electromagnetic field, right?
02:55:38.480 | Or electromagnetic fields.
02:55:40.400 | Every single time you pick up a cell phone
02:55:41.860 | or use a computer, you're putting our knowledge
02:55:44.880 | to the test, right?
02:55:46.480 | Whereas some paper that came out last week
02:55:49.480 | in Science or Nature may have zero independent replications,
02:55:52.800 | yet it is being cited publicly as prestigious scientists
02:55:57.220 | from Stanford and, you know, Harvard and MIT
02:56:00.240 | all came up with X, right?
02:56:02.880 | And so the prestige is a substitute
02:56:05.280 | for the actual replication.
02:56:08.720 | So there's a concept called Goodhart's Law, okay?
02:56:11.320 | I'm just gonna quote it.
02:56:12.240 | "When a measure becomes a target,
02:56:13.640 | it ceases to be a good measure," okay?
02:56:15.560 | So for example, backlinks on the web
02:56:20.560 | were a good signal for Google to use
02:56:23.080 | when people didn't know they were being used as a signal.
02:56:25.040 | - Yeah, you talked about quantity versus quality
02:56:28.080 | and PageRank was a pretty good approximation for quality.
02:56:30.960 | - Yes.
02:56:31.800 | - It's a fascinating thing, by the way, but yeah.
02:56:33.360 | - It is a fascinating thing, we can talk about that.
02:56:34.960 | But basically, once people know
02:56:38.040 | that you're using this as a measure,
02:56:39.520 | they will start to game it.
02:56:41.080 | And so then you have this cycle
02:56:43.360 | where sometimes you have a fixed point,
02:56:45.800 | like Satoshi with "Proof of Work"
02:56:47.640 | was miraculously able to come up with a game
02:56:50.120 | where the gaming of it was difficult
02:56:51.600 | without just buying more compute, right?
02:56:53.240 | So it's actually, it's a rare kind of game
02:56:55.360 | where knowledge of the game's rules
02:56:57.740 | didn't allow people to game the game.
02:56:59.040 | - Yeah.
02:56:59.880 | - But--
02:57:00.700 | - A brilliant way to put it, yeah.
02:57:01.540 | Which is one of the reasons it's brilliant,
02:57:03.400 | is that you can describe the game
02:57:06.040 | and you can't mess with it.
02:57:07.320 | - Exactly, it's very hard to come up with something
02:57:09.320 | that's stable in this way.
02:57:10.160 | There's actually, on the meta point,
02:57:12.880 | gosh, there's a game where the rule of the game
02:57:16.560 | is to change the rules, okay?
02:57:19.140 | It is--
02:57:21.600 | - You mean "Human Civilization" or what?
02:57:23.440 | - Yeah, gosh, it is called something, NOMIC, okay?
02:57:28.440 | - N-O-M-I-C?
02:57:30.400 | - NOMIC is a game where the rule of the game
02:57:33.040 | is to change the rules of the game.
02:57:34.480 | At first, that seems insane.
02:57:36.480 | Then you realize that's Congress.
02:57:38.180 | - Yeah.
02:57:41.160 | - Right?
02:57:41.980 | It is so meta because there are laws for elections
02:57:46.980 | that elect the editors of those laws
02:57:52.880 | who then change the laws that get them elected
02:57:54.840 | with gerrymandering and other stuff, right?
02:57:56.780 | That's a bad way of thinking about it.
02:57:58.560 | The other way of thinking about it is
02:57:59.840 | this is what every software engineer is doing.
02:58:02.040 | You are constantly, quote, "changing the rules"
02:58:04.460 | by editing software and pushing code updates and so on, right?
02:58:08.560 | So many games devolve into the metagame
02:58:12.500 | of who writes the rules of the game, right?
02:58:14.400 | Become essentially games of NOMIC.
02:58:16.620 | Proof of Work is so amazing
02:58:18.780 | because it didn't devolve in such a way, right?
02:58:21.460 | It became very hard to rewrite the rules
02:58:22.880 | once they got set up.
02:58:23.760 | Very financially and technically expensive.
02:58:25.740 | That's not to say it will always be like that,
02:58:28.220 | but it's very hard to change.
02:58:29.500 | - If we could take a small tangent,
02:58:31.100 | we'll return to academia.
02:58:32.180 | I'd love to ask you about how to fix the media as well
02:58:35.060 | after we fix academia.
02:58:36.580 | - Yeah, these are all actually related.
02:58:37.820 | - Yeah, Wikipedia, media, and academia
02:58:40.760 | are all related to the question of
02:58:42.700 | independent replication versus prestigious citation.
02:58:46.580 | - Sure, so the problem is authority and prestige
02:58:50.820 | as you see it from academia and the media
02:58:54.780 | and Wikipedia with the editors.
02:58:57.340 | We have to have a mechanism where sort of
02:59:04.780 | the data and the reproducibility
02:59:07.460 | is what dominates the discourse.
02:59:09.780 | - That's right, and so one way of thinking about this is,
02:59:11.740 | I've said this in, I think I tweeted this,
02:59:14.940 | Western civilization actually has
02:59:17.260 | a break-classification emergency button.
02:59:19.860 | It's called decentralization, right?
02:59:21.060 | Martin Luther had it.
02:59:22.380 | When the Catholic Church was too ossified and centralized,
02:59:25.620 | decentralized with the Protestant Reformation, okay?
02:59:28.140 | He said, at the time, people were able to
02:59:32.760 | pay for indulgences, like that is to say they could sin.
02:59:36.240 | They could say, okay, I sinned five times yesterday.
02:59:38.640 | Here's the equivalent of 50 bucks.
02:59:40.560 | Okay, I'm done with my sin.
02:59:41.400 | I can go and sin some more.
02:59:42.440 | Okay, they should really buy their way out of sin, okay?
02:59:44.520 | Now people debate as to how frequent
02:59:46.280 | those indulgences were, but these are one of the things
02:59:47.660 | he invade against in the 95 Theses.
02:59:50.000 | So decentralization, boom, break away
02:59:52.080 | from this ossified church, start something new, right?
02:59:55.600 | And in theory, the "religious wars" of the 1600s
02:59:59.160 | that ensued were about things like
03:00:01.560 | where the wafer was the body of Christ or what have you,
03:00:04.920 | but in part, they were also about power
03:00:07.280 | and whether the centralized entity would write all the rules
03:00:09.720 | or the decentralized one would.
03:00:11.280 | And so what happened was, obviously Catholicism still exists
03:00:13.340 | but Protestantism also exists, okay?
03:00:15.920 | And similarly, here you've got this ossified
03:00:19.120 | central institution where, forget about,
03:00:22.120 | I mean, there's complicated studies
03:00:23.600 | that are difficult to summarize,
03:00:24.440 | but when you have the science saying
03:00:26.440 | masks don't work and then they do, okay?
03:00:29.280 | Which everybody saw.
03:00:30.680 | And this is not like, everybody knew
03:00:32.680 | that there was not like some massive study
03:00:34.880 | that came out that changed our perspective on mask wearing.
03:00:37.860 | It was something that was just insistently asserted
03:00:39.920 | as this is what the science says.
03:00:41.400 | And then without any acknowledgement,
03:00:43.160 | the science said something different the next day, right?
03:00:45.700 | I remember 'cause I was in the middle of this debate.
03:00:48.260 | And I think you could justify masks early in the pandemic
03:00:51.320 | as a useful precaution and then later,
03:00:55.040 | post-vaccination, perhaps not necessary.
03:00:56.600 | I think that's like the rational way of thinking about it.
03:00:58.880 | But the point was that such levels of uncertainty
03:01:01.320 | were not acknowledged.
03:01:02.480 | Instead, people were basically lying in the name of science
03:01:07.480 | and public policy, it wasn't public health,
03:01:11.560 | it was political health, okay?
03:01:13.040 | So something like that, you're just spending
03:01:14.400 | down all the credibility of an institution
03:01:15.560 | for basically nothing, okay?
03:01:17.440 | And so in such a circumstance, what do you do?
03:01:20.140 | Break glass, decentralize.
03:01:21.920 | What does that look like?
03:01:22.880 | Okay, so let me describe what I call cryptoscience
03:01:28.320 | by analogy to crypto, just like there's fiat science,
03:01:31.400 | cryptoscience, right, fiat economics, okay?
03:01:33.640 | So in any experiment, any paper when it comes out, right,
03:01:38.400 | you can sort of divide it into the analog to digital
03:01:43.720 | and the purely digital, okay?
03:01:44.560 | So the analog to digital is you're running some instruments,
03:01:47.580 | you're getting some data, okay?
03:01:49.260 | And then once you've got the data,
03:01:51.400 | you're generating figures and tables and text
03:01:54.240 | and a PDF from that data, right?
03:01:56.880 | Leave aside the data collection step for now,
03:01:58.440 | I'll come back to that, right?
03:01:59.840 | Just the purely digital part,
03:02:01.540 | what does the ideal quote academic paper look like
03:02:04.480 | in 2022, 2023?
03:02:06.560 | First, there's this concept called reproducible research,
03:02:10.640 | okay?
03:02:12.240 | Reproducible research is the idea that the PDF
03:02:15.920 | should be regenerated from the data and code, okay?
03:02:20.840 | So you should be able to hit enter and regenerate it.
03:02:22.680 | Why is this really important as a concept?
03:02:24.240 | John Claire Boo and Dave Donahoe at Stanford 20 years ago
03:02:27.640 | pioneered this in stats because the text alone
03:02:32.120 | often doesn't describe every parameter
03:02:33.680 | that goes into a figure or something, right?
03:02:35.520 | You kind of sometimes just need to look at the code
03:02:37.000 | and then it's easy and without that, it's hard, okay?
03:02:39.740 | So reproducible research means you've regenerate the PDF
03:02:43.260 | from the code and the data, you hit enter, okay?
03:02:47.640 | Now, one issue is that many papers out there
03:02:51.000 | in science, nature, et cetera,
03:02:52.200 | are not reproducible research.
03:02:53.520 | Moreover, the data isn't even public.
03:02:55.640 | Moreover, sometimes the paper isn't even public.
03:02:57.720 | The open access movement has been fighting this
03:02:59.520 | for the last 20 something years.
03:03:00.720 | There's various levels of this like green and gold,
03:03:02.960 | open access, okay?
03:03:04.440 | So the first step is the code, the data,
03:03:07.640 | and the PDF go on chain, step number one, okay?
03:03:11.340 | The second thing is once you've got,
03:03:13.000 | so you can, anybody who is,
03:03:14.960 | and that could be the Ethereum chain,
03:03:16.120 | it could be its own dedicated chain, whatever, okay?
03:03:18.440 | It could be something where there's just the URLs
03:03:21.120 | are on the Ethereum chain and stored on Filecoin,
03:03:22.720 | many different implementations,
03:03:23.720 | but let's call that on chain broadly, okay?
03:03:26.400 | Not just online, on chain.
03:03:27.540 | When it's on chain, it's public and anybody can get it.
03:03:30.600 | So that's first.
03:03:31.860 | Second is once you've got something
03:03:33.640 | where you can regenerate the code,
03:03:35.640 | or the PDF from the code and the data on chain, guess what?
03:03:40.120 | You can have citations between two papers
03:03:43.920 | turn into import statements.
03:03:45.560 | - Yeah, that's funny.
03:03:47.120 | - That's cool, right?
03:03:48.400 | So now you're not just getting composable finance,
03:03:51.320 | like DeFi, where you have like one interest rate calculator
03:03:54.000 | calling another, you have composable science.
03:03:57.600 | And now you can say this paper on this,
03:03:59.960 | especially in ML, right?
03:04:01.400 | You'll often cite a previous paper in its benchmark
03:04:04.600 | or its method, right?
03:04:05.600 | You're gonna wanna scatter plot sometimes your paper,
03:04:08.800 | your algorithm versus theirs on the same dataset.
03:04:11.560 | That is facilitated if their entire paper
03:04:15.640 | is reproducible research that is generated.
03:04:18.240 | You can just literally import that Python
03:04:20.140 | and then you can generate your figure off of it, right?
03:04:24.040 | Moreover, think about how that aids reproducibility
03:04:26.720 | because you don't have to reproduce in the literal sense,
03:04:29.500 | every single snippet of code that they did,
03:04:31.560 | you can literally use their code, import it, okay?
03:04:34.000 | People start compounding on each other.
03:04:35.360 | It's better science, okay?
03:04:37.080 | Now I talked about this, but actually there's a few folks
03:04:39.840 | who have been actually building this.
03:04:41.680 | So there's usescholar.org,
03:04:45.360 | which actually has a demo of this,
03:04:46.800 | like just a V1 kind of prototype
03:04:48.680 | where it shows two stats papers on chain
03:04:52.000 | and one of them is citing the other
03:04:55.360 | with an import statement.
03:04:56.280 | There's also a thing like called I think dsci.com,
03:04:58.560 | which is trying to do this, right?
03:04:59.760 | Decentralized science.
03:05:01.040 | So this itself changes how we think about papers.
03:05:04.880 | And actually, by the way,
03:05:06.480 | the inspiration for PageRank was actually citations.
03:05:10.820 | It was like the impact factor out of academia.
03:05:12.240 | That's where Larry Page and Sergey Brin
03:05:13.720 | got the concept out of, right?
03:05:15.060 | So now you've got a web of citations
03:05:18.600 | that are import statements on chain.
03:05:20.440 | In theory, you could track back a paper
03:05:22.360 | all the way back to its antecedents, okay?
03:05:25.600 | So if it's citing something,
03:05:26.960 | you can now look it up and look it up and look it up.
03:05:29.160 | And a surprising number of papers actually,
03:05:33.440 | you know, their antecedents don't terminate
03:05:35.920 | or the original source says something different
03:05:38.220 | or it just kind of got garbled like a telephone game.
03:05:40.800 | And, you know, there's this famous thing on like the spinach,
03:05:45.680 | like it does actually have iron in it
03:05:47.640 | or something like that.
03:05:48.760 | I forget the details on this story,
03:05:50.560 | but it was something where you track back the citations
03:05:52.280 | and people are contradicting each other, okay?
03:05:54.600 | But it's just something that just gets copy pasted
03:05:56.280 | and it's a fact that's not actually a fact
03:05:57.920 | 'cause it's not audited properly.
03:05:59.720 | This allows you to cheaply audit, in theory,
03:06:02.540 | all the way back to Maxwell or Newton
03:06:04.080 | or something like that, okay?
03:06:05.720 | Now, what I'm describing is a big problem,
03:06:07.680 | but it's a finite problem.
03:06:09.240 | It's essentially taking all the important papers
03:06:12.120 | and putting them on chain.
03:06:13.080 | It's about the scale of, let's say, Wikipedia, okay?
03:06:15.920 | So it's like, I don't know, a few hundred thousand,
03:06:17.600 | a few million papers.
03:06:18.440 | I don't know the exact number,
03:06:19.480 | but it'll be out of that level, okay?
03:06:20.840 | So now you've got, number one,
03:06:22.920 | these things that are on chain, okay?
03:06:24.840 | Number two, you've turned citations into import statements.
03:06:28.200 | Number three, anybody can now, at a minimum,
03:06:31.720 | download that code.
03:06:33.600 | And while they may not have the instruments,
03:06:35.320 | and I'll come back to that point,
03:06:36.480 | while they may not have the instruments,
03:06:37.540 | they can do internal checks,
03:06:38.580 | the Benford's Law stuff we were just talking about.
03:06:41.200 | You can internally check the consistency
03:06:43.160 | of these tables and graphs,
03:06:44.480 | and often you'll find fraud
03:06:46.080 | or things that don't add up that way,
03:06:47.440 | 'cause all the code and the data is there, right?
03:06:50.520 | And now you've made it so that anybody in Brazil,
03:06:54.080 | in India, in Nigeria, they may not have an academic,
03:06:56.080 | you know, like a library access,
03:06:58.880 | but they can get into this, all right?
03:07:00.520 | Now, how do you fund all of this?
03:07:02.360 | Well, good thing is crypto actually allows tools
03:07:05.160 | for that as well.
03:07:06.000 | Andrew Huberman and others have started doing things
03:07:07.480 | like with NFTs to fund their lab.
03:07:10.960 | I can talk about the funding aspect.
03:07:12.240 | There's things like researchhub.com,
03:07:14.160 | which are trying to issue tokens for labs,
03:07:16.220 | but a lab isn't that expensive to fund.
03:07:18.280 | Maybe it's a few hundred thousand, a few million a year,
03:07:20.000 | depending on where you are.
03:07:21.520 | Crypto does generate money.
03:07:23.280 | And so you can probably imagine various tools,
03:07:25.560 | whether it's tokens or NFTs
03:07:26.800 | or something like that to fund.
03:07:28.680 | Finally, what this does is it is not QAnon, right?
03:07:33.680 | It is not saying don't trust anybody.
03:07:36.080 | Neither is it just trust
03:07:37.800 | the centralized academic establishment.
03:07:39.980 | Instead it's saying trust because you can verify,
03:07:43.000 | because we can download things and run them.
03:07:45.000 | The crucial thing that I'm assuming here
03:07:47.040 | is the billions of supercomputers around the world
03:07:50.120 | that we have, all the MacBooks and iPhones
03:07:52.480 | that can crank through lots and lots of computation.
03:07:54.880 | So everything digital, we can verify it locally, okay?
03:07:58.640 | Now, there's one last step,
03:08:00.080 | which is I mentioned the instruments, right?
03:08:02.200 | Whether it's your sequencing machine
03:08:04.000 | or your accelerometer or something like that
03:08:06.880 | is generating the data that you are reporting in your paper
03:08:09.880 | when you put it on chain, okay?
03:08:11.760 | Basically you think of that as the analog digital interface.
03:08:14.580 | We can cryptify that too, why?
03:08:16.580 | For example, an Illumina sequencing machine
03:08:19.800 | has an experiment manifest.
03:08:21.960 | And when that's written to,
03:08:24.500 | there's a website called NCBI,
03:08:26.220 | National Center for Biotechnology Information.
03:08:28.640 | You can see the experiment metadata
03:08:31.040 | on various sequencing runs.
03:08:32.760 | It'll tell you what instrument and what time it was run
03:08:35.640 | and who ran it and so on and so forth, okay?
03:08:39.080 | What that does is it allows you to correct
03:08:40.560 | for things like batch effects.
03:08:41.860 | Sometimes you will sequence on this day and the next day
03:08:44.560 | and maybe the humidity or something like that
03:08:46.560 | makes it look like there's a statistically significant
03:08:48.760 | difference between your two results,
03:08:50.040 | but it was just actually batch effects, okay?
03:08:52.280 | What's my point?
03:08:54.220 | Point is, if you have a crypto instrument,
03:08:57.400 | you can have various hashes and stuff of the data
03:09:00.860 | as a chain of custody for the data itself
03:09:02.860 | that are streamed and written on chain
03:09:05.120 | that the manufacturer can program into this.
03:09:07.480 | For anything that's really, and you might say,
03:09:08.640 | "Oh boy, boy, that's overkill," right?
03:09:10.400 | I'm saying actually not.
03:09:11.360 | You know why?
03:09:12.240 | If you're doing a study whose results
03:09:14.640 | are going to be used to influence a policy
03:09:16.580 | that's gonna control the lives of millions of people,
03:09:19.000 | every single step has to be totally audible.
03:09:21.320 | You need the glass box model.
03:09:23.280 | You need to be able to go back to the raw data.
03:09:25.760 | You need to be able to interrogate that.
03:09:27.920 | And again, anybody who's a good scientist
03:09:31.600 | will embrace this, right?
03:09:33.200 | - Yeah, so first of all, that was a brilliant exposition
03:09:36.480 | of a future of science that I would love to see.
03:09:39.280 | The pushback I'll provide, which is not really a pushback,
03:09:42.800 | is like what you describe is so much better
03:09:46.840 | than what we currently have that I think
03:09:50.960 | a lot of people would say any of the sub-steps
03:09:54.560 | you suggest are already going to be a huge improvement.
03:09:57.840 | So even just sharing the code.
03:10:01.000 | - Yes.
03:10:01.820 | - Or sharing the data.
03:10:03.020 | You said, I think it would surprise people how often--
03:10:08.020 | - It's hard to get data.
03:10:09.500 | - It is, like the actual data or specifics
03:10:12.780 | or a large number of the parameters,
03:10:14.500 | not you'll share like one or two parameters
03:10:17.640 | that were involved with running the experiment.
03:10:20.820 | You won't mention the machines involved,
03:10:23.800 | except maybe at a high level, but the versions and so on.
03:10:27.500 | The dates when the experiments were run,
03:10:29.100 | you don't mention any of this kind of stuff.
03:10:31.500 | So there's several ways to fix this.
03:10:35.660 | And one of them, I think, implied in what you're describing
03:10:40.660 | is a culture that says it's not okay.
03:10:45.620 | - Exactly.
03:10:46.460 | - To like, so first of all, there should be,
03:10:48.980 | even if it's not perfectly unchained
03:10:51.020 | to where you can automatically import all the way to Newton,
03:10:54.120 | just even the act of sharing the code, sharing the data,
03:11:01.300 | maybe in a way that's not perfectly integrated
03:11:04.900 | into a larger structure is already a very big positive step.
03:11:09.220 | - Yeah.
03:11:10.060 | - I'm saying like, if you don't do this,
03:11:12.620 | then this doesn't count.
03:11:15.980 | And because in general, I think my worry,
03:11:20.620 | as somebody who's a programmer, who's OCD,
03:11:25.500 | I love the picture you paint
03:11:27.560 | that you can just import everything
03:11:28.780 | and it automatically checks everything.
03:11:30.960 | My problem is that makes incremental science easier
03:11:35.500 | and revolutionary science harder.
03:11:37.300 | - Oh, I actually very much disagree with that.
03:11:39.020 | - I would love to hear your argument.
03:11:40.380 | Let me just kind of elaborate.
03:11:42.100 | - Sure, sure.
03:11:42.940 | - Why, sometimes you have to think
03:11:46.580 | in this gray area of fuzziness
03:11:49.500 | when you're thinking in totally novel ideas
03:11:51.940 | and when you have to concretize in data.
03:11:54.460 | Like some of the greatest papers ever written
03:11:56.180 | are to don't have data.
03:11:57.740 | They're in the space of ideas almost.
03:11:59.980 | Like you're kind of sketching stuff
03:12:01.640 | and there could be errors,
03:12:02.560 | but like Einstein himself with the famous five papers,
03:12:05.300 | I mean, they're really strong, but they're fuzzy.
03:12:08.920 | They're a little bit fuzzy.
03:12:10.500 | And so I think, even like the Gann paper,
03:12:15.120 | you're often thinking of like new data sets, new ideas.
03:12:19.200 | And I think maybe as a step after the paper is written,
03:12:24.120 | you could probably concretize it,
03:12:25.600 | integrate it into the rest of science.
03:12:27.200 | - Sure.
03:12:28.040 | - Like you shouldn't feel that pressure, I guess, early on.
03:12:30.880 | - Well, I mean, there's different,
03:12:33.120 | each of the steps that I'm talking about, right?
03:12:35.340 | There's like the data being public and everything.
03:12:38.000 | Just having the paper being public, that's like V1, right?
03:12:41.580 | Then you have the thing being regenerated from code and data,
03:12:44.900 | like the PDF being regenerated from code and data.
03:12:46.880 | Then you have the citations as import statements.
03:12:49.360 | Then you have the full citation graph
03:12:50.960 | as an import statement.
03:12:51.800 | So you just follow it all the way back, right?
03:12:53.940 | And now you have, that gives you audibility.
03:12:57.680 | Then you have the off-chain,
03:13:00.000 | you know, the analog digital crypto custody, right?
03:13:02.640 | Like where you're hashing things and streaming things.
03:13:04.500 | So you have the chain of custody.
03:13:06.600 | Each of those is kind of like a level up
03:13:08.000 | and adds to complexity,
03:13:09.800 | but it also adds to the audibility and the verifiability
03:13:12.240 | and the reproducibility.
03:13:13.600 | But, you know, one thing I'd say,
03:13:15.200 | I wanted to respond to that you said was
03:13:18.080 | that you think this would be good for incremental,
03:13:20.100 | but not innovative.
03:13:20.940 | Actually, I think it's quite the opposite.
03:13:22.000 | I think academia is institutional and it's not innovative.
03:13:26.440 | For example, NIH has this graph,
03:13:28.680 | which is like, I think it's age of recipients
03:13:31.320 | of R01 grants, okay?
03:13:33.520 | And what it shows is basically it's like a hump
03:13:36.600 | that moves over time,
03:13:37.760 | roughly plus one year forward for the average age
03:13:41.280 | as the year moves on, okay?
03:13:43.040 | I'll see if I can find the GIF.
03:13:44.980 | What this, why is this?
03:13:46.440 | Let me see if I can find it actually.
03:13:48.040 | Look at this movie just for a second.
03:13:49.440 | It's a ridiculously powerful movie and it's 30 seconds.
03:13:52.320 | I just sent it in, WhatsApp.
03:13:53.740 | The name of the video is "Age Distribution
03:13:56.660 | of NIH Principal Investigators and Medical School Faculty."
03:14:00.660 | And it starts out on the X-axis is age
03:14:03.380 | with the distribution and percent of PIs.
03:14:06.980 | And from early 1980s, moving one year at a time.
03:14:11.980 | And the mean of the distribution is moving slowly,
03:14:15.540 | approximately as Belagio said, about one year.
03:14:19.900 | - Per year.
03:14:20.740 | - Per year.
03:14:21.580 | - And this is 10 years ago.
03:14:22.620 | One year in age per year of time.
03:14:25.780 | - And notice how, first of all,
03:14:27.420 | the average age is moving way upward
03:14:29.980 | before you become an NHPI.
03:14:32.940 | Second is, it's a cohort of guys, people,
03:14:36.100 | who are just awarding grants to each other.
03:14:38.760 | That's clearly what's happening.
03:14:40.380 | That's the underlying dynamic.
03:14:42.140 | They're not awarding grants to folks who are much younger,
03:14:45.060 | because those folks haven't proven themselves yet.
03:14:47.580 | So this is what happens when you get prestigious citation
03:14:50.660 | rather than independent replication.
03:14:52.300 | The age just keeps creeping up.
03:14:54.100 | And this was 10 years ago, and it's gotten even worse.
03:14:56.020 | It's become even more gerontocratic,
03:14:57.820 | even more hidebound, right?
03:14:59.740 | And so the thing is, the structures
03:15:01.620 | that Vannevar Bush and others set up,
03:15:03.100 | the entire post-war science establishment,
03:15:05.940 | one thing I'll often find is people will say,
03:15:08.900 | "Baljeet, the government hath granted us the internet
03:15:13.540 | and self-driving cars and space flight and so on.
03:15:18.540 | How can you possibly be against the US government,
03:15:21.580 | kneel and repent for its bounty?"
03:15:24.780 | And really, the reason they kind of,
03:15:26.740 | they don't say it quite in that way,
03:15:27.980 | but that's really the underpinning kind of thing,
03:15:29.580 | because they've replaced GOD with GOV.
03:15:32.780 | They really think of the US government as God.
03:15:34.860 | The conservative will think of the US government
03:15:36.260 | as the all-powerful military abroad,
03:15:39.140 | and the progressive will think of it
03:15:40.780 | as the benign, all-powerful, nurturing parent at home.
03:15:45.780 | But in this context, they're like,
03:15:48.500 | "How come you as some tech bro could possibly think
03:15:52.460 | you could ever do basic science
03:15:55.020 | without the funding of the US government?
03:15:56.420 | Has it not developed all basic science?"
03:15:59.380 | And the answer to this is actually to say,
03:16:01.460 | "Well, what if we go further back than 1950?
03:16:03.860 | Did science happen before 1950?"
03:16:05.220 | Well, I think it did.
03:16:06.420 | Bernoulli and Maxwell and Newton,
03:16:09.100 | were they funded by NSF?
03:16:11.220 | No, they weren't.
03:16:12.420 | Were aviation, railroads, automobiles,
03:16:14.660 | gigantic industries that arose,
03:16:17.820 | and both were stimulated by
03:16:19.780 | and stimulated development of pure science?
03:16:21.740 | Did they, were they funded by NSF?
03:16:23.180 | No, they were not.
03:16:24.260 | Therefore, NSF is not a necessary condition
03:16:28.580 | for the presence of science,
03:16:29.420 | neither is even the United States.
03:16:30.540 | Obviously, a lot of these discoveries,
03:16:32.100 | Newton was before, I believe he was before the American,
03:16:36.540 | hold on, let me just find the exact,
03:16:38.540 | it's actually less old than people think.
03:16:41.460 | Okay, so Newton died 1727.
03:16:43.820 | So I knew that it was in the 1700s.
03:16:47.580 | So Newton was before the American Revolution, right?
03:16:51.020 | Obviously, that meant huge innovations could happen
03:16:54.100 | before the US government, before NIH, before NSF, right?
03:16:58.420 | Which means they are not a necessary condition, number one.
03:17:02.460 | That itself is crucial because a lot of people say,
03:17:04.780 | "The government is necessary for basic science."
03:17:07.340 | It is not necessary for basic science.
03:17:09.540 | It is one possible catalyst.
03:17:12.140 | And I would argue that mid-century, it was okay,
03:17:15.180 | because mid-century was the time when,
03:17:17.420 | the middle of the centralized century.
03:17:20.020 | 1933, 1945, 1969, you have Hoover Dam,
03:17:23.740 | you have the Manhattan Project, you have Apollo.
03:17:25.700 | That generation was acclimatized
03:17:28.700 | to a centralized US government
03:17:30.380 | that could accomplish great things,
03:17:31.860 | probably because technology favored centralization
03:17:34.340 | going into 1950 and then started favoring decentralization
03:17:36.900 | going out of it.
03:17:37.860 | I've talked about this in the book,
03:17:38.980 | Sovereign Individual has talked about this,
03:17:40.260 | but very roughly, you go up into 1950
03:17:44.140 | and you have mass media and mass production
03:17:47.220 | and just centralization of all kinds,
03:17:49.100 | giant nation states slugging it out on the world stage.
03:17:51.580 | And you go out in 1950 and you get cable news
03:17:54.660 | and personal computers and the internet
03:17:57.380 | and mobile phones and cryptocurrency,
03:17:58.940 | and you have the decentralization.
03:18:00.700 | And so this entire centralized scientific establishment
03:18:04.060 | was set up at the peak of the centralized century.
03:18:07.140 | And it might've been the right thing to do at that time,
03:18:09.540 | but is now showing its age.
03:18:11.260 | And it's no longer actually geared up for what we have.
03:18:13.620 | Where are the huge innovations coming out of?
03:18:14.820 | Well, Satoshi Nakamoto was not, to our knowledge,
03:18:18.660 | a professor, right?
03:18:20.580 | That's this revolutionary thing that came outside of it.
03:18:23.540 | Early in the pandemic,
03:18:24.900 | there was something called project-evidence.github.io,
03:18:28.140 | which accumulated all of the evidence
03:18:30.540 | for the coronavirus possibly having been a lab leak,
03:18:34.060 | when that was a very controversial thing to discuss.
03:18:37.140 | Alina Chan, to her credit,
03:18:38.860 | Matt Ridley and Alina Chan have written this book
03:18:41.700 | on whether the coronavirus was a lab leak or not.
03:18:45.420 | I think it's plausible that it was.
03:18:47.180 | I can't say I'm 100% sure,
03:18:50.140 | but I think it's at least,
03:18:50.980 | it certainly, it is a hypothesis worthy of discussion.
03:18:53.620 | Okay?
03:18:54.740 | Though, of course, it's got political overtones.
03:18:56.740 | Point being that the pseudonymous online publication
03:18:59.220 | at project-evidence.github.io
03:19:01.740 | happened when it was taboo to do so.
03:19:03.300 | So we're back to the age of pseudonymous publication
03:19:06.780 | where only the arguments can be argued with.
03:19:10.700 | The person can't be attacked.
03:19:12.380 | Okay?
03:19:13.220 | This is actually something that used to happen in the past.
03:19:14.780 | Like, you know, someone,
03:19:17.260 | there's a famous story where Newton solved a problem
03:19:19.140 | and someone said, "I know the lion by his claw,"
03:19:21.180 | or something like that, right?
03:19:22.460 | People used to do pseudonymous publication in the past
03:19:24.380 | so that they would be judged on part
03:19:25.980 | by their scientific ideas
03:19:27.180 | and not the person themselves, right?
03:19:29.620 | And so, I do disagree that this is the incremental stuff.
03:19:34.020 | This is actually the innovative stuff.
03:19:35.020 | The incremental stuff is gonna be
03:19:36.100 | the institutional gerontocracy that's academia,
03:19:39.860 | where it's like, you know, do you know who I am?
03:19:41.940 | I'm a Harvard professor.
03:19:43.220 | - Yeah, I don't, I think I agree with everything you said,
03:19:46.860 | but I'm not gonna get stuck on technicalities
03:19:51.860 | because I think I was referring to your vision
03:19:56.980 | of data sets and importing code.
03:19:59.860 | - Sure.
03:20:00.700 | - And so that forces just knowing how code works,
03:20:03.380 | it forces a structure,
03:20:04.780 | and structure usually favors incremental progress.
03:20:08.900 | Like, if you fork code,
03:20:11.900 | you're not going to, it decentrifies revolution.
03:20:16.900 | You want to go from scratch.
03:20:20.900 | - Okay, so I understand your point there, okay?
03:20:23.500 | And I also agree that some papers,
03:20:25.300 | like Francis Crick on the Klaustrum,
03:20:27.140 | or others are theoretical.
03:20:28.660 | They're more about like where to dig than the data itself,
03:20:31.420 | and so on and so forth, right?
03:20:32.700 | So, I agree with that.
03:20:33.900 | Still, I don't, the counterargument is,
03:20:37.380 | rather than a thousand people reading this paper
03:20:39.420 | to try to rebuild the whole thing and do it with errors,
03:20:44.100 | when they can just import,
03:20:45.260 | they can more easily build upon what others have done, right?
03:20:48.180 | - Oh yeah, yeah, yeah.
03:20:49.020 | So the paper should be forkable.
03:20:51.860 | - Well, yeah, yeah.
03:20:52.700 | So here's why, you know, like,
03:20:53.980 | Python has this concept of batteries included
03:20:56.020 | for the standard library, right?
03:20:57.660 | Because it lets you just import, import, import,
03:21:00.020 | and just get to work, right?
03:21:01.820 | That means you can fly.
03:21:03.620 | Whereas if you couldn't do all those things,
03:21:05.260 | and you had to rewrite string handling,
03:21:06.940 | you would only be able to do incremental things.
03:21:08.940 | Libraries actually allow for greater innovation.
03:21:11.020 | That's my counter.
03:21:11.860 | - I think you create, I think that paints a picture.
03:21:15.300 | I hope that's a picture that fits with science.
03:21:18.900 | It certainly does.
03:21:19.780 | It fits with code very well.
03:21:21.140 | I just wonder how much of science can be that,
03:21:24.220 | which is you import, how much of it is possible to do that?
03:21:28.660 | Certainly for the things I work on, you can,
03:21:31.140 | which is the machine learning world,
03:21:32.940 | all the computer science world.
03:21:35.500 | But whether you can do that for, all right,
03:21:37.820 | you can think biology, it seems to, yes, I think so.
03:21:42.260 | Chemistry, I think so.
03:21:44.460 | And then you start getting into weird stuff like psychology,
03:21:47.780 | which some people don't even think is a science.
03:21:50.140 | No, just love for my psychology friends.
03:21:53.860 | I think as you get farther and farther away
03:21:57.700 | from things that are like hard technical fields,
03:22:02.500 | it starts getting tougher and tougher and tougher
03:22:04.460 | to have like importable code.
03:22:07.140 | - Okay, so let me give the strong form version, right?
03:22:11.660 | So there's a guy who I think is a great machine learning guy,
03:22:15.300 | creator of actually Keras,
03:22:17.340 | who he disagrees with me on Francois Chalut.
03:22:19.420 | - Yeah, he's been on this podcast twice.
03:22:20.900 | - Okay, great.
03:22:21.740 | So he disagrees-
03:22:22.580 | - I disagree with him on a lot of stuff.
03:22:23.860 | - Yes, me too.
03:22:24.860 | I think we have mutual respect.
03:22:26.300 | You know, follow each other on Twitter, whatever.
03:22:28.460 | - I think, yes, I think he does respect and like you.
03:22:31.580 | - Here's something which I totally agree with him on.
03:22:33.900 | And he actually got like trolled or attacked for this,
03:22:35.860 | but I completely agree.
03:22:36.940 | Within 10, 20 years,
03:22:37.860 | nearly every branch of science will be,
03:22:39.540 | for all intents and purposes, a branch of computer science.
03:22:41.780 | Computational physics, computational chemistry,
03:22:43.500 | computational biology, computational medicine,
03:22:45.180 | even computational archeology,
03:22:46.740 | realistic simulations, big data analysis,
03:22:48.380 | and ML everywhere.
03:22:49.340 | That to me is incredibly obvious why.
03:22:51.620 | First of all, all we're doing every day
03:22:54.220 | is PDFs and data analysis on a computer, right?
03:22:58.140 | And so every single one of those areas
03:22:59.980 | can be reduced to the analog to digital step,
03:23:02.500 | and then it's all digital.
03:23:03.620 | Then you're flying, you're in the cloud, right?
03:23:04.900 | - Did he put a date?
03:23:05.740 | Did he say how long or--
03:23:07.180 | - 10 to 20 years, he was saying.
03:23:08.020 | - 10, 20 years.
03:23:08.860 | - I mean, arguably it's already there, right?
03:23:10.860 | And here's the thing.
03:23:12.260 | You were saying, well, you know,
03:23:14.180 | you might drop off when you hit psychology or history.
03:23:16.860 | Actually, I think it's the softer sciences
03:23:21.140 | that are gonna harden up.
03:23:24.060 | One of the things I talk about a lot in the book is,
03:23:26.380 | for example, with history,
03:23:27.860 | the concept of crypto history makes history computable.
03:23:31.180 | One way of thinking about it is,
03:23:32.780 | remember my Britney Spears example, right?
03:23:34.460 | Where Queen Britney, right?
03:23:36.340 | Okay, so at first it's kind of a funny thing to say
03:23:39.340 | a computer scientist's term for history is the log files,
03:23:42.540 | until we realized that what would a future historian,
03:23:45.340 | how would they write about the history of the 2010s?
03:23:48.700 | Well, a huge part of that history occurred
03:23:52.220 | on the servers of Twitter and Facebook.
03:23:55.300 | So now you go from like a log file,
03:23:57.420 | which is just the individual record
03:23:58.580 | of like one server's action,
03:24:00.460 | to a decade worth of data on literally billions of people.
03:24:05.460 | All of their online lives, like arguably,
03:24:09.420 | that's why I say that's like actually
03:24:11.020 | what the written history was of the 2010s
03:24:13.820 | was this giant digital history.
03:24:15.940 | As you go to the 2020s and the 2030s,
03:24:17.980 | more of that is gonna move from merely online to on chain
03:24:22.060 | and then cryptographically verifiable.
03:24:23.860 | So that soft subject of history becomes something
03:24:26.300 | that you can calculate things like Google trends
03:24:28.380 | and Ngrams and stuff like that.
03:24:30.140 | - Yes, beautifully put.
03:24:31.820 | Then I would venture to say that Donald Trump
03:24:35.940 | was erased from history when he was removed from Twitter
03:24:40.940 | and many social platforms and all his tweets were gone.
03:24:44.980 | - I think it's someone who has an archive of it,
03:24:47.220 | but yeah, I understand your point.
03:24:49.020 | - Yeah, well, as the flood of data
03:24:51.820 | about each individual increases censorship,
03:24:54.580 | it becomes much more difficult
03:24:56.540 | to actually have an archive of stuff.
03:24:58.020 | But yes, for important people
03:24:59.260 | like a president of the United States, yes.
03:25:02.140 | Let me on that topic ask you about Trump.
03:25:06.540 | You were considered for a position as FDA commissioner
03:25:09.780 | in the Trump administration.
03:25:11.420 | And I think in terms of the network state,
03:25:17.740 | in terms of the digital world,
03:25:20.140 | one of the seminal acts in the history of that
03:25:24.660 | was the banning of Trump from Twitter.
03:25:27.420 | Can you make the case for it and against it?
03:25:30.340 | - Sure, so first let me talk about the FDA thing.
03:25:32.980 | So I was considered for a senior role at FDA,
03:25:36.100 | but I do believe that, and this is a whole topic,
03:25:38.780 | we can talk about the FDA.
03:25:40.100 | I do believe that just as it was easier to create Bitcoin
03:25:44.220 | than to reform the Fed,
03:25:46.180 | reforming the Fed basically still hasn't happened.
03:25:48.660 | So just as it was easier to create Bitcoin
03:25:51.500 | than to reform the Fed,
03:25:52.980 | it will literally be easier to start a new country
03:25:55.460 | than to reform the FDA.
03:25:57.660 | It may take 10 or 20 years.
03:25:59.460 | I mean, think about Bitcoin,
03:26:00.300 | it's only about 13 years old, right?
03:26:02.420 | It may take 10 or 20 years to start a new network state
03:26:05.540 | with a different biomedical policy.
03:26:07.460 | But that is how we get out from
03:26:09.580 | this perhaps the single worst thing in the world,
03:26:12.980 | which is harmonization, regulatory harmonization.
03:26:15.820 | - Can you describe regulatory harmonization?
03:26:18.740 | - Regulatory harmonization is the mechanism
03:26:21.340 | by which US regulators impose the need
03:26:26.620 | their regulations on the entire world.
03:26:28.820 | So basically you have a monopoly by US regulators.
03:26:32.060 | This is not just the FDA, it is SEC and FAA
03:26:36.180 | and so on and so forth.
03:26:37.340 | And for the same reason that a small company
03:26:40.300 | will use Facebook login,
03:26:42.140 | they will outsource their login to Facebook.
03:26:45.100 | A small country will outsource their regulation to the USA.
03:26:48.860 | Okay, with all the attendant issues,
03:26:50.940 | because I mean, you know the names of some politicians.
03:26:55.100 | Can you name a single regulator at the FDA?
03:26:57.900 | No, right?
03:26:58.740 | Yet they will brag on their website that they regulate,
03:27:01.060 | I forget the exact numbers,
03:27:01.940 | I think it's like 25 cents out of every dollar,
03:27:03.940 | something along those lines, okay?
03:27:05.060 | It's like double digits, okay?
03:27:06.940 | That's a pretty big deal.
03:27:08.620 | And the thing about this is,
03:27:10.300 | people will talk about quote, our democracy and so on.
03:27:12.980 | But many of the positions in quote, our democracy
03:27:15.180 | are actually not subject to democratic accountability.
03:27:18.180 | You have tenured professors
03:27:20.140 | and you have tax exempt colleges.
03:27:22.700 | You have the Salzburgers, the New York Times
03:27:24.380 | who have dual class stock.
03:27:25.900 | You have a bunch of positions
03:27:29.500 | that are out of the reach of the electorate.
03:27:30.900 | And that includes regulators who have career tenure
03:27:33.820 | after just a few years of
03:27:35.340 | not necessarily even continuous service.
03:27:37.300 | So they're not accountable to the electorate.
03:27:39.380 | They're not named by the press.
03:27:41.980 | And they also aren't accountable to the market
03:27:45.020 | because you've got essentially uniform global regulations.
03:27:48.740 | Now, the thing about this is,
03:27:50.060 | it's not just a government thing.
03:27:51.500 | It's a regulatory capture thing.
03:27:53.180 | Big pharma companies like this as well.
03:27:56.100 | Because they can just get their approval in the US
03:27:58.060 | and then they can export to the rest of the world.
03:28:00.540 | I understand where that comes from as a corporate executive.
03:28:02.580 | It's such a pain to get access in one place.
03:28:04.980 | So there's a team up though,
03:28:06.020 | between the giant company and the giant government
03:28:09.580 | to box out all the small startups
03:28:11.700 | and all the small countries and lots of small innovation.
03:28:14.900 | There are cracks in this now.
03:28:16.700 | The FDA did not acquit itself well during the pandemic.
03:28:19.140 | For example, it denied, I mean, there's so many issues,
03:28:21.660 | but one of the things that even actually
03:28:23.820 | New York Times reported,
03:28:25.660 | the reason that people thought there were no COVID cases
03:28:27.900 | in the US early in the pandemic
03:28:29.740 | was because the FDA was denying people
03:28:31.540 | the ability to run COVID tests.
03:28:33.540 | And the emergency use authorization was,
03:28:36.260 | emergency should mean like right now, right?
03:28:38.740 | But it was not, it was just taking forever.
03:28:41.940 | And so some labs did civil disobedience
03:28:44.420 | and they just disobeyed the FDA
03:28:45.900 | and just went and tested academic labs
03:28:48.700 | with threat of federal penalties.
03:28:50.140 | 'Cause that's what they are.
03:28:50.980 | They're like the police, okay?
03:28:53.340 | And so we're sort of retroactively granted immunity
03:28:57.700 | because NYT went and ran a positive story on them.
03:29:00.340 | So NYT's authority is usually greater than that of FDA.
03:29:02.980 | If they come into a conflict, NYT runs stories,
03:29:04.940 | then FDA kind of gets spanked, right?
03:29:07.700 | And it's not, probably neither party
03:29:09.820 | would normally think of themselves that way.
03:29:11.620 | But if you look at it,
03:29:12.660 | when NYT goes and runs stories on a company,
03:29:14.940 | it names all the executives and they get all hit.
03:29:17.980 | When it runs stories on a regulator,
03:29:19.300 | it just treats the regulator usually
03:29:21.180 | as if it was just some abstract entity.
03:29:23.000 | It's Zuckerberg's Facebook, but you can't name,
03:29:25.820 | the people who the career bureaucrats at FDA.
03:29:28.060 | Interesting, right?
03:29:28.900 | - It's very interesting.
03:29:29.740 | - It's a very important point.
03:29:30.620 | Like that person who's like named
03:29:32.540 | and their face is known.
03:29:34.220 | Like you, just as an example,
03:29:35.520 | you know Zuckerberg's face and name.
03:29:37.400 | Most people don't know Arthur G. Sulzberger.
03:29:39.100 | They couldn't recognize him, right?
03:29:40.580 | Yet he's a guy who's inherited the New York Times company
03:29:42.540 | from his father's father's father.
03:29:43.700 | That is unaccountable power.
03:29:44.900 | It's not that they get great coverage,
03:29:46.340 | it's that they get no coverage.
03:29:47.660 | You don't even think about them, right?
03:29:49.780 | And so it's invisibility, right?
03:29:52.420 | - There's some aspect why Fauci was very interesting.
03:29:55.700 | - 'Cause he was a public face.
03:29:56.540 | - In my recent memory, there's not been many faces
03:30:01.140 | of scientific policy, of science policy.
03:30:03.340 | - Yeah.
03:30:04.180 | - And he became the face of that.
03:30:05.100 | And you know, as there's some of his meme,
03:30:09.500 | which is basically saying that he is science
03:30:12.840 | or to some people represents science.
03:30:14.380 | But in the--
03:30:15.220 | - Or quote unquote science or whatever, yeah.
03:30:17.100 | The positive aspect of that is that there is accountability
03:30:19.980 | when there's a face like that.
03:30:20.980 | - Right, but you can also see the Fauci example shows you
03:30:23.100 | why a lot of these folks do not want to be public
03:30:25.900 | because they enter a political and media minefield.
03:30:28.980 | I'm actually sympathetic to that aspect of it.
03:30:30.940 | What I'm not sympathetic to is the concept that in 2022,
03:30:35.200 | that the unelected, unfireable, anonymous American regulator
03:30:40.100 | should be able to impose regulatory policy
03:30:42.300 | for the entire world.
03:30:43.860 | We are not the world of 1945.
03:30:46.640 | It is not something where these other countries
03:30:48.280 | are even consciously consenting to that world.
03:30:50.580 | Just as give an example,
03:30:52.220 | there's a concept called challenge trials, okay?
03:30:54.900 | The Moderna vaccine was available very, very early
03:30:59.340 | in the pandemic.
03:31:00.180 | You can just synthesize it from the sequence.
03:31:02.060 | And challenge trials would have meant that people
03:31:05.260 | who are healthy volunteers, okay?
03:31:08.340 | They could have been soldiers, for example, of varying ages
03:31:10.460 | who are there to take a risk their lives
03:31:13.060 | for their country potentially, okay?
03:31:14.460 | It could have been just healthy volunteers,
03:31:15.720 | not necessarily soldiers, just patriots of whatever kind
03:31:18.060 | in any country, not just the US.
03:31:19.980 | But those healthy volunteers could have gone.
03:31:22.860 | And at the early stages of the pandemic,
03:31:24.500 | we didn't know exactly how lethal it was gonna be
03:31:26.500 | because Li Wenliang and 30-somethings in China
03:31:30.900 | were dying from this.
03:31:31.720 | It seemed like it could be far worse.
03:31:33.060 | - How lethal the virus would be.
03:31:35.620 | - Yeah, it may be, by the way,
03:31:37.060 | that those who are the most susceptible to the virus
03:31:40.340 | died faster earlier.
03:31:42.780 | It's as if you can imagine a model
03:31:44.460 | where those who were exposed
03:31:47.600 | and had the lowest susceptibility
03:31:49.080 | also had the highest severity
03:31:50.480 | and died in greater numbers early on.
03:31:52.240 | If you look at the graph,
03:31:53.600 | like deaths from COVID were exponential
03:31:55.640 | going into about April 2020
03:31:57.160 | and then leveled off to about 7,500, 10,000 a day
03:32:00.320 | and then kind of fell, right?
03:32:02.300 | But it could have gone to 75,000 at the beginning.
03:32:04.080 | So we didn't know how serious it was.
03:32:05.100 | So this would have been a real risk
03:32:06.520 | that these people would have been taking,
03:32:07.700 | but here's what they would have gotten for that.
03:32:09.560 | Basically, in a challenge trial,
03:32:11.440 | somebody would have been given the vaccine
03:32:14.660 | and then exposed to the virus
03:32:16.340 | and then put under observation.
03:32:19.200 | And then that would have given you all the data
03:32:22.060 | 'cause ultimately the synthesis of the thing,
03:32:24.340 | I mean, yes, you do need to scale up synthesis
03:32:26.100 | and manufacturing and what have you,
03:32:28.100 | but the information of whether it worked or not
03:32:31.100 | and was safe and effective,
03:32:32.900 | like that could have been gathered expeditiously
03:32:35.420 | with volunteers for challenge trials.
03:32:36.260 | - And you think there'd be a large number of volunteers?
03:32:38.540 | - Absolutely.
03:32:39.380 | - What's the concern there?
03:32:40.540 | Is there an ethical concern of taking on volunteers?
03:32:44.200 | - Well, so let me put it like this.
03:32:46.480 | Had we done that, we could have had vaccines early enough
03:32:51.040 | to save the lives of like a million Americans,
03:32:53.520 | especially seniors and so on, okay?
03:32:55.560 | Soldiers and more generally first responders and others,
03:32:59.120 | I do believe there's folks who would have stepped up
03:33:03.000 | to take that risk.
03:33:04.800 | - The heroes walk among us.
03:33:06.320 | - Yeah, that's right.
03:33:07.160 | Like if military service is something
03:33:09.040 | which is ritualized thing, people are paid for it,
03:33:12.660 | but they're not paid that much.
03:33:14.140 | They're really paid in honor and in duty and patriotism.
03:33:18.060 | That is actually the kind of thing
03:33:19.460 | where I do believe some fraction of those folks
03:33:22.360 | would have raised their hand for this important task.
03:33:25.100 | I don't know how many of them,
03:33:26.260 | but I do think that volunteers would have been there.
03:33:28.820 | There's probably some empirical test of that,
03:33:30.460 | which is there's a challenge trials website.
03:33:32.460 | There's a Harvard prof who put out this proposal
03:33:34.280 | early in the pandemic
03:33:35.120 | and he could tell you how many volunteers he got.
03:33:38.900 | But something like that could have just shortened
03:33:42.440 | the time from pandemic to functional vaccine, right?
03:33:47.440 | To days even, if you'd actually really act on it.
03:33:51.680 | The fact that that didn't happen
03:33:53.100 | and that the Chinese solution of lockdown,
03:33:55.680 | that actually, at the beginning,
03:33:56.840 | people thought the state could potentially stop the virus,
03:33:59.980 | stop people in place.
03:34:00.820 | It turned out to be more contagious than that.
03:34:02.340 | Basically no NPI, no non-pharmaceutical intervention
03:34:05.960 | really turned out to work that much, right?
03:34:08.320 | And actually at the very beginning of the pandemic,
03:34:09.880 | I said something like, look,
03:34:11.300 | it's actually February 3rd, about a month before people,
03:34:14.240 | I was just watching what was going on in China.
03:34:15.600 | I saw that they were doing digital quarantine,
03:34:18.520 | like using WeChat codes to block people off and so on.
03:34:21.560 | I didn't know what was gonna happen, but I said, look,
03:34:23.760 | if the coronavirus goes pandemic and it seems it may,
03:34:26.120 | the extreme edge case becomes the new normal.
03:34:28.200 | It's every debate we've had on surveillance,
03:34:30.040 | deplatforming and centralization accelerated.
03:34:32.680 | Pandemic means emergency powers for the state,
03:34:35.380 | even more than terrorism or crime.
03:34:38.080 | And sometimes a solution creates the next problem.
03:34:40.560 | My rough forecast of the future,
03:34:41.760 | the coronavirus results in quarantines,
03:34:43.600 | nationalism, centralization.
03:34:45.360 | And this may actually work to stop the spread,
03:34:48.880 | but once under control,
03:34:49.760 | states will not see their powers, so we decentralize.
03:34:52.760 | And I didn't know whether it was gonna stop the spread,
03:34:55.040 | but I knew that they were gonna try to do it, right?
03:34:57.640 | And look, it's hard to call every single thing right,
03:35:01.080 | and I'm sure someone will find some errors,
03:35:03.000 | but in general, I think that was actually pretty good
03:35:05.720 | for like early February of 2020, right?
03:35:08.200 | So it's my point though.
03:35:09.240 | The point is, rather than copying Chinese lockdown,
03:35:13.040 | we should have had were different regimes around the world.
03:35:16.700 | To some extent, Sweden defected from this, right?
03:35:19.140 | They had like no lockdowns or what have you.
03:35:21.360 | But really the axis that people were talking about
03:35:23.800 | was lockdown versus no lockdown.
03:35:25.400 | The real axis should have been challenge trials
03:35:28.400 | versus no challenge trials.
03:35:29.320 | We could have had that in days, okay?
03:35:31.640 | And those are two examples on both vaccines and testing.
03:35:35.480 | There's so many more that I can point to.
03:35:37.720 | - So those are kind of decentralized innovations,
03:35:40.120 | and that's what FDA should stand for.
03:35:42.720 | - FDA can stand for it.
03:35:44.280 | - Or something like FDA, right?
03:35:45.960 | - Ah, so let's talk about that, right?
03:35:46.960 | Something like FDA.
03:35:47.800 | So this is very important.
03:35:49.240 | In general, the way I try to think about things is V1, V2,
03:35:53.880 | V3, as we've talked about a few times.
03:35:55.160 | - Right, so FDA, V.
03:35:57.120 | - Well, right, so what was before FDA, right?
03:35:59.320 | So there was both good and bad before FDA,
03:36:01.040 | 'cause people don't necessarily have
03:36:02.040 | the right model of the past, okay?
03:36:04.080 | So if you ask people what was there before the FDA,
03:36:06.880 | they'll say, "Oh, and by the way,
03:36:08.240 | "the FDA itself omits the, right?
03:36:11.680 | "Their pronouns are just FDA, FDA."
03:36:14.200 | Okay, so, but basically-
03:36:17.360 | - Why is that important?
03:36:18.880 | - It's just something where-
03:36:20.680 | - Well, why is that either humorous or interesting to you?
03:36:24.320 | - They have a sort of in-group lingo
03:36:27.320 | where when you are kind of talking about them
03:36:30.640 | the way that they talk about themselves,
03:36:33.080 | it is something that kind of piques interest.
03:36:35.040 | It's kind of like, you know, in LA,
03:36:37.680 | people say the 101 or the, you know, right?
03:36:40.360 | Whereas in Northern California, they'll say 101.
03:36:42.920 | Or people from Nevada will say Nevada, right?
03:36:46.120 | It just instantly marks you as like insider or outsider,
03:36:48.840 | okay, in terms of how the language works, right?
03:36:51.240 | And that's, go ahead.
03:36:52.880 | - I mean, it just makes me sad,
03:36:54.760 | because that lingo is part of the mechanism
03:36:56.960 | which creates the silo, the bubble of particular thoughts,
03:37:01.880 | and that ultimately deviates from the truth
03:37:03.880 | because you're not open to new ideas.
03:37:05.960 | - I think it's actually like, you know,
03:37:07.280 | in "Glorious Bastards" there's a scene in the bar.
03:37:09.600 | Do you wanna talk about it?
03:37:10.440 | - No, but it's good.
03:37:12.680 | You can't, just to censor you,
03:37:15.080 | this is like a Wikipedia podcast, it's like Wikipedia.
03:37:17.520 | You can't cite Quentin Tarantino films, no.
03:37:21.440 | - Okay, okay, okay.
03:37:22.280 | - Sorry, take me back there.
03:37:23.440 | - Basically, like English start going like
03:37:25.200 | one, two, three, four, five.
03:37:26.920 | And I believe it's like the Germans start with like the thumb
03:37:30.760 | - Something that you'd never know, right?
03:37:33.080 | I may be misremembering it, but I think that's right.
03:37:35.000 | Okay, so that's like--
03:37:35.840 | - FDA's got the lingo, all right.
03:37:38.160 | - Right, so FDA's got the lingo.
03:37:39.080 | So coming back up, basically, just talk about FDA
03:37:42.960 | and then come back to your question on the platform.
03:37:45.080 | - So what was V0, FDA, what's V1?
03:37:47.760 | What does the future look like?
03:37:49.360 | - V1 was quote, patent medicines, okay?
03:37:52.200 | That's something so people say.
03:37:53.200 | But V1 was also Banting and Best, okay?
03:37:55.600 | Banting and Best, they won the Nobel Prize
03:37:58.360 | in the early 1920s, right?
03:38:01.760 | They came up with the idea for insulin supplementation
03:38:06.400 | to treat diabetes.
03:38:08.040 | And they came up with a concept,
03:38:09.800 | they experimented on dogs, they did self-experimentation,
03:38:13.320 | they had healthy volunteers,
03:38:14.520 | they experimented with the formulation as well, right?
03:38:16.920 | Because just like you'd have like a web app and a mobile app,
03:38:20.320 | maybe a command line app, you could have, you know,
03:38:22.640 | a drug that's administered orally or via injection
03:38:25.240 | or cream or, you know, there's different formulations,
03:38:27.280 | right, dosage, all that stuff,
03:38:28.880 | they could just like iterate on, okay,
03:38:30.720 | with willing doctor, willing patient.
03:38:33.480 | These, you know, these folks who were affected
03:38:37.040 | just sprang out of bed,
03:38:38.400 | the insulin supplementation was working for them.
03:38:40.440 | And within a couple of years,
03:38:41.920 | they had won the Nobel Prize and Eli Lilly
03:38:43.880 | had scaled production
03:38:44.720 | for the entire North American continent, okay?
03:38:47.080 | So that was a time when pharma moved at the speed of software
03:38:51.840 | when it was willing buyer, willing seller, okay?
03:38:54.760 | Because the past is demonized
03:38:56.640 | as something that our glorious regulatory agency
03:38:58.880 | is protecting us from, okay?
03:39:01.080 | But there's so many ways
03:39:02.960 | in which what it's really protecting you from
03:39:04.440 | is being healthy, okay?
03:39:06.720 | As, you know, I mean, there's a zillion examples of this,
03:39:10.000 | I won't be able to recapitulate all of them
03:39:11.720 | just in this podcast,
03:39:12.560 | but if you look at a post that I've got,
03:39:15.320 | it's called "Regulation, Disruption, and the Future,
03:39:19.640 | Technologies of 2013," Coursera PDF, okay?
03:39:25.680 | This lecture, which I'll kind of link it here
03:39:29.600 | so you can maybe put in the show notes if you want,
03:39:32.360 | this goes through like a dozen different examples
03:39:35.760 | of crazy things the FDA did
03:39:38.360 | from the kind of stuff that was dramatized
03:39:41.000 | in Dallas Buyers Club,
03:39:42.120 | where they were preventing people from getting AIDS drugs
03:39:45.360 | to, you know, their, you know, various attacks on,
03:39:49.760 | you know, quote, "raw milk,"
03:39:52.160 | where they were basically saying,
03:39:53.760 | here's a quote from FDA, you know, filing in 2010,
03:39:58.760 | "There's no generalized right to bodily and physical health.
03:40:02.360 | There's no right to consume or feed children
03:40:04.400 | any particular food.
03:40:05.520 | There's no fundamental right to freedom of contract."
03:40:08.920 | They basically feel like they own you.
03:40:10.880 | You're not allowed to make your own decisions
03:40:12.640 | about your food.
03:40:13.720 | There's no generalized right to bodily and physical health,
03:40:15.720 | direct quote from their like written kind of thing, okay?
03:40:19.400 | The general frame is usually that FDA says
03:40:22.360 | it's protecting you from the big bad company,
03:40:25.280 | but really what it's doing is it's preventing you
03:40:27.600 | from opting out, okay?
03:40:29.200 | Now, with that said,
03:40:30.600 | and this is where I'm talking about V3,
03:40:33.360 | as critical as I am of FDA or the Fed for that matter,
03:40:37.080 | I also actually recognize that like the Ron Paul type thing
03:40:41.520 | of end the Fed is actually not practical.
03:40:44.000 | End the Fed will just be laughed at.
03:40:45.680 | What Bitcoin did was a much, much, much more difficult task
03:40:49.120 | of building something better than the Fed.
03:40:52.080 | That's really difficult to do because the Fed and the FDA,
03:40:55.960 | they're like the hub of the current system.
03:40:57.960 | People rely on them for lots of different things, okay?
03:41:00.760 | And you're gonna need a better version of them
03:41:05.560 | and how would you actually build something like that?
03:41:07.040 | So with the Fed and with SEC and the entire,
03:41:12.040 | the banks and whatnot,
03:41:14.000 | crypto has a pretty good set of answers for these things.
03:41:17.120 | And over time, all the countries that are not,
03:41:21.400 | or all the groups that are not the US establishment
03:41:23.520 | or the CCP will find more and more to their liking
03:41:26.560 | in the crypto economy.
03:41:27.680 | So that part I think is going, okay?
03:41:29.520 | We can talk about that.
03:41:31.400 | What does that look like for biomedicine?
03:41:33.800 | Well, first, what does exit the FDA look like, right?
03:41:36.360 | So there actually are a bunch of exits from the FDA already,
03:41:40.960 | which is things like right to try laws, okay?
03:41:45.320 | CLIA labs and laboratory developed tests,
03:41:47.480 | compounding pharmacies, off-label prescription by doctors,
03:41:51.000 | and countries that aren't fully harmonized with FDA.
03:41:53.280 | For example, Kobe Bryant, before he passed away,
03:41:56.680 | went and did stem cell treatments in Germany, okay?
03:41:59.920 | Stem cells have been pushed out,
03:42:01.520 | I think in part by the Bush administration,
03:42:03.000 | but by other things.
03:42:04.320 | So those are different kinds of exits.
03:42:07.160 | Right to try basically means at the state level,
03:42:10.040 | you can just try the drug, okay?
03:42:12.520 | CLIA labs and LDTs, that means that's a path
03:42:17.120 | where you don't have to go through FDA
03:42:18.400 | to get a new device approved.
03:42:19.480 | You can just run it in a lab, okay?
03:42:21.760 | Compounding pharmacies, these were under attack.
03:42:23.600 | I'm not sure actually where the current statute is on this,
03:42:25.960 | but this is the idea that a pharmacist has some discretion
03:42:28.520 | in how they prepare mixtures of drugs.
03:42:31.640 | Off-label prescription by MDs.
03:42:34.560 | So MDs have enough weight in the system
03:42:36.840 | that they can kind of push back on FDA.
03:42:38.200 | And off-label prescription is the concept
03:42:40.440 | that a drug that's approved for purpose A
03:42:41.800 | can be prescribed for purpose B or C or D
03:42:44.160 | without going through another whole new drug approval process
03:42:47.800 | and then countries that aren't harmonized, right?
03:42:49.400 | So those are like five different kinds of exits
03:42:51.800 | from the FDA on different directions.
03:42:54.160 | So first those exits exist.
03:42:55.360 | So for those people who are like,
03:42:56.720 | "Oh my God, we're all gonna die
03:42:58.320 | or he's gonna poison us with non-FDA approved things
03:43:00.920 | or whatever," right?
03:43:02.120 | Like those exits exist.
03:43:04.520 | You've probably actually used tests or treatments from those.
03:43:07.720 | You don't even realize that you have, right?
03:43:09.480 | So it hasn't killed you, number one.
03:43:11.480 | Number two is actually testing for safety.
03:43:14.040 | You know, there's safety, efficacy, and like comparative.
03:43:16.960 | Safety is actually relatively easy to test for.
03:43:19.960 | There's very few drugs that are like,
03:43:22.280 | there's TGN-1412, that's a famous example
03:43:24.560 | of something that was actually really dangerous to people,
03:43:26.880 | right, with an early test.
03:43:28.240 | So those do exist, just acknowledge they do exist.
03:43:30.600 | But in general, testing for safety
03:43:32.360 | is actually not that hard to do, okay?
03:43:34.840 | And if something is safe,
03:43:36.200 | then you should be able to try it usually, okay?
03:43:38.840 | Now, what does that decentralized FDA look like?
03:43:40.600 | Well, basically you take individual pieces of it
03:43:43.000 | and you can often turn them into vehicles.
03:43:46.160 | And this is like 50 different startups,
03:43:49.360 | but let me describe some of them.
03:43:50.640 | First, have you gotten any drugs
03:43:52.640 | or something like that recently?
03:43:53.600 | I mean, like prescribed drugs, prescription drugs,
03:43:55.280 | and it was like--
03:43:56.120 | - Now that you clarify, the answer's no.
03:43:57.760 | - Yeah. (laughs)
03:43:59.280 | - Prescribed drugs, no.
03:44:00.840 | - Okay, so--
03:44:01.680 | - Not long, maybe antibiotics a long time ago, maybe.
03:44:04.680 | But no. - All right.
03:44:05.520 | So you know how you have like a,
03:44:07.120 | sort of like a wadded up chemistry textbook,
03:44:09.680 | the package insert that goes into the, right?
03:44:11.640 | - Yes, yes, yes. - Okay.
03:44:12.960 | (laughs)
03:44:13.800 | - It's a wadded up chemistry textbook, I love it.
03:44:15.560 | That's what it is, right?
03:44:16.400 | That's what a terrible user interface,
03:44:18.120 | we don't usually think of it that way.
03:44:19.520 | Why is your user interface so terrible?
03:44:20.800 | That's a web of regulation that makes it so terrible.
03:44:23.520 | And there's actually guys who tried to innovate
03:44:26.360 | just on user interface called, like, Help, I Need Help.
03:44:29.000 | That was like the name of the company a while back.
03:44:31.280 | And it was trying to explain the stuff in plain language.
03:44:33.760 | Okay.
03:44:35.120 | Just on user interface, you can innovate.
03:44:36.880 | And why is it important?
03:44:37.840 | Well, there's a company called PillPack,
03:44:41.360 | which innovated on, quote, the user interface for drugs
03:44:43.800 | by giving people a thing which had like a daily blister pack.
03:44:47.840 | So it's like, here's your prescription
03:44:50.720 | and you're supposed to take all these pills
03:44:52.160 | on the first and second.
03:44:53.480 | And basically, whether you had taken them on a given day
03:44:57.520 | was manifest by whether you had opened it
03:44:59.480 | for that specific day, okay?
03:45:01.600 | This is way better than other kinds
03:45:03.120 | of so-called compliance methodologies.
03:45:04.720 | Like, there are guys who tried to do like an IOT pill
03:45:07.080 | where when you swallow it, it like gives you measurements.
03:45:09.400 | This was just a simple innovation on user interface
03:45:11.720 | that boosted compliance in the sense of compliance
03:45:14.280 | with the drug regimen dramatically, right?
03:45:16.240 | And I think they got acquired or what have you
03:45:18.720 | for a lot of money.
03:45:19.640 | - And hopefully utilized effectively.
03:45:22.240 | - Utilized effectively, right?
03:45:23.080 | - Although sometimes these companies
03:45:24.880 | that do incredible innovation,
03:45:26.040 | it really makes you sad when they get acquired
03:45:27.800 | that that leads to their death, not their scaling.
03:45:30.520 | - Sure.
03:45:31.360 | I mean, they did a lot of other good things,
03:45:32.200 | but this was one thing that they did well, right?
03:45:34.600 | So PillPack just shows what you can get
03:45:36.360 | with improving on user interface.
03:45:38.720 | Why can't, I mean, we get reviews for everything, right?
03:45:42.160 | One thing that, you know, like people have sort of,
03:45:44.400 | in my view, somewhat quoted our context.
03:45:45.640 | They're like, "Oh, biology thinks you should replace
03:45:47.400 | the FDA with the Elkford drugs."
03:45:48.840 | Actually, there's something called phase four, okay,
03:45:51.800 | of the FDA, which is so-called post-market surveillance.
03:45:54.480 | Do you know that that's actually something where,
03:45:58.240 | in theory, you can go and fill out a form
03:46:01.160 | on the FDA website, which basically says,
03:46:03.960 | I've had a bad experience with a drug.
03:46:07.240 | - Not like VAERS, but for drugs.
03:46:09.600 | - Yeah, so it's called MedWatch, right?
03:46:12.160 | And so you can do voluntary reporting,
03:46:15.320 | and you can get a PDF and just upload it, right?
03:46:20.320 | - Is this a government, like is this the .gov?
03:46:24.640 | - Yeah, it's form 3500B.
03:46:26.240 | - I love it.
03:46:27.360 | It's HTML.
03:46:28.600 | It's gonna be from the '90s.
03:46:31.720 | It's gonna have an interface designed by somebody
03:46:36.200 | who's a COBOL/Fortran programmer.
03:46:39.360 | - Right, here we go.
03:46:40.200 | So here we go.
03:46:41.040 | So basically, the 3500B--
03:46:43.760 | - I hope to be proven wrong on that, by the way.
03:46:46.080 | - So 3500B, consumer voluntary reporting.
03:46:49.400 | When do I use this form?
03:46:50.440 | You were hurt or had a bad side effect.
03:46:52.040 | Use a drug which led to unsafe use, et cetera.
03:46:54.240 | The point is, FDA already has a terrible Yelp for drugs.
03:46:58.200 | It has a terrible version of it.
03:47:00.080 | What would the good version look like?
03:47:01.120 | The fact that you've never,
03:47:01.960 | I mean, the fact you have to fill out a PDF
03:47:03.600 | to go and submit a report.
03:47:04.440 | How do you submit a report at Yelp or Uber or Airbnb
03:47:07.720 | or Amazon, you tap, and there's star ratings, right?
03:47:10.120 | So just modernizing FDA 3500B and modernizing phase four,
03:47:15.120 | okay, is a huge thing.
03:47:18.520 | - Is it, can you comment on that?
03:47:19.640 | Is there, what incentive mechanism forces
03:47:23.320 | the modernization of that kind of thing?
03:47:26.360 | - Here's how it would work, or one possible.
03:47:27.840 | - To create an actual Yelp.
03:47:29.160 | - Yeah, here's how that would work, right?
03:47:31.800 | You go to the pharmacy or wherever,
03:47:36.440 | and you hold up your phone
03:47:37.280 | and you scan the barcode of the drug, okay?
03:47:39.080 | What does you see?
03:47:39.920 | Instantly, you see global reports, right?
03:47:42.480 | By the way, because your biology,
03:47:44.120 | your physiology, that's global, right?
03:47:46.080 | Information from Brazil or from Germany or Japan
03:47:51.080 | on their physiological reaction
03:47:54.160 | to the same drug you're taking is useful to you.
03:47:56.800 | It's not like a national boundaries thing.
03:47:58.480 | So the whole nation state model
03:47:59.800 | of only collecting information on by other Americans,
03:48:02.440 | really you want a global kind of thing,
03:48:03.680 | just like Amazon book reviews, that's a global thing.
03:48:07.560 | Other things are aggregate at the global level, okay?
03:48:09.960 | So what you want is to see every patient report
03:48:14.960 | and every doctor around the world on this drug.
03:48:17.920 | That might be really important
03:48:19.360 | to your rare or semi-rare condition.
03:48:22.080 | Just that alone would be a valuable site.
03:48:24.520 | - Who builds that site?
03:48:25.520 | It sounds like something created by capitalism.
03:48:28.840 | It sounds like it would have to be a company.
03:48:31.520 | - Yeah, you can definitely do it.
03:48:32.360 | - But we don't have a world where a company
03:48:34.920 | is allowed to be in charge of that kind of thing.
03:48:36.440 | - Well, I don't know.
03:48:37.280 | - Google Health went down.
03:48:38.600 | It just seems like a lot of the...
03:48:41.040 | - So it depends, right?
03:48:42.160 | Basically, this is why you have
03:48:43.480 | to pick off individual elements, right?
03:48:45.680 | There's essentially a combination of first recognizing
03:48:48.280 | that DFT is actually bad.
03:48:50.720 | You need to be able to say that.
03:48:52.240 | Let me put it like this.
03:48:53.200 | It does a lot of bad things.
03:48:54.520 | It is something which you need to be able to criticize.
03:48:57.560 | You might be like, "Well, that's obvious," right?
03:48:59.800 | Well, in 2010, for example, there's a book that came out,
03:49:02.400 | if anybody wants to understand FDA,
03:49:04.040 | it's called "Reputation and Power."
03:49:05.680 | - Yeah, a lot of people don't wanna criticize FDA.
03:49:08.400 | - Yeah, because they will retaliate
03:49:09.840 | against your biotech or pharma company.
03:49:11.560 | - Yeah, and that retaliation can be initiated
03:49:13.760 | by a single human being.
03:49:15.120 | - Absolutely.
03:49:15.960 | The best analogy is, you think about the TSA, okay?
03:49:19.480 | Have you flown recently?
03:49:20.440 | - Yeah. - Okay.
03:49:21.740 | Do you make any jokes about the TSA
03:49:23.200 | when you're in the TSA line?
03:49:24.600 | - Usually you don't want to,
03:49:26.520 | but they're a little more flexible.
03:49:28.400 | You know what, can I tell a story?
03:49:29.840 | - Sure. - Which is,
03:49:31.000 | it was similar to this.
03:49:33.480 | I was in Vegas at a club.
03:49:36.380 | I don't go to clubs.
03:49:37.920 | I got kicked out for the,
03:49:39.520 | I think the first time in my life,
03:49:41.560 | for making a joke with a bouncer.
03:49:43.600 | 'Cause I had a camera with me,
03:49:46.720 | and you're not allowed to have a camera,
03:49:48.680 | and I said, "Okay, cool, I'll take it out."
03:49:52.120 | But I made a funny joke that I don't care to retell.
03:49:55.760 | But he was just a little offended.
03:49:57.560 | He was like, "I don't care who you are.
03:49:59.560 | "I don't care who you're with."
03:50:01.480 | And then he proceeded to list me,
03:50:04.840 | the famous people he has kicked off that club.
03:50:07.760 | But there is, I mean, all of those,
03:50:10.240 | the reason I made the joke is I sensed
03:50:13.080 | that there was an entitlement to this particular individual,
03:50:17.080 | like where the authority has gone to his head.
03:50:19.840 | - Respect my authority. - Yeah.
03:50:21.320 | I almost wanted to poke at that.
03:50:23.360 | And I think the poking, the authority,
03:50:25.240 | I quickly learned the lesson.
03:50:26.840 | I have now been rewarded with the pride I feel
03:50:31.320 | for having poked authority,
03:50:32.960 | but now I'm kicked out of the club
03:50:35.080 | that would have resulted in a fun night
03:50:37.560 | with friends and so on.
03:50:38.480 | Instead, I'm standing alone crying in Vegas,
03:50:41.120 | which is not a unique Vegas experience.
03:50:42.960 | It's actually a fundamental Vegas experience.
03:50:45.320 | But that, I'm sure, that basic human nature
03:50:48.080 | happens in the FDA as well.
03:50:49.720 | - That's exactly right.
03:50:50.640 | So just like with the TSA, just to extend the analogy,
03:50:54.720 | when you're in line at the TSA, right?
03:50:57.360 | You don't wanna miss your flight.
03:50:58.400 | That could cost you hundreds of dollars.
03:51:00.480 | And so you comply with absolutely ludicrous regulations
03:51:03.520 | like, oh, three ounce bottles.
03:51:05.720 | Well, you know what?
03:51:06.560 | You can take an unlimited number of three ounce bottles
03:51:08.400 | and you can combine them into a six ounce bottle
03:51:11.680 | through the terrorist technology called mixing.
03:51:13.960 | Okay, advanced, right?
03:51:15.840 | And the thing about this is everybody in line,
03:51:19.280 | actually some fairly high, let's say,
03:51:22.800 | call it influence or net worth or whatever, people fly.
03:51:25.840 | Millions and millions and millions of people are subject
03:51:28.720 | to these absolutely moronic regulations.
03:51:30.920 | It's all what I think security theater is Shryer's term.
03:51:35.680 | A lot of people know this term.
03:51:36.800 | So millions of people are subject to it.
03:51:38.560 | It costs untold billions of dollars in terms of delays
03:51:42.080 | and what if you just walk up to, right?
03:51:43.960 | It irradiates people.
03:51:46.000 | And this is another FDA thing, by the way.
03:51:47.480 | This is an FDA-TSA team up, okay?
03:51:49.880 | In 2010, the TSA body scanners,
03:51:52.720 | there were concerns expressed,
03:51:53.920 | but when it's a government to government thing,
03:51:56.040 | see a dot com is treated with extreme scrutiny by FDA,
03:51:58.560 | but it's another dot gov,
03:51:59.520 | well, they're not trying to make a profit.
03:52:00.680 | So they kind of just wave them on through, okay?
03:52:03.000 | So these body scanners were basically like applied
03:52:06.440 | to millions and millions of people
03:52:08.520 | and this huge kind of opt-in experiments.
03:52:10.480 | Almost, I think it's quite likely by the way,
03:52:12.760 | that if there was even a slightly increased cancer risk,
03:52:16.360 | that the net morbidity and mortality
03:52:19.160 | from those would have outweighed the deaths from terrorism
03:52:21.440 | or whatever that were prevented, right?
03:52:23.000 | You can work out the numbers,
03:52:24.280 | but you can just get the math under reasonable assumptions,
03:52:26.640 | it's probably true.
03:52:27.880 | If it had any increased morbidity and mortality.
03:52:29.760 | I've not seen the recent things,
03:52:30.880 | but I've seen that concern expressed 12 years ago.
03:52:33.820 | Point being that despite the cost,
03:52:37.320 | despite how many people are exposed to it,
03:52:39.000 | despite how obviously patently ludicrous it is,
03:52:42.000 | you don't make any trouble,
03:52:43.200 | nor do people organize protests or whatever about this,
03:52:46.160 | because it's something where people,
03:52:49.760 | the security theater of the whole thing is part of it.
03:52:53.080 | Oh, well, if we took them away,
03:52:54.040 | there'd be more terrorism or something like that.
03:52:55.640 | People think, right?
03:52:57.040 | - But it is fascinating to see that the populace
03:53:00.760 | puts up with it, 'cause it doesn't,
03:53:03.120 | one of my favorite things is to listen to Jordan Peterson,
03:53:05.480 | who I think offline, but I think also on the podcast,
03:53:10.040 | you know, is somebody who resists authority in every way.
03:53:12.720 | And even he goes to TSA with a kind of suppressed,
03:53:17.720 | like all the instructions, everything down to,
03:53:20.920 | whenever you have like the yellow thing for your feet,
03:53:24.500 | they force you to adjust it even slightly if you're off.
03:53:27.880 | Just even, I mean, it's like, it's a Kafka novel.
03:53:32.320 | We're living, like TSA, it makes me smile.
03:53:34.960 | It brings joy to my heart, because I imagine
03:53:37.080 | Franz Kafka and I just walking through there,
03:53:40.860 | because it really is just deeply absurd.
03:53:45.120 | But, and then the whole motivation of the mechanism
03:53:49.840 | becomes distorted by the individuals involved.
03:53:52.120 | The initial one was to reduce the number
03:53:53.880 | of terrorist attacks, I suppose.
03:53:55.360 | - Right, now it's guns and drugs.
03:53:56.600 | Basically, it's like, essentially what they've done
03:53:58.560 | is they've repealed the Fourth Amendment, right?
03:54:00.620 | Search and seizure, they can do it without probable cause.
03:54:03.840 | Everybody is being searched.
03:54:04.840 | Everybody's a potential terrorist.
03:54:06.000 | So they've got probable cause for everybody, in theory.
03:54:08.320 | And so what they do, they'll post on their website
03:54:10.860 | the guns and drugs or whatever
03:54:12.020 | that they seized in these scanners.
03:54:13.220 | Well, of course, if you search everybody,
03:54:15.020 | you're gonna find some criminals or whatever.
03:54:17.500 | But the cost of doing that is dramatic.
03:54:20.020 | Moreover, the fact that people have sort of been trained
03:54:22.340 | to have compliance, it's like the Soviet Union, right?
03:54:25.460 | Where, just grudgingly, all right,
03:54:27.980 | go along with this extremely stupid thing.
03:54:29.700 | What's my point?
03:54:30.680 | The point is, this is a really stupid regulation
03:54:34.540 | that has existed in plain sight of everybody for 20 years.
03:54:36.860 | We're still taking off our shoes, okay,
03:54:39.260 | because some shoe bomber, whatever number of years ago,
03:54:41.740 | okay, all of this stuff is there,
03:54:44.460 | as opposed to, there's a zillion other things
03:54:46.680 | you could potentially do, different paradigms
03:54:48.100 | for, quote, airport security.
03:54:49.200 | But now apply that to FDA.
03:54:51.500 | Just like a lot of what TSA does is security theater,
03:54:54.220 | arguably all of it, a lot of what FDA does is safety theater.
03:54:57.500 | The difference is, there's far fewer people
03:54:59.580 | who go through the aperture.
03:55:00.780 | They're the biotech and pharma CEOs, okay?
03:55:03.780 | So you don't have an understanding
03:55:05.540 | of what it is to deal with them, number one.
03:55:07.380 | Number two is, the penalty is not a few hundred dollars
03:55:10.100 | of missing your flight.
03:55:11.580 | It is a few million dollars, or tens,
03:55:14.060 | or hundreds of millions of dollars
03:55:15.060 | for getting your company subject to the equivalent
03:55:18.060 | of a retaliatory wait time,
03:55:19.180 | just like that bouncer threw you out,
03:55:20.780 | just like the TSA officer, if you make a joke,
03:55:22.900 | or they can just sit you down
03:55:24.140 | and make you lose your flight, right?
03:55:25.780 | So too, can the FDA just silently impede
03:55:30.780 | the approval of something and choke you out financially
03:55:34.060 | because you don't have enough runway to get funded, right?
03:55:36.660 | So just impose more wait time.
03:55:38.140 | Guess what, we want another six months.
03:55:40.020 | Data's gonna take another six months,
03:55:41.300 | your company doesn't have the time, you die, right?
03:55:44.140 | If you live, you have to raise a round
03:55:46.500 | at some dilutive valuation,
03:55:48.500 | and now the price gets jacked up on the other side.
03:55:50.700 | That's the one thing that can give, by the way,
03:55:52.220 | in this whole process.
03:55:53.820 | When you push out timelines from days
03:55:57.500 | to get a vaccine approved, or a vaccine evaluated, rather,
03:56:01.180 | via challenge trials, to months or years,
03:56:03.980 | the cost during that time, when you,
03:56:06.620 | it just increases non-linearly, right?
03:56:08.780 | Because you can't iterate on the product.
03:56:11.260 | All the normal observations,
03:56:12.620 | if it takes you 10 years to launch a product
03:56:14.100 | versus 10 days, what's the difference
03:56:15.900 | in terms of your speed of variation,
03:56:16.820 | your cost, et cetera, right?
03:56:18.100 | So this is part of what, it's not the only thing,
03:56:19.820 | there's other things, there's AMA and CPT,
03:56:21.900 | there's other things, but this is one of the things
03:56:24.820 | that jacks up prices in the US medical system, okay?
03:56:28.140 | So now you have something where these CEOs,
03:56:31.340 | they're going through this aperture,
03:56:33.020 | they can't tell anybody about it,
03:56:36.140 | because if you read "Reputation and Power,"
03:56:38.500 | okay, I'm gonna just quote this,
03:56:40.500 | because it's an amazing, amazing book, right?
03:56:42.660 | It's written by a guy, Daniel Carpenter,
03:56:44.860 | a smart guy, but he's an FDA sympathizer.
03:56:47.300 | He fundamentally thinks it's a good thing, or what have you.
03:56:49.940 | Nevertheless, I respect Carpenter's intellectual honesty,
03:56:53.180 | because he quotes the CEOs in the book verbatim,
03:56:57.900 | and he gives some paragraphs.
03:56:59.260 | And essentially, from their descriptions,
03:57:00.900 | it's like, think about like a Vietnam War thing,
03:57:03.820 | where you've got a POW, and they're like blinking
03:57:06.900 | through their eyes, being tortured, okay?
03:57:09.620 | That is the style, when you read Carpenter's book,
03:57:12.240 | you read the quotes from these CEOs,
03:57:16.980 | hold on, let me see if I can find it.
03:57:18.660 | - Do you recommend the book?
03:57:19.780 | - It's a good book, yeah.
03:57:20.940 | Or it's now a little bit outdated, okay?
03:57:23.740 | Because it's like, you know, almost 10 years old.
03:57:26.020 | Still, as a history of the FDA, it is well worth reading.
03:57:30.100 | And by the way, the reason I say it,
03:57:31.020 | like the FDA is so insanely important.
03:57:34.620 | It's so much more important than many other things
03:57:36.220 | that people talk about, but they don't talk about it, right?
03:57:39.280 | I just wanna read his little blurb for it, right?
03:57:41.580 | This is 2010.
03:57:42.420 | "The U.S. Food and Drug Administration
03:57:43.700 | "is the most powerful regulatory agency in the world.
03:57:45.640 | "How did the FDA become so influential,
03:57:47.280 | "and how does it wield its extraordinary power?
03:57:50.640 | "Reputation and power traces the history
03:57:52.180 | "of FDA regulation of pharmaceuticals,
03:57:54.460 | "revealing how the agency's organizational reputation
03:57:56.700 | "has been the primary source of its power,
03:57:58.140 | "yet it's also one of its ultimate constraints.
03:58:00.680 | "Carpenter describes how the FDA cultivated a reputation
03:58:03.300 | "for competence and vigilance throughout the last century,
03:58:05.480 | "and how this organizational image
03:58:07.260 | "has enabled the agency to regulate
03:58:09.500 | "while resisting efforts to curb its own authority."
03:58:12.020 | First of all, just that description alone,
03:58:14.460 | you're like, wait a second,
03:58:16.580 | he is describing this as an active player.
03:58:19.460 | It's not like a DMV kind of thing, which is passed through.
03:58:23.660 | It's talking about cultivating a reputation,
03:58:26.580 | its power, resisting efforts to curb its own authority.
03:58:30.780 | The thing is, now you're kind of through the looking glass.
03:58:33.860 | You're like, wait a second, this is kind of language
03:58:35.940 | I don't usually hear for regulatory agencies.
03:58:38.620 | The thing is, the kind of person who becomes the CEO
03:58:41.580 | of a giant company, what do they wanna do?
03:58:43.180 | They wanna expand that company.
03:58:45.480 | They wanna make more profit.
03:58:48.560 | Similarly, the kind of person who comes
03:58:49.940 | to run a regulatory agency or one of the subunits,
03:58:53.420 | that person wants to expand its ambit.
03:58:55.640 | Okay?
03:58:58.260 | - By the way, is that always obvious,
03:58:59.460 | and sorry to interrupt, but for the CEO of the company,
03:59:01.780 | I know that the philosophical ideal of capitalism
03:59:05.220 | is you want to make the thing more profitable,
03:59:07.340 | but we're also human beings.
03:59:10.340 | Do you think there's some fundamental aspect
03:59:12.260 | to which we wanna do a lot of good in the world?
03:59:14.300 | - Sure, but the fiduciary duty will push people
03:59:17.000 | to get the ambitious, the profit-maximizing,
03:59:20.060 | expansionist CEO is selected for, right?
03:59:23.800 | Basically, they believe, crucially,
03:59:25.200 | they're not just, this is important,
03:59:26.620 | they're not just, I mean, some of them are
03:59:28.760 | grant-of-auto, make as much money as possible,
03:59:30.560 | but they believe in the mission, okay?
03:59:32.640 | They've come to believe in the mission,
03:59:33.680 | and that is the person who's selected.
03:59:35.080 | Chomsky actually had this good thing,
03:59:36.160 | which is like, I believe that you believe what you believe,
03:59:39.440 | but if you didn't believe what you believe,
03:59:40.560 | you wouldn't be sitting here.
03:59:41.600 | - Right, so they select for the kind of people
03:59:43.520 | that are able to make a lot of money,
03:59:45.700 | and in that process, those people are able
03:59:48.800 | to construct a narrative that they're doing good,
03:59:52.400 | even though what they were selected for
03:59:54.160 | is the fact that they can make a lot of money.
03:59:55.480 | - Yeah, and they may actually be doing good,
03:59:57.080 | but the thing is, with CEOs, we have a zillion images
04:00:00.600 | in television and media movies of the evil corporation
04:00:04.040 | and the greedy CEO.
04:00:04.960 | We have some concept of what CEO failure modes are like,
04:00:08.320 | okay?
04:00:09.140 | Now, when have you ever seen an evil regulator?
04:00:11.340 | Have you, can you name a fictional portrayal
04:00:14.160 | of an evil regulator?
04:00:15.160 | Can you name an evil CEO?
04:00:16.000 | There's tons. - Yeah, a lot, a lot,
04:00:17.640 | a lot, a lot.
04:00:19.040 | But that's so interesting.
04:00:20.200 | I'm trying, I'm searching for a deeper lesson here.
04:00:22.680 | You're right.
04:00:23.760 | You're right.
04:00:26.200 | I mean, there is portrayals,
04:00:30.080 | especially in sort of authoritarian regimes
04:00:33.280 | or the Soviet Union, where there's bureaucracy,
04:00:36.400 | you know, Chernobyl, you can kind of see
04:00:39.080 | within that there's the story of the regulator,
04:00:41.880 | but yeah, it's not as plentiful,
04:00:45.200 | and it also often doesn't have a face to it.
04:00:47.400 | It's almost like bureaucracy is this amorphous thing
04:00:51.480 | that results, any one individual you see,
04:00:53.920 | they're just obeying somebody else.
04:00:55.640 | There's not a face to it of evil.
04:00:58.480 | - That's right.
04:00:59.320 | - The evil is the entire machine.
04:01:01.040 | - That's right.
04:01:01.880 | That's what I call the school of fish strategy, by the way.
04:01:04.080 | It's something where you are an individual
04:01:06.240 | and you can be signaled out,
04:01:08.040 | but there's more accountability for one person's bad tweets
04:01:11.420 | than all the wars in the Middle East, right?
04:01:13.400 | Because it's a school of fish.
04:01:15.200 | - Yeah. - Right?
04:01:16.300 | So if the establishment is wrong,
04:01:18.640 | if the bureaucracy is wrong,
04:01:20.200 | they're all wrong at the same time.
04:01:21.240 | Who could have known?
04:01:22.560 | Whereas if you deviate,
04:01:24.960 | then you are a deviation who can be hammered down, okay?
04:01:28.760 | Now, the school of fish strategy is,
04:01:30.920 | unfortunately, very successful because,
04:01:33.840 | you know, truth is whatever.
04:01:35.120 | If you just always ride with the school of fish
04:01:37.040 | and turn when they turn and so on,
04:01:38.880 | unless there's a bigger school of fish that comes in,
04:01:41.920 | you basically can never be proven wrong, right?
04:01:44.200 | And this is actually, you know, of course,
04:01:46.160 | someone who believes in truth
04:01:47.420 | and believes in, you know, innovation and so on,
04:01:50.040 | just physiologically can't ride with the school of fish.
04:01:52.420 | You just have to say what is true or do what is true, right?
04:01:55.060 | Still, you've described correctly, you know,
04:01:57.420 | how it's faceless, right?
04:01:58.740 | So I will give two examples of fictional portrayals
04:02:00.860 | of evil regulators.
04:02:02.620 | One is actually the original Ghostbusters.
04:02:04.660 | - Okay. - Did not expect that one,
04:02:06.780 | but yes. - Yeah.
04:02:07.620 | So the EPA is actually the villain in that,
04:02:09.620 | where they flip a switch
04:02:11.620 | that lets out all the ghosts in the city.
04:02:13.620 | And essentially, the guy is coming in
04:02:15.380 | with a head of steam as this evil regulator
04:02:17.840 | that's just totally arrogant,
04:02:19.540 | doesn't actually understand the private sector
04:02:21.680 | or the consequences of their actions.
04:02:24.080 | And they force the,
04:02:26.020 | and crucially, they bring a cop with them with a gun.
04:02:28.520 | So it shows that a regulator is not simply,
04:02:31.960 | you know, some piece of paper,
04:02:33.640 | but it is the police, right?
04:02:35.880 | And that cop with the gun forces the Ghostbusters
04:02:38.480 | to like release the containment and the whole thing spreads.
04:02:40.680 | The second example is Dallas Buyers Club,
04:02:42.160 | which is more recent.
04:02:43.280 | And that actually shows the FDA blocking a guy
04:02:46.100 | who, with a life-threatening illness,
04:02:48.740 | you know, with AIDS,
04:02:49.760 | from getting the drugs to treat his condition
04:02:52.340 | and from getting it to other people, right?
04:02:54.780 | Those are just two portrayals,
04:02:55.860 | but in general, what you find
04:02:57.860 | is when you talk about FDA with people,
04:03:00.460 | one thing I'll often hear from folks is like,
04:03:03.900 | why would they do that, right?
04:03:06.500 | They have no mental model of this.
04:03:07.820 | They kind of think of it as,
04:03:09.220 | why would the,
04:03:11.460 | why would this thing,
04:03:12.760 | which they think of as sort of the DMV,
04:03:14.120 | they don't think of the DMV as like this active thing, okay?
04:03:17.320 | Why would the FDA do that?
04:03:18.760 | Well, it is because it's filled with some ambitious people
04:03:22.720 | that want to keep increasing the power of the agency,
04:03:24.560 | just like the CEO wants to increase
04:03:25.800 | the profit of the company, right?
04:03:27.040 | I use that word ambit, right?
04:03:28.980 | Why ambit?
04:03:29.820 | Because these folks are,
04:03:32.280 | we know the term greedy, right?
04:03:34.380 | These folks are power hungry.
04:03:36.380 | They want to have the maximum scope.
04:03:38.880 | And sometimes regulatory agencies
04:03:40.040 | collide with each other, right?
04:03:40.980 | Even though FDA is under HHS,
04:03:42.600 | sometimes it collides with HHS
04:03:44.040 | and they've got regulations that conflict.
04:03:46.800 | For example, HHS says everybody's supposed to be able
04:03:48.920 | to have access to their own medical record.
04:03:50.520 | FDA didn't want people to have access
04:03:51.960 | to their own personal genomes.
04:03:53.440 | That conflicts, okay?
04:03:54.640 | And both of those are kind of anti-corporate statutes
04:03:59.120 | that were put out,
04:03:59.960 | with HHS's thing being targeted at the hospitals
04:04:02.480 | and FDA being targeted at the personal genomic companies,
04:04:04.640 | but those conflicted, right?
04:04:06.840 | It's a little bit like CFTC and SEC have a door jam
04:04:09.480 | over who will regulate cryptocurrency, right?
04:04:11.400 | Sometimes regulators fight each other,
04:04:14.100 | but they fight each other.
04:04:14.980 | They fight companies.
04:04:16.360 | They are active players.
04:04:18.120 | This "Reputation and Power" book,
04:04:19.360 | the reason I mention it is,
04:04:21.320 | I'm gonna see if I can find this quote.
04:04:22.520 | So let me see if I can find this quote.
04:04:24.400 | "Reputation and Power, Organizational Image
04:04:27.320 | and Pharmaceutical Regulation at the FDA."
04:04:30.760 | So Genentech's executive, G. Kirk Robb, right?
04:04:35.440 | Robb would describe regulatory approval for his products
04:04:37.360 | as a fundamental challenge facing his company.
04:04:39.840 | And he would depict the administration
04:04:41.160 | in a particularly vivid metaphor.
04:04:42.920 | "I've told the story hundreds of times
04:04:44.440 | to help people understand the FDA.
04:04:46.520 | When I was in Brazil,
04:04:47.360 | I worked on the Amazon River for many months
04:04:48.920 | selling teramicin for Pfizer.
04:04:50.720 | I hadn't seen my family for eight or nine months.
04:04:52.800 | They were flying into Sao Paulo
04:04:54.280 | and I was flying down from some little village
04:04:56.200 | on the Amazon to Manus and then to Sao Paulo.
04:04:58.080 | I was a young guy in his twenties.
04:04:59.000 | I couldn't wait to see the kids.
04:05:00.440 | One of them was a year old baby.
04:05:01.560 | The other was three.
04:05:02.440 | I missed my wife.
04:05:04.000 | There was a Quonset hut in front of
04:05:06.160 | just a little dirt strip with a single engine plane
04:05:08.240 | to fly me to Manus.
04:05:09.080 | I roll up and there's a Brazilian soldier there.
04:05:11.440 | The military revolution happened literally the week before.
04:05:13.560 | So this soldier is standing there with his machine gun
04:05:15.680 | and he said to me, "You can't come in."
04:05:18.120 | I was speaking pretty good Portuguese by that time.
04:05:20.240 | I said, "My God, my plane, my family, I gotta come in."
04:05:23.280 | He said again, "You can't come in."
04:05:25.160 | I said, "I gotta come in."
04:05:26.880 | And he took his machine gun, took the safety off
04:05:29.640 | and pointed at me and said, "You can't come in."
04:05:31.720 | And I said, "Oh, now I got it.
04:05:33.960 | I can't go in there."
04:05:35.400 | And that's the way I always describe the FDA.
04:05:37.520 | The FDA is standing there with a machine gun
04:05:39.240 | against the pharmaceutical industry.
04:05:40.480 | So you better be their friend rather than their enemy.
04:05:42.520 | They are the boss.
04:05:43.600 | If you're a pharmaceutical firm,
04:05:44.800 | they own you body and soul.
04:05:46.640 | Okay, that's the CEO of a successful company, Genentech.
04:05:50.320 | He said he's told the story hundreds of times
04:05:52.640 | and regulatory approval is a fundamental challenge
04:05:54.400 | facing his company because if you are regulated by FDA,
04:05:59.240 | they are your primary customer.
04:06:01.120 | If they cut the cord on you, you have no other customers.
04:06:05.160 | And in fact, until very recently
04:06:07.280 | with the advent of social media,
04:06:09.000 | no one would even tell your story.
04:06:10.800 | It was assumed that you were some sort of
04:06:13.680 | corporate criminal that they were protecting the public from,
04:06:17.720 | that you were gonna put poison in milk,
04:06:20.240 | like the melamine scandal in China.
04:06:21.680 | I'm not saying those things don't exist, by the way.
04:06:23.120 | They do exist.
04:06:24.000 | That's why people are like,
04:06:25.200 | they can immediately summon to mind
04:06:26.680 | all the examples of corporate criminals.
04:06:28.800 | That's why I mentioned those fictional stories,
04:06:29.960 | those templates.
04:06:30.960 | Even if "Star Wars" doesn't exist,
04:06:32.360 | how many times have you heard a "Star Wars" metaphor
04:06:34.040 | or whatever for something, right?
04:06:35.360 | Breaking bad, you know?
04:06:36.200 | Go ahead.
04:06:37.040 | - Yeah, but the pharmaceutical companies
04:06:39.160 | are stuck between a rock and a hard place
04:06:41.200 | because the reputation, if they go to Twitter,
04:06:43.560 | they go to social media, they have horrible reputation.
04:06:46.760 | So it's like they don't know.
04:06:47.600 | - Yes, but why is that?
04:06:49.160 | Because reputation and power.
04:06:51.320 | FDA beat down the reputation of pharma companies,
04:06:54.480 | just like EPA helped beat down
04:06:56.040 | the reputation of oil companies.
04:06:57.720 | And as it says over here, right?
04:06:59.760 | "In practice, dealing with the fact of FDA power
04:07:01.880 | meant a fundamental change in corporate structure
04:07:03.960 | and culture.
04:07:04.920 | At Abbott and at Genentech,
04:07:06.520 | Rob's most central transformation
04:07:08.080 | was in creating a culture of acquiescence
04:07:10.840 | towards a government agency.
04:07:12.480 | As was done at other drug companies
04:07:14.160 | in the late 20th century,
04:07:15.480 | Rob essentially fired officials at Abbott
04:07:17.360 | who were insufficiently compliant with the FDA."
04:07:19.920 | What that means is de facto nationalization of the industry
04:07:24.240 | via regulation.
04:07:25.320 | Just to hover on that.
04:07:27.400 | That's a really big deal
04:07:29.160 | because if their primary customer
04:07:30.640 | is this government agency,
04:07:31.960 | then it has nationalized it just indirectly, right?
04:07:36.720 | This is partially what's just happened
04:07:38.160 | with Microsoft, Apple, Google, Amazon, the other MAGA.
04:07:42.960 | Okay?
04:07:43.800 | They have been, that's funny.
04:07:46.360 | - Well done.
04:07:47.200 | (laughs)
04:07:48.600 | Yeah, I didn't even think about that.
04:07:50.360 | It's well done.
04:07:51.360 | - Right, so it's, you know,
04:07:52.880 | like I have this tweet,
04:07:53.720 | it's like MAGA Republicans and MAGA Democrats, right?
04:07:56.400 | Okay.
04:07:57.720 | - Oh, damn it.
04:07:58.600 | So many things you've said today
04:08:00.000 | will just get stuck in my head.
04:08:01.760 | It changes the way you think.
04:08:03.400 | Catchy.
04:08:04.840 | Something about catchy phrasing of ideas
04:08:07.600 | makes me even more powerful.
04:08:09.920 | So yeah, okay.
04:08:11.120 | So that's happening in the tech sector.
04:08:12.720 | - It's happening in tech.
04:08:13.560 | So Facebook is the outlier
04:08:15.120 | 'cause Zuck still controls the company.
04:08:16.960 | But just like, I mean,
04:08:19.360 | why had tech had a good reputation for a while?
04:08:22.360 | Because there wasn't a regulatory agency
04:08:24.440 | whose justification was regulating
04:08:26.640 | these corporate criminals, right?
04:08:28.720 | Once that is the case,
04:08:30.520 | the regulatory agency basically
04:08:32.440 | comes back to Congress each year.
04:08:33.800 | If you look at its budget approvals,
04:08:34.920 | it's saying, "We fined this many guys.
04:08:37.480 | "We found this many violations," right?
04:08:39.520 | They have an incentive to exaggerate the threat
04:08:43.840 | in the same way that a prosecutor
04:08:45.680 | or a policeman has a quota, right?
04:08:48.680 | Like these are the police.
04:08:50.600 | One way I describe it also is like,
04:08:52.360 | you know, like a step-down transformers.
04:08:55.640 | You have high voltage electricity
04:08:57.240 | that's generated at the power plant,
04:08:59.160 | and it comes over the wires,
04:09:00.200 | and then there's step-down transformers
04:09:01.960 | that turn it into a lower voltage
04:09:03.800 | that you can just deal with out of your appliances, right?
04:09:07.240 | Similarly, you have something where the high voltage
04:09:11.520 | of like the US military or the police,
04:09:14.400 | and that is transmitted down into a little letter
04:09:17.680 | that comes in your mailbox saying,
04:09:18.640 | "Pay your $50 parking ticket,"
04:09:20.440 | where it's a piece of paper,
04:09:22.760 | so you don't see the gun attached to it.
04:09:25.160 | But if you were to defy that,
04:09:27.400 | it's like "Grand Theft Auto,"
04:09:28.360 | where you get one star, two star, three stars,
04:09:30.280 | four stars, five stars,
04:09:31.720 | and eventually, you know,
04:09:33.280 | you have some serious stuff on your hands, okay?
04:09:36.760 | So once you understand that, you know,
04:09:38.680 | every law is backed by force,
04:09:41.080 | like that Brazilian guy with the machine gun
04:09:42.840 | that Rob mentioned,
04:09:44.840 | these guys are the regulatory police, okay?
04:09:46.720 | Now, see, for a time, what happened was
04:09:49.720 | you had the captured industry
04:09:52.160 | because all of the folks who were in pharmaceuticals
04:09:55.840 | were, as Carpenter said,
04:09:57.960 | a culture of acquiescence towards the FDA.
04:10:01.880 | The FDA was their primary customer.
04:10:03.480 | So just like in a sense, it's rational,
04:10:05.680 | you know, Amazon talks about being customer obsessed, right?
04:10:08.600 | What Rob did was rational for that time, right?
04:10:12.080 | What GKirp Rob did was saying,
04:10:14.600 | "Our customer is the FDA, that's our primary customer.
04:10:16.720 | "Nobody else matters.
04:10:17.920 | "They are satisfied first.
04:10:18.960 | "Every single trade-off that has to be made is FDA,"
04:10:21.880 | right?
04:10:22.720 | And, you know, really that's why
04:10:24.240 | the two most important departments
04:10:25.480 | at many pharmaceutical companies, arguably all,
04:10:27.440 | are regulatory affairs and IP, not R&D, right?
04:10:30.520 | Because one is the artificial scarcity of regulation,
04:10:33.280 | which jacks up the price,
04:10:34.560 | and the other is artificial scarcity of the patent,
04:10:36.800 | which allows people to maintain the high price, right?
04:10:39.400 | So this entire thing is just like, you know,
04:10:41.000 | college education.
04:10:41.840 | These things may at some point have been a good concept,
04:10:44.480 | but the price has just risen and risen and risen
04:10:46.280 | until it's at the limit price and beyond, okay?
04:10:49.080 | So what has changed?
04:10:51.600 | What's changed is in the 2010s,
04:10:55.800 | late 2000s and 2010s and so on,
04:10:58.040 | with the advent of social media,
04:10:59.320 | with the advent of a bunch of millionaires,
04:11:02.240 | like who are independent,
04:11:04.440 | with the advent of Uber and Airbnb, right?
04:11:08.200 | With the advent of cryptocurrency,
04:11:09.880 | with the diminution of trust in institutions,
04:11:12.960 | it used to be really taboo to even talk about the FDA
04:11:15.520 | as potentially bad in like, you know, 2010, 2009, okay?
04:11:19.880 | But now people have just seen face plant after face plant
04:11:22.440 | by the institutions,
04:11:23.720 | and people are much more open to the concept
04:11:26.400 | that they may actually not have it all together.
04:11:29.080 | And I think it's, you know,
04:11:30.360 | you could probably see some tracking poll
04:11:32.200 | or something like that,
04:11:33.080 | but I wouldn't be surprised
04:11:33.920 | if it's like a 20 or 30 point drop
04:11:35.320 | after the CDC failed to control disease and the FDA failed,
04:11:39.240 | and the entire biomedical regulatory establishment
04:11:41.800 | and scientific establishment
04:11:43.120 | was saying masks don't work before they do.
04:11:45.160 | This was just a train crash
04:11:46.360 | of all the things that you're paying for
04:11:47.560 | that you supposedly think are good.
04:11:49.600 | As I mentioned, one response is to go QAnon,
04:11:52.200 | and people will say, "Oh, don't trust anything."
04:11:53.840 | But the better response is decentralizing FDA, okay.
04:11:56.520 | So I will say one other thing,
04:11:59.040 | which is I mentioned, you know,
04:12:00.880 | this concept of improving the form 3500B,
04:12:03.640 | where you like scan, go ahead.
04:12:05.520 | - No, yeah, yeah, right.
04:12:06.840 | That just makes me laugh that I could just tell
04:12:09.480 | the form sucks by the fact that it has that code name.
04:12:12.240 | Sorry. - Yeah, yeah, yeah, yeah, yeah.
04:12:13.560 | Exactly, right?
04:12:14.480 | - UX is broken at every layer.
04:12:16.720 | - Yeah, so they have a bad Yelp for drugs,
04:12:19.320 | could we make a better one?
04:12:20.160 | We could make a better one, just modern UX.
04:12:22.120 | The key insight here, by the way,
04:12:23.400 | which is a non-obvious point,
04:12:25.000 | and I've got a whole talk on this actually
04:12:27.120 | that I should probably release.
04:12:28.080 | I actually did like almost eight, nine years ago.
04:12:30.040 | It's called "Regulation is Information."
04:12:32.280 | Product quality is a digital signal.
04:12:35.760 | Okay, what do I mean by that?
04:12:37.160 | Basically, when I talk about exit, you know,
04:12:40.560 | exit the Fed, that's the crypto economy, right?
04:12:43.080 | What does exit the FDA look like?
04:12:44.440 | Well, one key insight is that many
04:12:47.000 | of the big scale tech companies
04:12:49.000 | can be thought of as cloud regulators
04:12:50.720 | rather than land regulators.
04:12:52.080 | What do I mean by that?
04:12:52.920 | Well, first, what is regulation?
04:12:54.240 | People do want a regulated marketplace.
04:12:57.000 | They want A, quality ratings,
04:12:59.920 | like on a one to five star scale,
04:13:02.160 | and B, bans of bad actors,
04:13:04.800 | like the zero star frauds and scammers and so on,
04:13:07.440 | and these are distinct, right?
04:13:08.560 | Somebody who's like a low quality
04:13:10.400 | but well-intentioned person is different
04:13:12.480 | than a smart and evil person.
04:13:13.920 | Those are two different kinds of failure modes
04:13:15.280 | you could have in a marketplace, right?
04:13:17.280 | Why is it rational for people to want
04:13:19.000 | a regulated marketplace, especially for health?
04:13:21.160 | Because they wanna pay essentially one entry cost,
04:13:24.040 | and then they don't have to evaluate everything separately
04:13:26.800 | where they may not have the technical information
04:13:28.360 | to do that, right?
04:13:29.800 | You don't wanna go to Starbucks
04:13:31.000 | and put a dipstick into every coffee
04:13:34.640 | to see if it's poisoned or something like that.
04:13:36.120 | You sort of wanna enter a zone
04:13:38.000 | where you know things are basically good,
04:13:40.440 | and you pay that one diligence cost on the zone itself,
04:13:43.320 | right, whether it's a digital or physical zone,
04:13:45.040 | and then the regulator's taking care of it,
04:13:46.680 | and they've baked in the regulatory cost
04:13:49.280 | into some subscription fee of some kind, right?
04:13:52.560 | So the thing is, the model we've talked about
04:13:54.720 | is the land regulator of a nation state
04:13:56.560 | and a territorially bounded thing,
04:13:58.640 | but the cloud regulator, what's a cloud regulator?
04:14:00.800 | That is Amazon Star Ratings, that's Yelp,
04:14:05.600 | that is eBay, that is Airbnb, that is Uber and Lyft,
04:14:09.760 | and so on and so forth.
04:14:11.200 | It's also actually Gmail and Google, why?
04:14:14.040 | Because you're doing spam filtering,
04:14:16.640 | and you are doing ranking of emails
04:14:19.800 | with a priority inbox, right?
04:14:21.440 | With Google itself, they ban malware links, right?
04:14:24.440 | So the bad actors are out, and they're ranking them, right?
04:14:26.920 | How about Apple, the App Store, right?
04:14:29.200 | They ban bad actors, and they do star ratings.
04:14:31.520 | When you start actually applying this lens,
04:14:32.760 | PayPal, they've got a reputation,
04:14:35.160 | every single web service that's at the scale
04:14:38.840 | of like tens of millions or hundreds of millions of people
04:14:41.600 | has had to build a cloud regulator,
04:14:44.240 | and the crucial thing is it scales across borders.
04:14:46.480 | So you can use the data from Mexico
04:14:48.880 | to help somebody in Moldova or vice versa, right?
04:14:52.400 | Because it's fundamentally international, right?
04:14:54.560 | Those ratings, you have a network effect.
04:14:56.560 | And there's another aspect to it,
04:14:59.080 | which is these are better regulators
04:15:01.120 | than the land regulators.
04:15:02.120 | For example, Uber is a better regulator
04:15:04.880 | than the taxi medallions, why?
04:15:07.400 | Every ride is GPS tracked,
04:15:09.080 | there's ratings on both the driver and the passenger side.
04:15:12.000 | Both parties know that payment can be rendered
04:15:15.000 | in a standard currency, right?
04:15:17.320 | If you have below a certain star rating on either side,
04:15:19.320 | you get deplatformed and so on
04:15:21.000 | to protect either rider or driver, and on and on, right?
04:15:24.680 | What does that do?
04:15:25.520 | Think about how much better that is than taxi medallions,
04:15:26.840 | rather than a six-month or annual inspection.
04:15:29.960 | You have reports from every single rider, okay?
04:15:32.840 | Before Uber, it was the taxi drivers and taxi regulators
04:15:37.840 | were in a little monopoly locally, okay?
04:15:40.960 | Because they were the persistent actors in the ecosystem.
04:15:42.960 | Taxi riders had nothing in common,
04:15:45.080 | didn't even know each other.
04:15:45.920 | In New York, some guy gets in a taxi,
04:15:47.600 | another guy, they had no way to communicate with each other.
04:15:49.600 | So the persistent actors in the ecosystem
04:15:51.760 | were the regulators and the drivers.
04:15:53.760 | And they had this cozy kind of thing,
04:15:55.160 | and medallion prices just kept going up.
04:15:56.800 | And this was a sort of collaboration
04:15:58.520 | on artificial scarcity.
04:15:59.960 | Afterwards, with Uber and Lyft and other entrants,
04:16:03.120 | you had something interesting,
04:16:03.960 | a different kind of regulator-driver fusion.
04:16:06.680 | If you assume regulatory capture exists and lean into it,
04:16:11.080 | Uber is the new regulator,
04:16:13.440 | and Uber drivers are the drivers.
04:16:15.720 | Lyft is the competing regulator,
04:16:17.720 | and Lyft drivers are the new drivers, okay?
04:16:19.920 | So you have a regulator-driver fusion
04:16:22.200 | versus another regulator-driver fusion.
04:16:23.880 | You no longer have a monopoly, you have multiple parties.
04:16:27.640 | Okay, you have a competitive market.
04:16:29.440 | This is the concept of polycentric law, right?
04:16:32.160 | Where you have multiple different legal regimes
04:16:34.920 | in the same jurisdiction overlapping
04:16:36.800 | that you can choose between with a tap of a button, right?
04:16:39.160 | All these concepts from libertarian theory,
04:16:41.040 | like polycentric law or catalysis,
04:16:43.600 | all these things are becoming more possible
04:16:45.560 | now that the internet
04:16:46.400 | has increased microeconomic leverage,
04:16:48.520 | because that exit is now possible.
04:16:52.640 | Now, you may argue, "Oh, well, Lyft and Uber,
04:16:55.720 | they're not profitable anymore."
04:16:57.000 | And there's two different criticisms of them.
04:16:58.720 | One is, "Oh, they're not profitable,"
04:17:00.640 | or, "Oh, they're charging too much," or whatever.
04:17:02.760 | And I think part of this is because of certain kinds of...
04:17:06.240 | The regulatory state has caught up
04:17:07.760 | to try to make them uncompetitive.
04:17:09.040 | For example, they don't allow people in some states
04:17:11.800 | to identify themselves as independent contractors,
04:17:13.640 | even if they are part-time, okay?
04:17:15.720 | There's various other kinds of rules and regulations.
04:17:18.760 | You know, in Austin for a while,
04:17:19.920 | Uber was even banned, what have you, right?
04:17:22.240 | Net-net though, like Uber, Grab, Gojek, Lyft, Didi,
04:17:27.240 | like ride-sharing as a concept is now out there.
04:17:30.600 | And whatever the next version is, whether it's self-driving,
04:17:32.800 | like, while it's like a very hard-fought battle
04:17:34.920 | and the regulatory state keeps trying
04:17:35.960 | to push things back into the garage,
04:17:38.400 | this is a fundamentally better way
04:17:40.120 | of just doing regulation of taxis.
04:17:42.240 | Similarly, Airbnb for hotels.
04:17:45.040 | I mean, it's basically the same thing, okay?
04:17:47.640 | And Airbnb could use competition.
04:17:50.880 | I think that it would be good to have, you know,
04:17:52.720 | like competition for them,
04:17:53.800 | and there are other kinds of sites opening up.
04:17:56.120 | But the fundamental concept of the cloud regulator now,
04:17:58.080 | let's apply it here.
04:17:59.320 | Once you realize regulation is information,
04:18:02.040 | the way you'd set up a competitor to FDA or SEC or FAA
04:18:08.080 | or something like that is you just do better reviews.
04:18:11.340 | Like you just start with that.
04:18:13.160 | That's pure information.
04:18:13.980 | You're under free speech.
04:18:14.820 | That's like still, you know, the most defended thing,
04:18:17.200 | literally just publishing reviews.
04:18:18.760 | And not just reviews by any old person.
04:18:20.800 | It turns out that FDA typically will use expert panels,
04:18:23.800 | where there's expert panels.
04:18:24.640 | It's like professors from Harvard or, you know,
04:18:26.520 | things like that.
04:18:27.360 | So what that is, is this concept of a reputational bridge.
04:18:31.840 | What you wanna do is you wanna have folks who are,
04:18:33.880 | let's say, biotech entrepreneurs,
04:18:36.160 | or they're, you know, profs like Sinclair or what have you.
04:18:40.280 | You do wanna have the reviews of the crowd, okay?
04:18:44.040 | But you also wanna have, especially in medicine,
04:18:45.800 | by the way, so you wanna have the reviews of experts
04:18:47.720 | of some kind.
04:18:48.880 | So there's gonna be defectors from the current establishment.
04:18:51.560 | Okay, just like, you know,
04:18:52.760 | there are profs who defected from computer science academia
04:18:55.320 | to become Larry and Sergey and whatever, you know,
04:18:57.120 | or they weren't profs, they were grad students, right?
04:19:00.080 | In the same way, you'll have defectors
04:19:02.280 | who have the credentials from the old world,
04:19:04.960 | but can build up the new.
04:19:05.920 | Just like there's folks from Wall Street
04:19:07.080 | who have come into cryptocurrency
04:19:08.080 | and helped legitimate it, right?
04:19:09.320 | Just like there's folks who left Salsberger
04:19:11.600 | to come to Substack, okay?
04:19:13.160 | You know, we have these folks who, by defecting,
04:19:17.080 | they help, and then they're also supplemented
04:19:18.960 | by all this new talent coming in, right?
04:19:20.240 | That combination of things is how you build a new system.
04:19:22.560 | It's not completely by itself,
04:19:24.200 | nor is it trying to reform the old, it's some fusion, okay?
04:19:27.200 | So in this new system, who do you have?
04:19:28.320 | You have, like, the most entrepreneurial
04:19:30.480 | and innovative MDs.
04:19:31.800 | You have the most entrepreneurial and innovative professors.
04:19:34.480 | And you have the founders of actual new products and stuff.
04:19:37.680 | And they are giving open-source reviews of these products.
04:19:42.240 | And they're also building a community that will say,
04:19:45.840 | "Look, we want this new drug, or we want this new treatment,
04:19:49.680 | or we want this new device,
04:19:51.440 | and we're willing to crowdfund 10,000 units.
04:19:54.040 | So please give us the thing,
04:19:56.200 | and we'll write a very fair review of it,
04:19:58.680 | and we'll also all evaluate it as a community," and so on.
04:20:01.280 | So you turn these people from just passive patients
04:20:03.600 | into active participants in their health.
04:20:05.160 | That's a community part,
04:20:06.080 | and they've got the kind of biomedical,
04:20:08.080 | technical leadership there.
04:20:09.360 | Now, what is the kind of prototype of something like this?
04:20:12.080 | Something like VitaDow is very interesting.
04:20:14.400 | Things like MoleculeDow are very interesting.
04:20:16.000 | It'll start with things like longevity, right?
04:20:18.760 | And why is that?
04:20:19.800 | Because the entire model of FDA, this 20th century model,
04:20:23.960 | is wait for somebody to have a disease,
04:20:27.760 | and then try to cure them, okay?
04:20:30.200 | Versus, you know, saying an ounce of prevention
04:20:32.440 | is worth a pound of cure, right?
04:20:34.440 | Why are we not actually tracking folks
04:20:37.160 | and getting a constant dashboard on yourself
04:20:39.960 | so you can see whether things are breaking?
04:20:42.080 | And then you deal with it
04:20:43.560 | just like you've got server uptime things.
04:20:44.920 | You don't wait necessarily for the site to go down.
04:20:47.960 | You start seeing, "Oh, response rates are spiking.
04:20:50.040 | We need to add more servers," right?
04:20:51.240 | You have some warning, okay?
04:20:52.800 | Even 10 years ago, there was this article
04:20:55.280 | called "The Measured Man" in "The Atlantic,"
04:20:57.960 | where this guy, physicist Larry Smarr, okay?
04:21:01.200 | What he was doing is he was essentially
04:21:02.920 | doing a bunch of measurements on himself.
04:21:05.600 | And he was finding that there were predictors of inflammation
04:21:09.960 | that were spiking, and he went to the doctor,
04:21:11.400 | showed the charts, and the doctor was like,
04:21:13.320 | "I can't do anything with this."
04:21:15.760 | Then it turned out to be an early warning
04:21:17.800 | of a serious condition,
04:21:19.720 | and he had to, I think, go for surgery or something.
04:21:22.920 | And he was starting to think, "Well, look,
04:21:25.040 | the way that we're doing medicine right now
04:21:26.640 | is it's not quite like pre-germ theory of disease,
04:21:31.640 | but it is pre-continuous diagnostics, okay?
04:21:37.000 | Continuous diagnostics,
04:21:38.720 | just to talk about this for a second,
04:21:40.000 | this is, I mentioned one angle on which you go after FDA,
04:21:43.040 | which is like the better phase four, right?
04:21:45.840 | I've mentioned the concept of better reviews in general,
04:21:48.800 | okay?
04:21:49.960 | I mentioned VitaDow, which is like a community
04:21:52.360 | that is going after longevity.
04:21:54.000 | Let's talk about continuous diagnostics.
04:21:55.960 | So basically, we know better what is going on
04:22:00.840 | in Bangalore or Budapest than in our own body.
04:22:03.140 | That's actually kind of insane to think about.
04:22:05.880 | This stuff that, you know,
04:22:06.960 | it's all the other side of the world, 10,000 miles away,
04:22:08.720 | but a few millimeters away,
04:22:10.320 | you don't really know what's going on, right?
04:22:12.360 | And that's starting to change
04:22:13.720 | with all the quantified self-devices,
04:22:15.360 | the hundreds of millions of Apple Watches
04:22:17.640 | and Fitbits and stuff, right?
04:22:19.320 | You're also starting to see continuous glucose meters,
04:22:21.280 | which are very important.
04:22:22.200 | They're starting to give you readouts.
04:22:23.200 | People are seeing, "Wow, this is spiking my insulin.
04:22:25.840 | "Or rather, this is spiking my blood sugar."
04:22:28.840 | And it might be something you didn't predict.
04:22:30.800 | It varies for different people.
04:22:32.000 | For some people, you know,
04:22:33.480 | a banana isn't a big deal.
04:22:34.420 | For others, it's actually quite bad for the blood sugar.
04:22:36.960 | What happens when you extend that?
04:22:38.040 | Well, about 10 years ago, a guy, Mike Snyder,
04:22:41.960 | professor at Stanford, did something called the integrome,
04:22:44.280 | where he just threw the kitchen sink
04:22:45.800 | of all the diagnostics he could at himself
04:22:47.640 | over the period of, I think, a few weeks or a few months.
04:22:50.240 | I forget the exact duration.
04:22:51.640 | And he was able to do things where he could see,
04:22:53.560 | during that period, he got a cold or something.
04:22:55.920 | And he could see in the expression data,
04:22:58.320 | the gene expression data,
04:22:59.260 | that he was getting sick before he felt sick.
04:23:02.000 | He could also see that something about that viral infection
04:23:04.840 | made him develop diabetes-like symptoms,
04:23:08.000 | if I'm remembering it accurately.
04:23:10.080 | So you could see, "Oh, wait a second.
04:23:11.100 | "These are things that I can see in my readouts
04:23:15.600 | "that I would only have the vaguest interpretation of
04:23:19.720 | "as a human being."
04:23:22.680 | And moreover, he could take,
04:23:25.400 | I don't think he did this,
04:23:26.240 | but if you took treatments, if you took drugs,
04:23:29.040 | you could actually show what your steady state was,
04:23:34.080 | if you tracked over time,
04:23:35.160 | show what your disease state or sick state was.
04:23:38.560 | And then this drug pushes you back into non-disease state.
04:23:42.600 | You can actually get a quantitative readout
04:23:44.840 | of what steady state was.
04:23:47.260 | And that steady state,
04:23:50.400 | your expression levels across all these genes,
04:23:53.000 | your small molecules,
04:23:53.960 | basically everything you can measure,
04:23:56.960 | that's gonna vary from person to person.
04:23:59.080 | What's healthy and natural for you
04:24:01.160 | may be a different baseline than for me.
04:24:02.880 | For example, people who are,
04:24:04.240 | small example, people who are South Asian
04:24:06.520 | or have dark skin tend to have vitamin D deficiency.
04:24:09.760 | Because we need a lot of sunlight.
04:24:11.040 | So often inside, you're tapping on your screen.
04:24:14.080 | So what do we do?
04:24:15.000 | Take actually significant vitamin D infusions.
04:24:17.520 | That's a small example
04:24:19.560 | of where baselines differ between people.
04:24:22.080 | So continuous diagnostics, what could that mean?
04:24:25.440 | That could mean things like the continuous glucose meter,
04:24:27.800 | it's quantified self,
04:24:29.680 | it's like continuous blood testing.
04:24:31.680 | So you have a so-called mobile phlebotomist.
04:24:33.860 | This is something which,
04:24:35.480 | phlebotomist takes blood,
04:24:36.600 | mobile phlebotomist would come to your office,
04:24:38.600 | come to your remote office.
04:24:39.880 | This is a great business for people.
04:24:41.240 | I think you can revisit this in 2022.
04:24:44.440 | People tried this in the 2010s,
04:24:45.600 | but I think it's worth revisiting.
04:24:47.200 | Mobile phlebotomist comes every week or every month,
04:24:50.520 | takes blood, runs every test.
04:24:52.560 | Maybe that's a few thousand dollars a year,
04:24:55.400 | maybe eventually gets to a few hundred dollars a year.
04:24:58.300 | And that's expensive in some ways,
04:24:59.800 | but boy, that's better health insurance in other ways.
04:25:01.840 | - Yeah, I mean, it's amazing.
04:25:03.040 | So one, there's a bunch of companies that do this,
04:25:04.880 | and I actually would love to learn more about them.
04:25:06.640 | One of them is a company called InsideTracker
04:25:09.660 | that sponsors this podcast.
04:25:10.960 | They do that, but the reason I really appreciate them,
04:25:13.200 | they're the first ones that introduced me
04:25:14.720 | to how easy it is.
04:25:17.160 | But it's also depressing how little information,
04:25:20.140 | exactly as you beautifully put once again,
04:25:22.520 | how little information we have about our own body
04:25:25.440 | in a continuous sense.
04:25:27.080 | And actually also sadly, even with InsideTracker,
04:25:31.520 | as I collect that data,
04:25:33.280 | how not integrated that data is with everything else.
04:25:37.960 | If I wanted to opt in, I would like,
04:25:40.920 | I can't, it's just like riffing off the top of my head,
04:25:44.140 | but I would like Google Maps to know
04:25:47.000 | what's going on inside my body.
04:25:48.840 | Maybe I can't intuit at first
04:25:50.920 | why that application is useful,
04:25:52.440 | but there could be incredible,
04:25:54.000 | like that's where the entrepreneur spirit builds
04:25:57.280 | is like, what can I do with that data?
04:25:58.880 | Can I make the trip less stressful for you
04:26:01.400 | and adjust to Google Maps, that kind of thing.
04:26:03.480 | - That's right.
04:26:04.320 | So, I mean, one of the things about this, by the way,
04:26:06.280 | is because there are so many movies made about Theranos,
04:26:09.920 | okay, that's one of the reasons why people
04:26:12.560 | have sort of been scared off from doing diagnostics
04:26:16.360 | to some extent, okay, why?
04:26:17.960 | 'Cause VCs are like, oh, is this another Theranos?
04:26:20.080 | Like the diligence and everything,
04:26:21.840 | everyone's looking at it, oh, blood testing,
04:26:23.600 | one drop of blood, huh?
04:26:24.920 | Hurts the recruiting.
04:26:25.920 | Essentially, a lot of the media and stuff around that
04:26:30.520 | basically has pathologized the thing
04:26:32.080 | that we wanna have a lot more entrance in, right?
04:26:34.520 | Now, one way of thinking about it is
04:26:36.560 | FDA has killed way more people than Theranos has,
04:26:40.720 | all right, way more.
04:26:41.540 | Just take drug lag alone, okay?
04:26:43.900 | Whenever you have a drug that works
04:26:47.460 | and reduce morbidity and mortality
04:26:49.340 | after it was actually generally available,
04:26:51.120 | but was delayed for months or years,
04:26:53.880 | the integral under that curve
04:26:55.100 | is the excess morbidity and mortality
04:26:56.580 | attributable to FDA's drug lag.
04:26:58.260 | You could go back and do that study
04:27:00.580 | across lots and lots of different drugs,
04:27:02.340 | and you'd probably find quite a lot.
04:27:03.580 | Alex Tabarrok and others have written on this, right?
04:27:05.820 | Daniel Henninger has written on this, okay?
04:27:07.580 | That's just like one example.
04:27:09.100 | I mean, I gave the pandemic example,
04:27:10.620 | the fact that they held up the EUAs for the tests
04:27:13.460 | and didn't do challenge trials.
04:27:15.100 | That's like a million American dead
04:27:18.340 | that could have been orders of magnitude less
04:27:19.860 | if we had gotten the vaccine out
04:27:21.120 | to the vulnerable population sooner, okay?
04:27:23.540 | So you're talking about something
04:27:24.700 | that has a total monopoly on global health,
04:27:27.160 | and you can't know what it is without that
04:27:32.160 | unless you have zones that are FDA-free,
04:27:35.140 | but that have some form of regulation.
04:27:37.640 | As I mentioned, it's a V3.
04:27:38.900 | It's not going back to zero regulation,
04:27:41.180 | everybody in a manner for itself,
04:27:42.900 | but it's a more reputable regulator,
04:27:45.340 | just like Uber is a better regulator
04:27:48.140 | than the taxi medallions, right?
04:27:49.740 | - Yeah, I mean, you're painting such an incredible picture.
04:27:52.180 | You're making me wish you were FDA commissioner.
04:27:55.220 | But I-
04:27:56.060 | - There are a bunch of people who tweeted something like that
04:27:57.620 | after the, you know, with the pandemic.
04:28:00.180 | Whatever, go ahead, yeah.
04:28:01.300 | - Is that possible?
04:28:02.340 | Like if you were just given,
04:28:04.140 | if you became FDA commissioner,
04:28:05.540 | could you push for those kinds of changes,
04:28:09.300 | or is that really something
04:28:10.500 | that has to come from the outside?
04:28:11.820 | - Short answer is no.
04:28:12.860 | And the longer answer, meaning-
04:28:15.180 | - The long, that'd be funny if you're like,
04:28:16.500 | the short answer is no, the long answer is yes.
04:28:18.540 | (both laughing)
04:28:19.500 | - So basically, see, a CEO of a company,
04:28:23.820 | while it's very difficult,
04:28:25.420 | they can hire and fire, right?
04:28:27.780 | So in theory, they can do surgery on the organism.
04:28:31.340 | And like, you know, Steve Jobs took over Apple
04:28:33.900 | and was able to hire and fire, raise money, do this, that.
04:28:37.220 | He basically had root over Apple.
04:28:39.420 | That he was a system administrator, right?
04:28:41.220 | He had full permissions, okay?
04:28:43.780 | As FDA commissioner,
04:28:44.780 | you do not have full permissions over FDA,
04:28:47.260 | let alone like the whole structure around it, right?
04:28:50.380 | If you're FDA commissioner,
04:28:51.420 | you are not the CEO of the agency, okay?
04:28:54.060 | Lots of these folks there have career tenure.
04:28:57.460 | They can't be fired.
04:29:00.340 | They can't even really be disciplined.
04:29:01.460 | There's something called the Douglas factors.
04:29:02.940 | You ever heard of the Douglas factors?
04:29:04.540 | It's like the Miranda rights for federal employees, okay?
04:29:07.780 | You know, the right to remain as a,
04:29:08.940 | so basically, if you've heard
04:29:10.540 | that federal employees can't be fired,
04:29:12.260 | the Douglas factors are how that's actually operationalized.
04:29:15.420 | When you try to fire somebody,
04:29:16.940 | it's this whole process where they get to appeal it
04:29:20.420 | and so on and so forth.
04:29:21.340 | And they're sitting in the office
04:29:22.660 | while you're trying to fire them.
04:29:24.300 | And they're complaining to everybody around them
04:29:26.900 | that this guy's trying to fire me,
04:29:28.020 | he's such a bad guy, blah, blah, right?
04:29:30.260 | And everybody around, even if, you know,
04:29:33.220 | they may think that guy is doing a bad job,
04:29:35.060 | they're like, wait a second, he's trying to fire you,
04:29:36.340 | he might try to fire me too.
04:29:38.180 | And so anybody who tries to fire somebody at FDA
04:29:41.060 | just gets a face full of lead for their troubles.
04:29:44.620 | What they instead will do is sometimes
04:29:45.860 | they'll just transfer somebody to the basement or something
04:29:48.700 | so they don't have to deal with them
04:29:49.540 | if they're truly bad, okay?
04:29:51.300 | But the thing about this is there is only one caveat,
04:29:55.420 | Douglas factor number eight,
04:29:57.140 | the notoriety of the offense or its impact
04:29:59.060 | upon the reputation of the agency.
04:30:02.980 | There's that word again, reputation,
04:30:04.300 | of reputation and power.
04:30:05.780 | So the one way you can truly screw up
04:30:08.180 | within a regulatory bureaucracy
04:30:09.980 | is if you sort of endanger the like annual budget renewal.
04:30:14.140 | Think of it as like this mini Death Star
04:30:16.780 | that's coming to dock against the max Death Star
04:30:19.820 | for its like annual refuel.
04:30:22.380 | And it's talking about all the corporate criminals
04:30:24.380 | that it's prosecuted, the quotas,
04:30:26.100 | like the police quotas, the ticketing, you know,
04:30:28.180 | and if they don't have a crisis, they will like invent one.
04:30:32.420 | Just again, just like TSA,
04:30:33.780 | just like other agencies you're more familiar with,
04:30:35.420 | you can kind of map it back.
04:30:36.460 | Look at the guns and drugs we've seized.
04:30:38.740 | And say an incentive for, you know,
04:30:41.700 | creating these crises or manufacturing them
04:30:43.580 | or exaggerating them.
04:30:44.580 | And if you endanger that refueling,
04:30:48.100 | that annual budget renewal or, you know, what have you,
04:30:51.380 | then the whole agency will basically be like,
04:30:53.260 | okay, you're bad and you can be disciplined
04:30:55.380 | or sometimes, you know, with rare except,
04:30:56.980 | you know, you can be booted.
04:30:58.460 | But what that means is that FDA commissioner
04:31:00.980 | is actually a white elephant.
04:31:02.260 | It's a ceremonial role, really, right?
04:31:04.220 | You know, the term white elephant,
04:31:05.340 | it's like basically, you know,
04:31:07.300 | the Maharaja gives you a white elephant as a gift.
04:31:11.300 | Seems great.
04:31:12.260 | Next day, it's eaten all of your grass.
04:31:15.340 | It's pooped on your lawn.
04:31:17.180 | It has like, just put a foot on your car and smashed it.
04:31:20.820 | But you can't give it away.
04:31:22.300 | It's a white elephant.
04:31:23.140 | The Maharaja gave it to you, right?
04:31:25.260 | That's what being like FDA commissioner is.
04:31:27.540 | It's the kind of thing where if,
04:31:31.020 | and a lot of people are drawn in like moths to the flame
04:31:33.300 | for these titles of the establishment.
04:31:37.300 | I wanna be head of this.
04:31:38.660 | I wanna be head of that, right?
04:31:40.620 | And really what it is, it's like,
04:31:42.780 | I don't know, becoming head of Kazakhstan in the mid 1980s
04:31:49.220 | in the Soviet Union, the Kazakhstan SSR, right?
04:31:51.340 | Soviet Socialist Republic,
04:31:52.580 | before the thing was gonna like crumble potentially, right?
04:31:55.260 | In many ways, it's becoming, you know,
04:31:56.980 | folks who are just totally status obsessed
04:31:58.580 | getting these positions,
04:32:00.100 | but like a lot of the merit,
04:32:02.940 | all the folks with merit are kind of leaving the government
04:32:05.300 | and going into, you know, tech or crypto or what have you.
04:32:09.100 | So even if these agencies were hollow before,
04:32:12.180 | in some ways they're becoming hollower
04:32:13.620 | because they have less talent there, right?
04:32:15.900 | So A, you can't hire and fire very easily.
04:32:18.300 | You can hire a little bit, but you can't really fire.
04:32:20.660 | B, a lot of the talent has left the building,
04:32:24.380 | but was there.
04:32:25.260 | C, we're entering the decentralizing era.
04:32:27.500 | And D, you know, like, be like Satoshi.
04:32:30.780 | Satoshi founded Bitcoin
04:32:32.220 | 'cause he knew you could not reform the Fed.
04:32:34.180 | There's everybody's trying to go and reform, reform, reform.
04:32:36.620 | The reason they're trying to reform
04:32:37.460 | is we haven't figured out the mechanism
04:32:39.060 | to build something new.
04:32:39.900 | And now perhaps we have that.
04:32:40.780 | So I've named a few of them, right?
04:32:42.060 | I'll name one more.
04:32:43.020 | Related to the literants.
04:32:45.340 | Fitness is actually the backdoor to a lot of medicine.
04:32:48.260 | Okay, why is that?
04:32:50.020 | You go to any, you know, conference,
04:32:51.940 | it could be neurology, it could be cardiology.
04:32:54.260 | You'll find somebody who's giving a talk
04:32:55.580 | that says something along the lines of,
04:32:57.220 | fitness is the ultimate drug.
04:32:58.620 | Maybe not today when people are saying,
04:32:59.860 | oh, fat phobic or whatever, but not too many years ago.
04:33:01.980 | You'd see somebody, people saying,
04:33:04.340 | fitness is the ultimate drug.
04:33:05.180 | If we could just prescribe fitness in a pill,
04:33:08.000 | that would improve your cardiovascular function,
04:33:09.620 | your neurological function, it deals with depression.
04:33:11.060 | - By the way, in that case,
04:33:12.140 | the use of the word drug means medicine, so.
04:33:14.740 | - Medicine, yeah, sure, sure, sure.
04:33:16.740 | - Fitness is the ultimate medicine, yeah.
04:33:18.380 | - Yeah, the ultimate medicine, right?
04:33:19.540 | So if they could just prescribe the effects of it,
04:33:22.220 | it's just like, boom, just massive effect, right?
04:33:24.560 | Like you're fit enough, you do the resistance training,
04:33:26.520 | it helps with, you know, preventive diabetes.
04:33:28.740 | Every kind of thing in the world,
04:33:30.060 | you see a significant treatment effect.
04:33:32.660 | Yet your fitness is your own responsibility.
04:33:34.860 | You go to some gym, 24 Hour Fitness, what do they have?
04:33:37.100 | They have on the wall exhortations like,
04:33:39.700 | your body is your responsibility, right?
04:33:42.460 | Am I right?
04:33:43.300 | - Yeah, yeah, yeah, yeah.
04:33:44.140 | - And you know, go ahead, but.
04:33:46.580 | - No, it's just, it's hilarious, yes, yes, yes.
04:33:49.940 | - It's funny, but it's true, right?
04:33:51.180 | - Yeah, it's funny 'cause it's true.
04:33:52.180 | - And so your fitness, your diet, that's your responsibility.
04:33:55.020 | But when you go into a doctor's office,
04:33:58.100 | suddenly it becomes lie back and think of England, okay?
04:34:00.900 | Suddenly you become passive.
04:34:02.400 | Suddenly, oh, your doctor is Dr. Google?
04:34:05.100 | Well, your doctor must be a moron.
04:34:06.740 | You're going and trying to take care of yourself,
04:34:08.900 | you're Googling symptoms.
04:34:10.460 | Oh, how stupid you are.
04:34:12.140 | I have a medical degree.
04:34:13.080 | And that doctor, see, the thing is,
04:34:15.580 | if you come in and you've self-diagnosed
04:34:17.540 | or you've done some of your own research,
04:34:19.360 | if you're right, and if they've got an ego about it,
04:34:22.480 | they're undermined.
04:34:23.780 | And if you're wrong, they're like, you know,
04:34:25.300 | ha ha, you know, arrogant.
04:34:26.460 | But either way, if they've got this kind of mindset,
04:34:29.020 | they have an incentive to resist
04:34:30.540 | the patient taking care of themselves.
04:34:31.660 | Isn't that the doctor's job, right?
04:34:33.260 | And they're kind of taught to behave like this, many of them.
04:34:36.740 | So what that means then is that intervention
04:34:39.660 | of that 15 or 30 minute appointment with the doctor,
04:34:42.460 | whatever drug they prescribe better hit you
04:34:44.260 | like Thor's hammer to put you back
04:34:46.340 | on the straight and narrow.
04:34:47.920 | Because that's only with you for like a few seconds,
04:34:51.300 | you know, a few minutes or whatever.
04:34:52.380 | The doctor's only with you for a few minutes.
04:34:53.620 | The drug is only, you know, some drugs are very powerful.
04:34:55.620 | So they actually do work like this, okay.
04:34:57.540 | But your fitness is your own responsibility.
04:35:01.380 | And that's a continuing forcing function every day.
04:35:04.260 | And again, we get back to decentralization, right?
04:35:07.220 | The decentralization of responsibility
04:35:09.700 | from somebody thinking of themselves merely as a patient
04:35:12.060 | to an active participant in their own health,
04:35:14.180 | who's doing their own monitoring of their own health, right?
04:35:16.540 | And logging all their stuff, who's eating, you know,
04:35:20.360 | properly and looking at the effect of their diet
04:35:22.620 | on things like their, you know,
04:35:23.620 | continuous glucose monitor is a V1, but other things, right?
04:35:27.140 | Who is, you know, as fit as they can possibly be.
04:35:30.020 | Like these are kind of obvious things,
04:35:32.100 | but why is this the back door to medicine?
04:35:33.500 | Because since FDA only regulates those things
04:35:36.740 | that are meant to diagnose and treat a disease,
04:35:39.100 | all the stuff that is meant to improve
04:35:42.020 | an otherwise healthy person
04:35:43.700 | is potentially out of their purview.
04:35:45.140 | Supplements are one interesting aspect
04:35:46.900 | that they were carved out in the mid 90s.
04:35:50.020 | And that's why the supplement industry is big
04:35:51.460 | 'cause FDA doesn't have as tight a rein on that.
04:35:54.420 | But all of the Fitbit, CGM,
04:35:58.780 | continuous glucose meter type stuff,
04:36:00.540 | you can crank out all kinds of things
04:36:02.540 | that help people get fitter,
04:36:05.060 | that will also actually have just general health value,
04:36:09.260 | but you're not quite marketing them to diagnose
04:36:11.460 | or treat, you know, a disease.
04:36:13.340 | See what I'm saying?
04:36:14.180 | You're marketing them for the purpose of fitness.
04:36:15.980 | This is a market, why?
04:36:17.740 | Because psychologically people,
04:36:20.180 | they don't like paying to get back to normal,
04:36:23.380 | but they will absolutely pay tons of money
04:36:25.100 | to get better than normal.
04:36:25.940 | They'll pay for fitness, they'll pay for makeup,
04:36:27.660 | they'll pay for hair, they'll pay for this and that.
04:36:30.980 | So that's actually the back door
04:36:33.140 | and you can do tons of things there
04:36:34.580 | where obviously being healthier is also protective.
04:36:38.540 | You can actually show the studies on this.
04:36:39.740 | So this way you build out all the tooling to get healthier
04:36:43.500 | and that actually helps on this axis.
04:36:45.900 | Fewer things which kind of a US medical system,
04:36:47.620 | but I'll admit, 'cause you got me on this topic.
04:36:48.980 | - I love this.
04:36:49.820 | - Okay, so--
04:36:50.660 | - This is the most eloquent exploration
04:36:54.060 | of the US medical system and how to improve it,
04:36:57.580 | how to fix it and what the future looks like.
04:36:59.900 | - Yeah, so-- - I love it.
04:37:01.180 | - So basically, so part of it is decentralizing control
04:37:03.700 | back to the individual, right?
04:37:05.160 | Now, I've talked about FDA at length,
04:37:07.000 | but let me talk about some of the other broken parts
04:37:08.820 | of the US system, right?
04:37:10.800 | There's like AMA, there's CPT, there's CPOM,
04:37:13.780 | there's this, you know, like all these regulations
04:37:17.340 | which see normally in capitalism,
04:37:21.060 | you have a buyer and a seller, right?
04:37:25.900 | In medicine, you have third-party regulation
04:37:28.320 | and fourth-party pricing and fifth-party payment, okay?
04:37:33.020 | So third-party regulation, FDA is regulating it,
04:37:36.280 | fourth-party pricing, it is, you know, the CPT codes, right?
04:37:41.280 | Fifth-party payment, it's the insurance companies, right?
04:37:44.980 | And just to discuss these bits of the system,
04:37:48.100 | first, why are some people against capitalism in medicine?
04:37:52.460 | I actually understand why they're against it
04:37:53.980 | because they are visualizing themselves on a gurney
04:37:57.900 | when they're being wheeled in
04:38:00.000 | and now somebody at their moment of vulnerability
04:38:02.380 | is charging this insane price for their care
04:38:04.940 | and many people in the US have had this horrible experience
04:38:07.380 | where they're bankrupted or scared of being bankrupted
04:38:10.220 | by medical bills.
04:38:11.160 | Therefore, the concept of adding more capitalism to medicine
04:38:14.420 | scares them and they think it's horrible
04:38:16.420 | and you're some like awful, greedy tech bro kind of thing.
04:38:19.860 | All right?
04:38:21.020 | Let me say I understand that concern
04:38:23.020 | and let me kind of, let me pull,
04:38:25.140 | tease that apart a little bit, right?
04:38:27.460 | Basically, the most capitalistic areas of medicine
04:38:30.820 | are the most functional areas of medicine.
04:38:32.320 | So that's say the places where you can walk in
04:38:34.540 | and walk out, okay?
04:38:36.380 | Whether that's dentistry, dermatology, plastic surgery,
04:38:40.860 | even veterinary medicine, which is not human, okay?
04:38:43.580 | Where you can make a conscious decision, say,
04:38:45.500 | okay, I want this care or I don't want this.
04:38:48.300 | I see a price list, I can pay cash, right?
04:38:51.420 | If I don't like it, I go to another dermatologist.
04:38:53.820 | There's few dermatological emergencies.
04:38:55.740 | That's why dermatologists have a great quality of life, okay?
04:38:58.280 | By contrast, when you're being wheeled in on a gurney,
04:39:00.380 | you need it right now, okay?
04:39:02.220 | And you're unconscious or what have you
04:39:04.260 | or you're not in a capacity to deal with it, right?
04:39:07.100 | And so these are the two extremes.
04:39:08.820 | It's like ambulatory medicine, you can walk around and pick
04:39:12.260 | and like ambulance medicine, okay?
04:39:14.820 | And what that means is the more ambulatory the medicine,
04:39:19.060 | the more legitimate capitalism is in that situation.
04:39:22.060 | People are okay with a dermatologist
04:39:25.620 | basically turning you down
04:39:26.780 | because you don't have enough money
04:39:28.020 | and you go to another dermatologist
04:39:29.500 | because you can comparison shop there.
04:39:31.000 | It's not usually an emergency, right?
04:39:33.600 | Whereas if you're coming in with an ambulance,
04:39:36.180 | then people don't wanna be turned down
04:39:37.660 | and I understand why, okay?
04:39:39.700 | What this suggests by the way
04:39:40.860 | is that you should only have insurance for the edge cases.
04:39:45.860 | Insurance should only cover the ambulance,
04:39:48.260 | not the ambulatory.
04:39:49.620 | And most people are losing money on insurance, right?
04:39:52.260 | Because most people are paying more in in premiums
04:39:54.780 | than they are getting out.
04:39:56.040 | It's just that this huge flight of dollars through the air
04:39:59.220 | that no one can make heads or tails out of.
04:40:01.180 | Oh, the other aspect that's obviously broken
04:40:03.700 | is employer provided health insurance
04:40:05.260 | which just started after World War II.
04:40:06.980 | So auto insurance is in a much more competitive market.
04:40:09.700 | You don't whip out your auto insurance card
04:40:12.020 | at the gas station to pay for your gas, right?
04:40:14.100 | You only whip it out when there's a crash, right?
04:40:16.140 | That's what quote health insurance should be.
04:40:18.460 | And the Singapore model is actually a pretty good one
04:40:20.480 | for this where they have sort of a mandatory HSA.
04:40:23.020 | You have to like put some money in that
04:40:25.220 | and that pays for your healthcare bills
04:40:27.580 | but then it's cash out of that.
04:40:28.720 | It's like a separate pocket
04:40:29.580 | sort of for savings to pay for-
04:40:31.140 | - Health savings account.
04:40:32.060 | - Health savings account, right?
04:40:33.260 | The thing about this is once you realize,
04:40:36.140 | well first, ambulatory medicine is capitalistic medicine.
04:40:39.220 | Ambulance medicine is socialist medicine, okay?
04:40:42.660 | You wanna shift people more towards ambulatory.
04:40:44.940 | Guess what?
04:40:45.760 | That's in their interest as well.
04:40:46.940 | Now that brings us back to the monitoring, right?
04:40:49.600 | The continuous monitoring where
04:40:51.860 | whether eventually it's Mike Snyder's Integrome,
04:40:54.220 | the V1 is the quantified self and the Apple watch
04:40:57.220 | and the continuous glucose meters
04:40:59.340 | and the VN is the Mike Snyder Integrome.
04:41:01.500 | There's a site called Q.bio which is doing this also, right?
04:41:05.020 | Eventually this stuff will hopefully just be in a device
04:41:07.260 | that just measures tons and tons of variables on you, right?
04:41:09.780 | There's ways of measuring some of these metabolites
04:41:13.820 | without breaking the skin.
04:41:15.140 | So it's not, you don't have to keep breaking the skin
04:41:18.480 | over and over, there's various ways of doing that.
04:41:20.360 | So now you've got something where you've got the monitoring,
04:41:23.860 | you've got the dashboards, you've got the alerts
04:41:26.860 | and just like this Larry Smarr guy
04:41:28.820 | that I mentioned, the measured man,
04:41:30.020 | you might be able to shift more and more things
04:41:32.140 | to ambulatory.
04:41:33.780 | And one of the things about this also is
04:41:35.740 | the medical system is set up in a bad way
04:41:37.900 | where the primary care physician is the one who is
04:41:42.900 | like not the top of their class,
04:41:44.980 | but the guys who are at the bottom of the pinball machine,
04:41:47.420 | the surgeons and the radiologists,
04:41:49.100 | once your stuff is already broken, okay?
04:41:51.620 | They're the ones who are paid the most.
04:41:52.740 | So a lot of the skill collects at the post break stage,
04:41:57.740 | right where you actually want Dugie Howser MD
04:41:59.940 | is at the upstream stage, okay?
04:42:03.020 | So you want this amazing, amazing doctor there, right?
04:42:05.660 | How could we get that?
04:42:06.860 | I mentioned the app that doesn't exist,
04:42:08.980 | which is like a better version of the 3500B, right?
04:42:11.740 | Here's another app that doesn't exist.
04:42:12.860 | And this is one that FDA is actively quashed.
04:42:15.020 | Why can't you just take an image of a mole
04:42:17.980 | or something like that?
04:42:18.940 | You know, with the incredible cameras we have,
04:42:21.580 | a huge amount of medical imaging should be able to be done
04:42:24.620 | at home and it goes to doctors,
04:42:27.340 | whether it's in the US or the Philippines or India.
04:42:29.980 | I mean, teleradiology exists, right?
04:42:32.460 | Why can you not do that for dermatology,
04:42:35.300 | for everything else?
04:42:36.140 | You should be able to literally just hold the thing up
04:42:37.940 | and with a combination of both AI and MDs, just diagnose.
04:42:42.660 | That should exist, right?
04:42:45.140 | Answer is there's a combination of both American doctors
04:42:49.180 | and the FDA that team up to prevent this or slow this.
04:42:52.460 | And, you know, one argument is,
04:42:55.580 | oh, the AI is not better than a human 100% of the time
04:42:59.100 | because it's not deployed yet.
04:43:00.620 | Therefore it could make an error, therefore it's bad.
04:43:02.100 | Even if it's better than 99% of doctors, 99% of the time.
04:43:05.220 | Another argument is the software has to go
04:43:07.620 | through design control, okay?
04:43:09.540 | Now, basically, once you understand how FDA works,
04:43:13.620 | basically imagine the most bureaucratic frozen process
04:43:18.340 | for code deployment at any company ever.
04:43:21.220 | And that is the most nimble thing ever
04:43:24.900 | relative to FDA's design review.
04:43:27.220 | So just to review, A, talked about how FDA
04:43:31.060 | was blocking all this stuff.
04:43:32.580 | B, talked about why ambulatory medicine is capitalist
04:43:35.580 | medicine, ambulance medicine is socialist medicine.
04:43:38.700 | C, talked about how with the diagnostic stuff
04:43:41.620 | we can shift it over to ambulatory.
04:43:43.440 | D, talked about how there's lots of things
04:43:45.120 | where you could have a combination of doctor and AI
04:43:47.020 | in an app that you kind of quickly self-diagnose.
04:43:49.740 | Some of this is happening now.
04:43:51.900 | Some of the telemedicine laws were relaxed during COVID
04:43:55.980 | where now people, you know, a doctor from Wyoming
04:43:58.740 | can prescribe for somebody in Minnesota.
04:44:01.300 | Like some of that stuff was relaxed during COVID, okay?
04:44:04.100 | There's other broken things in medical system.
04:44:05.740 | I'll just name two more and then kind of move on, okay?
04:44:08.500 | I mentioned like AMA and CPT, okay?
04:44:11.540 | - Those are two regulatory bodies?
04:44:13.460 | - AMA, American Medical Association,
04:44:14.780 | CPT, Current Procedural Terminology, okay?
04:44:17.180 | Basically, you know Marx's labor theory of value
04:44:20.380 | where people are supposed to be paid on their effort, right?
04:44:24.100 | Of course, the issue with this is that you'd be paying
04:44:27.900 | a physicist to try to dunk and, you know,
04:44:32.220 | they'd be trying but they wouldn't actually
04:44:34.460 | probably be able to do it.
04:44:35.580 | They'd be trying real hard, whereas you actually
04:44:38.220 | want to pay people on the basis of results, right?
04:44:40.520 | Cheaply attained results are actually better
04:44:42.540 | than expensively attained non-results, perhaps obvious, okay?
04:44:45.720 | Nevertheless, the way that the US medical system
04:44:48.700 | has payouts, it's based on so-called RVUs,
04:44:50.940 | relative value units.
04:44:52.260 | And this is something where there's a government body
04:44:56.580 | that sets these prices and it is in theory only for Medicare
04:45:01.580 | but all the private insurers key off of that.
04:45:05.080 | And AMA basically publishes a list of these so-called
04:45:10.080 | the CPT codes, which is like the coding,
04:45:13.760 | the biomedical coding of this,
04:45:15.260 | and what each medical process is worth and whatnot.
04:45:18.820 | So it's like, I don't remember all the numbers,
04:45:22.480 | but it's like a five-digit code and it's like,
04:45:24.780 | okay, I got a test for cystic fibrosis
04:45:27.500 | or a test for this or a test for that.
04:45:29.420 | God help you if your medical billing is erroneous.
04:45:35.280 | The insurance company will reject it
04:45:37.220 | 'cause it doesn't pay for that.
04:45:38.060 | This is this giant process of trying to encode
04:45:41.180 | every possible ailment and condition into the CPT codes
04:45:45.900 | and you can literally get degrees in medical billing
04:45:49.620 | just for this, okay?
04:45:50.900 | This enormous inefficient industry, okay?
04:45:54.220 | Like literally medical billing is a whole field, okay?
04:45:56.460 | - Yeah.
04:45:57.300 | What do you wanna do when you grow up?
04:45:59.900 | I wanna work in medical billing.
04:46:01.300 | - In medical billing, okay?
04:46:02.740 | Where everybody's mad at you at all times.
04:46:04.660 | Part of what happens is when you give a treatment,
04:46:07.620 | when a doctor gives a treatment to a patient,
04:46:10.060 | they can't like repo the treatment, okay?
04:46:12.460 | Like a car, you sell a car, you can in theory repo the car.
04:46:14.920 | So the patient has a treatment.
04:46:16.180 | Now, what happens?
04:46:18.060 | Well, the insurance company,
04:46:19.900 | that treatment is perhaps provided,
04:46:23.020 | look, it's a lab test provided by a company, right?
04:46:24.780 | The company bills the patient.
04:46:26.940 | The company is supposed to charge a high price, why?
04:46:29.700 | The insurance company wants it to try to collect
04:46:31.900 | from the patient.
04:46:32.740 | The patient is scared, oh my God, they see this huge price.
04:46:34.980 | They sometimes don't pay.
04:46:37.020 | Sometimes the insurance company doesn't pay either.
04:46:40.140 | And when a company is stiffed by an insurance company,
04:46:44.220 | when a diagnostic company is stiffed by an insurance company
04:46:46.740 | it has to jack up the price on everybody else, right?
04:46:48.960 | Everything boils down to the fact that you don't have
04:46:52.460 | you know, a buyer and a seller.
04:46:54.300 | The doctor doesn't know the price of what is being sold.
04:46:57.100 | The buyer doesn't know the price of what is being bought
04:47:01.100 | at the time it's being bought.
04:47:02.740 | Neither party can really even set a free price
04:47:04.660 | because there's this RVU system that hovers above.
04:47:07.460 | The buyer feels they've already bought it
04:47:09.420 | 'cause they bought insurance.
04:47:10.820 | The insurance company doesn't wanna pay for it.
04:47:12.820 | Everybody is trying to like push the price
04:47:14.900 | onto somebody else and you know,
04:47:17.500 | not actually show the sticker price of anything
04:47:19.860 | and hide everything and so on and so forth.
04:47:22.300 | Oh, the other thing about it is obviously lawsuits
04:47:24.420 | are over everything.
04:47:25.460 | Everybody's mad about everything.
04:47:27.940 | It's health, people are dying, okay?
04:47:30.260 | So everything is just optimized for optics
04:47:32.700 | as opposed to results, right?
04:47:34.340 | Similarly, actually many drugs are optimized
04:47:36.260 | for minimizing side effects and optics
04:47:39.260 | rather than maximizing effects
04:47:40.580 | which are totally different criteria, right?
04:47:42.540 | You might have, for example, a drug that only cures
04:47:46.460 | a thousand people but doesn't have any side effects
04:47:49.020 | versus one that cures a million people
04:47:51.940 | but that has 10 really serious side effects a year, right?
04:47:55.900 | And the second one would probably not happen
04:47:57.780 | because those side effects would be so big, okay.
04:48:00.220 | How do you attack this?
04:48:02.340 | I name a few examples but I actually think the reform
04:48:04.660 | is gonna come in part from outside the system.
04:48:06.820 | In particular, India is coming online, okay?
04:48:10.480 | Why is that important?
04:48:11.820 | Well, you may have encountered an Indian doctor or two, okay?
04:48:16.380 | Maybe an Indian programmer, one or two, all right?
04:48:18.940 | And I do think telemedicine could explode, right?
04:48:23.940 | Where you could have an Indian doctor in India
04:48:28.060 | and there's a US doctor, okay, who's like a dispatcher.
04:48:33.340 | You see what I'm saying?
04:48:35.140 | They've got all these other Indian doctors behind them,
04:48:37.020 | they've got a telemedical app
04:48:38.540 | and you are now doing something
04:48:40.600 | where these relatively inexpensive Indian doctors
04:48:44.820 | who are vetted by the American doctor
04:48:46.460 | or the doctor in the jurisdictions of license
04:48:48.560 | become the backend doctors of the world.
04:48:50.100 | To some extent, that's already there with teleradiology
04:48:52.380 | and other kinds of things, right?
04:48:53.940 | But now that you've got literally like a billion Indians
04:48:56.380 | who've just come online, okay,
04:48:58.260 | you have this huge pool of folks
04:48:59.620 | who have a different attitude towards medicine.
04:49:01.380 | For example, it's a lot more cash payment over there.
04:49:03.700 | For example, India is big on generic drugs.
04:49:06.140 | For example, during COVID, it had something called,
04:49:08.860 | has something called Arrogya Setu
04:49:10.100 | which is a national telemedicine app, okay?
04:49:12.380 | The US wasn't able to ship that.
04:49:13.580 | In some ways, India's digital infrastructure,
04:49:15.740 | again, you'll have to read a post called
04:49:18.720 | "The Internet Country" by tigerfeathers.substack.com
04:49:21.760 | and you'll see that actually India's
04:49:23.840 | national software infrastructure is surprisingly good.
04:49:27.120 | It's not as good as China's in some ways,
04:49:29.100 | but it's like better than the US's
04:49:30.640 | which is like health.gov and like non-existent.
04:49:32.780 | It's like kind of impressive
04:49:33.800 | how good some of India's software is.
04:49:35.760 | The fact that it exists is good.
04:49:37.660 | So you have all these new doctors coming online,
04:49:40.040 | India cranks out generics, right?
04:49:42.540 | Telemedicine is now more legal in the US
04:49:46.000 | and you have a cash payment in India, right?
04:49:49.480 | And in a lot of other places,
04:49:50.560 | you don't have the whole insurance, employer health thing.
04:49:53.720 | And this market is growing.
04:49:55.520 | So you could have a sort of parallel market
04:49:59.680 | that starts evolving, right?
04:50:01.440 | Which is, and people are already doing some medical tourism
04:50:03.720 | and I think that's another exit from the FDA.
04:50:05.560 | You have a parallel market that starts evolving
04:50:07.760 | that just starts from fundamentally different premises.
04:50:09.840 | It's just cash, cash for everything, right?
04:50:13.260 | There's downsides with cash for everything.
04:50:15.260 | There's a huge upside with cash for everything.
04:50:17.020 | Cash for everything means
04:50:17.860 | you get customer service from the doctor.
04:50:20.140 | It means the prices are actually visible.
04:50:23.140 | It ideally pushes you again,
04:50:25.180 | towards more ambulatory medicine
04:50:26.620 | rather than ambulance medicine.
04:50:28.480 | It is monitoring, constant monitoring
04:50:31.340 | with the quantified self and whatnot,
04:50:33.300 | as opposed to just let your system fail
04:50:35.700 | and then wheel you in, right?
04:50:38.600 | There's a reputational bridge
04:50:40.040 | because now we've had a couple of generations almost
04:50:42.400 | of Indian doctors in the US.
04:50:43.880 | So people know that there's some very competent
04:50:47.200 | Indian doctors.
04:50:48.040 | There are a good chunk of AMA.
04:50:49.200 | And so they can sort of lobby for this.
04:50:52.260 | And you have plenty of Indian engineers.
04:50:54.960 | Now I'm not saying India alone is a panacea,
04:50:57.320 | but I do think that this is a large enough parallel market
04:51:00.240 | to start doing interesting things.
04:51:02.000 | - And you could see a sort of medical tourism,
04:51:04.280 | medical migration to where it gives India an opportunity
04:51:08.480 | to basically let go of the constraints of the FDA.
04:51:13.480 | - Yeah, because--
04:51:15.920 | - And innovate aggressively.
04:51:17.480 | And I mean, it's just such a huge opportunity
04:51:20.340 | to define the future of medicine
04:51:22.440 | and make a shit ton of money
04:51:25.000 | from a market that's desperate for it in the United States
04:51:28.040 | because of all the over the regulation.
04:51:30.160 | - That's right.
04:51:31.000 | And I think basically it's something where the reason,
04:51:33.040 | it needs to be--
04:51:33.880 | - And that would fix the FDA, sorry to interrupt.
04:51:35.720 | - Yes, we fix the FDA by exiting the FDA, right?
04:51:38.440 | And then the FDA would dry out and then it would hopefully--
04:51:42.040 | - It might reform, it might dry out, right?
04:51:44.400 | And this is why people are, for example,
04:51:46.400 | they're traveling across borders,
04:51:48.040 | they're getting orders from Canadian pharmacies.
04:51:50.480 | A lot of this type of stuff,
04:51:52.000 | we can start to build alternatives, right?
04:51:54.120 | I mean, India's generic industry is really important
04:51:56.880 | because it just doesn't enforce American IP there.
04:51:59.380 | So generic drugs are cheaper, right?
04:52:02.600 | And it's quite competent, it's been around for a while.
04:52:04.780 | So there's enough proof points there
04:52:06.760 | where, again, I'm not saying a panacea,
04:52:09.920 | it's gonna be something which will require
04:52:11.600 | like American and Indian collaboration.
04:52:13.160 | I think there's gonna be a lot of other countries
04:52:14.480 | and so on that are involved.
04:52:15.800 | But you can start to see another pole getting set up,
04:52:19.100 | which is a confident enough civilization
04:52:22.280 | that is willing to take another regulatory path, right?
04:52:25.640 | And that is in some ways doing better
04:52:27.840 | on national software than the US is.
04:52:29.720 | And it has enough of a bridge to the US
04:52:31.720 | that it can be that stimulation which you need,
04:52:35.120 | which is kind of something that outside poke, right?
04:52:37.800 | I wanna talk about India,
04:52:39.000 | but let me just kind of wrap up
04:52:41.000 | on this big FDA biomedical kind of thing, right?
04:52:44.040 | With the book, "The Network State",
04:52:47.040 | the purpose of "The Network State",
04:52:49.600 | I want people to be able to build
04:52:51.680 | different kinds of network states.
04:52:52.720 | I want people to build the vegan village.
04:52:55.480 | I want people to be able to build a,
04:52:58.000 | if they wanna do the Bendic option,
04:52:59.320 | like a Christian network state.
04:53:01.120 | If people wanna do different kinds of things,
04:53:03.320 | I'm open to many different things
04:53:05.040 | and I will fund lots of different things.
04:53:07.520 | For me, the motivation is just like you needed
04:53:10.960 | to start a new currency, Bitcoin.
04:53:15.280 | It was easier to do that than reform the Fed.
04:53:16.920 | I think it's easier to start a new country
04:53:18.040 | than reform the FDA.
04:53:18.920 | And so I wanna do it to, in particular, get to longevity,
04:53:22.000 | right, meaning longevity enhancement, right?
04:53:24.280 | And what does that mean?
04:53:25.320 | So in an interesting way,
04:53:27.160 | and this will sound like a trite statement,
04:53:28.920 | but I think it's actually a deep statement
04:53:30.480 | or let me hopefully try to convince you it is.
04:53:32.880 | Crypto is to finance sort of what longevity is to,
04:53:36.880 | you know, the currency to medicine.
04:53:39.240 | It inverts certain fundamental assumptions.
04:53:42.480 | Okay, so at first crypto looks like traditional finance.
04:53:44.880 | It's got the charts and the bands
04:53:47.160 | and you're buying and selling and so on.
04:53:49.000 | But what Satoshi did is he took fundamental premises
04:53:52.040 | and flipped them.
04:53:52.880 | For example, in the traditional macroeconomic worldview,
04:53:57.760 | hyperinflation is bad, but deflation is also bad.
04:54:00.920 | So a little inflation is good, right?
04:54:04.320 | In the traditional macroeconomic worldview,
04:54:06.880 | it's good that there are custodians, banks,
04:54:09.160 | that, you know, kind of intermediate the whole system,
04:54:11.640 | right?
04:54:12.480 | In the traditional, you know, worldview,
04:54:14.520 | every transaction needs to be reversible
04:54:17.840 | because somebody could make a mistake
04:54:19.200 | and so on and so forth, right?
04:54:21.360 | In the traditional worldview,
04:54:22.440 | you don't really have root access over your money.
04:54:24.940 | Satoshi inverted all of those things, okay?
04:54:29.760 | Obviously, you know, the big one is hyperinflation is bad,
04:54:33.200 | but he also thought mild inflation was bad
04:54:35.520 | and deflation was good.
04:54:36.640 | That's just a fundamental shift, okay?
04:54:39.840 | He gave you root access over your money.
04:54:41.520 | You're now a system administrator of your own money.
04:54:42.920 | You can room-RF your entire fortune
04:54:46.000 | or send millions with a keystroke.
04:54:47.320 | You are now the system administrator of your own money.
04:54:48.840 | That alone is why cryptocurrency is important.
04:54:50.960 | If you want system administration access at times
04:54:53.200 | to computers, you'll want it to currency, right?
04:54:55.480 | To be sovereign.
04:54:57.000 | You know, there's other assumptions where like
04:54:58.520 | the assumptions, every transaction is private
04:55:00.480 | in the existing system by default,
04:55:02.880 | or it's visible only to the state.
04:55:04.820 | Whereas at least the initial, you know,
04:55:06.400 | the Bitcoin blockchain, everything is public, right?
04:55:08.640 | There are various kinds of things like this
04:55:10.400 | where he just inverted fundamental premises.
04:55:13.160 | And then the whole crypto system is in,
04:55:16.840 | the crypto economy is in many ways a teasing out
04:55:19.320 | of what that means.
04:55:21.400 | Just to give you one example, the US dollar,
04:55:24.940 | people have seen those graphs where it's like inflating
04:55:26.880 | and so it just like loses value over time
04:55:29.080 | and you've seen that, okay?
04:55:30.780 | Whereas, and most of the time it's just sort of denied
04:55:34.360 | that it's losing any value.
04:55:35.840 | The most highbrow way of defending it
04:55:39.440 | is the US dollar trades off temporary short-term
04:55:44.440 | price stability for long-term depreciation.
04:55:47.460 | And Bitcoin makes the opposite trade-off.
04:55:50.920 | In theory, at least, long-term appreciation
04:55:54.740 | at the expense of short-term price instability.
04:55:58.000 | Because like, you know,
04:55:59.400 | there's the whole plunge protection team and so on.
04:56:01.200 | Basically, there's various ways in which price stability
04:56:05.320 | is tried to be maintained in the medium term
04:56:08.000 | at the expense of long-term depreciation.
04:56:09.440 | You need like a reserve of assets to keep, you know,
04:56:11.920 | stabilizing the dollar against various things.
04:56:13.560 | So what does crypto medicine look like
04:56:16.840 | relative to fiat medicine to make the same analogy, right?
04:56:19.580 | The existing medical system,
04:56:21.080 | it assumes that a quick death is bad,
04:56:25.240 | an early death is bad,
04:56:26.420 | but also that living forever is either unrealistic
04:56:30.320 | or impossible or undesirable,
04:56:32.320 | that you should die with dignity or something like that,
04:56:35.040 | okay?
04:56:36.200 | So a little death is good.
04:56:38.000 | That's the existing medical system.
04:56:40.080 | Whereas the concept of life extension
04:56:42.840 | and, you know, David Sinclair and, you know,
04:56:45.440 | what do you call it?
04:56:46.280 | Healthspan says, rejects that fundamental premise.
04:56:49.480 | And it says, actually, the way to defeat cancer
04:56:52.220 | is to defeat aging.
04:56:53.060 | Aging is actually a programmed biological process.
04:56:56.500 | And we have results that are showing stopping
04:57:00.260 | or even reversing aging in some ways.
04:57:02.420 | And so now, just like with the other thing,
04:57:06.840 | you say a quick death is bad,
04:57:09.780 | and so is actually death itself, right?
04:57:12.700 | So we actually want significant life extension.
04:57:15.620 | This is similar.
04:57:16.620 | It's very similar to what, you know,
04:57:18.840 | the rejection of the fiat system, right?
04:57:22.480 | The fiat system says, a little inflation is good.
04:57:25.360 | Fiat medicine says, a little death is good.
04:57:28.240 | Bitcoin says, actually, no inflation,
04:57:30.900 | just get more valuable over time.
04:57:32.880 | And crypto medicine says, actually, let's, you know,
04:57:35.360 | extend life.
04:57:36.320 | This leads to all kinds of new things
04:57:38.840 | where you start actually thinking about,
04:57:40.200 | all right, how do I maintain my health with,
04:57:43.840 | you know, diagnostics?
04:57:45.160 | How do I, you know, take control of my own health
04:57:48.700 | with the decentralization of medicine?
04:57:50.500 | All the stuff that I've been describing
04:57:52.400 | sort of fits, like, longevity is traditional medicine
04:57:55.780 | as crypto is to traditional currency.
04:57:58.660 | - If we take those assumptions separately,
04:58:00.420 | so we take cryptocurrency aside,
04:58:02.440 | is that to you obvious that this,
04:58:07.260 | letting go of this assumption about death,
04:58:10.620 | is that an obvious thing?
04:58:12.100 | Is longevity obviously good?
04:58:15.660 | Versus, for example, the devil's advocate to that would be,
04:58:19.660 | what we want is to keep death
04:58:22.220 | and maximize the quality of life up until the end.
04:58:26.260 | - Well-- - Like, so that
04:58:28.420 | you're right into the sunset, healthy.
04:58:31.560 | - Somebody who was listening to the whole podcast
04:58:32.980 | would say, well, Balaji, just a few hours ago,
04:58:35.220 | you were saying this gerontocracy runs the US
04:58:37.460 | and they're all old and they don't get it, blah, blah, blah.
04:58:39.660 | And now you're talking about making people live forever,
04:58:41.660 | so there's never any new blood to wash it out.
04:58:43.260 | Ha ha, what a contradiction, right?
04:58:44.620 | - It's funny that you're so on point
04:58:48.060 | across all the topics we covered
04:58:51.940 | and the possible criticism, I love it.
04:58:53.620 | - Well, just trying to anticipate, you know, some--
04:58:55.260 | - Yeah, good, well done, well done, sir.
04:58:58.220 | - I think the argument on that is,
04:59:00.920 | so long as you have a frontier,
04:59:03.080 | it is okay for someone to live long, okay?
04:59:06.980 | So long as people can exit to a new thing, number one.
04:59:09.900 | Number two is, in order for us to go and colonize
04:59:13.100 | other planets and so on, if you do wanna get to Mars,
04:59:16.340 | if you wanna become Star Trek and what have you,
04:59:19.620 | probably gonna need to have,
04:59:23.240 | just to survive a long flight, so to speak,
04:59:28.620 | multiple light year flight,
04:59:30.260 | you're gonna need to have life extension.
04:59:32.580 | So to become a pioneering interstellar kind of thing.
04:59:35.260 | I know that, it's the kind of thing which sounds like,
04:59:38.540 | okay, yeah, and when we're on the moon,
04:59:40.300 | we're gonna need shovels.
04:59:42.020 | It sounds like a piling a fantasy
04:59:47.020 | on top of a fantasy in that sense,
04:59:48.500 | but it's also something where,
04:59:49.720 | if you're talking about the vector of our civilization,
04:59:51.620 | where are we going?
04:59:52.740 | Well, I actually do think it's either anarcho-primitivism
04:59:55.700 | or optimalism/transhumanism.
04:59:58.300 | Either we are shutting down civilization, it's degrowth,
05:00:01.740 | it's Unabomber, et cetera, or it's the stars
05:00:05.380 | and escaping the prime number base.
05:00:07.020 | - It's like, to me, it's obvious that we're going to,
05:00:12.020 | if we're to survive, expand out into space.
05:00:15.940 | - Yes.
05:00:16.780 | - And it's obvious that once we do,
05:00:20.540 | we'll look back at anyone, which is currently most people,
05:00:24.140 | that didn't think of this future,
05:00:27.380 | didn't anticipate this future,
05:00:28.860 | worked towards this future as Luddites,
05:00:31.700 | like as people who totally didn't get it.
05:00:34.340 | It'll become obvious.
05:00:35.620 | Right now it's impossible, and then it'll become obvious.
05:00:38.300 | - Yes.
05:00:39.140 | - It seems like, yes, longevity in some form,
05:00:42.420 | I mean, there could be a lot of arguments
05:00:43.940 | of the different forms longevity could take,
05:00:46.340 | but in some form, longevity is almost a prerequisite
05:00:50.140 | for the expansion out into the cosmos.
05:00:52.380 | - That's right.
05:00:53.220 | - Expansion of longevity.
05:00:54.060 | - There's also a way to bring it back to Earth
05:00:55.860 | to an extent, which is how were societies used to be judged?
05:00:58.980 | You may remember, people used to talk about life expectancy
05:01:01.900 | as a big thing, right?
05:01:04.100 | Life expectancy is actually a very, very, very good metric.
05:01:07.540 | It's a ratio scale variable.
05:01:09.140 | There's like four different class of variables
05:01:11.220 | statisticians talk about.
05:01:12.100 | Ratio scale is like years or meters or kilograms, okay?
05:01:16.580 | Then you have interval scale,
05:01:18.340 | where plus and minus means something,
05:01:19.700 | but there's no absolute zero.
05:01:21.020 | Then you have ordinal, where there's only ranks,
05:01:23.060 | and plus and minus don't mean anything,
05:01:24.140 | and then you have categorical,
05:01:25.260 | like the Yankees and the Braves are categorical variables,
05:01:29.660 | they're just different,
05:01:30.500 | but all you have is the comparator operator,
05:01:32.380 | whether you have a quality, you don't have a rank, okay?
05:01:34.900 | So ratio scale data is the best
05:01:37.300 | because you can compare it across space and time.
05:01:39.460 | If you have a skeleton that is like, you know,
05:01:43.580 | two meters tall, that's from 3000 years ago,
05:01:46.540 | you can compare the height of people
05:01:49.020 | from many, many years ago, different cultures and times,
05:01:51.180 | right?
05:01:52.020 | Whereas their currency is much harder to value,
05:01:53.740 | that's not like a ratio scale variable.
05:01:55.060 | Other things are harder to value across space and time,
05:01:57.460 | right?
05:01:58.300 | So life expectancy is good because,
05:02:01.100 | as a ratio scale variable, it's a very clear definition,
05:02:03.820 | right, like when someone born and died,
05:02:05.660 | those are actually relatively clear.
05:02:07.500 | But most other things aren't like that.
05:02:11.180 | You know, that's why murder or death,
05:02:14.860 | that, you know, it can be scored, it's unambiguous,
05:02:18.180 | you know, it's done when it's done,
05:02:19.940 | whereas when did somebody get sick?
05:02:21.860 | Oh, well, they were kind of sick,
05:02:23.420 | or they were sick today, they were sick at this hour,
05:02:25.420 | the boundary conditions, many other kinds of things
05:02:27.120 | are not like clear cut like that, right?
05:02:28.700 | - And I should just briefly comment
05:02:30.180 | that life expectancy does have this quirk,
05:02:33.380 | a dark quirk that it, when you just crudely look at it,
05:02:38.500 | is incorporates child mortality,
05:02:41.620 | mortality at age of one or age of five,
05:02:45.020 | and maybe it's better and clearer to look at mortality
05:02:50.020 | after five or whatever.
05:02:54.340 | And that's still, those metrics still hold
05:02:56.540 | in interesting ways and measure the progress
05:02:58.500 | of human civilization in interesting ways.
05:03:00.620 | - That's right.
05:03:01.460 | You actually want longevity biomarkers.
05:03:02.940 | A lot of people are working on this.
05:03:04.580 | There's a book called "The Picture of Dorian Gray,"
05:03:06.580 | and the concept is sell your soul to, you know,
05:03:09.740 | ensure the picture rather than he will age and fade, right?
05:03:13.440 | And so the concept is that that thing on the wall
05:03:16.300 | just reflects his age and you can see it, okay?
05:03:18.660 | So there's a premise that's embedded
05:03:20.540 | in a lot of Western culture that
05:03:22.480 | to gain something you must lose.
05:03:24.000 | If you're Icarus and you try to fly,
05:03:26.540 | then you'll fly too high and it'll melt your wings.
05:03:30.780 | But guess what?
05:03:31.620 | We fly every day, commercial air flight, right?
05:03:33.420 | So the opposite of like the Icarus
05:03:36.140 | or "Picture of Dorian Gray" kind of thing
05:03:37.900 | is the movie "Limitless," which I love
05:03:40.060 | because it's so Nietzschean and so unusual
05:03:42.860 | relative to the dystopian sci-fi movies
05:03:45.260 | where there's a, without giving, right?
05:03:46.660 | I mean, the movie's kind of old now,
05:03:47.940 | but there's a drug in it that's a nootropic
05:03:51.140 | that boosts your cognitive abilities
05:03:53.380 | and it's got side effects, but at the end,
05:03:55.740 | he engineers out the side effects.
05:03:57.980 | Amazing, just like, you know,
05:03:59.020 | yeah, there were planes that crashed and we land, right?
05:04:00.980 | Okay, so why did I mention the "Picture of Dorian Gray?"
05:04:02.980 | Well, there's another aspect of it,
05:04:03.980 | which is longevity biomarker.
05:04:06.340 | The point is to kind of estimate
05:04:07.740 | how many years of life you have left
05:04:09.820 | by that Q.bio or Integrome,
05:04:14.140 | or you take all these analyses on somebody, right?
05:04:17.580 | One of the best longevity biomarkers
05:04:19.620 | could be just your face, right?
05:04:22.780 | You image the face and you can sort of tell,
05:04:25.340 | oh, somebody looks like they've aged,
05:04:26.700 | oh, someone looks younger, et cetera, et cetera.
05:04:29.140 | And this is actually data that you've got
05:04:31.740 | on millions and millions of people
05:04:33.580 | where you could probably start having AI predict,
05:04:37.820 | okay, what is somebody's life expectancy
05:04:40.100 | given their current face and other kinds of things, right?
05:04:42.500 | Because you have their name, they have your birth date,
05:04:44.820 | you have their, you know, date they passed away
05:04:46.660 | if they've already passed away, right?
05:04:48.860 | And you have photos of them through their life, right?
05:04:50.900 | So just imaging might give
05:04:52.940 | a reasonably good longevity biomarker,
05:04:55.500 | but then you can supplement that
05:04:56.740 | with a lot of other variables.
05:04:59.020 | And now you can start benchmarking every treatment
05:05:02.060 | by its change in how much time you have left.
05:05:06.020 | If that treatment, that intervention
05:05:08.100 | boosts your estimated life expectancy by five years,
05:05:14.220 | you can see that in the data.
05:05:16.860 | You can get feedback on whether your longevity
05:05:20.180 | is being boosted or not, okay?
05:05:22.820 | And so what this does,
05:05:24.180 | it just fundamentally changes the assumptions in the system.
05:05:26.660 | Now, with that said, you know,
05:05:29.100 | life extension may be the kind of thing,
05:05:31.220 | I'm not sure if it'll work for our generation.
05:05:33.380 | We may be too late.
05:05:34.580 | It may work for the next generation.
05:05:35.940 | - Wouldn't that make you sad?
05:05:37.740 | - Well, I've got something.
05:05:38.860 | - To the last generation.
05:05:40.420 | - Could be, but I've got something for you,
05:05:42.620 | which is, I call it genomic reincarnation, okay?
05:05:46.380 | This one you probably haven't heard before.
05:05:47.540 | I've tweeted about it, okay?
05:05:49.860 | - By the way, good time to mention that your Twitter
05:05:53.700 | is one of the greatest Twitters of all time,
05:05:55.460 | so people should follow you.
05:05:56.380 | - Well, Lex Friedman has one of the greatest podcasts
05:05:58.500 | of all time.
05:05:59.340 | You guys should listen to the Lex Friedman podcast,
05:06:01.460 | which you may be doing, right?
05:06:03.180 | - Which you may be doing right now.
05:06:04.620 | - Yes. - Yeah.
05:06:05.860 | - Well, thank you.
05:06:06.700 | - What was the term again?
05:06:07.980 | Sorry, genomic--
05:06:09.660 | - Sorry, I call it genomic, not resurrection,
05:06:11.900 | but genomic reincarnation, okay?
05:06:13.420 | So here's the concept.
05:06:15.180 | You may be aware that you can synthesize strands of DNA,
05:06:18.900 | okay?
05:06:19.740 | There's sequencing of DNA, which is reading it,
05:06:22.180 | and synthesizing DNA, which is creating strands of DNA.
05:06:25.360 | What's interesting is you can actually also do that
05:06:28.660 | at the full chromosome level for bacterial chromosomes.
05:06:33.220 | Remember that thing I was saying earlier
05:06:34.340 | about the minimum life form that Craig Venter made?
05:06:37.500 | So people have synthesized entire bacterial chromosomes,
05:06:41.460 | and they work.
05:06:42.300 | Like, they can literally essentially print out
05:06:44.860 | a living organism, all right?
05:06:46.860 | Now, when you go from bacteria to eukaryotes,
05:06:51.500 | which are the kingdom of life that we're a part of, right?
05:06:56.500 | Yeast are part of this kingdom and so on.
05:06:58.060 | It becomes harder because the chromosomes
05:06:59.520 | are more complicated.
05:07:00.840 | But folks are working on eukaryotic chromosome synthesis.
05:07:04.380 | And if you spot me that sci-fi assumption
05:07:06.980 | that eventually we'll be able to take your genome sequence,
05:07:10.620 | and just like we can synthesize a bacterial chromosome,
05:07:14.180 | we can synthesize not just one eukaryotic chromosome,
05:07:16.820 | but your entire complement of chromosomes in the lab,
05:07:18.980 | right, 'cause you have 23, and 46, whatever,
05:07:22.100 | you take the pairs.
05:07:23.220 | What you can do is potentially print somebody out from disk,
05:07:27.600 | reincarnate them, insofar if your sequence determines you.
05:07:34.140 | And you can argue with this 'cause there's epigenetics
05:07:36.260 | and other stuff, okay?
05:07:37.460 | But let's just say at a first order,
05:07:39.020 | your DNA sequence is Lex.
05:07:41.380 | You can sequence that, okay?
05:07:43.100 | You can do full genome sequencing and log that to a file.
05:07:47.140 | Then here's the karma part.
05:07:50.380 | Your crypto community, where you've built up
05:07:53.540 | enough karma among them, if when you die,
05:07:57.420 | your karma balance is high enough,
05:08:00.500 | they will spend the money to reincarnate the next Lex,
05:08:04.940 | who can then watch everything that happened
05:08:08.380 | in your past life, and you can tell them something.
05:08:12.180 | - Mm-hmm.
05:08:13.580 | - Everything I described there, I mean,
05:08:14.660 | if you spot me eukaryotic chromosome synthesis,
05:08:18.660 | that's the only part that I think will be possible, right?
05:08:23.580 | Folks are working on it, I'm sure someone will mention it.
05:08:25.380 | - Right, it's essentially a clone.
05:08:27.540 | - It's like a clone, right?
05:08:29.360 | But it is you in a different time.
05:08:34.360 | - You're in a different time,
05:08:35.200 | but you don't unfortunately have the memories.
05:08:37.980 | Well, you could probably watch the digest of your life,
05:08:42.980 | and it would be pretty interesting, right?
05:08:45.180 | I mean-- - Yeah, that's actually
05:08:47.220 | a process for psychology to study.
05:08:50.180 | If you create a blank mind, what would you need
05:08:53.860 | to show that mind to align it very well
05:08:57.980 | with the experiences, with the fundamental experiences
05:09:02.260 | that define the original version,
05:09:04.900 | such that the resulting clone would have
05:09:08.020 | similar behavior patterns, worldviews, perspectives,
05:09:10.900 | feelings, all those kinds of things.
05:09:14.900 | - Potentially, right?
05:09:16.300 | - Including, sadly, enough traumas and all that.
05:09:18.540 | - Or what have you, right?
05:09:19.500 | But basically, just in a very simple version of it,
05:09:22.380 | by the time one is age 20 or 30,
05:09:26.260 | or something to meet your 20s,
05:09:27.580 | you'll learn your own personal operating system.
05:09:30.100 | You'll be like, "Oh, alcohol really doesn't agree with me,"
05:09:33.500 | or something like that.
05:09:34.340 | Just by trial and error, things that are idiosyncratic
05:09:37.700 | to your own physiology, like, "Oh, I'm totally wrecked
05:09:41.460 | "if I get seven hours of sleep versus nine hours,"
05:09:44.380 | or whatever it is, right?
05:09:45.300 | People will have different kinds of things like this.
05:09:47.980 | That manual can be given to your next self,
05:09:51.100 | so you can go, "Don't do this, do this,
05:09:52.420 | "don't do this, do this," right?
05:09:53.740 | To some extent, personal genomics
05:09:54.820 | already gives you some of this,
05:09:56.060 | where you're like, "Oh, I'm a caffeine,
05:09:57.860 | "or a slow metabolizer, oh, that explains X or Y,"
05:10:01.260 | or, "I have a weird version of alcohol, dehydrogenase,
05:10:04.140 | "oh, okay, that explains my alcohol tolerance."
05:10:06.900 | So this is part of the broader category
05:10:08.820 | of what I call practical miracles, right?
05:10:10.660 | So it's longevity, it's genomic reincarnation,
05:10:14.280 | it is restoring sight, and it is curing deafness
05:10:17.900 | with the artificial eyes and artificial ears.
05:10:22.900 | It is the super soldier serum, did I show you that?
05:10:26.800 | So like, Myostat and Null, a tweet about this,
05:10:29.740 | basically, X-Men are real.
05:10:32.100 | So here is a study from NEGM from several years ago, okay?
05:10:37.100 | What is this, this is like the mid-2000s.
05:10:43.220 | This was in 2004, okay?
05:10:46.380 | So it's now 17 years later, it's probably,
05:10:48.620 | this is almost certainly a teenager by now.
05:10:51.260 | So this kid basically was just totally built.
05:10:56.260 | - Yeah. - Okay?
05:10:58.340 | Extraordinarily muscular. - Like, very muscular,
05:11:00.620 | at a very young age. - Yes.
05:11:03.740 | So the child's birth weight was in the 73th percentile,
05:11:05.940 | he appeared extraordinarily muscular,
05:11:07.300 | protruding muscles in his thighs,
05:11:08.540 | motor and mental development has been normal.
05:11:10.380 | Now at four and a half years of age,
05:11:11.500 | he continues to have increased muscle bulk and strength.
05:11:14.100 | And so, essentially, Myostatin mutation
05:11:17.100 | associated with gross muscle hypertrophy in a child.
05:11:19.340 | So this is like real life X-Men, okay?
05:11:22.900 | And-- - And there's pictures
05:11:24.900 | of animals. - Yes.
05:11:27.180 | So there's a company called Variant Bio
05:11:29.020 | that is looking at people
05:11:30.500 | who have exceptional health-related traits,
05:11:33.020 | and is looking for essentially this kind of thing,
05:11:35.180 | but maybe more disease or whatever related, right?
05:11:38.380 | For example, people who have natural immunity to COVID.
05:11:40.500 | Understanding how that works,
05:11:41.500 | perhaps we can give other people
05:11:42.740 | artificial immunity to COVID, right?
05:11:45.220 | If you scroll up, you see my kind of tweet,
05:11:47.340 | "Super soldier serum is real,"
05:11:49.020 | where it's like wild type mouse and a Myostatin null.
05:11:52.060 | And look at the chest on that thing.
05:11:54.540 | You see the before and after.
05:11:56.100 | - Wow. - Okay?
05:11:57.900 | This is what's possible.
05:12:00.180 | This could be us, but you're regulating.
05:12:02.740 | You know, right?
05:12:03.580 | You're not saying like, "This could be us, but you're playing."
05:12:04.780 | This could be us, but the FDA regulating, right?
05:12:07.380 | All this-- (Lex laughing)
05:12:09.580 | Okay? - Oh, yeah.
05:12:11.220 | On steroids.
05:12:12.820 | - But it's not, that's the thing is--
05:12:13.660 | - But it's not steroids.
05:12:15.300 | - Well, that's the thing is people,
05:12:17.860 | again, you go back to the Icarus thing.
05:12:19.540 | They think, "Oh, steroids.
05:12:21.340 | "Well, that's definitely gonna give you cancer,
05:12:23.180 | "screw up your hormones," et cetera, et cetera.
05:12:25.580 | And it could, but you know what?
05:12:27.820 | Like, have we actually put in that much effort
05:12:30.380 | into figuring out like the right way
05:12:33.340 | of doing testosterone supplementation
05:12:35.140 | or the right way of doing this?
05:12:36.820 | Obviously, we've managed to put a lot of effort
05:12:40.020 | into marijuana, increasing the potency of it
05:12:42.740 | or what have you.
05:12:43.580 | Could we put the effort into these kinds of drugs, right?
05:12:46.780 | Or these kinds of compounds?
05:12:48.380 | Maybe.
05:12:49.700 | I think that would actually be a really good thing.
05:12:51.540 | The thing about this is I feel this is just
05:12:53.300 | a massively underexplored area
05:12:56.220 | rather than people drinking caffeine all the time.
05:12:58.500 | That's a very mild enhancing drug, okay?
05:13:02.060 | Nicotine is also arguably kind of like that.
05:13:04.220 | Some people have it even without the cigarettes, right?
05:13:06.920 | Why can't we research this stuff?
05:13:09.300 | One way of thinking about it is Lance Armstrong,
05:13:11.580 | the cyclist, yes, he violated all the rules.
05:13:16.580 | You know, he shouldn't have won the Tour de France
05:13:18.620 | or anything like that.
05:13:20.140 | But his chemists, and I say this somewhat tongue in cheek,
05:13:22.900 | but also his chemists are candidates
05:13:25.460 | for like the Nobel Prize in chemistry
05:13:27.780 | because they brought a man back from like testicular cancer
05:13:31.380 | to like winning Tour de France against a bunch of guys
05:13:34.660 | who probably, you know, a bunch of them
05:13:35.860 | were also juiced or whatever, right?
05:13:37.660 | Whatever was done there,
05:13:40.180 | take it out of the competition framework.
05:13:42.020 | There's a lot of testicular cancer patients
05:13:43.900 | or cancer patients period who would want some of that.
05:13:46.660 | And we should take that seriously.
05:13:50.100 | We should take that pursuit really, really, really seriously.
05:13:52.740 | Yes, except again, just like the Theranos stuff,
05:13:55.340 | all this pathologize.
05:13:56.300 | Oh, it's a Balico scandal.
05:13:57.660 | Oh, it says, oh my God, you know?
05:13:59.340 | And yes, of course,
05:14:00.420 | within the context of that game, they're cheating.
05:14:02.780 | When the context of life, you want to be cheating death.
05:14:06.620 | - Yes. - Right?
05:14:07.660 | So it's just a kind of a reframe on what is good, right?
05:14:12.460 | And it is just taking away these assumptions
05:14:15.060 | that mild inflation is good or mild death is good
05:14:18.380 | and going towards transcendence.
05:14:20.180 | So that gets me done with the giant FDA,
05:14:24.260 | biomedical, et cetera, et cetera.
05:14:25.620 | - Longevity, yeah.
05:14:26.460 | - Okay. - That is beautifully,
05:14:27.780 | beautifully done.
05:14:28.620 | - You have two questions.
05:14:29.980 | One was on Trump and deplatforming
05:14:31.540 | and the other was on crypto and the state of crypto
05:14:33.300 | and the third is on India.
05:14:34.180 | Which one should we do?
05:14:35.580 | - All right, since we talked about how to fix government,
05:14:38.780 | we talked about how to fix health, medicine, FDA, longevity,
05:14:43.780 | let us briefly talk about how to fix social media, perhaps.
05:14:50.820 | - Sure, very important. - Since we kind of talked
05:14:53.420 | about it from different directions,
05:14:54.820 | but it'd be nice to just look at social media.
05:14:57.100 | And if we can perhaps first, as an example,
05:15:00.580 | maybe it's not a useful example,
05:15:02.060 | but to me it was one that kind of shook me a little bit,
05:15:05.120 | is the removal of Trump and since then other major figures,
05:15:10.980 | but Donald Trump was probably the biggest person ever
05:15:15.160 | to be removed from social media.
05:15:16.900 | Do you understand why that was done?
05:15:19.940 | Can you steel man the case for it and against it?
05:15:22.980 | And if there's something broken about that,
05:15:26.140 | how do we fix it?
05:15:27.220 | - Steel man the case for is kind of obvious
05:15:29.780 | in the sense of you are seeing a would-be dictator
05:15:34.220 | who is trying to run a coup against democracy,
05:15:36.660 | who has his supporters go and storming
05:15:40.580 | the seat of government, who could use his app
05:15:44.940 | to whip up his followers across the country
05:15:47.300 | to reject the will of the people.
05:15:50.920 | And so you're an executive and you'll take actions
05:15:55.180 | that while perhaps controversial are still within the law
05:15:58.780 | and you'll make sure that you do your part
05:16:01.180 | to defend democracy by making sure
05:16:03.860 | that at least this guy's megaphone is taken away
05:16:06.220 | and that his supporters cannot organize more riots.
05:16:08.980 | That's basically the case for the deplatforming.
05:16:12.900 | Okay, would you agree with that?
05:16:15.100 | That's fair.
05:16:15.940 | - So it's like really steel manning it.
05:16:18.460 | - You asked to steel man, so I'm giving the for case, yeah.
05:16:20.340 | - Well, I guess I would like to separate
05:16:23.260 | the would-be dictator.
05:16:26.860 | Oh, I guess if you're storming the Capitol,
05:16:30.800 | you are a dictator, I see, I see.
05:16:32.960 | So those are two are interlinked, right?
05:16:37.680 | You have to have somehow a personal judgment of the person.
05:16:41.980 | - Bad enough to be worth this significant step.
05:16:46.140 | - Yeah, it's not just their actions or words
05:16:48.460 | in a particular situation, but broadly,
05:16:50.780 | this person is dangerous. - The context of everything
05:16:52.140 | that led up to this moment and so on, right?
05:16:54.500 | So that's the for case, right?
05:16:56.520 | Now, the against case.
05:16:57.540 | There's actually several against cases, right?
05:16:59.380 | There's obviously the Trump supporters against case.
05:17:02.600 | There is the sort of the libertarian/left libertarian
05:17:07.600 | against case, and there is the rest of world against case.
05:17:15.260 | There's actually three, 'cause it's not just two factions,
05:17:17.980 | there's multiple, right?
05:17:19.140 | So what is the Trump supporter against case?
05:17:21.140 | There's an article called "The Secret Bipartisan Campaign
05:17:23.700 | That Saved the 2020 Election," right?
05:17:26.100 | Which came out a few weeks after the inauguration,
05:17:30.680 | like February 4th, 2021.
05:17:32.460 | And essentially, the Trump supporter would read this
05:17:35.420 | as basically saying, in the name of defending democracy,
05:17:39.980 | they corrupted democracy,
05:17:41.780 | whether it was actually vote counts
05:17:44.300 | or just changes of all the rules
05:17:45.540 | for mail-in ballots and stuff,
05:17:47.200 | there were regular meetings between the Chamber of Commerce
05:17:49.540 | and AFL and the unions.
05:17:52.740 | In particular, they admit that the BLM riots
05:17:55.260 | of the mid-2020s were actually on a string
05:17:58.740 | and they could say, "Stand down," right?
05:18:00.420 | So that's actually, that's a quote from this article
05:18:03.300 | where it's like, "The word went out, 'Stand down,'
05:18:06.500 | protected results announced that it would not be activating
05:18:08.740 | the entire national mobilization network today,
05:18:10.740 | but remains ready to activate if necessary.
05:18:12.860 | Hothoser credits the activists for their restraint."
05:18:15.640 | So basically, "The activists re-oriented
05:18:18.020 | the protected results protest
05:18:19.340 | towards a weekend of celebration."
05:18:21.360 | So point being that the fact that,
05:18:24.940 | the Trump supporter would say,
05:18:26.380 | the fact that they could tell them to stand down
05:18:28.980 | meant that the previous unrest was in part coordinated.
05:18:33.980 | And so they'd say, "Okay, so that makes it illegitimate
05:18:37.060 | in a different way," right?
05:18:37.940 | Plus, well, there was one riot on Jan 6th
05:18:40.780 | versus the attacks on the White House and stuff,
05:18:43.060 | there was a storming of the White House in mid-2020,
05:18:45.020 | and didn't actually storm the White House,
05:18:46.740 | but they're setting fires outside
05:18:48.160 | and there's quite a lot of stuff, right?
05:18:50.260 | So the second against case is the,
05:18:52.820 | let's say libertarian/left libertarian who'd say,
05:18:55.960 | "Do we really want giant corporations,
05:19:00.900 | regardless of what you think about Trump,
05:19:02.620 | and you don't have to be a Trump supporter,
05:19:04.020 | do you really want giant corporations to be determining
05:19:06.500 | who can say what on the internet?
05:19:08.460 | And if they can de-platform a sitting president
05:19:11.620 | and the quote, "most powerful man in the world,"
05:19:14.460 | he's not the most powerful man in the world.
05:19:16.460 | In fact, the quote, "people" are electing a figurehead,
05:19:21.220 | and actually it's the heads of network
05:19:22.980 | that are more powerful than the heads of state, right?
05:19:26.060 | That the fact that the CEOs of Facebook and Twitter
05:19:30.060 | and Google and Apple and Amazon
05:19:32.500 | all made those decisions at the same time
05:19:35.140 | to not just de-platform Trump from Twitter,
05:19:37.340 | which literally billions of people around the world saw,
05:19:39.820 | but also censor or stop on Facebook,
05:19:42.540 | and to have Google and Apple pull Parler out of the App Store
05:19:45.740 | and Amazon shut down the backend,
05:19:47.620 | that would be corporate collusion by any other name.
05:19:50.460 | It's actually very similar
05:19:51.380 | to the so-called business plot against FDR.
05:19:54.260 | FDR was a complicated figure
05:19:59.100 | who can in some ways best be thought of
05:20:01.380 | as the least bad communist dictator
05:20:04.860 | or socialist dictator of the 20th century.
05:20:07.620 | Because he nationalized the economy,
05:20:09.760 | repealed the 10th Amendment, right?
05:20:11.700 | Tried to pack the courts.
05:20:13.100 | He sicced the government on all of his enemies
05:20:16.620 | from Huey Long to Andrew Mellon.
05:20:19.060 | Obviously he interned the Japanese,
05:20:20.620 | which shows that wasn't really totally a good guy, right?
05:20:23.180 | We don't usually think about the same guy
05:20:24.540 | who did this, did that.
05:20:25.960 | Earlier in his life, most people don't know this one,
05:20:28.660 | he led a whole Navy thing to entrap gay sailors.
05:20:33.340 | And did you know about this one?
05:20:34.500 | - No. - Yeah.
05:20:35.340 | Google FDR entrapment of gay sailors.
05:20:36.860 | Basically he got young men to try to find folks
05:20:41.340 | within the Navy who were gay
05:20:42.940 | and then basically entrap them
05:20:46.060 | so that they could be prosecuted and what have you, right?
05:20:49.100 | FDR did a lot of stuff,
05:20:50.980 | but fundamentally nationalized the economy
05:20:53.500 | and set up the alphabet soup,
05:20:55.500 | is what they called it at the time.
05:20:56.640 | And that's like all these agencies or whatever.
05:20:58.760 | And in some sense, he's like continuous,
05:21:01.820 | there'd been a rising trend of centralization.
05:21:05.020 | Woodrow Wilson, obviously centralized,
05:21:07.060 | Lincoln centralized, right?
05:21:08.740 | Even actually 1789 was a degree of centralization
05:21:12.020 | over the more loose thing that was 1776, 1789.
05:21:16.340 | So he was on that trend line,
05:21:17.300 | but he was definitely a huge kind of dog leg up.
05:21:20.140 | So the thing is that because of all the lawsuits
05:21:22.460 | that were flying, many folks like Amie Schlaes
05:21:26.500 | has written a book, "The Forgotten Man."
05:21:28.460 | And essentially her thesis
05:21:29.620 | and thesis of many others at that time,
05:21:31.020 | like John T. Flynn, who's this journalist
05:21:32.980 | who was pro FDR and then was against,
05:21:35.360 | was that FDR made the Great Depression great.
05:21:37.980 | Okay, that it wouldn't have been such a bad thing
05:21:40.020 | without him mucking up the entire economy
05:21:42.500 | and giving it a sickness.
05:21:43.580 | It would have recovered quickly without that, right?
05:21:45.740 | This is a counterfactual,
05:21:46.740 | which people just argue about it
05:21:48.500 | really angrily back and forth.
05:21:49.940 | And you can't actually run the experiment
05:21:52.060 | unless you could fork the economy, right?
05:21:54.140 | Just like where the bailouts good or bad.
05:21:55.580 | I think they were bad, but how could I prove it?
05:21:57.900 | I'd need to actually be able to fork the economy.
05:21:59.500 | Crypto actually allows you in theory to do that.
05:22:01.660 | Like where folks could actually shift balances.
05:22:03.420 | This is a whole separate thing
05:22:04.260 | where you can actually start to make macroeconomics
05:22:06.140 | into more of an experimental science
05:22:07.580 | rather than simply arguing from authority.
05:22:09.500 | You could argue from experiment.
05:22:11.500 | Some of the virtual economy stuff
05:22:12.700 | that Edward Castronova has done is relevant to this.
05:22:15.220 | We can talk about that.
05:22:16.460 | Point is though with FDR,
05:22:17.860 | there's this thing because he had
05:22:19.820 | breached such a war on private industry at that time
05:22:23.140 | and justified it with this narrative,
05:22:24.540 | quote, bold, persistent experimentation.
05:22:26.140 | There was something called the quote, business plot
05:22:27.900 | where all of these captains of industry
05:22:29.900 | that he'd been beating up.
05:22:31.060 | And again, Teddy Roosevelt had also been doing this
05:22:33.700 | with the trust buster.
05:22:36.400 | The journos at the time, Ida Tarbell had gone
05:22:38.700 | and basically ran all these articles on Rockefeller
05:22:41.500 | and knocked them down, Woodrow Wilson and Skid Row.
05:22:43.660 | But FDR, the CEOs were thinking,
05:22:46.500 | "Oh, bad, this is so terrible."
05:22:48.940 | There's a so-called business plot
05:22:50.940 | to try to take over the government
05:22:52.780 | and stop FDR from pushing the country
05:22:56.100 | in what they thought was a bad direction.
05:22:57.620 | Smedley Butler was a general that they recruited
05:23:00.220 | to try to help them with this,
05:23:01.940 | but he turned on them and he went
05:23:03.980 | and kind of broke the whole thing open
05:23:05.660 | and told to Congress and so on.
05:23:07.460 | And so all these guys, the whole plot was broken up.
05:23:12.500 | Now, one way of thinking about today
05:23:14.640 | or the whole aftermath of Jan 6th
05:23:17.200 | is it's a business plot, but in reverse
05:23:18.940 | because the generals and the CEOs both were against Trump
05:23:23.100 | and actually the business plot happened
05:23:25.600 | and now all the CEOs just,
05:23:27.300 | they pushed all the buttons that they needed to.
05:23:30.860 | And now the network was prime over the state.
05:23:34.360 | Now, why is that an interesting way of looking at it?
05:23:37.080 | Because one thing I have in the book
05:23:39.640 | is you can kind of think of 1950
05:23:42.240 | as like issues peak centralization.
05:23:44.900 | You go forward and backward in time, things decentralize.
05:23:48.600 | And you start getting mirror image events
05:23:51.480 | that happen with the opposite outcome.
05:23:52.920 | For example, 1890, the frontier closes.
05:23:56.200 | 1991, the internet frontier opens.
05:23:58.000 | Internet becomes open for commerce.
05:24:00.160 | You go backwards in time, you have the Spanish flu,
05:24:02.240 | forwards in time, COVID-19.
05:24:03.860 | Backwards in time, you have the captains of industry,
05:24:08.900 | the robber barons.
05:24:10.300 | Forwards in time, you have the tech billionaires.
05:24:12.400 | And there's so many examples of this.
05:24:14.720 | Like another one is backwards in time,
05:24:17.860 | the New York Times is allying with Soviet Russia
05:24:20.440 | to choke out Ukraine.
05:24:21.840 | Now today, they have reinvented themselves
05:24:24.560 | as cheerleaders for Ukraine against nationalist Russia.
05:24:28.060 | And of course, I think you could absolutely support Ukraine
05:24:30.920 | on other measures, but it's pretty hypocritical
05:24:32.520 | for the guys who profited from the Haldimor.
05:24:35.400 | The Oxelsberg family literally profited
05:24:37.280 | from denying the Haldimor to now make themselves cheerleaders
05:24:40.360 | for Ukraine.
05:24:41.200 | It's actually this insane thing, which we can talk about.
05:24:42.840 | - A tiny tangent on that.
05:24:44.540 | You put it brilliantly.
05:24:46.120 | And a reminder for anyone who listens to me
05:24:49.480 | talk about Ukraine, it is possible to have empathy
05:24:53.140 | for a nation and not be part of the machine
05:24:56.620 | that generates a mainstream narrative.
05:24:58.600 | - Yes, that's right.
05:24:59.600 | Like basically, I was actually one of the first
05:25:02.960 | of three Estonian EU residents, okay?
05:25:06.280 | And I completely understand why Estonia and the Baltics
05:25:09.480 | and all these countries, including Ukraine,
05:25:11.300 | that just recently within living memory
05:25:12.880 | got their independence from the Soviet empire
05:25:15.180 | would not wanna be forcibly reintegrated
05:25:18.320 | into a place that they just escaped from.
05:25:20.660 | And so that is something which is sort of outside
05:25:23.920 | the American left, right, tired kind of thing,
05:25:27.660 | where when you understand it from that point of view,
05:25:29.940 | then there's like a fourth point of view,
05:25:31.380 | which is like India's point of view,
05:25:33.260 | or like much of the developing world,
05:25:35.300 | or what I call parts of it are ascending,
05:25:38.520 | parts are descending, whatever.
05:25:39.720 | But much of the rest of the world
05:25:41.100 | outside of that border region says,
05:25:43.460 | "Look, we're sympathetic to the Ukrainians,
05:25:45.940 | "but we can't allow our people to starve.
05:25:48.340 | "So we're gonna maintain trade."
05:25:50.260 | And guess what, actually, we've got a lot of wars
05:25:52.140 | in our neck of the woods and human rights crises
05:25:54.300 | that Europe just didn't even care about.
05:25:56.020 | So it can't be that Europe's problems
05:25:57.380 | are the world's problems,
05:25:58.220 | but the world's problems are not Europe's problems, right?
05:26:00.020 | So that's like a fourth point of view.
05:26:01.180 | And a fifth point of view is China, which is like,
05:26:03.180 | guess what, we're gonna be the Iran of the Iraq war,
05:26:06.060 | where like who won the Iraq war?
05:26:07.500 | Iran, arguably, because they extended their influence
05:26:09.740 | into Iraq, right?
05:26:10.740 | So China's like, guess what, we're gonna turn Russia
05:26:12.260 | into our gas station and build a pipeline.
05:26:14.500 | They're building, there's a power,
05:26:15.420 | Siberia's like the name of the Eastern Russia pipeline,
05:26:18.600 | just like Nord Stream is,
05:26:19.980 | Nord Stream One and Nord Stream Two.
05:26:21.620 | I think they're building a new pipeline through Mongolia.
05:26:24.700 | So Xi Jinping and Putin and the Mongolian head of state
05:26:28.060 | were all photographed kind of thumbs upping this pipeline.
05:26:30.420 | We'll see if it goes through,
05:26:31.940 | but it's ironic that Russia wanted to make Ukraine
05:26:36.940 | their colony, but the outcome of this war
05:26:40.100 | may be that Russia becomes China's colony.
05:26:42.460 | So that's at least like five different perspectives, right?
05:26:45.380 | There's like the US establishment perspective,
05:26:47.860 | there's the Tucker-MAGA perspective,
05:26:51.260 | there's the Baltics and Ukrainian perspective,
05:26:54.100 | there's like the Indian and like poor countries perspective
05:26:56.260 | and then there's the Chinese perspective
05:26:57.100 | and then of course there's the Russians, right?
05:26:58.540 | So just respect to that, by the way,
05:27:02.340 | that's another example of history happening in reverse.
05:27:04.980 | This is the sign of Soviet partnership,
05:27:07.660 | except this time, China's the senior partner
05:27:10.060 | and Russia's the junior partner,
05:27:11.580 | and this time they're both nationalists rather than communist
05:27:13.820 | and there's so many flips like this
05:27:16.260 | and I'm gonna list a few more actually
05:27:19.600 | because there's so, so, so many of them.
05:27:21.820 | - Do you have an explanation why that happens?
05:27:23.540 | - Yes, let me just list a few of them.
05:27:25.980 | This is in the "The Network State" book,
05:27:27.980 | it's in the chapter called
05:27:29.140 | "Fragmentation, Frontier, Fourth Turning,
05:27:31.740 | Futures are Past," right?
05:27:33.460 | So I give this example of like a fluid unmixing,
05:27:38.460 | all right, just watch this for a second, all right?
05:27:41.300 | - This is from "Smarter Every Day,"
05:27:43.540 | "Unmixing Color Machine, Ultra-Limited Reversible Flow,
05:27:47.380 | Smarter Every Day, 217."
05:27:50.100 | - And so you can mix something
05:27:52.700 | and then like this thing that you don't think of
05:27:54.460 | as reversible, you can unmix it,
05:27:56.980 | which is insane, right, that it works.
05:27:59.300 | Okay, the physics of that situation, it just works, right?
05:28:02.100 | - So for people just listening,
05:28:04.300 | that there is whatever the mixture this is,
05:28:07.140 | this is ultra-laminar reversible flow,
05:28:09.580 | so this probably has something to do with the material.
05:28:12.480 | We're used to mixing not being a reversible process.
05:28:17.980 | - Exactly. - And that's what that shows
05:28:19.380 | and then he then reverses the mixing
05:28:22.460 | and is able to do it perfectly.
05:28:24.140 | - That's right, so that's like the Futures are Past thesis.
05:28:27.300 | - It shows that free will is an illusion, just kidding.
05:28:29.780 | - Well, basically there's some environments
05:28:33.020 | where the equations are like time symmetrical, right?
05:28:36.160 | And this is one model, sort of,
05:28:39.340 | it's just an interesting visual model
05:28:40.460 | for what's happening in the world
05:28:42.060 | as we re-decentralize after the centralized century, right?
05:28:45.780 | So basically, I mentioned the internet frontier reopens
05:28:50.180 | back in the Western frontier closed,
05:28:51.900 | today we experienced COVID-19,
05:28:53.220 | back then we experienced the Spanish flu,
05:28:55.140 | tech billionaires, and we have the capital industry, right?
05:28:56.940 | Today, founders like Elan and Dorsey
05:28:59.500 | are starting to win against establishment journalists.
05:29:02.740 | Back then, Ida Tarbell demagogued and defeated Rockefeller.
05:29:06.000 | I think net-net founders win this time versus the journos.
05:29:09.580 | Back then, the journos won over the founders, okay?
05:29:12.660 | Today we have cryptocurrencies,
05:29:13.900 | back then we had private banking.
05:29:15.620 | Today, this is an amazing one,
05:29:17.340 | we have a populist movement of digital gold advocates.
05:29:21.440 | Back then, 'cause Bitcoin maximalists and so on,
05:29:23.580 | where gold has become populist
05:29:25.420 | because it's against the printing money and so on and so forth.
05:29:28.220 | Back then, we had a populist movement against gold
05:29:31.640 | in the form of William Jennings Bryan
05:29:32.940 | and the cross of gold speech.
05:29:34.340 | Gold was considered a tool of big business.
05:29:36.660 | Now, gold is the tool against big business
05:29:39.220 | and big government, right?
05:29:40.980 | - Digital gold, yeah.
05:29:41.900 | - Digital gold, right?
05:29:43.580 | Today we have the inflation and cultural conflict
05:29:45.460 | of Weimar-like America.
05:29:47.380 | Back then, we had the inflation
05:29:48.380 | and cultural conflict of Weimar Germany.
05:29:50.500 | Today, in Weimar America,
05:29:51.620 | we have right and left fighting in the streets,
05:29:53.540 | same, unfortunately, in Weimar Germany.
05:29:55.720 | Peter Turchin has written about,
05:29:57.060 | today we have what Turchin considers antebellum-like
05:29:59.540 | polarization, like pre-war polarization.
05:30:01.660 | Back then, if you go further back in time,
05:30:03.720 | we had what we now know to be antebellum polarization, right?
05:30:06.260 | Today we have Airbnb, back then we had flophouses.
05:30:08.380 | Today we have Uber, back then we had gypsy cabs.
05:30:10.780 | So today we see the transition
05:30:12.900 | from neutral to yellow journalism.
05:30:14.420 | Back then, we saw the transition
05:30:15.500 | from yellow to, quote, neutral journalism, right?
05:30:17.900 | And today, figures like Mike Moritz,
05:30:21.180 | he wrote about China's energetic and America's laconic.
05:30:24.180 | But back then, Bertrand Russell
05:30:25.780 | actually wrote this whole long book.
05:30:27.500 | Actually, the mathematician, Bertrand Russell, right?
05:30:29.660 | Wrote this whole long book,
05:30:30.940 | which I didn't even realize he wrote
05:30:31.900 | about these kind of topics,
05:30:34.100 | about the problem of China.
05:30:36.220 | And one of his observations was,
05:30:38.340 | again, I'm not saying this is,
05:30:39.860 | I'm just saying he made this observation.
05:30:41.620 | He was saying that America was energetic
05:30:44.100 | and China laconic at the time,
05:30:45.420 | 'cause everybody was in opium dens and so on and so forth.
05:30:49.180 | More examples, the one I just mentioned,
05:30:50.860 | where the Chinese and Russians are again
05:30:52.620 | lining up against the West,
05:30:53.700 | except this time the Chinese are the senior partner
05:30:55.380 | in the relationship rather than the junior partner.
05:30:57.460 | Today, I think in the second Cold War,
05:30:59.540 | there will also be a third world,
05:31:01.220 | but this time I think that third world might come in first,
05:31:04.020 | because it's not the non-aligned movement,
05:31:05.780 | it's the aligned movement around Web3 protocols.
05:31:08.260 | - That's fascinating, yeah.
05:31:10.900 | That's where Nick comes in.
05:31:12.220 | By the way, something we haven't mentioned,
05:31:13.900 | Africa, that there could be very interesting things
05:31:16.100 | in Africa as well.
05:31:17.020 | - Nigeria is actually, Nigeria's has its first tech unicorn,
05:31:20.660 | and I'm investing there.
05:31:22.620 | And I think it's one of these things where China's risen,
05:31:26.140 | India's like about 10 years behind China,
05:31:29.060 | but I think this is the Indian decade in many ways.
05:31:31.780 | We can come back to that point.
05:31:33.580 | But there's absolutely sparks of light in Africa.
05:31:35.940 | I mean, it's a huge continent.
05:31:37.220 | - It feels like the more behind, sorry to interrupt,
05:31:39.460 | the more behind you are,
05:31:41.780 | the more opportunity you have to leapfrog.
05:31:43.860 | - Sometimes, that's right.
05:31:44.700 | And Pesa is a classic example
05:31:46.780 | where they did this in East Africa,
05:31:48.700 | but I think there's more possibility there.
05:31:51.120 | - So what is the fact that there's a kind of symmetry?
05:31:57.380 | - There's a kind of symmetry, right?
05:32:00.060 | - What is that, how did that take us from Trump?
05:32:03.740 | - Trump, okay, yeah.
05:32:06.580 | - The different perspective you took,
05:32:07.980 | the libertarian perspective of it doesn't really matter.
05:32:12.420 | - Yeah, because the libertarian perspective,
05:32:14.020 | or the left libertarian perspective would say,
05:32:17.180 | is it really a good idea to have total corporate power
05:32:20.900 | against the quote elected government,
05:32:23.540 | even if you may disagree?
05:32:26.400 | Do you wanna open the door to total corporate oligarchy?
05:32:29.700 | And it's like the opposite, that's why I mentioned
05:32:31.100 | it's like the opposite of business plots,
05:32:32.380 | and they pulled on that thread, okay?
05:32:34.020 | So the macro explanation that I have for this future
05:32:38.340 | is our past thesis, and there's more,
05:32:39.980 | it also gives some predictions, right?
05:32:41.780 | If you go backwards in time,
05:32:43.060 | the US federalizes into many individual states,
05:32:47.200 | like before the Civil War,
05:32:48.340 | people said the United States are,
05:32:49.740 | and after they said the United States is.
05:32:51.680 | Before FDR, the 10th Amendment,
05:32:53.940 | reserved rights to the states,
05:32:55.180 | afterwards it was just federal regulation of everything.
05:32:58.120 | As we go forwards in time,
05:32:59.380 | you're seeing states break away from the feds
05:33:01.580 | on gun laws, drug laws, sanctuary cities, okay?
05:33:06.580 | Many other kinds of things, you know?
05:33:08.940 | And now Florida, for example, has its own guard
05:33:12.180 | that's like not a national guard, but like a state guard.
05:33:15.020 | Other states are doing this.
05:33:17.660 | - And that's a force of decentralization,
05:33:19.260 | you're saying that parallels in reverse.
05:33:21.820 | - In reverse, right?
05:33:22.820 | So you're having to make America states again.
05:33:25.860 | - Nice.
05:33:26.700 | - Okay, that's what I think happening, right?
05:33:28.900 | I'm not saying, well, I think there's aspects of that
05:33:31.020 | that are good, there's aspects that are bad,
05:33:32.620 | but just like that's kind of the angle, right?
05:33:36.560 | - But then that's, I mean, from your perspective,
05:33:39.140 | that's probably not enough, right?
05:33:40.340 | That's not--
05:33:42.220 | - It's part of the future.
05:33:43.320 | Let's just say--
05:33:44.160 | - I think you, sorry to interrupt,
05:33:45.900 | you suggested all kinds of ways
05:33:47.420 | to build different countries.
05:33:48.920 | I think that's probably one of them.
05:33:50.360 | You said like start micro-countries or something like this.
05:33:54.020 | I forgot the terminology.
05:33:55.140 | - Yeah, micronations.
05:33:55.980 | Yeah, that's not my,
05:33:57.300 | I actually think of them as,
05:33:58.420 | a better term is microstates
05:33:59.620 | 'cause they're actually not nations.
05:34:00.700 | That's why they don't work,
05:34:01.540 | but microstates are better, right?
05:34:03.140 | Coming back to the difference
05:34:04.860 | between the nation and the state,
05:34:05.700 | the nation is like,
05:34:06.620 | the nation state is a term that people use
05:34:09.660 | without expanding it,
05:34:10.740 | but nation comes from the same root word as like natality.
05:34:13.380 | So it's like common descent, common birth, right?
05:34:15.780 | Common origin, like the Japanese nation.
05:34:17.660 | That's a group of people that have,
05:34:19.100 | come down from history, right?
05:34:20.380 | - Hence nationalism.
05:34:21.700 | - Yeah, whereas the state
05:34:22.820 | is like the administrative layer above them.
05:34:24.300 | It's like labor and capital,
05:34:25.780 | like labor and management, okay?
05:34:27.180 | The American state stood over the Japanese nation
05:34:30.220 | in 1946 after the war.
05:34:32.060 | - Right.
05:34:32.900 | Oh, so you weren't talking about tradition,
05:34:34.260 | and you know that that doesn't matter.
05:34:35.780 | - Tradition what?
05:34:36.620 | - In terms of like,
05:34:37.780 | I thought you meant nation is a thing
05:34:39.660 | that carries across the generations.
05:34:41.420 | There's a tradition, there's a culture, and so on,
05:34:43.540 | and state is just the management, the layer.
05:34:45.940 | - I mean, that's also another way
05:34:47.220 | of thinking about it, right?
05:34:48.060 | - There's a reversal there as well, okay.
05:34:50.060 | - Yeah, so I mean,
05:34:51.100 | one way of thinking about it is,
05:34:53.180 | you know, one nation under God indivisible
05:34:56.100 | is no longer true.
05:34:57.180 | It is, America is at least two nations,
05:34:59.740 | the Democrat and Republican,
05:35:01.380 | in the sense of their own cultures,
05:35:03.500 | where I can show you graph after graph.
05:35:05.300 | You've seen the polarization graphs.
05:35:06.900 | I can show you network diagrams where,
05:35:08.980 | you know, like there's this graph
05:35:10.780 | of polarization in Congress,
05:35:11.940 | where there's red and blue,
05:35:12.980 | they're separate things.
05:35:14.380 | There's this article from 2017 showing
05:35:17.180 | how, you know, shares on Facebook and Twitter
05:35:19.380 | are just separate subgraphs.
05:35:20.420 | They're just separate graphs in the social network,
05:35:22.100 | and they're pulling apart.
05:35:23.540 | Those are two nations.
05:35:25.540 | They're not under God because people in the US
05:35:27.620 | no longer believe in God,
05:35:28.500 | and they're very much divisible because
05:35:31.100 | 96% of Democrats won't marry Republicans
05:35:33.340 | in a high percentage other way.
05:35:35.100 | And what that means is in one generation,
05:35:37.660 | ideology becomes biology.
05:35:39.500 | These become ethnic groups.
05:35:40.620 | It takes on the character of Hutu and Tutsi
05:35:42.820 | or Protestant and Catholic, Sunni and Shiite.
05:35:44.500 | It's not about ideology.
05:35:45.980 | If you think about all the flips during COVID, right,
05:35:48.100 | where people were on one side versus the other side,
05:35:49.700 | it's tribal, it's just tribe on tribe.
05:35:51.780 | And so it's not universalist that identity of American
05:35:54.300 | makes less sense than the identity of Democrat
05:35:56.100 | and Republican right now,
05:35:57.660 | or perhaps the identity of individual states.
05:35:59.460 | What I think that's a good or bad thing,
05:36:00.500 | I think that's unfortunately, you know,
05:36:01.900 | whatever it is, it's the arrow of history, right?
05:36:03.940 | On the opposite side of things,
05:36:05.380 | India is actually was 562 princely states
05:36:08.300 | at the time of Indian unification from 1947 to 19,
05:36:12.780 | 1947 when they got independence from the British,
05:36:15.260 | it was 562 princely states.
05:36:16.660 | Most people don't know that part.
05:36:17.540 | They got, or outside India don't know that part.
05:36:19.620 | It got unified into a republic only by like 1950.
05:36:22.500 | And India is like actually a modern create,
05:36:25.220 | India is like Europe.
05:36:26.220 | It's kind of like the European union in the sense that
05:36:28.900 | we didn't have a unified India in the past.
05:36:30.780 | It was something with a lot of different countries,
05:36:32.420 | like Northern South India or like Gujarat and Tamil Nadu
05:36:36.100 | are as different as Finland and Spain, okay?
05:36:39.460 | But India is moved in the direction
05:36:41.180 | of much more unification, like much more, you know,
05:36:44.620 | centralization or what have you,
05:36:47.380 | whereas the US is decentralized.
05:36:48.540 | And you go, okay, a few more things,
05:36:50.700 | there are flips and I'll finish this off.
05:36:52.220 | Today we're seeing the rise of the pseudonymous founder
05:36:54.060 | in startup societies back all the way back in the 1770s,
05:36:56.740 | we saw pseudonymous founders of startup countries,
05:36:58.700 | namely the US, right?
05:36:59.540 | The Federalist Papers.
05:37:00.660 | Today we're seeing so far unsuccessful calls
05:37:03.700 | for wealth seizures in the US.
05:37:05.260 | Back then we saw FDR's Executive Order 6102,
05:37:08.100 | which was a successful seizure of gold.
05:37:10.220 | I expect we may see something like that,
05:37:11.940 | an attempted seizure of digital gold.
05:37:13.700 | And I think that'll be one of the things
05:37:15.860 | that individual states like Florida or Texas
05:37:18.220 | may not enforce that.
05:37:19.860 | And I think that's actually the kind of thing
05:37:21.900 | where you could see, you know,
05:37:23.860 | like a breakup potential in the future, right?
05:37:26.620 | One other thing that kind of rhymes is,
05:37:29.420 | in many ways like the modern US establishment,
05:37:31.660 | the story that you hear is the victories in 1945 and 1865
05:37:37.100 | legitimate the current establishment.
05:37:38.380 | That is being the Nazis, being the Confederates, right?
05:37:40.980 | So you beat the ethnic nationalists abroad
05:37:43.620 | and they beat the quote, secessionists at home, right?
05:37:46.580 | And the ethnic nationalists were, you know, Aryan Nazis
05:37:50.220 | and the secessionists were, you know,
05:37:52.180 | slave owners and against freedom and so on and so forth.
05:37:54.620 | Okay?
05:37:55.460 | I'm not disputing that, I'm just saying that
05:37:56.300 | that's just like the way people think about it.
05:37:59.220 | There's a possibility,
05:38:00.180 | and I'm not saying it's 100% at all, okay?
05:38:02.260 | But if you're a sci-fi writer,
05:38:03.880 | there's a possibility that the US loses
05:38:07.980 | to the ethnic nationalists abroad,
05:38:09.940 | except this time they're Chinese communists,
05:38:11.580 | non-white communists, as opposed to Aryan Nazis,
05:38:14.380 | which seemed like the total opposite, okay?
05:38:17.100 | And there's a possibility that there is
05:38:19.540 | a financial secession at home,
05:38:23.180 | where it's, you know, Bitcoin maximalist states
05:38:26.420 | that are advocating for freedom, the opposite of slavery.
05:38:29.260 | See what I'm saying?
05:38:31.620 | - Oh boy, that's dark.
05:38:35.940 | You're looking for major things in history
05:38:38.740 | that don't yet have a--
05:38:41.660 | - Cognate going forward, right?
05:38:43.500 | And that's a nice way to think about the future.
05:38:47.460 | - It is only one model,
05:38:48.780 | and any mental model or something like that,
05:38:50.940 | that's why I say as a sci-fi scenario,
05:38:52.740 | it's just like a scenario one could contemplate, right?
05:38:55.100 | Where the new version has, I mean,
05:38:57.780 | the Chinese communists do not think of themselves
05:38:59.660 | as Aryans, right?
05:39:02.140 | But they are ultra-nationalist.
05:39:05.140 | And the Hitler comparisons,
05:39:06.820 | people talk about Hitler endlessly,
05:39:08.180 | like Saddam is the new Hitler,
05:39:10.060 | everybody's the new Hitler, et cetera.
05:39:12.280 | If there is a comparison to quote Nazi Germany,
05:39:16.300 | it is, you know, CCP China in a sense, why?
05:39:20.460 | They are non-English speaking, manufacturing powerhouse
05:39:24.660 | with a massive military build-out under one leader
05:39:28.820 | that is a genuine peer competitor
05:39:31.660 | to the US on many dimensions.
05:39:33.160 | And in fact, you know,
05:39:34.260 | exceeding on some dimensions of technology and science.
05:39:37.060 | That is like, the problem is it's a boy who cried wolf.
05:39:40.720 | People say this a zillion times, right?
05:39:42.720 | And you know, that is like, you know,
05:39:46.500 | I'm not saying this by the way, crucially,
05:39:48.140 | I am just like, I think China is very complicated
05:39:53.140 | and there's hundreds of millions of people,
05:39:55.580 | probably half in China that disagrees
05:39:57.020 | with the current ultra-nationalist kind of thing, right?
05:39:59.380 | And so I kind of hate it when innocent Chinese people
05:40:02.980 | abroad or whatever, are just like attacked on this basis
05:40:06.380 | or what have you.
05:40:07.220 | Plus the other thing is that many Chinese people will say,
05:40:09.420 | well, look, relative to, you know,
05:40:11.880 | where we were when Deng took over in 1978,
05:40:16.100 | we built up the entire country.
05:40:17.480 | We're not starving to death anymore
05:40:18.720 | and the West wants to recolonize us.
05:40:20.380 | And so I understand where that's coming from.
05:40:22.000 | This way you wanna be able to argue different points of view.
05:40:24.200 | With that said, there's one huge difference, right?
05:40:28.320 | Which is Nazi Germany was like 70 million people
05:40:31.540 | and the US was 150 million
05:40:32.960 | and the Soviet Union was 150 million
05:40:34.320 | and the UK was like 50 million.
05:40:35.360 | So they were outnumbered like five to one.
05:40:37.760 | China outnumbers the US four to one.
05:40:42.760 | - This is gonna be a fun century.
05:40:46.040 | - Things are gonna get--
05:40:47.040 | - Under this model.
05:40:48.220 | - Under this model, things are gonna be potentially crazy.
05:40:50.260 | Plus, you know, people are like, oh, I think this is,
05:40:53.000 | you know, again, I have nothing personal,
05:40:55.980 | there's this guy, Peter Zaihan, he writes these books, right?
05:41:00.560 | I probably agree with about 20 or 30%,
05:41:02.420 | but I disagree with a lot of the rest.
05:41:03.860 | And a bunch of it is basically about
05:41:06.300 | how China's really weak and America's really strong
05:41:08.380 | and the rest of the world is screwed.
05:41:09.880 | And, you know, I think there's absolutely problems in China
05:41:14.700 | and, you know, like the current management
05:41:17.560 | is actually messing a lot of things up,
05:41:18.600 | we could talk about that.
05:41:20.220 | But I do think that, you know,
05:41:22.140 | the US is like fighting its factory.
05:41:24.580 | So one thing, you know, Zaihan will talk about is how,
05:41:26.780 | oh, America has this blue water Navy,
05:41:28.400 | all the aircraft carriers and China has nothing,
05:41:30.480 | it's got bupkis, et cetera.
05:41:32.140 | Well, China ships things all around the world, right?
05:41:34.620 | It probably has, you know,
05:41:36.180 | one of the most active fleets out there
05:41:39.100 | in terms of, you know, its commercial shipping.
05:41:41.500 | And in terms of building ships, here's a quote,
05:41:45.860 | "China's merchant shipbuilding industry
05:41:47.300 | is the world's largest,
05:41:48.140 | building more than 23 million gross tons of shipping
05:41:49.940 | in 2020.
05:41:51.020 | US yards built a mere 70,000 tons the same year
05:41:53.620 | that they typically average somewhere in the 200,000s."
05:41:55.840 | That is a 100 to 300X ratio, just in shipbuilding.
05:41:58.660 | Pretty much everything else you can find
05:42:00.060 | in the physical world is like that, okay?
05:42:02.540 | We're not talking like 2X,
05:42:04.360 | we're talking they can put together a subway station
05:42:06.980 | in nine hours with prefab and the US takes three years, okay?
05:42:11.980 | When you have 1000X difference in the physical world,
05:42:17.740 | the reason the US was won against, you know,
05:42:20.740 | Nazi Germany in a serious fight is
05:42:22.900 | they had this giant manufacturing plant that was overseas
05:42:26.180 | and they just outproduced, right?
05:42:28.460 | And they supplied the Soviets also with lend-lease
05:42:30.180 | and the Soviets talked about how
05:42:32.380 | they would not have won the war without the Americans.
05:42:34.520 | People are like, "Oh, the Russians fought the Germans."
05:42:38.000 | The Russians armed by Americans fought the Germans.
05:42:41.040 | Like it's a Soviet Union,
05:42:42.440 | they're not actually able to make high quality stuff.
05:42:44.700 | There obviously are individual people in Soviet Russia
05:42:48.320 | who were innovative, right?
05:42:49.520 | I'm not taking that away.
05:42:50.360 | There's a tradition of amazing,
05:42:52.920 | I just wanna be like,
05:42:53.980 | there's individual Russians who obviously I admire,
05:42:56.840 | Mendeleev and, you know, Klimogorov and so on.
05:43:00.080 | There's amazing Russian scientists and engineers.
05:43:01.640 | So I'm not saying that--
05:43:02.480 | - I mean, in general, from brilliant folks like yourself
05:43:07.100 | that criticize communism,
05:43:08.860 | it's too easy to say nothing communism produces is good,
05:43:13.860 | which of course is not true.
05:43:16.360 | - Yes.
05:43:17.200 | - A lot of brilliant people and a lot of even, you know,
05:43:20.920 | there's a lot of amazing things that have been created.
05:43:23.160 | - Yeah, so they had some amazing mathematicians,
05:43:24.980 | amazing scientists and so on, right?
05:43:26.680 | However--
05:43:27.520 | - Great branding on the, you know, red and yellow.
05:43:30.520 | - Yeah, yeah, so-- - The branding is stellar.
05:43:32.720 | Nazi Germany to excellent branding.
05:43:35.240 | With the flag and so on, you know?
05:43:37.320 | - So--
05:43:38.160 | - So--
05:43:38.980 | - And ends there in terms of compliments.
05:43:42.520 | - Yeah, well, actually they copied a lot of stuff
05:43:44.480 | from each other, you know?
05:43:45.380 | Like there's this movie called "The Soviet Story."
05:43:48.000 | It basically shows a lot of Nazi
05:43:50.260 | and Soviet propaganda things next to each other.
05:43:52.320 | And you can see guys almost in like the same pose.
05:43:54.680 | It's almost like, you know how AI will do like
05:43:57.000 | style transfer?
05:43:58.000 | You can almost see, 'cause the socialist realism
05:44:00.480 | style of like the muscular brawny worker,
05:44:02.920 | very similar to like the style of the Aryan Superman,
05:44:05.520 | you know, like pointing at the vermin or whatever.
05:44:07.640 | - And then there's the crappy open source version
05:44:09.880 | that tries to copy, which is Mussolini.
05:44:12.080 | - Yeah.
05:44:12.920 | - That just like, that does the same exact thing,
05:44:15.120 | but does it kind of shittier, so anyway.
05:44:17.200 | - So my main thing about this is basically like
05:44:19.760 | trying to fight your factory in the physical world
05:44:23.360 | is probably not gonna work.
05:44:24.780 | People are, I think, overconfident on this stuff, right?
05:44:28.440 | With that said, I think we want to, at all,
05:44:31.160 | you know, the future is not yet determined, right?
05:44:33.120 | At all odds, you know, we wanna avoid a hot war
05:44:36.120 | between like, I mean, a hot war between the US and China
05:44:39.160 | would be-
05:44:40.720 | - Do you think it's possible that we'll get a war?
05:44:42.920 | - We're doing these things like Pelosi going to Taiwan
05:44:45.400 | and trying to cause something.
05:44:48.620 | Like, look, again, this is one of these things
05:44:51.120 | which is complicated because obviously,
05:44:53.520 | if you're, there's more than one perspective on this, right?
05:44:55.920 | Again, you've got the US establishment,
05:44:57.600 | the US conservative, the Taiwanese perspective,
05:45:00.200 | the Chinese perspective, all the bystanders over there,
05:45:02.600 | there's more than one perspective on this, okay?
05:45:04.840 | If you're, you know, one of China's many neighbors,
05:45:08.500 | you look at China with apprehension.
05:45:10.600 | Like Vietnam, for example, has sort of fallen into,
05:45:13.320 | or not fallen into, is partnering with India
05:45:15.780 | because they're mutually apprehensive of China.
05:45:17.920 | China's not making like great friends with its neighbors.
05:45:20.400 | It's kind of, you know, it's demonizing Japan.
05:45:22.280 | It's so ultra-nationalist nowadays.
05:45:24.760 | And so if you're a Taiwanese, you're like,
05:45:27.000 | yeah, I do not want to be
05:45:28.840 | under the Chinese surveillance state.
05:45:30.280 | I completely understand it.
05:45:31.160 | Some people are pro-reunification, others aren't,
05:45:33.960 | but there's more, you know, trend, you know,
05:45:36.120 | in some ways for independence.
05:45:38.040 | Okay, fine.
05:45:39.360 | - But there's also an increasing temperature
05:45:41.880 | across the entire world.
05:45:43.520 | As we sit here today, there's speeches by Vladimir Putin
05:45:47.960 | about the serious possibility of a nuclear war.
05:45:50.980 | And that escalates kind of the heat in the room
05:45:56.680 | of geopolitics.
05:45:57.840 | - It escalates the heat in the room, of course, right?
05:45:59.560 | And the thing is, people have this belief
05:46:01.840 | that because something hasn't happened,
05:46:03.720 | it won't happen or it can't happen.
05:46:05.960 | But like, there were a lot of measures people took
05:46:08.920 | during the Cold War to make sure a nuclear exchange
05:46:13.920 | didn't happen, the whole mutually assured destruction thing
05:46:16.520 | and communicating that out and like the balance of terror.
05:46:20.160 | There were smart guys on both sides
05:46:21.840 | who thought through this and there were near misses, right?
05:46:25.720 | There, you know, like there's that story
05:46:27.200 | about like the Soviet colonel
05:46:28.920 | who didn't order a nuclear strike
05:46:30.200 | 'cause he thought it was just like an error
05:46:31.600 | in the instruments, right?
05:46:32.840 | Okay, what's the point?
05:46:34.080 | Point is, you know, for example, Pelosi going to Taiwan,
05:46:38.040 | that didn't strengthen Taiwan.
05:46:39.940 | That didn't like, if you're gonna go and provoke China,
05:46:42.920 | I thought Scholar Stage, his Twitter account,
05:46:44.760 | had a good point, which is you should,
05:46:47.520 | if you're actually gonna do it,
05:46:48.360 | then you strengthen Taiwan with like huge battalions
05:46:50.920 | of like arms and materiel and you make them a porcupine
05:46:54.000 | and so on and so forth.
05:46:55.240 | Instead, her kind of going and landing there
05:46:57.600 | and mooning China and then flying back
05:47:00.400 | in the middle of a hot war with Russia,
05:47:02.440 | that's absolutely, you know,
05:47:03.560 | in the middle of an economic crisis or what have,
05:47:06.560 | it just, you know, can you pick battles or whatever, right?
05:47:10.800 | It's like, you don't have to fight Russia
05:47:13.040 | and China at the same time.
05:47:15.320 | It's like kind of insane to do that, okay?
05:47:17.200 | Plus even with Ukraine, some people were like,
05:47:20.080 | oh, this was like a victory
05:47:21.800 | for the US military policy or something.
05:47:23.840 | There was a guy who, I'm not trying to beat him up
05:47:25.800 | or anything, he's like, this is in March,
05:47:27.880 | threat on US security assistance to Ukraine, it's working.
05:47:30.200 | Ukraine might be one of the biggest successes
05:47:32.000 | of US security assistance.
05:47:33.120 | And the reason is, you know, US didn't focus
05:47:35.480 | on some high-end shiny objects,
05:47:36.800 | but on core military tasks that focus should remain.
05:47:39.160 | And it's like, how is this a success?
05:47:42.200 | The West gave massive arms to Ukraine
05:47:44.320 | only after the invasion, but not enough before to deter.
05:47:48.040 | And now Ukraine is like this Syria-like battleground
05:47:51.520 | with a million refugees or whatever the number is, right?
05:47:55.920 | Their country is blown to smithereens,
05:47:57.600 | thousands of people dead, whatever thousand dollar gas
05:48:00.600 | in Europe with like 10X energy, radicalized Russians,
05:48:03.880 | the threat of World War III or even nuclear war, you know?
05:48:07.000 | Shooting somebody isn't, that's not like the point
05:48:10.280 | of the military, the point is, there's a million ways
05:48:12.980 | to smash Humpty Dumpty into pieces
05:48:14.560 | and unleash the blood-drenched tides, right?
05:48:17.680 | And have people all shooting each other and killing each other.
05:48:19.120 | It's really hard to maintain stability.
05:48:21.280 | That's what competence is, is deterrence and stability.
05:48:24.000 | Right?
05:48:24.840 | There's not like a success in any way.
05:48:27.360 | It's like an absolute tragedy for everybody involved, right?
05:48:30.960 | - Yeah, I mean, deterrence of course is the number one thing,
05:48:34.040 | but there's a lot to be said there.
05:48:36.760 | But I'm a huge not fan of declaring victory
05:48:41.960 | as we've done many times when it's the wrong.
05:48:44.920 | - Yeah, I mean, the other thing about this
05:48:46.320 | is the whole mission accomplished thing during Iraq.
05:48:48.360 | - Mission accomplished is what I meant, yeah.
05:48:49.840 | - Exactly, mission accomplished was obviously,
05:48:52.360 | the thing is Russia lives next door to Ukraine.
05:48:55.780 | And so, I mean, just like Iraq lives next door to Iran
05:48:59.560 | and Afghanistan is next door to Pakistan and China.
05:49:01.880 | And so if the US eventually gets tired of it and leaves,
05:49:05.100 | those guys are next door, right?
05:49:07.040 | And so, who knows what's gonna happen here, okay?
05:49:10.840 | But one of the problems is like,
05:49:14.880 | the whole Afghanistan thing or the Iraq thing
05:49:17.120 | is the lesson for people was the uncertainty.
05:49:20.080 | They're like, is the US gonna fight?
05:49:21.620 | Don't know.
05:49:22.620 | Will the US win if it fights?
05:49:23.700 | Don't know, therefore roll the dice.
05:49:25.660 | That uncertainty is itself like tempting to folks,
05:49:28.880 | like Putin over there, right?
05:49:30.480 | So point is coming all the way back up,
05:49:32.400 | we were talking about how history, futures are past
05:49:35.960 | and FDR, like the business plot for FDR failed,
05:49:39.660 | but like the tech companies
05:49:41.260 | were able to de-platform Trump, right?
05:49:42.920 | And the left libertarian would say,
05:49:44.320 | do we want that much corporate power?
05:49:45.760 | Okay, and so that's,
05:49:47.920 | so we gave the four case for Trump de-platforming,
05:49:49.720 | protecting democracy, the Trump supporter case against,
05:49:52.480 | which is on the secret history of the shadow campaign
05:49:54.480 | to save the 2020 election, basically that article,
05:49:56.800 | the left libertarian or libertarian case against.
05:49:59.040 | And then to me, what is, you know,
05:50:01.440 | like I am more sympathetic to the libertarian
05:50:06.400 | slash left libertarian against,
05:50:07.440 | and then also maybe the fourth group,
05:50:09.520 | which is the non-American case, right?
05:50:11.880 | Which is to say every, you know, Amlo,
05:50:16.000 | he's the, he was the head of state of Mexico,
05:50:19.200 | I think at that time, okay.
05:50:21.360 | Amlo, Macron, you know, other folks,
05:50:24.080 | everybody who's watching this around the world
05:50:26.640 | basically saw, let's say,
05:50:29.040 | US establishment or Democrat aligned folks
05:50:32.480 | just decapitate, you know, the head of state digitally,
05:50:37.480 | right, like just boom, gone, okay.
05:50:39.880 | And they're like, well, if they can do that in public
05:50:42.320 | to the US president,
05:50:43.480 | who's ostensibly the most powerful man in the world,
05:50:45.920 | what is the Mexican president stand against that?
05:50:48.800 | Nothing, right?
05:50:50.040 | Like these US media corporations,
05:50:52.000 | these US tech companies are so insanely powerful,
05:50:54.640 | everybody's on Twitter or what have you,
05:50:56.680 | other than China, leaving them aside,
05:50:58.040 | they've got their own root system.
05:50:59.640 | If somebody tried to de-platform
05:51:01.840 | Xi Jinping off of Sina Weibo,
05:51:03.840 | they'd probably just fall through a trap door,
05:51:05.720 | you know, their whole family, right?
05:51:07.600 | But for the rest of the world,
05:51:09.000 | that's on the, that is hosting their business,
05:51:11.760 | their politics on these US tech companies,
05:51:14.240 | they're like, regardless of whether it was justified
05:51:17.520 | on this guy, that means they will do it to anybody.
05:51:20.520 | Now the seal is broken,
05:51:21.600 | just like the bailouts, as exceptional as they were
05:51:23.840 | in the first era, everybody was shocked by them,
05:51:26.000 | then they became a policy instrument.
05:51:28.280 | And now there's bailouts happening,
05:51:30.560 | every single bill is printing another whatever,
05:51:32.600 | billion dollars or something like that, right?
05:51:34.440 | - Can I ask on your thoughts and advice on this topic?
05:51:39.280 | If I or anyone were to have a conversation
05:51:42.960 | with Donald Trump, first of all, should one do so?
05:51:47.360 | And if so, how do you do it?
05:51:52.040 | And it may not necessarily be Trump,
05:51:54.160 | it could be other people like Putin and Xi Jinping
05:51:57.560 | and so on, let's say people that are censored.
05:52:01.040 | - Right.
05:52:01.880 | - Like people that platforms in general see as dangerous.
05:52:06.720 | Hitler, you can go, we keep bringing it up.
05:52:09.560 | - Of course, that's the ultimate edge case, right?
05:52:11.760 | In the sense of, that's saying like,
05:52:13.160 | something must be done, this is something,
05:52:14.880 | therefore this must be done, right?
05:52:16.640 | I've heard that one before.
05:52:18.840 | (laughing)
05:52:20.160 | - No, but I love it.
05:52:22.000 | - So this is just--
05:52:23.800 | - Can I just use that as an explanation
05:52:25.600 | with confidence for everything I do?
05:52:27.280 | - Yeah, sure, there you go, right?
05:52:28.440 | - Something must be done, this is something,
05:52:30.160 | therefore it must be done.
05:52:31.160 | - Therefore this must be done.
05:52:32.120 | So that is the, like, all kinds of regulations,
05:52:35.640 | all kinds of things are kind of justified on that basis,
05:52:37.320 | right, and there's a version of that which is,
05:52:40.240 | punch a Nazi, I decide who's a Nazi, you're a Nazi.
05:52:44.040 | - Therefore I punch you and that's justified, yeah.
05:52:46.560 | - Yeah, and you know, like people say,
05:52:47.840 | oh, how many people are calling Israelis,
05:52:50.200 | you know, like these things, right?
05:52:52.480 | And so the problem with argumentum ad hilarum is,
05:52:57.480 | it just, I mean, people will say Obama's a Nazi,
05:53:00.440 | everybody will say everybody's a Nazi, right?
05:53:01.960 | - But there is a social consensus about who,
05:53:04.920 | let's set Nazis aside, but who is dangerous for society.
05:53:08.320 | - Okay, but now let's talk about that, all right?
05:53:10.000 | So basically, I think a more interesting example
05:53:12.640 | than Hitler in this context is Herbert Matthews.
05:53:16.880 | So Fidel Castro, before he became the communist dictator
05:53:20.320 | of Cuba, was on the run.
05:53:22.760 | He was like Osama Bin Laden at the time,
05:53:24.120 | he was like a terrorist that the Cuban regime
05:53:26.080 | had seemingly defeated.
05:53:27.760 | And what Herbert Matthews did is he got an intro to him,
05:53:30.040 | he went to the, you know, place where he was hiding out,
05:53:33.120 | he gave an interview and he printed this hagiography
05:53:35.960 | in the New York Times with this like, you know,
05:53:37.720 | photo of Castro looking all, you know, mighty and so on.
05:53:41.600 | And he's like, Castro is still alive and still fighting,
05:53:45.160 | okay, and there's this book on this called
05:53:48.280 | "The Man Who Created Fidel," okay?
05:53:50.960 | Where basically, NYT's article was crucial positive press
05:53:55.960 | that got Castro's point of view out to the world
05:53:59.640 | and helped lead to the communist revolution
05:54:02.760 | that actually impoverished Cuba,
05:54:05.880 | led to like gay people being, you know,
05:54:08.120 | like discriminated against there,
05:54:09.800 | led to people fleeing, you know,
05:54:11.760 | and drowning trying to escape, right?
05:54:14.120 | That's an example of where platforming somebody
05:54:16.720 | led to a very bad outcome.
05:54:17.800 | In fact, many of the communist dictators
05:54:20.560 | in the 20th century had like their own personal journalist,
05:54:23.320 | right?
05:54:24.160 | For example, there's a guy, John Reed, he's an American.
05:54:26.040 | He's buried, you know, if I get this right,
05:54:28.360 | I think he's buried at the Kremlin wall, okay?
05:54:31.280 | Why is an American buried there, okay?
05:54:34.840 | Because he wrote a book called "10 Days That Shook the World"
05:54:39.560 | that whitewashed the entire Soviet revolution
05:54:42.680 | and the Russian revolution in 1917, October revolution,
05:54:46.960 | and made these guys out to be the good guys
05:54:49.400 | when they were actually genocidal psychopaths, okay?
05:54:52.240 | He got their point of view out to the world
05:54:54.120 | and it was a totally misleading point of view, all right?
05:54:56.080 | - Do you think, what do you think he was thinking?
05:54:59.760 | Do you think he saw the psychopathy?
05:55:01.880 | You know, sometimes it's not obvious, like--
05:55:03.680 | - Well, the French revolution had already happened.
05:55:05.240 | So people kind of knew that this sort of psychopathic,
05:55:08.080 | you know, killing in the name of equality
05:55:09.560 | could produce bad results, right?
05:55:11.520 | But it's more than that, right?
05:55:14.040 | So it's John Reed, it's Herbert Matthews,
05:55:16.000 | it's Edgar Snow, okay?
05:55:18.440 | So these are all people who should be extremely famous,
05:55:20.280 | right?
05:55:21.320 | So Edgar Snow is Mao's journalist, okay?
05:55:24.480 | So he wrote, you know, there's actually an article in this
05:55:28.200 | how 1930s reporter from Missouri
05:55:30.360 | became China's ideal journalist, okay?
05:55:33.520 | And he wrote various books,
05:55:34.880 | including like "Red Star Over China," okay?
05:55:38.280 | And it's just a hagiography of Mao, right?
05:55:41.480 | And then of course you've got Duranty,
05:55:43.840 | and he is like Stalin's biographer, right?
05:55:46.720 | Just to recap, John Reed brought Lenin's message
05:55:49.920 | to the world, Mailman's dead.
05:55:51.600 | Duranty helped Stalin starve out the Ukrainians,
05:55:55.560 | Mailman's dead.
05:55:57.540 | Edgar Snow was Mao's biographer,
05:56:01.060 | and Herbert Matthews was like Castro's.
05:56:05.120 | This guy, David Halberstam in Vietnam,
05:56:07.920 | who was effectively Ho Chi Minh's.
05:56:10.040 | He basically went and took leaks from a communist spy.
05:56:14.600 | I'll give you the exact name.
05:56:16.720 | Pham, I'm gonna mispronounce this,
05:56:18.400 | but it's Perfect Spy,
05:56:20.120 | the incredible double life of PHAM, Pham Xuan An,
05:56:23.760 | Time Magazine reporter and Vietnamese communist agent.
05:56:26.600 | That guy was the source of many fabricated stories
05:56:29.360 | that David Halberstam printed in the New York Times
05:56:31.280 | that led to the undermining of the South Vietnamese regime.
05:56:34.140 | And for example, stories of Buddhists being killed and so on
05:56:38.140 | Ashley Rinsberg in "The Great Lady Winked"
05:56:39.840 | writes this whole thing up at length,
05:56:41.040 | so you can go and read it for his account.
05:56:43.600 | But basically all of these communist dictators
05:56:46.920 | had a journalist right alongside them as their biographer.
05:56:50.240 | - Yeah. - Okay?
05:56:51.160 | - But those are tools of the propaganda machine versus-
05:56:54.080 | - Well, so my point is,
05:56:56.040 | these are five examples that are on the far left
05:57:00.040 | that should be balanced also against the Times
05:57:02.200 | running profiles of Hitler on the far right.
05:57:05.360 | We know that basically,
05:57:06.720 | the Times actually also ran a whole thing,
05:57:08.160 | which was Hitler's like mountain retreat
05:57:10.980 | or something like that.
05:57:11.820 | Do you know about that story?
05:57:12.640 | - What year was this?
05:57:13.560 | - I'll tell you in a second.
05:57:15.240 | Hitler at home in the clouds.
05:57:17.560 | - Oh boy, please tell me it's like early 30s.
05:57:20.760 | - I think it's, oh yeah, this is Otto D. Tallisius.
05:57:23.400 | This is actually a guy that Ashley Rinsberg writes up
05:57:26.320 | in "The Great Lady Winked," right?
05:57:27.520 | - 1937. - '37.
05:57:29.040 | There's another one where I think the date is wrong,
05:57:30.720 | but it's 39, you know,
05:57:32.000 | but essentially these titles are like,
05:57:34.840 | "Where Hitler Dreams and Plans,"
05:57:36.500 | "He Lives Simply," you know, right?
05:57:39.280 | And there's another one,
05:57:40.300 | "Hair Hitler at Home in the Clouds," okay?
05:57:43.000 | The thing about this is,
05:57:44.360 | absolutely there are folks who are hagiographers
05:57:47.160 | of the far right,
05:57:48.720 | but whether you're talking Lenin and John Reed
05:57:51.960 | or Stalin and Walter Durante of "The New York Times,"
05:57:56.080 | or Castro and Herbert Matthews, again, of "The New York Times,"
05:57:59.560 | or Edgar Snow and Mao,
05:58:02.120 | or David Halberstam and, you know,
05:58:04.640 | Ho Chi Minh, again, of "The New York Times,"
05:58:07.020 | like you start to see a pattern here
05:58:09.240 | where the guys who are being platformed and given a voice
05:58:12.000 | are these guys who end up being like far left,
05:58:14.480 | you know, lunatics, right?
05:58:16.200 | And I think part of the issue here is,
05:58:18.040 | you know, the saying about how communists
05:58:20.200 | don't understand self-interest?
05:58:21.840 | Nationalists don't understand other interest.
05:58:24.080 | And so nationalists are more obvious.
05:58:27.280 | Isn't that good?
05:58:29.120 | I thought it was good.
05:58:29.940 | - It's pretty good. - Right?
05:58:30.780 | - Pretty good.
05:58:31.620 | - So the nationalist is very obvious
05:58:33.360 | in the sense of like, they're for the Aryans.
05:58:35.760 | They're not even for like the Slavs or whatever, right?
05:58:37.760 | Like, you know, basically, you know,
05:58:40.800 | had Hitler constructed a different ideology,
05:58:43.720 | you know, then like he might've gotten more support
05:58:46.160 | in Eastern Europe or whatever, right?
05:58:47.460 | But he also called the Slavs inferior,
05:58:49.060 | not just, you know,
05:58:49.900 | basically everybody was inferior to the Aryans, okay?
05:58:52.340 | Except maybe the English or whatever,
05:58:53.580 | but that was it, right?
05:58:55.500 | Oh, and the Japanese are honorary Aryans or something.
05:58:58.280 | So the nationalist declares the supremacy
05:59:01.180 | of their own race or culture or what have you,
05:59:03.380 | and doesn't understand people's other interest,
05:59:05.580 | but he also pumps up his own guys, okay?
05:59:07.660 | Same with, you know, in some ways China today,
05:59:09.820 | same with Japan back in the day.
05:59:11.900 | Whereas the communist has a message
05:59:14.300 | that sounds more appealing.
05:59:16.300 | It's a universalist message ostensibly,
05:59:19.340 | but it's actually a faux universalism
05:59:22.060 | 'cause it's actually particularism.
05:59:23.300 | Like during the Soviet Union, communism,
05:59:25.780 | this faux universalism was basically a mask
05:59:28.580 | for Russian nationalism, you know,
05:59:29.900 | where, you know, or at least Soviet nationalism
05:59:32.780 | where in particular Russians were pushed
05:59:34.500 | into many territories and, you know,
05:59:36.640 | Russian speakers were, you know,
05:59:38.840 | like privileged in, you know,
05:59:40.460 | the Eastern Europe and the Baltics.
05:59:42.940 | Of course, Russians themselves were oppressed at home
05:59:45.360 | as Sultan's rights here,
05:59:46.320 | both victim and victimizer of the regime.
05:59:48.320 | Their churches were crushed and so on.
05:59:50.260 | As compensation, they were agents of empire.
05:59:52.140 | It's a tragedy all around, okay?
05:59:53.340 | I'm not, you know, I think Russians have been hard done
05:59:56.460 | in many ways, they've had a very hard, hard century.
05:59:58.180 | They've also done hard by others, okay?
05:59:59.980 | It's complicated.
06:00:00.820 | - Those journalists you mentioned,
06:00:02.300 | just to elaborate, maybe you disagree with me,
06:00:05.140 | I wonder what you think.
06:00:06.180 | - Sure.
06:00:07.020 | - But I think conversation,
06:00:08.500 | like not to sort of glorify any particular medium,
06:00:12.100 | but there's something,
06:00:12.940 | one of the reasons I like long form podcasts
06:00:15.700 | or interviews, long form unedited interviews,
06:00:18.880 | there's been shows throughout the 20th century
06:00:20.720 | that do that kind of thing, but they seem to be rare.
06:00:23.460 | That podcast made it much more popular and common.
06:00:27.240 | It somehow makes it easier not to do
06:00:32.180 | this kind of bullshit journalism that--
06:00:34.380 | - The gotcha stuff, yeah.
06:00:35.860 | - I feel like asking interesting and deep questions allow,
06:00:40.860 | I think you could sit down with Hitler in 1940, 1941,
06:00:45.580 | 1942, and the podcast actually serve a purpose.
06:00:50.580 | - In '41 and '42, mid-World War II?
06:00:54.580 | - Or mid-World War II.
06:00:55.860 | A purpose of, one, which is very important,
06:00:59.860 | get good information for the future
06:01:03.020 | so history can study it.
06:01:05.140 | And two, reveal to the world the way a man thinks
06:01:10.140 | that is beyond the propaganda.
06:01:14.360 | - So all this stuff is complicated,
06:01:17.240 | but today, so in the specific issue
06:01:19.180 | of the folks you were talking about,
06:01:20.340 | like Putin, Z, Trump, right?
06:01:23.020 | For those folks, they are very clearly out-group
06:01:27.620 | for both the US left and right,
06:01:29.140 | which is, let's say the Western left and right,
06:01:31.060 | which are your audience.
06:01:34.480 | There's folks who are tankies
06:01:36.460 | and there are folks who are MAGA who are sympathetic.
06:01:39.140 | - I'm sorry, what are tankies?
06:01:40.420 | - Tankies are those who are,
06:01:41.860 | they may not call themselves tankies.
06:01:44.180 | Let's say they're anti-imperialist left and MAGA right,
06:01:48.620 | for different reasons, are against the US establishment
06:01:52.940 | and for Putin or Z or something like that
06:01:56.160 | as an agent against the US establishment.
06:01:59.700 | So leaving those aside,
06:02:01.700 | the point is that most of your audience
06:02:03.180 | is sort of on guard, vaccinated in a sense,
06:02:07.100 | versus Z and Putin and Trump.
06:02:09.780 | They know the counter-arguments and so on and so forth.
06:02:13.860 | In which case, I wouldn't think interviewing them
06:02:16.740 | would be that big a deal, relatively,
06:02:20.580 | 'cause there's so much other coverage and so on out there.
06:02:23.980 | I think it's probably okay.
06:02:25.420 | However, for something like what John Reed was doing
06:02:30.420 | and so on, when he was the sole source of information
06:02:32.740 | about the Russian Revolution.
06:02:34.220 | - Yes, that's different.
06:02:35.460 | - That's different, right?
06:02:36.540 | So it's something about,
06:02:38.260 | it kind of gets back to the competitive environment
06:02:40.420 | and so on.
06:02:41.260 | The dearth of folks who are writing critical coverage
06:02:44.860 | of these three men.
06:02:47.140 | And so if I felt that that was insufficient,
06:02:50.340 | then you might need more of it.
06:02:51.900 | Just like, for example, nowadays with Stalin,
06:02:55.900 | there are a lot of articles and books and PDFs
06:02:58.940 | and so on on it.
06:02:59.860 | - But at the time, not as much.
06:03:02.020 | - At the time, not as much.
06:03:03.020 | That's why I brought those guys.
06:03:04.580 | Because often, it's kind of like,
06:03:06.780 | have you ever stocked shelves at a supermarket?
06:03:09.580 | This would seem totally out of left field.
06:03:11.460 | - No, but shoes, but the same thing,
06:03:13.780 | Sears, I used to work at Sears.
06:03:14.980 | - The thing that is the most popular
06:03:16.980 | is the thing that's not on the shelf
06:03:18.660 | because it's been sold out.
06:03:20.220 | - Yeah. - Right?
06:03:21.180 | So in some ways, this is similar to that famous photo
06:03:25.660 | that people have, or image that people have on Twitter
06:03:27.820 | of the plane and the parts that are shot versus not, right?
06:03:32.380 | The survivorship bias, right?
06:03:35.120 | And one way of kind of thinking about it is
06:03:37.820 | the guys who you think of as bad guys or possible bad guys
06:03:40.900 | or controversial guys or whatever,
06:03:43.460 | are those you've already got some vaccination to,
06:03:45.060 | that's why you think of them at all.
06:03:46.600 | Whereas the folks that I mentioned,
06:03:48.100 | the regulators, invisible, right?
06:03:50.760 | Salzburger, you know Zuckerberg,
06:03:52.300 | you know his pros and cons, you know who he is as a person.
06:03:55.040 | You don't even know Salzburger exists, most people, right?
06:03:57.660 | Despite the fact that he's like, certainly is powerful.
06:04:00.540 | You know, he owns the New York Times, he inherits it.
06:04:02.460 | He also got dual class stock, just like Zuck.
06:04:05.220 | But he's invisible, right?
06:04:06.620 | - Well, that's why I think studying the knowns,
06:04:10.020 | the people that are known can help you generalize
06:04:12.940 | to the way human nature is, and then you start to question,
06:04:17.420 | are the same kind of humans existing in places
06:04:20.700 | that wield power?
06:04:23.260 | - Yeah. - And you can assume
06:04:24.180 | they are, they do exist there,
06:04:26.300 | and then you can start to infer.
06:04:27.940 | - That's right. - And ask questions.
06:04:29.460 | - So this is kind of, what I try to do is I'm like,
06:04:32.580 | what is the dark matter?
06:04:33.560 | What is the question that is not being asked
06:04:35.620 | or what have you, right? - Yes, yes.
06:04:36.860 | - And so, you know, that's not to say
06:04:39.260 | that you need to be so anti-mimetic that you only do that.
06:04:42.780 | But I think you need to do that as well as understand
06:04:45.220 | what is good about the conventional wisdom.
06:04:46.900 | And, you know, for example, if you notice a lot
06:04:48.700 | of what I talk about is like the V1, V2, V3,
06:04:51.780 | where as critical as I am of, let's say, the FDA,
06:04:54.620 | I recognize that people want a regulated marketplace
06:04:57.140 | and how do we do better?
06:04:58.340 | As critical as I can be of the Fed,
06:04:59.980 | I recognize that some kind of monetary policy is necessary.
06:05:03.860 | And Satoshi came up with a better one, right?
06:05:06.580 | As harsh as one can be a critic of the current system,
06:05:11.280 | it is really incumbent, as difficult as that is,
06:05:14.120 | upon one to come up with a better version.
06:05:15.500 | Just like academia, as much as I think
06:05:17.580 | current science is corrupted,
06:05:19.420 | what I propose is a way to actually improve on that.
06:05:22.260 | And actually, any true scientist say,
06:05:24.620 | yes, I want my work to be reproducible.
06:05:26.140 | Yes, I want citations to be important statements
06:05:28.220 | and so on and so forth.
06:05:29.060 | And we don't have to get everybody to agree with that,
06:05:30.580 | but just enough to build that better version
06:05:32.300 | and not regress.
06:05:33.540 | - Yeah, there's an implied optimism
06:05:36.060 | within the V1, V2, V3 framework.
06:05:39.460 | Let me ask you at a high level about social media,
06:05:43.060 | because you are one of its prominent users
06:05:47.820 | to communicate your wisdom.
06:05:49.580 | - I use Twitter, I wouldn't really think of it
06:05:52.220 | as communicate my wisdom per se or anything like that.
06:05:55.300 | I use Twitter like I might use GitHub as a scratch pad
06:05:58.660 | for just kind of floating concepts.
06:06:00.740 | And I've got, okay, here's a frame on things,
06:06:03.200 | let me kind of put it out there
06:06:04.340 | and see what people think, get some feedback and so on.
06:06:06.620 | - Don't you think it has lasting impact, your scratch book?
06:06:10.340 | - I think it's good, but basically like,
06:06:13.220 | if I say what's my primary thing on Twitter, it's that.
06:06:16.380 | It's a scratch pad for me to kind of
06:06:18.900 | put some concepts out there, iterate on them,
06:06:21.620 | get feedback on them and so on and so forth.
06:06:23.300 | - Do you think it's possible that the words you've tweeted
06:06:25.820 | on Twitter is the most impact you will have?
06:06:30.820 | - On the world? - On the world.
06:06:32.420 | - I don't, so-- - Is that possible?
06:06:34.440 | - Is it possible?
06:06:36.040 | Well, my tweets-- - What you gave me.
06:06:37.640 | - It's a good question.
06:06:38.600 | I think the network state will be, I think, important,
06:06:42.760 | or I hope, well, the book--
06:06:44.280 | - The book or the concept?
06:06:45.640 | - Good question. - Sorry, just a quick, right?
06:06:47.320 | - The movement. - The movement.
06:06:48.640 | - Right, in the sense that Zionism shows
06:06:51.720 | that it is possible to have a book and then a conference
06:06:55.760 | and then a fund and eventually in the fullness of time
06:06:58.320 | with a lot of time and effort to actually get a state, right?
06:07:01.360 | And as I mentioned earlier,
06:07:03.700 | a lot of countries are small countries,
06:07:05.140 | but I didn't mention there's a guy
06:07:06.580 | who's the head of Kazakhstan and he made a remark.
06:07:08.580 | He's like, "You know, if we allow every nation
06:07:11.820 | "that wants to have self-determination to have a state,
06:07:15.080 | "we'd have 600 countries rather than 190."
06:07:17.980 | Because there's many opposites of a nation state,
06:07:21.500 | but one of the opposites is the stateless nation.
06:07:24.100 | And so you remember the network state is popular?
06:07:25.840 | In places like Catalonia, Catalonian nationalists,
06:07:29.020 | in Catalonia, guys who are committed Catalonian nationalists.
06:07:31.660 | So Catalonia, this region of Spain, right?
06:07:33.740 | The thing is that, again, V1, V2, V3,
06:07:39.640 | the nation state is V2 and it beat the city state,
06:07:43.140 | which is like V1.
06:07:44.860 | And the network state I think of as a potential V3,
06:07:46.660 | which combines aspects of V1 and V2.
06:07:48.700 | So Catalonia or the Basque region,
06:07:51.860 | these are underneath the quote nation state of Spain,
06:07:55.820 | but many Catalonians think of themselves
06:07:57.380 | as part of a separate nation.
06:07:58.960 | Not all, many, okay?
06:08:01.180 | And so they want a state of their own.
06:08:03.220 | Who doesn't, if you're a nation?
06:08:06.260 | Meaning that they've got a legitimate claim
06:08:08.500 | from history, language, culture, all that stuff, right?
06:08:11.780 | The Basques do as well.
06:08:12.900 | The Kurds do as well, okay?
06:08:14.820 | Lots of ethnic groups around the world do.
06:08:16.540 | So in the game of musical chairs,
06:08:18.660 | that was the formation of current national borders,
06:08:21.340 | they lost out, right?
06:08:23.300 | So what did they do?
06:08:24.400 | Well, one answer is they just submit to the Spanish state
06:08:27.920 | and they just speak Spanish and their culture is erased
06:08:30.220 | and their history is erased and so on.
06:08:32.420 | The second is they do some sort of Ireland-like insurgency,
06:08:36.180 | the troubles to try to get a thing of their own,
06:08:38.240 | which is obviously bad for other kinds of reasons, right?
06:08:40.820 | You know, violent, et cetera.
06:08:42.260 | What this Catalonian nationalist said, he's like,
06:08:44.820 | "Look, while we can't give up on our existing path,
06:08:47.740 | "the network state is a really interesting third option."
06:08:50.280 | I mean, by the way, I hadn't talked to this guy, V. Partal,
06:08:53.820 | okay, and he's got this site called ViaWeb,
06:08:56.620 | and, or V-I-L-A, VilaWeb, sorry.
06:08:59.800 | It can be, meaning the network state
06:09:01.080 | can be especially appealing to us.
06:09:02.260 | Catalans are now embarking on the task
06:09:04.000 | of having a normal and current state in the old way,
06:09:06.160 | and this is a project that we cannot give up.
06:09:07.980 | But this does not mean that at the same time
06:09:09.680 | we are not also attentive to ideas like this
06:09:11.680 | and we do not try to learn and move forward, right?
06:09:14.000 | Meaning, you know, the network state, right?
06:09:16.280 | Because that's the third way, which says,
06:09:18.060 | okay, maybe this particular region
06:09:20.300 | is not something where you're gonna be able to get,
06:09:24.440 | you know, a state.
06:09:25.920 | But just like there's more Irish people
06:09:27.940 | who live outside of Ireland, right?
06:09:29.980 | Just like, you know, the Jewish people, you know,
06:09:32.620 | didn't actually get a state in Poland or what have you,
06:09:35.440 | they had one in Palestine.
06:09:37.340 | Perhaps the Catalonians could crowdfund territory
06:09:40.340 | in other places and have essentially a state of their own
06:09:45.200 | that's distributed, okay?
06:09:46.820 | Now, again, what people are immediately gonna say is,
06:09:49.000 | "Well, that's gonna lead to conflict with the locals
06:09:50.760 | necessarily," and so on and so forth.
06:09:53.020 | But if you're parallel processing,
06:09:55.180 | you don't have the all-in-one bucket aspect of,
06:09:58.860 | I must win here, and the guy on the other side is like,
06:10:01.060 | I must win, you have optionality.
06:10:02.440 | You can have multiple different nodes around the world,
06:10:05.140 | just like there's multiple Chinatowns,
06:10:06.740 | you could have multiple Catalonian towns, right?
06:10:09.540 | And some places you might be able to just buy an island
06:10:12.360 | and that becomes, you know, the New Catalonia, right?
06:10:15.140 | Just like in, I think there's a region called New Caledonia
06:10:19.620 | and that's in the Southwest Pacific.
06:10:22.420 | So maybe New Catalonia is somewhere else, right?
06:10:24.840 | So if you're flexible on that, now, of course,
06:10:26.860 | a bunch of people will immediately say,
06:10:29.060 | they'll have 50 different objections to this.
06:10:30.920 | They'll say, "Oh, you don't get it.
06:10:31.980 | The whole point is the land," and so on.
06:10:34.620 | They've been there for generations.
06:10:36.700 | Say, "I do get it."
06:10:37.940 | But this Catalan nationalist who's like literally written
06:10:42.120 | in Catalonian for, I don't know how many years, right,
06:10:46.380 | is basically saying, "This is worth thinking about."
06:10:51.220 | And so it's a peaceful third way.
06:10:52.580 | - Yeah, but it's interesting.
06:10:53.880 | I mean, it's a good question whether Elon Musk, SpaceX,
06:10:58.300 | and Tesla will be successful without Twitter.
06:11:00.300 | - Yeah, I don't think as successful.
06:11:02.500 | I mean, obviously they existed before Twitter
06:11:05.980 | and a lot of the engineering problems
06:11:08.320 | are obviously non-Twitter things, right?
06:11:09.860 | But Twitter itself has certainly probably helped Musk
06:11:11.700 | with Tesla sales.
06:11:12.700 | - The engineering, no, that's not what I mean.
06:11:15.460 | - Oh, go ahead.
06:11:16.300 | - The best people in the world
06:11:17.620 | solve the engineering problems.
06:11:19.180 | - Yes, but he hires the people to solve them
06:11:20.660 | and he knows enough about engineering to hire those people.
06:11:22.500 | - That's the point I'm making.
06:11:24.440 | On Twitter, the legend of Elon Musk is created.
06:11:28.680 | The vision is communicated
06:11:32.720 | and the best engineers in the world
06:11:35.040 | come to work for the vision.
06:11:37.280 | It's an advertisement of a man of a company
06:11:40.560 | pursuing a vision.
06:11:42.080 | I think Twitter is a great place to make viral ideas
06:11:47.080 | that are compelling to people, whatever those ideas are,
06:11:50.680 | and whether that's the network state
06:11:52.820 | or whether that's humans becoming
06:11:54.900 | a multi-planetary species.
06:11:57.180 | - Here is a remark I had just before the pandemic
06:11:59.720 | related to this, okay, about Twitter helping Elon,
06:12:01.860 | just beyond that for a second.
06:12:03.540 | Maybe centralization is actually also underexplored
06:12:06.420 | in the design space.
06:12:08.140 | For example, today's social networks
06:12:09.420 | are essentially governed by a single CEO,
06:12:10.900 | but that CEO is a background figure.
06:12:12.220 | They aren't leading the users to do anything.
06:12:14.060 | What if they did?
06:12:15.280 | One example, could Elon Musk's then 30 million followers
06:12:17.980 | somehow get us to Mars faster?
06:12:20.360 | Tools for directed collaborative work
06:12:22.260 | by really large groups on the internet
06:12:23.620 | are still in their infancy.
06:12:24.460 | You can see pieces of what I was talking about,
06:12:26.340 | the scratch pad thing,
06:12:27.740 | the network state being a group
06:12:29.300 | which can do collective actions.
06:12:30.460 | This is kind of the thing, right?
06:12:31.740 | So technologies for internet collaboration
06:12:33.380 | that can be very useful to the software
06:12:35.400 | for future network states.
06:12:37.260 | Operational transformation,
06:12:38.300 | so like Google Docs coordinates edits.
06:12:40.660 | Conflict-free replicated data types
06:12:42.940 | is another alternative, easier to code in some ways
06:12:44.780 | than operational transformation.
06:12:46.400 | Microtasks like Mechanical Turk, Scale.ai,
06:12:48.580 | and Earn.com before we sold it.
06:12:50.700 | Blockchains and crypto, obviously.
06:12:51.760 | The Polymath project,
06:12:53.120 | where a bunch of people parallel processed
06:12:54.960 | and were able to solve an open math problem
06:12:57.280 | by collaborating.
06:12:58.980 | Wikipedia with its flaws that we talked about.
06:13:01.560 | Social networks and group messaging,
06:13:02.600 | all these are ways for collaborating.
06:13:04.040 | They're not just simply attacking
06:13:05.440 | or doing something on the internet.
06:13:07.080 | This is something that Elon could use, right?
06:13:09.080 | - What works and what doesn't about Twitter?
06:13:11.680 | If there's something that's broken, how would you fix it?
06:13:14.360 | - A million things I can say here.
06:13:15.360 | So a few things.
06:13:16.200 | First is fact-checking.
06:13:17.880 | I had this kind of fun, I thought it was a funny tweet.
06:13:20.180 | To anyone who wants to quote,
06:13:21.240 | "Ban lying on social media,
06:13:23.040 | "please write down a function that takes in a statement
06:13:25.000 | "and returns whether it is true.
06:13:26.560 | "If you can start with the remand hypothesis,"
06:13:28.440 | that would be amazing.
06:13:29.520 | - Yeah.
06:13:30.360 | - Okay?
06:13:31.180 | - Yeah, we'll put--
06:13:32.020 | - That's kind of funny, right?
06:13:32.840 | - That's funny.
06:13:33.680 | - And so now the thing is--
06:13:34.840 | - That joke landed on like five people.
06:13:38.600 | - Sure, you wanna explain the joke?
06:13:40.480 | - Well, no, there's a lot of problems,
06:13:42.400 | decidability where the truth,
06:13:45.440 | that's what proofs in math is.
06:13:47.800 | The truth of the thing is actually
06:13:49.320 | exceptionally difficult to determine.
06:13:51.400 | And that's just a really nice example.
06:13:53.600 | - Right.
06:13:54.440 | - The problems that persist across centuries
06:13:57.040 | that have not been solved by the most brilliant minds,
06:14:00.040 | they're essentially true or false problems.
06:14:02.720 | - That's right.
06:14:03.560 | And so when people are saying,
06:14:04.380 | when they were saying they want to ban lying on social media,
06:14:07.320 | fact-check social media,
06:14:08.580 | the assumption is that they know what is true.
06:14:10.320 | And what do they mean by that?
06:14:11.260 | They really mean the assertion of political power, right?
06:14:14.360 | With that said, do I think it could be useful
06:14:17.740 | to have some kind of quote fact-checking thing?
06:14:19.560 | Yes, but it has to be decentralized and open source.
06:14:21.880 | You could imagine an interesting concept
06:14:23.680 | of coding Trugal, like a Google,
06:14:26.680 | that returned what was true.
06:14:27.680 | It's like a modified version.
06:14:28.520 | - Yes, I like it.
06:14:29.360 | - Right?
06:14:30.180 | So like GPT-3,
06:14:31.020 | but the stable diffusion version where it's open.
06:14:33.560 | Okay.
06:14:34.400 | And so now anybody,
06:14:35.220 | stable diffusion shows it is possible
06:14:36.480 | to take an expensive AI model and put it out there.
06:14:38.680 | Right?
06:14:39.520 | So you have, you know what a knowledge graph is?
06:14:41.480 | Like basically, you wouldn't actually,
06:14:46.080 | whether you have it as RDF
06:14:47.420 | or like a triple store kind of thing,
06:14:49.820 | or some other representation,
06:14:50.860 | it's like an ontology of A is a B and, you know,
06:14:55.140 | B has a C and it's got probabilities on the edges sometimes
06:14:58.500 | and other kinds of metadata.
06:14:59.700 | And this allows Google to show certain kinds
06:15:01.980 | of one box information where it's like,
06:15:04.620 | what is Steve Jobs's, you know,
06:15:07.300 | what is Lorraine Powell Jobs's age or birthday?
06:15:10.240 | They can pull that up out of the knowledge graph, right?
06:15:13.740 | And so you can imagine that Trugal
06:15:16.680 | would have both deterministic and statistical components.
06:15:19.880 | And crucially, it would say whether something is true
06:15:22.640 | according to a given knowledge graph.
06:15:24.480 | And so this way, at least what you can do is you can say,
06:15:28.720 | okay, here's the things that are consensus reality,
06:15:32.280 | like the value of the gravitational constant
06:15:34.160 | will be the same in the MAGA knowledge graph
06:15:37.520 | and the US establishment knowledge graph
06:15:39.020 | and the CCP knowledge graph and the, I don't know,
06:15:42.640 | the Brazilian knowledge graph and so on and so forth, okay.
06:15:46.020 | But there's other things that will be quite different.
06:15:48.680 | And at least now you can isolate
06:15:51.400 | where the point of disagreement is.
06:15:53.760 | And so you can have a form of decentralized fact-checking
06:15:55.980 | that is like, according to who, well, here is the authority
06:15:59.400 | and it is this knowledge graph, right?
06:16:00.820 | So that's like a kind of thing, right?
06:16:02.520 | - Yeah, yeah.
06:16:03.500 | - So that is, so that's one concept
06:16:06.100 | of like what next social media looks like.
06:16:08.060 | There's actually so much more.
06:16:09.700 | Another huge thing is decentralized social media, okay.
06:16:13.180 | Social media today is like China under communism
06:16:17.040 | in a really key sense.
06:16:18.780 | There's a great article called
06:16:20.260 | "The Secret Document That Transformed China."
06:16:22.240 | Do you know what China was like before 1978?
06:16:24.500 | - I know about the atrocities.
06:16:25.980 | - Sure, so. - But there.
06:16:26.820 | - To put some flesh on the bone, so to speak, okay.
06:16:30.660 | So basically.
06:16:32.100 | - There's a good book I'm reading
06:16:33.260 | because I think a lot of documents became public recently.
06:16:36.580 | And so.
06:16:37.420 | - There was a window when it opened up.
06:16:38.660 | Now it's probably closing back down again, but.
06:16:40.500 | - But great biographies because of that were written.
06:16:43.260 | Like I'm currently reading "Mao's Great Famine"
06:16:45.860 | by Frank Dicotter.
06:16:47.340 | - Yeah.
06:16:48.180 | - Which is, oh boy.
06:16:50.380 | - It's crazy, okay. - Yeah.
06:16:52.940 | - Here's the thing is capitalism was punishable by death
06:16:57.060 | in living memory in China.
06:16:59.780 | Just to explain what that meant, okay.
06:17:01.180 | I mean, that's what communism was, right?
06:17:03.020 | It was literally the same China that has like the CCP,
06:17:06.260 | you know, the entrepreneurs and Jack Ma and so on and so forth.
06:17:08.900 | 40 something years ago, capitalism was punishable by death.
06:17:11.140 | But to give you a concrete example,
06:17:13.540 | this is a famous story in China.
06:17:14.860 | It may be apocryphal,
06:17:15.980 | but it's what the folks have talked about.
06:17:18.220 | There's a village in Xiaogang
06:17:20.180 | and basically all the grain that you were produced
06:17:22.860 | was supposed to go to the collective.
06:17:24.380 | And even one straw belonged to the group.
06:17:26.780 | At one meeting with Communist Party officials,
06:17:28.260 | a farmer asked, "What about the teeth in my head?
06:17:30.060 | "Do I own those?"
06:17:30.900 | Answer, "No, your teeth belong to the collective."
06:17:33.540 | Okay.
06:17:34.380 | Now, the thing is that when you're taking 100%
06:17:38.940 | of everything, okay, work hard, don't work hard,
06:17:41.260 | everyone gets the same,
06:17:42.100 | so people don't wanna work, right?
06:17:43.540 | So what happened?
06:17:45.060 | These farmers gathered in secret
06:17:47.300 | and they did something that was like,
06:17:49.540 | would have gotten them executed.
06:17:50.620 | They were a contract amongst themselves and said,
06:17:54.620 | "We all agree that we will be able to keep
06:17:56.700 | "some of our own grain.
06:17:58.500 | "We will give some of them to the regime
06:18:00.440 | "so when it comes to collect grain, they've got something.
06:18:02.620 | "We'll be able to keep some of it."
06:18:04.180 | And if any of us are killed for doing this,
06:18:06.820 | then the contract said that the others
06:18:08.340 | would take care of their children.
06:18:10.020 | Okay?
06:18:11.900 | To keep some of what you earned,
06:18:13.380 | I mean, just think about how-
06:18:14.220 | - They formed a mini capitalism society
06:18:16.940 | within the Communist, a secret capitalism society
06:18:20.220 | amongst five people.
06:18:22.580 | - Right, so now that they could keep some
06:18:24.180 | of what they earned, right?
06:18:25.980 | Keep some of what they earned,
06:18:27.340 | they had a bumper harvest.
06:18:28.860 | And you know what happened with that bumper harvest?
06:18:30.780 | That made the local officials really suspicious and mad.
06:18:34.740 | They weren't happy that there was a bumper harvest.
06:18:36.140 | They're like, "What are you doing?
06:18:37.260 | "You're doing capitalism?"
06:18:38.780 | - Yeah. - Right?
06:18:39.740 | And a few years earlier, they might've just been executed.
06:18:43.260 | And in fact, many were.
06:18:44.100 | That's what it means when you see millions dead.
06:18:46.660 | Millions dead means guys were shot
06:18:48.900 | for keeping some grain for themselves.
06:18:51.020 | Okay, it means like guys came and kicked in the door
06:18:53.260 | of your collective farm and raped your wife
06:18:56.060 | and took you off to a prison camp and so on and so forth.
06:18:58.300 | That's what communism actually was, okay?
06:19:00.300 | It hasn't been depicted in movies.
06:19:01.860 | There's a great post by Ken Billingsley in the year 2000
06:19:05.580 | called, if I get this right, "Hollywood's Missing Movies."
06:19:09.700 | Okay, this is basically here.
06:19:13.420 | I'll paste this link so you can put it in the show notes.
06:19:15.020 | All right, this is worth reading.
06:19:16.340 | It's still applicable today,
06:19:18.100 | but now that we have stable diffusion,
06:19:19.900 | now we have all these people online,
06:19:21.260 | now that Russia and China are America's national bad guys,
06:19:25.400 | as they were before, they are again,
06:19:29.140 | perhaps we'll get some movies on what communism
06:19:30.980 | actually was during the 20th century and how bad it was,
06:19:32.940 | right, and vaccinate people against that
06:19:36.100 | as well as against Nazism, which they should be, okay?
06:19:39.100 | The point of this, go ahead.
06:19:41.120 | - No, 'cause I'm congratulating myself on the nice
06:19:46.100 | because you're sending me excellent links on WhatsApp
06:19:49.780 | and I just saw that there's an export chat feature.
06:19:52.940 | - Yes, great.
06:19:53.780 | - Because we also have disappearing messages on,
06:19:55.780 | so I was like, all right, this is great.
06:19:58.020 | - Great.
06:19:58.860 | - I get it.
06:20:00.180 | Your ability to reference sources is incredible,
06:20:03.740 | so thank you for this.
06:20:04.980 | - This is, otherwise, if I say something,
06:20:07.260 | it sounds too surprising,
06:20:08.420 | so that's why I wanna make sure I have--
06:20:09.580 | - Just on this topic.
06:20:10.580 | - Yeah, so like, yeah, I mean,
06:20:12.220 | people would be like shot for holding some grain.
06:20:15.160 | So what happened though was Deng Xiaoping said,
06:20:19.780 | okay, we're not gonna kill you.
06:20:21.860 | In fact, we're gonna actually set up
06:20:23.820 | the first special economic zone in Shenzhen.
06:20:25.660 | He didn't try to flip the whole country
06:20:27.280 | from communist to capitalist in one go.
06:20:29.620 | Instead, he's like, we can reform in one place.
06:20:32.620 | And in fact, he fenced it off from the rest of China
06:20:35.300 | and it did trade with Hong Kong
06:20:37.380 | and he spent his political capital on this one exception.
06:20:39.640 | It grew so fast, they gave him more political capital.
06:20:42.900 | Some people think actually that the Sino-Vietnamese War
06:20:47.260 | was Deng's way of just distracting the generals
06:20:52.300 | while he was turning China around
06:20:54.040 | to get it back on the capitalist road.
06:20:56.060 | And what he did was the opposite of a rebranding.
06:21:00.940 | He did a reinterpretation.
06:21:02.020 | Like a rebranding is where the substance is the same,
06:21:05.100 | but the logo is changed.
06:21:06.660 | You're now, you were Facebook, you're now meta.
06:21:09.940 | That's a rebranding, right?
06:21:11.780 | Reinterpretation is where the logo
06:21:13.820 | and the branding is the same.
06:21:14.860 | They're still the CCP,
06:21:15.900 | they're still the Chinese Communist Party,
06:21:17.980 | but they're capitalist now, the engine under the hood.
06:21:19.820 | It's deniable, okay?
06:21:21.980 | And this is a very common,
06:21:23.220 | once you realize those are different things,
06:21:24.500 | it's like swap the front end, swap the back end.
06:21:26.940 | - Yeah.
06:21:28.060 | - Go ahead. - Good way to put it.
06:21:29.140 | - Right? - It's really good.
06:21:29.980 | Yeah, yeah, really.
06:21:30.820 | (laughing)
06:21:33.100 | I'm enjoying your metaphors
06:21:34.980 | and way of talking about stuff.
06:21:36.420 | Yeah, so I get, yeah, yeah.
06:21:37.660 | Swap, you could, yeah.
06:21:39.060 | Rebranding is swap the front end,
06:21:40.940 | reinterpretation is swap the back end.
06:21:42.460 | - That's right.
06:21:43.300 | Once you realize that, you're like, okay,
06:21:45.620 | I can just like as an engineer, you can kind of,
06:21:48.340 | okay, sometimes I wanna do this on the front end,
06:21:49.900 | sometimes I wanna do it on the back end,
06:21:50.900 | sometimes it's explicit,
06:21:52.380 | and sometimes the user doesn't need to see it
06:21:53.780 | and it's on the back end.
06:21:54.980 | Lots of political stuff, you know,
06:21:57.460 | is arguably not just best done on the back end,
06:22:00.900 | but always done on the back end.
06:22:02.260 | One of the points I make in the book is,
06:22:04.060 | left is the new right is the new left,
06:22:05.940 | is, you know, if you look through history,
06:22:09.300 | the Christian King, the Republican conservative,
06:22:12.580 | the CCP entrepreneur, the WASP establishment,
06:22:17.580 | these are all examples of a revolutionary left movement
06:22:22.700 | becoming the ruling class right.
06:22:26.500 | Okay, like the Republican conservative,
06:22:28.020 | just as that one example,
06:22:29.060 | I go through extended description of this in the book,
06:22:31.300 | but the Republicans were the radical Republicans,
06:22:34.140 | the left of 1865, they won the revolution
06:22:37.180 | and their moral authority led them to have economic authority
06:22:41.100 | in the late 1800s.
06:22:41.940 | You wouldn't want a Democrat Confederate trader
06:22:45.380 | as the head of your, you know,
06:22:47.220 | railroad company, would you, right?
06:22:48.380 | So all the Confederate traders that were boxed out
06:22:51.700 | from the plum positions in the late 1800s.
06:22:55.300 | And so what happened was the Republicans
06:22:57.580 | turned their moral authority into economic authority,
06:22:59.540 | made tons of money.
06:23:00.940 | The Democrats then started repositioning,
06:23:02.780 | not as a party of the Southern racists, but the poor, right?
06:23:07.100 | And, you know, the cross of gold speech
06:23:09.260 | by William Jennings Bryan was part of that.
06:23:10.620 | There's a gradual process that reaches a path,
06:23:13.260 | not a path, let's say a crucial mark
06:23:15.900 | with the election of FDR,
06:23:17.700 | where it was actually not the 1932, but 1936 election
06:23:20.420 | that black voters switched over to FDR, okay?
06:23:23.980 | That was actually the, like the major flip to like 70%,
06:23:27.500 | you know, to the Democrats.
06:23:29.860 | Now they'd repositioned as a party of the poor,
06:23:31.660 | not the party of the South, okay?
06:23:33.740 | And Republicans had lost some economic authority,
06:23:38.740 | or rather they had moral authority,
06:23:40.140 | they turned into economic authority.
06:23:41.780 | They started to lose some moral authority.
06:23:43.700 | The loss of moral authority was complete by 1965.
06:23:45.980 | That was actually a mop-up.
06:23:47.020 | People dated, you know, the civil rights movement
06:23:48.980 | as the big way where the Republicans lost moral authority.
06:23:51.140 | It's not really, that was a mop-up because 1936,
06:23:54.180 | 30 years earlier was when black voters
06:23:55.620 | switched to the Democrats, okay?
06:23:57.300 | So 1965 was another 10 points moving over
06:23:59.700 | of black voters to Democrats.
06:24:01.620 | Republicans had completely lost moral authority
06:24:03.580 | 100 years after the civil war, okay?
06:24:05.860 | Then the next 50 years, that loss of moral authority
06:24:08.980 | meant that they lost economic authority,
06:24:11.060 | 'cause now you wouldn't want a Republican bigot
06:24:13.380 | as a CEO of your tech company anymore, would you?
06:24:15.260 | Right?
06:24:16.100 | So by 2015, now you have,
06:24:18.980 | it's like two sine waves that are staggered, right?
06:24:21.860 | Moral authority leads to economic authority,
06:24:23.780 | leads to loss of moral authority,
06:24:26.940 | leads to loss of economic authority.
06:24:28.700 | And so now you have the Democrats, you know,
06:24:32.140 | have completed 155 year arc from the defeated party
06:24:36.940 | in the civil war to the dominant party
06:24:40.180 | in the US establishment.
06:24:41.060 | All the woke capitalists are now at the very top.
06:24:44.340 | And now the same repositioning is happening
06:24:46.340 | where if you're so woke, why are you rich?
06:24:49.460 | You get it, right?
06:24:50.300 | Like, you know, if you're so smart,
06:24:51.140 | why aren't you rich is the normal kind of thing, right?
06:24:53.260 | If you're so woke, if you're so holy,
06:24:55.340 | why is like, for example, the BLM founder,
06:24:56.940 | why do they have this million dollar mansion, right?
06:24:58.620 | If you're so woke and it's all about being good
06:25:00.660 | and you're anti-capitalist,
06:25:01.500 | how come you seem to be raking in the money, et cetera,
06:25:03.620 | right?
06:25:04.700 | This is an argument which I'm not sure how long it will go.
06:25:09.460 | It might take years to play out,
06:25:11.340 | it might take decades to play out.
06:25:12.980 | I think probably on the order of a decade.
06:25:14.740 | You're gonna see, in my view, the repositioning.
06:25:17.060 | If the Democrats are the woke capitalists,
06:25:19.340 | the Republicans will eventually become,
06:25:21.420 | are becoming the Bitcoin maximalists.
06:25:24.260 | Because, you know, if one guy picks left,
06:25:26.460 | the other guy picks right.
06:25:27.300 | It's literally like magnets kind of repelling.
06:25:28.980 | They're sort of forced into the other corner here, right?
06:25:31.460 | And the Bitcoin maximalists will essentially,
06:25:34.140 | where this guy says centralization,
06:25:35.500 | they say decentralization,
06:25:36.860 | where they defend the right of capital to do anything,
06:25:39.700 | the maximalists will say,
06:25:40.780 | actually, you're all cantillionaires,
06:25:42.780 | you're all benefiting from printed money,
06:25:44.620 | you don't have anything that's legitimate,
06:25:46.660 | you don't actually own anything,
06:25:47.820 | it's all a handout from the government,
06:25:49.180 | and so on and so forth, right?
06:25:50.660 | And so that's a counter positioning
06:25:52.980 | that will basically attack the wokes
06:25:56.780 | by how much money they're making.
06:25:58.100 | They're not contesting the ideology.
06:25:59.900 | So when one guy signals economics, you signal culture.
06:26:04.020 | When the other guy signals culture, you signal economics.
06:26:07.220 | That's actually, that's a whole thing I can talk about.
06:26:08.820 | Should I talk about that for a second?
06:26:09.740 | - Sure.
06:26:11.100 | - Is this integrated into the forces that you talk about?
06:26:15.660 | You've talked about the three forces,
06:26:18.660 | the trifecta of forces that affect our society,
06:26:21.740 | which is the wokes, let's say--
06:26:25.020 | - Woke capital, communist capital, crypto capital.
06:26:27.180 | - You talk so fast, and I think so slow.
06:26:31.940 | - No, no, no.
06:26:32.900 | - Woke capital, communist capital, and crypto capital.
06:26:37.900 | Can you explain each of those three?
06:26:40.460 | We actually talked about each of the three in part,
06:26:43.860 | but it'd be nice to bring them together
06:26:45.820 | in a beautiful triangle.
06:26:46.980 | - Then I will also come back up,
06:26:48.380 | and I'll talk about how the CCP story relates
06:26:51.020 | to social media and decentralized social media, okay?
06:26:53.260 | All right, so NYT-CCP-BTC is woke capital,
06:26:58.260 | communist capital, crypto capital.
06:27:00.060 | And communist capital is, the simplest is, you must submit.
06:27:03.780 | The Communist Party is powerful, CCP is powerful,
06:27:06.180 | and you are not.
06:27:07.020 | If you're in China, you just submit.
06:27:08.060 | - CCP is an embodiment of communist capital
06:27:12.460 | that you're talking about.
06:27:13.300 | - Well, yeah, so basically, and by the,
06:27:14.580 | in China, they call it CPC, you know?
06:27:16.460 | So basically, they don't like it usually if you say CCP,
06:27:18.900 | right, so the Communist Party of China,
06:27:21.580 | as opposed to Chinese Communist Party of China.
06:27:23.020 | Basically, that is capitalism,
06:27:25.700 | that is a Chinese pool of capital,
06:27:28.500 | that billion-person pool, okay?
06:27:30.700 | That's WeChat, and it's Alibaba,
06:27:35.460 | and it's that entire kind of thing.
06:27:37.060 | That is one just social network with currency.
06:27:39.820 | The whole thing's vertically integrated.
06:27:40.660 | - When we say communist, what do you mean here?
06:27:43.300 | Why is the word communist important?
06:27:44.980 | Why don't you just say China?
06:27:46.740 | So is communist an important word?
06:27:48.500 | - It just, well--
06:27:50.100 | - Or is it just a catchy label?
06:27:51.540 | - It's a catchy label, but I think it's also important
06:27:53.300 | because it seems, it's paradoxical, right?
06:27:56.300 | So I had a thread on this.
06:27:58.060 | The future is communist capital versus woke capital
06:28:00.100 | versus crypto capital.
06:28:01.500 | Each represents a left-right fusion that's bizarre
06:28:03.500 | by the standards of the 1980s consensus.
06:28:05.500 | It's PRC versus MMT versus BTC, all right?
06:28:08.140 | And why is it bizarre by the standards of 1980s consensus?
06:28:10.620 | Well, in the 1980s, you wouldn't think the communists
06:28:13.420 | would become capitalists, but they did.
06:28:15.780 | You wouldn't think that the wokes, the progressives, right,
06:28:19.660 | would become so enamored with giant corporations
06:28:24.020 | and their power, right?
06:28:25.100 | They've seen something to liken that, right?
06:28:27.420 | And you also wouldn't think that the non-Americans
06:28:32.660 | or the post-Americans or the internationalists
06:28:35.660 | would be the champions of capital
06:28:37.060 | because you'd think it's the American nation, right?
06:28:41.020 | So rather than the conservative American nationalists
06:28:45.300 | being the defenders of capital,
06:28:48.340 | you have the liberal Americans who are with capital,
06:28:53.340 | you have the communist Chinese who are with capital,
06:28:56.740 | and you have the internationalists who are with capital.
06:28:59.540 | And it's the conservative American nationalists
06:29:01.220 | who are in some ways against that,
06:29:03.220 | which is kind of funny, right?
06:29:04.500 | So it's like this weird ideological flippening
06:29:07.300 | that if you take the long lens,
06:29:10.300 | you have these poles that kind of repel each other, okay?
06:29:12.660 | So just on the CCP, NYT, BTC thing.
06:29:16.020 | - NYT, by the way, is woke capital.
06:29:18.260 | - Yeah, what is NYT?
06:29:19.180 | So its formula is a little interesting.
06:29:21.400 | If CCP is just, you must submit because they're powerful,
06:29:24.140 | okay, and then you bow your head
06:29:25.740 | because the Chinese Communist Party is strong.
06:29:27.580 | Woke capital is you must sympathize.
06:29:30.580 | Why do you bow your head, Lex?
06:29:31.940 | Oh, because you're a white male.
06:29:34.400 | Therefore, you're guilty.
06:29:35.380 | You must bow your head because you are powerful.
06:29:38.120 | Yet notice it ends in the same place,
06:29:41.020 | in your head looking to the ground, right?
06:29:44.700 | In China, it's because they're powerful,
06:29:47.540 | so therefore you must bend your head.
06:29:49.620 | For the wokes, it's the left-handed version
06:29:52.300 | where you are powerful and it's shameful,
06:29:54.940 | so you should bow your head, right?
06:29:57.900 | - Right. - Okay?
06:29:58.900 | But it ends in your head bowed.
06:30:00.940 | It's an ideology of submission.
06:30:02.340 | It's not that subtle, but it's somewhat subtle.
06:30:04.660 | And then finally, crypto capital is head held high.
06:30:06.920 | You must be sovereign, okay?
06:30:09.140 | And one of the things I point out in the book
06:30:12.100 | is each of these polls is negative in some ways
06:30:17.100 | when taken to extreme, but also negative in its opposite.
06:30:20.260 | For example, obviously just totally submitting
06:30:22.960 | to total surveillance is bad,
06:30:24.700 | but a society where nobody submits is San Francisco
06:30:27.160 | where people just rob stores and walk out
06:30:29.660 | in the middle shoplifting all these goods
06:30:33.500 | and nothing happens, right?
06:30:35.060 | A society where you have the woke level of sympathy
06:30:37.700 | where you get to the kind of insanity
06:30:39.580 | of math is white supremacist
06:30:41.320 | and whatever nonsense is happening today is terrible,
06:30:45.220 | but a society that's totally stripped of sympathy
06:30:47.860 | is also not one that one would wanna be part of, right?
06:30:50.380 | That's just like the,
06:30:51.380 | whether it's 4chan's actual culture or it's feign culture
06:30:56.460 | or something like that or some weird combination,
06:30:59.180 | that's also not good.
06:31:00.020 | It's like Russia in the '90s,
06:31:00.960 | like nobody trusts anybody, that's also bad.
06:31:03.520 | And being totally sovereign, that sounds good.
06:31:06.460 | And there's a lot that is good about it.
06:31:07.740 | I'm sympathetic to this corner,
06:31:09.020 | but being totally sovereign, you go so capitalist,
06:31:11.780 | so sovereign that you're against the division of labor.
06:31:13.600 | You don't trust anybody.
06:31:14.660 | So you have to pump your own water and so on.
06:31:16.220 | So you actually have a reduced standard
06:31:17.380 | of living over here, okay?
06:31:19.340 | And conversely-
06:31:20.180 | - Like survivalist or whatever.
06:31:21.540 | - Survivalist type of stuff, right?
06:31:23.200 | And you just go kind of too crazy into that corner.
06:31:27.400 | And then of course, though, the other extreme
06:31:29.320 | of having no sovereignty is the,
06:31:32.480 | you will own nothing and be happy.
06:31:33.760 | Everything's in the cloud
06:31:34.680 | and can be deleted at any point, right?
06:31:36.420 | So each of these kind of has badness when it's there,
06:31:39.380 | but also it's total extreme opposite is bad.
06:31:41.520 | And so you wanna kind of carve out
06:31:43.040 | like an intelligent intermediate of these three poles,
06:31:46.560 | and that's the decentralized center
06:31:48.800 | or the recentralized center, I call it.
06:31:50.340 | Now, with that said,
06:31:52.000 | I think there is a repositioning in particular
06:31:53.560 | of woke capital that is happening.
06:31:54.880 | And I think if the 2000s was the global war on terror,
06:31:59.000 | and then the channel just changed to wokeness in the 2010s,
06:32:01.960 | and when I mean channel change,
06:32:03.400 | have you seen Paul Graham's graph
06:32:05.560 | or actually David Rosado's graph that Paul Graham posted?
06:32:08.580 | - No, but this is a good chance to say
06:32:10.920 | that Paul Graham is awesome.
06:32:12.460 | - Okay, yeah.
06:32:13.300 | And so here is this graph, okay?
06:32:14.800 | David Rosado's data analysis, I think,
06:32:17.040 | that put this together.
06:32:18.080 | So basically, this is a graph of the word usage frequency
06:32:23.080 | in New York Times, 1970 to 2018.
06:32:25.480 | And he's got some controls there.
06:32:27.360 | - Paul Graham tweets, "Hypothesis.
06:32:29.520 | "Although some newspapers can survive
06:32:31.560 | "the switch to online subscriptions,
06:32:33.220 | "none can do it and remain politically neutral,
06:32:36.240 | "quote, newspaper or record.
06:32:38.140 | "You have to pick a side to get people to subscribe."
06:32:40.960 | And there's a bunch of plots.
06:32:46.080 | Plots.
06:32:47.200 | On the x-axis is years,
06:32:50.640 | on the y-axis is the frequency of use.
06:32:54.440 | And sexism has been going up,
06:32:56.400 | misogyny has been going up,
06:32:58.080 | sexist, patriarchy, mansplaining,
06:33:01.760 | toxic masculinity, male privilege.
06:33:04.080 | All these terms have been going up
06:33:06.440 | very intensely in the past decade.
06:33:10.920 | - Yeah, but really, 2013 is the exact moment.
06:33:13.680 | You see these things,
06:33:14.640 | they're flat and then just go vertical.
06:33:16.680 | Mansplaining, toxic masculinity.
06:33:18.600 | - What precisely happened in 2013?
06:33:20.480 | - Ah, so I talk about this in the book,
06:33:23.240 | but I think fundamentally what happened was
06:33:25.680 | tech hurt media and their revenue dropped
06:33:29.240 | by about $50 billion over the four years from '08 to 2012.
06:33:33.740 | Tech helped Obama get reelected
06:33:37.000 | and media was positive on tech until December 2012.
06:33:40.560 | They wrote, "The nerds go marching in the Atlantic."
06:33:43.360 | Then after January 2013, once Obama was ensconced,
06:33:48.360 | then the knives came out
06:33:49.800 | because basically these tech guys were bankrupting them.
06:33:52.360 | They were through supporting them.
06:33:55.240 | And so the journals got extremely nasty
06:33:58.440 | and just basically they couldn't build search engines
06:34:02.480 | or create social networks,
06:34:03.840 | but they could write stories and shape narratives.
06:34:05.160 | So a clear editorial direction went down
06:34:07.960 | that essentially took all of this,
06:34:11.440 | all these weapons that had been developed in academia
06:34:14.040 | to win status competitions in humanities departments.
06:34:17.180 | And then they just deployed them.
06:34:20.200 | And essentially somebody observed that
06:34:22.320 | wokeness is the combination of
06:34:23.920 | Foucauldian deconstruction and civil rights,
06:34:27.000 | where deconstruction takes away
06:34:29.280 | the legitimacy of the old order.
06:34:30.480 | And then civil rights says,
06:34:31.320 | okay, the only thing that's good is this,
06:34:33.240 | which says the old order is also bad in a different way,
06:34:35.640 | but this is what's good.
06:34:36.960 | And that is the underpinning ideology.
06:34:40.720 | All these words have embedded in them an ideology.
06:34:45.360 | Another way of thinking about it is,
06:34:47.520 | this is not my reference, but I'll cite it anyway,
06:34:49.520 | the glossary of the Greek military junta.
06:34:53.440 | The creation and or use of special terms
06:34:55.240 | are employed by the junta as propaganda tools
06:34:58.200 | because essentially the word itself embeds a concept.
06:35:00.920 | You can Russell conjugate something one way or the other.
06:35:03.080 | Russell conjugation is this concept that
06:35:04.960 | I sweat, you perspire, but she glows.
06:35:07.160 | You can always take something.
06:35:09.040 | You are uncontrollably angry,
06:35:11.880 | but he is righteously indignant.
06:35:14.160 | You have a thin skin, they clap back.
06:35:18.880 | So once you kind of realize that these words
06:35:22.600 | have just been chosen in such a way
06:35:24.080 | as to delegitimize their target,
06:35:26.440 | and they all went vertical in 2013,
06:35:28.720 | and they were suddenly targeted
06:35:29.960 | against their erstwhile allies in tech,
06:35:32.760 | but also just across the country,
06:35:34.960 | you can see that this great awokening,
06:35:37.080 | that's what Iglesias called it
06:35:39.000 | by playing words to the great awakening,
06:35:40.920 | this kind of spasm of quasi-religious extremism.
06:35:43.840 | I wouldn't call it religious because it's not God-centered.
06:35:46.760 | It's really state and network-centered.
06:35:48.560 | So I call it a doctrine,
06:35:49.520 | which is a superset of religion and political doctrine.
06:35:52.360 | These words went vertical and all the terrorism stuff
06:35:57.800 | you just noticed kind of fell off a cliff.
06:35:59.680 | That was the obsession of everything in the 2000s
06:36:02.880 | and just channel change.
06:36:04.360 | It's amazing how that happened.
06:36:07.040 | It's not like any of the pieces got picked up.
06:36:09.720 | Some of those wars are still raging, of course.
06:36:11.800 | - And there's victims to this wokeism movement.
06:36:16.800 | - But in a weird way, even though some parts of it,
06:36:20.200 | just like there's wars in the Middle East
06:36:22.560 | that still keep raging,
06:36:23.520 | there's certainly active fronts of wokeism,
06:36:25.840 | but in a sense, the next shift is already on.
06:36:29.120 | You know why?
06:36:30.640 | It's a pivot from wokeism to statism.
06:36:33.320 | In many ways, NYT is sort of,
06:36:35.000 | and more generally the US establishment
06:36:36.680 | is sort of kind of coming,
06:36:38.360 | you may not believe this,
06:36:39.200 | they're kind of coming back to the center a little bit.
06:36:41.520 | In the same way that Lenin, after the revolution,
06:36:44.840 | implemented the new economic policy,
06:36:46.440 | which you may be aware of, right?
06:36:47.680 | Which was just like X percent more capitalism.
06:36:50.040 | He kind of boot on the neck, take control,
06:36:52.840 | but then ease up for a bit.
06:36:54.360 | And the so-called net men during the '20s
06:36:57.320 | were able to eke out something.
06:36:58.680 | There was like, oh, okay, fine.
06:37:00.120 | He's gonna be easier on us.
06:37:01.760 | Then it intensified again,
06:37:02.840 | because basically by loosening up,
06:37:04.560 | they were able to consolidate control.
06:37:05.840 | They weren't putting as much pressure on, right?
06:37:07.920 | Then it went extremely intense again, right?
06:37:11.240 | Similar to like Mao's like a hundred flowers thing,
06:37:13.680 | let a hundred flowers bloom.
06:37:14.880 | And you know, everybody came out
06:37:16.240 | and then he founded all the people who were against him
06:37:17.720 | and he executed a bunch of them, right?
06:37:19.400 | So what's happening now is NYT is,
06:37:23.200 | and more generally the US establishment
06:37:24.440 | is somewhat tacking back to the center,
06:37:26.800 | where, you know, they're not talking BLM
06:37:29.920 | and abolish the police.
06:37:31.280 | They're saying fund the Capitol police, right?
06:37:34.240 | They've gone from the narrative of 2020,
06:37:36.440 | which was meant to win a domestic contest,
06:37:38.640 | where they said, America is a systemically racist country,
06:37:40.720 | tear down George Washington, we're so evil,
06:37:43.600 | to the rhetoric of 2022,
06:37:45.320 | which is we're the global champion of democracy
06:37:47.240 | and every non-white country is supposed to trust us.
06:37:49.600 | Now, obviously those are inconsistent, right?
06:37:54.640 | If you're in India or you're in Nigeria,
06:37:56.680 | and you just heard that America's calling itself
06:37:59.040 | the same guys, by the way,
06:38:00.360 | saying it's so institutionally racist, systemically racist.
06:38:02.800 | And now you're saying, well,
06:38:03.640 | we're the leader of the free world and the number one.
06:38:05.680 | Obviously there's an inconsistency
06:38:06.840 | between the domestic propaganda
06:38:08.160 | and the foreign propaganda, right?
06:38:10.320 | There's a contrast between abolish the police
06:38:12.120 | and put 2 billion for the Capitol police.
06:38:13.760 | You can reconcile this and you can say,
06:38:16.160 | the US establishment is pro-federal
06:38:18.560 | and anti-local and state.
06:38:20.400 | So abolish the local police,
06:38:22.400 | who tend to be, you know, Republican or rightist,
06:38:25.600 | but fund the FBI, fund the Capitol police,
06:38:27.520 | who tend to be, you know,
06:38:28.440 | just like in the Soviet Union,
06:38:30.600 | is the national things like the KGB, right?
06:38:33.000 | They were for the state,
06:38:36.360 | but there were all these local nationalist,
06:38:38.680 | ethnic insurgencies in like Estonia and other places, right?
06:38:41.840 | So you can reconcile them,
06:38:43.080 | but nevertheless on its face, those are contradictory.
06:38:46.040 | So what are you gonna get, I think?
06:38:48.480 | I think you're gonna get this rotation
06:38:51.160 | where a fair number of the folks
06:38:54.720 | on the sort of authoritarian right
06:38:56.800 | are kind of pulled back into the fold a bit, okay?
06:38:59.520 | These are the cops and the military and whatnot,
06:39:01.880 | some of them, because as this decade progresses,
06:39:05.280 | you're gonna see the signaling on American statism
06:39:07.400 | as opposed to wokeism, okay?
06:39:08.520 | Which is 30 degrees back towards the center, right?
06:39:12.240 | Conversely, on the other side,
06:39:14.240 | you're gonna have the left libertarians
06:39:16.640 | and right libertarians who are signaling crypto
06:39:19.480 | and decentralization and so on, okay?
06:39:22.200 | And so the next one isn't red versus blue,
06:39:25.040 | it's orange versus green.
06:39:26.640 | It's the dollar versus Bitcoin.
06:39:28.280 | And so you have the authoritarians,
06:39:29.720 | the top of the political compass
06:39:30.840 | versus the "libertarians," right?
06:39:33.200 | And here's the visual of that.
06:39:35.580 | So that's why as I wrote the book and after I showed it,
06:39:40.320 | I was like, "You know, I'm already seeing this shift
06:39:43.920 | happening from war on terror to wokeism
06:39:46.720 | to American statism, right?"
06:39:48.960 | And here, just take a look at this visual.
06:39:51.800 | - Interesting.
06:39:52.640 | So the visual is an animation transforming
06:39:57.080 | the left versus right, libertarian versus authoritarian
06:40:02.080 | to Bitcoin and dollar versus crypto.
06:40:07.640 | - That's right.
06:40:08.480 | And some folks switch sides, right?
06:40:10.240 | 'Cause you have folks like Jack Dorsey
06:40:13.400 | and a lot of the tech founders
06:40:15.480 | in basically the lower left corner, right?
06:40:17.600 | Who were blue but are now gonna become orange or are orange.
06:40:21.080 | And you have folks in the upper right corner
06:40:23.920 | who are going to, at the end of the day,
06:40:26.280 | pick the dollar and the American flag
06:40:28.880 | over the internationalist ideals of cryptocurrency.
06:40:32.680 | - The realigning, as you call it.
06:40:35.000 | Let me ask you, briefly, we do need to get a comment,
06:40:40.000 | your visionary view of things.
06:40:42.780 | We're at a low point in the cryptocurrency space
06:40:47.200 | from a shallow analysis perspective,
06:40:50.200 | or maybe in a deeper sense, if you can enlighten me.
06:40:53.920 | Do you think Bitcoin will rise again?
06:40:57.200 | - Yes.
06:40:58.040 | - Do you think it will go to take on fiat,
06:41:03.040 | to go over a million dollars, to go to these heights?
06:41:07.120 | - I mean, I think it's possible.
06:41:08.640 | And the reason I think it's possible
06:41:09.720 | is I think a lot of things might go to a million dollars
06:41:12.160 | because inflation. - Oh, because of inflation.
06:41:13.560 | - Right.
06:41:14.400 | - Whatever.
06:41:15.240 | - That was an important point, right?
06:41:16.060 | - Yes, it's a very important point, yes.
06:41:17.640 | - Because you're seeing essentially...
06:41:19.800 | (Lex laughing)
06:41:20.760 | - Yes.
06:41:21.600 | - But sort of the choke pointing on energy
06:41:23.760 | is pushing up prices across the board for a lot of things.
06:41:26.720 | China's not doing us any favors with the COVID lockdowns.
06:41:32.320 | Putin's not doing the world any favors
06:41:34.960 | with this giant war.
06:41:36.600 | There's a lot of bad things happening
06:41:38.920 | in the physical world, right?
06:41:40.400 | When China, Russia, and the US are all,
06:41:45.880 | and Europe is, there's folks who are just insane
06:41:48.480 | about degrowth and they're against,
06:41:52.560 | they're pushing for burning coal and wood, right?
06:41:55.320 | So a lot of prices are going up
06:41:57.120 | in a really foundational and fundamental way.
06:41:59.400 | And with that said also,
06:42:01.480 | the dollar is in some ways strengthening
06:42:02.920 | against certain other things
06:42:03.880 | 'cause a lot of other countries are dying harder, right?
06:42:06.720 | And you've got riots in Sri Lanka and riots in Panama
06:42:10.280 | and riots in all these places, right?
06:42:13.560 | So it's very complicated
06:42:15.200 | because you've got multiple different trends
06:42:16.960 | going in the same way.
06:42:17.800 | Your Bitcoin maxims would just say infinity over 21 million.
06:42:22.320 | And so therefore you print all the dollars
06:42:23.880 | with only 21 million Bitcoin, so Bitcoin goes to infinity.
06:42:26.320 | But it can be something where lots of other currencies die
06:42:29.880 | and the dollar is actually exported via stable coins, okay?
06:42:33.420 | But I do think--
06:42:35.320 | - So it still moves,
06:42:36.160 | fiat still moves somehow into the cryptocurrencies.
06:42:39.200 | - Yeah, yeah, yeah.
06:42:40.040 | I think it's kind of like Microsoft where,
06:42:42.160 | I mean, Windows is still around, right?
06:42:44.120 | Microsoft is still around.
06:42:44.960 | It's still a multi-hundred billion dollar company.
06:42:47.320 | It had-- - He doesn't mean it.
06:42:48.560 | He doesn't mean it, don't worry.
06:42:49.840 | All my machines are Windows
06:42:51.360 | and like still boot, yeah. - Are they?
06:42:52.920 | Okay, okay. - I don't own a single Mac.
06:42:54.760 | - Really?
06:42:55.600 | Okay, you are unusual in that.
06:42:57.160 | - Yeah.
06:42:58.000 | - That's, so at least for our--
06:42:59.480 | - It's not ideology, just convenience.
06:43:02.080 | - Fine, I mean, they actually now at PostSethia,
06:43:04.240 | they do make some good stuff, right?
06:43:05.760 | Like Microsoft Teams is good, right?
06:43:07.840 | - Yeah, there's a lot of incredible stuff.
06:43:09.600 | And UCO has done a lot of innovative things, like GitHub.
06:43:13.440 | - Yeah, I mean, well, there's an acquisition,
06:43:14.720 | but still, they give them credit for it.
06:43:16.040 | - The acquisition, the pivoting of vision and motivations
06:43:19.920 | and focus and all that kind of stuff.
06:43:22.320 | So anyway, yes, Microsoft does analogy metaphor
06:43:25.200 | for something?
06:43:26.040 | - Well, yeah, so basically just like,
06:43:27.440 | they didn't need a turnaround,
06:43:28.480 | but they did endure to the present day.
06:43:30.360 | They didn't die from Google app.
06:43:31.640 | I mean, for the massive attacks on them, they didn't die.
06:43:33.800 | They're less powerful, but they make more money, right?
06:43:36.920 | And I think that might be something that,
06:43:40.120 | I mean, our best case scenario is the US establishment
06:43:42.800 | or CCP has more power over fewer people, okay?
06:43:46.280 | - I see.
06:43:47.120 | - And so, but you can't exit.
06:43:49.280 | If you're there, you're kind of knuckling under or whatever,
06:43:52.040 | but you can't exit, right?
06:43:53.560 | And so I mentioned those three polls.
06:43:56.200 | CCP is obviously a billion people, 1.4 aligned
06:44:00.040 | under the digital yuan and so on, right?
06:44:02.480 | NYT is the entire, it's the tech companies,
06:44:06.360 | it's the US dollar, it is the establishment.
06:44:09.080 | And then crypto capital is everybody else.
06:44:11.280 | But I actually think that over time,
06:44:13.560 | that third world is web three this time.
06:44:16.440 | And that's the third poll.
06:44:17.800 | And that's India and it's Israel.
06:44:19.560 | And it's lots of American conservatives
06:44:21.600 | and left libertarians and libertarians.
06:44:24.120 | And it's also lots of Chinese liberals,
06:44:25.880 | all the folks who are trying to get out of China
06:44:28.360 | because it's become so nationalist and crazy
06:44:32.360 | and difficult for capitalism.
06:44:34.360 | And so if you take basically non-establishment Americans
06:44:38.640 | on both left and right, okay?
06:44:40.280 | The bottom two quadrants in the political compass
06:44:43.360 | I talked about, you take the liberal Chinese,
06:44:46.560 | you take the Israelis and the Indians, why?
06:44:49.440 | Because they don't, both of them have a lot of tech talent.
06:44:53.840 | Right, they're the number one and number two demographics
06:44:55.440 | for tech founders.
06:44:56.880 | And they want to, while they are generally sympathetic
06:45:01.880 | to the West, right?
06:45:04.040 | And they have more ties to the West,
06:45:05.720 | they also are more cautious about national interests
06:45:09.560 | rather than just starting fights,
06:45:10.840 | you know, where that's how they would think about it, right?
06:45:12.440 | They just, you know, India thinks of itself
06:45:14.040 | as a poor country,
06:45:14.880 | Israel thinks of itself as a small country.
06:45:16.760 | And so therefore it needs to not just get in every fight
06:45:19.560 | just for the sake of it.
06:45:20.440 | And so need to maintain a cautious distance with China,
06:45:24.360 | but not like do what Pelosi is doing
06:45:26.640 | and try and start like a big thing, okay?
06:45:28.680 | I think Israel is similar
06:45:30.560 | where it's maintaining diplomatic relations with China.
06:45:32.480 | It's more friendly towards China than the US is.
06:45:35.040 | India and Israel, I think, are two sovereign states
06:45:38.240 | that have a lot of globally mobile tech talent
06:45:40.080 | that obviously have ties to the West
06:45:41.840 | with a large diaspora that are hard to demonize,
06:45:45.640 | you know, in the sense of willing to argue on the internet.
06:45:50.280 | (laughs)
06:45:51.280 | Let's put it like that, in English, right?
06:45:52.760 | It's very important.
06:45:53.960 | And them plus enough Americans plus enough Chinese
06:45:57.600 | can set up another poll that is not for cold war
06:46:02.280 | or military confrontation, but for peace and trade
06:46:05.560 | and freedom and so on and so forth, right?
06:46:08.240 | That's the center as opposed to the, you know,
06:46:12.600 | left of the woke American US establishment
06:46:16.480 | or the right of the ultra-nationalist CCP, right?
06:46:18.920 | That's what I think about.
06:46:20.000 | Now, what I would say here is the reason I think
06:46:23.440 | these are the kind of the three polls,
06:46:24.560 | you can argue against this, right?
06:46:25.600 | You can say it's unipolar world.
06:46:27.280 | America's totally dominant.
06:46:28.420 | That's one argument.
06:46:29.400 | You can say it's a bipolar world.
06:46:30.520 | It's just the US versus China.
06:46:31.840 | Not everybody else will have to be forced to align
06:46:34.240 | with one or the other.
06:46:35.360 | Jay Shankar, you know, actually explicitly rejected this.
06:46:37.600 | He's like, look, there's a billion people in India.
06:46:39.760 | It's coming up on, it will eventually be like
06:46:41.960 | the number three economy.
06:46:42.880 | It's on the rise.
06:46:44.800 | He's got the history and culture.
06:46:45.920 | He thinks he's entitled to have,
06:46:47.280 | India's entitled to have its own side, right?
06:46:49.560 | In such a thing.
06:46:50.380 | It's a funny way of putting it, right?
06:46:51.680 | But it's also true.
06:46:53.200 | And so you could say it's unipolar,
06:46:54.800 | you could say it's bipolar,
06:46:55.640 | you could say it's just multipolar.
06:46:56.880 | And everybody is kind of, they're, you know,
06:46:58.640 | India, Israel, these are groups out there.
06:47:00.720 | But I actually think it's gonna be tripolar.
06:47:02.240 | And the reason it's tripolar is these three pools
06:47:05.240 | are the groups that have enough media and money
06:47:08.640 | and scale and whatnot to really kind of
06:47:11.200 | be self-consistent civilizations.
06:47:12.480 | Obviously China's like the vertically integrated,
06:47:14.080 | like Apple or whatever, just like one country.
06:47:14.920 | - Maybe a stable ideology.
06:47:17.200 | - A stable ideology, that's right, right?
06:47:19.300 | Obviously the, you know,
06:47:20.520 | the wolves have control of lots of institutions.
06:47:23.100 | They've got the US establishment,
06:47:24.560 | they've got the tech companies,
06:47:25.400 | they've got the media companies and so on.
06:47:27.300 | But crypto is basically everybody else.
06:47:29.660 | And crucially crypto has inroads in China and America
06:47:33.620 | where it's hard to demonize it as completely foreign
06:47:35.540 | because there's many, many, many huge proponents
06:47:38.020 | of the universalist values of crypto in America and China,
06:47:42.060 | because it is true global rule of law and free speech
06:47:45.460 | and, you know, so on.
06:47:46.760 | It is genuinely universalist in a way where
06:47:49.380 | America can no longer be, you know,
06:47:51.460 | the number one rule of the rules-based order
06:47:52.900 | is America is always number one.
06:47:54.980 | And China doesn't even pretend
06:47:56.340 | to maintain a rules-based order, right?
06:47:58.220 | Whereas for all those countries
06:47:59.860 | that don't wanna either be dominated
06:48:01.100 | by the US media corporations that can,
06:48:02.900 | or social media that can just censor Trump,
06:48:04.700 | nor do they wanna be dominated by China,
06:48:06.420 | this is an attractive alternative platform
06:48:08.220 | they can make their own, right?
06:48:09.660 | So that's where I think, you know,
06:48:10.860 | I wrote an article on this in Foreign Policy on here.
06:48:14.980 | Here's two articles that talk about this a little bit.
06:48:16.980 | It's called Great Protocol Politics.
06:48:19.100 | And then here's another one on the sort of domestic thing,
06:48:22.660 | Bitcoin is Civilization for Barry Weiss, okay?
06:48:25.800 | But I wanna just come up the stack a little bit
06:48:28.680 | and just return to that original point,
06:48:30.100 | which I diverged on,
06:48:31.060 | which was why, I gave the whole example
06:48:34.380 | of how we got into China
06:48:37.500 | because I talked about how China
06:48:39.300 | had gone from communist to capitalist
06:48:40.700 | and letting people have just a share of what they owned,
06:48:42.780 | right?
06:48:43.620 | With social media,
06:48:44.500 | we're still in kind of the communist era
06:48:46.900 | of social media almost,
06:48:47.740 | where whatever you earn on social media,
06:48:50.420 | like Google takes its cut,
06:48:52.100 | Twitter takes 100%,
06:48:53.260 | you're nothing for all your tweets or anything like that.
06:48:55.700 | Not only do you have, do you earn nothing,
06:48:58.660 | you might get a little rev share on TikTok or YouTube,
06:49:00.980 | you can do okay, right?
06:49:02.500 | But not only do you earn either nothing or a little bit,
06:49:05.540 | you have no digital property rights,
06:49:07.120 | even more fundamentally.
06:49:08.520 | You are at the,
06:49:10.540 | just the whim of a giant corporation can hit a button
06:49:14.660 | and everything you worked for over years, gone, okay?
06:49:18.460 | That is, even if that is quote,
06:49:19.940 | the current state of events,
06:49:21.300 | the state of affairs rather,
06:49:22.660 | that is not the right balance of power.
06:49:24.980 | To be able to unperson somebody at the touch of a key
06:49:28.780 | and take away everything in the digital world
06:49:30.620 | and we're living more and more in the digital world,
06:49:32.460 | we need to check on that power.
06:49:34.300 | And the check on that power is crypto
06:49:36.740 | and its property rights and its decentralization, right?
06:49:39.420 | Then when I say decentralization,
06:49:41.380 | I mean your money and your digital property
06:49:44.900 | is by default yours.
06:49:47.020 | And there has to be a due process
06:49:49.780 | for someone to take that away from you.
06:49:51.300 | Everything, all work is online.
06:49:53.540 | All your money is online, your presence is online.
06:49:55.700 | That can just be taken away from you
06:49:57.580 | with the press of a key
06:49:59.460 | that just gives bad governments, bad corporations
06:50:02.740 | so much power that that's wrong, right?
06:50:04.740 | That's why I'm a medium and long-term bull on crypto,
06:50:07.500 | simply because the check on this thing.
06:50:09.420 | And that if you think about it
06:50:10.740 | in terms of just abstract decentralization is one thing,
06:50:12.620 | but you think about it in terms of property rights,
06:50:14.700 | it's quite another.
06:50:16.060 | And now what that also means is once you have property rights
06:50:21.060 | and you have decentralized social media,
06:50:23.540 | it'll be like the explosion of trade that happened
06:50:27.580 | after China went from communist to capitalist.
06:50:29.460 | Literally billions of people around the world
06:50:31.540 | are no longer giving everything to the collective.
06:50:34.300 | They own the teeth in their head now, finally.
06:50:37.100 | Okay, it's funny, right?
06:50:38.700 | So you're lexfriedman.eth, you own it.
06:50:40.940 | The keys are on your computer.
06:50:42.300 | The bad part is of course they can get hacked
06:50:43.940 | or something like that.
06:50:44.780 | Then you can deal with that with social recovery.
06:50:46.300 | There's ways of securing keys.
06:50:48.180 | But the good part is, ta-da,
06:50:51.140 | you actually have property rights
06:50:52.580 | in the Hernando de Soto sense.
06:50:54.060 | You have something you own, ownership, digital ownership.
06:50:56.980 | It's the cloud is great,
06:50:58.660 | but crypto gives you some of the functionality of the cloud
06:51:02.260 | while also having some of the functionality
06:51:03.780 | of the offline world where you have the keys.
06:51:05.020 | So it's a V3, right?
06:51:06.940 | It's a continuous theme, right?
06:51:08.340 | The V1 was offline, I've got a key, I own it.
06:51:10.660 | I have de facto control.
06:51:12.060 | V2 is the cloud, someone else manages it for me.
06:51:14.020 | It's hosted, I get collaboration and so on.
06:51:16.100 | V3 is the chain where you combine aspects of those, right?
06:51:18.740 | You have the global state of the cloud,
06:51:20.460 | but you have the local permission
06:51:21.740 | and controlling of the private key, okay?
06:51:23.900 | So that's why I'm a medium to long-term ultra bull on crypto
06:51:26.420 | and I've actually, there's a podcast I gave with Asympco
06:51:28.740 | where I talked through how crypto actually
06:51:30.860 | doesn't just go after finance.
06:51:33.260 | So it's gold and it's wire transfers and it's crowdfunding
06:51:36.020 | and it's all finance with DeFi,
06:51:37.740 | but it's actually also search and it's social
06:51:40.460 | and it's messaging.
06:51:41.620 | It's actually even operating systems
06:51:44.100 | and eventually cloud and whatnot.
06:51:46.780 | Do you want me to talk about that briefly?
06:51:48.340 | - Yeah, yeah, if you can briefly see
06:51:49.860 | how broad you see the effect of crypto.
06:51:55.060 | So first, crypto is fundamentally a new way
06:51:57.060 | of building backend systems, right?
06:51:58.900 | So if you think about how big a deal it was to go
06:52:01.300 | from AT&T's corporate Unix to Linux, it's permissionless,
06:52:04.980 | right?
06:52:05.820 | When you went from, as much as I admire a lot of the stuff
06:52:08.100 | that Sam Altman and Greg Brockman have done at OpenAI,
06:52:11.420 | I mean, they're phenomenal in terms of research.
06:52:13.300 | They've pushed the envelope forward.
06:52:14.740 | I give them a ton of credit, right?
06:52:16.500 | Still, it was great to see stable diffusion out there,
06:52:19.780 | which was open source AI, right?
06:52:22.980 | And so from a developer, from a power user standpoint,
06:52:26.660 | whenever you have the unlocked version,
06:52:28.260 | like an unlocked cell phone,
06:52:29.140 | it's always gonna be better, right?
06:52:30.580 | So what crypto gives you,
06:52:34.380 | obviously it's every financial thing in the world.
06:52:37.100 | You can do stocks, bonds, et cetera.
06:52:38.980 | It's not, just like the internet wasn't just a channel.
06:52:42.620 | It wasn't like radio and TV and internet.
06:52:44.340 | It was internet radio and internet TV
06:52:46.060 | and internet this and internet that.
06:52:47.560 | Everything was the internet.
06:52:48.460 | All media became the internet.
06:52:49.780 | Crypto is not an asset class.
06:52:50.900 | It's all asset classes.
06:52:52.260 | It's crypto stocks and crypto bonds, et cetera.
06:52:54.260 | In a real sense, like private property
06:52:57.500 | arguably didn't exist in the same way before crypto.
06:53:00.620 | International law didn't exist before crypto.
06:53:02.340 | How are you gonna do a deal between Brazil and Bangladesh?
06:53:06.020 | If a Brazilian company wants to acquire
06:53:08.060 | a Bangladesh company,
06:53:09.180 | they usually have to set up a US adapter in between
06:53:11.260 | because otherwise, what are the tax
06:53:12.740 | or the other obligations between the two?
06:53:14.460 | You set up a US adapter or a Chinese adapter to go between,
06:53:17.260 | but now that Brazilian and Bangladesh can go peer to peer
06:53:20.060 | 'cause they're using blockchain, right?
06:53:21.420 | They can agree on a system of law
06:53:23.100 | that is completely international and that's code.
06:53:27.780 | So each party can diligence it
06:53:29.100 | without speaking Portuguese and Bengali, right?
06:53:32.340 | So that's why I am a long-term bull on crypto.
06:53:35.620 | I just described the finance case.
06:53:37.680 | Let me go through the others, right?
06:53:39.100 | Social.
06:53:39.940 | So you have the private keys for your ENS.
06:53:43.500 | You have apps like Farcaster.
06:53:45.460 | You basically have decentralized social media
06:53:48.260 | where there's different variants.
06:53:49.700 | Some, you just log in with your crypto username.
06:53:52.660 | Others, the entire social network
06:53:53.980 | and all the likes and posts are on chain like Deeso,
06:53:56.620 | but there's several different versions, right?
06:53:59.060 | Search.
06:53:59.900 | Once you realize block explorers
06:54:04.060 | are an important stealth threat to search,
06:54:06.160 | they're very high traffic sites like blockchain.com
06:54:08.500 | and Etherscan that Google has just totally slept on.
06:54:11.940 | They don't have a block explorer.
06:54:13.340 | You don't have to do anything in terms of trading
06:54:15.320 | or anything like that.
06:54:16.540 | Google does not have a block explorer.
06:54:18.700 | Because think of it as search, but it is search.
06:54:20.580 | It's absolutely search.
06:54:21.660 | It's a very important kind of search engine.
06:54:23.860 | And once you have crypto social,
06:54:26.580 | you now show that you're not just indexing
06:54:29.080 | in a block explorer like on-chain transactions,
06:54:33.940 | but on-chain communications, okay?
06:54:36.580 | So now you suddenly see, oh, the entire social web
06:54:39.380 | that Google couldn't index.
06:54:40.300 | It could only index the World Wide Web
06:54:42.200 | and not the social web.
06:54:43.340 | Now it's actually the on-chain signed web,
06:54:47.460 | because every post is digitally signed.
06:54:48.900 | It's a new set of signals.
06:54:50.260 | It's way easier to index than either the World Wide Web
06:54:53.040 | or the social web, 'cause it's open and public.
06:54:55.140 | So this is a total disruptive thing to search
06:54:57.140 | in the medium term,
06:54:58.460 | because it's a new kind of data set to index, right?
06:55:01.460 | So that's how it's a threat to social, to search.
06:55:04.220 | It is a threat to messaging.
06:55:07.420 | Because, or it's disruptive actually,
06:55:08.900 | because of the ENS name, as I mentioned,
06:55:10.260 | is like a universal identifier.
06:55:11.860 | You can send encrypted messages between people.
06:55:14.000 | That's a better primitive to base it on.
06:55:16.420 | You know, WhatsApp is just claiming
06:55:18.580 | that they're end-to-end encrypted.
06:55:20.100 | But with an ENS name or with a crypto name,
06:55:23.460 | you can be provably audibly end-to-end encrypted
06:55:26.100 | because you're actually sending it back and forth, right?
06:55:27.820 | 'Cause the private key is local, right?
06:55:29.460 | That itself, given how important that is, right?
06:55:32.220 | You could man-in-the-middle signal or WhatsApp
06:55:35.180 | because there's a server there, right?
06:55:36.740 | If you have, you know,
06:55:37.580 | so end-to-end encrypted messaging will happen
06:55:39.540 | and with payments and all this other stuff, okay?
06:55:41.220 | So you get the crypto messaging apps.
06:55:43.240 | You get operating systems.
06:55:45.660 | Well, the frontier of operating systems,
06:55:47.140 | I mean, look, you know, Windows, Linux,
06:55:49.500 | and Mac OS have been around forever.
06:55:51.420 | But if you actually think about, you know,
06:55:53.780 | what is a blockchain?
06:55:54.920 | Well, there's operating systems, there's web browsers.
06:55:57.720 | A blockchain is the most complicated thing
06:55:59.300 | since an operating system or web browser
06:56:00.540 | 'cause it's a kind of operating system.
06:56:02.740 | It's got, you know, something like Ethereum has an EVM.
06:56:05.220 | It's got a programming language.
06:56:07.080 | It's got an ecosystem where people monetize on it.
06:56:09.660 | They build front-end apps and they build back-end apps.
06:56:12.100 | They interoperate between each other.
06:56:13.820 | This is the frontier of operating systems research.
06:56:15.940 | People haven't thought of it that way, right?
06:56:17.660 | It's also the frontier of a lot of things in databases.
06:56:20.460 | You will get a crypto LinkedIn
06:56:22.700 | where there's zero knowledge proofs of various credentials.
06:56:24.900 | Okay?
06:56:25.940 | Basically, every single web two company,
06:56:28.420 | I can probably come up with a web three variant of it, right?
06:56:32.160 | Like Ethereum is, I mean,
06:56:33.680 | and this is high praise for both parties,
06:56:35.780 | but Ethereum is like the crypto stripe,
06:56:38.420 | or the web three stripe.
06:56:39.640 | And you will see versions of everything else
06:56:44.440 | that are like this.
06:56:46.240 | But, you know, I kind of described search,
06:56:49.200 | social, messaging, operating systems, the phone, right?
06:56:52.820 | Solana is doing a crypto phone.
06:56:53.980 | Why do you want that?
06:56:54.820 | Again, digital property.
06:56:56.440 | Apple was talking about running some script
06:56:59.000 | to find if people were having, you know, CSAM,
06:57:02.040 | like, you know, child porn or whatever on their phones,
06:57:04.680 | right?
06:57:05.520 | And even NYT actually reported that,
06:57:08.920 | like Google ran something like this
06:57:10.360 | and found false positives.
06:57:11.480 | Some guy had to take a photo of a kid
06:57:13.680 | for, you know, medical diagnosis.
06:57:15.860 | It got false, you know, falsely flagged as CSAM.
06:57:18.840 | He lost access to his account.
06:57:20.780 | Total nightmare.
06:57:21.620 | Imagine just getting locked out of your Google account,
06:57:23.560 | which you're so dependent on, right?
06:57:25.160 | As more and more of your digital life goes online,
06:57:28.760 | you know, is it really that much ethically different
06:57:31.560 | if it's the Chinese state that locks you out
06:57:34.600 | or an American corporation, right?
06:57:37.000 | Basically, it's operationally very similar.
06:57:38.600 | You just have no recourse.
06:57:39.440 | You're unpersoned, right?
06:57:41.160 | So the crypto phone becomes like insanely important
06:57:44.960 | because you have a local set of private keys.
06:57:47.920 | Those are the keys to your currency and your passport
06:57:50.120 | and your services and your life, right?
06:57:52.560 | So like, become something that you just hold on you
06:57:54.580 | with your person at all times, like your normal phone.
06:57:56.500 | You might have backups and stuff,
06:57:57.600 | but you know, the crypto phone
06:57:58.920 | is an insanely important thing, okay?
06:58:01.280 | And so that's search, that's social, that's messaging,
06:58:04.560 | that's operating systems, that's a phone.
06:58:08.380 | That's a lot right there.
06:58:09.280 | - Yeah, that is beautiful.
06:58:10.800 | - Can I have 120 seconds to just finish up
06:58:13.640 | a few more thoughts on social media?
06:58:14.480 | - Yes, please. - Okay.
06:58:15.880 | AI and AR, okay?
06:58:17.640 | This massive impact, obviously, of AI and social media.
06:58:20.000 | You're gonna have completely new social media companies,
06:58:23.040 | gestures, other things, you know, TikTok having, you know,
06:58:26.440 | some of the AI creation tools in there
06:58:28.000 | is just like a V1 of that.
06:58:30.640 | There's this whole thread with everything,
06:58:31.840 | stable diffusion is unlocking.
06:58:33.600 | But basically, this is gonna melt Hollywood.
06:58:36.640 | US media corporations that took a hit in the 2010s,
06:58:39.700 | we're now gonna be able to have everyone around the world
06:58:41.980 | able to tell their story.
06:58:43.380 | And all the stuff about AI ethics and AI bias,
06:58:45.780 | the ultimate bias is centralized AI.
06:58:48.460 | Only decentralized AI is truly representative.
06:58:51.880 | You cannot be faux representative.
06:58:53.260 | You cannot claim that Google is representing Nigerians
06:58:57.580 | and Indians and Brazilians and Japanese.
06:58:59.140 | Like, those folks need to have access themselves, right?
06:59:01.940 | So that's a fundamental ethical argument
06:59:03.780 | against centralized AI.
06:59:05.300 | It's unethical, and it's like, you know,
06:59:07.760 | this faux thing where you might have like faux diversity
06:59:11.040 | in the interface,
06:59:14.080 | but you haven't actually truly decentralized it.
06:59:15.780 | This is the woke capitalism, right?
06:59:17.320 | You justify it with the wokeness,
06:59:18.520 | and you make the money by centralizing it.
06:59:20.280 | But the actual way of doing it is letting it free
06:59:22.280 | for the world and letting people build their own versions.
06:59:25.000 | If people wanna build a Asian "Lord of the Rings,"
06:59:27.600 | they can do that.
06:59:28.560 | If they wanna build an Indian one, they can do that.
06:59:30.300 | If they, you know, whatever they want, right?
06:59:32.560 | So that is the argument for AI decentralization
06:59:37.560 | and for how that kind of links to this.
06:59:40.360 | - I love that.
06:59:41.200 | AI decentralization fixes the bias problem in AI,
06:59:44.560 | which a lot of people seem to-
06:59:45.980 | - Yeah, centralization-
06:59:47.440 | - Talk about and focus on.
06:59:49.260 | - Yeah, centralization is inherently unrepresentative,
06:59:52.440 | fundamentally.
06:59:54.160 | Like, you can like mathematically show it.
06:59:55.400 | It's not representing the world.
06:59:57.600 | The decentralization allows anybody to pick it up
07:00:00.880 | and make it their own, right?
07:00:02.920 | And centralization is almost always a mask
07:00:05.120 | for like that private corporate interest, right?
07:00:07.760 | It's like, one of the things about the vocabulism thing,
07:00:09.960 | by the way, is the deplatforming of Trump was political.
07:00:12.880 | Other things are political.
07:00:14.160 | But do you know what deplatforming started with?
07:00:16.200 | In the late 2000s, early 2010s, all the open social stuff
07:00:19.640 | was when deplatforming was being used as a corporate weapon
07:00:22.520 | against Meerkat and Zynga and Teespring, right?
07:00:27.320 | These were companies that were competing with features
07:00:29.400 | of, you know, TweetDeck, et cetera.
07:00:31.000 | They're competing with features of Twitter or Facebook,
07:00:33.160 | and the API was cut off.
07:00:34.320 | And that was when actually progressives
07:00:35.960 | were for net neutrality and an open internet
07:00:39.040 | and open social against the concentration
07:00:40.980 | of corporate power and so on.
07:00:42.120 | Remember that, right?
07:00:43.440 | And so what's gonna happen is both those two things,
07:00:46.280 | the political and the corporate are gonna come together.
07:00:48.880 | In the Soviet Union, denunciation was used as a tool
07:00:53.360 | to, for example, undercut romantic rivals, right?
07:00:56.080 | There's a great article called "The Practice of Denunciation"
07:00:58.440 | in the Soviet Union, right?
07:01:00.400 | Which talks about all these examples
07:01:01.840 | where the ideological argument was used
07:01:04.680 | to like kick somebody into the 300 like pit
07:01:07.460 | that existed at like the center of the Soviet Union.
07:01:09.360 | Anybody could be kicked into the pit at any moment.
07:01:11.000 | And ta-da, well, Ivan's out, you know,
07:01:13.800 | and now, you know, hey, Anna, you know, whatever, right?
07:01:16.620 | Okay, that same thing is gonna be used by woke capitalists,
07:01:20.000 | is being used by woke capitalists,
07:01:21.480 | where the woke argument is used to justify
07:01:24.520 | pulling, pushing their competitor out of the app store
07:01:26.680 | or downracking them in search.
07:01:28.140 | Well, again, you wouldn't want a bigot to be in search
07:01:31.000 | who could compete with us or whatever, right?
07:01:32.800 | And conversely, so the wokeness is used to make money
07:01:36.720 | and the money is used to advance the ideology.
07:01:39.840 | It's like this kind of back and forth.
07:01:41.160 | Sometimes, right now, you think of those
07:01:43.320 | as independent things, but then they fuse, okay?
07:01:46.240 | And so that's very clear with the AI bias arguments
07:01:49.880 | where it just so happens that it's so powerful, Lex,
07:01:54.040 | this technology is so powerful in the wrong hands,
07:01:56.000 | it could be used, so we will charge you $99 for every use
07:01:59.100 | of it, how's that?
07:01:59.940 | How altruistic is that?
07:02:01.220 | Is that amazingly altruistic?
07:02:02.380 | It's really good, right?
07:02:03.620 | So once you kind of see that, as I said,
07:02:08.100 | whenever they're positioning in economics,
07:02:09.580 | you can go in culture, when they're positioning on culture,
07:02:11.460 | you can go in economics.
07:02:12.940 | If they're so woke, why are they rich?
07:02:14.560 | If they're so concerned about representation,
07:02:16.380 | why is it centralized?
07:02:17.220 | Answer, they're not actually concerned about it,
07:02:18.700 | they're making money, right?
07:02:19.820 | Okay, so that is, I think, in a few words,
07:02:22.780 | blows up a lot of the AI bias type stuff, right?
07:02:26.520 | Okay, they're basically, they're biasing AI, all right.
07:02:30.240 | So the amount of stuff that can be done with AI now,
07:02:32.760 | like it also helps the pseudonymous economy,
07:02:34.260 | as I was talking about with the AI Zoom.
07:02:35.880 | So you have totally new sites, totally new apps
07:02:38.880 | that are based on that.
07:02:40.040 | I think it may, I mean, it changes,
07:02:46.920 | you're gonna have new Google Docs,
07:02:48.160 | new, all these kinds of things.
07:02:49.240 | You might have, you know, once you can do things
07:02:51.840 | with just a few taps, you might have sites
07:02:54.060 | that are focused more on producing
07:02:55.740 | rather than just consuming, because, you know,
07:02:58.280 | you might, with AI, you can change
07:03:00.060 | the productivity of gestures.
07:03:01.640 | You know, you can have a few gestures,
07:03:03.620 | like, for example, the image-to-image thing
07:03:06.780 | with the Cable Diffusion, where you make a little cartoon,
07:03:09.620 | third graders painting, and it becomes a real painting.
07:03:12.420 | A lot of user interfaces will be rethought
07:03:14.660 | now that you can actually do this incredible stuff
07:03:16.900 | with AI, it knows what you want it to do, right?
07:03:19.820 | So, and I saw this funny thing,
07:03:21.840 | which was a riff on Peter Thiel's line,
07:03:24.420 | which is AI is centralized and crypto is decentralized.
07:03:27.200 | And somebody was saying, actually,
07:03:28.760 | it turns out crypto is centralized
07:03:30.080 | with the CBDCs and stable coin and so on,
07:03:32.760 | but AI is getting decentralized with Cable Diffusion,
07:03:34.640 | ha ha, right?
07:03:35.560 | Which is funny, and I think there's centralized
07:03:36.920 | and decentralized versions of each of these, right?
07:03:38.840 | And finally, the third pole that actually,
07:03:41.240 | you know, Thiel, you know, he talks about AI and crypto,
07:03:43.660 | but the third pole is actually,
07:03:44.640 | that's sort of underappreciated
07:03:45.800 | 'cause people think it already exists, is social.
07:03:47.600 | That just is keeping on going, right?
07:03:49.460 | And obviously the next step in social is AR and VR.
07:03:52.760 | And why is it so obvious?
07:03:53.600 | Because it's meta, you know, it's Facebook.
07:03:55.460 | Now I saw this very silly article, it's like,
07:03:57.560 | oh my God, Facebook is so dumb
07:03:58.780 | for putting a $10 billion into, you know,
07:04:00.860 | virtual reality, right?
07:04:02.200 | And I'm like, okay, the most predictable innovation
07:04:05.880 | in the world, in my view, is the AR glasses.
07:04:08.360 | Have you talked about this on the podcast before?
07:04:10.760 | - AR and VR, I mean, of course a lot,
07:04:13.080 | but the AR is not as obvious, actually.
07:04:15.120 | - Okay, so AR glasses, what are AR glasses?
07:04:17.920 | So you take Stampshed Spectacles, Google Glass,
07:04:21.920 | Apple's AR Kit, Facebook's Oculus Quest 2, right?
07:04:26.100 | Or MetaQuest 2, whatever, okay?
07:04:28.980 | You put those together and what do you get?
07:04:31.020 | You get something that has the form factor of glasses
07:04:33.940 | that you'd wear outside, okay?
07:04:35.940 | Which can, with a tap, record
07:04:39.460 | or give you terminator vision on something,
07:04:41.700 | or with another tap go totally dark and become VR glasses.
07:04:45.580 | Okay, so normal glasses, AR glasses, VR glasses, recording.
07:04:49.600 | It's as multifunctional as your phone, but it's hands-free.
07:04:53.400 | And you might actually even wear it more than your phone.
07:04:55.240 | In fact, you might be blind without your AR glasses
07:04:58.040 | because, you know, one of the things I've shown
07:05:01.000 | the book early on are like floating sigils.
07:05:04.240 | Did we talk, did I show you that?
07:05:06.300 | So this is a really important just visual concept.
07:05:11.280 | That right there shows with AR Kit,
07:05:12.880 | you can see a globe floating outside.
07:05:15.440 | Okay, secret societies are returning.
07:05:18.180 | This is what NFTs will become.
07:05:19.540 | The NFT locally on your crypto phone,
07:05:22.580 | if you hold it, you can see the symbol.
07:05:24.420 | And if you don't, you can't.
07:05:26.420 | - By the way, for people just listening,
07:05:28.100 | we're looking at a nice nature scene
07:05:31.700 | where an artificially created globe is floating in the air.
07:05:36.660 | - Yes, but it's invisible
07:05:38.820 | if you're not holding up the AR Kit phone, right?
07:05:42.060 | So- - So only you
07:05:43.700 | have a window into this artificial world.
07:05:46.200 | - That's right.
07:05:47.140 | And then here's another thing
07:05:48.840 | which shows you another piece of it.
07:05:50.640 | And this is using ENS to unlock a door.
07:05:55.640 | So this is an NFT used for something different.
07:05:57.280 | So the first one is using the NFT effectively
07:05:59.080 | to see something.
07:06:00.480 | And the second is using the NFT to do something.
07:06:03.100 | Okay, so based on your on-chain communication, right?
07:06:07.600 | You can unlock a door.
07:06:10.080 | That's a door to a room.
07:06:11.800 | Soon it could be a door to a building.
07:06:13.000 | It could be the gates to a community.
07:06:15.080 | It could be your digital login, okay?
07:06:17.520 | And so- - Amazing.
07:06:19.560 | - What this means is basically a lot of these things
07:06:21.360 | which are like individual pieces get synthesized, right?
07:06:24.640 | And you eventually have a digital,
07:06:27.480 | just like you have a digital currency,
07:06:29.000 | or digital currencies unify concepts like obviously gold,
07:06:33.820 | stocks, bonds, derivatives,
07:06:36.320 | every kind of financial instrument,
07:06:37.960 | plus Chuck E. Cheese tokens, karma,
07:06:39.720 | everything that's fungible and transferable.
07:06:41.940 | The digital passport unifies your Google style login,
07:06:46.100 | your private keys, your API keys, your NFTs,
07:06:49.000 | your ENS name, your domain name,
07:06:50.520 | all of those kinds of things,
07:06:51.800 | and your key card for your door and so on, right?
07:06:54.220 | So the AR glasses are what probably, I don't know,
07:06:57.120 | it'll be Facebook's version three or version four.
07:07:00.280 | Apple is also working on them.
07:07:01.400 | Google's also working on them.
07:07:02.640 | You might just get a bunch of those models at the same time.
07:07:04.840 | It's like predicting the iPhone,
07:07:06.300 | just like Dorsey knew that mobile was gonna be big.
07:07:08.680 | And that's why he had 140 characters for Twitter
07:07:11.360 | 'cause it was like an SMS code limitation
07:07:12.960 | and Twitter was started before the iPhone.
07:07:14.800 | AR glasses are an incredibly predictable invention
07:07:18.080 | that you can start thinking about the future of social
07:07:21.620 | is in part in person, okay?
07:07:23.920 | And it also means people might go outside more.
07:07:26.800 | Because you can't see a monitor in the sun,
07:07:29.780 | but you can hit AR and maybe you have a full screen thing
07:07:33.080 | and you just like kind of move your fingers or something
07:07:36.320 | and you can tap.
07:07:37.160 | You have to figure out the gesture.
07:07:38.080 | You don't wanna have gorilla arms.
07:07:39.500 | Maybe you do have a keyboard outside
07:07:41.120 | or just even like a,
07:07:42.400 | you could even have a desk like this.
07:07:46.520 | If you had, if you can touch type,
07:07:49.100 | you can imagine something where you look down
07:07:50.600 | and you can see a keyboard with your AR glasses
07:07:53.800 | and it registers it and then you can type like this, right?
07:07:56.240 | And probably you could have some AI that could,
07:07:58.960 | figure out what you meant rather than what you were doing.
07:08:01.240 | Okay, so that's AI and social media.
07:08:04.640 | That's AR and social media.
07:08:06.240 | But really one last thing I'll say,
07:08:08.440 | which is a non obvious non-technological part
07:08:10.880 | is I think we'll go from very broad networks,
07:08:14.880 | which are hundreds of millions or billions of people
07:08:16.680 | like Twitter and Facebook,
07:08:17.600 | which have many small communities in them
07:08:19.520 | to much smaller networks that have a million
07:08:23.120 | or 10 million people, but are much deeper, right?
07:08:26.560 | In terms of their association, right?
07:08:29.000 | And this is the long-term trend in tech
07:08:30.680 | 'cause you're going from eyeballs in the 1980s,
07:08:33.480 | or I'm sorry, eyeballs in the 1990s
07:08:35.340 | to daily active users in the 2000s
07:08:38.280 | to holders in the 2010s.
07:08:41.320 | So you go from just like, oh, I'm just a looky-loo
07:08:43.200 | to I'm logging in every day
07:08:44.920 | to I'm holding a significant percentage of my net worth.
07:08:47.480 | And then this decade is when the online community
07:08:50.720 | becomes primary.
07:08:51.880 | You're a netizen.
07:08:52.720 | The digital passport is your main identity.
07:08:55.760 | And so this is not,
07:08:57.220 | see the problem with Facebook or Twitter
07:08:58.560 | is it's a bunch of different communities
07:08:59.740 | that don't share the same values fighting each other.
07:09:01.480 | This brings us back to the network state
07:09:02.840 | where you have one community with shared values,
07:09:05.160 | shared currency, and it's full stack.
07:09:07.520 | It's a social network and it's a cryptocurrency
07:09:10.520 | and it's a co-living community and it's a messaging app
07:09:14.080 | and it's a this and it's a that.
07:09:15.280 | And it's like Estonia,
07:09:16.680 | with a million people,
07:09:17.520 | you can actually build a lot of that full stack.
07:09:19.220 | That is, starts to get to what I call a network state.
07:09:21.680 | - I feel like there should be
07:09:25.680 | like a standing applause line here.
07:09:28.000 | This is brilliant.
07:09:29.600 | You're an incredible person.
07:09:31.840 | This was an incredible conversation.
07:09:33.320 | We covered how to fix our government,
07:09:36.040 | looking at the future of governments,
07:09:38.640 | moving into network state.
07:09:40.160 | We covered how to fix medicine, FDA, longevity.
07:09:45.160 | That was just like a stellar description.
07:09:49.840 | Really, I'll have to listen to that multiple times
07:09:51.760 | to really think and thank you for that,
07:09:53.360 | especially in this time
07:09:55.960 | where the lessons learned from the pandemic
07:09:58.680 | are unclear to at least me.
07:10:02.160 | And there's a lot of thinking that needs to be done there.
07:10:04.800 | And then just a discussion about how to fix social media
07:10:09.440 | and how to fix money.
07:10:12.560 | This was brilliant.
07:10:13.520 | So you're an incredibly successful person yourself.
07:10:16.280 | You taught, co-taught a course at Stanford for startups.
07:10:21.280 | That's a whole nother discussion that we can have,
07:10:24.800 | but let me just ask you,
07:10:26.320 | there's a lot of people that look up to you.
07:10:29.200 | So if there's somebody who's young,
07:10:31.040 | in high school, early college,
07:10:34.500 | trying to figure out what the heck to do with their life,
07:10:36.680 | what to do with their career,
07:10:38.160 | what advice could you give them?
07:10:39.880 | How they can have a career they can be proud of
07:10:42.960 | or how they can have a life they can be proud of.
07:10:46.000 | - At least what I would do,
07:10:47.000 | and then you can take it or leave it or what have you.
07:10:49.000 | (Lex laughing)
07:10:51.840 | - Yeah, maybe to your younger self.
07:10:53.600 | Advice to your younger self.
07:10:55.360 | - My friend, Novel,
07:10:56.360 | this is a lot of what he puts out
07:11:00.280 | is the very practical brass tacks, next steps.
07:11:03.880 | And I tend towards the macro.
07:11:07.100 | Of course, we both do both kind of thing, right?
07:11:10.680 | But let's talk brass tacks and next steps
07:11:12.240 | 'cause I actually am practical,
07:11:14.640 | or at least practical enough to get things done, I think.
07:11:17.760 | - It's just like you said,
07:11:19.160 | you're breaking up the new book into three.
07:11:21.640 | - Yes, it's motivation, theory, and practice.
07:11:23.800 | - Motivation, theory, and practice.
07:11:25.320 | - That's right.
07:11:26.160 | And each of those-- - Let's talk about practice.
07:11:27.680 | - So let's talk practice.
07:11:28.600 | - Especially at the individual scale.
07:11:30.000 | - Right, so first, what skill do you learn
07:11:33.040 | as a young kid, right?
07:11:35.640 | So let me just give what the ideal full stack thing is.
07:11:39.840 | And then you have to say,
07:11:40.680 | okay, I'm good quantitatively, I'm good verbally,
07:11:43.320 | I'm good this, I'm good that, right?
07:11:45.000 | So the ideal is you are full stack engineer
07:11:48.440 | and full stack influencer,
07:11:49.760 | or full stack engineer, full stack creator, okay?
07:11:51.920 | So that's both right brain and left brain, all right?
07:11:54.700 | So what does that mean with engineering?
07:11:56.760 | That means you master computer science and statistics, okay?
07:12:01.220 | And of course, it's also good to know physics
07:12:03.060 | and continuous math and so on.
07:12:04.180 | That's actually quite valuable to know.
07:12:05.520 | And you might need to use a lot of that continuous math
07:12:08.360 | with AI nowadays, right?
07:12:09.600 | 'Cause a lot of that is actually helpful, right?
07:12:11.940 | Great descent and whatnot.
07:12:13.200 | But computer science and stats are to this century
07:12:17.500 | what physics was to the last, why?
07:12:20.360 | Because, for example, what percentage of your time
07:12:23.080 | do you spend looking at a screen of some kind?
07:12:25.340 | - A large percentage of the time.
07:12:27.840 | - A large percentage of the time, right?
07:12:28.720 | Probably more than, you know,
07:12:30.520 | for many people it's more than 50% of their waking hours.
07:12:33.040 | If you include laptop, you include cell phone, tablet,
07:12:36.680 | you know, your watch, you know,
07:12:38.040 | maybe a monitor of some kind, right?
07:12:39.980 | All those together is probably, it's a lot, okay?
07:12:42.120 | Which means, and then that's gonna only increase
07:12:44.160 | with AR glasses, okay?
07:12:46.400 | Which means most of the rest of your life
07:12:48.560 | will be spent in a sense in the matrix, okay?
07:12:51.840 | In a constructed digital world,
07:12:54.180 | which is more interesting in some sense
07:12:56.560 | than the offline world.
07:12:57.920 | 'Cause we look at it more, it changes faster, right?
07:13:00.440 | And where the physics are set by programmers, okay?
07:13:03.680 | And what that means is, you know,
07:13:06.840 | physics itself is obviously very important
07:13:08.880 | for the natural world.
07:13:10.060 | Computer science and stats are for the artificial world,
07:13:12.640 | right?
07:13:13.480 | And why is that?
07:13:14.600 | Because every domain has algorithms and data structures,
07:13:19.440 | whether it's aviation, okay?
07:13:21.480 | You go to American Airlines, right?
07:13:23.920 | They're gonna have, you know,
07:13:26.200 | planes and seats and tickets and so on.
07:13:29.480 | So it's data structures,
07:13:30.600 | and you're gonna have algorithms
07:13:31.560 | and functions that connect them.
07:13:32.760 | You're gonna have tables that those data are run to.
07:13:34.800 | If it's Walmart, you're gonna have SKUs,
07:13:37.720 | and you're gonna have shelves,
07:13:38.880 | and you're gonna have, so you have data structures
07:13:40.360 | and you have algorithms to connect them.
07:13:41.800 | So every single area, you have algorithms
07:13:44.520 | and data structures, which is computer science and stats.
07:13:47.160 | And so you're going to collect the data and analyze it, right?
07:13:51.800 | And so that means if you have that base of CS and stats,
07:13:54.800 | where you're really strong and you understand, you know,
07:13:57.640 | the theory as well as the practice, right?
07:13:59.560 | And you need both, okay?
07:14:00.480 | Because you need to understand, you know,
07:14:03.280 | obviously the basic stuff like big O notation and whatnot,
07:14:06.240 | and you need to understand
07:14:07.600 | all your probability distributions, okay?
07:14:10.320 | You know, a good exercise, by the way,
07:14:11.520 | is to go from the Bernoulli trials, right?
07:14:16.200 | To everything else,
07:14:17.360 | 'cause you can go Bernoulli trials
07:14:18.680 | to the binomial distribution, to the Gaussian.
07:14:21.640 | You can also go from, you know,
07:14:23.240 | Bernoulli trials to the geometric distribution and so on.
07:14:26.120 | You can drive everything from this, right?
07:14:27.920 | - And computer science includes not just big O,
07:14:30.400 | but software engineering?
07:14:32.040 | - Well, computer science is theory,
07:14:33.120 | software engineering is practice, right?
07:14:34.880 | You could argue probability and stats is theory,
07:14:38.840 | and then data science is practice.
07:14:40.760 | - Sure. - Right?
07:14:41.600 | - Yeah. - And so-
07:14:42.440 | - So you include all of that together.
07:14:43.680 | - I include all of that as a package.
07:14:44.760 | That's theory and practice, right?
07:14:46.200 | I mean, look, it's okay to use libraries
07:14:50.240 | once you know what's going on under the hood, right?
07:14:52.600 | That's fine, but you need to be able to kind of
07:14:54.280 | write out the whole thing yourself.
07:14:55.520 | - I mean, it's...
07:14:56.800 | That could be true, could not be true, I don't know.
07:15:02.480 | Are you sure about that?
07:15:03.720 | Because- - Well, you should-
07:15:05.320 | - You could, you might be able to get quite far
07:15:09.560 | standing on the shoulders of giants.
07:15:11.600 | - You can, but it depends.
07:15:13.320 | Like, you couldn't build, well, okay.
07:15:17.280 | Somebody- - Maybe you could.
07:15:18.640 | However you were gonna finish that sentence,
07:15:21.040 | I could push back before-
07:15:22.160 | - You could probably push back, right?
07:15:23.520 | But here's what I was gonna say.
07:15:24.400 | I was gonna say, you couldn't really,
07:15:26.520 | you couldn't build Google or Facebook or Amazon or Apple
07:15:31.040 | without somebody at the company who understood
07:15:34.640 | like computer architecture and layout of memory
07:15:39.640 | and theory of compilers.
07:15:42.360 | - But you might want to, see, the thing is,
07:15:44.120 | if you just look at libraries,
07:15:45.280 | you might be able to understand the capabilities
07:15:48.560 | and you can build up the intuition of like
07:15:50.760 | what a great specialized engineer could do that you can't.
07:15:55.520 | - Like, for example, at least a while back,
07:15:57.560 | facebook.com, like was literally,
07:15:59.920 | it's just a single C++ compiled binary.
07:16:02.080 | Or sorry, it's not C++, it was like hip hop.
07:16:05.320 | They had a PHP compiler where they had just one giant binary.
07:16:10.320 | I may be getting this wrong, but that's what I recall, right?
07:16:13.160 | - Yeah, yeah.
07:16:14.000 | I mean, it should be simple, it should be simple.
07:16:16.560 | And then you have guys like John Carmack
07:16:19.200 | who comes in and does an incredibly optimized implementation
07:16:24.200 | that actually-
07:16:26.160 | - Well, yeah, more than that, right?
07:16:27.400 | Like he's, I mean, yes, right, but go ahead.
07:16:30.160 | - I mean, there's some cases with John Carmack
07:16:32.960 | by being an incredible engineer is able to bring to reality
07:16:37.960 | things that otherwise would have taken
07:16:39.640 | an extra five to 10 years.
07:16:41.560 | - Yeah, or maybe even more than that.
07:16:42.760 | Like, so, you know, this is the great man theory of history
07:16:45.520 | versus like sort of the kind of the determinist,
07:16:50.320 | like, you know, waves of history are pushing things along.
07:16:52.600 | The way I reconcile those is the tech tree model of history.
07:16:55.360 | You know, like civilization, you ever play a game,
07:16:56.720 | civilization? - Yeah.
07:16:57.560 | - Yeah, so like civilization, you got the tech tree
07:16:59.360 | and you can go and be like, okay, I'm gonna get spearmen
07:17:01.480 | or I'm gonna do granaries and pottery, right?
07:17:05.000 | And so you can think of it as something where
07:17:07.640 | here's everything that humanity has right now.
07:17:09.680 | And then Satoshi can push on this dimension
07:17:11.680 | of the tech tree.
07:17:12.600 | So he's a great man because there weren't other,
07:17:14.920 | there wasn't a Leibniz to Satoshi's Newton, right?
07:17:18.880 | Like Vitalik, as amazing as he is,
07:17:20.840 | was five years later or thereabouts, right?
07:17:22.840 | There wasn't contemporaneous, like, you know,
07:17:26.360 | another person that was doing what Satoshi was doing,
07:17:28.560 | it's truly Sui Gennaris, right?
07:17:31.080 | And that shows, you know, what one person can do.
07:17:33.080 | Like probably Steve Jobs with Apple, you know,
07:17:35.080 | given how the company was dying before he got there
07:17:36.920 | and he built it into the most valuable
07:17:38.320 | or put on the directory,
07:17:39.200 | he becomes the most valuable company in the world.
07:17:40.680 | It shows that there is quote, great man, right?
07:17:43.520 | Maybe more than just being five or 10 years ahead,
07:17:45.480 | like truly shaping where history goes, right?
07:17:48.320 | But on their hand, of course, that person,
07:17:50.280 | Steve Jobs himself wrote that email
07:17:52.000 | that Priestley was first saying that, you know,
07:17:53.840 | he doesn't grow his own food and he doesn't, you know,
07:17:57.720 | he didn't even think of the rights that he's got,
07:17:59.600 | someone else thought of those and whatnot.
07:18:01.600 | And so he kind of, it is always a tension
07:18:03.880 | between the individual and society on this, right?
07:18:05.800 | But coming back, so CS and stats,
07:18:08.440 | that's what you wanna learn.
07:18:10.040 | I think physics is also good to know
07:18:11.440 | because you go one level deeper
07:18:12.600 | and of course all these devices,
07:18:14.080 | you're not gonna be able to build, you know,
07:18:17.800 | LIDAR or things like that without understanding physics.
07:18:20.440 | - You mentioned that as one side of the brain,
07:18:22.320 | what about the other?
07:18:23.320 | - Right, so CS and stats is that side.
07:18:25.560 | Okay, and then you can go into any domain,
07:18:26.880 | any company, kick butt, you know, add value, right?
07:18:30.080 | Okay, so now the other side is creator, right?
07:18:32.800 | Becoming a creator.
07:18:33.880 | First, online, you know, like social media
07:18:37.200 | is about to become far, far, far more lucrative
07:18:40.040 | and monetizable.
07:18:41.240 | People are not updated.
07:18:43.680 | They kind of think this is, it's like over
07:18:45.760 | or something like that or it's old or whatever.
07:18:47.960 | But with crypto, once you have property rights
07:18:52.120 | in social media, now it's not what Google
07:18:55.840 | just allows you to have, but it's what you own, right?
07:18:58.960 | You actually have genuine property rights.
07:19:01.880 | And that just completely changes everything,
07:19:03.800 | just like, you know, the introduction of property rights
07:19:05.520 | in China change everything.
07:19:06.360 | It might take some lag for that to happen,
07:19:08.400 | but you can lend against that, borrow against that.
07:19:12.000 | You just, you own the digital property, right?
07:19:15.040 | And you can do NFTs, you can do, you know, investments,
07:19:17.120 | you can do all this other stuff, right?
07:19:18.760 | So in many ways, I think anybody who's listening,
07:19:22.480 | who's like, you know, I want to build a billion dollar
07:19:24.400 | company, I'm like, build a billion dollar company, yes.
07:19:26.760 | Also build a million person media operation
07:19:30.760 | or a million person following or something online, right?
07:19:33.240 | Because a US media company is simply not economically
07:19:37.920 | or socially aligned with your business.
07:19:40.120 | I mean, the big thing that I think, you know,
07:19:43.160 | tech and media actually, it's funny,
07:19:45.480 | there's this collision and sometimes
07:19:46.960 | there's an Adam smashing event
07:19:48.240 | and there's like a repositioning, right?
07:19:50.300 | And media attacked tech really hard in the 2010s,
07:19:54.400 | as well as many other things.
07:19:55.840 | And now, post 2020, I think it's now centralized tech
07:20:00.080 | and media versus decentralized tech and media.
07:20:03.040 | And centralized tech and media is NYT and Google,
07:20:07.440 | which have all become woke-ified,
07:20:08.680 | the establishment companies.
07:20:10.500 | But decentralized tech and media is like Substack,
07:20:12.660 | all lots of defectors from the US establishment,
07:20:17.280 | from the NYT have gone to Substack.
07:20:19.520 | But also all the founders and funders
07:20:21.880 | are much more vocal on Twitter,
07:20:23.960 | whether it's Mark Anderson, Jack Dorsey, Jeff Bezos,
07:20:28.120 | Zuckerberg, Zuck is just cutting out the establishment
07:20:30.400 | and just going direct to posting himself
07:20:33.480 | or posting the jiu-jitsu thing, you know,
07:20:35.160 | which he recently did or going and talking to Rogan, right?
07:20:38.400 | And so you now have this sort of Adam smash
07:20:41.960 | and like kind of reconstitution.
07:20:43.760 | Why is that important?
07:20:44.700 | Well, look, once you realize US media companies
07:20:48.080 | are companies and their employees,
07:20:51.880 | Sulzberger's employees are just dogs on a leash, right?
07:20:54.560 | They're hit men for old money,
07:20:56.980 | assassins for the establishment.
07:20:58.400 | They're never gonna investigate him, okay?
07:21:00.360 | There's this thing right now, like some strike
07:21:02.480 | or possible strike that's going at the New York Times.
07:21:04.580 | The obviously, the most obvious rich corporate zillionaire,
07:21:09.580 | the epitome of white privilege is, you know,
07:21:12.640 | and again, I'm not the kind of person
07:21:14.080 | who thinks white is an insult, right?
07:21:15.480 | But the guy who inherited the company
07:21:18.320 | from his father's father's father's father,
07:21:20.260 | in the NFL, right?
07:21:22.040 | You're supposed to have the Rooney rule
07:21:23.000 | where you're supposed to interview diverse candidates
07:21:24.480 | for the top job.
07:21:25.320 | You know, the other competitors for the top job
07:21:27.300 | of the publisher of the New York Times
07:21:28.160 | were two cousins of Sulzberger.
07:21:31.160 | So it's three cis straight white males in 2017
07:21:34.940 | who competed for this top job.
07:21:36.540 | And everybody in media was like silent
07:21:38.420 | about this coronation.
07:21:39.460 | They had this coronation article
07:21:40.940 | in the Times about this, right?
07:21:43.340 | So you have this meritless nepotist, right?
07:21:47.060 | This literally rich cis white man
07:21:49.380 | who makes millions of dollars a year
07:21:52.180 | and it makes like 50 X the salary of other,
07:21:55.980 | you know, NYT journalists, okay?
07:21:59.000 | And, you know, lives in a mansion and so on
07:22:02.140 | while denouncing, this is a born rich guy
07:22:04.300 | who denounces all the built rich guys
07:22:06.420 | at a company which is far whiter
07:22:08.920 | than the tech companies he's been denouncing, okay?
07:22:11.600 | And again, there's a website called
07:22:14.220 | Tech Journalism is Less Diverse Than Tech.com
07:22:16.980 | which actually shows the numbers on this, right?
07:22:18.820 | Here, I can look at this numbers, right?
07:22:21.140 | So why did I say this?
07:22:22.440 | Well, centralized US media has lost a ton of clout.
07:22:25.780 | Engagement is down.
07:22:27.900 | You've seen the crypto prices down,
07:22:29.440 | like stock prices have crashed.
07:22:30.980 | That's very obvious and quantifiable.
07:22:33.420 | Less visible is that media engagement has crashed, right?
07:22:36.460 | By the way, yeah, there's a plot
07:22:38.860 | that shows on the X axis percent white
07:22:41.580 | and then the Y axis are the different companies.
07:22:45.300 | And the tech companies are basically below 50% white
07:22:51.440 | and all the different media,
07:22:55.080 | tech journalism companies are all way above,
07:22:58.520 | you know, 70, 80, 90 plus percent white.
07:23:03.420 | And hypocrisy, ladies and gentlemen.
07:23:05.760 | - I mean, again, I'm not the kind of person
07:23:07.440 | who thinks white is an insult, but these guys are
07:23:10.200 | and they are the wokest whites on the planet, right?
07:23:13.200 | It's like ridiculous, right?
07:23:14.600 | - You know, it's like anyone who's homophobic,
07:23:19.120 | anyone who's, it feels like it's a personal thing
07:23:22.080 | that they're struggling with.
07:23:23.640 | Maybe the journalists are actually the ones who are racist.
07:23:27.280 | - Well, actually, you know, it's funny you say that
07:23:29.280 | because there's this guy, A.M. Rosenthal, okay?
07:23:33.260 | And you know, on his gravestone was,
07:23:35.000 | we kept the, he kept the paper straight, right?
07:23:37.300 | And actually he essentially went and,
07:23:41.600 | this is a managing editor of the New York Times for almost,
07:23:43.660 | you know, from '69 to '77, executive editor from '77 to '86.
07:23:47.880 | And it was a history-- - Oh my Lord.
07:23:50.120 | - Yeah, history of basically keeping,
07:23:52.700 | you know, gay reporters out.
07:23:53.540 | So essentially, the way I think about it is,
07:23:56.240 | New York Post reported that,
07:23:58.160 | just to talk about this for a second
07:24:01.240 | 'cause it's so insane, all right?
07:24:03.080 | New York Post reported,
07:24:03.920 | and I've got some of this in the book, okay?
07:24:05.560 | But, Abe Rosenthal, managing editor of the New York Times
07:24:10.560 | from 1969 to 1977, executive editor from 1977 to 1986.
07:24:16.440 | His gravestone reads, he kept the paper straight.
07:24:19.840 | And then here's Jeet here on this.
07:24:22.360 | He kept the paper straight. As it happens,
07:24:23.980 | Rosenthal was a notorious homophobe.
07:24:25.300 | He made it a specific policy of the paper
07:24:26.660 | not to use the term gay.
07:24:27.500 | He denied a plum job to a gay man for being gay.
07:24:29.620 | He minimized AIDS crisis.
07:24:31.020 | So, like, you know, the thing about this is,
07:24:34.580 | this is not like a one-off thing, okay?
07:24:36.620 | The New York Times literally won a Pulitzer
07:24:39.900 | for choking out the Ukrainians,
07:24:41.420 | for helping starve five million Ukrainians to death.
07:24:43.660 | And now has reinvented themselves as like a cheerleader
07:24:46.060 | to stand with Ukraine, right?
07:24:47.460 | They were for, you know, Abe Rosenthal's homophobia
07:24:50.940 | before they were against it, right?
07:24:52.840 | They were like, if you saw the link I just pasted in, okay?
07:24:56.480 | During BLM, you know, it's credibly reported
07:24:59.760 | that, and I haven't seen this refuted,
07:25:01.440 | the family that owns the New York Times were slaveholders.
07:25:03.600 | Somehow that stayed out of 1619 and BLM coverage, right?
07:25:06.640 | So they were literally getting the profits from slavery
07:25:09.640 | to help bootstrap, you know, what was the Times
07:25:12.640 | or, you know, went into it.
07:25:13.800 | They actually did this article on like the compound interest
07:25:16.840 | of slaveholders in Haiti and how much they owed people,
07:25:20.780 | right?
07:25:21.620 | If you apply that to how much money they made off slaves,
07:25:23.100 | I mean, can anyone name one of Salzburg's slaves?
07:25:25.380 | Like, can we humanize that, put a face on that,
07:25:27.580 | show exactly, you know, who lost such that he may win, right?
07:25:31.340 | And so you stack this up and it's like, you know,
07:25:34.060 | for the Iraq war before they were against it.
07:25:35.820 | And it's like, yeah, sure, Bush, you know,
07:25:37.740 | did a lot of bad stuff there,
07:25:38.740 | but they also reported a lot of negative, you know,
07:25:41.380 | not negative coverage, like false coverage, right?
07:25:43.040 | About WMDs, like, you know, the whole jihadist military.
07:25:45.860 | And so it's like this amazing thing where
07:25:48.480 | if some of the most evil people in history
07:25:50.240 | are the historians, if the, you know,
07:25:52.960 | they actually ran this ad campaign in the 2017 time period
07:25:56.040 | called "The Truth."
07:25:57.200 | So giant Orwellian billboards, right?
07:26:00.780 | Which say, you know, the truth is essential.
07:26:04.920 | Here, it looks like this.
07:26:06.320 | - This was when?
07:26:07.920 | - This was just a few years ago, 2017.
07:26:10.320 | - This is in New York.
07:26:11.980 | A billboard by the New York Times reads,
07:26:15.360 | "The truth is hard to know.
07:26:17.560 | The truth is hard to find.
07:26:19.580 | The truth is hard to hear.
07:26:21.180 | The truth is hard to believe.
07:26:22.620 | The truth is hard to accept.
07:26:24.780 | Truth is hard to deny.
07:26:26.840 | The truth is more important now than ever."
07:26:30.360 | All right, this is like, yeah, this is 1984 type of stuff.
07:26:36.300 | - Yeah, now here's the thing.
07:26:37.140 | Do you know what other-
07:26:38.460 | - Truth, big.
07:26:40.460 | Truth, period, big white board.
07:26:44.940 | - So, okay, what other national newspaper
07:26:48.440 | proclaimed itself the truth in constantly, every day?
07:26:52.700 | You know this one, actually.
07:26:54.840 | - Oh, you mean Pravda?
07:26:56.140 | Yeah, yeah.
07:26:56.980 | - There you go, that's right.
07:26:57.800 | What is the Soviet translation?
07:26:59.740 | What's the Russian translation of Pravda?
07:27:00.980 | - It's truth.
07:27:02.040 | - Yeah.
07:27:02.880 | - That's so sorry, that didn't even connect to my head, yes.
07:27:05.160 | (laughing)
07:27:08.000 | Yeah, truth.
07:27:09.900 | Unironically, huh?
07:27:11.960 | - And again, it just so happens that-
07:27:13.300 | - Is this an Onion article?
07:27:14.640 | - What's that?
07:27:15.480 | Onion article, right.
07:27:16.300 | So like, you know, Pravda, like at least they were communist,
07:27:18.680 | these guys have figured out how to get,
07:27:20.320 | charge people $99 a year or whatever it is for the truth.
07:27:23.320 | Wow, that's actually even amazing, right?
07:27:25.160 | So the corporate truth.
07:27:26.200 | So when you stack all that up, right,
07:27:28.120 | basically legacy media has delegitimized themselves, right?
07:27:32.520 | Every day that those, quote,
07:27:33.600 | "investigative journalists don't investigate Salzburger,"
07:27:36.240 | shows that they are so courageous
07:27:38.360 | as to investigate your boss, but not their own.
07:27:40.720 | - Yes.
07:27:41.960 | - Ta-da, total mass drop, right?
07:27:44.900 | That's like, just obvious, right?
07:27:47.820 | And now once you realize this,
07:27:49.220 | and you know, every influencer who's coming up,
07:27:52.740 | every creator realizes, okay, well that means
07:27:56.140 | I have to think about these media corporations
07:27:57.980 | as competitors.
07:27:59.560 | They are competitors.
07:28:00.400 | They are competitors for advertisers and influence.
07:28:03.700 | They will try, basically what the media corporations did
07:28:06.220 | partially successfully during the 2010s is
07:28:08.660 | they sort of had this reign of terror over many influencers,
07:28:11.900 | where they'd give them positive coverage
07:28:14.140 | if they supported sort of the party line,
07:28:16.440 | and negative coverage if they didn't, okay?
07:28:18.820 | But now the soft power has just dropped off a cliff, right?
07:28:23.820 | And, you know, many kinds of tactics that, you know,
07:28:27.920 | establishment journalists do,
07:28:29.080 | one way of thinking of them is like as a for-profit stasi.
07:28:33.100 | Because they may stalk you, dox you, surveil you.
07:28:38.100 | Like, they can literally put, you know,
07:28:41.020 | like two dozen people following somebody around for a year,
07:28:45.900 | and that's not considered stalking, right?
07:28:50.080 | That's not considered spamming.
07:28:52.040 | They are allowed to do this and make money doing this.
07:28:55.300 | Whereas if you so much as criticize them,
07:28:58.340 | oh my God, it's an attack on the free press, blah, blah,
07:29:00.660 | right?
07:29:01.500 | But you are the free press and I'm the free press.
07:29:02.940 | Like, we're the free press.
07:29:04.100 | Again, it goes back to the decentralized, you know,
07:29:06.260 | the free speech is not like some media corporation's thing.
07:29:09.100 | It's everybody's right.
07:29:10.140 | And what actually happened with social media,
07:29:12.260 | what they're against is not that it is an attack
07:29:14.500 | on democracy, it's that it's the ultimate democracy
07:29:16.180 | because people have a voice now
07:29:18.260 | that didn't used to have a voice.
07:29:19.180 | You know what I'm saying?
07:29:20.020 | Freedom of the press belongs to those who own one, right?
07:29:22.420 | That old one, right?
07:29:23.260 | Or never argue with a man who buys ink by the barrel, right?
07:29:25.780 | - Yeah.
07:29:26.620 | - In a real way, the entire things that were promised
07:29:30.500 | to people, freedom of speech, free markets, you know,
07:29:33.520 | like a beggar's democracy, it's like, oh yeah,
07:29:35.920 | you can have freedom of speech, but not freedom of reach,
07:29:38.140 | because you're just talking to yourself in your living room
07:29:40.940 | in, you know, Buffalo, New York, right?
07:29:44.140 | You maybe you can get gather some friends around.
07:29:45.760 | You didn't have the licenses to get, you know,
07:29:49.380 | like a TV broadcast license, radio license, you know,
07:29:52.100 | the resources to buy a newspaper.
07:29:53.660 | You didn't have practical reach or distribution, okay?
07:29:57.740 | What happened was all these people in the US
07:29:59.860 | and around the world suddenly got voices
07:30:01.300 | and they were suddenly saying things
07:30:02.300 | that the establishment didn't want them to say.
07:30:05.020 | And so that's what this counter decentralization has meant,
07:30:07.280 | both in the US and in China, this crackdown,
07:30:10.300 | but it's as if like a stock went up like 100X
07:30:12.980 | and then dropped like 30%.
07:30:15.340 | All the deplatforming stuff, yes, it's bad, okay?
07:30:18.980 | It's a rearward move, but in the long arc,
07:30:23.980 | I think we're going to have more speech.
07:30:26.740 | I think the counter decentralization may succeed in China,
07:30:29.420 | but I don't think it's gonna succeed outside it,
07:30:31.140 | 'cause you're trying to retrofit speech and thought controls
07:30:34.060 | onto an ostensibly free society, right?
07:30:36.940 | Now that Czech got cash, people actually have a voice.
07:30:39.600 | It's not gonna be taken away from them very easily, right?
07:30:42.000 | So how does this relate to my advice to young kids?
07:30:44.360 | Once you have that context, right?
07:30:46.020 | Once you realize, hey, look,
07:30:48.080 | Apple didn't like do deals with Blackberry, okay?
07:30:53.860 | Amazon didn't collaborate or give free content
07:30:57.600 | to Barnes and Noble.
07:30:59.480 | Netflix was not going and, you know,
07:31:03.080 | socializing with employees of Blockbuster.
07:31:05.480 | These employees of establishment media corporations
07:31:07.940 | are your competitors, okay?
07:31:09.960 | They are out for clicks.
07:31:12.360 | They are out for money.
07:31:13.720 | If they literally choke out the Ukrainians
07:31:18.320 | before making themselves into champions
07:31:20.560 | of the Ukrainian cause, they'll basically do anything.
07:31:23.560 | And so once you realize that, you're like,
07:31:25.760 | okay, I need to build my own voice, okay?
07:31:27.920 | If you're Brazilian, you're Nigerian,
07:31:30.840 | you're in the Midwest or the Middle East, right?
07:31:32.960 | If you're, you know, Japanese, you know, wherever you are,
07:31:37.440 | you need to build your own voice
07:31:38.720 | because outsourcing that voice to somebody else
07:31:41.820 | and having it put through the distorting filter,
07:31:43.860 | which maximizes the clicks of the distorting kind of thing,
07:31:47.680 | it's just not gonna be in one's own interest.
07:31:49.640 | You don't have to, you know,
07:31:51.040 | even agree with everything I'm saying
07:31:52.400 | or even all of it to just be like,
07:31:53.680 | well, look, I'd rather speak for myself.
07:31:55.120 | I'd rather go direct if I could.
07:31:56.440 | Speak unmediated, in my own words, right?
07:31:58.280 | Because the choice of word is actually very important,
07:32:00.360 | right?
07:32:01.200 | So that's the second big thing.
07:32:02.800 | You need to, and this is the thing
07:32:03.800 | that took me a long time to understand, okay?
07:32:06.400 | Because I always got the importance of math and science.
07:32:08.840 | And in fact, I would have been probably
07:32:11.240 | just a career academic or mathematician in another life,
07:32:14.660 | you know, maybe statistician, something like that,
07:32:16.640 | electrical engineer, et cetera.
07:32:18.180 | But the importance of creating your own content
07:32:20.200 | and telling your own stories,
07:32:22.120 | if you don't tell your own story,
07:32:23.920 | the story will be told for you, right?
07:32:25.820 | The sort of flip of winners write history is
07:32:29.120 | if you do not write history, you will not be the winner.
07:32:31.280 | You must write a history, okay?
07:32:33.720 | As kind of a funny way of putting it, right?
07:32:36.120 | - Yeah, chicken and egg.
07:32:38.040 | Yeah. - Contra positive, right?
07:32:39.560 | And now what does that mean practically, okay?
07:32:41.720 | So in many ways, the program that I'm laying out
07:32:45.720 | is to build alternatives, peaceful alternatives
07:32:48.460 | to all legacy institutions, right?
07:32:52.040 | To obviously to the Fed, right?
07:32:54.520 | With Bitcoin, to Wall Street with DeFi
07:32:57.520 | and with Ethereum and so on.
07:32:59.240 | To academia with the ledger of record
07:33:02.120 | and the on-chain reproducible research that we talked about.
07:33:05.080 | To media with decentralized social media, decentralized AI.
07:33:09.340 | You can melt Hollywood with this, okay?
07:33:11.980 | Melt the RIA, melt the MPAA.
07:33:14.600 | I mean, there's some good people there,
07:33:16.720 | but everybody should have their own movies.
07:33:19.140 | You know, people should be able to tell their own stories
07:33:21.860 | and not just wait for it to be cast through Hollywood
07:33:24.720 | and Hollywood is just making remakes anyway, okay?
07:33:27.080 | So you can tell original stories
07:33:28.480 | and you can do so online and you can do so by hitting a key
07:33:31.480 | and the production values will be there now
07:33:32.800 | that the AI content creation tools are out there.
07:33:35.280 | I mentioned disrupting or replacing
07:33:37.760 | or building alternatives to the Fed, to Wall Street,
07:33:40.600 | to academia, to media.
07:33:43.680 | I mentioned to Wikipedia, right?
07:33:46.140 | There's things like Golden.
07:33:47.420 | There's things like,
07:33:48.800 | there's a bunch of web three-ish Wikipedia competitors
07:33:51.980 | that are combining both AI and crypto for property rights.
07:33:54.880 | There's, you'll also need alternatives
07:33:57.760 | to all the major tech companies.
07:33:58.960 | That was the list that I went through with,
07:34:01.440 | you know, decentralized search and social and messaging
07:34:06.120 | and operating systems and even the crypto phone, okay?
07:34:09.520 | And then finally, you need alternatives
07:34:10.920 | to US political institutions and more generally,
07:34:14.480 | and Chinese political institutions.
07:34:16.200 | And what are those?
07:34:18.200 | That's where the network state comes in.
07:34:20.200 | And the fundamental concept is if, you know,
07:34:23.760 | as I mentioned, only 2% of the world
07:34:25.880 | can become president of the United States
07:34:27.200 | about the number of Americans who are, you know,
07:34:29.800 | native born and over 35 and so on and so forth.
07:34:32.120 | But 100% of the world can become president
07:34:33.480 | of their own network state.
07:34:34.560 | What that means is,
07:34:36.360 | and this is kind of related to those two points, right?
07:34:38.020 | If you're an individual and you're good at engineering
07:34:40.600 | and you're good at content creation, okay?
07:34:42.960 | Like somebody like Jack Dorsey, for example,
07:34:44.600 | or Mark Anderson, actually a lot of the founders
07:34:46.680 | are actually quite good at both nowadays.
07:34:48.200 | You look at Bezos, he's actually funny on Twitter
07:34:49.840 | when he allows himself to be.
07:34:51.240 | You know, you don't become a leader of that caliber
07:34:53.640 | without having, you know, some of both, right?
07:34:55.800 | If you've got some of both,
07:34:57.940 | now, no matter where you are, what your ethnicity is,
07:35:00.120 | what your nationality is, whether you can get a US visa,
07:35:03.040 | you can become president of a network state.
07:35:05.440 | And what this is, it's a new path to political power
07:35:09.120 | that does not require going through either the US
07:35:11.040 | or the Chinese establishment.
07:35:12.320 | You don't have to wait till you're 75.
07:35:14.120 | You don't have to become a gerontocrat
07:35:16.160 | or spout the party line and so on.
07:35:19.000 | The V1 of this is like folks like, you know,
07:35:21.720 | Francis Suarez or Nagy McKelvey of El Salvador,
07:35:25.680 | but, you know, Suarez is a great example where,
07:35:28.240 | while not a full sovereign or anything like that,
07:35:30.040 | he has many ways, in many ways, the skills of a tech CEO
07:35:32.680 | where he just put up a, you know, a call on Twitter
07:35:36.400 | and helped build Miami,
07:35:38.080 | recruited all these people from all over.
07:35:39.800 | And it wasn't the two-party system, but the end city system.
07:35:41.880 | He just helped build the city by bringing people in, okay?
07:35:45.320 | And that's, and when I say Suarez is a V1,
07:35:47.920 | you know, I love Francis Suarez, I love what they're doing.
07:35:51.000 | The next iteration of that
07:35:52.560 | is to actually build the community itself
07:35:55.080 | rather than just kind of taking an existing Miami,
07:35:57.080 | you're building something that is potentially
07:35:58.760 | the scale of Miami, but as a digital community.
07:36:00.760 | And how many people is that?
07:36:01.960 | Well, like the Miami population is actually not that large.
07:36:04.840 | It's like 400 something thousand people.
07:36:07.200 | You could build a digital community like that.
07:36:08.920 | So if you have the engineering
07:36:12.920 | and you have the content creation
07:36:14.240 | and you build your own distribution,
07:36:15.240 | you own your own thing,
07:36:16.320 | you can become essentially a new kind of political leader
07:36:19.880 | where you just build a large enough online community
07:36:22.240 | that can crowdfund territory
07:36:23.480 | and you build your vision of the good.
07:36:25.480 | - And anybody could build the vision of the good.
07:36:30.760 | Talking about eight billion people.
07:36:32.440 | I mean, there's no more inspiring.
07:36:34.760 | I mean, sometimes when we look at how things are broken,
07:36:38.920 | there could be a cynical paralysis.
07:36:41.720 | - Right.
07:36:42.560 | - But ultimately this is a really empowering message.
07:36:45.360 | - Yes.
07:36:46.200 | I think there is a new birth of global freedom
07:36:51.400 | and that in the fullness of time,
07:36:53.280 | people will look at the internet
07:36:55.000 | as being to the Americas what the Americas were to Europe.
07:37:00.000 | A new world, okay?
07:37:02.300 | In the sense of this cloud continent has just come down,
07:37:06.560 | okay, and people are, you know,
07:37:08.200 | if you spend 50% of your waking hours looking at a screen,
07:37:10.800 | 20%, you're spending all this time commuting up
07:37:12.920 | to the cloud in the morning and coming back down.
07:37:14.680 | You're doing these day trips
07:37:15.520 | and it's got a different geography
07:37:17.080 | and all these people are near each other
07:37:18.320 | that were far in the physical world and vice versa, right?
07:37:21.240 | And so this will, 'cause it's this new domain,
07:37:25.560 | it gives rise to virtual worlds
07:37:26.920 | that eventually become physical.
07:37:28.360 | In the same way that most people don't know this that well,
07:37:30.360 | but, you know, the Americas really shaped the old world.
07:37:33.960 | Many concepts like the ultra capitalism
07:37:36.520 | and ultra democracy of the new world,
07:37:38.240 | the French Revolution was in part,
07:37:39.360 | I mean, that was a bad version, okay?
07:37:40.920 | But that was in part inspired by the American, okay?
07:37:43.240 | There are many movements that came back
07:37:45.060 | to the old world that started here.
07:37:46.680 | In the same way, you know,
07:37:48.240 | I don't call it the mainstream media anymore.
07:37:50.120 | You know what I call it?
07:37:50.960 | The downstream media,
07:37:52.240 | because it's downstream of the internet.
07:37:54.400 | - That's right. - Right?
07:37:55.560 | - That's right.
07:37:56.400 | - And, you know, there's this guy a while back
07:37:58.000 | who he had this meme called
07:37:59.080 | the one kilo year American empire,
07:38:00.600 | that everything's American and so on.
07:38:02.280 | And his, I think, fundamental category era
07:38:04.120 | is he considers the internet to be American.
07:38:06.200 | But you know why that's not the case?
07:38:07.600 | Because, and it'll be very obviously so,
07:38:09.840 | I think in five or 10 years.
07:38:12.540 | Because the majority of English speakers online
07:38:15.840 | by about 2030 are gonna be Indian.
07:38:19.960 | Okay?
07:38:21.160 | They just got 5G LTE super cheap internet recently,
07:38:25.280 | the last few years.
07:38:26.120 | It's like one of the biggest stories in the world
07:38:27.520 | that's not really being told that much, okay?
07:38:29.880 | And they've been lurking.
07:38:31.040 | And here's the thing.
07:38:32.120 | And this took me a long time to kind of, you know,
07:38:33.840 | figure out like to, not to figure it, but to communicate.
07:38:36.320 | I actually realized this in 2013,
07:38:37.600 | but these folks don't type with an accent.
07:38:41.940 | Okay, they speak with an accent,
07:38:44.400 | but they don't type with an accent.
07:38:46.240 | And all the way back in 2013,
07:38:47.760 | when I taught this Coursera course,
07:38:50.000 | I was like, who are these folks?
07:38:50.880 | I had hundreds of thousands of people
07:38:51.920 | from around the world sign up.
07:38:52.760 | It was a very popular course even then, okay?
07:38:54.920 | And hundreds of thousands of people signed up.
07:38:57.280 | I was like, who are these folks?
07:38:58.860 | And there were like Polish guys and, you know,
07:39:02.080 | like this lady from Brazil.
07:39:05.600 | And they knew scumbag Steve and good guy Greg,
07:39:09.220 | but they didn't know the Yankees or hot dogs
07:39:13.560 | or all the offline stuff of America.
07:39:15.840 | They didn't know physical America.
07:39:17.000 | They knew the digital conversation, the Reddit conversation
07:39:20.480 | and, you know, what became the Twitter conversation.
07:39:22.280 | For example, I just saw this YouTube video
07:39:25.600 | where there's a Indian founder.
07:39:27.360 | And he just said, just casually like,
07:39:29.200 | "Oh, I slid into his DMs like this," right?
07:39:32.200 | It was kind of a joke, but he said in an Indian accent
07:39:34.760 | and everybody laughed, everybody knew what he meant.
07:39:36.520 | And you're like, wait, that is a piece
07:39:38.320 | of what people think of as American internet slang.
07:39:41.200 | That's actually internet slang,
07:39:43.460 | which will soon be said mostly by non-Americans.
07:39:46.040 | Now, what does that mean?
07:39:46.880 | That means that just like the US was a branch of the UK
07:39:50.640 | and it started with English.
07:39:51.520 | And certainly there's lots of antecedents
07:39:53.120 | you can trace back to England.
07:39:54.560 | But nowadays, most Americans are not English in ancestry.
07:39:58.920 | There's Germans and Italians, Jewish people,
07:40:00.680 | African-Americans, you know, everybody, right?
07:40:03.000 | In the same way, the internet is much more representative
07:40:05.480 | of the world than the USA is.
07:40:07.160 | - It may have started American,
07:40:08.660 | but it got forked by the rest of the world.
07:40:10.960 | - That's right.
07:40:11.780 | And it gives a global equality of opportunity.
07:40:14.960 | It's even more capitalist than America is.
07:40:16.960 | It's even more democratic than America is,
07:40:19.280 | just as America is more capitalist
07:40:20.560 | and democratic than the UK.
07:40:22.600 | - The meme has escaped the cage of its captor.
07:40:26.880 | - And by the way, that doesn't mean I'm,
07:40:28.480 | so I wanna be very clear about something.
07:40:30.760 | When I say this kind of stuff, people will be like,
07:40:32.160 | oh my God, you hate America so much.
07:40:35.000 | And that's not at all what I'm saying.
07:40:36.960 | It's like, first, take Britain, okay?
07:40:39.920 | Would you think of the US or Israel or India or Singapore
07:40:44.320 | as being anti-British?
07:40:45.920 | Not today, they're post-British, right?
07:40:48.700 | In fact, they're quite respectful to,
07:40:50.120 | I mean, look at the Queen and so on.
07:40:51.280 | People respect the UK and so on.
07:40:52.880 | Everyone's coming there to pay their respects.
07:40:54.560 | - That might not be the greatest example, but yes, go on.
07:40:57.080 | - Well, let's put it like this.
07:40:57.920 | - Yes, but yes, broadly speaking.
07:40:59.280 | - They're not like burning the British flag
07:41:00.760 | and effigy or anything.
07:41:01.600 | I mean, essentially, the point is each of these societies
07:41:03.760 | is kind of moving along their own axis.
07:41:05.360 | They're not defining every action
07:41:06.840 | in terms of whether they're pro-British or anti-British.
07:41:09.720 | Like, once you have kind of a healthy distance,
07:41:12.320 | people can respect all the accomplishments of the UK
07:41:15.360 | while also being happy that you're no longer run by them.
07:41:19.480 | And then you can have like a better
07:41:22.320 | kind of arms-length relationship, right?
07:41:24.520 | And that's what post-British means.
07:41:26.000 | It is not anti-British, not at all.
07:41:27.520 | In fact, you can respect it while also being happy
07:41:30.240 | that you've got your own sovereignty, right?
07:41:32.600 | And you're happy that Britain is doing its own thing.
07:41:35.280 | I'm glad they're doing well, right?
07:41:37.640 | Okay, and they're actually doing
07:41:39.740 | some special economics zone stuff now.
07:41:41.600 | And in the same way, if you think of it
07:41:43.960 | as not being pro-American or anti-American,
07:41:46.600 | 'cause that's a with us or against us
07:41:47.960 | formulation of George Bush, you know?
07:41:49.880 | Like, rather than just everything must be scored
07:41:52.000 | as pro-American or anti-American,
07:41:53.600 | you can think of post-American,
07:41:55.000 | that not everything has to be scored on that axis.
07:41:57.520 | Like, you know, there are certain things around the world
07:42:02.520 | which should be able to exist on their own,
07:42:04.320 | and you should be able to move along your own axis.
07:42:06.560 | Like, is, like, perhaps an obvious example,
07:42:09.360 | like, is longevity pro-American or anti-American?
07:42:13.640 | You know, no, it's like, it's on its own axis.
07:42:15.920 | It's moving on its own axis.
07:42:17.320 | And new states and new countries should be able to exist
07:42:20.920 | that do not have to define themselves as anti-American
07:42:24.400 | to do so, they're just post-American.
07:42:26.840 | Friendly to, but different from.
07:42:28.760 | That is totally possible to do,
07:42:30.040 | and we've got examples of that, right?
07:42:31.800 | And so when I talk about this,
07:42:33.440 | I'm talking about is really in many ways
07:42:35.880 | US and Western ideals, you know?
07:42:38.560 | But manifested in just a different form, right?
07:42:42.120 | And also, crucially, integrative of global ideals.
07:42:45.840 | You know, these are, in a sense,
07:42:47.160 | are global human rights, they're global values,
07:42:51.160 | which is freedom of speech, private property,
07:42:53.160 | protection from search and seizure.
07:42:54.960 | And actually, so that's all the Bill of Rights type stuff.
07:42:57.640 | And I saw something that I thought was really good recently
07:42:59.560 | that's a good first cut.
07:43:02.160 | That's something that I might wanna include.
07:43:03.600 | I credit him, of course, in the V to the book,
07:43:06.240 | a digital Bill of Rights, okay?
07:43:08.320 | And so this was a really good,
07:43:11.000 | decent first cut at a digital Bill of Rights, okay?
07:43:13.320 | And he talks about the right to encrypt,
07:43:15.880 | the right to compute, the right to repair,
07:43:18.080 | the right to portability, right?
07:43:19.680 | So encrypt is perhaps obvious,
07:43:21.640 | you know, e-commerce and everything.
07:43:23.640 | Compute, like your device, it's not like,
07:43:26.320 | you can't just have somebody intercept it
07:43:28.120 | or shut down your floating points.
07:43:31.600 | That might sound stupid,
07:43:32.920 | but in the EU, they're trying to regulate AI.
07:43:36.280 | And by doing that, they have some regulation
07:43:37.760 | that says like logic is itself regulated.
07:43:42.120 | Did you see this?
07:43:42.960 | - No, it's hilarious.
07:43:43.800 | - Hold on, I'll click the tweet
07:43:45.720 | that I sent you just before this one, right?
07:43:48.400 | So I was like, you know, in woke America,
07:43:52.040 | they're abolishing accelerated math
07:43:53.680 | 'cause math is quote white supremacist.
07:43:56.040 | Not to be outdone, Europe seeks to regulate AI
07:43:58.320 | by regulating logic itself.
07:43:59.840 | You can't reason without a license, right?
07:44:02.520 | Article three, for purposes of this regulation,
07:44:04.240 | the following definition apply.
07:44:06.680 | AI system is software that's developed
07:44:08.920 | with one or more of the techniques
07:44:09.840 | and approaches listed in Annex 1.
07:44:11.160 | And you know what's in Annex 1?
07:44:12.840 | In Annex 1, logic and knowledge-based approaches.
07:44:16.200 | (laughs)
07:44:17.920 | So step away from the if statement.
07:44:20.680 | - Right.
07:44:21.520 | - Okay, and the thing is, you know,
07:44:24.040 | if you've dealt with these bureaucracies,
07:44:25.560 | the stupidest possible interpretation,
07:44:27.480 | I mean, think about, if you think,
07:44:29.080 | oh no, no, that wouldn't make any sense,
07:44:30.360 | they wouldn't do that.
07:44:31.280 | The entire web has been uglified
07:44:32.960 | by the stupid cookie thing
07:44:34.040 | that does absolutely nothing, right?
07:44:35.600 | The actual way to protect privacy
07:44:36.880 | is with user-local data,
07:44:38.840 | meaning like decentralized systems, right?
07:44:41.600 | Where the private keys are local.
07:44:42.760 | - Now I'm just laughing at the layers of absurdity
07:44:46.280 | in this step away from the if statement.
07:44:49.160 | I mean, it's hilarious.
07:44:50.920 | It's very, very clumsy.
07:44:52.800 | - They wanna be-- - It's us struggling
07:44:56.120 | how to define, yeah, the digital bill of rights, I suppose,
07:44:59.680 | and doing it so extremely clumsily.
07:45:01.560 | - It's funny, you know, the European,
07:45:02.840 | like I heard this thing, which is like,
07:45:04.160 | Europe's like, well, look, the US and China
07:45:05.840 | are way ahead of us in AI,
07:45:06.960 | but we're gonna be a leader in AI regulation.
07:45:09.080 | - Oh yeah, yeah, yeah.
07:45:11.400 | And something we haven't mentioned much of
07:45:13.720 | in this whole conversation,
07:45:15.080 | I think maybe implied between the lines
07:45:18.240 | is the thing that was in the Constitution
07:45:21.400 | of the pursuit and happiness,
07:45:22.760 | and the thing that is in many stories
07:45:25.440 | that we humans conjure up, which is love.
07:45:30.160 | - Oh yeah. - I think the thing
07:45:31.080 | that makes life worth living in many ways.
07:45:33.720 | But for that, you have to have freedom,
07:45:35.880 | you have to have stability,
07:45:38.960 | you have to have a society that's functioning
07:45:40.720 | so that humans can do what humans do,
07:45:42.480 | which is make friends, make family,
07:45:45.840 | make love, make beautiful things together as human beings.
07:45:52.240 | Balaji, this is like an incredible conversation.
07:45:57.000 | Thank you for showing an amazing future.
07:46:03.280 | I think really empowering to people
07:46:05.840 | because we can all be part of creating that future.
07:46:09.320 | And thank you so much for talking to me today.
07:46:12.040 | This was an incredible,
07:46:13.600 | obviously the longest conversation I've ever done,
07:46:15.760 | but also one of the most amazing, enlightening.
07:46:18.200 | Thank you.
07:46:19.040 | Thank you, brother, for everything you do.
07:46:20.840 | Thank you for inspiring all of us.
07:46:23.040 | - Well, Lex, this was great.
07:46:24.760 | And we didn't get through all the questions there.
07:46:28.280 | - We didn't. (Lex laughing)
07:46:29.520 | Just for the record, we didn't get,
07:46:31.540 | I would venture to say we didn't get through 50%.
07:46:35.560 | This is great, this is great.
07:46:37.120 | And I had to stop us from going too deep on any one thing,
07:46:42.120 | even though it was tempting, like those chocolates,
07:46:45.200 | those damn delicious-looking chocolates
07:46:48.160 | that was used as a metaphor about 13 hours ago,
07:46:51.800 | however long we started the conversation.
07:46:55.120 | This was incredible.
07:46:56.360 | It was really brilliant.
07:46:57.200 | You're brilliant throughout on all those different topics.
07:47:00.280 | So yeah, thank you again for talking to me.
07:47:02.380 | - This is great.
07:47:03.220 | I really appreciate being here.
07:47:06.660 | - Thanks for listening to this conversation
07:47:08.220 | with Balaji Srinivasan.
07:47:10.140 | To support this podcast,
07:47:11.500 | please check out our sponsors in the description.
07:47:14.100 | And now let me leave you with some words from Ray Bradbury.
07:47:17.940 | "People ask me to predict the future,
07:47:20.420 | when all I want to do is to prevent it.
07:47:22.940 | Better yet, build it.
07:47:25.540 | Predicting the future is much too easy anyway.
07:47:28.460 | You look at the people around you,
07:47:30.260 | the street you stand on,
07:47:31.920 | the visible air you breathe,
07:47:34.120 | and predict more of the same.
07:47:36.960 | To hell with more.
07:47:38.520 | I want better."
07:47:39.960 | Thank you for listening, and hope to see you next time.
07:47:43.640 | (upbeat music)
07:47:46.220 | (upbeat music)
07:47:48.800 | [BLANK_AUDIO]