back to index

Daniel Schmachtenberger: Steering Civilization Away from Self-Destruction | Lex Fridman Podcast #191


Chapters

0:0 Introduction
1:31 Aliens and UFOs
20:15 Collective intelligence of human civilization
28:12 Consciousness
39:33 How much computation does the human brain perform?
43:12 Humans vs ants
50:30 Humans are apex predators
57:34 Girard's Mimetic Theory of Desire
77:31 We can never completely understand reality
80:54 Self-terminating systems
91:18 Catastrophic risk
121:30 Adding more love to the world
148:55 How to build a better world
166:7 Meaning of life
173:49 Death
179:29 The role of government in society
196:54 Exponential growth of technology
242:35 Lessons from my father
248:11 Even suffering is filled with beauty

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Daniel Schmachtenberger,
00:00:03.080 | a founding member of the Consilience Project
00:00:05.780 | that is aimed at improving public sensemaking and dialogue.
00:00:09.320 | He's interested in understanding how we humans
00:00:12.640 | can be the best version of ourselves as individuals
00:00:15.400 | and as collectives at all scales.
00:00:18.800 | Quick mention of our sponsors,
00:00:20.480 | Ground News, NetSuite, Four Sigmatic,
00:00:23.560 | Magic Spoon, and BetterHelp.
00:00:25.800 | Check them out in the description to support this podcast.
00:00:29.480 | As a side note, let me say that I got a chance
00:00:31.520 | to talk to Daniel on and off the mic for a couple of days.
00:00:34.700 | We took a long walk the day before our conversation.
00:00:37.920 | I really enjoyed meeting him, just on a basic human level.
00:00:40.680 | We talked about the world around us
00:00:42.940 | with words that carried hope for us individual ants
00:00:46.200 | actually contributing something of value to the colony.
00:00:50.000 | These conversations are the reasons I love human beings,
00:00:52.920 | our insatiable striving to lessen the suffering in the world.
00:00:56.640 | But more than that, there's a simple magic
00:00:59.080 | to two strangers meeting for the first time
00:01:01.680 | and sharing ideas, becoming fast friends,
00:01:04.440 | and creating something that is far greater
00:01:06.320 | than the sum of our parts.
00:01:08.280 | I've gotten to experience some of that same magic
00:01:10.040 | here in Austin with a few new friends
00:01:12.440 | and in random bars in my travels across this country,
00:01:16.060 | where a conversation leaves me
00:01:17.800 | with a big stupid smile on my face
00:01:19.760 | and a new appreciation of this too short, too beautiful life.
00:01:24.220 | This is the Lex Friedman Podcast,
00:01:26.280 | and here is my conversation with Daniel Schmachtenberger.
00:01:30.120 | If aliens were observing Earth
00:01:33.680 | through the entire history, just watching us,
00:01:36.900 | and were tasked with summarizing what happened until now,
00:01:40.240 | what do you think they would say?
00:01:41.880 | What do you think they would write up in that summary?
00:01:43.880 | Like it has to be pretty short, less than a page.
00:01:46.960 | Like in "Hitchhiker's Guide,"
00:01:49.040 | (Daniel laughing)
00:01:50.040 | there's I think like a paragraph or a couple sentences.
00:01:52.960 | How would you summarize, sorry,
00:01:56.000 | how would the alien summarize, do you think,
00:01:58.320 | all of human civilization?
00:01:59.780 | - My first thoughts take more than a page.
00:02:04.400 | They'd probably distill it.
00:02:05.760 | 'Cause if they watched, well, I mean, first,
00:02:10.040 | I have no idea if their senses are even attuned
00:02:12.440 | to similar stuff to what our senses are attuned to,
00:02:15.280 | or what the nature of their consciousness is like
00:02:17.380 | relative to ours.
00:02:18.280 | And so let's assume that they're kind of like us,
00:02:21.100 | just technologically more advanced
00:02:22.440 | to get here from wherever they are.
00:02:23.880 | That's the first kind of constraint
00:02:25.200 | on the thought experiment.
00:02:27.280 | And then if they've watched throughout all of history,
00:02:29.780 | they saw the burning of Alexandria,
00:02:32.680 | they saw that 2000 years ago in Greece,
00:02:36.600 | we were producing things like clocks,
00:02:38.080 | the Antikythera mechanism,
00:02:39.360 | and then that technology got lost.
00:02:40.880 | They saw that there wasn't just a steady dialectic
00:02:44.240 | of progress.
00:02:45.280 | - So every once in a while,
00:02:46.200 | there's a giant fire that destroys a lot of things.
00:02:49.760 | There's a giant like commotion
00:02:53.020 | that destroys a lot of things.
00:02:54.260 | - Yeah, and it's usually self-induced.
00:02:56.640 | They would have seen that.
00:03:00.640 | And so as they're looking at us now,
00:03:03.840 | as we move past the nuclear weapons age
00:03:07.040 | into the full globalization,
00:03:08.800 | Anthropocene exponential tech age,
00:03:11.460 | still making our decisions relatively similarly
00:03:15.800 | to how we did in the stone age
00:03:18.280 | as far as rivalry game theory type stuff.
00:03:20.240 | I think they would think that this is probably
00:03:22.960 | most likely one of the planets
00:03:24.380 | that is not gonna make it to being intergalactic
00:03:26.340 | 'cause we blow ourselves up
00:03:27.340 | in the technological adolescence.
00:03:29.460 | And if we are going to,
00:03:30.580 | we're gonna need some major progress rapidly
00:03:35.580 | in the social technologies that can guide
00:03:39.780 | and bind and direct the physical technologies
00:03:42.700 | so that we are safe vessels
00:03:44.120 | for the amount of power we're getting.
00:03:45.980 | - Actually, "Hitchhiker's Guide" has a estimation
00:03:50.820 | about how much of a risk this particular thing poses
00:03:54.780 | to the rest of the galaxy.
00:03:57.540 | And I think, I forget what it was,
00:04:00.020 | I think it was medium or low.
00:04:02.380 | So their estimation would be that
00:04:05.860 | this species of ant-like creatures
00:04:08.540 | is not gonna survive long.
00:04:10.220 | There's ups and downs in terms of technological innovation.
00:04:13.840 | The fundamental nature of their behavior
00:04:16.180 | from a game theory perspective hasn't really changed.
00:04:18.940 | They have not learned in any fundamental way
00:04:21.420 | how to control and properly incentivize
00:04:26.120 | or properly do the mechanism design of games
00:04:30.220 | to ensure long-term survival.
00:04:32.660 | And then they move on to another planet.
00:04:35.660 | Do you think there is,
00:04:38.980 | in a more, slightly more serious question,
00:04:41.660 | do you think there's some number
00:04:44.140 | or perhaps a very, very large number
00:04:46.720 | of intelligent alien civilizations out there?
00:04:49.360 | - Yes, would be hard to think otherwise.
00:04:53.980 | I know, I think Postrom had a new article not that long ago
00:04:58.000 | on why that might not be the case,
00:04:59.380 | that the Drake equation might not be
00:05:01.760 | the kind of end story on it.
00:05:03.780 | But when I look at the total number of Kepler planets
00:05:07.380 | just that we're aware of just galactically,
00:05:09.720 | and also like when those life forms
00:05:13.780 | were discovered in Mono Lake
00:05:15.080 | that didn't have the same six primary atoms,
00:05:17.500 | I think it had arsenic replacing phosphorus
00:05:19.420 | as one of the primary aspects of its energy metabolism,
00:05:23.080 | we get to think about that the building blocks
00:05:24.760 | might be more different.
00:05:26.300 | So the physical constraints
00:05:27.420 | even that the planets have to have might be more different.
00:05:30.660 | It seems really unlikely,
00:05:32.460 | not to mention interesting things that we've observed
00:05:36.860 | that are still unexplained.
00:05:38.200 | As you've had guests on your show discussing Tic Tac and--
00:05:42.500 | - Oh, the ones that have visited.
00:05:44.300 | - Yeah.
00:05:45.140 | - Well, let's dive right into that.
00:05:46.820 | What do you make sense of the rich human psychology
00:05:51.820 | of there being hundreds of thousands,
00:05:55.920 | probably millions of witnesses of UFOs
00:05:58.820 | of different kinds on Earth,
00:06:00.580 | most of which I presume are conjured up by the human mind
00:06:05.100 | through the perceptual system.
00:06:07.180 | Some number might be true,
00:06:09.140 | some number might be reflective of actual physical objects,
00:06:12.680 | whether it's drones or testing military technology
00:06:17.080 | that's secret or other worldly technology.
00:06:20.980 | What do you make sense of all of that?
00:06:22.300 | Because it's gained quite a bit of popularity recently.
00:06:25.460 | There's some sense of which that's us humans being hopeful
00:06:31.380 | and dreaming of other worldly creatures
00:06:38.380 | as a way to escape the dreariness
00:06:40.940 | of the human condition.
00:06:43.180 | But in another sense, it really could be
00:06:47.280 | something truly exciting that science
00:06:49.560 | should turn its eye towards.
00:06:53.520 | So where do you place it?
00:06:56.000 | - Speaking of turning eye towards,
00:06:57.460 | this is one of those super fascinating,
00:07:00.500 | actually super consequential possibly,
00:07:02.480 | topics that I wish I had more time to study
00:07:05.240 | and just haven't allocated.
00:07:06.280 | So I don't have firm beliefs on this
00:07:08.200 | 'cause I haven't got to study it as much as I want.
00:07:09.580 | So what I'm gonna say comes from a superficial assessment.
00:07:12.480 | While we know there are plenty of things
00:07:18.160 | that people thought of as UFO sightings
00:07:20.160 | that we can fully write off,
00:07:21.520 | we have other better explanations for them.
00:07:24.120 | What we're interested in is the ones
00:07:25.640 | that we don't have better explanations for
00:07:27.160 | and then not just immediately jumping
00:07:28.760 | to a theory of what it is,
00:07:31.560 | but holding it as unidentified
00:07:33.160 | and being curious and earnest.
00:07:36.560 | I think the Tic Tac one is quite interesting
00:07:39.900 | and made it in major media recently,
00:07:42.180 | but I don't know if you ever saw the disclosure project.
00:07:45.540 | A guy named Stephen Greer organized
00:07:49.160 | a bunch of mostly US military and some commercial flight,
00:07:53.660 | people who had direct observation
00:07:57.460 | and classified information disclosing it at a CNN briefing.
00:08:01.980 | And so you saw high-ranking generals,
00:08:05.060 | admirals, fighter pilots all describing things
00:08:07.380 | that they saw on radar with visual,
00:08:10.660 | with their own eyes or cameras,
00:08:14.740 | and also describing some phenomena
00:08:17.420 | that had some consistency across different people.
00:08:20.500 | And I find this interesting enough
00:08:23.060 | that I think it would be silly to just dismiss it.
00:08:25.560 | And specifically, we can ask the question,
00:08:30.320 | how much of it is natural phenomena,
00:08:31.860 | ball lightning or something like that?
00:08:34.260 | And this is why I'm more interested
00:08:35.900 | in what fighter pilots and astronauts
00:08:38.940 | and people who are trained in being able
00:08:43.420 | to identify flying objects
00:08:46.260 | and atmospheric phenomena have to say about it.
00:08:49.780 | I think the thing, then you could say,
00:08:53.560 | well, are they more advanced military craft?
00:08:57.220 | Is it some kind of human craft?
00:08:59.140 | The interesting thing that a number of them describe
00:09:02.540 | is something that's kind of like right angles at speed,
00:09:06.220 | or if not right angles, acute angles at speed,
00:09:08.580 | but something that looks like a different relationship
00:09:11.220 | to inertia than physics makes sense for us.
00:09:13.480 | I don't think that there are any human technologies
00:09:17.460 | that are doing that even in really deep
00:09:19.380 | underground black projects.
00:09:22.260 | Now, one could say, okay, well, could it be a hologram?
00:09:25.700 | Well, would it show up on radar if radar is also seeing it?
00:09:28.060 | And so I don't know.
00:09:30.820 | I think there's enough.
00:09:32.220 | I mean, and for that to be a massive coordinated PSYOP
00:09:36.180 | is as interesting and ridiculous in a way
00:09:40.980 | as the idea that it's UFOs
00:09:43.500 | from some extra planetary source.
00:09:45.880 | So it's up there on the interesting topics.
00:09:48.460 | - To me, if it is at all alien technology,
00:09:53.460 | it is the dumbest version of alien technologies.
00:09:57.420 | It's so far away.
00:09:58.460 | It's like the old, old crappy VHS tapes of alien technology.
00:10:02.920 | These are like crappy drones that just floated
00:10:05.620 | or even like space to the level of like space junk
00:10:09.500 | because it is so close to our human technology.
00:10:14.160 | We talk about it moves in ways that's unlike
00:10:17.540 | what we understand about physics,
00:10:18.700 | but it still has very similar kind of geometric notions
00:10:23.700 | and something that we humans can perceive with our eyes,
00:10:27.640 | all those kinds of things.
00:10:28.500 | I feel like alien technology most likely
00:10:31.820 | would be something that we would not be able to perceive,
00:10:35.660 | not because they're hiding,
00:10:36.700 | but because it's so far advanced
00:10:38.820 | that it would be much,
00:10:42.300 | it would be beyond the cognitive capabilities of us humans.
00:10:45.060 | Just as you were saying,
00:10:46.300 | as per your answer for aliens summarizing earth,
00:10:50.240 | the starting assumption
00:10:53.980 | is they have similar perception systems.
00:10:56.200 | They have similar cognitive capabilities
00:10:58.980 | and that very well may not be the case.
00:11:01.420 | Let me ask you about staying in aliens
00:11:05.180 | for just a little longer
00:11:06.900 | because I think it's a good transition
00:11:09.180 | in talking about governments and human societies.
00:11:11.840 | Do you think if a US government or any government
00:11:19.320 | was in possession of an alien spacecraft
00:11:24.460 | or of information related to alien spacecraft,
00:11:28.240 | they would have the capacity structurally?
00:11:33.240 | Would they have the processes?
00:11:38.720 | Would they be able to communicate
00:11:42.720 | that to the public effectively?
00:11:45.840 | Or would they keep it secret in a room
00:11:47.800 | and do nothing with it,
00:11:49.440 | both to try to preserve military secrets,
00:11:53.200 | but also because of the incompetence
00:11:54.900 | that's inherent to bureaucracies or either?
00:11:59.060 | - Well, we can certainly see
00:12:03.500 | when certain things become declassified
00:12:06.260 | 25 or 50 years later,
00:12:07.860 | that there were things
00:12:09.140 | that the public might've wanted to know
00:12:11.500 | that were kept secret for a very long time
00:12:14.340 | for reasons of at least supposedly national security,
00:12:18.160 | which is also a nice source of plausible deniability
00:12:21.940 | for people covering their ass
00:12:25.920 | for doing things that would be problematic
00:12:28.040 | and other purposes.
00:12:30.560 | There's a scientist at Stanford
00:12:37.960 | who supposedly got some material
00:12:42.120 | that was recovered from Area 51 type area,
00:12:45.480 | did analysis on it using, I believe, electron microscopy
00:12:48.720 | and a couple other methods
00:12:50.160 | and came to the idea that it was a nanotech alloy
00:12:54.080 | that was something we didn't currently have the ability
00:12:58.080 | to do, was not naturally occurring.
00:12:59.780 | So I've heard some things.
00:13:03.000 | And again, like I said,
00:13:03.840 | I'm not gonna stand behind any of these
00:13:05.520 | 'cause I haven't done the level of study
00:13:07.040 | to have high confidence.
00:13:08.600 | I think what you said also about
00:13:14.460 | would it be super low-tech alien craft,
00:13:18.920 | like would they necessarily move their atoms around in space
00:13:22.120 | or might they do something more interesting than that?
00:13:24.640 | Might they be able to have a different relationship
00:13:27.920 | to the concept of space or information or consciousness?
00:13:31.280 | One of the things that the craft supposedly do
00:13:35.000 | is not only accelerate and turn in a way
00:13:37.840 | that looks non-inertial, but also disappear.
00:13:40.440 | So there's a question as to,
00:13:42.000 | like the two are not necessarily mutually exclusive
00:13:44.880 | and it could be possible to,
00:13:46.640 | some people run a hypothesis
00:13:48.700 | that they create intentional amounts of exposure
00:13:51.480 | as an invitation of a particular kind.
00:13:54.100 | Who knows?
00:13:57.000 | Interesting field.
00:13:58.680 | - We tend to assume like SETI,
00:14:00.560 | that's listening out for aliens out there.
00:14:03.180 | I've just been recently reading more and more
00:14:06.720 | about gravitational waves
00:14:08.240 | and you have orbiting black holes that orbit each other.
00:14:13.240 | They generate ripples in space-time.
00:14:17.680 | For fun at night when I lay in bed,
00:14:21.520 | I think about what it would be like to ride those waves
00:14:23.920 | when they, not the low magnitude they are
00:14:27.320 | as when they reach Earth,
00:14:28.360 | but get closer to the black holes
00:14:30.400 | because it would basically be shrinking
00:14:32.820 | and expanding us in all dimensions, including time.
00:14:37.820 | So it's actually ripples through space-time
00:14:41.200 | that they generate.
00:14:42.800 | Why is it that you couldn't use that
00:14:46.360 | as it travels at the speed of light?
00:14:48.160 | Travels at a speed, which is a very weird thing to say
00:14:53.660 | when you're morphing space-time.
00:14:57.460 | You could argue it's faster than the speed of light.
00:15:02.080 | So if you're able to communicate by,
00:15:04.500 | to summon enough energy to generate black holes
00:15:08.580 | and to force them to orbit each other,
00:15:12.440 | why not travel as the ripples in space-time,
00:15:17.440 | whatever the hell that means,
00:15:19.600 | somehow combined with wormholes.
00:15:21.400 | So if you're able to communicate through,
00:15:23.320 | like we don't think of gravitational waves
00:15:26.940 | as something you can communicate with
00:15:28.600 | because the radio will have to be
00:15:31.520 | a very large size and very dense, but perhaps that's it.
00:15:35.880 | Perhaps that's one way to communicate.
00:15:39.120 | It's a very effective way.
00:15:40.800 | And that would explain,
00:15:43.640 | we wouldn't even be able to make sense of that,
00:15:46.960 | of the physics that results in an alien species
00:15:50.360 | that's able to control gravity at that scale.
00:15:53.380 | - I think you just jumped up the Kardashev scale so far.
00:15:56.760 | You're not just harnessing the power of a star,
00:15:59.080 | but harnessing the power of mutually rotating black holes.
00:16:02.660 | That's way above my physics pay grade to think about,
00:16:08.900 | including even non-rotating black hole versions
00:16:13.900 | of transwarp travel.
00:16:15.600 | I think, you can talk with Eric more about that.
00:16:21.560 | I think he has better ideas on it than I do.
00:16:23.800 | My hope for the future of humanity
00:16:26.200 | mostly does not rest in the near term
00:16:29.280 | on our ability to get to other habitable planets in time.
00:16:32.840 | - And even more than that,
00:16:34.560 | in the list of possible solutions
00:16:36.280 | of how to improve human civilization,
00:16:39.340 | orbiting black holes is not on the first page for you.
00:16:42.780 | - Not on the first page.
00:16:44.860 | - Okay, I bet you did not expect us
00:16:46.500 | to start this conversation here.
00:16:48.960 | But I'm glad the places it went.
00:16:50.560 | I am excited on a much smaller scale
00:16:55.460 | of Mars, Europa, or Titan,
00:16:59.740 | Venus potentially having very bacteria-like life forms.
00:17:05.660 | Just on a small human level,
00:17:10.620 | it's a little bit scary, but mostly really exciting
00:17:13.220 | that there might be life elsewhere,
00:17:15.880 | in the volcanoes, in the oceans,
00:17:17.960 | all around us, teeming, having little societies.
00:17:22.920 | And whether there's properties about that kind of life,
00:17:26.680 | that's somehow different than ours.
00:17:28.380 | I don't know what would be more exciting
00:17:29.660 | if those colonies of single-cell type organisms,
00:17:35.140 | what would be more exciting,
00:17:36.420 | if they're different or if they're the same?
00:17:39.260 | If they're the same, that means
00:17:41.780 | through the rest of the universe,
00:17:45.420 | there's life forms like us,
00:17:47.420 | something like us everywhere.
00:17:51.240 | If they're different, that's also really exciting
00:17:53.900 | 'cause there's life forms everywhere that are not like us.
00:17:58.860 | That's a little bit scary.
00:18:01.860 | - I don't know what's scarier, actually.
00:18:03.620 | (both laughing)
00:18:04.660 | I think-- - It's both scary
00:18:05.620 | and exciting no matter what, right?
00:18:07.700 | - The idea that they could be very different
00:18:09.940 | is philosophically very interesting
00:18:11.420 | for us to open our aperture on what life and consciousness
00:18:14.780 | and self-replicating possibilities could look like.
00:18:18.220 | The question on are they different or the same,
00:18:21.500 | obviously there's lots of life here
00:18:22.660 | that is the same in some ways and different in other ways.
00:18:25.560 | When you take the thing that we call an invasive species,
00:18:29.300 | it's something that's still pretty the same,
00:18:30.980 | hydrocarbon-based thing,
00:18:32.800 | but co-evolved with co-selective pressures
00:18:35.560 | in a certain environment,
00:18:36.400 | we move it to another environment,
00:18:37.440 | it might be devastating to that whole ecosystem
00:18:39.660 | 'cause it's just different enough
00:18:40.980 | that it messes up the self-stabilizing dynamics
00:18:44.160 | of that ecosystem.
00:18:45.000 | So the question of would they be different
00:18:50.000 | in ways where we could still figure out a way
00:18:53.020 | to inhabit a biosphere together or fundamentally not,
00:18:56.980 | fundamentally the nature of how they operate
00:19:00.480 | and the nature of how we operate
00:19:01.820 | would be incommensurable is a deep question.
00:19:05.020 | - Well, we offline talked about mimetic theory, right?
00:19:09.740 | It seems like if there were sufficiently different
00:19:12.460 | where we would not even,
00:19:14.580 | we can coexist on different planes,
00:19:17.660 | it seems like a good thing.
00:19:19.540 | If we're close enough together to where we'd be competing,
00:19:23.200 | then you're getting into the world of viruses and pathogens
00:19:26.220 | and all those kinds of things to where we would,
00:19:29.820 | one of us would die off quickly
00:19:32.180 | through basically mass murder without--
00:19:36.100 | - Even accidentally.
00:19:38.780 | - Even accidentally.
00:19:40.060 | - If we just had a self-replicating,
00:19:42.180 | single-celled kind of creature
00:19:43.980 | that happened to not work well
00:19:48.980 | for the hydrocarbon life that was here,
00:19:50.900 | that got introduced because it either output something
00:19:54.140 | that was toxic or utilized up the same resource too quickly
00:19:56.620 | and it just replicated faster and mutated faster.
00:19:59.980 | It wouldn't be a mimetic theory,
00:20:03.620 | conflict theory kind of harm,
00:20:04.940 | it would just be a von Neumann machine,
00:20:08.360 | a self-replicating machine
00:20:09.820 | that was fundamentally incompatible
00:20:12.060 | with these kinds of self-replicating systems
00:20:14.460 | with faster OODA loops.
00:20:16.180 | - For one final time, putting your alien/god hat on
00:20:19.900 | and you look at human civilization,
00:20:24.260 | do you think about the 7.8 billion people on Earth
00:20:28.100 | as individual little creatures,
00:20:30.420 | individual little organisms,
00:20:32.500 | or do you think of us as one organism
00:20:36.500 | with a collective intelligence?
00:20:40.240 | What's the proper framework through which to analyze it,
00:20:45.460 | again, as an alien?
00:20:46.540 | - So that I know where you're coming from
00:20:47.940 | when you have asked the question the same way
00:20:50.420 | before the Industrial Revolution,
00:20:51.900 | before the Agricultural Revolution
00:20:53.260 | when there were half a billion people
00:20:54.540 | and no telecommunications connecting them?
00:20:56.640 | - I would indeed ask the question the same way,
00:21:01.860 | but I would be less confident about your conclusions.
00:21:06.660 | It would be an actually more interesting way
00:21:11.020 | to ask the question at that time,
00:21:12.580 | but I would nevertheless ask it the same way, yes.
00:21:15.300 | - Well, let's go back further and smaller then.
00:21:17.980 | Rather than just a single human
00:21:19.780 | or the entire human species,
00:21:21.260 | let's look at a relatively isolated tribe.
00:21:26.260 | In the relatively isolated,
00:21:29.140 | probably sub Dunbar number, sub 150 people tribe,
00:21:34.020 | do I look at that as one entity
00:21:37.260 | where evolution is selecting for it
00:21:38.860 | based on group selection,
00:21:39.940 | or do I think of it as 150 individuals
00:21:42.540 | that are interacting in some way?
00:21:45.140 | Well, could those individuals exist without the group?
00:21:51.820 | The evolutionary adaptiveness of humans
00:21:56.100 | was involved critically group selection
00:21:59.940 | and individual humans alone
00:22:01.980 | trying to figure out stone tools and protection
00:22:04.620 | and whatever aren't what was selected for.
00:22:07.660 | And so I think the or is the wrong frame.
00:22:13.540 | I think it's individuals are affecting
00:22:18.420 | the group that they're a part of.
00:22:20.140 | They're also dependent upon
00:22:22.020 | and being affected by the group that they're part of.
00:22:24.900 | And so this now starts to get in deep
00:22:27.300 | into political theories also,
00:22:28.900 | which is theories that orient towards the collective
00:22:32.340 | at different scales,
00:22:33.180 | whether a tribal scale or an empire
00:22:34.900 | or a nation state or something.
00:22:36.380 | And ones that orient towards the individual liberalism
00:22:38.660 | and stuff like that.
00:22:39.980 | And I think there's very obvious failure modes
00:22:41.860 | on both sides.
00:22:43.180 | And so the relationship between them
00:22:45.260 | is more interesting to me than either of them.
00:22:47.020 | The relationship between the individual
00:22:48.540 | and the collective and the question
00:22:49.540 | around how to have a virtuous process between those.
00:22:52.060 | So a good social system would be one
00:22:54.420 | where the organism of the individual
00:22:56.060 | and the organism of the group of individuals
00:22:59.740 | is they're both synergistic to each other.
00:23:01.940 | So what is best for the individuals
00:23:03.260 | and what's best for the whole is aligned.
00:23:05.700 | - But there is nevertheless an individual.
00:23:08.940 | They're not, it's a matter of degrees, I suppose.
00:23:11.820 | But what defines a human more?
00:23:18.740 | The social network within which they've been brought up
00:23:23.740 | through which they've developed their intelligence
00:23:27.340 | or is it their own sovereign individual self?
00:23:30.880 | Like what's your intuition of how much,
00:23:35.280 | not just for evolutionary survival,
00:23:38.340 | but as intellectual beings,
00:23:41.100 | how much do we need others for our development?
00:23:44.460 | - Yeah, I think we have a weird sense of this today
00:23:48.500 | relative to most previous periods of sapien history.
00:23:53.340 | I think the vast majority of sapien history is tribal.
00:23:56.900 | Like depending upon your early human model,
00:24:00.580 | two or 300,000 years of Homo sapiens in little tribes
00:24:05.500 | where they depended upon that tribe for survival
00:24:08.420 | and excommunication from the tribe was fatal.
00:24:12.820 | I think they, and our whole evolutionary genetic history
00:24:16.120 | is in that environment.
00:24:17.020 | And the amount of time we've been out of it
00:24:18.300 | is relatively so tiny.
00:24:20.660 | And then we still depended upon extended families
00:24:24.460 | and local communities more.
00:24:26.100 | And the big kind of giant market complex
00:24:28.980 | where I can provide something to the market to get money,
00:24:33.020 | to be able to get other things from the market
00:24:34.440 | where it seems like I don't need anyone.
00:24:35.900 | It's almost like disintermediating our sense of need,
00:24:38.140 | even though your and my ability to talk to each other
00:24:43.140 | using these mics and the phones that we coordinated on
00:24:45.900 | took millions of people over six continents
00:24:48.060 | to be able to run the supply chains
00:24:49.540 | that made all the stuff that we depend on,
00:24:50.860 | but we don't notice that we depend upon them.
00:24:52.380 | They all seem fungible.
00:24:53.540 | If you take a baby,
00:24:58.140 | obviously that you didn't even get to a baby without a mom.
00:25:00.260 | Was it dependent, we depended upon each other, right?
00:25:02.560 | Without two parents at minimum
00:25:04.680 | and they depended upon other people.
00:25:06.480 | But if we take that baby and we put it out in the wild,
00:25:09.780 | it obviously dies.
00:25:11.320 | So if we let it grow up for a little while,
00:25:13.220 | the minimum amount of time
00:25:14.220 | where it starts to have some autonomy
00:25:15.660 | and then we put it out in the wild
00:25:16.780 | and this has happened a few times,
00:25:19.180 | it doesn't learn language
00:25:20.460 | and it doesn't learn the small motor articulation
00:25:26.660 | that we learn.
00:25:27.740 | It doesn't learn the type of consciousness
00:25:31.180 | that we end up having that is socialized.
00:25:33.220 | So I think we take for granted
00:25:37.260 | how much conditioning affects us.
00:25:39.440 | - Is it possible that it affects basically 90% of us
00:25:46.180 | or basically 99.9 or maybe the whole thing?
00:25:50.020 | The whole thing is the connection between us humans
00:25:52.580 | and that we're no better than apes
00:25:56.420 | without our human connections.
00:25:58.980 | Because thinking of it that way
00:26:01.500 | forces us to think very differently about human society
00:26:05.580 | and how to progress forward
00:26:07.580 | if the connections are fundamental.
00:26:09.940 | - I just have to object to the no better than apes
00:26:12.900 | 'cause better here, I think you mean a specific thing
00:26:15.060 | which means have capacities
00:26:16.260 | that are fundamentally different
00:26:17.340 | than I think apes also depend upon troops.
00:26:19.880 | And I think the idea of humans as better than nature
00:26:26.540 | in some kind of ethical sense
00:26:29.140 | ends up having heaps of problems.
00:26:30.300 | We'll table that, we can come back to it.
00:26:32.340 | But when we say what is unique about homo sapien capacity
00:26:35.380 | relative to the other animals
00:26:36.600 | we currently inhabit the biosphere with?
00:26:39.700 | And I'm saying it that way
00:26:40.820 | because there were other early hominids
00:26:42.500 | that had some of these capacities.
00:26:45.200 | We believe our tool creation and our language creation
00:26:50.040 | and our coordination are all kind of the results
00:26:52.120 | of a certain type of capacity for abstraction.
00:26:55.040 | And other animals will use tools
00:26:58.080 | but they don't evolve the tools they use.
00:26:59.760 | They keep using the same types of tools
00:27:01.360 | that they basically can find.
00:27:03.280 | So a chimp will notice that a rock can cut a vine
00:27:06.760 | that it wants to.
00:27:07.600 | And it'll even notice that a sharper rock
00:27:09.240 | will cut it better.
00:27:10.080 | And experientially it'll use the sharper rock.
00:27:12.400 | And if you even give it a knife
00:27:13.560 | it'll probably use the knife
00:27:14.600 | 'cause it's experiencing the effectiveness.
00:27:17.040 | But it doesn't make stone tools
00:27:19.140 | because that requires understanding
00:27:21.240 | why one is sharper than the other.
00:27:22.880 | What is the abstract principle called sharpness
00:27:25.920 | to then be able to invent a sharper thing?
00:27:28.200 | That same abstraction makes language
00:27:30.780 | and the ability for abstract representation
00:27:33.640 | which makes the ability to coordinate
00:27:35.880 | in a more advanced set of ways.
00:27:38.880 | So I do think our ability to coordinate with each other
00:27:41.320 | is pretty fundamental to the selection
00:27:43.440 | of what we are as a species.
00:27:45.640 | - I wonder if that coordination, that connection
00:27:49.320 | is actually the thing that gives birth to consciousness.
00:27:52.080 | That gives birth to, well let's start with self-awareness.
00:27:55.960 | - More like theory of mind.
00:27:57.200 | - Theory of mind, yeah.
00:27:58.640 | I mean I suppose there's experiments
00:28:01.560 | that show that there's other mammals
00:28:03.040 | that have a very crude theory of mind.
00:28:05.480 | I'm not sure, maybe dogs, something like that.
00:28:08.240 | But actually dogs probably has to do with
00:28:09.920 | that they co-evolved with humans.
00:28:12.400 | See it'd be interesting if that theory of mind
00:28:15.120 | is what leads to consciousness in the way we think about it.
00:28:20.120 | Is the richness of the subjective experience
00:28:23.560 | that is consciousness.
00:28:24.920 | I have an inkling sense that that only exists
00:28:28.400 | because we're social creatures.
00:28:30.880 | That doesn't come with the hardware and the software
00:28:34.880 | in the beginning.
00:28:36.400 | That's learned as an effective tool
00:28:41.400 | for communication almost.
00:28:43.160 | I think we think that consciousness is fundamental.
00:28:48.200 | Maybe it's not.
00:28:52.480 | A bunch of folks kind of criticize the idea
00:28:57.640 | that the illusion of consciousness is consciousness.
00:29:00.440 | That it is just a facade we use
00:29:03.240 | to help us construct theories of mind.
00:29:08.240 | You almost put yourself in the world
00:29:10.120 | as a subjective being.
00:29:12.040 | And that experience, you want to richly experience it
00:29:14.800 | as an individual person
00:29:16.440 | so that I could empathize with your experience.
00:29:20.000 | I find that notion compelling.
00:29:22.420 | Mostly because it allows you to then create robots
00:29:25.780 | that become conscious not by being quote unquote conscious
00:29:30.300 | but by just learning to fake it 'til they make it.
00:29:34.160 | Present a facade of consciousness
00:29:39.480 | with the task of making that facade
00:29:43.640 | very convincing to us humans
00:29:45.840 | and thereby it will become conscious.
00:29:48.160 | I have a sense that in some way
00:29:51.640 | that will make them conscious
00:29:53.320 | if they're sufficiently convincing to humans.
00:29:58.780 | Is there some element of that
00:30:00.380 | that you find convincing?
00:30:05.100 | This is a much harder set of questions
00:30:09.420 | and deep end of the pool
00:30:10.900 | than starting with the aliens was.
00:30:12.940 | We went from aliens to consciousness.
00:30:16.860 | This is not the trajectory I was expecting, nor you.
00:30:21.140 | But let us walk a while.
00:30:23.820 | We can walk a while
00:30:24.900 | and I don't think we will do it justice.
00:30:26.980 | So what do we mean by consciousness
00:30:30.720 | versus conscious self-reflective awareness?
00:30:34.240 | What do we mean by awareness, qualia, theory of mind?
00:30:38.060 | There's a lot of terms that we think of
00:30:39.680 | as slightly different things
00:30:41.620 | and subjectivity, first person.
00:30:45.740 | I don't remember exactly the quote
00:30:50.080 | but I remember when reading
00:30:52.300 | when Sam Harris wrote the book "Free Will"
00:30:54.040 | and then Dennett critiqued it.
00:30:56.280 | And then there was some writing back and forth
00:30:58.080 | between the two
00:30:59.160 | because normally they're on the same side
00:31:02.060 | of kind of arguing for critical thinking
00:31:05.960 | and logical fallacies and philosophy of science
00:31:08.560 | against supernatural ideas.
00:31:11.100 | And here Dennett believed
00:31:14.200 | there is something like free will.
00:31:15.600 | He is a determinist compatibilist
00:31:17.740 | but no consciousness and radical limitivist.
00:31:21.240 | And Sam was saying, no, there is consciousness
00:31:23.220 | but there's no free will.
00:31:24.240 | And that's like the most fundamental kinds
00:31:26.600 | of axiomatic senses they disagreed on
00:31:28.940 | but neither of them could say
00:31:29.780 | it was 'cause the other one didn't understand
00:31:30.880 | the philosophy of science or logical fallacies.
00:31:33.400 | And they kind of spoke past each other.
00:31:35.400 | And at the end, if I remember correctly,
00:31:36.800 | Sam said something that I thought was quite insightful
00:31:39.360 | which was to the effect of, it seems
00:31:42.200 | 'cause they weren't making any progress
00:31:43.760 | in shared understanding.
00:31:45.100 | It seems that we simply have different intuitions about this.
00:31:48.520 | And what you could see was that
00:31:52.360 | what the words meant, right?
00:31:54.280 | At the level of symbol grounding might be quite different.
00:31:57.400 | One of them might've had deeply different
00:32:01.640 | enough life experiences that what is being referenced
00:32:04.180 | and then also different associations
00:32:05.760 | of what the words mean.
00:32:06.640 | This is why when trying to address these things
00:32:09.280 | Charles Sanders Peirce said,
00:32:10.920 | "The first philosophy has to be semiotics
00:32:13.080 | "because if you don't get semiotics right,
00:32:14.900 | "we end up importing different ideas and bad ideas
00:32:17.680 | "right into the nature of the language that we're using."
00:32:20.240 | And then it's very hard to do epistemology
00:32:21.840 | or ontology together.
00:32:22.920 | So I'm saying this to say
00:32:25.920 | why I don't think we're gonna get very far
00:32:27.400 | is I think we would have to go very slowly
00:32:30.280 | in terms of defining what we mean by words
00:32:32.820 | and fundamental concepts.
00:32:34.760 | - Well, and also allowing our minds to drift together
00:32:38.280 | for a time so that our definitions of these terms align.
00:32:42.840 | I think there's a beauty that some people enjoy with Sam
00:32:49.460 | that he is quite stubborn on his definitions of terms
00:32:54.460 | without often clearly revealing that definition.
00:32:59.540 | So in his mind, he can, like,
00:33:01.460 | you could sense that he can deeply understand
00:33:03.620 | what he means exactly by a term
00:33:06.500 | like free will and consciousness.
00:33:08.180 | And you're right.
00:33:09.020 | He's very specific in fascinating ways
00:33:12.460 | that not only does he think that free will is an illusion,
00:33:18.420 | he thinks he's able, not thinks, he says
00:33:21.820 | he's able to just remove himself
00:33:24.020 | from the experience of free will
00:33:26.020 | and just be like for minutes at a time, hours at a time,
00:33:31.020 | like really experience as if he has no free will.
00:33:35.700 | Like he's a leaf flowing down the river.
00:33:39.480 | And given that,
00:33:43.020 | he's very sure that consciousness is fundamental.
00:33:45.820 | So here's this conscious leaf
00:33:48.300 | that's subjectively experiencing the floating
00:33:51.500 | and yet has no ability to control
00:33:54.260 | and make any decisions for itself.
00:33:56.660 | It's only the decisions have all been made.
00:34:01.300 | There's some aspect to which the terminology there
00:34:04.440 | perhaps is the problem.
00:34:06.500 | - So that's a particular kind of meditative experience.
00:34:09.020 | And the people in the Vedantic tradition
00:34:11.940 | and some of the Buddhist traditions
00:34:13.620 | thousands of years ago described similar experiences
00:34:15.820 | and somewhat similar conclusions,
00:34:17.180 | some slightly different.
00:34:18.380 | There are other types of phenomenal experience
00:34:23.780 | that are the phenomenal experience of pure agency.
00:34:28.660 | And like the Catholic theologian,
00:34:31.780 | but evolutionary theorist,
00:34:33.020 | Tarek Deshardan describes this.
00:34:35.300 | And that rather than a creator agent God in the beginning,
00:34:38.980 | there's a creative impulse or a creative process.
00:34:41.340 | And he would go into a type of meditation
00:34:43.740 | that identified as the pure essence
00:34:45.380 | of that kind of creative process.
00:34:47.080 | And I think the types of experiences we've had,
00:34:53.140 | and then one, the types of experience we've had
00:34:55.860 | make a big deal to the nature of how we do symbol grounding.
00:34:58.600 | The other thing is the types of experiences we have
00:35:01.300 | can't not be interpreted
00:35:02.980 | through our existing interpretive frames.
00:35:05.220 | And most of the time our interpretive frames
00:35:06.980 | are unknown even to us, some of them.
00:35:09.340 | And so this is a tricky topic.
00:35:14.340 | So I guess there's a bunch of directions
00:35:17.260 | we could go with it,
00:35:18.100 | but I wanna come back to what the impulse was
00:35:21.420 | that was interesting around what is consciousness
00:35:24.180 | and how does it relate to us as social beings?
00:35:26.580 | And how does it relate to the possibility
00:35:29.260 | of consciousness with AIs?
00:35:31.340 | - Right, you're keeping us on track, which is wonderful.
00:35:34.860 | You're a wonderful hiking partner.
00:35:36.500 | - Okay. - Yes.
00:35:37.900 | Let's go back to the initial impulse
00:35:40.180 | of what is consciousness
00:35:41.560 | and how does the social impulse connect to consciousness?
00:35:44.760 | Is consciousness a consequence of that social connection?
00:35:50.700 | - I'm gonna state a position and not argue it
00:35:53.040 | 'cause it's honestly, like it's a long, hard thing to argue
00:35:56.300 | and we can totally do it another time if you want.
00:35:58.800 | I don't subscribe to consciousness
00:36:04.220 | as an emergent property of biology or neural networks.
00:36:10.760 | Obviously, a lot of people do.
00:36:12.340 | Obviously, the philosophy of science
00:36:15.220 | orients towards that in,
00:36:18.520 | not absolutely, but largely.
00:36:21.940 | I think of the nature of first person,
00:36:27.880 | the universe of first person, of qualia,
00:36:31.180 | as experience, sensation, desire, emotion, phenomenology,
00:36:38.900 | but the felt sense, not the, we say emotion
00:36:41.220 | and we think of a neurochemical pattern
00:36:42.800 | or an endocrine pattern.
00:36:44.160 | But all of the physical stuff, the third person stuff,
00:36:47.920 | has position and momentum and charge and stuff like that
00:36:51.600 | that is measurable, repeatable.
00:36:54.460 | I think of the nature of first person and third person
00:36:57.840 | as ontologically orthogonal to each other,
00:37:01.480 | not reducible to each other.
00:37:03.500 | They're different kinds of stuff.
00:37:06.760 | So I think about the evolution of third person
00:37:08.940 | that we're quite used to thinking about
00:37:10.780 | from subatomic particles to atoms to molecules to on and on.
00:37:14.460 | I think about a similar kind of and corresponding evolution
00:37:17.480 | in the domain of first person
00:37:18.680 | from the way Whitehead talked about
00:37:20.840 | kind of prehension or proto qualia
00:37:23.520 | in earlier phases of self-organization
00:37:25.460 | and to higher orders of it and that there's correspondence,
00:37:28.500 | but that neither like the idealists
00:37:31.820 | do we reduce third person to first person,
00:37:34.300 | which is what idealists do.
00:37:35.800 | Or neither like the physicalists
00:37:37.300 | or do we reduce first person to third person.
00:37:40.340 | Obviously, Bohm talked about an implicate order
00:37:44.920 | that was deeper than and gave rise
00:37:46.340 | to the explicate order of both.
00:37:48.380 | Nagel talks about something like that.
00:37:49.960 | I have a slightly different sense of that,
00:37:51.880 | but again, I'll just kind of not argue
00:37:53.980 | how that occurs for a moment.
00:37:56.180 | So rather than say, does consciousness emerge from,
00:37:59.280 | I'll talk about do higher capacities of consciousness
00:38:04.740 | emerge in relationship with?
00:38:07.260 | So it's not first person as a category
00:38:09.560 | emerging from third person,
00:38:10.940 | but increased complexity within the nature of first person
00:38:13.780 | and third person co-evolving.
00:38:15.260 | Do I think that it seems relatively likely
00:38:19.480 | that more advanced neural networks
00:38:21.140 | have deeper phenomenology, more complex,
00:38:24.580 | where it goes just from basic sensation to emotion,
00:38:29.380 | to social awareness, to abstract cognition,
00:38:33.100 | to self-reflexive abstract cognition?
00:38:34.900 | Yeah, but I wouldn't say
00:38:36.520 | that's the emergence of consciousness.
00:38:37.940 | I would say it's increased complexity
00:38:39.680 | within the domain of first person
00:38:41.140 | corresponding to increased complexity.
00:38:43.380 | And the correspondent should not
00:38:44.780 | automatically be seen as causal.
00:38:46.340 | We can get into the arguments
00:38:47.660 | for why that often is the case.
00:38:49.880 | So would I say that obviously the sapien brain
00:38:54.300 | is pretty unique and a single sapien now has that, right?
00:38:58.060 | Even if it took sapiens evolving in tribes
00:39:01.100 | based on group selection to make that brain.
00:39:03.680 | So the group made it, now that brain is there.
00:39:06.020 | Now, if I take a single person with that brain
00:39:08.380 | out of the group and try to raise them in a box,
00:39:10.780 | they'll still not be very interesting even with the brain.
00:39:13.500 | But the brain does give hardware capacities
00:39:16.860 | that if conditioned in relationship
00:39:18.980 | can have interesting things emerge.
00:39:21.860 | So do I think that the human biology,
00:39:26.180 | types of human consciousness
00:39:28.640 | and types of social interaction
00:39:30.500 | all co-emerged and co-evolved?
00:39:33.160 | - As a small aside, as you're talking about the biology,
00:39:36.880 | let me comment that I spent, this is what I do.
00:39:40.140 | This is what I do with my life.
00:39:41.460 | This is why I will never accomplish anything
00:39:43.180 | is I spent much of the morning
00:39:44.940 | trying to do research on how many computations
00:39:49.260 | the brain performs and how much energy it uses
00:39:52.820 | versus the state of the ER, CPUs and GPUs.
00:39:56.540 | Arriving at about 20 quadrillion.
00:39:59.960 | So that's two to the 10 to the 16 computations.
00:40:03.480 | So synaptic firings per second that the brain does.
00:40:07.680 | And that's about a million times faster than the,
00:40:11.100 | let's say the 20 thread state of the arts Intel CPU,
00:40:17.600 | the 10th generation.
00:40:21.200 | And then there's similar calculation for the GPU
00:40:25.760 | and all ended up also trying to compute
00:40:28.700 | that it takes 10 watts to run the brain about.
00:40:32.440 | And then what does that mean in terms of calories per day,
00:40:34.760 | kilocalories, that's about, for an average human brain,
00:40:39.760 | that's 250 to 300 calories a day.
00:40:44.460 | And so it ended up being a calculation
00:40:48.100 | where you're doing about 20 quadrillion calculations
00:40:54.620 | that are fueled by something like,
00:40:56.580 | depending on your diet, three bananas.
00:40:59.180 | So three bananas results in a computation
00:41:03.600 | that's about a million times more powerful
00:41:05.820 | than the current state of the art computers.
00:41:08.580 | - Now let's take that one step further.
00:41:10.620 | There's some assumptions built in there.
00:41:12.380 | The assumption is that one,
00:41:14.580 | what the brain is doing is just computation.
00:41:17.540 | Two, the relevant computations are synaptic firings
00:41:20.940 | and that there's nothing other than synaptic firings
00:41:22.620 | that we have to factor.
00:41:24.980 | So I'm forgetting his name right now.
00:41:27.940 | There's a very famous neuroscientist at Stanford
00:41:32.820 | just passed away recently
00:41:34.060 | who did a lot of the pioneering work on glial cells
00:41:37.280 | and showed that his assessment glial cells
00:41:40.060 | did a huge amount of the thinking, not just neurons.
00:41:42.260 | And it opened up this entirely different field
00:41:44.300 | of like what the brain is and what consciousness is.
00:41:46.620 | You look at Damasio's work on embodied cognition
00:41:48.960 | and how much of what we would consider consciousness
00:41:52.060 | or feeling is happening
00:41:52.900 | outside of the nervous system completely,
00:41:54.860 | happening in endocrine process
00:41:56.820 | involving lots of other cells and signal communication.
00:41:59.540 | You talk to somebody like Penrose
00:42:01.860 | who you've had on the show.
00:42:03.420 | And even though the Penrose-Hammerhoff conjecture
00:42:05.940 | is probably not right, is there something like that
00:42:08.320 | that might be the case
00:42:09.340 | where we're actually having to look at stuff happening
00:42:11.180 | at the level of quantum computation and microtubules.
00:42:13.820 | I'm not arguing for any of those.
00:42:16.820 | I'm arguing that we don't know
00:42:18.820 | how big the unknown unknown set is.
00:42:21.160 | - Well, at the very least,
00:42:22.700 | this has become like an infomercial for the human brain.
00:42:25.700 | But wait, there's more.
00:42:28.500 | At the very least, the three bananas
00:42:31.700 | buys you a million times--
00:42:33.380 | - At the very least.
00:42:34.220 | - At the very least. - It's impressive.
00:42:35.620 | - And then you could have,
00:42:38.340 | and then the synaptic firings we're referring to
00:42:40.580 | is strictly the electrical signals.
00:42:42.540 | It could be the mechanical transmission of information,
00:42:44.580 | there's chemical transmission of information,
00:42:46.500 | there's all kinds of other stuff going on.
00:42:49.300 | And there's memory that's built in
00:42:51.060 | that's also all tied in.
00:42:52.380 | Not to mention, which I'm learning more and more about,
00:42:55.400 | it's not just about the neurons.
00:42:58.640 | It's also about the immune system
00:43:00.460 | that's somehow helping with the computation.
00:43:02.260 | So it's the entirety and the entire body
00:43:05.460 | is helping with the computation.
00:43:06.900 | So the three bananas--
00:43:08.840 | - It could buy you a lot.
00:43:10.020 | - It could buy you a lot.
00:43:12.100 | But on the topic of sort of the greater degrees
00:43:18.340 | of complexity emerging in consciousness,
00:43:22.260 | I think few things are as beautiful and inspiring
00:43:26.140 | as taking a step outside of the human brain,
00:43:28.940 | just looking at systems or simple rules
00:43:32.540 | create incredible complexity.
00:43:36.260 | Not create, incredible complexity emerges.
00:43:40.020 | So one of the simplest things to do that with
00:43:44.100 | is cellular automata.
00:43:46.260 | And there's, I don't know what it is,
00:43:49.580 | and maybe you can speak to it.
00:43:51.100 | We can certainly, we will certainly
00:43:54.020 | talk about the implications of this,
00:43:55.500 | but there's so few things that are as awe-inspiring to me
00:44:00.500 | as knowing the rules of a system
00:44:05.860 | and not being able to predict what the heck it looks like.
00:44:08.380 | And it creates incredibly beautiful complexity
00:44:11.740 | that when zoomed out on,
00:44:14.320 | looks like there's actual organisms doing things
00:44:18.140 | that are much, that operate on a scale much higher
00:44:23.140 | than the underlying mechanism.
00:44:27.820 | So with cellular automata,
00:44:28.940 | that's cells that are born and die, born and die,
00:44:31.860 | and they only know about each other's neighbors.
00:44:34.420 | And there's simple rules that govern
00:44:35.780 | that interaction of birth and death.
00:44:38.000 | And then they create, at scale,
00:44:40.580 | organisms that look like they take up
00:44:43.740 | hundreds or thousands of cells, and they're moving.
00:44:46.900 | They're moving around, they're communicating,
00:44:48.580 | they're sending signals to each other.
00:44:50.860 | And you forget, at moments at a time,
00:44:54.380 | before you remember, that the simple rules on cells
00:44:59.060 | is all that it took to create that.
00:45:00.980 | It's sad in that we can't come up
00:45:08.240 | with a simple description of that system
00:45:11.420 | that generalizes the behavior of the large organisms.
00:45:16.420 | We can only come up, we can only hope to come up
00:45:21.060 | with the thing, the fundamental physics,
00:45:23.100 | or the fundamental rules of that system, I suppose.
00:45:25.460 | It's sad that we can't predict.
00:45:27.360 | Everything we know about the mathematics of those systems,
00:45:29.840 | it seems like we can't, really in a nice way,
00:45:32.320 | like economics tries to do,
00:45:34.040 | to predict how this whole thing will unroll.
00:45:37.060 | But it's beautiful because how simple it is
00:45:39.940 | underneath it all.
00:45:40.940 | So what do you make of the emergence of complexity
00:45:47.300 | from simple rules?
00:45:48.900 | What the hell is that about?
00:45:50.820 | - Yeah, well, we can see that something like
00:45:53.180 | flocking behavior, the murmuration, can be computer-coded.
00:45:56.980 | It's not a very hard set of rules to be able to see
00:45:59.580 | some of those really amazing types of complexity.
00:46:03.060 | And the whole field of complexity science
00:46:06.420 | and some of the sub-disciplines like stigmur G
00:46:08.620 | are studying how following fairly simple responses
00:46:13.100 | to a pheromone signal do ant colonies do this amazing thing
00:46:16.140 | where what you might describe as the organizational
00:46:18.740 | or computational capacity of the colony
00:46:20.860 | is so profound relative to what each individual ant is doing.
00:46:25.140 | I am not anywhere near as well versed
00:46:29.240 | in the cutting edge of cellular automatas.
00:46:31.180 | I would like, unfortunately, in terms of topics
00:46:33.900 | that I would like to get to and haven't,
00:46:35.260 | like ET's more, Wolfram's a new kind of science
00:46:39.260 | I have only skimmed and read reviews of
00:46:42.740 | and not read the whole thing or his newer work since.
00:46:45.380 | But his idea of the four basic kind of categories
00:46:50.460 | of emergent phenomena that can come from cellular automata
00:46:53.820 | and that one of them is kind of interesting
00:46:55.420 | and looks a lot like complexity,
00:46:58.900 | rather than just chaos or homogeneity
00:47:01.980 | or self-termination or whatever.
00:47:05.100 | I think this is very interesting.
00:47:10.180 | It does not instantly make me think that biology
00:47:15.460 | is operating on a similarly small set of rules
00:47:18.100 | or that human consciousness is.
00:47:19.340 | I'm not that reductionistly oriented.
00:47:22.060 | And so if you look at say Santa Fe Institute,
00:47:27.060 | one of the co-founders, Stuart Kauffman,
00:47:30.740 | his work, you should really get him on your show.
00:47:33.100 | So a lot of the questions that you like,
00:47:35.420 | one of Kauffman's more recent books after investigations
00:47:39.140 | and some of the real fundamental stuff
00:47:40.420 | was called "Reinventing the Sacred"
00:47:41.740 | and it had to do with some of these exact questions
00:47:44.980 | in kind of non-reductionist approach,
00:47:46.700 | but that is not just silly hippie-ism.
00:47:49.180 | And he was very interested in highly non-ergotic systems
00:47:53.020 | where you couldn't take a lot of behavior
00:47:55.740 | over a small period of time and predict
00:47:57.140 | what the behavior of subsets
00:47:58.620 | over a longer period of time would do.
00:48:01.460 | And then going further,
00:48:03.300 | someone who spent some time at Santa Fe Institute
00:48:05.180 | and then kind of made a whole new field
00:48:06.740 | that you should have on, Dave Snowden,
00:48:09.020 | who some people call the father of anthro-complexity
00:48:12.420 | or what is the complexity unique to humans?
00:48:14.580 | He says something to the effect of
00:48:17.820 | that modeling humans as termites really doesn't cut it.
00:48:20.420 | Like we don't respond exactly identically
00:48:24.100 | to the same pheromone stimulus using stigmergy
00:48:26.980 | like it works for flows of traffic
00:48:28.620 | and some very simple human behaviors,
00:48:30.500 | but it really doesn't work for trying to make sense
00:48:32.900 | of the Sistine Chapel and Picasso
00:48:34.660 | and general relativity creation and stuff like that.
00:48:37.580 | And it's because the termites are not doing abstraction,
00:48:41.500 | forecasting deep into the future
00:48:43.420 | and making choices now based on forecasts of the future,
00:48:46.100 | not just adaptive signals in the moment
00:48:47.700 | and evolutionary code from history.
00:48:49.660 | That's really different, right?
00:48:51.060 | Like making choices now that can factor
00:48:53.140 | deep modeling of the future.
00:48:56.100 | And with humans, our uniqueness one to the next
00:49:00.220 | in terms of response to similar stimuli
00:49:02.300 | is much higher than it is with a termite.
00:49:04.340 | One of the interesting things there
00:49:07.300 | is that their uniqueness is extremely low.
00:49:08.940 | They're basically fungible within a class, right?
00:49:11.140 | There's different classes,
00:49:12.100 | but within a class they're basically fungible
00:49:13.700 | and their system uses that,
00:49:15.100 | very high numbers and lots of loss, right?
00:49:19.580 | Lots of death and loss.
00:49:20.420 | - Do you think the termite feels that way?
00:49:21.980 | Don't you think we humans are deceiving ourselves
00:49:23.980 | about our uniqueness?
00:49:24.980 | Perhaps it doesn't just,
00:49:27.140 | isn't there some sense in which this emergence
00:49:28.940 | just creates different higher and higher levels
00:49:31.220 | of abstraction where at every layer,
00:49:33.460 | each organism feels unique?
00:49:35.820 | Is that possible?
00:49:36.820 | That we're all equally dumb but at different scales?
00:49:40.340 | - No, I think uniqueness is evolving.
00:49:42.180 | I think that hydrogen atoms are more similar to each other
00:49:48.580 | than cells of the same type are.
00:49:51.260 | And I think that cells are more similar to each other
00:49:53.340 | than humans are.
00:49:54.740 | And I think that highly K-selected species
00:49:58.340 | are more unique than R-selected species.
00:50:00.740 | So they're different evolutionary processes.
00:50:03.020 | The R-selected species where you have a whole,
00:50:05.780 | a lot of death and very high birth rates,
00:50:09.460 | you're not looking for as much individuality within
00:50:14.060 | or individual possible expression
00:50:16.060 | to cover the evolutionary search space within an individual.
00:50:18.660 | You're looking at it more in terms of a numbers game.
00:50:22.740 | So yeah, I would say there's probably more difference
00:50:25.100 | between one orca and the next
00:50:26.820 | than there is between one Cape buffalo and the next.
00:50:29.620 | - Given that, it would be interesting to get your thoughts
00:50:32.340 | about mimetic theory where we're imitating each other
00:50:36.580 | in the context of this idea of uniqueness.
00:50:42.140 | How much truth is there to that?
00:50:46.060 | How compelling is this worldview to you
00:50:49.580 | of Girardian mimetic theory of desire
00:50:54.220 | where maybe you can explain it from your perspective,
00:50:57.940 | but it seems like imitating each other
00:51:00.020 | is the fundamental property of the behavior
00:51:04.140 | of human civilization.
00:51:06.660 | - Well, imitation is not unique to humans, right?
00:51:09.100 | Monkeys imitate.
00:51:10.140 | So a certain amount of learning through observing
00:51:15.740 | is not unique to humans.
00:51:17.660 | Humans do more of it.
00:51:19.100 | It's actually kind of worth speaking to this for a moment.
00:51:24.060 | Monkeys can learn new behaviors, new...
00:51:26.620 | We've even seen teaching an ape sign language
00:51:29.980 | and then the ape teaching other apes sign language.
00:51:33.100 | So that's a kind of mimesis, right?
00:51:34.660 | Kind of learning through imitation.
00:51:36.660 | And that needs to happen if they need to learn
00:51:40.820 | or develop capacities that are not just coded
00:51:42.900 | by their genetics, right?
00:51:44.260 | So within the same genome,
00:51:45.860 | they're learning new things based on the environment.
00:51:48.700 | And so based on someone else learned something first.
00:51:51.580 | And so let's pick it up.
00:51:53.220 | How much a creature is the result
00:51:57.100 | of just its genetic programming
00:51:58.500 | and how much it's learning is a very interesting question.
00:52:02.180 | And I think this is a place where humans really show up
00:52:04.900 | radically different than everything else.
00:52:06.860 | And you can see it in the neoteny,
00:52:09.660 | how long we're basically fetal.
00:52:13.580 | That the closest ancestors to us, if we look at a chimp,
00:52:17.940 | a chimp can hold on to its mother's fur
00:52:20.300 | while she moves around day one.
00:52:22.700 | And obviously we see horses up
00:52:24.260 | and walking within 20 minutes.
00:52:26.620 | The fact that it takes a human a year to be walking
00:52:29.380 | and it takes a horse 20 minutes
00:52:30.540 | and you say how many multiples of 20 minutes
00:52:32.220 | go into a year?
00:52:33.060 | Like that's a long period of helplessness
00:52:35.660 | that wouldn't work for a horse, right?
00:52:37.300 | Like they or anything else.
00:52:40.980 | And not only can we not hold on to mom in the first day,
00:52:44.980 | it's three months before we can move our head volitionally.
00:52:48.540 | So it's like, why are we embryonic for so long?
00:52:51.940 | Basically it's like it's still fetal on the outside.
00:52:56.780 | Had to be because couldn't keep growing inside
00:52:59.420 | and actually ever get out with big heads
00:53:01.420 | and narrower hips from going upright.
00:53:03.260 | So here's a place where there's a co-evolution
00:53:07.700 | of the pattern of humans,
00:53:09.380 | specifically here are our neoteny
00:53:12.300 | and what that portends to learning
00:53:15.100 | with our being tool making
00:53:17.660 | and environment modifying creatures.
00:53:19.900 | Which is because we have the abstraction to make tools,
00:53:23.500 | we change our environments more than other creatures
00:53:25.620 | change their environments.
00:53:26.700 | The next most environment modifying creature to us
00:53:29.340 | is like a beaver.
00:53:30.180 | And then you were in LA,
00:53:32.940 | you fly into LAX and you look at the just orthogonal grid
00:53:36.620 | going on forever in all directions.
00:53:38.940 | And we've recently come into the Anthropocene
00:53:41.860 | where the surface of the earth is changing
00:53:43.340 | more from human activity than geological activity
00:53:46.100 | and then beavers.
00:53:46.940 | And you're like, okay, wow,
00:53:47.780 | we're really in a class of our own
00:53:49.460 | in terms of environment modifying.
00:53:51.740 | So as soon as we started tool making,
00:53:57.620 | we were able to change our environments
00:54:00.020 | much more radically.
00:54:02.060 | We could put on clothes and go to a cold place.
00:54:05.060 | And this is really important because we actually went
00:54:07.980 | and became apex predators in every environment.
00:54:10.420 | We functioned like apex predators.
00:54:12.020 | The polar bear can't leave the Arctic, right?
00:54:14.460 | And the lion can't leave the Savannah
00:54:16.900 | and an orca can't leave the ocean.
00:54:18.220 | And we went and became apex predators
00:54:19.540 | in all those environments
00:54:20.500 | because of our tool creation capacity.
00:54:21.940 | We could become better predators
00:54:23.140 | than them adapted to the environment
00:54:24.660 | or at least with our tools adapted to the environment.
00:54:27.260 | - So then every aspect towards any organism
00:54:31.380 | in any environment,
00:54:33.060 | we're incredibly good at becoming apex predators.
00:54:35.900 | - Yes, and nothing else can do that kind of thing.
00:54:39.540 | There is no other apex predator that,
00:54:42.980 | but see the other apex predator
00:54:44.260 | is only getting better at being a predator
00:54:45.980 | through evolutionary process that's super slow.
00:54:48.180 | And that super slow process
00:54:49.420 | creates co-selective process with their environment.
00:54:52.140 | So as the predator becomes a tiny bit faster,
00:54:54.420 | it eats more of the slow prey,
00:54:55.820 | the genes of the fast prey and breed
00:54:57.500 | and the prey becomes faster.
00:54:58.860 | And so there's this kind of balancing.
00:55:01.220 | We in, because of our tool making,
00:55:03.220 | we increased our predatory capacity faster
00:55:05.620 | than anything else could increase its resilience to it.
00:55:08.460 | As a result, we started outstripping the environment
00:55:10.940 | and extincting species following stone tools
00:55:14.060 | and going and becoming apex predator everywhere.
00:55:15.580 | This is why we can't keep applying apex predator theories
00:55:17.700 | 'cause we're not an apex predator.
00:55:18.740 | We're an apex predator,
00:55:19.580 | but we're something much more than that.
00:55:21.500 | Like just for an example,
00:55:23.860 | the top apex predator in the world, an orca.
00:55:26.140 | An orca can eat one big fish at a time, like one tuna,
00:55:30.180 | and it'll miss most of the time or one seal.
00:55:32.620 | And we can put a mile long drift net out on a single boat
00:55:37.420 | and pull up an entire school of them, right?
00:55:40.140 | We can deplete the entire oceans of them.
00:55:41.940 | That's not an orca, right?
00:55:43.020 | Like that's not an apex predator.
00:55:44.660 | And that's not even including
00:55:46.700 | that we can then genetically engineer different creatures.
00:55:49.620 | We can extinct species, we can devastate whole ecosystems.
00:55:52.860 | We can make built worlds that have no natural things
00:55:55.060 | that are just human built worlds.
00:55:56.260 | We can build new types of natural creatures, synthetic life.
00:55:59.180 | So we are much more like little gods
00:56:01.180 | than we are like apex predators now,
00:56:02.660 | but we're still behaving as apex predators.
00:56:04.260 | And little gods that behave as apex predators
00:56:06.100 | causes a problem, kind of core to my assessment of the world.
00:56:09.780 | - So what does it mean to be a predator?
00:56:13.020 | So a predator is somebody that effectively can mine
00:56:18.020 | the resources from a place, so for their survival,
00:56:22.220 | or is it also just purely like higher level objectives
00:56:27.660 | of violence and what is, can predators be predators
00:56:31.060 | towards the same, each other towards the same species?
00:56:34.740 | Like are we using the word predator sort of generally,
00:56:37.980 | which then connects to conflict and military conflict,
00:56:41.900 | violent conflict in the space of human species?
00:56:45.700 | - Obviously we can say that plants are mining
00:56:47.900 | the resources of their environment in a particular way,
00:56:50.260 | using photosynthesis to be able to pull minerals
00:56:52.820 | out of the soil and nitrogen and carbon out of the air
00:56:55.660 | and like that.
00:56:57.300 | And we can say herbivores are being able to mine
00:57:00.460 | and concentrate that.
00:57:01.360 | So I wouldn't say mining the environment
00:57:03.060 | is unique to predator.
00:57:04.300 | Predator is, you know,
00:57:07.460 | - Violence. - Generally being defined
00:57:12.900 | as mining other animals, right?
00:57:16.740 | We don't consider herbivores predators,
00:57:19.740 | but animal, which requires some type of violence capacity
00:57:24.300 | because animals move, plants don't move.
00:57:26.940 | So it requires some capacity to overtake something
00:57:31.060 | that can move and try to get away.
00:57:32.820 | We'll go back to the Gerard thing,
00:57:35.740 | then we'll come back here.
00:57:37.500 | Why are we neotenous?
00:57:38.940 | Why are we embryonic for so long?
00:57:40.580 | Because are we, did we just move from the Savannah
00:57:45.780 | to the Arctic and we need to learn new stuff?
00:57:48.100 | If we came genetically programmed,
00:57:49.780 | we would not be able to do that.
00:57:51.580 | Are we throwing spears or are we fishing
00:57:53.820 | or are we running an industrial supply chain
00:57:56.480 | or are we texting?
00:57:57.320 | What is the adaptive behavior?
00:57:59.420 | Horses today in the wild and horses 10,000 years ago
00:58:02.180 | were doing pretty much the same stuff.
00:58:04.300 | And so since we make tools and we evolve our tools
00:58:08.260 | and then change our environment so quickly
00:58:10.380 | and other animals are largely the result
00:58:12.300 | of their environment,
00:58:13.140 | but we're environment modifying so rapidly,
00:58:16.040 | we need to come without too much programming
00:58:18.000 | so we can learn the environment we're in,
00:58:20.100 | learn the language, right?
00:58:21.620 | Which is gonna be very important to learn the toolmaking,
00:58:25.940 | learn the, and so we have a very long period
00:58:29.380 | of relative helplessness
00:58:31.420 | because we aren't coded how to behave yet
00:58:33.340 | because we're imprinting a lot of software
00:58:35.580 | on how to behave that is useful to that particular time.
00:58:38.660 | So our mimesis is not unique to humans,
00:58:41.420 | but the total amount of it is really unique.
00:58:44.340 | And this is also where the uniqueness can go up, right?
00:58:46.980 | Is because we are less just the result of the genetics
00:58:49.660 | and that means the kind of learning through history
00:58:52.460 | that they got coded in genetics
00:58:53.820 | and more the result of,
00:58:55.380 | it's almost like our hardware selected for software, right?
00:59:00.020 | Like if evolution is kind of doing these,
00:59:02.120 | think of as a hardware selection.
00:59:04.020 | I have problems with computer metaphors for biology,
00:59:06.140 | but I'll use this one here.
00:59:07.500 | That we have not had hardware changes
00:59:14.000 | since the beginning of sapiens,
00:59:15.760 | but our world is really, really different.
00:59:18.260 | And that's all changes in software, right?
00:59:20.480 | Changes in on the same fundamental genetic substrate,
00:59:24.840 | what we're doing with these brains and minds and bodies
00:59:27.880 | and social groups and like that.
00:59:30.560 | And so now Gerard specifically was looking at
00:59:35.560 | when we watch other people talking, so we learn language,
00:59:41.480 | you and I would have a hard time learning Mandarin today
00:59:43.760 | or it'd take a lot of work.
00:59:44.680 | We'd be learning how to conjugate verbs and stuff,
00:59:46.600 | but a baby learns it instantly
00:59:47.880 | without anyone even really trying to teach it
00:59:49.520 | just through mimesis.
00:59:50.360 | So it's a powerful thing.
00:59:52.440 | They're obviously more neuroplastic than we are
00:59:54.460 | when they're doing that
00:59:55.300 | and all their attention is allocated to that.
00:59:57.160 | But they're also learning how to move their bodies
00:59:59.560 | and they're learning all kinds of stuff through mimesis.
01:00:02.440 | One of the things that Gerard says
01:00:03.760 | is they're also learning what to want
01:00:05.600 | and they learn what to want.
01:00:07.880 | They learn desire by watching what other people want.
01:00:10.440 | And so intrinsic to this,
01:00:11.780 | people end up wanting what other people want.
01:00:13.940 | And if we can't have what other people have
01:00:18.020 | without taking it away from them,
01:00:20.040 | then that becomes a source of conflict.
01:00:21.680 | So the mimesis of desire
01:00:24.160 | is the fundamental generator of conflict
01:00:26.120 | and that then the conflict energy
01:00:29.880 | within a group of people will build over time.
01:00:32.840 | This is a very, very crude interpretation of the theory.
01:00:36.140 | - Can we just pause on that?
01:00:37.680 | For people who are not familiar
01:00:39.240 | and for me who hasn't,
01:00:41.360 | I'm loosely familiar but haven't internalized it,
01:00:43.700 | but every time I think about it,
01:00:44.540 | it's a very compelling view of the world,
01:00:46.520 | whether it's true or not.
01:00:48.520 | It's quite, it's like when you take
01:00:51.840 | everything Freud says as truth,
01:00:54.120 | it's a very interesting way to think about the world.
01:00:56.560 | In the same way,
01:00:57.840 | thinking about the mimetic theory of desire,
01:01:03.580 | that everything we want
01:01:05.520 | is imitation of other people's wants.
01:01:11.280 | We don't have any original wants.
01:01:13.320 | We're constantly imitating others.
01:01:15.760 | And so, and not just others,
01:01:18.640 | but others we're exposed to.
01:01:21.360 | So there's these little local pockets,
01:01:23.360 | however defined local,
01:01:25.040 | of people imitating each other.
01:01:27.400 | And one that's super empowering
01:01:29.540 | because then you can pick which group you can join.
01:01:33.160 | Like, what do you wanna imitate?
01:01:35.060 | (laughs)
01:01:35.960 | It's the old, whoever your friends are,
01:01:39.880 | that's what your life is gonna be like.
01:01:42.440 | That's really powerful.
01:01:43.680 | I mean, it's depressing that we're so unoriginal,
01:01:46.360 | but it's also liberating in that,
01:01:49.000 | if this holds true,
01:01:51.040 | that we can choose our life
01:01:52.560 | by choosing the people we hang out with.
01:01:54.600 | - So, okay.
01:01:56.840 | Thoughts that are very compelling,
01:01:58.520 | that seem like they're more absolute
01:02:00.200 | than they actually are,
01:02:01.320 | end up also being dangerous.
01:02:02.800 | We wanna-- - Communism?
01:02:03.840 | (laughs)
01:02:04.680 | - I'm gonna discuss here where I think we need
01:02:07.620 | to amend this particular theory.
01:02:10.240 | But specifically, you just said something
01:02:11.920 | that everyone who's paid attention
01:02:14.120 | knows is true experientially,
01:02:15.760 | which is who you're around affects who you become.
01:02:18.680 | And as libertarian and self-determining
01:02:22.560 | and sovereign as we'd like to be,
01:02:24.620 | everybody, I think, knows that if you got put
01:02:28.640 | in a maximum security prison,
01:02:30.320 | aspects of your personality would have to adapt
01:02:32.660 | or you wouldn't survive there, right?
01:02:34.520 | You would become different.
01:02:35.360 | If you grew up in Darfur versus Finland,
01:02:38.960 | you would be different with your same genetics.
01:02:40.680 | Like, just, there's no real question about that.
01:02:43.360 | And that even today, if you hang out in a place
01:02:47.520 | with ultra marathoners as your roommates
01:02:50.360 | or all people who are obese as your roommates,
01:02:53.660 | the statistical likelihood of what happens
01:02:55.480 | to your fitness is pretty clear, right?
01:02:56.980 | Like the behavioral science of this is pretty clear.
01:02:59.360 | So, the whole saying we are the average
01:03:02.320 | of the five people we spend the most time around.
01:03:04.320 | I think the more self-reflective someone is
01:03:06.560 | and the more time they spend by themselves
01:03:07.920 | in self-reflection, the less this is true,
01:03:09.720 | but it's still true.
01:03:10.720 | So, one of the best things someone can do
01:03:13.980 | to become more self-determined is be self-determined
01:03:16.760 | about the environments they wanna put themselves in.
01:03:18.880 | Because to the degree that there is some self-determination
01:03:21.340 | and some determination by the environment,
01:03:23.360 | don't be fighting an environment
01:03:24.900 | that is predisposing you in bad directions.
01:03:27.040 | Try to put yourself in an environment
01:03:28.500 | that is predisposing the things that you want.
01:03:30.920 | In turn, try to affect the environment
01:03:32.760 | in ways that predispose positive things
01:03:34.080 | for those around you.
01:03:35.120 | - Or perhaps also to, there's probably interesting ways
01:03:38.800 | to play with this.
01:03:39.640 | You could probably put yourself,
01:03:41.480 | like form connections that have this perfect tension
01:03:46.040 | in all directions to where you're actually free
01:03:48.320 | to decide whatever the heck you want,
01:03:49.720 | because the set of wants within your circle of interactions
01:03:54.720 | is so conflicting that you're free to choose whichever one.
01:03:59.140 | So, if there's enough tension,
01:04:00.380 | as opposed to everybody aligned like a flock of birds.
01:04:03.600 | - Yeah, I mean, you definitely want
01:04:05.200 | that all of the dialectics would be balanced.
01:04:09.340 | So, if you have someone who is extremely oriented
01:04:14.340 | to self-empowerment and someone who's extremely oriented
01:04:17.980 | to kind of empathy and compassion,
01:04:19.360 | both the dialectic of those is better
01:04:21.300 | than either of them on their own.
01:04:22.940 | If you have both of them being inhabited better than you
01:04:27.740 | by the same person, spending time around that person
01:04:29.660 | will probably do well for you.
01:04:31.400 | I think the thing you just mentioned is super important
01:04:34.700 | when it comes to cognitive schools,
01:04:36.880 | which is I think one of the fastest things people can do
01:04:41.860 | to improve their learning
01:04:43.380 | and their not just cognitive learning,
01:04:46.260 | but their meaningful problem-solving communication
01:04:50.860 | and civic capacity,
01:04:52.140 | capacity to participate as a citizen with other people
01:04:54.540 | and making the world better,
01:04:56.260 | is to be seeking dialectical synthesis all the time.
01:04:59.340 | And so, in the Hegelian sense, if you have a thesis,
01:05:03.580 | you have an antithesis.
01:05:06.240 | So, maybe we have libertarianism on one side
01:05:08.580 | and Marxist kind of communism on the other side.
01:05:10.700 | And one is arguing that the individual
01:05:13.820 | is the unit of choice.
01:05:16.620 | And so, we want to increase the freedom
01:05:20.060 | and support of individual choice,
01:05:21.480 | because as they make more agentic choices,
01:05:23.420 | it'll produce a better whole for everybody.
01:05:25.260 | The other side saying,
01:05:26.100 | well, the individuals are conditioned by their environment
01:05:27.980 | who would choose to be born into Darfur rather than Finland.
01:05:30.940 | So, we actually need to collectively make environments
01:05:36.380 | that are good,
01:05:37.280 | because that the environment conditions individuals.
01:05:40.000 | So, you have a thesis and an antithesis.
01:05:42.280 | And then Hegel's ideas, you have a synthesis,
01:05:44.560 | which is a kind of higher order truth
01:05:46.040 | that understands how those relate
01:05:48.000 | in a way that neither of them do.
01:05:50.120 | And so, it is actually at a higher order of complexity.
01:05:52.740 | So, the first part would be,
01:05:53.720 | can I steel man each of these?
01:05:55.480 | Can I argue each one well enough
01:05:57.080 | that the proponents of it are like, totally, you got that?
01:05:59.920 | And not just argue it rhetorically,
01:06:01.640 | but can I inhabit it where I can try to see
01:06:04.420 | and feel the world the way someone
01:06:06.280 | seeing and feeling the world that way would?
01:06:08.480 | 'Cause once I do,
01:06:09.740 | then I don't want to screw those people
01:06:11.360 | because there's truth in it, right?
01:06:13.160 | And I'm not gonna go back to war with them.
01:06:14.560 | I'm gonna go to finding solutions
01:06:16.140 | that could actually work at a higher order.
01:06:18.380 | If I don't go to a higher order, then there's war.
01:06:21.380 | And, but then the higher order thing would be,
01:06:23.280 | well, it seems like the individual does affect the commons
01:06:27.720 | and the collective and other people.
01:06:28.880 | It also seems like the collective conditions individuals,
01:06:31.900 | at least statistically.
01:06:33.100 | And I can cherry pick out the one guy
01:06:34.700 | who got out of the ghetto
01:06:36.340 | and pulled himself up by his bootstraps.
01:06:38.480 | But I can also say statistically
01:06:39.880 | that most people born into the ghetto
01:06:41.400 | show up differently than most people born into the Hamptons.
01:06:44.200 | And so, unless you wanna argue that
01:06:47.880 | and have you take your child from the Hamptons
01:06:49.460 | and put them in the ghetto,
01:06:50.300 | then like, come on, be realistic about this thing.
01:06:52.880 | So how do we make,
01:06:54.960 | we don't want social systems
01:06:56.340 | that make weak dependent individuals, right?
01:07:00.040 | The welfare argument,
01:07:01.120 | but we also don't want no social system
01:07:04.000 | that supports individuals to do better.
01:07:06.600 | We don't want individuals
01:07:09.120 | where their self-expression and agency
01:07:11.480 | fucks the environment and everybody else
01:07:13.220 | and employs slave labor and whatever.
01:07:15.600 | So can we make it to where individuals are creating holes
01:07:20.520 | that are better for conditioning other individuals?
01:07:22.500 | Can we make it to where we have holes
01:07:23.680 | that are conditioning increased agency and sovereignty?
01:07:26.960 | Right, that would be the synthesis.
01:07:28.240 | So the thing that I'm coming to here is,
01:07:30.820 | if people have that as a frame,
01:07:32.800 | and sometimes it's not just thesis and antithesis,
01:07:34.960 | it's like eight different views, right?
01:07:37.520 | Can I steel man each view?
01:07:39.400 | This is not just, can I take the perspective,
01:07:41.280 | but am I seeking them?
01:07:42.220 | Am I actively trying to inhabit other people's perspective?
01:07:46.400 | Then can I really try to essentialize it
01:07:49.440 | and argue the best points of it,
01:07:51.480 | both the sense-making about reality and the values,
01:07:54.480 | why these values actually matter?
01:07:56.120 | Then, just like I wanna seek those perspectives,
01:07:59.740 | then I wanna seek, is there a higher order
01:08:02.800 | set of understandings that could fulfill the values of
01:08:07.160 | and synthesize the sense-making
01:08:08.680 | of all of them simultaneously?
01:08:10.140 | Maybe I won't get it, but I wanna be seeking it,
01:08:12.000 | and I wanna be seeking progressively better ones.
01:08:14.460 | So this is perspective seeking, driving perspective taking,
01:08:18.740 | and then seeking synthesis.
01:08:22.000 | I think that one cognitive disposition
01:08:27.800 | might be the most helpful thing.
01:08:30.220 | - Would you put a title of dialectic synthesis
01:08:34.020 | on that process?
01:08:34.860 | 'Cause that seems to be such a part,
01:08:36.440 | so like this rigorous empathy.
01:08:38.480 | Like, it's not just empathy, it's empathy with rigor.
01:08:44.040 | Like you really want to understand
01:08:46.380 | and embody different worldviews,
01:08:48.380 | and then try to find a higher order synthesis.
01:08:50.880 | - Okay, so I remember last night you told me,
01:08:54.340 | when we first met, you said
01:08:57.220 | that you looked in somebody's eyes
01:08:59.060 | and you felt that you had suffered in some ways
01:09:01.100 | that they had suffered, and so you could trust them.
01:09:03.580 | Shared pathos, right, creates a certain sense
01:09:05.800 | of kind of shared bonding and shared intimacy.
01:09:07.940 | So empathy is actually feeling the suffering
01:09:10.800 | of somebody else, and feeling the depth of their sentience.
01:09:14.620 | I don't wanna fuck them anymore, I don't wanna hurt them.
01:09:16.880 | I don't want to behave,
01:09:18.180 | I don't want my proposition to go through
01:09:20.940 | when I go and inhabit the perspective of the other people,
01:09:23.140 | if they feel that's really gonna mess them up, right?
01:09:25.700 | And so the rigorous empathy,
01:09:27.900 | it's different than just compassion,
01:09:29.300 | which is I generally care.
01:09:31.060 | Like I have a generalized care,
01:09:32.500 | but I don't know what it's like to be them.
01:09:34.320 | I can never know what it's like to be them perfectly,
01:09:36.340 | and there's a humility you have to have,
01:09:38.200 | which is my most rigorous attempt is still not it.
01:09:42.440 | My most rigorous attempt, mine,
01:09:44.620 | to know what it's like to be a woman is still not it.
01:09:46.860 | I have no question that if I was actually a woman,
01:09:48.540 | it would be different than my best guesses.
01:09:50.460 | I have no question if I was actually black,
01:09:52.100 | it'd be different than my best guesses.
01:09:54.500 | So there's a humility in that which keeps me listening,
01:09:56.980 | 'cause I don't think that I know fully,
01:09:58.740 | but I want to, and I'm gonna keep trying better to.
01:10:01.460 | And then I want to across them,
01:10:03.680 | and then I wanna say,
01:10:04.520 | is there a way we can forward together
01:10:06.140 | and not have to be in war?
01:10:07.880 | It has to be something that could meet the values
01:10:10.540 | that everyone holds,
01:10:11.660 | that could reconcile the partial sensemaking
01:10:13.880 | that everyone holds,
01:10:15.060 | and that could offer a way forward that is more agreeable
01:10:18.820 | than the partial perspectives at war with each other.
01:10:21.220 | - But so the more you succeed at this empathy
01:10:24.100 | with humility,
01:10:25.540 | the more you're carrying the burden
01:10:27.140 | of other people's pain, essentially.
01:10:30.860 | - Now this goes back to the question of,
01:10:32.520 | do I see us as one being or 7.8 billion?
01:10:36.140 | I think the,
01:10:39.900 | if I'm overwhelmed with my own pain,
01:10:44.320 | I can't empathize that much,
01:10:46.600 | because I don't have the bandwidth,
01:10:47.900 | I don't have the capacity.
01:10:49.600 | If I don't feel like I can do something
01:10:51.280 | about a particular problem in the world,
01:10:52.820 | it's hard to feel it 'cause it's just too devastating.
01:10:55.900 | And so a lot of people go numb and even go nihilistic
01:10:58.980 | because they just don't feel the agency.
01:11:01.420 | So as I actually become more empowered as an individual
01:11:04.120 | and have more sense of agency,
01:11:05.660 | I also become more empowered to be more empathetic for others
01:11:08.380 | and be more connected to that shared burden
01:11:10.940 | and want to be able to make choices
01:11:12.740 | on behalf of and in benefit of.
01:11:15.620 | - So this way of living
01:11:19.900 | seems like a way of living
01:11:22.100 | that would solve a lot of problems in society
01:11:25.300 | from a cellular automata perspective.
01:11:27.320 | So if you have a bunch of little agents behaving in this way,
01:11:32.580 | my intuition, there'll be interesting complexities
01:11:35.020 | that emerge, but my intuition is
01:11:37.460 | it will create a society that's very different
01:11:39.780 | and recognizably better than the one we have today.
01:11:43.740 | How much, oh wait, hold that question,
01:11:48.460 | 'cause I want to come back to it,
01:11:49.300 | but this brings us back to Gerard, which we didn't answer.
01:11:51.580 | The conflict theory.
01:11:52.580 | - Yes.
01:11:53.400 | - 'Cause about how to get past the conflict theory.
01:11:54.980 | - Yes, you know the Robert Frost poem about the two paths
01:11:57.300 | and you never had time to turn back to the other.
01:11:59.780 | We're gonna have to do that quite a lot.
01:12:01.340 | We're gonna be living that poem over and over again.
01:12:05.540 | But yes, how to, let's return back.
01:12:09.540 | - Okay, so the rest of the argument goes,
01:12:11.700 | you learn to want what other people want,
01:12:13.620 | therefore fundamental conflict based in our desire
01:12:16.580 | because we want the thing that somebody else has.
01:12:18.840 | And then people are, they're in conflict
01:12:22.100 | over trying to get the same stuff,
01:12:23.600 | power, status, attention, physical stuff,
01:12:25.980 | a mate, whatever it is.
01:12:27.620 | And then we learn the conflict by watching.
01:12:30.620 | And so then the conflict becomes mimetic.
01:12:32.180 | So the, and you know, we become on the Palestinian side
01:12:36.180 | or the Israeli side or the communist or capitalist side
01:12:38.100 | or the left or right politically or whatever it is.
01:12:40.860 | And until eventually the conflict energy in the system
01:12:43.860 | builds up so much that some type of violence
01:12:47.420 | is needed to get the bad guy,
01:12:48.980 | whoever it is that we're gonna blame.
01:12:50.460 | And you know, George talks about why scapegoating
01:12:52.940 | was kind of a mechanism to minimize the amount of violence.
01:12:55.700 | Let's blame a scapegoat as being more relevant
01:12:59.820 | than they really were.
01:13:00.640 | But if we all believe it, then we can all kind of
01:13:02.060 | calm down with the conflict energy.
01:13:03.820 | - It's a really interesting concept, by the way.
01:13:06.300 | I mean, you went, you beautifully summarized it,
01:13:08.660 | but the idea that there's a scapegoat,
01:13:10.340 | that there's a, this kind of thing
01:13:11.900 | naturally leads to a conflict.
01:13:13.460 | And then they find the other, some group that's the other,
01:13:17.020 | that's either real or artificial
01:13:18.860 | as the cause of the conflict.
01:13:20.940 | - Well, it's always artificial
01:13:22.460 | because the cause of the conflict,
01:13:23.900 | and Gerard is the mimesis of desire itself,
01:13:25.980 | and how do we attack that?
01:13:27.580 | How do we attack that it's our own desire?
01:13:30.180 | So this now gets to something more like Buddha said, right?
01:13:32.760 | Which was desire is the cause of suffering.
01:13:34.860 | Gerard and Buddha would kind of agree in this way.
01:13:38.980 | - So, but that explains, I mean, again,
01:13:43.420 | it's a compelling description of human history
01:13:46.780 | that we do tend to come up with the other.
01:13:49.740 | And--
01:13:50.580 | - Okay, kind of.
01:13:51.420 | I just had such a funny experience
01:13:54.380 | with someone critiquing Gerard the other day
01:13:56.020 | in such a elegant and beautiful and simple way.
01:13:59.700 | It's a friend who's grew up Aboriginal Australian,
01:14:04.700 | is a scholar of Aboriginal social technologies.
01:14:12.660 | And he's like, nah, man, Gerard just made shit up
01:14:15.660 | about how tribes work.
01:14:16.880 | Like we come from a tribe,
01:14:18.140 | we've got tens of thousands of years,
01:14:20.260 | and we didn't have increasing conflict
01:14:22.100 | and then scapegoat and kill someone.
01:14:23.740 | We'd have a little bit of conflict
01:14:24.940 | and then we would dance and then everybody'd be fine.
01:14:27.940 | Like we'd dance around the campfire,
01:14:29.260 | everyone would like kind of physically get the energy out.
01:14:31.100 | We'd look in each other's eyes,
01:14:32.100 | we'd have positive bonding, and then we're fine.
01:14:34.580 | And nobody, no scapegoats.
01:14:36.260 | And--
01:14:37.100 | - I think that's called the Joe Rogan theory of desire,
01:14:40.300 | which is he's like all of human problems
01:14:43.540 | have to do with the fact
01:14:44.420 | that you don't do enough hard shit in your day.
01:14:46.900 | So maybe you could just dance it,
01:14:49.140 | 'cause he says like doing exercise
01:14:50.900 | and running on the treadmill gets all the demons out.
01:14:53.180 | Maybe just dancing gets all the demons out.
01:14:55.220 | - So this is why I say we have to be careful
01:14:57.100 | with taking an idea that seems too explanatory
01:15:00.460 | and then taking it as a given and then saying,
01:15:02.900 | well, now that we're stuck with the fact
01:15:04.500 | that conflict is inexorable because mimetic desire
01:15:08.940 | and therefore how do we deal with the inexorability
01:15:10.940 | of the conflict and how to sublimate violence?
01:15:12.940 | Well, no, the whole thing might be actually gibberish.
01:15:15.740 | Meaning it's only true in certain conditions
01:15:17.780 | and other conditions it's not true.
01:15:19.100 | So the deeper question is under which conditions
01:15:21.620 | is that true?
01:15:22.460 | Under which conditions is it not true?
01:15:23.820 | What do those other conditions make possible and look like?
01:15:26.300 | - And in general, we should stay away
01:15:27.500 | from really compelling models of reality
01:15:30.300 | because there's something about our brains
01:15:33.580 | that these models become sticky
01:15:35.260 | and we can't even think outside of them.
01:15:38.180 | - It's not that we stay away from them,
01:15:39.020 | it's that we know that the model of reality
01:15:40.540 | is never reality.
01:15:42.340 | That's the key thing.
01:15:43.820 | - Humility again, it goes back to just having the humility
01:15:46.340 | that you don't have a perfect model of reality.
01:15:48.380 | - There's an, the model of reality could never be reality.
01:15:52.020 | The process of modeling is inherently information reduction.
01:15:56.020 | And I can never show that the unknown,
01:16:00.140 | unknown set has been factored.
01:16:01.840 | - It's back to the cellular automata.
01:16:05.700 | You can't put the genie back in the bottle.
01:16:10.180 | Like when you realize it's unfortunately,
01:16:13.540 | sadly impossible to create a model of cellular automata,
01:16:18.540 | even if you know the basic rules
01:16:22.860 | that predict to even any degree of accuracy
01:16:26.220 | what, how that system will evolve,
01:16:30.700 | which is fascinating mathematically, sorry.
01:16:32.740 | I think about it quite a lot.
01:16:34.660 | It's very annoying.
01:16:36.100 | Wolfram has this rule 30,
01:16:39.300 | like you should be able to predict it.
01:16:41.860 | It's so simple, but you can't predict what's going to be,
01:16:46.380 | like there's a problem he defines,
01:16:48.860 | they try to predict some aspect
01:16:50.680 | of the middle column of the system.
01:16:53.140 | Just anything about it, what's gonna happen in the future.
01:16:55.780 | And you can't, you can't.
01:16:58.300 | It sucks.
01:17:01.860 | 'Cause then we can't make sense of this world,
01:17:03.900 | you know, of reality in a definitive way.
01:17:07.620 | It's always like in the striving,
01:17:10.180 | like we're always striving.
01:17:12.620 | - Yeah, I don't think this sucks.
01:17:14.260 | - So that's a feature, not a bug?
01:17:17.500 | - Well, that's assuming a designer.
01:17:20.380 | I would say, I don't think it sucks.
01:17:23.300 | I think it's not only beautiful,
01:17:25.540 | but maybe necessary for beauty.
01:17:27.360 | - The mess.
01:17:29.980 | So you're, so you're, you disagree with Jordan Peterson,
01:17:33.580 | you should clean up your room.
01:17:35.140 | You like the rooms messy.
01:17:36.580 | - It's essential for the, for beauty.
01:17:39.500 | - It's not, it's not that, it's, okay.
01:17:42.700 | I take, I have no idea if it was intended this way.
01:17:45.860 | And so I'm just interpreting it a way I like.
01:17:48.300 | The commandment about having no false idols.
01:17:51.140 | To me, the way I interpret that is meaningful,
01:17:55.940 | is that reality is sacred to me.
01:17:58.660 | I have a reverence for reality,
01:18:00.580 | but I know my best understanding of it is never complete.
01:18:04.940 | I know my best model of it is a model
01:18:08.100 | where I tried to make some kind of predictive capacity
01:18:12.020 | by reducing the complexity of it
01:18:13.860 | to a set of stuff that I could observe.
01:18:15.640 | And then a subset of that stuff
01:18:17.200 | that I thought was the causal dynamics
01:18:18.860 | and then some set of mechanisms that are involved.
01:18:21.980 | And what we find is that it can be super useful.
01:18:24.740 | Like Newtonian gravity can help us do ballistic curves
01:18:28.380 | and all kinds of super useful stuff.
01:18:30.020 | And then we get to the place where it doesn't explain
01:18:32.960 | what's happening at the cosmological scale
01:18:34.980 | or at a quantum scale.
01:18:36.840 | And at each time, what we're finding is we excluded stuff.
01:18:41.840 | And it also doesn't explain the reconciliation
01:18:44.360 | of gravity with quantum mechanics
01:18:46.000 | and the other kinds of fundamental laws.
01:18:48.240 | So models can be useful,
01:18:50.040 | but they're never true with a capital T.
01:18:52.140 | Meaning they're never an actual real full,
01:18:56.500 | they're never a complete description
01:18:57.980 | of what's happening in real systems.
01:18:59.620 | They can be a complete description of what's happening
01:19:01.460 | in an artificial system that was the result
01:19:03.240 | of applying a model.
01:19:04.600 | So the model of a circuit board
01:19:05.880 | and the circuit board are the same thing.
01:19:07.500 | But I would argue that the model of a cell
01:19:08.940 | and the cell are not the same thing.
01:19:11.180 | And I would say this is key to what we call complexity
01:19:15.120 | versus the complicated,
01:19:16.400 | which is a distinction Dave Snowden made well
01:19:19.220 | in defining the difference between simple,
01:19:23.060 | complicated, complex and chaotic systems.
01:19:25.940 | But one of the definers in complex systems
01:19:28.380 | is that no matter how you model the complex system,
01:19:30.560 | it will still have some emergent behavior
01:19:32.300 | not predicted by the model.
01:19:34.020 | - Can you elaborate on the complex versus the complicated?
01:19:37.100 | - Complicated means we can fully explicate
01:19:40.020 | the phase space of all the things that it can do.
01:19:41.820 | We can program it.
01:19:42.900 | All human, not all, for the most part,
01:19:47.440 | human built things are complicated.
01:19:48.980 | They don't self-organize.
01:19:50.180 | They don't self-repair.
01:19:52.380 | They're not self-evolving.
01:19:53.420 | And we can make a blueprint for them.
01:19:56.300 | - Sorry, for human systems?
01:19:58.020 | - For human technologies.
01:19:59.580 | - Human technologies, I'm sorry.
01:20:01.080 | Okay, so non-biological systems.
01:20:01.920 | - That are basically the application of models.
01:20:04.560 | And engineering is kind of applied science,
01:20:09.080 | science as the modeling process.
01:20:10.980 | But with-
01:20:14.460 | - But humans are complex.
01:20:15.820 | - Complex stuff, with biological type stuff
01:20:18.800 | and sociological type stuff,
01:20:20.520 | it more has generator functions.
01:20:23.060 | And even those can't be fully explicated
01:20:25.300 | than it has, or our explanation can't prove
01:20:28.580 | that it has closure of what would be
01:20:30.300 | in the unknown, unknown set.
01:20:31.500 | Where we keep finding like, oh, it's just the genome.
01:20:33.560 | Oh, well now it's the genome and the epigenome.
01:20:35.220 | And then a recursive change on the epigenome
01:20:37.180 | 'cause of the proteome.
01:20:38.000 | And then there's mitochondrial DNA
01:20:39.260 | and then virus is affected and fuck, right?
01:20:41.700 | So it's like we get overexcited
01:20:43.940 | when we think we found the thing.
01:20:45.580 | - So on Facebook, you know how you can list
01:20:48.220 | your relationship as complicated?
01:20:49.700 | It should actually say it's complex.
01:20:52.900 | That's the more accurate description.
01:20:55.540 | Self-terminating is a really interesting idea
01:20:57.540 | that you talk about quite a bit.
01:20:59.180 | First of all, what is a self-terminating system?
01:21:03.860 | And I think you have a sense, correct me if I'm wrong,
01:21:07.540 | that human civilization as it currently is
01:21:11.620 | is a self-terminating system.
01:21:13.320 | Why do you have that intuition?
01:21:17.700 | Combine it with the definition
01:21:18.860 | of what self-terminating means.
01:21:22.760 | - Okay, so if we look at human societies historically,
01:21:27.760 | human civilizations,
01:21:29.840 | it's not that hard to realize
01:21:33.880 | that most of the major civilizations
01:21:35.880 | and empires of the past don't exist anymore.
01:21:37.840 | So they had a life cycle, they died for some reason.
01:21:40.560 | So we don't still have the early Egyptian empire
01:21:44.120 | or Inca or Maya or Aztec or any of those, right?
01:21:47.080 | And so they terminated.
01:21:50.400 | Sometimes it seems like they were terminated
01:21:51.920 | from the outside and more,
01:21:53.000 | sometimes it seems like they self-terminate.
01:21:54.520 | When we look at Easter Island, it was a self-termination.
01:21:57.640 | So let's go ahead and take an island situation.
01:22:00.700 | If I have an island and we are consuming the resources
01:22:03.240 | on that island faster than the resources
01:22:05.280 | can replicate themselves and there's a finite space there,
01:22:08.200 | that system is gonna self-terminate.
01:22:09.840 | It's not gonna be able to keep doing that thing
01:22:12.080 | 'cause you'll get to a place of there's no resources left
01:22:15.200 | and then you get a...
01:22:16.840 | So now if I'm utilizing the resources faster
01:22:20.040 | than they can replicate,
01:22:21.920 | or faster than they can replenish,
01:22:23.760 | and I'm actually growing our population in the process,
01:22:25.840 | I'm even increasing the rate
01:22:27.320 | of the utilization of resources,
01:22:29.080 | I might get an exponential curve and then hit a wall
01:22:32.520 | and then just collapse the exponential curve
01:22:34.760 | rather than do an S curve or some other kind of thing.
01:22:37.740 | So self-terminating system is any system
01:22:43.080 | that depends upon a substrate system
01:22:46.360 | that is debasing its own substrate,
01:22:48.700 | that is debasing what it depends upon.
01:22:51.080 | - So you're right that if you look at empires,
01:22:54.800 | they rise and fall throughout human history,
01:22:57.640 | but not this time, bro.
01:22:59.500 | This one's gonna last forever.
01:23:03.440 | - I like that idea.
01:23:06.560 | I think that if we don't understand
01:23:07.920 | why all the previous ones failed, we can't ensure that.
01:23:11.000 | And so I think it's very important to understand it well
01:23:12.840 | so that we can have that be a designed outcome
01:23:16.400 | with somewhat decent probability.
01:23:18.680 | - So where it's sort of in terms of consuming the resources
01:23:22.040 | on the island, we're a clever bunch,
01:23:24.560 | and we keep coming up, especially when on the horizon,
01:23:29.560 | there is a termination point.
01:23:33.720 | We keep coming up with clever ways of avoiding disaster,
01:23:37.440 | of avoiding collapse, of constructing,
01:23:40.640 | this is where technological innovation,
01:23:42.240 | this is where growth comes in,
01:23:43.820 | coming up with different ways to improve productivity
01:23:46.240 | and the way society functions
01:23:48.160 | such that we consume less resources
01:23:50.160 | or get a lot more from the resources we have.
01:23:52.400 | So there's some sense in which there is a,
01:23:58.160 | human ingenuity is a source for optimism
01:24:01.760 | about the future of this particular system
01:24:03.640 | that may not be self-terminating.
01:24:07.240 | If there's more innovation than there is consumption.
01:24:11.160 | - So overconsumption of resources
01:24:15.720 | is just one way a thing can self-terminate.
01:24:17.320 | We're just kind of starting here,
01:24:18.720 | but there are reasons for optimism and pessimism
01:24:23.720 | then they're both worth understanding.
01:24:27.400 | And there's failure modes on understanding
01:24:29.040 | either without the other.
01:24:30.280 | As we mentioned previously,
01:24:33.120 | there's what I would call naive techno-optimism,
01:24:37.720 | naive techno-capital optimism
01:24:39.760 | that says stuff just has been getting better and better
01:24:43.000 | and we wouldn't wanna live in the dark ages
01:24:44.600 | and tech has done all this awesome stuff.
01:24:46.240 | And we know the proponents of those models
01:24:50.400 | and that stuff is gonna kind of keep getting better.
01:24:52.200 | Of course there are problems,
01:24:53.160 | but human ingenuity rises to it.
01:24:54.720 | Supply and demand will solve the problems, whatever.
01:24:56.720 | - Would you put a rake as well on that or in that bucket?
01:25:01.040 | Is there some specific people you have in mind
01:25:04.240 | or naive optimism is truly naive
01:25:06.440 | to where you're essentially just have an optimism
01:25:09.480 | that's blind to any kind of realities
01:25:11.320 | of the way technology progresses?
01:25:14.040 | - I don't think that anyone who thinks about it
01:25:19.040 | and writes about it is perfectly naive.
01:25:22.440 | - Gotcha.
01:25:23.280 | - But there might be--
01:25:24.120 | - It's a platonic ideal.
01:25:25.680 | - There might be a bias in the nature of the assessment.
01:25:29.720 | I would also say there's kind of naive techno-pessimism
01:25:33.160 | and there are critics of technology.
01:25:36.160 | I mean, you read the Unabomber's manifesto
01:25:43.800 | on why technology can't not result in our self-termination.
01:25:47.080 | So we have to take it out before it gets any further.
01:25:49.560 | But also if you read a lot of the X-risk community,
01:25:54.280 | Bostrom and friends,
01:25:56.560 | it's like our total number of existential risks
01:26:00.400 | and the total probability of them is going up.
01:26:03.480 | And so I think that there are,
01:26:07.080 | we have to hold together where our positive possibilities
01:26:11.960 | and our risk possibilities are both increasing
01:26:14.440 | and then say for the positive possibilities
01:26:17.080 | to be realized long-term,
01:26:19.100 | all of the catastrophic risks have to not happen.
01:26:21.560 | Any of the catastrophic risks happening is enough
01:26:25.040 | to keep that positive outcome from occurring.
01:26:27.320 | So how do we ensure that none of them happen?
01:26:29.980 | If we want to say,
01:26:31.160 | let's have a civilization that doesn't collapse.
01:26:33.040 | So again, collapse theory.
01:26:35.480 | It's worth looking at books like
01:26:37.440 | "The Collapse of Complex Societies" by Joseph Tainter.
01:26:40.240 | It does an analysis of that many of the societies fell
01:26:45.240 | for internal institutional decay,
01:26:48.960 | civilizational decay reasons.
01:26:51.400 | Baudrillard in "Simulation and Simulacra"
01:26:53.960 | looks at a very different way of looking at
01:26:55.600 | how institutional decay in the collective intelligence
01:26:57.960 | of a system happens and it becomes kind of
01:26:59.760 | more internally parasitic on itself.
01:27:02.360 | Obviously, Jared Diamond made a more popular book
01:27:04.560 | called "Collapse."
01:27:05.500 | And as we were mentioning,
01:27:07.640 | the Antikythera mechanism has been getting attention
01:27:10.480 | in the news lately.
01:27:11.320 | It's like a 2000 year old clock, right?
01:27:13.640 | Like metal gears.
01:27:15.600 | And does that mean we lost like 1500 years
01:27:20.600 | of technological progress?
01:27:22.220 | And from a society that was relatively
01:27:25.600 | technologically advanced.
01:27:27.040 | So what I'm interested in here is being able to say,
01:27:32.560 | okay, well, why did previous societies fail?
01:27:37.420 | Can we understand that abstractly enough
01:27:40.660 | that we can make a civilizational model
01:27:44.100 | that isn't just trying to solve one type of failure,
01:27:47.140 | but solve the underlying things
01:27:48.980 | that generate the failures as a whole?
01:27:51.500 | Are there some underlying generator functions or patterns
01:27:55.060 | that would make a system self-terminating?
01:27:57.140 | And can we solve those and have that be the kernel
01:27:59.200 | of a new civilizational model that is not self-terminating?
01:28:02.420 | And can we then be able to actually look at
01:28:05.180 | the categories of excerpts we're aware of
01:28:06.860 | and see that we actually have resilience
01:28:09.220 | in the presence of those,
01:28:10.280 | not just resilience, but anti-fragility.
01:28:12.700 | And I would say for the optimism to be grounded,
01:28:16.820 | it has to actually be able to understand the risk space well
01:28:20.340 | and have adequate solutions for it.
01:28:22.460 | - So can we try to dig into some basic intuitions
01:28:27.460 | about the underlying sources of catastrophic failures
01:28:32.380 | of the system and overconsumption
01:28:35.020 | that's built in into self-terminating systems?
01:28:37.420 | So both the overconsumption, which is like the slow death,
01:28:41.000 | and then there's the fast death of nuclear war
01:28:44.220 | and all those kinds of things,
01:28:45.820 | AGI, biotech, bioengineering, nanotechnology,
01:28:49.420 | my favorite, nanobots.
01:28:50.820 | Nanobots are my favorite because it sounds so cool to me
01:28:58.220 | that I could just know that I would be one of the scientists
01:29:01.140 | that would be full steam ahead in building them
01:29:04.020 | without sufficiently thinking
01:29:07.300 | about the negative consequences.
01:29:08.700 | I would definitely be, I would be podcasting
01:29:11.020 | all about the negative consequences,
01:29:12.940 | but when I go back home, I'd be just in my heart,
01:29:17.460 | know the amount of excitement is a dumb descendant of ape,
01:29:21.180 | no offense to apes.
01:29:22.220 | So I wanna backtrack on my previous comments
01:29:27.620 | about negative comments about apes.
01:29:32.560 | I have that sense of excitement
01:29:36.640 | that would result in problems.
01:29:38.880 | So sorry, a lot of things said,
01:29:40.360 | but can we start to pull it at a thread?
01:29:43.560 | 'Cause you've also provided a kind of a beautiful,
01:29:46.940 | general approach to this, which is this dialectic synthesis
01:29:50.760 | or just rigorous empathy.
01:29:54.360 | Whatever word we wanna put to it,
01:29:56.960 | that seems to be from the individual perspective
01:29:59.080 | as one way to sort of live in the world
01:30:01.560 | as we try to figure out
01:30:02.720 | how to construct non-self-terminating systems.
01:30:06.120 | So what are some underlying sources?
01:30:08.040 | - Yeah, first I have to say,
01:30:10.720 | I actually really respect Drexler for emphasizing Grey Goo
01:30:16.640 | in "Engines of Creation" back in the day
01:30:18.640 | to make sure the world was paying adequate attention
01:30:23.780 | to the risks of the nanotech.
01:30:26.380 | As someone who was right at the cutting edge
01:30:28.680 | of what could be,
01:30:29.680 | there's definitely game theoretic advantage
01:30:35.080 | to those who focus on the opportunities
01:30:36.980 | and don't focus on the risks or pretend there aren't risks
01:30:39.920 | because they get to market first
01:30:45.640 | and then they externalize all of the costs
01:30:49.400 | through limited liability or whatever it is
01:30:51.460 | to the commons or wherever happen to have it.
01:30:53.240 | Other people are gonna have to solve those,
01:30:54.640 | but now they have the power and capital associated.
01:30:56.640 | The person who looked at the risks
01:30:57.860 | and tried to do better design and go slower
01:31:00.000 | is probably not gonna move into positions
01:31:02.960 | of as much power or influence as quickly.
01:31:04.480 | So this is one of the issues we have to deal with
01:31:05.960 | is some of the bad game theoretic dispositions
01:31:08.800 | in the system relative to its own stability.
01:31:12.680 | - And the key aspect to that, sorry to interrupt,
01:31:15.080 | is the externalities generated.
01:31:17.120 | - Yes.
01:31:18.480 | - What flavors of catastrophic risk
01:31:20.380 | are we talking about here?
01:31:21.680 | What's your favorite flavor in terms of ice cream?
01:31:25.000 | So mine is coconut.
01:31:25.880 | - Nobody seems to like coconut ice cream.
01:31:27.900 | So ice cream aside,
01:31:30.600 | what do you most worry about in terms of catastrophic risk
01:31:35.780 | that will help us kind of make concrete
01:31:40.180 | the discussion we're having about
01:31:42.980 | how to fix this whole thing?
01:31:44.700 | - Yeah, I think it's worth taking
01:31:46.540 | a historical perspective briefly
01:31:48.420 | to just kind of orient everyone to it.
01:31:49.940 | We don't have to go all the way back to the aliens
01:31:53.620 | who've seen all of civilization,
01:31:54.880 | but to just recognize that for all of human history,
01:31:59.580 | as far as we're aware,
01:32:00.880 | there were existential risks to civilizations
01:32:05.840 | and they happened, right?
01:32:07.300 | Like there were civilizations that were killed in war,
01:32:10.600 | that tribes that were killed in tribal warfare, whatever.
01:32:13.760 | So people faced existential risk
01:32:15.940 | to the group that they identified with.
01:32:18.120 | It's just, those were local phenomena, right?
01:32:20.500 | It wasn't a fully global phenomena.
01:32:21.980 | So an empire could fall
01:32:23.580 | and surrounding empires didn't fall.
01:32:25.260 | Maybe they came in and filled the space.
01:32:27.260 | The first time that we were able to think
01:32:33.220 | about catastrophic risk, not from like a solar flare
01:32:36.120 | or something that we couldn't control,
01:32:37.380 | but from something that humans would actually create
01:32:39.580 | at a global level was World War II and the bomb.
01:32:42.900 | Because it was the first time that we had tech big enough
01:32:45.180 | that could actually mess up everything at a global level.
01:32:48.740 | It could mess up habitability.
01:32:50.060 | We just weren't powerful enough to do that before.
01:32:52.940 | It's not that we didn't behave in ways
01:32:54.500 | that would have done it.
01:32:55.340 | We just only behaved in those ways
01:32:57.180 | at the scale we could affect.
01:32:59.020 | And so it's important to get
01:33:01.020 | that there's the entire world before World War II,
01:33:04.360 | where we don't have the ability
01:33:05.780 | to make a non-habitable biosphere, non-habitable for us.
01:33:09.220 | And then there's World War II
01:33:10.420 | and the beginning of a completely new phase
01:33:13.040 | where global human induced catastrophic risk
01:33:16.580 | is now a real thing.
01:33:17.580 | And that was such a big deal
01:33:19.820 | that it changed the entire world
01:33:21.400 | in a really fundamental way,
01:33:23.220 | which is when you study history,
01:33:26.420 | it's amazing how big a percentage of history
01:33:28.340 | is studying war, right?
01:33:29.420 | And the history of wars,
01:33:30.700 | you said European history, whatever.
01:33:32.660 | It's generals and wars and empire expansions.
01:33:35.660 | And so the major empires near each other
01:33:38.580 | never had really long periods of time
01:33:40.540 | where they weren't engaged in war
01:33:41.820 | or preparation for war or something like that.
01:33:44.100 | That was, humans don't have a good precedent
01:33:47.360 | in the post tribal phase,
01:33:49.660 | the civilization phase of being able to solve conflicts
01:33:52.000 | without war for very long.
01:33:53.300 | World War II was the first time
01:33:56.320 | where we could have a war that no one could win.
01:33:59.240 | And so the superpowers couldn't fight again.
01:34:02.800 | They couldn't do a real kinetic war.
01:34:04.200 | They could do diplomatic wars and cold war type stuff,
01:34:07.240 | and they could fight proxy wars through other countries
01:34:09.380 | that didn't have the big weapons.
01:34:11.180 | And so mutually assured destruction
01:34:12.900 | and like coming out of World War II,
01:34:15.240 | we actually realized that nation states
01:34:17.240 | couldn't prevent world war.
01:34:19.520 | And so we needed a new type of supervening government
01:34:22.560 | in addition to nation states,
01:34:23.720 | which was the whole Bretton Woods world,
01:34:25.080 | the United Nations, the World Bank, the IMF,
01:34:28.760 | the globalization trade type agreements,
01:34:31.980 | mutually assured destruction.
01:34:33.640 | That was, how do we have some coordination
01:34:36.320 | beyond just nation states between them
01:34:38.240 | since we have to stop war between at least the superpowers?
01:34:41.800 | And it was pretty successful,
01:34:44.120 | given that we've had like 75 years
01:34:45.920 | of no superpower on superpower war.
01:34:49.340 | We've had lots of proxy wars during that time.
01:34:52.940 | We've had cold war.
01:34:54.360 | And I would say we're in a new phase now
01:34:57.940 | where the Bretton Woods solution is basically over,
01:35:02.020 | or almost over.
01:35:03.300 | - Can you describe the Bretton Woods solution?
01:35:05.260 | - Yeah, so the Bretton Woods,
01:35:07.240 | the series of agreements for how the nations
01:35:12.240 | would be able to engage with each other
01:35:16.100 | in a solution other than war was these IGOs,
01:35:19.780 | these intergovernmental organizations,
01:35:22.020 | and was the idea of globalization.
01:35:24.780 | Since we could have global effects,
01:35:26.020 | we needed to be able to think about things globally,
01:35:28.380 | where we had trade relationships with each other,
01:35:30.360 | where it would not be profitable to war with each other.
01:35:33.340 | It'd be more profitable to actually be able
01:35:34.900 | to trade with each other.
01:35:35.820 | So our own self-interest was gonna drive
01:35:38.420 | our non-war interest.
01:35:40.040 | And so this started to look like,
01:35:44.180 | and obviously this couldn't have happened
01:35:46.620 | that much earlier either,
01:35:47.600 | because industrialization hadn't gotten far enough
01:35:49.740 | to be able to do massive global industrial supply chains
01:35:52.420 | and ship stuff around quickly.
01:35:54.700 | But like we were mentioning earlier,
01:35:56.300 | almost all the electronics that we use today,
01:35:58.900 | just basic cheap stuff for us,
01:36:00.540 | is made on six continents, made in many countries.
01:36:03.100 | There's no single country in the world
01:36:04.380 | that could actually make many of the things that we have,
01:36:06.460 | and from the raw material extraction
01:36:08.740 | to the plastics and polymers and the et cetera.
01:36:12.620 | And so the idea that we made a world
01:36:15.020 | that could do that kind of trade
01:36:17.420 | and create massive GDP growth,
01:36:19.140 | we could all work together to be able
01:36:20.540 | to mine natural resources and grow stuff.
01:36:23.740 | With the rapid GDP growth,
01:36:25.380 | there was the idea that everybody could keep having more
01:36:28.060 | without having to take each other's stuff.
01:36:30.580 | And so that was part of kind of the Bretton Woods
01:36:33.980 | post-World War II model.
01:36:35.460 | The other was that we'd be so economically interdependent
01:36:38.220 | that blowing each other up would never make sense.
01:36:41.000 | That worked for a while.
01:36:42.500 | Now, it also brought us up into planetary boundaries faster,
01:36:48.500 | the unrenewable use of resource
01:36:51.500 | and turning those resources into pollution
01:36:54.040 | on the other side of the supply chain.
01:36:56.580 | So obviously that faster GDP growth
01:36:58.780 | meant the overfishing of the oceans
01:37:02.500 | and the cutting down of the trees and the climate change
01:37:04.980 | and the toxic mining tailings going into the water
01:37:08.980 | and the mountaintop removal mining
01:37:10.780 | and all those types of things.
01:37:11.780 | - That's the overconsumption side
01:37:13.280 | of the risk that we're talking about.
01:37:15.340 | - And so the answer of let's do positive GDP
01:37:19.060 | is the answer rapidly and exponentially
01:37:23.280 | obviously accelerated the planetary boundary side.
01:37:26.960 | And that was thought about for a long time,
01:37:30.140 | but it started to be modeled with the Club of Rome
01:37:32.940 | and limits of growth.
01:37:34.040 | But it's just very obvious to say
01:37:39.380 | if you have a linear materials economy
01:37:40.940 | where you take stuff out of the earth faster,
01:37:43.400 | whether it's fish or trees or oil,
01:37:46.620 | you take it out of the earth faster
01:37:47.960 | than it can replenish itself.
01:37:49.940 | And you turn it into trash
01:37:51.860 | after using it for a short period of time,
01:37:53.240 | you put the trash in the environment faster
01:37:55.060 | than it can process itself.
01:37:56.740 | And there's toxicity associated with both sides of this.
01:37:59.900 | You can't run an exponentially growing
01:38:02.500 | linear materials economy on a finite planet forever.
01:38:05.080 | That's not a hard thing to figure out.
01:38:06.700 | And it has to be exponential
01:38:08.500 | if there's an exponentiation in the monetary supply
01:38:11.660 | because of interest and then fractional reserve banking.
01:38:14.500 | And to then be able to keep up
01:38:16.140 | with the growing monetary supply,
01:38:17.400 | you have to have growth of goods and services.
01:38:19.300 | So that's that kind of thing that has happened.
01:38:21.700 | But you also see that when you get these supply chains
01:38:27.140 | that are so interconnected across the world,
01:38:29.060 | you get increased fragility
01:38:30.380 | 'cause a collapse or a problem in one area
01:38:32.740 | then affects the whole world in a much bigger area
01:38:35.060 | as opposed to the issues being local.
01:38:37.500 | So we got to see with COVID
01:38:39.540 | and an issue that started in one part of China
01:38:42.780 | affecting the whole world so much more rapidly
01:38:45.660 | than would have happened before Bretton Woods, right?
01:38:48.580 | Before international travel supply chains,
01:38:50.980 | you know, that whole kind of thing.
01:38:52.460 | And with a bunch of second and third order effects
01:38:54.300 | that people wouldn't have predicted, okay?
01:38:55.500 | We have to stop certain kinds of travel
01:38:57.940 | because of viral contaminants,
01:38:59.300 | but the countries doing agriculture
01:39:02.820 | depend upon fertilizer they don't produce
01:39:04.700 | that is shipped into them
01:39:05.660 | and depend upon pesticides they don't produce.
01:39:07.420 | So we got both crop failures
01:39:09.580 | and crops being eaten by locusts
01:39:11.420 | in scale in Northern Africa and Iran and things like that
01:39:13.820 | because they couldn't get the supplies of stuff in.
01:39:15.520 | So then you get massive starvation
01:39:17.460 | or future kind of hunger issues
01:39:19.140 | because of supply chain shutdowns.
01:39:21.980 | So you get this increased fragility and cascade dynamics
01:39:25.100 | where a small problem can end up leading to cascade effects.
01:39:28.900 | And also we went from two superpowers
01:39:33.900 | with one catastrophe weapon
01:39:37.100 | to now that same catastrophe weapon
01:39:40.980 | is there's more countries that have it,
01:39:44.620 | eight or nine countries that have it.
01:39:46.500 | And there's a lot more types of catastrophe weapons.
01:39:50.220 | We now have catastrophe weapons with weaponized drones
01:39:53.580 | that can hit infrastructure targets with bio with,
01:39:56.180 | in fact, every new type of tech has created an arms race.
01:39:59.680 | So we have not with the UN
01:40:02.060 | or the other kind of intergovernmental organizations,
01:40:04.140 | we haven't been able to really do nuclear deproliferation.
01:40:07.620 | We've actually had more countries get nukes
01:40:09.860 | and keep getting faster nukes,
01:40:11.260 | the race to hypersonics and things like that.
01:40:13.920 | And every new type of technology that has emerged
01:40:17.700 | has created an arms race.
01:40:19.560 | And so you can't do mutually assured destruction
01:40:23.820 | with multiple agents the way you can with two agents.
01:40:26.780 | Two agents, it's much easier
01:40:28.700 | to create a stable Nash equilibrium that's forced.
01:40:31.620 | But the ability to monitor and say,
01:40:32.780 | if these guys shoot, who do I shoot?
01:40:34.020 | Do I shoot them?
01:40:34.840 | Do I shoot everybody?
01:40:36.100 | And so you get a three body problem.
01:40:37.700 | You get a very complex type of thing
01:40:39.520 | when you have multiple agents
01:40:40.540 | and multiple different types of catastrophe weapons,
01:40:42.900 | including ones that can be much more easily produced
01:40:45.380 | than nukes.
01:40:46.220 | Nukes are really hard to produce.
01:40:47.040 | There's only uranium in a few areas.
01:40:48.500 | Uranium enrichment is hard.
01:40:50.020 | ICBMs are hard.
01:40:51.600 | But weaponized drones hitting smart targets is not so hard.
01:40:55.380 | There's a lot of other things where basically the scale
01:40:57.560 | at being able to manufacture them is going way, way down
01:40:59.940 | to where even non-state actors can have them.
01:41:02.680 | And so when we talk about exponential tech
01:41:06.560 | and the decentralization of exponential tech,
01:41:09.500 | what that means is decentralized catastrophe weapon capacity.
01:41:13.060 | And especially in a world of increasing numbers
01:41:16.420 | of people feeling disenfranchised, frantic,
01:41:19.300 | whatever, for different reasons.
01:41:21.340 | So I would say the Bretton Woods world doesn't prepare us
01:41:26.340 | to be able to deal with lots of different agents,
01:41:29.260 | having lots of different types of catastrophe weapons
01:41:31.540 | you can't put mutually assured destruction on,
01:41:33.680 | where you can't keep doing growth of the materials economy
01:41:36.840 | in the same way because of hitting planetary boundaries
01:41:40.900 | and where the fragility dynamics are actually now
01:41:44.060 | their own source of catastrophic risk.
01:41:46.100 | So now we're, so like there was all the world
01:41:48.220 | until World War II.
01:41:49.220 | And World War II is just from a civilization timescale
01:41:52.240 | point of view, it was just a second ago.
01:41:54.620 | It seems like a long time, but it is really not.
01:41:56.700 | We get a short period of relative peace at the level
01:41:58.900 | of superpowers while building up the military capacity
01:42:01.500 | for much, much, much worse war the entire time.
01:42:04.340 | And then now we're at this new phase where the things
01:42:07.600 | that allowed us to make it through the nuclear power
01:42:10.940 | are not the same systems that will let us make it
01:42:13.100 | through the next stage.
01:42:14.660 | So what is this next post Bretton Woods?
01:42:17.460 | How do we become safe vessels, safe stewards
01:42:22.460 | of many different types of exponential technology
01:42:26.220 | is a key question when we're thinking about X-Risk.
01:42:30.020 | - Okay, so, and I'd like to try to answer the how
01:42:35.020 | a few ways, but first on the mutually assured destruction.
01:42:40.980 | Do you give credit to the idea of two superpowers
01:42:46.460 | not blowing each other up with nuclear weapons
01:42:49.740 | to the simple game theoretic model
01:42:51.980 | of mutually assured destruction,
01:42:53.300 | or something you've said previously,
01:42:56.460 | this idea of inverse correlation,
01:42:59.060 | which I tend to believe between,
01:43:02.300 | now you were talking about tech,
01:43:05.820 | but I think it's maybe broadly true.
01:43:09.500 | The inverse correlation between competence
01:43:11.860 | and propensity for destruction.
01:43:13.900 | So the bigger your weapons,
01:43:18.900 | not because you're afraid of mutually assured
01:43:22.260 | self-destruction, but because we're human beings
01:43:24.460 | and there's a deep moral fortitude there
01:43:28.700 | that somehow aligned with competence
01:43:30.820 | and being good at your job.
01:43:32.260 | That like, it's very hard to be a psychopath
01:43:37.260 | and be good at killing at scale.
01:43:41.860 | Do you share any of that intuition?
01:43:46.420 | - Kind of.
01:43:47.260 | I think most people would say that Alexander the Great
01:43:51.820 | and Genghis Khan and Napoleon were effective people
01:43:55.220 | that were good at their job.
01:43:58.180 | That were actually maybe asymmetrically good
01:44:01.620 | at being able to organize people
01:44:03.820 | and do certain kinds of things
01:44:06.180 | that were pretty oriented
01:44:08.460 | towards certain types of destruction.
01:44:11.100 | Or pretty willing to,
01:44:13.020 | maybe they would say they were oriented
01:44:14.220 | towards empire expansion,
01:44:15.340 | but pretty willing to commit certain acts of destruction
01:44:18.300 | in the name of it.
01:44:19.340 | - What are you worried about?
01:44:20.780 | The Genghis Khan, or you could argue he's not a psychopath.
01:44:27.580 | Are you worried about Genghis Khan,
01:44:30.140 | are you worried about Hitler,
01:44:31.180 | or are you worried about a terrorist
01:44:33.900 | who has a very different ethic,
01:44:37.900 | which is not even for,
01:44:41.100 | it's not trying to preserve and build
01:44:43.820 | and expand my community.
01:44:46.500 | It's more about just the destruction in itself is the goal.
01:44:50.460 | - I think the thing that you're looking at
01:44:53.540 | that I do agree with
01:44:54.620 | is that there's a psychological disposition
01:44:57.740 | towards construction
01:44:59.100 | and a psychological disposition more towards destruction.
01:45:02.740 | Obviously everybody has both and can toggle between both.
01:45:05.900 | And oftentimes one is willing to destroy certain things.
01:45:09.180 | We have this idea of creative destruction, right?
01:45:10.860 | Willing to destroy certain things to create other things.
01:45:13.700 | And utilitarianism and trolley problems
01:45:16.460 | are all about exploring that space.
01:45:18.020 | And the idea of war is all about that.
01:45:20.700 | I am trying to create something for our people
01:45:23.220 | and it requires destroying some other people.
01:45:25.460 | Sociopathy is a funny topic
01:45:30.100 | 'cause it's possible to have very high fealty
01:45:32.060 | to your in-group and work on perfecting the methods
01:45:35.100 | of torture to the out-group at the same time
01:45:38.300 | 'cause you can dehumanize and then remove empathy.
01:45:40.860 | And I would also say that there are types.
01:45:47.780 | So the reason, the thing that gives hope
01:45:51.300 | about the orientation towards construction and destruction
01:45:55.220 | being a little different in psychologies
01:45:56.980 | is what it takes to build really catastrophic tech,
01:46:01.500 | even today where it doesn't take what it took
01:46:03.260 | to make a nuke, a small group of people could do it,
01:46:05.860 | takes still some real technical knowledge
01:46:09.260 | that required having studied for a while
01:46:11.420 | and some then building capacity.
01:46:13.900 | And there's a question of,
01:46:15.500 | is that psychologically inversely correlated
01:46:18.260 | with the desire to damage civilization meaningfully?
01:46:22.100 | A little bit, a little bit, I think.
01:46:26.860 | - I think a lot.
01:46:29.180 | I think it's actually, I mean,
01:46:31.580 | this is the conversation I had,
01:46:33.340 | I think offline with Dan Carlin,
01:46:35.220 | which is like, it's pretty easy to come up with ways
01:46:39.620 | that any competent, I can come up with a lot of ways
01:46:42.960 | to hurt a lot of people.
01:46:45.000 | And it's pretty easy.
01:46:46.300 | Like I alone could do it.
01:46:48.220 | And there's a lot of people as smart or smarter than me,
01:46:53.220 | at least in their creation of explosives.
01:46:57.820 | Why are we not seeing more insane mass murder?
01:47:02.820 | - I think there is something fascinating
01:47:06.380 | and beautiful about this.
01:47:08.500 | - Yes.
01:47:09.340 | - And it does have to do with some deeply pro-social
01:47:12.700 | types of characteristics in humans.
01:47:17.340 | But when you're dealing with very large numbers,
01:47:20.620 | you don't need a whole lot of a phenomena.
01:47:23.180 | And so then you start to say,
01:47:24.260 | well, what's the probability that X won't happen this year,
01:47:26.980 | then won't happen in the next two years,
01:47:28.500 | three years, four years.
01:47:30.060 | And then how many people are doing destructive things
01:47:33.020 | with lower tech?
01:47:33.940 | And then how many of them can get access to higher tech
01:47:36.220 | that they didn't have to figure out how to build?
01:47:39.020 | So when I can get commercial tech,
01:47:43.580 | and maybe I don't understand tech very well,
01:47:46.300 | but I understand it well enough to utilize it,
01:47:48.520 | not to create it, and I can repurpose it.
01:47:51.340 | When we saw that commercial drone
01:47:54.860 | with a homemade thermite bomb
01:47:55.980 | hit the Ukrainian munitions factory
01:47:58.340 | and do the equivalent of an incendiary bomb level of damage,
01:48:02.540 | that was just home tech.
01:48:04.180 | That's just simple kind of thing.
01:48:06.340 | And so the question is not,
01:48:09.800 | does it stay being a small percentage of the population?
01:48:14.060 | The question is, can you bind that phenomena
01:48:17.660 | nearly completely?
01:48:18.980 | And especially now as you start to get into bigger things,
01:48:25.740 | CRISPR gene drive technologies and various things like that,
01:48:29.460 | can you bind it completely long-term?
01:48:33.240 | Over what period of time?
01:48:35.860 | - Not perfectly though.
01:48:37.100 | That's the thing.
01:48:37.940 | I'm trying to say that there is some,
01:48:41.260 | let's call it, let's, a random word, love,
01:48:45.540 | that's inherent and that's core to human nature,
01:48:50.460 | that's preventing destruction at scale.
01:48:54.060 | And you're saying, yeah, but there's a lot of humans.
01:48:57.700 | There's gonna be eight plus billion,
01:48:59.860 | and then there's a lot of seconds in the day
01:49:01.760 | to come up with stuff.
01:49:02.820 | There's a lot of pain in the world
01:49:04.260 | that can lead to a distorted view of the world
01:49:07.420 | such that you want to channel that pain into the destruction.
01:49:10.560 | All those kinds of things.
01:49:11.580 | And it's only a matter of time
01:49:13.180 | that any one individual could do large damage,
01:49:15.460 | especially as we create more and more democratized,
01:49:20.460 | decentralized ways to deliver that damage,
01:49:22.620 | even if you don't know how to build the initial weapon.
01:49:25.740 | But the thing is, it seems like it's a race
01:49:30.060 | between the cheapening of destructive weapons
01:49:36.820 | and the capacity of humans to express their love
01:49:41.820 | towards each other.
01:49:43.160 | And it's a race that so far, I know on Twitter,
01:49:48.160 | it's not popular to say, but love is winning, okay?
01:49:51.900 | So what is the argument that love is going to lose here
01:49:55.200 | against nuclear weapons and biotech and AI and drones?
01:50:00.200 | - Okay, I'm gonna come at the end of this
01:50:05.520 | to a how love wins.
01:50:07.200 | So I just want you to know that that's where I'm oriented.
01:50:09.560 | - That's the end, okay.
01:50:10.400 | - But I'm gonna argue against why that is a given
01:50:14.640 | because it's not a given.
01:50:19.040 | I don't believe.
01:50:20.080 | And I think that-
01:50:20.920 | - This is like a good romantic comedy.
01:50:22.560 | So you're gonna create drama right now,
01:50:25.120 | but it will end in a happy ending.
01:50:27.000 | - Well, it's because it's only a happy ending
01:50:28.720 | if we actually understand the issues well enough
01:50:30.760 | and take responsibility to shift it.
01:50:32.520 | Do I believe, like, there's a reason
01:50:34.480 | why there's so much more dystopic sci-fi
01:50:36.560 | than protopic sci-fi.
01:50:38.040 | And the some protopic sci-fi usually requires magic
01:50:41.880 | is because, or at least magical tech, right?
01:50:46.880 | Dilithium crystals and warp drives and stuff.
01:50:49.360 | Because it's very hard to imagine people
01:50:53.040 | like the people we have been in the history books
01:50:55.480 | with exponential type technology and power
01:51:01.480 | that don't eventually blow themselves up,
01:51:03.920 | that make good enough choices
01:51:05.480 | as stewards of their environment and their commons
01:51:07.600 | and each other and et cetera.
01:51:09.720 | So like, it's easier to think of scenarios
01:51:11.920 | where we blow ourselves up
01:51:12.960 | than it is to think of scenarios
01:51:14.000 | where we avoid every single scenario
01:51:15.520 | where we blow ourselves up.
01:51:16.680 | And when I say blow ourselves up,
01:51:17.680 | I mean the environmental versions,
01:51:20.440 | the terrorist versions, the war versions,
01:51:22.800 | the cumulative externalities versions.
01:51:25.420 | - And I'm sorry if I'm interrupting your flow of thought,
01:51:31.200 | but why is it easier?
01:51:33.640 | Could it be a weird psychological thing
01:51:35.680 | where we either are just more capable
01:51:38.000 | to visualize explosions and destruction?
01:51:41.120 | And then the sicker thought,
01:51:42.760 | which is like we kind of enjoy for some weird reason
01:51:45.760 | thinking about that kind of stuff,
01:51:47.000 | even though we wouldn't actually act on it.
01:51:49.840 | It's almost like some weird,
01:51:51.640 | like I love playing shooter games,
01:51:53.600 | first person shooters.
01:51:56.200 | And like, especially if it's like murdering zombie doom,
01:51:59.720 | you're shooting demons.
01:52:01.440 | I played one of my favorite games, Diablos,
01:52:03.400 | like slashing through different monsters
01:52:05.880 | and the screaming and pain and the hellfire.
01:52:08.360 | And then I go out into the real world
01:52:11.000 | to eat my coconut ice cream and I'm all about love.
01:52:13.360 | So like, can we trust our ability to visualize
01:52:17.560 | how it all goes to shit
01:52:19.720 | as an actual rational way of thinking?
01:52:22.640 | - I think it's a fair question to say to what degree
01:52:25.920 | is there just kind of perverse fantasy
01:52:28.760 | and morbid exploration
01:52:32.000 | and whatever else that happens in our imagination.
01:52:34.520 | But I don't think that's the whole of it.
01:52:38.040 | I think there is also a reality
01:52:41.080 | to the combinatorial possibility space
01:52:43.720 | and the difference in the probabilities
01:52:45.280 | that there's a lot of ways I could try to put
01:52:48.760 | the 70 trillion cells of your body together
01:52:50.920 | that don't make you.
01:52:52.000 | There's not that many ways I can put them together
01:52:54.800 | that make you.
01:52:55.640 | There's a lot of ways I could try
01:52:56.480 | to connect the organs together
01:52:57.720 | that make some weird kind of group of organs
01:53:00.600 | on a desk but that doesn't actually
01:53:02.840 | make a functioning human.
01:53:04.320 | And you can kill an adult human in a second,
01:53:08.400 | but you can't get one in a second.
01:53:09.720 | It takes 20 years to grow one
01:53:11.280 | and a lot of things to happen right.
01:53:12.680 | I could destroy this building
01:53:14.480 | in a couple minutes with demolition,
01:53:17.880 | but it took a year or a couple years to build it.
01:53:20.320 | There is--
01:53:21.160 | - Calm down, Cole, this is just an example.
01:53:25.160 | It's not, he doesn't mean it.
01:53:27.640 | - There's a gradient where entropy is easier
01:53:31.200 | and that there's a lot more ways
01:53:34.000 | to put a set of things together that don't work
01:53:36.840 | than the few that really do produce higher order synergies.
01:53:40.520 | And so,
01:53:42.440 | when we look at a history of war
01:53:46.920 | and then we look at exponentially more powerful warfare,
01:53:51.360 | an arms race that drives that in all these directions,
01:53:53.600 | and when we look at a history of environmental destruction
01:53:56.000 | and exponentially more powerful tech
01:53:57.560 | that makes exponential externalities
01:53:59.400 | multiplied by the total number of agents
01:54:01.120 | that are doing it and the cumulative effects,
01:54:03.520 | there's a lot of ways the whole thing can break,
01:54:05.600 | like a lot of different ways.
01:54:07.400 | And for it to get ahead,
01:54:08.720 | it has to have none of those happen.
01:54:10.520 | And so, there's just a probability space
01:54:15.080 | where it's easier to imagine that thing.
01:54:16.480 | So, to say how do we have a protopic future,
01:54:20.000 | we have to say, well, one criteria must be
01:54:23.240 | that it avoids all of the catastrophic risks.
01:54:25.680 | So, can we understand,
01:54:26.840 | can we inventory all the catastrophic risks?
01:54:28.600 | Can we inventory the patterns of human behavior
01:54:30.480 | that give rise to them?
01:54:32.040 | And could we try to solve for that?
01:54:34.920 | And could we have that be the essence
01:54:37.160 | of the social technology that we're thinking about
01:54:39.840 | to be able to guide, bind, and direct
01:54:41.680 | a new physical technology?
01:54:42.800 | 'Cause so far, our physical technology,
01:54:45.200 | like we were talking about the Genghis Khans and like that,
01:54:48.120 | that obviously use certain kinds of physical technology
01:54:51.240 | and armaments and also social technology
01:54:53.640 | and unconventional warfare for a particular set of purposes.
01:54:57.760 | But we have things that don't look like warfare,
01:54:59.880 | like Rockefeller and standard oil.
01:55:03.680 | And it looked like a constructive mindset
01:55:07.040 | to be able to bring this new energy resource to the world.
01:55:12.040 | And it did.
01:55:13.840 | And the second order effects of that are climate change
01:55:18.840 | and all of the oil spills that have happened and will happen.
01:55:23.240 | And all of the wars in the Middle East
01:55:25.880 | over the oil that had been there
01:55:27.720 | and the massive political clusterfuck
01:55:30.800 | and human life issues that are associated with it
01:55:33.520 | and on and on, right?
01:55:35.000 | And so it's also not just the orientation
01:55:41.400 | to construct a thing can have a narrow focus
01:55:44.280 | on what I'm trying to construct,
01:55:45.480 | but be affecting a lot of other things
01:55:47.160 | through second and third order effects
01:55:48.480 | I'm not taking responsibility for.
01:55:50.200 | - And you often, on another tangent,
01:55:53.560 | mentioned second, third, and fourth order effects.
01:55:57.160 | - And the order.
01:55:58.240 | - And the order. - Cascading.
01:55:59.560 | - Which is really fascinating.
01:56:01.880 | Like starting with the third order,
01:56:04.680 | plus it gets really interesting.
01:56:08.280 | 'Cause we don't even acknowledge
01:56:10.280 | like the second order effects.
01:56:11.800 | - Right.
01:56:12.960 | - But like thinking, 'cause those,
01:56:14.680 | it could get bigger and bigger and bigger
01:56:17.040 | in ways we were not anticipating.
01:56:18.920 | So how do we make those, so it sounds like part of the,
01:56:21.720 | part of the thing that you are thinking through
01:56:26.000 | in terms of a solution, how to create an anti-fragile,
01:56:30.280 | a resilient society, is to make explicit,
01:56:34.920 | acknowledge, understand the externalities,
01:56:39.520 | the second order, third order, fourth order,
01:56:42.560 | and the order effects.
01:56:44.720 | How do we start to think about those effects?
01:56:47.920 | - Yeah, the war application is harm we're trying to cause,
01:56:50.320 | or that we're aware we're causing, right?
01:56:53.120 | The externality is harm that, at least supposedly,
01:56:55.960 | we're not aware we're causing,
01:56:57.080 | or at minimum, it's not our intention, right?
01:56:59.360 | Maybe we're either totally unaware of it,
01:57:01.280 | or we're aware of it,
01:57:02.120 | but it is a side effect of what our intention is.
01:57:04.480 | It's not the intention itself.
01:57:06.400 | There are catastrophic risks from both types.
01:57:08.920 | The direct application of increased technological power
01:57:12.640 | to a rivalrous intent,
01:57:16.080 | which is gonna cause harm for some out-group
01:57:18.360 | for some in-group to win,
01:57:19.520 | but the out-group is also working on growing the tech,
01:57:21.840 | and if they don't lose completely,
01:57:23.640 | they reverse engineer the tech, upregulate it,
01:57:25.800 | come back with more capacity.
01:57:27.760 | So there's the exponential tech arms race side
01:57:31.520 | of in-group, out-group rivalry using exponential tech
01:57:34.520 | that is one set of risks.
01:57:36.240 | And the other set of risks is the application
01:57:40.480 | of exponentially more powerful tech,
01:57:42.760 | not intentionally to try and beat an out-group,
01:57:46.520 | but to try to achieve some goal that we have,
01:57:48.580 | but to produce a second and third order effects
01:57:51.160 | that do have harm to the commons, to other people,
01:57:55.240 | to environment, to other groups,
01:57:57.220 | that might actually be bigger problems
01:58:01.680 | than the problem we were originally trying to solve
01:58:03.360 | with the thing we were building.
01:58:05.240 | When Facebook was building a dating app
01:58:09.320 | and then building a social app
01:58:10.480 | where people could tag pictures,
01:58:12.920 | they weren't trying to build a democracy destroying app
01:58:17.040 | that would maximize time on site as part of its ad model
01:58:23.200 | through AI optimization of a newsfeed
01:58:27.480 | to the thing that made people spend most time on site,
01:58:29.460 | which is usually them being limbically hijacked
01:58:31.840 | more than something else,
01:58:33.240 | which ends up appealing to people's cognitive biases
01:58:35.920 | and group identities,
01:58:37.440 | and creates no sense of shared reality.
01:58:39.880 | They weren't trying to do that,
01:58:40.880 | but it was a second order effect.
01:58:42.920 | And it's a pretty fucking powerful second order effect,
01:58:46.940 | and a pretty fast one,
01:58:49.920 | 'cause the rate of tech is obviously able to get distributed
01:58:52.760 | to much larger scale, much faster,
01:58:54.520 | and with a bigger jump in terms of total vertical capacity,
01:58:58.560 | then that's what it means to get to the verticalizing part
01:59:00.680 | of an exponential curve.
01:59:01.880 | So just like we can see that oil
01:59:07.440 | had these second order environmental effects,
01:59:09.100 | and also social and political effects,
01:59:10.760 | and war and so much of the whole,
01:59:14.680 | like the total amount of oil used
01:59:17.080 | has a proportionality to total global GDP.
01:59:20.960 | And this way we have this, the petrodollar,
01:59:23.200 | and so the oil thing also had the externalities
01:59:29.120 | of a major aspect of what happened
01:59:30.940 | with military industrial complex and things like that.
01:59:33.640 | So, but we can see the same thing
01:59:35.680 | with more current technologies,
01:59:37.920 | with Facebook and Google and other things.
01:59:41.000 | So I don't think we can run,
01:59:44.800 | and the more powerful the tech is,
01:59:46.780 | we build it for reason X, whatever reason X is.
01:59:51.000 | Maybe X is three things, maybe it's one thing, right?
01:59:53.800 | We're doing the oil thing because we wanna make cars
01:59:57.680 | because it's a better method of individual transportation.
02:00:00.040 | We're building the Facebook thing
02:00:01.120 | 'cause we're gonna connect people socially
02:00:02.800 | in a personal sphere.
02:00:04.440 | But it interacts with complex systems,
02:00:08.760 | with ecologies, economies, psychologies, cultures.
02:00:12.960 | And so it has effects
02:00:14.000 | on other than the thing we're intending.
02:00:16.400 | Some of those effects can end up being negative effects,
02:00:19.800 | but because this technology,
02:00:21.760 | if we make it to solve a problem,
02:00:24.480 | it has to overcome the problem.
02:00:25.920 | The problem's been around for a while,
02:00:27.080 | it's gonna overcome in a short period of time.
02:00:28.340 | So it usually has greater scale,
02:00:30.120 | greater rate of magnitude in some way.
02:00:32.560 | That also means that the externalities that it creates
02:00:35.520 | might be bigger problems.
02:00:37.480 | And you can say, well, but then that's the new problem
02:00:39.960 | and humanity will innovate its way out of that.
02:00:41.740 | Well, I don't think that's paying attention
02:00:43.560 | to the fact that we can't keep up
02:00:44.980 | with exponential curves like that,
02:00:47.440 | nor do finite spaces allow
02:00:49.320 | exponential externalities forever.
02:00:51.200 | And this is why a lot of the smartest people
02:00:55.140 | thinking about this are thinking,
02:00:56.680 | well, no, I think we're totally screwed
02:00:59.840 | unless we can make a benevolent AI singleton
02:01:02.200 | that rules all of us.
02:01:03.260 | Guys like Ostrom and others thinking in those directions,
02:01:08.080 | 'cause they're like, how do humans try to do
02:01:12.080 | multipolarity and make it work?
02:01:14.280 | And I have a different answer of what I think it looks like
02:01:17.880 | that does have more to do with the love,
02:01:20.680 | but some applied social tech aligned with love.
02:01:23.720 | - 'Cause I have a bunch of really dumb ideas.
02:01:26.240 | I'd prefer to hear-
02:01:28.240 | - I'd like to hear some of them first.
02:01:30.080 | I think the idea I would have is to be a bit more rigorous
02:01:34.240 | in trying to measure the amount of love you add
02:01:39.240 | or subtract from the world
02:01:42.600 | in second, third, fourth, fifth order effects.
02:01:46.400 | It's actually, I think, especially in the world of tech,
02:01:49.600 | quite doable.
02:01:50.580 | You just might not like,
02:01:53.720 | the shareholders may not like that kind of metric,
02:01:58.280 | but it's pretty easy to measure.
02:01:59.980 | That's not even,
02:02:02.120 | I'm perhaps half joking about love,
02:02:06.520 | but we could talk about just happiness and well-being,
02:02:08.680 | long-term well-being.
02:02:09.920 | That's pretty easy for Facebook, for YouTube,
02:02:12.840 | for all these companies to measure that.
02:02:15.240 | They do a lot of kinds of surveys.
02:02:18.840 | I mean, there's very simple solutions here
02:02:21.200 | that you could just survey how,
02:02:23.920 | I mean, servers are in some sense useless
02:02:26.760 | because they're a subset of the population.
02:02:31.280 | You're just trying to get a sense,
02:02:32.520 | it's very loose kind of understanding,
02:02:34.440 | but integrated deeply as part of the technology.
02:02:36.960 | Most of our tech is recommender systems.
02:02:39.280 | Most of the, sorry, not tech,
02:02:41.000 | online interaction is driven by recommender systems
02:02:45.120 | that learn very little data about you
02:02:48.040 | and use that data based on,
02:02:50.680 | mostly based on traces of your previous behavior
02:02:52.960 | to suggest future things.
02:02:54.600 | This is how Twitter, this is how Facebook works,
02:02:56.800 | this is how AdSense for Google, AdSense works,
02:03:00.280 | this is how Netflix, YouTube work, and so on.
02:03:02.520 | And for them to just track, as opposed to engagement,
02:03:06.440 | how much you spend on a particular video,
02:03:08.360 | a particular site, is also track,
02:03:12.040 | give you the technology to do self-report
02:03:15.520 | of what makes you feel good,
02:03:17.040 | of what makes you grow as a person,
02:03:19.480 | of what makes you,
02:03:23.120 | the best version of yourself,
02:03:24.880 | the Rogan idea of the hero of your own movie.
02:03:30.960 | And just add that little bit of information.
02:03:34.040 | If you have people, you have this happiness surveys
02:03:37.560 | of how you feel about the last five days,
02:03:40.140 | how would you report your experience?
02:03:42.840 | You can lay out a set of videos.
02:03:44.920 | This is kind of fascinating to watch.
02:03:45.960 | I don't know if you ever look at YouTube,
02:03:47.400 | the history of videos you've looked at.
02:03:49.540 | It's fascinating, it's very embarrassing for me.
02:03:52.160 | Like, it'll be like a lecture,
02:03:54.000 | and then like a set of videos
02:03:56.480 | that I don't want anyone to know about,
02:03:58.640 | which is, which would be like, I don't know,
02:04:02.400 | maybe like five videos in a row
02:04:04.120 | where it looks like I watched the whole thing,
02:04:05.760 | which I probably did, about like how to cook a steak,
02:04:08.320 | even though, or just like the best chefs in the world
02:04:11.320 | cooking steaks, and I'm just like sitting there watching it
02:04:15.080 | for no purpose whatsoever, wasting away my life,
02:04:17.840 | or like funny cat videos, or like legit,
02:04:21.160 | that's always a good one.
02:04:23.480 | And I could look back and rate which videos
02:04:26.560 | made me a better person and not.
02:04:29.160 | And I mean, on a more serious note,
02:04:31.800 | there's a bunch of conversations, podcasts,
02:04:34.040 | or lectures I've watched which made me a better person,
02:04:37.400 | and some of them made me a worse person.
02:04:40.120 | Quite honestly, not for stupid reasons, like I feel dumber,
02:04:43.720 | but because I do have a sense that that started me
02:04:47.200 | on a path of not being kind to other people.
02:04:52.200 | For example, I'll give you, for my own,
02:04:57.160 | and I'm sorry for ranting, but maybe there's some usefulness
02:04:59.560 | to this kind of exploration of self.
02:05:01.360 | When I focus on creating, on programming, on science,
02:05:07.180 | I become a much deeper thinker
02:05:11.640 | and a kinder person to others.
02:05:14.380 | When I listen to too many, a little bit is good,
02:05:16.960 | but too many podcasts or videos
02:05:20.240 | about how our world is melting down,
02:05:24.280 | or criticizing ridiculous people,
02:05:27.240 | the worst of the quote-unquote woke, for example.
02:05:30.240 | There's all these groups that are misbehaving
02:05:33.560 | in fascinating ways because they've been corrupted by power.
02:05:37.040 | The more I watch criticism of them, the worse I become.
02:05:44.120 | And I'm aware of this, but I'm also aware
02:05:47.080 | that for some reason it's pleasant to watch those sometimes.
02:05:50.920 | And so for me to be able to self-report
02:05:54.040 | that to the YouTube algorithm, to the systems around me,
02:05:57.400 | and they ultimately try to optimize
02:05:59.640 | to make me the best version of myself,
02:06:03.160 | which I personally believe would make YouTube
02:06:05.400 | a lot more money because I'd be much more willing
02:06:07.420 | to spend time on YouTube and give YouTube
02:06:09.440 | a lot more of my money.
02:06:12.160 | That's great for business and great for humanity
02:06:15.880 | because it'll make me a kinder person.
02:06:18.040 | It'll increase the love quotient, the love metric,
02:06:23.000 | and it'll make them a lot of money.
02:06:25.420 | I feel like everything's aligned.
02:06:27.000 | And so you should do that not just for YouTube algorithm,
02:06:30.000 | but also for military strategy
02:06:32.000 | and whether to go to war or not,
02:06:33.740 | because one externality you can think of
02:06:35.880 | about going to war, which I think we talked about offline,
02:06:40.080 | is we often go to war with kind of governments,
02:06:42.800 | not with the people.
02:06:46.000 | You have to think about the kids of countries
02:06:49.960 | that see a soldier and because of what they experience,
02:06:56.840 | their interaction with the soldier, hate is born.
02:07:00.820 | When you're like eight years old, six years old,
02:07:04.200 | you lose your dad, you lose your mom,
02:07:06.200 | you lose a friend, somebody close to you
02:07:09.160 | that wanna really powerful externality
02:07:11.920 | that could be reduced to love, positive and negative,
02:07:15.280 | is the hate that's born when you make decisions.
02:07:19.360 | And that's going to take fruition,
02:07:22.640 | that that little seed is going to become a tree
02:07:25.180 | that then leads to the kind of destruction
02:07:27.120 | that we talk about.
02:07:28.340 | So, but in my sense, it's possible to reduce everything
02:07:33.600 | to a measure of how much love does this add to the world.
02:07:38.560 | All that to say, do you have ideas
02:07:41.160 | of how we practically build systems
02:07:46.000 | that create a resilient society?
02:07:49.300 | - There were a lot of good things that you shared
02:07:51.800 | where there's like 15 different ways
02:07:55.040 | that we could enter this that are all interesting.
02:07:57.140 | So I'm trying to see which one will probably be most useful.
02:07:59.880 | - Pick the one or two things that are least ridiculous.
02:08:03.520 | - When you were mentioning if we could see
02:08:07.240 | some of the second order effects or externalities
02:08:11.440 | that we aren't used to seeing,
02:08:12.320 | specifically the one of a kid being radicalized
02:08:15.200 | somewhere else, which engenders enmity in them towards us,
02:08:18.080 | which decreases our own future security.
02:08:20.240 | Even if you don't care about the kid,
02:08:21.400 | if you care about the kid, it's a whole other thing.
02:08:23.480 | Yeah, I mean, I think when we saw this,
02:08:27.820 | when Jane Fonda and others went to Vietnam
02:08:30.120 | and took photos and videos of what was happening,
02:08:32.760 | and you got to see the pictures
02:08:33.920 | of the kids with napalm on them.
02:08:36.680 | That like the anti-war effort was bolstered by that
02:08:41.600 | in a way it couldn't have been without that.
02:08:43.600 | There's a, until we can see the images,
02:08:46.840 | you can't have a mere neuron effect in the same way.
02:08:50.080 | And when you can, that starts to have a powerful effect.
02:08:53.300 | I think there's a deep principle
02:08:54.600 | that you're sharing there, which is that if we,
02:08:59.600 | we can have a rivalrous intent where our in-group,
02:09:05.480 | whatever it is, maybe it's our political party
02:09:07.520 | wanting to win within the US,
02:09:08.920 | maybe it's our nation state wanting to win in a war
02:09:12.880 | or an economic war over resource or whatever it is,
02:09:15.660 | that if we don't obliterate the other people completely,
02:09:19.840 | they don't go away.
02:09:21.400 | They're not engendered to like us more.
02:09:24.300 | They didn't become less smart.
02:09:27.160 | So they have more enmity towards us
02:09:28.800 | and whatever technologies we employed to be successful,
02:09:31.280 | they will now reverse engineer,
02:09:33.160 | make iterations on and come back.
02:09:35.380 | And so you drive an arms race,
02:09:37.280 | which is why you can see that the wars were over history,
02:09:42.280 | employing more lethal weaponry.
02:09:45.840 | And not just the kinetic war,
02:09:47.480 | the information war and the narrative war
02:09:51.680 | and the economic war, right?
02:09:53.600 | Like it just increased capacity in all of those fronts.
02:09:56.940 | And so what seems like a win to us on the short term
02:10:02.760 | might actually really produce losses in the long term.
02:10:05.380 | And what's even in our own best interest in the long term
02:10:07.780 | is probably more aligned with everyone else
02:10:09.300 | 'cause we inter-affect each other.
02:10:10.780 | And I think the thing about globalism,
02:10:13.780 | globalization and exponential tech
02:10:16.060 | and the rate at which we affect each other
02:10:17.740 | and the rate at which we affect the biosphere
02:10:19.100 | that we're all affected by
02:10:20.860 | is that this kind of proverbial spiritual idea
02:10:25.860 | that we're all interconnected
02:10:28.100 | and need to think about that in some way,
02:10:30.180 | that it was easy for tribes to get,
02:10:32.620 | because everyone in the tribe so clearly
02:10:34.740 | saw their interconnection and dependence on each other.
02:10:37.720 | But in terms of a global level,
02:10:39.480 | the speed at which we are actually interconnected,
02:10:43.720 | the speed at which the harm happening to something in Wuhan
02:10:46.860 | affects the rest of the world
02:10:48.000 | or a new technology developed somewhere
02:10:50.360 | affects the entire world or an environmental issue
02:10:52.440 | or whatever is making it to where
02:10:54.880 | we either actually all get,
02:10:57.140 | not as a spiritual idea, just even as physics, right?
02:10:59.560 | We all get the interconnectedness of everything
02:11:01.800 | and that we either all consider that
02:11:04.120 | and see how to make it through more effectively together
02:11:06.940 | or failures anywhere end up becoming
02:11:10.160 | decreased quality of life and failures
02:11:11.800 | and increased risk everywhere.
02:11:13.240 | - Don't you think people are beginning to experience that
02:11:15.280 | at the individual level?
02:11:16.480 | So governments are resisting it.
02:11:18.320 | They're trying to make us not empathize with each other,
02:11:20.880 | feel connected.
02:11:21.720 | But don't you think people are beginning
02:11:23.000 | to feel more and more connected?
02:11:25.000 | Like, isn't that exactly what the technology is enabling?
02:11:27.880 | Like social networks, we tend to criticize them,
02:11:30.040 | but isn't there a sense which we're experiencing?
02:11:35.040 | - When you watch those videos that are criticizing,
02:11:39.380 | whether it's the woke Antifa side
02:11:42.680 | or the QAnon Trump supporter side,
02:11:46.040 | does it seem like they have increased empathy
02:11:49.300 | for people that are outside of their ideologic camp?
02:11:51.560 | - No, not at all.
02:11:52.400 | So I may be conflating my own experience
02:11:58.200 | of the world and that of the populace.
02:12:01.200 | I tend to see those videos as feeding something
02:12:09.640 | that's a relic of the past.
02:12:12.060 | They figured out that drama fuels clicks,
02:12:16.360 | but whether I'm right or wrong, I don't know.
02:12:19.440 | But I tend to sense that that is not,
02:12:23.400 | that hunger for drama is not fundamental to human beings.
02:12:27.320 | That we want to actually,
02:12:29.080 | that we want to understand Antifa
02:12:33.240 | and we want to empathize.
02:12:34.840 | We want to take radical ideas
02:12:37.200 | and be able to empathize with them and synthesize it all.
02:12:41.840 | - Okay, let's look at cultural outliers
02:12:46.840 | in terms of violence versus compassion.
02:12:51.920 | We can see that a lot of cultures
02:12:53.660 | have relatively lower in-group violence,
02:12:56.980 | bigger out-group violence.
02:12:58.700 | And there's some variance in them
02:13:00.020 | and variance at different times
02:13:01.180 | based on the scarcity or abundance of resource
02:13:02.980 | and other things.
02:13:04.380 | But you can look at, say, Jains,
02:13:09.060 | whose whole religion is around nonviolence
02:13:11.820 | so much so that they don't even hurt plants.
02:13:13.740 | They only take fruits that fall off them and stuff.
02:13:16.620 | Or to go to a larger population, you take Buddhists,
02:13:19.820 | where for the most part, with a few exceptions,
02:13:22.140 | for the most part, across three millennia
02:13:24.540 | and across lots of different countries
02:13:26.320 | and geographies and whatever,
02:13:28.120 | you have 10 million people, plus or minus,
02:13:30.320 | who don't hurt bugs.
02:13:31.460 | The whole spectrum of genetic variance
02:13:34.820 | that is happening within a culture of that many people
02:13:37.980 | and head traumas and whatever, and nobody hurts bugs.
02:13:41.900 | And then you look at a group where the kids grew up
02:13:44.480 | as child soldiers in Liberia or Darfur,
02:13:47.580 | where to make it to adulthood,
02:13:48.660 | pretty much everybody's killed people, hand to hand,
02:13:51.620 | and killed people who were civilian
02:13:53.260 | or innocent type of people.
02:13:55.460 | And you say, okay, so we were very neotenous.
02:13:58.620 | We can be conditioned by our environment,
02:14:00.260 | and humans can be conditioned,
02:14:02.900 | where almost all the humans show up
02:14:04.940 | in these two different bell curves.
02:14:06.420 | It doesn't mean that the Buddhists had no violence.
02:14:08.200 | It doesn't mean that these people had no compassion,
02:14:09.940 | but they're very different Gaussian distributions.
02:14:13.820 | And so I think one of the important things
02:14:17.940 | that I like to do is look at the examples
02:14:20.640 | of the populations, what Buddhism shows regarding compassion
02:14:24.580 | or what Judaism shows around education,
02:14:28.900 | the average level of education that everybody gets
02:14:31.260 | 'cause of a culture that is really working
02:14:32.540 | on conditioning it or various cultures.
02:14:35.260 | What are the positive deviance
02:14:37.940 | outside of the statistical deviance
02:14:40.440 | to see what is actually possible?
02:14:42.720 | And then say, what are the conditioning factors?
02:14:45.600 | And can we condition those
02:14:46.860 | across a few of them simultaneously?
02:14:48.900 | And could we build a civilization like that?
02:14:50.880 | Becomes a very interesting question.
02:14:53.400 | So there's this kind of real politic idea
02:14:55.580 | that humans are violent.
02:14:58.440 | Large groups of humans become violent.
02:15:00.920 | They become irrational, specifically those two things,
02:15:02.980 | rivalrous and violent and irrational.
02:15:05.020 | And so in order to minimize the total amount of violence
02:15:07.900 | and have some good decisions, they need ruled somehow.
02:15:10.440 | And that not getting that is some kind of naive utopianism
02:15:13.960 | that doesn't understand human nature yet.
02:15:16.020 | This gets back to like mimesis of desire
02:15:18.100 | as an inexorable thing.
02:15:19.280 | I think the idea of the masses
02:15:22.080 | is actually a kind of propaganda
02:15:24.160 | that is useful for the classes that control
02:15:28.060 | to popularize the idea that most people are too violent,
02:15:34.540 | lazy, undisciplined and irrational to make good choices.
02:15:39.540 | And therefore their choices should be sublimated
02:15:42.040 | in some kind of way.
02:15:43.440 | I think that if we look back
02:15:44.700 | at these conditioning environments,
02:15:47.660 | we can say, okay, so the kids,
02:15:51.060 | they go to a really fancy school
02:15:56.060 | and have a good developmental environment
02:15:57.820 | like Exeter Academy.
02:15:59.620 | There's still a Gaussian distribution
02:16:00.980 | of how well they do on any particular metric,
02:16:03.420 | but on average, they become senators.
02:16:06.100 | And the worst ones become high-end lawyers or whatever.
02:16:09.180 | And then I look at the inner city school
02:16:10.700 | with a totally different set of things.
02:16:12.100 | And I see a very, very differently displaced
02:16:13.980 | Gaussian distribution,
02:16:14.820 | but a very different set of conditioning factors.
02:16:16.400 | So then I say the masses,
02:16:18.020 | well, if all those kids
02:16:18.940 | who were one of the parts of the masses
02:16:20.460 | got to go to Exeter and have that family and whatever,
02:16:22.420 | would they still be the masses?
02:16:23.980 | Could we actually condition more social virtue,
02:16:29.940 | more civic virtue,
02:16:30.900 | more orientation towards dialectical synthesis,
02:16:33.460 | more empathy, more rationality widely?
02:16:38.160 | Would that lead to better capacity
02:16:42.280 | for something like participatory governance,
02:16:44.540 | democracy or republic
02:16:45.680 | or some kind of participatory governance?
02:16:47.920 | - Yes. - Yes.
02:16:49.760 | Is it necessary for it, actually?
02:16:53.500 | And is it good for class interests?
02:16:56.960 | Not really.
02:16:58.140 | - By the way, when you say class interests,
02:17:01.200 | this is the powerful leading over the less powerful,
02:17:04.120 | that kind of idea?
02:17:05.020 | - Anyone that benefits from asymmetries of power
02:17:08.880 | doesn't necessarily benefit
02:17:11.800 | from decreasing those asymmetries of power
02:17:14.100 | and kind of increasing the capacity of people more widely.
02:17:19.100 | And so when we talk about power,
02:17:24.760 | we're talking about asymmetries
02:17:26.940 | in agency, influence and control.
02:17:29.480 | - You think that hunger for power
02:17:31.160 | is fundamental to human nature?
02:17:33.260 | I think we should get that straight
02:17:34.720 | before we talk about other stuff.
02:17:35.980 | So like this pickup line that I use at a bar often,
02:17:41.560 | which is power corrupts and absolute power corrupts,
02:17:44.100 | absolutely.
02:17:45.500 | Is that true or is that just a fancy thing to say?
02:17:48.620 | In modern society, there's something to be said,
02:17:51.460 | have we changed as societies over time
02:17:54.140 | in terms of how much we crave power?
02:17:56.900 | - That there is an impulse towards power
02:18:01.220 | that is innate in people and can be conditioned
02:18:03.340 | one way or the other, yes.
02:18:04.420 | But you can see that Buddhist society
02:18:06.100 | does a very different thing with it at scale.
02:18:08.880 | That you don't end up seeing the emergence
02:18:11.000 | of the same types of sociopathic behavior
02:18:16.000 | and particularly then creating sociopathic institutions.
02:18:19.460 | And so it's like, is eating the foods
02:18:26.300 | that were rare in our evolutionary environment
02:18:28.340 | that give us more dopamine hit because they were rare
02:18:30.220 | and they're not anymore, salt, fat, sugar.
02:18:33.180 | Is there something pleasurable about those
02:18:35.020 | where humans have an orientation to overeat if they can?
02:18:38.260 | Well, the fact that there is that possibility
02:18:40.620 | doesn't mean everyone will obligately be obese
02:18:42.880 | and die of obesity, right?
02:18:43.920 | Like it's possible to have a particular impulse
02:18:48.420 | and to be able to understand it,
02:18:49.600 | have other ones and be able to balance them.
02:18:52.220 | And so to say that power dynamics are obligate in humans
02:18:57.220 | and we can't do anything about it is very similar to me
02:19:01.680 | to saying like everyone is gonna be obligately obese.
02:19:05.880 | - Yeah, so there's some degree to which those,
02:19:08.040 | the control of those impulses has to do
02:19:10.000 | with the conditioning early in life.
02:19:11.440 | - Yes, and the culture that creates the environment
02:19:15.040 | to be able to do that and then the recursion on that.
02:19:17.760 | - Okay, so what if we were to, just bear with me,
02:19:21.640 | just asking for a friend,
02:19:22.880 | if we were to kill all humans on earth and then start over,
02:19:26.780 | is there ideas about how to build up,
02:19:32.100 | okay, we don't have to kill it,
02:19:33.080 | let's leave the humans on earth, they're fine,
02:19:35.320 | and go to Mars and start a new society.
02:19:39.040 | Is there ways to construct systems of conditioning,
02:19:42.080 | education, of how we live with each other
02:19:45.800 | that would incentivize us properly?
02:19:52.020 | To not seek power, to not construct systems
02:19:56.440 | that are of asymmetry of power
02:20:00.200 | and to create systems that are resilient
02:20:01.720 | to all kinds of terrorist attacks,
02:20:03.400 | to all kinds of destructions?
02:20:05.380 | - I believe so.
02:20:08.140 | - So is there some inklings we could,
02:20:10.240 | of course, you probably don't have all the answers,
02:20:13.160 | but you have insights about what that looks like.
02:20:15.640 | I mean, is it just rigorous practice of dialectic synthesis
02:20:20.160 | as essentially conversations with assholes
02:20:23.520 | of various flavors until they're not assholes anymore
02:20:26.400 | because you've become deeply empathetic
02:20:28.520 | with their experience?
02:20:29.600 | - Okay, so there's a lot of things
02:20:33.080 | that we would need to construct to come back to this,
02:20:36.920 | like what is the basis of rivalry?
02:20:39.560 | How do you bind it?
02:20:40.940 | How does it relate to tech?
02:20:43.400 | If you have a culture that is doing less rivalry,
02:20:46.020 | does it always lose in war to those who do war better?
02:20:48.920 | And how do you make something on the enactment
02:20:51.040 | of how to get there from here?
02:20:53.240 | - Great, great.
02:20:54.080 | So what's rivalry?
02:20:55.440 | Is rivalry bad or good?
02:20:57.020 | So is another word for rivalry competition?
02:21:01.680 | - Yes, I think, roughly yes.
02:21:05.460 | I think bad and good are kind of silly concepts here.
02:21:10.460 | Good for some things, bad for other things.
02:21:12.640 | - For resilience. - Some contexts and others.
02:21:15.680 | Even that.
02:21:16.520 | Let me give you an example that relates back
02:21:19.800 | to the Facebook measuring thing
02:21:21.280 | you were mentioning a moment ago.
02:21:22.920 | First, I think what you're saying is actually aligned
02:21:26.320 | with the right direction
02:21:27.200 | and what I wanna get to in a moment,
02:21:29.220 | but it's not, the devil is in the details here.
02:21:32.180 | So-- - I enjoy praise.
02:21:34.360 | It feeds my ego.
02:21:35.360 | I grow stronger, so I appreciate that.
02:21:37.720 | - I will make sure to include one piece
02:21:39.560 | every 15 minutes as we go. - Thank you.
02:21:42.280 | - So it's easier to measure,
02:21:47.280 | there are problems with this argument,
02:21:52.760 | but there's also utility to it.
02:21:54.220 | So let's take it for the utility it has first.
02:21:56.960 | It's harder to measure happiness
02:22:00.800 | than it is to measure comfort.
02:22:04.840 | We can measure with technology
02:22:07.360 | that the shocks in a car are making the car bounce less,
02:22:10.600 | that the bed is softer and material science
02:22:14.900 | and those types of things.
02:22:16.200 | And happiness is actually hard for philosophers to define
02:22:19.660 | because some people find
02:22:22.840 | that there's certain kinds of overcoming suffering
02:22:24.760 | that are necessary for happiness.
02:22:26.120 | There's happiness that feels more like contentment
02:22:27.960 | and happiness that feels more like passion.
02:22:30.080 | Is passion the source of all suffering
02:22:31.580 | or the source of all creativity?
02:22:33.340 | There's deep stuff and it's mostly first person,
02:22:35.800 | not measurable third person stuff,
02:22:37.200 | even if maybe it corresponds to third person stuff
02:22:40.120 | to some degree, but we also see examples of,
02:22:42.460 | some of our favorite examples is people
02:22:44.020 | who are in the worst environments
02:22:45.300 | who end up finding happiness, right?
02:22:46.640 | Where the third person stuff looks to be less conducive
02:22:49.180 | and there's some Viktor Frankl, Nelson Mandela, whatever.
02:22:52.520 | But it's pretty easy to measure comfort
02:22:55.700 | and it's pretty universal.
02:22:57.060 | And I think we can see that the industrial revolution
02:22:59.980 | started to replace happiness with comfort quite heavily
02:23:03.760 | as the thing it was optimizing for.
02:23:05.360 | And we can see that when increased comfort is given,
02:23:07.860 | maybe because of the evolutionary disposition
02:23:10.340 | that expending extra calories
02:23:12.680 | when for the majority of our history,
02:23:14.140 | we didn't have extra calories was not a safe thing to do.
02:23:17.320 | Who knows why?
02:23:18.280 | When extra comfort is given,
02:23:21.280 | it's very easy to take that path,
02:23:24.060 | even if it's not the path that supports
02:23:26.520 | overall wellbeing long-term.
02:23:28.680 | And so we can see that,
02:23:33.680 | when you look at the techno-optimist idea
02:23:37.140 | that we have better lives than Egyptian pharaohs
02:23:39.940 | and kings and whatever, what they're largely looking at
02:23:42.820 | is how comfortable our beds are
02:23:45.960 | and how comfortable the transportation systems are
02:23:48.060 | and things like that,
02:23:48.900 | in which case there's massive improvement.
02:23:50.280 | But we also see that in some of the nations
02:23:52.660 | where people have access to the most comfort,
02:23:54.340 | suicide and mental illness are the highest.
02:23:57.140 | And we also see that some of the happiest cultures
02:24:00.540 | are actually some of the ones
02:24:01.380 | that are in materially lame environments.
02:24:04.080 | And so there's a very interesting question here.
02:24:06.220 | And if I understand correctly, you do cold showers.
02:24:09.980 | And Joe Rogan was talking about how he needs to do
02:24:12.440 | some fairly intensive kind of struggle
02:24:16.720 | that is a non-comfort
02:24:18.080 | to actually induce being better as a person,
02:24:20.520 | this concept of hormesis,
02:24:22.440 | that it's actually stressing an adaptive system
02:24:26.400 | that increases its adaptive capacity
02:24:28.420 | and that there's something that the happiness of a system
02:24:32.180 | has something to do with its adaptive capacity,
02:24:34.540 | its overall resilience, health, wellbeing,
02:24:36.480 | which requires a decent bit of discomfort.
02:24:39.440 | And yet in the presence of the comfort solution,
02:24:44.080 | it's very hard to not choose it.
02:24:45.720 | And then as you're choosing it regularly
02:24:47.600 | to actually down-regulate your overall adaptive capacity.
02:24:51.280 | And so when we start saying,
02:24:55.000 | can we make tech where we're measuring
02:25:00.000 | for the things that it produces
02:25:01.320 | beyond just the measure of GDP
02:25:03.120 | or whatever particular measures look like
02:25:06.600 | the revenue generation or profit generation of my business,
02:25:09.560 | are all the meaningful things measurable?
02:25:14.200 | And what are the right measures?
02:25:17.700 | And what are the externalities of optimizing
02:25:20.140 | for that measurement set?
02:25:21.240 | What meaningful things aren't included
02:25:22.800 | in that measurement set
02:25:23.880 | that might have their own externalities?
02:25:25.960 | These are some of the questions
02:25:26.880 | we actually have to take seriously.
02:25:28.160 | - Yeah, and I think they're answerable questions, right?
02:25:31.120 | - Progressively better, not perfect.
02:25:33.120 | - Right, so first of all,
02:25:34.720 | let me throw out happiness and comfort out of the discussion.
02:25:37.480 | Those seem like useless.
02:25:39.280 | The distinction, 'cause I said they're useful,
02:25:42.320 | wellbeing is useful, but I think I take it back.
02:25:45.780 | I proposed new metrics in this brainstorm session,
02:25:52.400 | which is, so one is like personal growth,
02:25:57.400 | which is intellectual growth.
02:26:01.080 | I think we're able to make that concrete for ourselves.
02:26:05.640 | Like you're a better person than you were a week ago,
02:26:10.360 | or a worse person than you were a week ago.
02:26:12.200 | I think we can ourselves report that
02:26:15.760 | and understand what that means.
02:26:17.840 | It's this gray area, and we try to define it,
02:26:20.480 | but I think we humans are pretty good at that,
02:26:22.680 | because we have a sense, an idealistic sense
02:26:25.040 | of the person we might be able to become.
02:26:27.120 | We all dream of becoming a certain kind of person,
02:26:29.320 | and I think we have a sense of getting closer
02:26:32.600 | and not towards that person.
02:26:34.040 | Maybe this is not a great metric, fine.
02:26:36.220 | The other one is love, actually.
02:26:38.440 | Fuck if you're happy or not, or you're comfortable or not,
02:26:42.900 | how much love do you have towards your fellow human beings?
02:26:47.120 | I feel like if you try to optimize that
02:26:48.920 | and increasing that, that's going to have,
02:26:51.400 | that's a good metric.
02:26:52.500 | How many times a day, sorry, if I can quantify,
02:26:58.800 | how many times a day have you thought positively
02:27:00.920 | of another human being?
02:27:02.560 | Just put that down as a number, and increase that number.
02:27:06.400 | - I think the process of saying,
02:27:08.960 | okay, so let's not take GDP or GDP per capita
02:27:13.200 | as the metric we wanna optimize for,
02:27:14.680 | because GDP goes up during war,
02:27:16.500 | and it goes up with more healthcare spending
02:27:18.500 | from sicker people and various things
02:27:20.040 | that we wouldn't say correlate to quality of life.
02:27:22.700 | Addiction drives GDP awesomely.
02:27:24.680 | - By the way, when I said growth, I wasn't referring to GDP.
02:27:28.200 | - I know, I'm giving an example now
02:27:30.120 | of the primary metric we use
02:27:32.080 | and why it's not an adequate metric,
02:27:33.880 | 'cause we're exploring other ones.
02:27:35.280 | So the idea of saying, what would the metrics
02:27:38.360 | for a good civilization be?
02:27:41.640 | If I had to pick a set of metrics,
02:27:43.160 | what would the best ones be,
02:27:44.080 | if I was gonna optimize for those?
02:27:46.200 | And then really try to run the thought experiment
02:27:49.140 | more deeply and say, okay,
02:27:51.140 | so what happens if we optimize for that?
02:27:53.320 | Try to think through the first and second
02:27:56.360 | and third order effects of what happens that's positive,
02:27:58.880 | and then also say what negative things
02:28:01.580 | can happen from optimizing that?
02:28:02.820 | What actually matters that is not included in that
02:28:05.540 | or in that way of defining it?
02:28:06.820 | Because love versus number of positive thoughts per day,
02:28:09.860 | I could just make a long list of names
02:28:11.740 | and just say positive thing about each one.
02:28:13.780 | It's all very superficial.
02:28:15.580 | Not include animals or the rest of life,
02:28:17.620 | have a very shallow total amount of it,
02:28:20.780 | but I'm optimizing the number,
02:28:21.900 | and if I get some credit for the number.
02:28:23.580 | So, and this is when I said
02:28:25.580 | the model of reality isn't reality.
02:28:27.340 | When you make a set of metrics,
02:28:30.700 | say we're gonna optimize for this,
02:28:32.200 | whatever reality is that is not included in those metrics
02:28:35.420 | can be the areas where harm occurs,
02:28:36.800 | which is why I would say that wisdom is something like
02:28:43.780 | the discernment that leads to right choices
02:28:47.280 | beyond what metrics-based optimization would offer.
02:28:52.940 | - Yeah, but another way to say that is
02:28:57.060 | wisdom is a constantly expanding
02:29:03.100 | and evolving set of metrics.
02:29:04.980 | - Which means that there is something in you
02:29:07.940 | that is recognizing a new metric that's important
02:29:10.100 | that isn't part of that metric set.
02:29:11.420 | So, there's a certain kind of connection,
02:29:13.820 | discernment, awareness, and this is--
02:29:17.420 | - It's iterative game theory.
02:29:18.940 | - There's a Girdles incompleteness theorem, right?
02:29:20.700 | Which is if the set of things is consistent,
02:29:23.400 | it won't be complete.
02:29:24.240 | So, we're gonna keep adding to it,
02:29:25.460 | which is why we were saying earlier,
02:29:27.540 | I don't think it's not beautiful.
02:29:29.180 | And especially if you were just saying
02:29:31.340 | one of the metrics you wanna optimize for
02:29:32.700 | at the individual level is becoming, right?
02:29:34.500 | That we're becoming more.
02:29:35.340 | Well, that then becomes true for the civilization
02:29:37.300 | and our metric sets as well.
02:29:39.620 | And our definition of how to think about a meaningful life
02:29:42.420 | and a meaningful civilization.
02:29:44.220 | I can tell you what some of my favorite metrics are.
02:29:46.540 | - What's that?
02:29:47.380 | - Well, love is obviously not a metric.
02:29:52.380 | - Oh, what you can bench?
02:29:53.860 | - Yeah. - It's a good metric.
02:29:55.060 | - Yeah, I wanna optimize that across the entire population,
02:29:57.460 | starting with infants.
02:29:58.560 | So, in the same way that love isn't a metric,
02:30:05.260 | but you could make metrics that look at certain parts of it.
02:30:07.380 | The thing I'm about to say isn't a metric,
02:30:08.980 | but it's a consideration.
02:30:11.260 | 'Cause I thought about this a lot.
02:30:12.340 | I don't think there is a metric, a right one.
02:30:14.700 | I think that every metric by itself,
02:30:18.420 | without this thing we talked about,
02:30:19.660 | of the continuous improvement
02:30:20.740 | becomes a paperclip maximizer.
02:30:22.620 | I think that's what the idea of false idol means
02:30:26.540 | in terms of the model of reality not being reality.
02:30:29.700 | Then my sacred relationship is to reality itself,
02:30:32.460 | which also binds me to the unknown forever.
02:30:34.940 | To the known, but also to the unknown.
02:30:36.420 | And there's a sense of sacredness connected to the unknown
02:30:39.660 | that creates an epistemic humility
02:30:41.300 | that is always seeking not just to optimize the thing I know,
02:30:43.860 | but to learn new stuff.
02:30:45.740 | And to be open to perceive reality directly.
02:30:47.540 | So, my model never becomes sacred.
02:30:48.980 | My model is useful.
02:30:49.940 | - So, the model can't be the false idol.
02:30:52.980 | - Correct.
02:30:54.540 | And this is why the first verse of the Tao Te Ching
02:30:57.780 | is the Tao that is nameable is not the eternal Tao.
02:31:00.540 | The naming then can become the source of the 10,000 things
02:31:03.100 | that if you get too carried away with it,
02:31:04.620 | can actually obscure you from paying attention to reality
02:31:07.260 | beyond the models.
02:31:09.300 | - It sounds a lot like Stephen Wolfram,
02:31:11.540 | but in a different language, much more poetic.
02:31:14.020 | - I can imagine that.
02:31:15.420 | - No, I'm referring to, I'm joking.
02:31:17.820 | But there's echoes of cellular automata,
02:31:19.820 | which you can't name.
02:31:21.260 | You can't construct a good model of cellular automata.
02:31:24.300 | You can only watch in awe.
02:31:25.920 | I apologize.
02:31:27.940 | I'm distracting your train of thought
02:31:29.580 | horribly and miserably.
02:31:31.140 | Making a difference.
02:31:31.980 | By the way, something robots aren't good at.
02:31:34.380 | And dealing with the uncertainty of uneven ground.
02:31:37.420 | You've been okay so far.
02:31:38.980 | You've been doing wonderfully.
02:31:40.300 | So what's your favorite metrics?
02:31:41.660 | - Okay, so-- - That's why I know
02:31:42.500 | you're not a robot.
02:31:43.320 | You passed an interim test.
02:31:44.220 | - So one metric, and there are problems with this,
02:31:48.220 | but one metric that I like to just,
02:31:50.300 | as a thought experiment to consider,
02:31:52.420 | is 'cause you're actually asking,
02:31:55.380 | we're, I mean, I know you ask your guests
02:31:58.420 | about the meaning of life.
02:31:59.380 | 'Cause ultimately when you're saying
02:32:01.220 | what is a desirable civilization,
02:32:03.660 | you can't answer that without answering
02:32:05.740 | what is a meaningful human life?
02:32:07.220 | And to say what is a good civilization?
02:32:10.140 | 'Cause it's gonna be in relationship to that, right?
02:32:12.740 | And then you have whatever your answer is,
02:32:19.020 | how do you know?
02:32:19.860 | What is the epistemic basis for postulating that?
02:32:23.940 | - There's also a whole 'nother reason
02:32:26.420 | for asking that question.
02:32:27.540 | I don't, I mean, that doesn't even apply to you whatsoever,
02:32:31.700 | which is it's interesting how few people
02:32:34.900 | have been asked questions like it.
02:32:39.300 | We joke about these questions as silly.
02:32:43.820 | - Right.
02:32:44.820 | - It's funny to watch a person.
02:32:47.940 | And if I was more of an asshole,
02:32:49.420 | I would really stick on that question.
02:32:51.620 | - Right.
02:32:52.580 | - It's a silly question in some sense,
02:32:54.580 | but we haven't really considered what it means.
02:32:58.140 | Just a more concrete version of that question
02:33:00.580 | is what is a better world?
02:33:02.900 | What is the kind of world we're trying to create?
02:33:04.900 | Really?
02:33:06.140 | Have you really thought about the kind of world?
02:33:08.460 | - I'll give you some kind of simple answers to that
02:33:10.900 | that are meaningful to me.
02:33:12.940 | But let me do the societal indices first,
02:33:14.580 | 'cause they're fun.
02:33:15.420 | - Yes.
02:33:16.260 | - We should take a note of this meaningful thing,
02:33:19.100 | 'cause it's important to come back to.
02:33:21.180 | - Are you reminding me to ask you
02:33:22.420 | about the meaning of life?
02:33:23.380 | Noted.
02:33:24.220 | - I know.
02:33:25.060 | - Let me jot that down.
02:33:28.460 | - So, 'cause I think I stopped tracking it
02:33:31.540 | like 25 open threads.
02:33:33.180 | Okay.
02:33:35.220 | - Let it all burn.
02:33:36.060 | - One index that I find very interesting
02:33:40.060 | is the inverse correlation of addiction within the society.
02:33:43.020 | The more a society produces addiction
02:33:47.460 | within the people in it,
02:33:49.180 | the less healthy I think the society is
02:33:52.300 | as a pretty fundamental metric.
02:33:54.740 | And so the more the individuals feel
02:33:56.980 | that there are less compulsive things
02:34:00.380 | in compelling them to behave in ways
02:34:02.900 | that are destructive to their own values.
02:34:05.100 | And insofar as a civilization is conditioning
02:34:09.260 | and influencing the individuals within it,
02:34:12.140 | the inverse of addiction.
02:34:13.660 | - Broadly defined.
02:34:15.620 | - Correct.
02:34:16.460 | - Addiction.
02:34:17.300 | What's it?
02:34:18.540 | - Compulsive behavior that is destructive
02:34:21.220 | towards things that we value.
02:34:26.900 | - Yeah.
02:34:28.180 | - I think that's a very interesting one to think about.
02:34:29.940 | - That's a really interesting one, yeah.
02:34:31.100 | - And this is then also where comfort
02:34:32.740 | and addiction start to get very close.
02:34:35.460 | And the ability to go in the other direction from addiction
02:34:38.900 | is the ability to be exposed to hypernormal stimuli
02:34:41.660 | and not go down the path of desensitizing to other stimuli
02:34:46.460 | and needing that hypernormal stimuli,
02:34:48.540 | which does involve a kind of hormesis.
02:34:50.980 | So I do think the civilization of the future
02:34:54.500 | has to create something like ritualized discomfort.
02:34:58.740 | And...
02:35:01.380 | - Ritualized discomfort.
02:35:05.100 | - Yeah.
02:35:06.260 | I think that's what the sweat lodge and the vision quest
02:35:09.500 | and the solo journey and the ayahuasca journey
02:35:11.980 | and the Sundance were.
02:35:13.420 | I think it's even a big part of what yoga asana was,
02:35:16.020 | is to make beings that are resilient and strong,
02:35:21.020 | they have to overcome some things.
02:35:23.060 | To make beings that can control their own mind and fear,
02:35:25.580 | they have to face some fears.
02:35:27.500 | But we don't want to put everybody in war or real trauma.
02:35:31.140 | And yet we can see that the most fucked up people we know
02:35:34.780 | had childhoods of a lot of trauma,
02:35:36.380 | but some of the most incredible people we know
02:35:38.340 | had childhoods of a lot of trauma,
02:35:40.060 | whether or not they happened to make it through
02:35:41.980 | and overcome that or not.
02:35:43.220 | So how do we get the benefits of the stealing of character
02:35:48.220 | and the resilience and the whatever
02:35:49.820 | that happened from the difficulty
02:35:51.140 | without traumatizing people?
02:35:52.940 | A certain kind of ritualized discomfort
02:35:56.300 | that not only has us overcome something by ourselves,
02:36:01.620 | but overcome it together with each other
02:36:03.220 | where nobody bails when it gets hard
02:36:04.820 | 'cause the other people are there.
02:36:05.780 | So it's both a resilience of the individuals
02:36:08.060 | and a resilience of the bonding.
02:36:09.660 | So I think we'll keep getting more and more
02:36:12.780 | comfortable stuff, but we have to also develop resilience
02:36:16.300 | in the presence of that for the anti-addiction direction
02:36:21.100 | and the fullness of character
02:36:23.060 | and the trustworthiness to others.
02:36:24.900 | - So you have to be consistently injecting discomfort
02:36:29.260 | into the system, ritualize.
02:36:31.180 | I mean, this sounds like you have to imagine Sisyphus happy.
02:36:35.300 | You have to imagine Sisyphus with his rock,
02:36:39.140 | optimally resilient from a metrics perspective in society.
02:36:47.020 | So we want to constantly be throwing rocks at ourselves.
02:36:52.020 | - Not constantly.
02:36:53.400 | You didn't have to do- - Frequently.
02:36:56.140 | - Periodically. - Periodically.
02:36:58.860 | - And there's different levels of intensity,
02:37:00.740 | different periodacies.
02:37:01.620 | Now, I do not think this should be imposed by states.
02:37:04.720 | I think it should emerge from cultures.
02:37:07.900 | And I think the cultures are developing people
02:37:11.020 | that understand the value of it.
02:37:13.340 | So there is both a cultural cohesion to it,
02:37:17.340 | but there's also a voluntarism because the people value
02:37:20.340 | the thing that is being developed, they understand it.
02:37:22.380 | - And that's what conditioning, so it's conditioning.
02:37:24.300 | It's conditioning some of these values.
02:37:28.140 | - Conditioning is a bad word
02:37:29.300 | because we like our idea of sovereignty,
02:37:30.900 | but when we recognize the language that we speak
02:37:34.380 | and the words that we think in
02:37:35.980 | and the patterns of thought built into that language
02:37:39.220 | and the aesthetics that we like,
02:37:40.380 | and so much is conditioned in us
02:37:42.340 | just based on where we're born,
02:37:43.780 | you can't not condition people.
02:37:45.380 | So all you can do is take more responsibility
02:37:47.340 | for what the conditioning factors are,
02:37:48.860 | and then you have to think about this question
02:37:50.380 | of what is a meaningful human life?
02:37:51.700 | Because we're, unlike the other animals born into environment
02:37:55.140 | that they're genetically adapted for,
02:37:56.680 | we're building new environments
02:37:58.140 | that we were not adapted for,
02:37:59.700 | and then we're becoming affected by those.
02:38:02.100 | So then we have to say, well, what kinds of environments,
02:38:04.400 | digital environments, physical environments,
02:38:06.460 | social environments would we want to create
02:38:10.820 | that would develop the healthiest, happiest,
02:38:13.820 | most moral, noble, meaningful people?
02:38:16.460 | What are even those sets of things that matter?
02:38:18.220 | So you end up getting deep existential consideration
02:38:21.680 | at the heart of civilization design
02:38:23.380 | when you start to realize how powerful we're becoming
02:38:25.200 | and how much what we're building it
02:38:27.100 | in service towards matters.
02:38:29.180 | - Before I pull, I think, three threads you just laid down,
02:38:33.020 | is there another metric index that you're interested in?
02:38:35.740 | - I'll tell you one more that I really like.
02:38:37.700 | There's a number, but the next one that comes to mind is,
02:38:42.800 | I have to make a very quick model.
02:38:48.200 | Healthy human bonding,
02:38:53.100 | say we were in a tribal type setting,
02:38:56.100 | my positive emotional states
02:38:58.360 | and your positive emotional states
02:39:00.580 | would most of the time be correlated,
02:39:02.740 | your negative emotional states and mine.
02:39:04.480 | And so you start laughing, I start laughing,
02:39:07.160 | you start crying, my eyes might tear up.
02:39:09.780 | And we would call that the compassion-compersion axis.
02:39:14.780 | I would, this is a model I find useful.
02:39:18.780 | So compassion is when you're feeling something negative,
02:39:20.860 | I feel some pain, I feel some empathy,
02:39:22.900 | something in relationship.
02:39:24.400 | Compersion is when you do well, I'm stoked for you.
02:39:27.260 | Right, like I actually feel happiness at your happiness.
02:39:29.740 | - I like compersion.
02:39:31.220 | - Yeah, the fact that it's such an uncommon word in English
02:39:34.060 | is actually a problem culturally.
02:39:36.600 | - 'Cause I feel that often
02:39:37.440 | and I think that's a really good feeling to feel
02:39:39.760 | and maximize for actually.
02:39:40.900 | - That's actually the metric I'm gonna say.
02:39:42.620 | - Oh, wow.
02:39:43.500 | - Is the compassion-compersion axis
02:39:46.020 | is the thing I would optimize for.
02:39:47.540 | Now, there is a state where my emotional states
02:39:51.400 | and your emotional states are just totally decoupled.
02:39:54.680 | And that is like sociopathy.
02:39:57.260 | I don't want to hurt you, but I don't care if I do
02:39:59.700 | or for you to do well or whatever.
02:40:01.480 | But there's a worse state and it's extremely common,
02:40:03.500 | which is where they're inversely coupled,
02:40:05.980 | where my positive emotions correspond to your negative ones
02:40:09.620 | and vice versa.
02:40:11.120 | And that is the, I would call it the jealousy-sadism axis.
02:40:16.060 | The jealousy axis is when you're doing really well,
02:40:19.460 | I feel something bad.
02:40:20.700 | I feel taken away from, less than, upset, envious, whatever.
02:40:25.700 | And that's so common.
02:40:30.580 | But I think of it as kind of a low-grade psychopathology
02:40:34.380 | that we've just normalized.
02:40:35.740 | The idea that I'm actually upset at the happiness
02:40:39.540 | or fulfillment or success of another
02:40:40.920 | is like a profoundly fucked up thing.
02:40:43.360 | No, we shouldn't shame it and repress it so it gets worse.
02:40:45.440 | We should study it.
02:40:46.500 | Where does it come from?
02:40:47.340 | And it comes from our own insecurities and stuff.
02:40:50.580 | But then the next part that everybody knows
02:40:52.480 | is really fucked up is just on the same axis.
02:40:55.100 | It's the same inverted, which is to the jealousy
02:40:58.540 | or the envy is the, I feel badly when you're doing well.
02:41:02.120 | The sadism side is I actually feel good when you lose.
02:41:05.620 | Or when you're in pain,
02:41:06.560 | I feel some happiness that's associated.
02:41:07.980 | And you can see when someone feels jealous,
02:41:10.020 | sometimes they feel jealous with a partner
02:41:12.420 | and then they feel they want that partner to get it.
02:41:14.820 | Revenge comes up or something.
02:41:17.460 | So sadism is really, jealousy is one step on the path
02:41:21.260 | to sadism from the healthy compassion-compersion axis.
02:41:23.940 | So I would like to see a society that is inversely,
02:41:28.240 | that is conditioning sadism and jealousy inversely, right?
02:41:32.660 | The lower that amount
02:41:34.300 | and the more the compassion-compersion.
02:41:36.020 | And if I had to summarize that very simply,
02:41:37.780 | I'd say it would optimize for compersion.
02:41:39.820 | Which is, 'cause notice,
02:41:43.980 | that's not just saying love for you
02:41:46.140 | where I might be self-sacrificing and miserable
02:41:48.800 | and I love people, but I kill myself,
02:41:51.220 | which I don't think anybody thinks a great idea.
02:41:52.780 | Or happiness where I might be sociopathically happy
02:41:55.000 | where I'm causing problems all over the place
02:41:56.860 | or even sadistically happy, but it's a coupling, right?
02:42:00.600 | That I'm actually feeling happiness in relationship to yours
02:42:03.200 | and even in causal relationship where I,
02:42:05.460 | my own agentic desire to get happier
02:42:07.280 | wants to support you too.
02:42:08.760 | - That's actually, speaking of another pickup line,
02:42:12.260 | that's quite honestly what I,
02:42:15.660 | as a guy who's single,
02:42:17.240 | this is gonna come out very ridiculous
02:42:19.860 | 'cause it's like, oh yeah, where's your girlfriend, bro?
02:42:22.620 | But that's what I look for in a relationship
02:42:27.620 | 'cause it's like, it's so much,
02:42:30.340 | it's so, it's such an amazing life
02:42:33.500 | where you actually get joy from another person's success
02:42:37.920 | and they get joy from your success.
02:42:40.020 | And then it becomes like,
02:42:41.660 | you don't actually need to succeed much
02:42:44.100 | for that to have a, like a,
02:42:45.820 | like a cycle of just like happiness
02:42:49.660 | that just increases like exponentially.
02:42:52.280 | It's weird.
02:42:53.140 | So like just be, just enjoying the happiness of others,
02:42:56.300 | the success of others.
02:42:58.040 | So this is like the, let's call this,
02:43:01.020 | 'cause the first person that drilled this into my head
02:43:03.020 | is Rogan, Joe Rogan.
02:43:04.860 | He was the embodiment of that
02:43:06.180 | 'cause I saw somebody who was successful, rich,
02:43:11.180 | and nonstop, true, I mean,
02:43:15.100 | you could tell when somebody's full of shit
02:43:16.700 | and somebody's not,
02:43:18.180 | really genuinely enjoying the success of his friends.
02:43:21.940 | That was weird to me.
02:43:23.060 | That was interesting.
02:43:23.900 | And I mean, the way you're kind of speaking to it,
02:43:27.340 | the reason Joe stood out to me
02:43:30.020 | is I guess I haven't witnessed genuine expression
02:43:33.020 | of that often in this culture,
02:43:35.380 | of just real joy for others.
02:43:38.660 | I mean, part of that has to do,
02:43:39.660 | there hasn't been many channels where you can watch
02:43:43.640 | or listen to people being their authentic selves.
02:43:46.300 | So I'm sure there's a bunch of people
02:43:47.660 | who live life with compersion.
02:43:49.580 | They probably don't seek public attention also,
02:43:52.420 | but that was, yeah, if there was any word
02:43:56.780 | that could express what I've learned from Joe
02:43:58.740 | and why he's been a really inspiring figure
02:44:01.340 | is that compersion.
02:44:02.820 | And I wish our world had a lot more of that
02:44:07.820 | 'cause then, I mean, my own,
02:44:12.580 | sorry to go in a small tangent,
02:44:14.380 | but you're speaking how society should function,
02:44:19.060 | but I feel like if you optimize for that metric
02:44:21.780 | in your own personal life,
02:44:23.540 | you're going to live a truly fulfilling life.
02:44:27.860 | I don't know what the right word to use,
02:44:30.020 | but that's a really good way to live life.
02:44:32.180 | - You will also learn what gets in the way of it
02:44:36.140 | and how to work with it that if you wanted to help
02:44:38.300 | try to build systems at scale or apply Facebook
02:44:40.900 | or exponential technologies to do that,
02:44:42.820 | you would have more actual depth of real knowledge
02:44:45.540 | of what that takes.
02:44:48.660 | And this is, as you mentioned,
02:44:50.420 | that there's this virtuous cycle
02:44:51.740 | between when you get stoked on other people doing well
02:44:53.740 | and then they have a similar relationship to you
02:44:55.700 | and everyone is in the process of building each other up.
02:44:58.300 | And this is what I would say the healthy version
02:45:02.340 | of competition is versus the unhealthy version.
02:45:04.740 | The healthy version, right, the root,
02:45:08.540 | I believe it's a Latin word that means to strive together.
02:45:11.940 | And it's that impulse of becoming
02:45:14.140 | where I want to become more,
02:45:15.220 | but I recognize that there's actually a hormesis,
02:45:17.740 | there's a challenge that is needed
02:45:19.340 | for me to be able to do that.
02:45:21.260 | But that means that, yes,
02:45:22.980 | there's an impulse where I'm trying to get ahead,
02:45:24.620 | maybe I'm even trying to win,
02:45:25.780 | but I actually want a good opponent
02:45:27.740 | and I want them to get ahead too
02:45:29.260 | because that is where my ongoing becoming happens.
02:45:31.620 | And the win itself will get boring very quickly.
02:45:34.460 | The ongoing becoming is where there's aliveness.
02:45:37.260 | And for the ongoing becoming, they need to have it too.
02:45:39.780 | And that's the strive together.
02:45:41.260 | So in the healthy competition,
02:45:42.460 | I'm stoked when they're doing really well
02:45:44.340 | 'cause my becoming is supported by it.
02:45:47.140 | - Now, this is actually a very nice segue
02:45:48.780 | into a model I like about what a meaningful human life is,
02:45:53.780 | if you want to go there.
02:45:57.900 | - Let's go there.
02:46:00.940 | - We can go somewhere else if you want.
02:46:02.820 | - Well, I have three things I'm going elsewhere with,
02:46:05.820 | but if we were first,
02:46:07.460 | let us take a short stroll
02:46:09.500 | through the park of the meaning of life.
02:46:12.220 | Daniel, what is a meaningful life?
02:46:16.300 | - I think the semantics end up mattering
02:46:20.300 | 'cause a lot of people will take the word meaning
02:46:24.300 | and the word purpose almost interchangeably.
02:46:27.340 | And they'll think kind of what is the meaning of my life?
02:46:30.980 | What is the meaning of human life?
02:46:32.100 | What is the meaning of life?
02:46:32.940 | What's the meaning of the universe?
02:46:34.780 | And what is the meaning of existence rather than non-existence?
02:46:38.620 | So there's a lot of kind of existential considerations there.
02:46:41.780 | And I think there's some cognitive mistakes
02:46:44.340 | that are very easy.
02:46:45.940 | Like taking the idea of purpose.
02:46:48.860 | - Which is like a goal.
02:46:50.300 | - Which is a utilitarian concept.
02:46:51.980 | The purpose of one thing is defined in relationship
02:46:54.500 | to other things that have assumed value.
02:46:56.500 | And to say, what is the purpose of everything?
02:47:00.780 | Well, it's a purpose is too small of a question.
02:47:03.380 | It's fundamentally a relative question within everything.
02:47:05.740 | What is the purpose of one thing relative to another?
02:47:07.660 | What is the purpose of everything?
02:47:08.860 | And there's nothing outside of it with which to say it.
02:47:11.100 | You actually just got to the limits of the utility
02:47:14.100 | of the concept of purpose.
02:47:16.100 | It doesn't mean it's purposeless in the sense
02:47:17.860 | of something inside of it being purposeless.
02:47:19.420 | It means the concept is too small.
02:47:21.380 | Which is why you end up getting to,
02:47:23.180 | like in Taoism talking about the nature of it.
02:47:27.380 | Rather, there's a fundamental what,
02:47:30.940 | where the why can't go deeper.
02:47:32.460 | It is the nature of it.
02:47:33.620 | But I'm gonna try to speak to a much simpler part,
02:47:39.180 | which is when people think about
02:47:40.460 | what is a meaningful human life?
02:47:42.300 | And kind of if we were to optimize for something
02:47:44.780 | at the level of individual life,
02:47:46.580 | but also how does optimizing for this
02:47:49.340 | at the level of the individual life
02:47:50.620 | lead to the best society for insofar
02:47:55.380 | as people living that way affects others
02:47:57.460 | and long-term to the world as a whole?
02:47:59.780 | And how would we then make a civilization
02:48:02.300 | that was trying to think about these things?
02:48:04.500 | 'Cause you can see that there are a lot of dialectics
02:48:09.780 | where there's value on two sides,
02:48:13.020 | individualism and collectivism,
02:48:14.620 | or the ability to accept things
02:48:19.460 | and the ability to push harder and whatever.
02:48:21.940 | And there's failure modes on both sides.
02:48:24.260 | And so when you were starting to say,
02:48:27.260 | okay, individual happiness,
02:48:28.780 | and you're like, wait, fuck, sadists can be happy
02:48:30.420 | while hurting people.
02:48:31.260 | It's not individual happiness, it's love.
02:48:32.620 | But wait, some people can self-sacrifice out of love
02:48:35.500 | in a way that actually ends up
02:48:36.500 | just creating codependency for everybody.
02:48:38.980 | Or, okay, so how do we think about
02:48:41.940 | all those things together?
02:48:43.340 | This kind of came to me as a simple way
02:48:51.220 | that I kind of relate to it is
02:48:52.780 | that a meaningful life involves the mode of being,
02:48:57.420 | the mode of doing, and the mode of becoming.
02:49:00.060 | And it involves a virtuous relationship
02:49:02.500 | between those three.
02:49:04.460 | And that any of those modes on their own
02:49:09.100 | also have failure modes that are not a meaningful life.
02:49:12.420 | The mode of being, the way I would describe it,
02:49:15.180 | if we're talking about the essence of it,
02:49:21.180 | is about taking in and appreciating
02:49:23.900 | the beauty of life that is now.
02:49:25.980 | It's a mode that is in the moment,
02:49:28.340 | and that is largely about being with what is.
02:49:32.940 | It's fundamentally grounded in the nature of experience
02:49:35.260 | and the meaningfulness of experience.
02:49:37.340 | The prima facie meaningfulness
02:49:38.940 | of when I'm having this experience,
02:49:40.860 | I'm not actually asking what the meaning of life is.
02:49:44.060 | I'm actually full of it.
02:49:45.020 | I'm full of experiencing it.
02:49:47.020 | - The momentary experience.
02:49:48.260 | - Yes.
02:49:49.100 | So, taking in the beauty of life.
02:49:52.940 | Doing is adding to the beauty of life.
02:49:56.460 | I'm gonna produce some art.
02:49:57.700 | I'm gonna produce some technology
02:49:58.940 | that will make life easier, more beautiful for somebody else.
02:50:01.260 | I'm going to do some science
02:50:05.340 | that will end up leading to better insights
02:50:07.420 | or other people's ability to appreciate the beauty of life
02:50:09.980 | more because they understand more about it,
02:50:11.420 | or whatever it is, or protect it.
02:50:13.260 | I'm gonna protect it in some way.
02:50:14.940 | But that's adding to, or being in service
02:50:17.580 | of the beauty of life through our doing.
02:50:19.700 | And becoming is getting better at both of those.
02:50:22.940 | Being able to deepen our being,
02:50:24.540 | which is to be able to take in the beauty of life
02:50:26.580 | more profoundly, be more moved by it, touched by it.
02:50:31.060 | And increasing our capacity with doing
02:50:32.820 | to add to the beauty of life more.
02:50:34.540 | And so, I hold that a meaningful life
02:50:40.420 | has to be all three of those.
02:50:42.420 | And where they're not in conflict with each other,
02:50:46.420 | ultimately, it grounds in being.
02:50:48.620 | It grounds in the intrinsic meaningfulness of experience.
02:50:52.020 | And then my doing is ultimately something
02:50:54.340 | that will be able to increase the possibility
02:50:57.540 | of the quality of experience for others.
02:51:00.180 | And my becoming is a deepening on those.
02:51:03.140 | So it grounds in experience,
02:51:04.700 | and also the evolutionary possibility of experience.
02:51:07.780 | - And the point is to oscillate between these,
02:51:13.540 | never getting stuck on any one.
02:51:16.360 | - Yeah.
02:51:18.100 | - Or I suppose in parallel,
02:51:18.920 | well, you can't really, attention is a thing.
02:51:22.020 | You can only allocate attention.
02:51:26.900 | I want moments where I am absorbed in the sunset
02:51:30.820 | and I'm not thinking about what to do next.
02:51:32.940 | And then the fullness of that can make it
02:51:37.260 | to where my doing doesn't come from what's in it for me.
02:51:40.780 | 'Cause I actually feel overwhelmingly full already.
02:51:44.260 | And then it's like, how can I make life better
02:51:49.160 | for other people that don't have
02:51:50.300 | as much opportunities I had?
02:51:51.820 | How can I add something wonderful?
02:51:53.540 | How can I just be in the creative process?
02:51:56.460 | And so I think where the doing comes from matters.
02:52:00.460 | And if the doing comes from a fullness of being,
02:52:02.940 | it's inherently going to be paying attention
02:52:04.860 | to externalities.
02:52:05.940 | Or it's more oriented to do that
02:52:08.900 | than if it comes from some emptiness
02:52:10.440 | that is trying to get full in some way
02:52:12.100 | that is willing to cause sacrifices other places
02:52:14.180 | and where a chunk of its attention is internally focused.
02:52:17.020 | And so when Buddha said,
02:52:19.860 | "Desire is the cause of all suffering,"
02:52:21.420 | then later the vow of the Bodhisattva,
02:52:24.620 | which was to show up for all sentient beings
02:52:26.620 | in universe forever,
02:52:28.120 | is a pretty intense thing like desire.
02:52:30.560 | I would say there is a kind of desire,
02:52:34.540 | if we think of desire as a basis for movement,
02:52:36.240 | like a flow or a gradient,
02:52:37.580 | there's a kind of desire that comes
02:52:39.020 | from something missing inside,
02:52:40.420 | seeking fulfillment of that in the world.
02:52:43.060 | That ends up being the cause of actions
02:52:45.100 | that perpetuate suffering.
02:52:46.820 | But there's also not just non-desire,
02:52:49.080 | there's a kind of desire that comes from feeling full
02:52:52.460 | at the beauty of life and wanting to add to it
02:52:54.940 | that is a flow this direction.
02:52:56.580 | And I don't think that is the cause of suffering.
02:52:59.860 | I think that is,
02:53:01.700 | and the Western traditions, right?
02:53:03.360 | The Eastern traditions focused on that
02:53:05.260 | and kind of unconditional happiness outside of them,
02:53:07.860 | in the moment, outside of time.
02:53:09.220 | Western tradition said,
02:53:10.160 | "No, actually desire is the source of creativity
02:53:12.380 | "and we are here to be made in the image
02:53:14.300 | "and likeness of the creator.
02:53:15.480 | "We're here to be fundamentally creative."
02:53:17.560 | But creating from where and in service of what?
02:53:21.120 | Creating from a sense of connection to everything
02:53:23.080 | and wholeness in service of the wellbeing of all of it
02:53:25.540 | is very different.
02:53:26.440 | Which is back to that compassion, compersion axis.
02:53:31.700 | - Being, doing, becoming.
02:53:33.400 | It's pretty powerful.
02:53:35.620 | Also could potentially be algorithmatized
02:53:40.580 | into a robot just saying,
02:53:49.540 | where does death come into that?
02:53:51.800 | Being is forgetting, I mean,
02:53:56.000 | the concept of time completely.
02:53:58.880 | There's a sense to doing and becoming
02:54:01.140 | that has a deadline built in,
02:54:05.020 | the urgency built in.
02:54:07.440 | Do you think death is fundamental to this,
02:54:09.860 | to a meaningful life?
02:54:12.980 | Acknowledging or
02:54:18.340 | feeling the terror of death, like Ernest Becker,
02:54:23.340 | or just acknowledging the uncertainty, the mystery,
02:54:26.860 | the melancholy nature of the fact that the ride ends?
02:54:31.300 | Is that part of this equation or it's not necessary?
02:54:34.700 | - Okay, look at how it could be related.
02:54:37.540 | I've experienced fear of death.
02:54:39.240 | I've also experienced
02:54:41.980 | times where I thought I was gonna die.
02:54:46.540 | It felt extremely peaceful and beautiful.
02:54:48.660 | And it's funny because
02:54:54.540 | if we, we can be afraid of death
02:54:58.620 | because we're afraid of hell or bad reincarnation
02:55:00.580 | or the Bardo or some kind of idea of the afterlife we have,
02:55:02.980 | or we're projecting some kind of sentient suffering.
02:55:05.460 | But if we're afraid of just non-experience,
02:55:07.880 | I noticed that every time I stay up late enough
02:55:12.800 | that I'm really tired,
02:55:14.600 | I'm longing for deep sleep and non-experience, right?
02:55:18.040 | Like I'm actually longing for experience to stop.
02:55:21.060 | And it's not morbid, it's not a bummer.
02:55:24.040 | It's, and I don't mind falling asleep.
02:55:27.780 | And I sometimes when I wake up, wanna go back into it.
02:55:30.880 | And then when it's done, I'm happy to come out of it.
02:55:33.980 | So when we think about death and having finite time here,
02:55:42.500 | and we could talk about if we live for a thousand years
02:55:45.680 | instead of a hundred or something like that,
02:55:47.400 | it would still be finite time.
02:55:48.900 | The one bummer with the age we die is that I generally find
02:55:52.340 | that people mostly start to emotionally mature
02:55:55.240 | just shortly before they die.
02:55:56.700 | But there's,
02:56:02.960 | if I get to live forever,
02:56:07.260 | I can just stay focused on what's in it for me forever.
02:56:12.060 | And if life continues and consciousness and sentience
02:56:17.060 | and people appreciating beauty and adding to it
02:56:21.740 | and becoming continues, my life doesn't,
02:56:23.820 | but my life can have effects that continue well beyond it.
02:56:27.080 | Then life with a capital L starts mattering more to me
02:56:31.240 | than my life.
02:56:32.160 | My life gets to be a part of an in service too.
02:56:34.500 | And the whole thing about when old men plant trees,
02:56:38.320 | the shade of which they'll never get to be in.
02:56:41.300 | I remember the first time I read this poem by Hafez,
02:56:45.680 | the Sufi poet written in like 13th century
02:56:50.000 | or something like that.
02:56:51.160 | And he talked about that if you're lonely to think about him
02:56:56.160 | and he was kind of leaning his spirit
02:56:58.320 | into yours across the distance of a millennium
02:57:01.840 | and would come for you with these poems.
02:57:04.640 | And I was thinking about people a millennium from now
02:57:07.080 | and caring about their experience
02:57:08.640 | and what they'd be suffering if they'd be lonely
02:57:10.440 | and could he offer something that could touch them.
02:57:13.000 | And it's just fucking beautiful.
02:57:14.920 | And so like the most beautiful parts of humans
02:57:18.960 | have to do with something that transcends
02:57:20.540 | what's in it for me.
02:57:21.540 | And death forces you to that.
02:57:25.280 | - So not only does death create the urgency,
02:57:27.600 | it urgency of doing it.
02:57:32.600 | You're very right.
02:57:33.520 | It does have a sense in which it incentivizes
02:57:36.960 | the compersion and the compassion.
02:57:39.820 | - And the widening, you remember Einstein had that quote,
02:57:45.040 | "Something to the effect of it's an optical delusion
02:57:47.360 | "of consciousness to believe there are separate things."
02:57:50.160 | There's this one thing we call universe
02:57:52.500 | and something about us being inside of a prison of perception
02:57:57.500 | that can only see a very narrow little bit of it.
02:58:02.080 | But this might be just some weird disposition of mine,
02:58:07.080 | but when I think about the future after I'm dead
02:58:11.360 | and I think about consciousness,
02:58:14.880 | I think about young people falling in love
02:58:19.020 | for the first time and their experience.
02:58:20.760 | And I think about people being awed by sunsets
02:58:22.800 | and I think about all of it, right?
02:58:26.620 | I can't not feel connected to that.
02:58:30.200 | - Do you feel some sadness to the very high likelihood
02:58:35.200 | that you will be forgotten completely
02:58:37.520 | by all of human history?
02:58:39.080 | You, Daniel, the name, that which cannot be named?
02:58:44.080 | - Systems like to self perpetuate.
02:58:48.120 | Egos do that.
02:58:52.460 | The idea that I might do something meaningful
02:58:54.960 | that future people will appreciate,
02:58:56.480 | of course there's like a certain sweetness to that idea.
02:59:00.120 | But I know how many people did something,
02:59:03.480 | did things that I wouldn't be here without
02:59:05.560 | and that my life would be less without
02:59:06.840 | whose names I will never know.
02:59:08.340 | And I feel a gratitude to them.
02:59:12.000 | I feel a closeness.
02:59:12.900 | I feel touched by that.
02:59:14.000 | And I think to the degree that the future people
02:59:17.760 | are conscious enough, there is a,
02:59:20.800 | you know, a lot of traditions had this kind of,
02:59:23.320 | are we being good ancestors and respect for the ancestors
02:59:25.660 | beyond the names?
02:59:27.120 | I think that's a very healthy idea.
02:59:29.860 | - But let me return to a much less beautiful
02:59:32.820 | and much less pleasant conversation.
02:59:36.460 | You mentioned prison.
02:59:37.500 | - Back to X-Risk, okay.
02:59:38.800 | - And conditioning.
02:59:42.160 | You mentioned something about the state.
02:59:45.560 | So what role, let's talk about companies,
02:59:52.580 | governments, parents, all the mechanisms
02:59:56.400 | that can be a source of conditioning.
02:59:58.720 | Which flavor of ice cream do you like?
03:00:01.820 | Do you think the state is the right thing for the future?
03:00:05.620 | So governments that are elected, democratic systems
03:00:08.340 | that are representing representative democracy.
03:00:11.580 | Is there some kind of political system of governance
03:00:16.140 | that you find appealing?
03:00:17.860 | Is it parents, meaning a very close-knit tribes
03:00:22.860 | of conditioning that's the most essential?
03:00:25.860 | And then you and Michael Malice would happily agree
03:00:30.860 | that it's anarchy, where the state should be dissolved
03:00:35.580 | or destroyed or burned to the ground,
03:00:38.360 | if you're Michael Malice, giggling,
03:00:40.160 | holding the torch as the fire burns.
03:00:46.040 | So which is it?
03:00:47.400 | Is the state, can the state be good
03:00:49.680 | or is the state bad for the conditioning
03:00:53.760 | of a beautiful world?
03:00:55.820 | A or B, this is like an S and T test.
03:00:58.860 | - You like to give these simplified good or bad things.
03:01:01.700 | Would I like the state that we live in currently,
03:01:06.300 | the United States federal government,
03:01:07.900 | to stop existing today?
03:01:09.700 | No, I would really not like that.
03:01:11.640 | I think that'd be not quite bad
03:01:13.060 | for the world in a lot of ways.
03:01:15.340 | Do I think that it's a optimal social system
03:01:20.980 | and maximally just and humane and all those things?
03:01:23.860 | And I want it to continue as is.
03:01:25.060 | No, also not that.
03:01:26.980 | But I am much more interested in it being able to evolve
03:01:29.900 | to a better thing without going through the catastrophe phase
03:01:34.900 | that I think it's just non-existence would give.
03:01:38.300 | - So what size of state is good?
03:01:42.520 | In a sense, like, should we as a human society,
03:01:45.480 | as this world becomes more globalized,
03:01:47.100 | should we be constantly striving to reduce
03:01:50.460 | the, we can put on a map, like right now, literally,
03:01:54.940 | like the centers of power in the world.
03:01:59.660 | Some of them are tech companies,
03:02:01.820 | some of them are governments.
03:02:03.420 | Should we be trying to, as much as possible,
03:02:05.780 | decentralize the power to where it's very difficult
03:02:09.180 | to point on the map the centers of power?
03:02:12.500 | And that means making the state,
03:02:15.780 | however, there's a bunch of different ways
03:02:17.500 | to make the government much smaller.
03:02:19.860 | That could be reducing, in the United States,
03:02:24.700 | reducing the funding for the government,
03:02:28.540 | all those kinds of things,
03:02:29.380 | the set of responsibilities, the set of powers.
03:02:33.420 | It could be, I mean, this is far out,
03:02:36.720 | but making more nations, or maybe nations not in the space
03:02:41.220 | that are defined by geographic location,
03:02:43.740 | but rather in the space of ideas,
03:02:45.700 | which is what anarchy is about.
03:02:47.360 | So anarchy is about forming collectives
03:02:49.220 | based on their set of ideas, and doing so dynamically,
03:02:52.860 | not based on where you were born, and so on.
03:02:55.060 | - I think we can say that the natural state of humans,
03:03:00.560 | if we want to describe such a thing,
03:03:03.020 | was to live in tribes that were below the Dunbar number,
03:03:07.820 | meaning that for a few hundred thousand years
03:03:11.520 | of human history, all of the groups of humans
03:03:15.140 | mostly stayed under that size.
03:03:16.900 | And whenever it would get up to that size,
03:03:18.180 | it would end up cleaving.
03:03:19.620 | And so it seems like there's a pretty strong,
03:03:22.260 | but there weren't individual humans out in the wild
03:03:24.260 | doing really well, right?
03:03:25.180 | So we were a group animal,
03:03:26.940 | but with groups that had a specific size.
03:03:28.700 | So we could say, in a way,
03:03:30.580 | humans were being domesticated by those groups.
03:03:32.600 | They were learning how to have certain rules
03:03:34.340 | to participate with the group
03:03:35.620 | without which you'd get kicked out,
03:03:36.860 | but that's still the wild state of people.
03:03:40.300 | - And maybe it's useful to do as a side statement,
03:03:43.660 | which I've recently looked at a bunch of papers
03:03:45.740 | around Dunbar's number, where the mean is actually 150.
03:03:49.260 | If you actually look at the original papers--
03:03:50.820 | - It's a range.
03:03:51.660 | - It's really a range.
03:03:53.140 | So it's actually somewhere under a thousand.
03:03:56.060 | So it's a range of like two to 500 or whatever it is.
03:03:59.060 | But you could argue that the,
03:04:01.300 | I think it actually is exactly two,
03:04:03.940 | the range is two to 520, something like that.
03:04:08.240 | And this is the mean that's taken crudely.
03:04:11.940 | It's not a very good paper
03:04:14.380 | in terms of the actual numerically speaking.
03:04:18.260 | But it'd be interesting if there's a bunch
03:04:21.180 | of Dunbar numbers that could be computed
03:04:24.580 | for particular environments, particular conditions, so on.
03:04:26.980 | It is very true that they're likely to be something small,
03:04:30.420 | you know, under a million.
03:04:32.220 | But it'd be interesting if we could expand that number
03:04:34.700 | in interesting ways that will change the fabric
03:04:37.380 | of this conversation.
03:04:38.200 | I just want to kind of throw that in there.
03:04:39.740 | I don't know if the 150 is baked in somehow
03:04:42.460 | into the hardware.
03:04:43.820 | We can talk about some of the things
03:04:45.020 | that it probably has to do with.
03:04:46.980 | Up to a certain number of people,
03:04:50.080 | and this is gonna be variable based
03:04:51.740 | on the social technologies that mediate it to some degree.
03:04:54.540 | We can talk about that in a minute.
03:04:56.300 | Up to a certain number of people,
03:05:01.100 | everybody can know everybody else pretty intimately.
03:05:04.260 | So let's go ahead and just take 150 as an average number.
03:05:12.020 | Everybody can know everyone intimately enough
03:05:14.420 | that if your actions made anyone else do poorly,
03:05:18.740 | it's your extended family,
03:05:20.300 | and you're stuck living with them,
03:05:21.780 | and you know who they are,
03:05:22.700 | and there's no anonymous people.
03:05:24.260 | There's no just them and over there.
03:05:26.640 | And that's one part of what leads
03:05:29.900 | to a kind of tribal process where what's good
03:05:32.260 | for the individual and good for the whole has a coupling.
03:05:34.980 | Also below that scale, everyone is somewhat aware
03:05:39.460 | of what everybody else is doing.
03:05:40.840 | There's not groups that are very siloed.
03:05:44.220 | And as a result, it's actually very hard
03:05:46.140 | to get away with bad behavior.
03:05:47.860 | There's a forced kind of transparency.
03:05:50.540 | And so you don't need kind of like the state in that way,
03:05:55.460 | but lying to people doesn't actually get you ahead.
03:05:57.940 | Sociopathic behavior doesn't get you ahead
03:05:59.740 | because it gets seen.
03:06:01.020 | And so there's a conditioning environment
03:06:04.920 | where the individual's behaving in a way
03:06:06.860 | that is aligned with the interests of the tribe
03:06:09.260 | is what gets conditioned.
03:06:11.320 | When it gets to be a much larger system,
03:06:13.600 | it becomes easier to hide certain things
03:06:16.800 | from the group as a whole,
03:06:18.000 | as well as to be less emotionally bound
03:06:20.040 | to a bunch of anonymous people.
03:06:21.640 | I would say there's also a communication protocol
03:06:26.020 | where up to about that number of people,
03:06:29.300 | we could all sit around a tribal council
03:06:31.400 | and be part of a conversation around a really big decision.
03:06:34.120 | Do we migrate?
03:06:34.960 | Do we not migrate?
03:06:35.780 | Do we, you know, something like that.
03:06:37.200 | Do we get rid of this person?
03:06:38.960 | And why would I want to agree to be a part of a larger group
03:06:43.960 | where everyone can't be part of that council?
03:06:48.980 | And so I am going to now be subject to law
03:06:52.600 | that I have no say in.
03:06:53.820 | If I could be part of a smaller group
03:06:55.140 | that could still survive and I get a say in the law
03:06:57.100 | that I'm subject to.
03:06:58.060 | So I think the cleaving,
03:06:59.860 | and a way we can look at it beyond the Dunbar number two
03:07:02.740 | is we can look at that a civilization has binding energy
03:07:06.560 | that is holding them together and has cleaving energy.
03:07:08.660 | And if the binding energy exceeds the cleaving energy,
03:07:10.860 | that civilization will last.
03:07:12.840 | And so there are things that we can do
03:07:14.500 | to decrease the cleaving energy within the society,
03:07:16.480 | things we can do to increase the binding energy.
03:07:18.260 | I think naturally we saw that had certain characteristics
03:07:20.940 | up to a certain size kind of tribalism.
03:07:22.940 | That ended with a few things.
03:07:25.780 | It ended with people having migrated enough
03:07:27.660 | that when you started to get resource wars,
03:07:31.420 | you couldn't just migrate away easily.
03:07:33.260 | And so tribal warfare became more obligated.
03:07:35.380 | It involved the plow
03:07:37.340 | in the beginning of real economic surplus.
03:07:39.540 | So there were a few different kind of forcing functions.
03:07:43.200 | But we're talking about what size should it be, right?
03:07:48.380 | What size should a society be?
03:07:49.900 | And I think the idea,
03:07:51.940 | like if we think about your body for a moment
03:07:54.780 | as a self-organizing complex system that is multi-scaled,
03:07:58.260 | we think about-- - Our body is a wonderland.
03:08:00.380 | - Our body is a wonderland, yeah.
03:08:02.580 | You have--
03:08:05.460 | - That's a John Mayer song, I apologize.
03:08:07.580 | But yes, so if we think about our body
03:08:10.360 | and the billions of cells that are in it.
03:08:12.260 | - Well, you don't have,
03:08:13.660 | like think about how ridiculous it would be
03:08:15.300 | to try to have all the tens of trillions of cells in it
03:08:18.800 | with no internal organization structure, right?
03:08:21.720 | Just like a sea of protoplasm, it wouldn't work.
03:08:24.540 | - Pure democracy.
03:08:25.660 | - And so you have cells and tissues,
03:08:29.140 | and then you have tissues and organs
03:08:31.260 | and organs and organ systems.
03:08:32.700 | And so you have these layers of organization.
03:08:34.500 | And then obviously the individual and a tribe
03:08:37.180 | and a ecosystem.
03:08:39.260 | And each of the higher layers
03:08:42.000 | are both based on the lower layers,
03:08:43.460 | but also influencing them.
03:08:45.260 | I think the future of civilization will be similar,
03:08:48.300 | which is there's a level of governance
03:08:49.900 | that happens at the level of the individual,
03:08:51.400 | my own governance of my own choice.
03:08:54.620 | I think there's a level that happens
03:08:56.140 | at the level of a family.
03:08:57.620 | We're making decisions together,
03:08:59.140 | we're inter-influencing each other
03:09:00.540 | and affecting each other, taking responsibility for.
03:09:03.780 | The idea of an extended family.
03:09:05.100 | And you can see that for a lot of human history,
03:09:06.940 | we had an extended family.
03:09:08.060 | We had a local community, a local church,
03:09:10.000 | or whatever it was.
03:09:10.980 | We had these intermediate structures.
03:09:13.440 | Whereas right now there's kind of like the individual
03:09:16.100 | producer, consumer, taxpayer, voter,
03:09:19.860 | and the massive nation state global complex.
03:09:22.520 | And not that much in the way of intermediate structures
03:09:25.020 | that we relate with,
03:09:25.860 | and not that much in the way of real personal dynamics,
03:09:27.960 | all impersonalized, made fungible.
03:09:30.900 | And so I think that we have to have global governance.
03:09:35.900 | Meaning, I think we have to have governance
03:09:39.580 | at the scale we affect stuff.
03:09:40.820 | And if anybody is messing up the oceans,
03:09:43.140 | that matters for everybody.
03:09:44.020 | So that can't only be national or only local.
03:09:48.020 | Everyone is scared of the idea of global governance
03:09:49.900 | 'cause we think about some top-down system of imposition
03:09:52.620 | that now has no checks and balances on power.
03:09:54.980 | I'm scared of that same version.
03:09:56.340 | So I'm not talking about that kind of global governance.
03:10:00.020 | It's why I'm even using the word governance as a process
03:10:02.540 | rather than government as an imposed phenomena.
03:10:06.300 | So I think we have to have global governance,
03:10:09.340 | but I think we also have to have local governance.
03:10:11.540 | And there has to be relationships between them
03:10:13.260 | that each, where there are both checks and balances
03:10:16.980 | and power flows of information.
03:10:18.920 | So I think governance at the level of cities
03:10:21.620 | will be a bigger deal in the future
03:10:24.260 | than governance at the level of nation states.
03:10:26.300 | 'Cause I think nation states are largely fictitious things
03:10:30.380 | that are defined by wars and agreements to stop wars
03:10:33.740 | and like that.
03:10:34.580 | I think cities are based on real things
03:10:36.700 | that will keep being real
03:10:37.540 | where the proximity of certain things together,
03:10:40.500 | the physical proximity of things together
03:10:42.140 | gives increased value of those things.
03:10:44.560 | So you look at like Jeffrey West's work on scale
03:10:47.500 | and finding that companies and nation states
03:10:50.520 | and things that have a kind of complicated
03:10:52.540 | agreement structure get diminishing return
03:10:54.940 | of production per capita
03:10:56.860 | as the total number of people increases
03:10:58.380 | beyond about the tribal scale,
03:10:59.600 | but the city actually gets increasing productivity
03:11:01.980 | per capita, but it's not designed.
03:11:04.000 | It's kind of this organic thing, right?
03:11:06.080 | So there should be governance at the level of cities
03:11:08.500 | because people can sense
03:11:09.820 | and actually have some agency there.
03:11:11.940 | Probably neighborhoods and smaller scales within it
03:11:14.140 | and also verticals.
03:11:15.140 | And some of it won't be geographic,
03:11:16.500 | it'll be network based, right?
03:11:17.580 | Networks of affinities.
03:11:18.920 | So I don't think the future is one type of governance.
03:11:22.200 | Now, what we can say more broadly is say,
03:11:25.200 | when we're talking about groups of people
03:11:26.600 | that inter-affect each other,
03:11:27.840 | the idea of a civilization is that we can figure out
03:11:30.360 | how to coordinate our choice-making
03:11:32.800 | to not be at war with each other
03:11:34.480 | and hopefully increase total productive capacity
03:11:38.120 | in a way that's good for everybody.
03:11:39.360 | Division of labor and specialty,
03:11:41.140 | so we all get more, better stuff and whatever.
03:11:44.820 | But it's a coordination of our choice-making.
03:11:49.840 | I think we can look at civilizations failing
03:11:52.520 | on the side of not having enough coordination
03:11:55.320 | of choice-making, so they fail on the side of chaos
03:11:57.440 | and then they cleave
03:11:58.260 | and an internal war comes about or whatever.
03:12:00.840 | Or they can't make smart decisions
03:12:04.100 | and they overuse their resources or whatever.
03:12:06.800 | Or it can fail on the side of trying to get order
03:12:10.720 | via imposition, via force.
03:12:14.000 | And so it fails on the side of oppression,
03:12:15.560 | which ends up being for a while,
03:12:18.240 | functional-ish for the thing as a whole,
03:12:21.360 | but miserable for most people in it
03:12:22.880 | until it fails either because of revolt
03:12:25.260 | or because it can't innovate enough or something like that.
03:12:28.160 | And so there's this toggling
03:12:29.560 | between order via oppression and chaos.
03:12:34.240 | And I think the idea of democracy,
03:12:37.120 | not the way we've implemented it,
03:12:38.580 | but the idea of it,
03:12:39.680 | whether we're talking about a representative democracy
03:12:42.140 | or a direct digital democracy,
03:12:43.640 | liquid democracy, a republic, or whatever,
03:12:46.600 | the idea of an open society, participatory governance,
03:12:50.320 | is can we have order that is emergent rather than imposed
03:12:55.000 | so that we aren't stuck with chaos
03:12:57.800 | and infighting and inability to coordinate,
03:13:00.680 | and we're also not stuck with oppression?
03:13:03.400 | And what would it take to have emergent order?
03:13:07.600 | This is the most kind of central question
03:13:10.520 | for me these days,
03:13:12.020 | because if we look at what different nation states
03:13:17.020 | are doing around the world,
03:13:18.380 | and we see nation states that are more authoritarian,
03:13:21.820 | that in some ways are actually coordinating
03:13:24.740 | much more effectively.
03:13:26.340 | So for instance, we can see that China
03:13:30.960 | has built high-speed rail, not just through its country,
03:13:33.220 | but around the world,
03:13:34.220 | and the US hasn't built any high-speed rail yet.
03:13:36.780 | You can see that it brought 300 million people
03:13:38.820 | out of poverty in a time
03:13:39.960 | where we've had increasing economic inequality happening.
03:13:43.840 | You can see that if there was a single country
03:13:48.100 | that could make all of its own stuff,
03:13:49.380 | if the global supply chains failed,
03:13:51.180 | China would be the closest one to being able
03:13:53.180 | to start to go closed loop on fundamental things.
03:13:56.480 | Belt and Road Initiative,
03:13:59.540 | supply chain on rare earth metals,
03:14:02.700 | transistor manufacturing,
03:14:04.020 | that is like, oh, they're actually coordinating
03:14:06.220 | more effectively in some important ways.
03:14:08.800 | In the last, call it 30 years.
03:14:10.400 | And that's imposed order.
03:14:14.280 | - Imposed order.
03:14:15.620 | - And we can see that if in the US,
03:14:20.620 | let's look at why real quick.
03:14:22.780 | We know why we created term limits
03:14:26.340 | so that we wouldn't have forever monarchs.
03:14:28.940 | That's the thing we were trying to get away from,
03:14:30.860 | and that there would be checks and balances on power
03:14:32.980 | and that kind of thing.
03:14:34.420 | But that also has created a negative second order effect,
03:14:37.580 | which is nobody does long-term planning.
03:14:39.840 | Because somebody comes in who's got four years,
03:14:41.820 | they want reelected.
03:14:43.140 | They don't do anything that doesn't create a return
03:14:45.000 | within four years that will end up getting them elected,
03:14:48.240 | reelected.
03:14:49.080 | And so the 30 year industrial development
03:14:51.240 | to build high speed trains or the new kind of fusion energy
03:14:54.720 | or whatever it is, just doesn't get invested in.
03:14:57.100 | And then if you have left versus right,
03:15:00.420 | where whatever someone does for four years,
03:15:02.720 | then the other guy gets in and undoes it for four years.
03:15:05.680 | And most of the energy goes into campaigning
03:15:07.740 | against each other.
03:15:08.580 | This system is just dissipating as heat.
03:15:11.300 | It's just burning up as heat.
03:15:12.580 | And the system that has no term limits
03:15:14.300 | and no internal friction in fighting
03:15:15.920 | because they got rid of those people,
03:15:18.100 | can actually coordinate better.
03:15:19.800 | But I would argue it has its own fail states eventually
03:15:24.800 | and dystopic properties that are not the thing we want.
03:15:28.740 | - So the goal is to accomplish,
03:15:30.740 | to create a system that does long-term planning
03:15:33.600 | without the negative effects of a monarch or dictator
03:15:37.140 | that stays there for the long-term
03:15:39.660 | and accomplish that through,
03:15:46.720 | not through the imposition of a single leader,
03:15:52.180 | but through emergence.
03:15:54.100 | So that doesn't, that perhaps, first of all,
03:15:58.580 | the technology in itself seems to maybe disagree,
03:16:02.540 | allow for different possibilities here,
03:16:04.900 | which is make primary the system, not the humans.
03:16:08.280 | So the basic, the medium on which the democracy happens,
03:16:13.280 | like a platform where people can make decisions,
03:16:21.220 | do the choice-making, the coordination of the choice-making,
03:16:26.980 | where emerges some kind of order
03:16:30.980 | to where like something that applies
03:16:32.540 | at the scale of the family, the extended family,
03:16:35.460 | the city, the country, the continent, the whole world.
03:16:40.460 | And then does that so dynamically,
03:16:43.180 | constantly changing based on the needs of the people,
03:16:45.940 | sort of always evolving.
03:16:47.240 | And it would all be owned by Google.
03:16:50.600 | Like doesn't this, is there a way to,
03:16:56.080 | so first of all, you're optimistic
03:16:58.780 | that you could basically create,
03:17:00.780 | that technology can save us.
03:17:02.420 | Technology at creating platforms,
03:17:05.020 | by technology I mean like software network platforms
03:17:08.020 | that allows humans to deliberate,
03:17:11.280 | like make government together dynamically
03:17:14.700 | without the need for a leader
03:17:16.500 | that's on a podium screaming stuff.
03:17:19.220 | That's one.
03:17:20.260 | And two, if you're optimistic about that,
03:17:23.340 | are you also optimistic about the CEOs of such platforms?
03:17:27.600 | The idea that technology is values neutral,
03:17:32.100 | values agnostic, and people can use it
03:17:35.340 | for constructive or destructive purposes,
03:17:37.940 | but it doesn't predispose anything.
03:17:40.100 | It's just silly and naive.
03:17:43.460 | Technology elicits patterns of human behavior
03:17:46.700 | because those who utilize it and get ahead
03:17:49.300 | end up behaving differently
03:17:50.380 | because of their utilization of it.
03:17:51.780 | And then other people,
03:17:53.580 | then they end up shaping the world
03:17:54.900 | or other people race to also get the power
03:17:56.740 | of the technology.
03:17:57.740 | And so there's whole schools of anthropology
03:18:00.120 | that look at the effect on social systems
03:18:02.800 | and the minds of people of the change in our tooling.
03:18:05.920 | Marvin Harris's work called "Cultural Materialism"
03:18:08.200 | looked at this deeply.
03:18:09.080 | Obviously, Marshall McLuhan looked specifically
03:18:11.220 | at the way that information technologies
03:18:12.760 | change the nature of our beliefs,
03:18:14.480 | minds, values, social systems.
03:18:17.040 | I will not try to do this rigorously
03:18:21.040 | because there are academics
03:18:22.840 | who will disagree on the subtle details,
03:18:24.260 | but I'll do it kind of like illustratively.
03:18:27.380 | You think about the emergence of the plow,
03:18:29.580 | the ox drawn plow in the beginning of agriculture
03:18:31.660 | that came with it,
03:18:32.500 | where before that you had hunter-gatherer
03:18:34.600 | and then you had horticulture,
03:18:36.260 | kind of a digging stick, but not the plow.
03:18:40.220 | Well, the world changed a lot with that, right?
03:18:43.180 | And a few of the changes
03:18:48.180 | that at least some theorists believe in
03:18:51.580 | is when the ox drawn plow started to proliferate,
03:18:55.580 | any culture that utilized it
03:18:56.800 | was able to start to actually cultivate grain
03:18:58.720 | 'cause just with a digging stick,
03:19:00.040 | you couldn't get enough grain for it to matter.
03:19:01.580 | Grain was a storable caloric surplus.
03:19:03.360 | They could make it through the famines.
03:19:04.480 | They could grow their population.
03:19:05.600 | So the ones that used it got so much ahead
03:19:07.560 | that it became obligate and everybody used it.
03:19:10.400 | That corresponding with the use of a plow,
03:19:13.640 | animism went away everywhere, that it existed,
03:19:16.120 | because you can't talk about the spirit of the buffalo
03:19:18.640 | while beating the cow all day long to pull a plow.
03:19:22.000 | So the moment that we do animal husbandry of that kind,
03:19:24.560 | where you have to beat the cow all day,
03:19:25.680 | you have to say, it's just a dumb animal.
03:19:27.560 | Man has dominion over earth.
03:19:28.820 | And the nature of even our religious
03:19:30.200 | and spiritual ideas change.
03:19:31.820 | You went from women primarily using the digging stick
03:19:34.960 | to do the horticulture or gathering before that,
03:19:37.760 | men doing the hunting stuff to now men had to use the plow
03:19:40.240 | because the upper body strength actually really mattered.
03:19:42.440 | Women would have miscarriages when they would do it
03:19:44.160 | when they were pregnant.
03:19:45.160 | So all the caloric supply started to come from men
03:19:47.940 | where it had been from both before
03:19:49.320 | and the ratio of male, female gods changed
03:19:51.820 | to being mostly male gods following that.
03:19:54.540 | Obviously we went from very,
03:19:57.360 | that particular line of thought then also says
03:20:01.080 | that feminism followed the tractor
03:20:03.240 | and that the rise of feminism in the West
03:20:08.760 | started to follow women being able to say,
03:20:10.520 | we can do what men can
03:20:12.200 | because the male upper body strength wasn't differential
03:20:15.560 | once the internal combustion engine was much stronger
03:20:18.400 | and we can drive a tractor.
03:20:20.620 | So I don't think to try to trace complex things
03:20:24.280 | to one cause is a good idea.
03:20:25.700 | So I think this is a reductionist view,
03:20:27.220 | but it has truth in it.
03:20:29.240 | And so the idea that technology is values agnostic is silly.
03:20:34.240 | Technology codes patterns of behavior
03:20:36.640 | that code rationalizing those patterns of behavior
03:20:39.280 | and believing in them.
03:20:40.320 | The plow also is the beginning of the Anthropocene, right?
03:20:43.320 | It was the beginning of us changing the environment
03:20:45.400 | radically to clear cut areas
03:20:47.720 | to just make them useful for people,
03:20:49.280 | which also meant the change of the view
03:20:51.040 | of where the web of life, we're just a part of it, et cetera.
03:20:54.160 | So all those types of things.
03:20:55.720 | - So that's brilliantly put,
03:20:58.720 | but by the way, that was just brilliant.
03:21:01.300 | But the question is, so it's not agnostic, but.
03:21:05.900 | - So we have to look at what the psychological effects
03:21:08.720 | of specific tech applied certain ways are
03:21:11.600 | and be able to say,
03:21:13.640 | it's not just doing the first order thing you intended.
03:21:16.460 | It's doing like the effect on patriarchy and animism
03:21:20.480 | and the end of tribal culture
03:21:22.840 | and the beginning of empire and the class systems
03:21:24.800 | that came with that.
03:21:25.880 | We can go on and on about what the plow did.
03:21:28.640 | The beginning of surplus was inheritance,
03:21:30.640 | which then became the capital model and like lots of things.
03:21:34.440 | So we have to say, when we're looking at the tech,
03:21:36.880 | how is, what are the values built
03:21:39.320 | into the way the tech is being built that are not obvious?
03:21:42.720 | - Right, so you always have to consider externalities.
03:21:44.840 | - Yes.
03:21:45.680 | - And this is no matter what.
03:21:46.500 | - And the externalities are not just physical
03:21:47.340 | to the environment,
03:21:48.180 | they're also to how the people are being conditioned
03:21:49.760 | and how the relationality between them is being conditioned.
03:21:51.920 | - The question I'm asking you,
03:21:53.120 | so I personally would rather be led by a plow
03:21:56.240 | and a tractor than Stalin, okay?
03:21:58.760 | That's the question I'm asking you.
03:22:00.600 | In creating an emergent government where people,
03:22:06.740 | where there's a democracy that's dynamic,
03:22:09.320 | that makes choices, that does governance
03:22:11.600 | at like a very kind of liquid,
03:22:17.520 | like there's a bunch of fine resolution layers
03:22:21.640 | of abstraction of governance happening at all scales,
03:22:26.160 | right, and doing so dynamically
03:22:28.080 | where no one person has power at any one time
03:22:30.720 | that can dominate and impose rule, okay?
03:22:34.200 | That's the Stalin version.
03:22:35.840 | I'm saying isn't the alternative that's emergent,
03:22:40.840 | empowered or made possible by the plow and the tractor,
03:22:48.120 | which is the modern version of that,
03:22:50.920 | is like the internet, the digital space
03:22:54.240 | where we can, the monetary system,
03:22:56.840 | where you have the cryptocurrency and so on,
03:22:58.720 | but you have much more importantly,
03:23:01.280 | to me at least, is just basic social interaction,
03:23:03.360 | the mechanisms of human transacting with each other
03:23:05.960 | in the space of ideas.
03:23:07.200 | So yes, it's not agnostic, definitely not agnostic.
03:23:12.160 | You've had a brilliant rant there.
03:23:14.280 | The tractor has effects,
03:23:16.200 | but isn't that the way we achieve an emergent system
03:23:19.440 | of governance?
03:23:21.200 | - Yes, but I wouldn't say we're on track.
03:23:23.240 | - You haven't seen anything promising.
03:23:29.080 | - It's not that I haven't seen anything promising.
03:23:30.520 | It's that to be on track requires understanding
03:23:32.840 | and guiding some of the things differently
03:23:34.240 | than is currently happening, and it's possible.
03:23:36.480 | That's actually what I really care about.
03:23:38.720 | So you couldn't have had a Stalin
03:23:43.720 | without having certain technologies emerge.
03:23:46.480 | He couldn't have ruled such a big area
03:23:47.920 | without transportation technologies, without the train,
03:23:51.000 | without the communication tech that made it possible.
03:23:55.120 | So when you say you'd rather have a tractor
03:23:58.280 | or a plow than a Stalin,
03:23:59.480 | there's a relationship between them that is more recursive,
03:24:02.720 | which is new physical technologies allow rulers
03:24:07.720 | to rule with more power over larger distances, historically.
03:24:16.680 | Some things are more responsible for that than others.
03:24:19.440 | Like Stalin also ate stuff for breakfast,
03:24:22.160 | but the thing he ate for breakfast is less responsible
03:24:24.680 | for the starvation of millions than the train.
03:24:28.180 | The train is more responsible for that.
03:24:30.160 | And then the weapons of war are more responsible.
03:24:32.400 | So some technology, let's not throw it all in the,
03:24:36.160 | you're saying technology has a responsibility here,
03:24:39.560 | but some is better than others.
03:24:42.080 | - I'm saying that people's use of technology
03:24:45.000 | will change their behavior.
03:24:46.160 | So it has behavioral dispositions built in.
03:24:48.880 | The change of the behavior will also change the values
03:24:51.560 | in the society.
03:24:52.520 | - It's very complicated, right?
03:24:53.520 | - It will also, as a result,
03:24:55.520 | both make people who have different kinds of predispositions
03:24:58.680 | with regard to rulership
03:25:00.320 | and different kinds of new capacities.
03:25:03.260 | And so we have to think about these things.
03:25:06.300 | It's kind of well understood that the printing press
03:25:09.360 | and then in early industrialism ended feudalism
03:25:13.120 | and created kind of nation states.
03:25:15.720 | So one thing I would say as a long trend
03:25:19.320 | that we can look at is that whenever there is a step
03:25:22.520 | function, a major leap in technology, physical technology,
03:25:26.440 | the underlying techno-industrial base
03:25:28.320 | with which we do stuff,
03:25:30.240 | it ends up coding for,
03:25:32.080 | it ends up predisposing a whole bunch
03:25:33.760 | of human behavioral patterns
03:25:36.000 | that the previous social system
03:25:37.800 | had not emerged to try to solve.
03:25:40.320 | And so it usually ends up breaking
03:25:42.100 | the previous social systems,
03:25:43.520 | the way the plow broke the tribal system,
03:25:45.300 | the way that the industrial revolution
03:25:46.640 | broke the feudal system.
03:25:48.080 | And then new social systems have to emerge
03:25:50.320 | that can deal with that,
03:25:51.400 | the new powers, the new dispositions,
03:25:54.040 | whatever with that tech.
03:25:55.360 | Obviously the nuke broke nation state governance
03:25:58.120 | being adequate and said,
03:25:59.920 | we can't ever have that again.
03:26:00.880 | So then it created this international
03:26:03.360 | governance apparatus world.
03:26:05.680 | So I guess what I'm saying is that
03:26:12.680 | the solution is not exponential tech
03:26:17.680 | following the current path
03:26:22.140 | of what the market incentivizes exponential tech to do,
03:26:24.860 | market being a previous social tech.
03:26:27.040 | I would say that exponential tech,
03:26:33.180 | if we look at different types of social tech,
03:26:38.420 | so let's just briefly look at
03:26:41.360 | that democracy tried to do the emergent order thing.
03:26:44.560 | At least that's the story.
03:26:47.960 | And this is why if you look,
03:26:53.060 | this important part to build first.
03:26:57.300 | - It's kind of doing it,
03:26:58.300 | it's just doing it poorly, you're saying.
03:27:00.860 | I mean, it is emergent order in some sense.
03:27:03.320 | I mean, that's the hope of democracy
03:27:04.500 | versus other forms of government.
03:27:06.220 | - Correct.
03:27:07.060 | I mean, I said at least the story
03:27:08.820 | because obviously it didn't do it
03:27:10.420 | for women and slaves early on,
03:27:11.880 | it doesn't do it for all classes equally, et cetera.
03:27:14.420 | But the idea of democracy is participatory governance.
03:27:19.420 | And so you notice that the modern democracies
03:27:22.720 | emerged out of the European enlightenment.
03:27:26.220 | And specifically, because the idea that a lot of people,
03:27:29.900 | some huge number, not a tribal number,
03:27:31.380 | huge number of anonymous people who don't know each other
03:27:33.540 | are not bonded to each other,
03:27:35.660 | who believe different things,
03:27:36.860 | who grew up in different ways
03:27:38.000 | can all work together to make collective decisions.
03:27:40.200 | Well, that affect everybody.
03:27:41.940 | And where some of them will make compromises
03:27:43.620 | and the thing that matters to them
03:27:44.620 | for what matters to other strangers.
03:27:46.440 | That's actually wild.
03:27:47.820 | Like it's a wild idea that that would even be possible.
03:27:50.660 | And it was kind of the result
03:27:52.220 | of this high enlightenment idea
03:27:54.460 | that we could all do the philosophy of science
03:27:58.860 | and we could all do the Hegelian dialectic.
03:28:02.860 | Those ideas had emerged, right?
03:28:04.300 | And it was that we could all,
03:28:08.940 | so our choice-making,
03:28:10.160 | 'cause we've said a society
03:28:11.200 | is trying to coordinate choice-making.
03:28:12.520 | The emergent order is the order of the choices
03:28:14.960 | that we're making,
03:28:15.840 | not just at the level of the individuals,
03:28:17.040 | but what groups of individuals,
03:28:18.280 | corporations, nations, states, whatever do.
03:28:20.440 | Our choices are based on,
03:28:23.240 | our choice-making is based on our sense-making
03:28:25.080 | and our meaning-making.
03:28:26.400 | Our sense-making is what do we believe
03:28:27.720 | is happening in the world?
03:28:29.460 | And what do we believe the effects
03:28:30.600 | of a particular thing would be?
03:28:31.600 | Our meaning-making is what do we care about, right?
03:28:33.480 | Our values generation, what do we care about
03:28:34.900 | that we're trying to move the world in the direction of?
03:28:37.200 | If you ultimately are trying to move the world
03:28:39.260 | in a direction that is really, really different
03:28:41.620 | than the direction I'm trying to,
03:28:42.760 | we have very different values,
03:28:44.620 | we're gonna have a hard time.
03:28:46.060 | And if you think the world is a very different world,
03:28:48.180 | right, if you think that systemic racism
03:28:51.220 | is rampant everywhere and one of the worst problems,
03:28:54.220 | and I think it's not even a thing,
03:28:55.860 | if you think climate change is almost existential
03:28:58.740 | and I think it's not even a thing,
03:29:00.220 | we're gonna have a really hard time coordinating.
03:29:02.580 | And so we have to be able to have shared sense-making
03:29:05.700 | of can we come to understand
03:29:07.280 | just what is happening together?
03:29:10.640 | And then can we do shared values generation?
03:29:12.600 | Okay, maybe I'm emphasizing a particular value
03:29:14.600 | more than you, but I can see how,
03:29:16.300 | I can take your perspective
03:29:17.440 | and I can see how the thing that you value
03:29:18.900 | is worth valuing,
03:29:20.000 | and I can see how it's affected by this thing.
03:29:22.240 | So can we take all the values
03:29:23.900 | and try to come up with a proposition
03:29:25.360 | that benefits all of them better
03:29:27.280 | than the proposition I created
03:29:28.480 | just to benefit these ones that harms the ones
03:29:30.480 | that you care about,
03:29:31.860 | which is why you're opposing my proposition.
03:29:34.320 | We don't even try in the process
03:29:35.920 | of crafting a proposition currently to see,
03:29:39.040 | and this is the reason that the proposition
03:29:41.060 | when we vote on it gets half the votes almost all the time.
03:29:43.440 | It almost never gets 90% of the votes
03:29:45.920 | is because it benefits some things and harms other things.
03:29:48.180 | We can say all theory of trade-offs,
03:29:49.760 | but we didn't even try to say,
03:29:51.140 | could we see what everybody cares about
03:29:53.820 | and see if there was a better solution?
03:29:56.620 | - How do we fix that try?
03:29:57.820 | I wonder, is it as simple
03:29:59.880 | as the social technology education?
03:30:02.120 | - Well, no.
03:30:04.280 | The proposition crafting and refinement process
03:30:07.080 | has to be key to a democracy
03:30:09.440 | or to a government, and it's not currently.
03:30:11.840 | - But isn't that the humans creating that situation?
03:30:15.780 | So one way, there's two ways to fix that.
03:30:19.880 | One is to fix the individual humans,
03:30:21.740 | which is the education early in life.
03:30:23.760 | And the second is to create somehow systems that-
03:30:26.480 | - Yeah, it's both.
03:30:28.040 | - So I understand the education part,
03:30:30.440 | but creating systems,
03:30:31.400 | that's why I mentioned the technologies,
03:30:34.040 | is creating social networks, essentially.
03:30:36.240 | - Yes, that's actually necessary.
03:30:37.880 | Okay, so let's go to the first part
03:30:39.160 | and then we'll come to the second part.
03:30:40.760 | So democracy emerged as an enlightenment era idea
03:30:45.440 | that we could all do a dialectic
03:30:49.000 | and come to understand what other people valued.
03:30:51.440 | And so that we could actually come up
03:30:54.160 | with a cooperative solution rather than just,
03:30:57.640 | fuck you, we're gonna get our thing in war, right?
03:31:00.360 | And that we could sense make together.
03:31:01.640 | We could all apply the philosophy of science
03:31:03.280 | and you weren't gonna stick to your guns
03:31:05.160 | on what the speed of sound is if we measured it
03:31:06.840 | and we found out what it was.
03:31:07.800 | And there's a unifying element
03:31:09.680 | to the objectivity in that way.
03:31:11.880 | And so this is why I believe Jefferson said,
03:31:15.600 | if you could give me a perfect newspaper
03:31:17.200 | and a broken government,
03:31:18.080 | or I'm paraphrasing,
03:31:19.280 | or a broken government and perfect newspaper,
03:31:20.900 | I wouldn't hesitate to take the perfect newspaper.
03:31:22.480 | Because if the people understand what's going on,
03:31:24.380 | they can build a new government.
03:31:25.960 | If they don't understand what's going on,
03:31:27.360 | they can't possibly make good choices.
03:31:29.600 | And Washington, I'm paraphrasing again,
03:31:33.680 | first president said,
03:31:34.960 | the number one aim of the federal government
03:31:36.720 | should be the comprehensive education of every citizen
03:31:39.640 | in the science of government.
03:31:41.080 | Science of government was the term of art.
03:31:42.720 | Think about what that means, right?
03:31:43.800 | Science of government would be game theory,
03:31:47.760 | coordination theory, history,
03:31:49.240 | it wouldn't call it game theory yet.
03:31:51.240 | History, sociology, economics, right?
03:31:53.600 | All the things that lead to
03:31:54.880 | how we understand human coordination.
03:31:57.360 | I think it's so profound that he didn't say
03:32:00.400 | the number one aim of the federal government
03:32:02.120 | is rule of law.
03:32:03.020 | And he didn't say it's protecting the border from enemies.
03:32:07.200 | Because if the number one aim
03:32:08.720 | was to protect the border from enemies,
03:32:11.000 | it could do that as military dictatorship quite effectively.
03:32:14.440 | And if the goal was rule of law,
03:32:16.360 | it could do it as a dictatorship, as a police state.
03:32:19.200 | And so if the number one goal is anything other
03:32:23.520 | than the comprehensive education of all the citizens
03:32:25.640 | in the science of government,
03:32:26.480 | it won't stay democracy long.
03:32:28.240 | You can see, so both education and the fourth estate,
03:32:31.960 | the fourth estate being the,
03:32:33.000 | so education, can I make sense of the world?
03:32:34.840 | Am I trained to make sense of the world?
03:32:36.080 | The fourth estate is what's actually going on currently,
03:32:38.320 | the news, do I have good unbiased information about it?
03:32:41.280 | Those are both considered prerequisite institutions
03:32:43.920 | for democracy to even be a possibility.
03:32:46.520 | And then at the scale it was initially suggested here,
03:32:49.340 | the town hall was the key phenomena
03:32:51.920 | where there wasn't a special interest group
03:32:54.440 | crafted a proposition.
03:32:55.440 | The first thing I ever saw was the proposition,
03:32:58.000 | didn't know anything about it,
03:32:58.840 | and I got to vote yes or no.
03:33:00.240 | It was in the town hall, we all got to talk about it,
03:33:02.160 | and the proposition could get crafted in real time
03:33:04.640 | through the conversation,
03:33:05.940 | which is why there was that founding father statement
03:33:08.320 | that voting is the death of democracy.
03:33:10.760 | Voting fundamentally is polarizing the population
03:33:13.240 | in some kind of sublimated war,
03:33:15.120 | and we'll do that as the last step.
03:33:17.400 | But what we wanna do first is to say,
03:33:18.920 | how does the thing that you care about
03:33:20.760 | that seems damaged by this proposition,
03:33:22.600 | how could that turn into a solution
03:33:24.560 | to make this proposition better?
03:33:26.400 | Or this proposition still tends to the thing
03:33:27.920 | it's trying to tend to and tends to that better.
03:33:29.560 | Can we work on this together?
03:33:31.000 | And in a town hall, we could have that.
03:33:32.880 | As the scale increased, we lost the ability to do that.
03:33:35.920 | Now, as you mentioned, the internet could change that.
03:33:37.880 | The fact that we had representatives
03:33:40.200 | that had to ride a horse from one town hall
03:33:41.720 | to the other one to see what the colony would do,
03:33:44.680 | that we stopped having this kind of developmental,
03:33:48.120 | propositional development process when the town hall ended.
03:33:51.960 | The fact that we have not used the internet
03:33:53.660 | to recreate this is somewhere between insane
03:33:58.480 | and aligned with class interests.
03:34:03.480 | - I would push back to say that the internet
03:34:06.920 | has those things, it just has a lot of other things.
03:34:09.760 | I feel like the internet has places
03:34:11.640 | where that encourage synthesis of competing ideas
03:34:16.040 | and sense-making, which is what we're talking about.
03:34:19.800 | It's just that it's also flooded
03:34:21.720 | with a bunch of other systems
03:34:23.400 | that perhaps are out competing it under current incentives,
03:34:26.460 | perhaps has to do with capitalism and the market.
03:34:28.860 | - Sure, Linux is awesome, right?
03:34:31.460 | And Wikipedia and places where you have,
03:34:34.680 | and they have problems, but places where you have
03:34:36.280 | open source sharing of information,
03:34:39.060 | vetting of information towards collective building.
03:34:41.620 | Is that building something like,
03:34:44.640 | how much has that affected our court systems
03:34:47.440 | or our policing systems or our military systems or our--
03:34:50.600 | - First of all, I think a lot, but not enough.
03:34:53.000 | I think that's something I told you offline yesterday
03:34:56.200 | is perhaps it's a whole nother discussion,
03:34:59.920 | but I don't think we're quite quantifying
03:35:02.960 | the impact on the world,
03:35:04.840 | the positive impact of Wikipedia.
03:35:06.640 | You said the policing, I mean,
03:35:09.360 | I just think the amount of empathy that,
03:35:14.360 | like knowledge, I think can't help but lead to empathy.
03:35:22.840 | Just knowing, okay, just knowing, okay,
03:35:26.360 | I'll give you some pieces of information.
03:35:28.640 | Knowing how many people died in various wars,
03:35:30.920 | that already, that delta,
03:35:32.680 | when you have millions of people have that knowledge,
03:35:35.480 | it's like, it's a little like slap in the face,
03:35:37.520 | like, oh, like my boyfriend or girlfriend breaking up with me
03:35:42.360 | is not such a big deal
03:35:43.960 | when millions of people were tortured,
03:35:46.560 | you know, like just a little bit.
03:35:48.000 | And when a lot of people know that because of Wikipedia
03:35:51.920 | or the effect, there's second order effects of Wikipedia,
03:35:55.280 | which is it's not that necessarily people read Wikipedia,
03:35:58.680 | it's like YouTubers who don't really know stuff that well
03:36:03.680 | will thoroughly read a Wikipedia article
03:36:07.480 | and create a compelling video
03:36:09.160 | describing that Wikipedia article
03:36:10.600 | that then millions of people watch
03:36:12.880 | and they understand that, holy shit, a lot of,
03:36:15.520 | there was such, first of all,
03:36:16.840 | there was such a thing as World War II and World War I,
03:36:19.600 | okay, like they can at least like learn about it,
03:36:22.680 | they can learn about, this was like recent,
03:36:25.480 | they can learn about slavery,
03:36:26.520 | they can learn about all kinds of injustices in the world.
03:36:30.080 | And that I think has a lot of effects to our,
03:36:33.720 | to the way, whether you're a police officer,
03:36:36.640 | a lawyer, a judge, and the jury,
03:36:40.320 | or just the regular civilian citizen,
03:36:44.520 | the way you approach the,
03:36:47.800 | every other communication you engage in,
03:36:50.080 | even if the system of that communication
03:36:52.000 | is very much flawed.
03:36:53.360 | So I think there's a huge positive effect on Wikipedia.
03:36:55.880 | That's my case for Wikipedia.
03:36:57.160 | So you should donate to Wikipedia.
03:36:59.520 | I'm a huge fan, but there's very few systems like it,
03:37:02.760 | which is sad to me.
03:37:04.720 | - So I think it would be a useful exercise
03:37:08.520 | for any listener of the show
03:37:11.960 | to really try to run the dialectical synthesis process
03:37:16.880 | with regard to a topic like this,
03:37:20.640 | and take the techno concern perspective
03:37:25.640 | with regard to information tech
03:37:29.320 | that folks like Tristan Harris take,
03:37:32.240 | and say, what are all of the things that are getting worse?
03:37:35.320 | And what, and are any of them following an exponential curve
03:37:38.680 | and how much worse, how quickly could that be?
03:37:40.980 | And then, and do that fully without mitigating it.
03:37:46.920 | Then take the techno optimist perspective
03:37:48.680 | and see what things are getting better
03:37:50.320 | in a way that Kurzweil or Diamandis or someone might do,
03:37:54.800 | and try to take that perspective fully
03:37:57.560 | and say, are some of those things exponential?
03:37:59.040 | And what could that portend?
03:38:00.280 | And then try to hold all that at the same time.
03:38:03.080 | And I think there are ways in which,
03:38:07.400 | depending upon the metrics we're looking at,
03:38:10.520 | things are getting worse on exponential curves
03:38:13.160 | and better on exponential curves
03:38:14.760 | for different metrics at the same time,
03:38:16.800 | which I hold as the destabilization of previous system.
03:38:20.760 | And either an emergence to a better system
03:38:22.720 | or a collapse to a lower order are both possible.
03:38:25.520 | And so I want my optimism not to be about my assessment.
03:38:31.880 | I want my assessment to be just as fucking clear
03:38:34.600 | as it can be.
03:38:35.440 | I want my optimism to be what inspires the solution process
03:38:40.000 | on that clear assessment.
03:38:42.200 | So I never want to apply optimism in the sense making.
03:38:45.720 | I want to just try to be clear.
03:38:47.560 | If anything, I want to make sure
03:38:49.120 | that the challenges are really well understood.
03:38:52.480 | But that's in service of an optimism
03:38:54.680 | that there are good potentials,
03:38:57.480 | even if I don't know what they are, that are worth seeking.
03:39:00.600 | There's kind of a, there is some sense of optimism
03:39:03.800 | that's required to even try
03:39:04.760 | to innovate really hard problems.
03:39:06.460 | But then I want to take my pessimism
03:39:09.400 | and red team my own optimism
03:39:11.200 | to see is that solution not gonna work?
03:39:13.040 | Does it have second order effects?
03:39:14.600 | And then not get upset by that
03:39:16.940 | because I then come back to how to make it better.
03:39:19.600 | So just a relationship between optimism and pessimism
03:39:22.280 | and the dialectic of how they can work.
03:39:25.120 | So when I, of course, we can say
03:39:27.720 | that Wikipedia is a pretty awesome example of a thing.
03:39:32.560 | We can look at the places where it has limits
03:39:34.960 | or has failed where on a celebrity topic
03:39:40.200 | or corporate interest topic,
03:39:42.000 | you can pay Wikipedia editors to edit more frequently
03:39:45.080 | and various things like that.
03:39:46.920 | But you can also see where there's a lot of information
03:39:49.160 | that was kind of decentrally created,
03:39:51.360 | that is good information,
03:39:52.360 | that is more easily accessible to people
03:39:54.060 | than everybody buying their own Encyclopedia Britannica
03:39:56.200 | or walking down to the library
03:39:57.800 | and that can be updated in real time faster.
03:40:00.840 | And I think you're very right
03:40:04.140 | that the business model is a big difference
03:40:08.080 | because Wikipedia is not a for-profit corporation.
03:40:11.400 | It is a, it's tending to the information commons
03:40:15.260 | and it doesn't have an agenda
03:40:16.960 | other than tending to the information commons.
03:40:19.640 | And I think the two masters issue is a tricky one
03:40:23.720 | when I'm trying to optimize
03:40:24.960 | for very different kinds of things
03:40:26.920 | where I have to sacrifice one for the other
03:40:31.360 | and I can't find synergistic satisfiers, which one?
03:40:34.360 | And if I have a fiduciary responsibility to shareholder
03:40:37.840 | profit maximization and what does that end up creating?
03:40:41.780 | I think the ad model that Silicon Valley took,
03:40:46.800 | I think Jaron Lanier,
03:40:49.840 | I don't know if you've had him on the show,
03:40:51.120 | but he has interesting assessment
03:40:52.720 | of the nature of the ad model.
03:40:54.860 | Silicon Valley wanting to support capitalism
03:41:00.280 | and entrepreneurs to make things,
03:41:01.620 | but also the belief that information should be free
03:41:05.280 | and also the network dynamics
03:41:07.480 | where the more people you got on,
03:41:08.680 | you got increased value per user per capita
03:41:11.560 | as more people got on.
03:41:12.440 | So you didn't want to do anything
03:41:13.320 | to slow the rate of adoption.
03:41:15.400 | Some places actually, PayPal paying people money
03:41:18.600 | to join the network because the value of the network
03:41:21.760 | would be, there'd be a Metcalf-like dynamic
03:41:23.840 | proportional to the square of the total number of users.
03:41:26.600 | So the ad model made sense of how do we make it free,
03:41:31.480 | but also be a business, get everybody on,
03:41:34.120 | but not really thinking about what it would mean to,
03:41:37.600 | and this is now the whole idea
03:41:38.840 | that if you aren't paying for the product,
03:41:40.440 | you are the product.
03:41:41.440 | If they have a fiduciary responsibility
03:41:45.940 | to their shareholder to maximize profit,
03:41:47.700 | their customer is the advertiser,
03:41:50.260 | the user who it's being built for
03:41:52.500 | is to do behavioral mod for them for advertisers.
03:41:56.400 | That's a whole different thing
03:41:58.440 | than that same type of tech could have been
03:42:00.680 | if applied with a different business model
03:42:02.400 | or a different purpose.
03:42:03.660 | I think there's, because Facebook and Google
03:42:10.800 | and other information and communication platforms
03:42:14.580 | end up harvesting data about user behavior
03:42:17.180 | that allows them to model who the people are
03:42:19.520 | in a way that gives them more,
03:42:20.920 | sometimes specific information and behavioral information
03:42:25.160 | than even a therapist or a doctor or a lawyer
03:42:29.920 | or a priest might have in a different setting.
03:42:31.820 | They basically are accessing privileged information.
03:42:35.080 | There should be a fiduciary responsibility.
03:42:38.160 | And in normal fiduciary law,
03:42:40.520 | if there's this principal agent thing,
03:42:42.200 | if you are a principal and I'm an agent on your behalf,
03:42:47.200 | I don't have a game theoretic relationship with you.
03:42:49.600 | If you're sharing something with me
03:42:50.840 | and I'm the priest or I'm the therapist,
03:42:52.800 | I'm never gonna use that information
03:42:54.280 | to try to sell you a used car or whatever the thing is.
03:42:58.140 | But Facebook is gathering massive amounts
03:43:00.960 | of privileged information and using it
03:43:02.460 | to modify people's behavior for a behavior
03:43:04.680 | that they didn't sign up for wanting the behavior
03:43:07.160 | but what the corporation did.
03:43:08.660 | So I think this is an example of the physical tech
03:43:12.720 | evolving in the context of the previous social tech
03:43:15.960 | where it's being shaped in particular ways.
03:43:18.480 | And here, unlike Wikipedia that evolved
03:43:20.360 | for the information commons,
03:43:22.840 | this evolved for fulfilling particular agentic purpose.
03:43:26.720 | Most people, when they're on Facebook,
03:43:28.120 | think it's just a tool that they're using.
03:43:29.600 | They don't realize it's an agent.
03:43:31.600 | It is a corporation with a profit motive
03:43:33.920 | and as I'm interacting with it,
03:43:36.640 | it has a goal for me different than my goal for myself.
03:43:39.840 | And I might wanna be on for a short period of time.
03:43:41.720 | Its goal is maximize time on site.
03:43:43.600 | And so there is a rivalry that is take,
03:43:46.840 | but where there should be a fiduciary contract.
03:43:50.000 | I think that's actually a huge deal.
03:43:52.200 | And I think if we said,
03:43:54.040 | could we apply Facebook-like technology
03:43:58.080 | to develop people's citizenry capacity, right?
03:44:03.080 | To develop their personal health and wellbeing and habits,
03:44:09.240 | as well as their cognitive understanding,
03:44:13.320 | the complexity with which they can process
03:44:15.880 | the health of their relationships,
03:44:17.600 | that would be amazing to start to explore.
03:44:22.200 | And this is now the thesis
03:44:23.520 | that we started to discuss before is,
03:44:27.140 | every time there is a major step function
03:44:29.720 | in the physical tech,
03:44:30.980 | it obsoletes the previous social tech
03:44:33.880 | and the new social tech has to emerge.
03:44:35.780 | What I would say is that when we look
03:44:38.200 | at the nation state level of the world today,
03:44:40.580 | the more top-down authoritarian nation states
03:44:44.000 | are as the exponential tech started to emerge,
03:44:47.160 | the digital technology started to emerge.
03:44:49.960 | They were in a position for better long-term planning
03:44:53.420 | and better coordination.
03:44:55.300 | And so the authoritarian states started applying
03:44:57.520 | the exponential tech intentionally
03:44:59.240 | to make more effective authoritarian states.
03:45:01.680 | And that's everything from like an internet of things,
03:45:03.960 | surveillance system, going into machine learning systems,
03:45:08.180 | to the Sesame Credit system, to all those types of things.
03:45:11.600 | And so they're upgrading their social tech
03:45:14.080 | using the exponential tech.
03:45:16.200 | Otherwise, within a nation state like the US,
03:45:19.100 | but democratic open societies,
03:45:22.800 | the countries, the states are not directing the technology
03:45:26.380 | in a way that makes a better open society,
03:45:28.180 | meaning better emergent order.
03:45:29.980 | They're saying, well, the corporations are doing that
03:45:32.500 | and the state is doing the relatively little thing
03:45:34.820 | it would do aligned with the previous corporate law
03:45:36.880 | that no longer is relevant
03:45:38.060 | 'cause there wasn't fiduciary responsibility
03:45:39.820 | for things like that.
03:45:40.660 | There wasn't antitrust
03:45:42.180 | because this creates functional monopolies
03:45:44.580 | because of network dynamics, right?
03:45:46.080 | Where YouTube has more users than Vimeo
03:45:49.020 | and every other video player together.
03:45:50.580 | Amazon has a bigger percentage of market share
03:45:52.700 | than all of the other markets together.
03:45:54.560 | You get one big dog per vertical because of network effect,
03:45:59.300 | which is a kind of organic monopoly
03:46:00.700 | that the previous antitrust law didn't even have a place.
03:46:02.900 | That wasn't the thing.
03:46:04.420 | Anti-monopoly was only something that emerged
03:46:06.900 | in the space of government contracts.
03:46:08.780 | So what we see is the new exponential technology
03:46:13.740 | is being directed by authoritarian nation states
03:46:16.680 | to make better authoritarian nation states
03:46:18.200 | and by corporations to make more powerful corporations.
03:46:21.100 | And powerful corporations,
03:46:23.020 | when we think about the Scottish enlightenment,
03:46:24.940 | when the idea of markets was being advanced,
03:46:27.020 | the modern kind of ideas of markets,
03:46:28.980 | the biggest corporation was tiny
03:46:32.260 | compared to what the biggest corporation today is.
03:46:35.020 | So the asymmetry of it relative to people was tiny.
03:46:37.880 | And the asymmetry now in terms of the total technology
03:46:41.840 | it employs, total amount of money,
03:46:43.620 | total amount of information processing
03:46:45.700 | is so many orders of magnitude.
03:46:48.380 | And rather than there be demand for an authentic thing
03:46:53.380 | that creates a basis for supply,
03:46:55.740 | as supply started to get way more coordinated and powerful
03:46:58.660 | and the demand wasn't coordinated
03:46:59.780 | 'cause you don't have a labor union
03:47:00.860 | of all the customers working together,
03:47:02.860 | but you do have a coordination on the supply side,
03:47:04.900 | supply started to recognize
03:47:06.240 | that it could manufacture demand.
03:47:08.020 | It could make people want shit that they didn't want before
03:47:09.980 | that maybe wouldn't increase their happiness
03:47:12.000 | in a meaningful way, might increase addiction.
03:47:14.140 | Addiction is a very good way to manufacture demand.
03:47:17.100 | And so as soon as manufactured demand started
03:47:21.400 | through this is the cool thing
03:47:23.420 | and you have to have it for status or whatever it is,
03:47:25.900 | the intelligence of the market was breaking.
03:47:28.720 | Now it's no longer a collective intelligence system
03:47:30.900 | that is upregulating real desire
03:47:32.940 | for things that are really meaningful.
03:47:34.260 | You were able to hijack the lower angels of our nature
03:47:37.200 | rather than the higher ones,
03:47:38.160 | the addictive patterns drive those
03:47:40.260 | and have people want shit
03:47:41.340 | that doesn't actually make them happier,
03:47:42.540 | make the world better.
03:47:44.140 | And so we really also have to update our theory of markets
03:47:48.980 | because behavioral econ showed that homo economicus,
03:47:52.360 | the rational actor is not really a thing,
03:47:54.500 | but particularly at greater and greater scale
03:47:56.860 | can't really be a thing.
03:47:57.980 | Voluntarism isn't a thing where if my company
03:48:00.800 | doesn't wanna advertise on Facebook,
03:48:02.420 | I just will lose to the companies that do
03:48:04.260 | 'cause that's where all the fucking attention is.
03:48:06.420 | And so then I can say it's voluntary,
03:48:08.300 | but it's not really if there's a functional monopoly.
03:48:11.820 | Same if I'm gonna sell on Amazon or things like that.
03:48:14.660 | So what I would say is these corporations
03:48:18.460 | are becoming more powerful than nation states in some ways.
03:48:23.460 | And they are also debasing the integrity
03:48:28.460 | of the nation states, the open societies.
03:48:33.900 | So the democracies are getting weaker
03:48:35.900 | as a result of exponential tech
03:48:38.040 | and the kind of new tech companies
03:48:40.460 | that are kind of a new feudalism, tech feudalism,
03:48:42.860 | 'cause it's not a democracy inside of a tech company
03:48:45.140 | or the supply and demand relationship
03:48:46.940 | when you have manufactured demand
03:48:49.480 | and kind of monopoly type functions.
03:48:52.120 | And so we have basically a new feudalism
03:48:54.820 | controlling exponential tech
03:48:55.980 | and authoritarian nation states controlling it.
03:48:58.060 | And those attractors are both shitty.
03:49:00.060 | And so I'm interested in the application of exponential tech
03:49:05.340 | to making better social tech
03:49:07.300 | that makes emergent order possible.
03:49:10.020 | And where then that emergent order can bind
03:49:12.580 | and direct the exponential tech in fundamentally healthy,
03:49:16.660 | not X risk oriented directions.
03:49:19.100 | I think the relationship of social tech
03:49:21.220 | and physical tech can make it.
03:49:22.700 | I think we can actually use the physical tech
03:49:24.420 | to make better social tech, but it's not given that we do.
03:49:27.860 | If we don't make better social tech,
03:49:30.240 | then I think the physical tech
03:49:31.500 | empowers really shitty social tech
03:49:33.000 | that is not a world that we want.
03:49:35.080 | - I don't know if it's the road we wanna go down,
03:49:37.820 | but I tend to believe that the market
03:49:39.680 | will create exactly the thing you're talking about,
03:49:42.200 | which I feel like there's a lot of money to be made
03:49:44.560 | in creating a social tech that creates a better citizen,
03:49:49.560 | that creates a better human being.
03:49:56.500 | Your description of Facebook and so on,
03:50:02.800 | which is a system that creates addiction,
03:50:05.600 | which manufactures demand, is not obviously inherently
03:50:10.600 | the consequence of the markets.
03:50:14.600 | I feel like that's the first stage of us,
03:50:17.400 | like baby deer trying to figure out how to use the internet.
03:50:20.560 | I feel like there's much more money to be made
03:50:23.320 | with something that creates compersion and love, honestly.
03:50:28.320 | I mean, I really, we can have this,
03:50:33.960 | I can make the business case for it.
03:50:35.360 | I don't think we wanna really have that discussion,
03:50:39.040 | but do you have some hope that that's the case?
03:50:41.440 | And I guess if not, then how do we fix the system
03:50:44.720 | of markets that work so well
03:50:46.000 | for the United States for so long?
03:50:48.960 | - Like I said, every social tech worked for a while.
03:50:51.600 | Tribalism worked well for 200,000 or 300,000 years.
03:50:55.480 | I think social tech has to keep evolving.
03:50:57.960 | The social technologies with which we organize
03:51:00.840 | and coordinate our behavior have to keep evolving
03:51:03.440 | as our physical tech does.
03:51:05.720 | So I think the thing that we call markets,
03:51:09.320 | of course we can try to say,
03:51:12.000 | oh, even biology runs on markets,
03:51:14.480 | but the thing that we call markets,
03:51:18.080 | the underlying theory, homo economicus,
03:51:20.600 | demand, driving supply, that thing broke.
03:51:23.560 | It broke with scale in particular and a few other things.
03:51:28.160 | So it needs updated in a really fundamental way.
03:51:32.600 | I think there's something even deeper
03:51:34.200 | than making money happening
03:51:35.560 | that in some ways will obsolete money-making.
03:51:39.080 | I think capitalism is not about business.
03:51:44.560 | So if you think about business,
03:51:48.280 | I'm gonna produce a good or a service that people want
03:51:50.880 | and bring it to the market
03:51:53.080 | so that people get access to that good or service.
03:51:55.160 | That's the world of business, but that's not capitalism.
03:51:58.280 | Capitalism is the management and allocation of capital.
03:52:03.000 | Which financial services was a tiny percentage
03:52:06.560 | of the total market has become a huge percentage
03:52:08.360 | of the total market.
03:52:09.200 | It's a different creature.
03:52:10.360 | So if I was in business
03:52:12.400 | and I was producing a good or service
03:52:13.920 | and I was saving up enough money
03:52:15.080 | that I started to be able to invest that money
03:52:17.120 | and gain interest or do things like that,
03:52:19.840 | I start realizing I'm making more money on my money
03:52:23.600 | than I'm making on producing the goods and services.
03:52:26.120 | So I stop even paying attention to goods and services
03:52:28.240 | and start paying attention to making money on money.
03:52:30.880 | And how do I utilize capital to create more capital?
03:52:34.040 | And capital gives me more optionality
03:52:36.280 | 'cause I can buy anything with it
03:52:37.400 | than a particular good or service that only some people want.
03:52:40.200 | Capitalism, more capital ended up meaning more control.
03:52:47.240 | I could put more people under my employment.
03:52:51.680 | I could buy larger pieces of land,
03:52:54.800 | novel access to resource, mines,
03:52:56.640 | and put more technology under my employment.
03:52:58.200 | So it meant increased agency and also increased control.
03:53:00.960 | I think attentionalism is even more powerful.
03:53:05.300 | So rather than enslave people
03:53:10.680 | where the people kind of always want to get away
03:53:14.000 | and put in the least work they can,
03:53:16.320 | there's a way in which economic servitude
03:53:18.160 | was just more profitable than slavery, right?
03:53:21.640 | Have the people work even harder voluntarily
03:53:24.000 | 'cause they wanna get ahead
03:53:25.640 | and nobody has to be there to whip them
03:53:27.960 | or control them or whatever.
03:53:29.360 | This is a cynical take, but a meaningful take.
03:53:34.380 | So capital ends up being a way
03:53:40.680 | to influence human behavior, right?
03:53:43.160 | And yet where people still feel free in some meaningful way,
03:53:48.160 | they're not feeling like they're gonna be punished
03:53:51.760 | by the state if they don't do something.
03:53:53.480 | It's like punished by the market
03:53:54.800 | via homelessness or something.
03:53:56.600 | But the market is this invisible thing
03:53:58.160 | I can't put an agent on, so it feels like free.
03:54:01.120 | And so if you want to affect people's behavior
03:54:06.120 | and still have them feel free,
03:54:10.280 | capital ends up being a way to do that.
03:54:12.480 | But I think affecting their attention is even deeper
03:54:15.880 | 'cause if I can affect their attention,
03:54:18.500 | I can both affect what they want
03:54:20.880 | and what they believe and what they feel.
03:54:22.880 | And we statistically know this very clearly.
03:54:24.600 | Facebook has done studies that based on changing the feed,
03:54:27.320 | it can change beliefs, emotional dispositions, et cetera.
03:54:31.160 | And so I think there's a way that the harvest
03:54:35.920 | and directing of attention is even a more powerful system
03:54:38.960 | than capitalism.
03:54:39.800 | It is effective in capitalism to generate capital,
03:54:42.780 | but I think it also generates influence
03:54:44.680 | beyond what capital can do.
03:54:46.640 | And so do we want to have some groups
03:54:52.800 | utilizing that type of tech
03:54:55.960 | to direct other people's attention?
03:54:57.680 | If so, towards what?
03:55:02.680 | Towards what metrics of what a good civilization
03:55:05.520 | and good human life would be?
03:55:07.160 | What's the oversight process?
03:55:08.800 | What is the-
03:55:09.960 | - Transparency.
03:55:10.800 | I can answer all the things you're mentioning.
03:55:13.480 | I guarantee you, if I'm not such a lazy ass,
03:55:19.520 | I'll be part of the many people doing this,
03:55:21.840 | is transparency and control,
03:55:23.880 | like giving control to individual people.
03:55:26.240 | - Okay, so maybe the corporation has coordination
03:55:31.240 | on its goals that all of its customers
03:55:36.120 | or users together don't have.
03:55:37.600 | So there's some asymmetry of its goals,
03:55:42.600 | but maybe I could actually help all of the customers
03:55:45.840 | to coordinate almost like a labor union or whatever
03:55:49.000 | by informing and educating them adequately
03:55:51.840 | about the effects, the externalities on them.
03:55:54.840 | This is not toxic waste going into the ocean
03:55:57.320 | or the atmosphere, it's their minds,
03:55:59.160 | their beings, their families, their relationships,
03:56:02.280 | such that they will in group change their behavior.
03:56:05.560 | And I think the,
03:56:08.760 | one way of saying what you're saying, I think,
03:56:13.040 | is that you think that you can rescue homo economicus
03:56:18.040 | from the rational actor
03:56:21.080 | that will pursue all the goods and services
03:56:22.720 | and choose the best one at the best price,
03:56:24.600 | the kind of Rand, von Mises Hayek,
03:56:26.440 | that you can rescue that from Dan Ariely
03:56:28.520 | and behavioral econ that says
03:56:29.840 | that's actually not how people make choices,
03:56:31.400 | they make it based on status hacking,
03:56:32.960 | largely whether it's good for them or not in the long term.
03:56:36.080 | And the large asymmetric corporation can run propaganda
03:56:40.320 | and narrative warfare that hits people's status buttons
03:56:42.600 | and their limbic hijacks and their lots of other things
03:56:45.640 | in ways that they can't even perceive
03:56:47.880 | that are happening.
03:56:48.880 | They're not paying attention to that,
03:56:51.480 | the site is employing psychologists
03:56:53.240 | and split testing and whatever else.
03:56:55.140 | So you're saying, I think we can recover homo economicus.
03:56:59.800 | - And not just through a single mechanism technology,
03:57:02.400 | there's the, not to keep mentioning the guy,
03:57:05.640 | but platforms like Joe Rogan and so on
03:57:08.760 | that help make viral the ways
03:57:14.960 | the education of negative externalities
03:57:17.880 | can become viral in this world.
03:57:20.840 | - So interestingly, I actually agree with you
03:57:25.240 | that--
03:57:27.120 | - Got 'em, four and a half hours in.
03:57:30.560 | - That we can--
03:57:32.280 | - Tech can do some good.
03:57:33.480 | - Well, see what you're talking about
03:57:35.800 | is the application of tech here, broadcast tech,
03:57:38.640 | where you can speak to a lot of people
03:57:40.280 | and that's not gonna be strong enough
03:57:42.280 | 'cause the different people need spoken to differently,
03:57:44.160 | which means it has to be different voices
03:57:45.600 | that get amplified to those audiences,
03:57:47.080 | more like Facebook's tech,
03:57:48.160 | but nonetheless, we'll start with broadcast tech.
03:57:50.520 | - Plants the first seed and then the word of mouth
03:57:52.680 | is a powerful thing.
03:57:53.880 | You need to do the first broadcast shotgun
03:57:56.200 | and then it like lands, a catapult or whatever,
03:57:59.480 | I don't know what the right weapon is,
03:58:01.760 | but then it just spreads the word of mouth
03:58:03.480 | through all kinds of tech, including Facebook.
03:58:06.240 | - So let's come back to the fundamental thing.
03:58:08.160 | The fundamental thing is we want a kind of order
03:58:11.280 | at various scales from the conflicting parts of ourself,
03:58:15.680 | actually having more harmony than they might have
03:58:19.880 | to family, extended family, local, all the way up to global.
03:58:24.880 | We want emergent order where our choices
03:58:28.600 | have more alignment, right?
03:58:32.600 | We want that to be emergent rather than imposed
03:58:36.360 | or rather than we want fundamentally different things
03:58:38.640 | or make totally different sense of the world
03:58:40.240 | where warfare of some kind becomes the only solution.
03:58:43.640 | Emergent order requires us in our choice-making,
03:58:47.480 | requires us being able to have related sense-making
03:58:51.440 | and related meaning-making processes.
03:58:53.480 | Can we apply digital technologies and exponential tech
03:58:59.720 | in general to try to increase the capacity to do that?
03:59:03.120 | Where the technology called a town hall,
03:59:04.920 | the social tech that we'd all get together and talk,
03:59:06.720 | obviously is very scale limited
03:59:08.600 | and it's also oriented to geography
03:59:10.880 | rather than networks of aligned interest.
03:59:13.080 | Can we build new, better versions of those types of things?
03:59:16.240 | And going back to the idea that a democracy
03:59:20.280 | or participatory governance depends upon
03:59:22.720 | comprehensive education in the science of government,
03:59:24.960 | which include being able to understand things
03:59:27.000 | like asymmetric information warfare
03:59:28.640 | on the side of governments
03:59:30.000 | and how the people can organize adequately.
03:59:32.200 | Can you utilize some of the technologies now
03:59:35.880 | to be able to support increased comprehensive education
03:59:40.120 | of the people and maybe comprehensive informant-ness?
03:59:43.240 | So both fixing the decay in both education
03:59:45.880 | and the fourth estate that have happened
03:59:47.360 | so that people can start self-organizing
03:59:49.240 | to then influence the corporations,
03:59:52.760 | the nation states to do different things
03:59:55.280 | and/or build new ones themselves.
03:59:57.400 | Yeah, fundamentally, that's the thing that has to happen.
04:00:00.720 | The exponential tech gives us a novel problem landscape
04:00:03.480 | that the world never had.
04:00:04.640 | The nuke gave us a novel problem landscape.
04:00:07.000 | And so that required this whole Bretton Woods world.
04:00:10.480 | The exponential tech gives us novel problem landscape.
04:00:13.480 | Our existing problem-solving processes
04:00:15.440 | aren't doing a good job.
04:00:16.520 | We have had more countries get nukes.
04:00:18.680 | We haven't had nuclear deproliferation.
04:00:20.360 | We haven't achieved any of the UN
04:00:22.200 | sustainable development goals.
04:00:24.600 | We haven't kept any of the new categories of tech
04:00:26.640 | from making arms races.
04:00:27.720 | So our global coordination is not adequate
04:00:29.880 | to the problem landscape.
04:00:32.200 | So we need fundamentally better problem-solving processes,
04:00:35.120 | a market or a state as a problem-solving process.
04:00:37.240 | We need better ones that can do the speed
04:00:39.400 | and scale of the current issues.
04:00:41.440 | Right now, speed is one of the other big things,
04:00:43.360 | is that by the time we regulated DDT out of existence
04:00:46.920 | or cigarettes not for people under 18,
04:00:49.080 | they'd already killed so many people
04:00:50.800 | and we let the market do the thing.
04:00:52.720 | But as Elon has made the point, that won't work for AI.
04:00:56.280 | By the time we recognize afterwards
04:00:59.460 | that we have an autopoetic AI that's a problem,
04:01:01.440 | you won't be able to reverse it,
04:01:02.600 | that there's a number of things that
04:01:04.720 | when you're dealing with tech
04:01:05.680 | that is either self-replicating
04:01:07.840 | and disintermediates humans to keep going,
04:01:09.760 | doesn't need humans to keep going,
04:01:11.560 | or you have tech that just has exponentially fast effects,
04:01:15.600 | your regulation has to come early.
04:01:17.680 | It can't come after the effects have happened,
04:01:21.640 | the negative effects have happened
04:01:23.120 | 'cause the negative effects could be too big too quickly.
04:01:25.400 | So we basically need new problem-solving processes
04:01:27.800 | that do better at being able to internalize externality,
04:01:32.800 | solve the problems on the right time scale
04:01:34.800 | and the right geographic scale.
04:01:37.680 | And those new processes to not be imposed
04:01:40.480 | have to emerge from people wanting them
04:01:42.520 | and being able to participate in their development,
04:01:46.440 | which is what I would call kind of
04:01:47.720 | a new cultural enlightenment or renaissance
04:01:49.640 | that has to happen,
04:01:50.920 | where people start understanding the new power
04:01:54.400 | that exponential tech offers,
04:01:56.600 | the way that it is actually damaging
04:01:59.720 | current governance structures that we care about
04:02:02.880 | and creating an ex-risk landscape,
04:02:04.760 | but could also be redirected towards more pro-topic purposes
04:02:09.760 | and then saying,
04:02:11.080 | how do we rebuild new social institutions?
04:02:13.080 | What are adequate social institutions
04:02:14.840 | where we can do participatory governance at scale and time?
04:02:18.780 | And how can the people actually participate
04:02:21.660 | to build those things?
04:02:22.880 | The solution that I see working
04:02:25.680 | requires a process like that.
04:02:27.140 | - And the result maximizes love.
04:02:32.280 | So again, Elon, you'd be right that love is the answer.
04:02:36.120 | Let me take you back from the scale of societies
04:02:39.280 | to the scale that's far, far more important,
04:02:42.640 | which is the scale of family.
04:02:47.300 | You've written a blog post about your dad.
04:02:50.080 | We have various flavors of relationships with our fathers.
04:02:55.920 | What have you learned about life from your dad?
04:02:59.600 | - Well, people can read the blog post
04:03:04.040 | and see a lot of individual things that I learned
04:03:06.680 | that I really appreciated.
04:03:08.800 | If I was to kind of summarize at a high level,
04:03:11.200 | I had a really incredible dad,
04:03:16.360 | very, very unusually positive set of experiences.
04:03:23.160 | He was committed, we were homeschooled,
04:03:25.120 | and he was committed to work from home to be available
04:03:27.640 | and prioritize fathering in a really deep way.
04:03:30.340 | And as a super gifted, super loving, very unique man,
04:03:40.240 | he also had his unique issues
04:03:41.820 | that were part of what crafted the unique brilliance
04:03:43.800 | and those things often go together.
04:03:45.680 | And I say that because I think I had some unusual gifts
04:03:50.340 | and also some unusual difficulties.
04:03:52.100 | And I think it's useful for everybody to know
04:03:55.520 | their path probably has both of those.
04:03:57.420 | But if I was to say kind of at the essence
04:04:03.640 | of one of the things my dad taught me
04:04:05.720 | across a lot of lessons was like
04:04:07.360 | the intersection of self-empowerment,
04:04:11.560 | ideas and practices that self-empower
04:04:14.840 | towards collective good,
04:04:17.960 | towards some virtuous purpose beyond the self.
04:04:20.800 | And he both said that a million different ways,
04:04:23.960 | taught it in a million different ways
04:04:25.560 | when we were doing construction
04:04:26.920 | and he was teaching me how to build a house.
04:04:30.160 | We were putting the wires to the walls
04:04:32.900 | before the drywall went on.
04:04:34.060 | He made sure that the way that we put the wires
04:04:36.440 | through was beautiful.
04:04:37.600 | Like that the height of the holes was similar,
04:04:42.600 | that we twisted the wires in a particular way.
04:04:44.880 | And it's like, no one's ever gonna see it.
04:04:46.920 | And he's like, if a job's worth doing,
04:04:48.720 | it's worth doing well and excellence is its own reward
04:04:50.920 | and those types of ideas.
04:04:52.040 | And if there was a really shitty job to do,
04:04:53.600 | he'd say, see the job, do the job, stay out of the misery.
04:04:55.720 | Just don't indulge any negativity,
04:04:57.160 | do the things that need done.
04:04:59.040 | And so there's like a,
04:05:01.160 | there's an empowerment and a nobility together.
04:05:04.180 | And yeah, extraordinarily fortunate.
04:05:10.560 | - Is there ways you think you could have been a better son?
04:05:13.400 | Is there things you regret?
04:05:17.680 | That's an interesting question.
04:05:19.520 | - Let me first say, just as a bit of a criticism,
04:05:23.620 | that what kind of man do you think you are
04:05:28.560 | not wearing a suit and tie?
04:05:30.560 | A real man should.
04:05:31.600 | (laughing)
04:05:33.760 | Exactly, I agree with your dad on that point.
04:05:36.680 | You mentioned offline that he suggested
04:05:39.880 | a real man should wear a suit and tie.
04:05:41.760 | But outside of that,
04:05:45.480 | is there ways you could have been a better son?
04:05:48.440 | - Maybe next time on your show, I'll wear a suit and tie.
04:05:51.040 | (laughing)
04:05:52.560 | My dad would be happy about that.
04:05:54.200 | - Please.
04:06:12.600 | - I can answer the question later in life, not early.
04:06:15.740 | I had just a huge amount of respect and reverence
04:06:19.560 | for my dad when I was young.
04:06:20.640 | So I was asking myself that question a lot.
04:06:23.840 | So there weren't a lot of things I knew
04:06:26.280 | that I wasn't seeking to apply.
04:06:28.380 | There was a phase when I went through
04:06:34.040 | my kind of individuation differentiation
04:06:38.040 | where I had to make him excessively wrong
04:06:41.080 | about too many things.
04:06:42.240 | I don't think I had to, but I did.
04:06:45.480 | And he had a lot of kind of non-standard model beliefs
04:06:50.480 | about things, whether early kind of ancient civilizations
04:06:55.480 | or ideas on evolutionary theory
04:06:58.800 | or alternate models of physics.
04:07:00.360 | They weren't irrational,
04:07:06.200 | but they didn't all have the standard
04:07:08.080 | of epistemic proof that I would need.
04:07:12.000 | And I went through, and some of them
04:07:16.200 | were kind of spiritual ideas as well.
04:07:18.680 | I went through a phase in my early 20s
04:07:23.120 | where I kind of had the attitude that Dawkins
04:07:28.120 | or Christopher Hitchens has that can kind of be
04:07:36.920 | excessively certain and sanctimonious,
04:07:39.560 | applying their reductionist philosophy of science
04:07:43.200 | to everything and kind of brutally dismissive.
04:07:46.740 | I'm embarrassed by that phase.
04:07:49.500 | Not to say anything about those men and their path,
04:07:55.100 | but for myself.
04:07:56.840 | And so during that time, I was more dismissive
04:08:01.840 | of my dad's epistemology
04:08:04.640 | than I would have liked to have been.
04:08:06.760 | I got to correct that later, apologize for it.
04:08:08.960 | But that was the first thought that came to mind.
04:08:11.920 | - You've written the following.
04:08:14.280 | I've had the experience countless times,
04:08:17.960 | making love, watching a sunset, listening to music,
04:08:22.280 | feeling the breeze, that I would sign up for this whole life
04:08:27.280 | and all of its pains just to experience this exact moment.
04:08:33.400 | This is a kind of worldless knowing.
04:08:35.600 | It's the most important and real truth I know,
04:08:40.120 | that experience itself is infinitely meaningful
04:08:43.960 | and pain is temporary.
04:08:45.720 | And seen clearly, even the suffering is filled with beauty.
04:08:50.760 | I've experienced countless lives worth of moments
04:08:54.280 | worthy of life, such an unreasonable fortune.
04:09:00.560 | A few words of gratitude from you, beautifully written.
04:09:03.680 | Is there some beautiful moments?
04:09:05.920 | Now you have experienced countless lives
04:09:08.880 | worth of those moments,
04:09:10.440 | but is there some things that if you could,
04:09:14.880 | in your darker moments, you can go to to relive,
04:09:19.540 | to remind yourself that the whole ride is worthwhile?
04:09:22.720 | Maybe skip the making love part.
04:09:24.320 | We don't wanna know about that.
04:09:27.920 | - I mean, I feel unreasonably fortunate
04:09:32.840 | that it is such a humongous list because,
04:09:39.920 | I mean, I feel fortunate to have had exposure to practices
04:09:47.080 | and philosophies and way of seeing things
04:09:48.680 | that makes me see things that way.
04:09:50.020 | So I can take responsibility for seeing things in that way
04:09:53.200 | and not taking for granted really wonderful things,
04:09:55.980 | but I can't take credit for being exposed
04:09:57.800 | to the philosophies that even gave me that possibility.
04:10:00.500 | It's not just with my wife,
04:10:06.760 | it's with every person who I really love when we're talking,
04:10:10.880 | I look at their face.
04:10:11.920 | I, in the context of a conversation,
04:10:14.160 | feel overwhelmed by how lucky I am to get to know them.
04:10:17.600 | And like there's never been someone like them
04:10:19.940 | in all of history and there never will be again.
04:10:22.000 | And they might be gone tomorrow, I might be gone tomorrow.
04:10:23.800 | And like, I get this moment with them.
04:10:25.360 | And when you take in the uniqueness of that fully
04:10:28.320 | and the beauty of it, it's overwhelmingly beautiful.
04:10:30.920 | And I remember the first time I did a big dose of mushrooms
04:10:38.360 | and I was looking at a tree for a long time.
04:10:43.780 | And I was just crying with overwhelm at how beautiful
04:10:46.580 | the tree was.
04:10:47.420 | And it was a tree outside the front of my house
04:10:48.700 | that I'd walked by a million times
04:10:50.000 | and never looked at like this.
04:10:52.200 | And it wasn't the dose of mushrooms where I was hallucinating
04:10:57.080 | like where the tree was purple.
04:10:59.000 | Like the tree still looked like, if I had to describe it,
04:11:01.040 | say it's green and it has leaves, looks like this,
04:11:03.320 | but it was way fucking more beautiful,
04:11:05.720 | like capturing than it normally was.
04:11:08.560 | And I'm like, why is it so beautiful
04:11:09.840 | if I would describe it the same way?
04:11:12.020 | And I realized I had no thoughts taking me anywhere else.
04:11:15.120 | - Yeah.
04:11:15.960 | - Like what it seemed like the mushrooms were doing
04:11:17.720 | was just actually shutting the narrative off
04:11:19.640 | that would have me be distracted
04:11:20.800 | so I could really see the tree.
04:11:22.840 | And then I'm like, fuck, when I get off these mushrooms,
04:11:24.480 | I'm gonna practice seeing the tree
04:11:26.280 | because it's always that beautiful and I just miss it.
04:11:29.200 | And so I practice being with it
04:11:30.720 | and quieting the rest of the mind
04:11:32.240 | and then being like, wow.
04:11:33.560 | And if it's not mushrooms,
04:11:35.120 | like people will have peak experiences
04:11:37.320 | where they'll see life and how incredible it is.
04:11:40.000 | It's always there.
04:11:41.880 | - It's funny that I had this exact same experience
04:11:44.960 | on quite a lot of mushrooms,
04:11:47.800 | just sitting alone and looking at a tree
04:11:50.320 | and exactly as you described it,
04:11:52.920 | appreciating the undistorted beauty of it.
04:11:56.080 | And it's funny to me that here's two humans,
04:12:00.160 | very different with very different journeys
04:12:02.560 | where at some moment in time,
04:12:03.840 | both looking at a tree like idiots for hours
04:12:06.360 | (both laughing)
04:12:07.960 | and just in awe and happy to be alive.
04:12:10.520 | And yeah, even just that moment alone is worth living for.
04:12:14.960 | But you did say humans
04:12:17.240 | and we have a moment together as two humans
04:12:21.120 | and you mentioned shots.
04:12:22.520 | (both laughing)
04:12:23.360 | That's a, I have to ask, what are we looking at?
04:12:26.960 | - When I went to go get a smoothie before coming here,
04:12:30.480 | I got you a keto smoothie that you didn't want
04:12:32.480 | 'cause you're not just keto but fasting.
04:12:34.880 | But I saw the thing with you and your dad
04:12:37.160 | where you did shots together.
04:12:39.520 | And this place happened to have shots
04:12:41.240 | of ginger turmeric cayenne juice of some kind.
04:12:45.920 | - With some Himalayan salt.
04:12:47.400 | - I didn't necessarily plan it for being on the show.
04:12:49.960 | I just brought it.
04:12:51.200 | - Wow.
04:12:52.040 | - But we can do it that way.
04:12:52.880 | - I think we shall toast like heroes, Daniel.
04:12:57.880 | It's a huge honor.
04:12:59.920 | - What do we toast to?
04:13:00.760 | What do we toast to?
04:13:01.640 | - We toast to this moment,
04:13:03.320 | this unique moment that we get to share together.
04:13:07.000 | - I'm very grateful to be here in this moment with you.
04:13:09.040 | And yeah, I'm grateful that you invited me here.
04:13:12.680 | We met for the first time and I will never be the same
04:13:16.200 | for the good and the bad.
04:13:18.880 | - I am.
04:13:19.720 | - That is really interesting.
04:13:24.240 | That feels way healthier than the vodka
04:13:26.480 | my dad and I were drinking.
04:13:27.840 | So I feel like a better man already.
04:13:31.160 | Daniel, this is one of the best conversations I've ever had.
04:13:34.200 | I can't wait to have many more.
04:13:36.120 | - Likewise.
04:13:37.120 | - This has been an amazing experience.
04:13:39.120 | Thank you for wasting all your time today.
04:13:40.960 | I wanna say in terms of what you're mentioning about,
04:13:43.920 | like the, that you work in machine learning
04:13:48.760 | and the optimism that wants to look at the issues,
04:13:52.960 | but wants to look at how this increased technological power
04:13:56.040 | could be applied to solving them.
04:13:57.800 | And that even thinking about the broadcast of like,
04:14:01.760 | can I help people understand the issues better
04:14:04.160 | and help organize them?
04:14:05.520 | Like fundamentally you're oriented like Wikipedia.
04:14:09.400 | What I see to really try to tend to the information commons
04:14:13.960 | without another agentic interest distorting it.
04:14:16.800 | And for you to be able to get guys like Lee Smolin
04:14:22.120 | and Roger Penrose and like the greatest thinkers
04:14:25.440 | that are alive and have them on the show.
04:14:29.040 | And most people would never be exposed to them
04:14:30.760 | and talk about it in a way that people can understand.
04:14:33.840 | I think it's an incredible service.
04:14:35.760 | I think you're doing great work.
04:14:36.920 | So I was really happy to hear from you.
04:14:39.360 | - Thank you, Daniel.
04:14:41.240 | Thanks for listening to this conversation
04:14:42.680 | with Daniel Schmachtenberger.
04:14:44.080 | And thank you to Ground News, NetSuite,
04:14:47.520 | Four Sigmatic, Magic Spoon and BetterHelp.
04:14:51.640 | Check them out in the description to support this podcast.
04:14:55.480 | And now let me leave you with some words
04:14:57.560 | from Albert Einstein.
04:14:59.600 | "I know not with what weapons World War III will be fought,
04:15:03.320 | "but World War IV will be fought with sticks and stones."
04:15:08.040 | Thank you for listening and hope to see you next time.
04:15:11.120 | (upbeat music)
04:15:13.720 | (upbeat music)
04:15:16.320 | [BLANK_AUDIO]