back to indexDaniel Schmachtenberger: Steering Civilization Away from Self-Destruction | Lex Fridman Podcast #191
Chapters
0:0 Introduction
1:31 Aliens and UFOs
20:15 Collective intelligence of human civilization
28:12 Consciousness
39:33 How much computation does the human brain perform?
43:12 Humans vs ants
50:30 Humans are apex predators
57:34 Girard's Mimetic Theory of Desire
77:31 We can never completely understand reality
80:54 Self-terminating systems
91:18 Catastrophic risk
121:30 Adding more love to the world
148:55 How to build a better world
166:7 Meaning of life
173:49 Death
179:29 The role of government in society
196:54 Exponential growth of technology
242:35 Lessons from my father
248:11 Even suffering is filled with beauty
00:00:00.000 |
The following is a conversation with Daniel Schmachtenberger, 00:00:05.780 |
that is aimed at improving public sensemaking and dialogue. 00:00:09.320 |
He's interested in understanding how we humans 00:00:12.640 |
can be the best version of ourselves as individuals 00:00:25.800 |
Check them out in the description to support this podcast. 00:00:29.480 |
As a side note, let me say that I got a chance 00:00:31.520 |
to talk to Daniel on and off the mic for a couple of days. 00:00:34.700 |
We took a long walk the day before our conversation. 00:00:37.920 |
I really enjoyed meeting him, just on a basic human level. 00:00:42.940 |
with words that carried hope for us individual ants 00:00:46.200 |
actually contributing something of value to the colony. 00:00:50.000 |
These conversations are the reasons I love human beings, 00:00:52.920 |
our insatiable striving to lessen the suffering in the world. 00:01:08.280 |
I've gotten to experience some of that same magic 00:01:12.440 |
and in random bars in my travels across this country, 00:01:19.760 |
and a new appreciation of this too short, too beautiful life. 00:01:26.280 |
and here is my conversation with Daniel Schmachtenberger. 00:01:33.680 |
through the entire history, just watching us, 00:01:36.900 |
and were tasked with summarizing what happened until now, 00:01:41.880 |
What do you think they would write up in that summary? 00:01:43.880 |
Like it has to be pretty short, less than a page. 00:01:50.040 |
there's I think like a paragraph or a couple sentences. 00:02:10.040 |
I have no idea if their senses are even attuned 00:02:12.440 |
to similar stuff to what our senses are attuned to, 00:02:15.280 |
or what the nature of their consciousness is like 00:02:18.280 |
And so let's assume that they're kind of like us, 00:02:27.280 |
And then if they've watched throughout all of history, 00:02:40.880 |
They saw that there wasn't just a steady dialectic 00:02:46.200 |
there's a giant fire that destroys a lot of things. 00:03:11.460 |
still making our decisions relatively similarly 00:03:20.240 |
I think they would think that this is probably 00:03:24.380 |
that is not gonna make it to being intergalactic 00:03:39.780 |
and bind and direct the physical technologies 00:03:45.980 |
- Actually, "Hitchhiker's Guide" has a estimation 00:03:50.820 |
about how much of a risk this particular thing poses 00:04:10.220 |
There's ups and downs in terms of technological innovation. 00:04:16.180 |
from a game theory perspective hasn't really changed. 00:04:46.720 |
of intelligent alien civilizations out there? 00:04:53.980 |
I know, I think Postrom had a new article not that long ago 00:05:03.780 |
But when I look at the total number of Kepler planets 00:05:19.420 |
as one of the primary aspects of its energy metabolism, 00:05:23.080 |
we get to think about that the building blocks 00:05:27.420 |
even that the planets have to have might be more different. 00:05:32.460 |
not to mention interesting things that we've observed 00:05:38.200 |
As you've had guests on your show discussing Tic Tac and-- 00:05:46.820 |
What do you make sense of the rich human psychology 00:06:00.580 |
most of which I presume are conjured up by the human mind 00:06:09.140 |
some number might be reflective of actual physical objects, 00:06:12.680 |
whether it's drones or testing military technology 00:06:22.300 |
Because it's gained quite a bit of popularity recently. 00:06:25.460 |
There's some sense of which that's us humans being hopeful 00:07:08.200 |
'cause I haven't got to study it as much as I want. 00:07:09.580 |
So what I'm gonna say comes from a superficial assessment. 00:07:42.180 |
but I don't know if you ever saw the disclosure project. 00:07:49.160 |
a bunch of mostly US military and some commercial flight, 00:07:57.460 |
and classified information disclosing it at a CNN briefing. 00:08:05.060 |
admirals, fighter pilots all describing things 00:08:17.420 |
that had some consistency across different people. 00:08:23.060 |
that I think it would be silly to just dismiss it. 00:08:46.260 |
and atmospheric phenomena have to say about it. 00:08:59.140 |
The interesting thing that a number of them describe 00:09:02.540 |
is something that's kind of like right angles at speed, 00:09:06.220 |
or if not right angles, acute angles at speed, 00:09:08.580 |
but something that looks like a different relationship 00:09:13.480 |
I don't think that there are any human technologies 00:09:22.260 |
Now, one could say, okay, well, could it be a hologram? 00:09:25.700 |
Well, would it show up on radar if radar is also seeing it? 00:09:32.220 |
I mean, and for that to be a massive coordinated PSYOP 00:09:53.460 |
it is the dumbest version of alien technologies. 00:09:58.460 |
It's like the old, old crappy VHS tapes of alien technology. 00:10:02.920 |
These are like crappy drones that just floated 00:10:05.620 |
or even like space to the level of like space junk 00:10:09.500 |
because it is so close to our human technology. 00:10:18.700 |
but it still has very similar kind of geometric notions 00:10:23.700 |
and something that we humans can perceive with our eyes, 00:10:31.820 |
would be something that we would not be able to perceive, 00:10:42.300 |
it would be beyond the cognitive capabilities of us humans. 00:10:46.300 |
as per your answer for aliens summarizing earth, 00:11:09.180 |
in talking about governments and human societies. 00:11:11.840 |
Do you think if a US government or any government 00:11:24.460 |
or of information related to alien spacecraft, 00:12:14.340 |
for reasons of at least supposedly national security, 00:12:18.160 |
which is also a nice source of plausible deniability 00:12:45.480 |
did analysis on it using, I believe, electron microscopy 00:12:50.160 |
and came to the idea that it was a nanotech alloy 00:12:54.080 |
that was something we didn't currently have the ability 00:13:18.920 |
like would they necessarily move their atoms around in space 00:13:22.120 |
or might they do something more interesting than that? 00:13:24.640 |
Might they be able to have a different relationship 00:13:27.920 |
to the concept of space or information or consciousness? 00:13:31.280 |
One of the things that the craft supposedly do 00:13:42.000 |
like the two are not necessarily mutually exclusive 00:13:48.700 |
that they create intentional amounts of exposure 00:14:03.180 |
I've just been recently reading more and more 00:14:08.240 |
and you have orbiting black holes that orbit each other. 00:14:21.520 |
I think about what it would be like to ride those waves 00:14:32.820 |
and expanding us in all dimensions, including time. 00:14:48.160 |
Travels at a speed, which is a very weird thing to say 00:14:57.460 |
You could argue it's faster than the speed of light. 00:15:04.500 |
to summon enough energy to generate black holes 00:15:31.520 |
a very large size and very dense, but perhaps that's it. 00:15:43.640 |
we wouldn't even be able to make sense of that, 00:15:46.960 |
of the physics that results in an alien species 00:15:50.360 |
that's able to control gravity at that scale. 00:15:53.380 |
- I think you just jumped up the Kardashev scale so far. 00:15:56.760 |
You're not just harnessing the power of a star, 00:15:59.080 |
but harnessing the power of mutually rotating black holes. 00:16:02.660 |
That's way above my physics pay grade to think about, 00:16:08.900 |
including even non-rotating black hole versions 00:16:15.600 |
I think, you can talk with Eric more about that. 00:16:29.280 |
on our ability to get to other habitable planets in time. 00:16:39.340 |
orbiting black holes is not on the first page for you. 00:16:59.740 |
Venus potentially having very bacteria-like life forms. 00:17:10.620 |
it's a little bit scary, but mostly really exciting 00:17:17.960 |
all around us, teeming, having little societies. 00:17:22.920 |
And whether there's properties about that kind of life, 00:17:29.660 |
if those colonies of single-cell type organisms, 00:17:51.240 |
If they're different, that's also really exciting 00:17:53.900 |
'cause there's life forms everywhere that are not like us. 00:18:11.420 |
for us to open our aperture on what life and consciousness 00:18:14.780 |
and self-replicating possibilities could look like. 00:18:18.220 |
The question on are they different or the same, 00:18:22.660 |
that is the same in some ways and different in other ways. 00:18:25.560 |
When you take the thing that we call an invasive species, 00:18:37.440 |
it might be devastating to that whole ecosystem 00:18:40.980 |
that it messes up the self-stabilizing dynamics 00:18:50.000 |
in ways where we could still figure out a way 00:18:53.020 |
to inhabit a biosphere together or fundamentally not, 00:19:05.020 |
- Well, we offline talked about mimetic theory, right? 00:19:09.740 |
It seems like if there were sufficiently different 00:19:19.540 |
If we're close enough together to where we'd be competing, 00:19:23.200 |
then you're getting into the world of viruses and pathogens 00:19:26.220 |
and all those kinds of things to where we would, 00:19:50.900 |
that got introduced because it either output something 00:19:54.140 |
that was toxic or utilized up the same resource too quickly 00:19:56.620 |
and it just replicated faster and mutated faster. 00:20:16.180 |
- For one final time, putting your alien/god hat on 00:20:24.260 |
do you think about the 7.8 billion people on Earth 00:20:40.240 |
What's the proper framework through which to analyze it, 00:20:47.940 |
when you have asked the question the same way 00:20:56.640 |
- I would indeed ask the question the same way, 00:21:01.860 |
but I would be less confident about your conclusions. 00:21:12.580 |
but I would nevertheless ask it the same way, yes. 00:21:15.300 |
- Well, let's go back further and smaller then. 00:21:29.140 |
probably sub Dunbar number, sub 150 people tribe, 00:21:45.140 |
Well, could those individuals exist without the group? 00:22:01.980 |
trying to figure out stone tools and protection 00:22:22.020 |
and being affected by the group that they're part of. 00:22:28.900 |
which is theories that orient towards the collective 00:22:36.380 |
And ones that orient towards the individual liberalism 00:22:39.980 |
And I think there's very obvious failure modes 00:22:45.260 |
is more interesting to me than either of them. 00:22:49.540 |
around how to have a virtuous process between those. 00:23:08.940 |
They're not, it's a matter of degrees, I suppose. 00:23:18.740 |
The social network within which they've been brought up 00:23:23.740 |
through which they've developed their intelligence 00:23:27.340 |
or is it their own sovereign individual self? 00:23:41.100 |
how much do we need others for our development? 00:23:44.460 |
- Yeah, I think we have a weird sense of this today 00:23:48.500 |
relative to most previous periods of sapien history. 00:23:53.340 |
I think the vast majority of sapien history is tribal. 00:24:00.580 |
two or 300,000 years of Homo sapiens in little tribes 00:24:05.500 |
where they depended upon that tribe for survival 00:24:08.420 |
and excommunication from the tribe was fatal. 00:24:12.820 |
I think they, and our whole evolutionary genetic history 00:24:20.660 |
And then we still depended upon extended families 00:24:28.980 |
where I can provide something to the market to get money, 00:24:33.020 |
to be able to get other things from the market 00:24:35.900 |
It's almost like disintermediating our sense of need, 00:24:38.140 |
even though your and my ability to talk to each other 00:24:43.140 |
using these mics and the phones that we coordinated on 00:24:50.860 |
but we don't notice that we depend upon them. 00:24:58.140 |
obviously that you didn't even get to a baby without a mom. 00:25:00.260 |
Was it dependent, we depended upon each other, right? 00:25:06.480 |
But if we take that baby and we put it out in the wild, 00:25:20.460 |
and it doesn't learn the small motor articulation 00:25:39.440 |
- Is it possible that it affects basically 90% of us 00:25:50.020 |
The whole thing is the connection between us humans 00:26:01.500 |
forces us to think very differently about human society 00:26:09.940 |
- I just have to object to the no better than apes 00:26:12.900 |
'cause better here, I think you mean a specific thing 00:26:19.880 |
And I think the idea of humans as better than nature 00:26:32.340 |
But when we say what is unique about homo sapien capacity 00:26:45.200 |
We believe our tool creation and our language creation 00:26:50.040 |
and our coordination are all kind of the results 00:26:52.120 |
of a certain type of capacity for abstraction. 00:27:03.280 |
So a chimp will notice that a rock can cut a vine 00:27:10.080 |
And experientially it'll use the sharper rock. 00:27:22.880 |
What is the abstract principle called sharpness 00:27:38.880 |
So I do think our ability to coordinate with each other 00:27:45.640 |
- I wonder if that coordination, that connection 00:27:49.320 |
is actually the thing that gives birth to consciousness. 00:27:52.080 |
That gives birth to, well let's start with self-awareness. 00:28:05.480 |
I'm not sure, maybe dogs, something like that. 00:28:12.400 |
See it'd be interesting if that theory of mind 00:28:15.120 |
is what leads to consciousness in the way we think about it. 00:28:24.920 |
I have an inkling sense that that only exists 00:28:30.880 |
That doesn't come with the hardware and the software 00:28:43.160 |
I think we think that consciousness is fundamental. 00:28:57.640 |
that the illusion of consciousness is consciousness. 00:29:12.040 |
And that experience, you want to richly experience it 00:29:16.440 |
so that I could empathize with your experience. 00:29:22.420 |
Mostly because it allows you to then create robots 00:29:25.780 |
that become conscious not by being quote unquote conscious 00:29:30.300 |
but by just learning to fake it 'til they make it. 00:29:53.320 |
if they're sufficiently convincing to humans. 00:30:16.860 |
This is not the trajectory I was expecting, nor you. 00:30:34.240 |
What do we mean by awareness, qualia, theory of mind? 00:30:56.280 |
And then there was some writing back and forth 00:31:05.960 |
and logical fallacies and philosophy of science 00:31:21.240 |
And Sam was saying, no, there is consciousness 00:31:29.780 |
it was 'cause the other one didn't understand 00:31:30.880 |
the philosophy of science or logical fallacies. 00:31:36.800 |
Sam said something that I thought was quite insightful 00:31:45.100 |
It seems that we simply have different intuitions about this. 00:31:54.280 |
At the level of symbol grounding might be quite different. 00:32:01.640 |
enough life experiences that what is being referenced 00:32:06.640 |
This is why when trying to address these things 00:32:14.900 |
"we end up importing different ideas and bad ideas 00:32:17.680 |
"right into the nature of the language that we're using." 00:32:34.760 |
- Well, and also allowing our minds to drift together 00:32:38.280 |
for a time so that our definitions of these terms align. 00:32:42.840 |
I think there's a beauty that some people enjoy with Sam 00:32:49.460 |
that he is quite stubborn on his definitions of terms 00:32:54.460 |
without often clearly revealing that definition. 00:33:01.460 |
you could sense that he can deeply understand 00:33:12.460 |
that not only does he think that free will is an illusion, 00:33:26.020 |
and just be like for minutes at a time, hours at a time, 00:33:31.020 |
like really experience as if he has no free will. 00:33:43.020 |
he's very sure that consciousness is fundamental. 00:33:48.300 |
that's subjectively experiencing the floating 00:34:01.300 |
There's some aspect to which the terminology there 00:34:06.500 |
- So that's a particular kind of meditative experience. 00:34:13.620 |
thousands of years ago described similar experiences 00:34:18.380 |
There are other types of phenomenal experience 00:34:23.780 |
that are the phenomenal experience of pure agency. 00:34:35.300 |
And that rather than a creator agent God in the beginning, 00:34:38.980 |
there's a creative impulse or a creative process. 00:34:47.080 |
And I think the types of experiences we've had, 00:34:53.140 |
and then one, the types of experience we've had 00:34:55.860 |
make a big deal to the nature of how we do symbol grounding. 00:34:58.600 |
The other thing is the types of experiences we have 00:35:18.100 |
but I wanna come back to what the impulse was 00:35:21.420 |
that was interesting around what is consciousness 00:35:24.180 |
and how does it relate to us as social beings? 00:35:31.340 |
- Right, you're keeping us on track, which is wonderful. 00:35:41.560 |
and how does the social impulse connect to consciousness? 00:35:44.760 |
Is consciousness a consequence of that social connection? 00:35:50.700 |
- I'm gonna state a position and not argue it 00:35:53.040 |
'cause it's honestly, like it's a long, hard thing to argue 00:35:56.300 |
and we can totally do it another time if you want. 00:36:04.220 |
as an emergent property of biology or neural networks. 00:36:31.180 |
as experience, sensation, desire, emotion, phenomenology, 00:36:44.160 |
But all of the physical stuff, the third person stuff, 00:36:47.920 |
has position and momentum and charge and stuff like that 00:36:54.460 |
I think of the nature of first person and third person 00:37:06.760 |
So I think about the evolution of third person 00:37:10.780 |
from subatomic particles to atoms to molecules to on and on. 00:37:14.460 |
I think about a similar kind of and corresponding evolution 00:37:25.460 |
and to higher orders of it and that there's correspondence, 00:37:37.300 |
or do we reduce first person to third person. 00:37:40.340 |
Obviously, Bohm talked about an implicate order 00:37:56.180 |
So rather than say, does consciousness emerge from, 00:37:59.280 |
I'll talk about do higher capacities of consciousness 00:38:10.940 |
but increased complexity within the nature of first person 00:38:24.580 |
where it goes just from basic sensation to emotion, 00:38:49.880 |
So would I say that obviously the sapien brain 00:38:54.300 |
is pretty unique and a single sapien now has that, right? 00:39:03.680 |
So the group made it, now that brain is there. 00:39:06.020 |
Now, if I take a single person with that brain 00:39:08.380 |
out of the group and try to raise them in a box, 00:39:10.780 |
they'll still not be very interesting even with the brain. 00:39:33.160 |
- As a small aside, as you're talking about the biology, 00:39:36.880 |
let me comment that I spent, this is what I do. 00:39:44.940 |
trying to do research on how many computations 00:39:49.260 |
the brain performs and how much energy it uses 00:39:59.960 |
So that's two to the 10 to the 16 computations. 00:40:03.480 |
So synaptic firings per second that the brain does. 00:40:07.680 |
And that's about a million times faster than the, 00:40:11.100 |
let's say the 20 thread state of the arts Intel CPU, 00:40:21.200 |
And then there's similar calculation for the GPU 00:40:28.700 |
that it takes 10 watts to run the brain about. 00:40:32.440 |
And then what does that mean in terms of calories per day, 00:40:34.760 |
kilocalories, that's about, for an average human brain, 00:40:48.100 |
where you're doing about 20 quadrillion calculations 00:41:17.540 |
Two, the relevant computations are synaptic firings 00:41:20.940 |
and that there's nothing other than synaptic firings 00:41:27.940 |
There's a very famous neuroscientist at Stanford 00:41:34.060 |
who did a lot of the pioneering work on glial cells 00:41:40.060 |
did a huge amount of the thinking, not just neurons. 00:41:42.260 |
And it opened up this entirely different field 00:41:44.300 |
of like what the brain is and what consciousness is. 00:41:46.620 |
You look at Damasio's work on embodied cognition 00:41:48.960 |
and how much of what we would consider consciousness 00:41:56.820 |
involving lots of other cells and signal communication. 00:42:03.420 |
And even though the Penrose-Hammerhoff conjecture 00:42:05.940 |
is probably not right, is there something like that 00:42:09.340 |
where we're actually having to look at stuff happening 00:42:11.180 |
at the level of quantum computation and microtubules. 00:42:22.700 |
this has become like an infomercial for the human brain. 00:42:38.340 |
and then the synaptic firings we're referring to 00:42:42.540 |
It could be the mechanical transmission of information, 00:42:44.580 |
there's chemical transmission of information, 00:42:52.380 |
Not to mention, which I'm learning more and more about, 00:43:12.100 |
But on the topic of sort of the greater degrees 00:43:22.260 |
I think few things are as beautiful and inspiring 00:43:40.020 |
So one of the simplest things to do that with 00:43:55.500 |
but there's so few things that are as awe-inspiring to me 00:44:05.860 |
and not being able to predict what the heck it looks like. 00:44:08.380 |
And it creates incredibly beautiful complexity 00:44:14.320 |
looks like there's actual organisms doing things 00:44:18.140 |
that are much, that operate on a scale much higher 00:44:28.940 |
that's cells that are born and die, born and die, 00:44:31.860 |
and they only know about each other's neighbors. 00:44:43.740 |
hundreds or thousands of cells, and they're moving. 00:44:46.900 |
They're moving around, they're communicating, 00:44:54.380 |
before you remember, that the simple rules on cells 00:45:11.420 |
that generalizes the behavior of the large organisms. 00:45:16.420 |
We can only come up, we can only hope to come up 00:45:23.100 |
or the fundamental rules of that system, I suppose. 00:45:27.360 |
Everything we know about the mathematics of those systems, 00:45:29.840 |
it seems like we can't, really in a nice way, 00:45:40.940 |
So what do you make of the emergence of complexity 00:45:53.180 |
flocking behavior, the murmuration, can be computer-coded. 00:45:56.980 |
It's not a very hard set of rules to be able to see 00:45:59.580 |
some of those really amazing types of complexity. 00:46:06.420 |
and some of the sub-disciplines like stigmur G 00:46:08.620 |
are studying how following fairly simple responses 00:46:13.100 |
to a pheromone signal do ant colonies do this amazing thing 00:46:16.140 |
where what you might describe as the organizational 00:46:20.860 |
is so profound relative to what each individual ant is doing. 00:46:31.180 |
I would like, unfortunately, in terms of topics 00:46:35.260 |
like ET's more, Wolfram's a new kind of science 00:46:42.740 |
and not read the whole thing or his newer work since. 00:46:45.380 |
But his idea of the four basic kind of categories 00:46:50.460 |
of emergent phenomena that can come from cellular automata 00:47:10.180 |
It does not instantly make me think that biology 00:47:15.460 |
is operating on a similarly small set of rules 00:47:22.060 |
And so if you look at say Santa Fe Institute, 00:47:30.740 |
his work, you should really get him on your show. 00:47:35.420 |
one of Kauffman's more recent books after investigations 00:47:41.740 |
and it had to do with some of these exact questions 00:47:49.180 |
And he was very interested in highly non-ergotic systems 00:48:03.300 |
someone who spent some time at Santa Fe Institute 00:48:09.020 |
who some people call the father of anthro-complexity 00:48:17.820 |
that modeling humans as termites really doesn't cut it. 00:48:24.100 |
to the same pheromone stimulus using stigmergy 00:48:30.500 |
but it really doesn't work for trying to make sense 00:48:34.660 |
and general relativity creation and stuff like that. 00:48:37.580 |
And it's because the termites are not doing abstraction, 00:48:43.420 |
and making choices now based on forecasts of the future, 00:48:56.100 |
And with humans, our uniqueness one to the next 00:49:08.940 |
They're basically fungible within a class, right? 00:49:12.100 |
but within a class they're basically fungible 00:49:21.980 |
Don't you think we humans are deceiving ourselves 00:49:27.140 |
isn't there some sense in which this emergence 00:49:28.940 |
just creates different higher and higher levels 00:49:36.820 |
That we're all equally dumb but at different scales? 00:49:42.180 |
I think that hydrogen atoms are more similar to each other 00:49:51.260 |
And I think that cells are more similar to each other 00:50:03.020 |
The R-selected species where you have a whole, 00:50:09.460 |
you're not looking for as much individuality within 00:50:16.060 |
to cover the evolutionary search space within an individual. 00:50:18.660 |
You're looking at it more in terms of a numbers game. 00:50:22.740 |
So yeah, I would say there's probably more difference 00:50:26.820 |
than there is between one Cape buffalo and the next. 00:50:29.620 |
- Given that, it would be interesting to get your thoughts 00:50:32.340 |
about mimetic theory where we're imitating each other 00:50:54.220 |
where maybe you can explain it from your perspective, 00:51:06.660 |
- Well, imitation is not unique to humans, right? 00:51:10.140 |
So a certain amount of learning through observing 00:51:19.100 |
It's actually kind of worth speaking to this for a moment. 00:51:26.620 |
We've even seen teaching an ape sign language 00:51:29.980 |
and then the ape teaching other apes sign language. 00:51:36.660 |
And that needs to happen if they need to learn 00:51:40.820 |
or develop capacities that are not just coded 00:51:45.860 |
they're learning new things based on the environment. 00:51:48.700 |
And so based on someone else learned something first. 00:51:58.500 |
and how much it's learning is a very interesting question. 00:52:02.180 |
And I think this is a place where humans really show up 00:52:13.580 |
That the closest ancestors to us, if we look at a chimp, 00:52:26.620 |
The fact that it takes a human a year to be walking 00:52:40.980 |
And not only can we not hold on to mom in the first day, 00:52:44.980 |
it's three months before we can move our head volitionally. 00:52:48.540 |
So it's like, why are we embryonic for so long? 00:52:51.940 |
Basically it's like it's still fetal on the outside. 00:52:56.780 |
Had to be because couldn't keep growing inside 00:53:03.260 |
So here's a place where there's a co-evolution 00:53:19.900 |
Which is because we have the abstraction to make tools, 00:53:23.500 |
we change our environments more than other creatures 00:53:26.700 |
The next most environment modifying creature to us 00:53:32.940 |
you fly into LAX and you look at the just orthogonal grid 00:53:38.940 |
And we've recently come into the Anthropocene 00:53:43.340 |
more from human activity than geological activity 00:54:02.060 |
We could put on clothes and go to a cold place. 00:54:05.060 |
And this is really important because we actually went 00:54:07.980 |
and became apex predators in every environment. 00:54:12.020 |
The polar bear can't leave the Arctic, right? 00:54:24.660 |
or at least with our tools adapted to the environment. 00:54:33.060 |
we're incredibly good at becoming apex predators. 00:54:35.900 |
- Yes, and nothing else can do that kind of thing. 00:54:45.980 |
through evolutionary process that's super slow. 00:54:49.420 |
creates co-selective process with their environment. 00:54:52.140 |
So as the predator becomes a tiny bit faster, 00:55:05.620 |
than anything else could increase its resilience to it. 00:55:08.460 |
As a result, we started outstripping the environment 00:55:14.060 |
and going and becoming apex predator everywhere. 00:55:15.580 |
This is why we can't keep applying apex predator theories 00:55:26.140 |
An orca can eat one big fish at a time, like one tuna, 00:55:32.620 |
And we can put a mile long drift net out on a single boat 00:55:46.700 |
that we can then genetically engineer different creatures. 00:55:49.620 |
We can extinct species, we can devastate whole ecosystems. 00:55:52.860 |
We can make built worlds that have no natural things 00:55:56.260 |
We can build new types of natural creatures, synthetic life. 00:56:04.260 |
And little gods that behave as apex predators 00:56:06.100 |
causes a problem, kind of core to my assessment of the world. 00:56:13.020 |
So a predator is somebody that effectively can mine 00:56:18.020 |
the resources from a place, so for their survival, 00:56:22.220 |
or is it also just purely like higher level objectives 00:56:27.660 |
of violence and what is, can predators be predators 00:56:31.060 |
towards the same, each other towards the same species? 00:56:34.740 |
Like are we using the word predator sort of generally, 00:56:37.980 |
which then connects to conflict and military conflict, 00:56:41.900 |
violent conflict in the space of human species? 00:56:45.700 |
- Obviously we can say that plants are mining 00:56:47.900 |
the resources of their environment in a particular way, 00:56:50.260 |
using photosynthesis to be able to pull minerals 00:56:52.820 |
out of the soil and nitrogen and carbon out of the air 00:56:57.300 |
And we can say herbivores are being able to mine 00:57:19.740 |
but animal, which requires some type of violence capacity 00:57:26.940 |
So it requires some capacity to overtake something 00:57:40.580 |
Because are we, did we just move from the Savannah 00:57:45.780 |
to the Arctic and we need to learn new stuff? 00:57:59.420 |
Horses today in the wild and horses 10,000 years ago 00:58:04.300 |
And so since we make tools and we evolve our tools 00:58:21.620 |
Which is gonna be very important to learn the toolmaking, 00:58:35.580 |
on how to behave that is useful to that particular time. 00:58:44.340 |
And this is also where the uniqueness can go up, right? 00:58:46.980 |
Is because we are less just the result of the genetics 00:58:49.660 |
and that means the kind of learning through history 00:58:55.380 |
it's almost like our hardware selected for software, right? 00:59:04.020 |
I have problems with computer metaphors for biology, 00:59:20.480 |
Changes in on the same fundamental genetic substrate, 00:59:24.840 |
what we're doing with these brains and minds and bodies 00:59:30.560 |
And so now Gerard specifically was looking at 00:59:35.560 |
when we watch other people talking, so we learn language, 00:59:41.480 |
you and I would have a hard time learning Mandarin today 00:59:44.680 |
We'd be learning how to conjugate verbs and stuff, 00:59:47.880 |
without anyone even really trying to teach it 00:59:52.440 |
They're obviously more neuroplastic than we are 00:59:55.300 |
and all their attention is allocated to that. 00:59:57.160 |
But they're also learning how to move their bodies 00:59:59.560 |
and they're learning all kinds of stuff through mimesis. 01:00:07.880 |
They learn desire by watching what other people want. 01:00:11.780 |
people end up wanting what other people want. 01:00:29.880 |
within a group of people will build over time. 01:00:32.840 |
This is a very, very crude interpretation of the theory. 01:00:41.360 |
I'm loosely familiar but haven't internalized it, 01:00:54.120 |
it's a very interesting way to think about the world. 01:01:29.540 |
because then you can pick which group you can join. 01:01:43.680 |
I mean, it's depressing that we're so unoriginal, 01:02:04.680 |
- I'm gonna discuss here where I think we need 01:02:15.760 |
which is who you're around affects who you become. 01:02:24.620 |
everybody, I think, knows that if you got put 01:02:30.320 |
aspects of your personality would have to adapt 01:02:38.960 |
you would be different with your same genetics. 01:02:40.680 |
Like, just, there's no real question about that. 01:02:43.360 |
And that even today, if you hang out in a place 01:02:50.360 |
or all people who are obese as your roommates, 01:02:56.980 |
Like the behavioral science of this is pretty clear. 01:03:02.320 |
of the five people we spend the most time around. 01:03:13.980 |
to become more self-determined is be self-determined 01:03:16.760 |
about the environments they wanna put themselves in. 01:03:18.880 |
Because to the degree that there is some self-determination 01:03:28.500 |
that is predisposing the things that you want. 01:03:35.120 |
- Or perhaps also to, there's probably interesting ways 01:03:41.480 |
like form connections that have this perfect tension 01:03:46.040 |
in all directions to where you're actually free 01:03:49.720 |
because the set of wants within your circle of interactions 01:03:54.720 |
is so conflicting that you're free to choose whichever one. 01:04:00.380 |
as opposed to everybody aligned like a flock of birds. 01:04:05.200 |
that all of the dialectics would be balanced. 01:04:09.340 |
So, if you have someone who is extremely oriented 01:04:14.340 |
to self-empowerment and someone who's extremely oriented 01:04:22.940 |
If you have both of them being inhabited better than you 01:04:27.740 |
by the same person, spending time around that person 01:04:31.400 |
I think the thing you just mentioned is super important 01:04:36.880 |
which is I think one of the fastest things people can do 01:04:46.260 |
but their meaningful problem-solving communication 01:04:52.140 |
capacity to participate as a citizen with other people 01:04:56.260 |
is to be seeking dialectical synthesis all the time. 01:04:59.340 |
And so, in the Hegelian sense, if you have a thesis, 01:05:08.580 |
and Marxist kind of communism on the other side. 01:05:26.100 |
well, the individuals are conditioned by their environment 01:05:27.980 |
who would choose to be born into Darfur rather than Finland. 01:05:30.940 |
So, we actually need to collectively make environments 01:05:37.280 |
because that the environment conditions individuals. 01:05:42.280 |
And then Hegel's ideas, you have a synthesis, 01:05:50.120 |
And so, it is actually at a higher order of complexity. 01:05:57.080 |
that the proponents of it are like, totally, you got that? 01:06:18.380 |
If I don't go to a higher order, then there's war. 01:06:21.380 |
And, but then the higher order thing would be, 01:06:23.280 |
well, it seems like the individual does affect the commons 01:06:28.880 |
It also seems like the collective conditions individuals, 01:06:41.400 |
show up differently than most people born into the Hamptons. 01:06:47.880 |
and have you take your child from the Hamptons 01:06:50.300 |
then like, come on, be realistic about this thing. 01:07:15.600 |
So can we make it to where individuals are creating holes 01:07:20.520 |
that are better for conditioning other individuals? 01:07:23.680 |
that are conditioning increased agency and sovereignty? 01:07:32.800 |
and sometimes it's not just thesis and antithesis, 01:07:39.400 |
This is not just, can I take the perspective, 01:07:42.220 |
Am I actively trying to inhabit other people's perspective? 01:07:51.480 |
both the sense-making about reality and the values, 01:07:56.120 |
Then, just like I wanna seek those perspectives, 01:08:02.800 |
set of understandings that could fulfill the values of 01:08:10.140 |
Maybe I won't get it, but I wanna be seeking it, 01:08:12.000 |
and I wanna be seeking progressively better ones. 01:08:14.460 |
So this is perspective seeking, driving perspective taking, 01:08:30.220 |
- Would you put a title of dialectic synthesis 01:08:38.480 |
Like, it's not just empathy, it's empathy with rigor. 01:08:48.380 |
and then try to find a higher order synthesis. 01:08:50.880 |
- Okay, so I remember last night you told me, 01:08:59.060 |
and you felt that you had suffered in some ways 01:09:01.100 |
that they had suffered, and so you could trust them. 01:09:03.580 |
Shared pathos, right, creates a certain sense 01:09:05.800 |
of kind of shared bonding and shared intimacy. 01:09:10.800 |
of somebody else, and feeling the depth of their sentience. 01:09:14.620 |
I don't wanna fuck them anymore, I don't wanna hurt them. 01:09:20.940 |
when I go and inhabit the perspective of the other people, 01:09:23.140 |
if they feel that's really gonna mess them up, right? 01:09:34.320 |
I can never know what it's like to be them perfectly, 01:09:38.200 |
which is my most rigorous attempt is still not it. 01:09:44.620 |
to know what it's like to be a woman is still not it. 01:09:46.860 |
I have no question that if I was actually a woman, 01:09:54.500 |
So there's a humility in that which keeps me listening, 01:09:58.740 |
but I want to, and I'm gonna keep trying better to. 01:10:07.880 |
It has to be something that could meet the values 01:10:15.060 |
and that could offer a way forward that is more agreeable 01:10:18.820 |
than the partial perspectives at war with each other. 01:10:21.220 |
- But so the more you succeed at this empathy 01:10:52.820 |
it's hard to feel it 'cause it's just too devastating. 01:10:55.900 |
And so a lot of people go numb and even go nihilistic 01:11:01.420 |
So as I actually become more empowered as an individual 01:11:05.660 |
I also become more empowered to be more empathetic for others 01:11:22.100 |
that would solve a lot of problems in society 01:11:27.320 |
So if you have a bunch of little agents behaving in this way, 01:11:32.580 |
my intuition, there'll be interesting complexities 01:11:37.460 |
it will create a society that's very different 01:11:39.780 |
and recognizably better than the one we have today. 01:11:49.300 |
but this brings us back to Gerard, which we didn't answer. 01:11:53.400 |
- 'Cause about how to get past the conflict theory. 01:11:54.980 |
- Yes, you know the Robert Frost poem about the two paths 01:11:57.300 |
and you never had time to turn back to the other. 01:12:01.340 |
We're gonna be living that poem over and over again. 01:12:13.620 |
therefore fundamental conflict based in our desire 01:12:16.580 |
because we want the thing that somebody else has. 01:12:32.180 |
So the, and you know, we become on the Palestinian side 01:12:36.180 |
or the Israeli side or the communist or capitalist side 01:12:38.100 |
or the left or right politically or whatever it is. 01:12:40.860 |
And until eventually the conflict energy in the system 01:12:50.460 |
And you know, George talks about why scapegoating 01:12:52.940 |
was kind of a mechanism to minimize the amount of violence. 01:12:55.700 |
Let's blame a scapegoat as being more relevant 01:13:00.640 |
But if we all believe it, then we can all kind of 01:13:03.820 |
- It's a really interesting concept, by the way. 01:13:06.300 |
I mean, you went, you beautifully summarized it, 01:13:13.460 |
And then they find the other, some group that's the other, 01:13:30.180 |
So this now gets to something more like Buddha said, right? 01:13:34.860 |
Gerard and Buddha would kind of agree in this way. 01:13:43.420 |
it's a compelling description of human history 01:13:56.020 |
in such a elegant and beautiful and simple way. 01:13:59.700 |
It's a friend who's grew up Aboriginal Australian, 01:14:04.700 |
is a scholar of Aboriginal social technologies. 01:14:12.660 |
And he's like, nah, man, Gerard just made shit up 01:14:24.940 |
and then we would dance and then everybody'd be fine. 01:14:29.260 |
everyone would like kind of physically get the energy out. 01:14:32.100 |
we'd have positive bonding, and then we're fine. 01:14:37.100 |
- I think that's called the Joe Rogan theory of desire, 01:14:44.420 |
that you don't do enough hard shit in your day. 01:14:50.900 |
and running on the treadmill gets all the demons out. 01:14:57.100 |
with taking an idea that seems too explanatory 01:15:00.460 |
and then taking it as a given and then saying, 01:15:04.500 |
that conflict is inexorable because mimetic desire 01:15:08.940 |
and therefore how do we deal with the inexorability 01:15:10.940 |
of the conflict and how to sublimate violence? 01:15:12.940 |
Well, no, the whole thing might be actually gibberish. 01:15:19.100 |
So the deeper question is under which conditions 01:15:23.820 |
What do those other conditions make possible and look like? 01:15:43.820 |
- Humility again, it goes back to just having the humility 01:15:46.340 |
that you don't have a perfect model of reality. 01:15:48.380 |
- There's an, the model of reality could never be reality. 01:15:52.020 |
The process of modeling is inherently information reduction. 01:16:13.540 |
sadly impossible to create a model of cellular automata, 01:16:41.860 |
It's so simple, but you can't predict what's going to be, 01:16:53.140 |
Just anything about it, what's gonna happen in the future. 01:17:01.860 |
'Cause then we can't make sense of this world, 01:17:29.980 |
So you're, so you're, you disagree with Jordan Peterson, 01:17:42.700 |
I take, I have no idea if it was intended this way. 01:17:45.860 |
And so I'm just interpreting it a way I like. 01:17:51.140 |
To me, the way I interpret that is meaningful, 01:18:00.580 |
but I know my best understanding of it is never complete. 01:18:08.100 |
where I tried to make some kind of predictive capacity 01:18:18.860 |
and then some set of mechanisms that are involved. 01:18:21.980 |
And what we find is that it can be super useful. 01:18:24.740 |
Like Newtonian gravity can help us do ballistic curves 01:18:30.020 |
And then we get to the place where it doesn't explain 01:18:36.840 |
And at each time, what we're finding is we excluded stuff. 01:18:41.840 |
And it also doesn't explain the reconciliation 01:18:59.620 |
They can be a complete description of what's happening 01:19:11.180 |
And I would say this is key to what we call complexity 01:19:16.400 |
which is a distinction Dave Snowden made well 01:19:28.380 |
is that no matter how you model the complex system, 01:19:34.020 |
- Can you elaborate on the complex versus the complicated? 01:19:40.020 |
the phase space of all the things that it can do. 01:20:01.920 |
- That are basically the application of models. 01:20:31.500 |
Where we keep finding like, oh, it's just the genome. 01:20:33.560 |
Oh, well now it's the genome and the epigenome. 01:20:55.540 |
Self-terminating is a really interesting idea 01:20:59.180 |
First of all, what is a self-terminating system? 01:21:03.860 |
And I think you have a sense, correct me if I'm wrong, 01:21:22.760 |
- Okay, so if we look at human societies historically, 01:21:37.840 |
So they had a life cycle, they died for some reason. 01:21:40.560 |
So we don't still have the early Egyptian empire 01:21:44.120 |
or Inca or Maya or Aztec or any of those, right? 01:21:54.520 |
When we look at Easter Island, it was a self-termination. 01:21:57.640 |
So let's go ahead and take an island situation. 01:22:00.700 |
If I have an island and we are consuming the resources 01:22:05.280 |
can replicate themselves and there's a finite space there, 01:22:09.840 |
It's not gonna be able to keep doing that thing 01:22:12.080 |
'cause you'll get to a place of there's no resources left 01:22:23.760 |
and I'm actually growing our population in the process, 01:22:29.080 |
I might get an exponential curve and then hit a wall 01:22:34.760 |
rather than do an S curve or some other kind of thing. 01:22:51.080 |
- So you're right that if you look at empires, 01:23:07.920 |
why all the previous ones failed, we can't ensure that. 01:23:11.000 |
And so I think it's very important to understand it well 01:23:12.840 |
so that we can have that be a designed outcome 01:23:18.680 |
- So where it's sort of in terms of consuming the resources 01:23:24.560 |
and we keep coming up, especially when on the horizon, 01:23:33.720 |
We keep coming up with clever ways of avoiding disaster, 01:23:43.820 |
coming up with different ways to improve productivity 01:23:50.160 |
or get a lot more from the resources we have. 01:24:07.240 |
If there's more innovation than there is consumption. 01:24:18.720 |
but there are reasons for optimism and pessimism 01:24:33.120 |
there's what I would call naive techno-optimism, 01:24:39.760 |
that says stuff just has been getting better and better 01:24:50.400 |
and that stuff is gonna kind of keep getting better. 01:24:54.720 |
Supply and demand will solve the problems, whatever. 01:24:56.720 |
- Would you put a rake as well on that or in that bucket? 01:25:01.040 |
Is there some specific people you have in mind 01:25:06.440 |
to where you're essentially just have an optimism 01:25:14.040 |
- I don't think that anyone who thinks about it 01:25:25.680 |
- There might be a bias in the nature of the assessment. 01:25:29.720 |
I would also say there's kind of naive techno-pessimism 01:25:43.800 |
on why technology can't not result in our self-termination. 01:25:47.080 |
So we have to take it out before it gets any further. 01:25:49.560 |
But also if you read a lot of the X-risk community, 01:25:56.560 |
it's like our total number of existential risks 01:26:00.400 |
and the total probability of them is going up. 01:26:07.080 |
we have to hold together where our positive possibilities 01:26:11.960 |
and our risk possibilities are both increasing 01:26:19.100 |
all of the catastrophic risks have to not happen. 01:26:21.560 |
Any of the catastrophic risks happening is enough 01:26:25.040 |
to keep that positive outcome from occurring. 01:26:27.320 |
So how do we ensure that none of them happen? 01:26:31.160 |
let's have a civilization that doesn't collapse. 01:26:37.440 |
"The Collapse of Complex Societies" by Joseph Tainter. 01:26:40.240 |
It does an analysis of that many of the societies fell 01:26:55.600 |
how institutional decay in the collective intelligence 01:27:02.360 |
Obviously, Jared Diamond made a more popular book 01:27:07.640 |
the Antikythera mechanism has been getting attention 01:27:27.040 |
So what I'm interested in here is being able to say, 01:27:44.100 |
that isn't just trying to solve one type of failure, 01:27:51.500 |
Are there some underlying generator functions or patterns 01:27:57.140 |
And can we solve those and have that be the kernel 01:27:59.200 |
of a new civilizational model that is not self-terminating? 01:28:12.700 |
And I would say for the optimism to be grounded, 01:28:16.820 |
it has to actually be able to understand the risk space well 01:28:22.460 |
- So can we try to dig into some basic intuitions 01:28:27.460 |
about the underlying sources of catastrophic failures 01:28:35.020 |
that's built in into self-terminating systems? 01:28:37.420 |
So both the overconsumption, which is like the slow death, 01:28:41.000 |
and then there's the fast death of nuclear war 01:28:45.820 |
AGI, biotech, bioengineering, nanotechnology, 01:28:50.820 |
Nanobots are my favorite because it sounds so cool to me 01:28:58.220 |
that I could just know that I would be one of the scientists 01:29:01.140 |
that would be full steam ahead in building them 01:29:12.940 |
but when I go back home, I'd be just in my heart, 01:29:17.460 |
know the amount of excitement is a dumb descendant of ape, 01:29:43.560 |
'Cause you've also provided a kind of a beautiful, 01:29:46.940 |
general approach to this, which is this dialectic synthesis 01:29:56.960 |
that seems to be from the individual perspective 01:30:02.720 |
how to construct non-self-terminating systems. 01:30:10.720 |
I actually really respect Drexler for emphasizing Grey Goo 01:30:18.640 |
to make sure the world was paying adequate attention 01:30:36.980 |
and don't focus on the risks or pretend there aren't risks 01:30:51.460 |
to the commons or wherever happen to have it. 01:30:54.640 |
but now they have the power and capital associated. 01:31:04.480 |
So this is one of the issues we have to deal with 01:31:05.960 |
is some of the bad game theoretic dispositions 01:31:12.680 |
- And the key aspect to that, sorry to interrupt, 01:31:21.680 |
What's your favorite flavor in terms of ice cream? 01:31:30.600 |
what do you most worry about in terms of catastrophic risk 01:31:49.940 |
We don't have to go all the way back to the aliens 01:31:54.880 |
but to just recognize that for all of human history, 01:32:00.880 |
there were existential risks to civilizations 01:32:07.300 |
Like there were civilizations that were killed in war, 01:32:10.600 |
that tribes that were killed in tribal warfare, whatever. 01:32:18.120 |
It's just, those were local phenomena, right? 01:32:33.220 |
about catastrophic risk, not from like a solar flare 01:32:37.380 |
but from something that humans would actually create 01:32:39.580 |
at a global level was World War II and the bomb. 01:32:42.900 |
Because it was the first time that we had tech big enough 01:32:45.180 |
that could actually mess up everything at a global level. 01:32:50.060 |
We just weren't powerful enough to do that before. 01:33:01.020 |
that there's the entire world before World War II, 01:33:05.780 |
to make a non-habitable biosphere, non-habitable for us. 01:33:32.660 |
It's generals and wars and empire expansions. 01:33:41.820 |
or preparation for war or something like that. 01:33:49.660 |
the civilization phase of being able to solve conflicts 01:33:56.320 |
where we could have a war that no one could win. 01:34:04.200 |
They could do diplomatic wars and cold war type stuff, 01:34:07.240 |
and they could fight proxy wars through other countries 01:34:19.520 |
And so we needed a new type of supervening government 01:34:38.240 |
since we have to stop war between at least the superpowers? 01:34:49.340 |
We've had lots of proxy wars during that time. 01:34:57.940 |
where the Bretton Woods solution is basically over, 01:35:03.300 |
- Can you describe the Bretton Woods solution? 01:35:26.020 |
we needed to be able to think about things globally, 01:35:28.380 |
where we had trade relationships with each other, 01:35:30.360 |
where it would not be profitable to war with each other. 01:35:47.600 |
because industrialization hadn't gotten far enough 01:35:49.740 |
to be able to do massive global industrial supply chains 01:35:56.300 |
almost all the electronics that we use today, 01:36:00.540 |
is made on six continents, made in many countries. 01:36:04.380 |
that could actually make many of the things that we have, 01:36:08.740 |
to the plastics and polymers and the et cetera. 01:36:25.380 |
there was the idea that everybody could keep having more 01:36:30.580 |
And so that was part of kind of the Bretton Woods 01:36:35.460 |
The other was that we'd be so economically interdependent 01:36:38.220 |
that blowing each other up would never make sense. 01:36:42.500 |
Now, it also brought us up into planetary boundaries faster, 01:37:02.500 |
and the cutting down of the trees and the climate change 01:37:04.980 |
and the toxic mining tailings going into the water 01:37:23.280 |
obviously accelerated the planetary boundary side. 01:37:30.140 |
but it started to be modeled with the Club of Rome 01:37:40.940 |
where you take stuff out of the earth faster, 01:37:56.740 |
And there's toxicity associated with both sides of this. 01:38:02.500 |
linear materials economy on a finite planet forever. 01:38:08.500 |
if there's an exponentiation in the monetary supply 01:38:11.660 |
because of interest and then fractional reserve banking. 01:38:17.400 |
you have to have growth of goods and services. 01:38:19.300 |
So that's that kind of thing that has happened. 01:38:21.700 |
But you also see that when you get these supply chains 01:38:32.740 |
then affects the whole world in a much bigger area 01:38:39.540 |
and an issue that started in one part of China 01:38:42.780 |
affecting the whole world so much more rapidly 01:38:45.660 |
than would have happened before Bretton Woods, right? 01:38:52.460 |
And with a bunch of second and third order effects 01:39:05.660 |
and depend upon pesticides they don't produce. 01:39:11.420 |
in scale in Northern Africa and Iran and things like that 01:39:13.820 |
because they couldn't get the supplies of stuff in. 01:39:21.980 |
So you get this increased fragility and cascade dynamics 01:39:25.100 |
where a small problem can end up leading to cascade effects. 01:39:46.500 |
And there's a lot more types of catastrophe weapons. 01:39:50.220 |
We now have catastrophe weapons with weaponized drones 01:39:53.580 |
that can hit infrastructure targets with bio with, 01:39:56.180 |
in fact, every new type of tech has created an arms race. 01:40:02.060 |
or the other kind of intergovernmental organizations, 01:40:04.140 |
we haven't been able to really do nuclear deproliferation. 01:40:11.260 |
the race to hypersonics and things like that. 01:40:13.920 |
And every new type of technology that has emerged 01:40:19.560 |
And so you can't do mutually assured destruction 01:40:23.820 |
with multiple agents the way you can with two agents. 01:40:28.700 |
to create a stable Nash equilibrium that's forced. 01:40:40.540 |
and multiple different types of catastrophe weapons, 01:40:42.900 |
including ones that can be much more easily produced 01:40:51.600 |
But weaponized drones hitting smart targets is not so hard. 01:40:55.380 |
There's a lot of other things where basically the scale 01:40:57.560 |
at being able to manufacture them is going way, way down 01:40:59.940 |
to where even non-state actors can have them. 01:41:06.560 |
and the decentralization of exponential tech, 01:41:09.500 |
what that means is decentralized catastrophe weapon capacity. 01:41:13.060 |
And especially in a world of increasing numbers 01:41:21.340 |
So I would say the Bretton Woods world doesn't prepare us 01:41:26.340 |
to be able to deal with lots of different agents, 01:41:29.260 |
having lots of different types of catastrophe weapons 01:41:31.540 |
you can't put mutually assured destruction on, 01:41:33.680 |
where you can't keep doing growth of the materials economy 01:41:36.840 |
in the same way because of hitting planetary boundaries 01:41:40.900 |
and where the fragility dynamics are actually now 01:41:46.100 |
So now we're, so like there was all the world 01:41:49.220 |
And World War II is just from a civilization timescale 01:41:54.620 |
It seems like a long time, but it is really not. 01:41:56.700 |
We get a short period of relative peace at the level 01:41:58.900 |
of superpowers while building up the military capacity 01:42:01.500 |
for much, much, much worse war the entire time. 01:42:04.340 |
And then now we're at this new phase where the things 01:42:07.600 |
that allowed us to make it through the nuclear power 01:42:10.940 |
are not the same systems that will let us make it 01:42:22.460 |
of many different types of exponential technology 01:42:26.220 |
is a key question when we're thinking about X-Risk. 01:42:30.020 |
- Okay, so, and I'd like to try to answer the how 01:42:35.020 |
a few ways, but first on the mutually assured destruction. 01:42:40.980 |
Do you give credit to the idea of two superpowers 01:42:46.460 |
not blowing each other up with nuclear weapons 01:43:18.900 |
not because you're afraid of mutually assured 01:43:22.260 |
self-destruction, but because we're human beings 01:43:47.260 |
I think most people would say that Alexander the Great 01:43:51.820 |
and Genghis Khan and Napoleon were effective people 01:44:15.340 |
but pretty willing to commit certain acts of destruction 01:44:20.780 |
The Genghis Khan, or you could argue he's not a psychopath. 01:44:46.500 |
It's more about just the destruction in itself is the goal. 01:44:59.100 |
and a psychological disposition more towards destruction. 01:45:02.740 |
Obviously everybody has both and can toggle between both. 01:45:05.900 |
And oftentimes one is willing to destroy certain things. 01:45:09.180 |
We have this idea of creative destruction, right? 01:45:10.860 |
Willing to destroy certain things to create other things. 01:45:20.700 |
I am trying to create something for our people 01:45:23.220 |
and it requires destroying some other people. 01:45:30.100 |
'cause it's possible to have very high fealty 01:45:32.060 |
to your in-group and work on perfecting the methods 01:45:38.300 |
'cause you can dehumanize and then remove empathy. 01:45:51.300 |
about the orientation towards construction and destruction 01:45:56.980 |
is what it takes to build really catastrophic tech, 01:46:01.500 |
even today where it doesn't take what it took 01:46:03.260 |
to make a nuke, a small group of people could do it, 01:46:18.260 |
with the desire to damage civilization meaningfully? 01:46:35.220 |
which is like, it's pretty easy to come up with ways 01:46:39.620 |
that any competent, I can come up with a lot of ways 01:46:48.220 |
And there's a lot of people as smart or smarter than me, 01:46:57.820 |
Why are we not seeing more insane mass murder? 01:47:09.340 |
- And it does have to do with some deeply pro-social 01:47:17.340 |
But when you're dealing with very large numbers, 01:47:24.260 |
well, what's the probability that X won't happen this year, 01:47:30.060 |
And then how many people are doing destructive things 01:47:33.940 |
And then how many of them can get access to higher tech 01:47:36.220 |
that they didn't have to figure out how to build? 01:47:46.300 |
but I understand it well enough to utilize it, 01:47:58.340 |
and do the equivalent of an incendiary bomb level of damage, 01:48:09.800 |
does it stay being a small percentage of the population? 01:48:18.980 |
And especially now as you start to get into bigger things, 01:48:25.740 |
CRISPR gene drive technologies and various things like that, 01:48:45.540 |
that's inherent and that's core to human nature, 01:48:54.060 |
And you're saying, yeah, but there's a lot of humans. 01:49:04.260 |
that can lead to a distorted view of the world 01:49:07.420 |
such that you want to channel that pain into the destruction. 01:49:13.180 |
that any one individual could do large damage, 01:49:15.460 |
especially as we create more and more democratized, 01:49:22.620 |
even if you don't know how to build the initial weapon. 01:49:30.060 |
between the cheapening of destructive weapons 01:49:36.820 |
and the capacity of humans to express their love 01:49:43.160 |
And it's a race that so far, I know on Twitter, 01:49:48.160 |
it's not popular to say, but love is winning, okay? 01:49:51.900 |
So what is the argument that love is going to lose here 01:49:55.200 |
against nuclear weapons and biotech and AI and drones? 01:50:07.200 |
So I just want you to know that that's where I'm oriented. 01:50:10.400 |
- But I'm gonna argue against why that is a given 01:50:27.000 |
- Well, it's because it's only a happy ending 01:50:28.720 |
if we actually understand the issues well enough 01:50:38.040 |
And the some protopic sci-fi usually requires magic 01:50:46.880 |
Dilithium crystals and warp drives and stuff. 01:50:53.040 |
like the people we have been in the history books 01:51:05.480 |
as stewards of their environment and their commons 01:51:25.420 |
- And I'm sorry if I'm interrupting your flow of thought, 01:51:42.760 |
which is like we kind of enjoy for some weird reason 01:51:56.200 |
And like, especially if it's like murdering zombie doom, 01:52:11.000 |
to eat my coconut ice cream and I'm all about love. 01:52:13.360 |
So like, can we trust our ability to visualize 01:52:22.640 |
- I think it's a fair question to say to what degree 01:52:32.000 |
and whatever else that happens in our imagination. 01:52:45.280 |
that there's a lot of ways I could try to put 01:52:52.000 |
There's not that many ways I can put them together 01:53:17.880 |
but it took a year or a couple years to build it. 01:53:34.000 |
to put a set of things together that don't work 01:53:36.840 |
than the few that really do produce higher order synergies. 01:53:46.920 |
and then we look at exponentially more powerful warfare, 01:53:51.360 |
an arms race that drives that in all these directions, 01:53:53.600 |
and when we look at a history of environmental destruction 01:54:01.120 |
that are doing it and the cumulative effects, 01:54:03.520 |
there's a lot of ways the whole thing can break, 01:54:23.240 |
that it avoids all of the catastrophic risks. 01:54:28.600 |
Can we inventory the patterns of human behavior 01:54:37.160 |
of the social technology that we're thinking about 01:54:45.200 |
like we were talking about the Genghis Khans and like that, 01:54:48.120 |
that obviously use certain kinds of physical technology 01:54:53.640 |
and unconventional warfare for a particular set of purposes. 01:54:57.760 |
But we have things that don't look like warfare, 01:55:07.040 |
to be able to bring this new energy resource to the world. 01:55:13.840 |
And the second order effects of that are climate change 01:55:18.840 |
and all of the oil spills that have happened and will happen. 01:55:30.800 |
and human life issues that are associated with it 01:55:53.560 |
mentioned second, third, and fourth order effects. 01:56:18.920 |
So how do we make those, so it sounds like part of the, 01:56:21.720 |
part of the thing that you are thinking through 01:56:26.000 |
in terms of a solution, how to create an anti-fragile, 01:56:44.720 |
How do we start to think about those effects? 01:56:47.920 |
- Yeah, the war application is harm we're trying to cause, 01:56:53.120 |
The externality is harm that, at least supposedly, 01:56:57.080 |
or at minimum, it's not our intention, right? 01:57:02.120 |
but it is a side effect of what our intention is. 01:57:06.400 |
There are catastrophic risks from both types. 01:57:08.920 |
The direct application of increased technological power 01:57:19.520 |
but the out-group is also working on growing the tech, 01:57:23.640 |
they reverse engineer the tech, upregulate it, 01:57:27.760 |
So there's the exponential tech arms race side 01:57:31.520 |
of in-group, out-group rivalry using exponential tech 01:57:36.240 |
And the other set of risks is the application 01:57:42.760 |
not intentionally to try and beat an out-group, 01:57:46.520 |
but to try to achieve some goal that we have, 01:57:48.580 |
but to produce a second and third order effects 01:57:51.160 |
that do have harm to the commons, to other people, 01:58:01.680 |
than the problem we were originally trying to solve 01:58:12.920 |
they weren't trying to build a democracy destroying app 01:58:17.040 |
that would maximize time on site as part of its ad model 01:58:27.480 |
to the thing that made people spend most time on site, 01:58:29.460 |
which is usually them being limbically hijacked 01:58:33.240 |
which ends up appealing to people's cognitive biases 01:58:42.920 |
And it's a pretty fucking powerful second order effect, 01:58:49.920 |
'cause the rate of tech is obviously able to get distributed 01:58:54.520 |
and with a bigger jump in terms of total vertical capacity, 01:58:58.560 |
then that's what it means to get to the verticalizing part 01:59:07.440 |
had these second order environmental effects, 01:59:23.200 |
and so the oil thing also had the externalities 01:59:30.940 |
with military industrial complex and things like that. 01:59:46.780 |
we build it for reason X, whatever reason X is. 01:59:51.000 |
Maybe X is three things, maybe it's one thing, right? 01:59:53.800 |
We're doing the oil thing because we wanna make cars 01:59:57.680 |
because it's a better method of individual transportation. 02:00:08.760 |
with ecologies, economies, psychologies, cultures. 02:00:16.400 |
Some of those effects can end up being negative effects, 02:00:27.080 |
it's gonna overcome in a short period of time. 02:00:32.560 |
That also means that the externalities that it creates 02:00:37.480 |
And you can say, well, but then that's the new problem 02:00:39.960 |
and humanity will innovate its way out of that. 02:01:03.260 |
Guys like Ostrom and others thinking in those directions, 02:01:14.280 |
And I have a different answer of what I think it looks like 02:01:20.680 |
but some applied social tech aligned with love. 02:01:23.720 |
- 'Cause I have a bunch of really dumb ideas. 02:01:30.080 |
I think the idea I would have is to be a bit more rigorous 02:01:34.240 |
in trying to measure the amount of love you add 02:01:42.600 |
in second, third, fourth, fifth order effects. 02:01:46.400 |
It's actually, I think, especially in the world of tech, 02:01:53.720 |
the shareholders may not like that kind of metric, 02:02:06.520 |
but we could talk about just happiness and well-being, 02:02:09.920 |
That's pretty easy for Facebook, for YouTube, 02:02:34.440 |
but integrated deeply as part of the technology. 02:02:41.000 |
online interaction is driven by recommender systems 02:02:50.680 |
mostly based on traces of your previous behavior 02:02:54.600 |
This is how Twitter, this is how Facebook works, 02:02:56.800 |
this is how AdSense for Google, AdSense works, 02:03:00.280 |
this is how Netflix, YouTube work, and so on. 02:03:02.520 |
And for them to just track, as opposed to engagement, 02:03:24.880 |
the Rogan idea of the hero of your own movie. 02:03:34.040 |
If you have people, you have this happiness surveys 02:03:49.540 |
It's fascinating, it's very embarrassing for me. 02:04:04.120 |
where it looks like I watched the whole thing, 02:04:05.760 |
which I probably did, about like how to cook a steak, 02:04:08.320 |
even though, or just like the best chefs in the world 02:04:11.320 |
cooking steaks, and I'm just like sitting there watching it 02:04:15.080 |
for no purpose whatsoever, wasting away my life, 02:04:34.040 |
or lectures I've watched which made me a better person, 02:04:40.120 |
Quite honestly, not for stupid reasons, like I feel dumber, 02:04:43.720 |
but because I do have a sense that that started me 02:04:57.160 |
and I'm sorry for ranting, but maybe there's some usefulness 02:05:01.360 |
When I focus on creating, on programming, on science, 02:05:14.380 |
When I listen to too many, a little bit is good, 02:05:27.240 |
the worst of the quote-unquote woke, for example. 02:05:30.240 |
There's all these groups that are misbehaving 02:05:33.560 |
in fascinating ways because they've been corrupted by power. 02:05:37.040 |
The more I watch criticism of them, the worse I become. 02:05:47.080 |
that for some reason it's pleasant to watch those sometimes. 02:05:54.040 |
that to the YouTube algorithm, to the systems around me, 02:06:03.160 |
which I personally believe would make YouTube 02:06:05.400 |
a lot more money because I'd be much more willing 02:06:12.160 |
That's great for business and great for humanity 02:06:18.040 |
It'll increase the love quotient, the love metric, 02:06:27.000 |
And so you should do that not just for YouTube algorithm, 02:06:35.880 |
about going to war, which I think we talked about offline, 02:06:40.080 |
is we often go to war with kind of governments, 02:06:46.000 |
You have to think about the kids of countries 02:06:49.960 |
that see a soldier and because of what they experience, 02:06:56.840 |
their interaction with the soldier, hate is born. 02:07:00.820 |
When you're like eight years old, six years old, 02:07:11.920 |
that could be reduced to love, positive and negative, 02:07:15.280 |
is the hate that's born when you make decisions. 02:07:22.640 |
that that little seed is going to become a tree 02:07:28.340 |
So, but in my sense, it's possible to reduce everything 02:07:33.600 |
to a measure of how much love does this add to the world. 02:07:49.300 |
- There were a lot of good things that you shared 02:07:55.040 |
that we could enter this that are all interesting. 02:07:57.140 |
So I'm trying to see which one will probably be most useful. 02:07:59.880 |
- Pick the one or two things that are least ridiculous. 02:08:07.240 |
some of the second order effects or externalities 02:08:12.320 |
specifically the one of a kid being radicalized 02:08:15.200 |
somewhere else, which engenders enmity in them towards us, 02:08:21.400 |
if you care about the kid, it's a whole other thing. 02:08:30.120 |
and took photos and videos of what was happening, 02:08:36.680 |
That like the anti-war effort was bolstered by that 02:08:46.840 |
you can't have a mere neuron effect in the same way. 02:08:50.080 |
And when you can, that starts to have a powerful effect. 02:08:54.600 |
that you're sharing there, which is that if we, 02:08:59.600 |
we can have a rivalrous intent where our in-group, 02:09:05.480 |
whatever it is, maybe it's our political party 02:09:08.920 |
maybe it's our nation state wanting to win in a war 02:09:12.880 |
or an economic war over resource or whatever it is, 02:09:15.660 |
that if we don't obliterate the other people completely, 02:09:28.800 |
and whatever technologies we employed to be successful, 02:09:37.280 |
which is why you can see that the wars were over history, 02:09:53.600 |
Like it just increased capacity in all of those fronts. 02:09:56.940 |
And so what seems like a win to us on the short term 02:10:02.760 |
might actually really produce losses in the long term. 02:10:05.380 |
And what's even in our own best interest in the long term 02:10:17.740 |
and the rate at which we affect the biosphere 02:10:20.860 |
is that this kind of proverbial spiritual idea 02:10:34.740 |
saw their interconnection and dependence on each other. 02:10:39.480 |
the speed at which we are actually interconnected, 02:10:43.720 |
the speed at which the harm happening to something in Wuhan 02:10:50.360 |
affects the entire world or an environmental issue 02:10:57.140 |
not as a spiritual idea, just even as physics, right? 02:10:59.560 |
We all get the interconnectedness of everything 02:11:04.120 |
and see how to make it through more effectively together 02:11:13.240 |
- Don't you think people are beginning to experience that 02:11:18.320 |
They're trying to make us not empathize with each other, 02:11:25.000 |
Like, isn't that exactly what the technology is enabling? 02:11:27.880 |
Like social networks, we tend to criticize them, 02:11:30.040 |
but isn't there a sense which we're experiencing? 02:11:35.040 |
- When you watch those videos that are criticizing, 02:11:46.040 |
does it seem like they have increased empathy 02:11:49.300 |
for people that are outside of their ideologic camp? 02:12:01.200 |
I tend to see those videos as feeding something 02:12:16.360 |
but whether I'm right or wrong, I don't know. 02:12:23.400 |
that hunger for drama is not fundamental to human beings. 02:12:37.200 |
and be able to empathize with them and synthesize it all. 02:13:01.180 |
based on the scarcity or abundance of resource 02:13:13.740 |
They only take fruits that fall off them and stuff. 02:13:16.620 |
Or to go to a larger population, you take Buddhists, 02:13:19.820 |
where for the most part, with a few exceptions, 02:13:34.820 |
that is happening within a culture of that many people 02:13:37.980 |
and head traumas and whatever, and nobody hurts bugs. 02:13:41.900 |
And then you look at a group where the kids grew up 02:13:48.660 |
pretty much everybody's killed people, hand to hand, 02:13:55.460 |
And you say, okay, so we were very neotenous. 02:14:06.420 |
It doesn't mean that the Buddhists had no violence. 02:14:08.200 |
It doesn't mean that these people had no compassion, 02:14:09.940 |
but they're very different Gaussian distributions. 02:14:20.640 |
of the populations, what Buddhism shows regarding compassion 02:14:28.900 |
the average level of education that everybody gets 02:14:42.720 |
And then say, what are the conditioning factors? 02:15:00.920 |
They become irrational, specifically those two things, 02:15:05.020 |
And so in order to minimize the total amount of violence 02:15:07.900 |
and have some good decisions, they need ruled somehow. 02:15:10.440 |
And that not getting that is some kind of naive utopianism 02:15:28.060 |
to popularize the idea that most people are too violent, 02:15:34.540 |
lazy, undisciplined and irrational to make good choices. 02:15:39.540 |
And therefore their choices should be sublimated 02:16:00.980 |
of how well they do on any particular metric, 02:16:06.100 |
And the worst ones become high-end lawyers or whatever. 02:16:14.820 |
but a very different set of conditioning factors. 02:16:20.460 |
got to go to Exeter and have that family and whatever, 02:16:23.980 |
Could we actually condition more social virtue, 02:16:30.900 |
more orientation towards dialectical synthesis, 02:17:01.200 |
this is the powerful leading over the less powerful, 02:17:05.020 |
- Anyone that benefits from asymmetries of power 02:17:14.100 |
and kind of increasing the capacity of people more widely. 02:17:35.980 |
So like this pickup line that I use at a bar often, 02:17:41.560 |
which is power corrupts and absolute power corrupts, 02:17:45.500 |
Is that true or is that just a fancy thing to say? 02:17:48.620 |
In modern society, there's something to be said, 02:18:01.220 |
that is innate in people and can be conditioned 02:18:06.100 |
does a very different thing with it at scale. 02:18:16.000 |
and particularly then creating sociopathic institutions. 02:18:26.300 |
that were rare in our evolutionary environment 02:18:28.340 |
that give us more dopamine hit because they were rare 02:18:35.020 |
where humans have an orientation to overeat if they can? 02:18:38.260 |
Well, the fact that there is that possibility 02:18:40.620 |
doesn't mean everyone will obligately be obese 02:18:43.920 |
Like it's possible to have a particular impulse 02:18:52.220 |
And so to say that power dynamics are obligate in humans 02:18:57.220 |
and we can't do anything about it is very similar to me 02:19:01.680 |
to saying like everyone is gonna be obligately obese. 02:19:05.880 |
- Yeah, so there's some degree to which those, 02:19:11.440 |
- Yes, and the culture that creates the environment 02:19:15.040 |
to be able to do that and then the recursion on that. 02:19:17.760 |
- Okay, so what if we were to, just bear with me, 02:19:22.880 |
if we were to kill all humans on earth and then start over, 02:19:33.080 |
let's leave the humans on earth, they're fine, 02:19:39.040 |
Is there ways to construct systems of conditioning, 02:20:10.240 |
of course, you probably don't have all the answers, 02:20:13.160 |
but you have insights about what that looks like. 02:20:15.640 |
I mean, is it just rigorous practice of dialectic synthesis 02:20:23.520 |
of various flavors until they're not assholes anymore 02:20:33.080 |
that we would need to construct to come back to this, 02:20:43.400 |
If you have a culture that is doing less rivalry, 02:20:46.020 |
does it always lose in war to those who do war better? 02:20:48.920 |
And how do you make something on the enactment 02:21:05.460 |
I think bad and good are kind of silly concepts here. 02:21:12.640 |
- For resilience. - Some contexts and others. 02:21:22.920 |
First, I think what you're saying is actually aligned 02:21:29.220 |
but it's not, the devil is in the details here. 02:21:54.220 |
So let's take it for the utility it has first. 02:22:07.360 |
that the shocks in a car are making the car bounce less, 02:22:16.200 |
And happiness is actually hard for philosophers to define 02:22:22.840 |
that there's certain kinds of overcoming suffering 02:22:26.120 |
There's happiness that feels more like contentment 02:22:33.340 |
There's deep stuff and it's mostly first person, 02:22:37.200 |
even if maybe it corresponds to third person stuff 02:22:46.640 |
Where the third person stuff looks to be less conducive 02:22:49.180 |
and there's some Viktor Frankl, Nelson Mandela, whatever. 02:22:57.060 |
And I think we can see that the industrial revolution 02:22:59.980 |
started to replace happiness with comfort quite heavily 02:23:05.360 |
And we can see that when increased comfort is given, 02:23:07.860 |
maybe because of the evolutionary disposition 02:23:14.140 |
we didn't have extra calories was not a safe thing to do. 02:23:37.140 |
that we have better lives than Egyptian pharaohs 02:23:39.940 |
and kings and whatever, what they're largely looking at 02:23:45.960 |
and how comfortable the transportation systems are 02:23:52.660 |
where people have access to the most comfort, 02:23:57.140 |
And we also see that some of the happiest cultures 02:24:04.080 |
And so there's a very interesting question here. 02:24:06.220 |
And if I understand correctly, you do cold showers. 02:24:09.980 |
And Joe Rogan was talking about how he needs to do 02:24:22.440 |
that it's actually stressing an adaptive system 02:24:28.420 |
and that there's something that the happiness of a system 02:24:32.180 |
has something to do with its adaptive capacity, 02:24:39.440 |
And yet in the presence of the comfort solution, 02:24:47.600 |
to actually down-regulate your overall adaptive capacity. 02:25:06.600 |
the revenue generation or profit generation of my business, 02:25:28.160 |
- Yeah, and I think they're answerable questions, right? 02:25:34.720 |
let me throw out happiness and comfort out of the discussion. 02:25:39.280 |
The distinction, 'cause I said they're useful, 02:25:42.320 |
wellbeing is useful, but I think I take it back. 02:25:45.780 |
I proposed new metrics in this brainstorm session, 02:26:01.080 |
I think we're able to make that concrete for ourselves. 02:26:05.640 |
Like you're a better person than you were a week ago, 02:26:17.840 |
It's this gray area, and we try to define it, 02:26:20.480 |
but I think we humans are pretty good at that, 02:26:27.120 |
We all dream of becoming a certain kind of person, 02:26:29.320 |
and I think we have a sense of getting closer 02:26:38.440 |
Fuck if you're happy or not, or you're comfortable or not, 02:26:42.900 |
how much love do you have towards your fellow human beings? 02:26:52.500 |
How many times a day, sorry, if I can quantify, 02:26:58.800 |
how many times a day have you thought positively 02:27:02.560 |
Just put that down as a number, and increase that number. 02:27:08.960 |
okay, so let's not take GDP or GDP per capita 02:27:20.040 |
that we wouldn't say correlate to quality of life. 02:27:24.680 |
- By the way, when I said growth, I wasn't referring to GDP. 02:27:35.280 |
So the idea of saying, what would the metrics 02:27:46.200 |
And then really try to run the thought experiment 02:27:56.360 |
and third order effects of what happens that's positive, 02:28:02.820 |
What actually matters that is not included in that 02:28:06.820 |
Because love versus number of positive thoughts per day, 02:28:32.200 |
whatever reality is that is not included in those metrics 02:28:36.800 |
which is why I would say that wisdom is something like 02:28:47.280 |
beyond what metrics-based optimization would offer. 02:29:07.940 |
that is recognizing a new metric that's important 02:29:18.940 |
- There's a Girdles incompleteness theorem, right? 02:29:35.340 |
Well, that then becomes true for the civilization 02:29:39.620 |
And our definition of how to think about a meaningful life 02:29:44.220 |
I can tell you what some of my favorite metrics are. 02:29:55.060 |
- Yeah, I wanna optimize that across the entire population, 02:29:58.560 |
So, in the same way that love isn't a metric, 02:30:05.260 |
but you could make metrics that look at certain parts of it. 02:30:12.340 |
I don't think there is a metric, a right one. 02:30:22.620 |
I think that's what the idea of false idol means 02:30:26.540 |
in terms of the model of reality not being reality. 02:30:29.700 |
Then my sacred relationship is to reality itself, 02:30:36.420 |
And there's a sense of sacredness connected to the unknown 02:30:41.300 |
that is always seeking not just to optimize the thing I know, 02:30:54.540 |
And this is why the first verse of the Tao Te Ching 02:30:57.780 |
is the Tao that is nameable is not the eternal Tao. 02:31:00.540 |
The naming then can become the source of the 10,000 things 02:31:04.620 |
can actually obscure you from paying attention to reality 02:31:11.540 |
but in a different language, much more poetic. 02:31:21.260 |
You can't construct a good model of cellular automata. 02:31:34.380 |
And dealing with the uncertainty of uneven ground. 02:31:44.220 |
- So one metric, and there are problems with this, 02:32:10.140 |
'Cause it's gonna be in relationship to that, right? 02:32:19.860 |
What is the epistemic basis for postulating that? 02:32:27.540 |
I don't, I mean, that doesn't even apply to you whatsoever, 02:32:54.580 |
but we haven't really considered what it means. 02:32:58.140 |
Just a more concrete version of that question 02:33:02.900 |
What is the kind of world we're trying to create? 02:33:06.140 |
Have you really thought about the kind of world? 02:33:08.460 |
- I'll give you some kind of simple answers to that 02:33:16.260 |
- We should take a note of this meaningful thing, 02:33:40.060 |
is the inverse correlation of addiction within the society. 02:34:05.100 |
And insofar as a civilization is conditioning 02:34:28.180 |
- I think that's a very interesting one to think about. 02:34:35.460 |
And the ability to go in the other direction from addiction 02:34:38.900 |
is the ability to be exposed to hypernormal stimuli 02:34:41.660 |
and not go down the path of desensitizing to other stimuli 02:34:54.500 |
has to create something like ritualized discomfort. 02:35:06.260 |
I think that's what the sweat lodge and the vision quest 02:35:09.500 |
and the solo journey and the ayahuasca journey 02:35:13.420 |
I think it's even a big part of what yoga asana was, 02:35:16.020 |
is to make beings that are resilient and strong, 02:35:23.060 |
To make beings that can control their own mind and fear, 02:35:27.500 |
But we don't want to put everybody in war or real trauma. 02:35:31.140 |
And yet we can see that the most fucked up people we know 02:35:36.380 |
but some of the most incredible people we know 02:35:40.060 |
whether or not they happened to make it through 02:35:43.220 |
So how do we get the benefits of the stealing of character 02:35:56.300 |
that not only has us overcome something by ourselves, 02:36:12.780 |
comfortable stuff, but we have to also develop resilience 02:36:16.300 |
in the presence of that for the anti-addiction direction 02:36:24.900 |
- So you have to be consistently injecting discomfort 02:36:31.180 |
I mean, this sounds like you have to imagine Sisyphus happy. 02:36:39.140 |
optimally resilient from a metrics perspective in society. 02:36:47.020 |
So we want to constantly be throwing rocks at ourselves. 02:37:01.620 |
Now, I do not think this should be imposed by states. 02:37:07.900 |
And I think the cultures are developing people 02:37:17.340 |
but there's also a voluntarism because the people value 02:37:20.340 |
the thing that is being developed, they understand it. 02:37:22.380 |
- And that's what conditioning, so it's conditioning. 02:37:30.900 |
but when we recognize the language that we speak 02:37:35.980 |
and the patterns of thought built into that language 02:37:45.380 |
So all you can do is take more responsibility 02:37:48.860 |
and then you have to think about this question 02:37:51.700 |
Because we're, unlike the other animals born into environment 02:38:02.100 |
So then we have to say, well, what kinds of environments, 02:38:16.460 |
What are even those sets of things that matter? 02:38:18.220 |
So you end up getting deep existential consideration 02:38:23.380 |
when you start to realize how powerful we're becoming 02:38:29.180 |
- Before I pull, I think, three threads you just laid down, 02:38:33.020 |
is there another metric index that you're interested in? 02:38:37.700 |
There's a number, but the next one that comes to mind is, 02:39:09.780 |
And we would call that the compassion-compersion axis. 02:39:18.780 |
So compassion is when you're feeling something negative, 02:39:24.400 |
Compersion is when you do well, I'm stoked for you. 02:39:27.260 |
Right, like I actually feel happiness at your happiness. 02:39:31.220 |
- Yeah, the fact that it's such an uncommon word in English 02:39:37.440 |
and I think that's a really good feeling to feel 02:39:47.540 |
Now, there is a state where my emotional states 02:39:51.400 |
and your emotional states are just totally decoupled. 02:39:57.260 |
I don't want to hurt you, but I don't care if I do 02:40:01.480 |
But there's a worse state and it's extremely common, 02:40:05.980 |
where my positive emotions correspond to your negative ones 02:40:11.120 |
And that is the, I would call it the jealousy-sadism axis. 02:40:16.060 |
The jealousy axis is when you're doing really well, 02:40:20.700 |
I feel taken away from, less than, upset, envious, whatever. 02:40:30.580 |
But I think of it as kind of a low-grade psychopathology 02:40:35.740 |
The idea that I'm actually upset at the happiness 02:40:43.360 |
No, we shouldn't shame it and repress it so it gets worse. 02:40:47.340 |
And it comes from our own insecurities and stuff. 02:40:52.480 |
is really fucked up is just on the same axis. 02:40:55.100 |
It's the same inverted, which is to the jealousy 02:40:58.540 |
or the envy is the, I feel badly when you're doing well. 02:41:02.120 |
The sadism side is I actually feel good when you lose. 02:41:12.420 |
and then they feel they want that partner to get it. 02:41:17.460 |
So sadism is really, jealousy is one step on the path 02:41:21.260 |
to sadism from the healthy compassion-compersion axis. 02:41:23.940 |
So I would like to see a society that is inversely, 02:41:28.240 |
that is conditioning sadism and jealousy inversely, right? 02:41:46.140 |
where I might be self-sacrificing and miserable 02:41:51.220 |
which I don't think anybody thinks a great idea. 02:41:52.780 |
Or happiness where I might be sociopathically happy 02:41:55.000 |
where I'm causing problems all over the place 02:41:56.860 |
or even sadistically happy, but it's a coupling, right? 02:42:00.600 |
That I'm actually feeling happiness in relationship to yours 02:42:08.760 |
- That's actually, speaking of another pickup line, 02:42:19.860 |
'cause it's like, oh yeah, where's your girlfriend, bro? 02:42:33.500 |
where you actually get joy from another person's success 02:42:53.140 |
So like just be, just enjoying the happiness of others, 02:43:01.020 |
'cause the first person that drilled this into my head 02:43:06.180 |
'cause I saw somebody who was successful, rich, 02:43:18.180 |
really genuinely enjoying the success of his friends. 02:43:23.900 |
And I mean, the way you're kind of speaking to it, 02:43:30.020 |
is I guess I haven't witnessed genuine expression 02:43:39.660 |
there hasn't been many channels where you can watch 02:43:43.640 |
or listen to people being their authentic selves. 02:43:49.580 |
They probably don't seek public attention also, 02:43:56.780 |
that could express what I've learned from Joe 02:44:14.380 |
but you're speaking how society should function, 02:44:19.060 |
but I feel like if you optimize for that metric 02:44:23.540 |
you're going to live a truly fulfilling life. 02:44:32.180 |
- You will also learn what gets in the way of it 02:44:36.140 |
and how to work with it that if you wanted to help 02:44:38.300 |
try to build systems at scale or apply Facebook 02:44:42.820 |
you would have more actual depth of real knowledge 02:44:51.740 |
between when you get stoked on other people doing well 02:44:53.740 |
and then they have a similar relationship to you 02:44:55.700 |
and everyone is in the process of building each other up. 02:44:58.300 |
And this is what I would say the healthy version 02:45:02.340 |
of competition is versus the unhealthy version. 02:45:08.540 |
I believe it's a Latin word that means to strive together. 02:45:15.220 |
but I recognize that there's actually a hormesis, 02:45:22.980 |
there's an impulse where I'm trying to get ahead, 02:45:29.260 |
because that is where my ongoing becoming happens. 02:45:31.620 |
And the win itself will get boring very quickly. 02:45:34.460 |
The ongoing becoming is where there's aliveness. 02:45:37.260 |
And for the ongoing becoming, they need to have it too. 02:45:48.780 |
into a model I like about what a meaningful human life is, 02:46:02.820 |
- Well, I have three things I'm going elsewhere with, 02:46:20.300 |
'cause a lot of people will take the word meaning 02:46:27.340 |
And they'll think kind of what is the meaning of my life? 02:46:34.780 |
And what is the meaning of existence rather than non-existence? 02:46:38.620 |
So there's a lot of kind of existential considerations there. 02:46:51.980 |
The purpose of one thing is defined in relationship 02:46:56.500 |
And to say, what is the purpose of everything? 02:47:00.780 |
Well, it's a purpose is too small of a question. 02:47:03.380 |
It's fundamentally a relative question within everything. 02:47:05.740 |
What is the purpose of one thing relative to another? 02:47:08.860 |
And there's nothing outside of it with which to say it. 02:47:11.100 |
You actually just got to the limits of the utility 02:47:16.100 |
It doesn't mean it's purposeless in the sense 02:47:23.180 |
like in Taoism talking about the nature of it. 02:47:33.620 |
But I'm gonna try to speak to a much simpler part, 02:47:42.300 |
And kind of if we were to optimize for something 02:48:04.500 |
'Cause you can see that there are a lot of dialectics 02:48:28.780 |
and you're like, wait, fuck, sadists can be happy 02:48:32.620 |
But wait, some people can self-sacrifice out of love 02:48:52.780 |
that a meaningful life involves the mode of being, 02:49:09.100 |
also have failure modes that are not a meaningful life. 02:49:12.420 |
The mode of being, the way I would describe it, 02:49:28.340 |
and that is largely about being with what is. 02:49:32.940 |
It's fundamentally grounded in the nature of experience 02:49:40.860 |
I'm not actually asking what the meaning of life is. 02:49:58.940 |
that will make life easier, more beautiful for somebody else. 02:50:07.420 |
or other people's ability to appreciate the beauty of life 02:50:19.700 |
And becoming is getting better at both of those. 02:50:24.540 |
which is to be able to take in the beauty of life 02:50:26.580 |
more profoundly, be more moved by it, touched by it. 02:50:42.420 |
And where they're not in conflict with each other, 02:50:48.620 |
It grounds in the intrinsic meaningfulness of experience. 02:50:54.340 |
that will be able to increase the possibility 02:51:04.700 |
and also the evolutionary possibility of experience. 02:51:07.780 |
- And the point is to oscillate between these, 02:51:18.920 |
well, you can't really, attention is a thing. 02:51:26.900 |
I want moments where I am absorbed in the sunset 02:51:37.260 |
to where my doing doesn't come from what's in it for me. 02:51:40.780 |
'Cause I actually feel overwhelmingly full already. 02:51:44.260 |
And then it's like, how can I make life better 02:51:56.460 |
And so I think where the doing comes from matters. 02:52:00.460 |
And if the doing comes from a fullness of being, 02:52:12.100 |
that is willing to cause sacrifices other places 02:52:14.180 |
and where a chunk of its attention is internally focused. 02:52:34.540 |
if we think of desire as a basis for movement, 02:52:49.080 |
there's a kind of desire that comes from feeling full 02:52:52.460 |
at the beauty of life and wanting to add to it 02:52:56.580 |
And I don't think that is the cause of suffering. 02:53:05.260 |
and kind of unconditional happiness outside of them, 02:53:10.160 |
"No, actually desire is the source of creativity 02:53:17.560 |
But creating from where and in service of what? 02:53:21.120 |
Creating from a sense of connection to everything 02:53:23.080 |
and wholeness in service of the wellbeing of all of it 02:53:26.440 |
Which is back to that compassion, compersion axis. 02:54:18.340 |
feeling the terror of death, like Ernest Becker, 02:54:23.340 |
or just acknowledging the uncertainty, the mystery, 02:54:26.860 |
the melancholy nature of the fact that the ride ends? 02:54:31.300 |
Is that part of this equation or it's not necessary? 02:54:58.620 |
because we're afraid of hell or bad reincarnation 02:55:00.580 |
or the Bardo or some kind of idea of the afterlife we have, 02:55:02.980 |
or we're projecting some kind of sentient suffering. 02:55:07.880 |
I noticed that every time I stay up late enough 02:55:14.600 |
I'm longing for deep sleep and non-experience, right? 02:55:18.040 |
Like I'm actually longing for experience to stop. 02:55:27.780 |
And I sometimes when I wake up, wanna go back into it. 02:55:30.880 |
And then when it's done, I'm happy to come out of it. 02:55:33.980 |
So when we think about death and having finite time here, 02:55:42.500 |
and we could talk about if we live for a thousand years 02:55:48.900 |
The one bummer with the age we die is that I generally find 02:55:52.340 |
that people mostly start to emotionally mature 02:56:07.260 |
I can just stay focused on what's in it for me forever. 02:56:12.060 |
And if life continues and consciousness and sentience 02:56:17.060 |
and people appreciating beauty and adding to it 02:56:23.820 |
but my life can have effects that continue well beyond it. 02:56:27.080 |
Then life with a capital L starts mattering more to me 02:56:32.160 |
My life gets to be a part of an in service too. 02:56:34.500 |
And the whole thing about when old men plant trees, 02:56:38.320 |
the shade of which they'll never get to be in. 02:56:41.300 |
I remember the first time I read this poem by Hafez, 02:56:51.160 |
And he talked about that if you're lonely to think about him 02:56:58.320 |
into yours across the distance of a millennium 02:57:04.640 |
And I was thinking about people a millennium from now 02:57:08.640 |
and what they'd be suffering if they'd be lonely 02:57:10.440 |
and could he offer something that could touch them. 02:57:14.920 |
And so like the most beautiful parts of humans 02:57:33.520 |
It does have a sense in which it incentivizes 02:57:39.820 |
- And the widening, you remember Einstein had that quote, 02:57:45.040 |
"Something to the effect of it's an optical delusion 02:57:47.360 |
"of consciousness to believe there are separate things." 02:57:52.500 |
and something about us being inside of a prison of perception 02:57:57.500 |
that can only see a very narrow little bit of it. 02:58:02.080 |
But this might be just some weird disposition of mine, 02:58:07.080 |
but when I think about the future after I'm dead 02:58:20.760 |
And I think about people being awed by sunsets 02:58:30.200 |
- Do you feel some sadness to the very high likelihood 02:58:39.080 |
You, Daniel, the name, that which cannot be named? 02:58:52.460 |
The idea that I might do something meaningful 02:58:56.480 |
of course there's like a certain sweetness to that idea. 02:59:14.000 |
And I think to the degree that the future people 02:59:20.800 |
you know, a lot of traditions had this kind of, 02:59:23.320 |
are we being good ancestors and respect for the ancestors 03:00:01.820 |
Do you think the state is the right thing for the future? 03:00:05.620 |
So governments that are elected, democratic systems 03:00:08.340 |
that are representing representative democracy. 03:00:11.580 |
Is there some kind of political system of governance 03:00:17.860 |
Is it parents, meaning a very close-knit tribes 03:00:25.860 |
And then you and Michael Malice would happily agree 03:00:30.860 |
that it's anarchy, where the state should be dissolved 03:00:58.860 |
- You like to give these simplified good or bad things. 03:01:01.700 |
Would I like the state that we live in currently, 03:01:20.980 |
and maximally just and humane and all those things? 03:01:26.980 |
But I am much more interested in it being able to evolve 03:01:29.900 |
to a better thing without going through the catastrophe phase 03:01:34.900 |
that I think it's just non-existence would give. 03:01:42.520 |
In a sense, like, should we as a human society, 03:01:50.460 |
the, we can put on a map, like right now, literally, 03:02:05.780 |
decentralize the power to where it's very difficult 03:02:19.860 |
That could be reducing, in the United States, 03:02:29.380 |
the set of responsibilities, the set of powers. 03:02:36.720 |
but making more nations, or maybe nations not in the space 03:02:49.220 |
based on their set of ideas, and doing so dynamically, 03:02:55.060 |
- I think we can say that the natural state of humans, 03:03:03.020 |
was to live in tribes that were below the Dunbar number, 03:03:07.820 |
meaning that for a few hundred thousand years 03:03:11.520 |
of human history, all of the groups of humans 03:03:19.620 |
And so it seems like there's a pretty strong, 03:03:22.260 |
but there weren't individual humans out in the wild 03:03:30.580 |
humans were being domesticated by those groups. 03:03:40.300 |
- And maybe it's useful to do as a side statement, 03:03:43.660 |
which I've recently looked at a bunch of papers 03:03:45.740 |
around Dunbar's number, where the mean is actually 150. 03:03:49.260 |
If you actually look at the original papers-- 03:03:56.060 |
So it's a range of like two to 500 or whatever it is. 03:04:03.940 |
the range is two to 520, something like that. 03:04:24.580 |
for particular environments, particular conditions, so on. 03:04:26.980 |
It is very true that they're likely to be something small, 03:04:32.220 |
But it'd be interesting if we could expand that number 03:04:34.700 |
in interesting ways that will change the fabric 03:04:51.740 |
on the social technologies that mediate it to some degree. 03:05:01.100 |
everybody can know everybody else pretty intimately. 03:05:04.260 |
So let's go ahead and just take 150 as an average number. 03:05:12.020 |
Everybody can know everyone intimately enough 03:05:14.420 |
that if your actions made anyone else do poorly, 03:05:29.900 |
to a kind of tribal process where what's good 03:05:32.260 |
for the individual and good for the whole has a coupling. 03:05:34.980 |
Also below that scale, everyone is somewhat aware 03:05:50.540 |
And so you don't need kind of like the state in that way, 03:05:55.460 |
but lying to people doesn't actually get you ahead. 03:06:06.860 |
that is aligned with the interests of the tribe 03:06:21.640 |
I would say there's also a communication protocol 03:06:31.400 |
and be part of a conversation around a really big decision. 03:06:38.960 |
And why would I want to agree to be a part of a larger group 03:06:43.960 |
where everyone can't be part of that council? 03:06:55.140 |
that could still survive and I get a say in the law 03:06:59.860 |
and a way we can look at it beyond the Dunbar number two 03:07:02.740 |
is we can look at that a civilization has binding energy 03:07:06.560 |
that is holding them together and has cleaving energy. 03:07:08.660 |
And if the binding energy exceeds the cleaving energy, 03:07:14.500 |
to decrease the cleaving energy within the society, 03:07:16.480 |
things we can do to increase the binding energy. 03:07:18.260 |
I think naturally we saw that had certain characteristics 03:07:39.540 |
So there were a few different kind of forcing functions. 03:07:43.200 |
But we're talking about what size should it be, right? 03:07:51.940 |
like if we think about your body for a moment 03:07:54.780 |
as a self-organizing complex system that is multi-scaled, 03:08:15.300 |
to try to have all the tens of trillions of cells in it 03:08:18.800 |
with no internal organization structure, right? 03:08:21.720 |
Just like a sea of protoplasm, it wouldn't work. 03:08:32.700 |
And so you have these layers of organization. 03:08:34.500 |
And then obviously the individual and a tribe 03:08:45.260 |
I think the future of civilization will be similar, 03:09:00.540 |
and affecting each other, taking responsibility for. 03:09:05.100 |
And you can see that for a lot of human history, 03:09:13.440 |
Whereas right now there's kind of like the individual 03:09:22.520 |
And not that much in the way of intermediate structures 03:09:25.860 |
and not that much in the way of real personal dynamics, 03:09:30.900 |
And so I think that we have to have global governance. 03:09:44.020 |
So that can't only be national or only local. 03:09:48.020 |
Everyone is scared of the idea of global governance 03:09:49.900 |
'cause we think about some top-down system of imposition 03:09:52.620 |
that now has no checks and balances on power. 03:09:56.340 |
So I'm not talking about that kind of global governance. 03:10:00.020 |
It's why I'm even using the word governance as a process 03:10:02.540 |
rather than government as an imposed phenomena. 03:10:06.300 |
So I think we have to have global governance, 03:10:09.340 |
but I think we also have to have local governance. 03:10:11.540 |
And there has to be relationships between them 03:10:13.260 |
that each, where there are both checks and balances 03:10:24.260 |
than governance at the level of nation states. 03:10:26.300 |
'Cause I think nation states are largely fictitious things 03:10:30.380 |
that are defined by wars and agreements to stop wars 03:10:37.540 |
where the proximity of certain things together, 03:10:44.560 |
So you look at like Jeffrey West's work on scale 03:10:59.600 |
but the city actually gets increasing productivity 03:11:06.080 |
So there should be governance at the level of cities 03:11:11.940 |
Probably neighborhoods and smaller scales within it 03:11:18.920 |
So I don't think the future is one type of governance. 03:11:27.840 |
the idea of a civilization is that we can figure out 03:11:34.480 |
and hopefully increase total productive capacity 03:11:41.140 |
so we all get more, better stuff and whatever. 03:11:44.820 |
But it's a coordination of our choice-making. 03:11:52.520 |
on the side of not having enough coordination 03:11:55.320 |
of choice-making, so they fail on the side of chaos 03:12:04.100 |
and they overuse their resources or whatever. 03:12:06.800 |
Or it can fail on the side of trying to get order 03:12:25.260 |
or because it can't innovate enough or something like that. 03:12:39.680 |
whether we're talking about a representative democracy 03:12:46.600 |
the idea of an open society, participatory governance, 03:12:50.320 |
is can we have order that is emergent rather than imposed 03:13:03.400 |
And what would it take to have emergent order? 03:13:12.020 |
because if we look at what different nation states 03:13:18.380 |
and we see nation states that are more authoritarian, 03:13:30.960 |
has built high-speed rail, not just through its country, 03:13:34.220 |
and the US hasn't built any high-speed rail yet. 03:13:36.780 |
You can see that it brought 300 million people 03:13:39.960 |
where we've had increasing economic inequality happening. 03:13:43.840 |
You can see that if there was a single country 03:13:53.180 |
to start to go closed loop on fundamental things. 03:14:04.020 |
that is like, oh, they're actually coordinating 03:14:28.940 |
That's the thing we were trying to get away from, 03:14:30.860 |
and that there would be checks and balances on power 03:14:34.420 |
But that also has created a negative second order effect, 03:14:39.840 |
Because somebody comes in who's got four years, 03:14:43.140 |
They don't do anything that doesn't create a return 03:14:45.000 |
within four years that will end up getting them elected, 03:14:51.240 |
to build high speed trains or the new kind of fusion energy 03:14:54.720 |
or whatever it is, just doesn't get invested in. 03:15:02.720 |
then the other guy gets in and undoes it for four years. 03:15:19.800 |
But I would argue it has its own fail states eventually 03:15:24.800 |
and dystopic properties that are not the thing we want. 03:15:30.740 |
to create a system that does long-term planning 03:15:33.600 |
without the negative effects of a monarch or dictator 03:15:46.720 |
not through the imposition of a single leader, 03:15:58.580 |
the technology in itself seems to maybe disagree, 03:16:04.900 |
which is make primary the system, not the humans. 03:16:08.280 |
So the basic, the medium on which the democracy happens, 03:16:13.280 |
like a platform where people can make decisions, 03:16:21.220 |
do the choice-making, the coordination of the choice-making, 03:16:32.540 |
at the scale of the family, the extended family, 03:16:35.460 |
the city, the country, the continent, the whole world. 03:16:43.180 |
constantly changing based on the needs of the people, 03:17:05.020 |
by technology I mean like software network platforms 03:17:23.340 |
are you also optimistic about the CEOs of such platforms? 03:17:43.460 |
Technology elicits patterns of human behavior 03:18:02.800 |
and the minds of people of the change in our tooling. 03:18:05.920 |
Marvin Harris's work called "Cultural Materialism" 03:18:09.080 |
Obviously, Marshall McLuhan looked specifically 03:18:29.580 |
the ox drawn plow in the beginning of agriculture 03:18:40.220 |
Well, the world changed a lot with that, right? 03:18:51.580 |
is when the ox drawn plow started to proliferate, 03:18:56.800 |
was able to start to actually cultivate grain 03:19:00.040 |
you couldn't get enough grain for it to matter. 03:19:07.560 |
that it became obligate and everybody used it. 03:19:13.640 |
animism went away everywhere, that it existed, 03:19:16.120 |
because you can't talk about the spirit of the buffalo 03:19:18.640 |
while beating the cow all day long to pull a plow. 03:19:22.000 |
So the moment that we do animal husbandry of that kind, 03:19:31.820 |
You went from women primarily using the digging stick 03:19:34.960 |
to do the horticulture or gathering before that, 03:19:37.760 |
men doing the hunting stuff to now men had to use the plow 03:19:40.240 |
because the upper body strength actually really mattered. 03:19:42.440 |
Women would have miscarriages when they would do it 03:19:45.160 |
So all the caloric supply started to come from men 03:19:57.360 |
that particular line of thought then also says 03:20:12.200 |
because the male upper body strength wasn't differential 03:20:15.560 |
once the internal combustion engine was much stronger 03:20:20.620 |
So I don't think to try to trace complex things 03:20:29.240 |
And so the idea that technology is values agnostic is silly. 03:20:36.640 |
that code rationalizing those patterns of behavior 03:20:40.320 |
The plow also is the beginning of the Anthropocene, right? 03:20:43.320 |
It was the beginning of us changing the environment 03:20:51.040 |
of where the web of life, we're just a part of it, et cetera. 03:21:01.300 |
But the question is, so it's not agnostic, but. 03:21:05.900 |
- So we have to look at what the psychological effects 03:21:13.640 |
it's not just doing the first order thing you intended. 03:21:16.460 |
It's doing like the effect on patriarchy and animism 03:21:22.840 |
and the beginning of empire and the class systems 03:21:30.640 |
which then became the capital model and like lots of things. 03:21:34.440 |
So we have to say, when we're looking at the tech, 03:21:39.320 |
into the way the tech is being built that are not obvious? 03:21:42.720 |
- Right, so you always have to consider externalities. 03:21:46.500 |
- And the externalities are not just physical 03:21:48.180 |
they're also to how the people are being conditioned 03:21:49.760 |
and how the relationality between them is being conditioned. 03:21:53.120 |
so I personally would rather be led by a plow 03:22:00.600 |
In creating an emergent government where people, 03:22:17.520 |
like there's a bunch of fine resolution layers 03:22:21.640 |
of abstraction of governance happening at all scales, 03:22:28.080 |
where no one person has power at any one time 03:22:35.840 |
I'm saying isn't the alternative that's emergent, 03:22:40.840 |
empowered or made possible by the plow and the tractor, 03:23:01.280 |
to me at least, is just basic social interaction, 03:23:03.360 |
the mechanisms of human transacting with each other 03:23:07.200 |
So yes, it's not agnostic, definitely not agnostic. 03:23:16.200 |
but isn't that the way we achieve an emergent system 03:23:29.080 |
- It's not that I haven't seen anything promising. 03:23:30.520 |
It's that to be on track requires understanding 03:23:34.240 |
than is currently happening, and it's possible. 03:23:47.920 |
without transportation technologies, without the train, 03:23:51.000 |
without the communication tech that made it possible. 03:23:59.480 |
there's a relationship between them that is more recursive, 03:24:02.720 |
which is new physical technologies allow rulers 03:24:07.720 |
to rule with more power over larger distances, historically. 03:24:16.680 |
Some things are more responsible for that than others. 03:24:22.160 |
but the thing he ate for breakfast is less responsible 03:24:24.680 |
for the starvation of millions than the train. 03:24:30.160 |
And then the weapons of war are more responsible. 03:24:32.400 |
So some technology, let's not throw it all in the, 03:24:36.160 |
you're saying technology has a responsibility here, 03:24:48.880 |
The change of the behavior will also change the values 03:24:55.520 |
both make people who have different kinds of predispositions 03:25:06.300 |
It's kind of well understood that the printing press 03:25:09.360 |
and then in early industrialism ended feudalism 03:25:19.320 |
that we can look at is that whenever there is a step 03:25:22.520 |
function, a major leap in technology, physical technology, 03:25:55.360 |
Obviously the nuke broke nation state governance 03:26:22.140 |
of what the market incentivizes exponential tech to do, 03:26:33.180 |
if we look at different types of social tech, 03:26:41.360 |
that democracy tried to do the emergent order thing. 03:27:11.880 |
it doesn't do it for all classes equally, et cetera. 03:27:14.420 |
But the idea of democracy is participatory governance. 03:27:19.420 |
And so you notice that the modern democracies 03:27:26.220 |
And specifically, because the idea that a lot of people, 03:27:31.380 |
huge number of anonymous people who don't know each other 03:27:38.000 |
can all work together to make collective decisions. 03:27:47.820 |
Like it's a wild idea that that would even be possible. 03:27:54.460 |
that we could all do the philosophy of science 03:28:12.520 |
The emergent order is the order of the choices 03:28:23.240 |
our choice-making is based on our sense-making 03:28:31.600 |
Our meaning-making is what do we care about, right? 03:28:34.900 |
that we're trying to move the world in the direction of? 03:28:37.200 |
If you ultimately are trying to move the world 03:28:39.260 |
in a direction that is really, really different 03:28:46.060 |
And if you think the world is a very different world, 03:28:51.220 |
is rampant everywhere and one of the worst problems, 03:28:55.860 |
if you think climate change is almost existential 03:29:00.220 |
we're gonna have a really hard time coordinating. 03:29:02.580 |
And so we have to be able to have shared sense-making 03:29:12.600 |
Okay, maybe I'm emphasizing a particular value 03:29:20.000 |
and I can see how it's affected by this thing. 03:29:28.480 |
just to benefit these ones that harms the ones 03:29:41.060 |
when we vote on it gets half the votes almost all the time. 03:29:45.920 |
is because it benefits some things and harms other things. 03:30:04.280 |
The proposition crafting and refinement process 03:30:11.840 |
- But isn't that the humans creating that situation? 03:30:23.760 |
And the second is to create somehow systems that- 03:30:40.760 |
So democracy emerged as an enlightenment era idea 03:30:49.000 |
and come to understand what other people valued. 03:30:54.160 |
with a cooperative solution rather than just, 03:30:57.640 |
fuck you, we're gonna get our thing in war, right? 03:31:05.160 |
on what the speed of sound is if we measured it 03:31:19.280 |
or a broken government and perfect newspaper, 03:31:20.900 |
I wouldn't hesitate to take the perfect newspaper. 03:31:22.480 |
Because if the people understand what's going on, 03:31:36.720 |
should be the comprehensive education of every citizen 03:32:03.020 |
And he didn't say it's protecting the border from enemies. 03:32:11.000 |
it could do that as military dictatorship quite effectively. 03:32:16.360 |
it could do it as a dictatorship, as a police state. 03:32:19.200 |
And so if the number one goal is anything other 03:32:23.520 |
than the comprehensive education of all the citizens 03:32:28.240 |
You can see, so both education and the fourth estate, 03:32:36.080 |
The fourth estate is what's actually going on currently, 03:32:38.320 |
the news, do I have good unbiased information about it? 03:32:41.280 |
Those are both considered prerequisite institutions 03:32:46.520 |
And then at the scale it was initially suggested here, 03:32:55.440 |
The first thing I ever saw was the proposition, 03:33:00.240 |
It was in the town hall, we all got to talk about it, 03:33:02.160 |
and the proposition could get crafted in real time 03:33:05.940 |
which is why there was that founding father statement 03:33:10.760 |
Voting fundamentally is polarizing the population 03:33:27.920 |
it's trying to tend to and tends to that better. 03:33:32.880 |
As the scale increased, we lost the ability to do that. 03:33:35.920 |
Now, as you mentioned, the internet could change that. 03:33:41.720 |
to the other one to see what the colony would do, 03:33:44.680 |
that we stopped having this kind of developmental, 03:33:48.120 |
propositional development process when the town hall ended. 03:34:06.920 |
has those things, it just has a lot of other things. 03:34:11.640 |
where that encourage synthesis of competing ideas 03:34:16.040 |
and sense-making, which is what we're talking about. 03:34:23.400 |
that perhaps are out competing it under current incentives, 03:34:26.460 |
perhaps has to do with capitalism and the market. 03:34:34.680 |
and they have problems, but places where you have 03:34:39.060 |
vetting of information towards collective building. 03:34:47.440 |
or our policing systems or our military systems or our-- 03:34:50.600 |
- First of all, I think a lot, but not enough. 03:34:53.000 |
I think that's something I told you offline yesterday 03:35:14.360 |
like knowledge, I think can't help but lead to empathy. 03:35:28.640 |
Knowing how many people died in various wars, 03:35:32.680 |
when you have millions of people have that knowledge, 03:35:35.480 |
it's like, it's a little like slap in the face, 03:35:37.520 |
like, oh, like my boyfriend or girlfriend breaking up with me 03:35:48.000 |
And when a lot of people know that because of Wikipedia 03:35:51.920 |
or the effect, there's second order effects of Wikipedia, 03:35:55.280 |
which is it's not that necessarily people read Wikipedia, 03:35:58.680 |
it's like YouTubers who don't really know stuff that well 03:36:12.880 |
and they understand that, holy shit, a lot of, 03:36:16.840 |
there was such a thing as World War II and World War I, 03:36:19.600 |
okay, like they can at least like learn about it, 03:36:26.520 |
they can learn about all kinds of injustices in the world. 03:36:30.080 |
And that I think has a lot of effects to our, 03:36:53.360 |
So I think there's a huge positive effect on Wikipedia. 03:36:59.520 |
I'm a huge fan, but there's very few systems like it, 03:37:11.960 |
to really try to run the dialectical synthesis process 03:37:32.240 |
and say, what are all of the things that are getting worse? 03:37:35.320 |
And what, and are any of them following an exponential curve 03:37:38.680 |
and how much worse, how quickly could that be? 03:37:40.980 |
And then, and do that fully without mitigating it. 03:37:50.320 |
in a way that Kurzweil or Diamandis or someone might do, 03:37:57.560 |
and say, are some of those things exponential? 03:38:00.280 |
And then try to hold all that at the same time. 03:38:10.520 |
things are getting worse on exponential curves 03:38:16.800 |
which I hold as the destabilization of previous system. 03:38:22.720 |
or a collapse to a lower order are both possible. 03:38:25.520 |
And so I want my optimism not to be about my assessment. 03:38:31.880 |
I want my assessment to be just as fucking clear 03:38:35.440 |
I want my optimism to be what inspires the solution process 03:38:42.200 |
So I never want to apply optimism in the sense making. 03:38:49.120 |
that the challenges are really well understood. 03:38:57.480 |
even if I don't know what they are, that are worth seeking. 03:39:00.600 |
There's kind of a, there is some sense of optimism 03:39:16.940 |
because I then come back to how to make it better. 03:39:19.600 |
So just a relationship between optimism and pessimism 03:39:27.720 |
that Wikipedia is a pretty awesome example of a thing. 03:39:32.560 |
We can look at the places where it has limits 03:39:42.000 |
you can pay Wikipedia editors to edit more frequently 03:39:46.920 |
But you can also see where there's a lot of information 03:39:54.060 |
than everybody buying their own Encyclopedia Britannica 03:40:08.080 |
because Wikipedia is not a for-profit corporation. 03:40:11.400 |
It is a, it's tending to the information commons 03:40:16.960 |
other than tending to the information commons. 03:40:19.640 |
And I think the two masters issue is a tricky one 03:40:31.360 |
and I can't find synergistic satisfiers, which one? 03:40:34.360 |
And if I have a fiduciary responsibility to shareholder 03:40:37.840 |
profit maximization and what does that end up creating? 03:40:41.780 |
I think the ad model that Silicon Valley took, 03:41:01.620 |
but also the belief that information should be free 03:41:15.400 |
Some places actually, PayPal paying people money 03:41:18.600 |
to join the network because the value of the network 03:41:23.840 |
proportional to the square of the total number of users. 03:41:26.600 |
So the ad model made sense of how do we make it free, 03:41:34.120 |
but not really thinking about what it would mean to, 03:41:52.500 |
is to do behavioral mod for them for advertisers. 03:42:10.800 |
and other information and communication platforms 03:42:20.920 |
sometimes specific information and behavioral information 03:42:25.160 |
than even a therapist or a doctor or a lawyer 03:42:29.920 |
or a priest might have in a different setting. 03:42:31.820 |
They basically are accessing privileged information. 03:42:42.200 |
if you are a principal and I'm an agent on your behalf, 03:42:47.200 |
I don't have a game theoretic relationship with you. 03:42:54.280 |
to try to sell you a used car or whatever the thing is. 03:43:04.680 |
that they didn't sign up for wanting the behavior 03:43:08.660 |
So I think this is an example of the physical tech 03:43:12.720 |
evolving in the context of the previous social tech 03:43:22.840 |
this evolved for fulfilling particular agentic purpose. 03:43:36.640 |
it has a goal for me different than my goal for myself. 03:43:39.840 |
And I might wanna be on for a short period of time. 03:43:46.840 |
but where there should be a fiduciary contract. 03:43:58.080 |
to develop people's citizenry capacity, right? 03:44:03.080 |
To develop their personal health and wellbeing and habits, 03:44:38.200 |
at the nation state level of the world today, 03:44:40.580 |
the more top-down authoritarian nation states 03:44:44.000 |
are as the exponential tech started to emerge, 03:44:49.960 |
They were in a position for better long-term planning 03:44:55.300 |
And so the authoritarian states started applying 03:45:01.680 |
And that's everything from like an internet of things, 03:45:03.960 |
surveillance system, going into machine learning systems, 03:45:08.180 |
to the Sesame Credit system, to all those types of things. 03:45:16.200 |
Otherwise, within a nation state like the US, 03:45:22.800 |
the countries, the states are not directing the technology 03:45:29.980 |
They're saying, well, the corporations are doing that 03:45:32.500 |
and the state is doing the relatively little thing 03:45:34.820 |
it would do aligned with the previous corporate law 03:45:50.580 |
Amazon has a bigger percentage of market share 03:45:54.560 |
You get one big dog per vertical because of network effect, 03:46:00.700 |
that the previous antitrust law didn't even have a place. 03:46:04.420 |
Anti-monopoly was only something that emerged 03:46:08.780 |
So what we see is the new exponential technology 03:46:13.740 |
is being directed by authoritarian nation states 03:46:18.200 |
and by corporations to make more powerful corporations. 03:46:23.020 |
when we think about the Scottish enlightenment, 03:46:32.260 |
compared to what the biggest corporation today is. 03:46:35.020 |
So the asymmetry of it relative to people was tiny. 03:46:37.880 |
And the asymmetry now in terms of the total technology 03:46:48.380 |
And rather than there be demand for an authentic thing 03:46:55.740 |
as supply started to get way more coordinated and powerful 03:47:02.860 |
but you do have a coordination on the supply side, 03:47:08.020 |
It could make people want shit that they didn't want before 03:47:12.000 |
in a meaningful way, might increase addiction. 03:47:14.140 |
Addiction is a very good way to manufacture demand. 03:47:17.100 |
And so as soon as manufactured demand started 03:47:23.420 |
and you have to have it for status or whatever it is, 03:47:28.720 |
Now it's no longer a collective intelligence system 03:47:34.260 |
You were able to hijack the lower angels of our nature 03:47:44.140 |
And so we really also have to update our theory of markets 03:47:48.980 |
because behavioral econ showed that homo economicus, 03:47:54.500 |
but particularly at greater and greater scale 03:47:57.980 |
Voluntarism isn't a thing where if my company 03:48:04.260 |
'cause that's where all the fucking attention is. 03:48:08.300 |
but it's not really if there's a functional monopoly. 03:48:11.820 |
Same if I'm gonna sell on Amazon or things like that. 03:48:18.460 |
are becoming more powerful than nation states in some ways. 03:48:40.460 |
that are kind of a new feudalism, tech feudalism, 03:48:42.860 |
'cause it's not a democracy inside of a tech company 03:48:55.980 |
and authoritarian nation states controlling it. 03:49:00.060 |
And so I'm interested in the application of exponential tech 03:49:12.580 |
and direct the exponential tech in fundamentally healthy, 03:49:22.700 |
I think we can actually use the physical tech 03:49:24.420 |
to make better social tech, but it's not given that we do. 03:49:35.080 |
- I don't know if it's the road we wanna go down, 03:49:39.680 |
will create exactly the thing you're talking about, 03:49:42.200 |
which I feel like there's a lot of money to be made 03:49:44.560 |
in creating a social tech that creates a better citizen, 03:50:05.600 |
which manufactures demand, is not obviously inherently 03:50:17.400 |
like baby deer trying to figure out how to use the internet. 03:50:20.560 |
I feel like there's much more money to be made 03:50:23.320 |
with something that creates compersion and love, honestly. 03:50:35.360 |
I don't think we wanna really have that discussion, 03:50:39.040 |
but do you have some hope that that's the case? 03:50:41.440 |
And I guess if not, then how do we fix the system 03:50:48.960 |
- Like I said, every social tech worked for a while. 03:50:51.600 |
Tribalism worked well for 200,000 or 300,000 years. 03:50:57.960 |
The social technologies with which we organize 03:51:00.840 |
and coordinate our behavior have to keep evolving 03:51:23.560 |
It broke with scale in particular and a few other things. 03:51:28.160 |
So it needs updated in a really fundamental way. 03:51:35.560 |
that in some ways will obsolete money-making. 03:51:48.280 |
I'm gonna produce a good or a service that people want 03:51:53.080 |
so that people get access to that good or service. 03:51:55.160 |
That's the world of business, but that's not capitalism. 03:51:58.280 |
Capitalism is the management and allocation of capital. 03:52:03.000 |
Which financial services was a tiny percentage 03:52:06.560 |
of the total market has become a huge percentage 03:52:15.080 |
that I started to be able to invest that money 03:52:19.840 |
I start realizing I'm making more money on my money 03:52:23.600 |
than I'm making on producing the goods and services. 03:52:26.120 |
So I stop even paying attention to goods and services 03:52:28.240 |
and start paying attention to making money on money. 03:52:30.880 |
And how do I utilize capital to create more capital? 03:52:37.400 |
than a particular good or service that only some people want. 03:52:40.200 |
Capitalism, more capital ended up meaning more control. 03:52:58.200 |
So it meant increased agency and also increased control. 03:53:00.960 |
I think attentionalism is even more powerful. 03:53:10.680 |
where the people kind of always want to get away 03:53:18.160 |
was just more profitable than slavery, right? 03:53:29.360 |
This is a cynical take, but a meaningful take. 03:53:43.160 |
And yet where people still feel free in some meaningful way, 03:53:48.160 |
they're not feeling like they're gonna be punished 03:53:58.160 |
I can't put an agent on, so it feels like free. 03:54:01.120 |
And so if you want to affect people's behavior 03:54:12.480 |
But I think affecting their attention is even deeper 03:54:24.600 |
Facebook has done studies that based on changing the feed, 03:54:27.320 |
it can change beliefs, emotional dispositions, et cetera. 03:54:31.160 |
And so I think there's a way that the harvest 03:54:35.920 |
and directing of attention is even a more powerful system 03:54:39.800 |
It is effective in capitalism to generate capital, 03:55:02.680 |
Towards what metrics of what a good civilization 03:55:10.800 |
I can answer all the things you're mentioning. 03:55:26.240 |
- Okay, so maybe the corporation has coordination 03:55:42.600 |
but maybe I could actually help all of the customers 03:55:45.840 |
to coordinate almost like a labor union or whatever 03:55:51.840 |
about the effects, the externalities on them. 03:55:59.160 |
their beings, their families, their relationships, 03:56:02.280 |
such that they will in group change their behavior. 03:56:08.760 |
one way of saying what you're saying, I think, 03:56:13.040 |
is that you think that you can rescue homo economicus 03:56:32.960 |
largely whether it's good for them or not in the long term. 03:56:36.080 |
And the large asymmetric corporation can run propaganda 03:56:40.320 |
and narrative warfare that hits people's status buttons 03:56:42.600 |
and their limbic hijacks and their lots of other things 03:56:55.140 |
So you're saying, I think we can recover homo economicus. 03:56:59.800 |
- And not just through a single mechanism technology, 03:57:20.840 |
- So interestingly, I actually agree with you 03:57:35.800 |
is the application of tech here, broadcast tech, 03:57:42.280 |
'cause the different people need spoken to differently, 03:57:48.160 |
but nonetheless, we'll start with broadcast tech. 03:57:50.520 |
- Plants the first seed and then the word of mouth 03:57:56.200 |
and then it like lands, a catapult or whatever, 03:58:03.480 |
through all kinds of tech, including Facebook. 03:58:06.240 |
- So let's come back to the fundamental thing. 03:58:08.160 |
The fundamental thing is we want a kind of order 03:58:11.280 |
at various scales from the conflicting parts of ourself, 03:58:15.680 |
actually having more harmony than they might have 03:58:19.880 |
to family, extended family, local, all the way up to global. 03:58:32.600 |
We want that to be emergent rather than imposed 03:58:36.360 |
or rather than we want fundamentally different things 03:58:40.240 |
where warfare of some kind becomes the only solution. 03:58:43.640 |
Emergent order requires us in our choice-making, 03:58:47.480 |
requires us being able to have related sense-making 03:58:53.480 |
Can we apply digital technologies and exponential tech 03:58:59.720 |
in general to try to increase the capacity to do that? 03:59:04.920 |
the social tech that we'd all get together and talk, 03:59:13.080 |
Can we build new, better versions of those types of things? 03:59:22.720 |
comprehensive education in the science of government, 03:59:24.960 |
which include being able to understand things 03:59:35.880 |
to be able to support increased comprehensive education 03:59:40.120 |
of the people and maybe comprehensive informant-ness? 03:59:57.400 |
Yeah, fundamentally, that's the thing that has to happen. 04:00:00.720 |
The exponential tech gives us a novel problem landscape 04:00:07.000 |
And so that required this whole Bretton Woods world. 04:00:10.480 |
The exponential tech gives us novel problem landscape. 04:00:24.600 |
We haven't kept any of the new categories of tech 04:00:32.200 |
So we need fundamentally better problem-solving processes, 04:00:35.120 |
a market or a state as a problem-solving process. 04:00:41.440 |
Right now, speed is one of the other big things, 04:00:43.360 |
is that by the time we regulated DDT out of existence 04:00:52.720 |
But as Elon has made the point, that won't work for AI. 04:00:59.460 |
that we have an autopoetic AI that's a problem, 04:01:11.560 |
or you have tech that just has exponentially fast effects, 04:01:17.680 |
It can't come after the effects have happened, 04:01:23.120 |
'cause the negative effects could be too big too quickly. 04:01:25.400 |
So we basically need new problem-solving processes 04:01:27.800 |
that do better at being able to internalize externality, 04:01:42.520 |
and being able to participate in their development, 04:01:50.920 |
where people start understanding the new power 04:01:59.720 |
current governance structures that we care about 04:02:04.760 |
but could also be redirected towards more pro-topic purposes 04:02:14.840 |
where we can do participatory governance at scale and time? 04:02:32.280 |
So again, Elon, you'd be right that love is the answer. 04:02:36.120 |
Let me take you back from the scale of societies 04:02:50.080 |
We have various flavors of relationships with our fathers. 04:02:55.920 |
What have you learned about life from your dad? 04:03:04.040 |
and see a lot of individual things that I learned 04:03:08.800 |
If I was to kind of summarize at a high level, 04:03:16.360 |
very, very unusually positive set of experiences. 04:03:25.120 |
and he was committed to work from home to be available 04:03:27.640 |
and prioritize fathering in a really deep way. 04:03:30.340 |
And as a super gifted, super loving, very unique man, 04:03:41.820 |
that were part of what crafted the unique brilliance 04:03:45.680 |
And I say that because I think I had some unusual gifts 04:03:52.100 |
And I think it's useful for everybody to know 04:04:17.960 |
towards some virtuous purpose beyond the self. 04:04:20.800 |
And he both said that a million different ways, 04:04:34.060 |
He made sure that the way that we put the wires 04:04:37.600 |
Like that the height of the holes was similar, 04:04:42.600 |
that we twisted the wires in a particular way. 04:04:48.720 |
it's worth doing well and excellence is its own reward 04:04:53.600 |
he'd say, see the job, do the job, stay out of the misery. 04:05:01.160 |
there's an empowerment and a nobility together. 04:05:10.560 |
- Is there ways you think you could have been a better son? 04:05:19.520 |
- Let me first say, just as a bit of a criticism, 04:05:33.760 |
Exactly, I agree with your dad on that point. 04:05:45.480 |
is there ways you could have been a better son? 04:05:48.440 |
- Maybe next time on your show, I'll wear a suit and tie. 04:06:12.600 |
- I can answer the question later in life, not early. 04:06:15.740 |
I had just a huge amount of respect and reverence 04:06:45.480 |
And he had a lot of kind of non-standard model beliefs 04:06:50.480 |
about things, whether early kind of ancient civilizations 04:07:23.120 |
where I kind of had the attitude that Dawkins 04:07:28.120 |
or Christopher Hitchens has that can kind of be 04:07:39.560 |
applying their reductionist philosophy of science 04:07:43.200 |
to everything and kind of brutally dismissive. 04:07:49.500 |
Not to say anything about those men and their path, 04:07:56.840 |
And so during that time, I was more dismissive 04:08:06.760 |
I got to correct that later, apologize for it. 04:08:08.960 |
But that was the first thought that came to mind. 04:08:17.960 |
making love, watching a sunset, listening to music, 04:08:22.280 |
feeling the breeze, that I would sign up for this whole life 04:08:27.280 |
and all of its pains just to experience this exact moment. 04:08:35.600 |
It's the most important and real truth I know, 04:08:40.120 |
that experience itself is infinitely meaningful 04:08:45.720 |
And seen clearly, even the suffering is filled with beauty. 04:08:50.760 |
I've experienced countless lives worth of moments 04:08:54.280 |
worthy of life, such an unreasonable fortune. 04:09:00.560 |
A few words of gratitude from you, beautifully written. 04:09:14.880 |
in your darker moments, you can go to to relive, 04:09:19.540 |
to remind yourself that the whole ride is worthwhile? 04:09:39.920 |
I mean, I feel fortunate to have had exposure to practices 04:09:50.020 |
So I can take responsibility for seeing things in that way 04:09:53.200 |
and not taking for granted really wonderful things, 04:09:57.800 |
to the philosophies that even gave me that possibility. 04:10:06.760 |
it's with every person who I really love when we're talking, 04:10:14.160 |
feel overwhelmed by how lucky I am to get to know them. 04:10:17.600 |
And like there's never been someone like them 04:10:19.940 |
in all of history and there never will be again. 04:10:22.000 |
And they might be gone tomorrow, I might be gone tomorrow. 04:10:25.360 |
And when you take in the uniqueness of that fully 04:10:28.320 |
and the beauty of it, it's overwhelmingly beautiful. 04:10:30.920 |
And I remember the first time I did a big dose of mushrooms 04:10:43.780 |
And I was just crying with overwhelm at how beautiful 04:10:47.420 |
And it was a tree outside the front of my house 04:10:52.200 |
And it wasn't the dose of mushrooms where I was hallucinating 04:10:59.000 |
Like the tree still looked like, if I had to describe it, 04:11:01.040 |
say it's green and it has leaves, looks like this, 04:11:12.020 |
And I realized I had no thoughts taking me anywhere else. 04:11:15.960 |
- Like what it seemed like the mushrooms were doing 04:11:22.840 |
And then I'm like, fuck, when I get off these mushrooms, 04:11:26.280 |
because it's always that beautiful and I just miss it. 04:11:37.320 |
where they'll see life and how incredible it is. 04:11:41.880 |
- It's funny that I had this exact same experience 04:12:10.520 |
And yeah, even just that moment alone is worth living for. 04:12:23.360 |
That's a, I have to ask, what are we looking at? 04:12:26.960 |
- When I went to go get a smoothie before coming here, 04:12:30.480 |
I got you a keto smoothie that you didn't want 04:12:41.240 |
of ginger turmeric cayenne juice of some kind. 04:12:47.400 |
- I didn't necessarily plan it for being on the show. 04:12:52.880 |
- I think we shall toast like heroes, Daniel. 04:13:03.320 |
this unique moment that we get to share together. 04:13:07.000 |
- I'm very grateful to be here in this moment with you. 04:13:09.040 |
And yeah, I'm grateful that you invited me here. 04:13:12.680 |
We met for the first time and I will never be the same 04:13:31.160 |
Daniel, this is one of the best conversations I've ever had. 04:13:40.960 |
I wanna say in terms of what you're mentioning about, 04:13:48.760 |
and the optimism that wants to look at the issues, 04:13:52.960 |
but wants to look at how this increased technological power 04:13:57.800 |
And that even thinking about the broadcast of like, 04:14:01.760 |
can I help people understand the issues better 04:14:05.520 |
Like fundamentally you're oriented like Wikipedia. 04:14:09.400 |
What I see to really try to tend to the information commons 04:14:13.960 |
without another agentic interest distorting it. 04:14:16.800 |
And for you to be able to get guys like Lee Smolin 04:14:22.120 |
and Roger Penrose and like the greatest thinkers 04:14:29.040 |
And most people would never be exposed to them 04:14:30.760 |
and talk about it in a way that people can understand. 04:14:51.640 |
Check them out in the description to support this podcast. 04:14:59.600 |
"I know not with what weapons World War III will be fought, 04:15:03.320 |
"but World War IV will be fought with sticks and stones." 04:15:08.040 |
Thank you for listening and hope to see you next time.