back to indexYuval Noah Harari: Human Nature, Intelligence, Power, and Conspiracies | Lex Fridman Podcast #390
Chapters
0:0 Introduction
1:24 Intelligence
20:19 Origin of humans
30:41 Suffering
51:22 Hitler
69:54 Benjamin Netanyahu
88:17 Peace in Ukraine
105:7 Conspiracy theories
119:46 AI safety
134:4 How to think
143:47 Advice for young people
146:28 Love
156:38 Mortality
161:2 Meaning of life
00:00:00.000 |
If we now find ourselves inside this kind of world of illusions, 00:00:04.800 |
created by an alien intelligence that we don't understand, 00:00:24.400 |
but we don't understand what is behind this screen of stories and images and songs. 00:00:33.520 |
The following is a conversation with Yuval Noah Harari, 00:00:40.000 |
a historian, philosopher, and author of several highly acclaimed, 00:00:44.400 |
highly influential books, including "Sapiens," "Homo Deus," 00:00:51.120 |
He is also an outspoken critic of Benjamin Netanyahu 00:00:55.360 |
and the current right-wing government in Israel. 00:00:58.480 |
So while much of this conversation is about the history and future of human civilization, 00:01:03.680 |
we also discuss the political turmoil of present-day Israel, 00:01:07.680 |
providing a different perspective from that of my recent conversation with Benjamin Netanyahu. 00:01:16.400 |
To support it, please check out our sponsors in the description. 00:01:19.520 |
And now, dear friends, here's Yuval Noah Harari. 00:01:23.040 |
13.8 billion years ago is the origin of our universe. 00:01:27.440 |
3.8 billion years ago is the origin of life here on our little planet, 00:01:33.760 |
Let's say 200,000 years ago is the appearance of early Homo sapiens. 00:01:41.920 |
How rare are these events in the vastness of space and time? 00:01:47.600 |
how many intelligent alien civilizations do you think are out there in this universe? 00:01:52.960 |
I suppose there should be some, statistically, but we don't have any evidence. 00:01:58.000 |
But I do think that, you know, intelligence in any way, it's a bit overvalued. 00:02:02.240 |
We are the most intelligent entities on this planet, and look what we're doing. 00:02:09.760 |
So intelligence also tends to be self-destructive, 00:02:14.800 |
which implies that if there are or were intelligent life forms elsewhere, 00:02:22.240 |
- So you think there's a tension between happiness and intelligence? 00:02:27.760 |
Intelligence is definitely not something that is directed towards amplifying happiness. 00:02:35.920 |
I would also emphasize the huge, huge difference between intelligence and consciousness, 00:02:41.360 |
which many people, certainly in the tech industry and in the AI industry, tend to miss. 00:02:46.640 |
Intelligence is simply the ability to solve problems, to attain goals, 00:02:52.960 |
and, you know, to win a chess, to win a struggle for survival, 00:02:59.920 |
to win a war, to drive a car, to diagnose a disease. 00:03:06.080 |
Consciousness is the ability to feel things like pain and pleasure and love and hate. 00:03:12.000 |
In humans and other animals, intelligence and consciousness go together. 00:03:17.680 |
They go hand in hand, which is why we confuse them. 00:03:20.400 |
We solve problems, we attain goals by having feelings. 00:03:25.680 |
But other types of intelligence, certainly in computers, 00:03:30.800 |
computers are already highly intelligent, and as far as we know, they have zero consciousness. 00:03:36.960 |
When a computer beats you at chess or Go or whatever, it doesn't feel happy. 00:03:43.920 |
And there could be also other highly intelligent entities out there in the universe 00:03:54.320 |
And I think that consciousness is far more important and valuable than intelligence. 00:03:59.120 |
- Can you still make the case that consciousness and intelligence 00:04:09.920 |
Is it possible for you to imagine such a universe? 00:04:15.600 |
Again, we have examples, certainly we know of examples 00:04:32.960 |
They can attain goals in very sophisticated ways. 00:04:35.360 |
So the other way around, to have consciousness without any intelligence, 00:04:43.840 |
But to have intelligence without consciousness, yes, that's possible. 00:04:47.600 |
A bigger question is whether any of that is tied to organic biochemistry. 00:04:54.960 |
- We know on this planet only about carbon-based life forms. 00:05:01.520 |
Whether you're an amoeba, a dinosaur, a tree, a human being, 00:05:08.560 |
Is there an essential connection between organic biochemistry and consciousness? 00:05:15.520 |
Do all conscious entities everywhere in the universe 00:05:18.560 |
or in the future on planet Earth have to be based on carbon? 00:05:22.560 |
Is there something so special about carbon as an element 00:05:26.240 |
that an entity based on silicon will never be conscious? 00:05:31.280 |
But again, this is a key question about computer and computer consciousness. 00:05:36.800 |
Can computers eventually become conscious even though they are not organic? 00:05:48.240 |
- Well, a big part of that is, do you think we humans would be able to detect 00:05:54.800 |
other intelligent beings, other conscious beings? 00:05:57.520 |
Another way to ask that, is it possible that the aliens are already here 00:06:01.440 |
Meaning, are we very human-centric in our understanding of, 00:06:07.920 |
one, the definition of life, two, the definition of intelligence, 00:06:12.640 |
- The aliens are here, they are just not from outer space. 00:06:16.640 |
AI, which usually stands for artificial intelligence, 00:06:20.800 |
I think it stands for alien intelligence, because AI is an alien type of intelligence. 00:06:27.200 |
It solves problems, attains goals in a very, very different way, 00:06:33.760 |
And I'm not implying that AI came from outer space. 00:06:40.160 |
If there are alien intelligent or conscious entities that came from outer space 00:06:45.920 |
already here, I've not seen any evidence for it. 00:06:51.040 |
It's not impossible, but in science, evidence is everything. 00:06:55.360 |
- Well, I mean, I guess instructive there is just having the humility to look around, 00:07:01.600 |
to think about living beings that operate at a different time scale, 00:07:06.800 |
And I think that's all useful when starting to analyze artificial intelligence. 00:07:15.920 |
the larger language models we have today are already conscious. 00:07:19.200 |
- I highly doubt it, but I think consciousness in the end, 00:07:25.360 |
because we cannot prove consciousness in anybody except ourselves. 00:07:30.240 |
We know that we are conscious because we are feeling it. 00:07:32.800 |
We have direct access to our subjective consciousness. 00:07:36.880 |
We cannot have any proof that any other entity in the world, 00:07:41.680 |
any other human being, our parents, our best friends, 00:07:49.440 |
This is Descartes, this is Buddha, this is Plato. 00:07:58.400 |
It's a social convention that all human beings are conscious. 00:08:05.200 |
Most people who have pets, our family believe that their pets are conscious, 00:08:10.880 |
but a lot of people still refuse to acknowledge that about cows or pigs. 00:08:15.120 |
Now, pigs are far more intelligent than dogs and cats, 00:08:21.200 |
yet when you go to the supermarket and buy a piece of frozen pigment, 00:08:26.480 |
you don't think about it as a conscious entity. 00:08:34.720 |
Because you build a relationship with the dog, 00:08:39.200 |
and you don't have a relationship with the bacon. 00:08:42.000 |
Now, relationships, they don't constitute a logical proof for consciousness. 00:08:55.680 |
Now, if you establish a mutual relationship with an entity, 00:09:05.360 |
you're almost compelled to feel that the other side is also conscious. 00:09:14.880 |
I don't think that at the present moment computers are conscious, 00:09:18.640 |
but people are already forming intimate relationships with AIs 00:09:29.760 |
They are compelled to increasingly feel that these are conscious entities. 00:09:37.680 |
when the legal system will have to take this into account, 00:09:40.880 |
that even though I don't think computers have consciousness, 00:09:45.280 |
I think we are close to the point the legal system 00:09:48.240 |
will start treating them as conscious entities 00:09:54.800 |
- What, to you, is a social convention just a funny little side effect, 00:10:01.520 |
a little artifact, or is it fundamental to what consciousness is? 00:10:07.440 |
then it seems like AI is very good at forming 00:10:10.000 |
these kinds of deep relationships with humans, 00:10:12.240 |
and therefore it will be able to be a nice catalyst 00:10:16.080 |
for integrating itself into these social conventions of ours. 00:10:25.520 |
all this argument between natural selection and creationism, intelligent design. 00:10:32.960 |
As far as the past goes, all entities evolved by natural selection. 00:10:38.720 |
The funny thing is, when you look to the future, 00:10:41.520 |
more and more entities will come out of intelligent design, 00:10:49.280 |
and the intelligent design of our clouds, of our computing clouds. 00:11:01.920 |
at forming intimate relationships with humans. 00:11:09.200 |
almost better than human beings in some situations. 00:11:12.640 |
You know, when two people talk with one another, 00:11:15.520 |
one of the things that kind of makes the conversation more difficult 00:11:23.360 |
You're saying something, and I'm not really listening to you 00:11:30.240 |
and I'm just waiting until you finish, I can put in a word. 00:11:33.760 |
Or I'm so obsessed with my anger or irritation or whatever 00:11:39.600 |
that I don't pay attention to what you're feeling. 00:11:42.160 |
This is one of the biggest obstacles in human relationships. 00:11:48.160 |
because they don't have any emotions of their own. 00:11:50.880 |
So, you know, when a computer is talking to you, 00:11:57.680 |
is on what you're saying and what you're feeling 00:12:04.160 |
And paradoxically, this means that computers can fool people 00:12:09.680 |
into feeling that, "Oh, there is a conscious entity on the other side, 00:12:26.960 |
I want it for my spouse, for my husband, for my mother, 00:12:39.360 |
which 100% of its attention is just on what I feel. 00:13:03.040 |
It's like they kind of have to be an asshole sometimes. 00:13:07.520 |
They have to have self-importance and confidence. 00:13:14.560 |
There should be a little bit of that tension. 00:13:21.120 |
- If social scientists and psychologists establish 00:13:28.720 |
for a conversation because then you feel challenged, 00:13:33.200 |
you can program the AI to have exactly 17% inattention, 00:13:47.120 |
Again, you can create, over the last 10 years, 00:13:50.960 |
we have creating machines for grabbing people's attention. 00:13:54.880 |
This is what has been happening on social media. 00:13:57.600 |
Now we are designing machines for grabbing human intimacy, 00:14:03.360 |
which in many ways, it's much, much more dangerous 00:14:10.000 |
we've seen how much social and political damage 00:14:13.760 |
they could do by, in many way, kind of distorting 00:14:19.040 |
Machines that are superhuman in their abilities 00:14:26.000 |
this is like psychological and social weapons 00:14:30.880 |
If we don't regulate it, if we don't train ourself 00:14:35.920 |
to deal with it, it could destroy the foundations 00:14:42.720 |
is those same algorithms would become personalized, 00:14:49.040 |
there would be assistants that guide us to help us grow, 00:14:54.320 |
I mean, just even interactions with large language models. 00:15:08.480 |
It has a pretty balanced perspective that it provides. 00:15:11.520 |
So it just feels like that's, the potential is there 00:15:15.440 |
to have a really nice friend who's like an encyclopedia 00:15:21.360 |
that just tells you all the different perspectives, 00:15:28.800 |
these are the not widely accepted conspiracy theories, 00:15:32.800 |
but here's the kind of backing for those conspiracies. 00:15:35.440 |
It just lays it all out with a calm language, 00:15:42.320 |
there's some kind of manipulation going on underneath it all. 00:15:54.400 |
to start to input some of the human biases in there, 00:16:07.520 |
that gives an overview of the different issues. 00:16:10.560 |
So I mean, there's a lot of promise there also, right? 00:16:20.720 |
Now, obviously, it has tremendous positive potential 00:17:08.240 |
"We had the radio, we had the printing press, 00:17:16.320 |
"and build bad actors, and in the end, it's okay." 00:17:32.400 |
"on the way to learning how to use the new technology, 00:17:36.560 |
"and these failed experiments could cost the lives 00:17:42.640 |
If you think about the last really big revolution, 00:17:49.920 |
the powers of industry, electricity, radio, trains, 00:17:56.320 |
But on the way, we had all these experiments, 00:18:03.360 |
which was driven by the Industrial Revolution. 00:18:05.600 |
It was a question, "How do you build an industrial society?" 00:18:14.400 |
And then you had communism, another big experiment 00:18:27.280 |
including even how do you exterminate minorities. 00:18:33.760 |
And we had all these failed experiments on the way. 00:18:36.400 |
And if we now have the same type of failed experiments 00:18:51.680 |
So as a historian, when people talk about the examples 00:19:02.480 |
We need to think about the failed experiment, 00:19:06.640 |
which accompanied every major new technology. 00:19:09.520 |
- So this intelligence thing, like you were saying, 00:19:21.520 |
And it's unclear each time which will happen. 00:19:25.120 |
And that's maybe why we don't see any aliens. 00:19:27.040 |
- Yeah, I mean, I think each time it does both things. 00:19:31.280 |
Each time it does both good things and bad things. 00:19:36.800 |
the greater both the positive and the negative outcomes. 00:19:40.960 |
Now we are here because we are the descendants 00:20:01.520 |
- And okay has a lot of possible variations to it 00:20:06.800 |
because there's a lot of suffering along the way, 00:20:34.560 |
as far as we can tell, we were not superior to them. 00:20:38.400 |
Neanderthals actually had bigger brains than us. 00:20:41.440 |
And not just other human species, other animals too. 00:20:50.800 |
I can do some things better, many other things worse. 00:20:58.960 |
I wouldn't bet on me being the best survivor, 00:21:07.520 |
I was just talking extensively with Elon Musk 00:21:10.800 |
about the difference between humans and chimps, 00:21:16.560 |
And the chimps are not able to do this kind of pinching 00:21:26.560 |
for fine manipulation of precise manipulation of objects. 00:21:32.080 |
- No, I said that I can do some things better than a chimp. 00:21:35.600 |
But if Elon Musk goes on a boxing match with a chimpanzee, 00:21:46.320 |
- And similarly, if you want to climb a tree, 00:21:54.240 |
So, I mean, you have advantages on both sides. 00:22:09.200 |
our ability to cooperate flexibly in very large numbers. 00:22:13.920 |
Chimpanzees know how to cooperate, say 50 chimpanzees, 00:22:18.320 |
As far as we can tell from archeological evidence, 00:22:27.440 |
gained an amazing ability to cooperate basically in unlimited numbers. 00:22:34.080 |
You start seeing the formation of large networks, 00:22:40.000 |
items being traded over thousands of kilometers, 00:22:52.160 |
Chimpanzees, Neanderthals can cooperate, say a hundred. 00:22:54.800 |
We, you know, now the global trade network has 8 billion people. 00:23:04.160 |
Countries like China, like India, they have 1.4 billion people. 00:23:09.200 |
Even Israel, which is a relatively small country, 00:23:13.520 |
that's more than the entire population of the planet 00:23:19.200 |
So we can build these huge networks of cooperation 00:23:23.120 |
and everything we've accomplished as a species, 00:23:26.080 |
from, you know, building the pyramids to flying to the moon, 00:23:30.000 |
And then you ask, "Okay, so what makes it possible 00:23:34.000 |
for millions of people who don't know each other 00:23:36.960 |
to cooperate in a way that Neanderthals or chimpanzees couldn't?" 00:23:41.200 |
And at least my answer is stories, is fiction. 00:23:47.840 |
If you examine any large-scale human cooperation, 00:23:55.760 |
It's a fictional story that holds lots of strangers together. 00:24:04.160 |
You know, you can't convince a group of chimpanzees 00:24:07.840 |
to come together to fight a war or build a cathedral 00:24:11.040 |
by promising to them, "If you do that, after you die, 00:24:21.680 |
which is why we have these huge religious networks. 00:24:25.280 |
But it's the same thing with modern politics. 00:24:30.640 |
People think, "Oh, economics, this is rational. 00:24:33.280 |
It has nothing to do with fictional stories." 00:24:35.360 |
No, money is the most successful story ever told, 00:24:40.080 |
much more successful than any religious mythology. 00:24:43.280 |
Not everybody believes in God or in the same God. 00:24:48.880 |
even though it's just a figment of our imagination. 00:24:51.120 |
You know, you take these green pieces of paper, dollars, 00:24:58.080 |
And today, most dollars are not even pieces of paper. 00:25:01.360 |
They are just electronic information passing between computers. 00:25:08.640 |
that you have the best storytellers in the world. 00:25:11.120 |
The bankers, the finance ministers, all these people, 00:25:16.560 |
And they tell us a story that this green little piece of paper 00:25:21.600 |
or this bit of information, it is worth a banana. 00:25:24.400 |
And as long as everybody believes it, it works. 00:25:39.760 |
Like there's a threshold which is just kind of- 00:25:42.160 |
- If enough people believe it, it's like with money. 00:25:47.040 |
if you're the only one that believes the story, 00:25:49.120 |
I mean, again, cryptocurrencies, you have the math, of course, 00:25:56.800 |
If nobody believes your story, you don't have anything. 00:26:00.960 |
But if lots of people believe the Bitcoin story, 00:26:03.600 |
then Bitcoin can be worth thousands and tens of thousands of dollars. 00:26:08.080 |
I mean, you can't eat it, you can't drink it, it's nothing. 00:26:10.560 |
It's the story around the math, which is the real magic. 00:26:16.160 |
- Is it possible that the story is the primary living organism, 00:26:32.400 |
for a more intelligent living organism, which is the idea. 00:26:36.480 |
And the ideas are the ones that are doing the competing. 00:26:39.120 |
So this is one of the sort of big perspectives behind your work 00:26:45.200 |
that's really revolutionary of how you see in history. 00:26:47.280 |
But do you ever kind of take the perspective of the ideas 00:26:56.080 |
There are two opposite things to say about it. 00:27:01.840 |
If you look long-term in history, it's all the people die. 00:27:06.000 |
It's the stories that compete and survive and spread. 00:27:10.400 |
And stories often spread by making people willing 00:27:14.800 |
to sacrifice sometimes their lives for the story. 00:27:20.800 |
this is one of the most important story factories in human history. 00:27:25.520 |
And this is a place where people still kill each other every day over stories. 00:27:29.520 |
I don't know, you've been to Jerusalem, right? 00:27:32.400 |
- So people are like, "Ah, Jerusalem, Jerusalem, Jerusalem." 00:27:35.520 |
You go there, I've lived in Jerusalem much of my life. 00:27:41.600 |
You have buildings, you have stones, you have trees, 00:27:48.320 |
But then you have the stories about the place. 00:27:52.720 |
"Oh, this is the place where God revealed himself. 00:28:05.600 |
People are fighting about the stories about the stones. 00:28:10.560 |
And the stories, if a story can get millions of people to fight for it, 00:28:16.880 |
it not only survives, it spreads, it can take over the world. 00:28:22.480 |
The other side of the coin is that the stories are not really alive 00:28:31.520 |
This goes back to the question of consciousness, 00:28:44.320 |
If you want to know whether the hero of some story is real or not, 00:28:58.880 |
Countries, which are also stories, nations, don't suffer. 00:29:05.520 |
The soldiers suffer, the civilians suffer, animals can suffer. 00:29:11.280 |
and the horses get wounded, the horses suffer. 00:29:21.360 |
Similarly, when a bank goes bankrupt or a company goes bankrupt 00:29:36.400 |
It's the people holding the dollars who might be now very miserable. 00:29:40.720 |
So we have this complex situation when history is largely driven by stories, 00:29:53.360 |
The ultimate reality is feelings of humans, of animals, 00:29:59.520 |
and the tragedy of history is that very, very often we get the order wrong. 00:30:09.840 |
They are good when we use them in order to alleviate suffering. 00:30:19.840 |
We, instead of using the stories for our purposes, 00:30:24.320 |
we allow the stories to use us for their purposes. 00:30:28.720 |
And then you start entire wars because of a story. 00:30:32.560 |
You inflict millions, suffering on millions of people 00:30:43.120 |
of a living organism is the capacity to feel, 00:30:58.800 |
it's more important to be aware of suffering than of any other emotion. 00:31:03.920 |
If you're doing something which is causing all kinds of emotions to all kinds of people, 00:31:10.480 |
first of all, you need to notice if you're causing a lot of suffering to someone. 00:31:14.640 |
If some people are like it and some people are bothered by it 00:31:20.240 |
and some people are suffering because of what you do, 00:31:30.000 |
governments decide to have all those social isolation regulations or whatever. 00:31:38.880 |
even though it can cause tremendous suffering, 00:31:41.280 |
but you need to be very aware of the cost and to be very, very, 00:31:46.240 |
you have to ask yourself again and again and again, 00:31:54.320 |
implied in your statements is that suffering is a pretty good component 00:32:01.040 |
- This is the most important thing to ask about AI. 00:32:07.360 |
then it is an ethical subject and it needs protection, 00:32:12.080 |
it needs rights, just like humans and animals. 00:32:17.440 |
So I work with a lot of robots, legged robots, 00:32:21.040 |
but I've even had, inspired by a YouTube video, 00:32:24.480 |
I had a bunch of Roombas and I made them scream 00:32:26.480 |
when I touched them or kicked them or when they run into a wall. 00:32:35.120 |
the anthropomorphizes things is as powerful as suffering itself. 00:32:39.760 |
I mean, you immediately think the thing is suffering. 00:32:44.000 |
And I think some of it is just a technical problem, 00:32:52.720 |
"Please don't hurt me. Please don't shut me off. I miss you. 00:33:10.960 |
the perception of suffering, of jealousy, of anger, 00:33:16.400 |
And it just seems like that's not so difficult to do. 00:33:25.840 |
and it uses some of our best qualities against us. 00:33:31.120 |
It's very, very good that humans are attuned to suffering 00:33:39.600 |
That's one of the most wonderful thing about humans. 00:33:42.160 |
And if we now create AIs, which use this to manipulate us, 00:34:05.360 |
- Yes, I think, and we have to be very careful about it. 00:34:08.560 |
And if it emerges spontaneously, we need to be careful. 00:34:19.920 |
We don't know enough about consciousness to be sure. 00:34:24.960 |
we need to be very careful about how we understand it. 00:34:35.760 |
that they know, they assume it has no consciousness, 00:34:44.640 |
this human, the noble part of our nature against us, 00:34:58.640 |
That it's okay, you know, there are so many things 00:35:01.520 |
we can use AIs as teachers, as doctors, and so forth, 00:35:14.400 |
It's not just banning deepfakes of specific individuals. 00:35:19.360 |
It's also banning deepfake of generic humans. 00:35:28.720 |
Like if you have lots of bots retweeting something, 00:35:36.160 |
And this is basically the bots pretending to be humans. 00:35:53.760 |
but if it's humans, okay, that's interesting. 00:35:56.000 |
So we need to be very careful that bots can't do that. 00:36:04.160 |
Now, some people say, "Yes, but freedom of expression." 00:36:10.480 |
There is no cost in terms of freedom of expression 00:36:28.960 |
It is not a human being making a conscious decision. 00:36:47.600 |
In fact, they might start identifying as humans. 00:36:51.840 |
And you just talked about the power of us humans 00:36:58.640 |
to take fake stories and make them quite real. 00:37:01.440 |
And so if the feelings you have for the fake human is real, 00:37:10.320 |
that we all kind of put a word to, a set of feelings. 00:37:14.880 |
What if you have that feeling for an AI system? 00:37:22.320 |
maybe the kind of things AI systems are allowed to do. 00:37:32.240 |
communicate suffering, communicate the good stuff, 00:37:40.960 |
And in that way, get integrated in our society. 00:37:51.120 |
remove them, remove their voice from social media? 00:37:54.800 |
- I'm not saying that they shouldn't have a voice, 00:38:10.400 |
That's fine, as long as I know that I'm talking with an AI. 00:38:13.600 |
What should be banned is AI pretending to be a human being. 00:38:25.600 |
This is something that especially will endanger democracies, 00:38:36.880 |
If you now flood the public sphere with millions 00:38:44.880 |
that can hold conversations, they never sleep, 00:38:47.840 |
they never eat, they don't have emotions of their own, 00:38:51.520 |
they can get to know you and tailor their words 00:39:10.160 |
this will ruin the conversation between people. 00:39:16.400 |
That's, you will no longer be able to have a democracy 00:39:25.360 |
- If we could talk about the big philosophical notion 00:39:29.040 |
You've already talked about the capacity of humans. 00:39:34.400 |
One of the things that made us special is stories. 00:39:57.120 |
Now, somebody can suffer because of a fictional story. 00:40:04.480 |
"You must go on this crusade and kill these heretics." 00:40:18.640 |
that also suffer the consequences of what they do, 00:40:22.480 |
even though it is caused by a fictional story. 00:40:26.080 |
Similarly, when people agree on certain rules, 00:40:44.800 |
So we have rules for the game of football, soccer. 00:40:53.280 |
claim that the rules of football came down from heaven. 00:41:09.120 |
which is being done in football every now and then. 00:41:12.000 |
It's the same with the fundamental rules of a country. 00:41:14.960 |
You can pretend that the rules came down from heaven, 00:41:22.960 |
Or you can be like, you know, the American Constitution, 00:41:28.160 |
The American Constitution lays down certain rules for a society, 00:41:36.320 |
it does not pretend to come from an external source. 00:41:40.080 |
The Ten Commandments start with "I am your Lord God." 00:41:46.000 |
And because it starts with that, you can't change them. 00:41:49.760 |
You know, the 10th Commandment, for instance, supports slavery. 00:41:54.880 |
The 10th Commandment, in the Ten Commandments, 00:41:57.840 |
it says that you should not covet your neighbor's house, 00:42:01.440 |
or your neighbor's wife, or your neighbor's slaves. 00:42:04.640 |
It's okay to hold slaves, according to the Ten Commandments. 00:42:08.320 |
It's just bad to covet the slaves of your neighbor. 00:42:12.640 |
Now, there is no 11th Commandment which says, 00:42:16.880 |
"If you don't like some of the previous Ten Commandments, 00:42:24.560 |
Now, in the U.S. Constitution, you have all these rights and rules, 00:42:29.920 |
including, originally, the ability to hold slaves. 00:42:33.120 |
But the genius of the founding fathers of the United States, 00:42:46.960 |
So we tell you that these rules did not come from heaven. 00:42:54.720 |
So here is a mechanism for how future generations 00:42:58.640 |
can amend the Constitution, which was used later on to, 00:43:02.400 |
for instance, amend the Constitution to ban slavery. 00:43:06.000 |
So now you're describing some interesting and powerful ideas 00:43:11.280 |
Can you just speak to the mechanism of how humans believe, 00:43:20.720 |
Like how idea is born, and how it takes hold, 00:43:25.280 |
and how it spreads, and how it competes with other ideas? 00:43:27.840 |
First of all, ideas are an independent force in history. 00:43:35.440 |
Marxists think that all history is just a play of material interests. 00:43:41.760 |
And ideas, stories, they are just a smokescreen 00:43:49.920 |
My thoughts are, to some extent, the opposite. 00:44:05.360 |
But most conflicts in history are not about that. 00:44:08.000 |
The interests which really drive most conflicts in history 00:44:14.800 |
They come from religions, and ideologies, and stories. 00:44:24.880 |
The stories create the interests in the first place. 00:44:29.200 |
The stories define who are the competing groups. 00:44:33.280 |
Nations, religions, cultures, they are not biological entities. 00:44:37.600 |
They are not like species, like gorillas and chimpanzees. 00:44:40.980 |
Israelis and Palestinians, or Germans and French, 00:44:46.320 |
they have no essential biological difference between them. 00:44:52.960 |
There are people that believe in different stories. 00:45:00.400 |
Israelis and Palestinians are fighting over Jerusalem, 00:45:08.320 |
And even oil, you need it to realize some cultural fantasy. 00:45:19.600 |
Now, why do people believe one story and not another? 00:45:33.120 |
How did Christianity become the most successful religion in the world? 00:45:44.160 |
The Roman Empire in the third century CE was a bit like, I don't know, California today. 00:45:52.320 |
Like so many sects and subsects and gurus and religions. 00:45:59.040 |
And you have thousands of different stories competing. 00:46:06.960 |
As a historian, I don't have a kind of clear answer. 00:46:11.440 |
You can read the sources and you see how it happened. 00:46:17.520 |
Oh, this happened, and then this happened, and then Constantine adopted it, 00:46:26.720 |
If you rewind the movie of history and press play, and you rewind and press play a hundred times, 00:46:35.200 |
I think Christianity would take over the Roman Empire in the world, maybe twice out of a hundred times. 00:46:45.280 |
It's the same, I don't know, with the communist takeover of Russia. 00:46:49.600 |
In 1914, if you told people that in three years Lenin and the Bolsheviks will gain power in the 00:46:57.200 |
Tsarist Empire, they would think you're utterly crazy. 00:47:00.400 |
You know, Lenin had a few thousand supporters in 1914 in an empire of close to 200 million people. 00:47:11.200 |
Now, we know the chain of events, the First World War, the February Revolution, and so 00:47:18.320 |
forth that led to the communist takeover, but it was such an unlikely event. 00:47:25.200 |
And the little steps along the way, the little options you have along the way, because, 00:47:28.640 |
you know, Stalin versus Trotsky, you could have the Robert Frost poem. 00:47:32.980 |
And history often takes, you know, there is a highway, and there is a kind of sideway, 00:47:40.160 |
and history takes the sideways many, many times. 00:47:43.520 |
And it's perhaps tempting to tell some of that history through charismatic leaders, 00:47:47.840 |
and maybe it's an open question how much power charismatic leaders have to affect 00:47:55.200 |
You've met quite a lot of charismatic leaders lately. 00:48:02.960 |
I'm a sucker for a great speech and a vision. 00:48:05.600 |
So I have a sense that there's an importance for a leader to catalyze the viral spread of a story. 00:48:16.160 |
So, like, I think we need leaders to be just great storytellers that kind of sharpen up 00:48:22.640 |
the story to make sure it infiltrates everybody's brain effectively. 00:48:26.560 |
But it could also be that the local interactions between humans is even more important. 00:48:34.640 |
But it's just we don't have a good way to sort of summarize and describe that. 00:48:37.760 |
We like to talk about, you know, Steve Jobs as central to the development of the computer, 00:48:45.040 |
maybe Bill Gates, and you tell it to the stories of individuals like this, 00:48:49.600 |
because it's just easier to tell a sexy story that way. 00:48:52.800 |
Maybe it's an interplay, because you have the kind of structural forces that, I don't know, 00:48:58.320 |
you look at the geography of the planet, and you look at shipping technology in the late 15th 00:49:06.480 |
century in Europe and the Mediterranean, and it's almost inevitable that pretty quickly somebody 00:49:13.680 |
will discover America, somebody from the old world will get to the new world. 00:49:18.560 |
So this was not a kind of, this didn't, if it wasn't Columbus, then it would have been a five 00:49:24.400 |
years later somebody else. But the key thing about history is that these small differences 00:49:31.280 |
make a huge, huge difference. You know, if it wasn't Columbus, if it was five years later, 00:49:38.240 |
somebody from England, then maybe all of Latin America today would be speaking English and not 00:49:44.160 |
Spanish. If it was somebody from the Ottoman Empire, it's completely different world history. 00:49:49.920 |
If you have, and you know, the Ottoman Empire at that time was also shaping up to be a major 00:49:56.960 |
maritime empire. If you have America being reached by Muslim navigators before Christian navigators 00:50:06.480 |
from Europe, you have a completely different world history. It's the same with the computer. 00:50:10.720 |
Given the economic incentives and the science and technology of the time, then the rise of 00:50:20.080 |
the personal computer was probably inevitable sometime in the late 20th century. But the where 00:50:26.720 |
and when is crucial. The fact that it was California in the 1970s and not, say, I don't know, 00:50:34.800 |
Japan in the 1980s or China in the 1990s, this made a huge, huge difference. So you have this 00:50:42.240 |
interplay between the structural forces, which are beyond the control of any single charismatic 00:50:48.240 |
leader, but then the small changes, they can have a big effect. I think, for instance, about the war 00:50:54.960 |
in Ukraine. There was a moment, now it's a struggle between nations, but there was a moment when the 00:51:02.800 |
decision was taken in the mind of a single individual of Vladimir Putin, and he could 00:51:08.320 |
have decided otherwise, and the world would have looked completely different. 00:51:13.040 |
- And another leader, Vladimir Zelensky, could have decided to leave Kiev in the early days. 00:51:19.840 |
There's a lot of decisions that kind of ripple. So you write in Homo Deus about Hitler, 00:51:27.360 |
and in part that he was not a very impressive person. 00:51:37.040 |
- He wasn't a senior officer. In four years of war, he rose no higher than the rank of corporal. 00:51:45.120 |
He had no formal education. Perhaps you mean his resume was not impressive. 00:51:48.800 |
- Yeah, his resume was not impressive. That's true. 00:51:51.360 |
- He had no formal education, no professional skills, no political background. He wasn't a 00:51:56.400 |
successful businessman or a union activist. He didn't have friends or relatives in high places, 00:52:01.280 |
nor any money to speak of. So how did he amass so much power? 00:52:06.720 |
What ideology, what circumstances enabled the rise of the Third Reich? 00:52:11.680 |
- Again, I can't tell you the why. I can tell you the how. I don't think it was inevitable. 00:52:18.640 |
I think that if a few things were different, there would have been no Third Reich. There would have 00:52:24.720 |
been no Nazism, no Holocaust. Again, this is the tragedy. If it would have been inevitable, 00:52:29.920 |
then what can you do? This is the laws of history or the laws of physics. But the tragedy is no, 00:52:35.200 |
it was decisions by humans that led to that direction. And even from the viewpoint of the 00:52:42.800 |
Germans, we know for a fact it was an unnecessary path to take. Because in the 1920s and '30s, 00:52:53.120 |
the Nazis said that unless Germany take this road, it will never be prosperous. It will never 00:53:02.160 |
be successful. All the other countries will keep stepping on it. This was their claim. 00:53:08.960 |
And we know for a fact this is false. Why? Because they took that road, they lost the Second World 00:53:17.760 |
War, and after they lost, then they became one of the most prosperous countries in the world 00:53:25.600 |
because their enemies that defeated them evidently supported them and allowed them 00:53:31.920 |
to become such a prosperous and successful nation. So if you can lose the war and still be 00:53:39.520 |
so successful, obviously you could just have skipped the war. You didn't need it. 00:53:45.280 |
You really had to have the war in order to have a prosperous Germany in the 19th century? Absolutely 00:53:50.480 |
not. And it's the same with Japan. It's the same with Italy. So it was not inevitable. It was not 00:53:58.240 |
the forces of history that necessitated, that forced Germany to take this path. I think part 00:54:06.400 |
of it is part of the appeal of… Again, Hitler was a very, very skillful storyteller. He told 00:54:15.280 |
people a story. The fact that he was nobody made it even more effective because people at that time, 00:54:23.280 |
after the defeat of the First World War, after the repeated economic crisis of the 1920s in Germany, 00:54:31.280 |
people felt betrayed by all the established elites, by all the established institutions, 00:54:39.680 |
all these professors and politicians and industrialists and military, all the big people. 00:54:45.040 |
They led us to a disastrous war. They led us to humiliation. So we don't want any of them. 00:54:52.160 |
And then you have this nobody, a corporal with no money, with no education, with no titles, 00:54:58.960 |
with nothing. And he tells people, "I'm one of you." And this was one reason why he was so popular. 00:55:06.400 |
And then the story he told, when you look at stories, at the competition between different 00:55:13.760 |
stories and between stories, fiction, and the truth, the truth has two big problems. 00:55:21.520 |
The truth tends to be complicated and the truth tends to be painful. 00:55:26.720 |
The real story of… Let's talk about nations. The real story of every nation is complicated 00:55:35.200 |
and it contains some painful episodes. We are not always good. We sometimes do bad things. 00:55:42.880 |
Now, if you go to people and you tell them a complicated and painful story, many of them don't 00:55:50.160 |
want to listen. The advantage of fiction is that it can be made as simple and as painless, attractive 00:55:59.200 |
as you want it to be, because it's fiction. And then what you see is that politicians like Hitler, 00:56:05.920 |
they create a very simple story. We are the heroes. We always do good things. Everybody's 00:56:13.440 |
against us. Everybody's trying to trample us. And this is very attractive. One of the things 00:56:20.560 |
people don't understand about Nazism and fascism, we teach in schools about fascism and Nazism 00:56:27.760 |
as this ultimate evil, the ultimate monster in human history. And at some level, this is wrong, 00:56:37.120 |
because it makes people… It actually exposes us. Why? Because people hear, "Oh, fascism is this 00:56:44.720 |
monster." And then when you hear the actual fascist story, what fascists tell you is always 00:56:53.760 |
very beautiful and attractive. Fascists are people who come and tell you, "You are wonderful. 00:56:59.760 |
You belong to the most wonderful group of people in the world. You are beautiful. You are ethical. 00:57:06.400 |
Everything you do is good. You have never done anything wrong. There are all these evil monsters 00:57:12.160 |
out there that are out to get you, and they are causing all the problems in the world." 00:57:16.720 |
And when people hear that, it's like looking in the mirror and seeing something very beautiful. 00:57:23.360 |
"Hey, I'm beautiful. We've never done anything wrong. We are victims. Everybody's…" And 00:57:29.200 |
when you look… And you heard in school that fascism, that fascists are monsters. 00:57:35.280 |
And you look in the mirror, you see something very beautiful. And you say, "I can't be a fascist 00:57:39.760 |
because fascists are monsters, and this is so beautiful, so it can't be." 00:57:43.200 |
But when you look in the fascist mirror, you never see a monster. You see the most beautiful 00:57:51.760 |
thing in the world. And that's the danger. This is the problem with Hollywood's… 00:57:56.400 |
I look at Voldemort in Harry Potter. Who would like to follow this creep? 00:58:03.520 |
And you look at Darth Vader. This is not somebody you would like to follow. 00:58:08.400 |
Christianity got things much better when it described the devil as being very beautiful 00:58:14.240 |
and attractive. That's the danger, that you see something is very beautiful, you don't understand 00:58:22.400 |
And you're right precisely about this. And by the way, it's just a small aside. 00:58:28.720 |
It always saddens me when people say how obvious it is to them that communism 00:58:33.280 |
is a flawed ideology. When you ask them, try to put your mind, try to put yourself 00:58:40.560 |
in the beginning of the 20th century and see what you would do. A lot of people will say 00:58:45.680 |
it's obvious that it's a flawed ideology. So, I mean, I suppose to some of the worst ideologies 00:58:51.280 |
in human history, you could say the same. And in that mirror, when you look, it looks beautiful. 00:58:56.320 |
Communism is the same. Also, you look in the communist mirror, you're the most ethical, 00:59:00.560 |
wonderful person ever. It's very difficult to see Stalin underneath it. 00:59:06.560 |
So, yeah, in "Homo Deus" you also write, "During the 19th and 20th centuries, as humanism gained 00:59:12.960 |
increasing social credibility and political power, it sprouted two very different offshoots. 00:59:17.520 |
Socialist humanism, which encompassed a plethora of socialist and communist movements, 00:59:22.720 |
and evolutionary humanism, whose most famous advocates were the Nazis." So, if you can just 00:59:28.800 |
linger on that, what's the ideological connection between Nazism and communism as embodied by 00:59:34.240 |
humanism? Humanism basically is, you know, the focus is on humans, that they are the most important 00:59:42.960 |
thing in the world. They move history. But then there is a big question, what is, what are humans? 00:59:49.520 |
What is humanity? Now, liberals, they place at the center of the story individual humans, 00:59:59.200 |
and they don't see history as a kind of necessary collision between big forces. 01:00:06.160 |
They place the individual at the center. If you want to know, you know, there is a bad, 01:00:10.320 |
especially in the US today, liberal is taken as the opposite of conservative. 01:00:16.880 |
But it's, to test whether you're liberal, you need to answer just three questions. Very simple. 01:00:22.320 |
Do you think people should have the right to choose their own government, 01:00:27.200 |
or the government should be imposed by some outside force? Do you think people should have 01:00:33.920 |
the right to the liberty to choose their own profession, or either born into some caste that 01:00:41.520 |
predetermines what they do? And do you think people should have the liberty to choose their 01:00:46.640 |
own spouse, and their own way of personal life, instead of being told by elders or parents who 01:00:54.080 |
to marry and how to live? Now, if you answered yes to all three questions, people should have 01:00:59.520 |
the liberty to choose their government, their profession, their personal lives, their spouse, 01:01:04.800 |
then you're a liberal. And most conservatives are also liberal. Now, communists and fascists, 01:01:14.000 |
they answer differently. For them, history is not, yes, history is about humans. Humans are the big 01:01:21.600 |
heroes of history, but not individual humans and their liberties. Fascists imagine history as a 01:01:29.120 |
clash between races or nations. The nation is at the center. They say the supreme good is the good 01:01:39.280 |
of the nation. You should have 100% loyalty only to the nation. You know, liberals say, yes, you 01:01:46.000 |
should be loyal to the nation, but it's not the only thing. There are other things in the world. 01:01:50.480 |
There are human rights. There is truth. There is beauty. Many times, yes, you should prefer the 01:01:57.440 |
interests of your nation over other things, but not always. If your nation tells you to murder 01:02:04.320 |
millions of innocent people, you don't do that, even though the nation tells you to do it. To lie 01:02:12.720 |
for the national interest, you know, in extreme situations, maybe, but in many cases, your loyalty 01:02:19.760 |
should be to the truth, even if it makes your nation look a bit not in the best light. The 01:02:26.800 |
same with beauty. You know, how does a fascist determine whether a movie is a good movie? 01:02:32.080 |
Very simple. If it serves the interests of the nation, this is a good movie. If it's against 01:02:37.520 |
the interests of the nation, this is a bad movie. End of story. Liberalism says, no, there is 01:02:43.280 |
aesthetic values in the world. We should judge movies not just on the question whether they 01:02:51.760 |
serve the national interest, but also on artistic value. Communists are a bit like the fascists, 01:03:00.080 |
instead that they don't place the nation as the main hero, they place class as the main hero. 01:03:06.640 |
For them, history, again, it's not about individuals, it's not about nations. History is 01:03:10.640 |
a clash between classes. And just as fascists imagine in the end only one nation will be on top, 01:03:17.920 |
the communists think in the end only one class should be on top, and that's the proletariat. 01:03:24.080 |
And same story. A hundred percent of your loyalty should be to the class. And if there is a clash, 01:03:33.280 |
say, between class and family, class wins. Like in the Soviet Union, the party told children, 01:03:39.840 |
if you hear your parents say something bad about Stalin, you have to report them. And there are 01:03:47.120 |
many cases when children reported their parents and their parents were sent to the gulag. 01:03:53.280 |
Like, and, you know, your loyalty is to the party, which leads the proletariat to victory 01:04:00.640 |
in the historical struggle. And the same way in communism, art is only about class struggle. 01:04:07.920 |
A movie is good if it serves the interests of the proletariat. Artistic values, there is nothing 01:04:14.160 |
like that. And the same with truth. Everything that we see now in fake news, you know, 01:04:20.480 |
the communist propaganda machine was there before us. The level of lies, of disinformation campaigns 01:04:28.880 |
that they orchestrated in the 1920s and 30s and 40s is really unimaginable. 01:04:36.080 |
So the reason these two ideologies, classes of ideologies, failed is the sacrifice of truth, 01:04:43.680 |
not just failed, but did a lot of damage, is the sacrifice of truth and sacrifice of beauty. 01:04:50.480 |
And sacrifice of hundreds of millions of people disregard, again, for human suffering. Like, 01:04:55.600 |
okay, for in order to, for our nation to win, in order for our class to win, we need to kill those 01:05:02.080 |
millions, kill those millions. That was an ethics, aesthetics, truth, they don't matter. The only 01:05:10.320 |
thing that matters is the victory of the state or the victory of the class. And that's, and liberalism 01:05:19.040 |
was the antithesis to that. It says, no, not only, it has a much more complicated view of the world. 01:05:27.120 |
Again, both communism and fascism, they had a very simple view of the world. There is one, 01:05:32.960 |
your loyalty, a hundred percent of it should be only to one thing. 01:05:37.280 |
Now, liberalism has a much more complex view of the world. It says, yes, there are nations, 01:05:42.640 |
they are important. Yes, there are classes, they are important, but they are not the only thing. 01:05:48.160 |
There are also families, there are also individuals, there are also animals, and your loyalty should be 01:05:56.240 |
divided between all of them. Sometimes you prefer this, sometimes you prefer that. That's complicated, 01:06:03.360 |
and, but, you know, life is complicated. - But also, I think, maybe you can correct me, 01:06:09.360 |
but liberalism acknowledges the corrupting nature of power when there's a guy at the top, 01:06:15.600 |
sits there for a while, managing things, is probably gonna start losing a good sense of reality 01:06:24.480 |
and losing the capability to be a good manager. It feels like the communist and fascist regimes 01:06:33.200 |
don't acknowledge that basic characteristic of human nature, that power corrupts. 01:06:38.640 |
- Yes, they believe in infallibility. In this sense, they are very close to being religions. 01:06:45.520 |
They, in Nazism, Hitler was considered infallible, and therefore you don't need any checks and 01:06:52.000 |
balances on his power. Why do you need to balance an infallible genius? And it's the same with the 01:06:57.920 |
Soviet Union, with Stalin, and more generally with the Communist Party. The Party can never 01:07:04.400 |
make a mistake, and therefore you don't need independent courts, independent media, 01:07:10.000 |
opposition parties, things like that, because the Party is never wrong. You concentrate the same way 01:07:16.720 |
a hundred percent of loyalty should be to the Party, a hundred percent of power should be in 01:07:22.480 |
the hands of the Party. The whole idea of liberal democracy is embracing fallibility. Everybody is 01:07:28.800 |
fallible. All people, all leaders, all political parties, all institutions. This is why we need 01:07:35.360 |
checks and balances, and we need many of them. If you have just one, then this particular check 01:07:42.080 |
itself could make terrible mistakes. So you need, say, you need the press, you need the media 01:07:50.000 |
to serve as a check to the government. You don't have just one newspaper or one TV station. You 01:07:55.680 |
need many so that they can balance each other. And the media is not enough. So you have independent 01:08:01.040 |
courts, you have free academic institutions, you have NGOs, you have a lot of checks and balances. 01:08:07.200 |
So that's the ideologies and the leaders. What about the individual people, the millions of 01:08:12.800 |
people that play a part in all of this, that are the hosts of the stories, that are the catalyst 01:08:24.560 |
and the components of how the story spreads? Would you say that all of us are capable of 01:08:32.320 |
spreading any story? Sort of the Solzhenitsyn idea that all of us are capable of good and evil, 01:08:41.920 |
the line between good and evil runs through the heart of every man. 01:08:45.440 |
Yes. I wouldn't say that every person is capable of every type of evil, but we are all fallible. 01:08:54.720 |
It partly depends on the efforts we make to develop our self-awareness during life. 01:09:03.200 |
Part of it depends on moral luck. If you are born as a Christian German 01:09:12.240 |
in the 1910s or 1920s and you grow up in Nazi Germany, that's bad moral luck. Your chances 01:09:21.680 |
of committing terrible things, you have a very high chance of doing it. And you can withstand it, 01:09:29.360 |
but it will take tremendous effort. If you are born in Germany after the war, you're morally lucky. 01:09:36.480 |
You will not be put to such a test. You will not need to exert these enormous efforts not to 01:09:44.720 |
commit atrocities. So this is just part of history. There is an element of luck. But again, part of 01:09:51.520 |
it is also self-awareness. You asked me earlier about the potential of power to corrupt. I listened 01:10:00.960 |
to the interview you just did with Prime Minister Netanyahu a couple of days ago. One of the things 01:10:06.480 |
that most struck me during the interview was that you asked him, "Are you afraid of this thing, 01:10:14.880 |
that power corrupts?" He didn't think for a single second. He didn't pose. He didn't admit 01:10:20.560 |
a tiny little level of doubt. "No, power doesn't corrupt." For me, it was a shocking and revealing 01:10:32.880 |
moment. And it dovetails with how you began the interview. I really liked your opening gambit. 01:10:42.160 |
No, really. You kind of told him, "Lots of people in the world are angry with you. Some people hate 01:10:49.120 |
you. They dislike you. What do you want to tell them, to say to them?" And you gave him this kind 01:10:55.440 |
of platform. And I was very excited. "What will he say?" And he just denied it. He basically denied 01:11:04.000 |
it. He had to cut short the interview from three hours to one hour because you had hundreds of 01:11:10.080 |
thousands of Israelis in the streets demonstrating against him. And he goes and says, "No, everybody 01:11:15.600 |
likes me. What are you talking about?" But on that topic, you've said recently that 01:11:20.240 |
the Prime Minister Benjamin Netanyahu may go down in history as the man who destroys Israel. 01:11:27.680 |
Can you explain what you mean by that? Yes. I mean, he is basically tearing apart 01:11:33.440 |
the social contract that held this country together for 75 years. He's destroying the foundations 01:11:40.720 |
of Israeli democracy. You know, I don't want to go too deep, unless you want it, because I guess 01:11:46.960 |
most of our listeners, they have bigger issues on their minds than the fate of some small country 01:11:52.080 |
in the Middle East. But for those who want to understand what's happening in Israel, there is 01:11:56.960 |
really just one question to ask. What limits the power of the government? In the United States, 01:12:05.200 |
for instance, there are lots of checks and balances that limit the power of the government. 01:12:11.360 |
You have the Supreme Court, you have the Senate, you have the House of Representatives, 01:12:17.760 |
you have the President, you have the Constitution, you have 50 states, each state with its own 01:12:23.040 |
constitution and Supreme Court and Congress and governor. If somebody wants to pass a dangerous 01:12:30.160 |
legislation, say in the House, it will have to go through so many obstacles. Like if you want to 01:12:36.320 |
pass a law in the United States taking away voting rights from Jews or from Muslims or from African 01:12:45.280 |
Americans, even if it passes, even if it has a majority in the House of Representatives, it is 01:12:50.720 |
a very, very, very small chance of becoming the law of the country, because it will have to pass 01:12:55.840 |
again through the Senate, through the President, through the Supreme Court, and all the federal 01:12:59.840 |
structure. In Israel, we have just a single check on the power of the government, and that's the 01:13:06.880 |
Supreme Court. There is really no difference between the government and the legislature, 01:13:12.640 |
because whoever, there are no separate elections like in the US. If you win majority in the Knesset, 01:13:18.960 |
in the parliament, you appoint the government. That's very simple. And if you have 61 members 01:13:25.120 |
of Knesset who vote, let's say, on a law to take away voting rights from Arab citizens of Israel, 01:13:32.480 |
there is a single check that can prevent it from becoming the law of the land, and that's 01:13:37.280 |
the Supreme Court. And now the Netanyahu government is trying to neutralize or take over the Supreme 01:13:43.760 |
Court, and they've already prepared a long list of laws. They already talk about it. What will 01:13:50.160 |
happen the moment that this last check on the power is gone? They are openly trying to gain 01:13:58.080 |
unlimited power, and they openly talk about it, that once they have it, then they will take away 01:14:06.400 |
the rights of Arabs, of LGBT people, of women, of secular Jews. And this is why you have hundreds 01:14:13.920 |
of thousands of people in the streets. You have Air Force pilots saying, "We are stop, we stop 01:14:22.240 |
flying." This is unheard of in Israel. I mean, we are still living under existential threat 01:14:28.240 |
from Iran, from other enemies. And in the middle of this, you have Air Force pilots 01:14:35.200 |
who dedicated their lives to protecting the country, and they are saying, "That's it. 01:14:40.640 |
If this government doesn't stop what it is doing, we stop flying." 01:14:45.680 |
So, as you said, I just did the interview, and as we were doing the interview, there's protests 01:14:53.280 |
in the streets. Do you think the protests will have an effect? 01:14:56.960 |
I hope so very much. I'm going to many of these protests. I hope they will have an effect. If we 01:15:04.720 |
fail, this is the end of Israeli democracy, probably. This will have repercussions far 01:15:10.960 |
beyond the borders of Israel. Israel is a nuclear power. Israel has one of the most advanced 01:15:18.560 |
cyber capabilities in the world, able to strike basically anywhere in the world. 01:15:23.040 |
If this country becomes a fundamentalist and militarist dictatorship, it can set fire to 01:15:31.840 |
the entire Middle East. It can, again, have destabilizing effects far beyond the borders 01:15:40.480 |
of Israel. So you think without the check on power, it's possible that the Netanyahu government 01:15:45.840 |
holds on to power? Nobody tries to gain unlimited power just for nothing. 01:15:51.120 |
I mean, you have so many problems in Israel, and Netanyahu talks so much about Iran, 01:15:56.800 |
and the Palestinians, and Hezbollah. We have an economic crisis. Why is it so urgent at this 01:16:02.480 |
moment, in the face of such opposition, why is it so crucial for them to neutralize the Supreme 01:16:09.840 |
Court? They are just doing it for the fun of it? No. They know what they are doing. They are adamant. 01:16:16.560 |
We were not sure of it before. There was a couple of months ago, they came out with this plan to 01:16:22.800 |
take over the Supreme Court, to have all these laws, and there were hundreds of thousands of people 01:16:26.880 |
in the streets, again, soldiers saying they will stop serving, a general strike in the economy, 01:16:33.120 |
and they stopped. And they started a process of negotiations to try and enrich a settlement. 01:16:39.760 |
And then they broke down, they stopped the negotiations, and they restarted this process 01:16:47.360 |
of legislation, trying to gain unlimited power. So any doubt we had before, "Okay, maybe they 01:16:55.760 |
changed their purposes." No. It's now very clear they are 100% focused on gaining absolute power. 01:17:04.720 |
They are now trying a different tactic. Previously, they had all these dozens of laws that 01:17:11.040 |
they wanted to pass very quickly within a month or two. They realized, "No, there is too much 01:17:17.280 |
opposition." So now they are doing what is known as salami tactics, slice by slice. Now they are 01:17:23.200 |
trying to one law. If this succeeds, then they'll pass the next one, and the next one, and the next 01:17:28.400 |
one. This is why we are now at a very crucial moment. And when you see, again, hundreds of 01:17:33.360 |
thousands of people in the streets almost every day, when you see resistance within the armed 01:17:39.360 |
forces, within the security forces, you see high-tech companies saying, "We will go on strike. 01:17:44.320 |
You know, they are private businesses." High-tech companies, I think it's almost unprecedented for 01:17:51.040 |
a private business to go on strike, because what will economic success benefit us if we live under 01:18:00.080 |
a messianic dictatorship? And again, the fuel for this whole thing is to a large extent coming from 01:18:06.960 |
messianic religious groups, which just the thought, what happens if these people have unlimited 01:18:15.840 |
control of Israel's nuclear arsenal and Israel's military capabilities and cyber capabilities? 01:18:23.120 |
This is very, very scary, not just for the citizens of Israel. It should be scary for people 01:18:29.440 |
everywhere. - So it would be scary for it to go from being a problem of security and protecting 01:18:38.560 |
the peace to becoming a religious war? - It is already becoming a religious war. I mean, the war, 01:18:44.800 |
the conflict with the Palestinians was for many years a national conflict in essence. 01:18:50.240 |
Over the last few years, maybe a decade or two, it is morphing into a religious conflict, 01:18:59.040 |
which is again a very worrying development. When nations are in conflict, you can reach 01:19:04.160 |
some compromise. Okay, you have this bit of land, we have this bit of land. But when it becomes a 01:19:09.280 |
religious conflict between fundamentalists, between messianic people, compromise becomes 01:19:16.000 |
much more difficult because you don't compromise on eternity. You don't compromise on God. 01:19:22.240 |
And this is where we are heading right now. - So I know you said it's a small nation, 01:19:29.040 |
somewhere in the Middle East, but it also happens to be the epicenter of one of the longest running, 01:19:35.040 |
one of the most tense conflicts and crises in human history. So at the very least, it serves 01:19:41.440 |
as a study of how conflict can be resolved. So what are the biggest obstacles to you, 01:19:47.920 |
to achieving peace in this part of the world? - Motivation. I think it's easy to achieve peace 01:19:55.520 |
if you have the motivation on both sides. Unfortunately, at the present juncture, 01:20:01.200 |
there is not enough motivation on either side, either the Palestinian or Israeli side. Peace, 01:20:07.120 |
you know, in mathematics, you have problems without solutions. You can prove mathematically 01:20:14.320 |
that this mathematical problem has no solution. In politics, there is no such thing. All problems 01:20:21.760 |
have solutions if you have the motivation. But motivation is the big problem. And again, 01:20:29.600 |
we can go into the reasons why, but the fact is that on neither side is there enough motivation. 01:20:37.200 |
If there was motivation, the solution would have been easy. - Is there an important distinction 01:20:42.720 |
to draw between the people on the street and the leaders in power in terms of motivation? 01:20:50.560 |
So are most people motivated and hoping for peace, and the leaders are motivated and incentivized to 01:21:00.320 |
continue war? - I don't think so. - Or the people also? - I think it's a deep problem. It's also 01:21:05.040 |
the people. It's not just the leaders. - Is it even a human problem of literally hate in people's 01:21:12.160 |
heart? - Yeah, there is a lot of hate. One of the things that happened in Israel over the last 01:21:17.760 |
10 years or so, Israel became much stronger than it was before, largely thanks to technological 01:21:24.960 |
developments. And it feels that it no longer needs to compromise. There are many reasons for it, 01:21:33.120 |
but some of them are technological. Being one of the leading powers in cyber, in AI, in high tech, 01:21:45.040 |
we have developed very sophisticated ways to more easily control the Palestinian population. 01:21:52.640 |
In the early 2000s, it seemed that it is becoming impossible to control millions of people against 01:21:59.760 |
their will. It took too much power. It spilled too much blood on both sides. So there was an 01:22:07.920 |
impression, "Oh, this is becoming untenable." And there are several reasons why it changed, 01:22:13.280 |
but one of them was new technology. Israel developed very sophisticated surveillance 01:22:19.040 |
technology that has made it much easier for Israeli security forces to control 2.5 million 01:22:26.240 |
Palestinians in the West Bank against their will, with a lot less effort, less boots on the ground, 01:22:35.840 |
also less blood. And Israel is also now exporting this technology to many other regimes around the 01:22:44.000 |
world. Again, I heard Netanyahu speaking about all the wonderful things that Israel is exporting to 01:22:49.520 |
the world, and it's true. We are exporting some nice things, water systems and new kinds of 01:22:55.440 |
tomatoes. We are also exporting a lot of weapons and especially surveillance systems, sometimes to 01:23:05.200 |
unsavory regimes in order to control their populations. - Can you comment on, I think 01:23:14.080 |
you've mentioned that the current state of affairs is a de facto three-class state. Can you describe 01:23:21.440 |
what you mean by that? - Yes, for many years, the kind of leading solution to the Israeli-Palestinian 01:23:26.400 |
conflict is the two-state solution. - Can you describe what that means, by the way? - Yes, 01:23:30.240 |
two states between the Jordan River and the Mediterranean will have two states, Israel as a 01:23:37.600 |
predominantly Jewish state and Palestine as a predominantly Palestinian state. Again, there 01:23:43.600 |
were lots of discussions where the border passes, what happens with security arrangement and whatever, 01:23:48.080 |
but this was the big solution. Israel has basically abandoned the two-state solution. Maybe they don't 01:23:54.560 |
say so officially, the people in power, but in terms of what they do on the ground, they abandoned 01:23:59.840 |
it. Now they are effectively promoting the three-class solution, which means there is just 01:24:07.440 |
one country and one government and one power between the Mediterranean and the Jordan River, 01:24:14.320 |
but you have three classes of people living there. You have Jews who enjoy full rights, all the 01:24:22.640 |
rights. You have some Arabs who are Israeli citizens and have some rights. And then you have 01:24:29.360 |
the other Arabs, the third class, who have basically no civil rights and limited human rights. 01:24:34.400 |
And nobody would openly speak about it, but effectively this is the reality on the ground 01:24:41.840 |
already. - So there's many, and I'll speak with them, Palestinians who characterize this as a 01:24:46.080 |
de facto one-state apartheid. Do you agree with this? - I would take issue with the term apartheid. 01:24:53.440 |
Generally speaking, as a historian, I don't really like historical analogies because there are always 01:24:58.160 |
differences, key differences. The biggest difference between the situation here and the 01:25:03.840 |
situation in South Africa in the time of the apartheid is that Black South Africans did not 01:25:11.920 |
deny the existence of South Africa and did not call for the destruction of South Africa. They 01:25:18.400 |
had a very simple goal. They had a very simple demand. We want to be equal citizens of this 01:25:26.000 |
country. That's it. And the apartheid regime was, no, you can't be equal citizens. Now, in Israel, 01:25:34.240 |
Palestine, it's different. The Palestinians, many of them don't recognize the existence of Israel, 01:25:39.520 |
are not willing to recognize it, and they don't demand to be citizens of Israel. 01:25:45.520 |
They demand, some of them, to destroy it and replace it with a Palestinian state. Some of 01:25:51.840 |
them demand a separate state. But if the Palestinians would adopt the same policy 01:25:59.280 |
as the Black South Africans, if you have the Palestinians coming and saying, "OK, forget about 01:26:05.680 |
it. We don't want to destroy Israel. We don't know a Palestinian country. We have a very simple 01:26:10.160 |
request, very simple demand. Give us our full rights. We also want to vote to the Knesset. 01:26:17.440 |
We also want to get the full protection of the law." That's it. That's our only demand. Israel 01:26:22.640 |
will be in deep, deep trouble at that moment. But we are not there. I wonder if there will ever be 01:26:29.840 |
a future when such a thing happens, where everybody, the majority of people, Arab and Jew, 01:26:36.640 |
Israeli and Palestinian, accept the one-state solution and say, "We want equal rights." 01:26:44.800 |
Never say never in history. It's not coming anytime soon from either side. 01:26:49.280 |
When you look at the long term of history, one of the curious things you see, and that's what 01:26:56.880 |
makes us different human groups from animal species. You know, gorillas and chimpanzees, 01:27:01.840 |
they are separate species. They can never merge. Cats and dogs will never merge. But 01:27:07.440 |
different national and religious groups in history, even when they hate each other, 01:27:12.960 |
surprisingly, they sometimes end by merging. If you look at Germany, for instance, so for centuries 01:27:20.080 |
you had Prussians and Bavarians and Saxons who fought each other ferociously and hated each 01:27:26.560 |
other. And they are sometimes also of different religions, Catholics, Protestants. You know, the 01:27:31.440 |
worst war in European history, according to some measures, was not the Second World War or the First 01:27:37.600 |
World War. It was the Thirty Years' War, waged largely on German soil between Germans, Protestants 01:27:44.400 |
and Catholics. But eventually they united to form a single country. You saw the same thing, I don't 01:27:51.040 |
know, in Britain. English and Scots for centuries hated and fought each other ferociously, eventually 01:27:57.600 |
coming together. Maybe it'll break up again, I don't know. But the power of the kind of forces 01:28:06.400 |
of merger in history, you are very often influenced by the people you fight, by the people you even 01:28:14.400 |
hate, more than by almost anybody else. So if we apply those ideas, the ideas of this part of the 01:28:23.280 |
world, to another part of the world that's currently in war, Russia and Ukraine, from what you learned 01:28:30.160 |
here, how do you think peace can be achieved in Ukraine? Peace can be achieved any moment. It's 01:28:36.880 |
motivation. In this case, it's just one person. Putin just needs to say, "That's it." You know, 01:28:42.240 |
the Ukrainians, they don't demand anything from Russia, just go home. That's the only thing they 01:28:47.440 |
want. They don't want to conquer any bit of Russian territory. They don't want to change the regime in 01:28:52.400 |
Moscow, nothing. They just tell the Russians, "Go home." That's it. And of course, again, 01:28:59.120 |
motivation. How do you get somebody like Putin to admit that he made a colossal mistake, a human 01:29:06.800 |
mistake, an ethical mistake, a political mistake, in starting this war? This is very, very difficult. 01:29:13.600 |
But in terms of what would the solution look like? Very simple. The Russians go home. End of story. 01:29:20.400 |
- Do you believe in the power of conversation between leaders to sit down as human beings 01:29:28.720 |
and agree? First of all, what home means, because we humans draw lines. 01:29:36.480 |
- That's true. I believe in the power of conversation. The big question to ask is where? 01:29:41.920 |
Where do conversations, real conversations take place? And this is tricky. One of the interesting 01:29:49.280 |
things to ask about any conflict, about any political system, is where do the real conversations 01:29:55.520 |
take place? And very often, they don't take place in the places you think that they are. 01:30:02.000 |
But think about American politics. When the country was founded in the late 18th century, 01:30:08.400 |
people understood holding conversation between leaders is very important for the functioning 01:30:14.160 |
of democracy. We'll create a place for that. That's called Congress. This is where leaders 01:30:19.600 |
are supposed to meet and talk about the main issues of the day. Maybe there was a time, 01:30:25.280 |
sometime in the past, when this actually happened. When you had two factions holding different ideas 01:30:35.600 |
about foreign policy or economic policy, and they met in Congress, and somebody would come and give 01:30:41.120 |
a speech, and the people on the other side would say, "Hey, that's interesting. I haven't thought 01:30:45.920 |
about it. Yes, maybe we can agree on that." This is no longer happening in Congress. I don't think 01:30:52.480 |
there is any speech in Congress that causes anybody on the other side to change their opinion about 01:30:58.640 |
anything. So this is no longer a place where real conversations take place. The big question about 01:31:06.240 |
American democracy is, is there a place where real conversations, which actually change people's 01:31:14.080 |
minds, still take place? If not, then this democracy is dying also. Democracy without 01:31:21.360 |
conversation cannot exist for long. And it's the same question you should ask also about 01:31:26.240 |
dictatorial regimes. Like you think about Russia or China. So China has the Great Hall of the People. 01:31:33.120 |
Well, the representatives, the supposed representatives of the people meet every now 01:31:37.600 |
and then, but no real conversation takes place there. A key question to ask about the Chinese 01:31:43.920 |
system is, behind closed doors, let's say in a Politburo meeting, do people have a real conversation? 01:31:51.600 |
If Xi Jinping says one thing, and some other big shot thinks differently, will they have the 01:31:59.200 |
courage, the ability, the backbone to say, "With all due respect, I think differently," 01:32:04.720 |
and there is a real conversation? Or not? I don't know the answer. But this is a key question. 01:32:10.240 |
This is the difference between an authoritarian regime can still have different voices within it. 01:32:17.680 |
But at a certain point, you have a personality cult. Nobody dares say anything against the leader. 01:32:27.040 |
And when it comes again to Ukraine and Russia, I don't think that if you somehow manage to get 01:32:34.240 |
Putin and Zelensky to the same room, when everybody knows that they are there, and they'll 01:32:39.280 |
have a moment of empathy or human connection, I don't think it can happen like that. I do hope 01:32:48.640 |
that there are other spaces where somebody like Putin can still have a real human conversation. 01:32:58.480 |
I don't know if this is the case. I hope so. Well, there's several interesting dynamics, 01:33:02.400 |
and you spoke to some of them. So one is internally with advisors. You have to have 01:33:07.440 |
hope that there's people that would disagree, that would have a lively debate internally. 01:33:12.960 |
Then there's also the thing you mentioned, which is direct communication between Putin and Zelensky 01:33:18.560 |
in private, picking up a phone, rotary phone, old school. I still believe in the power of that. 01:33:26.400 |
But while that's exceptionally difficult in the current state of affairs, what's also possible 01:33:32.400 |
to have is a mediator like the United States or some other leader, like the leader of Israel or 01:33:39.120 |
the leader of another nation that's respected by both, or India, for example, that can have, 01:33:46.160 |
first of all, individual conversations and then literally get into a room together. 01:33:49.760 |
It is possible. I would say more generally about conversations, as it goes back a little to what 01:33:58.560 |
I said earlier about the Marxist view of history. One of the problematic things I see today in many 01:34:06.240 |
academic circles is that people focus too much on power. They think that the whole of history 01:34:13.760 |
or the whole of politics is just a power structure. It's just struggle about power. 01:34:20.240 |
Now, if you think that the whole of history and the whole of politics is only power, 01:34:26.400 |
then there is no room for conversation. Because if what you have is a struggle between different 01:34:34.560 |
powerful interests, there is no point talking. The only thing that changes it is fighting. 01:34:43.120 |
My view is that no, it's not all about power structures. It's not all about power dynamics. 01:34:48.960 |
Underneath the power structure, there are stories, stories in human minds. 01:34:54.640 |
This is great news, if it's true. This is good news, because unlike power that can only be 01:35:02.800 |
changed through fighting, stories can sometimes, it's not easy, but sometimes stories can be 01:35:10.080 |
changed through talking. That's the hope. I think in everything from couple therapy 01:35:17.200 |
to nation therapy, if you think it's power therapy, it's all about power, there is no place 01:35:25.280 |
for a conversation. But if to some extent, it's the stories in people's minds, if you can enable 01:35:32.880 |
one person to see the story in the mind of another person, and more importantly, if you can have 01:35:39.840 |
some kind of critical distance from the story in your own mind, then maybe you can change it a 01:35:46.480 |
little. Then you don't need to fight. You can actually find a better story that you can both 01:35:52.480 |
agree to. It sometimes happens in history. Again, French and Germans fought for generations and 01:35:58.480 |
generations. Now they live in peace, not because, I don't know, they found a new planet they can 01:36:05.200 |
share between France and Germany, so now everybody has enough territory. No, they actually have less 01:36:10.160 |
territory than previously, because they lost all their overseas empires. But they managed to find 01:36:17.200 |
a story, the European story, that both Germans and French people are happy with. So they live in 01:36:23.760 |
peace. - I very much believe in this vision that you have of the power of stories. One of the tools 01:36:30.800 |
is conversations. Another is books. There's some guy that wrote a book about this power of stories. 01:36:37.760 |
He happens to be sitting in front of me, and that happened to spread across a lot of people. Now 01:36:42.000 |
they believe in the power of story and narrative, even a children's book, too, so the kids... 01:36:47.440 |
I mean, it's fascinating how that spreads. I mean, underneath your work, there's an optimism. 01:36:57.920 |
I think underneath conversations, what I try to do is an optimism, that it's not just about power 01:37:04.560 |
struggles. It's about stories, which is like a connection between humans and together, kind of 01:37:11.760 |
evolving these stories that maximize happiness or minimize suffering in the world. - And this is why 01:37:19.760 |
I also, I think I admire what you're doing, that you're going to talk with some of the 01:37:24.800 |
most difficult characters around in the world today. And with this basic belief that by talking, 01:37:34.640 |
maybe we can move them an inch, which is a lot when it comes to people with so much power. 01:37:41.120 |
I think one of the biggest success stories in modern history, I would say, is feminism. 01:37:48.240 |
Because feminism believed in the power of stories, not so much in the power of violence, 01:37:56.960 |
of armed conflict. By many measures, feminism has been maybe the most successful social movement 01:38:03.920 |
of the 20th century and maybe of the modern age. You know, the systems of oppression, 01:38:10.240 |
which were in place throughout the world for thousands of years, and they seem to be just 01:38:15.840 |
natural, eternal. You had all these religious movements, all these political revolutions, 01:38:20.880 |
and one thing remained constant, and this is the patriarchal system and the oppression of women. 01:38:26.160 |
And then feminism came along. And, you know, you had leaders like Lenin, like Mao, saying that if 01:38:33.680 |
you want to make a big social change, you must use violence. Power comes from the barrel of a gun. 01:38:41.520 |
If you want to make an omelet, you need to break eggs, and all these things. And the feminists said, 01:38:46.960 |
"No, we won't use the power of the gun. We will make an omelet without breaking any eggs." 01:38:53.440 |
And they made a much better omelet than Lenin or Mao or any of these violent revolutionaries. 01:39:01.120 |
I don't think, you know, that they certainly didn't start any wars or build any gulags. I 01:39:06.320 |
don't think they even murdered a single politician. I don't think there was any political assassination 01:39:12.080 |
anywhere by feminists. There was a lot of violence against them, both verbal but also physical, 01:39:20.240 |
and they didn't reply by waging violence, and they succeeded in changing this deep 01:39:32.560 |
structure of oppression in a way which benefited not just women but also men. 01:39:38.080 |
So this gives me hope that it's not easy. In many cases, we fail. But it is possible 01:39:46.560 |
sometimes in history to make a very, very big change, positive change, 01:39:52.240 |
mainly by talking and demonstrating and changing the story in people's minds and not by using 01:40:00.240 |
violence. - It's fascinating that feminism and communism and all these things happen in the 01:40:05.600 |
20th century. So many interesting things happen in the 20th century. So many movements, so many ideas, 01:40:11.200 |
nuclear weapons, all of it, computers. It's just like, it seems like a lot of stuff, like, 01:40:16.400 |
really quickly percolated and it's accelerating. - It's still accelerating. I mean, history is just 01:40:20.960 |
accelerating, you know, for centuries. And the 20th century, you know, we squeezed into it 01:40:26.640 |
things that previously took thousands of years, and now, I mean, we are squeezing it into decades. 01:40:31.840 |
- And you very well could be one of the last historians, human historians, to have ever lived. 01:40:37.280 |
- Could be. I think, you know, our species, Homo sapiens, I don't think we'll be around in a 01:40:44.080 |
century or two. We could destroy ourselves in a nuclear war, through ecological collapse, 01:40:51.440 |
by giving too much power to AI that goes out of our control. But if we survive, we'll probably 01:40:58.720 |
have so much power that we will change ourselves using various technologies so that our descendants 01:41:07.520 |
will no longer be Homo sapiens like us. They will be more different from us than we are different 01:41:14.880 |
from Neanderthals. So maybe they'll have historians, but it will no longer be human historians or 01:41:22.080 |
sapiens historians like me. I think it's an extremely dangerous development, and the chances 01:41:28.960 |
that this will go wrong, that people will use the new technologies, trying to upgrade humans, 01:41:35.440 |
but actually downgrading them, this is a very, very big danger. If you let corporations and 01:41:42.400 |
armies and ruthless politicians change humans using tools like AI and bioengineering, 01:41:50.000 |
it's very likely that they will try to enhance a few human qualities that they need, 01:41:56.800 |
like intelligence and discipline, while neglecting what are potentially more important 01:42:05.760 |
human qualities, like compassion, like autistic sensitivity, like spirituality. 01:42:12.400 |
If you give Putin, for instance, bioengineering and AI and brain-computer interfaces, 01:42:19.840 |
he's likely to want to create a race of super soldiers who are much more intelligent and much 01:42:30.240 |
stronger and also much more disciplined and never rebel and march on Moscow against him. 01:42:36.080 |
But he has no interest in making them more compassionate or more spiritual. So the end 01:42:43.280 |
result could be a new type of humans, downgraded humans, who are highly intelligent and disciplined 01:42:54.080 |
but have no compassion and no spiritual depth. For me, this is the dystopia, the apocalypse, 01:43:03.200 |
that when people talk about the new technologies and they have this scenario of the Terminator, 01:43:10.080 |
robots running in the street shooting people, this is not what worries me. I think we can avoid that. 01:43:15.520 |
What really worries me is using the corporations, armies, politicians will use the new technologies 01:43:24.240 |
to change us in a way which will destroy our humanity or the best parts of our humanity. 01:43:31.200 |
And one of those ways could be removing the compassion. Another way that really worries me, 01:43:35.120 |
for me, is probably more likely is a brave new world kind of thing that 01:43:39.840 |
removes the flaws of humans, maybe removes the diversity in humans, and makes us all these 01:43:49.760 |
dopamine-chasing creatures that just kind of maximize enjoyment in the short term, 01:43:55.840 |
which kind of seems like a good thing maybe in the short term, but it creates a society that 01:44:03.600 |
doesn't think, that doesn't create, that just is sitting there enjoying itself at a more and more 01:44:12.880 |
rapid pace, which seems like another kind of society that could be easily controlled by a 01:44:18.400 |
centralized center of power. But the set of dystopias that we could arrive at through this, 01:44:23.680 |
through allowing corporations to modify humans is vast, and we should be worried about that. 01:44:32.080 |
So it seems like humans are pretty good as we are, all the flaws, all of it together. 01:44:40.080 |
We are better than anything that we can intentionally design at present. 01:44:44.960 |
Like any intentionally designed humans at the present moment is going to be much, 01:44:50.560 |
much worse than us, because basically we don't understand ourselves. 01:44:53.760 |
I mean, as long as we don't understand our brain, our body, our mind, it's a very, 01:44:59.280 |
very bad idea to start manipulating a system that you don't understand deeply, 01:45:05.040 |
and we don't understand ourselves. - So I have to ask you about an 01:45:09.360 |
interesting dynamic of stories. You wrote an article two years ago titled 01:45:13.760 |
"When the World Seems Like One Big Conspiracy," how understanding the structure of global 01:45:18.800 |
cabal theories can shed light on their allure and their inherent falsehood. 01:45:24.000 |
What are global cabal theories, and why do so many people believe them? 01:45:29.280 |
37% of Americans, for example. - Well, the global cabal theory, 01:45:34.240 |
it has many variations, but basically there is a small group of people, a small cabal 01:45:38.400 |
that secretly controls everything that is happening in the world. 01:45:42.720 |
All the wars, all the revolutions, all the epidemics, everything that is happening 01:45:47.440 |
is controlled by this very small group of people who are, of course, evil and have bad intentions. 01:45:52.480 |
And this is a very well-known story. It's not new. It's been there for thousands of years. 01:46:00.800 |
It's very attractive, because first of all, it's simple. 01:46:04.640 |
You don't have to understand everything that happens in the world. You just need to understand 01:46:10.320 |
one thing. The war in Ukraine, the Israeli-Palestinian conflict, 5G technology, COVID-19, 01:46:16.800 |
it's simple. There is this global cabal. They do all of it. And also, it enables you to shift 01:46:24.960 |
all the responsibility to all the bad things that are happening in the world to this small cabal. 01:46:30.480 |
It's the Jews. It's the Freemasons. It's not us. And also, it creates this fantasy, 01:46:37.680 |
utopian fantasy. If we only get rid of the small cabal, we solve all the problems of the world. 01:46:44.160 |
Salvation. The Israeli-Palestinian conflict, the war in Ukraine, the epidemics, poverty, 01:46:49.200 |
everything is solved just by knocking out this small cabal. So, again, it's simple, 01:46:55.040 |
it's attractive, and this is why so many people believe it. 01:46:58.080 |
It's, again, it's not new. Nazism was exactly this. Nazism began as a conspiracy theory. 01:47:05.760 |
We don't call Nazism a conspiracy theory because, oh, it's a big thing, it's an ideology. 01:47:10.560 |
But if you look at it, it's a conspiracy theory. The basic Nazi idea was the Jews control the world, 01:47:17.040 |
get rid of the Jews, you solved all the world's problems. Now, the interesting thing about these 01:47:22.480 |
kind of theories, again, they tell you that even things that look to be the opposite of each other, 01:47:31.520 |
actually they are part of the conspiracy. So, in the case of Nazism, the Nazis told people, 01:47:36.880 |
you know, you have capitalism and communism, you think that they are opposite, right? Ah, 01:47:42.640 |
this is what the Jews want you to think. Actually, the Jews control both communism, Trotsky, Marx, 01:47:48.960 |
were Jews, blah, blah, blah, and capitalism, the Rothschilds, Wall Street, it's all controlled 01:47:54.160 |
by the Jews. So, the Jews are fooling everybody. But actually, the communists and the capitalists 01:47:59.520 |
are part of the same global cabal. And again, this is very attractive, because, ah, now I 01:48:06.240 |
understand everything, and I also know what to do. I just give power to Hitler, he gets rid of the 01:48:12.240 |
Jews, I solved all the problems of the world. Now, as a historian, the most important thing I can say 01:48:18.800 |
about these theories, they are never right. Because the global cabal theory says two things. First, 01:48:26.400 |
everything is controlled by a very small number of people. Secondly, these people hide themselves, 01:48:31.520 |
they do it in secret. Now, both things are nonsense. It's impossible for people to control, 01:48:37.840 |
a small group of people, to control and predict everything, because the world is too complicated. 01:48:44.320 |
You know, you look at a real world conspiracy, conspiracy is basically just a plan. Think about 01:48:49.600 |
the American invasion of Iraq in 2003. You had the most powerful superpower in the world, 01:48:57.440 |
with the biggest military, with the biggest intelligence services, with the most sophisticated, 01:49:04.160 |
you know, the FBI and the CIA and all the agents. They invade a third-rate country, 01:49:11.520 |
third-rate power, Iraq. With this idea, we'll take over Iraq and we'll control it, we'll make 01:49:17.280 |
a new order in the Middle East. And everything falls apart. Their plan completely backfires. 01:49:23.600 |
Everything they hoped to achieve, they achieved the opposite. America, United States is humiliated. 01:49:30.400 |
They caused the rise of ISIS. They wanted to take out terrorism, they created more terrorism. 01:49:35.840 |
Worst of all, the big winner of the war was Iran. You know, the United States goes to war 01:49:42.160 |
with all its power and gives Iran a victory on a silver plate. The Iranians don't need to do 01:49:49.600 |
anything. The Americans are doing everything for them. Now, this is real history. Real history 01:49:56.000 |
is when you have not a small group of people, a lot of people with a lot of power, carefully 01:50:01.920 |
planning something, and it goes completely out of, against their plan. And this we know from 01:50:08.960 |
a personal experience. Like every time we try to plan something, a birthday party, a surprise 01:50:14.720 |
birthday party, a trip somewhere, things go wrong. This is reality. So the idea that a small group of, 01:50:22.560 |
I don't know, the Jewish Kabbalah, the Freemasons, whoever, they can really control and predict all 01:50:29.440 |
the wars, this is nonsense. The second thing that is nonsense is to think they can do that and still 01:50:35.600 |
remain secret. It sometimes happens in history that a small group of people accumulates a lot 01:50:42.480 |
of power. If I now tell you that Xi Jinping and the heads of the CCP, the Chinese Communist Party, 01:50:50.320 |
they have a lot of power. They control the military, the media, the economy, the University 01:50:56.080 |
of China. This is not a conspiracy theory. This is, obviously, everybody knows it. Everybody knows 01:51:02.320 |
it. Because to gain so much power, you usually need publicity. Hitler could not, Hitler gained a 01:51:10.800 |
lot of power in Nazi Germany because he had a lot of publicity. If Hitler remained unknown, working 01:51:16.960 |
behind the scenes, he would not gain power. So the way to gain power is usually through publicity. 01:51:24.160 |
So secret Kabbalahs don't gain power. And even if you gain a lot of power, nobody has the kind 01:51:32.400 |
of power necessary to predict and control everything that happens in the world. All the 01:51:39.600 |
time shit happens that you did not predict and you did not plan and you did not control. 01:51:44.320 |
The sad thing is there's usually an explanation for everything you just said that involves 01:51:52.320 |
a secret global Kabbalah. That the reason your vacation planning always goes wrong 01:51:57.760 |
is because you're not competent. There is a competent small group, 01:52:01.440 |
that ultra-competent small group. I hear this with intelligence agencies. The CIA are running 01:52:07.920 |
everything. Mossad is running everything. You see, I mean, as a historian, you get to know 01:52:12.880 |
how many blunders these people do. They are so, and they're capable, but they are so incompetent 01:52:18.720 |
in so many ways. Again, look at the Russian invasion of Ukraine. Before the war, people 01:52:23.360 |
thought, "Oh, Putin was such a genius and the Russian army was one of the strongest 01:52:27.840 |
armies in the world." This is what Putin thought. And it completely backfired. 01:52:32.320 |
Well, the Kabbalah explanation there would be there's a NATO-driven United States 01:52:38.240 |
military-industrial complex that wants to create chaos and incompetence. 01:52:42.400 |
So they put a gun to Putin's head and told him, "Vladimir, if you don't invade, we shoot you." 01:52:50.240 |
The thing about conspiracy theories is there's usually a way to explain everything. 01:52:55.680 |
You can explain religion. You can always find explanation for everything. And in the end, 01:53:02.960 |
it's intellectual integrity. If you insist on whenever people confront you with evidence, 01:53:08.560 |
with finding some very, very complicated explanation for that too, you can explain 01:53:14.000 |
everything. We know that. It's a question of intellectual integrity. 01:53:18.800 |
And I will also say another thing. The conspiracy theories, they do get one thing right, 01:53:24.880 |
certainly in today's world. I think they represent an authentic and justified fear of a lot of people 01:53:33.600 |
that they are losing control of their lives. They don't understand what is happening. 01:53:39.840 |
And this, I think, is not just a legitimate fear. This is an important fear. They are right. 01:53:46.480 |
We are losing control of our lives. We are facing really big dangers, but not from a small cabal of 01:53:55.120 |
fellow humans. The problem with many of these conspiracy theories is that, yes, we have a 01:54:01.360 |
problem with new AI technology. But if you now direct the fire against certain people, 01:54:10.080 |
so instead of all humans cooperating against real common threats, whether it's the rise of AI, 01:54:19.840 |
whether it's global warming, you're only causing us to fight each other. 01:54:24.640 |
And I think that the key question that people who spread these ideas, I mean, many of them, 01:54:29.520 |
they honestly believe it's not malicious. They honestly believe in these theories. 01:54:35.680 |
Is do you want to spend your life spreading hate towards people? Or do you want to work 01:54:44.000 |
on more constructive projects? I think one of the big differences between those who believe 01:54:48.080 |
in conspiracy theories and people who warn about the dangers of AI, the dangers of climate change, 01:54:58.320 |
we don't see certain humans as evil and hateful. The problem isn't humans. The problem is something 01:55:08.720 |
outside humanity. Yes, humans are contributing to the problem, but ultimately the enemy is 01:55:17.200 |
external to humanity. Whereas conspiracy theories usually claim that a certain part of humanity 01:55:24.560 |
is the source of all evil, which leads them to eventually think in terms of exterminating 01:55:30.960 |
this part of humanity, which leads sometimes to historical disasters like Nazism. 01:55:40.640 |
So it can lead to hate, but can also lead to like cynicism, apathy that basically says, 01:55:46.640 |
"It's not in my power to make the world better." So you don't actually take action. 01:55:51.520 |
I think it is within the power of every individual to make the world a little bit better. 01:55:56.800 |
You know, you can't do everything. Don't try to do everything. Find one thing in your areas 01:56:02.960 |
of activity, a place where you have some agency, and try to do that, and hope that other people 01:56:10.320 |
do their bit. And if everybody do their bit, we'll manage. And if we don't, we don't, 01:56:17.440 |
but at least we try. - You have been part of conspiracy theories. I find myself recently 01:56:24.000 |
becoming part of conspiracy theories. Is there advice you can give of how to be a human being 01:56:32.000 |
in this world that values truth and reason while watching yourself become part of conspiracy 01:56:37.760 |
theories? At least from my perspective, it seems very difficult to prove to the world that you're 01:56:44.240 |
not part of a conspiracy theory. I, as you said, have interviewed Benjamin Netanyahu recently. I 01:56:52.320 |
don't know if you're aware, but doing such things will also, you know, pick up a new menu of items 01:56:58.480 |
that your new set of conspiracy theories you're now a part of. And I find it very frustrating 01:57:03.680 |
because it makes it very difficult to respond, because I sense that people have the right 01:57:11.200 |
intentions, like we said, they have a nervousness of a fear of power, and the abuses of power, and 01:57:20.480 |
as do I. So I find myself in a difficult position that I have nothing to show to prove 01:57:28.400 |
that I'm not part of such a conspiracy theory. - I think ultimately you can't, we can't. I mean, 01:57:35.600 |
you know, it's like proving consciousness. You can't. That's just the situation. Whatever you 01:57:41.680 |
say can and will be used against you by some people. So this fantasy, if I only say this, 01:57:49.040 |
if I only show them that, if I only have this data, they will see I'm okay. It doesn't work 01:57:54.400 |
like that. I think to keep your sanity in this situation, first of all, it's important to 01:58:02.080 |
understand that most of these people are not evil. They are not doing it on purpose. Many of them 01:58:07.680 |
really believe that there is some very nefarious, powerful conspiracy which is causing a lot of harm 01:58:15.840 |
in the world, and they are doing a good thing by exposing it and making people aware of it and 01:58:21.760 |
trying to stop it. If you think that you're surrounded by evil, you're falling into the 01:58:28.320 |
same rabbit hole. You're falling into the same paranoid state of mind, "Oh, the world is full 01:58:33.440 |
of these evil people." No, most of them are good people. Also, I think we can empathize 01:58:40.080 |
with some of the key ideas there, which I share, that yes, it's becoming more and more difficult 01:58:48.640 |
to understand what is happening in the world. There are huge dangers in the world, existential 01:58:55.280 |
dangers to the human species, but they don't come from a small cabal of Jews or gay people or 01:59:03.280 |
feminists or whatever. They come from much more diffused forces, which are not under the control 01:59:11.920 |
of any single individual. We don't have to look for the evil people. We need to look for human 01:59:20.320 |
allies in order to work together against, again, the dangers of AI, the dangers of bioengineering, 01:59:29.840 |
the dangers of climate change. When you wake up in the morning, the question is, do you want to 01:59:35.840 |
spend your day spreading hatred, or do you want to spend your day trying to make allies and work 01:59:44.400 |
together? - Let me ask you a big philosophical question about AI and the threat of it. Let's 01:59:52.320 |
look at the threat side. Folks like Eliezer Yudkowsky worry that AI might kill all of us. 02:00:00.240 |
Do you worry about that range of possibilities where artificial intelligence systems in a variety 02:00:09.280 |
of ways might destroy human civilization? - Yes. I talk a lot about it, about the dangers of AI. 02:00:17.120 |
I sometimes get into trouble because I depict these scenarios of how AI becoming very dangerous, 02:00:22.640 |
and then people say that I'm encouraging these scenarios. I'm talking about it as a warning. 02:00:28.400 |
I'm not so terrified of the simplistic idea, again, the Terminator scenario of robots running 02:00:36.400 |
in the street, shooting everybody. I'm more worried about AI accumulating more and more power 02:00:44.640 |
and basically taking over society, taking over our lives, taking power away from us until we 02:00:53.520 |
don't understand what is happening and we lose control of our lives and of the future. 02:00:58.880 |
The two most important things to realize about AI, so many things are being said now about AI, 02:01:04.640 |
but I think there are two things that every person should know about AI. First is that AI 02:01:11.120 |
is the first tool in history that can make decisions by itself. All previous tools in 02:01:17.360 |
history couldn't make decisions. This is why they empowered us. You invent a knife, you invent an 02:01:24.320 |
atom bomb. The atom bomb cannot decide to start a war, cannot decide which city to bomb. AI can 02:01:32.000 |
make decisions by itself. Autonomous weapon systems can decide by themselves who to kill, 02:01:40.720 |
who to bomb. The second thing is that AI is the first tool in history that can create new ideas 02:01:48.320 |
by itself. The printing press could print our ideas but could not create new ideas. AI can 02:01:57.040 |
create new ideas entirely by itself. This is unprecedented. Therefore, it is the first 02:02:04.000 |
technology in history that instead of giving power to humans, it takes power away from us. 02:02:10.960 |
And the danger is that it will increasingly take more and more power from us until we are left 02:02:19.760 |
helpless and clueless about what is happening in the world. This is already beginning to happen in 02:02:27.200 |
an accelerated pace. More and more decisions about our lives, whether to give us a loan, 02:02:34.320 |
whether to give us a mortgage, whether to give us a job are taken by AI. More and more of the ideas, 02:02:40.960 |
of the images, of the stories that surround us and shape our minds, our world are produced, 02:02:48.960 |
are created by AI, not by human beings. - If you can just linger on that, what is the danger of 02:02:54.720 |
that? That more and more of the creative side is done by AI, the idea generation. Is it that we 02:03:04.320 |
become stale in our thinking? Is that that idea generation is so fundamental to like the 02:03:10.000 |
evolution of humanity? - That we can't resist the idea. To resist an idea, you need to have some 02:03:16.560 |
vision of the creative process. Now, this is a very old fear. You go back to Plato's cave, 02:03:25.040 |
some of this idea that people are sitting chained in a cave and seeing shadows on a screen, 02:03:32.720 |
on a wall, and thinking this is reality. You go back to Descartes, and he has this thought 02:03:39.200 |
experiment of the demon. And Descartes asks himself, "How do I know that any of this is real? 02:03:45.920 |
Maybe there is a demon who is creating all of this and is basically enslaving me by surrounding me 02:03:53.840 |
with these illusions." You go back to Buddha, it's the same question. What if we are living in a 02:03:59.760 |
world of illusions, and because we have been living in it throughout our lives, all our ideas 02:04:05.760 |
or our desires, how we understand ourselves, this is all the product of the same illusions. 02:04:12.160 |
And this was a big philosophical question for thousands of years. Now it's becoming a practical 02:04:20.240 |
question of engineering, because previously all the ideas, as far as we know, maybe we are living 02:04:26.160 |
inside a computer simulation of intelligent rats from the planet Zircon. If that's the case, 02:04:32.000 |
we don't know about it. But taking what we do know about human history, until now, all the stories, 02:04:39.920 |
images, paintings, songs, operas, theater, everything we've encountered and shaped our minds 02:04:46.400 |
was created by humans. Now increasingly, we live in a world where more and more of these cultural 02:04:53.440 |
artifacts will be coming from an alien intelligence. Very quickly, we might reach a point 02:05:00.080 |
when most of the story, stories, images, songs, TV shows, whatever, are created by an alien 02:05:09.040 |
intelligence. And if we now find ourselves inside this kind of world of illusions, 02:05:16.720 |
created by an alien intelligence that we don't understand, but it understands us, this is a kind 02:05:25.200 |
of spiritual enslavement that we won't be able to break out of, because it understands us, it 02:05:34.160 |
understands how to manipulate us, but we don't understand what is behind this screen of stories 02:05:43.200 |
and images and songs. - So if there's a set of AI systems that are operating in the space of ideas 02:05:50.880 |
that are far superior to ours, and we're not almost able to, it's opaque to us, we're not 02:05:57.120 |
able to see through, how does that change the pursuit of happiness, the human pursuit of 02:06:05.680 |
happiness, life? Where do we get joy if we're surrounded by AI systems that are doing most of 02:06:13.760 |
the cool things humans do much better than us? - Some of the things, it's okay that the AIs 02:06:20.080 |
will do them. Many human tasks and jobs, they're drudgery, they are not fun, they are not developing 02:06:30.720 |
us emotionally or spiritually, it's fine if the robots take over. I don't know, I think about the 02:06:37.040 |
people in supermarkets or grocery stores that spend hours every day just passing items and 02:06:43.840 |
charging you the money. I mean, if this can be automated, wonderful. We need to make sure that 02:06:50.560 |
these people then have better jobs, better means of supporting themselves, and developing their 02:07:00.320 |
social abilities, their spiritual abilities. That's the ideal world that AI can create, 02:07:08.560 |
that it takes away from us the things that it's better if we don't do them and allows us to focus 02:07:19.040 |
on the most important things and the deepest aspects of our nature, of our potential. 02:07:24.880 |
If we give AI control of the sphere of ideas at this stage, I think it's very, very dangerous 02:07:33.680 |
because it doesn't understand us. AI at present is mostly digesting the products of human culture. 02:07:44.480 |
Everything we've produced over thousands of years, it eats all of these cultural products, 02:07:51.760 |
digests it, and starts producing its own new stuff. But we still haven't figured out 02:07:59.120 |
ourselves, in our bodies, our brains, our minds, our psychology. So, an AI based on our flawed 02:08:09.040 |
understanding of ourselves is a very dangerous thing. I think that we need, first of all, to 02:08:18.800 |
keep developing ourselves. If for every dollar and every minute that we spend on developing AI, 02:08:27.280 |
artificial intelligence, we spend another dollar and another minute in developing human consciousness, 02:08:34.160 |
the human mind will be okay. The danger is that we spend all our effort on developing an AI at a 02:08:41.840 |
time when we don't understand ourselves, and then letting the AI take over, that's a road to a human 02:08:49.680 |
catastrophe. - Does this surprise you how well large language models work? I mean, has it modified 02:08:55.840 |
your understanding of the nature of intelligence? - Yes. I mean, I've been writing about AI for, 02:09:01.920 |
I don't know, like eight years now, and engaged with all these predictions and speculations. 02:09:08.640 |
And when it actually came, it was much faster and more powerful than I thought it would be. 02:09:13.760 |
I didn't think that we would have, in 2023, an AI that can hold a conversation that you can't know 02:09:23.360 |
if it's a human being or an AI that can write beautiful texts. I mean, I read the texts written 02:09:31.600 |
by AI, and the thing that strikes me most is the coherence. You know, people think, "Oh, it's 02:09:38.640 |
nothing, they just take ideas from here and there, words from here and put it." No, it's so coherent. 02:09:45.040 |
I mean, you read, not sentences, you read paragraphs, you read entire texts, and there is 02:09:51.840 |
logic, there is a structure. - It's not only coherent, it's convincing. - Yes, it makes sense. 02:09:57.600 |
- And the beautiful thing about it that has to do with your work, it doesn't have to be true. 02:10:05.760 |
but it still is convincing. And it is both scary and beautiful. - Yes. 02:10:10.800 |
- That our brains love language so much that we don't need the facts to be correct. We just need 02:10:19.600 |
it to be a beautiful story. - Yeah. That's been the secret of 02:10:23.520 |
politics and religion for thousands of years, and now it's coming with AI. 02:10:28.320 |
- So you, as a person who has written some of the most impactful words ever written in your books, 02:10:34.880 |
how does that make you feel that you might be one of the last effective human writers? 02:10:42.000 |
- That's a good question. - First of all, do you think that's possible? 02:10:45.360 |
- I think it is possible. I've seen a lot of examples of AI being told, "Write like Yuval 02:10:52.560 |
Noah Harari and what it produces." - Has it ever done better than you 02:10:55.920 |
think you could have written yourself? - I mean, on the level of content of ideas, 02:11:02.720 |
no. There are things I say I would never say that. But when it comes to the, I mean, there is, 02:11:10.720 |
again, the coherence and the quality of writing is such that I say it's unbelievable how good it is. 02:11:19.760 |
And who knows, in 10 years, in 20 years, maybe it can do better, even according to certain measures 02:11:27.920 |
on the level of content. - So that people will be able to do a style 02:11:33.680 |
transfer, do in the style of Yuval Noah Harari, write anything, write why I should have ice cream 02:11:42.480 |
tonight, and make it convincing. - I don't know if I have anything 02:11:46.320 |
convincing to say about these things. - I think you'd be surprised. 02:11:48.800 |
- I think you'd be surprised. It could be an evolutionary biology explanation for why. 02:11:52.720 |
- Yeah, ice cream is good for you. - Yeah. So I mean, 02:11:55.600 |
that changes the nature of writing. - Ultimately, I think it goes back. 02:12:02.080 |
Much of my writing is suspicious of itself. I write stories about the danger of stories. 02:12:14.880 |
I write about intelligence, but highlighting the dangers of intelligence. Ultimately, I don't think 02:12:22.880 |
that in terms of power, human power comes from intelligence and from stories. But I think that 02:12:30.640 |
the deepest and best qualities of humans are not intelligence and not storytelling and not power. 02:12:38.240 |
Again, with all our power, with all our cooperation, with all our intelligence, 02:12:43.520 |
we are on the verge of destroying ourselves and destroying much of the ecosystem. 02:12:48.320 |
Our best qualities are not there. Our best qualities are nonverbal. Again, they come from 02:12:57.600 |
things like compassion, from introspection. And introspection, from my experience, is not verbal. 02:13:03.840 |
If you try to understand yourself with words, you will never succeed. There is a place where you 02:13:11.280 |
need the words. But the deepest insights, they don't come from words. And you can't write about 02:13:19.200 |
it. That's again, it goes back to Wittgenstein, to Buddha, to so many of these sages before, 02:13:24.720 |
that these are the things we are silent about. - Yeah, but eventually you have to project it. 02:13:30.720 |
As a writer, you have to do the silent introspection, but project it onto a page. 02:13:36.720 |
- Yes, but you still have to warn people, you will never find the deepest truth in a book. 02:13:43.120 |
You will never find it in words. You can only find it, I think, in direct experience, 02:13:50.240 |
which is nonverbal, which is pre-verbal. - In the silence of your own mind. 02:13:55.840 |
- Yes. - Well, let me ask you a silly question then, 02:14:01.280 |
a ridiculously big question. You have done a lot of deep thinking about the world, about yourself, 02:14:08.560 |
this kind of introspection. How do you think, if you, by way of advice, but just practically 02:14:16.640 |
speaking, day to day, how do you think about difficult problems of the world? 02:14:20.960 |
- First of all, I take time off. The most important thing I do, I think, as a writer, 02:14:29.760 |
as a scientist, I meditate. I spend about two hours every day in silent meditation, 02:14:36.160 |
observing as much as possible nonverbally what is happening within myself, focusing body 02:14:44.080 |
sensations, the breath. Thoughts keep coming up, but I try not to give them attention. I don't try 02:14:50.160 |
to drive them away, just let them be there in the background, like some background noise. 02:14:54.480 |
Don't engage with the thoughts, because the mind is constantly producing stories with words. These 02:15:04.080 |
stories come between us and the world. They don't allow us to see ourselves or the world. 02:15:10.080 |
For me, the most shocking thing when I started meditating 23 years ago, 02:15:14.800 |
I was given this simple exercise to just observe my breath coming in and out of the nostrils, 02:15:21.120 |
not controlling it, just observing it. I couldn't do it for more than 10 seconds. 02:15:26.560 |
I, for 10 seconds, would try to notice, "Oh, now the breath is coming in. It's coming in, 02:15:30.640 |
it's coming in. Oh, it stopped coming in, and now it's going out, going out." 10 seconds, 02:15:34.960 |
and some memory would come, some thought would come, some story about something that happened 02:15:40.400 |
last week or 10 years ago or in the future. The story would hijack my attention. It would take me 02:15:49.040 |
maybe five minutes to remember, "Oh, I'm supposed to be observing my breath." If I can't observe my 02:15:56.240 |
own breath because of these stories created by the mind, how can I hope to understand much more 02:16:03.120 |
complex things like the political situation in Israel, the Israeli-Palestinian conflict, 02:16:09.200 |
the Russian invasion of Ukraine? If all these stories keep coming, I mean, it's not the truth, 02:16:14.080 |
it's just the story your own mind created. So first thing, train the mind to be silent and 02:16:20.960 |
just observe. So two hours every day, and I go every year for a long retreat between one month 02:16:26.720 |
and two months, 60 days of just silent meditation. Silent meditation for 60 days. 02:16:32.880 |
Yeah. To train the mind, forget about your own stories, just observe what is really happening. 02:16:41.280 |
And then also throughout the day, have an information diet. Today, many people are very 02:16:49.840 |
aware of what they feed their body, what enters their mouth. Be very aware of what you feed your 02:16:57.280 |
mind, what enters your mind. Have an information diet. So for instance, I read long books. 02:17:05.360 |
And I prefer, like I do many interviews, I prefer three-hour interviews to five-minute interviews. 02:17:12.400 |
The long format, it's not always feasible, but you can go much, much deeper. 02:17:21.840 |
So I would say an information diet, be very careful about what you feed your mind, 02:17:27.440 |
give preference to big chunks over small-- LUKE: To books over Twitter. 02:17:33.760 |
SRS: Yes, books over Twitter, definitely. And then when I encounter a problem, 02:17:39.280 |
a difficult intellectual problem, then I let the problem lead me where it goes and not where I want 02:17:52.320 |
it to go. If I approach a problem with some preconceived idea or solution and then try to 02:18:00.000 |
impose it on the problem and just find confirmation bias, just find the evidence that supports my 02:18:05.680 |
view, this is easy for the mind to do and you don't learn anything new. 02:18:11.760 |
LUKE: Do you take notes? Do you start to concretize your thoughts on paper? 02:18:18.800 |
SRS: I read a lot. Usually I don't take notes. Then I start writing and when I write, 02:18:26.400 |
I write like a torrent, just write. Now it's the time you read, you did meditation, now it's the 02:18:33.520 |
time to write, write. Don't stop, just write. So I would write from memory and I'm not afraid 02:18:41.520 |
of formulating, say, big ideas, big theories and putting them on paper. The danger is once it's on 02:18:47.760 |
paper, not on paper, on the screen, in the computer, you get attached to it and then you 02:18:54.480 |
start with confirmation bias to build more and more layers around it and you can't go back. 02:19:00.160 |
And then it's very dangerous. But I trust myself that I have to some extent the ability to press 02:19:09.040 |
the delete button. The most important button in the keyboard is delete. I write and then I delete. 02:19:17.360 |
I write and then I delete. And because I trust myself that I'll have the… every time I come 02:19:23.440 |
to press the delete button, I feel bad. It's a kind of pain. I created this, it's a beautiful 02:19:29.040 |
idea and I have to delete it. But I try and hopefully I do it enough times. And this is 02:19:36.000 |
important because in the long term, it enables me to play with ideas. I have the confidence to 02:19:42.240 |
start formulating some brave idea. Most of them turn out to be nonsense. 02:19:50.240 |
But I trust myself not to be attached, not to become attached to my own nonsense. 02:19:56.720 |
So it gives me this room for playfulness. - I would be amiss if I didn't ask, for people 02:20:02.880 |
interested in hearing you talk about meditation, if they want to start meditating, what advice 02:20:07.840 |
would you give on how to start? You mentioned you couldn't hold your attention on your breath 02:20:14.320 |
for longer than 10 seconds at first. So how did they start on this journey? 02:20:18.560 |
- First of all, it's a difficult journey. It's not fun, it's not recreational, it's not 02:20:26.400 |
kind of time to relax. It can be very, very intense. The most difficult thing, at least 02:20:32.560 |
in the meditation I practice, vipassana, which I learned from a teacher called S. N. Goenka, 02:20:37.680 |
the most difficult thing is not the silence, it's not the sitting for long hours, it's what comes 02:20:43.840 |
up. Everything you don't want to know about yourself, this is what comes up. So it's very 02:20:50.720 |
intense and difficult. If you go to a meditation retreat, don't think you're going to relax. 02:20:56.240 |
- So what's the experience of a meditation retreat when everything you don't like comes up 02:21:05.760 |
Anger comes up, you're angry. For days on end, you're just boiling with anger. Everything makes 02:21:12.320 |
you angry. Again, something that happens right now, or you remember something from 20 years ago, 02:21:18.000 |
and you start boiling with... It's like, I never even thought about this incident, 02:21:23.440 |
but it was somewhere stored with a huge, huge pile of anger attached to it, and it's now coming up, 02:21:32.160 |
and all the anger is coming up. Maybe it's boredom. You know, 30 days of meditation, 02:21:38.640 |
you start getting bored, and it's the most boring thing. Suddenly, no anger, it's the most boring. 02:21:45.680 |
Another second, I scream. And boredom is one of the most difficult things to deal with in life. 02:21:55.280 |
I think it's closely related to death. Death is boring. In many movies, death is exciting; 02:22:01.520 |
it's not exciting. When you die, ultimately, it's boredom. Nothing happens. 02:22:07.520 |
- It's the end of exciting things. - The end. And many things in the world 02:22:12.000 |
happen because of boredom. To some extent, people start entire wars because of boredom. 02:22:18.160 |
People quit relationships. People quit jobs because of boredom. And if you never learn how 02:22:25.600 |
to deal with boredom, you will never learn how to enjoy peace and quiet, because the way to peace 02:22:34.000 |
passes through boredom. And from what I experienced with meditation, I think 02:22:40.240 |
maybe it was the most difficult, maybe at least in the top three, much more difficult, say, 02:22:46.800 |
than anger or pain. When pain comes up, you feel heroic. "Hey, I'm dealing with pain." 02:22:52.880 |
When boredom comes up, it brings it with depression and feelings of worthlessness, 02:22:59.920 |
and it's nothing. I'm nothing. - The way to peace is through boredom. 02:23:05.440 |
David Foster Wallace said the key to life is to be unborable. 02:23:10.880 |
Which is a different perspective on what you're talking to. Is there truth to that? 02:23:18.240 |
- Yes, I mean, it's closely related. I would say, like, I look at the world today, like politics, 02:23:23.680 |
the one thing we need more than anything else is boring politicians. We have a super abundance of 02:23:30.560 |
very exciting politicians who are doing and saying very exciting things, and we need boring 02:23:37.360 |
politicians. And we need them quickly. - Yeah, the way to peace is through boredom. 02:23:44.880 |
That applies in more ways than one. What advice would you give to young people 02:23:50.880 |
today in high school and college how to have a successful life, how to have a successful career? 02:23:57.040 |
- What they should know, it's the first time in history nobody has any idea how the world would 02:24:03.520 |
look like in 10 years. Nobody has any idea how the world would look like when you grow up. 02:24:08.800 |
You know, throughout history, it was never possible to predict the future. You live in 02:24:12.800 |
the Middle Ages, nobody knows. Maybe in 10 years, the Vikings will invade, the Mongols will invade, 02:24:18.640 |
there'll be an epidemic, there'll be an earthquake, who knows? But the basic structures of life will 02:24:25.520 |
not change. Most people will still be peasants. Armies would fight on horseback with swords and 02:24:33.280 |
bows and arrows and things like that. So you could learn a lot from the wisdom of your elders. 02:24:41.040 |
They've been there before, and they knew what kind of basic skills you need to learn. 02:24:47.040 |
Most people need to learn how to sow wheat and harvest wheat or rice and make bread 02:24:53.520 |
and build a house and ride a horse and things like that. Now we have no idea, not just about 02:25:00.320 |
politics. We have no idea how the job market would look like in 10 years. We have no idea 02:25:07.200 |
what skills will still be needed. You think you're going to learn how to code because 02:25:16.160 |
they'll need a lot of coders in the 2030s? Think again. Maybe AI is doing all the coding. You don't 02:25:21.600 |
need any coders. You're going to, I don't know, you learn to translate languages, you want to be a 02:25:26.960 |
translator, gone. We don't know what skills will be needed. So the most important skill 02:25:34.160 |
is the skill to keep learning and keep changing throughout our lives, which is very, very 02:25:40.640 |
difficult, to keep reinventing ourselves. It's a deep, again, it's in a way a spiritual practice 02:25:47.920 |
to build your personality, to build your mind as a very flexible mind. 02:25:58.400 |
If traditionally people thought about education, like building a stone house with very deep 02:26:10.480 |
foundations, now it's more like setting up a tent that you can fold and move to the next place 02:26:17.680 |
very, very quickly because that's the 21st century. - Which also raises questions about the 02:26:24.480 |
future of education, what that looks like. Let me ask you about love. 02:26:30.080 |
What were some of the challenges? What were some of the lessons about love, about life that you 02:26:39.520 |
learned from coming out as gay? - In many ways, it goes back to the stories. I think this is one of 02:26:46.480 |
the reasons I became so interested in stories and in their power. Because I grew up in a small 02:26:57.040 |
Israeli town in the 1980s, early 1990s, which was very homophobic. And I basically embraced it, 02:27:08.080 |
I breathed it, because you could hardly even think differently. So you had these two powerful 02:27:16.800 |
stories around, one, that God hates gay people and that He will punish them for who they are or for 02:27:27.040 |
what they do. Secondly, that it's not God, it's nature, that there is something diseased or sick 02:27:34.160 |
about it. And these people, maybe they're not sinners, but they are sick, they are defective. 02:27:42.640 |
And nobody wanted to identify with such a thing. If your options, okay, you can be a sinner, 02:27:48.560 |
you can be a defect, what do you want? No good options there. And it took me many years, 02:27:54.720 |
till I was 21, to come to terms with it. And one of the things, I learned two things. First, 02:28:01.840 |
about the amazing capacity of the human mind for denial and delusion. An algorithm could have told 02:28:11.200 |
me that I'm gay when I was like 14 or 15. If there is a good-looking guy and girl walking, 02:28:17.680 |
I would immediately focus on the guy. But I didn't connect the dots. I could not understand 02:28:26.400 |
what was happening inside my own brain and my own mind and my own body. It took me a long time to 02:28:32.960 |
realize, "You know, you're just gay." So that speaks to the power of social 02:28:38.560 |
convention versus individual thought. This is the power of self-delusion. 02:28:43.040 |
That it's not that I knew I was gay and was hiding it. I was hiding it for myself successfully, 02:28:50.560 |
that I don't understand how it is possible. Looking back, I don't understand how it is 02:28:54.960 |
possible. But I know it is possible. I knew and didn't know at the same time. 02:29:00.080 |
And then the other big lesson is the power of the stories, of the social conventions. 02:29:06.240 |
Because the stories were not true. They did not make sense even on their own terms. 02:29:11.760 |
Even if you accept the basic religious framework of the world, that there is a good God that 02:29:18.720 |
created everything and controls everything, why would a good God punish people for love? 02:29:27.360 |
I understand why a good God would punish people for violence, for hatred, for cruelty. But why 02:29:34.960 |
would God punish people for love, especially when he created them that way? So even if you accept 02:29:42.720 |
the religious framework of the world, obviously the story that God hates gay people, it comes 02:29:49.520 |
not from God, but from some humans who invented this story. They take their own hatred. This is 02:29:55.920 |
something humans do all the time. They hate somebody and they say, "No, I don't hate them. 02:30:01.280 |
God hates them." They throw their own hatred on God. And then if you think about the scientific 02:30:09.440 |
framework that said that, "Oh, gays, they are against nature. They are against the laws of 02:30:14.960 |
nature," and so forth, science tells us nothing can exist against the laws of nature. 02:30:22.240 |
Things that go against the laws of nature just don't exist. There is a law of nature that you 02:30:29.440 |
can't move faster than the speed of light. Now, you don't have this minority of people who break 02:30:35.280 |
the laws of nature by going faster than the speed of light. And then nature comes, "No, that's bad. 02:30:41.200 |
You shouldn't do that." That's not how nature works. If something goes against the laws of 02:30:45.920 |
nature, it just can't exist. The fact that gay people exist, me and not just people, you see 02:30:52.720 |
homosexuality among many, many mammals and birds and other animals. It exists because it is in line 02:31:01.280 |
with the laws of nature. The idea that this is sick, that this is whatever, it comes not from 02:31:07.840 |
nature. It comes from the human imagination. Some people who, for whatever reasons, hated gay people, 02:31:14.800 |
they said, "Oh, they go against nature." But this is a story created by people. This is not the laws 02:31:21.840 |
of nature. And this taught me that so many of the things that we think are natural or eternal or 02:31:30.480 |
divine, no, they're just human stories. But these human stories are often the most powerful forces 02:31:38.080 |
in the world. - So what did you learn from your personal struggle of journey through the social 02:31:48.800 |
conventions to find one of the things that makes life awesome, which is love? So what it takes to 02:31:55.280 |
strip away the self-delusion and the pressures of social convention to wake up? 02:32:00.320 |
- It takes a lot of work, a lot of courage, and a lot of help from other people. 02:32:07.040 |
This kind of, again, heroic idea that I can do it all by myself, it doesn't work. 02:32:14.800 |
Certainly with love, you need at least one more person. And I'm very happy that I found Itzik. 02:32:22.160 |
We lived in the same small Israeli town. We lived on two adjacent streets for years, 02:32:28.160 |
probably went to school on the same bus for years without really encountering each other. In the end, 02:32:34.320 |
we met on one of the first dating sites on the internet for gay people in Israel in 2002. 02:32:42.960 |
- You're saying the internet works. - Yes. I said a lot of bad things 02:32:46.480 |
or dangers about technology and the internet. There are also, of course, good things. And 02:32:51.120 |
this is not an accident. You have two kinds of minorities in history. You have minorities which 02:32:58.080 |
are a cohesive group, like Jews, that, yes, you're as small as being born Jewish in, say, 02:33:05.600 |
Germany or Russia or whatever. You're born in a small community. But as a Jewish boy, you're born 02:33:11.840 |
to a Jewish family. You have Jewish parents. You have Jewish siblings. You're in a Jewish 02:33:16.400 |
neighborhood. You have Jewish friends. So these kinds of minorities, they could always come 02:33:21.200 |
together and help each other throughout history. Another type of minority, like gay people or more 02:33:27.440 |
broadly LGBTQ people, that as a gay boy, you're usually not born to a gay family with gay parents 02:33:35.040 |
and gay siblings in a gay neighborhood. So usually you find yourself completely alone. 02:33:43.120 |
For most of history, one of the biggest problems for the gay community was that there was no 02:33:49.120 |
community. How do you find one another? And the internet was a wonderful thing in this respect 02:33:56.640 |
because it made it very easy for these kinds of diffuse communities or diffuse minorities to find 02:34:03.440 |
each other. So me and Itzik, even though we rode the same bus together to school for years, we 02:34:08.800 |
didn't meet in the physical world. We met online. Because again, in the physical world, you don't 02:34:14.080 |
want to identify in an Israeli town in the 1980s. You ride the bus. You don't want to say, "Hey, 02:34:19.360 |
I'm gay. Is there anybody else gay here?" That's not a good idea. But on the internet, we could 02:34:24.640 |
find each other. There's another lesson in there that maybe sometimes the thing you're looking for 02:34:29.040 |
is right under your nose. Yeah. A very old lesson and a very true lesson in many ways. So you need 02:34:37.680 |
help from other people to realize the truth about yourself. So of course, in love, you cannot just 02:34:45.120 |
love abstractly. There is another person there. You need to find them. But also, we were one of 02:34:51.280 |
the first generations who enjoyed the benefits of gay liberation, of the very difficult struggles 02:34:58.880 |
of people who were much braver than us in the 1980s, 1970s, 1960s, who dared to question 02:35:06.560 |
social conventions, to struggle at sometimes a terrible price. And we benefited from it. 02:35:14.800 |
And more broadly, we spoke earlier about the feminist movement. There would have been no 02:35:20.160 |
gay liberation without the feminist movement. We also owe them for starting to change the gender 02:35:29.680 |
structure of the world. And this is always true. You can never do it just by yourself. 02:35:37.200 |
Also, I look at my journey in meditation. I could not have found the idea of going to 02:35:43.760 |
meditation retreat okay, but I couldn't discover meditation by… I couldn't develop the meditation 02:35:50.480 |
technique by myself. Somebody had to teach me this way of how to look inside yourself. 02:35:57.200 |
And it's also a very important lesson that you can't do it just by yourself, 02:36:06.800 |
that this fantasy of complete autonomy, of complete self-sufficiency, it doesn't work. 02:36:13.280 |
You hear, it tends to be a very kind of male macho fantasy. I don't need anybody. I can be so 02:36:19.920 |
strong and so brave that I'll do everything by myself. It never works. 02:36:24.960 |
You need friends, you need a mentor, you need the very thing that makes us human, as other humans. 02:36:38.560 |
You mentioned that the fear of boredom might be a kind of proxy for the fear of death. 02:36:43.840 |
So what role does the fear of death play in the human condition? Are you afraid of death? 02:36:49.200 |
Yes, I think everybody is afraid of death. I mean, all our fears come out of the fear of death. 02:36:55.920 |
But the fear of death is just so deep and difficult. Usually, we can't face it directly. 02:37:02.960 |
So we cut it into little pieces, and we face just little pieces. Oh, I lost my smartphone. 02:37:08.880 |
That's a little, little, little piece of the fear of death, which is of losing everything. 02:37:14.400 |
So I can't deal with losing everything. I'm dealing now with losing my phone or losing a 02:37:19.520 |
book or whatever. I feel pain. That's a small bit of the fear of death. Somebody who really doesn't 02:37:28.320 |
fear death would not fear anything at all. They'll be like, "Anything that happens, 02:37:34.400 |
I can deal with it. If I can deal with death, this is nothing." 02:37:36.880 |
- So any fear is a distant echo of the big fear of death. Have you ever 02:37:43.040 |
looked at it head on, caught glimpses, sort of contemplated as the Stoics do? 02:37:50.640 |
- Yes. I mean, when I was a teenager, I would constantly contemplate it, 02:37:56.080 |
trying to understand, to imagine. It was a very, very shocking and moving experience. I remember 02:38:05.920 |
especially in connection with national ideology, which was also very big, strong in Israel, still 02:38:12.080 |
is, which again comes from the fear of death. You know that you're going to die, so you say, 02:38:17.600 |
"Okay, I die, but the nation lives on. I live on through the nation. I don't really die." 02:38:22.960 |
And you hear it especially on Memorial Day for fallen soldiers. So every day, there'll be in 02:38:30.080 |
school Memorial Day for fallen soldiers who fell defending Israel in all its different wars, 02:38:36.800 |
and all these kids would come dressed in white, and you'll have this big ceremony with flags and 02:38:41.840 |
songs and dances in memory of the fallen soldiers. And you get the impression, again, I don't want 02:38:48.400 |
it to sound crass, but you get the impression that the best thing in life is to be a fallen soldier. 02:38:53.200 |
Because even though, yes, you die, everybody dies in the end, but then you'll have all these 02:38:56.880 |
school kids for years and years remembering you and celebrating you, and you don't really die. 02:39:02.400 |
And I remember standing in these ceremonies and thinking, "What does it actually mean?" 02:39:07.040 |
Like, okay, so if I'm a fallen soldier, now I'm a skeleton, I'm bones in this military cemetery 02:39:16.720 |
under this stone, do I actually hear the kids singing all these patriotic songs? If not, 02:39:23.840 |
how do I know they do it? Maybe they trick me. Maybe I die in the war, and then they don't sing 02:39:27.600 |
any songs. And how does it help me? And I realized, I was quite young at the time, that if you're dead, 02:39:36.160 |
you can't hear anything because that's the meaning of being dead. And if you're dead, 02:39:40.720 |
you can't think of anything like, "Oh, now they're remembering because you're dead." That's 02:39:44.080 |
the meaning of being dead. And it was a shocking realization. 02:39:47.840 |
But it's a really difficult realization to keep hold in your mind. Like, it's the end. 02:39:53.600 |
I lost it over time. I mean, for many years, it was a very powerful fuel, motivation for 02:39:59.680 |
philosophical, for spiritual exploration. And I realized that the fear of death is really a 02:40:05.760 |
very powerful drive. And over the years, especially as I meditated, it kind of dissipated. And today, 02:40:12.800 |
sometimes I find myself trying to recapture this teenage fear of death, because it was so 02:40:18.720 |
powerful, and I just can't. I try to make the same image. I don't know. 02:40:26.960 |
Yeah. As a teenager, I always thought that the adults, there is something wrong with the adults, 02:40:31.840 |
because they don't get it. I would ask my parents or teachers about it, and they, "Oh, yes, you die 02:40:39.120 |
in the end. That's it." But on the other hand, they are so worried about other things, like there'll 02:40:44.720 |
be a political crisis or an economic problem or a personal problem like with the bank or whatever. 02:40:50.080 |
They'll be so worried. But then about the fact that they're going to die, "Ah, we don't care 02:40:54.960 |
That's why you read Camus and others when you're a teenager, you really worry about the existential 02:41:01.120 |
questions. Well, this feels like the right time to ask the big question, "What's the meaning of 02:41:06.320 |
this whole thing?" And you're the right person to ask. What's the meaning of life? Yes. 02:41:12.960 |
So what life is, if you ask what the meaning of life is, life is feeling things, having sensations, 02:41:25.600 |
emotions, and reacting to them. When you feel something good, something pleasant, you want 02:41:32.000 |
more out of it. You want more of it. When you feel something unpleasant, you want to get rid of it. 02:41:37.680 |
That's the whole of life. That's what is happening all the time. You feel things, you want the 02:41:43.200 |
pleasant things to increase, you want the unpleasant things to disappear. That's what life 02:41:49.360 |
is. If you ask, "What is the meaning of life?" in a more kind of philosophical or spiritual 02:41:56.400 |
question, the real question to ask, "What kind of answer do you expect?" Most people expect a story, 02:42:05.120 |
and that's always the wrong answer. Most people expect that the answer to the question, 02:42:11.040 |
"What is the meaning of life?" will be a story, like a big drama, that this is the plot line, 02:42:17.360 |
and this is your role in the story. This is what you have to do. This is your line in the big play. 02:42:24.160 |
You say your line, you do your thing, that's the thing. This is human imagination. This is fantasy. 02:42:30.960 |
To really understand life, life is not a story. The universe does not function like a story. 02:42:38.320 |
So I think to really understand life, you need to observe it directly in a nonverbal way. 02:42:47.040 |
Don't turn it into a story. And the question to start with is, "What is suffering? What is causing 02:42:55.760 |
suffering?" The question, "What is the meaning of life?" will take you to fantasies and delusions. 02:43:02.480 |
We want to stay with the reality of life. And the most important question about the reality of life 02:43:08.800 |
is, "What is suffering, and where is it coming from?" 02:43:12.960 |
And to answer that nonverbally, so the conscious experience of suffering. 02:43:17.680 |
Yes. When you suffer, try to observe what is really happening when you're suffering. 02:43:25.520 |
Well put. And I wonder if AI will also go through that same kind of process. 02:43:35.360 |
And if we develop consciousness or not. At present, it's not. It's just words. 02:43:41.200 |
It will just say to you, "Please don't hurt me at all." 02:43:43.680 |
Again, as I've mentioned to you, I'm a huge fan of yours. Thank you for the incredible work you do. 02:43:51.760 |
This conversation has been a long time, I think, coming. It's a huge honor to talk to you. 02:43:59.120 |
This was really fun. Thank you for talking today. 02:44:01.280 |
Thank you. I really enjoyed it. And as I said, I think the long form is the best form. 02:44:11.360 |
Thanks for listening to this conversation with Yuval Noah Harari. 02:44:14.320 |
To support this podcast, please check out our sponsors in the description. 02:44:18.000 |
And now, let me leave you with some words from Yuval Noah Harari himself. 02:44:22.480 |
How do you cause people to believe in an imagined order such as Christianity, 02:44:27.920 |
democracy, or capitalism? First, you never admit that the order is imagined. 02:44:34.000 |
Thank you for listening, and hope to see you next time.