back to indexJudea Pearl: Causal Reasoning, Counterfactuals, and the Path to AGI | Lex Fridman Podcast #56
00:00:00.000 |
The following is a conversation with Judea Pearl, 00:00:03.280 |
professor at UCLA and a winner of the Turing Award 00:00:06.760 |
that's generally recognized as the Nobel Prize of Computing. 00:00:12.960 |
of artificial intelligence, computer science, and statistics. 00:00:16.720 |
He has developed and championed probabilistic approaches 00:00:20.000 |
to AI, including Bayesian networks, and profound ideas 00:00:29.080 |
but to our understanding and practice of science. 00:00:32.800 |
But in the field of AI, the idea of causality, cause 00:00:41.160 |
must be developed in order to build truly intelligent 00:00:50.720 |
I recommend his most recent book called Book of Why, 00:00:54.200 |
that presents key ideas from a lifetime of work 00:00:57.120 |
in a way that is accessible to the general public. 00:01:05.800 |
give it five stars on Apple Podcasts, support on Patreon, 00:01:09.160 |
or simply connect with me on Twitter, Alex Friedman, 00:01:12.440 |
spelled F-R-I-D-M-A-N. If you leave a review on Apple 00:01:16.920 |
Podcasts especially, but also CastBox or comment on YouTube, 00:01:20.920 |
consider mentioning topics, people, ideas, questions, 00:01:40.000 |
where he said that the significance of your life 00:01:46.520 |
On most days, the existentialist approach to life 00:01:53.560 |
I recently started doing ads at the end of the introduction. 00:01:56.560 |
I'll do one or two minutes after introducing the episode 00:02:08.200 |
This show is presented by Cash App, the number one finance 00:02:12.960 |
I personally use Cash App to send money to friends, 00:02:15.440 |
but you can also use it to buy, sell, and deposit 00:02:22.720 |
You can buy fractions of a stock, say $1 worth, 00:02:27.960 |
Brokerage services are provided by Cash App Investing, 00:02:36.560 |
to support one of my favorite organizations called First, 00:02:39.880 |
best known for their FIRST Robotics and LEGO competitions. 00:02:43.360 |
They educate and inspire hundreds of thousands 00:02:48.920 |
and have a perfect rating and charity navigator, 00:02:55.920 |
When you get Cash App from the App Store or Google Play 00:03:02.640 |
and Cash App will also donate $10 to FIRST, which again, 00:03:12.680 |
And now, here's my conversation with Judea Pearl. 00:03:18.000 |
You mentioned in an interview that science is not 00:03:20.760 |
a collection of facts, but a constant human struggle 00:03:26.720 |
What was the first mystery that you can recall 00:03:37.840 |
- What was it? - I had a fever for three days. 00:03:41.480 |
And when I learned about Descartes, analytic geometry, 00:03:47.080 |
and I found out that you can do all the construction 00:03:58.280 |
- So what kind of world does analytic geometry unlock? 00:04:07.360 |
Okay, so Descartes had the idea that geometrical construction 00:04:16.240 |
can be articulated in the language of algebra, 00:04:19.600 |
which means that all the proof that we did in high school 00:04:24.880 |
and trying to prove that the three bisectors meet 00:04:28.920 |
at one point and that, okay, all this can be proven 00:04:45.200 |
I'm telling you, right? - So it's the connection 00:04:47.000 |
between the different mathematical disciplines 00:04:54.440 |
- So which mathematic discipline is most beautiful? 00:05:02.440 |
- But there's a visual element to geometry being-- 00:05:27.520 |
So but the transition from one to another was really, 00:05:31.560 |
I thought that Descartes was the greatest mathematician 00:05:35.200 |
- So you have been at the, if you think of engineering 00:05:59.600 |
I mean, we got a very solid background in mathematics 00:06:12.320 |
They left their careers in Heidelberg and Berlin 00:06:17.880 |
And we were the beneficiary of that experiment. 00:06:33.360 |
Their cousins and their nieces and their faces. 00:06:45.200 |
- So you're almost educated as a historian of math. 00:06:53.800 |
So every exercise in math was connected with a person. 00:07:35.840 |
which I did in LCA Laboratories in superconductivity. 00:07:51.040 |
to get into software engineering a little bit. 00:08:27.840 |
- You have permanent current swirling around. 00:08:35.360 |
That's what we worked on in the 1960s in RCA. 00:08:39.680 |
And I discovered a few nice phenomena with the vortices. 00:09:01.360 |
I mean, thin film superconductors became important 00:09:06.920 |
So they called it pearl vortex without my knowledge. 00:09:13.800 |
- You have footprints in all of the sciences. 00:09:17.560 |
So let's talk about the universe a little bit. 00:09:20.960 |
Is the universe at the lowest level deterministic 00:09:23.880 |
or stochastic in your amateur philosophy view? 00:09:38.880 |
uncertainty principle and we have some experiments 00:09:57.240 |
It's a puzzle that you have the dice flipping machine, 00:10:14.240 |
So, but it only governs microscopic phenomena. 00:10:19.240 |
- So you don't think of quantum mechanics as useful 00:10:38.480 |
and as far as the neuron firing is concerned, 00:10:52.960 |
Free will is an illusion that we AI people are gonna solve. 00:11:08.920 |
A machine that act as though it has free will. 00:11:17.160 |
and you wouldn't be able to tell the difference 00:11:23.720 |
- So the illusion, it propagates the illusion of free will 00:11:54.040 |
- Yeah, you can't fake it if you don't have it. 00:11:57.600 |
So let's begin at the beginning with probability, 00:12:16.960 |
- It's a degree of uncertainty that an agent has 00:12:22.400 |
- You're still expressing some knowledge in that statement. 00:12:27.840 |
it's absolutely a different kind of knowledge 00:12:47.600 |
versus 10%, it's a piece of useful knowledge. 00:13:01.600 |
- It allows you to predict things with a certain probability 00:13:06.120 |
and computing those probabilities are very useful. 00:13:15.080 |
and you need prediction to be able to survive. 00:13:25.060 |
- And so you've done a lot of work in causation, 00:13:49.360 |
What is it, so probability of something happening 00:13:53.180 |
is something, but then there's a bunch of things happening, 00:13:56.500 |
and sometimes they happen together, sometimes not. 00:14:00.800 |
So how do you think about correlation of things? 00:14:03.660 |
- Correlation occurs when two things vary together 00:14:24.440 |
Things cannot be correlated unless there is a reason 00:14:43.200 |
because we cannot grasp any other logic except causation. 00:15:05.040 |
Now staying the same means that I have chosen 00:15:11.720 |
where the guy has the same value as the previous one. 00:15:29.240 |
and I choose only those flippings experiments 00:15:35.620 |
and the bell rings when at least one of them is a tail, 00:15:44.360 |
because I only look at the cases where the bell rang. 00:15:53.680 |
with my audacity to ignore certain incidents, 00:16:06.760 |
- Right, so that's, you just outlined one of the flaws 00:16:13.040 |
and trying to infer something from the math about the world 00:16:36.320 |
that's what, that has been the majority of science. 00:16:42.660 |
Statisticians know it, statisticians know it, 00:16:59.560 |
that's why they all dismiss the Simpson Paradox, 00:17:02.400 |
ah, we know it, they don't know anything about it. 00:17:09.680 |
where all the variables are hard to account for, 00:17:20.080 |
Who is trying to get causation from correlation? 00:17:31.720 |
implying, sort of hypothesizing with our ability-- 00:17:40.480 |
or if they are outdated, or they're about to get outdated. 00:17:48.200 |
- Psychology, what, is it SEM, Structural Equation? 00:17:50.800 |
- No, no, I was thinking of applied psychology studying, 00:17:57.240 |
in semi-autonomous vehicles, how people behave, 00:18:17.560 |
- Do they fall asleep, or do they tend to fall asleep 00:18:28.720 |
- And so you measure, you put people in the car, 00:18:32.480 |
because it's real world, you can't conduct an experiment 00:19:04.420 |
So you just observe when they drive it autonomously 00:19:11.200 |
- But maybe they turn it off when they're very tired. 00:19:16.600 |
- Okay, so that you have now uncontrolled experiment. 00:19:36.000 |
So that is an issue that is about 120 years old. 00:19:58.520 |
But the Babylonian king that wanted the exile, 00:20:03.520 |
the people from Israel that were taken in exile 00:20:46.340 |
"Let's take the other guys to eat the king's food, 00:20:50.200 |
"and in about a week's time, we'll test our performance." 00:20:57.800 |
and they were so much better than the others, 00:21:02.120 |
and the king nominated them to super position in his case. 00:22:05.080 |
Science has not provided us with the mathematics 00:22:41.520 |
- None of the machine learning people clobbered you? 00:22:48.840 |
- Most people, and this is why today's conversation, 00:22:55.780 |
There's certain aspects that are just effective today, 00:23:20.960 |
something like causation, but they don't, not necessarily. 00:23:55.980 |
- Hypothesis, everything which has to do with causality 00:24:09.080 |
- So it still needs the human expert to propose. 00:24:13.820 |
- Right, you need the human expert to specify 00:24:27.020 |
By whom listen to, I mean one variable listens to the other. 00:24:31.260 |
So I say, okay, the tide is listening to the moon, 00:24:43.140 |
This is our understanding of the world in which we live. 00:25:01.260 |
and we certainly do not know how to handle it 00:25:07.260 |
In AI, slogan is representation first, discovery second. 00:25:12.180 |
But if I give you all the information that you need, 00:25:42.060 |
once you give me a representation for my knowledge. 00:25:50.060 |
how to represent things, how do I discover them? 00:26:02.620 |
has not considered causation, that A causes B. 00:26:06.260 |
Just in anything, that seems like a non-obvious thing 00:26:11.260 |
that you think we would have really acknowledged it, 00:26:21.060 |
So knowledge, how hard is it to create a knowledge 00:26:31.260 |
because we have only four or five major variables. 00:26:36.060 |
And an epidemiologist or an economist can put them down. 00:26:41.560 |
What, minimum wage, unemployment policy, X, Y, Z, 00:26:48.420 |
and start collecting data and quantify the parameter 00:26:57.140 |
that were left unquantified with the initial knowledge. 00:27:12.020 |
everywhere, in the health science, that's a routine thing. 00:27:24.860 |
Once you have that, you have to have a language 00:27:35.780 |
One is how the science of causation is very useful 00:27:47.380 |
And then the other is how do we create intelligence systems 00:27:53.580 |
So if my research question is how do I pick up 00:27:58.660 |
all the knowledge that is required to be able to do that, 00:28:07.980 |
Do we return back to the problem that we didn't solve 00:28:26.580 |
- Task of eliciting knowledge from an expert, 00:28:34.260 |
So automating the building of knowledge as much as possible. 00:28:38.620 |
- It's a different game in the causal domain, 00:28:51.500 |
But you don't enrich it by asking for more rules. 00:28:58.980 |
to look at the data and quantifying and ask queries 00:29:05.500 |
You couldn't because the question is quite complex, 00:29:11.500 |
and it's not within the capability of ordinary cognition. 00:29:16.900 |
Of ordinary person, ordinary expert even, to answer. 00:29:32.100 |
- Okay, what's the effect of a drug on recovery? 00:29:35.920 |
What is the aspirin that caused my headache to be cured? 00:29:46.300 |
This is already, you see, it's a difficult question 00:30:07.980 |
- And the first exercise is express it mathematically. 00:30:26.400 |
I want to find the effect of the drug on my headache. 00:30:40.760 |
It's the difference between association and intervention. 00:30:48.900 |
So do calculus connected on the do operator itself 00:31:01.740 |
you're making the choice to change a variable. 00:31:11.860 |
and the mechanism by which we take your query 00:31:15.420 |
and we translate it into something that we can work with 00:31:23.340 |
and you cut off all the incoming error into x. 00:31:26.820 |
And you're looking now in the modified mutilated model, 00:31:40.220 |
from all influences that acted upon them earlier. 00:31:45.220 |
And you subject them to the tyranny of your muscles. 00:31:49.180 |
- So you remove all the questions about causality 00:31:55.780 |
- No, because there's one level of questions. 00:31:59.020 |
Answer questions about what will happen if you do things. 00:32:24.740 |
- Where we do something, where we drink the coffee 00:32:31.780 |
- To imagine how the experiment will look like 00:32:40.620 |
What is the effect of blood pressure on mortality? 00:32:50.800 |
Which means I can, if I have a model of your body, 00:32:58.620 |
how the blood pressure change will affect your mortality. 00:33:04.700 |
How I go into the model and I conduct this surgery 00:33:12.060 |
even though physically I can do, I cannot do it. 00:33:22.200 |
Meaning the surgery of changing the blood pressure is, 00:33:56.820 |
But I don't depend things which are not depends on x. 00:34:14.940 |
But hypothetically, no. - Hypothetically, no. 00:34:17.380 |
- If we have a model, that is what the model is for. 00:34:24.620 |
you take it apart, put it back, that's the idea of a model. 00:34:28.860 |
It's the idea of thinking counterfactually, imagining, 00:34:35.140 |
- So by constructing that model, you can start to infer 00:34:37.940 |
if the higher, the blood pressure leads to mortality, 00:34:47.340 |
- I construct a model, I still cannot answer it. 00:34:50.780 |
I have to see if I have enough information in the model 00:34:53.820 |
that would allow me to find out the effects of intervention 00:35:06.340 |
- You need to have assumptions about who affects whom. 00:35:16.380 |
the answer is yes, you can get it from observational study. 00:35:25.680 |
Then you need to find either different kind of observation 00:35:30.680 |
that you haven't considered, or one experiment. 00:35:34.060 |
- So basically, that puts a lot of pressure on you 00:35:42.940 |
But you don't have to encode more than what you know. 00:35:47.500 |
God forbid, if you put, like economists are doing this, 00:35:52.860 |
They put assumptions, even if they don't prevail in the world 00:35:56.040 |
they put assumptions so they can identify things. 00:36:01.500 |
but the problem is you don't know what you don't know. 00:36:07.540 |
because if you don't know, you say it's possible, 00:36:10.620 |
it's possible that X affect the traffic tomorrow. 00:36:18.660 |
You put down an arrow which says it's possible. 00:36:23.940 |
- So there's not a significant cost to adding arrows that-- 00:36:28.020 |
- The more arrow you add, the less likely you are 00:36:32.220 |
to identify things from purely observational data. 00:36:45.420 |
the answer is, you can answer it ahead of time. 00:36:49.160 |
I cannot answer my query from observational data. 00:37:03.100 |
and this do calculus is allowing for intervention. 00:37:13.220 |
- And trying to sort of understand the difference 00:37:18.320 |
What's the, first of all, what is counterfactuals, 00:37:29.680 |
as opposed to just reasoning what effect actions have? 00:37:34.680 |
- Counterfactual contains what we normally call explanations. 00:37:39.920 |
- Can you give an example of a counterfactual? 00:37:44.320 |
affects something else, I didn't explain anything yet. 00:37:55.400 |
I'm asking for explanation, what cured my headache? 00:38:20.260 |
I would have a headache, you're thereby saying 00:38:22.760 |
that aspirin is the thing that removes the headache. 00:38:25.960 |
- Yeah, but you have to have another important information. 00:38:40.520 |
- Yeah, by considering what would have happened 00:38:44.400 |
if everything else is the same, but I didn't take aspirin. 00:38:46.960 |
- That's right, so you know that things took place. 00:39:16.600 |
which says had he not shot, you have a logical clash. 00:39:23.820 |
That's the counterfactual, and that is the source 00:39:26.160 |
of our explanation of the idea of responsibility, 00:39:37.220 |
that's the highest level of reasoning, right? 00:39:54.700 |
and you say, had this weight been two kilogram, 00:40:05.560 |
except that mathematics is only in the form of equation, 00:40:09.560 |
equating the weight, proportionality constant, 00:40:18.540 |
So you don't have the asymmetry in the equation of physics, 00:40:23.300 |
although every physicist thinks counterfactually. 00:40:26.820 |
Ask high school kids, had the weight been three kilograms, 00:40:35.160 |
because they do the counterfactual processing in their mind, 00:40:38.900 |
and then they put it into equation, algebraic equation, 00:40:46.700 |
- How do you make a robot learn these relationships? 00:40:55.580 |
So before you go learning, you have to ask yourself, 00:40:59.380 |
suppose I give him all the information, okay? 00:41:01.780 |
Can the robot perform the task that I ask him to perform? 00:41:07.820 |
Can he reason and say, no, it wasn't the aspirin, 00:41:10.980 |
it was the good news you received on the phone? 00:41:13.320 |
- Right, because, well, unless the robot had a model, 00:41:26.180 |
- But now we have to linger, and we have to say, 00:41:32.220 |
without a team of human experts running around? 00:41:59.500 |
And they learn it by playful manipulation of the world. 00:42:07.660 |
- The simple world involve only toys and balls and chimes. 00:42:11.860 |
But if you think about it, it's a complex world. 00:42:23.740 |
plus parent's guidance, peer wisdom, and hearsay. 00:43:03.060 |
to be able to play in the crib with different objects? 00:43:14.180 |
- Manipulating physical objects on this very, 00:43:25.260 |
Because my sense is the world is extremely complicated. 00:43:42.620 |
It's easy in the sense that you have only 20 variables 00:43:46.980 |
and they are just variables, they're not mechanics. 00:44:27.180 |
I mean, it seems like you would only have to be able 00:44:37.900 |
- I think it's a matter of combining simple models 00:44:41.220 |
from many, many sources, from many, many disciplines 00:44:48.220 |
Metaphors are the basics of human intelligence, basis. 00:45:15.960 |
The Greek believed that the sky is an opaque shell. 00:45:25.940 |
It's an opaque shell, and the stars are holes 00:45:29.660 |
poked in the shells through which you see the eternal light. 00:45:36.980 |
Because they understand how you poke holes in shells. 00:46:04.960 |
enabled Aristoteles to measure the radius of the Earth 00:46:23.040 |
I know the distance, I'll measure the two angles, 00:46:26.400 |
and then I have the radius of the shell of the turtle. 00:46:38.480 |
very close to the measurements we have today, 00:47:00.320 |
the Babylonian experiments were the machine learning people 00:47:27.520 |
Familiar means that answers to certain questions 00:47:35.600 |
- And they were made explicit because somewhere 00:47:38.520 |
in the past, you've constructed a model of that. 00:47:42.400 |
- You're familiar with, so the child is familiar 00:47:48.360 |
So the child could predict that if you let loose 00:48:16.240 |
but the marriage between the two is a tough thing, 00:48:20.560 |
which we haven't yet been able to algorithmize. 00:48:24.760 |
- So you think of that process of using metaphor 00:48:35.880 |
- It is reasoning by metaphor, metaphorical reasoning. 00:48:47.640 |
- It is, it is, it is definitely a form of learning. 00:48:53.800 |
taking something which theoretically is derivable 00:49:05.580 |
Finding the winning starting move in chess is hard. 00:49:35.440 |
So what does a chess master have that we don't have? 00:49:49.000 |
I don't know about you, I'm not a chess master. 00:49:58.620 |
He has seen it before, or he has seen the pattern before, 00:50:18.960 |
we humans are able to initially derive very effectively 00:50:22.400 |
and then reason by metaphor very effectively, 00:50:25.120 |
and make it look so easy, that it makes one wonder 00:50:42.860 |
All I can tell you is that we are making tremendous progress 00:50:52.160 |
Something that I even dare to call it revolution, 00:50:57.160 |
the causal revolution, because what we have achieved 00:51:08.760 |
dwarfed everything that was derived in the entire history. 00:51:20.600 |
and there's really important good work you're doing 00:51:26.420 |
Where do these worlds collide, and what does that look like? 00:51:32.720 |
- First, they're gonna work without collisions. 00:51:41.760 |
- The human is going to jumpstart the exercise 00:51:48.520 |
by providing qualitative, non-committing models 00:51:56.440 |
Universe, how in reality, the domain of discourse works. 00:52:01.440 |
The machine is gonna take over from that point of view 00:52:06.800 |
and derive whatever the calculus says can be derived. 00:52:11.800 |
Namely, quantitative answer to our questions. 00:52:18.440 |
I'll give you some example of complex questions 00:52:21.200 |
that would boggle your mind if you think about it. 00:52:26.200 |
You take results of studies in diverse populations 00:52:32.560 |
under diverse conditions, and you infer the cause effect 00:52:38.640 |
of a new population which doesn't even resemble 00:52:45.160 |
And you do that by, do calculus, you do that by generalizing 00:52:57.000 |
Let's ignore the differences and pull out the commonality. 00:53:01.200 |
And you do it over maybe 100 hospitals around the world. 00:53:06.160 |
From that, you can get really mileage from big data. 00:53:20.520 |
I think, especially for medical applications. 00:53:30.080 |
which is the temporal relationship between things. 00:53:45.240 |
- Is temporal precedence, the arrow of time in physics-- 00:53:55.800 |
- Yes, I never seen cause propagate backward. 00:54:07.080 |
I suppose that's still forward in the arrow of time. 00:54:10.400 |
But are there relationships, logical relationships, 00:54:17.160 |
- Sure, do calculate this logical relationship. 00:54:47.960 |
reason about the order of events, the source, the-- 00:54:54.920 |
- Not about, we're not deriving the order of events. 00:55:12.520 |
other causal relationship that could be derived 00:55:28.240 |
And I ask what if a rifleman A declined to shoot? 00:55:37.960 |
If he declined to shoot, it means that he disobeyed order. 00:55:51.000 |
That's how you start, that's the initial order. 00:55:53.560 |
But now you ask question about breaking the rules. 00:56:20.200 |
But the curiosity, the natural curiosity for me is 00:56:24.320 |
that yes, you're absolutely correct and important. 00:56:27.980 |
And it's hard to believe that we haven't done this 00:56:31.080 |
seriously, extensively, already a long time ago. 00:56:37.000 |
But I also wanna know, maybe you can philosophize 00:56:47.260 |
We put a learning machine that watches execution trials 00:56:56.520 |
All the machine can learn is to see shot or not shot. 00:57:07.340 |
From the fact you don't know who listens to whom. 00:57:13.720 |
listened to the bullets, that the bullets are listening 00:57:19.280 |
All we hear is one command, two shots, dead, okay? 00:57:36.680 |
you can start proposing ideas for humans to review? 00:57:44.360 |
So robot is watching trials like that, 200 trials, 00:58:03.640 |
It's looking at the facts don't give you the strings 00:58:07.160 |
- Absolutely, but do you think of machine learning 00:58:17.600 |
- Right now they only look at the facts, yeah. 00:58:19.160 |
- So is there a way to modify, in your sense-- 00:58:26.960 |
- Doing the interventionist kind of thing, intervention. 00:58:43.600 |
The noise still have to be random to be able to relate it 00:58:55.480 |
from which to infer the strings behind the facts. 00:59:03.000 |
But now that we are expert in what you can do 00:59:06.240 |
once you have a model, we can reason back and say 00:59:39.380 |
- So without imagining what the end goal looks like, 01:00:12.520 |
If robots can communicate with reward and punishment 01:00:19.960 |
among themselves, hitting each other on the wrist 01:00:27.800 |
Playing better soccer because they can do that. 01:00:35.940 |
- Because they can communicate among themselves. 01:00:38.100 |
- Because of the communication they can do this-- 01:00:40.100 |
- Because they communicate like us, reward and punishment. 01:00:44.060 |
Yes, you didn't pass the ball the right time, 01:00:47.580 |
and so therefore you're gonna sit on the bench 01:00:53.660 |
the question is will they play better soccer? 01:00:59.680 |
Without this ability to reason about reward and punishment, 01:01:08.420 |
- So far I can only think about communication. 01:01:11.740 |
- Communication is, not necessarily natural language, 01:01:17.580 |
And that's important to have a quick and effective means 01:01:24.100 |
If the coach tells you you should have passed the ball, 01:01:37.740 |
So how can a coach tell you you should have passed the ball? 01:01:45.620 |
You know your software, you tweak the right module, 01:01:55.240 |
- No, no, no, no, no, they're not well defined. 01:02:14.360 |
do you think this cause and effect type of thinking 01:02:24.220 |
ethics under which the machines make decisions? 01:02:26.420 |
Is the cause effect where the two can come together? 01:02:47.160 |
which should be very much, what is compassion? 01:02:50.900 |
They imagine that you suffer pain as much as me. 01:02:57.020 |
- I do have already a model of myself, right? 01:03:28.080 |
You look at yourself as if you are a part of the environment. 01:03:32.400 |
If you build a model of yourself versus the environment, 01:03:35.440 |
then you can say I need to have a model of myself. 01:03:38.240 |
I have abilities, I have desires and so forth, okay? 01:03:44.360 |
Not a full detail because I cannot get the whole thing 01:03:50.700 |
So on that level of a blueprint, I can modify things. 01:04:11.820 |
so including yourself into the model of the world? 01:04:15.480 |
Some people tell me, no, this is only part of consciousness, 01:04:19.600 |
and then they start telling me what they really mean 01:04:30.240 |
- Do you have concerns about the future of AI, 01:04:36.540 |
all the different trajectories of all of our research? 01:04:40.680 |
- Where's your hope, where the movement heads, 01:04:44.360 |
- I'm concerned because I know we are building 01:04:48.000 |
a new species that has a capability of exceeding us, 01:04:52.400 |
exceeding our capabilities, and can breed itself 01:05:07.640 |
We don't know the degree to which we control it. 01:05:42.720 |
- For us it was, but a few people along the way, 01:05:46.360 |
a few creatures along the way would not agree. 01:06:20.760 |
So sample of one doesn't mean poverty of knowledge. 01:06:34.400 |
But I really feel helpless in contributing to this argument 01:06:59.240 |
- And later served in Israel military, defense forces. 01:07:31.200 |
I wanted to be a member of the kibbutz throughout my life 01:08:13.680 |
It tripled its population from 600,000 to a million point 01:08:29.360 |
When you wanted to make an omelet in a restaurant, 01:08:35.680 |
And they imprisoned people from bringing food 01:08:43.120 |
from the farming and from the villages to the city. 01:08:53.400 |
and higher education did not suffer any budget cut. 01:08:59.160 |
They still invested in me, in my wife, in our generation 01:09:17.280 |
It's a miracle that we survived the war of 1948. 01:09:33.600 |
that not many people talk about, the next phase. 01:09:40.280 |
and the country managed to triple its population. 01:09:45.280 |
Imagine United States going from what, 350 million 01:10:15.400 |
- In your view, looking back, is religion good for society? 01:10:21.000 |
- That's a good question for robotics, you know? 01:10:34.640 |
that religion is good to you, to keep you in line. 01:10:37.920 |
Should we give the robot the metaphor of a god? 01:10:42.920 |
As a matter of fact, the robot will get it without us also. 01:11:00.360 |
father teaching, father image, and mother image. 01:11:10.800 |
but assuming the robot is gonna have a mother and a father, 01:11:22.440 |
So the robot will have this model of the trainer, 01:11:48.680 |
And so the question is if overall that metaphor 01:11:59.140 |
But as long as you keep in mind it's only a metaphor. 01:12:18.000 |
- The way he's known is he was abducted in Pakistan 01:12:32.000 |
I don't even pay attention to what the pretense was. 01:12:35.180 |
Originally they wanted to have the United States 01:12:55.400 |
But eventually he was executed in front of a camera. 01:13:00.400 |
- At the core of that is hate and intolerance. 01:13:10.000 |
We don't really appreciate the depth of the hate 01:13:27.520 |
I just listened recently to what they teach you 01:13:50.600 |
- We didn't know how, but we knew who did it. 01:14:00.480 |
- Do you think all of us are capable of evil? 01:14:12.240 |
If you're indoctrinated sufficiently long and in depth, 01:14:17.240 |
you're capable of ISIS, you're capable of Nazism. 01:14:28.400 |
after we have gone through some Western education 01:14:32.840 |
and we learn that everything is really relative. 01:14:40.080 |
Whether we are capable now of being transformed 01:14:43.520 |
under certain circumstances to become brutal. 01:14:53.000 |
because some people say yes, given the right circumstances, 01:15:14.540 |
He wrote an article at the Wall Street Journal 01:15:16.780 |
titled Daniel Pearl and the Normalization of Evil. 01:15:31.860 |
- The message was that we are not treating terrorism. 01:15:47.620 |
People have grievance and they go and bomb restaurants. 01:15:55.260 |
Look, you're even not surprised when I tell you that. 01:16:09.820 |
And we have created that to ourselves by normalizing, 01:16:26.780 |
Every terrorist yesterday becomes a freedom fighter today 01:16:37.660 |
- And so we should call out evil when there's evil. 01:16:47.660 |
- Yeah, if we want to separate good from evil. 01:17:12.260 |
Does your heart have anger, sadness, or is it hope? 01:17:17.220 |
- Look, I see some beautiful people coming from Pakistan. 01:17:29.420 |
But I see horrible propagation of evil in this country too. 01:17:42.780 |
can catch the mind of the best intellectuals. 01:18:03.900 |
He had a sense of balance that I didn't have. 01:18:29.380 |
He really grew up with the idea that a foreigner 01:18:54.620 |
and then I just hugged him and said here's a dime, 01:19:17.880 |
What is the best way to arrive at new breakthrough ideas 01:19:42.700 |
If they are really dumb, you will find out quickly 01:19:48.380 |
by trying an arrow to see that they're not leading 01:19:51.140 |
any place, but follow them and try to understand 01:20:05.500 |
- There is a lot of inertia in science, in academia. 01:20:18.580 |
- Yeah, those two words, your way, that's a powerful thing. 01:20:28.580 |
I wrote the book of why in order to democratize common sense. 01:20:38.660 |
In order to instill rebellious spirit in students 01:20:45.420 |
so they wouldn't wait until the professor get things right. 01:20:49.920 |
- So you wrote the manifesto of the rebellion 01:21:02.780 |
what ideas do you hope ripple through the next many decades? 01:21:29.800 |
What is counterfactual in terms of a model surgery? 01:21:34.440 |
That's it, because everything follows from that. 01:21:54.360 |
- Judea, thank you so much for talking today. 01:21:57.280 |
- Thank you for being so attentive and instigating. 01:22:07.500 |
Thanks for listening to this conversation with Judea Pearl. 01:22:11.280 |
And thank you to our presenting sponsor, Cash App. 01:22:14.240 |
Download it, use code LEGSPODCAST, you'll get $10, 01:22:18.280 |
and $10 will go to FIRST, a STEM education nonprofit 01:22:21.920 |
that inspires hundreds of thousands of young minds 01:22:24.840 |
to learn and to dream of engineering our future. 01:22:28.280 |
If you enjoy this podcast, subscribe on YouTube, 01:22:31.080 |
get five stars on Apple Podcasts, support on Patreon, 01:22:36.840 |
And now, let me leave you with some words of wisdom 01:22:41.040 |
You cannot answer a question that you cannot ask, 01:22:44.000 |
and you cannot ask a question you have no words for. 01:22:47.380 |
Thank you for listening, and hope to see you next time.