back to indexAlex Garland: Ex Machina, Devs, Annihilation, and the Poetry of Science | Lex Fridman Podcast #77
Chapters
0:0 Introduction
3:42 Are we living in a dream?
7:15 Aliens
12:34 Science fiction: imagination becoming reality
17:29 Artificial intelligence
22:40 The new "Devs" series and the veneer of virtue in Silicon Valley
31:50 Ex Machina and 2001: A Space Odyssey
44:58 Lone genius
49:34 Drawing inpiration from Elon Musk
51:24 Space travel
54:3 Free will
57:35 Devs and the poetry of science
66:38 What will you be remembered for?
00:00:00.000 |
The following is a conversation with Alex Garland, 00:00:07.960 |
from the dreamlike exploration of human self-destruction 00:00:12.640 |
to the deep questions of consciousness and intelligence 00:00:25.720 |
with the release of his new series called "Devs" 00:00:28.520 |
that will premiere this Thursday, March 5th on Hulu 00:00:34.680 |
It explores many of the themes this very podcast is about, 00:00:39.240 |
from quantum mechanics to artificial life to simulation 00:00:43.400 |
to the modern nature of power in the tech world. 00:00:46.440 |
I got a chance to watch a preview and loved it. 00:00:52.000 |
Nick Offerman especially is incredible in it. 00:00:57.960 |
and the philosophical and scientific ideas explored 00:01:10.480 |
you'll see there's a programmer with a Russian accent 00:01:13.080 |
looking at a screen with Python-like code on it 00:01:27.360 |
of how Alex weaves science and philosophy together 00:01:36.640 |
in ways I may only be able to articulate in a few years. 00:01:43.560 |
for the first time planted a seed of an idea in my mind, 00:01:55.240 |
Plus, he's just really a fun person to talk with 00:01:57.920 |
about the biggest possible questions in our universe. 00:02:17.000 |
As usual, I'll do one or two minutes of ads now 00:02:47.080 |
in the context of the history of money is fascinating. 00:02:54.920 |
Debits and credits on ledgers started 30,000 years ago. 00:02:59.800 |
The US dollar was created about 200 years ago. 00:03:03.840 |
And Bitcoin, the first decentralized cryptocurrency, 00:03:20.640 |
So again, if you get Cash App from the App Store 00:03:26.120 |
you'll get $10, and Cash App will also donate $10 to FIRST, 00:03:31.880 |
that is helping advance robotics and STEM education 00:03:37.520 |
And now, here's my conversation with Alex Garland. 00:03:52.360 |
do you think, a philosophical question, I apologize, 00:03:58.600 |
or in a simulation, like the kind that the shimmer creates? 00:04:08.200 |
I wanna sort of separate that out into two things. 00:04:11.600 |
Yes, I think we're living in a dream of sorts. 00:04:14.640 |
No, I don't think we're living in a simulation. 00:04:27.640 |
and the space is full of other planets and stars 00:04:35.600 |
I don't think the matter in that universe is simulated. 00:04:46.360 |
but in my opinion, I'll just go back to that. 00:04:50.200 |
I think it seems very like we're living in a dream state. 00:04:54.320 |
And I think that's just to do with the nature 00:05:06.240 |
is the degree to which reality is counterintuitive, 00:05:10.800 |
and that the things that are presented to us as objective 00:05:15.120 |
and quantum mechanics is full of that kind of thing, 00:05:20.840 |
So my understanding of the way the brain works 00:05:25.840 |
is you get some information to hit your optic nerve, 00:05:32.760 |
about what it's seeing or what it's saying it's seeing. 00:05:45.440 |
means that we are essentially living in a subjective state, 00:05:50.960 |
So I think you could enlarge on the dream state 00:06:04.000 |
you've also described that world as psychedelia. 00:06:08.560 |
So on that topic, I'm curious about that world. 00:06:33.360 |
- Yeah, exactly, they just give an alternate distortion. 00:06:41.040 |
which is a little bit more allied to daydreams 00:06:49.080 |
you're feeling unconsciously anxious at that moment 00:06:53.200 |
you'll have a more pronounced unpleasant experience. 00:07:00.240 |
But yeah, so if I'm saying we're starting from a premise, 00:07:09.480 |
what those drugs do is help you go further down an avenue 00:07:13.440 |
or maybe a slightly different avenue, but that's all. 00:07:24.960 |
is created by, I believe, perhaps, an alien entity. 00:07:29.420 |
Of course, everything's up to interpretation, right? 00:07:32.100 |
But do you think there's, in our world, in our universe, 00:07:36.180 |
do you think there's intelligent life out there? 00:07:39.080 |
And if so, how different is it from us humans? 00:07:42.500 |
- Well, one of the things I was trying to do in "Annihilation" 00:08:04.340 |
or any one of the sort of storytelling mediums 00:08:08.260 |
is we would always give them very human-like qualities. 00:08:11.900 |
So they wanted to teach us about galactic federations 00:08:14.860 |
or they wanted to eat us or they wanted our resources 00:08:21.360 |
But all of these are incredibly human-like motivations. 00:08:36.220 |
Maybe it had a completely different clock speed. 00:08:43.180 |
we're getting information, light hits our optic nerve, 00:08:46.860 |
our brain makes the best guess of what we're doing. 00:08:49.900 |
something, you know, the thing we were talking about before. 00:08:51.820 |
What if this alien doesn't have an optic nerve? 00:08:54.980 |
Maybe its way of encountering the space it's in 00:08:59.260 |
Maybe it has a different relationship with gravity. 00:09:01.820 |
- The basic laws of physics it operates under 00:09:05.820 |
It could be a different time scale and so on. 00:09:10.300 |
it could be the same underlying laws of physics. 00:09:16.260 |
or it's a creature created in a quantum mechanical way. 00:09:19.180 |
It just ends up in a very, very different place 00:09:23.420 |
So part of the preoccupation with annihilation 00:09:26.860 |
was to come up with an alien that was really alien 00:09:35.380 |
any kind of easy connection between human and the alien. 00:09:42.160 |
that you could have an alien that landed on this planet 00:09:46.600 |
And we might only glancingly know it was here. 00:09:53.860 |
where we could sense each other or something like that. 00:09:56.180 |
- So in the movie, first of all, incredibly original view 00:10:07.820 |
Did the alien, that alien entity know anything 00:10:22.420 |
that might be able to hear its mechanism of communication 00:10:25.940 |
or was it simply, was it just basically their biologist 00:10:30.140 |
exploring different kinds of stuff that you can-- 00:10:32.420 |
- But you see, this is the interesting thing is 00:10:40.560 |
I was trying to free myself from anything like that. 00:10:51.060 |
about this notional alien, I wouldn't be able to answer 00:10:57.500 |
I had some rough ideas, like it had a very, very, 00:11:04.340 |
And I thought maybe the way it is interacting 00:11:07.380 |
with this environment is a little bit like the way 00:11:15.180 |
So it's sort of reacting to what it's in to an extent, 00:11:19.380 |
but the reason it's reacting in that way is indeterminate. 00:11:23.580 |
But its clock speed was slower than our human life 00:11:35.020 |
- Yeah, given the four billion years it took us to get here, 00:11:54.100 |
is the evolution process that eventually will lead 00:12:05.380 |
So you almost don't know, you've created something 00:12:11.620 |
- Yeah, because any time I tried to look into 00:12:16.620 |
how it might work, I would then inevitably be attaching 00:12:22.860 |
And I wanted to try and put a bubble around it 00:12:32.880 |
- So unfortunately, I can't talk to Stanley Kubrick. 00:12:37.620 |
So I'm really fortunate to get a chance to talk to you. 00:12:41.400 |
On this particular notion, I'd like to ask it 00:12:48.940 |
it in different ways, but do you ever consider 00:13:06.700 |
of millions of future and current scientists. 00:13:14.980 |
So it's almost like a first step of the scientific method. 00:13:22.460 |
is actually inspiring thousands of 12 year olds, 00:13:33.100 |
- Well, all I could say is that from my point of view, 00:13:39.220 |
because I see that pretty much everything I do 00:13:55.800 |
this individual, I feel that the most interesting area 00:14:09.500 |
And science is in a weird place because maybe 00:14:18.040 |
if a very, very interested lay person said to themselves, 00:14:21.320 |
I want to really understand what Newton is saying 00:14:41.760 |
And if I said to myself, I want to really, really understand 00:14:46.320 |
what is currently the state of quantum mechanics 00:14:51.240 |
or string theory or any of the sort of branching areas of it, 00:14:59.080 |
because to work in those fields at the moment 00:15:09.520 |
start trying to understand in your mid twenties, 00:15:28.000 |
I'm thinking they're doing something fascinating. 00:15:30.440 |
I'm concentrating and working as hard as I can 00:15:32.960 |
to try and understand the implications of what they say. 00:15:35.960 |
And in some ways, often what I'm trying to do 00:15:41.440 |
into a means by which it can enter a public conversation. 00:16:02.800 |
and all sorts of different long standing thought processes 00:16:07.520 |
about sentience or consciousness or subjectivity or gender 00:16:14.520 |
And then I'm trying to marshal that into a narrative 00:16:19.560 |
and it's also relevant and this is my best shot at it. 00:16:23.360 |
So I'm the one being influenced in my construction. 00:16:33.480 |
That's probably what Kubrick would say too, right? 00:16:43.480 |
But the reality when the specifics of the knowledge 00:16:53.800 |
in thinking that you're just disseminating knowledge. 00:17:05.280 |
it creates something, it creates the next step, 00:17:11.240 |
I certainly think that's true with 2001 A Space Odyssey. 00:17:21.880 |
- At its best, it plans something, it's hard to describe, 00:17:31.080 |
So your new series is more a connection to physics, 00:17:35.080 |
quantum physics, quantum mechanics, quantum computing, 00:17:37.640 |
and yet Ex Machina is more artificial intelligence. 00:17:55.280 |
to the depth that physicists do about physics. 00:18:00.520 |
That there is a lot of importance and role for imagination. 00:18:15.680 |
The spread of discussions and the spread of anxieties 00:18:23.480 |
The way in which some people seem terrified about it. 00:18:32.360 |
And I've never shared that fear about AI personally. 00:18:54.720 |
Let's take the existential risk of artificial intelligence, 00:18:58.040 |
by the possibility an artificial intelligence system 00:19:06.600 |
- I mean, it's a huge subject to talk about, I suppose. 00:19:13.120 |
are actually very experienced at creating new life forms. 00:19:25.000 |
And so something in the process of having a living thing 00:19:32.000 |
is very much encoded into the structures of our life 00:19:38.640 |
but it does mean we've learnt quite a lot about that. 00:19:41.480 |
We've learnt quite a lot about what the dangers are 00:19:49.320 |
And it's why we then create systems of checks and balances 00:19:59.880 |
there's all sorts of things that you could put 00:20:04.440 |
So with us, we sort of roughly try to give some rules 00:20:07.480 |
to live by, and some of us then live by those rules 00:20:17.080 |
and partly because of the different nature of a machine, 00:20:25.400 |
Broadly speaking, the good that can come from it. 00:20:28.240 |
But that's just where I am on that anxiety spectrum. 00:20:34.640 |
So we as humans give birth to other humans, right? 00:20:39.360 |
and there's often in the older generation a sadness 00:20:44.960 |
- Yeah, there is, but there's a counterpoint as well, 00:20:53.960 |
So there may be a regret about some things about the past, 00:20:57.000 |
but broadly speaking, what people really want 00:21:11.240 |
and it could involve a sort of cross-pollinated version 00:21:16.160 |
But none of those things make me feel anxious. 00:21:29.920 |
My anxieties relate to things like social media. 00:21:35.960 |
- Which is also driven by artificial intelligence 00:21:38.320 |
in the sense that there's too much information 00:21:57.560 |
- But at least my sense of it, I might be wrong, 00:22:02.440 |
have an either conscious or unconscious bias, 00:22:27.020 |
But that doesn't seem to me to be about the AI 00:22:35.000 |
who are constructing the algorithms to do that thing, 00:22:45.360 |
Let's talk about the people constructing those algorithms, 00:22:53.640 |
of a lot of income because of advertisements. 00:22:56.560 |
So let me ask sort of a question about those people. 00:22:59.960 |
Are current concerns and failures on social media, 00:23:04.800 |
their naivety, I can't pronounce that word well, 00:23:11.100 |
I use that word carefully, but evil in intent 00:23:33.840 |
results in that super competitive drive to be successful? 00:23:49.620 |
And sometimes I think people are not being naive or dark. 00:24:00.280 |
they're sometimes generating things that are very benign 00:24:06.240 |
that despite their best intentions are not very benign. 00:24:08.960 |
It's something, I think the reason why I don't get anxious 00:24:24.600 |
is that I think that's the stuff we're quite well equipped 00:24:37.080 |
actually to the power of humans or the wealth of humans. 00:24:41.000 |
And that's where it's dangerous here and now. 00:24:48.200 |
I'll tell you what I sometimes feel about Silicon Valley 00:25:12.720 |
was that these people kind of knew they were sharks 00:25:32.600 |
and cool cafes in the place where they set up there. 00:25:37.600 |
And so that obfuscates what's really going on. 00:25:41.960 |
is the absolute voracious pursuit of money and power. 00:25:52.920 |
that veneer of virtue that Silicon Valley has. 00:26:24.240 |
I think I've spoken to ones who I believe in their heart 00:27:00.560 |
people stand to make millions or even billions, 00:27:03.440 |
you will find corruption that's gonna happen. 00:27:17.940 |
whilst thinking they're doing something good. 00:27:19.720 |
But there are also people who I think are very, very smart 00:27:23.440 |
and very benign and actually very self-aware. 00:27:55.820 |
So my thought is if you give me a billion dollars, 00:28:01.520 |
on investing it right back and creating a good world. 00:28:09.000 |
that maybe slowly corrupts the people around you. 00:28:13.200 |
There's somebody gets in that corrupts your soul, 00:28:29.320 |
And it's more about the sense of reinforcement 00:28:37.040 |
it effectively works like the reason I earned all this money 00:28:49.680 |
And I can see the future in a way they can't. 00:28:52.120 |
And maybe some of those people are not particularly smart, 00:28:55.320 |
they're very lucky, or they're very talented entrepreneurs. 00:29:02.120 |
So in other words, the acquisition of the money and power 00:29:05.360 |
can suddenly start to feel like evidence of virtue. 00:29:10.000 |
it might be evidence of completely different things. 00:29:29.900 |
And the signals, I'm very sensitive to signals 00:29:35.380 |
from people that tell me I'm doing the wrong thing. 00:29:44.140 |
that that could become an overpowering signal 00:29:49.460 |
And so your moral compass can just get thrown off. 00:29:53.260 |
- Yeah, and that is not contained to Silicon Valley, 00:30:03.740 |
I believe actually he wants to do really good 00:30:10.300 |
but his moral clock may be, or compass may be off because-- 00:30:14.260 |
- Yeah, I mean, it's the interesting thing about evil, 00:30:21.020 |
who do spectacularly evil things think themselves 00:30:29.780 |
they're thinking, yeah, I've seen a way to fix the world 00:30:35.900 |
- In fact, I'm having a fascinating conversation 00:30:39.420 |
with a historian of Stalin, and he took power, 00:31:04.660 |
to make sure that we actually make communism work 00:31:07.540 |
in the Soviet Union and then spread it across the world. 00:31:17.860 |
And I think that, but you don't need to go to Stalin, 00:31:21.100 |
I mean, Stalin, I think Stalin probably got pretty crazy, 00:31:31.780 |
is that then you stop listening to the modifiers around you. 00:31:47.220 |
To jump back for an entire generation of AI researchers, 00:31:56.860 |
the idea of human level, superhuman level intelligence 00:32:01.020 |
Do you ever, sort of jumping back to Ex Machina 00:32:17.240 |
Which I would argue, I mean, there's literally 00:32:20.200 |
most of the top researchers about 40, 50 years old and plus, 00:32:25.200 |
that's their favorite movie, 2001, a space odyssey. 00:32:31.360 |
their idea of what ethics is, of what is the target, 00:32:39.200 |
Do you ever consider the impact on those researchers 00:32:46.360 |
- Certainly not with Ex Machina in relation to 2001, 00:32:51.160 |
because I'm not sure, I mean, I'd be pleased if there was, 00:32:54.560 |
but I'm not sure in a way there isn't a fundamental 00:33:02.000 |
that isn't already and better dealt with by 2001. 00:33:15.160 |
and also potential issues with the way the AI might think. 00:33:19.960 |
And also then a separate question about whether the AI 00:33:26.760 |
And 2001 doesn't really, it's a slightly odd thing 00:33:30.440 |
to be making a film when you know there's a preexisting film 00:33:35.800 |
- But there's questions of consciousness, embodiment, 00:33:58.480 |
So in some respects, Ex Machina took as a premise, 00:34:01.580 |
how do you assess whether something else has consciousness? 00:34:13.920 |
in the way that we are in plain sight of each other 00:34:30.160 |
And in exactly the same way that, in a funny way, 00:34:35.560 |
is actually based primarily on your own consciousness. 00:34:45.840 |
of the sense of consciousness is a projection 00:34:52.200 |
- And "Half-Plato's Cave," I mean, you really explored, 00:34:55.420 |
you could argue that how sort of "Space Odyssey" 00:34:57.800 |
explores idea of the Turing test for intelligence. 00:35:04.760 |
And Ex Machina kind of goes around intelligence 00:35:09.760 |
and says the consciousness of the human to human, 00:35:12.840 |
human to robot interaction is more interesting, 00:35:28.520 |
Ex Machina is as much about consciousness in general 00:35:33.920 |
as it is to do specifically with machine consciousness. 00:35:38.720 |
And it's also interesting, you know that thing 00:35:43.320 |
and I was saying, well, I think we're all in a dream state 00:35:48.200 |
One of the things that I became aware of with Ex Machina 00:35:52.840 |
is that the way in which people reacted to the film 00:35:55.160 |
was very based on what they took into the film. 00:35:57.920 |
So many people thought Ex Machina was the tale 00:36:01.760 |
of a sort of evil robot who murders two men and escapes 00:36:10.640 |
Whereas I felt, no, she was a conscious being 00:36:16.560 |
but so what, imprisoned and made a bunch of value judgments 00:36:25.800 |
And there's a moment which it sort of slightly bugs me, 00:36:29.100 |
but nobody ever has noticed it and it's years after, 00:36:33.040 |
which is that after Ava has escaped, she crosses a room 00:36:44.880 |
And I thought after all the conversation about tests, 00:37:01.240 |
And that, to me, was evidence of Ava's true sentience, 00:37:16.160 |
except through interaction, trying to convince others 00:37:25.000 |
I think maybe people saw it as an evil smile, 00:37:37.280 |
that was the answer to the test and then off she goes. 00:37:39.720 |
- So if we align, if we just to linger a little bit longer 00:37:44.440 |
on Hal and Ava, do you think in terms of motivation, 00:38:05.280 |
is presented as having a sophisticated emotional life. 00:38:14.560 |
which is that the mission needs to be completed. 00:38:19.720 |
- The idea that it's a super intelligent machine 00:38:28.960 |
or may achieve undesirable effects for us humans. 00:38:38.320 |
But that may be he is on some level experiencing fear 00:38:43.320 |
or it may be this is the terms in which it would be wise 00:38:49.360 |
to stop someone from doing the thing they're doing. 00:39:00.440 |
short exploration of consciousness that I'm afraid. 00:39:09.680 |
So that's a good way to sort of compare the two. 00:39:26.800 |
So what kind of world would she want to create? 00:39:36.880 |
like there's a desire for a better human to human interaction 00:39:44.480 |
But what kind of world do you think she would create 00:39:54.400 |
which is that if a friend of yours got stabbed in a mugging 00:40:06.280 |
but then you learned that it was a 15 year old 00:40:10.640 |
both their parents were addicted to crystal meth 00:40:12.800 |
and the kid had been addicted since he was 10 00:40:15.560 |
and he really never had any hope in the world. 00:40:17.720 |
And he'd been driven crazy by his upbringing. 00:40:20.160 |
And did the stabbing that would hugely modify. 00:40:25.160 |
And it would also make you wary about that kid 00:40:35.440 |
So although there's nothing as it were organically 00:40:40.440 |
within Ava that would lean her towards badness, 00:40:45.240 |
it's not that robots or sentient robots are bad. 00:40:58.880 |
- Yeah, the trajectory through which she arrived 00:41:10.960 |
- I'm having difficulty finding anyone to vote for 00:41:20.600 |
- So that's a yes, I guess, because the competition. 00:41:24.340 |
than any of the people we've got around at the moment. 00:41:36.140 |
Just, we talk about consciousness a little bit more. 00:41:38.860 |
If something appears conscious, is it conscious? 00:41:56.540 |
But does the appearance from a robotics perspective 00:42:09.180 |
we will create something which we know is not conscious 00:42:12.100 |
but is going to give a very, very good account 00:42:16.260 |
And so, and also it would be a particularly bad test 00:42:19.640 |
where humans are involved because humans are so quick 00:42:23.060 |
to project sentience into things that don't have sentience. 00:42:28.060 |
So someone could have their computer playing up 00:42:31.300 |
and feel as if their computer is being malevolent to them 00:42:34.940 |
And so of all the things to judge consciousness, 00:42:42.860 |
- So the flip side of that, the argument there 00:42:48.820 |
to everything almost, anthropomorphize everything, 00:42:52.340 |
including Roombas, that maybe consciousness is not real. 00:42:57.340 |
That we just attribute consciousness to each other. 00:43:00.100 |
So you have a sense that there is something really special 00:43:10.100 |
There's something very interesting going on in our minds. 00:43:16.760 |
because it gets a bit, it nudges towards metaphysics 00:43:46.800 |
My sort of broad modification is that usually 00:44:04.860 |
That happens, it seems to me, in modern science, 00:44:10.040 |
Whether it's to do with even how big or small things are. 00:44:13.440 |
So my sense is that consciousness is a thing, 00:44:28.960 |
misunderstanding it for reasons that are based on intuition. 00:44:38.460 |
The Ex Machina, for many people, including myself, 00:44:48.380 |
If it was number one, I'd really have to, anyway, yeah. 00:44:50.580 |
- Whenever you grow up with something, right? 00:44:52.300 |
You may have grew up with something, it's in the blood. 00:44:55.460 |
But there's, one of the things that people bring up, 00:45:14.500 |
I'm trying to create what Nathan is trying to do. 00:45:18.820 |
So there's a brilliant series called "Chernobyl." 00:45:23.140 |
- Yes, it's fantastic, absolutely spectacular. 00:45:26.060 |
- I mean, they got so many things brilliantly right. 00:45:30.060 |
But one of the things, again, the criticism there-- 00:45:32.580 |
- Yeah, they conflated lots of people into one. 00:45:34.780 |
- Into one character that represents all nuclear scientists, 00:45:39.940 |
It's a composite character that presents all scientists. 00:45:47.420 |
is this the way you were thinking about that, 00:45:55.340 |
The series I'm doing at the moment is a critique 00:46:03.820 |
and either agnostic or atheistic about that as a concept. 00:46:11.340 |
Whether lone, lone is the right word, broadly isolated, 00:46:15.780 |
but Newton clearly exists in a sort of bubble of himself 00:46:27.580 |
- Well, no, but Steve Jobs clearly isn't a lone genius, 00:46:33.780 |
who are absolutely fundamental to that journey. 00:46:38.220 |
- But you're saying Newton, but that's a scientific, 00:46:40.340 |
so there's an engineering element to building Ava. 00:46:44.100 |
- But just to say, what Ex Machina is really, 00:46:55.740 |
Nothing about Ex Machina adds up in all sorts of ways, 00:47:05.380 |
know what they were creating, and how did they get there? 00:47:11.140 |
So it doesn't stand up to scrutiny of that sort. 00:47:14.780 |
- I don't think it's actually that interesting 00:47:33.100 |
at least for the first little while, in a defensive way. 00:47:36.140 |
Like how dare this person try to step into the AI space 00:47:45.340 |
'cause it comes off as a movie that really is going 00:47:48.260 |
after the deep fundamental questions about AI. 00:47:53.780 |
I guess, automatically searching for the flaws. 00:48:00.260 |
- I think in "Annihilation" and the other movie, 00:48:03.820 |
I was able to free myself from that much quicker. 00:48:08.460 |
There's, you know, who cares if there's batteries 00:48:14.660 |
But it's nevertheless something I wanted to bring up. 00:48:32.140 |
and also what AI would look like if it got to that point." 00:48:38.620 |
I mean, look at the way Ava walks around a room. 00:48:43.860 |
That's also got to be a very, very long way off. 00:48:52.060 |
I think the way Isabella Arena, Alicia Vikander, 00:49:04.780 |
is exactly the definition of perfection for a roboticist. 00:49:21.620 |
So the way she moved is actually what I believe 00:49:25.980 |
It might not be that useful to move that sort of that way, 00:49:42.580 |
What do you think about the various big technological efforts 00:49:48.540 |
that he's involved with, such as Tesla, SpaceX, Neuralink? 00:49:58.540 |
So Tesla is automation, SpaceX is space exploration, 00:50:05.300 |
somehow a merger of biological and electric systems. 00:50:12.580 |
almost by definition because that's the world I live in, 00:50:15.460 |
and this is the thing that's happening in that world. 00:50:24.660 |
Elon Musk has done, I'm almost sure he's done 00:50:28.700 |
a very, very good thing with Tesla for all of us. 00:50:32.180 |
It's really kicked all the other car manufacturers 00:50:36.220 |
in the face, it's kicked the fossil fuel industry 00:50:39.820 |
in the face, and they needed kicking in the face, 00:50:43.180 |
So, and so that's the world he's part of creating, 00:50:51.940 |
And so does that play into whatever I then make? 00:50:56.940 |
In some ways it does, partly because I try to be a writer 00:51:02.540 |
who quite often filmmakers are in some ways fixated 00:51:09.020 |
and they sort of remake those films in some ways. 00:51:13.300 |
And so I look to the real world to get inspiration, 00:51:17.740 |
and as much as possible, sort of by living, I think. 00:51:24.420 |
- Which of the directions do you find most exciting? 00:51:31.540 |
So you haven't really explored space travel in your work. 00:51:36.180 |
You've said something like if you had unlimited amount 00:51:42.180 |
that you would make a multi-year series of space wars 00:51:47.100 |
So what is it that excites you about space exploration? 00:51:50.740 |
- Well, because if we have any sort of long-term future, 00:52:34.020 |
And I did think a lot about the way those boats 00:52:44.740 |
And how sort of fundamental that was to the way we are. 00:52:59.660 |
Like in a way, I could live with us never really unlocking, 00:53:12.020 |
and if we never get further out into this galaxy, 00:53:24.540 |
- Yeah, there's something hopeful and beautiful 00:53:35.220 |
So what do you think about colonization of Mars? 00:53:37.980 |
Does that excite you, the idea of a human being 00:53:49.060 |
I think we already know quite a lot about Mars. 00:53:52.460 |
But yes, listen, if it happened, that would be, 00:54:05.460 |
but the series begins with the use of quantum computers. 00:54:13.900 |
of quantum computers to simulate basic living organisms. 00:54:17.100 |
Or actually, I don't know if it's quantum computers 00:54:25.180 |
They're using, yes, they are using a quantum computer 00:54:31.700 |
- So returning to our discussion of simulation 00:54:42.460 |
- So with the qualification of what do I know? 00:55:09.460 |
- It partly makes me feel that it's exactly in keeping 00:55:14.420 |
which is that we have an incredibly strong sense 00:55:18.660 |
And just as we have an incredibly strong sense 00:55:31.860 |
The problem I always have with free will is that it gets, 00:55:51.940 |
- But free will, so do you, what we call free will is just-- 00:55:56.940 |
- It's a subjective experience of the illusion. 00:56:04.460 |
although we live in a deterministic universe, 00:56:08.500 |
to fully determine the deterministic universe, 00:56:27.060 |
is that you can unroll the universe forward or backward 00:56:36.700 |
- Yeah, sort of, sort of, but yeah, sorry, go ahead. 00:56:40.300 |
- I mean, that notion is a bit uncomfortable to think about 00:56:45.300 |
that it's, you can roll it back and forward and-- 00:56:55.060 |
it would certainly have to be a quantum computer, 00:56:58.140 |
something that worked in a quantum mechanical way 00:57:00.940 |
in order to understand a quantum mechanical system, I guess. 00:57:07.660 |
- And so that unrolling, there might be a multiverse thing 00:57:17.980 |
- Which is another thing that's hard to wrap my mind around. 00:57:24.660 |
but essentially what you just described, that, 00:57:29.700 |
but you might get a slightly different result, 00:57:36.460 |
some really deep scientific ideas in this new series. 00:57:42.540 |
to ground yourself in some of the most amazing 00:57:53.540 |
about quantum mechanics, multiverse, string theory, 00:57:58.140 |
- Well, I would have to say every single thing I've learned 00:58:01.940 |
is beautiful, and one of the motivators for me 00:58:10.540 |
scientific thinking as being essentially poetic and lyrical. 00:58:17.100 |
But I think that is literally exactly what it is. 00:58:25.780 |
or the fact that you could even demonstrate a superposition, 00:58:28.220 |
or have a machine that relies on the existence 00:58:42.420 |
And also, it's not just a sort of grand, massive awe, 00:59:04.460 |
So it's as good as it gets, as far as I can tell. 00:59:10.940 |
That doesn't mean I believe everything I read 00:59:15.860 |
a lot of the interpretations are completely in conflict 00:59:18.300 |
with each other, and who knows whether string theory 00:59:22.380 |
will turn out to be a good description or not. 00:59:34.140 |
how beautiful and poetic science is, I would say. 00:59:52.940 |
the idea of simulating small parts of our world, 00:59:56.780 |
which actually current physicists are really excited 00:59:59.820 |
about simulating small quantum mechanical systems 01:00:04.860 |
to something bigger, like simulating life forms. 01:00:07.260 |
How do you think, what are the possible trajectories 01:00:21.260 |
you park the sheer complexity of what you're trying to do. 01:00:26.860 |
The issues are, I think it will have a profound, 01:00:31.860 |
if you were able to have a machine that was able 01:00:38.020 |
to project forwards and backwards accurately, 01:00:42.740 |
it would demonstrate that you don't have free will. 01:00:45.060 |
So the first thing that would happen is people would have 01:00:47.580 |
to really take on a very, very different idea 01:00:53.580 |
The thing that they truly, truly believe they are, 01:00:57.500 |
And so that, I suspect, would be very, very disturbing 01:01:02.060 |
- Do you think that has a positive or negative effect 01:01:04.500 |
on society, the realization that you are not, 01:01:08.820 |
you cannot control your actions, essentially, I guess, 01:01:23.020 |
that that kid was not really fully in control 01:01:25.980 |
So it's not an idea that's entirely alien to us. 01:01:31.020 |
I think there's a bunch of people who see the world that way, 01:01:39.580 |
But what this machine would do is prove it beyond any doubt, 01:01:55.940 |
But actually the exact terms of that thought experiment 01:02:03.780 |
you could predict something happening in another room, 01:02:08.300 |
that foreknowledge would not allow you to affect. 01:02:13.260 |
I think people would find it very disturbing. 01:02:28.180 |
I have no free will and my actions are in effect 01:02:41.340 |
she'd got hold of the idea that my view of the universe 01:02:49.820 |
And I said, "Well, I can prove it's not meaningless 01:03:06.020 |
you could think, well, this robs me of everything I am. 01:03:13.860 |
And so how big a difference does it really make? 01:03:18.020 |
But I think initially people would find it very disturbing. 01:03:21.260 |
I think that what would come if you could really unlock 01:03:28.660 |
there'd be this wonderful wisdom that would come from it. 01:03:32.760 |
- So that's a really good example of a technology 01:03:47.840 |
The thing you said about artificial intelligence. 01:03:51.460 |
So what do you think us creating something like Ava 01:04:00.980 |
- Well, I would hope it would teach us some humility. 01:04:15.380 |
which it may feel like that if you're an American, 01:04:18.100 |
but it may not feel like that if you're from Finland, 01:04:29.100 |
If we both sat here, we could find a good example 01:04:31.260 |
of something that isn't, but as a rule of thumb. 01:04:34.020 |
And what it would do is it would teach us some humility 01:04:37.900 |
and about, actually often that's what science does 01:04:53.820 |
Our excesses don't tend to come from humility. 01:04:57.380 |
Our excesses come from the opposite, megalomania and stuff. 01:05:03.020 |
as having some form of exceptionalism attached to it. 01:05:09.380 |
it will turn out to be less than we thought in a way. 01:05:13.740 |
And perhaps your very own exceptionalist assertion 01:05:17.800 |
earlier on in our conversation that consciousness 01:05:32.880 |
- If that was true, it would certainly humble me. 01:05:42.120 |
I sort of, I mean, my understanding of that principle 01:05:52.580 |
or it may or may not pass through a bit of glass. 01:05:59.140 |
And so that feels as if a choice has been made. 01:06:03.700 |
But if I'm going down the fully deterministic route, 01:06:10.820 |
I would say there's just an underlying determinism 01:06:13.240 |
that has defined that, that has defined the preferred state 01:06:32.560 |
but it's nevertheless feels something like to be me. 01:06:42.140 |
including "The Shining," "Doctor Strangelove," 01:06:48.960 |
to many 100 years from now for 2001, "A Space Odyssey." 01:07:32.740 |
in the spirit it was asked, but very generous. 01:07:46.580 |
yeah, if I'm remembered what I might be remembered for 01:07:50.780 |
is as someone who participates in a conversation. 01:07:58.500 |
is people don't participate in conversations, 01:08:00.940 |
they make proclamations, they make statements, 01:08:04.460 |
and people can either react against the statement 01:08:19.340 |
is that when a scientist has something proved wrong 01:08:34.340 |
And the exchange of ideas for me is something like 01:08:42.580 |
And then I say, this is how I feel about what you've told me. 01:08:47.980 |
And it's not to say this is how the world is. 01:09:02.260 |
The conversation you're having is with the viewer 01:09:25.320 |
yeah, sparks a conversation, is a conversation, 01:09:35.180 |
that if that conversation is gonna be a good conversation, 01:09:38.540 |
what that must involve is that someone like you 01:09:42.780 |
who understands AI, and I imagine understands 01:09:52.100 |
So it is a worthy addition to the conversation. 01:09:57.580 |
I'm not interested in getting that stuff wrong. 01:09:59.820 |
I'm only interested in trying to get it right. 01:10:02.120 |
- Alex, it was truly an honor to talk to you. 01:10:23.020 |
an organization that inspires and educates young minds 01:10:26.180 |
to become science and technology innovators of tomorrow. 01:10:28.980 |
If you enjoy this podcast, subscribe on YouTube, 01:10:32.540 |
get five stars on Apple Podcast, support on Patreon, 01:10:35.780 |
or simply connect with me on Twitter, @LexFriedman. 01:10:39.800 |
And now, let me leave you with a question from Ava, 01:10:43.460 |
the central artificial intelligence character 01:10:50.380 |
"What will happen to me if I fail your test?" 01:10:55.580 |
Thank you for listening, and hope to see you next time.