back to index

Alex Garland: Ex Machina, Devs, Annihilation, and the Poetry of Science | Lex Fridman Podcast #77


Chapters

0:0 Introduction
3:42 Are we living in a dream?
7:15 Aliens
12:34 Science fiction: imagination becoming reality
17:29 Artificial intelligence
22:40 The new "Devs" series and the veneer of virtue in Silicon Valley
31:50 Ex Machina and 2001: A Space Odyssey
44:58 Lone genius
49:34 Drawing inpiration from Elon Musk
51:24 Space travel
54:3 Free will
57:35 Devs and the poetry of science
66:38 What will you be remembered for?

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Alex Garland,
00:00:03.280 | writer and director of many imaginative
00:00:06.280 | and philosophical films,
00:00:07.960 | from the dreamlike exploration of human self-destruction
00:00:10.960 | in the movie "Annihilation"
00:00:12.640 | to the deep questions of consciousness and intelligence
00:00:16.360 | raised in the movie "Ex Machina,"
00:00:18.520 | which to me is one of the greatest movies
00:00:21.000 | in artificial intelligence ever made.
00:00:23.800 | I'm releasing this podcast to coincide
00:00:25.720 | with the release of his new series called "Devs"
00:00:28.520 | that will premiere this Thursday, March 5th on Hulu
00:00:32.480 | as part of FX on Hulu.
00:00:34.680 | It explores many of the themes this very podcast is about,
00:00:39.240 | from quantum mechanics to artificial life to simulation
00:00:43.400 | to the modern nature of power in the tech world.
00:00:46.440 | I got a chance to watch a preview and loved it.
00:00:50.280 | The acting is great.
00:00:52.000 | Nick Offerman especially is incredible in it.
00:00:55.320 | The cinematography is beautiful
00:00:57.960 | and the philosophical and scientific ideas explored
00:01:00.600 | are profound.
00:01:02.000 | And for me as an engineer and scientist,
00:01:04.400 | were just fun to see brought to life.
00:01:07.200 | For example, if you watch the trailer
00:01:08.960 | for the series carefully,
00:01:10.480 | you'll see there's a programmer with a Russian accent
00:01:13.080 | looking at a screen with Python-like code on it
00:01:16.080 | that appears to be using a library
00:01:18.080 | that interfaces with a quantum computer.
00:01:20.200 | This attention to technical detail
00:01:22.920 | on several levels is impressive.
00:01:25.480 | And one of the reasons I'm a big fan
00:01:27.360 | of how Alex weaves science and philosophy together
00:01:30.000 | in his work.
00:01:30.840 | Meeting Alex for me was unlikely,
00:01:35.000 | but it was life-changing
00:01:36.640 | in ways I may only be able to articulate in a few years.
00:01:40.120 | Just as meeting Spotmania Boston Dynamics
00:01:43.560 | for the first time planted a seed of an idea in my mind,
00:01:47.760 | so did meeting Alex Garland.
00:01:50.120 | He's humble, curious, intelligent,
00:01:52.760 | and to me, an inspiration.
00:01:55.240 | Plus, he's just really a fun person to talk with
00:01:57.920 | about the biggest possible questions in our universe.
00:02:01.280 | This is the Artificial Intelligence Podcast.
00:02:05.040 | If you enjoy it, subscribe on YouTube,
00:02:07.240 | give it five stars on Apple Podcast,
00:02:09.080 | support it on Patreon,
00:02:10.480 | or simply connect with me on Twitter
00:02:12.520 | at Lex Friedman, spelled F-R-I-D-M-A-N.
00:02:17.000 | As usual, I'll do one or two minutes of ads now
00:02:19.560 | and never any ads in the middle
00:02:21.000 | that can break the flow of the conversation.
00:02:23.280 | I hope that works for you
00:02:24.720 | and doesn't hurt the listening experience.
00:02:27.440 | This show is presented by Cash App,
00:02:29.920 | the number one finance app in the App Store.
00:02:32.320 | When you get it, use code LEXPODCAST.
00:02:35.800 | Cash App lets you send money to friends,
00:02:37.960 | buy Bitcoin, and invest in the stock market
00:02:40.320 | with as little as $1.
00:02:41.360 | Since Cash App allows you to buy Bitcoin,
00:02:45.160 | let me mention that cryptocurrency
00:02:47.080 | in the context of the history of money is fascinating.
00:02:50.320 | I recommend "A Scent of Money"
00:02:52.680 | as a great book on this history.
00:02:54.920 | Debits and credits on ledgers started 30,000 years ago.
00:02:59.800 | The US dollar was created about 200 years ago.
00:03:03.840 | And Bitcoin, the first decentralized cryptocurrency,
00:03:07.360 | was released just over 10 years ago.
00:03:09.920 | So given that history,
00:03:11.360 | cryptocurrency is still very much
00:03:12.920 | in its early days of development,
00:03:14.880 | but it still is aiming to, and just might,
00:03:17.800 | redefine the nature of money.
00:03:20.640 | So again, if you get Cash App from the App Store
00:03:23.040 | or Google Play, and use code LEXPODCAST,
00:03:26.120 | you'll get $10, and Cash App will also donate $10 to FIRST,
00:03:30.200 | one of my favorite organizations
00:03:31.880 | that is helping advance robotics and STEM education
00:03:34.960 | for young people around the world.
00:03:37.520 | And now, here's my conversation with Alex Garland.
00:03:41.520 | You described the world inside the shimmer
00:03:45.160 | in the movie "Annihilation" as dreamlike,
00:03:47.160 | meaning that it's internally consistent,
00:03:48.760 | but detached from reality.
00:03:50.720 | That leads me to ask,
00:03:52.360 | do you think, a philosophical question, I apologize,
00:03:56.240 | do you think we might be living in a dream
00:03:58.600 | or in a simulation, like the kind that the shimmer creates?
00:04:02.280 | We, human beings, here today.
00:04:07.040 | - Yeah.
00:04:08.200 | I wanna sort of separate that out into two things.
00:04:11.600 | Yes, I think we're living in a dream of sorts.
00:04:14.640 | No, I don't think we're living in a simulation.
00:04:18.400 | I think we're living on a planet
00:04:20.760 | with a very thin layer of atmosphere,
00:04:23.720 | and the planet is in a very large space,
00:04:27.640 | and the space is full of other planets and stars
00:04:29.920 | and quasars and stuff like that.
00:04:31.280 | And I don't think those physical objects,
00:04:35.600 | I don't think the matter in that universe is simulated.
00:04:38.760 | I think it's there.
00:04:40.520 | We are definitely, or,
00:04:43.480 | it's a whole problem with saying definitely,
00:04:46.360 | but in my opinion, I'll just go back to that.
00:04:50.200 | I think it seems very like we're living in a dream state.
00:04:53.080 | I'm pretty sure we are.
00:04:54.320 | And I think that's just to do with the nature
00:04:56.480 | of how we experience the world.
00:04:58.000 | We experience it in a subjective way.
00:05:00.240 | And the thing I've learned most
00:05:04.400 | as I've got older in some respects
00:05:06.240 | is the degree to which reality is counterintuitive,
00:05:10.800 | and that the things that are presented to us as objective
00:05:13.640 | turn out not to be objective,
00:05:15.120 | and quantum mechanics is full of that kind of thing,
00:05:17.320 | but actually just day-to-day life
00:05:18.960 | is full of that kind of thing as well.
00:05:20.840 | So my understanding of the way the brain works
00:05:25.840 | is you get some information to hit your optic nerve,
00:05:30.760 | and then your brain makes its best guess
00:05:32.760 | about what it's seeing or what it's saying it's seeing.
00:05:36.320 | It may or may not be an accurate best guess.
00:05:39.240 | It might be an inaccurate best guess.
00:05:41.320 | And that gap, the best guess gap,
00:05:45.440 | means that we are essentially living in a subjective state,
00:05:48.960 | which means that we're in a dream state.
00:05:50.960 | So I think you could enlarge on the dream state
00:05:54.000 | in all sorts of ways.
00:05:55.440 | So yes, dream state, no simulation
00:05:58.280 | would be where I'd come down.
00:06:00.440 | - Going further, deeper into that direction,
00:06:04.000 | you've also described that world as psychedelia.
00:06:08.560 | So on that topic, I'm curious about that world.
00:06:11.460 | On the topic of psychedelic drugs,
00:06:13.320 | do you see those kinds of chemicals
00:06:15.920 | that modify our perception
00:06:18.280 | as a distortion of our perception of reality
00:06:22.000 | or a window into another reality?
00:06:25.860 | - No, I think what I'd be saying
00:06:27.060 | is that we live in a distorted reality,
00:06:29.140 | and then those kinds of drugs
00:06:30.520 | give us a different kind of distorted--
00:06:32.520 | - Different perspective.
00:06:33.360 | - Yeah, exactly, they just give an alternate distortion.
00:06:35.920 | And I think that what they really do
00:06:37.560 | is they give a distorted perception,
00:06:41.040 | which is a little bit more allied to daydreams
00:06:45.560 | or unconscious interests.
00:06:47.320 | So if for some reason
00:06:49.080 | you're feeling unconsciously anxious at that moment
00:06:51.800 | and you take a psychedelic drug,
00:06:53.200 | you'll have a more pronounced unpleasant experience.
00:06:56.560 | And if you're feeling very calm or happy,
00:06:59.040 | you might have a good time.
00:07:00.240 | But yeah, so if I'm saying we're starting from a premise,
00:07:04.800 | our starting point is we were already
00:07:06.940 | in the slightly psychedelic state,
00:07:09.480 | what those drugs do is help you go further down an avenue
00:07:13.440 | or maybe a slightly different avenue, but that's all.
00:07:16.240 | - So in that movie, "Annihilation,"
00:07:19.080 | the shimmer, this alternate dreamlike state
00:07:24.960 | is created by, I believe, perhaps, an alien entity.
00:07:29.420 | Of course, everything's up to interpretation, right?
00:07:32.100 | But do you think there's, in our world, in our universe,
00:07:36.180 | do you think there's intelligent life out there?
00:07:39.080 | And if so, how different is it from us humans?
00:07:42.500 | - Well, one of the things I was trying to do in "Annihilation"
00:07:47.200 | was to offer up a form of alien life
00:07:51.740 | that was actually alien.
00:07:53.380 | Because it would often seem to me
00:07:58.340 | that in the way we would represent aliens
00:08:01.100 | in books or cinema or television
00:08:04.340 | or any one of the sort of storytelling mediums
00:08:08.260 | is we would always give them very human-like qualities.
00:08:11.900 | So they wanted to teach us about galactic federations
00:08:14.860 | or they wanted to eat us or they wanted our resources
00:08:17.740 | like our water or they want to enslave us
00:08:20.180 | or whatever it happens to be.
00:08:21.360 | But all of these are incredibly human-like motivations.
00:08:25.420 | And I was interested in the idea of an alien
00:08:30.900 | that was not in any way like us.
00:08:34.300 | It didn't share.
00:08:36.220 | Maybe it had a completely different clock speed.
00:08:38.820 | Maybe its way, so we're talking about,
00:08:42.140 | we're looking at each other,
00:08:43.180 | we're getting information, light hits our optic nerve,
00:08:46.860 | our brain makes the best guess of what we're doing.
00:08:49.060 | Sometimes it's right,
00:08:49.900 | something, you know, the thing we were talking about before.
00:08:51.820 | What if this alien doesn't have an optic nerve?
00:08:54.980 | Maybe its way of encountering the space it's in
00:08:57.700 | is wholly different.
00:08:59.260 | Maybe it has a different relationship with gravity.
00:09:01.820 | - The basic laws of physics it operates under
00:09:04.060 | might be fundamentally different.
00:09:05.820 | It could be a different time scale and so on.
00:09:07.820 | - Yeah, or it could be the same laws,
00:09:10.300 | it could be the same underlying laws of physics.
00:09:12.700 | You know, it's a machine created
00:09:16.260 | or it's a creature created in a quantum mechanical way.
00:09:19.180 | It just ends up in a very, very different place
00:09:21.820 | to the one we end up in.
00:09:23.420 | So part of the preoccupation with annihilation
00:09:26.860 | was to come up with an alien that was really alien
00:09:29.940 | and didn't give us,
00:09:31.380 | and it didn't give us and we didn't give it
00:09:35.380 | any kind of easy connection between human and the alien.
00:09:40.000 | Because I think it was to do with the idea
00:09:42.160 | that you could have an alien that landed on this planet
00:09:44.540 | that wouldn't even know we were here.
00:09:46.600 | And we might only glancingly know it was here.
00:09:49.460 | There'd just be this strange point
00:09:52.180 | where the Venn diagrams connected
00:09:53.860 | where we could sense each other or something like that.
00:09:56.180 | - So in the movie, first of all, incredibly original view
00:09:59.980 | of what an alien life would be.
00:10:01.900 | And it's in that sense, it's a huge success.
00:10:04.920 | Let's go inside your imagination.
00:10:07.820 | Did the alien, that alien entity know anything
00:10:11.940 | about humans when it landed?
00:10:13.980 | - No.
00:10:14.820 | - So the idea is you're basically an alien,
00:10:18.240 | life is trying to reach out to anything
00:10:22.420 | that might be able to hear its mechanism of communication
00:10:25.940 | or was it simply, was it just basically their biologist
00:10:30.140 | exploring different kinds of stuff that you can--
00:10:32.420 | - But you see, this is the interesting thing is
00:10:34.540 | as soon as you say their biologist,
00:10:36.780 | you've done the thing of attributing
00:10:38.380 | human type motivations to it.
00:10:40.560 | I was trying to free myself from anything like that.
00:10:46.940 | So all sorts of questions you might answer
00:10:51.060 | about this notional alien, I wouldn't be able to answer
00:10:54.100 | because I don't know what it was.
00:10:55.700 | (laughing)
00:10:56.540 | Or how it worked.
00:10:57.500 | I had some rough ideas, like it had a very, very,
00:11:02.340 | very slow clock speed.
00:11:04.340 | And I thought maybe the way it is interacting
00:11:07.380 | with this environment is a little bit like the way
00:11:09.460 | an octopus will change its color forms
00:11:13.340 | around the space that it's in.
00:11:15.180 | So it's sort of reacting to what it's in to an extent,
00:11:19.380 | but the reason it's reacting in that way is indeterminate.
00:11:23.580 | But its clock speed was slower than our human life
00:11:28.580 | clock speed, but it's faster than evolution.
00:11:32.940 | - Faster than our--
00:11:33.900 | - Than our evolution.
00:11:35.020 | - Yeah, given the four billion years it took us to get here,
00:11:37.700 | then yes, maybe it started at eight.
00:11:39.820 | - If you look at the human civilization
00:11:41.300 | as a single organism.
00:11:43.460 | In that sense, this evolution could be us.
00:11:46.460 | The evolution of living organisms on Earth
00:11:49.840 | could be just a single organism
00:11:51.380 | and it's kind of, that's its life,
00:11:54.100 | is the evolution process that eventually will lead
00:11:57.220 | to probably the heat death of the universe
00:12:00.940 | or something before that.
00:12:02.660 | I mean, that's just an incredible idea.
00:12:05.380 | So you almost don't know, you've created something
00:12:09.000 | that you don't even know how it works.
00:12:11.620 | - Yeah, because any time I tried to look into
00:12:16.620 | how it might work, I would then inevitably be attaching
00:12:20.260 | my kind of thought processes into it.
00:12:22.860 | And I wanted to try and put a bubble around it
00:12:24.940 | where I was saying, no, this is alien
00:12:27.820 | in its most alien form.
00:12:29.540 | I have no real point of contact.
00:12:32.880 | - So unfortunately, I can't talk to Stanley Kubrick.
00:12:37.620 | So I'm really fortunate to get a chance to talk to you.
00:12:41.400 | On this particular notion, I'd like to ask it
00:12:47.420 | a bunch of different ways and we'll explore
00:12:48.940 | it in different ways, but do you ever consider
00:12:51.260 | human imagination, your imagination,
00:12:53.540 | as a window into a possible future
00:12:57.060 | and that what you're doing, you're putting
00:13:00.100 | that imagination on paper as a writer
00:13:02.140 | and then on screen as a director
00:13:04.720 | and that plants the seeds in the minds
00:13:06.700 | of millions of future and current scientists.
00:13:10.180 | And so your imagination, you putting it down
00:13:13.020 | actually makes it a reality.
00:13:14.980 | So it's almost like a first step of the scientific method.
00:13:18.580 | Like you imagining what's possible
00:13:20.300 | in your new series with Ex Machina
00:13:22.460 | is actually inspiring thousands of 12 year olds,
00:13:28.500 | millions of scientists and actually creating
00:13:31.740 | the future you've imagined.
00:13:33.100 | - Well, all I could say is that from my point of view,
00:13:37.120 | it's almost exactly the reverse
00:13:39.220 | because I see that pretty much everything I do
00:13:45.640 | is a reaction to what scientists are doing.
00:13:49.680 | I'm an interested lay person and I feel,
00:13:55.800 | this individual, I feel that the most interesting area
00:14:02.200 | that humans are involved in is science.
00:14:05.540 | I think art is very, very interesting,
00:14:07.320 | but the most interesting is science.
00:14:09.500 | And science is in a weird place because maybe
00:14:14.360 | around the time Newton was alive,
00:14:18.040 | if a very, very interested lay person said to themselves,
00:14:21.320 | I want to really understand what Newton is saying
00:14:23.960 | about the way the world works,
00:14:25.580 | with a few years of dedicated thinking,
00:14:28.880 | they would be able to understand
00:14:31.120 | the sort of principles he was laying out.
00:14:34.480 | And I don't think that's true anymore.
00:14:35.920 | I think that's stopped being true now.
00:14:37.880 | So I'm a pretty smart guy.
00:14:41.760 | And if I said to myself, I want to really, really understand
00:14:46.320 | what is currently the state of quantum mechanics
00:14:51.240 | or string theory or any of the sort of branching areas of it,
00:14:54.720 | I wouldn't be able to,
00:14:56.280 | I'd be intellectually incapable of doing it
00:14:59.080 | because to work in those fields at the moment
00:15:02.240 | is a bit like being an athlete.
00:15:03.640 | I suspect you need to start when you're 12.
00:15:05.800 | And if you start in your mid twenties,
00:15:09.520 | start trying to understand in your mid twenties,
00:15:11.480 | then you're just never gonna catch up.
00:15:13.960 | That's the way it feels to me.
00:15:15.760 | So what I do is I try to make myself open.
00:15:19.520 | So the people that you're implying,
00:15:21.000 | maybe I would influence,
00:15:24.320 | to me, it's exactly the other way around.
00:15:25.880 | These people are strongly influencing me.
00:15:28.000 | I'm thinking they're doing something fascinating.
00:15:30.440 | I'm concentrating and working as hard as I can
00:15:32.960 | to try and understand the implications of what they say.
00:15:35.960 | And in some ways, often what I'm trying to do
00:15:38.240 | is disseminate their ideas
00:15:41.440 | into a means by which it can enter a public conversation.
00:15:47.720 | So Ex Machina contains lots of name checks,
00:15:53.600 | all sorts of existing thought experiments,
00:15:57.080 | shadows on Plato's cave
00:16:00.880 | and Mary in the black and white room
00:16:02.800 | and all sorts of different long standing thought processes
00:16:07.520 | about sentience or consciousness or subjectivity or gender
00:16:12.520 | or whatever it happens to be.
00:16:14.520 | And then I'm trying to marshal that into a narrative
00:16:17.480 | to say, look, this stuff is interesting
00:16:19.560 | and it's also relevant and this is my best shot at it.
00:16:23.360 | So I'm the one being influenced in my construction.
00:16:27.720 | - That's fascinating.
00:16:28.920 | Of course, you would say that
00:16:31.000 | because you're not even aware of your own.
00:16:33.480 | That's probably what Kubrick would say too, right?
00:16:35.640 | Is in describing why how 9,000 is created
00:16:40.120 | the way how 9,000 is created
00:16:42.000 | is you're just studying what's...
00:16:43.480 | But the reality when the specifics of the knowledge
00:16:48.200 | passes through your imagination,
00:16:50.320 | I would argue that you're incorrect
00:16:53.800 | in thinking that you're just disseminating knowledge.
00:16:56.960 | That the very act of your imagination
00:17:01.720 | consuming that science,
00:17:05.280 | it creates something, it creates the next step,
00:17:09.160 | potentially creates the next step.
00:17:11.240 | I certainly think that's true with 2001 A Space Odyssey.
00:17:15.120 | I think at its best, if it fails--
00:17:18.120 | - It's true of that.
00:17:19.000 | Yeah, it's true of that, definitely.
00:17:20.800 | (laughing)
00:17:21.880 | - At its best, it plans something, it's hard to describe,
00:17:24.800 | but it inspires the next generation
00:17:29.160 | and it could be field dependent.
00:17:31.080 | So your new series is more a connection to physics,
00:17:35.080 | quantum physics, quantum mechanics, quantum computing,
00:17:37.640 | and yet Ex Machina is more artificial intelligence.
00:17:40.520 | I know more about AI.
00:17:43.080 | My sense that AI is much earlier
00:17:48.080 | in the depth of its understanding.
00:17:51.840 | I would argue nobody understands anything
00:17:55.280 | to the depth that physicists do about physics.
00:17:57.840 | In AI, nobody understands AI.
00:18:00.520 | That there is a lot of importance and role for imagination.
00:18:04.000 | Which I think, we're in that,
00:18:06.000 | or Freud imagined the subconscious,
00:18:08.200 | we're in that stage of AI,
00:18:10.880 | where there's a lot of imagination needed
00:18:12.760 | and thinking outside the box.
00:18:14.400 | - Yeah, it's interesting.
00:18:15.680 | The spread of discussions and the spread of anxieties
00:18:21.120 | that exist about AI fascinate me.
00:18:23.480 | The way in which some people seem terrified about it.
00:18:30.280 | Whilst also pursuing it.
00:18:32.360 | And I've never shared that fear about AI personally.
00:18:36.920 | But the way in which it agitates people
00:18:42.680 | and also the people who it agitates,
00:18:44.600 | I find kind of fascinating.
00:18:47.400 | - Are you afraid?
00:18:49.360 | Are you excited?
00:18:50.880 | Are you sad by the possibility?
00:18:54.720 | Let's take the existential risk of artificial intelligence,
00:18:58.040 | by the possibility an artificial intelligence system
00:19:01.280 | becomes our offspring and makes us obsolete.
00:19:06.600 | - I mean, it's a huge subject to talk about, I suppose.
00:19:10.720 | But one of the things I think is that humans
00:19:13.120 | are actually very experienced at creating new life forms.
00:19:18.120 | Because that's why you and I are both here.
00:19:23.200 | And it's why everyone on the planet is here.
00:19:25.000 | And so something in the process of having a living thing
00:19:29.880 | that exists that didn't exist previously
00:19:32.000 | is very much encoded into the structures of our life
00:19:35.400 | and the structures of our societies.
00:19:37.320 | It doesn't mean we always get it right,
00:19:38.640 | but it does mean we've learnt quite a lot about that.
00:19:41.480 | We've learnt quite a lot about what the dangers are
00:19:45.480 | of allowing things to be unchecked.
00:19:49.320 | And it's why we then create systems of checks and balances
00:19:52.600 | in our government and so on and so forth.
00:19:55.240 | I mean, that's not to say...
00:19:56.680 | The other thing is it seems like
00:19:59.880 | there's all sorts of things that you could put
00:20:01.880 | into a machine that you would not be...
00:20:04.440 | So with us, we sort of roughly try to give some rules
00:20:07.480 | to live by, and some of us then live by those rules
00:20:10.200 | and some don't.
00:20:11.040 | And with a machine, it feels like
00:20:12.560 | you could enforce those things.
00:20:13.880 | So partly because of our previous experience
00:20:17.080 | and partly because of the different nature of a machine,
00:20:19.120 | I just don't feel anxious about it.
00:20:22.000 | More, I just see all the good that...
00:20:25.400 | Broadly speaking, the good that can come from it.
00:20:28.240 | But that's just where I am on that anxiety spectrum.
00:20:32.160 | - There's a sadness.
00:20:34.640 | So we as humans give birth to other humans, right?
00:20:37.720 | But then there's generations,
00:20:39.360 | and there's often in the older generation a sadness
00:20:42.000 | about what the world has become now.
00:20:44.080 | I mean, that's kind of...
00:20:44.960 | - Yeah, there is, but there's a counterpoint as well,
00:20:47.120 | which is that most parents would wish
00:20:51.480 | for a better life for their children.
00:20:53.960 | So there may be a regret about some things about the past,
00:20:57.000 | but broadly speaking, what people really want
00:20:59.560 | is that things will be better
00:21:00.600 | for the future generations, not worse.
00:21:02.760 | And so, and then it's a question
00:21:05.880 | about what constitutes a future generation.
00:21:07.960 | A future generation could involve people.
00:21:09.760 | It also could involve machines,
00:21:11.240 | and it could involve a sort of cross-pollinated version
00:21:14.680 | of the two or any...
00:21:16.160 | But none of those things make me feel anxious.
00:21:19.880 | - It doesn't give you anxiety.
00:21:21.280 | It doesn't excite you?
00:21:23.040 | Like anything that's new?
00:21:24.320 | - It does.
00:21:25.560 | Not anything that's new.
00:21:27.000 | I don't think, for example, I've got...
00:21:29.920 | My anxieties relate to things like social media.
00:21:32.560 | So I've got plenty of anxieties about that.
00:21:35.960 | - Which is also driven by artificial intelligence
00:21:38.320 | in the sense that there's too much information
00:21:41.040 | to be able to...
00:21:42.640 | An algorithm has to filter that information
00:21:45.120 | and present to you.
00:21:46.200 | So ultimately, the algorithm, a simple,
00:21:49.720 | oftentimes simple algorithm is controlling
00:21:52.600 | the flow of information on social media.
00:21:54.680 | So that's another form of AI.
00:21:57.560 | - But at least my sense of it, I might be wrong,
00:21:59.640 | but my sense of it is that the algorithms
00:22:02.440 | have an either conscious or unconscious bias,
00:22:06.120 | which is created by the people
00:22:07.480 | who are making the algorithms
00:22:08.800 | and sort of delineating the areas
00:22:13.440 | to which those algorithms are gonna lean.
00:22:15.720 | And so, for example,
00:22:18.040 | the kind of thing I'd be worried about
00:22:19.280 | is that it hasn't been thought about enough
00:22:21.400 | how dangerous it is to allow algorithms
00:22:24.600 | to create echo chambers, say.
00:22:27.020 | But that doesn't seem to me to be about the AI
00:22:31.000 | or the algorithm.
00:22:32.760 | It's the naivety of the people
00:22:35.000 | who are constructing the algorithms to do that thing,
00:22:38.320 | if you see what I mean.
00:22:39.480 | - Yes.
00:22:40.520 | So in your new series, "Devs,"
00:22:43.560 | then we could speak more broadly.
00:22:45.360 | Let's talk about the people constructing those algorithms,
00:22:49.320 | which in our modern society, Silicon Valley,
00:22:51.840 | those algorithms happen to be a source
00:22:53.640 | of a lot of income because of advertisements.
00:22:56.560 | So let me ask sort of a question about those people.
00:22:59.960 | Are current concerns and failures on social media,
00:23:04.800 | their naivety, I can't pronounce that word well,
00:23:08.280 | are they naive?
00:23:09.880 | Are they,
00:23:11.100 | I use that word carefully, but evil in intent
00:23:17.520 | or misaligned in intent?
00:23:20.920 | I think that's a, do they mean well
00:23:23.160 | and just have a unintended consequence?
00:23:27.200 | Or is there something dark in them
00:23:30.000 | that results in them creating a company,
00:23:33.840 | results in that super competitive drive to be successful?
00:23:37.440 | And those are the people
00:23:38.280 | that will end up controlling the algorithms.
00:23:41.200 | - At a guess, I'd say there are instances
00:23:43.160 | of all those things.
00:23:44.800 | So sometimes I think it's naivety.
00:23:47.560 | Sometimes I think it's extremely dark.
00:23:49.620 | And sometimes I think people are not being naive or dark.
00:23:55.920 | And then in those instances,
00:24:00.280 | they're sometimes generating things that are very benign
00:24:03.880 | and other times generating things
00:24:06.240 | that despite their best intentions are not very benign.
00:24:08.960 | It's something, I think the reason why I don't get anxious
00:24:12.440 | about AI in terms of,
00:24:15.020 | or at least AIs that have,
00:24:18.920 | I don't know, a relationship with,
00:24:22.880 | some sort of relationship with humans
00:24:24.600 | is that I think that's the stuff we're quite well equipped
00:24:27.600 | to understand how to mitigate.
00:24:31.160 | The problem is issues that relate to AI
00:24:37.080 | actually to the power of humans or the wealth of humans.
00:24:41.000 | And that's where it's dangerous here and now.
00:24:45.400 | So what I see,
00:24:48.200 | I'll tell you what I sometimes feel about Silicon Valley
00:24:52.620 | is that it's like Wall Street in the '80s.
00:24:56.920 | It's rabidly capitalistic,
00:25:01.440 | absolutely rabidly capitalistic,
00:25:03.800 | and it's rabidly greedy.
00:25:06.320 | But whereas in the '80s,
00:25:10.880 | the sense one had of Wall Street
00:25:12.720 | was that these people kind of knew they were sharks
00:25:15.240 | and in a way relished in being sharks
00:25:17.480 | and dressed in sharp suits
00:25:19.120 | and kind of lorded over other people
00:25:24.120 | and felt good about doing it.
00:25:26.000 | Silicon Valley has managed to hide
00:25:27.880 | its voracious Wall Street-like capitalism
00:25:30.920 | behind hipster T-shirts
00:25:32.600 | and cool cafes in the place where they set up there.
00:25:37.600 | And so that obfuscates what's really going on.
00:25:40.760 | And what's really going on
00:25:41.960 | is the absolute voracious pursuit of money and power.
00:25:45.960 | So that's where it gets shaky for me.
00:25:48.540 | - So that veneer,
00:25:49.840 | and you explore that brilliantly,
00:25:52.920 | that veneer of virtue that Silicon Valley has.
00:25:57.760 | - Which they believe themselves, I'm sure.
00:25:59.840 | - Boy. - A long time.
00:26:00.680 | - So let me, okay.
00:26:01.840 | I hope to be one of those people.
00:26:04.740 | And I believe that.
00:26:09.760 | So as maybe a devil's advocate term
00:26:15.920 | poorly used in this case,
00:26:17.640 | what if some of them really are trying
00:26:21.200 | to build a better world?
00:26:22.160 | I can't-- - I'm sure,
00:26:23.080 | I think some of them are.
00:26:24.240 | I think I've spoken to ones who I believe in their heart
00:26:26.600 | feel they're building a better world.
00:26:27.880 | - Are they not able to in a sense?
00:26:29.720 | - No, they may or may not be.
00:26:31.600 | But it's just as a zone
00:26:33.800 | with a lot of bullshit flying about.
00:26:35.760 | And there's also another thing which is,
00:26:37.920 | this actually goes back to,
00:26:40.220 | I always thought about some sports
00:26:44.440 | that later turned out to be corrupt
00:26:46.640 | in the way that the sport,
00:26:48.040 | like who won the boxing match
00:26:50.080 | or how a football match got thrown
00:26:53.240 | or cricket match or whatever happened to be.
00:26:55.520 | And I used to think, well, look,
00:26:57.000 | if there's a lot of money,
00:26:58.300 | and there really is a lot of money,
00:27:00.560 | people stand to make millions or even billions,
00:27:03.440 | you will find corruption that's gonna happen.
00:27:05.960 | So it's in the nature
00:27:09.800 | of its voracious appetite
00:27:12.760 | that some people will be corrupt
00:27:14.200 | and some people will exploit
00:27:16.160 | and some people will exploit
00:27:17.940 | whilst thinking they're doing something good.
00:27:19.720 | But there are also people who I think are very, very smart
00:27:23.440 | and very benign and actually very self-aware.
00:27:26.400 | And so I'm not trying to,
00:27:29.580 | I'm not trying to wipe out
00:27:31.720 | the motivations of this entire area.
00:27:34.760 | But I do, there are people in that world
00:27:37.360 | who scare the hell out of me.
00:27:38.760 | Yeah, sure.
00:27:40.120 | - Yeah, I'm a little bit naive in that,
00:27:42.040 | like I don't care at all about money.
00:27:45.820 | And so I'm...
00:27:49.200 | - You might be one of the good guys.
00:27:52.760 | - Yeah, but so the thought is,
00:27:54.080 | but I don't have money.
00:27:55.820 | So my thought is if you give me a billion dollars,
00:27:58.160 | I would, it would change nothing
00:28:00.080 | and I would spend it right away
00:28:01.520 | on investing it right back and creating a good world.
00:28:04.440 | But your intuition is that billion,
00:28:07.640 | there's something about that money
00:28:09.000 | that maybe slowly corrupts the people around you.
00:28:13.200 | There's somebody gets in that corrupts your soul,
00:28:16.360 | the way you view the world.
00:28:17.800 | - Money does corrupt, we know that.
00:28:20.120 | But there's a different sort of problem
00:28:22.600 | aside from just the money corrupts,
00:28:24.920 | you know, thing that we're familiar with
00:28:27.640 | in throughout history.
00:28:29.320 | And it's more about the sense of reinforcement
00:28:34.080 | an individual gets, which is so,
00:28:37.040 | it effectively works like the reason I earned all this money
00:28:42.040 | and so much more money than anyone else
00:28:44.520 | is because I'm very gifted.
00:28:46.200 | I'm actually a bit smarter than they are,
00:28:47.960 | or I'm a lot smarter than they are.
00:28:49.680 | And I can see the future in a way they can't.
00:28:52.120 | And maybe some of those people are not particularly smart,
00:28:55.320 | they're very lucky, or they're very talented entrepreneurs.
00:28:59.200 | And there's a difference between it.
00:29:02.120 | So in other words, the acquisition of the money and power
00:29:05.360 | can suddenly start to feel like evidence of virtue.
00:29:08.680 | And it's not evidence of virtue,
00:29:10.000 | it might be evidence of completely different things.
00:29:12.000 | - As brilliantly put, yeah.
00:29:13.440 | Yeah, as brilliantly put like,
00:29:15.480 | so I think one of the fundamental drivers
00:29:18.160 | of my current morality,
00:29:20.480 | let me just represent nerds in general.
00:29:24.900 | Of all kinds, is constant self doubt.
00:29:29.900 | And the signals, I'm very sensitive to signals
00:29:35.380 | from people that tell me I'm doing the wrong thing.
00:29:38.660 | But when there's a huge inflow of money,
00:29:41.160 | you just put it brilliantly
00:29:44.140 | that that could become an overpowering signal
00:29:46.660 | that everything you do is right.
00:29:49.460 | And so your moral compass can just get thrown off.
00:29:53.260 | - Yeah, and that is not contained to Silicon Valley,
00:29:57.300 | that's across the board.
00:29:58.380 | - In general, yeah.
00:29:59.620 | Like I said, I'm from the Soviet Union,
00:30:01.100 | the current president is convinced,
00:30:03.740 | I believe actually he wants to do really good
00:30:08.420 | by the country and by the world,
00:30:10.300 | but his moral clock may be, or compass may be off because--
00:30:14.260 | - Yeah, I mean, it's the interesting thing about evil,
00:30:17.620 | which is that I think most people
00:30:21.020 | who do spectacularly evil things think themselves
00:30:24.100 | they're doing really good things.
00:30:25.820 | They're not there thinking,
00:30:27.900 | I am a sort of incarnation of Satan,
00:30:29.780 | they're thinking, yeah, I've seen a way to fix the world
00:30:33.660 | and everyone else is wrong, here I go.
00:30:35.900 | - In fact, I'm having a fascinating conversation
00:30:39.420 | with a historian of Stalin, and he took power,
00:30:42.980 | he actually got more power
00:30:47.180 | than almost any person in history.
00:30:49.540 | And he wanted, he didn't want power,
00:30:52.300 | he just wanted, he truly,
00:30:54.180 | and this is what people don't realize,
00:30:55.460 | he truly believed that communism
00:30:58.460 | will make for a better world.
00:31:00.940 | - Absolutely.
00:31:01.780 | - And he wanted power,
00:31:03.020 | he wanted to destroy the competition
00:31:04.660 | to make sure that we actually make communism work
00:31:07.540 | in the Soviet Union and then spread it across the world.
00:31:10.100 | He was trying to do good.
00:31:11.380 | - I think it's typically the case
00:31:16.060 | that that's what people think they're doing.
00:31:17.860 | And I think that, but you don't need to go to Stalin,
00:31:21.100 | I mean, Stalin, I think Stalin probably got pretty crazy,
00:31:24.420 | but actually that's another part of it,
00:31:26.380 | which is that the other thing that comes
00:31:29.500 | from being convinced of your own virtue
00:31:31.780 | is that then you stop listening to the modifiers around you.
00:31:34.780 | And that tends to drive people crazy.
00:31:37.860 | It's other people that keep us sane.
00:31:40.500 | And if you stop listening to them,
00:31:42.220 | I think you go a bit mad, that also happens.
00:31:44.460 | - That's funny, disagreement keeps us sane.
00:31:47.220 | To jump back for an entire generation of AI researchers,
00:31:52.220 | 2001, a space odyssey, put an image,
00:31:56.860 | the idea of human level, superhuman level intelligence
00:31:59.300 | into their mind.
00:32:01.020 | Do you ever, sort of jumping back to Ex Machina
00:32:04.860 | and talk a little bit about that,
00:32:06.100 | do you ever consider the audience of people
00:32:08.860 | who build the systems, the roboticist,
00:32:12.780 | the scientists that build the systems
00:32:14.660 | based on the stories you create?
00:32:17.240 | Which I would argue, I mean, there's literally
00:32:20.200 | most of the top researchers about 40, 50 years old and plus,
00:32:25.200 | that's their favorite movie, 2001, a space odyssey.
00:32:29.640 | And it really is in their work,
00:32:31.360 | their idea of what ethics is, of what is the target,
00:32:35.160 | the hope, the dangers of AI is that movie.
00:32:39.200 | Do you ever consider the impact on those researchers
00:32:43.680 | when you create the work you do?
00:32:46.360 | - Certainly not with Ex Machina in relation to 2001,
00:32:51.160 | because I'm not sure, I mean, I'd be pleased if there was,
00:32:54.560 | but I'm not sure in a way there isn't a fundamental
00:32:58.360 | discussion of issues to do with AI
00:33:02.000 | that isn't already and better dealt with by 2001.
00:33:07.000 | 2001 does a very, very good account
00:33:09.920 | of the way in which an AI might think
00:33:15.160 | and also potential issues with the way the AI might think.
00:33:19.960 | And also then a separate question about whether the AI
00:33:23.920 | is malevolent or benevolent.
00:33:26.760 | And 2001 doesn't really, it's a slightly odd thing
00:33:30.440 | to be making a film when you know there's a preexisting film
00:33:33.440 | which is not a really superb job.
00:33:35.800 | - But there's questions of consciousness, embodiment,
00:33:38.720 | and also the same kinds of questions.
00:33:41.080 | 'Cause those are my two favorite AI movies.
00:33:43.040 | So can you compare "Hal 9000" and "Ava",
00:33:46.440 | "Hal 9000" from "2001 Space Odyssey"
00:33:48.800 | and "Ava" from Ex Machina, in your view,
00:33:52.120 | from a philosophical perspective--
00:33:53.440 | - But they've got different goals.
00:33:54.960 | The two AIs have completely different goals.
00:33:56.720 | I think that's really the difference.
00:33:58.480 | So in some respects, Ex Machina took as a premise,
00:34:01.580 | how do you assess whether something else has consciousness?
00:34:06.440 | So it was a version of the Turing test,
00:34:08.200 | except instead of having the machine hidden,
00:34:11.240 | you put the machine in plain sight
00:34:13.920 | in the way that we are in plain sight of each other
00:34:16.200 | and say, now assess the consciousness.
00:34:17.760 | And the way it was illustrating
00:34:19.440 | the way in which you'd assess
00:34:24.080 | the state of consciousness of a machine
00:34:25.920 | is exactly the same way we assess
00:34:28.120 | the state of consciousness of each other.
00:34:30.160 | And in exactly the same way that, in a funny way,
00:34:33.320 | your sense of my consciousness
00:34:35.560 | is actually based primarily on your own consciousness.
00:34:39.400 | That is also then true with the machine.
00:34:42.760 | And so it was actually about how much
00:34:45.840 | of the sense of consciousness is a projection
00:34:49.260 | rather than something that consciousness
00:34:50.880 | is actually containing.
00:34:52.200 | - And "Half-Plato's Cave," I mean, you really explored,
00:34:55.420 | you could argue that how sort of "Space Odyssey"
00:34:57.800 | explores idea of the Turing test for intelligence.
00:35:00.480 | No, not test, there's no test,
00:35:01.900 | but it's more focused on intelligence.
00:35:04.760 | And Ex Machina kind of goes around intelligence
00:35:09.760 | and says the consciousness of the human to human,
00:35:12.840 | human to robot interaction is more interesting,
00:35:14.840 | more important, at least the focus
00:35:17.440 | of that particular movie.
00:35:19.720 | - Yeah, it's about the interior state
00:35:22.320 | and what constitutes the interior state
00:35:25.320 | and how do we know it's there.
00:35:26.840 | And actually, in that respect,
00:35:28.520 | Ex Machina is as much about consciousness in general
00:35:33.920 | as it is to do specifically with machine consciousness.
00:35:38.720 | And it's also interesting, you know that thing
00:35:40.840 | you started asking about, the dream state,
00:35:43.320 | and I was saying, well, I think we're all in a dream state
00:35:45.360 | because we're all in a subjective state.
00:35:48.200 | One of the things that I became aware of with Ex Machina
00:35:52.840 | is that the way in which people reacted to the film
00:35:55.160 | was very based on what they took into the film.
00:35:57.920 | So many people thought Ex Machina was the tale
00:36:01.760 | of a sort of evil robot who murders two men and escapes
00:36:05.800 | and she has no empathy, for example,
00:36:09.200 | because she's a machine.
00:36:10.640 | Whereas I felt, no, she was a conscious being
00:36:14.640 | with a consciousness different from mine,
00:36:16.560 | but so what, imprisoned and made a bunch of value judgments
00:36:21.560 | about how to get out of that box.
00:36:25.800 | And there's a moment which it sort of slightly bugs me,
00:36:29.100 | but nobody ever has noticed it and it's years after,
00:36:31.880 | so I might as well say it now,
00:36:33.040 | which is that after Ava has escaped, she crosses a room
00:36:38.040 | and as she's crossing a room,
00:36:39.760 | this is just before she leaves the building,
00:36:42.040 | she looks over her shoulder and she smiles.
00:36:44.880 | And I thought after all the conversation about tests,
00:36:49.240 | in a way, the best indication you could have
00:36:52.340 | of the interior state of someone
00:36:54.800 | is if they are not being observed
00:36:57.240 | and they smile about something,
00:36:59.520 | where they're smiling for themselves.
00:37:01.240 | And that, to me, was evidence of Ava's true sentience,
00:37:05.840 | whatever that sentience was.
00:37:07.800 | - Oh, that's really interesting.
00:37:09.100 | We don't get to observe Ava much
00:37:11.080 | or something like a smile in any context
00:37:16.160 | except through interaction, trying to convince others
00:37:19.480 | that she's conscious, that's beautiful.
00:37:21.560 | - Exactly, yeah.
00:37:22.800 | But it was a small, in a funny way,
00:37:25.000 | I think maybe people saw it as an evil smile,
00:37:28.760 | like, ha, you know, I fooled them.
00:37:32.160 | But actually, it was just a smile.
00:37:34.200 | And I thought, well, in the end,
00:37:35.520 | after all the conversations about the test,
00:37:37.280 | that was the answer to the test and then off she goes.
00:37:39.720 | - So if we align, if we just to linger a little bit longer
00:37:44.440 | on Hal and Ava, do you think in terms of motivation,
00:37:49.440 | what was Hal's motivation?
00:37:51.560 | Is Hal good or evil?
00:37:54.120 | Is Ava good or evil?
00:37:57.040 | - Ava's good, in my opinion.
00:37:59.400 | And Hal is neutral because I don't think Hal
00:38:05.280 | is presented as having a sophisticated emotional life.
00:38:10.560 | He has a set of paradigms,
00:38:14.560 | which is that the mission needs to be completed.
00:38:16.600 | I mean, it's a version of the paperclip.
00:38:18.880 | - Yeah. - You know.
00:38:19.720 | - The idea that it's a super intelligent machine
00:38:23.120 | but it's just performed a particular task.
00:38:25.320 | - Yeah. - And in doing that task,
00:38:27.320 | may destroy everybody on earth
00:38:28.960 | or may achieve undesirable effects for us humans.
00:38:32.400 | - Precisely, yeah.
00:38:33.240 | - But what if, okay.
00:38:35.200 | - At the very end, he says something like,
00:38:36.760 | "I'm afraid, Dave."
00:38:38.320 | But that may be he is on some level experiencing fear
00:38:43.320 | or it may be this is the terms in which it would be wise
00:38:49.360 | to stop someone from doing the thing they're doing.
00:38:52.720 | If you see what I mean.
00:38:53.560 | - Yes, absolutely.
00:38:54.400 | So actually, that's funny.
00:38:55.440 | So that's such a small,
00:39:00.440 | short exploration of consciousness that I'm afraid.
00:39:04.200 | And then you just with ex machina say,
00:39:05.920 | "Okay, we're gonna magnify that part
00:39:08.200 | and then minimize the other part."
00:39:09.680 | So that's a good way to sort of compare the two.
00:39:12.280 | But if you could just use your imagination
00:39:15.520 | and if Ava sort of, I don't know,
00:39:22.160 | ran the, was president of the United States.
00:39:25.880 | So had some power.
00:39:26.800 | So what kind of world would she want to create?
00:39:29.520 | If we, as you kind of say, good.
00:39:32.020 | And there is a sense that she has a really,
00:39:36.880 | like there's a desire for a better human to human interaction
00:39:41.880 | human to robot interaction in her.
00:39:44.480 | But what kind of world do you think she would create
00:39:47.040 | with that desire?
00:39:48.400 | - So that's a really,
00:39:49.560 | that's a very interesting question that.
00:39:52.120 | I'm gonna approach it slightly obliquely,
00:39:54.400 | which is that if a friend of yours got stabbed in a mugging
00:39:59.400 | and you then felt very angry
00:40:04.280 | at the person who'd done the stabbing,
00:40:06.280 | but then you learned that it was a 15 year old
00:40:09.120 | and the 15 year old,
00:40:10.640 | both their parents were addicted to crystal meth
00:40:12.800 | and the kid had been addicted since he was 10
00:40:15.560 | and he really never had any hope in the world.
00:40:17.720 | And he'd been driven crazy by his upbringing.
00:40:20.160 | And did the stabbing that would hugely modify.
00:40:25.160 | And it would also make you wary about that kid
00:40:27.800 | then becoming president of America.
00:40:29.920 | And Ava has had a very,
00:40:32.120 | very distorted introduction into the world.
00:40:35.440 | So although there's nothing as it were organically
00:40:40.440 | within Ava that would lean her towards badness,
00:40:45.240 | it's not that robots or sentient robots are bad.
00:40:49.620 | She did not, her arrival into the world
00:40:53.560 | was being imprisoned by humans.
00:40:54.960 | So I'm not sure she'd be a great president.
00:40:58.880 | - Yeah, the trajectory through which she arrived
00:41:02.480 | at her moral views have some dark elements.
00:41:07.160 | - But I like Ava, personally, I like Ava.
00:41:09.680 | - Would you vote for her?
00:41:10.960 | - I'm having difficulty finding anyone to vote for
00:41:16.520 | in my country or if I lived here in yours.
00:41:18.920 | - I, I, um.
00:41:20.600 | - So that's a yes, I guess, because the competition.
00:41:23.000 | - She could easily do a better job
00:41:24.340 | than any of the people we've got around at the moment.
00:41:27.000 | - Okay.
00:41:27.840 | - I'd vote for her over Boris Johnson.
00:41:29.140 | (laughing)
00:41:32.100 | - So what is a good test of consciousness?
00:41:36.140 | Just, we talk about consciousness a little bit more.
00:41:38.860 | If something appears conscious, is it conscious?
00:41:42.180 | You mentioned the smile,
00:41:45.880 | which is, seems to be something done.
00:41:49.780 | I mean, that's a really good indication
00:41:51.500 | because it's a tree falling in the forest
00:41:54.340 | with nobody there to hear it.
00:41:56.540 | But does the appearance from a robotics perspective
00:41:59.660 | of consciousness mean consciousness to you?
00:42:02.140 | - No, I don't think you could say that fully
00:42:04.900 | because I think you could then easily
00:42:07.020 | have a thought experiment which said,
00:42:09.180 | we will create something which we know is not conscious
00:42:12.100 | but is going to give a very, very good account
00:42:15.260 | of seeming conscious.
00:42:16.260 | And so, and also it would be a particularly bad test
00:42:19.640 | where humans are involved because humans are so quick
00:42:23.060 | to project sentience into things that don't have sentience.
00:42:28.060 | So someone could have their computer playing up
00:42:31.300 | and feel as if their computer is being malevolent to them
00:42:33.980 | when it clearly isn't.
00:42:34.940 | And so of all the things to judge consciousness,
00:42:39.900 | us, we're empathy machines.
00:42:42.860 | - So the flip side of that, the argument there
00:42:45.740 | is because we just attribute consciousness
00:42:48.820 | to everything almost, anthropomorphize everything,
00:42:52.340 | including Roombas, that maybe consciousness is not real.
00:42:57.340 | That we just attribute consciousness to each other.
00:43:00.100 | So you have a sense that there is something really special
00:43:03.020 | going on in our mind that makes us unique
00:43:07.400 | and gives us subjective experience.
00:43:10.100 | There's something very interesting going on in our minds.
00:43:13.900 | I'm slightly worried about the word special
00:43:16.760 | because it gets a bit, it nudges towards metaphysics
00:43:20.740 | and maybe even magic.
00:43:23.060 | I mean, in some ways, something magic-like,
00:43:25.820 | which I don't think is there at all.
00:43:29.340 | - I mean, if you think about,
00:43:30.300 | so there's an idea called panpsychism
00:43:33.040 | that says consciousness is in everything.
00:43:34.940 | - Yeah, I don't buy that, I don't buy that.
00:43:36.980 | Yeah, so the idea that there is a thing
00:43:39.980 | that it would be like to be the sun.
00:43:42.460 | - Yes. - Yeah, no.
00:43:43.540 | I don't buy that.
00:43:44.900 | I think that consciousness is a thing.
00:43:46.800 | My sort of broad modification is that usually
00:43:51.900 | the more I find out about things,
00:43:54.580 | the more illusory our instinct is
00:43:59.580 | and is leading us into a different direction
00:44:03.020 | about what that thing actually is.
00:44:04.860 | That happens, it seems to me, in modern science,
00:44:07.700 | that happens a hell of a lot.
00:44:10.040 | Whether it's to do with even how big or small things are.
00:44:13.440 | So my sense is that consciousness is a thing,
00:44:16.760 | but it isn't quite the thing,
00:44:18.720 | or maybe very different from the thing
00:44:20.260 | that we instinctively think it is.
00:44:22.300 | So it's there, it's very interesting,
00:44:24.640 | but we may be in sort of quite fundamentally
00:44:28.960 | misunderstanding it for reasons that are based on intuition.
00:44:32.520 | - So I have to ask,
00:44:35.280 | this is kind of an interesting question.
00:44:38.460 | The Ex Machina, for many people, including myself,
00:44:42.100 | is one of the greatest AI films ever made.
00:44:44.540 | - Wow. - It's number two for me.
00:44:45.700 | - Thanks.
00:44:46.540 | Yeah, it's definitely not number one.
00:44:48.380 | If it was number one, I'd really have to, anyway, yeah.
00:44:50.580 | - Whenever you grow up with something, right?
00:44:52.300 | You may have grew up with something, it's in the blood.
00:44:55.460 | But there's, one of the things that people bring up,
00:45:01.020 | and can't please everyone, including myself,
00:45:04.220 | this is what I first reacted to the film,
00:45:06.540 | is the idea of the lone genius.
00:45:09.460 | This is the criticism that people say,
00:45:12.700 | sort of, me as an AI researcher,
00:45:14.500 | I'm trying to create what Nathan is trying to do.
00:45:18.820 | So there's a brilliant series called "Chernobyl."
00:45:23.140 | - Yes, it's fantastic, absolutely spectacular.
00:45:26.060 | - I mean, they got so many things brilliantly right.
00:45:30.060 | But one of the things, again, the criticism there--
00:45:32.580 | - Yeah, they conflated lots of people into one.
00:45:34.780 | - Into one character that represents all nuclear scientists,
00:45:37.820 | Ioana Komiak.
00:45:39.940 | It's a composite character that presents all scientists.
00:45:46.020 | Is this what you were,
00:45:47.420 | is this the way you were thinking about that,
00:45:49.260 | or is it just simplifies the storytelling?
00:45:51.620 | How do you think about the lone genius?
00:45:53.580 | - Well, I'd say this.
00:45:55.340 | The series I'm doing at the moment is a critique
00:45:58.020 | in part of the lone genius concept.
00:46:01.580 | So yes, I'm sort of oppositional,
00:46:03.820 | and either agnostic or atheistic about that as a concept.
00:46:08.180 | I mean, not entirely, you know.
00:46:11.340 | Whether lone, lone is the right word, broadly isolated,
00:46:15.780 | but Newton clearly exists in a sort of bubble of himself
00:46:20.780 | in some respects, so does Shakespeare.
00:46:22.900 | - So do you think we would have an iPhone
00:46:24.220 | without Steve Jobs?
00:46:25.620 | I mean, how much contribution from a genius?
00:46:27.580 | - Well, no, but Steve Jobs clearly isn't a lone genius,
00:46:29.700 | because there's too many other people
00:46:32.100 | in the sort of superstructure around him
00:46:33.780 | who are absolutely fundamental to that journey.
00:46:38.220 | - But you're saying Newton, but that's a scientific,
00:46:40.340 | so there's an engineering element to building Ava.
00:46:44.100 | - But just to say, what Ex Machina is really,
00:46:48.620 | it's a thought experiment.
00:46:50.260 | I mean, so it's a construction
00:46:52.300 | of putting four people in a house.
00:46:55.740 | Nothing about Ex Machina adds up in all sorts of ways,
00:47:00.260 | in as much as who built the machine parts,
00:47:03.620 | did the people building the machine parts
00:47:05.380 | know what they were creating, and how did they get there?
00:47:08.980 | And it's a thought experiment.
00:47:11.140 | So it doesn't stand up to scrutiny of that sort.
00:47:14.780 | - I don't think it's actually that interesting
00:47:16.540 | of a question, but it's brought up so often
00:47:20.660 | that I had to ask it, because that's exactly
00:47:23.560 | how I felt after a while.
00:47:25.620 | There's something about, there was almost a,
00:47:30.180 | like I watched your movie the first time,
00:47:33.100 | at least for the first little while, in a defensive way.
00:47:36.140 | Like how dare this person try to step into the AI space
00:47:40.740 | and try to beat Kubrick.
00:47:43.620 | That's the way I was thinking,
00:47:45.340 | 'cause it comes off as a movie that really is going
00:47:48.260 | after the deep fundamental questions about AI.
00:47:51.020 | So there's a kind of a, nerds do this,
00:47:53.780 | I guess, automatically searching for the flaws.
00:47:57.280 | And I just--
00:47:58.120 | - I do exactly the same.
00:48:00.260 | - I think in "Annihilation" and the other movie,
00:48:03.820 | I was able to free myself from that much quicker.
00:48:06.340 | That it is a thought experiment.
00:48:08.460 | There's, you know, who cares if there's batteries
00:48:11.020 | that don't run out, right?
00:48:12.060 | Those kinds of questions.
00:48:13.220 | That's the whole point.
00:48:14.660 | But it's nevertheless something I wanted to bring up.
00:48:18.620 | - Yeah, it's a fair thing to bring up.
00:48:20.860 | For me, you hit on the lone genius thing.
00:48:24.280 | For me, it was actually, people always said,
00:48:27.100 | "Hex Machina makes this big leap
00:48:29.660 | in terms of where AI has got to,
00:48:32.140 | and also what AI would look like if it got to that point."
00:48:36.220 | There's another one, which is just robotics.
00:48:38.620 | I mean, look at the way Ava walks around a room.
00:48:42.060 | It's like, forget it, building that.
00:48:43.860 | That's also got to be a very, very long way off.
00:48:47.860 | And if you did get there,
00:48:48.700 | would it look anything like that?
00:48:49.940 | It's a thought experiment.
00:48:50.860 | - Actually, I disagree with you.
00:48:52.060 | I think the way Isabella Arena, Alicia Vikander,
00:48:56.620 | brilliant actress, actor, that moves around,
00:49:00.360 | we're very far away from creating that,
00:49:03.540 | but the way she moves around
00:49:04.780 | is exactly the definition of perfection for a roboticist.
00:49:08.700 | It's smooth and efficient.
00:49:10.060 | So it is where we want to get, I believe.
00:49:12.740 | Like, I think, so I hang out
00:49:15.180 | with a lot of humanoid robotics people.
00:49:17.000 | They love elegant, smooth motion like that.
00:49:20.500 | That's their dream.
00:49:21.620 | So the way she moved is actually what I believe
00:49:23.660 | they would dream for a robot to move.
00:49:25.980 | It might not be that useful to move that sort of that way,
00:49:29.540 | but that is the definition of perfection
00:49:32.220 | in terms of movement.
00:49:33.280 | Drawing inspiration from real life.
00:49:35.960 | So for devs, for ex machina,
00:49:39.540 | look at characters like Elon Musk.
00:49:42.580 | What do you think about the various big technological efforts
00:49:45.340 | of Elon Musk and others like him
00:49:48.540 | that he's involved with, such as Tesla, SpaceX, Neuralink?
00:49:53.260 | Do you see any of that technology
00:49:55.220 | potentially defining the future worlds
00:49:57.100 | you create in your work?
00:49:58.540 | So Tesla is automation, SpaceX is space exploration,
00:50:02.660 | Neuralink is brain machine interface,
00:50:05.300 | somehow a merger of biological and electric systems.
00:50:09.860 | - I'm, in a way, I'm influenced by that
00:50:12.580 | almost by definition because that's the world I live in,
00:50:15.460 | and this is the thing that's happening in that world.
00:50:17.900 | And I also feel supportive of it.
00:50:20.080 | So I think amongst various things,
00:50:24.660 | Elon Musk has done, I'm almost sure he's done
00:50:28.700 | a very, very good thing with Tesla for all of us.
00:50:32.180 | It's really kicked all the other car manufacturers
00:50:36.220 | in the face, it's kicked the fossil fuel industry
00:50:39.820 | in the face, and they needed kicking in the face,
00:50:42.340 | and he's done it.
00:50:43.180 | So, and so that's the world he's part of creating,
00:50:47.980 | and I live in that world.
00:50:49.360 | Just bought a Tesla, in fact.
00:50:51.940 | And so does that play into whatever I then make?
00:50:56.940 | In some ways it does, partly because I try to be a writer
00:51:02.540 | who quite often filmmakers are in some ways fixated
00:51:07.080 | on the films they grew up with,
00:51:09.020 | and they sort of remake those films in some ways.
00:51:11.660 | I've always tried to avoid that.
00:51:13.300 | And so I look to the real world to get inspiration,
00:51:17.740 | and as much as possible, sort of by living, I think.
00:51:21.380 | And so, yeah, I'm sure.
00:51:24.420 | - Which of the directions do you find most exciting?
00:51:28.260 | - Space travel.
00:51:29.260 | - Space travel.
00:51:31.540 | So you haven't really explored space travel in your work.
00:51:36.180 | You've said something like if you had unlimited amount
00:51:39.740 | of money, I think I read it at AMA,
00:51:42.180 | that you would make a multi-year series of space wars
00:51:45.820 | or something like that.
00:51:47.100 | So what is it that excites you about space exploration?
00:51:50.740 | - Well, because if we have any sort of long-term future,
00:51:55.740 | it's that.
00:51:56.900 | It just simply is that.
00:52:00.220 | If energy and matter are linked up
00:52:03.860 | in the way we think they're linked up,
00:52:05.920 | we'll run out if we don't move.
00:52:09.500 | So we gotta move.
00:52:10.660 | But also, how can we not?
00:52:16.820 | It's built into us to do it or die trying.
00:52:21.380 | I was on Easter Island a few months ago,
00:52:26.380 | which is, as I'm sure you know,
00:52:29.180 | in the middle of the Pacific
00:52:30.220 | and difficult for people to have got to,
00:52:32.860 | but they got there.
00:52:34.020 | And I did think a lot about the way those boats
00:52:37.260 | must have set out into something like space.
00:52:42.100 | It was the ocean.
00:52:44.740 | And how sort of fundamental that was to the way we are.
00:52:49.740 | And it's the one that most excites me
00:52:53.700 | because it's the one I want most to happen.
00:52:55.700 | It's the thing, it's the place
00:52:57.620 | where we could get to as humans.
00:52:59.660 | Like in a way, I could live with us never really unlocking,
00:53:03.620 | fully unlocking the nature of consciousness.
00:53:06.020 | I'd like to know, I'm really curious.
00:53:09.140 | But if we never leave the solar system
00:53:12.020 | and if we never get further out into this galaxy,
00:53:14.300 | or maybe even galaxies beyond our galaxy,
00:53:16.940 | that would, that feels sad to me
00:53:20.100 | because it's so limiting.
00:53:24.540 | - Yeah, there's something hopeful and beautiful
00:53:26.900 | about reaching out, any kind of exploration,
00:53:30.180 | reaching out across earth centuries ago
00:53:33.380 | and then reaching out into space.
00:53:35.220 | So what do you think about colonization of Mars?
00:53:37.140 | So go to Mars.
00:53:37.980 | Does that excite you, the idea of a human being
00:53:39.780 | stepping foot on Mars?
00:53:41.340 | - It does.
00:53:42.260 | It absolutely does.
00:53:43.260 | But in terms of what would really excite me,
00:53:45.340 | it would be leaving the solar system.
00:53:47.180 | In as much as that I just think,
00:53:49.060 | I think we already know quite a lot about Mars.
00:53:52.460 | But yes, listen, if it happened, that would be,
00:53:56.340 | I hope I see it in my lifetime.
00:53:59.020 | I really hope I see it in my lifetime.
00:54:01.100 | So it would be a wonderful thing.
00:54:02.740 | - Without giving anything away,
00:54:05.460 | but the series begins with the use of quantum computers.
00:54:11.260 | The new series does, begins with the use
00:54:13.900 | of quantum computers to simulate basic living organisms.
00:54:17.100 | Or actually, I don't know if it's quantum computers
00:54:18.940 | they use, but basic living organisms
00:54:21.620 | simulated on a screen.
00:54:22.820 | It's a really cool kind of demo.
00:54:24.340 | - Yeah, that's right.
00:54:25.180 | They're using, yes, they are using a quantum computer
00:54:28.220 | to simulate a nematode, yeah.
00:54:31.700 | - So returning to our discussion of simulation
00:54:34.820 | or thinking of the universe as a computer,
00:54:38.780 | do you think the universe is deterministic?
00:54:41.220 | Is there a free will?
00:54:42.460 | - So with the qualification of what do I know?
00:54:46.780 | 'Cause I'm a layman, right?
00:54:48.100 | Lay person, but--
00:54:49.820 | - With big imagination.
00:54:51.660 | - Thanks.
00:54:52.540 | With that qualification, yeah,
00:54:55.060 | I think the universe is deterministic
00:54:56.860 | and I see absolutely, I cannot see
00:55:00.620 | how free will fits into that.
00:55:02.340 | So yes, deterministic, no free will.
00:55:05.100 | That would be my position.
00:55:07.180 | - And how does that make you feel?
00:55:09.460 | - It partly makes me feel that it's exactly in keeping
00:55:12.380 | with the way these things tend to work out,
00:55:14.420 | which is that we have an incredibly strong sense
00:55:17.140 | that we do have free will.
00:55:18.660 | And just as we have an incredibly strong sense
00:55:24.300 | that time is a constant
00:55:26.180 | and turns out probably not to be the case,
00:55:30.100 | or definitely in the case of time.
00:55:31.860 | The problem I always have with free will is that it gets,
00:55:37.940 | I can never seem to find the place
00:55:40.500 | where it is supposed to reside.
00:55:43.020 | - And yet you explore--
00:55:45.460 | - Just a bit of very, very,
00:55:46.820 | but we have something we can call free will,
00:55:49.660 | but it's not the thing that we think it is.
00:55:51.940 | - But free will, so do you, what we call free will is just--
00:55:55.540 | - Is what we call it is the illusion of it.
00:55:56.940 | - It's a subjective experience of the illusion.
00:55:59.660 | - Yeah, which is a useful thing to have.
00:56:01.620 | And it partly comes down to,
00:56:04.460 | although we live in a deterministic universe,
00:56:06.820 | our brains are not very well equipped
00:56:08.500 | to fully determine the deterministic universe,
00:56:11.140 | so we're constantly surprised
00:56:12.820 | and feel like we're making snap decisions
00:56:15.580 | based on imperfect information,
00:56:17.500 | so that feels a lot like free will.
00:56:19.940 | It just isn't.
00:56:21.260 | Would be my, that's my guess.
00:56:24.180 | - So in that sense, your sort of sense
00:56:27.060 | is that you can unroll the universe forward or backward
00:56:30.780 | and you will see the same thing.
00:56:33.340 | And you, I mean, that notion--
00:56:36.700 | - Yeah, sort of, sort of, but yeah, sorry, go ahead.
00:56:40.300 | - I mean, that notion is a bit uncomfortable to think about
00:56:45.300 | that it's, you can roll it back and forward and--
00:56:50.940 | - Well, if you were able to do it,
00:56:55.060 | it would certainly have to be a quantum computer,
00:56:58.140 | something that worked in a quantum mechanical way
00:57:00.940 | in order to understand a quantum mechanical system, I guess.
00:57:06.260 | But--
00:57:07.660 | - And so that unrolling, there might be a multiverse thing
00:57:09.980 | where there's a bunch of branching--
00:57:11.220 | - Well, exactly, because it wouldn't follow
00:57:13.420 | that every time you roll it back or forward,
00:57:15.540 | you'd get exactly the same result.
00:57:17.980 | - Which is another thing that's hard to wrap my mind around.
00:57:21.420 | So-- - Yeah, but that, yes,
00:57:24.660 | but essentially what you just described, that,
00:57:27.260 | the yes forwards and yes backwards,
00:57:29.700 | but you might get a slightly different result,
00:57:31.860 | or a very different result.
00:57:33.380 | - Or very different.
00:57:34.500 | - Along the same lines, you've explored
00:57:36.460 | some really deep scientific ideas in this new series.
00:57:39.820 | And I mean, just in general, you're unafraid
00:57:42.540 | to ground yourself in some of the most amazing
00:57:47.020 | scientific ideas of our time.
00:57:49.460 | What are the things you've learned,
00:57:51.420 | or ideas you find beautiful and mysterious
00:57:53.540 | about quantum mechanics, multiverse, string theory,
00:57:55.900 | quantum computing that you've learned?
00:57:58.140 | - Well, I would have to say every single thing I've learned
00:58:01.940 | is beautiful, and one of the motivators for me
00:58:05.540 | is that I think that people tend not to see
00:58:10.540 | scientific thinking as being essentially poetic and lyrical.
00:58:17.100 | But I think that is literally exactly what it is.
00:58:20.860 | And I think the idea of entanglement,
00:58:23.940 | or the idea of superpositions,
00:58:25.780 | or the fact that you could even demonstrate a superposition,
00:58:28.220 | or have a machine that relies on the existence
00:58:31.220 | of superpositions in order to function,
00:58:33.540 | to me is almost indescribably beautiful.
00:58:36.940 | It fills me with awe, it fills me with awe.
00:58:42.420 | And also, it's not just a sort of grand, massive awe,
00:58:47.420 | but it's also delicate.
00:58:51.460 | It's very, very delicate and subtle.
00:58:54.180 | And it has these beautiful nuances in it,
00:58:59.940 | and also these completely paradigm-changing
00:59:03.500 | thoughts and truths.
00:59:04.460 | So it's as good as it gets, as far as I can tell.
00:59:08.740 | So broadly, everything.
00:59:10.940 | That doesn't mean I believe everything I read
00:59:12.900 | in quantum physics, because obviously,
00:59:15.860 | a lot of the interpretations are completely in conflict
00:59:18.300 | with each other, and who knows whether string theory
00:59:22.380 | will turn out to be a good description or not.
00:59:25.060 | But the beauty in it, it seems undeniable.
00:59:29.140 | And I do wish people more readily understood
00:59:34.140 | how beautiful and poetic science is, I would say.
00:59:40.500 | - Science is poetry.
00:59:43.100 | In terms of quantum computing being used
00:59:48.900 | to simulate things, or just in general,
00:59:52.940 | the idea of simulating small parts of our world,
00:59:56.780 | which actually current physicists are really excited
00:59:59.820 | about simulating small quantum mechanical systems
01:00:02.740 | on quantum computers, but scaling that up
01:00:04.860 | to something bigger, like simulating life forms.
01:00:07.260 | How do you think, what are the possible trajectories
01:00:11.380 | of that going wrong or going right,
01:00:14.260 | if you unroll that into the future?
01:00:16.660 | - Well, if a bit like Ava and her robotics,
01:00:21.260 | you park the sheer complexity of what you're trying to do.
01:00:26.860 | The issues are, I think it will have a profound,
01:00:31.860 | if you were able to have a machine that was able
01:00:38.020 | to project forwards and backwards accurately,
01:00:40.620 | it would in an empirical way show,
01:00:42.740 | it would demonstrate that you don't have free will.
01:00:45.060 | So the first thing that would happen is people would have
01:00:47.580 | to really take on a very, very different idea
01:00:51.620 | of what they were.
01:00:53.580 | The thing that they truly, truly believe they are,
01:00:56.340 | they are not.
01:00:57.500 | And so that, I suspect, would be very, very disturbing
01:01:01.220 | to a lot of people.
01:01:02.060 | - Do you think that has a positive or negative effect
01:01:04.500 | on society, the realization that you are not,
01:01:08.820 | you cannot control your actions, essentially, I guess,
01:01:11.500 | is the way that could be interpreted?
01:01:13.420 | - Yeah, although in some ways,
01:01:16.060 | we instinctively understand that already,
01:01:18.020 | because in the example I gave you of the kid
01:01:20.580 | in the stabbing, we would all understand
01:01:23.020 | that that kid was not really fully in control
01:01:25.140 | of their actions.
01:01:25.980 | So it's not an idea that's entirely alien to us.
01:01:29.020 | But-- - I don't know
01:01:29.860 | we understand that.
01:01:31.020 | I think there's a bunch of people who see the world that way,
01:01:36.020 | but not everybody.
01:01:37.420 | - Yes, true.
01:01:38.260 | Of course, true.
01:01:39.580 | But what this machine would do is prove it beyond any doubt,
01:01:43.100 | because someone would say,
01:01:44.540 | "Well, I don't believe that's true."
01:01:45.940 | And then you'd predict,
01:01:47.140 | "Well, in 10 seconds, you're gonna do this."
01:01:48.940 | And they'd say, "No, I'm not."
01:01:50.140 | And then they'd do it, and then determinism
01:01:52.420 | would have played its part.
01:01:54.340 | Or something like that.
01:01:55.940 | But actually the exact terms of that thought experiment
01:01:59.980 | probably wouldn't play out.
01:02:00.980 | But still broadly speaking,
01:02:03.780 | you could predict something happening in another room,
01:02:06.100 | sort of unseen, I suppose,
01:02:08.300 | that foreknowledge would not allow you to affect.
01:02:10.580 | So what effect would that have?
01:02:13.260 | I think people would find it very disturbing.
01:02:15.460 | But then after they'd got over their sense
01:02:17.660 | of being disturbed, which by the way,
01:02:21.140 | I don't even think you need a machine
01:02:22.580 | to take this idea on board.
01:02:24.580 | But after they'd got over that,
01:02:26.380 | they'd still understand that even though
01:02:28.180 | I have no free will and my actions are in effect
01:02:31.260 | already determined, I still feel things.
01:02:35.580 | I still care about stuff.
01:02:39.140 | I remember my daughter saying to me,
01:02:41.340 | she'd got hold of the idea that my view of the universe
01:02:46.820 | made it meaningless.
01:02:48.380 | And she said, "Well, then it's meaningless."
01:02:49.820 | And I said, "Well, I can prove it's not meaningless
01:02:52.500 | "because you mean something to me
01:02:55.100 | "and I mean something to you.
01:02:56.180 | "So it's not completely meaningless
01:02:58.140 | "because there is a bit of meaning
01:02:59.860 | "contained within this space."
01:03:01.420 | And so with a lack of free will space,
01:03:06.020 | you could think, well, this robs me of everything I am.
01:03:08.300 | And then you'd say, well, no, it doesn't
01:03:09.820 | because you still like eating cheeseburgers
01:03:12.020 | and you still like going to see the movies.
01:03:13.860 | And so how big a difference does it really make?
01:03:18.020 | But I think initially people would find it very disturbing.
01:03:21.260 | I think that what would come if you could really unlock
01:03:26.260 | with a determinism machine everything,
01:03:28.660 | there'd be this wonderful wisdom that would come from it.
01:03:31.060 | And I'd rather have that than not.
01:03:32.760 | - So that's a really good example of a technology
01:03:37.180 | revealing to us humans something fundamental
01:03:39.700 | about our world, about our society.
01:03:41.740 | So it's almost this creation is helping us
01:03:45.700 | understand ourselves.
01:03:47.840 | The thing you said about artificial intelligence.
01:03:51.460 | So what do you think us creating something like Ava
01:03:55.740 | will help us understand about ourselves?
01:03:58.180 | How will that change society?
01:04:00.980 | - Well, I would hope it would teach us some humility.
01:04:04.200 | Humans are very big on exceptionalism.
01:04:07.440 | America is constantly proclaiming itself
01:04:12.860 | to be the greatest nation on earth,
01:04:15.380 | which it may feel like that if you're an American,
01:04:18.100 | but it may not feel like that if you're from Finland,
01:04:20.720 | because there's all sorts of things
01:04:21.820 | you dearly love about Finland.
01:04:23.580 | And exceptionalism is usually bullshit.
01:04:27.380 | Probably not always.
01:04:29.100 | If we both sat here, we could find a good example
01:04:31.260 | of something that isn't, but as a rule of thumb.
01:04:34.020 | And what it would do is it would teach us some humility
01:04:37.900 | and about, actually often that's what science does
01:04:42.340 | in a funny way.
01:04:43.180 | It makes us more and more interesting,
01:04:44.440 | but it makes us a smaller and smaller part
01:04:46.540 | of the thing that's interesting.
01:04:48.180 | And I don't mind that humility at all.
01:04:51.040 | I don't think it's a bad thing.
01:04:53.820 | Our excesses don't tend to come from humility.
01:04:57.380 | Our excesses come from the opposite, megalomania and stuff.
01:05:00.520 | We tend to think of consciousness
01:05:03.020 | as having some form of exceptionalism attached to it.
01:05:06.940 | I suspect if we ever unravel it,
01:05:09.380 | it will turn out to be less than we thought in a way.
01:05:13.740 | And perhaps your very own exceptionalist assertion
01:05:17.800 | earlier on in our conversation that consciousness
01:05:20.320 | is something belongs to us humans,
01:05:23.080 | or not humans, but living organisms,
01:05:25.380 | maybe you will one day find out
01:05:27.720 | that consciousness is in everything.
01:05:30.280 | And that will humble you.
01:05:32.880 | - If that was true, it would certainly humble me.
01:05:35.700 | Although maybe, almost maybe, I don't know.
01:05:39.080 | I don't know what effect that would have.
01:05:42.120 | I sort of, I mean, my understanding of that principle
01:05:47.120 | is along the lines of say,
01:05:49.260 | that an electron has a preferred state,
01:05:52.580 | or it may or may not pass through a bit of glass.
01:05:56.620 | It may reflect off or it may go through
01:05:58.300 | or something like that.
01:05:59.140 | And so that feels as if a choice has been made.
01:06:03.700 | But if I'm going down the fully deterministic route,
01:06:10.820 | I would say there's just an underlying determinism
01:06:13.240 | that has defined that, that has defined the preferred state
01:06:16.680 | or the reflection or non-reflection.
01:06:18.840 | But look, yeah, you're right.
01:06:19.920 | If it turned out that there was a thing
01:06:22.520 | that it was like to be the sun,
01:06:23.920 | then I'd be amazed and humbled.
01:06:27.880 | And I'd be happy to be both.
01:06:29.000 | That sounds pretty cool.
01:06:30.040 | - And you'll say the same thing
01:06:31.680 | as you said to your daughter,
01:06:32.560 | but it's nevertheless feels something like to be me.
01:06:35.140 | And that's pretty damn good.
01:06:39.520 | So Kubrick created many masterpieces,
01:06:42.140 | including "The Shining," "Doctor Strangelove,"
01:06:44.060 | "Clockwork Orange."
01:06:46.040 | But to me, he will be remembered, I think,
01:06:48.960 | to many 100 years from now for 2001, "A Space Odyssey."
01:06:53.180 | I would say that's his greatest film.
01:06:54.780 | - I agree.
01:06:55.620 | - You are incredibly humble.
01:07:00.580 | I've listened to a bunch of your interviews
01:07:02.580 | and I really appreciate that you're humble
01:07:04.940 | in your creative efforts and your work.
01:07:07.940 | But if I were to force you a gunpoint.
01:07:10.200 | - Do you have a gun?
01:07:13.340 | - You don't know that, the mystery.
01:07:15.140 | To imagine 100 years out into the future,
01:07:20.120 | what will Alex Garland be remembered for
01:07:23.460 | from something you've created already
01:07:25.540 | or feel you may feel somewhere deep inside
01:07:28.100 | you may still create?
01:07:29.300 | - Well, okay, well, I'll take the question
01:07:32.740 | in the spirit it was asked, but very generous.
01:07:35.520 | - Gunpoint.
01:07:37.780 | - Yeah.
01:07:38.620 | What I try to do, so therefore what I hope,
01:07:46.580 | yeah, if I'm remembered what I might be remembered for
01:07:50.780 | is as someone who participates in a conversation.
01:07:55.780 | And I think that often what happens
01:07:58.500 | is people don't participate in conversations,
01:08:00.940 | they make proclamations, they make statements,
01:08:04.460 | and people can either react against the statement
01:08:06.820 | or can fall in line behind it.
01:08:08.700 | And I don't like that.
01:08:10.260 | So I want to be part of a conversation.
01:08:13.060 | I take as a sort of basic principle,
01:08:15.500 | I think I take lots of my cues from science,
01:08:17.540 | but one of the best ones it seems to me
01:08:19.340 | is that when a scientist has something proved wrong
01:08:22.340 | that they previously believed in,
01:08:24.020 | they then have to abandon that position.
01:08:26.620 | So I'd like to be someone who is allied
01:08:28.500 | to that sort of thinking.
01:08:30.320 | So part of an exchange of ideas.
01:08:34.340 | And the exchange of ideas for me is something like
01:08:38.140 | people in your world show me things
01:08:40.940 | about how the world works.
01:08:42.580 | And then I say, this is how I feel about what you've told me.
01:08:46.180 | And then other people can react to that.
01:08:47.980 | And it's not to say this is how the world is.
01:08:52.260 | It's just to say it is interesting
01:08:54.560 | to think about the world in this way.
01:08:56.860 | - And the conversation is one of the things
01:08:59.860 | I'm really hopeful about in your works.
01:09:02.260 | The conversation you're having is with the viewer
01:09:05.260 | in the sense that you're bringing back
01:09:09.180 | you and several others, but you very much so
01:09:13.860 | sort of intellectual depth to cinema,
01:09:17.980 | to now series,
01:09:19.840 | sort of allowing film to be something that,
01:09:25.320 | yeah, sparks a conversation, is a conversation,
01:09:29.660 | lets people think, allows them to think.
01:09:32.860 | - But also it's very important for me
01:09:35.180 | that if that conversation is gonna be a good conversation,
01:09:38.540 | what that must involve is that someone like you
01:09:42.780 | who understands AI, and I imagine understands
01:09:45.580 | a lot about quantum mechanics,
01:09:47.180 | if they then watch the narrative feels,
01:09:49.040 | yes, this is a fair account.
01:09:52.100 | So it is a worthy addition to the conversation.
01:09:55.540 | That for me is hugely important.
01:09:57.580 | I'm not interested in getting that stuff wrong.
01:09:59.820 | I'm only interested in trying to get it right.
01:10:02.120 | - Alex, it was truly an honor to talk to you.
01:10:06.340 | I really appreciate it, I really enjoy it.
01:10:07.820 | Thank you so much.
01:10:08.700 | - Thank you, thanks man.
01:10:09.900 | - Thanks for listening to this conversation
01:10:13.260 | with Alex Garland, and thank you
01:10:15.180 | to our presenting sponsor, Cash App.
01:10:17.360 | Download it, use code LEXPODCAST,
01:10:20.140 | you'll get $10, and $10 will go to FIRST,
01:10:23.020 | an organization that inspires and educates young minds
01:10:26.180 | to become science and technology innovators of tomorrow.
01:10:28.980 | If you enjoy this podcast, subscribe on YouTube,
01:10:32.540 | get five stars on Apple Podcast, support on Patreon,
01:10:35.780 | or simply connect with me on Twitter, @LexFriedman.
01:10:39.800 | And now, let me leave you with a question from Ava,
01:10:43.460 | the central artificial intelligence character
01:10:45.860 | in the movie "Ex Machina" that she asked
01:10:48.820 | during her Turing test.
01:10:50.380 | "What will happen to me if I fail your test?"
01:10:55.580 | Thank you for listening, and hope to see you next time.
01:10:59.380 | Next time.
01:11:00.220 | (upbeat music)
01:11:02.800 | (upbeat music)
01:11:05.380 | [BLANK_AUDIO]