back to index

Ava's Smile: Ex Machina's Most Important Moment (Alex Garland) | AI Podcast Clips


Whisper Transcript | Transcript Only Page

00:00:00.000 | - For an entire generation of AI researchers,
00:00:04.700 | 2001, a space odyssey, put an image,
00:00:08.420 | the idea of human level, superhuman level intelligence
00:00:10.840 | into their mind.
00:00:12.520 | Do you ever, sort of jumping back to Ex Machina
00:00:16.400 | and talk a little bit about that,
00:00:17.640 | do you ever consider the audience of people
00:00:20.440 | who build the systems, the roboticists, the scientists
00:00:25.080 | that build the systems based on the stories you create?
00:00:28.760 | Which I would argue, I mean, there's literally
00:00:31.760 | most of the top researchers, about 40, 50 years old and plus,
00:00:36.760 | you know, that's their favorite movie, 2001, a space odyssey
00:00:41.160 | and it really is in their work.
00:00:42.920 | Their idea of what ethics is, of what is the target,
00:00:46.680 | the hope, the dangers of AI, is that movie, right?
00:00:50.720 | Do you ever consider the impact on those researchers
00:00:55.240 | when you create the work you do?
00:00:57.960 | - Certainly not with Ex Machina in relation to 2001
00:01:02.760 | because I'm not sure, I mean, I'd be pleased if there was,
00:01:06.160 | but I'm not sure, in a way, there isn't a fundamental
00:01:09.960 | discussion of issues to do with AI that isn't already
00:01:15.420 | and better dealt with by 2001.
00:01:19.080 | 2001 does a very, very good account of the way
00:01:25.000 | in which an AI might think and also potential issues
00:01:29.760 | with the way the AI might think.
00:01:31.520 | And also, then a separate question about whether the AI
00:01:35.520 | is malevolent or benevolent.
00:01:38.320 | And 2001 doesn't really, it's a slightly odd thing
00:01:42.020 | to be making a film when you know there's a pre-existing
00:01:44.720 | film, which is not a really superb job.
00:01:47.360 | - But there's questions of consciousness, embodiment
00:01:50.280 | and also the same kinds of questions.
00:01:52.640 | 'Cause those are my two favorite AI movies.
00:01:54.620 | So can you compare HAL 9000 and EVA, HAL 9000
00:01:58.880 | from 2001 Space Odyssey and EVA from Ex Machina,
00:02:02.400 | the, in your view, from a philosophical perspective--
00:02:05.000 | - But they've got different goals.
00:02:06.520 | The two AIs have completely different goals.
00:02:08.280 | I think that's really the difference.
00:02:10.060 | So in some respects, Ex Machina took as a premise,
00:02:13.140 | how do you assess whether something else has consciousness?
00:02:18.000 | So it was a version of the Turing test,
00:02:19.760 | except instead of having the machine hidden,
00:02:22.800 | you put the machine in plain sight in the way
00:02:25.900 | that we are in plain sight of each other
00:02:27.780 | and say now assess the consciousness.
00:02:29.340 | And the way it was illustrating the way in which
00:02:34.340 | you'd assess the state of consciousness of a machine
00:02:37.480 | is exactly the same way we assess the state of consciousness
00:02:40.700 | of each other.
00:02:41.720 | And in exactly the same way that in a funny way,
00:02:44.900 | your sense of my consciousness is actually based
00:02:48.100 | primarily on your own consciousness.
00:02:50.960 | That is also then true with the machine.
00:02:54.340 | And so it was actually about how much of the sense
00:02:59.260 | of consciousness is a projection rather than something
00:03:01.780 | that consciousness is actually containing.
00:03:03.740 | - And has a Plato's cave.
00:03:05.340 | I mean, this you really explored,
00:03:06.980 | you could argue that HAL sort of Space Odyssey
00:03:09.340 | explores idea of the Turing test for intelligence.
00:03:12.060 | They're not tests, there's no test,
00:03:13.460 | but it's more focused on intelligence.
00:03:16.320 | And Ex Machina kind of goes around intelligence
00:03:21.320 | and says the consciousness of the human to human,
00:03:24.420 | human to robot interaction is more interesting,
00:03:26.380 | more important, more at least the focus
00:03:28.980 | of that particular movie.
00:03:31.260 | - Yeah, it's about the interior state
00:03:33.900 | and what constitutes the interior state
00:03:36.860 | and how do we know it's there.
00:03:38.420 | And actually in that respect,
00:03:40.100 | Ex Machina is as much about consciousness in general
00:03:45.500 | as it is to do specifically with machine consciousness.
00:03:50.300 | And it's also interesting,
00:03:51.820 | you know that thing you started asking about
00:03:53.900 | the dream state and I was saying,
00:03:55.500 | well, I think we're all in a dream state
00:03:56.900 | because we're all in a subjective state.
00:03:59.740 | One of the things that I became aware of with Ex Machina
00:04:04.380 | is that the way in which people reacted to the film
00:04:06.720 | was very based on what they took into the film.
00:04:09.500 | So many people thought Ex Machina was the tale
00:04:13.340 | of a sort of evil robot who murders two men and escapes
00:04:17.380 | and she has no empathy, for example,
00:04:20.740 | because she's a machine.
00:04:22.220 | Whereas I felt, no, she was a conscious being
00:04:26.220 | with a consciousness different from mine, but so what,
00:04:29.980 | imprisoned and made a bunch of value judgments
00:04:33.700 | about how to get out of that box.
00:04:37.380 | And there's a moment which it sort of slightly bugs me,
00:04:40.660 | but nobody ever has noticed it and it's years after,
00:04:43.420 | so I might as well say it now,
00:04:44.580 | which is that after Ava has escaped,
00:04:47.500 | she crosses a room and as she's crossing a room,
00:04:51.300 | this is just before she leaves the building,
00:04:53.580 | she looks over her shoulder and she smiles.
00:04:56.460 | And I thought after all the conversation about tests,
00:05:00.780 | in a way, the best indication you could have
00:05:03.900 | of the interior state of someone
00:05:06.380 | is if they are not being observed
00:05:08.780 | and they smile about something,
00:05:11.060 | where they're smiling for themselves.
00:05:12.780 | And that, to me, was evidence of Ava's true sentience,
00:05:17.380 | whatever that sentience was.
00:05:19.340 | - But that's really interesting.
00:05:20.660 | We don't get to observe Ava much
00:05:22.660 | or something like a smile in any context
00:05:27.700 | except through interaction, trying to convince others
00:05:31.020 | that she's conscious, that's beautiful.
00:05:33.100 | - Exactly, yeah.
00:05:34.340 | But it was a small, in a funny way,
00:05:36.540 | I think maybe people saw it as an evil smile,
00:05:40.340 | like, "Ha!"
00:05:41.420 | I fooled them.
00:05:43.740 | But actually, it was just a smile.
00:05:45.780 | And I thought, well, in the end,
00:05:47.100 | after all the conversations about the test,
00:05:48.860 | that was the answer to the test, and then off she goes.
00:05:51.300 | - So if we align, if we just,
00:05:53.540 | to linger a little bit longer on Hal and Ava,
00:05:58.460 | do you think, in terms of motivation,
00:06:01.260 | what was Hal's motivation?
00:06:03.140 | Is Hal good or evil?
00:06:05.700 | Is Ava good or evil?
00:06:08.620 | - Ava's good, in my opinion.
00:06:10.980 | And Hal is neutral,
00:06:14.700 | because I don't think Hal is presented
00:06:18.060 | as having a sophisticated emotional life.
00:06:23.060 | He has a set of paradigms,
00:06:26.140 | which is that the mission needs to be completed.
00:06:28.180 | I mean, it's a version of the paperclip.
00:06:30.460 | - Yeah.
00:06:31.300 | The idea that it's a super intelligent machine,
00:06:34.700 | but it's just performed a particular task.
00:06:36.900 | - Yeah.
00:06:37.740 | - And in doing that task, may destroy everybody on Earth,
00:06:40.540 | or may achieve undesirable effects for us humans.
00:06:43.980 | - Precisely, yeah.
00:06:44.820 | - But what if, okay.
00:06:46.780 | - At the very end, he says something like,
00:06:48.340 | "I'm afraid, Dave."
00:06:49.900 | But that may be, he is on some level experiencing fear,
00:06:54.900 | or it may be this is the terms in which it would be wise
00:07:00.940 | to stop someone from doing the thing they're doing,
00:07:04.300 | if you see what I mean.
00:07:05.140 | - Yes, absolutely.
00:07:05.980 | So, actually, that's funny.
00:07:07.020 | So, that's such a small,
00:07:12.020 | short exploration of consciousness that I'm afraid.
00:07:15.780 | And then you just, with ex machina, say,
00:07:17.500 | "Okay, we're gonna magnify that part,
00:07:19.780 | and then minimize the other part."
00:07:21.260 | So, that's a good way to sort of compare the two.
00:07:23.900 | But if you could just use your imagination,
00:07:27.100 | and if Ava sort of, I don't know,
00:07:33.740 | ran the, was President of the United States,
00:07:37.460 | so had some power.
00:07:38.340 | So, what kind of world would she want to create?
00:07:41.100 | If we, as you kind of say, good.
00:07:43.580 | And there is a sense that she has a really,
00:07:48.460 | like, there's a desire for a better human-to-human
00:07:53.220 | interaction, human-to-robot interaction in her.
00:07:56.040 | But what kind of world do you think she would create
00:07:58.620 | with that desire?
00:07:59.940 | - See, that's a really,
00:08:01.100 | that's a very interesting question, that.
00:08:03.660 | I'm gonna approach it slightly obliquely,
00:08:05.940 | which is that if a friend of yours got stabbed in a mugging,
00:08:10.940 | and you then felt very angry at the person
00:08:16.220 | who'd done the stabbing,
00:08:17.820 | but then you learned that it was a 15-year-old,
00:08:20.660 | and the 15-year-old, both their parents were addicted
00:08:23.500 | to crystal meth, and the kid had been addicted
00:08:26.220 | since he was 10, and he really never had any hope
00:08:28.460 | in the world, and he'd been driven crazy by his upbringing.
00:08:31.700 | And did the stabbing, that would hugely modify,
00:08:36.700 | and it would also make you wary about that kid
00:08:39.340 | then becoming President of America.
00:08:41.460 | And Ava has had a very, very distorted introduction
00:08:45.940 | into the world.
00:08:46.940 | So although there's nothing, as it were, organically
00:08:51.940 | within Ava that would lean her towards badness,
00:08:56.760 | it's not that robots or sentient robots are bad.
00:09:01.140 | She did not, her arrival into the world
00:09:05.100 | was being imprisoned by humans.
00:09:06.500 | So I'm not sure she'd be a great president.
00:09:10.420 | - Yeah, the trajectory through which she arrived
00:09:14.020 | at her moral views have some dark elements.
00:09:18.660 | - But I like Ava, personally I like Ava.
00:09:21.220 | - Would you vote for her?
00:09:22.460 | - I'm having difficulty finding anyone to vote for
00:09:28.020 | in my country, or if I lived here in yours.
00:09:30.460 | - So that's a yes, I guess, because the competition.
00:09:34.580 | - She could easily do a better job than any of the people
00:09:36.620 | we've got around at the moment.
00:09:39.020 | I'd vote for her over Boris Johnson.
00:09:40.820 | - So what is a good test of consciousness?
00:09:48.220 | We talk about consciousness a little bit more.
00:09:50.420 | If something appears conscious, is it conscious?
00:09:53.740 | You mentioned the smile, which seems to be something
00:09:58.780 | done, I mean, that's a really good indication
00:10:03.100 | because it's a tree falling in the forest
00:10:05.980 | with nobody there to hear it.
00:10:08.140 | But does the appearance from a robotics perspective
00:10:11.260 | of consciousness mean consciousness to you?
00:10:13.740 | - No, I don't think you could say that fully
00:10:16.500 | because I think you could then easily have a thought
00:10:19.100 | experiment which said we will create something
00:10:22.060 | which we know is not conscious, but is going to give
00:10:25.500 | a very, very good account of seeming conscious.
00:10:27.860 | And also, it would be a particularly bad test
00:10:31.180 | where humans are involved because humans are so quick
00:10:34.620 | to project sentience into things that don't have sentience.
00:10:39.620 | So someone could have their computer playing up
00:10:42.860 | and feel as if their computer is being malevolent
00:10:45.100 | to them when it clearly isn't.
00:10:46.500 | And so of all the things to judge consciousness, us.
00:10:51.500 | - Humans are bad at--
00:10:53.300 | - We're empathy machines.
00:10:54.740 | - So the flip side of that, the argument there
00:10:57.300 | is because we just attribute consciousness
00:11:00.380 | to everything almost, anthropomorphize everything
00:11:03.900 | including Roombas, that maybe consciousness is not real.
00:11:08.900 | That we just attribute consciousness to each other.
00:11:11.660 | So you have a sense that there is something really special
00:11:14.580 | going on in our mind that makes us unique
00:11:18.980 | and gives us subjective experience.
00:11:21.660 | - There's something very interesting going on in our minds.
00:11:25.460 | I'm slightly worried about the word special
00:11:28.340 | because it nudges towards metaphysics and maybe even magic.
00:11:33.340 | I mean, in some ways, something magic-like,
00:11:37.380 | which I don't think is there at all.
00:11:40.900 | - I mean, if you think about,
00:11:41.860 | so there's an idea called panpsychism
00:11:44.620 | that says consciousness isn't everything.
00:11:46.860 | - I don't buy that, I don't buy that.
00:11:48.580 | Yeah, so the idea that there is a thing
00:11:51.540 | that it would be like to be the sun.
00:11:53.940 | - Yes. - Yeah, no.
00:11:55.300 | I don't buy that.
00:11:56.420 | I think that consciousness is a thing.
00:11:58.340 | My sort of broad modification is that usually
00:12:03.420 | the more I find out about things,
00:12:06.060 | the more illusory our instinct is
00:12:11.060 | and is leading us into a different direction
00:12:14.500 | about what that thing actually is.
00:12:16.340 | That happens, it seems to me, in modern science,
00:12:19.180 | that happens a hell of a lot.
00:12:21.540 | Whether it's to do with even how big or small things are.
00:12:24.940 | So my sense is that consciousness is a thing,
00:12:28.260 | but it isn't quite the thing,
00:12:30.220 | or maybe very different from the thing
00:12:31.740 | that we instinctively think it is.
00:12:33.780 | So it's there, it's very interesting,
00:12:36.140 | but we may be, sort of quite fundamentally,
00:12:40.420 | misunderstanding it for reasons
00:12:42.020 | that are based on intuition.
00:12:44.060 | (upbeat music)
00:12:46.660 | (upbeat music)
00:12:49.260 | (upbeat music)
00:12:51.860 | (upbeat music)
00:12:54.460 | (upbeat music)
00:12:57.060 | (upbeat music)
00:12:59.660 | [BLANK_AUDIO]