back to index

David Ferrucci: Humor as the Turing Test for Intelligence | AI Podcast Clips


Whisper Transcript | Transcript Only Page

00:00:00.000 | One of the benchmarks for me is humor, right?
00:00:05.000 | That seems to be one of the hardest.
00:00:07.360 | And to me, the biggest contrast is Watson.
00:00:09.900 | So one of the greatest comedy sketches of all time, right,
00:00:14.840 | is the SNL celebrity Jeopardy!
00:00:18.120 | With Alex Trebek and Sean Connery
00:00:21.640 | and Burt Reynolds and so on.
00:00:23.640 | With Sean Connery commentating on Alex Trebek's mother
00:00:27.760 | a lot, so, and I think all of them
00:00:30.400 | are in the negative points wise.
00:00:32.400 | So they're clearly all losing
00:00:34.640 | in terms of the game of Jeopardy!,
00:00:35.860 | but they're winning in terms of comedy.
00:00:37.840 | So what do you think about humor in this whole interaction,
00:00:42.840 | in the dialogue that's productive?
00:00:46.040 | Or even just whatever, what humor represents to me is
00:00:49.720 | the same idea that you're saying about framework,
00:00:54.920 | 'cause humor only exists
00:00:55.940 | within a particular human framework.
00:00:57.860 | So what do you think about humor?
00:00:59.100 | What do you think about things like humor
00:01:01.060 | that connect to the kind of creativity
00:01:02.860 | you mentioned that's needed?
00:01:04.660 | - I think there's a couple of things going on there.
00:01:05.940 | So I sort of feel like,
00:01:09.060 | and I might be too optimistic this way,
00:01:11.340 | but I think that there are,
00:01:14.260 | we did a little bit about with this
00:01:16.060 | and with puns in Jeopardy!
00:01:18.540 | We literally sat down and said,
00:01:20.220 | how do puns work?
00:01:22.700 | And it's like wordplay,
00:01:24.340 | and you could formalize these things.
00:01:25.660 | So I think there's a lot aspects of humor
00:01:27.820 | that you could formalize.
00:01:29.780 | You could also learn humor.
00:01:31.180 | You could just say, what do people laugh at?
00:01:33.060 | And if you have enough, again,
00:01:34.440 | if you have enough data to represent the phenomenon,
00:01:36.440 | you might be able to weigh the features
00:01:39.020 | and figure out what humans find funny
00:01:40.860 | and what they don't find funny.
00:01:42.260 | The machine might not be able to explain
00:01:44.740 | why the human finding it funny,
00:01:46.180 | unless we sit back and think about that more formally.
00:01:49.740 | I think, again, I think you do a combination of both.
00:01:51.980 | And I'm always a big proponent of that.
00:01:53.460 | I think robust architectures and approaches
00:01:56.260 | are always a little bit combination of us reflecting
00:01:59.200 | and being creative about how things are structured,
00:02:02.060 | how to formalize them,
00:02:03.340 | and then taking advantage of large data
00:02:05.100 | and doing learning and figuring out
00:02:06.180 | how to combine these two approaches.
00:02:08.660 | I think there's another aspect to humor though,
00:02:10.980 | which goes to the idea that I feel like I can relate
00:02:13.900 | to the person telling the story.
00:02:18.340 | And I think that's an interesting theme
00:02:21.700 | in the whole AI theme, which is,
00:02:23.720 | do I feel differently when I know it's a robot?
00:02:27.980 | And when I know, when I imagine
00:02:30.980 | that the robot is not conscious the way I'm conscious,
00:02:33.740 | when I imagine the robot does not actually
00:02:35.840 | have the experiences that I experience,
00:02:38.240 | do I find it funny?
00:02:40.500 | Or do, because it's not as related,
00:02:42.580 | I don't imagine that the person's relating it to it
00:02:46.060 | the way I relate to it.
00:02:47.380 | I think this also, you see this in the arts
00:02:50.900 | and in entertainment where,
00:02:52.940 | sometimes you have savants who are remarkable at a thing,
00:02:56.900 | whether it's sculpture, it's music or whatever,
00:02:59.340 | but the people who get the most attention
00:03:00.820 | are the people who can evoke a similar emotional response,
00:03:05.820 | who can get you to emote, right?
00:03:10.260 | About the way they are.
00:03:11.460 | In other words, who can basically make the connection
00:03:13.980 | from the artifact, from the music or the painting
00:03:16.540 | or the sculpture to the emotion
00:03:19.300 | and get you to share that emotion with them.
00:03:21.900 | And then, and that's when it becomes compelling.
00:03:24.220 | So they're communicating at a whole different level.
00:03:26.480 | They're just not communicating the artifact.
00:03:28.860 | They're communicating their emotional response
00:03:30.460 | to the artifact.
00:03:31.300 | And then you feel like, oh, wow,
00:03:32.900 | I can relate to that person.
00:03:34.020 | I can connect to that.
00:03:35.060 | I can connect to that person.
00:03:36.620 | So I think humor has that aspect as well.
00:03:40.180 | - So the idea that you can connect to that person,
00:03:44.300 | person being the critical thing,
00:03:46.780 | but we're also able to anthropomorphize objects pretty,
00:03:51.780 | robots and AI systems pretty well.
00:03:54.660 | So we're almost looking to make them human.
00:03:58.260 | But maybe from your experience with Watson,
00:04:00.300 | maybe you can comment on, did you consider that as part,
00:04:04.420 | well, obviously the problem of Jeopardy
00:04:06.500 | doesn't require anthropomorphization, but nevertheless--
00:04:09.980 | - Well, there was some interest in doing that.
00:04:11.780 | And that's another thing I didn't wanna do
00:04:14.500 | 'cause I didn't wanna distract
00:04:15.740 | from the actual scientific task.
00:04:18.260 | But you're absolutely right.
00:04:19.100 | I mean, humans do anthropomorphize
00:04:22.860 | and without necessarily a lot of work.
00:04:25.380 | I mean, you just put some eyes
00:04:26.580 | and a couple of eyebrow movements
00:04:28.740 | and you're getting humans to react emotionally.
00:04:31.300 | And I think you can do that.
00:04:33.020 | So I didn't mean to suggest that that connection
00:04:37.580 | cannot be mimicked.
00:04:40.120 | I think that connection can be mimicked
00:04:41.740 | and can get you to, can produce that emotional response.
00:04:46.740 | I just wonder though, if you're told what's really going on,
00:04:51.780 | if you know that the machine is not conscious,
00:04:56.700 | not having the same richness of emotional reactions
00:05:00.260 | and understanding that it doesn't really share
00:05:01.700 | the understanding, but it's essentially
00:05:03.500 | just moving its eyebrow or drooping its eyes
00:05:05.900 | or making them bigger, whatever it's doing,
00:05:07.660 | just getting the emotional response.
00:05:09.660 | Will you still feel it?
00:05:11.080 | Interesting, I think you probably would for a while.
00:05:13.860 | And then when it becomes more important
00:05:15.380 | that there's a deeper shared understanding,
00:05:18.180 | it may run flat, but I don't know.
00:05:20.100 | - No, I'm pretty confident that majority of the world,
00:05:24.780 | even if you tell them how it works,
00:05:26.940 | well, it will not matter,
00:05:28.580 | especially if the machine herself says
00:05:32.620 | that she is conscious.
00:05:34.940 | - That's very possible.
00:05:35.780 | - So you, the scientist that made the machine,
00:05:38.100 | is saying that this is how the algorithm works.
00:05:42.360 | Everybody will just assume you're lying
00:05:43.960 | and that there's a conscious being there.
00:05:45.640 | - So you're deep into the science fiction genre now.
00:05:48.760 | - I don't think it's, it's actually psychology.
00:05:51.520 | I think it's not science fiction.
00:05:53.280 | I think it's reality.
00:05:54.400 | I think it's a really powerful one
00:05:56.320 | that we'll have to be exploring in the next few decades.
00:06:00.240 | It's a very interesting element of intelligence.
00:06:03.160 | (upbeat music)
00:06:05.740 | (upbeat music)
00:06:08.320 | (upbeat music)
00:06:10.900 | (upbeat music)
00:06:13.480 | (upbeat music)
00:06:16.060 | (upbeat music)
00:06:18.640 | [BLANK_AUDIO]