back to index

Judea Pearl: Concerns for AGI | AI Podcast Clips


Whisper Transcript | Transcript Only Page

00:00:00.000 | - Do you have concerns about the future of AI?
00:00:05.000 | All the different trajectories of all of our research.
00:00:09.940 | - Yes.
00:00:10.980 | - Where's your hope?
00:00:12.060 | Where the movement heads, where are your concerns?
00:00:14.660 | - I'm concerned because I know we are building
00:00:18.300 | a new species that has a capability of exceeding us,
00:00:27.100 | exceeding our capabilities and can breed itself
00:00:31.580 | and take over the world, absolutely.
00:00:33.860 | It's a new species that is uncontrolled.
00:00:37.940 | We don't know the degree to which we control it,
00:00:40.380 | we don't even understand what it means
00:00:42.940 | to be able to control this new species.
00:00:45.480 | So I'm concerned.
00:00:48.020 | I don't have anything to add to that
00:00:51.420 | because it's such a gray area, that's unknown.
00:00:56.420 | It never happened in history.
00:00:57.980 | The only time it happened in history
00:01:04.540 | was evolution with human beings.
00:01:06.280 | And it wasn't very successful, was it?
00:01:09.760 | Some people say it was a great success.
00:01:12.980 | - For us it was, but a few people along the way,
00:01:16.660 | a few creatures along the way would not agree.
00:01:18.960 | So it's just because it's such a gray area,
00:01:23.340 | there's nothing else to say.
00:01:25.260 | - We have a sample of one.
00:01:27.100 | - Sample of one.
00:01:28.380 | - That's us.
00:01:29.220 | - But some people would look at you and say,
00:01:35.140 | yeah, but we were looking to you to help us
00:01:40.060 | make sure that sample two works out okay.
00:01:43.460 | - We have more than a sample of one.
00:01:45.140 | We have theories.
00:01:45.980 | And that's good.
00:01:48.980 | We don't need to be statisticians.
00:01:51.020 | So sample of one doesn't mean poverty of knowledge.
00:01:55.700 | It's not.
00:01:56.740 | Sample of one plus theory,
00:01:59.100 | conjectural theory of what could happen.
00:02:02.040 | That we do have.
00:02:04.660 | But I really feel helpless in contributing to this argument
00:02:09.660 | because I know so little and my imagination is limited
00:02:15.060 | and I know how much I don't know
00:02:20.380 | and I, but I'm concerned.
00:02:24.940 | (silence)
00:02:27.100 | (silence)
00:02:29.260 | (silence)
00:02:31.420 | (silence)
00:02:33.580 | (silence)
00:02:35.740 | (silence)
00:02:37.900 | (silence)
00:02:40.060 | (silence)
00:02:42.220 | [BLANK_AUDIO]