back to indexPeter Norvig: We Are Seduced by Our Low-Dimensional Metaphors | AI Podcast Clips
00:00:03.440 |
Any time you use neural networks, any time you learn 00:00:10.040 |
from data, form representation from data in an automated way, 00:00:21.760 |
in terms of how this neural network sees the world, 00:00:24.880 |
where why does it succeed so brilliantly in so many cases, 00:00:29.880 |
and fail so miserably in surprising ways in small. 00:00:37.600 |
Can simply more data, better data, more organized data 00:00:49.960 |
So I prefer to talk about trust, and validation, 00:00:55.960 |
and verification rather than just about explainability. 00:01:08.160 |
don't want to use these systems unless we trust them, 00:01:37.880 |
and with organizations, and corporations, and so on. 00:01:42.820 |
Then you have no guarantee that that explanation 00:01:49.720 |
didn't get the loan because you didn't have enough collateral. 00:01:52.680 |
And that may be true, or it may be true that they just 00:02:01.200 |
And that's true whether the decision was made 00:02:11.640 |
and I want to be able to have a conversation to go back 00:02:14.420 |
and forth and said, well, you gave this explanation, 00:02:18.520 |
And what would have happened if this had happened? 00:02:31.920 |
And I think we need testing of various kinds. 00:02:40.760 |
based on my religion, or skin color, or whatever? 00:02:57.800 |
So we thought we were doing pretty good at object 00:03:02.440 |
We said, look, we're at pretty close to human level 00:03:08.880 |
And then you start seeing these adversarial images, 00:03:29.080 |
that are different than the attacks on the machines. 00:03:32.280 |
But the attacks on the machines are so striking, 00:03:36.000 |
they really change the way you think about what we've done. 00:03:39.680 |
And the way I think about it is I think part of the problem 00:03:43.280 |
is we're seduced by our low dimensional metaphors. 00:04:01.600 |
and maybe there's a tiny little spot in the middle 00:04:07.400 |
And if you believe that metaphor, then you say, 00:04:11.720 |
And there's only going to be a couple adversarial images. 00:04:17.380 |
And what you should really say is it's not a 2D flat space 00:04:24.240 |
And a cat is this string that goes out in this crazy path. 00:04:29.360 |
And if you step a little bit off the path in any direction, 00:04:42.800 |
but it was an understanding of what the models are 00:04:47.280 |
And now we can start exploring how do you fix that. 00:04:49.400 |
Yeah, validating the robustness of the system, so on. 00:04:56.600 |
Do you think we're a little too hard on our robots 00:05:08.360 |
There's a dance in nonverbal and verbal communication 00:05:13.720 |
If we apply the same kind of standard in terms of humans, 00:05:20.880 |
You and I haven't met before, and there's some degree 00:05:23.160 |
of trust that nothing's going to go crazy wrong. 00:05:31.040 |
we seem to approach through skepticism always, always. 00:05:35.400 |
And it's like they have to prove through a lot of hard work 00:05:39.680 |
that they're even worthy of even inkling of our trust. 00:05:45.000 |
How do we break that barrier, close that gap? 00:05:50.440 |
Just listening, my friend Mark Moffat is a naturalist, 00:05:55.400 |
and he says the most amazing thing about humans 00:06:05.840 |
lots of people around you that you've never met before, 00:06:17.600 |
here's some that aren't from my tribe, bad things happen. 00:06:23.000 |
Especially in a coffee shop, there's delicious food around. 00:06:32.440 |
We still go to war, we still do terrible things, 00:06:34.800 |
but for the most part, we've learned to trust each other 00:06:39.520 |
So that's going to be important for our AI systems as well. 00:06:45.160 |
And also, I think a lot of the emphasis is on AI, 00:06:50.240 |
but in many cases, AI is part of the technology, 00:06:56.040 |
So a lot of what we've seen is more due to communications 00:07:06.760 |
but the reason we're able to have any kind of system at all 00:07:10.560 |
is we've got the communications so that we're 00:07:12.840 |
collecting the data and so that we can reach lots 00:07:18.160 |
I think that's a bigger change that we're dealing with.