back to index

Noam Chomsky: Deepest Property of Language


Whisper Transcript | Transcript Only Page

00:00:00.000 | - What are the most beautiful or fascinating aspects
00:00:04.320 | of language or ideas in linguistics or cognitive science
00:00:07.760 | that you've seen in a lifetime of studying language
00:00:11.080 | and studying the human mind?
00:00:13.200 | - Well, I think the deepest property of language
00:00:18.200 | and puzzling property that's been discovered
00:00:21.920 | is what is sometimes called structure dependence.
00:00:26.280 | We now understand it pretty well,
00:00:28.640 | but it was puzzling for a long time.
00:00:30.960 | I'll give you a concrete example.
00:00:32.640 | So suppose you say,
00:00:34.680 | the guy who fixed the car carefully packed his tools.
00:00:40.880 | It's ambiguous.
00:00:42.120 | He could fix the car carefully or carefully pack his tools.
00:00:46.960 | Suppose you put carefully in front,
00:00:50.080 | carefully the guy who fixed the car packed his tools.
00:00:54.880 | Then it's carefully packed, not carefully fixed.
00:00:58.400 | And in fact, you do that even if it makes no sense.
00:01:01.320 | So suppose you say carefully,
00:01:03.640 | the guy who fixed the car is tall.
00:01:07.160 | You have to interpret it as carefully he's tall,
00:01:10.880 | even though that doesn't make any sense.
00:01:13.280 | And notice that that's a very puzzling fact
00:01:16.200 | because you're relating carefully,
00:01:19.080 | not to the linearly closest verb,
00:01:22.640 | but to the linearly more remote verb.
00:01:26.320 | A linear closeness is a easy computation,
00:01:31.320 | but here you're doing a much more,
00:01:32.760 | what looks like a more complex computation.
00:01:35.800 | You're doing something that's taking you
00:01:38.520 | essentially to the more remote thing.
00:01:40.920 | It's now, if you look at the actual structure
00:01:45.680 | of the sentence, where the phrases are and so on,
00:01:49.200 | turns out you're picking out the structurally closest thing,
00:01:53.240 | but the linearly more remote thing.
00:01:57.000 | But notice that what's linear is 100% of what you hear.
00:02:01.520 | You never hear structure, can't.
00:02:04.200 | So what you're doing is, and incidentally,
00:02:06.960 | this is universal, all constructions, all languages.
00:02:11.200 | And what we're compelled to do is carry out
00:02:14.580 | what looks like the more complex computation
00:02:17.720 | on material that we never hear,
00:02:21.280 | and we ignore 100% of what we hear
00:02:24.440 | and the simplest computation.
00:02:26.560 | But by now there's even a neural basis for this
00:02:29.720 | that's somewhat understood.
00:02:31.920 | And there's good theories by now
00:02:33.440 | that explain why it's true.
00:02:35.660 | That's a deep insight into the surprising nature
00:02:40.040 | of language with many consequences.
00:02:42.200 | (silence)
00:02:44.360 | (silence)
00:02:46.520 | (silence)
00:02:48.680 | (silence)
00:02:50.840 | (silence)
00:02:53.000 | (silence)
00:02:55.160 | (silence)
00:02:57.320 | [BLANK_AUDIO]