back to index

Donald Knuth: Algorithms, Complexity, and The Art of Computer Programming | Lex Fridman Podcast #62


Chapters

0:0 Introduction
3:45 Donald Knuth Interview
6:30 Predicting the Future
14:26 literate programming
15:49 literary influences
22:1 Informal vs formal
23:4 How difficult is a design
25:42 Using data to construct algorithms
26:50 Accessibility of algorithms
33:1 Graph theory
36:49 Problems in combinatorics
39:16 Thinking and writing process
42:13 Some days harder than others
44:51 Macros
45:53 Writing
48:36 Art
50:26 Transformational surprises
58:3 Algorithms

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Donald Knuth,
00:00:03.440 | one of the greatest and most impactful computer scientists
00:00:06.640 | and mathematicians ever.
00:00:09.280 | He's the recipient of the 1974 Turing Award,
00:00:13.500 | considered the Nobel Prize of computing.
00:00:16.000 | He's the author of the multi-volume work,
00:00:18.680 | the magnum opus, "The Art of Computer Programming."
00:00:23.320 | He made several key contributions to the rigorous analysis
00:00:26.880 | of computational complexity of algorithms,
00:00:29.320 | including the popularization of asymptotic notation
00:00:33.480 | that we all affectionately know as the big O notation.
00:00:37.560 | He also created the tech typesetting system,
00:00:40.840 | which most computer scientists, physicists, mathematicians,
00:00:44.460 | and scientists and engineers in general
00:00:47.000 | use to write technical papers and make them look beautiful.
00:00:51.740 | I can imagine no better guest to end 2019 with than Don,
00:00:56.780 | one of the kindest, most brilliant people in our field.
00:01:00.560 | This podcast was recorded many months ago.
00:01:03.120 | It's one I avoided because perhaps counterintuitively,
00:01:06.440 | the conversation meant so much to me.
00:01:08.960 | If you can believe it,
00:01:10.000 | I knew even less about recording back then,
00:01:12.480 | so the camera angle is a bit off.
00:01:14.320 | I hope that's okay with you.
00:01:16.240 | The office space was a big cramp for filming,
00:01:19.400 | but it was a magical space where Don does most of his work.
00:01:24.120 | It meant a lot to me that he would welcome me into his home.
00:01:27.400 | It was quite a journey to get there.
00:01:28.920 | As many people know, he doesn't check email,
00:01:31.700 | so I had to get creative.
00:01:33.500 | The effort was worth it.
00:01:35.740 | I've been doing this podcast on the side
00:01:37.420 | for just over a year.
00:01:38.700 | Sometimes I had to sacrifice a bit of sleep,
00:01:41.260 | but always happy to do it
00:01:42.740 | and to be part of an amazing community of curious minds.
00:01:46.600 | Thank you for your kind words of support
00:01:48.940 | and for the interesting discussions,
00:01:50.620 | and I look forward to many more of those in 2020.
00:01:54.400 | This is the Artificial Intelligence Podcast.
00:01:57.440 | If you enjoy it, subscribe on YouTube,
00:01:59.800 | give it five stars on Apple Podcast,
00:02:01.820 | follow on Spotify, support on Patreon,
00:02:04.520 | or simply connect with me on Twitter
00:02:06.560 | at Lex Friedman, spelled F-R-I-D-M-A-N.
00:02:10.640 | I recently started doing ads at the end of the introduction.
00:02:13.640 | I'll do one or two minutes after introducing the episode
00:02:16.320 | and never any ads in the middle
00:02:17.920 | that break the flow of the conversation.
00:02:19.960 | I hope that works for you
00:02:21.340 | and doesn't hurt the listening experience.
00:02:23.700 | I provide timestamps for the start of the conversation
00:02:26.020 | that you can skip to,
00:02:27.100 | but it helps if you listen to the ad
00:02:29.460 | and support this podcast
00:02:30.980 | by trying out the product or service being advertised.
00:02:34.260 | This show is presented by Cash App,
00:02:36.540 | the number one finance app in the App Store.
00:02:39.100 | I personally use Cash App to send money to friends,
00:02:41.860 | but you can also use it to buy, sell,
00:02:44.180 | and deposit Bitcoin in just seconds.
00:02:46.620 | Cash App also has a new investing feature.
00:02:49.740 | You can buy fractions of a stock, say $1 worth,
00:02:52.720 | no matter what the stock price is.
00:02:54.960 | Broker services are provided by Cash App Investing,
00:02:57.800 | a subsidiary of Square and a member of SIPC.
00:03:01.520 | I'm excited to be working with Cash App
00:03:03.460 | to support one of my favorite organizations called FIRST,
00:03:06.860 | best known for their FIRST Robotics and Lego competitions.
00:03:10.480 | They educate and inspire hundreds of thousands of students
00:03:14.200 | in over 110 countries
00:03:16.000 | and have a perfect rating on Charity Navigator,
00:03:18.520 | which means that donated money
00:03:19.840 | is used to maximum effectiveness.
00:03:23.240 | When you get Cash App from the App Store or Google Play
00:03:26.460 | and use code LEXPODCAST, you'll get $10
00:03:30.120 | and Cash App will also donate $10 to FIRST,
00:03:32.920 | which again is an organization
00:03:34.800 | that I've personally seen inspire girls and boys
00:03:37.640 | to dream of engineering a better world.
00:03:40.580 | And now here's my conversation with Donald Knuth.
00:03:45.480 | In 1957 at Case Tech, you were once allowed
00:03:50.480 | to spend several evenings with a IBM 650 computer,
00:03:55.480 | as you've talked about in the past,
00:03:57.040 | and you fell in love with computing then.
00:03:59.040 | - Yeah. - Can you take me back
00:04:01.620 | to that moment with the IBM 650?
00:04:05.040 | What was it that grabbed you about that computer?
00:04:09.080 | - So the IBM 650 was this machine
00:04:14.280 | that, well, it didn't fill a room,
00:04:16.760 | but it was big and noisy.
00:04:20.360 | But when I first saw it, it was through a window
00:04:23.040 | and there were just a lot of lights flashing on it.
00:04:26.080 | And I was a freshman.
00:04:28.720 | I had a job with the statistics group
00:04:34.520 | and I was supposed to punch cards for data
00:04:38.760 | and then sort them on another machine.
00:04:40.380 | But then they got this new computer,
00:04:43.400 | came in and it had interesting lights.
00:04:48.120 | Okay, so, well, but I had a key to the building
00:04:51.520 | so I could get in and look at it
00:04:53.520 | and got a manual for it.
00:04:55.840 | And my first experience was based on the fact
00:04:58.600 | that I could punch cards, basically,
00:05:00.400 | which was a big thing for the idea.
00:05:02.440 | But the IBM 650 was big in size,
00:05:06.480 | but incredibly small in power.
00:05:10.920 | - In resources. - In memory.
00:05:12.840 | It had 2,000 words of memory
00:05:16.840 | and a word of memory was 10 decimal digits plus a sign.
00:05:20.360 | And it would do, to add two numbers together,
00:05:23.960 | you could probably expect that would take,
00:05:26.540 | I'll say three milliseconds.
00:05:30.120 | So-- - It's still pretty fast.
00:05:31.920 | It's the memory is the constraint.
00:05:33.560 | The memory is the problem.
00:05:34.680 | - That was why it took three milliseconds,
00:05:36.880 | because it took five milliseconds
00:05:38.640 | for the drum to go around.
00:05:40.200 | And you had to wait, I don't know, five cycle times.
00:05:45.120 | If you have an instruction,
00:05:47.320 | one position on the drum,
00:05:49.040 | then it would be ready to read the data
00:05:50.640 | for the instruction.
00:05:51.720 | You know, go three notches.
00:05:55.720 | The drum is 50 cycles around
00:05:57.840 | and you go three cycles and you can get the data
00:06:00.720 | and then you can go another three cycles
00:06:02.160 | and get to your next instruction,
00:06:04.520 | if the instruction is there.
00:06:05.600 | Otherwise, you spin until you get to the right place.
00:06:09.360 | And we had no random access memory whatsoever
00:06:13.040 | until my senior year.
00:06:14.440 | Senior year, we got 50 words of random access memory,
00:06:17.400 | which were priceless.
00:06:18.840 | And we would move stuff up to the random access memory
00:06:23.840 | in 60 word chunks and then we would start again.
00:06:28.080 | So, subroutine wanted to go up there and--
00:06:31.120 | - Could you have predicted the future
00:06:34.320 | 60 years later of computing from then?
00:06:37.680 | - You know, in fact, the hardest question
00:06:39.120 | I was ever asked was,
00:06:40.400 | what could I have predicted?
00:06:44.600 | In other words, the interviewer asked me,
00:06:46.920 | she said, you know,
00:06:48.320 | what about computing has surprised you?
00:06:51.200 | You know, and immediately I ran,
00:06:52.720 | I rattled off a couple dozen things
00:06:54.720 | and then she said, okay, so what didn't surprise?
00:06:57.360 | And I tried for five minutes to think of something
00:07:01.200 | that I would have predicted and I couldn't.
00:07:05.360 | But let me say that this machine,
00:07:08.400 | I didn't know, well, there wasn't much else
00:07:12.120 | in the world at that time.
00:07:13.560 | The 650 was the first machine
00:07:16.560 | that there were more than a thousand of ever.
00:07:19.400 | Before that, there were, you know,
00:07:21.040 | there was each machine there might be
00:07:23.520 | a half a dozen examples, maybe--
00:07:25.440 | - The first mass market, mass produced.
00:07:28.160 | - It was the first one, yeah, done in quantity.
00:07:30.640 | And IBM didn't sell them, they rented them,
00:07:35.640 | but they rented them to universities
00:07:39.880 | that had a great deal.
00:07:44.440 | And so that's why a lot of students
00:07:48.120 | learned about computers at that time.
00:07:50.680 | - So you refer to people, including yourself,
00:07:55.760 | who gravitate toward a kind of computational thinking
00:07:59.160 | as geeks, at least I've heard you use that terminology.
00:08:03.560 | - It's true that I think there's something
00:08:06.280 | that happened to me as I was growing up
00:08:08.200 | that made my brain structure in a certain way
00:08:11.720 | that resonates with computers.
00:08:14.160 | - So there's this space of people,
00:08:16.000 | 2% of the population, you empirically estimate.
00:08:19.200 | - That's been-- - Proven?
00:08:22.840 | - Fairly constant over most of my career.
00:08:24.920 | However, it might be different now
00:08:28.200 | because kids have different experiences when they're young.
00:08:30.800 | - So what does the world look like to a geek?
00:08:35.480 | What is this aspect of thinking that is unique to--
00:08:40.480 | - That makes, yeah. - That makes a geek?
00:08:45.360 | - This is a hugely important question.
00:08:49.320 | In the '50s, IBM noticed that there were geeks
00:08:57.600 | and non-geeks, and so they tried to hire geeks
00:08:59.760 | and they put out ads for papers saying,
00:09:02.360 | if you play chess, come to Madison Avenue
00:09:04.640 | for an interview or something like this.
00:09:06.400 | They were trying for some things.
00:09:08.060 | So what is it that I find easy
00:09:12.040 | and other people tend to find harder?
00:09:14.560 | And I think there's two main things.
00:09:17.640 | One is this ability to jump levels of abstraction.
00:09:26.920 | So you see something in the large
00:09:28.560 | and you see something in the small
00:09:31.440 | and you pass between those unconsciously.
00:09:35.840 | So you know that in order to solve some big problem,
00:09:40.440 | what you need to do is add one to a certain register
00:09:44.840 | and that gets you to another step.
00:09:47.280 | And below the, I don't go down to the electron level,
00:09:51.340 | but I knew what those milliseconds were,
00:09:54.200 | what the drum was like on the 650.
00:09:56.040 | I knew how I was gonna factor a number
00:09:59.640 | or find a root of an equation or something
00:10:03.080 | because of what was doing.
00:10:04.640 | And as I'm debugging, I'm going through,
00:10:07.100 | did I make a key punch error?
00:10:09.880 | Did I write the wrong instruction?
00:10:13.120 | Do I have the wrong thing in a register?
00:10:15.840 | And each level is different.
00:10:20.480 | And this idea of being able to see something
00:10:25.240 | at lots of levels and fluently go between them
00:10:30.000 | seems to me to be much more pronounced
00:10:32.920 | in the people that resonate with computers like I do.
00:10:37.400 | So in my books, I also don't stick just to the high level,
00:10:42.400 | but I mix low level stuff with high level
00:10:47.480 | and this means that some people think
00:10:54.320 | that I should write better books and it's probably true,
00:10:58.720 | but other people say, well, but that's,
00:11:01.520 | if you think like that,
00:11:03.820 | then that's the way to train yourself,
00:11:05.480 | like keep mixing the levels
00:11:06.920 | and learn more and more how to jump between.
00:11:10.400 | So that's the one thing.
00:11:11.480 | The other thing is that it's more of a talent
00:11:15.200 | to be able to deal with non-uniformity
00:11:21.800 | where there's case one, case two, case three,
00:11:24.220 | instead of having one or two rules that govern everything.
00:11:29.180 | So it doesn't bother me if I need,
00:11:34.680 | like an algorithm has 10 steps to it,
00:11:38.520 | each step does something else, that doesn't bother me.
00:11:41.120 | But a lot of pure mathematics is based on one or two rules,
00:11:45.460 | which are universal.
00:11:48.600 | And so this means that people like me
00:11:51.240 | sometimes work with systems
00:11:53.200 | that are more complicated than necessary
00:11:54.840 | because it doesn't bother us
00:11:56.400 | that we didn't figure out the simple rule.
00:11:59.760 | - And you mentioned that while Jacobi, Boole, Abel,
00:12:05.680 | all the mathematicians in the 19th century
00:12:07.680 | may have had symptoms of geek,
00:12:11.640 | the first 100% legit geek was Turing, Alan Turing.
00:12:16.000 | - I think he had, yeah, a lot more of this quality
00:12:21.000 | than any, just from reading the kind of stuff he did.
00:12:26.480 | - So how does Turing, what influence has Turing had on you?
00:12:31.920 | In your way of thinking? - Well, okay,
00:12:34.400 | so I didn't know that aspect of him
00:12:38.120 | until after I graduated some years.
00:12:40.400 | It, as an undergraduate, we had a class
00:12:43.840 | that talked about computability theory and Turing machines.
00:12:47.600 | And it was all, it sounded like a very specific
00:12:52.080 | kind of purely theoretical approach to stuff.
00:12:56.440 | So when, how old was I when I learned
00:12:59.360 | that he had a design machine
00:13:04.360 | and that he wrote the, you know,
00:13:07.440 | he wrote a wonderful manual for Manchester machines
00:13:12.480 | and he invented all, you know, subroutines
00:13:17.360 | and he was a real hacker that he had his hands dirty.
00:13:22.360 | I thought for many years
00:13:26.760 | that he had only done purely formal work.
00:13:30.440 | As I started reading his own publications,
00:13:32.960 | I could, you know, I could feel this kinship.
00:13:35.760 | And of course he had a lot of peculiarities.
00:13:41.920 | Like he wrote numbers backwards because,
00:13:44.040 | I mean, left to right instead of right to left
00:13:47.520 | because that's, it was easier for computers
00:13:50.360 | to process them that way.
00:13:51.860 | - What do you mean left to right?
00:13:54.160 | - He would write pi as, you know, nine, five,
00:13:59.160 | one, four, point three, I mean, okay.
00:14:02.400 | - Right, got it.
00:14:07.320 | - Four, one, point three.
00:14:09.600 | On the blackboard, I mean, when he,
00:14:11.400 | he had trained himself to do that
00:14:16.680 | because the computers he was working with
00:14:19.680 | worked that way inside.
00:14:21.000 | - Trained himself to think like a computer.
00:14:22.760 | Well, there you go, that's geek thinking.
00:14:24.920 | You've practiced some of the most elegant formalism
00:14:29.960 | in computer science and yet you're the creator
00:14:33.880 | of a concept like literate programming
00:14:37.000 | which seems to move closer to natural language
00:14:40.320 | type of description of programming.
00:14:43.000 | - Yeah, absolutely.
00:14:44.760 | - So how do you see those two as conflicting
00:14:47.080 | as the formalism of theory
00:14:49.000 | and the idea of literate programming?
00:14:51.480 | - So there we are in a non-uniform system
00:14:54.240 | where I don't think one size fits all
00:14:57.600 | and I don't think all truth lies in one kind of expertise.
00:15:03.320 | And so somehow, in a way you'd say my life
00:15:07.560 | is a convex combination of English and mathematics.
00:15:12.560 | - And you're okay with that.
00:15:14.520 | - And not only that, I think--
00:15:16.360 | - Thriving.
00:15:17.200 | - I wish, you know, I want my kids to be that way.
00:15:18.920 | I want, et cetera, you know.
00:15:20.880 | Use left brain, right brain at the same time,
00:15:23.920 | you got a lot more done.
00:15:25.400 | That was part of the bargain.
00:15:28.760 | - And I've heard that you didn't really read
00:15:32.600 | for pleasure until into your 30s, you know, literature.
00:15:37.120 | - That's true.
00:15:37.960 | You know more about me than I do,
00:15:39.240 | but I'll try to be consistent with what you read.
00:15:41.640 | - Yeah, no, just believe me.
00:15:43.000 | Just go with whatever story I tell you.
00:15:45.800 | It'll be easier that way.
00:15:46.840 | The conversation will be easier.
00:15:47.680 | - Right, yeah, no, that's true.
00:15:49.920 | - So I've heard mention of Philip Roth's
00:15:51.880 | American Pastoral, which I love as a book.
00:15:54.880 | I don't know if, it was mentioned as something,
00:15:59.280 | I think, that was meaningful to you as well.
00:16:02.320 | In either case, what literary books
00:16:05.840 | had a lasting impact on you?
00:16:07.320 | What literature, what poetry?
00:16:08.160 | - Yeah, okay, good question.
00:16:09.800 | So I met Roth.
00:16:11.640 | - Oh, really?
00:16:13.440 | - Well, we both got doctors from Harvard on the same day,
00:16:16.680 | so we were, yeah, we had lunch together
00:16:20.200 | and stuff like that, but he knew that,
00:16:22.400 | you know, computer books would never sell.
00:16:25.040 | Well, all right, so you say you,
00:16:30.040 | you were a teenager when you left Russia,
00:16:33.360 | so I have to say that Tolstoy
00:16:36.800 | was one of the big influences on me.
00:16:39.000 | I especially like Anna Karenina,
00:16:42.600 | not because of particularly of the plot of the story,
00:16:47.600 | but because there's this character who,
00:16:53.160 | you know, the philosophical discussions,
00:16:56.360 | you know, it's a whole way of life
00:17:01.200 | is worked out there among the characters,
00:17:03.480 | and so that I thought was especially beautiful.
00:17:08.480 | On the other hand, Dostoevsky, I didn't like at all,
00:17:12.880 | because I felt that his genius was mostly
00:17:16.200 | because he kept forgetting what he had started out to do,
00:17:19.000 | and he was just sloppy.
00:17:20.160 | I didn't think that he polished his stuff at all,
00:17:26.280 | and I tend to admire somebody
00:17:29.240 | who dodges the I's and crosses the T's.
00:17:32.640 | - So the music of the prose is what you admire more than--
00:17:36.800 | - I certainly do admire the music of the language,
00:17:39.440 | which I couldn't appreciate in the Russian original,
00:17:42.360 | but I can in Victor Hugo, because it's close,
00:17:45.360 | French is much, it's closer,
00:17:46.840 | but Tolstoy I like, the same reason I like Herman Wouk
00:17:52.600 | as a novelist, I think, like his book,
00:17:57.600 | Marjorie Morningstar has a similar character in Hugo
00:18:01.960 | who developed his own personal philosophy,
00:18:04.280 | and it goes in--
00:18:06.640 | - Was consistent.
00:18:10.160 | - Yeah, right, and it's worth pondering.
00:18:13.760 | So, yeah.
00:18:15.640 | - So you don't like Nietzsche, and--
00:18:18.040 | - Like what?
00:18:18.880 | - You don't like Friedrich Nietzsche, or--
00:18:20.560 | - Nietzsche, yeah, no, no, yeah, this has,
00:18:23.200 | I keep seeing quotations for Nietzsche,
00:18:26.880 | and he never tempt me to read any further.
00:18:30.240 | - Well, he's full of contradictions,
00:18:31.840 | so you will certainly not appreciate him.
00:18:34.400 | - But Schiller, you know, I'm trying to get across
00:18:37.920 | what I appreciate in literature,
00:18:40.520 | and part of it is, as you say, the music of the language,
00:18:46.560 | the way it flows, and take Raymond Chandler
00:18:51.320 | versus Dashiell Hammett, Dashiell Hammett's sentences
00:18:53.920 | are awful, and Raymond Chandler's are beautiful,
00:18:59.200 | they just flow, so I don't read literature
00:19:04.160 | because it's supposed to be good for me,
00:19:07.360 | or because somebody said it's great,
00:19:09.680 | but I find things that I like, I mean,
00:19:16.200 | you mentioned you were dressed like James Bond,
00:19:18.320 | so I love Ian Fleming, I think he's got,
00:19:21.640 | he had a really great gift for, if he has a golf game,
00:19:25.360 | or a game of bridge, or something,
00:19:27.760 | and this comes into his story,
00:19:29.680 | it'll be the most exciting golf game,
00:19:32.400 | or the absolute best possible hands of bridge that exist,
00:19:37.400 | and he exploits it, and tells it beautifully, so.
00:19:45.720 | - So, in connecting some things here,
00:19:49.440 | looking at literate programming,
00:19:51.280 | and being able to convey,
00:19:55.240 | encode algorithms to a computer
00:19:59.880 | in a way that mimics how humans speak,
00:20:04.180 | what do you think about natural language in general,
00:20:08.680 | and the messiness of our human world,
00:20:11.480 | about trying to express difficult things?
00:20:15.560 | - So, the idea of literate programming
00:20:17.640 | is really to try to
00:20:21.080 | understand something better by seeing it
00:20:25.840 | from at least two perspectives,
00:20:27.920 | the formal and the informal,
00:20:29.900 | if we're trying to understand a complicated thing,
00:20:32.640 | if we can look at it in different ways,
00:20:34.320 | and so, this is in fact the key to technical writing,
00:20:37.920 | a good technical writer,
00:20:40.920 | trying not to be obvious about it,
00:20:42.320 | but says everything twice,
00:20:44.920 | formally and informally, or maybe three times,
00:20:47.520 | but you try to give the reader
00:20:51.600 | a way to put the concept into his own brain,
00:20:56.440 | or her own brain.
00:20:57.760 | - Is that better for the writer, or the reader, or both?
00:21:01.720 | - Well, the writer just tries to understand the reader,
00:21:06.720 | that's the goal of a writer,
00:21:08.120 | is to have a good mental image of the reader,
00:21:12.120 | and to say what the reader expects next,
00:21:15.040 | and to impress the reader with what has impressed the writer,
00:21:20.040 | why something is interesting.
00:21:24.720 | So, when you have a computer program,
00:21:26.100 | we try to, instead of looking at it as something
00:21:29.400 | that we're just trying to give an instruction
00:21:31.120 | to the computer, what we really wanna be
00:21:33.260 | is giving insight to the person
00:21:36.280 | who's gonna be maintaining this program,
00:21:40.840 | or to the programmer himself when he's debugging it,
00:21:44.760 | as to why this stuff is being done.
00:21:47.160 | And so, all the techniques of exposition
00:21:50.640 | that a teacher uses, or a book writer uses,
00:21:54.040 | make you a better programmer,
00:21:55.400 | if your program is gonna be not just a one-shot deal.
00:22:00.400 | - So, how difficult is that?
00:22:04.680 | Do you see hope for the combination of informal and formal,
00:22:10.360 | for the programming task?
00:22:12.840 | - Yeah, I'm the wrong person to ask, I guess,
00:22:15.960 | because I'm a geek, but I think for a geek it's easy.
00:22:19.080 | Some people have difficulty writing,
00:22:24.880 | and it might be because there's something
00:22:29.360 | in their brain structure that makes it hard
00:22:31.800 | for them to write, or it might be something,
00:22:34.440 | just that they haven't had enough practice.
00:22:36.600 | I'm not the right one to judge,
00:22:39.840 | but I don't think you can teach any person
00:22:42.080 | any particular skill.
00:22:44.480 | I do think that writing is half of my life,
00:22:49.400 | and so I put it together in the literary program.
00:22:53.040 | Even when I'm writing a one-shot program,
00:22:55.160 | I write it in a literate way,
00:23:00.000 | because I get it right faster that way.
00:23:04.640 | - Now, does it get compiled automatically?
00:23:07.160 | - Or?
00:23:08.600 | So, I guess on the technical side,
00:23:10.440 | my question was how difficult is it to design a system
00:23:14.760 | where much of the programming is done informally?
00:23:18.320 | - Informally?
00:23:20.560 | - Yeah, informally.
00:23:22.120 | - I think whatever works to make it understandable is good,
00:23:27.120 | but then you have to also understand how informal it is.
00:23:33.280 | You have to know the limitations.
00:23:36.560 | So, by putting the formal and informal together,
00:23:40.080 | this is where it gets locked into your brain.
00:23:45.080 | You can say informally,
00:23:49.960 | well, I'm working on a problem right now, so.
00:23:52.880 | - Let's go there.
00:23:53.720 | Can you give me an example
00:23:55.960 | of connecting the informal and the formal?
00:24:00.000 | - Well, it's a little too complicated an example.
00:24:03.400 | There's a puzzle that's self-referential.
00:24:06.920 | It's called a Japanese arrow puzzle,
00:24:09.240 | and you're given a bunch of boxes.
00:24:13.640 | Each one points north, east, south, or west,
00:24:16.760 | and at the end, you're supposed to fill in each box
00:24:19.680 | with the number of distinct numbers that it points to.
00:24:24.680 | So, if I put a three in a box, that means that,
00:24:28.120 | and it's pointing to five other boxes,
00:24:30.020 | that means that there's gonna be three different numbers
00:24:32.180 | in those five boxes.
00:24:33.680 | And those boxes are pointing,
00:24:36.920 | one of them might be pointing to me,
00:24:38.240 | one of them might be pointing the other way,
00:24:40.640 | but anyway, I'm supposed to find a set of numbers
00:24:45.440 | that obeys this complicated condition
00:24:48.800 | that each number counts how many distinct numbers
00:24:52.280 | it points to.
00:24:53.120 | And so, a guy sent me his solution to this problem
00:24:58.800 | where he presents formal statements
00:25:03.800 | that say either this is true, or this is true,
00:25:07.500 | or this is true, and so I try to render
00:25:10.780 | that formal statement informally,
00:25:13.580 | and I try to say, I contain a three,
00:25:17.380 | and the guys I'm pointing to contain the numbers
00:25:22.380 | one, two, and six.
00:25:25.300 | So, by putting it informally, and also,
00:25:27.220 | I convert it into a dialogue statement,
00:25:30.640 | that helps me understand the logical statement
00:25:35.580 | that he's written down as a string of numbers
00:25:37.660 | in terms of some abstract variables that he had.
00:25:41.100 | - That's really interesting.
00:25:42.100 | So, maybe an extension of that,
00:25:45.340 | there has been a resurgence in computer science
00:25:48.140 | and machine learning and neural networks.
00:25:52.500 | So, using data to construct algorithms.
00:25:56.580 | So, it's another way to construct algorithms, really.
00:25:58.900 | - Yes, exactly.
00:25:59.740 | - If you can think of it that way.
00:26:01.460 | So, as opposed to natural language to construct algorithms,
00:26:06.220 | use data to construct algorithms.
00:26:08.260 | So, what's your view of this branch of computer science
00:26:13.260 | where data is almost more important
00:26:15.900 | than the mechanism of the algorithm?
00:26:18.700 | - It seems to be suited to a certain kind of non-geek,
00:26:24.100 | and which is probably why it's taken off.
00:26:29.100 | It has its own community that really resonates with that.
00:26:34.620 | But it's hard to trust something like that
00:26:38.740 | because nobody, even the people who work with it,
00:26:43.460 | they have no idea what has been learned.
00:26:46.520 | - That's a really interesting thought,
00:26:49.620 | that it makes algorithms more accessible
00:26:54.620 | to a different community, a different type of brain.
00:26:58.660 | - Yep.
00:26:59.580 | - And that's really interesting
00:27:01.100 | 'cause just like literate programming,
00:27:04.340 | perhaps could make programming more accessible
00:27:08.860 | to a certain kind of brain.
00:27:10.620 | - There are people who think it's just a matter of education
00:27:13.700 | and anybody can learn to be a great programmer,
00:27:16.380 | anybody can learn to be a great skier.
00:27:20.760 | I wish that were true, but I know that there's a lot
00:27:26.780 | of things that I've tried to do,
00:27:28.460 | and I was well motivated and I kept trying
00:27:32.220 | to build myself up, and I never got past a certain level.
00:27:36.140 | I can't view, for example, I can't view
00:27:39.100 | three-dimensional objects in my head.
00:27:44.180 | I have to make a model and look at it
00:27:46.580 | and study it from all points of view,
00:27:48.540 | and then I start to get some idea,
00:27:50.700 | but other people are good at four dimensions.
00:27:53.180 | - Physicists.
00:27:55.860 | - Yeah.
00:27:56.700 | - So let's go to the art of computer programming.
00:28:02.900 | In 1962, you set the table of contents
00:28:08.280 | for this magnum opus, right?
00:28:12.980 | - Yep.
00:28:13.860 | - It was supposed to be a single book with 12 chapters.
00:28:16.700 | Now today, what is it, 57 years later,
00:28:21.700 | you're in the middle of volume four of seven.
00:28:25.300 | - In the middle of volume 4B, more precisely.
00:28:30.220 | - Can I ask you for an impossible task,
00:28:33.020 | which is try to summarize the book so far,
00:28:37.620 | maybe by giving a little examples.
00:28:41.660 | So from the sorting and the search
00:28:43.660 | and the combinatorial algorithms,
00:28:46.340 | if you were to give a summary, a quick elevator summary.
00:28:50.540 | - Elevator, that's great, yeah, right.
00:28:52.540 | - But depending how many floors there are in the building.
00:28:54.660 | - Yeah, the first volume called Fundamental Algorithms
00:28:58.480 | talks about something that you can't,
00:29:02.420 | the stuff you can't do without.
00:29:04.040 | You have to know the basic concepts of what is a program
00:29:10.060 | and what is an algorithm.
00:29:11.700 | And it also talks about a low-level machine
00:29:15.940 | so you can have some kind of an idea what's going on.
00:29:19.980 | And it has basic concepts of input/output and subroutines.
00:29:24.980 | - Induction.
00:29:27.860 | - Induction, right, mathematical preliminary.
00:29:32.020 | So the thing that makes my book different
00:29:34.180 | from a lot of others is that I try to,
00:29:39.260 | not only present the algorithm,
00:29:40.820 | but I try to analyze them,
00:29:42.220 | which means that quantitatively I say
00:29:44.500 | not only does it work, but it works this fast.
00:29:46.860 | And so I need math for that.
00:29:49.600 | And then there's the standard way to structure data inside
00:29:52.660 | and represent information in the computer.
00:29:56.100 | So that's all volume one.
00:29:57.580 | Volume two talks, it's called Semi-Numerical Algorithms.
00:30:01.540 | And here we're writing programs,
00:30:04.220 | but we're also dealing with numbers.
00:30:07.300 | Algorithms deal with any kinds of objects,
00:30:10.540 | but when objects are numbers,
00:30:13.540 | well, then we have certain special paradigms
00:30:17.980 | that apply to things that involve numbers.
00:30:20.780 | And so there's arithmetic on numbers
00:30:24.860 | and there's matrices full of numbers,
00:30:27.500 | there's random numbers,
00:30:29.100 | and there's power series full of numbers.
00:30:31.780 | There's different algebraic concepts
00:30:34.420 | that have numbers in structured ways.
00:30:37.300 | - And arithmetic in the way a computer
00:30:38.860 | would think about arithmetic.
00:30:40.100 | So floating point--
00:30:41.700 | - Floating point arithmetic, high precision arithmetic,
00:30:44.940 | not only addition, subtraction, multiplication,
00:30:47.140 | but also comparison of numbers.
00:30:49.260 | So then volume three talks about--
00:30:53.900 | - I like that one, sort and search.
00:30:55.380 | - Sorting and searching.
00:30:56.220 | - I love sorting.
00:30:57.140 | - Right, so here we're not dealing necessarily with numbers
00:31:01.020 | because you sort letters and other objects
00:31:03.780 | and searching we're doing all the time
00:31:05.340 | with Google nowadays, but I mean, we have to find stuff.
00:31:09.540 | So again, algorithms that underlie
00:31:13.500 | all kinds of applications,
00:31:15.860 | none of these volumes is about a particular application,
00:31:19.140 | but the applications are examples
00:31:20.820 | of why people want to know about sorting,
00:31:23.620 | why people want to know about random numbers.
00:31:26.660 | So then volume four goes into combinatorial algorithm.
00:31:31.020 | This is where we have zillions of things to deal with.
00:31:34.900 | And here we keep finding cases
00:31:39.180 | where one good idea can make something go more
00:31:43.700 | than a million times faster.
00:31:45.220 | And we're dealing with problems
00:31:49.420 | that are probably never gonna be solved efficiently,
00:31:53.780 | but that doesn't mean we give up on them.
00:31:56.940 | And we have this chance to have good ideas
00:32:00.420 | and go much, much faster on them.
00:32:02.140 | So that's combinatorial algorithms.
00:32:04.740 | And those are the ones that are,
00:32:06.820 | I mean, you say sorting is most fun for you.
00:32:09.740 | - Well, it's a satisfiability too.
00:32:10.580 | - It's true it's fun, but combinatorial algorithms
00:32:13.580 | are the ones that I always enjoyed the most
00:32:18.020 | because that's when my skill at programming
00:32:20.940 | had the most payoff.
00:32:22.100 | The difference between an obvious algorithm
00:32:26.020 | that you think up first thing
00:32:28.380 | and an interesting subtle algorithm that not so obvious,
00:32:33.380 | but run circles around the other one,
00:32:37.380 | that's where computer science really comes in.
00:32:42.380 | And a lot of these combinatorial methods
00:32:46.820 | were found first in applications
00:32:49.660 | to artificial intelligence or cryptography.
00:32:54.140 | And in my case, I just liked them
00:32:58.180 | and it was associated more with puzzles.
00:33:01.380 | - Now do you like the most in the domain of graphs
00:33:04.100 | and graph theory?
00:33:04.980 | - Graphs are great because they're terrific models
00:33:08.620 | of so many things in the real world.
00:33:10.620 | And you throw numbers on a graph, you got a network.
00:33:14.420 | And so there you have many more things.
00:33:19.500 | But combinatorial in general is any arrangement of objects
00:33:24.500 | that has some kind of a higher structure,
00:33:30.060 | non-random structure.
00:33:33.260 | And so is it possible to put something together
00:33:37.740 | satisfying all these conditions?
00:33:39.380 | Like I mentioned arrows a minute ago,
00:33:41.820 | is there a way to put these numbers on a bunch of boxes
00:33:46.100 | that are pointing to each other?
00:33:47.140 | Is that gonna be possible at all?
00:33:48.780 | - That's volume four.
00:33:49.980 | - That's volume four.
00:33:52.060 | - What does the future hold?
00:33:52.900 | - Volume 4A was part one.
00:33:54.740 | And what happened was in 1962,
00:33:58.700 | when I started writing down a table of contents,
00:34:03.420 | it wasn't gonna be a book about computer programming
00:34:06.900 | in general, it was gonna be a book
00:34:08.300 | about how to write compilers.
00:34:10.580 | And I was asked to write a book explaining
00:34:14.780 | how to write a compiler.
00:34:16.820 | And at that time, there were only a few dozen people
00:34:21.820 | in the world who had written compilers
00:34:24.140 | and I happened to be one of them.
00:34:25.460 | So, and I also had some experience writing
00:34:30.460 | for like the campus newspaper and things like that.
00:34:35.500 | So I said, okay, great.
00:34:38.660 | I'm the only person I know who's written a compiler
00:34:42.140 | but hasn't invented any new techniques
00:34:43.940 | for writing compilers.
00:34:45.380 | And all the other people I knew had super ideas
00:34:49.740 | but I couldn't see that they would be able to write a book
00:34:52.340 | that would describe anybody else's ideas with their own.
00:34:55.780 | So I could be the journalist and I could explain
00:34:59.780 | what all these cool ideas about compiler writing were.
00:35:04.300 | And then I started putting down, well, yeah,
00:35:07.740 | let me, you need to have a chapter about data structures.
00:35:11.100 | You need to have some introductory material.
00:35:13.860 | I want to talk about searching 'cause a compiler writer
00:35:16.860 | has to look up the variables in a symbol table
00:35:21.860 | and find out which, when you write the name of a variable
00:35:27.340 | in one place, it's supposed to be the same
00:35:31.540 | as the one you put somewhere else.
00:35:32.660 | So you need all these basic techniques
00:35:35.740 | and I kind of know some arithmetic and stuff.
00:35:39.980 | So I threw in these chapters
00:35:42.140 | and I threw in a chapter on combinatorics
00:35:44.540 | because that was what I really enjoyed programming the most
00:35:48.420 | but there weren't many algorithms known
00:35:50.380 | about combinatorial methods in 1962.
00:35:52.760 | So that was a kind of a short chapter
00:35:55.460 | but it was sort of thrown in just for fun.
00:35:58.180 | And chapter 12 was going to be actual compilers,
00:36:01.420 | applying all the stuff in chapters one to 11
00:36:04.460 | to make compilers.
00:36:06.580 | Well, okay, so that was my table of contents from 1962.
00:36:10.640 | And during the '70s, the whole field of combinatorics
00:36:15.380 | went through a huge explosion.
00:36:17.860 | People talk about a combinatorial explosion
00:36:20.020 | and they usually mean by that,
00:36:22.020 | that the number of cases goes up,
00:36:24.260 | n plus one and all of a sudden,
00:36:28.180 | your problem has gotten more than 10 times harder.
00:36:31.620 | But there was an explosion of ideas
00:36:35.340 | about combinatorics in the '70s to the point that,
00:36:40.380 | take 1975, I bet you more than half
00:36:44.600 | of all the journals of computer science
00:36:46.840 | were about combinatorial method.
00:36:49.540 | - What kind of problems were occupying people's minds?
00:36:52.420 | What kind of problems in combinatorics?
00:36:54.780 | Was it satisfiability, graph theory?
00:36:57.780 | - Yeah, graph theory was quite dominant.
00:36:59.980 | But all of the NP-hard problems
00:37:05.420 | that you have like Hamiltonian path--
00:37:08.820 | - Travel salesman.
00:37:09.820 | - Going beyond, yeah, going beyond graphs,
00:37:12.460 | you had operation research.
00:37:14.440 | Whenever there was a small class of problems
00:37:17.940 | that had efficient solutions
00:37:19.420 | and they were usually associated with matroid theory,
00:37:21.940 | a special mathematical construction.
00:37:24.640 | But once we went to things that involved
00:37:27.660 | three things at a time instead of two,
00:37:30.460 | all of a sudden, things got harder.
00:37:32.500 | So we had satisfiability problems
00:37:34.300 | where if you have clauses,
00:37:37.900 | every clause has two logical elements in it,
00:37:40.620 | then we can satisfy it in linear time.
00:37:43.140 | We can test for satisfiability in linear time,
00:37:45.180 | but if you allow yourself three variables in the clause,
00:37:49.060 | then nobody knows how to do it.
00:37:52.540 | So these articles were about trying to find better ways
00:37:56.860 | to solve cryptography problems and graph theory problems.
00:38:01.580 | We have lots of data,
00:38:04.320 | but we didn't know how to find the best subsets of the data,
00:38:07.180 | like with sorting, we could get the answer.
00:38:11.340 | It didn't take long.
00:38:12.940 | - So how did it continue to change from the '70s to today?
00:38:16.940 | - Yeah, so now there may be half a dozen conferences
00:38:20.700 | whose topic is combinatorics, different kind,
00:38:24.940 | but fortunately, I don't have to rewrite my book every month
00:38:28.540 | like I had to in the '70s.
00:38:31.100 | But still, there's a huge amount of work being done
00:38:34.020 | and people getting better ideas on these problems
00:38:37.700 | that don't seem to have really efficient solutions,
00:38:41.400 | but we still get a lot more with them.
00:38:44.900 | And so this book that I'm finishing now is,
00:38:48.340 | I've got a whole bunch of brand new methods
00:38:51.300 | that as far as I know, there's no other book
00:38:55.540 | that covers this particular approach.
00:38:59.980 | And so I'm trying to do my best
00:39:03.500 | of exploring the tip of the iceberg
00:39:06.020 | and I try out lots of things and keep rewriting
00:39:11.020 | as I find better methods.
00:39:16.540 | - So what's your writing process like?
00:39:18.740 | What's your thinking and writing process like every day?
00:39:22.060 | What's your routine even?
00:39:25.620 | - Yeah, I guess it's actually the best question
00:39:29.760 | because I spend seven days a week doing it.
00:39:34.760 | - You're the most prepared to answer it.
00:39:36.980 | - Yeah, but okay, so the chair I'm sitting in
00:39:41.980 | is where I do--
00:39:43.460 | - It's where the magic happens.
00:39:46.700 | - Well, reading and writing,
00:39:49.220 | the chair is usually sitting over there
00:39:50.660 | where I have other books, some reference books,
00:39:53.500 | but I found this chair which was designed
00:39:59.260 | by a Swedish guy anyway.
00:40:00.400 | It turns out this is the only chair
00:40:02.040 | I can really sit in for hours and hours
00:40:03.760 | and not know that I'm in a chair.
00:40:05.640 | But then I have the standup desk right next to us
00:40:09.340 | and so after I write something with pencil and eraser,
00:40:13.920 | I get up and I type it and revise and rewrite.
00:40:18.100 | - The kernel of the idea is first put on paper.
00:40:24.580 | - Yeah.
00:40:25.420 | - That's where--
00:40:26.420 | - And I'll write maybe five programs a week,
00:40:29.680 | of course, literate programming.
00:40:33.220 | And these are, before I describe something in my book,
00:40:36.700 | I always program it to see how it's working
00:40:38.820 | and I try it a lot.
00:40:40.320 | So for example, I learned at the end of January,
00:40:45.260 | I learned of a breakthrough by four Japanese people
00:40:50.160 | who had extended one of my methods in a new direction
00:40:54.540 | and so I spent the next five days writing a program
00:40:57.960 | to implement what they did and then I,
00:41:00.300 | they had only generalized part of what I had done
00:41:04.760 | so then I had to see if I could generalize more parts of it
00:41:07.760 | and then I had to take their approach
00:41:10.620 | and I had to try it out on a couple of dozen
00:41:14.060 | of the other problems I had already worked out
00:41:16.640 | with my old methods.
00:41:18.560 | And so that took another couple of weeks
00:41:20.420 | and then I started to see the light
00:41:24.160 | and I started writing the final draft
00:41:29.080 | and then I would type it up,
00:41:32.540 | involve some new mathematical questions
00:41:34.940 | and so I wrote to my friends who might be good
00:41:37.740 | at solving those problems and they solved some of them.
00:41:42.740 | So I put that in as exercises.
00:41:45.140 | And so a month later, I had absorbed one new idea
00:41:50.380 | that I learned and I'm glad I heard about it in time
00:41:54.920 | otherwise I would have put my book out
00:41:56.760 | before I'd heard about the idea.
00:41:59.000 | On the other hand, this book was supposed to come in
00:42:01.560 | at 300 pages and I'm up to 350 now.
00:42:04.340 | That added 10 pages to the book
00:42:06.760 | but if I learn about another one,
00:42:09.280 | my publisher's gonna shoot me.
00:42:10.780 | - Well, so in that process, in that one month process,
00:42:17.060 | are some days harder than others?
00:42:19.280 | - Are some days harder than others?
00:42:20.940 | Well, yeah.
00:42:22.280 | My work is fun but I also work hard
00:42:24.340 | and every big job has parts
00:42:27.120 | that are a lot more fun than others.
00:42:29.280 | And so many days I'll say,
00:42:31.920 | why do I have to have such high standards?
00:42:34.280 | Why couldn't I just be sloppy and not try this out
00:42:37.480 | and just report the answer?
00:42:40.240 | But I know that people are counting me to do this
00:42:45.520 | and so, okay, so okay, Don, I'll grit my teeth and do it.
00:42:50.080 | And then the joy comes out when I see that actually,
00:42:54.280 | I'm getting good results and I get,
00:42:57.400 | and even more when I see that somebody
00:43:00.160 | has actually read and understood what I wrote
00:43:02.640 | and told me how to make it even better.
00:43:05.680 | I did wanna mention something about the method.
00:43:10.680 | So I got this tablet here where,
00:43:13.760 | (stammering)
00:43:16.480 | - Wow.
00:43:17.320 | - Where I do the first writing of concepts.
00:43:22.320 | Okay, so.
00:43:23.840 | - And what language is that then?
00:43:27.800 | - Right, so take a look at it.
00:43:29.520 | But here, to random say,
00:43:31.840 | explain how to draw such skewed pixel diagrams.
00:43:34.960 | Okay, so I got this paper about 40 years ago
00:43:39.600 | when I was visiting my sister in Canada
00:43:42.120 | and they make tablets of paper with this nice large size
00:43:47.120 | and just the right--
00:43:48.080 | - And a very small space between lines.
00:43:49.680 | - Small spaces, yeah, yeah, take a look.
00:43:51.240 | - Maybe also just show it.
00:43:54.400 | - Yeah.
00:43:55.240 | - Yeah, wow.
00:43:58.240 | - You know, I've got these manuscripts
00:44:00.560 | going back to the '60s.
00:44:01.940 | And those are when I'm getting my ideas on paper, okay?
00:44:09.360 | But I'm a good typist.
00:44:10.680 | In fact, I went to typing school when I was in high school
00:44:14.680 | and so I can type faster than I think.
00:44:17.580 | So then when I do the editing, stand up and type,
00:44:21.040 | then I revise this and it comes out a lot different
00:44:24.800 | than what, for style and rhythm and things like that
00:44:28.880 | come out at the typing stage.
00:44:31.040 | - And you type in tech.
00:44:32.880 | - And I type in tech.
00:44:34.200 | - And can you think in tech?
00:44:37.280 | - No.
00:44:38.240 | - So.
00:44:39.080 | - To a certain extent, I have only a small number
00:44:42.440 | of idioms that I use, like, you know,
00:44:44.440 | I'm beginning a theorem, I do something for,
00:44:46.600 | displayed equation, I do something and so on.
00:44:49.880 | But I have to see it.
00:44:52.020 | - In the way that it's on paper here.
00:44:55.240 | - Yeah, right.
00:44:56.080 | - So for example, Turing wrote, what, "The Other Direction."
00:44:59.360 | You don't write macros, you don't think in macros.
00:45:04.360 | - Not particularly, but when I need a macro,
00:45:06.000 | I'll go ahead and do it.
00:45:09.840 | But the thing is, I also write to fit.
00:45:12.840 | I mean, I'll change something if I can save a line.
00:45:17.840 | You know, it's like haiku.
00:45:19.800 | I'll figure out a way to rewrite the sentence
00:45:21.880 | so that it'll look better on the page.
00:45:25.200 | And I shouldn't be wasting my time on that,
00:45:27.360 | but I can't resist because I know it's only another
00:45:31.680 | 3% of the time or something like that.
00:45:33.920 | - And it could also be argued that
00:45:36.080 | that is what life is about.
00:45:39.040 | - Ah, yes, in fact, that's true.
00:45:41.520 | Like I work in the garden one day a week
00:45:45.400 | and that's kind of a description of my life
00:45:48.440 | is getting rid of weeds, you know,
00:45:50.840 | removing bugs from programs.
00:45:52.920 | - So you know, a lot of writers talk about,
00:45:55.640 | you know, basically suffering, the writing processes.
00:45:58.760 | - Yeah.
00:45:59.600 | - Having, you know, it's extremely difficult.
00:46:02.120 | And I think of programming, especially,
00:46:05.160 | or technical writing that you're doing can be like that.
00:46:08.800 | Do you find yourself, methodologically,
00:46:13.440 | how do you every day sit down to do the work?
00:46:16.960 | Is it a challenge?
00:46:18.560 | You kind of say it's, you know, it's fun.
00:46:22.740 | But it'd be interesting to hear if there are non-fun parts
00:46:29.120 | that you really struggle with.
00:46:31.120 | - Yeah, so the fun comes when I'm able to put together
00:46:35.640 | ideas of two people who didn't know about each other.
00:46:39.400 | And so I might be the first person
00:46:41.760 | that saw both of their ideas.
00:46:44.240 | And so then, you know, then I get to make the synthesis
00:46:47.880 | and that gives me a chance to be creative.
00:46:52.120 | But the drudge work is where I've got to
00:46:55.640 | chase everything down to its root.
00:46:57.720 | This leads me into really interesting stuff.
00:47:00.180 | I mean, I learned about Sanskrit and I,
00:47:02.680 | you know, I try to give credit to all the authors.
00:47:06.720 | And so I write to people who know the people,
00:47:11.720 | authors if they're dead or I communicate this way.
00:47:16.000 | And I got to get the math right.
00:47:18.440 | And I got to tack all my programs,
00:47:20.440 | try to find holes in them.
00:47:22.800 | And I rewrite the programs after I get a better idea.
00:47:26.800 | - Is there ever dead ends?
00:47:28.920 | - Dead ends, oh yeah, I throw stuff out, yeah.
00:47:32.040 | One of the things that I, I spent a lot of time
00:47:35.280 | preparing a major example based on the game of baseball.
00:47:39.360 | And I know a lot of people who,
00:47:42.500 | for whom baseball is the most important thing in the world.
00:47:46.840 | But I also know a lot of people
00:47:48.720 | for whom cricket is the most important in the world
00:47:50.920 | or soccer or something, you know.
00:47:53.960 | And I realized that if I had a big example,
00:47:58.040 | I mean, it was gonna have a fold out illustration
00:47:59.840 | and everything.
00:48:00.800 | And I was saying, well, what am I really teaching
00:48:02.640 | about algorithms here where I had this baseball example?
00:48:06.800 | And if I was a person who knew only cricket,
00:48:10.440 | wouldn't they, what would they think about this?
00:48:13.240 | And so I've ripped the whole thing out.
00:48:14.920 | But I, you know, I had something that would have
00:48:18.400 | really appealed to people who grew up with baseball
00:48:20.960 | as a major theme in their life.
00:48:24.160 | - Which is a lot of people, but.
00:48:26.240 | - But, yeah.
00:48:27.080 | - But still a minority.
00:48:28.960 | - Small minority.
00:48:29.800 | I took out bowling too.
00:48:31.140 | - Even a smaller minority.
00:48:34.680 | What's the art in the art of programming?
00:48:40.920 | Why is there, of the few words in the title,
00:48:45.080 | why is art one of them?
00:48:46.440 | - Yeah, well, that's what I wrote my Turing lecture about.
00:48:50.760 | And so when people talk about art,
00:48:54.840 | it really, I mean, what the word means is
00:48:57.840 | something that's not in nature.
00:49:02.480 | So when you have artificial intelligence,
00:49:05.040 | art comes from the same root,
00:49:09.240 | saying that this is something
00:49:11.200 | that was created by human beings.
00:49:13.760 | And then it's gotten a further meaning,
00:49:18.160 | often of fine art, which adds this beauty to the mix
00:49:21.840 | and says, you know, we have things that are artistically
00:49:24.260 | done and this means not only done by humans,
00:49:28.240 | but also done in a way that's elegant
00:49:30.840 | and brings joy and has, I guess,
00:49:35.840 | Tolstoy versus Dostoevsky.
00:49:41.960 | - Right.
00:49:44.280 | - But anyway, it's that part that says that it's done well,
00:49:50.160 | as well as not only different from nature.
00:49:54.060 | In general, then, art is what human beings
00:49:59.060 | are specifically good at.
00:50:01.660 | And when they say artificial intelligence,
00:50:04.380 | well, they're trying to mimic human beings.
00:50:06.500 | - But there's an element of fine art and beauty.
00:50:11.140 | You are one.
00:50:11.980 | - That's what I try to also say,
00:50:14.620 | that you can write a program and make a work of art.
00:50:19.860 | - So now, in terms of surprising,
00:50:24.020 | you know, what ideas in writing from search
00:50:31.740 | to the combinatorial algorithms,
00:50:34.000 | what ideas have you come across
00:50:37.100 | that were particularly surprising to you
00:50:42.100 | that changed the way you see a space of problems?
00:50:48.280 | - I get a surprise every time I have a bug
00:50:49.960 | in my program, obviously.
00:50:51.780 | But that isn't really what you're looking for.
00:50:54.160 | - More transformational than surprising.
00:50:57.120 | - For example, in volume 4A,
00:50:59.360 | I was especially surprised when I learned
00:51:02.260 | about data structure called BDD,
00:51:04.280 | Boolean Decision Diagram.
00:51:06.920 | Because I sort of had the feeling that,
00:51:10.880 | as an old timer, and I'd been programming
00:51:16.520 | since the '50s, and BDDs weren't invented
00:51:21.520 | until 1986, and here comes a brand new idea
00:51:25.520 | that revolutionizes the way to represent
00:51:27.800 | a Boolean function.
00:51:29.600 | And Boolean functions are so basic
00:51:32.080 | to all kinds of things.
00:51:33.400 | I mean, logic underlies everything we can describe,
00:51:40.560 | all of what we know in terms of logic somehow.
00:51:44.160 | And propositional logic, I thought that was
00:51:49.160 | cut and dried and everything was known.
00:51:54.200 | But here comes Randy Bryant and discovers
00:52:00.680 | that BDDs are incredibly powerful.
00:52:05.440 | That means I have a whole new section
00:52:12.680 | to the book that I never would have thought of
00:52:14.360 | until 1986, not even until the 1990s,
00:52:17.440 | when people started to use it
00:52:20.520 | for a billion dollar applications.
00:52:25.520 | And it was the standard way to design computers
00:52:29.240 | for a long time until SAT solvers came along
00:52:32.920 | in the year 2000, so that's another great big surprise.
00:52:36.920 | So a lot of these things have totally changed
00:52:40.440 | the structure of my book.
00:52:42.400 | And the middle third of volume 4B is about SAT solvers,
00:52:47.000 | and that's 300 plus pages, which is all about material,
00:52:52.000 | mostly about material that was discovered in this century.
00:52:57.920 | And I had to start from scratch
00:53:01.960 | and meet all the people in the field
00:53:04.080 | and write 15 different SAT solvers that I wrote
00:53:08.520 | while preparing that, seven of them are described
00:53:11.480 | in the book, others were from my own experience.
00:53:15.200 | - So newly invented data structures
00:53:17.920 | or ways to represent--
00:53:19.640 | - A whole new class of algorithm.
00:53:22.640 | - A whole new class of algorithm.
00:53:23.480 | - Yeah, and the interesting thing about the BDDs
00:53:26.960 | was that the theoreticians started looking at it
00:53:30.440 | and started to describe all the things
00:53:33.880 | you couldn't do with BDDs.
00:53:35.400 | And so they were getting a bad name.
00:53:41.560 | Because, okay, they were useful,
00:53:45.440 | but they didn't solve every problem.
00:53:48.280 | I'm sure that the theoreticians are,
00:53:50.440 | in the next 10 years, are gonna show
00:53:53.080 | why machine learning doesn't solve everything.
00:53:56.360 | But I'm not only worried about the worst case,
00:53:59.680 | I get a huge delight when I can actually solve
00:54:02.960 | a problem that I couldn't solve before,
00:54:05.360 | even though I can't solve the problem
00:54:07.520 | that it suggests as a further problem.
00:54:10.600 | I know that I'm way better than I was before.
00:54:14.160 | And so I found out that BDDs could do
00:54:16.520 | all kinds of miraculous things.
00:54:19.840 | And so I had to spend quite a few years
00:54:24.840 | learning about that territory.
00:54:31.040 | - So in general, what brings you more pleasure?
00:54:36.000 | Improving or showing a worst case analysis of an algorithm,
00:54:40.000 | or showing a good average case,
00:54:43.840 | or just showing a good case?
00:54:45.840 | That something good pragmatically
00:54:47.580 | can be done with this algorithm.
00:54:49.320 | - Yeah, I like a good case that is maybe
00:54:52.560 | only a million times faster than I was able to do before.
00:54:55.440 | And not worry about the fact that
00:54:58.040 | it's still gonna take too long
00:55:02.600 | if I double the size of the problem.
00:55:05.640 | - So that said, you popularized the asymptotic notation
00:55:10.040 | for describing running time.
00:55:13.080 | Obviously in the analysis of algorithms,
00:55:15.560 | worst case is such an important part.
00:55:18.440 | Do you see any aspects of that kind of analysis
00:55:22.360 | is lacking, and notation too?
00:55:26.080 | - Well, the main purpose, we should have notations
00:55:29.480 | that help us for the problems we wanna solve,
00:55:33.240 | and so that they match our intuitions.
00:55:36.200 | And people who worked in number theory
00:55:38.920 | had used asymptotic notation in a certain way,
00:55:43.120 | but it was only known to a small group of people.
00:55:46.360 | And I realized that, in fact, it was very useful
00:55:50.680 | to be able to have a notation for something
00:55:53.200 | that we don't know exactly what it is,
00:55:55.040 | but we only know partial about it.
00:55:57.520 | So for example, instead of big O notation,
00:56:02.240 | let's just take a much simpler notation
00:56:05.400 | where I'd say zero or one, or zero, one, or two.
00:56:10.320 | And suppose that when I had been in high school,
00:56:13.880 | we would be allowed to put in the middle of our formula,
00:56:18.160 | X plus zero, one, or two equals Y, okay?
00:56:23.160 | And then we would learn how to multiply
00:56:27.080 | two such expressions together, and deal with them.
00:56:32.720 | Well, the same thing, big O notation says,
00:56:35.360 | here's something that's, I'm not sure what it is,
00:56:38.960 | but I know it's not too big.
00:56:40.360 | I know it's not bigger than some constant times
00:56:43.640 | N squared or something like that.
00:56:45.520 | So I write big O of N squared.
00:56:47.320 | Now I learn how to add big O of N squared
00:56:50.120 | to big O of N cubed, and I know how to add big O
00:56:52.640 | of N squared to plus one, and square that,
00:56:56.040 | and how to take logarithms, exponentials,
00:56:58.400 | we'll have big O's in the middle of them.
00:57:00.400 | And that turned out to be hugely valuable
00:57:04.640 | in all of the work that I was trying to do,
00:57:06.320 | as I'm trying to figure out how good an algorithm is.
00:57:09.480 | - So have there been algorithms in your journey
00:57:12.800 | that perform very differently in practice
00:57:16.780 | than they do in theory?
00:57:18.800 | - Well, the worst case of a combinatorial algorithm
00:57:21.400 | is almost always horrible.
00:57:23.580 | But we have SAT solvers that are solving,
00:57:28.080 | where one of the last exercises in that part of my book
00:57:33.040 | was figure out a problem that has 100 variables
00:57:37.200 | that's difficult for a SAT solver.
00:57:39.800 | But you would think that a problem
00:57:43.960 | with 100 Boolean variables requires you
00:57:46.640 | to do two to the 100th operations,
00:57:51.000 | because that's the number of possibilities
00:57:52.760 | when you have 100 Boolean variables,
00:57:54.800 | and two to the 100th.
00:57:57.040 | Two to the 100th is way bigger
00:57:58.800 | than we can handle, 10 to the 17th is a lot.
00:58:02.080 | - You've mentioned over the past few years
00:58:05.160 | that you believe P may be equal to NP,
00:58:08.520 | but that it's not really,
00:58:10.020 | you know, if somebody does prove that P equals NP,
00:58:14.080 | it will not directly lead to an actual algorithm
00:58:16.720 | to solve difficult problems.
00:58:18.580 | Can you explain your intuition here?
00:58:21.600 | Has it been changed?
00:58:23.480 | And in general, on the difference
00:58:25.240 | between easy and difficult problems of P and NP and so on?
00:58:28.800 | - Yeah, so the popular idea is,
00:58:32.260 | if an algorithm exists, then somebody will find it.
00:58:37.000 | And it's just a matter of writing it down.
00:58:42.540 | But many more algorithms exist
00:58:48.480 | than anybody can understand, or ever make use of.
00:58:51.760 | - Or discover, yeah.
00:58:53.240 | - Because they're just way beyond human comprehension.
00:58:56.920 | The total number of algorithms is more than mind-boggling.
00:59:01.920 | So we have situations now where we know
00:59:07.440 | that algorithms exist, but we don't know,
00:59:10.080 | we don't have the foggiest idea what the algorithms are.
00:59:12.920 | There are simple examples based on game playing,
00:59:17.920 | where you have, where you say,
00:59:22.240 | well, there must be an algorithm that exists
00:59:25.160 | to win in the game of hex, because,
00:59:27.960 | for the first player to win in the game of hex,
00:59:29.760 | because hex is always either a win
00:59:33.520 | for the first player or the second player.
00:59:35.240 | - Well, what's the game of hex?
00:59:36.080 | - There's a game of hex, which is based
00:59:38.680 | on putting pebbles onto a hexagonal board,
00:59:42.200 | and the white player tries to get a white path
00:59:45.160 | from left to right, and the black player tries
00:59:47.120 | to get a black path from bottom to top.
00:59:49.720 | - And how does capture occur?
00:59:51.120 | Just so I understand that.
00:59:52.160 | - And there's no capture.
00:59:53.080 | You just put pebbles down one at a time.
00:59:56.200 | But there's no draws, because after all the white
00:59:58.840 | and black are played, there's either gonna be
01:00:00.720 | a white path across from east to west,
01:00:02.720 | or a black path from bottom to top.
01:00:05.760 | So there's always, you know, it's the perfect
01:00:08.720 | information game, and people take turns,
01:00:11.880 | like tic-tac-toe.
01:00:13.920 | And the hex board can be different sizes,
01:00:18.760 | but there's no possibility of a draw,
01:00:21.640 | and players move one at a time.
01:00:24.400 | And so it's gotta be either a first player win
01:00:26.600 | or a second player win.
01:00:27.600 | Mathematically, you follow out all the trees,
01:00:30.800 | and there's always a win for the first player,
01:00:34.440 | second player, okay.
01:00:36.000 | And it's finite.
01:00:37.240 | The game is finite.
01:00:38.600 | So there's an algorithm that will decide,
01:00:41.200 | you can show it has to be one or the other,
01:00:43.840 | because the second player could mimic the first player
01:00:46.880 | with kind of a pairing strategy.
01:00:48.680 | And so you can show that it has to be one or the other.
01:00:54.960 | But we don't know any algorithm anyway.
01:00:58.520 | We don't know.
01:00:59.360 | In other words, there are cases where you can prove
01:01:02.680 | the existence of a solution, but nobody knows
01:01:07.120 | any way how to find it.
01:01:08.800 | More like the algorithm question,
01:01:10.880 | there's a very powerful theorem in graph theory
01:01:15.400 | by Robinson and Seymour that says that every class
01:01:20.400 | of graphs that is closed under taking minors
01:01:25.160 | has a polynomial time algorithm to determine
01:01:29.640 | whether it's in this class or not.
01:01:31.520 | Now a class of graphs, for example, planar graphs,
01:01:33.720 | these are graphs that you can draw in a plane
01:01:35.520 | without crossing lines.
01:01:37.360 | And a planar graph, taking minors means that
01:01:41.600 | you can shrink an edge into a point,
01:01:45.920 | or you can delete an edge.
01:01:47.520 | And so you start with a planar graph,
01:01:51.640 | shrink any edge to a point is still planar,
01:01:54.240 | delete an edge is still planar.
01:01:56.640 | Okay, now, but there are millions of different
01:02:01.640 | ways to describe a family of graphs
01:02:09.600 | that still remains the same under taking minor.
01:02:14.160 | And Robinson and Seymour proved that
01:02:16.960 | any such family of graphs, there's a finite number
01:02:20.120 | of minimum graphs that are obstructions.
01:02:25.120 | So that if it's not in the family,
01:02:29.600 | then it has to contain, then there has to be a way
01:02:34.600 | to shrink it down until you get one of these
01:02:37.880 | bad minimum graphs that's not in the family.
01:02:40.600 | In the case of a planar graph,
01:02:43.760 | the minimum graph is a five-pointed star
01:02:46.200 | where everything's pointing to another,
01:02:48.480 | and the minimum graph consisting of trying
01:02:50.800 | to connect three utilities to three houses
01:02:52.800 | without crossing lines.
01:02:54.440 | And so there are two bad graphs that are not planar.
01:02:58.320 | And every non-planar graph contains one of these
01:03:01.480 | two bad graphs by shrinking and removing edges.
01:03:07.080 | - Sorry, can you say that again?
01:03:08.280 | So he proved that there's a finite number
01:03:11.040 | of these bad graphs.
01:03:11.880 | - There's always a finite number.
01:03:13.200 | So somebody says, here's a family--
01:03:15.240 | - It's hard to believe.
01:03:16.240 | (laughing)
01:03:17.440 | - And they proved in this sequence of 20 papers,
01:03:20.840 | I mean, it's deep work.
01:03:23.400 | - Because that's for any arbitrary class.
01:03:27.280 | So it's for any--
01:03:28.120 | - Any arbitrary class that's closed under taking minors.
01:03:31.000 | - That's closed under, maybe I'm not understanding,
01:03:33.920 | because it seems like a lot of them
01:03:35.200 | are closed under taking minors.
01:03:36.760 | - Almost all the important classes of graphs are.
01:03:39.760 | There are tons of such graphs,
01:03:42.120 | but also hundreds of them that arise in applications.
01:03:47.120 | I have a book over here called "Fact Classes of Graphs,"
01:03:51.520 | and it's amazing how many different classes of graphs
01:03:56.520 | that people have looked at.
01:03:58.960 | - So why do you bring up this theorem, or this proof?
01:04:02.520 | - So now, there's lots of algorithms
01:04:05.560 | that are known for special classes of graphs.
01:04:07.400 | For example, if I have a chordal graph,
01:04:10.760 | then I can color it efficiently.
01:04:12.400 | If I have some kind of graphs,
01:04:15.720 | it'll make a great network, various things.
01:04:18.800 | So you'd like to test, somebody gives you a graph,
01:04:22.720 | oh, is it in this family of graphs?
01:04:24.360 | If so, then I can go to the library
01:04:28.400 | and find an algorithm that's gonna solve my problem
01:04:31.000 | on that graph.
01:04:32.880 | - Okay, so we wanna have a graph that says,
01:04:37.240 | a algorithm that says,
01:04:39.000 | you give me a graph, I'll tell you
01:04:43.720 | whether it's in this family or not, okay?
01:04:48.720 | And so all I have to do is test whether or not,
01:04:53.120 | that does this given graph have a minor,
01:04:55.400 | that's one of the bad ones.
01:04:57.040 | A minor is everything you can get
01:04:59.120 | by shrinking and removing it.
01:05:01.560 | And given any minor, there's a polynomial time algorithm
01:05:04.720 | saying I can tell whether this is a minor of you.
01:05:09.600 | And there's a finite number of bad cases.
01:05:11.640 | So I just try, does it have this bad case?
01:05:15.080 | Polynomial time, I got the answer.
01:05:17.120 | Does it have this bad case?
01:05:18.200 | Polynomial time, I got the answer.
01:05:19.880 | Total polynomial time.
01:05:23.240 | And so I've solved the problem.
01:05:24.680 | However, all we know is that the number of minors is finite.
01:05:28.600 | We don't know what, we might only know one or two
01:05:32.120 | of those minors, but we don't know if we've got 20 of them,
01:05:35.480 | we don't know if there might be 21, 25.
01:05:37.680 | All we know is that it's finite.
01:05:42.640 | So here we have a polynomial time algorithm
01:05:44.680 | that we don't know.
01:05:46.400 | - That's a really great example of what you worry about
01:05:49.840 | or why you think P equals NP won't be useful.
01:05:53.200 | But still, why do you hold the intuition
01:05:56.640 | that P equals NP?
01:05:59.920 | - Because you have to rule out so many possible algorithms
01:06:04.920 | as being not working.
01:06:08.260 | You can take the graph and you can represent it
01:06:14.080 | as in terms of certain prime numbers,
01:06:18.280 | and then you can multiply those together,
01:06:20.000 | and then you can take the bitwise and,
01:06:24.380 | and construct some certain constant in polynomial time.
01:06:29.380 | And then that's perfectly valid algorithm.
01:06:33.220 | And there's so many algorithms of that kind.
01:06:36.860 | A lot of times we see random,
01:06:38.680 | take data and we get coincidences
01:06:46.140 | that some fairly random looking number actually is useful
01:06:51.380 | because it happens to solve a problem
01:06:56.380 | just because there's so many hairs on your head.
01:07:04.660 | But it seems like unlikely that two people
01:07:10.860 | are gonna have the same number of hairs on their head.
01:07:13.460 | But they're obvious, but you can count
01:07:17.540 | how many people there are and how many hairs on their head.
01:07:20.660 | There must be people walking around in the country
01:07:22.580 | that have the same number of hairs on their head.
01:07:24.700 | Well, that's a kind of a coincidence
01:07:26.660 | that you might say also,
01:07:28.540 | this particular combination of operations
01:07:31.940 | just happens to prove that a graph has a Hamiltonian path.
01:07:35.940 | I mean, I see lots of cases where unexpected things happen
01:07:40.940 | when you have enough possibilities.
01:07:44.220 | - So because the space of possibility is so huge,
01:07:46.620 | your intuition just says--
01:07:47.460 | - They have to rule them all out.
01:07:49.220 | And so that's the reason for my intuition
01:07:51.700 | is it by no means a proof.
01:07:53.260 | I mean, some people say,
01:07:54.700 | well, P can't equal NP
01:07:58.580 | because you've had all these smart people.
01:08:00.680 | The smartest designers of algorithms
01:08:05.460 | have been racking their brains for years and years
01:08:09.100 | and there's million dollar prizes out there.
01:08:11.380 | None of them, nobody has thought of the algorithm.
01:08:16.260 | So there must be no such algorithm.
01:08:19.260 | On the other hand, I can use exactly the same logic
01:08:23.060 | and I can say, well, P must be equal to NP
01:08:25.900 | because there's so many smart people out here
01:08:27.780 | have been trying to prove it unequal to NP
01:08:30.580 | and they've all failed.
01:08:31.840 | - This kind of reminds me of the discussion
01:08:36.900 | about the search for aliens.
01:08:38.460 | We've been trying to look for them
01:08:40.580 | and we haven't found them yet, therefore they don't exist.
01:08:43.900 | But you can show that there's so many planets out there
01:08:46.300 | that they very possibly could exist.
01:08:48.220 | - Yeah, right.
01:08:50.300 | And then there's also the possibility that they exist
01:08:54.420 | but they all discovered machine learning or something
01:08:57.700 | and then blew each other up.
01:09:00.940 | - Well, on that small, quick tangent, let me ask,
01:09:03.660 | do you think there's intelligent life
01:09:05.180 | out there in the universe?
01:09:07.040 | - I have no idea.
01:09:08.420 | - Do you hope so?
01:09:09.260 | Do you think about it?
01:09:11.420 | - I don't spend my time thinking about things
01:09:14.940 | that I could never know, really.
01:09:16.900 | - And yet you do enjoy the fact
01:09:18.300 | that there's many things you don't know.
01:09:20.380 | You do enjoy the mystery of things.
01:09:23.980 | - I enjoy the fact that I have limits, yeah.
01:09:27.660 | But I don't take time to answer unsolvable questions.
01:09:32.660 | - Got it.
01:09:35.020 | Well, 'cause you've taken on some tough questions
01:09:38.940 | that may seem unsolvable.
01:09:40.520 | You have taken on some tough questions
01:09:42.620 | that may seem unsolvable, but they're in the space.
01:09:44.860 | - It gives me a thrill when I can get further
01:09:46.660 | than I ever thought I could.
01:09:47.860 | - Right. - Yeah.
01:09:48.940 | - But-- - But I don't--
01:09:50.980 | - Much like with religion, these--
01:09:53.140 | - I'm glad that there's no proof that God exists or not.
01:09:57.460 | I mean, I think--
01:09:59.300 | - It would spoil the mystery.
01:10:00.740 | - It would be too dull, yeah.
01:10:02.960 | - So to quickly talk about the other art
01:10:08.220 | of artificial intelligence,
01:10:10.900 | what is your view, you know,
01:10:14.200 | artificial intelligence community has developed
01:10:16.660 | as part of computer science
01:10:18.000 | and in parallel with computer science since the '60s.
01:10:21.320 | What's your view of the AI community from the '60s to now?
01:10:26.320 | - So all the way through, it was the people
01:10:29.400 | who were inspired by trying to mimic intelligence
01:10:33.400 | or to do things that were somehow
01:10:37.800 | the greatest achievements of intelligence
01:10:40.140 | that had been inspiration to people
01:10:42.500 | who have pushed the envelope of computer science
01:10:45.400 | maybe more than any other group of people.
01:10:50.320 | So all the way through, it's been a great source
01:10:53.080 | of good problems to sink teeth into.
01:10:58.080 | And getting partial answers
01:11:06.580 | and then more and more successful answers over the years.
01:11:10.580 | So this has been the inspiration
01:11:13.860 | for lots of the great discoveries of computer science.
01:11:16.420 | - Are you yourself captivated by the possibility
01:11:18.860 | of creating, of algorithms having
01:11:21.740 | echoes of intelligence in them?
01:11:24.760 | - Not as much as most of the people in the field,
01:11:29.580 | I guess I would say,
01:11:30.420 | but that's not to say that they're wrong or that,
01:11:34.700 | it's just you asked about my own personal preferences.
01:11:37.760 | But the thing that I worry about
01:11:43.660 | is when people start believing
01:11:48.860 | that they've actually succeeded.
01:11:50.460 | Because it seems to me there's a huge gap
01:11:57.540 | between really understanding something
01:12:01.860 | and being able to pretend to understand something
01:12:05.420 | and give the illusion of understanding something.
01:12:08.180 | - Do you think it's possible to create
01:12:10.100 | without understanding?
01:12:12.100 | - Yeah.
01:12:12.940 | - So to--
01:12:13.820 | - I do that all the time too.
01:12:15.220 | I mean, that's why I use random numbers.
01:12:18.840 | But there's still this great gap.
01:12:24.900 | I don't assert that it's impossible,
01:12:27.620 | but I don't see anything coming any closer to really
01:12:32.620 | the kind of stuff that I would consider intelligence.
01:12:38.860 | - So you've mentioned something that,
01:12:40.980 | on that line of thinking, which I very much agree with,
01:12:45.460 | so the art of computer programming, as the book,
01:12:49.540 | is focused on single processor algorithms,
01:12:53.340 | and for the most part.
01:12:55.780 | You mentioned--
01:12:57.620 | - That's only because I set the table of contents in 1962,
01:13:00.900 | you have to remember.
01:13:02.380 | - For sure, there's no--
01:13:04.580 | - I'm glad I didn't wait until 1965 or something.
01:13:08.060 | - That's, one book, maybe we'll touch on the Bible,
01:13:12.620 | but one book can't always cover the entirety of everything.
01:13:17.100 | So I'm glad the table of contents
01:13:22.260 | for the art of computer programming is what it is.
01:13:26.020 | But you did mention that you thought that understanding
01:13:30.420 | of the way ant colonies are able to perform
01:13:33.200 | incredibly organized tasks might well be the key
01:13:36.140 | to understanding human cognition.
01:13:38.480 | So these fundamentally distributed systems.
01:13:42.060 | So what do you think is the difference between the way
01:13:44.860 | Don Knuth would sort a list,
01:13:47.860 | and an ant colony would sort a list,
01:13:49.820 | or perform an algorithm?
01:13:52.860 | - Sorting a list isn't the same as cognition, though,
01:13:55.900 | but I know what you're getting at.
01:13:58.360 | Well, the advantage of ant colony,
01:14:02.060 | at least we can see what they're doing.
01:14:04.580 | We know which ant has talked to which other ant,
01:14:07.620 | and it's much harder with the brains to know
01:14:12.620 | to what extent neurons are passing signal.
01:14:18.260 | So I'm just saying that ant colony might be,
01:14:21.700 | if they have the secret of cognition,
01:14:23.980 | think of an ant colony as a cognitive single being,
01:14:29.820 | rather than as a colony of lots of different ants.
01:14:32.340 | I mean, just like the cells of our brain are,
01:14:36.020 | and the microbiome and all that is interacting entities,
01:14:41.020 | but somehow I consider myself to be a single person.
01:14:48.460 | Well, an ant colony, you can say,
01:14:51.620 | might be cognitive somehow.
01:14:55.420 | - It sums it up.
01:14:56.980 | - Yeah, I mean, okay, I smash a certain ant,
01:15:01.980 | and the organism says, "Hmm, that stung.
01:15:05.100 | "What was that?"
01:15:06.900 | But if we're going to crack the secret of cognition,
01:15:10.660 | it might be that we could do so
01:15:13.020 | by psyching out how ants do it,
01:15:16.980 | because we have a better chance to measure
01:15:19.700 | the communicating by pheromones
01:15:21.300 | and by touching each other and sight,
01:15:23.820 | but not by much more subtle phenomenon
01:15:27.660 | like electric currents going through.
01:15:30.300 | - But even a simpler version of that,
01:15:31.860 | what are your thoughts of maybe Conway's Game of Life?
01:15:35.380 | - Okay, so Conway's Game of Life
01:15:37.180 | is able to simulate any computable process,
01:15:42.180 | and any deterministic process is--
01:15:46.860 | - I like how you went there.
01:15:48.020 | I mean, that's not its most powerful thing, I would say.
01:15:53.020 | I mean, it can simulate it,
01:15:56.740 | but the magic is that the individual units
01:16:00.380 | are distributed and extremely simple.
01:16:03.820 | - Yes, we understand exactly what the primitives are.
01:16:06.900 | - The primitives, just like with the ant colony,
01:16:08.780 | even simpler, though.
01:16:09.620 | - But still, it doesn't say that I understand life.
01:16:16.540 | I mean, I understand.
01:16:17.620 | It gives me a better insight into what does it mean
01:16:24.860 | to have a deterministic universe.
01:16:27.960 | What does it mean to have free choice, for example?
01:16:34.620 | - Do you think God plays dice?
01:16:37.940 | - Yes, I don't see any reason why God should be forbidden
01:16:41.580 | from using the most efficient ways
01:16:44.180 | to, I mean, we know that dice are extremely important
01:16:49.180 | in efficient algorithms.
01:16:54.780 | There are things that couldn't be done well
01:16:57.060 | without randomness, and so I don't see any reason
01:16:59.300 | why God should be prohibited from--
01:17:03.180 | - When the algorithm requires it,
01:17:05.500 | you don't see why the physics should constrain it.
01:17:10.100 | - Yeah.
01:17:11.460 | - So in 2001, you gave a series of lectures at MIT
01:17:15.780 | about religion and science.
01:17:17.100 | - No, that was 1999.
01:17:18.340 | - You published, sorry.
01:17:20.940 | - The book came out in 2001.
01:17:22.820 | - So in 1999, you spent a little bit of time in Boston,
01:17:26.900 | enough to give those lectures.
01:17:29.860 | - Yeah.
01:17:30.700 | - And I read the 2001 version, most of it.
01:17:35.700 | It's quite fascinating to read.
01:17:37.420 | I recommend people, it's transcription of your lectures.
01:17:40.840 | So what did you learn about how ideas get started
01:17:44.460 | and grow from studying the history of the Bible?
01:17:46.940 | So you've rigorously studied a very particular part
01:17:50.380 | of the Bible.
01:17:51.200 | What did you learn from this process
01:17:53.720 | about the way us human beings as a society
01:17:56.700 | develop and grow ideas, share ideas,
01:17:59.740 | and are defined by those ideas?
01:18:01.540 | - Well, it's hard to summarize that.
01:18:04.060 | I wouldn't say that I learned a great deal
01:18:08.500 | of really definite things.
01:18:10.540 | Where I could make conclusions,
01:18:12.220 | but I learned more about what I don't know.
01:18:15.420 | You have a complex subject,
01:18:16.800 | which is really beyond human understanding.
01:18:20.060 | So we give up on saying,
01:18:23.460 | I'm never gonna get to the end of the road
01:18:24.980 | and I'm never gonna understand it.
01:18:26.240 | But you say, but maybe it might be good for me
01:18:29.780 | to get closer and closer and learn more
01:18:33.060 | about more and more about something.
01:18:34.460 | And so, how can I do that efficiently?
01:18:39.220 | And the answer is, well, use randomness.
01:18:42.980 | And so try a random subset
01:18:48.300 | that is within my grasp
01:18:52.660 | and study that in detail
01:18:56.940 | instead of just studying parts
01:18:59.340 | that somebody tells me to study
01:19:00.980 | or instead of studying nothing,
01:19:04.060 | because it's too hard.
01:19:07.020 | So I decided for my own amusement,
01:19:12.020 | once that I would take a subset
01:19:17.020 | of the verses of the Bible
01:19:22.020 | and I would try to find out
01:19:25.420 | what the best thinkers have said
01:19:27.100 | about that small subset.
01:19:29.860 | And I had about, let's say 60 verses out of 3000.
01:19:34.940 | I think it's one out of 500 or something like this.
01:19:37.540 | And so then I went to the libraries,
01:19:39.340 | which are well indexed.
01:19:40.980 | I spent, for example, at Boston Public Library,
01:19:48.140 | I would go once a week for a year.
01:19:52.220 | And I went a half dozen times
01:19:56.540 | to Andover Harvard Library
01:19:58.540 | to look at this book that weren't in the Boston Public,
01:20:03.460 | where scholars had looked
01:20:05.300 | and you can go down the shelves
01:20:08.540 | and you can look in the index
01:20:11.940 | and say, oh, is this verse mentioned
01:20:14.580 | anywhere in this book?
01:20:15.420 | If so, look at page 105.
01:20:17.580 | So in other words, I could learn
01:20:19.260 | not only about the Bible,
01:20:20.540 | but about the secondary literature about the Bible,
01:20:23.500 | the things that scholars have written about it.
01:20:26.020 | And so that gave me a way
01:20:28.500 | to zoom in on parts of the thing
01:20:33.140 | so that I could get more insight.
01:20:36.020 | And so I look at it as a way of giving me
01:20:40.020 | some firm pegs,
01:20:43.020 | which I could hang pieces of information,
01:20:45.780 | but not as things where I would say,
01:20:48.860 | and therefore this is true.
01:20:50.860 | - In this random approach of sampling the Bible,
01:20:54.940 | what did you learn about the most central,
01:21:02.780 | one of the biggest accumulation of ideas in our--
01:21:05.420 | - It seemed to me that the main thrust
01:21:08.420 | was not the one that most people think of
01:21:11.020 | as saying, you know,
01:21:13.100 | don't have sex or something like this.
01:21:15.020 | But that the main thrust was
01:21:18.020 | to try to figure out
01:21:23.020 | how to live in harmony with God's wishes.
01:21:26.780 | I'm assuming that God exists
01:21:28.420 | and as I say, I'm glad that there's no way to prove this
01:21:32.420 | because that would,
01:21:33.500 | I would run through the proof once
01:21:37.300 | and then I'd forget it.
01:21:38.620 | And I would never speculate about spiritual things
01:21:43.620 | and mysteries otherwise.
01:21:48.420 | And I think my life would be very incomplete.
01:21:51.540 | So I'm assuming that God exists,
01:21:55.980 | but a lot of the people say God doesn't exist,
01:22:01.940 | but that's still important to them.
01:22:03.220 | And so in a way that might still be,
01:22:06.180 | whether God is there or not,
01:22:09.060 | in some sense, God is important to them.
01:22:12.900 | One of the verses I studied,
01:22:15.980 | you can interpret it as saying
01:22:19.180 | it's much better to be an atheist than not to care at all.
01:22:23.540 | - So I would say it's, yeah,
01:22:26.340 | it's similar to the P equals NP discussion.
01:22:29.540 | You mentioned a mental exercise
01:22:32.220 | that I'd love it if you could partake in yourself,
01:22:36.780 | mental exercise of being God.
01:22:38.860 | And so how would you, if you were God, Doc Knuth,
01:22:42.260 | how would you present yourself to the people of Earth?
01:22:45.220 | - You mentioned your love of literature
01:22:47.660 | and there was this book that really I can recommend to you.
01:22:52.660 | Yeah, the title I think is "Blasphemy."
01:22:55.780 | It talks about God revealing himself
01:22:58.740 | through a computer in Los Alamos.
01:23:01.900 | And it's the only book that I've ever read
01:23:06.900 | where the punchline was really
01:23:13.180 | the very last word of the book
01:23:15.460 | and it explained the whole idea of the book.
01:23:18.580 | And so I'd only give that away,
01:23:20.900 | but it's really very much about this question
01:23:23.460 | that you raised.
01:23:27.540 | But suppose God said, okay,
01:23:31.420 | that my previous means of communication with the world
01:23:36.260 | are not the best for the 21st century,
01:23:38.940 | so what should I do now?
01:23:40.700 | And it's conceivable that God would choose
01:23:45.700 | the way that's described in this book.
01:23:51.020 | - Another way to look at this exercise
01:23:52.660 | is looking at the human mind,
01:23:54.820 | looking at the human spirit,
01:23:56.140 | the human life in a systematic way.
01:23:59.260 | - I think it mostly, you wanna learn humility.
01:24:01.860 | You wanna realize that once we solve one problem,
01:24:04.460 | that doesn't mean that all of a sudden
01:24:07.220 | other problems are gonna drop out.
01:24:08.660 | And we have to realize that there are things
01:24:13.660 | beyond our ability.
01:24:22.260 | I see hubris all around.
01:24:24.980 | - Yeah, well said.
01:24:28.660 | If you were to run program analysis on your own life,
01:24:31.660 | how did you do in terms of correctness,
01:24:36.020 | running time, resource use,
01:24:39.580 | asymptotically speaking, of course?
01:24:40.980 | - Okay, yeah, well, I would say
01:24:44.060 | that question had not been asked me before.
01:24:50.260 | And I started out with library subroutines
01:24:55.260 | and learning how to be a automaton that was obedient.
01:25:06.340 | And I had the great advantage that I didn't have anybody
01:25:13.780 | to blame for my failures.
01:25:18.580 | If I started not understanding something,
01:25:21.860 | I knew that I should stop playing ping pong
01:25:24.460 | and that it was my fault that I wasn't studying hard enough
01:25:28.540 | or something rather than that somebody
01:25:30.660 | was discriminating against me in some way.
01:25:32.860 | And I don't know how to avoid the existence of biases
01:25:37.860 | in the world, but I know that that's an extra burden
01:25:41.780 | that I didn't have to suffer from.
01:25:46.260 | And then I found the,
01:25:51.260 | from parents I learned the idea of service
01:25:58.900 | to other people as being more important
01:26:04.100 | than what I get out of stuff myself.
01:26:09.100 | I know that I need to be happy enough
01:26:15.940 | in order to be able to be of service,
01:26:18.220 | but I came to a philosophy finally
01:26:23.220 | that I phrase as point eight is enough.
01:26:27.580 | There was a TV show once called "Eight is Enough,"
01:26:31.340 | which was about, somebody had eight kids.
01:26:34.820 | But I say point eight is enough,
01:26:39.300 | which means if I can have a way of rating happiness,
01:26:44.540 | I think it's good design to have an organism
01:26:49.540 | that's happy about 80% of the time.
01:26:54.640 | And if it was 100% of the time,
01:26:58.740 | it would be like everybody's on drugs
01:27:01.420 | and everything collapses and nothing works
01:27:06.420 | because everybody's just too happy.
01:27:09.340 | - Do you think you've achieved that point eight
01:27:11.220 | optimal balance?
01:27:13.180 | - There are times when I'm down
01:27:15.900 | and I know that I'm chemically,
01:27:20.300 | I know that I've actually been programmed
01:27:22.340 | to be depressed a certain amount of time.
01:27:27.340 | And if that gets out of kilter
01:27:30.100 | and I'm more depressed than usual,
01:27:31.740 | sometimes I find myself trying to think,
01:27:34.180 | now who should I be mad at today?
01:27:36.060 | There must be a reason why I'm,
01:27:40.020 | but then I realized, it's just my chemistry
01:27:43.700 | telling me that I'm supposed to be mad at somebody.
01:27:45.660 | And so I triggered, I'm saying,
01:27:47.500 | okay, go to sleep and get better.
01:27:49.340 | But if I'm not 100% happy,
01:27:55.180 | that doesn't mean that I should find somebody
01:27:57.060 | that's screwing me and try to silence them.
01:28:00.680 | I'm saying, okay, I'm not 100% happy,
01:28:06.140 | but I'm happy enough to be a part
01:28:11.140 | of a sustainable situation.
01:28:14.420 | So that's kind of the numerical analysis I do.
01:28:19.420 | - You've converged towards the optimal,
01:28:23.020 | which for human life is a point eight.
01:28:25.540 | I hope it's okay to talk about,
01:28:28.300 | as you talked about previously,
01:28:29.940 | in 2006 you were diagnosed with prostate cancer.
01:28:34.580 | Has that encounter with mortality changed you in some way,
01:28:38.620 | or the way you see the world?
01:28:40.540 | - Yeah, it did.
01:28:41.740 | The first encounter with mortality was when my dad died,
01:28:45.500 | and I went through a month when I sort of came to
01:28:50.500 | be comfortable with the fact that I was gonna die someday.
01:28:59.380 | And during that month, I don't know,
01:29:03.000 | I felt okay, but I couldn't sing.
01:29:08.000 | And I couldn't do original research either.
01:29:14.000 | I sort of remember after three or four weeks,
01:29:18.800 | the first time I started having a technical thought
01:29:21.720 | that made sense and was maybe slightly creative,
01:29:24.920 | I could sort of feel that something
01:29:27.920 | was starting to move again.
01:29:29.600 | But that was, so I felt very empty
01:29:32.880 | until I came to grips with the,
01:29:36.680 | I learned that this is sort of a standard grief process
01:29:40.800 | that people go through.
01:29:42.640 | Okay, so then now I'm at a point in my life,
01:29:46.400 | even more so than in 2006,
01:29:48.160 | where all of my goals have been fulfilled
01:29:51.400 | except for finishing the art of computer programming.
01:29:58.800 | I had one major unfulfilled goal.
01:30:01.840 | I'd wanted all my life to write a piece of music,
01:30:06.840 | and I had an idea for a certain kind of music
01:30:12.360 | that I thought ought to be written,
01:30:13.600 | at least somebody ought to try to do it.
01:30:15.120 | And I felt that it wasn't gonna be easy,
01:30:20.000 | but I wanted it to be proof of concept.
01:30:24.080 | I wanted to know if it was gonna work or not.
01:30:25.920 | And so I spent a lot of time,
01:30:28.480 | and finally I finished that piece
01:30:30.160 | and we had the world premiere last year
01:30:34.640 | on my 80th birthday,
01:30:35.840 | and we had another premiere in Canada,
01:30:38.040 | and there's talk of concerts in Europe and various things.
01:30:41.840 | So that, but that's done.
01:30:43.280 | It's part of the world's music now,
01:30:45.360 | and it's either good or bad,
01:30:46.400 | but I did what I was hoping to do.
01:30:50.200 | So the only thing that I had to do
01:30:53.440 | that I have on my agenda
01:30:57.440 | is to try to do as well as I can
01:30:59.960 | with the art of computer programming until I go to Sina.
01:31:02.960 | - Do you think there's a element of 0.8 that might apply?
01:31:07.000 | - 0.8? - Yeah.
01:31:08.440 | - Well, I look at it more that I got actually to 1.0
01:31:13.440 | when that concert was over with.
01:31:19.640 | I mean, I, you know, I,
01:31:22.800 | so in 2006, I was at 0.8.
01:31:26.440 | So when I was diagnosed with prostate cancer,
01:31:30.520 | then I said, okay, well, maybe this is,
01:31:32.920 | you know, I've had all kinds of good luck all my life
01:31:37.920 | and there's no, I have nothing to complain about.
01:31:40.880 | So I might die now, and we'll see what happens.
01:31:45.040 | And so, so it's quite seriously,
01:31:48.000 | I went and I didn't,
01:31:50.200 | I had no expectation that I deserved better.
01:31:56.360 | I didn't make any plans for the future.
01:31:59.800 | I had my surgery, I came out of the surgery
01:32:03.320 | and spent some time learning how to walk again
01:32:08.320 | and so on, it was painful for a while.
01:32:15.760 | But I got home and I realized I hadn't really thought
01:32:19.600 | about what to do next.
01:32:21.240 | I hadn't any expectation, you know, I said, okay,
01:32:25.560 | hey, I'm still alive.
01:32:26.680 | Okay, now I can write some more books.
01:32:28.680 | But I didn't come to with the attitude that, you know,
01:32:31.520 | this was terribly unfair.
01:32:36.040 | And I just said, okay, I was accepting whatever turned out.
01:32:44.600 | You know, I'd gotten more than my share already.
01:32:49.600 | So why should I, you know, why should I?
01:32:54.680 | And I didn't, and I really, when I got home,
01:32:58.680 | I realized that I had really not thought
01:33:00.880 | about the next step, what I would do
01:33:02.400 | after I would be able to work again.
01:33:04.840 | I had sort of thought of it as if, as this might,
01:33:08.000 | I was comfortable with the fact that it was at the end.
01:33:14.440 | But I was hoping that I would still, you know,
01:33:18.360 | be able to learn about satisfiability
01:33:23.360 | and also someday write music.
01:33:28.880 | I didn't start seriously on the music project until 2012.
01:33:34.040 | - So I'm gonna be in huge trouble
01:33:36.320 | if I don't talk to you about this.
01:33:38.160 | In the '70s, you've created the tech typesetting system
01:33:43.280 | together with metafont language for font description
01:33:45.800 | and computer modern family of typefaces.
01:33:49.560 | That has basically defined the methodology
01:33:52.200 | and the aesthetic of the countless research fields,
01:33:57.200 | right, math, physics, beyond computer science, so on.
01:34:02.320 | Okay, well, first of all, thank you.
01:34:04.120 | I think I speak for a lot of people in saying that.
01:34:08.280 | But a question in terms of beauty.
01:34:11.680 | There's a beauty to typography that you've created,
01:34:15.040 | and yet beauty is hard to quantify.
01:34:17.900 | - Right.
01:34:18.740 | - How does one create beautiful letters
01:34:22.680 | and beautiful equations?
01:34:24.180 | Like what, what, what?
01:34:26.640 | Perhaps there's no words to be describing,
01:34:31.180 | to be describing the process, but.
01:34:34.340 | - So the great Harvard mathematician,
01:34:40.160 | George Deburkov, wrote a book in the '30s
01:34:44.160 | called "Aesthetic Measure," where he would
01:34:46.560 | have pictures of vases, and underneath would be a number,
01:34:50.000 | and this was how beautiful the vase was.
01:34:52.400 | And he had a formula for this.
01:34:54.560 | And he actually also wrote about music.
01:34:58.240 | And so he could, you know, so I thought maybe I would,
01:35:03.000 | part of my musical composition,
01:35:04.520 | I would try to program his algorithms
01:35:08.560 | and, you know, so that I would write something
01:35:12.440 | that had the highest number by his score.
01:35:14.440 | Well, it wasn't quite rigorous enough for a computer to do.
01:35:19.160 | But anyway, people have tried to put numerical value
01:35:22.920 | on beauty, but, and he did probably the most serious attempt.
01:35:27.920 | And George Gershwin's teacher also wrote two volumes
01:35:34.560 | where he talked about his method of composing music,
01:35:38.440 | but you're talking about another kind of beauty,
01:35:41.520 | and beauty in letters and letter forms.
01:35:43.320 | - Elegance and whatever that curvature is.
01:35:46.120 | - Right, so, and so that's,
01:35:50.080 | and yeah, the beholder, as they say,
01:35:52.760 | but striving for excellence in whatever definition
01:35:56.600 | you want to give to beauty, then you try to get
01:35:58.680 | as close to that as you can somehow with it.
01:36:00.880 | - I guess I'm trying to ask,
01:36:04.120 | and there may not be a good answer,
01:36:06.920 | what loose definitions were you operating under
01:36:10.520 | with the community of people that you were working on?
01:36:12.640 | - Oh, the loose definition, I wanted it to appeal to me.
01:36:17.640 | To me, I mean.
01:36:21.000 | - To you personally.
01:36:21.960 | - Yeah.
01:36:22.800 | - That's a good start.
01:36:23.640 | - Yeah, no, and it failed that test
01:36:25.680 | when I got, volume two came out with the new printing,
01:36:29.920 | and I was expecting it to be the happiest day of my life.
01:36:32.400 | And I felt like a burning,
01:36:37.400 | like how angry I was that I opened the book
01:36:41.040 | and it was in the same beige covers,
01:36:44.680 | but it didn't look right on the page.
01:36:48.200 | The number two was particularly ugly.
01:36:52.640 | I couldn't stand any page that had a two in its page number.
01:36:55.680 | And I was expecting that.
01:36:58.480 | I spent all this time making measurements,
01:37:01.320 | and I had looked at stuff in different ways,
01:37:06.320 | and I had great technology,
01:37:11.440 | but I wasn't done.
01:37:15.240 | I had to retune the whole thing after 1961.
01:37:20.160 | - Has it ever made you happy, finally?
01:37:22.320 | - Oh, yes.
01:37:23.640 | - Or is it a point in age?
01:37:26.880 | - No, and so many books have come out
01:37:29.520 | that would never have been written without this.
01:37:31.920 | It's just a joy, but now I,
01:37:35.760 | I mean, all these pages that are sitting up there,
01:37:39.400 | if I didn't like them, I would change them.
01:37:44.640 | That's my, nobody else has this ability.
01:37:48.400 | They have to stick with what I gave them.
01:37:50.160 | - Yeah, so in terms of the other side of it,
01:37:53.640 | there's the typography, so the look of the type
01:37:56.800 | and the curves and the lines.
01:37:59.960 | What about the spacing?
01:38:01.600 | - What about the?
01:38:02.440 | - The spacing between the white space.
01:38:05.320 | - Yeah.
01:38:06.160 | - It seems like you could be a little bit more systematic
01:38:09.440 | about the layout or technical.
01:38:11.960 | - Oh, yeah, you can always go further.
01:38:13.880 | I didn't stop at .8.
01:38:18.200 | I stopped at about .98.
01:38:20.520 | - It seems like you're not following your own rule
01:38:24.960 | for happiness, or is?
01:38:27.000 | - No, no, no.
01:38:27.840 | (Lex laughing)
01:38:30.280 | There's, okay, of course, there's this,
01:38:32.920 | what is the Japanese word, wabi-sabi or something,
01:38:36.880 | where the most beautiful works of art
01:38:40.760 | are those that have flaws, because then the person
01:38:43.520 | who perceives them adds their own appreciation,
01:38:48.520 | and that gives the viewer more satisfaction, or so on.
01:38:52.400 | But no, no, with typography, I wanted it to look
01:38:57.400 | as good as I could in the vast majority of cases,
01:39:01.960 | and then when it doesn't, then I say, okay,
01:39:06.960 | that's 2% more work for the author.
01:39:10.280 | But I didn't want to say that my job was to get to 100%
01:39:16.760 | and take all the work away from the author.
01:39:20.640 | That's what I meant by that.
01:39:22.560 | - So if you were to venture a guess,
01:39:26.440 | how much of the nature of reality
01:39:28.840 | do you think we humans understand?
01:39:30.800 | So you mentioned you appreciate mystery.
01:39:34.360 | How much of the world about us is shrouded in mystery?
01:39:38.040 | If you were to put a number on it,
01:39:41.840 | what percent of it all do we understand?
01:39:44.960 | Are we totally--
01:39:45.800 | - How many leading zeros, 0.00, I don't know.
01:39:50.320 | I think it's infinitesimal.
01:39:52.600 | - How do we think about that, and what do we do about that?
01:39:55.160 | Do we continue one step at a time?
01:39:57.120 | - Yeah, we muddle through.
01:39:58.640 | We do our best, we realize that nobody's perfect,
01:40:03.480 | and we try to keep advancing,
01:40:07.240 | but we don't spend time saying we're not there,
01:40:12.240 | we're not all the way to the end.
01:40:14.800 | Some mathematicians that would be in the office next to me
01:40:19.320 | when I was in the math department,
01:40:21.360 | they would never think about anything
01:40:23.160 | smaller than countable infinity.
01:40:25.400 | We intersected that countable infinity
01:40:28.600 | 'cause I rarely got up to countable infinity.
01:40:31.680 | I was always talking about finite stuff.
01:40:33.680 | But even limiting to finite stuff,
01:40:37.160 | which the universe might be,
01:40:42.160 | there's no way to really know
01:40:45.960 | whether the universe isn't just made out of capital N,
01:40:50.960 | whatever units you wanna call them, quarks or whatever,
01:41:00.520 | where capital N is some finite number.
01:41:02.600 | All of the numbers that are comprehensible
01:41:04.400 | are still way smaller than most,
01:41:07.080 | almost all finite numbers.
01:41:08.440 | I got this one paper called "Supernatural Numbers,"
01:41:14.120 | where I guess you probably ran into
01:41:17.240 | something called Knuth arrow notation.
01:41:19.120 | Did you ever run into that?
01:41:21.040 | Anyway, so you take the number,
01:41:22.600 | I think it's like, and I called it super K.
01:41:26.280 | I named it after myself,
01:41:29.040 | but arrow notation is something like 10,
01:41:33.080 | and then four arrows and a three or something like that.
01:41:35.920 | Okay, now, the arrow notation,
01:41:38.600 | if you have no arrows, that means multiplication.
01:41:42.160 | XY means X times X times X times XY times.
01:41:47.160 | If you have one arrow, that means exponentiation.
01:41:50.360 | So X one arrow Y means X to the X to the X to the X
01:41:53.440 | to the XY times.
01:41:56.440 | So I found out, by the way,
01:41:58.040 | that this notation was invented by a guy in 1830,
01:42:03.040 | and he was one of the English notations
01:42:08.040 | a one of the English nobility
01:42:12.680 | who spent his time thinking about stuff like this.
01:42:15.400 | And it was exactly the same concept that I used arrows,
01:42:21.080 | and he used a slightly different notation.
01:42:23.360 | But anyway, and then this Ackermann's function
01:42:26.480 | is based on the same kind of ideas,
01:42:28.620 | but Ackermann was 1920s.
01:42:31.200 | But anyway, you've got this number 10 quadruple arrow three.
01:42:37.040 | So that says, well, we take 10 to the 10 to the 10
01:42:42.040 | to the 10 to the 10 to the 10th.
01:42:45.280 | How many times do we do that?
01:42:47.120 | Oh, 10 double arrow two times or something.
01:42:49.680 | I mean, how tall is that stack?
01:42:52.520 | But then we do that again,
01:42:54.760 | because that was only 10 triple arrow quadruple arrow two.
01:42:57.920 | Quadruple arrow three means we have--
01:43:00.000 | - It's gonna be a pretty large number.
01:43:01.600 | - It gets way beyond comprehension, okay?
01:43:06.800 | And so, but it's so small compared
01:43:11.800 | to what finite numbers really are,
01:43:13.720 | because I wanna use four arrows and a 10 and a three.
01:43:17.760 | I mean, let's have that many number of arrows.
01:43:22.760 | - The boundary between infinite and finite
01:43:26.560 | is incomprehensible for us humans anyway.
01:43:29.780 | - Infinity is a good, is a useful way for us
01:43:32.920 | to think about extremely large things.
01:43:37.920 | And we can manipulate it,
01:43:44.040 | but we can never know that the universe
01:43:48.680 | is actually anywhere near that.
01:43:51.840 | So it just, so I realize how little we know.
01:44:02.280 | But we found an awful lot of things
01:44:06.080 | that are too hard for any one person to know,
01:44:12.360 | even in our small universe.
01:44:14.520 | - Yeah, and we did pretty good.
01:44:16.520 | So when you go up to heaven and meet God
01:44:20.800 | and get to ask one question that would get answered,
01:44:25.280 | what question would you ask?
01:44:30.480 | - What kind of browser do you have up here?
01:44:32.680 | (laughing)
01:44:34.920 | No, no, I actually, I don't think it's meaningful
01:44:39.920 | to ask this question, but I certainly hope
01:44:45.560 | we had good internet.
01:44:46.600 | - Okay, on that note, that's beautiful actually.
01:44:53.160 | Don, thank you so much.
01:44:54.000 | It was a huge honor to talk to you.
01:44:55.320 | I really appreciate it.
01:44:56.160 | - Well, thanks for the gamut of questions.
01:44:59.280 | - Yeah, it was fun.
01:45:00.960 | - Thanks for listening to this conversation
01:45:02.760 | with Donald Knuth.
01:45:04.320 | And thank you to our presenting sponsor, Cash App.
01:45:07.560 | Download it, use code LEXPODCAST.
01:45:10.160 | You'll get $10 and $10 will go to FIRST,
01:45:12.960 | a STEM education nonprofit that inspires
01:45:15.360 | hundreds of thousands of young minds
01:45:16.960 | to learn and to dream of engineering our future.
01:45:21.040 | If you enjoy this podcast, subscribe on YouTube,
01:45:23.480 | give it five stars on Apple Podcast,
01:45:25.440 | support it on Patreon, or connect with me on Twitter.
01:45:28.840 | And now let me leave you with some words of wisdom
01:45:31.800 | from Donald Knuth.
01:45:33.000 | We should continually be striving to transform
01:45:36.440 | every art into a science.
01:45:39.120 | And in the process, we advance the art.
01:45:42.320 | Thank you for listening and hope to see you next time.
01:45:46.600 | (upbeat music)
01:45:49.180 | (upbeat music)
01:45:51.760 | [BLANK_AUDIO]