back to index

Charles Isbell: Computing, Interactive AI, and Race in America | Lex Fridman Podcast #135


Chapters

0:0 Introduction
2:36 Top 3 movies of all time
8:45 People are easily predictable
14:27 Breaking out of our bubbles
26:13 Interactive AI
32:45 Lifelong machine learning
41:12 Faculty hiring
48:47 University rankings
56:15 Science communicators
65:39 Hip hop
74:39 Funk
76:3 Computing
91:55 Race
107:59 Cop story
116:20 Racial tensions
125:42 MLK vs Malcolm X
129:3 Will human civilization destroy itself?
133:34 Fear of death and the passing of time

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Charles Isbell,
00:00:03.120 | Dean of the College of Computing at Georgia Tech,
00:00:06.360 | a researcher and educator
00:00:08.400 | in the field of artificial intelligence,
00:00:10.720 | and someone who deeply thinks about what exactly is
00:00:14.980 | the field of computing and how do we teach it.
00:00:17.880 | He also has a fascinatingly varied set of interests,
00:00:21.440 | including music, books, movies, sports, and history
00:00:25.560 | that make him especially fun to talk with.
00:00:28.240 | When I first saw him speak, his charisma
00:00:30.960 | immediately took over the room,
00:00:32.960 | and I had a stupid excited smile on my face,
00:00:35.640 | and I knew I had to eventually talk to him on this podcast.
00:00:39.400 | Quick mention of each sponsor,
00:00:41.120 | followed by some thoughts related to the episode.
00:00:44.280 | First is Neuro, the maker of functional sugar-free gum
00:00:48.080 | and mints that I use to give my brain a quick caffeine boost.
00:00:52.280 | Second is Decoding Digital,
00:00:54.420 | a podcast on tech and entrepreneurship
00:00:57.080 | that I listen to and enjoy.
00:00:59.240 | Third is Masterclass, online courses that I watch
00:01:02.120 | from some of the most amazing humans in history.
00:01:05.020 | And finally, Cash App, the app I use to send money
00:01:07.840 | to friends for food and drinks.
00:01:10.760 | Please check out these sponsors in the description
00:01:13.080 | to get a discount and to support this podcast.
00:01:16.360 | As a side note, let me say that I'm trying to make it
00:01:18.600 | so that the conversations with Charles, Eric Weinstein,
00:01:22.360 | and Dan Carlin will be published
00:01:24.180 | before Americans vote for president on November 3rd.
00:01:28.280 | There's nothing explicitly political in these conversations,
00:01:31.480 | but they do touch on something in human nature
00:01:34.520 | that I hope can bring context to our difficult time,
00:01:37.880 | and maybe, for a moment, allow us to empathize
00:01:41.240 | with people we disagree with.
00:01:43.260 | With Eric, we talk about the nature of evil.
00:01:45.880 | With Charles, besides AI and music,
00:01:49.240 | we talk a bit about race in America
00:01:51.480 | and how we can bring more love and empathy
00:01:54.380 | to our online communication.
00:01:57.060 | And with Dan Carlin, well, we talk about
00:02:00.340 | Alexander the Great, Genghis Khan, Hitler, Stalin,
00:02:04.820 | and all the complicated parts of human history in between,
00:02:07.980 | with a hopeful eye toward a brighter future
00:02:10.420 | for our humble little civilization here on Earth.
00:02:14.020 | The conversation with Dan will hopefully be posted tomorrow
00:02:17.740 | on Monday, November 2nd.
00:02:20.260 | If you enjoy this thing, subscribe on YouTube,
00:02:22.760 | review it with Five Stars on Apple Podcasts,
00:02:25.040 | follow on Spotify, support on Patreon,
00:02:27.680 | or connect with me on Twitter @LexFriedman.
00:02:31.240 | And now, here's my conversation with Charles Isbell.
00:02:35.120 | You've mentioned that you love movies and TV shows.
00:02:40.040 | Let's ask an easy question,
00:02:43.160 | but you have to be definitively, objectively conclusive.
00:02:46.720 | What's your top three movies of all time?
00:02:49.880 | - So you're asking me to be definitive and to be conclusive.
00:02:52.460 | That's a little hard, I'm gonna tell you why.
00:02:54.460 | It's very simple.
00:02:55.340 | It's because movies is too broad of a category.
00:02:58.340 | I gotta pick sub-genres.
00:02:59.660 | But I will tell you that of those genres,
00:03:02.140 | I'll pick one or two from each of the genres.
00:03:04.180 | I'll get us to three, so I'm not gonna cheat.
00:03:06.060 | So my favorite comedy of all times,
00:03:09.860 | which is probably my favorite movie of all time,
00:03:12.060 | is "His Girl Friday," which is probably a movie
00:03:15.220 | that you've not ever heard of,
00:03:16.540 | but it's based on a play called "The Front Page"
00:03:19.060 | from, I don't know, early 1900s.
00:03:21.660 | And the movie is a fantastic film.
00:03:26.020 | - What's the story?
00:03:26.860 | What's the independent film?
00:03:28.300 | - No, no, no. - What are we talking about?
00:03:29.700 | - This is one of the movies
00:03:31.180 | that would have been very popular.
00:03:32.260 | It's a screwball comedy.
00:03:33.340 | You ever see "Moonlighting," the TV show?
00:03:35.100 | You know what I'm talking about?
00:03:35.940 | So you've seen these shows where there's a man and a woman,
00:03:38.860 | and they clearly are in love with one another,
00:03:40.380 | and they're constantly fighting
00:03:41.540 | and always talking over each other.
00:03:43.300 | Banter, banter, banter, banter, banter.
00:03:45.300 | This was the movie that started all that
00:03:47.540 | as far as I'm concerned.
00:03:48.980 | It's very much of its time, so it's, I don't know,
00:03:52.340 | must have come out sometime between 1934 and 1939.
00:03:55.260 | I'm not sure exactly when the movie itself came out.
00:03:58.300 | It's black and white.
00:03:59.740 | It's just a fantastic film.
00:04:01.500 | It is hilarious.
00:04:02.420 | - So it's mostly conversation?
00:04:04.380 | - Not entirely, but mostly, mostly.
00:04:06.260 | Just a lot of back and forth.
00:04:07.620 | There's a story there.
00:04:08.940 | Someone's on death row, and they're newspapermen,
00:04:13.940 | including her.
00:04:14.900 | They're all newspapermen.
00:04:16.380 | They were divorced, the editor, the publisher, I guess,
00:04:19.260 | and the reporter, they were divorced.
00:04:22.540 | But they clearly, he's thinking,
00:04:24.260 | trying to get back together,
00:04:25.580 | and there's this whole other thing that's going on.
00:04:27.300 | But none of that matters.
00:04:28.140 | The plot doesn't matter.
00:04:28.980 | - Yeah, it's just a little play in conversation.
00:04:31.660 | - It's fantastic, and I just love everything
00:04:33.980 | about the conversation, because at the end of the day,
00:04:35.860 | sort of narrative and conversation
00:04:37.180 | are the sort of things that drive me,
00:04:38.340 | and so I really like that movie for that reason.
00:04:41.380 | Similarly, I'm now gonna cheat,
00:04:43.140 | and I'm gonna give you two movies as one,
00:04:45.180 | and they're Crouching Tiger, Hidden Dragon, and John Wick.
00:04:49.040 | Both relatively modern.
00:04:50.900 | John Wick, of course, is--
00:04:51.740 | - One, two, or three.
00:04:52.700 | - One.
00:04:53.540 | It gets increasingly, I love them all for different reasons,
00:04:56.220 | and increasingly more ridiculous.
00:04:57.660 | Kind of like loving Alien and Aliens,
00:04:59.660 | despite the fact they're two completely different movies.
00:05:01.780 | But the reason I put Crouching Tiger, Hidden Dragon,
00:05:04.420 | and John Wick together is 'cause I actually think
00:05:06.580 | they're the same movie, or what I like about them,
00:05:08.820 | the same movie, which is both of them create a world
00:05:13.660 | that you're coming in the middle of,
00:05:15.620 | and they don't explain it to you.
00:05:17.740 | But the story is done so well that you pick it up.
00:05:20.100 | So anyone who's seen John Wick, you know,
00:05:22.140 | you have these little coins, and they're headed out,
00:05:24.660 | and there are these rules, and apparently,
00:05:26.340 | every single person in New York City is an assassin.
00:05:29.140 | There's like two people who come through who aren't,
00:05:30.740 | but otherwise they are.
00:05:31.580 | But there's this complicated world,
00:05:33.180 | and everyone knows each other.
00:05:34.220 | They don't sit down and explain it to you,
00:05:35.300 | but you figure it out.
00:05:36.140 | Crouching Tiger, Hidden Dragon's a lot like that.
00:05:38.180 | You get the feeling that this is chapter nine
00:05:40.020 | of a 10-part story, and you've missed
00:05:42.100 | the first eight chapters, and they're not gonna
00:05:43.940 | explain it to you, but there's this sort of
00:05:45.300 | rich world behind you.
00:05:46.140 | - But you get pulled in anyway, like immediately.
00:05:47.460 | - You get pulled in anyway.
00:05:48.300 | So it's just excellent storytelling in both cases,
00:05:50.940 | and very, very different.
00:05:51.780 | - And also you like the outfit, I assume,
00:05:53.460 | the John Wick outfit.
00:05:54.660 | - Oh yeah, of course, of course, yes.
00:05:56.660 | I think John Wick outfit's perfect.
00:05:58.140 | And so that's number two, and then--
00:05:59.860 | - But sorry to pause on that, martial arts,
00:06:02.100 | you have a long list of hobbies.
00:06:04.020 | Like it scrolls off the page, but I didn't see
00:06:06.460 | martial arts as one of them.
00:06:07.820 | - I do not do martial arts, but I certainly--
00:06:09.540 | - Watch it. - Watch martial arts.
00:06:10.380 | Oh, I appreciate it very much.
00:06:11.420 | - Oh, we could talk about every Jackie Chan movie
00:06:13.300 | ever made, and I would be on board with that.
00:06:15.820 | - The Couch Hour 2, like that kind of,
00:06:17.660 | the comedy of it, cop.
00:06:18.940 | - Yes, yes.
00:06:20.100 | By the way, my favorite Jackie Chan movie
00:06:21.980 | would be Drunken Master 2, known in the States
00:06:26.140 | usually as Legend of the Drunken Master.
00:06:28.140 | Actually, Drunken Master, the first one,
00:06:30.780 | is the first kung fu movie I ever saw,
00:06:33.500 | but I did not know that.
00:06:35.020 | - First Jackie Chan movie?
00:06:36.140 | - No, first one ever that I saw and remember,
00:06:38.140 | but I had no idea--
00:06:39.660 | - What is this?
00:06:40.500 | - I didn't know what it was, and I didn't know
00:06:41.340 | that was Jackie Chan.
00:06:42.180 | That was like his first major movie.
00:06:44.180 | I was a kid, it was done in the '70s.
00:06:46.460 | I only later rediscovered that that was actually--
00:06:49.820 | - And he creates his own martial art by,
00:06:52.500 | was he actually drinking, or was he play drinking?
00:06:57.500 | - You mean as an actor, or--
00:06:59.660 | - No. (laughing)
00:07:00.980 | I'm sure as an actor he was--
00:07:02.700 | - No, he was-- - It was the '70s,
00:07:03.820 | or whatever.
00:07:04.660 | - He was definitely drinking, and in the end,
00:07:07.620 | he drinks industrial-grade alcohol.
00:07:09.620 | - Ah, yeah.
00:07:10.660 | - Yeah, and has one of the most fantastic fights ever
00:07:13.980 | in that sub-genre.
00:07:15.220 | Anyway, that's my favorite one of his movies,
00:07:17.180 | but I'll tell you the last movie.
00:07:19.540 | It's actually a movie called Nothing But a Man,
00:07:21.660 | which is a 1960s, starred Ivan Dixon,
00:07:26.380 | who you'll know from Hogan's Heroes, and Abbie Lincoln.
00:07:31.380 | It's just a really small little drama.
00:07:34.140 | It's a beautiful story, but my favorite scenes,
00:07:36.500 | I'm cheating, my favorite,
00:07:38.700 | one of my favorite movies just for the ending
00:07:41.380 | is The Godfather.
00:07:42.780 | I think the last scene of that is just fantastic.
00:07:45.660 | It's the whole movie all summarized in just eight,
00:07:48.100 | nine seconds.
00:07:48.940 | - Godfather Part One?
00:07:49.760 | - Part One.
00:07:50.600 | - How does it end?
00:07:51.500 | I don't think you can, you need to worry about spoilers
00:07:54.020 | if you haven't seen The Godfather.
00:07:55.100 | - Spoiler alert.
00:07:55.940 | It ends with the wife coming to Michael,
00:08:00.700 | and he says, "Just this once,
00:08:02.740 | "I'll let you ask me my business."
00:08:04.180 | And she asks him if he did this terrible thing,
00:08:06.620 | and he looks her in the eye, and he lies,
00:08:07.940 | and he says, "No," and she says, "Thank you,"
00:08:10.180 | and she walks out the door,
00:08:12.620 | and you see him, you see her going out of the door,
00:08:17.620 | and all these people are coming in,
00:08:20.100 | and they're kissing Michael's hands, and Godfather.
00:08:23.420 | And then the camera switches perspectives,
00:08:25.580 | so instead of looking at him, you're looking at her,
00:08:28.360 | and the door closes in her face,
00:08:31.540 | and that's the end of the movie,
00:08:32.460 | and that's the whole movie right there.
00:08:34.620 | - Do you see parallels between that
00:08:36.060 | and your position as dean at Georgia Tech, Carl?
00:08:38.380 | - Ha!
00:08:39.220 | Just kidding, trick question.
00:08:40.380 | (laughing)
00:08:42.140 | Sometimes, certainly if the door gets closed
00:08:43.860 | on me every once in a while.
00:08:45.540 | - Okay, that was a rhetorical question.
00:08:48.120 | You've also mentioned that you, I think,
00:08:50.960 | enjoy all kinds of experiments, including on yourself,
00:08:54.160 | but I saw a video where you said you did an experiment
00:08:56.980 | where you tracked all kinds of information about yourself,
00:09:00.020 | and a few others, sort of wiring up your home,
00:09:05.540 | and this little idea that you mentioned in that video,
00:09:09.060 | which is kind of interesting,
00:09:10.180 | that you thought that two days' worth of data
00:09:14.660 | is enough to capture majority of the behavior
00:09:17.420 | of the human being.
00:09:18.420 | First, can you describe what the heck you did
00:09:23.140 | to collect all that data, 'cause it's fascinating,
00:09:25.220 | just like little details of how you collect that data,
00:09:27.820 | and also what your intuition behind the two days is.
00:09:31.080 | - So, first off, it has to be the right two days,
00:09:32.540 | but I was thinking of a very specific experiment.
00:09:34.940 | There's actually a suite of them that I've been a part of,
00:09:36.980 | and other people have done this, of course.
00:09:38.140 | I just sort of dabbled in that part of the world.
00:09:40.460 | But to be very clear, the specific thing
00:09:41.980 | that I was talking about had to do with recording
00:09:45.100 | all the IR going on in my, infrared going on in my house.
00:09:48.820 | So this is a long time ago, so everything's being controlled
00:09:52.020 | by pressing buttons on remote controls,
00:09:54.340 | as opposed to speaking to Alexa, or Siri,
00:09:56.900 | or someone like that.
00:09:58.020 | And I was just trying to figure out
00:09:59.480 | if you could get enough data on people
00:10:00.900 | to figure out what they were gonna do
00:10:02.660 | with their TVs or their lights.
00:10:04.200 | My house was completely wired up at the time.
00:10:06.460 | But you know, what, I'm about to look at a movie,
00:10:09.500 | or I'm about to turn on the TV, or whatever,
00:10:10.900 | and just see what I could predict from it.
00:10:12.940 | It was kind of surprising, it shouldn't have been.
00:10:16.500 | But that's all very easy to do, by the way,
00:10:18.340 | just capturing all the little stuff.
00:10:19.580 | I mean, it's a bunch of computer systems.
00:10:20.700 | It's really easy to capture,
00:10:21.540 | if you know what you're looking for.
00:10:23.180 | At Georgia Tech, long before I got there,
00:10:24.620 | we had this thing called the Aware Home,
00:10:26.540 | where everything was wired up,
00:10:27.780 | and you saw, you captured everything that was going on.
00:10:29.780 | Nothing even difficult, not with video,
00:10:31.500 | or anything like that, just the way that the system
00:10:33.580 | was just capturing everything.
00:10:35.700 | So it turns out that, and I did this with myself,
00:10:39.780 | and then I had students, and they worked with
00:10:41.540 | many other people, and it turns out at the end of the day,
00:10:44.500 | people do the same things over and over and over again.
00:10:48.120 | So it has to be the right two days, like a weekend,
00:10:50.100 | but it turns out, not only can you predict
00:10:52.720 | what someone's going to do next,
00:10:53.820 | at the level of what button they're gonna press next
00:10:55.660 | on a remote control, but you can do it
00:10:58.980 | with something really, really simple,
00:11:00.740 | like a, you don't even need a hidden Markov model,
00:11:02.740 | it's like a Mark, just simply, I press this,
00:11:05.220 | this is my prediction of the next thing.
00:11:06.420 | And it turns out, you get 93% accuracy,
00:11:09.900 | just by doing something very simple and stupid,
00:11:11.900 | and just counting statistics.
00:11:13.780 | But what was actually more interesting,
00:11:15.220 | is that you could use that information.
00:11:16.580 | This comes up again and again in my work.
00:11:18.520 | If you try to represent people or objects
00:11:21.780 | by the things they do, the things you can measure about them
00:11:25.060 | that have to do with action in the world,
00:11:27.500 | so distribution over actions, and you try to represent them
00:11:30.420 | by the distribution of actions that are done on them,
00:11:34.740 | then you do a pretty good job of sort of
00:11:37.020 | understanding how people are,
00:11:38.380 | and they cluster remarkably well, in fact, irritatingly so.
00:11:43.220 | And so, by clustering people this way, you can,
00:11:47.260 | maybe, you know, I got the 93% accuracy
00:11:49.620 | of what's the next button you're gonna press,
00:11:51.340 | but I can get 99% accuracy, or somewhere there's about,
00:11:54.620 | on the collections of things you might press.
00:11:56.460 | And it turns out, the things that you might press
00:11:58.340 | are all related to each other in exactly
00:12:00.700 | the way that you would expect.
00:12:01.540 | So, for example, all the numbers on a keypad,
00:12:05.900 | it turns out, all have the same behavior,
00:12:07.420 | with respect to you as a human being.
00:12:09.140 | And so, you would naturally cluster them together,
00:12:11.420 | and you discover that numbers are all related
00:12:15.140 | to one another in some way, and all these other things.
00:12:17.100 | And then, and here's the part that I think's important.
00:12:19.700 | I mean, you can see this in all kinds of things.
00:12:22.060 | Every individual is different,
00:12:25.060 | but any given individual is remarkably predictable,
00:12:28.460 | because you keep doing the same things over and over again.
00:12:30.740 | And the two things that I've learned,
00:12:32.460 | in the long time that I've been thinking about this,
00:12:34.100 | is people are easily predictable,
00:12:36.720 | and people hate when you tell them
00:12:37.900 | that they're easily predictable.
00:12:39.580 | But they are, and there you go.
00:12:42.100 | - What about, let me play devil's advocate,
00:12:45.140 | and philosophically speaking, is it possible to say
00:12:48.540 | that what defines humans is the outlier?
00:12:51.580 | So, even though some large percentage of our behaviors,
00:12:55.560 | whatever the signal we measure is the same,
00:12:57.660 | and it would cluster nicely,
00:12:59.380 | but maybe it's the special moments
00:13:01.420 | of when we break out of the routine,
00:13:03.960 | is the definitive things,
00:13:05.680 | and the way we break out of that routine
00:13:07.540 | for each one of us might be different?
00:13:09.260 | - It's possible.
00:13:10.100 | I would say it a little differently, I think.
00:13:12.660 | I would make two things.
00:13:13.660 | One is, I'm gonna disagree with the premise, I think,
00:13:16.500 | but that's fine.
00:13:17.340 | I think the way I would put it is,
00:13:21.480 | there are people who are very different
00:13:23.140 | from lots of other people, but they're not 0%.
00:13:25.920 | They're closer to 10%, right?
00:13:27.560 | So, in fact, even if you do this kind of clustering
00:13:29.180 | of people, that'll turn out to be
00:13:30.580 | this small number of people.
00:13:31.700 | They all behave like each other,
00:13:33.420 | even if they individually behave very differently
00:13:35.500 | from everyone else.
00:13:37.180 | So, I think that's kind of important.
00:13:38.780 | But what you're really asking, I think,
00:13:40.300 | and I think this is really a question,
00:13:41.620 | is what do you do when you're faced
00:13:44.140 | with a situation you've never seen before?
00:13:46.580 | What do you do when you're faced
00:13:47.580 | with an extraordinary situation
00:13:48.860 | maybe you've seen others do,
00:13:49.900 | and you're actually forced to do something?
00:13:51.520 | You react to that very differently,
00:13:52.720 | and that is the thing that makes you human.
00:13:54.000 | I would agree with that, at least at a philosophical level,
00:13:56.520 | that it's the times when you are faced
00:13:59.800 | with something difficult, a decision that you have to make,
00:14:04.180 | where the answer isn't easy,
00:14:05.760 | even if you know what the right answer is,
00:14:07.680 | that's sort of what defines you as the individual,
00:14:09.560 | and I think what defines people broadly.
00:14:11.840 | It's the hard problem, it's not the easy problem.
00:14:13.440 | It's the thing that's gonna hurt you.
00:14:14.500 | It's not the thing.
00:14:15.440 | It's not even that it's difficult.
00:14:17.520 | It's just that you know that the outcome
00:14:19.700 | is going to be highly suboptimal for you,
00:14:22.300 | and I do think that that's a reasonable place to start
00:14:25.380 | for the question of what makes us human.
00:14:28.140 | - So before we talk about, sort of explore
00:14:30.100 | the different ideas underlying
00:14:31.780 | interactive artificial intelligence,
00:14:33.300 | which we are working on, let me just go along this thread
00:14:36.940 | to skip to kind of our world of social media,
00:14:39.660 | which is something that,
00:14:41.660 | at least on the artificial intelligence side,
00:14:43.500 | you think about there's a popular narrative.
00:14:47.060 | I don't know if it's true,
00:14:48.680 | but that we have these silos in social media,
00:14:53.480 | and we have these clusterings, as you're kind of mentioning,
00:14:56.600 | and the idea is that, along that narrative,
00:15:00.840 | is that we wanna break each other out of those silos
00:15:05.840 | so we can be empathetic to other people.
00:15:11.000 | If you're a Democrat, you'd be empathetic to the Republican.
00:15:14.160 | If you're a Republican, you're empathetic to a Democrat.
00:15:16.420 | Those are just two silly bins
00:15:18.480 | that we seem to be very excited about,
00:15:21.000 | but there's other binnings that we can think about.
00:15:24.120 | Is there, from an artificial intelligence perspective,
00:15:28.120 | 'cause you're just saying we cluster along the data,
00:15:30.920 | but then interactive artificial intelligence
00:15:33.680 | is referring to throwing agents into that mix,
00:15:36.680 | AI systems in that mix,
00:15:38.400 | helping us, interacting with us humans,
00:15:41.100 | and maybe getting us out of those silos.
00:15:43.900 | Is that something that you think is possible?
00:15:46.800 | Do you see a hopeful possibility
00:15:50.240 | for artificial intelligence systems
00:15:52.160 | in these large networks of people
00:15:55.320 | to get us outside of our habits,
00:15:58.320 | in at least the idea space,
00:16:00.200 | to where we can sort of be empathetic
00:16:04.240 | to other people's lived experiences,
00:16:07.900 | other people's points of view, all that kind of stuff?
00:16:11.440 | - Yes, and I actually don't think it's that hard.
00:16:13.320 | Well, it's not hard in this sense.
00:16:15.300 | So imagine that you can,
00:16:17.180 | now let's make life simple for a minute.
00:16:20.600 | Let's assume that you can do a kind of partial ordering
00:16:23.660 | over ideas or clusterings of behavior.
00:16:27.080 | It doesn't even matter what I mean here,
00:16:28.920 | so long as there's some way that this is a cluster,
00:16:30.840 | this is a cluster, there's some edge between them, right?
00:16:33.600 | They don't quite touch even, or maybe they come very close.
00:16:36.020 | If you can imagine that conceptually,
00:16:38.620 | then the way you get from here to here
00:16:40.040 | is not by going from here to here.
00:16:41.100 | The way you get from here to here is you find the edge
00:16:43.020 | and you move slowly together, right?
00:16:44.800 | And I think that machines are actually very good
00:16:46.560 | at that sort of thing, once we can kind of define the problem
00:16:48.720 | either in terms of behavior or ideas or words or whatever.
00:16:51.420 | So it's easy in the sense that
00:16:52.980 | if you already have the network
00:16:54.600 | and you know the relationships,
00:16:56.000 | the edges and sort of the strings on them,
00:16:57.400 | and you kind of have some semantic meaning for them,
00:17:00.720 | the machine doesn't have to, you do as the designer,
00:17:03.800 | then yeah, I think you can kind of move people along
00:17:05.680 | and sort of expand them.
00:17:06.920 | But it's harder than that.
00:17:07.760 | And the reason it's harder than that,
00:17:09.560 | or sort of coming up with the network structure itself
00:17:12.960 | is hard is because I'm gonna tell you a story
00:17:14.500 | that someone else told me, and I don't,
00:17:17.460 | I may get some of the details a little bit wrong,
00:17:19.140 | but it's roughly, it roughly goes like this.
00:17:22.280 | You take two sets of people from the same backgrounds
00:17:25.740 | and you want them to solve a problem.
00:17:27.820 | So you separate them up, which we do all the time,
00:17:29.740 | right, oh, you know, we're gonna break out in the,
00:17:31.380 | we're gonna break out groups,
00:17:32.220 | you're gonna go over there and you're gonna talk about this,
00:17:33.540 | you're gonna go over there and you're gonna talk about this.
00:17:35.120 | And then you have them sort of in this big room,
00:17:37.700 | but far apart from one another,
00:17:38.740 | and you have them sort of interact with one another.
00:17:41.060 | When they come back to talk about what they learned,
00:17:43.440 | you wanna merge what they've done together,
00:17:45.080 | it can be extremely hard because they don't,
00:17:48.160 | they basically don't speak the same language anymore.
00:17:50.080 | Like when you create these problems and you dive into them,
00:17:52.200 | you create your own language.
00:17:53.680 | So the example this one person gave me,
00:17:55.280 | which I found kind of interesting
00:17:57.080 | 'cause we were in the middle of that at the time,
00:17:58.560 | was they're sitting over there
00:18:00.960 | and they're talking about these rooms that you can see,
00:18:03.560 | but you're seeing them from different vantage points,
00:18:05.480 | depending upon what side of the room you're on.
00:18:07.520 | They can see a clock very easily.
00:18:09.840 | And so they start referring to the room
00:18:11.220 | as the one with the clock.
00:18:12.580 | This group over here, looking at the same room,
00:18:15.540 | they can see the clock,
00:18:16.420 | but it's not in their line of sight or whatever,
00:18:18.420 | so they end up referring to it by some other way.
00:18:22.700 | When they get back together
00:18:23.620 | and they're talking about things,
00:18:25.240 | they're referring to the same room
00:18:26.720 | and they don't even realize they're referring
00:18:27.980 | to the same room.
00:18:28.820 | And in fact, this group doesn't even see
00:18:30.060 | that there's a clock there
00:18:30.900 | and this group doesn't see whatever it is.
00:18:32.180 | The clock on the wall is the thing that stuck with me.
00:18:33.980 | So if you create these different silos,
00:18:35.620 | the problem isn't that the ideologies disagree,
00:18:38.340 | it's that you're using the same words
00:18:40.940 | and they mean radically different things.
00:18:42.800 | The hard part is just getting them to agree on the,
00:18:45.720 | well, maybe we'd say the axioms in our world, right?
00:18:48.880 | But just get them to agree on some basic definitions.
00:18:52.560 | Because right now they're talking past each other,
00:18:55.020 | just completely talking past each other.
00:18:56.680 | That's the hard part, getting them to meet,
00:18:58.700 | getting them to interact, that may not be that difficult.
00:19:01.360 | Getting them to see where their language
00:19:04.180 | is leading them to lead past one another,
00:19:05.760 | that's the hard part.
00:19:07.480 | - It's a really interesting question to me.
00:19:09.160 | It could be on the layer of language,
00:19:10.380 | but it feels like there's multiple layers to this.
00:19:12.520 | Like it could be worldview, it could be,
00:19:14.520 | I mean, it all boils down to empathy,
00:19:16.000 | being able to put yourself in the shoes of the other person,
00:19:19.560 | to learn the language, to learn visually
00:19:22.960 | how they see the world, to learn the,
00:19:27.680 | I mean, I experience this now with trolls,
00:19:30.960 | the degree of humor in that world.
00:19:33.500 | For example, I talk about love a lot.
00:19:35.040 | I'm very lucky to have this amazing community
00:19:39.240 | of loving people, but whenever I encounter trolls,
00:19:42.200 | they always roll their eyes at the idea of love
00:19:45.040 | because it's so quote unquote cringe.
00:19:47.360 | - Yeah.
00:19:48.200 | - So they show love by derision, I would say.
00:19:53.200 | And I think about, on the human level,
00:19:58.200 | that's a whole 'nother discussion,
00:19:59.280 | that's psychology, that's sociology, so on.
00:20:01.800 | But I wonder if AI systems can help somehow
00:20:06.080 | and bridge the gap of what is this person's life like?
00:20:11.080 | Encourage me to just ask that question,
00:20:15.280 | to put myself in their shoes,
00:20:17.280 | to experience the agitations, the fears,
00:20:20.280 | the hopes they have, to experience,
00:20:23.420 | even just to think about what was their upbringing like,
00:20:28.440 | like having a single parent home
00:20:31.440 | or a shitty education or all those kinds of things,
00:20:35.400 | just to put myself in that mind space.
00:20:38.440 | It feels like that's really important
00:20:40.480 | for us to bring those clusters together,
00:20:44.200 | to find that similar language,
00:20:45.760 | but it's unclear how AI can help that
00:20:48.240 | because it seems like AI systems need
00:20:49.880 | to understand both parties first.
00:20:52.360 | - So the word understander is doing a lot of work, right?
00:20:55.040 | - Yeah, yes.
00:20:56.160 | - Do you have to understand it
00:20:57.120 | or do you just simply have to note
00:20:58.960 | that there is something similar as a point to touch, right?
00:21:03.320 | So you use the word empathy,
00:21:06.320 | and I like that word for a lot of reasons.
00:21:08.560 | I think you're right in the way that you're using
00:21:10.200 | and the way that you're describing it,
00:21:11.080 | but let's separate it from sympathy, right?
00:21:13.080 | So sympathy is feeling sort of for someone.
00:21:17.060 | Empathy is kind of understanding where they're coming from
00:21:19.240 | and how they feel, right?
00:21:20.260 | And for most people, those things go hand in hand.
00:21:23.960 | For some people, some are very good at empathy
00:21:25.940 | and very bad at sympathy.
00:21:28.040 | Some people cannot experience,
00:21:29.840 | well, my observation would be, I'm not a psychologist,
00:21:32.320 | my observation would be that some people
00:21:34.280 | seem incapable of feeling sympathy
00:21:36.080 | unless they feel empathy first.
00:21:38.080 | You can understand someone,
00:21:39.320 | understand where they're coming from and still think,
00:21:40.920 | "No, I can't support that," right?
00:21:43.520 | It doesn't mean that the only way,
00:21:45.840 | because if that isn't the case,
00:21:47.640 | then what it requires is that you must,
00:21:51.600 | the only way that you can,
00:21:52.840 | to understand someone means you must agree
00:21:54.820 | with everything that they do.
00:21:56.840 | Which, is it right, right?
00:21:58.880 | And if the only way I can feel for someone
00:22:02.000 | is to completely understand them
00:22:03.520 | and make them like me in some way,
00:22:06.240 | well, then we're lost, right?
00:22:07.800 | Because we're not all exactly like each other.
00:22:09.880 | I don't have to understand everything
00:22:10.880 | that you've gone through.
00:22:11.700 | It helps, clearly.
00:22:12.920 | But they're separable ideas, right?
00:22:14.360 | Even though they get clearly tangled up in one another.
00:22:16.920 | So what I think AI could help you do, actually,
00:22:19.440 | is if, and I'm being quite fanciful, as it were,
00:22:23.180 | but if you think of these as kind of,
00:22:25.280 | I understand how you interact,
00:22:26.660 | the words that you use, the actions you take,
00:22:28.880 | I have some way of doing this,
00:22:29.720 | let's not worry about what that is,
00:22:31.960 | but I can see you as a kind of distribution
00:22:34.320 | of experiences and actions taken upon you,
00:22:36.880 | things you've done, and so on.
00:22:38.040 | And I can do this with someone else,
00:22:39.660 | and I can find the places where there's
00:22:41.680 | some kind of commonality, a mapping, as it were,
00:22:44.700 | even if it's not total.
00:22:46.120 | If I think of it as distribution, right,
00:22:48.520 | then I can take the cosine of the angle between you,
00:22:51.100 | and if it's zero, you've got nothing in common.
00:22:53.840 | If it's one, you're completely the same person.
00:22:56.000 | Well, you're probably not one,
00:22:59.240 | you're almost certainly not zero.
00:23:00.520 | If I can find the place where there's the overlap,
00:23:02.580 | then I might be able to introduce you on that basis,
00:23:04.560 | or connect you in that way,
00:23:07.740 | and make it easier for you to take that step of empathy.
00:23:10.880 | It's not impossible to do,
00:23:14.520 | although I wonder if it requires that everyone involved
00:23:18.840 | is at least interested in asking the question.
00:23:21.320 | So maybe the hard part is just getting them interested
00:23:23.040 | in asking the question.
00:23:23.880 | In fact, maybe if you can get them to ask the question,
00:23:25.560 | how are we more alike than we are different,
00:23:27.400 | they'll solve it themselves.
00:23:28.400 | Maybe that's the problem that AI should be working on,
00:23:30.280 | not telling you how you're similar or different,
00:23:32.820 | but just getting you to decide
00:23:34.200 | that it's worthwhile asking the question.
00:23:35.800 | - So, no.
00:23:36.640 | - It feels like an economist's answer, actually.
00:23:39.120 | - Well, people, okay, first of all,
00:23:40.360 | people like it when I would disagree.
00:23:42.020 | So let me disagree slightly,
00:23:43.760 | which is I think everything you said is brilliant,
00:23:46.040 | but I tend to believe, philosophically speaking,
00:23:49.840 | that people are interested underneath it all,
00:23:52.640 | and I would say that AI,
00:23:54.400 | the possibility that an AI system
00:23:58.640 | would show the commonality is incredible.
00:24:01.400 | That's a really good starting point.
00:24:02.720 | I would say if on social media,
00:24:05.940 | I could discover the common things,
00:24:09.160 | deep or shallow, between me and a person
00:24:12.000 | who there's tension with,
00:24:14.240 | I think that my basic human nature
00:24:17.080 | would take over from there,
00:24:18.400 | and I think enjoy that commonality
00:24:22.520 | and there's something sticky about that
00:24:25.320 | that my mind will linger on,
00:24:27.040 | and that person in my mind will become warmer and warmer,
00:24:30.880 | and I'll start to feel more and more compassion towards them.
00:24:34.240 | I think for majority of the population, that's true,
00:24:37.560 | but that's a hypothesis.
00:24:39.600 | - Yeah, I mean, it's an empirical question, right?
00:24:41.320 | You'd have to figure it out.
00:24:42.840 | I mean, I want to believe you're right,
00:24:44.200 | and so I'm gonna say that I think you're right.
00:24:46.600 | Of course, some people come to those things
00:24:49.440 | for the purpose of trolling, right?
00:24:52.300 | And it doesn't matter.
00:24:53.140 | They're playing a different game.
00:24:54.880 | - Yeah.
00:24:55.720 | - But I don't know.
00:24:56.540 | My experience is it requires two things.
00:24:59.000 | It requires, in fact, maybe this is really,
00:25:00.960 | at the end, what you're saying,
00:25:01.800 | and I do agree with this for sure.
00:25:03.840 | So it's hard to hold onto that kind of anger
00:25:08.840 | or to hold onto just the desire
00:25:13.400 | to humiliate someone for that long.
00:25:14.880 | It's just difficult to do.
00:25:15.920 | It takes a toll on you,
00:25:17.800 | but more importantly, we know this,
00:25:20.080 | both from people having done studies on it,
00:25:21.960 | but also from our own experiences,
00:25:23.700 | that it is much easier to be dismissive of a person
00:25:27.100 | if they're not in front of you, if they're not real, right?
00:25:29.840 | So much of the history of the world
00:25:33.020 | is about making people other, right?
00:25:35.180 | So if you're on social media, if you're on the web,
00:25:37.760 | if you're doing whatever on the internet,
00:25:39.680 | being forced to deal with someone as a person,
00:25:42.760 | some equivalent to being in the same room,
00:25:46.040 | makes a huge difference, 'cause then you're,
00:25:47.520 | one, you're forced to deal with their humanity
00:25:49.280 | 'cause it's in front of you.
00:25:50.280 | The other is, of course, that they might punch you
00:25:52.460 | in the face if you go too far,
00:25:53.540 | so both of those things kind of work together,
00:25:55.800 | I think, to the right end.
00:25:57.360 | So I think bringing people together
00:25:59.880 | is really a kind of substitute for forcing them
00:26:04.020 | to see the humanity in another person
00:26:06.120 | and to not be able to treat them as bits.
00:26:08.020 | It's hard to troll someone
00:26:09.080 | when you're looking them in the eye.
00:26:10.540 | This is very difficult to do.
00:26:12.760 | - Agreed.
00:26:13.800 | Your broad set of research interests
00:26:15.340 | fall under interactive AI, as I mentioned,
00:26:19.140 | which is a fascinating set of ideas,
00:26:21.460 | and you have some concrete things
00:26:23.240 | that you're particularly interested in,
00:26:25.520 | but maybe could you talk about how you think
00:26:28.760 | about the field of interactive artificial intelligence?
00:26:31.920 | - Sure.
00:26:32.760 | So let me say up front that if you look at,
00:26:34.960 | certainly my early work,
00:26:35.800 | but even if you look at most of it,
00:26:37.600 | I'm a machine learning guy.
00:26:38.920 | Right, I do machine learning.
00:26:40.680 | First paper I ever published was in NIPS.
00:26:43.360 | Back then it was NIPS, now it's NIRPS.
00:26:45.640 | It's a long story there.
00:26:46.800 | Anyway, that's another thing.
00:26:48.380 | So I'm a machine learning guy.
00:26:49.220 | I believe in data, I believe in statistics
00:26:50.940 | and all those kind of things.
00:26:51.780 | - Yes.
00:26:52.600 | - And the reason I'm bringing that up
00:26:53.440 | is even though I'm a newfangled
00:26:54.900 | statistical machine learning guy
00:26:56.580 | and have been for a very long time,
00:26:58.100 | the problem I really care about is AI.
00:27:00.500 | I care about artificial intelligence.
00:27:01.980 | I care about building some kind of intelligent artifact,
00:27:05.820 | however that gets expressed,
00:27:07.900 | that would be at least as intelligent as humans
00:27:12.060 | and as interesting as humans,
00:27:13.580 | perhaps in their own way.
00:27:16.660 | - So that's the deep underlying love and dream
00:27:18.840 | is the bigger AI.
00:27:20.800 | - Yes.
00:27:21.640 | - It's the bigger, whatever the heck that is.
00:27:22.840 | - Yeah, the machine learning in some ways
00:27:24.080 | is a means to the end.
00:27:25.840 | It is not the end.
00:27:26.880 | And I don't understand how one could be intelligent
00:27:29.480 | without learning, so therefore I gotta figure out
00:27:31.240 | how to do that, right?
00:27:32.320 | So that's important.
00:27:33.160 | But machine learning, by the way, is also a tool.
00:27:35.440 | I said statistical because that's what most people
00:27:37.520 | think of themselves, machine learning people.
00:27:39.200 | That's how they think.
00:27:40.040 | I think it's what Pat Langley might disagree,
00:27:41.440 | or at least 1980s Pat Langley might disagree
00:27:44.520 | with what it takes to do machine learning.
00:27:46.820 | But I care about the AI problem,
00:27:48.460 | which is why it's interactive AI,
00:27:49.620 | not just interactive ML.
00:27:50.980 | I think it's important to understand
00:27:52.620 | that there's a long-term goal here,
00:27:54.500 | which I will probably never live to see,
00:27:56.160 | but I would love to have been a part of,
00:27:58.140 | which is building something truly intelligent
00:28:00.740 | outside of ourselves.
00:28:02.380 | - Can we take a tiny tangent?
00:28:03.900 | - Sure.
00:28:04.740 | - Or am I interrupting?
00:28:05.580 | Which is, is there something you can say concrete
00:28:10.100 | about the mysterious gap between the subset ML
00:28:14.240 | and the bigger AI?
00:28:15.980 | What's missing?
00:28:16.900 | What do you think?
00:28:18.220 | I mean, obviously it's totally unknown,
00:28:21.040 | not totally, but in part unknown at this time,
00:28:23.220 | but is it something like what Pat Langley's,
00:28:25.700 | is it knowledge, like expert system reasoning
00:28:28.460 | type of kind of thing?
00:28:29.820 | - So AI is bigger than ML, but ML is bigger than AI.
00:28:33.060 | This is kind of the real problem here,
00:28:35.620 | is that they're really overlapping things
00:28:36.980 | that are really interested in slightly different problems.
00:28:39.540 | I tend to think of ML, and there are many people out there
00:28:41.500 | are gonna be very upset at me about this,
00:28:42.700 | but I tend to think of ML being much more concerned
00:28:44.600 | with the engineering of solving a problem.
00:28:46.320 | I'm an AI about the sort of more philosophical goal
00:28:48.880 | of true intelligence, and that's the thing
00:28:50.960 | that motivates me, even if I end up finding myself
00:28:53.480 | living in this kind of engineering-ish space.
00:28:56.120 | I've now made Michael Jordan upset,
00:28:57.840 | but you know, it's, to me, they just feel very different.
00:29:01.640 | You're just measuring them differently.
00:29:03.400 | Your sort of goals of where you're trying to be
00:29:06.360 | are somewhat different, but to me, AI is about
00:29:08.600 | trying to build that intelligent thing.
00:29:10.520 | And typically, but not always, for the purpose
00:29:14.500 | of understanding ourselves a little bit better.
00:29:16.820 | Machine learning is, I think, trying to solve the problem,
00:29:19.860 | whatever that problem is.
00:29:21.020 | Now, that's my take.
00:29:22.140 | Others, of course, would disagree.
00:29:23.520 | - So on that note, so with the interactive AI,
00:29:26.860 | do you tend to, in your mind, visualize AI
00:29:29.060 | as a singular system, or is it as a collective,
00:29:32.260 | huge amount of systems interacting with each other?
00:29:34.340 | Like, is the social interaction of us humans
00:29:38.380 | and of AI systems fundamental to intelligence?
00:29:41.920 | - I think, well, it's certainly fundamental
00:29:43.240 | to our kind of intelligence, right?
00:29:44.960 | And I actually think it matters quite a bit.
00:29:46.800 | So the reason the interactive AI part matters to me
00:29:51.040 | is because I don't, this is gonna sound simple,
00:29:55.660 | but I don't care whether a tree makes a sound
00:30:00.180 | when it falls and there's no one around,
00:30:01.720 | because I don't think it matters, right?
00:30:03.720 | If there's no observer, in some sense.
00:30:05.320 | And I think what's interesting about the way
00:30:06.800 | that we're intelligent is we're intelligent
00:30:09.780 | with other people, right, or other things, anyway.
00:30:12.960 | And we go out of our way to make other things intelligent.
00:30:15.600 | We're hardwired to find intention,
00:30:17.600 | even whether there is no intention.
00:30:19.120 | Why, I mean, anthropomorphize everything.
00:30:21.000 | I think, anyway.
00:30:21.840 | I think the interactive AI part is being intelligent
00:30:25.280 | in and of myself in isolation is a meaningless act,
00:30:28.840 | in some sense.
00:30:30.520 | The correct answer is you have to be intelligent
00:30:32.520 | in the way that you interact with others.
00:30:33.520 | That's also efficient, because it allows you
00:30:35.440 | to learn faster, because you can import from past history.
00:30:39.760 | It also allows you to be efficient
00:30:41.160 | in the transmission of that.
00:30:42.440 | So we ask ourselves about me.
00:30:44.920 | Am I intelligent?
00:30:45.800 | Clearly, I think so.
00:30:47.600 | But I'm also intelligent as a part of a larger species
00:30:50.300 | and group of people, and we're trying to move
00:30:52.040 | the species forward as well.
00:30:53.520 | And so I think that notion of being intelligent
00:30:56.320 | with others is kind of the key thing,
00:30:57.920 | because otherwise you come and you go,
00:30:59.840 | and then it doesn't matter.
00:31:01.520 | And so that's why I care about that aspect of it.
00:31:04.600 | And it has lots of other implications.
00:31:07.320 | One is not just building something intelligent with others,
00:31:10.320 | but understanding that you can't always communicate
00:31:12.360 | with those others.
00:31:13.180 | They have been in a room where there's a clock on the wall
00:31:15.480 | that you haven't seen, which means you have to spend
00:31:17.240 | an enormous amount of time communicating
00:31:18.720 | with one another constantly in order to figure out
00:31:21.920 | what each other wants.
00:31:24.000 | So, I mean, this is why people project, right?
00:31:25.620 | You project your own intentions and your own reasons
00:31:28.740 | for doing things onto others as a way of understanding them
00:31:31.040 | so that you know how to behave.
00:31:32.960 | But by the way, you, completely predictable person,
00:31:35.920 | I don't know how you're predictable,
00:31:37.040 | I don't know you well enough, but you probably eat
00:31:38.880 | the same five things over and over again,
00:31:40.400 | or whatever it is that you do, right?
00:31:41.840 | I know I do.
00:31:43.200 | If I'm going to a new Chinese restaurant,
00:31:44.720 | I will get General Gao's chicken,
00:31:45.840 | because that's the thing that's easy to get.
00:31:47.740 | I will get hot and sour soup.
00:31:49.840 | People do the things that they do,
00:31:51.120 | but other people get the chicken and broccoli.
00:31:53.800 | I think I can push this analogy way too far.
00:31:55.840 | The chicken and broccoli--
00:31:56.680 | - I don't know what's wrong with those people.
00:31:57.840 | - I don't know what's wrong with them either.
00:32:00.200 | - That's not good.
00:32:01.040 | - We have all had our trauma.
00:32:02.520 | So they get their chicken and broccoli
00:32:03.720 | and their egg drop soup or whatever.
00:32:06.000 | We got to communicate, and it's going to change, right?
00:32:08.440 | So it's not, interactive AI is not just about
00:32:11.960 | learning to solve a problem or a task.
00:32:14.040 | It's about having to adapt that over time,
00:32:16.440 | over a very long period of time,
00:32:17.720 | and interacting with other people,
00:32:19.620 | who will themselves change?
00:32:20.880 | This is what we mean about things like adaptable models,
00:32:22.840 | right, that you have to have a model,
00:32:24.200 | that model's going to change.
00:32:25.160 | And by the way, it's not just the case
00:32:26.500 | that you're different from that person,
00:32:28.040 | but you're different from the person you were 15 minutes ago,
00:32:30.320 | or certainly 15 years ago,
00:32:31.920 | and I have to assume that you're at least going to drift.
00:32:34.760 | Hopefully not too many discontinuities,
00:32:36.320 | but you're going to drift over time.
00:32:38.040 | And I have to have some mechanism for adapting to that
00:32:42.360 | as you and an individual over time,
00:32:43.680 | and across individuals over time.
00:32:46.040 | - On the topic of adaptive modeling,
00:32:48.020 | and you talk about lifelong learning,
00:32:51.000 | which is, I think, a topic that's understudied,
00:32:56.000 | or maybe because nobody knows what to do with it.
00:32:59.520 | But if you look at Alexa,
00:33:01.520 | or most of our artificial intelligence systems
00:33:04.360 | that are primarily machine learning based systems,
00:33:07.000 | or dialogue systems, all those kinds of things,
00:33:08.960 | they know very little about you,
00:33:12.360 | in the sense of the lifelong learning sense,
00:33:15.640 | that we learn, as humans,
00:33:19.760 | we learn a lot about each other,
00:33:22.080 | not in the quantity effects,
00:33:24.880 | but the temporally rich set of information
00:33:29.880 | that seems to pick up the crumbs along the way
00:33:34.360 | that somehow seems to capture a person pretty well.
00:33:37.080 | Do you have any ideas how to do lifelong learning?
00:33:42.080 | Because it seems like most of the machine learning community
00:33:44.560 | does not.
00:33:45.480 | - No, well, by the way,
00:33:46.320 | not only does the machine learning community
00:33:47.320 | not spend a lot of time on lifelong learning,
00:33:49.840 | I don't think they spend a lot of time on learning, period,
00:33:52.760 | in the sense that they tend to be very task-focused.
00:33:55.120 | Everybody is over-fitting to whatever problem
00:33:57.520 | is they happen to have.
00:33:58.360 | They're over-engineering their solutions to the task.
00:34:01.480 | Even the people, and I think these people do,
00:34:03.720 | are trying to solve a hard problem of transfer learning,
00:34:05.920 | right, I'm gonna learn on one task,
00:34:07.080 | then learn the other task.
00:34:08.040 | You still end up creating the task.
00:34:09.680 | You know, it's like looking for your keys
00:34:11.060 | where the light is, 'cause that's where the light is, right?
00:34:12.700 | It's not because the keys have to be there.
00:34:14.920 | I mean, one could argue that we tend to do this in general.
00:34:18.400 | We tend to kind of do it as a group.
00:34:20.600 | We tend to hill climb and get stuck in local optima.
00:34:23.280 | And I think we do this in the small as well.
00:34:26.360 | I think it's very hard to do.
00:34:28.360 | Because, so, look, here's the hard thing about AI, right?
00:34:32.240 | The hard thing about AI is it keeps changing on us, right?
00:34:34.600 | You know, what is AI?
00:34:35.440 | AI is the art and science of making computers act
00:34:38.520 | the way they do in the movies, right?
00:34:39.840 | That's what it is, right?
00:34:40.680 | (laughing)
00:34:41.500 | - That's a good definition.
00:34:42.340 | - But beyond that, it's--
00:34:43.920 | - And they keep coming out with new movies.
00:34:45.280 | - Yes, and they just, right, exactly.
00:34:47.300 | We are driven by this kind of need
00:34:49.700 | to the sort of ineffable quality of who we are.
00:34:52.980 | Which means that the moment you understand something
00:34:55.660 | is no longer AI, right?
00:34:57.140 | Well, we understand this.
00:34:58.060 | That's just, you take the derivative and you divide by two
00:35:00.300 | and then you average it out over time in the window.
00:35:02.500 | So therefore, that's no longer AI.
00:35:03.940 | So the problem is unsolvable
00:35:05.140 | because it keeps kind of going away.
00:35:07.020 | This creates a kind of illusion,
00:35:08.020 | which I don't think is an entire illusion,
00:35:09.440 | of either there's very simple task-based things
00:35:11.340 | you can do very well and over-engineer.
00:35:13.700 | There's all of AI, and there's like nothing in the middle.
00:35:16.560 | Like it's very hard to get from here to here,
00:35:18.740 | and it's very hard to see how to get from here to here.
00:35:21.740 | And I don't think that we've done a very good job of it
00:35:24.220 | because we get stuck trying to solve the small problem
00:35:27.020 | that's in front of us, myself included.
00:35:28.140 | I'm not gonna pretend that I'm better at this
00:35:30.340 | than anyone else.
00:35:31.300 | And of course, all the incentives in academia
00:35:34.340 | and in industry are set to make that very hard
00:35:37.140 | 'cause you have to get the next paper out,
00:35:38.960 | you have to get the next product out,
00:35:40.460 | you have to solve this problem,
00:35:41.820 | and it's very sort of naturally incremental.
00:35:43.740 | And none of the incentives are set up
00:35:47.000 | to allow you to take a huge risk
00:35:48.860 | unless you're already so well-established
00:35:51.120 | you can take that big risk.
00:35:52.460 | And if you're that well-established
00:35:54.880 | that you can take that big risk,
00:35:56.280 | then you've probably spent much of your career
00:35:57.840 | taking these little risks, relatively speaking.
00:35:59.920 | And so you have got a lifetime of experience
00:36:02.720 | telling you not to take that particular big risk, right?
00:36:04.760 | So the whole system's set up to make progress very slow.
00:36:07.240 | That's fine.
00:36:08.180 | It's just the way it is.
00:36:09.320 | But it does make this gap seem really big,
00:36:11.160 | which is my long way of saying
00:36:12.620 | I don't have a great answer to it
00:36:14.020 | except that stop doing n equals one.
00:36:17.820 | At least try to get n equal two and maybe n equal seven
00:36:20.820 | so that you can say I'm gonna,
00:36:22.100 | or maybe T is a better variable here.
00:36:24.060 | I'm gonna not just solve this problem,
00:36:25.500 | I'm gonna solve this problem and another problem.
00:36:26.900 | I'm not gonna learn just on you.
00:36:28.300 | I'm gonna keep living out there in the world
00:36:30.180 | and just seeing what happens
00:36:31.580 | and that we'll learn something as designers
00:36:33.180 | and our machine learning algorithms
00:36:34.900 | and our AI algorithms can learn as well.
00:36:37.060 | But unless you're willing to build a system
00:36:39.700 | which you're gonna have live for months at a time
00:36:42.160 | in an environment that is messy and chaotic
00:36:44.100 | you cannot control,
00:36:45.740 | then you're never going to make progress in that direction.
00:36:48.420 | So I guess my answer to you is yes.
00:36:50.300 | My idea is that you should, it's not no.
00:36:52.380 | It's yes, you should be deploying these things
00:36:55.740 | and making them live for months at a time
00:36:58.740 | and be okay with the fact
00:36:59.940 | that it's gonna take you five years to do this.
00:37:01.780 | Not rerunning the same experiment over and over again
00:37:04.060 | and refining the machine
00:37:05.600 | so it's slightly better at whatever,
00:37:07.340 | but actually having it out there
00:37:08.900 | and living in the chaos of the world
00:37:11.580 | and seeing what it's learning algorithm, say, can learn,
00:37:14.800 | what data structure it can build
00:37:16.060 | and how it can go from there.
00:37:17.360 | Without that, you're gonna be stuck ultimately.
00:37:19.800 | - What do you think about the possibility of N equals one
00:37:23.840 | growing, it's probably a crude approximation,
00:37:27.240 | but growing like if we look at language models like GPT-3,
00:37:31.040 | if you just make it big enough, it'll swallow the world.
00:37:34.360 | Meaning like it'll solve all your T to infinity
00:37:37.680 | by just growing in size of this.
00:37:41.120 | Taking the small over-engineered solution
00:37:43.820 | and just pumping it full of steroids in terms of compute,
00:37:47.300 | in terms of size of training data
00:37:49.820 | and the Yann LeCun style self-supervised
00:37:52.340 | or open AI self-supervised.
00:37:54.240 | Just throw all of YouTube at it
00:37:57.860 | and it will learn how to reason, how to paint,
00:38:01.880 | how to create music, how to love,
00:38:04.300 | all of that by watching YouTube videos.
00:38:06.180 | - I mean, I can't think of a more terrifying world
00:38:08.340 | to live in than a world that is based on YouTube videos,
00:38:10.880 | but yeah, I think the answer,
00:38:13.100 | I just kind of don't think that'll quite,
00:38:15.280 | well, it won't work that easily.
00:38:16.880 | You will get somewhere and you will learn something,
00:38:20.100 | which means it's probably worth it,
00:38:21.700 | but you won't get there.
00:38:23.580 | You won't solve the, you know, here's the thing.
00:38:25.980 | We build these things and we say we want them to learn,
00:38:30.700 | but what actually happens, and let's say they do learn.
00:38:33.420 | I mean, certainly every paper I've gotten published
00:38:35.220 | the things learn, I don't know about anyone else,
00:38:37.500 | but they actually change us, right?
00:38:39.820 | We react to it differently, right?
00:38:41.280 | So we keep redefining what it means to be successful,
00:38:44.280 | both in the negative in the case,
00:38:45.780 | but also in the positive in that,
00:38:47.060 | oh, well, this is an accomplishment.
00:38:50.160 | I'll give you an example,
00:38:51.980 | which is like the one you just described with YouTube.
00:38:53.580 | Let's get completely out of machine learning.
00:38:55.340 | Well, not completely, but mostly out of machine learning.
00:38:57.460 | Think about Google.
00:38:58.420 | People were trying to solve information retrieval,
00:39:02.180 | the ad hoc information retrieval problem forever.
00:39:04.420 | I mean, first major book I ever read about it was what,
00:39:07.600 | '71, I think was when it came out.
00:39:09.760 | Anyway, it's, you know, we'll treat everything as a vector
00:39:12.600 | and we'll do these vector space models and whatever.
00:39:15.360 | And that was all great.
00:39:16.560 | And we made very little progress.
00:39:18.920 | I mean, we made some progress.
00:39:20.160 | And then Google comes and makes the ad hoc problem
00:39:23.680 | seem pretty easy.
00:39:24.560 | I mean, it's not, there's lots of computers
00:39:26.000 | and databases involved, but you know,
00:39:27.960 | and there's some brilliant algorithmic stuff behind it too,
00:39:31.240 | and some systems building.
00:39:32.540 | But the problem changed, right?
00:39:36.540 | If you've got a world that's that connected
00:39:40.100 | so that you have, you know, there are 10 million answers
00:39:43.140 | quite literally to the question that you're asking,
00:39:46.540 | then the problem wasn't give me the things that are
00:39:49.620 | relevant, the problem is don't give me anything
00:39:51.580 | that's irrelevant, at least in the first page,
00:39:53.900 | because nothing else matters.
00:39:56.260 | So Google is not solving the information retrieval problem,
00:39:59.920 | at least not on this webpage.
00:40:01.700 | Google is minimizing false positives,
00:40:05.940 | which is not the same thing as getting an answer.
00:40:08.180 | It turns out it's good enough for what it is
00:40:10.000 | we wanna use Google for, but it also changes
00:40:12.300 | what the problem was we thought we were trying to solve
00:40:14.680 | in the first place.
00:40:15.520 | You thought you were trying to find an answer,
00:40:17.420 | but you're not, we're trying to find the answer,
00:40:19.060 | but it turns out you're just trying to find an answer.
00:40:21.100 | Now, yes, it is true, it's also very good
00:40:22.580 | at finding you exactly that webpage.
00:40:24.220 | Of course, you trained yourself to figure out
00:40:25.980 | what the keywords were to get you that webpage.
00:40:29.060 | But in the end, by having that much data,
00:40:31.980 | you've just changed the problem into something else.
00:40:33.820 | You haven't actually learned what you set out to learn.
00:40:35.620 | Now, the counter to that would be
00:40:37.180 | maybe we're not doing that either, we just think we are,
00:40:40.120 | because we're in our own heads, maybe we're learning
00:40:43.180 | the wrong problem in the first place,
00:40:44.420 | but I don't think that matters.
00:40:45.520 | I think the point is is that Google has not solved
00:40:48.560 | information retrieval, Google has done amazing service.
00:40:51.180 | I have nothing bad to say about what they've done.
00:40:52.820 | Lord knows my entire life is better because Google exists,
00:40:55.820 | in foreign for Google Maps, I don't think I've ever
00:40:57.320 | found this place.
00:40:58.160 | (laughing)
00:40:59.220 | - Where is this?
00:41:00.060 | - Like 95, I see 110 and I see, but where did 95 go?
00:41:04.360 | So I'm very grateful for Google, but they just have
00:41:08.580 | to make certain the first five things are right.
00:41:11.260 | And everything after that is wrong.
00:41:12.780 | Look, we're going off on a totally different topic here,
00:41:15.580 | but think about the way we hire faculty.
00:41:19.780 | It's exactly the same thing.
00:41:21.460 | - Are you getting controversial?
00:41:22.860 | - I'm not getting controversial.
00:41:24.500 | It's exactly the same problem, right?
00:41:27.420 | It's minimizing false positives.
00:41:29.500 | We say things like we want to find the best person
00:41:34.280 | to be an assistant professor at MIT
00:41:36.280 | in the new College of Computing,
00:41:39.280 | which I will point out was founded 30 years after
00:41:42.440 | the College of Computing I'm a part of.
00:41:44.480 | Both of my alma mater--
00:41:46.480 | - No more fighting words.
00:41:47.480 | - I'm just saying I appreciate all that they did
00:41:50.000 | and all that they're doing.
00:41:51.400 | Anyway, so we're gonna try to hire the best person
00:41:57.300 | to be the best professor.
00:41:58.120 | That's what we say, the best person for this job.
00:41:59.620 | But that's not what we do at all, right?
00:42:02.540 | Do you know which percentage of faculty in the top four
00:42:06.020 | earn their PhDs from the top four?
00:42:07.720 | Say in 2017, which is the most recent year
00:42:12.000 | for which I have data.
00:42:12.840 | - Maybe a large percentage.
00:42:14.260 | - About 60%.
00:42:15.100 | - 60.
00:42:15.940 | - 60% of the faculty in the top four earn their PhDs
00:42:18.140 | in the top four.
00:42:18.980 | This is computer science for which there is no top five.
00:42:21.220 | There's only a top four, right?
00:42:22.280 | 'Cause they're all tied for one.
00:42:23.460 | - For people who don't know, by the way,
00:42:24.700 | that would be MIT Stanford, Berkeley, CMU.
00:42:27.220 | - Yep.
00:42:28.060 | - Georgia Tech.
00:42:30.780 | - Number eight.
00:42:31.620 | - Number eight, you're keeping track.
00:42:34.260 | - Oh yes, it's a large part of my job.
00:42:35.700 | Number five is Illinois.
00:42:37.140 | Number six is a tie with UW and Cornell.
00:42:40.560 | And Princeton and Georgia Tech are tied for eight
00:42:42.460 | and UT Austin is number 10.
00:42:43.960 | Michigan's number 11, by the way.
00:42:47.180 | So if you look at the top 10,
00:42:49.020 | you know what percentage of faculty in the top 10
00:42:51.660 | earn their PhDs from the top 10?
00:42:54.100 | 65, roughly, 65%.
00:42:57.120 | If you look at the top 55 ranked departments,
00:43:00.540 | 50% of the faculty earn their PhDs from the top 10.
00:43:04.300 | There's no universe in which all the best faculty,
00:43:09.300 | even just for R1 universities,
00:43:12.420 | the majority of them come from 10 places.
00:43:14.580 | There's just no way that's true,
00:43:15.420 | especially when you consider how small
00:43:16.980 | some of those universities are
00:43:18.140 | in terms of the number of PhDs they produce.
00:43:20.580 | Now, that's not a negative.
00:43:21.980 | I mean, it is a negative.
00:43:23.060 | It also has a habit of entrenching
00:43:24.940 | certain historical inequities and accidents.
00:43:28.640 | But what it tells you is, well, ask yourself the question,
00:43:32.920 | why is it like that?
00:43:36.020 | Well, because it's easier.
00:43:37.320 | If we go all the way back to the 1980s,
00:43:41.160 | you know, there was a saying that, you know,
00:43:42.520 | nobody ever lost his job buying a computer from IBM.
00:43:45.720 | And it was true.
00:43:47.080 | And nobody ever lost their job hiring a PhD from MIT, right?
00:43:51.920 | If the person turned out to be terrible, well, you know,
00:43:54.200 | they came from MIT, what did you expect me to know?
00:43:56.060 | However, that same person coming from,
00:43:58.880 | pick whichever is your least favorite place
00:44:01.900 | that produces PhDs in say, computer science,
00:44:04.320 | well, you took a risk, right?
00:44:06.240 | So all the incentives,
00:44:07.400 | particularly because you're only gonna hire one this year,
00:44:09.240 | well, now we're hiring 10, but you know,
00:44:10.920 | you're only gonna hire one or two or three this year.
00:44:12.920 | And by the way, when they come in,
00:44:14.240 | you're stuck with them for at least seven years
00:44:16.120 | in most places, because that's before you know
00:44:17.600 | whether they're getting tenure or not.
00:44:18.560 | And if they get tenure, you're stuck with them
00:44:19.960 | for a good 30 years, unless they decide to leave.
00:44:22.240 | That means the pressure to get this right is very high.
00:44:23.960 | So what are you gonna do?
00:44:24.800 | You're gonna minimize false positives.
00:44:27.820 | You don't care about saying no inappropriately.
00:44:30.480 | You only care about saying yes inappropriately.
00:44:33.280 | So all the pressure drives you
00:44:34.840 | into that particular direction.
00:44:36.340 | Google, not to put too fine a point on it,
00:44:40.000 | was in exactly the same situation with their search.
00:44:42.000 | It turns out you just don't wanna give people
00:44:43.560 | the wrong page in the first three or four pages.
00:44:47.120 | And if there's 10 million right answers
00:44:48.940 | and a hundred bazillion wrong answers,
00:44:50.680 | just make certain the wrong answers don't get up there.
00:44:52.280 | And who cares if the right answer
00:44:54.160 | was actually the 13th page?
00:44:55.880 | A right answer, a satisficing answer,
00:44:58.440 | is number one, two, three, or four.
00:44:59.800 | So who cares?
00:45:00.680 | - Or an answer that will make you discover
00:45:02.240 | something beautiful, profound to your question.
00:45:05.440 | - Well, that's a different problem, right?
00:45:06.960 | - But isn't that the problem?
00:45:08.240 | Can we linger on this topic
00:45:09.800 | without sort of walking with grace?
00:45:14.800 | How do we get, for hiring faculty,
00:45:19.440 | how do we get that 13th page with a truly special person?
00:45:24.440 | I mean, it depends on the department.
00:45:27.440 | Computer science probably has those kinds of people.
00:45:32.140 | Like you have the Russian guy, Grigori Perelman.
00:45:35.120 | Just these awkward, strange minds
00:45:40.600 | that don't know how to play the little game of etiquette
00:45:44.240 | that faculty have all agreed somehow converged
00:45:48.320 | over the decades how to play with each other.
00:45:50.600 | And also is not, on top of that,
00:45:53.480 | is not from the top four, top whatever numbers, the schools.
00:45:57.520 | And maybe actually just says a few every once in a while
00:46:01.080 | to the traditions of old
00:46:04.400 | within the computer science community.
00:46:06.000 | Maybe talks trash about machine learning
00:46:09.200 | is a total waste of time.
00:46:10.680 | And that's there on their resume.
00:46:13.760 | So how do you allow the system
00:46:17.680 | to give those folks a chance?
00:46:19.680 | - Well, you have to be willing to take a certain kind of,
00:46:21.000 | without taking a particular position
00:46:22.200 | on any particular person,
00:46:23.120 | you'd have to take, you have to be willing to take risk.
00:46:26.640 | A small amount of risk.
00:46:27.480 | I mean, if we were treating this as a,
00:46:29.640 | well, as a machine learning problem,
00:46:31.480 | as a search problem, which is what it is,
00:46:32.760 | it's a search problem.
00:46:33.600 | If we were treating it that way,
00:46:34.640 | you would say, oh, well, the main thing is you want,
00:46:36.560 | you know, you've got a prior,
00:46:38.080 | you want some data 'cause I'm Bayesian.
00:46:40.120 | If you don't wanna do it that way,
00:46:41.280 | we'll just inject some randomness in and it'll be okay.
00:46:44.760 | The problem is that feels very, very hard to do with people.
00:46:48.600 | All the incentives are wrong there,
00:46:51.800 | but it turns out, and let's say,
00:46:53.760 | let's say that's the right answer.
00:46:54.640 | Let's just give, for the sake of argument,
00:46:56.440 | that, you know, injecting randomness in the system
00:46:58.600 | at that level for who you hire is just not worth doing
00:47:01.840 | because the price is too high or the cost is too high.
00:47:04.320 | We had infinite resources, sure, but we don't.
00:47:05.760 | And also you've gotta teach people.
00:47:07.000 | So, you know, you're ruining other people's lives
00:47:08.800 | if you get it too wrong.
00:47:10.840 | But we've taken that principle, even if I grant it,
00:47:14.400 | and pushed it all the way back, right?
00:47:17.240 | So we could have a better pool than we have
00:47:22.240 | of people we look at and give an opportunity to.
00:47:24.960 | If we do that, then we have a better chance of finding that.
00:47:28.320 | Of course, that just pushes the problem back
00:47:30.280 | another level, but let me tell you something else.
00:47:32.200 | You know, I did a sort of study, I call it a study.
00:47:35.080 | I called up eight of my friends and asked them
00:47:36.400 | for all of their data for graduate admissions,
00:47:38.280 | but then someone else followed up and did an actual study.
00:47:41.320 | And it turns out that I can tell you
00:47:43.360 | how everybody gets into grad school, more or less.
00:47:46.440 | More or less.
00:47:48.060 | You basically admit everyone from places
00:47:49.640 | higher ranked than you.
00:47:50.720 | You admit most people from places ranked around you,
00:47:52.600 | and you admit almost no one from places ranked below you,
00:47:54.600 | with the exception of the small liberal arts colleges
00:47:56.560 | that aren't ranked at all, like Harvey Mudd,
00:47:58.160 | 'cause they don't, they don't have PhDs,
00:47:59.280 | so they aren't ranked.
00:48:00.320 | This is all CS.
00:48:01.160 | Which means the decision of whether, you know,
00:48:06.200 | you become a professor at Cornell
00:48:09.360 | was determined when you were 17, right?
00:48:11.760 | By what you knew to go to undergrad to do whatever, right?
00:48:15.720 | So if we can push these things back a little bit
00:48:17.920 | and just make the pool a little bit bigger,
00:48:19.440 | at least you raise the probability
00:48:20.680 | that you will be able to see someone interesting
00:48:23.120 | and take the risk.
00:48:25.320 | The other answer to that question, by the way,
00:48:28.200 | which you could argue is the same as,
00:48:30.000 | you either adjust the pool so the probabilities go up,
00:48:32.520 | that's a way of injecting a little bit of uniformity.
00:48:36.040 | Uniform noise in the system, as it were,
00:48:38.080 | is you change your loss function.
00:48:40.240 | You just let yourself be measured by something
00:48:42.840 | other than whatever it is
00:48:44.200 | that we're measuring ourselves by now.
00:48:46.640 | I mean, US News and World Report,
00:48:48.680 | every time they change their formula
00:48:50.600 | for determining rankings,
00:48:51.840 | move entire universities to behave differently,
00:48:54.860 | because rankings matter.
00:48:56.440 | - Can you talk trash about those rankings for a second?
00:48:59.520 | No, I'm joking about talking trash.
00:49:01.280 | I actually, it's so funny how, from my perspective,
00:49:05.400 | from a very shallow perspective,
00:49:07.320 | how dogmatic, like how much I trust those rankings.
00:49:11.560 | They're almost ingrained in my head.
00:49:14.040 | I mean, at MIT, everybody kind of,
00:49:16.820 | it's a propagated, mutually agreed upon idea
00:49:22.880 | that those rankings matter.
00:49:25.080 | And I don't think anyone knows what they're,
00:49:27.040 | like most people don't know what they're based on.
00:49:30.320 | And what are they exactly based on
00:49:33.040 | and what are the flaws in that?
00:49:34.600 | - Well, so it depends on which rankings you're talking about.
00:49:38.200 | Do you wanna talk about computer science
00:49:39.320 | or you wanna talk about universities?
00:49:40.280 | - Computer science, US News, isn't that the main one?
00:49:43.480 | - Yeah, it's US News.
00:49:44.320 | The only one that matters is US News.
00:49:45.440 | Nothing else matters.
00:49:46.480 | Sorry, csrankings.org, but nothing else matters
00:49:48.800 | but US News.
00:49:50.200 | So US News has formula that it uses for many things,
00:49:54.320 | but not for computer science,
00:49:55.280 | because computer science is considered a science,
00:49:57.600 | which is absurd.
00:49:58.480 | So the rankings for computer science is 100% reputation.
00:50:04.480 | So two people at each department,
00:50:09.160 | it's not really a department, whatever,
00:50:10.280 | at each department basically rank everybody.
00:50:14.000 | Slightly more complicated than that.
00:50:15.320 | But whatever, they rank everyone.
00:50:16.960 | And then those things are put together
00:50:18.240 | and somehow-- - Oh no.
00:50:20.000 | So that means, how do you improve reputation?
00:50:22.120 | How do you move up and down the space of reputation?
00:50:25.760 | - Yes, that's exactly the question.
00:50:27.320 | - Twitter?
00:50:28.160 | (Dave laughs)
00:50:29.000 | - It can help.
00:50:29.840 | I can tell you how Georgia Tech did it,
00:50:31.960 | or at least how I think Georgia Tech did it,
00:50:33.120 | because Georgia Tech is actually the case to look at.
00:50:36.360 | Not just because I'm at Georgia Tech,
00:50:37.640 | but because Georgia Tech is the only computing unit
00:50:40.440 | that was not in the top 20 that has made it into the top 10.
00:50:43.120 | It's also the only one in the last two decades, I think,
00:50:46.720 | that moved up in the top 10,
00:50:50.960 | as opposed to having someone else move down.
00:50:53.280 | So we used to be number 10,
00:50:54.920 | and then we became number nine,
00:50:56.160 | because UT Austin went down slightly
00:50:58.480 | and now we were tied for ninth,
00:50:59.560 | 'cause that's how rankings work.
00:51:01.400 | And we moved from nine to eight,
00:51:03.040 | because our raw score moved up a point.
00:51:06.680 | So Georgia, something about Georgia Tech,
00:51:09.360 | computer science, or computing anyway.
00:51:11.560 | I think it's because we have shown leadership
00:51:15.640 | at every crisis level, right?
00:51:17.000 | So we created a college, first public university to do it,
00:51:19.200 | second college, second university to do it
00:51:21.200 | after CMU is number one.
00:51:22.320 | I also think it's no accident that CMU is the largest,
00:51:26.000 | and we're, depending upon how you count
00:51:27.800 | and depending on exactly where MIT ends up
00:51:29.640 | with its final college of computing,
00:51:31.200 | second or third largest.
00:51:32.160 | I don't think that's an accident.
00:51:33.080 | We've been doing this for a long time.
00:51:34.600 | But in the 2000s,
00:51:36.040 | when there was a crisis about undergraduate education,
00:51:39.400 | Georgia Tech took a big risk
00:51:41.000 | and succeeded at rethinking undergrad education
00:51:43.680 | and computing.
00:51:45.160 | I think we created these schools at a time
00:51:47.840 | when most public universities anyway were afraid to do it.
00:51:50.320 | We did the online masters,
00:51:51.820 | and that mattered because people were trying to figure out
00:51:55.880 | what to do with MOOCs and so on.
00:51:57.200 | I think it's about being observed by your peers
00:52:00.760 | and having an impact.
00:52:02.200 | So, I mean, that is what reputation is, right?
00:52:04.240 | So the way you move up in the reputation rankings
00:52:07.140 | is by doing something that makes people turn
00:52:08.960 | and look at you and say, "That's good.
00:52:11.640 | They're better than I thought."
00:52:13.600 | - Yeah. - Beyond that,
00:52:14.440 | it's just inertia.
00:52:15.260 | And there's huge hysteria in the system, right?
00:52:16.800 | Like, I mean, there was these,
00:52:18.280 | I can't remember this, this may be apocryphal,
00:52:19.760 | but there's a major or a department
00:52:23.360 | that like MIT was ranked number one in,
00:52:25.600 | and they didn't have it.
00:52:27.120 | Right, it's just about what you,
00:52:28.560 | I don't know if that's true,
00:52:29.400 | but someone said that to me anyway.
00:52:31.120 | But it's a thing, right?
00:52:33.160 | It's all about reputation.
00:52:34.240 | Of course MIT is great because MIT is great.
00:52:36.520 | It's always been great.
00:52:37.340 | By the way, because MIT is great,
00:52:39.280 | the best students come, which keeps it being great.
00:52:42.960 | I mean, it's just a positive feedback loop.
00:52:44.840 | It's not surprising.
00:52:45.680 | I don't think it's wrong.
00:52:47.240 | - Yeah, but it's almost like a narrative.
00:52:48.920 | Like, it doesn't actually have to be backed by reality.
00:52:52.080 | And it's, you know, not to say anything about MIT,
00:52:54.480 | but like, it does feel like we're playing
00:52:59.320 | in the space of narratives,
00:53:00.840 | not the space of something grounded in,
00:53:03.840 | like one of the surprising things when I showed up at MIT
00:53:06.280 | and just all the students I've worked with
00:53:08.760 | and all the research I've done is it like,
00:53:13.360 | they're the same people as I've met other places.
00:53:16.640 | - I mean, what MIT is going for,
00:53:20.440 | well, MIT has many things going for it,
00:53:21.280 | but one of the things MIT is going for is--
00:53:23.120 | - Nice logo.
00:53:24.000 | - It has a nice logo.
00:53:25.000 | It's a lot better than it was when I was here.
00:53:28.120 | Nice colors too.
00:53:29.200 | Terrible, terrible name for a mascot.
00:53:31.400 | But the thing that MIT has going for it
00:53:34.460 | is it really does get the best students.
00:53:36.520 | It just doesn't get all of the best students.
00:53:39.040 | There are many more best students out there, right?
00:53:41.480 | And the best students want to be here
00:53:42.800 | 'cause it's the best place to be
00:53:44.720 | or one of the best places to be.
00:53:45.800 | And it just kind of, it's a sort of positive feedback.
00:53:47.880 | But you said something earlier,
00:53:49.380 | which I think is worth examining for a moment, right?
00:53:52.820 | You said it's, I forget the word you used.
00:53:54.800 | You said, "We're living in the space of narrative
00:53:58.080 | "as opposed to something objective."
00:54:00.080 | Narrative is objective.
00:54:01.260 | I mean, one could argue that the only thing
00:54:03.400 | that we do as humans is narrative.
00:54:04.920 | We just build stories to explain why we do what we do.
00:54:08.060 | Someone once asked me, "But wait, there's nothing objective."
00:54:10.240 | No, it's completely an objective measure.
00:54:12.920 | It's an objective measure of the opinions of everybody else.
00:54:16.340 | Now, is that physics?
00:54:19.480 | I don't know.
00:54:20.600 | But, you know, what, I mean, tell me something
00:54:23.180 | you think is actually objective and measurable
00:54:24.820 | in a way that makes sense.
00:54:25.780 | Like cameras, they don't, do you know that,
00:54:29.900 | I mean, you're getting me off on something here,
00:54:31.180 | but do you know that cameras,
00:54:34.900 | which are just reflecting light and putting them on film,
00:54:37.780 | like did not work for dark-skinned people
00:54:40.300 | until like the 1970s?
00:54:42.800 | You know why?
00:54:43.700 | Because you were building cameras
00:54:45.340 | for the people who were gonna buy cameras,
00:54:47.260 | who all, at least in the United States
00:54:49.100 | and Western Europe, were relatively light-skinned.
00:54:52.140 | Turns out it took terrible pictures
00:54:53.440 | of people who look like me.
00:54:55.620 | That got fixed with better film and whole processes.
00:54:58.700 | Do you know why?
00:55:00.240 | Because furniture manufacturers
00:55:02.980 | wanted to be able to take pictures of mahogany furniture.
00:55:06.340 | Right, because candy manufacturers
00:55:08.540 | wanted to be able to take pictures of chocolate.
00:55:11.780 | Now, the reason I bring that up
00:55:13.660 | is because you might think that cameras--
00:55:17.900 | - Are objective.
00:55:18.740 | - Are objective, they're just capturing light.
00:55:20.140 | No, they're made, they are doing the things
00:55:23.000 | that they are doing based upon decisions
00:55:24.780 | by real human beings to privilege,
00:55:27.300 | if I may use that word, some physics over others,
00:55:29.900 | because it's an engineering problem,
00:55:31.260 | there are trade-offs, right?
00:55:32.260 | So I can either worry about this part of the spectrum
00:55:34.740 | or this part of the spectrum.
00:55:35.920 | This costs more, that costs less,
00:55:37.340 | this costs the same,
00:55:38.300 | but I have more people paying money over here, right?
00:55:40.380 | And it turns out that if a giant conglomerate
00:55:43.860 | wants, demands that you do something different
00:55:45.900 | and it's gonna involve all kinds of money for you,
00:55:47.700 | suddenly the trade-offs change, right?
00:55:49.460 | And so there you go.
00:55:51.020 | I actually don't know how I ended up there.
00:55:52.220 | Oh, it's because of this notion of objectiveness, right?
00:55:54.020 | So even the objective isn't objective
00:55:57.060 | because at the end you've gotta tell a story,
00:55:58.460 | you've gotta make decisions,
00:55:59.300 | you've gotta make trade-off,
00:56:00.140 | and what else is engineering other than that?
00:56:01.900 | So I think that the rankings capture something.
00:56:05.340 | They just don't necessarily capture
00:56:07.820 | what people assume they capture.
00:56:09.860 | - Just to linger on this idea,
00:56:15.300 | why is there not more people who just play
00:56:20.020 | with whatever that narrative is, have fun with it,
00:56:22.380 | have like excite the world,
00:56:23.940 | whether it's in the Carl Sagan style
00:56:26.540 | of that calm, sexy voice of explaining the stars
00:56:30.460 | and all the romantic stuff,
00:56:31.660 | or the Elon Musk, dare I even say Donald Trump,
00:56:35.180 | where you're like trolling and shaking up the system
00:56:37.560 | and just saying controversial things.
00:56:39.560 | I talked to Lisa Feldman Barrett,
00:56:43.100 | who's a neuroscientist who just enjoys
00:56:45.820 | playing the controversy,
00:56:47.580 | thinks like, finds the counterintuitive ideas
00:56:51.700 | in the particular science and throws them out there
00:56:54.820 | and sees how they play in the public discourse.
00:56:57.660 | Like why don't we see more of that?
00:57:00.420 | And why doesn't academia attract an Elon Musk type?
00:57:03.620 | - Well, tenure is a powerful thing
00:57:06.480 | that allows you to do whatever you want,
00:57:08.140 | but getting tenure typically requires
00:57:10.660 | you to be relatively narrow, right?
00:57:13.220 | Because people are judging you.
00:57:14.420 | Well, I think the answer is we have told ourselves a story,
00:57:18.660 | a narrative, that that is vulgar,
00:57:22.220 | which you just described as vulgar.
00:57:23.900 | It's certainly unscientific, right?
00:57:26.620 | And it is easy to convince yourself
00:57:31.620 | that in some ways you're the mathematician, right?
00:57:35.100 | The fewer there are in your major,
00:57:38.120 | the more that proves your purity, right?
00:57:40.540 | - Yeah.
00:57:41.380 | - So once you tell yourself that story,
00:57:44.180 | then it is beneath you to do that kind of thing, right?
00:57:49.180 | I think that's wrong.
00:57:51.340 | I think that, and by the way, everyone doesn't have to do this.
00:57:53.380 | Everyone's not good at it,
00:57:54.220 | and everyone, even if they would be good at it,
00:57:55.260 | would enjoy it.
00:57:56.340 | So it's fine.
00:57:57.700 | But I do think you need some diversity
00:57:59.740 | in the way that people choose to relate
00:58:03.060 | to the world as academics,
00:58:05.620 | because I think the great universities
00:58:08.900 | are ones that engage with the rest of the world.
00:58:12.260 | It is a home for public intellectuals.
00:58:14.940 | And in 2020, being a public intellectual
00:58:18.260 | probably means being on Twitter,
00:58:20.700 | whereas of course that wasn't true 20 years ago,
00:58:23.140 | 'cause Twitter wasn't around 20 years ago.
00:58:25.300 | And if it was, it wasn't around in a meaningful way.
00:58:26.660 | I don't actually know how long Twitter's been around.
00:58:28.740 | As I get older, I find that my notion of time
00:58:31.980 | has gotten worse and worse.
00:58:33.100 | Like Google really has been around that long?
00:58:34.540 | Anyway, the point is that I think that
00:58:38.140 | I think that we sometimes forget
00:58:42.180 | that a part of our job is to impact the people
00:58:44.900 | who aren't in the world that we're in,
00:58:46.380 | and that that's the point of being at a great place
00:58:48.500 | and being a great person, frankly.
00:58:50.340 | - There's an interesting force
00:58:51.660 | in terms of public intellectuals.
00:58:53.660 | Forget Twitter, we could look at just online courses
00:58:56.740 | that are public-facing in some part.
00:58:59.220 | Like there is a kind of force that pulls you back.
00:59:05.860 | I would, let me just call it out
00:59:07.380 | 'cause I don't give a damn at this point.
00:59:09.740 | There's a little bit of, all of us have this,
00:59:11.780 | but certainly faculty have this, which is jealousy.
00:59:15.100 | It's whoever's popular at being a good communicator,
00:59:20.100 | exciting the world with their science.
00:59:23.020 | And of course, when you excite the world with the science,
00:59:27.480 | it's not peer-reviewed, clean.
00:59:30.480 | It all sounds like bullshit.
00:59:33.740 | It's like a TED Talk.
00:59:35.540 | And people roll their eyes,
00:59:37.100 | and they hate that a TED Talk gets millions of views
00:59:40.360 | or something like that.
00:59:41.220 | And then everybody pulls each other back.
00:59:43.300 | There's this force that just kind of,
00:59:45.500 | it's hard to stand out
00:59:46.940 | unless you like win a Nobel Prize or whatever.
00:59:48.740 | Like it's only when you like get senior enough
00:59:51.940 | where you just stop giving a damn.
00:59:54.940 | But just like you said, even when you get tenure,
00:59:57.900 | that was always the surprising thing to me.
01:00:00.380 | I have many colleagues and friends who have gotten tenure,
01:00:04.260 | but there's not a switch.
01:00:06.600 | There's not an F-you money switch
01:00:11.220 | where you're like, you know what?
01:00:13.600 | Now I'm going to be more bold.
01:00:15.300 | It doesn't, I don't see it.
01:00:17.220 | - Well, there's a reason for that.
01:00:18.860 | Tenure isn't a test.
01:00:19.780 | It's a training process.
01:00:20.980 | It teaches you to behave in a certain way,
01:00:23.560 | to think in a certain way, to accept certain values,
01:00:26.200 | and to react accordingly.
01:00:27.940 | And the better you are at that,
01:00:29.260 | the more likely you are to earn tenure.
01:00:30.880 | And by the way, this is not a bad thing.
01:00:32.620 | Most things are like that.
01:00:34.700 | And I think most of my colleagues
01:00:36.400 | are interested in doing great work,
01:00:38.340 | and they're just having impact
01:00:39.380 | in the way that they want to have impact.
01:00:41.340 | I do think that as a field, not just as a field,
01:00:45.420 | as a profession, we have a habit of belittling those
01:00:50.420 | who are popular, as it were,
01:00:55.680 | as if the word itself is a kind of scarlet A, right?
01:00:58.980 | I think it's easy to convince yourself,
01:01:03.980 | and no one is immune to this,
01:01:07.820 | that the people who are better known
01:01:09.540 | are better known for bad reasons.
01:01:11.740 | The people who are out there dumbing it down
01:01:15.260 | are not being pure to whatever the values and ethos is
01:01:19.900 | for your field.
01:01:20.740 | And it's just very easy to do.
01:01:22.900 | Now, having said that, I think that ultimately,
01:01:26.580 | people who are able to be popular and out there
01:01:30.260 | and are touching the world and making a difference,
01:01:32.760 | our colleagues do, in fact,
01:01:35.340 | appreciate that in the long run.
01:01:37.260 | It's just, you have to be very good at it,
01:01:39.380 | or you have to be very interested in pursuing it.
01:01:41.660 | And once you get past a certain level,
01:01:42.940 | I think people accept that for who it is.
01:01:44.820 | I mean, I don't know.
01:01:45.860 | I'd be really interested in how Rod Brooks felt
01:01:47.700 | about how people were interacting with him
01:01:50.500 | when he did "Fast, Cheap, and Out of Control"
01:01:52.860 | way, way, way back when.
01:01:54.300 | - What's "Fast, Cheap, and Out of Control"?
01:01:56.420 | - It was a documentary that involved four people.
01:01:59.820 | I remember nothing about it other than Rod Brooks was in it
01:02:01.980 | and something about naked mole rats.
01:02:04.820 | Can't remember what the other two things were.
01:02:06.460 | It was robots, naked mole rats, and then two other--
01:02:08.580 | - By the way, Rod Brooks used to be the head
01:02:10.260 | of the Artificial Intelligence Laboratory at MIT,
01:02:13.460 | and then launched, I think, iRobot,
01:02:16.860 | and then Think Robotics, Rethink Robotics?
01:02:19.380 | - Yes, sir, yes.
01:02:21.140 | - Think is in the word.
01:02:23.380 | And also is a little bit of a rock star personality
01:02:27.180 | in the AI world, very opinionated, very intelligent.
01:02:30.260 | Anyway, sorry, mole rats and naked.
01:02:32.580 | - Naked mole rats.
01:02:33.420 | Also, he was one of my two advisors for my PhD.
01:02:37.140 | - This explains a lot.
01:02:38.220 | (laughing)
01:02:39.060 | - I don't know how it explains.
01:02:40.940 | I love Rod.
01:02:41.780 | But I also love my other advisor, Paul.
01:02:43.740 | Paul, if you're listening, I love you too.
01:02:45.140 | Both very, very different people.
01:02:46.700 | - Paul Viola.
01:02:47.540 | - Paul Viola, both very interesting people,
01:02:48.820 | very different in many ways.
01:02:50.900 | But I don't know what Rod would say to you
01:02:52.860 | about what the reaction was.
01:02:54.980 | I know that for the students at the time,
01:02:56.980 | 'cause I was a student at the time, it was amazing.
01:02:59.500 | This guy was in a movie, being very much himself.
01:03:04.340 | Actually, the movie version of him
01:03:06.260 | is a little bit more Rod than Rod.
01:03:09.620 | I mean, I think they edited it appropriately for him.
01:03:12.860 | But it was very much Rod, and he did all this
01:03:14.700 | while doing great work.
01:03:15.700 | I mean, he was running, was he running the iLab
01:03:17.340 | at that point or not?
01:03:18.180 | I don't know.
01:03:19.000 | I think the iLab would be soon.
01:03:21.080 | He was a giant in the field.
01:03:22.040 | He did amazing things, made a lot of his bones
01:03:23.720 | by doing the kind of counterintuitive science, right?
01:03:27.760 | And saying, no, you're doing this all wrong.
01:03:29.680 | Representation is crazy.
01:03:30.800 | The world is your own representation.
01:03:32.120 | You just react to it.
01:03:32.960 | I mean, he did amazing things,
01:03:34.080 | and continues to do those sorts of things
01:03:36.480 | as he's moved on.
01:03:37.880 | I have, I think he might tell you,
01:03:40.880 | I don't know if he would tell you it was good or bad,
01:03:42.080 | but I know that for everyone else out there in the world,
01:03:45.480 | it was a good thing,
01:03:46.320 | and certainly he continued to be respected.
01:03:48.340 | So it's not as if it destroyed his career by being popular.
01:03:52.080 | - All right, let's go into a topic where I'm on thin ice,
01:03:56.280 | because I grew up in the Soviet Union and Russia.
01:03:58.080 | My knowledge of music, this American thing you guys do,
01:04:02.160 | is quite foreign.
01:04:07.160 | So your research group is called,
01:04:10.960 | as we've talked about,
01:04:12.180 | the Lab for Interactive Artificial Intelligence,
01:04:14.660 | but also there's just a bunch of mystery around this.
01:04:17.540 | My research fails me.
01:04:19.200 | Also called PFUNK.
01:04:21.220 | P stands for probabilistic.
01:04:24.020 | And what does FUNK stand for?
01:04:27.880 | - So a lot of my life is about making acronyms.
01:04:30.440 | So if I have one quirk, it's that people will say words,
01:04:32.880 | and I see if they make acronyms.
01:04:34.600 | And if they do, then I'm happy,
01:04:36.920 | and then if they don't, I try to change it
01:04:38.760 | so that they make acronyms.
01:04:39.760 | It's just a thing that I do.
01:04:41.640 | So PFUNK is an acronym.
01:04:43.240 | It has three or four different meanings.
01:04:45.840 | But finally I decided that the P stands for probabilistic
01:04:48.480 | because at the end of the day,
01:04:49.880 | it's machine learning and it's randomness
01:04:51.440 | and it's uncertainty, which is the important thing here.
01:04:54.300 | And the FUNK can be lots of different things,
01:04:56.580 | but I decided I should leave it up to the individual
01:04:58.640 | to figure out exactly what it is.
01:05:00.680 | But I will tell you that when my students graduate,
01:05:03.600 | when they get out, as we say, at Tech,
01:05:06.460 | I hand them, they put on a hat and star glasses
01:05:11.660 | and a medallion from the PFUNK era,
01:05:15.820 | and we take a picture,
01:05:17.380 | and I hand them a pair of fuzzy dice,
01:05:21.040 | which they get to keep.
01:05:22.320 | - So there's a sense to it which is not an acronym,
01:05:25.660 | like literally FUNK.
01:05:27.160 | You have a dark, mysterious past.
01:05:30.780 | Oh, it's not dark, it's just fun,
01:05:34.460 | as in hip hop and funk.
01:05:38.240 | - Yep.
01:05:39.480 | - So can you educate a Soviet-born Russian
01:05:43.400 | about this thing called hip hop?
01:05:45.840 | Like if you were to give me,
01:05:48.680 | like if we went on a journey together
01:05:50.860 | and you were trying to educate me about,
01:05:53.020 | especially the past couple of decades in the '90s
01:05:58.020 | about hip hop or funk,
01:05:59.800 | what records or artists would you introduce me to?
01:06:04.800 | Would you tell me about,
01:06:09.740 | or maybe what influenced you in your journey,
01:06:12.320 | or what you just love?
01:06:13.520 | Like when the family's gone and you just sit back
01:06:16.600 | and just blast some stuff these days,
01:06:19.440 | what do you listen to?
01:06:20.440 | - Well, so I listen to a lot,
01:06:21.640 | but I will tell you, well, first off,
01:06:22.960 | all great music was made when I was 14,
01:06:24.800 | and that statement is true for all people,
01:06:26.520 | no matter how old they are or where they live.
01:06:28.560 | But for me, the first thing that's worth pointing out
01:06:31.420 | is that hip hop and rap aren't the same thing,
01:06:33.820 | so depending on who you talk to about this,
01:06:35.080 | and there are people who feel very strongly about this,
01:06:37.960 | much more strongly than I do.
01:06:38.800 | - You're offending everybody in this conversation,
01:06:40.440 | so this is great, let's keep going.
01:06:41.840 | - Hip hop is a culture.
01:06:44.040 | - Yeah, I take that.
01:06:44.880 | - It's a whole set of things, of which rap is a part.
01:06:46.680 | So tagging is a part of hip hop.
01:06:48.320 | I don't know why that's true,
01:06:49.160 | but people tell me it's true,
01:06:50.080 | and I'm willing to go along with it,
01:06:51.280 | 'cause they get very angry about it.
01:06:52.600 | But hip hop is-- - Tagging is like graffiti.
01:06:54.040 | - Tagging is like graffiti.
01:06:55.860 | And there's all these, including the popping and the locking
01:06:57.880 | and all the dancing and all those things,
01:06:59.040 | that's all a part of hip hop.
01:07:00.400 | It's a way of life, which I think is true.
01:07:03.040 | And then there's rap, which is this particular--
01:07:05.240 | - It's the music part.
01:07:06.080 | - Yes, or a music part.
01:07:07.800 | - A music part, yeah.
01:07:09.000 | I mean, you wouldn't call the stuff that DJs do
01:07:10.600 | the (imitates scratching) scratching.
01:07:12.000 | That's not rap, right?
01:07:12.920 | But it's a part of hip hop, right?
01:07:14.280 | So given that we understand that hip hop is this whole thing,
01:07:17.480 | what are the rap albums that best touch that for me?
01:07:20.360 | Well, if I were gonna educate you,
01:07:22.040 | I would try to figure out what you liked,
01:07:23.280 | and then I would work you there.
01:07:24.840 | - Leonard Skinner.
01:07:26.640 | - Oh my God.
01:07:27.480 | - Yes. (laughs)
01:07:28.320 | - I would probably start with--
01:07:29.800 | (Leonard laughs)
01:07:32.400 | - Led Zeppelin.
01:07:33.640 | - There's a fascinating, oh, it's okay.
01:07:35.080 | There's a fascinating exercise one can do
01:07:37.320 | by watching old episodes of "I Love the '70s,"
01:07:40.960 | "I Love the '80s," "I Love the '90s"
01:07:43.160 | with a bunch of friends,
01:07:44.200 | and just see where people come in and out of pop culture.
01:07:47.280 | So if you're talking about those people,
01:07:51.080 | then I would actually start you with
01:07:53.680 | where I would hope to start you with anyway,
01:07:55.120 | which is "Public Enemy."
01:07:56.680 | Particularly, it takes a nation of millions to hold us back,
01:07:59.840 | which is clearly the best album ever produced,
01:08:03.560 | and certainly the best hip hop album ever produced,
01:08:06.120 | in part because it was so much
01:08:08.280 | of what was great about the time.
01:08:09.920 | Fantastic lyrics, 'cause to me, it's all about the lyrics.
01:08:12.720 | Amazing music that was coming from,
01:08:14.360 | Rick Rubin was the producer of that,
01:08:16.520 | and he did a lot, very kind of heavy metal-ish,
01:08:19.080 | at least in the '80s sense at the time,
01:08:21.840 | and it was focused on politics in the 1980s,
01:08:26.240 | which was what made hip hop so great then.
01:08:28.320 | I would start you there,
01:08:29.520 | then I would move you up through things
01:08:31.560 | that have been happening more recently.
01:08:33.280 | I'd probably get you to someone like a Mos Def.
01:08:35.280 | I would give you a history lesson, basically.
01:08:37.080 | Mos Def's amazing.
01:08:37.920 | - So he hosted a poetry jam thing on HBO
01:08:40.120 | or something like that?
01:08:41.080 | - Probably, I don't think I've seen it,
01:08:41.920 | but I wouldn't be surprised.
01:08:43.000 | - Yeah, spoken poetry, that guy.
01:08:44.200 | - Yeah, he's amazing.
01:08:45.640 | He's amazing.
01:08:46.880 | And then, after I got you there,
01:08:48.920 | I'd work you back to EPMD,
01:08:50.640 | and eventually, I would take you back to "The Last Poets,"
01:08:56.000 | and particularly their first album, "The Last Poets,"
01:08:58.080 | which was 1970, to give you a sense of history,
01:09:01.400 | and that it actually has been building up
01:09:02.960 | over a very, very long time.
01:09:05.160 | So we would start there,
01:09:06.840 | because that's where your music aligns,
01:09:08.480 | and then we would cycle out,
01:09:09.920 | and I'd move you to the present,
01:09:11.000 | and then I'd take you back to the past.
01:09:12.680 | Because I think a large part of people
01:09:14.280 | who are kind of confused about any kind of music,
01:09:17.040 | you know, the truth is, this is the same thing
01:09:18.680 | we've always been talking about, right?
01:09:19.720 | It's about narrative and being a part of something
01:09:21.400 | and being immersed in something,
01:09:22.960 | so you understand it, right?
01:09:24.560 | Jazz, which I also like,
01:09:27.640 | is, one of the things that's cool about jazz
01:09:29.880 | is that you come and you meet someone
01:09:30.880 | who's talking to you about jazz,
01:09:31.960 | and you have no idea what they're talking about.
01:09:33.480 | And then one day it all clicks,
01:09:35.440 | and you've been so immersed in it,
01:09:36.600 | you go, "Oh yeah, that's a Charlie Parker,"
01:09:38.640 | you start using words that nobody else understands, right?
01:09:40.520 | And it becomes part of, hip-hop's the same way,
01:09:42.680 | everything's the same way, they're all cultural artifacts.
01:09:44.860 | But I would help you to see that there's a history of it,
01:09:46.920 | and how it connects to other genres of music
01:09:48.840 | that you might like, to bring you in,
01:09:50.680 | so that you could kind of see how it connects
01:09:53.680 | to what you already like,
01:09:54.520 | including some of the good work that's been done
01:09:58.640 | with fusions of hip-hop and bluegrass.
01:10:02.640 | - Oh no. - Yes.
01:10:04.040 | Some of it's even good.
01:10:07.000 | Not all of it, but some of it is good.
01:10:09.720 | But I'd start you with "It Takes a Nation to Make Us All"
01:10:11.720 | as a back.
01:10:12.560 | - There's an interesting tradition in more modern hip-hop
01:10:16.000 | of integrating almost like classic rock songs or whatever,
01:10:20.000 | like integrating into their music,
01:10:23.820 | into the beat, into the whatever.
01:10:25.580 | It's kind of interesting.
01:10:26.520 | It gives a whole new, not just classic rock,
01:10:30.360 | but what is it, Kanye, Gold Digger, the--
01:10:33.800 | - Mm-hmm, old R&B.
01:10:36.200 | - Taking and pulling old R&B, right.
01:10:38.200 | - Well, that's been true since the beginning.
01:10:39.640 | I mean, in fact, that's in some ways,
01:10:41.360 | that's why the DJ used to get top billing,
01:10:45.480 | 'cause it was the DJ that brought all the records together
01:10:47.760 | and made it work so that people could dance.
01:10:49.440 | If you go back to those days, mostly in New York,
01:10:53.040 | though not exclusively, but mostly in New York,
01:10:54.660 | where it sort of came out of,
01:10:56.600 | it was the DJ that brought all the music together
01:10:58.080 | and the beats and showed that basically music
01:11:00.280 | is itself an instrument, very meta,
01:11:02.360 | and you can bring it together
01:11:03.400 | and then you sort of wrap over it and so on,
01:11:05.120 | and it sort of moved that way.
01:11:07.000 | So that's going way, way back.
01:11:07.960 | Now, in the period of time where I grew up,
01:11:10.400 | when I became really into it, which was most of the '80s,
01:11:14.600 | it was more funk was the back for a lot of the stuff,
01:11:17.960 | public enemy at that time notwithstanding,
01:11:21.240 | and so, which is very nice,
01:11:22.960 | 'cause it tied into what my parents listened to
01:11:25.200 | and what I vaguely remember listening to
01:11:26.600 | when I was very small.
01:11:28.160 | And by the way, complete revival of George Clinton
01:11:32.080 | and Parliament and Funkadelic and all of those things
01:11:34.560 | to bring it sort of back into the '80s and into the '90s.
01:11:37.280 | And as we go on, you're gonna see the last decade
01:11:40.600 | and the decade before that being brought in.
01:11:42.460 | And when you don't think
01:11:43.600 | that you're hearing something you've heard,
01:11:44.760 | it's probably because it's being sampled by someone
01:11:47.560 | who referring to something they remembered
01:11:52.560 | when they were young,
01:11:54.120 | perhaps from somewhere else altogether,
01:11:55.880 | and you just didn't realize what it was
01:11:58.200 | because it wasn't a popular song
01:11:59.840 | where you happened to grow up.
01:12:00.840 | So this stuff's been going on for a long time.
01:12:02.280 | It's one of the things that I think is beautiful.
01:12:04.640 | Run DMC, Jam Master Jay used to play, he played piano.
01:12:09.320 | He would record himself playing piano
01:12:11.280 | and then sample that to make it a part
01:12:13.000 | of what was going on rather than play the piano.
01:12:15.360 | - That's how his mind can think more.
01:12:17.080 | - Well, it's pieces, you're putting pieces together.
01:12:18.400 | You're putting pieces of music together
01:12:19.440 | to create new music, right?
01:12:20.680 | Now, that doesn't mean that the root,
01:12:21.760 | I mean, the roots are doing their own thing, right?
01:12:24.640 | - Yeah, those are, that's a whole.
01:12:26.040 | - Yeah, but still, it's the right attitude.
01:12:29.680 | And what else is jazz, right?
01:12:31.760 | Jazz is about putting pieces together
01:12:33.120 | and then putting your own spin on it.
01:12:34.120 | It's all the same, it's all the same thing.
01:12:36.040 | It's all the same thing.
01:12:37.240 | - 'Cause you mentioned lyrics.
01:12:38.240 | It does make me sad, again,
01:12:40.440 | this is me talking trash about modern hip hop.
01:12:42.640 | I haven't investigated, I'm sure people will correct me
01:12:46.320 | that there's a lot of great artists.
01:12:47.680 | That's part of the reason I'm saying it
01:12:49.120 | is they'll leave it in the comments
01:12:50.680 | that you should listen to this person,
01:12:52.360 | is the lyrics went away from talking about
01:12:56.000 | maybe not just politics, but life and so on.
01:13:01.040 | The kind of protest songs,
01:13:03.920 | even if you look at a Bob Marley,
01:13:06.040 | or you see Public Enemy, or Rage Against the Machine
01:13:08.160 | more on the rock side,
01:13:09.360 | that's the place where we go to those lyrics.
01:13:14.520 | Classic rock is all about my woman left me,
01:13:18.720 | or I'm really happy that she's still with me,
01:13:22.200 | or the flip side, it's like love songs of different kinds.
01:13:24.680 | It's all love, but it's less political,
01:13:26.880 | like less interesting, I would say,
01:13:28.720 | in terms of deep, profound knowledge.
01:13:31.240 | It seems like rap is the place where you would find that,
01:13:35.320 | and it's sad that for the most part,
01:13:38.160 | what I see, you look at like mumble rap or whatever,
01:13:41.000 | they're moving away from lyrics
01:13:42.520 | and more towards the beat and the musicality of it.
01:13:45.680 | - I've always been a fan of the lyrics.
01:13:46.880 | In fact, if you go back and you read my reviews,
01:13:49.320 | which I recently was rereading,
01:13:51.320 | man, fuck, I wrote my last review
01:13:54.640 | the month I graduated, when I got my PhD,
01:13:57.360 | which says something about something.
01:13:58.520 | I'm not sure what, though.
01:14:00.120 | I always would, I don't always,
01:14:01.360 | but I often would start with, it's all about the lyrics.
01:14:03.200 | For me, it's about the lyrics.
01:14:04.840 | Someone has already written in the comments
01:14:08.240 | before I've even finished having this conversation
01:14:10.160 | that neither of us knows what we're talking about,
01:14:12.080 | and it's all in the underground hip hop,
01:14:14.120 | and here's who you should go listen to.
01:14:15.640 | And that is true.
01:14:16.480 | Every time I despair for popular rap,
01:14:19.960 | someone points me to where I discover
01:14:22.800 | some underground hip hop song,
01:14:24.240 | and I am made happy and whole again.
01:14:26.560 | So I know it's out there.
01:14:28.640 | I don't listen to it as much as I used to,
01:14:30.640 | because I'm listening to podcasts
01:14:32.000 | and old music from the 1980s and '80s.
01:14:34.000 | - Kind of rap, no beat, though.
01:14:34.960 | - It's a kind of, no, no beat at all,
01:14:36.800 | but there's a little bit of sampling here and there,
01:14:38.600 | I'm sure. (laughing)
01:14:40.840 | - By the way, James Brown is funk or no?
01:14:42.600 | - Yes, and so is Junior Wells, by the way.
01:14:45.120 | - Who's that?
01:14:45.960 | - Ah, Junior Wells, Chicago Blues.
01:14:48.400 | He was James Brown before James Brown was.
01:14:50.440 | - It's hard to imagine somebody being James Brown.
01:14:52.840 | - Go look up Hoodoo Man Blues, Junior Wells,
01:14:55.740 | and just listen to Snatch It Back and Hold It,
01:15:00.060 | and you'll see it.
01:15:02.200 | And they were contemporaries.
01:15:03.440 | - Where do you put Little Richard
01:15:04.840 | or all that kind of stuff, like Ray Charles,
01:15:07.280 | like when they get hit the road, jack,
01:15:10.720 | and don't you come back?
01:15:11.960 | Isn't that like, there's a funkiness in it.
01:15:14.080 | - Oh, there's definitely a funkiness in it.
01:15:15.640 | I mean, it's all, I mean, it's all a lie.
01:15:18.240 | I mean, it's all, there's all a line
01:15:19.880 | that carries it all together.
01:15:21.240 | You know, it's, I guess I would answer your question
01:15:23.600 | depending on whether I'm thinking about it in 2020
01:15:25.320 | or I'm thinking about it in 1960.
01:15:27.680 | I'd probably give a different answer.
01:15:28.920 | I'm just thinking in terms of, you know, that was rock,
01:15:31.120 | but when you look back on it, it was funky.
01:15:35.080 | But we didn't use those words, or maybe we did,
01:15:36.960 | I wasn't around, but you know,
01:15:38.400 | I don't think we used the word 1960, funk.
01:15:40.760 | Certainly not the way we used it in the '70s and the '80s.
01:15:43.160 | - Do you reject disco?
01:15:44.720 | - I do not reject disco.
01:15:45.640 | I appreciate all the mistakes that we have made
01:15:47.600 | to get to where we are now.
01:15:49.000 | Actually, some of the disco is actually really, really good.
01:15:51.120 | - John Travolta, oh boy, he regrets it probably.
01:15:54.520 | Maybe not.
01:15:55.360 | Well, like it's the mistakes thing.
01:15:56.840 | - Yeah, and it got him to where he's going, where he is.
01:15:59.640 | - Oh, well, thank you for taking that detour.
01:16:03.820 | You've talked about computing,
01:16:05.720 | we've already talked about computing a little bit,
01:16:08.480 | but can you try to describe how you think
01:16:11.320 | about the world of computing,
01:16:13.440 | where it fits into the sets of different disciplines?
01:16:16.120 | We mentioned College of Computing.
01:16:18.360 | What should people, how should they think about computing,
01:16:21.880 | especially from an educational perspective,
01:16:24.220 | of like what is the perfect curriculum
01:16:27.020 | that defines for a young mind what computing is?
01:16:32.900 | - So I don't know about a perfect curriculum,
01:16:34.280 | although that's an important question,
01:16:35.400 | because at the end of the day,
01:16:36.680 | without the curriculum, you don't get anywhere.
01:16:38.520 | Curriculum, to me, is the fundamental data structure.
01:16:41.040 | It's not even the classroom.
01:16:42.080 | - Data structure, I love it.
01:16:43.680 | - Right?
01:16:44.520 | So I think the curriculum is where I like to play.
01:16:48.000 | So I spend a lot of time thinking about this.
01:16:50.400 | But I will tell you, I'll answer your question
01:16:51.840 | by answering a slightly different question first
01:16:53.240 | and getting back to this, which is,
01:16:54.760 | you talked about disciplines,
01:16:56.180 | and what does it mean to be a discipline?
01:16:58.560 | The truth is, what we really educate people in
01:17:01.880 | from the beginning, but certainly through college,
01:17:04.600 | you've sort of failed if you don't think about it this way,
01:17:06.700 | I think, is the world,
01:17:09.420 | people often think about tools and tool sets,
01:17:13.400 | and when you're really trying to be good,
01:17:14.520 | you think about skills and skill sets.
01:17:16.660 | But disciplines are about mindsets, right?
01:17:18.760 | They're about fundamental ways of thinking,
01:17:21.500 | not just the hammer that you pick up,
01:17:23.940 | whatever that is, to hit the nail,
01:17:26.640 | not just the skill of learning how to hammer well,
01:17:29.160 | or whatever, it's the mindset of, like,
01:17:31.040 | what's the fundamental way to think about the world, right?
01:17:36.040 | And disciplines, different disciplines,
01:17:38.080 | give you different mindsets.
01:17:39.680 | They give you different ways of sort of thinking through.
01:17:41.600 | So, with that in mind, I think that computing,
01:17:44.760 | to even ask the question whether it's a discipline,
01:17:46.240 | you have to decide, does it have a mindset,
01:17:48.040 | does it have a way of thinking about the world
01:17:49.640 | that is different from the scientist who is doing discovery
01:17:53.480 | and using the scientific method as a way of doing it,
01:17:55.280 | or the mathematician who builds abstractions
01:17:57.440 | and tries to find sort of steady-state truth
01:17:59.800 | about the abstractions that may be artificial, but whatever.
01:18:03.720 | Or is it the engineer who's all about
01:18:06.440 | building demonstrably superior technology
01:18:08.460 | with respect to some notion of trade-offs,
01:18:11.080 | whatever that means, right?
01:18:11.920 | That's sort of the world that you live in.
01:18:14.480 | What is computing?
01:18:15.640 | How is computing different?
01:18:16.800 | So, I've thought about this for a long time,
01:18:18.040 | and I've come to a view about what computing actually is,
01:18:22.160 | what the mindset is, and it's a little abstract,
01:18:25.460 | but that would be appropriate for computing.
01:18:27.280 | I think that what distinguishes the computationalist
01:18:30.200 | from others is that he or she understands
01:18:36.200 | that models, languages, and machines are equivalent.
01:18:39.960 | They're the same thing.
01:18:41.520 | And because it's not just a model,
01:18:43.960 | but it's a machine that is an executable thing
01:18:46.520 | that can be described as a language,
01:18:49.140 | that means that it's dynamic.
01:18:51.300 | So, it is mathematical in some sense,
01:18:54.100 | in the kind of sense of abstraction,
01:18:55.960 | but it is fundamentally dynamic and executable.
01:18:58.320 | The mathematician is not necessarily worried
01:19:00.000 | about either the dynamic part.
01:19:01.780 | In fact, whenever I tried to write something
01:19:04.100 | for mathematicians, they invariably demand
01:19:07.040 | that I make it static, and that's not a bad thing.
01:19:09.320 | It's just, it's a way of viewing the world,
01:19:11.240 | that truth is a thing, right?
01:19:12.640 | It's not a process that continually runs, right?
01:19:16.000 | So, that dynamic thing matters,
01:19:18.000 | that self-reflection of the system itself matters,
01:19:20.960 | and that is what computing brought us.
01:19:23.360 | So, it is a science, because the models fundamentally
01:19:26.260 | represent truths in the world.
01:19:27.960 | Information is a scientific thing to discover, right?
01:19:30.900 | Not just a mathematical conceit that gets created.
01:19:33.920 | But of course, it's engineering,
01:19:35.280 | because you're actually dealing with constraints
01:19:37.180 | in the world and trying to execute machines
01:19:39.640 | that actually run.
01:19:41.420 | But it's also a math, because you're actually worrying
01:19:44.120 | about these languages that describe what's happening.
01:19:47.400 | But the fact that regular expressions
01:19:52.400 | and finite state automata, one of which feels like a machine,
01:19:56.160 | or at least an abstraction machine,
01:19:57.360 | and the other is a language,
01:19:58.280 | that they're actually the equivalent thing.
01:19:59.720 | I mean, that is not a small thing,
01:20:01.360 | and it permeates everything that we do,
01:20:03.280 | even when we're just trying to figure out
01:20:04.560 | how to do debugging.
01:20:05.720 | So, that idea, I think, is fundamental,
01:20:08.320 | and we would do better if we made that more explicit.
01:20:12.220 | How my life has changed in my thinking about this
01:20:15.300 | in the 10 or 15 years it's been
01:20:18.100 | since I tried to put that to paper with some colleagues
01:20:20.680 | is the realization, which comes to a question
01:20:24.360 | you actually asked me earlier,
01:20:27.560 | which has to do with trees falling down
01:20:29.000 | and whether it matters,
01:20:30.160 | is this sort of triangle of equality,
01:20:34.660 | it only matters because there's a person
01:20:36.540 | inside the triangle, right?
01:20:38.840 | That what's changed about computing,
01:20:42.020 | computer science, whatever you want to call it,
01:20:44.140 | is we now have so much data
01:20:46.580 | and so much computational power,
01:20:48.820 | we're able to do really, really interesting,
01:20:50.780 | promising things.
01:20:52.040 | But the interesting and the promising
01:20:55.060 | kind of only matters with respect to human beings
01:20:56.960 | and their relationship to it.
01:20:58.340 | So, the triangle exists, that is fundamentally computing.
01:21:01.820 | What makes it worthwhile and interesting,
01:21:03.440 | and potentially world species changing,
01:21:08.140 | is that there are human beings inside of it,
01:21:10.100 | and intelligence that has to interact with it
01:21:11.980 | that changes the data, the information that makes sense
01:21:14.540 | and gives meaning to the models,
01:21:17.460 | the languages, and the machines.
01:21:19.180 | So, if the curriculum can convey that
01:21:23.020 | while conveying the tools and the skills
01:21:24.660 | that you need in order to succeed,
01:21:26.740 | then it is a big win.
01:21:28.740 | That's what I think you have to do.
01:21:30.940 | - Do you pull psychology,
01:21:33.100 | it's like these human things into that,
01:21:35.900 | into the idea, into this framework of computing,
01:21:39.100 | do you pull in psychology, neuroscience,
01:21:41.340 | like parts of psychology, parts of neuroscience,
01:21:43.220 | parts of sociology?
01:21:44.780 | What about philosophy, like studies of human nature
01:21:48.180 | from different perspectives?
01:21:49.260 | - Absolutely.
01:21:50.100 | And by the way, it works both ways.
01:21:51.300 | So, let's take biology for a moment.
01:21:53.380 | It turns out a cell is basically
01:21:54.740 | a bunch of if-then statements,
01:21:56.220 | if you look at it the right way,
01:21:57.860 | which is nice because I understand if-then statements.
01:22:00.540 | I never really enjoyed biology,
01:22:01.860 | but I do understand if-then statements.
01:22:03.380 | And if you tell the biologist that
01:22:05.020 | and they begin to understand that,
01:22:06.140 | it actually helps them to think about
01:22:08.220 | a bunch of really cool things.
01:22:09.700 | There'll still be biology involved, but whatever.
01:22:11.980 | On the other hand, the fact of biology is,
01:22:15.180 | in fact, the cell is a bunch of if-then statements
01:22:17.420 | or whatever, allows the computationalists
01:22:19.420 | to think differently about the language
01:22:20.980 | and the way that we, well,
01:22:22.220 | certainly the way we would do AI and machine learning,
01:22:23.900 | but there's just even the way that we think about,
01:22:25.960 | we think about computation.
01:22:27.180 | So, the important thing to me is,
01:22:29.740 | as my engineering colleagues
01:22:31.440 | who are not in computer science worry about computer science
01:22:33.780 | eating up engineering colleges
01:22:35.540 | where computer science is trapped,
01:22:37.400 | it's not a worry.
01:22:40.380 | You shouldn't worry about that at all.
01:22:41.520 | Computing is, computer science, computing,
01:22:43.680 | it's not, it's central,
01:22:45.020 | but it's not the most important thing in the world.
01:22:46.880 | It's not more important.
01:22:47.720 | It is just key to helping others
01:22:50.900 | do other cool things they're gonna do.
01:22:52.380 | You're not gonna be a historian in 2030.
01:22:54.740 | You're not gonna get a PhD in history
01:22:55.860 | without understanding some data science and computing
01:22:57.860 | because the way you're gonna get history done, in part,
01:23:00.940 | and I say done,
01:23:01.820 | the way you're gonna get it done
01:23:03.040 | is you're going to look at data
01:23:04.320 | and you're gonna let,
01:23:05.280 | you're gonna have the system
01:23:06.200 | that's gonna help you to analyze things,
01:23:07.480 | to help you to think about a better way to describe history
01:23:10.060 | and to understand what's going on
01:23:11.020 | and what it tells us about where we might be going.
01:23:13.480 | The same is true for psychology,
01:23:14.460 | same is true for all of these things.
01:23:16.120 | The reason I brought that up
01:23:17.040 | is because the philosopher has a lot to say about computing.
01:23:20.200 | The psychologist has a lot to say
01:23:22.060 | about the way humans interact with computing, right?
01:23:24.800 | And certainly a lot about intelligence,
01:23:26.400 | which, at least for me,
01:23:28.960 | ultimately is kind of the goal
01:23:30.660 | of building these computational devices
01:23:32.300 | is to build something intelligent.
01:23:33.620 | - Did you think computing will eat everything
01:23:36.380 | in some certain sense
01:23:37.500 | or almost like disappear because it's part of everything?
01:23:40.460 | - It's so funny you say this.
01:23:41.300 | I wouldn't say it's gonna metastasize,
01:23:42.860 | but there's kind of two ways that fields destroy themselves.
01:23:46.620 | One is they become super narrow,
01:23:48.680 | and I think we can think of fields that might be that way.
01:23:53.500 | They become pure.
01:23:54.860 | And we have that instinct, we have that impulse.
01:23:57.100 | I'm sure you can think of several people
01:23:58.660 | who want computer science to be this pure thing.
01:24:01.860 | The other way is you become everywhere
01:24:04.240 | and you become everything and nothing.
01:24:06.060 | And so everyone says,
01:24:07.440 | I'm gonna teach Fortran for engineers or whatever,
01:24:10.320 | I'm gonna do this.
01:24:11.420 | And then you lose the thing
01:24:12.660 | that makes it worth studying in and of itself.
01:24:15.460 | The thing about computing,
01:24:16.500 | and this is not unique to computing,
01:24:18.380 | though at this point in time,
01:24:19.620 | it is distinctive about computing,
01:24:21.820 | where we happen to be in 2020,
01:24:23.600 | is we are both a thriving major.
01:24:26.660 | In fact, the thriving major, almost every place.
01:24:29.260 | And we're a service unit,
01:24:33.340 | because people need to know the things we need to know.
01:24:36.220 | And our job, much as the mathematician's job,
01:24:38.600 | is to help this person over here
01:24:40.820 | to think like a mathematician,
01:24:41.880 | much the way the point of you taking chemistry as a freshman
01:24:45.840 | is not to learn chemistry,
01:24:47.000 | it's to learn to think like a scientist, right?
01:24:48.820 | Our job is to help them to think like a computationalist,
01:24:51.980 | and we have to take both of those things very seriously.
01:24:54.460 | And I'm not sure that as a field,
01:24:56.820 | we have historically certainly taken the second thing,
01:24:59.180 | that our job is to help them to think a certain way.
01:25:01.940 | People who aren't gonna be our major,
01:25:02.780 | I don't think we've taken that very seriously at all.
01:25:06.020 | - I don't know if you know who Dan Carlin is,
01:25:07.700 | he has this podcast called "Hardcore History."
01:25:09.780 | - Yes.
01:25:10.620 | - I've just did an amazing four-hour conversation with him,
01:25:14.380 | mostly about Hitler.
01:25:15.780 | But I bring him up because he talks about this idea
01:25:20.240 | that it's possible that history as a field will become,
01:25:24.340 | like currently, most people study history a little bit,
01:25:29.340 | kind of are aware of it, we have a conversation about it,
01:25:33.140 | different parts of it, I mean,
01:25:34.300 | there's a lot of criticism to say
01:25:35.700 | that some parts of history are being ignored,
01:25:37.300 | blah, blah, blah, so on.
01:25:38.780 | But most people are able to have a curiosity
01:25:42.620 | and able to learn it.
01:25:44.000 | His thought is it's possible,
01:25:47.380 | given the way social media works,
01:25:49.100 | the current way we communicate,
01:25:51.600 | that history becomes a niche field
01:25:53.900 | where literally most people just ignore,
01:25:57.260 | 'cause everything is happening so fast
01:25:59.300 | that the history starts losing its meaning
01:26:01.860 | and then it starts being a thing that only,
01:26:04.880 | like the theoretical computer science part
01:26:08.300 | of computer science, it becomes a niche thing
01:26:10.620 | that only the rare holders of the World Wars
01:26:14.980 | and all the history, the founding of the United States,
01:26:19.500 | all those kinds of things, the Civil Wars.
01:26:22.420 | And it's a kind of profound thing to think about,
01:26:26.220 | how we can lose track, how we can lose these fields
01:26:31.220 | when they're best, like in the case of history,
01:26:34.340 | it's best for that to be a pervasive thing
01:26:36.940 | that everybody learns and thinks about and so on.
01:26:40.700 | And I would say computing is quite obviously
01:26:44.180 | similar to history in the sense that it seems
01:26:47.100 | like it should be a part of everybody's life to some degree,
01:26:51.540 | especially as we move into the later parts
01:26:54.380 | of the 21st century.
01:26:56.300 | And it's not obvious that that's the way it'll go.
01:26:59.140 | It might be in the hands of the few still.
01:27:02.020 | Depending if it's machine learning,
01:27:04.780 | it's unclear that computing will win out.
01:27:09.100 | It's currently very successful, but it's not,
01:27:11.300 | I would say that's something,
01:27:13.060 | I mean, you're at the leadership level of this,
01:27:15.420 | you're defining the future, so it's in your hands.
01:27:18.540 | - No pressure.
01:27:19.380 | - But it feels like there's multiple ways
01:27:21.420 | this can go, and there's this kind of conversation
01:27:25.020 | of everybody should learn to code, right?
01:27:27.900 | The changing nature of jobs and so on.
01:27:30.160 | Do you have a sense of what your role
01:27:36.100 | in education of computing is here?
01:27:40.860 | Like what's the hopeful path forward?
01:27:42.940 | - There's a lot there.
01:27:43.780 | I will say that, well, first off,
01:27:45.620 | it would be an absolute shame if no one studied history.
01:27:49.740 | On the other hand, as T approaches infinity,
01:27:51.220 | the amount of history is presumably also growing,
01:27:53.740 | at least linearly, and so you have to forget
01:27:58.620 | more and more of history, but history
01:28:00.780 | needs to always be there.
01:28:01.620 | I mean, I can imagine a world where,
01:28:03.340 | if you think of your brains as being outside of your head,
01:28:06.400 | that you can kind of learn the history you need to know
01:28:08.740 | when you need to know it.
01:28:09.900 | That seems fanciful, but it's a kind of way of,
01:28:12.620 | is there a sufficient statistic of history?
01:28:15.720 | No, and there certainly, but there may be
01:28:17.740 | for the particular thing you have to care about,
01:28:19.260 | but those who do not remember.
01:28:20.820 | - For our objective camera discussion, right?
01:28:23.220 | - Yeah, right, and we've already lost lots of history,
01:28:26.060 | and of course, you have your own history,
01:28:27.420 | that some of which will be, it's even lost to you.
01:28:29.980 | You don't even remember whatever it was
01:28:31.780 | you were doing 17 years ago.
01:28:33.420 | - All the ex-girlfriends.
01:28:34.540 | - Yeah.
01:28:35.380 | - Gone.
01:28:36.820 | - Exactly, so history's being lost anyway,
01:28:39.660 | but the big lessons of history shouldn't be,
01:28:41.820 | and I think to take it to the question
01:28:44.620 | of computing and sort of education,
01:28:46.260 | the point is you have to get across those lessons.
01:28:48.340 | You have to get across the way of thinking.
01:28:50.960 | And you have to be able to go back,
01:28:52.640 | and you don't wanna lose the data,
01:28:54.600 | even if you don't necessarily have the information
01:28:56.600 | at your fingertips.
01:28:57.880 | With computing, I think it's somewhat different.
01:29:00.640 | Everyone doesn't have to learn how to code,
01:29:02.800 | but everyone needs to learn how to think in the way
01:29:06.040 | that you can be precise, and I mean precise
01:29:08.240 | in the sense of repeatable, not just in the sense of,
01:29:12.600 | not resolution in the sense of get the right number of bits.
01:29:15.600 | In saying what it is you want the machine to do,
01:29:19.520 | and being able to describe a problem in such a way
01:29:22.280 | that it is executable, which we are not,
01:29:26.040 | human beings are not very good at that.
01:29:27.760 | In fact, I think we spend much of our time
01:29:29.240 | talking back and forth just to kind of vaguely understand
01:29:31.480 | what the other person means, and hope we get it good enough
01:29:33.760 | that we can act accordingly.
01:29:35.360 | You can't do that with machines, at least not yet.
01:29:38.320 | And so having to think that precisely
01:29:42.320 | about things is quite important,
01:29:45.240 | and that's somewhat different from coding.
01:29:47.600 | Coding is a crude means to an end.
01:29:51.160 | On the other hand, the idea of coding,
01:29:53.720 | what that means, that it's a programming language
01:29:55.760 | and it has these sort of things that you fiddle with
01:29:58.160 | in these ways that you express,
01:29:59.560 | that is an incredibly important point.
01:30:01.080 | In fact, I would argue that one of the big holes
01:30:03.520 | in machine learning right now and in AI
01:30:05.080 | is that we forget that we are basically doing
01:30:08.720 | software engineering.
01:30:09.560 | We forget that we are doing, we are using programming.
01:30:13.440 | Like, we're using languages to express what we're doing.
01:30:15.400 | We get just so all caught up in the deep network,
01:30:17.280 | or we get all caught up in whatever,
01:30:18.360 | that we forget that we're making decisions
01:30:22.800 | based upon a set of parameters that we made up.
01:30:25.440 | And if we did slightly different parameters,
01:30:27.080 | we'd have completely different outcomes.
01:30:29.080 | And so the lesson of computing, computer science education,
01:30:32.440 | is to be able to think like that,
01:30:34.760 | and to be aware of it when you're doing it.
01:30:36.760 | Basically, at the end of the day,
01:30:38.200 | it's a way of surfacing your assumptions.
01:30:41.360 | I mean, we call them parameters,
01:30:42.720 | or we call them if-then statements or whatever,
01:30:45.960 | but you're forced to surface those assumptions.
01:30:48.520 | That's the key thing that you should get out
01:30:50.720 | of a computing education, that and that the models,
01:30:52.600 | the languages, and the machines are equivalent.
01:30:54.040 | But it actually follows from that,
01:30:56.140 | that you have to be explicit about
01:30:58.800 | what it is you're trying to do,
01:30:59.760 | because the model you're building
01:31:02.000 | is something you will one day run.
01:31:04.360 | So you better get it right, or at least understand it,
01:31:06.720 | and be able to express roughly what you want to express.
01:31:09.740 | So I think it is key that we figure out
01:31:14.240 | how to educate everyone to think that way,
01:31:19.040 | because at the end, it will not only make them better
01:31:22.400 | at whatever it is that they are doing,
01:31:24.460 | and I emphasize doing,
01:31:27.480 | it'll also make them better citizens.
01:31:29.680 | It'll help them to understand what others are doing to them
01:31:33.080 | so that they can react accordingly,
01:31:35.440 | 'cause you're not gonna solve the problem of social media
01:31:37.720 | insofar as you think of social media as a problem
01:31:40.800 | by just making slightly better code, right?
01:31:43.920 | It only works if people react to it appropriately
01:31:47.640 | and know what's happening,
01:31:49.160 | and therefore take control over what they're doing.
01:31:52.560 | I mean, that's my take on it.
01:31:54.240 | - Okay, let me try to proceed awkwardly
01:31:58.920 | into the topic of race.
01:32:00.440 | - Okay.
01:32:01.600 | - One is because it's a fascinating part of your story,
01:32:03.800 | and you're just eloquent and fun about it,
01:32:05.560 | and then the second is because we're living
01:32:07.720 | through a pretty tense time in terms of race tensions
01:32:13.320 | and discussions and ideas in this time in America.
01:32:17.840 | You grew up in Atlanta, not born in Atlanta.
01:32:22.600 | Is some southern state somewhere, Tennessee,
01:32:24.760 | something like that? - Tennessee.
01:32:25.660 | - Nice, okay.
01:32:26.680 | But early on, you moved, you basically,
01:32:31.480 | you identify as an Atlanta native, yeah,
01:32:35.920 | and you've mentioned that you grew up
01:32:39.640 | in a predominantly black neighborhood.
01:32:42.520 | By the way, black, African-American, personal color.
01:32:45.080 | - I prefer black. - Black.
01:32:46.560 | - With a capital B. - With a capital B.
01:32:49.400 | The other letters are-- - The rest of them,
01:32:51.320 | no matter, it's a capital B.
01:32:52.960 | (Lex laughing)
01:32:54.680 | - Okay, so predominantly black neighborhood,
01:32:57.360 | and so you didn't almost see race.
01:32:59.360 | Maybe you can correct me on that.
01:33:01.720 | And then just in the video you talked about
01:33:04.680 | when you showed up to Georgia Tech for your undergrad,
01:33:08.040 | you were one of the only black folks there,
01:33:12.000 | and that was like, oh, that was a new experience.
01:33:14.520 | So can you take me from just a human perspective,
01:33:19.400 | but also from a race perspective,
01:33:21.100 | your journey growing up in Atlanta,
01:33:23.240 | and then showing up at Georgia Tech?
01:33:24.960 | - All right, that's easy.
01:33:25.840 | And by the way, that story continues through MIT as well.
01:33:28.800 | In fact, it was quite a bit more stark at MIT and Boston.
01:33:33.080 | - So maybe just a quick pause,
01:33:34.800 | Georgia Tech was undergrad, MIT was graduate school.
01:33:38.080 | - Mm-hmm, mm-hmm, and I went directly
01:33:39.640 | to grad school from undergrad,
01:33:40.800 | so I had no distractions in between
01:33:43.880 | my bachelor's and my master's and PhD.
01:33:45.760 | - You didn't go on a backpacking trip in Europe.
01:33:47.880 | - Didn't do any of that, didn't do it.
01:33:48.960 | In fact, I literally went to IBM for three months,
01:33:51.920 | got in a car, and drove straight to Boston with my mother,
01:33:54.720 | or Cambridge.
01:33:55.560 | Moved into an apartment I'd never seen over the Royal East.
01:34:00.840 | Anyway, that's another story.
01:34:02.320 | So let me tell you a little bit about--
01:34:03.640 | - You miss MIT?
01:34:05.120 | - Oh, I loved MIT.
01:34:06.280 | I don't miss Boston at all, but I loved MIT.
01:34:09.200 | - And then Miss Fighting Warrants, okay.
01:34:10.360 | - Oh, so let's back up to this.
01:34:12.440 | So as you said, I was born in Chattanooga, Tennessee.
01:34:14.960 | My earliest memory is arriving in Atlanta
01:34:16.680 | in a moving truck at the age of three and a half,
01:34:18.280 | so I think of myself as being from Atlanta.
01:34:20.360 | I have a very distinct memory of that.
01:34:21.520 | So I grew up in Atlanta,
01:34:22.640 | it's the only place I ever knew as a kid.
01:34:24.960 | I loved it.
01:34:25.800 | Like much of the country, and certainly much of Atlanta
01:34:29.560 | in the '70s and '80s, it was deeply, highly segregated,
01:34:33.240 | though not in a way that I think was obvious to you
01:34:36.200 | unless you were looking at it,
01:34:37.600 | or were old enough to have noticed it,
01:34:39.880 | but you could divide up Atlanta,
01:34:41.040 | and Atlanta's hardly unique in this way, by highway,
01:34:43.540 | and you could get race and class that way.
01:34:45.220 | So I grew up not only in a predominantly black area,
01:34:48.600 | to say the very least, I grew up on the poor side of that.
01:34:52.060 | But I was very much aware of race for a bunch of reasons,
01:34:57.640 | one, that people made certain that I was, my family did,
01:35:00.940 | but also that it would come up.
01:35:02.120 | So in first grade, I had a girlfriend.
01:35:07.120 | I say I had a girlfriend, I didn't have a girlfriend,
01:35:08.960 | I wasn't even entirely sure what girls were
01:35:10.760 | in the first grade, but I do remember she decided
01:35:13.880 | I was her girlfriend, this little white girl named Heather.
01:35:16.640 | And we had a long discussion about how it was okay
01:35:18.920 | for us to be boyfriend and girlfriend,
01:35:20.320 | despite the fact that she was white and I was black.
01:35:22.880 | - Between the two of you, or your parents?
01:35:24.760 | Did your parents know about this?
01:35:26.520 | - Yes, but being a girlfriend and boyfriend in first grade
01:35:29.800 | just basically meant that you spent
01:35:31.280 | slightly more time together during recess.
01:35:33.720 | It had no, I think we Eskimo kissed once.
01:35:35.720 | It doesn't mean, it didn't mean anything.
01:35:38.320 | It was, at the time, it felt very scandalous
01:35:40.400 | 'cause everyone was watching.
01:35:41.300 | I was like, ah, my life is, now my life has changed
01:35:44.080 | in first grade, no one told me elementary school
01:35:45.880 | would be like this.
01:35:46.720 | - Did you write poetry or?
01:35:47.840 | - Not in first grade, that would come later.
01:35:50.240 | That would come during puberty,
01:35:52.080 | when I wrote lots and lots of poetry.
01:35:54.280 | Anyway, so I was aware of it.
01:35:56.320 | I didn't think too much about it, but I was aware of it.
01:35:59.520 | But I was surrounded.
01:36:00.520 | It wasn't that I wasn't aware of race,
01:36:02.220 | it's that I wasn't aware that I was a minority.
01:36:05.480 | Different, and it's because I wasn't.
01:36:09.040 | As far as my world was concerned,
01:36:10.240 | I mean, I'm six years old, five years old in first grade.
01:36:13.120 | The world is the seven people I see every day.
01:36:15.640 | So it didn't feel that way at all.
01:36:18.040 | And by the way, this being Atlanta,
01:36:20.200 | home of the Civil Rights Movement and all the rest,
01:36:22.000 | it meant that when I looked at TV,
01:36:23.480 | which back then one did,
01:36:24.720 | 'cause there were only three, four, five channels, right?
01:36:27.040 | And I saw the news, which my mother might make me watch.
01:36:29.840 | Monica Kaufman was on TV telling me the news.
01:36:35.280 | And they were all black, and the mayor was black,
01:36:37.320 | and always been black.
01:36:38.160 | And so it just never occurred to me.
01:36:39.920 | When I went to Georgia Tech,
01:36:42.200 | I remember the first day walking across campus,
01:36:45.040 | from West Campus to East Campus,
01:36:47.360 | and realizing along the way
01:36:48.960 | that of the hundreds and hundreds and hundreds
01:36:50.560 | and hundreds of students that I was seeing,
01:36:52.400 | I was the only black one.
01:36:54.060 | That was enlightening and very off-putting,
01:36:58.200 | because it occurred to me.
01:36:59.560 | And then of course it continued that way for,
01:37:02.240 | well, for much of the rest of my career at Georgia Tech.
01:37:05.840 | Of course I found lots of other students.
01:37:07.400 | And I met people, 'cause in Atlanta,
01:37:09.100 | you're either black or you're white.
01:37:10.320 | There was nothing else.
01:37:12.200 | So I began to meet students of Asian descent,
01:37:14.480 | and I met students who we would call Hispanic,
01:37:16.680 | and so on and so forth.
01:37:17.640 | And so my world, this is what college is supposed to do,
01:37:19.680 | right, it's supposed to open you up to people, and it did.
01:37:23.040 | But it was a very strange thing to be in the minority.
01:37:28.040 | When I came to Boston, I will tell you a story.
01:37:32.160 | I applied to one place as an undergrad, Georgia Tech,
01:37:36.840 | because I was stupid, I didn't know any better.
01:37:38.800 | I just didn't know any better, right?
01:37:41.320 | No one told me.
01:37:42.640 | When I went to grad school, I applied to three places,
01:37:44.920 | Georgia Tech, because that's where I was, MIT, and CMU.
01:37:47.680 | When I got in to MIT, I got into CMU,
01:37:54.720 | but I had a friend who went to CMU.
01:37:57.920 | And so I asked him what he thought about it.
01:37:59.560 | He spent his time explaining to me about Pittsburgh,
01:38:01.880 | much less about CMU, but more about Pittsburgh,
01:38:04.720 | of which I developed a strong opinion
01:38:06.520 | based upon his strong opinion,
01:38:07.860 | something about the sun coming out two days out of the year.
01:38:11.080 | And I didn't get a chance to go there
01:38:12.360 | because the timing was wrong.
01:38:14.240 | I think it was because the timing was wrong.
01:38:16.700 | At MIT, I asked 20 people I knew,
01:38:20.240 | either when I visited or I had already known
01:38:23.600 | for a variety of reasons whether they liked Boston.
01:38:26.160 | And 10 of them loved it, and 10 of them hated it.
01:38:29.560 | The 10 who loved it were all white.
01:38:31.120 | The 10 who hated it were all black.
01:38:33.480 | And they explained to me very much why that was the case.
01:38:36.280 | Both dads told me why.
01:38:38.840 | And the stories were remarkably the same
01:38:40.920 | for the two clusters.
01:38:42.240 | And I came up here, and I could see it immediately,
01:38:45.840 | why people would love it and why people would not.
01:38:48.000 | - Why people tell you about the nice coffee shops.
01:38:51.400 | - Well, it wasn't coffee shops.
01:38:52.240 | It was CD, used CD places.
01:38:54.080 | But yeah, it was that kind of a thing.
01:38:56.000 | Nice shops, oh, there's all these students here.
01:38:57.840 | Harvard Square is beautiful.
01:38:59.520 | You can do all these things,
01:39:00.400 | and you can walk in something about the outdoors,
01:39:01.960 | which I wasn't the slightest bit interested in.
01:39:03.480 | The outdoors is for the bugs, it's not for humans.
01:39:05.960 | And the-- (Lyle laughs)
01:39:08.320 | - That should be a T-shirt.
01:39:09.560 | - Yeah, I mean, it's the way I feel about it.
01:39:12.520 | And the black folk told me completely different stories
01:39:14.960 | about which part of town you did not wanna be caught in
01:39:18.040 | after dark, and I heard all, but that was nothing new.
01:39:22.060 | So I decided that MIT was a great place to be
01:39:25.480 | as a university, and I believed it then, I believe it now.
01:39:29.680 | And that whatever it is I wanted to do,
01:39:31.280 | I thought I knew what I wanted to do,
01:39:32.360 | but what if I was wrong?
01:39:34.180 | Someone there would know how to do it.
01:39:36.160 | Of course, then I would pick the one topic
01:39:37.680 | that nobody was working on at the time, but that's okay.
01:39:41.520 | It was great, and so I thought that I would be fine.
01:39:43.200 | And I'd only be there for like four or five years.
01:39:45.640 | I told myself, which turned out not to be true at all.
01:39:48.620 | But I enjoyed my time, I enjoyed my time there.
01:39:50.520 | But I did see a lot of,
01:39:52.060 | I ran across a lot of things
01:39:55.800 | that were driven by what I look like while I was here.
01:39:59.560 | I got asked a lot of questions, I ran into a lot of cops.
01:40:03.000 | I saw a lot about the city.
01:40:06.320 | But at the time, I mean, I haven't been here a long time,
01:40:08.480 | these are the things that I remember.
01:40:09.580 | So this is 1990.
01:40:11.180 | There was not a single black radio station.
01:40:16.360 | Now this is 1990.
01:40:17.800 | I don't know if there are any radio stations anymore.
01:40:19.720 | I'm sure there are, but I don't listen to the radio anymore,
01:40:22.320 | and almost no one does,
01:40:23.480 | at least if you're under a certain age.
01:40:26.800 | But the idea is you could be in a major metropolitan area
01:40:28.840 | and there wasn't a single black radio station,
01:40:30.400 | by which I mean a radio station
01:40:31.560 | that played what we would call black music then,
01:40:33.960 | was absurd, but somehow captured
01:40:37.760 | kind of everything about the city.
01:40:39.840 | I grew up in Atlanta,
01:40:41.160 | and you've heard me tell you about Atlanta.
01:40:43.440 | Boston had no economically viable
01:40:48.240 | or socially cohesive black middle class.
01:40:50.560 | Insofar as it existed, it was uniformly distributed
01:40:55.160 | throughout large parts, not all parts,
01:40:56.600 | but large parts of the city.
01:40:57.940 | And where you had concentrations of black Bostonians,
01:41:02.320 | they tended to be poor.
01:41:04.400 | It was very different from where I grew up.
01:41:06.120 | I grew up on the poor side of town, sure,
01:41:07.960 | but then in high school, well, in ninth grade,
01:41:11.040 | we didn't have middle school.
01:41:12.960 | I went to an eighth grade school
01:41:14.120 | where there was a lot of, let's just say,
01:41:16.280 | we had a riot the year that I was there.
01:41:17.920 | There was at least one major fight every week.
01:41:19.820 | It was an amazing experience.
01:41:24.820 | But when I went to ninth grade, I went to academy.
01:41:27.680 | - Math and-- - Math and Science Academy,
01:41:30.120 | Mays High, it was a public school.
01:41:31.880 | It was a magnet school, that's why I was able to go there.
01:41:34.640 | It was the first school, high school,
01:41:36.400 | I think, in the state of Georgia
01:41:37.480 | to sweep the state math and science fairs.
01:41:39.840 | It was great.
01:41:40.680 | It had 385 students, all but four of whom were black.
01:41:45.680 | I went to school with the daughter
01:41:50.360 | of the former mayor of Atlanta, Michael Jackson's cousin.
01:41:54.240 | I mean, you know, it was an upper middle class--
01:41:56.760 | - Dropping names. - Whatever.
01:41:58.120 | I just dropped names occasionally.
01:41:59.680 | You know, I dropped the mic, dropped some names,
01:42:01.920 | just to let you know I used to hang out
01:42:02.960 | with Michael Jackson's cousin.
01:42:04.440 | 12th cousin, nine times removed, I don't know.
01:42:06.200 | The point is, we had a parking problem
01:42:08.760 | 'cause the kids had cars.
01:42:10.080 | I did not come from a place where you had cars.
01:42:12.360 | I had my first car when I came to MIT, actually.
01:42:14.840 | So it was just a very different experience for me.
01:42:19.840 | But I'd been to places where, whether you were rich
01:42:24.660 | or whether you were poor, you could be black and rich
01:42:27.320 | or black and poor, and it was there,
01:42:28.440 | and there were places, and they were segregated
01:42:31.000 | by class as well as by race, but that existed.
01:42:34.240 | Here, at least when I was here,
01:42:35.720 | it didn't feel that way at all,
01:42:36.840 | and it felt like a bunch of,
01:42:38.120 | a really interesting contradiction.
01:42:40.260 | It felt like it was the interracial dating capital
01:42:44.240 | of the country.
01:42:47.080 | It really felt that way.
01:42:48.320 | But it also felt like the most racist place
01:42:51.440 | I ever spent any time.
01:42:54.240 | You couldn't go up the Orange Line, at that time.
01:42:56.580 | I mean, again, that was 30 years ago.
01:42:58.940 | I don't know what it's like now.
01:43:00.180 | But there were places you couldn't go,
01:43:01.780 | and you knew it, everybody knew it.
01:43:04.740 | And there were places you couldn't live,
01:43:07.580 | and everybody knew that.
01:43:08.780 | And that was just the greater Boston area in 1992.
01:43:12.820 | - Subtle racism or explicit racism?
01:43:14.900 | - Both, both.
01:43:16.860 | - In terms of within the institutions,
01:43:18.900 | did you feel, was there levels in which you were empowered
01:43:24.140 | to be first, or one of the first black people
01:43:28.080 | in a particular discipline,
01:43:29.620 | in some of these great institutions that you were a part of,
01:43:33.060 | you know, Georgia Tech or MIT?
01:43:35.640 | And was there a part where it was, it felt limiting?
01:43:38.920 | - I always felt empowered.
01:43:41.520 | Some of that was my own delusion, I think.
01:43:44.580 | But it worked out.
01:43:46.380 | So I never felt, in fact, quite the opposite.
01:43:50.260 | Not only did I not feel as if no one was trying to stop me,
01:43:54.340 | I had the distinct impression
01:43:55.420 | that people wanted me to succeed.
01:43:57.580 | By people, I meant the people in power.
01:44:00.900 | Not my fellow students, not that they didn't want me
01:44:02.460 | to succeed, but I felt supported,
01:44:07.020 | or at least that people were happy to see me succeed
01:44:10.420 | at least as much as anyone else.
01:44:12.260 | But you know, 1990, you're dealing with a different set
01:44:14.900 | of problems, right?
01:44:15.740 | You're very early, at least in computer science,
01:44:18.620 | you're very early in the sort of Jackie Robinson period.
01:44:21.540 | You know, there's this thing called
01:44:22.540 | Jackie Robinson syndrome, which is that you have to,
01:44:25.940 | you know, the first one has to be perfect,
01:44:27.860 | or has to be sure to succeed because if that person fails,
01:44:30.700 | no one else comes after for a long time.
01:44:32.440 | So, you know, it was kind of in everyone's best interest.
01:44:35.860 | But I think it came from a sincere place.
01:44:37.460 | I'm completely sure that people went out of their way
01:44:40.700 | to try to make certain that the environment would be good.
01:44:45.220 | Not just for me, but for the other people
01:44:46.780 | who of course were around.
01:44:47.620 | And I was hardly the only, I was the only person
01:44:49.300 | in the AI lab, but I wasn't the only person
01:44:52.220 | at MIT by a long shot.
01:44:54.060 | On the other hand, we're what?
01:44:55.860 | At that point, we would have been what?
01:44:57.220 | Less than 20 years away from the first black PhD
01:45:00.460 | to graduate from MIT, right?
01:45:02.780 | Shirley Jackson, right?
01:45:03.620 | 1971, something like that, somewhere around then.
01:45:07.020 | So we weren't that far away from the first first.
01:45:10.820 | And we were still another eight years away
01:45:12.700 | from the first black PhD computer science, right?
01:45:15.900 | So we were in a, it was a sort of interesting time.
01:45:19.620 | But I did not feel as if the institutions
01:45:21.880 | of the university were against any of that.
01:45:26.880 | And furthermore, I felt as if there was enough
01:45:29.160 | of a critical mass across the institute
01:45:31.580 | from students and probably faculty,
01:45:34.120 | though I didn't know them, who wanted to make certain
01:45:37.060 | that the right thing happened.
01:45:38.580 | That's very different from the institutions
01:45:40.180 | of the rest of the city, which I think were designed
01:45:44.140 | in such a way that they felt no need to be supportive.
01:45:47.540 | - Let me ask a touchy question on that.
01:45:51.280 | So you kind of said that you didn't feel,
01:45:56.160 | you felt empowered.
01:46:00.180 | Is there some lesson, advice, in the sense
01:46:04.220 | that no matter what, you should feel empowered?
01:46:07.860 | You said, you used the word, I think, illusion or delusion.
01:46:12.060 | Is there a sense from the individual perspective
01:46:15.140 | where you should always kind of ignore, you know,
01:46:20.800 | the, ignore your own eyes,
01:46:23.160 | ignore the little forces that you are able
01:46:32.260 | to observe around you that are like trying
01:46:35.340 | to mess with you of whether it's jealousy,
01:46:37.400 | whether it's hatred in its pure form,
01:46:39.880 | whether it's just hatred in its like deluded form,
01:46:43.820 | all that kind of stuff, and just kind of see yourself
01:46:46.780 | as empowered and confident, all those kinds of things?
01:46:50.340 | - I mean, it certainly helps, but there's a trade-off, right?
01:46:52.540 | You have to be deluded enough to think
01:46:53.880 | that you can succeed.
01:46:55.020 | I mean, you can't get a PhD unless you're crazy enough
01:46:57.020 | to think you can invent something
01:46:58.820 | that no one else has come up with.
01:47:00.140 | I mean, that kind of massive delusion is that.
01:47:02.340 | So you have to be deluded enough to believe
01:47:03.700 | that you can succeed despite whatever odds you see
01:47:05.760 | in front of you, but you can't be so deluded
01:47:07.580 | that you don't think that you need to step out
01:47:09.020 | of the way of the oncoming train.
01:47:10.740 | - Right. - Right.
01:47:11.580 | So it's all a trade-off, right?
01:47:12.940 | You have to kind of believe in yourself.
01:47:14.220 | It helps to have a support group around you
01:47:16.540 | in some way or another.
01:47:18.060 | I was able to find that.
01:47:19.500 | I've been able to find that wherever I've gone,
01:47:21.860 | even if it wasn't necessarily on the floor that I was in.
01:47:24.100 | I had lots of friends when I was here.
01:47:25.500 | Many of them still live here,
01:47:27.660 | and I've kept up with many of them.
01:47:29.140 | So I felt supported, and certainly I had my mother
01:47:31.940 | and my family and those people back home
01:47:34.500 | that I could always lean back on,
01:47:36.980 | even if it were a long-distance call that cost money,
01:47:39.420 | which is not something that any of the kids today
01:47:42.100 | even know what I'm talking about.
01:47:43.100 | But back then it mattered.
01:47:44.580 | Calling my mom was an expensive proposition.
01:47:46.980 | But you have that, and it's fine.
01:47:48.300 | I think it helps.
01:47:49.340 | But you cannot be so deluded that you miss the obvious
01:47:52.540 | because it makes things slower,
01:47:54.140 | and it makes you think you're doing better than you are,
01:47:56.620 | and it will hurt you in the long run.
01:47:58.520 | - You mentioned cops.
01:48:02.540 | You tell a story of being pulled over.
01:48:05.060 | Perhaps it happened more than once.
01:48:07.100 | - More than once, for sure.
01:48:09.020 | - One, could you tell that story?
01:48:10.700 | And in general, can you give me a sense
01:48:14.180 | of what the world looks like
01:48:17.580 | when the law doesn't always look at you
01:48:20.140 | with the blank slate,
01:48:24.920 | with the objective eyes?
01:48:31.700 | I don't know how to say it more poetically.
01:48:33.960 | - Well, I guess the, I don't either.
01:48:35.940 | I guess the answer is it looks exactly the way it looks now
01:48:39.940 | because this is the world that we happen to live in.
01:48:42.620 | It's people clustering and doing the things that they do
01:48:46.220 | and making decisions based on one or two bits
01:48:49.620 | of information they find relevant,
01:48:51.460 | which, by the way, are all positive feedback loops,
01:48:54.340 | which makes it easier for you to believe
01:48:56.500 | what you believed before
01:48:57.340 | because you behave in a certain way that makes it true,
01:48:59.540 | and it goes on and circles and it cycles
01:49:01.100 | and it cycles and it cycles.
01:49:02.460 | So it's just about being on edge.
01:49:06.260 | I do not, despite having made it over 50 now.
01:49:10.160 | Despite-- - Congratulations, by the way.
01:49:13.620 | - God, I have a few gray hairs here and there.
01:49:16.260 | - You did pretty good.
01:49:17.420 | - I think, you know, I don't imagine I will ever
01:49:22.180 | see a police officer and not get very, very tense.
01:49:25.660 | Now, everyone gets a little tense
01:49:28.480 | because it probably means you're being pulled over
01:49:30.460 | for speeding or something,
01:49:32.140 | or you're gonna get a ticket or whatever, right?
01:49:34.340 | I mean, the interesting thing about the law in general
01:49:36.620 | is that most human beings' experience of it
01:49:38.980 | is fundamentally negative, right?
01:49:41.820 | You're only dealing with a lawyer
01:49:43.160 | if you're in trouble,
01:49:44.540 | except in a few very small circumstances, right?
01:49:47.620 | But, so that's just, that's an underlying reality.
01:49:50.220 | Now imagine that that's also
01:49:51.420 | at the hands of the police officer.
01:49:53.060 | I remember the time when I was,
01:49:54.580 | when I got pulled over that time,
01:49:57.540 | halfway between Boston and Wellesley, actually.
01:49:59.940 | I remember thinking,
01:50:02.940 | as he, when he pulled his gun on me,
01:50:06.380 | that if he shot me right now, he'd get away with it.
01:50:11.540 | That was the worst thing that I felt
01:50:13.920 | about that particular moment,
01:50:15.360 | is that if he shoots me now, he will get away with it.
01:50:18.760 | It would be years later when I realized,
01:50:21.400 | actually much worse than that,
01:50:22.960 | is that he'd get away with it,
01:50:27.680 | and if anyone, if it became a thing
01:50:29.760 | that other people knew about,
01:50:31.000 | odds were, would be, of course, that it wouldn't,
01:50:32.940 | but if it became a thing that other people knew about,
01:50:34.400 | if I was living in today's world
01:50:35.840 | as opposed to the world 30 years ago,
01:50:38.380 | that not only would he get away with it,
01:50:40.400 | but that I would be painted a villain.
01:50:43.840 | I was probably big and scary,
01:50:45.300 | and I probably moved too fast,
01:50:46.500 | and if only I'd done what he said,
01:50:47.740 | and da, da, da, da, da, da,
01:50:49.320 | which is somehow worse, right?
01:50:51.240 | You know, that hurts not just you, you're dead,
01:50:53.700 | but your family, and the way people look at you,
01:50:57.420 | and look at your legacy or your history,
01:51:00.020 | that's terrible, and it would work.
01:51:02.300 | I absolutely believe it would've worked had he done it.
01:51:04.860 | Now, he didn't.
01:51:05.700 | I don't think he wanted to shoot me.
01:51:06.540 | I don't think he felt like killing anybody.
01:51:08.180 | He did not go out that night expecting to do that,
01:51:10.260 | or planning on doing it,
01:51:11.240 | and I wouldn't be surprised if he never, ever did that,
01:51:14.280 | or ever even pulled his gun again.
01:51:15.640 | I don't know the man's name.
01:51:16.600 | I don't remember anything about him.
01:51:17.600 | I do remember the gun.
01:51:18.680 | Guns are very big when they're in your face.
01:51:20.120 | I can tell you this much.
01:51:20.960 | They're much larger than they seem.
01:51:23.160 | - And you were basically speeding or something like that?
01:51:25.680 | - He said I ran a light, I think.
01:51:26.840 | - Ran a light, sure.
01:51:27.680 | - I don't think I ran a light,
01:51:29.080 | but in fact, I may not have even gotten a ticket.
01:51:31.740 | I may have just gotten a warning.
01:51:33.280 | I think he was a little spooked, too.
01:51:34.120 | - But he pulled a gun.
01:51:35.240 | - Yeah, apparently I moved too fast or something.
01:51:38.160 | Rolled my window down before I should've.
01:51:39.800 | It was unclear.
01:51:40.640 | I think he thought I was gonna do something,
01:51:42.720 | or at least that's how he behaved.
01:51:44.800 | - So how, if we can take a little walk around your brain,
01:51:49.380 | how do you feel about that guy,
01:51:55.280 | and how do you feel about cops after that experience?
01:51:59.160 | - Well, I don't remember that guy,
01:52:01.160 | but my views on police officers
01:52:02.880 | is the same view I have about lots of things.
01:52:06.480 | Fire is an important and necessary thing in the world,
01:52:10.320 | but you must respect fire because it will burn you.
01:52:13.860 | Fire is a necessary evil in the sense that it can burn you,
01:52:19.000 | necessary in the sense that, you know, heat,
01:52:21.360 | and all the other things that we use fire for.
01:52:23.340 | So when I see a cop, I see a giant ball of flame,
01:52:28.340 | and I just try to avoid it.
01:52:31.880 | - And then some people might see a nice place,
01:52:34.120 | a nice thing to roast marshmallows with a family over.
01:52:37.960 | - Which is fine, I don't roast marshmallows.
01:52:40.120 | - Okay, so let me go a little darker, and I apologize.
01:52:43.120 | Just talked to Dan Carlin about it, he left for four hours.
01:52:45.120 | So sorry if I go dark here a little bit,
01:52:48.640 | but is it easy for this experience
01:52:53.600 | of just being careful with the fire
01:52:55.760 | and avoiding it to turn to hatred?
01:52:58.600 | - Yeah, of course.
01:52:59.760 | And one might even argue that it is a illogical conclusion.
01:53:05.100 | On the other hand, you've got to live in the world,
01:53:07.620 | and I don't think it's helpful.
01:53:12.140 | Hate is something one should,
01:53:13.420 | I mean, hate is something that takes a lot of energy.
01:53:18.020 | So one should reserve it for when it is useful
01:53:21.940 | and not carried around with you all the time.
01:53:24.280 | Again, there's a big difference between the happy delusion
01:53:28.140 | that convinces you that you can actually get out of bed
01:53:30.140 | and make it to work today without getting hit by a car,
01:53:33.660 | and the sad delusion that means you can not worry
01:53:37.260 | about this car that is barreling towards you, right?
01:53:39.780 | So we all have to be a little deluded
01:53:41.900 | because otherwise we're paralyzed, right?
01:53:44.060 | But one should not be ridiculous.
01:53:46.460 | If we go all the way back to something you said earlier
01:53:48.580 | about empathy, I think what I would ask other people
01:53:53.580 | to get out of this one of many, many, many stories
01:54:00.140 | is to recognize that it is real.
01:54:03.160 | People would ask me to empathize with the police officer.
01:54:07.260 | I would quote back statistics saying that,
01:54:09.580 | you know, being a police officer isn't even
01:54:12.700 | in the top 10 most dangerous jobs in the United States,
01:54:14.860 | you're much more likely to get killed in a taxi cab.
01:54:17.540 | Half of police officers are actually killed by suicide.
01:54:21.940 | But that means their lives are something,
01:54:25.420 | something's going on there with them.
01:54:27.860 | And I would more than happy to be empathetic
01:54:30.100 | about what it is they go through
01:54:31.140 | and how they see the world.
01:54:32.780 | I think though that if we step back from what I feel,
01:54:37.220 | and we step back from what an individual police officer
01:54:39.500 | feels, you step up a level, and all this,
01:54:42.300 | because all things tie back into interactive AI.
01:54:45.500 | The real problem here is that we've built a narrative,
01:54:47.460 | we built a big structure that has made it easy
01:54:50.340 | for people to put themselves into different pots
01:54:53.240 | in the different clusters, and to basically
01:54:56.820 | forget that the people in the other clusters
01:54:59.540 | are ultimately like them.
01:55:02.300 | It is a useful exercise to ask yourself sometimes,
01:55:04.540 | I think, that if I had grown up in a completely
01:55:07.140 | different house, in a completely different household,
01:55:09.340 | as a completely different person, if I had been a woman,
01:55:12.140 | would I see the world differently?
01:55:13.660 | Would I believe what that crazy person over there believes?
01:55:16.700 | And the answer's probably yes,
01:55:19.820 | because after all, they believe it.
01:55:21.740 | And fundamentally, they're the same as you.
01:55:24.460 | So then what can you possibly do to fix it?
01:55:27.620 | How do you fix Twitter, if you think Twitter
01:55:29.460 | needs to be, is broken, or Facebook,
01:55:31.180 | if you think Facebook is broken, how do you fix racism?
01:55:34.340 | How do you fix any of these things?
01:55:36.740 | It's all structural, right?
01:55:38.740 | It's not, I mean, individual conversations matter a lot,
01:55:41.720 | but you have to create structures that allow people
01:55:45.420 | to have those individual conversations all the time
01:55:47.720 | in a way that is relatively safe,
01:55:50.460 | and that allows them to understand that other people
01:55:52.700 | have had different experiences, but that ultimately
01:55:54.660 | we're the same, which sounds very,
01:55:56.420 | I don't even know what the right word is.
01:55:59.660 | I'm trying to avoid a word like saccharine.
01:56:01.060 | But it feels very optimistic.
01:56:04.900 | But I think that's okay.
01:56:08.620 | I think that's a part of the delusion,
01:56:09.980 | is you wanna be a little optimistic,
01:56:11.580 | and then recognize that the hard problem
01:56:13.100 | is actually setting up the structures in the first place,
01:56:14.900 | because it's in almost no one's interest
01:56:17.900 | to change the infrastructure.
01:56:20.540 | - Right, I tend to believe that leaders
01:56:22.860 | have a big role to that, of selling that optimistic delusion
01:56:27.100 | to everybody, and that eventually leads
01:56:29.620 | to the building of the structures.
01:56:31.340 | But that requires a leader that unites,
01:56:34.520 | sort of unites everybody on a vision,
01:56:36.740 | as opposed to divides on a vision,
01:56:38.580 | which is this particular moment in history feels
01:56:43.460 | like there's a non-zero probability, if we go to the P,
01:56:49.380 | of something akin to a violent or a non-violent civil war.
01:56:53.620 | This is one of the most divisive periods
01:56:57.100 | of American history in recent,
01:56:58.700 | you can speak to this from perhaps a more knowledgeable
01:57:03.700 | and deeper perspective than me,
01:57:05.340 | but from my naive perspective,
01:57:06.980 | this seems like a very strange time.
01:57:09.740 | There's a lot of anger, and it has to do with people,
01:57:13.920 | I mean, for many reasons.
01:57:15.500 | One, the thing that's not spoken about,
01:57:17.660 | I think, much is the quiet economic pain of millions
01:57:22.660 | that's growing because of COVID,
01:57:28.500 | because of closed businesses, because of lost dreams.
01:57:32.100 | So that's building, whatever that tension is building.
01:57:35.700 | The other is, there seems to be
01:57:38.300 | an elevated level of emotion.
01:57:39.940 | I'm not sure if you can psychoanalyze
01:57:41.940 | where that's coming from, but this sort of,
01:57:44.620 | from which the protests and so on percolated.
01:57:47.300 | It's like, why now?
01:57:48.420 | Why this particular moment in history?
01:57:50.340 | - Oh, because enough time has passed.
01:57:52.220 | I mean, the very first race riots were in Boston,
01:57:54.780 | not to draw anything-- - Really?
01:57:57.380 | When?
01:57:58.220 | - Oh. - This is before--
01:57:59.060 | - Going way, I mean, like the 1700s or whatever, right?
01:58:01.500 | I mean, there was a massive one in New York.
01:58:03.700 | I mean, I'm talking way, way, way back when.
01:58:05.380 | So Boston used to be the hotbed of riots.
01:58:07.900 | It's just what Boston was all about,
01:58:10.220 | or so I'm told from history class.
01:58:12.420 | There's an interesting one in New York.
01:58:14.860 | I don't remember when that was.
01:58:15.700 | Anyway, the point is,
01:58:17.380 | basically you gotta get another generation,
01:58:23.140 | old enough to be angry, but not so old
01:58:25.260 | to remember what happened the last time, right?
01:58:28.140 | And that's sort of what happens.
01:58:29.380 | But you said like two completely,
01:58:32.420 | you said two things there that I think are worth unpacking.
01:58:35.300 | One has to do with this sort of moment in time,
01:58:38.740 | and why, why is this sort of upbuilt?
01:58:43.620 | And the other has to do with a kind of,
01:58:45.620 | sort of the economic reality of COVID.
01:58:47.980 | So I'm actually, I want to separate those things,
01:58:49.940 | because for example, you know,
01:58:51.700 | this happened before COVID happened, right?
01:58:55.420 | So let's separate these two things for a moment.
01:58:57.820 | Now, let me preface all this by saying
01:59:01.140 | that although I am interested in history,
01:59:04.600 | one of my three minors as an undergrad was history,
01:59:07.660 | specifically history of the 1960s.
01:59:09.360 | - Interesting. - The other was Spanish.
01:59:12.260 | - Okay, that's a mistake.
01:59:14.060 | - Oh, I loved, I loved. - Okay.
01:59:16.380 | - And history of, and Spanish history, actually,
01:59:18.180 | but Spanish, and the other was
01:59:19.820 | what we would now call cognitive science,
01:59:21.140 | but at the time--
01:59:22.820 | - Oh, that's fascinating, interesting.
01:59:25.260 | - I minored in cocci here for grad school.
01:59:28.300 | That was really, that was really fascinating.
01:59:30.820 | It was a very different experience
01:59:32.060 | from all the computer science classes I'd been taking,
01:59:34.260 | even the cocci classes I was taking at an undergrad.
01:59:37.780 | Anyway, I'm not, I am a, I'm interested in history,
01:59:42.780 | but I'm hardly a historian, right?
01:59:45.060 | So, you know, forgive my, I will ask the audience
01:59:48.660 | to forgive my simplification.
01:59:50.500 | But I think the question that's always worth asking
01:59:56.620 | as opposed to, it's the same question,
01:59:59.320 | but a little different.
02:00:01.220 | Not why now, but why not before, right?
02:00:05.220 | So why the 1950s, '60s civil rights movement
02:00:09.940 | as opposed to the 1930s, 1940s?
02:00:11.620 | Well, first off, there was a civil rights movement
02:00:12.980 | in the '30s and '40s, it just wasn't of the same character
02:00:16.220 | or quite as well-known.
02:00:17.380 | Post-World War II, lots of interesting things
02:00:19.820 | were happening.
02:00:20.660 | It's not as if a switch was turned on
02:00:22.700 | and Brown v. the Board of Education
02:00:24.940 | or the Montgomery Bus Boycott,
02:00:27.500 | and that's when it happened.
02:00:28.340 | These things have been building up forever
02:00:29.500 | and go all the way back and all the way back
02:00:30.980 | and all the way back, and, you know,
02:00:32.100 | Harriet Tubman was not born in 1950, right?
02:00:34.580 | So, you know, we can take these things--
02:00:36.100 | - It could have easily happened right after World War II.
02:00:38.900 | - Yes, I think, and again, I am not a scholar,
02:00:43.900 | I think that the big difference was TV.
02:00:48.100 | These things are visible.
02:00:50.860 | People can see them.
02:00:52.580 | It's hard to avoid, right?
02:00:55.580 | Why not James Farmer?
02:00:56.860 | Why Martin Luther King?
02:00:57.980 | 'Cause one was born 20 years after the other, whatever.
02:01:02.700 | I think it turns out that, you know what King's
02:01:06.180 | biggest failure was in the early days?
02:01:08.780 | It was in Georgia.
02:01:09.680 | You know, they were doing the usual thing,
02:01:14.320 | trying to integrate, and I forget the guy's name,
02:01:18.460 | but you can look this up, but he, a cop,
02:01:21.820 | he was a sheriff, made a deal with the whole state of Georgia.
02:01:24.980 | We're gonna take people and we are going to
02:01:26.500 | non-violently put them in trucks,
02:01:28.820 | and then we are going to take them
02:01:30.140 | and put them in jails very far away from here.
02:01:33.280 | And we're gonna do that, and we're not gonna,
02:01:35.200 | there'll be no reason for the press to hang around.
02:01:37.720 | And they did that, and it worked.
02:01:40.160 | And the press left, and nothing changed.
02:01:43.200 | So, next they went to Birmingham, Alabama,
02:01:45.240 | and Bull O'Connor, and you got to see on TV
02:01:49.240 | little boys and girls being hit with fire hoses
02:01:52.500 | and being knocked down, and there was outrage,
02:01:54.560 | and things changed, right?
02:01:56.260 | Part of the delusion is pretending that nothing bad
02:01:59.820 | is happening that might force you to do something big
02:02:01.680 | you don't want to do, but sometimes it gets put in your face
02:02:04.200 | and then you kind of can't ignore it.
02:02:05.760 | And a large part, in my view, of what happened, right,
02:02:09.840 | was that it was too public to ignore.
02:02:12.000 | Now, we created other ways of ignoring it.
02:02:14.640 | Lots of change happened in the South,
02:02:16.080 | but part of that delusion was that it wasn't gonna affect
02:02:17.860 | the West or the Northeast, and of course it did,
02:02:20.280 | and that caused its own set of problems,
02:02:21.740 | which went into the late '60s into the '70s,
02:02:23.880 | and in some ways we're living with that legacy now,
02:02:26.440 | and so on.
02:02:28.420 | So, why not, what's happening now?
02:02:30.600 | Why it didn't happen 10 years ago?
02:02:32.600 | I think it's people have more voices,
02:02:35.240 | there's not just more TV, there's social media,
02:02:37.480 | it's very easy for these things
02:02:38.760 | to kind of build on themselves,
02:02:40.320 | and things are just quite visible.
02:02:43.500 | And there's demographic change,
02:02:45.920 | I mean, the world is changing rapidly, right?
02:02:47.600 | And so it's very difficult.
02:02:49.200 | You're now seeing people you could have avoided seeing
02:02:50.920 | most of your life growing up in a particular time,
02:02:53.960 | and it's happening, it's dispersing at a speed
02:02:56.920 | that is fast enough to cause concern for some people,
02:03:00.080 | but not so fast to cause massive negative reaction.
02:03:03.240 | So that's that.
02:03:06.120 | On the other hand, and again,
02:03:07.720 | that's a massive oversimplification,
02:03:09.360 | but I think there's something there anyway,
02:03:11.000 | at least something worth exploring.
02:03:11.880 | I'm happy to be yelled at by a real historian.
02:03:14.080 | - Oh yeah, I mean, there's just the obvious thing,
02:03:17.400 | I mean, I guess you're implying, but not saying this,
02:03:20.480 | I mean, it seemed to have percolated the most
02:03:22.760 | with just a single video, for example,
02:03:24.600 | the George Floyd video.
02:03:26.160 | - It makes a huge difference.
02:03:27.960 | It's fascinating to think that whatever the mechanisms
02:03:32.600 | that put injustice in front of our face,
02:03:36.480 | not like directly in front of our face,
02:03:40.320 | those mechanisms are the mechanisms of change.
02:03:43.000 | - Yeah, on the other hand, Rodney King.
02:03:44.640 | So no one remembers this.
02:03:46.160 | I seem to be the only person who remembers this,
02:03:47.680 | but sometime before the Rodney King incident,
02:03:50.560 | there was a guy who was a police officer
02:03:52.640 | who was saying that things were really bad
02:03:55.600 | in Southern California, and he was gonna prove it
02:03:59.040 | by having some news, some camera people follow him around.
02:04:02.840 | And he says, "I'm gonna go into these towns
02:04:04.320 | "and just follow me for a week,
02:04:05.400 | "and you will see that I'll get harassed."
02:04:06.920 | And like the first night, he goes out there,
02:04:09.480 | he crosses into the city, some cops pull him over,
02:04:11.920 | and he's a police officer, remember.
02:04:14.420 | They don't know that, of course.
02:04:15.400 | They like shove his face through a glass window.
02:04:17.800 | This was on the news,
02:04:18.640 | like I distinctly remember watching this as a kid.
02:04:21.620 | Actually, I guess I wasn't a kid,
02:04:22.520 | I was in college at the time, I was in grad school at the time.
02:04:25.200 | - So that's not enough, like just--
02:04:27.880 | - Well, it disappeared.
02:04:28.720 | Like a day later, it didn't go viral.
02:04:30.740 | - Whatever that is, whatever that magic thing is.
02:04:34.040 | - And whatever it was in '92,
02:04:35.320 | it was harder to go viral in '92, right?
02:04:37.680 | Or '91, actually it must have been '90 or '91.
02:04:40.160 | But that happened, and like two days later,
02:04:42.160 | it's like it never happened.
02:04:43.000 | Like nobody, again, nobody remembers this,
02:04:44.720 | but I'm like the only person.
02:04:45.560 | Sometimes I think I must have dreamed it.
02:04:46.920 | Anyway, Rodney King happens, it goes viral,
02:04:50.200 | or the moral equivalent thereof at the time.
02:04:53.160 | And eventually, we get April 29th, right?
02:04:56.240 | And I don't know what the difference was
02:04:58.600 | between the two things,
02:04:59.440 | other than one thing caught on and one thing didn't.
02:05:01.440 | Maybe what's happening now
02:05:03.280 | is two things are feeding onto one another.
02:05:06.000 | One is more people are willing to believe.
02:05:09.200 | And the other is there's easier and easier ways
02:05:11.800 | to give evidence.
02:05:13.060 | Cameras, body cams, or whatever.
02:05:15.480 | But we're still finding ourselves telling the same story.
02:05:17.440 | It's the same thing over and over again.
02:05:18.520 | I would invite you to go back and read the op-eds
02:05:21.520 | from what people were saying about
02:05:23.640 | the violence is not the right answer after Rodney King.
02:05:27.280 | And then go back to 1980 and the big riots
02:05:29.040 | that were happening around then,
02:05:30.320 | and read the same op-ed.
02:05:31.720 | It's the same words over and over and over again.
02:05:35.200 | I mean, there's your remembering history right there.
02:05:37.720 | I mean, it's like literally the same words.
02:05:39.080 | Like you could have just caught it,
02:05:39.920 | and I'm surprised no one got flagged for plagiarism.
02:05:43.160 | - It's interesting if you have an opinion
02:05:45.000 | on the question of violence,
02:05:46.720 | and the popular, perhaps, caricature
02:05:49.040 | of Malcolm X versus Martin Luther King.
02:05:53.320 | - You know Malcolm X was older than Martin Luther King?
02:05:55.800 | People kind of have it in their head that he's younger.
02:05:58.040 | Well, he died sooner, right?
02:06:00.960 | But only by a few years, right?
02:06:03.040 | People think of MLK as the older statesman,
02:06:05.640 | and they think of Malcolm X as the young, angry, whatever.
02:06:09.080 | But that's more of a narrative device.
02:06:11.280 | It's not true at all.
02:06:12.500 | I don't, I just, I reject the choice.
02:06:18.480 | I think it's a false choice.
02:06:19.640 | I think they're just things that happen.
02:06:20.880 | You just do, as I said, hatred is not,
02:06:23.540 | it takes a lot of energy.
02:06:25.240 | But every once in a while you have to fight.
02:06:27.320 | One thing I will say, without taking a moral position,
02:06:31.620 | which I will not take on this matter,
02:06:34.440 | violence has worked.
02:06:39.040 | - Yeah, that's the annoying thing.
02:06:42.480 | - That's the annoying thing.
02:06:43.680 | - It seems like over-the-top anger works.
02:06:48.440 | Outrage works.
02:06:50.000 | So you can say being calm and rational,
02:06:53.760 | just talking it out is gonna lead to progress,
02:06:57.120 | but it seems like if you just look through history,
02:07:00.740 | being irrationally upset is the way you make progress.
02:07:06.740 | - Well, it's certainly the way that you
02:07:08.600 | get someone to notice you.
02:07:09.800 | - Yeah, and that's--
02:07:11.240 | - And if they don't notice you,
02:07:12.080 | I mean, what's the difference between that and what,
02:07:13.880 | again, without taking a moral position on this,
02:07:15.600 | I'm just trying to observe history here.
02:07:17.120 | If you, maybe if television didn't exist,
02:07:20.520 | the civil rights movement doesn't happen,
02:07:22.360 | or it takes longer, or it takes a very different form.
02:07:25.000 | Maybe if social media doesn't exist,
02:07:27.220 | a whole host of things, positive and negative,
02:07:29.640 | don't happen, right?
02:07:31.160 | So, and what do any of those things do
02:07:33.720 | other than expose things to people?
02:07:38.320 | Violence is a way of shouting.
02:07:40.800 | I mean, many people far more talented and thoughtful
02:07:43.400 | than I have have said this in one form or another, right?
02:07:46.000 | That violence is the voice of the unheard, right?
02:07:49.880 | I mean, it's a thing that people do
02:07:53.400 | when they feel as if they have no other option.
02:07:56.000 | And sometimes we agree, and sometimes we disagree.
02:07:59.680 | Sometimes we think they're justified.
02:08:01.040 | Sometimes we think they are not.
02:08:03.460 | But regardless, it is a way of shouting.
02:08:06.440 | And when you shout, people tend to hear you,
02:08:08.800 | even if they don't necessarily hear
02:08:10.000 | the words that you're saying.
02:08:10.880 | They hear that you were shouting.
02:08:12.800 | I see no way.
02:08:14.360 | So another way of putting it, which I think is less,
02:08:16.760 | let us just say, provocative, but I think is true,
02:08:21.760 | is that all change, particularly change
02:08:27.160 | that impacts power, requires struggle.
02:08:30.060 | The struggle doesn't have to be violent.
02:08:33.320 | But it's a struggle nonetheless.
02:08:37.440 | - The powerful don't give up power easily.
02:08:40.320 | - I mean, why should they?
02:08:43.440 | But even so, it still has to be a struggle.
02:08:45.000 | And by the way, this isn't just about violent, political,
02:08:48.180 | whatever, nonviolent political change, right?
02:08:49.920 | This is true for understanding calculus, right?
02:08:52.240 | I mean, everything requires a struggle.
02:08:53.880 | - We're back to talking about faculty hiring.
02:08:55.560 | - At the end of the day, in the end of the day,
02:08:57.160 | it all comes down to faculty hiring.
02:08:58.520 | - And the godfather scene.
02:09:00.360 | - All a metaphor.
02:09:01.400 | Faculty hiring is a metaphor for all of life.
02:09:04.120 | - Let me ask a strange question.
02:09:07.040 | Do you think everything is gonna be okay in the next year?
02:09:13.240 | Do you have a hope that we're gonna be okay?
02:09:16.760 | - I tend to think that everything's gonna be okay,
02:09:18.860 | because I just tend to think
02:09:20.120 | that everything's gonna be okay.
02:09:21.800 | My mother says something to me a lot, and always has,
02:09:25.400 | and I find it quite comforting, which is,
02:09:27.000 | this too shall pass.
02:09:28.600 | And this too shall pass.
02:09:30.480 | Now, this too shall pass is not just
02:09:32.520 | this bad thing is going away.
02:09:35.760 | Everything passes.
02:09:36.760 | I mean, I have a 16-year-old daughter
02:09:38.560 | who's going to go to college,
02:09:41.100 | probably in about 15 minutes,
02:09:42.480 | given how fast she seems to be growing up.
02:09:44.600 | And I get to hang out with her now,
02:09:46.040 | but one day I won't.
02:09:47.560 | She'll ignore me just as much as I ignored my parents
02:09:50.160 | when I was in college and went to grad school.
02:09:51.560 | This too shall pass.
02:09:52.400 | But I think that one day, if we're all lucky,
02:09:55.920 | you live long enough to look back on something
02:09:57.640 | that happened a while ago, even if it was painful,
02:09:59.520 | and mostly, it's a memory.
02:10:02.260 | So yes, I think it'll be okay.
02:10:06.260 | - What about humans?
02:10:07.320 | Do you think we'll live into the 21st century?
02:10:11.520 | - I certainly hope so.
02:10:12.720 | - Are you worried about,
02:10:14.880 | are you worried that we might destroy ourselves
02:10:16.720 | with nuclear weapons, with AGI, with engineering?
02:10:20.080 | - I'm not worried about AGI doing it,
02:10:21.400 | but I am worried, I mean, at any given moment, right?
02:10:23.840 | Also, but you know, at any given moment, a comet could,
02:10:26.040 | I mean, you know, whatever.
02:10:27.320 | I tend to think that outside of things
02:10:30.440 | completely beyond our control,
02:10:32.300 | we have a better chance than not of making it.
02:10:36.960 | - You know, I talked to Alex Villepenco from Berkeley.
02:10:40.360 | He was talking about comets
02:10:42.000 | and that they can come out of nowhere,
02:10:43.540 | and that was a realization to me.
02:10:46.840 | Wow, we're just watching this darkness,
02:10:50.100 | and they can just enter, and then we have less than a month.
02:10:53.320 | - Yeah, and yet, you make it from day to day.
02:10:56.260 | - That one shall not pass.
02:10:59.340 | Well, maybe for Earth it'll pass, but not for humans.
02:11:02.240 | - But I'm just choosing to believe
02:11:04.320 | that it's going to be okay,
02:11:07.480 | and we're not gonna get hit by an asteroid,
02:11:09.620 | at least not while I'm around, and if we are,
02:11:11.840 | well, there's very little I can do about it,
02:11:13.560 | so I might as well assume it's not going to happen.
02:11:16.280 | - It makes food taste better.
02:11:17.800 | - It makes food taste better.
02:11:19.720 | - So you, out of the millions of things
02:11:22.960 | you've done in your life, you've also began
02:11:25.400 | the This Week in Black History calendar of facts.
02:11:28.860 | There's like a million questions I can ask here.
02:11:33.360 | You said you're not a historian, but is there,
02:11:38.200 | let's start at the big history question of,
02:11:41.680 | is there somebody in history, in black history,
02:11:45.260 | that you draw a lot of philosophical
02:11:49.220 | or personal inspiration from, or you just find interesting,
02:11:52.820 | or a moment in history you find interesting?
02:11:55.180 | - Well, I find the entirety of the '40s and the '60s
02:11:59.060 | and the civil rights movement that didn't happen
02:12:01.500 | and did happen at the same time during then
02:12:03.460 | quite inspirational.
02:12:04.540 | I mean, I've read quite a bit of the time period,
02:12:08.020 | at least I did in my younger days
02:12:10.000 | when I had more time to read as many things as I wanted to.
02:12:12.900 | What was quirky about This Week in Black History
02:12:17.140 | when I started in the '80s was how focused it was.
02:12:22.140 | It was because of the sources I was stealing from,
02:12:24.480 | and I was very much stealing from,
02:12:25.640 | so I'd take calendars, anything I could find,
02:12:28.160 | Google didn't exist, right, and I just pulled
02:12:29.600 | as much as I could and just put it together
02:12:30.960 | in one place for other people.
02:12:32.560 | What ended up being quirky about it,
02:12:33.760 | and I started getting people sending me information,
02:12:36.160 | was the inventors, people who,
02:12:40.240 | Garrett Morgan to Benjamin Banneker,
02:12:42.520 | people who were inventing things
02:12:45.040 | at a time when, how in the world
02:12:52.460 | did they manage to invent anything?
02:12:54.160 | Like, all these other things were happening,
02:12:56.280 | mother necessity, right, all these other things
02:12:57.720 | were happening, and there were so many
02:12:59.460 | terrible things happening around them,
02:13:00.760 | and they went to the wrong state at the wrong time,
02:13:02.680 | they may never come back,
02:13:04.160 | but they were inventing things we use, right?
02:13:06.360 | And it was always inspiring to me
02:13:09.080 | that people would still create,
02:13:11.400 | even under those circumstances.
02:13:14.560 | I got a lot out of that.
02:13:15.400 | I also learned a few lessons, I think,
02:13:17.440 | you know, the Charles Richard Drews of the world.
02:13:19.200 | You know, you create things that impact people,
02:13:24.200 | you don't necessarily get credit for them,
02:13:26.500 | and that's not right, but it's also okay.
02:13:30.460 | - You're okay with that?
02:13:31.640 | - Up to a point, yeah.
02:13:33.740 | I mean, look, in our world, all we really have is credit.
02:13:38.740 | - I was always bothered by how much value credit is given.
02:13:44.060 | - That's the only thing you got.
02:13:45.400 | I mean, if you're an academic in some sense,
02:13:47.140 | well, it isn't the only thing you've got,
02:13:48.240 | but it feels that way sometimes.
02:13:49.780 | - But you got the actual, we're all gonna be dead soon.
02:13:53.800 | You got the joy of having created.
02:13:56.520 | You know, the credit with Jan,
02:14:00.780 | I've talked to Jorgen Schmidhuber, right?
02:14:02.980 | The Turing Award given to three people for deep learning,
02:14:07.820 | and you could say that a lot of other people
02:14:10.580 | should be on that list.
02:14:11.660 | It's the Nobel Prize question.
02:14:13.700 | Yeah, it's sad.
02:14:14.820 | It's sad, and people like talking about it,
02:14:16.560 | but I feel like in the long arc of history,
02:14:20.300 | the only person who'll be remembered
02:14:21.860 | is Einstein, Hitler, maybe Elon Musk.
02:14:24.300 | And the rest of us are just like.
02:14:27.500 | - Well, you know, someone asked me about immortality once,
02:14:29.660 | and I said, and I stole this from somebody else,
02:14:32.460 | I don't remember who, but it was, you know,
02:14:35.420 | I asked him, "What's your great-grandfather's name?"
02:14:37.940 | Any of them.
02:14:38.780 | Of course, they don't know.
02:14:40.780 | Most of us do not know.
02:14:42.540 | I mean, I'm not entirely sure I know my grandparents' names,
02:14:45.020 | all my grandparents' names.
02:14:45.940 | I know what I called them, right?
02:14:47.700 | I don't know their middle names, for example.
02:14:49.800 | Didn't live in living memory, so I could find out.
02:14:53.900 | Actually, my grandfather didn't know when he was born.
02:14:56.540 | Had no idea how old he was, right?
02:14:59.180 | But I definitely don't know
02:15:00.620 | who any of my great-grandparents are.
02:15:02.740 | So in some sense, immortality is doing something,
02:15:06.020 | preferably positive, so that your great-grandchildren
02:15:08.500 | know who you are, right?
02:15:10.740 | And that's kind of what you can hope for,
02:15:12.260 | which is very depressing in some ways.
02:15:14.100 | I could turn it into something uplifting
02:15:16.860 | if you need me to, but it's--
02:15:17.860 | - Yeah, can you do the work here?
02:15:19.260 | - Yeah, it's simple, right?
02:15:21.180 | It doesn't matter.
02:15:22.100 | I don't have to know who my great-grandfather was
02:15:23.980 | to know that I wouldn't be here without him.
02:15:25.980 | - Yeah.
02:15:26.820 | - And I don't know who my great-grandchildren are,
02:15:29.300 | certainly who my great-great-grandchildren are,
02:15:31.180 | and I'll probably never meet them,
02:15:32.900 | although I would very much like to.
02:15:35.060 | But hopefully I'll set the world in motion
02:15:37.900 | in such a way that their lives will be better
02:15:39.700 | than they would have been if I hadn't done that.
02:15:41.340 | Well, certainly they wouldn't have existed
02:15:42.820 | if I hadn't done the things that I did.
02:15:44.420 | So I think that's a good positive thing.
02:15:47.060 | You live on through other people.
02:15:49.500 | - Are you afraid of death?
02:15:51.380 | - I don't know if I'm afraid of death, but I don't like it.
02:15:53.620 | (laughing)
02:15:55.380 | - Another T-shirt.
02:15:56.780 | (laughing)
02:15:58.500 | I mean, do you ponder it?
02:15:59.820 | Do you think about the--
02:16:01.660 | - Yes, the inevitability of oblivion?
02:16:04.420 | - Yes.
02:16:05.260 | - I do occasionally.
02:16:06.500 | This feels like a very Russian conversation, actually.
02:16:08.340 | - It's very, yeah.
02:16:09.740 | - I will tell you a story,
02:16:10.660 | a very, something that happened to me recently.
02:16:13.460 | If you look very carefully, you will see I have a scar.
02:16:15.980 | - Yes.
02:16:17.580 | - Which, by the way, is an interesting story of its own
02:16:20.000 | about why people who have half of their thyroid taken out,
02:16:22.020 | some people get scars and some don't.
02:16:24.300 | But anyway, I had half my thyroid taken out.
02:16:28.100 | The way I got there, by the way,
02:16:29.060 | is its own interesting story, but I won't go into it.
02:16:30.780 | Just suffice it to say, I did what I keep telling people
02:16:33.020 | you should never do, which is never go to the doctor
02:16:34.500 | unless you have to, because there's nothing good
02:16:36.280 | that's ever gonna come out of a doctor's visit, right?
02:16:38.000 | So I went to the doctor to look at one thing,
02:16:40.660 | this little bump I had on the side
02:16:41.940 | that I thought might be something bad
02:16:43.980 | because my mother made me, and I went there,
02:16:45.740 | and he's like, "Oh, it's nothing,
02:16:46.780 | "but by the way, your thyroid is huge.
02:16:48.380 | "Can you breathe?"
02:16:49.220 | "Yes, I can breathe."
02:16:50.040 | "Are you sure?
02:16:50.880 | "'Cause it's pushing on your windpipe.
02:16:51.720 | "You should be dead."
02:16:52.540 | "Ah!"
02:16:53.380 | So I was sitting there, and to look at my thyroid,
02:16:57.740 | it was growing.
02:16:58.820 | I had what's called a goiter,
02:17:00.220 | and he said, "We're gonna have to take it out
02:17:02.060 | "at some point."
02:17:02.900 | "When?"
02:17:03.720 | "Sometime before you're 85, probably,
02:17:05.420 | "but if you wait 'til you're 85, that'll be really bad
02:17:08.260 | "because you don't wanna have surgery
02:17:10.420 | "when you're 85 years old, if you can help it."
02:17:13.160 | Certainly not the kind of surgery it takes
02:17:14.600 | to take out your thyroid.
02:17:16.080 | So I went there, and we decided,
02:17:19.340 | I would decide I would put it off until December 19th
02:17:23.500 | because my birthday's December 18th,
02:17:25.180 | and I wouldn't be able to say I made it to 49 or whatever,
02:17:27.940 | so I said, "I'll wait 'til after my birthday."
02:17:30.220 | In the first six months of that, nothing changed.
02:17:34.980 | Apparently, in the next three months,
02:17:37.000 | it had grown, I hadn't noticed this at all.
02:17:40.380 | I went and had surgery.
02:17:42.660 | They took out half of it.
02:17:43.840 | The other half is still there,
02:17:44.860 | and it's working fine, by the way.
02:17:46.260 | I don't have to take a pill or anything like that.
02:17:48.060 | It's great.
02:17:49.740 | I'm in the hospital room, and the doctor comes in.
02:17:54.740 | I've got these things on my arm.
02:17:56.900 | They're gonna do whatever.
02:17:58.060 | They're talking to me, and the anesthesiologist says,
02:18:01.200 | "Huh, your blood pressure's through the roof.
02:18:02.500 | "Do you have high blood pressure?"
02:18:04.260 | I said, "No, but I'm terrified if that helps you at all."
02:18:08.100 | And the anesthetist, who's the nurse
02:18:10.220 | who supports the anesthesiologist, if I got that right,
02:18:13.460 | said, "Oh, don't worry about it.
02:18:14.300 | "I just put some stuff in your IV.
02:18:16.000 | "You're gonna be feeling pretty good in a couple minutes."
02:18:17.660 | And I remember turning and saying,
02:18:20.240 | "Well, I'm gonna feel pretty good in a couple minutes."
02:18:21.940 | Next thing I know, there's this guy,
02:18:24.700 | and he's moving my bed.
02:18:26.140 | And he's talking to me.
02:18:28.260 | I have this distinct impression that I've met this guy,
02:18:31.260 | and I should know what he's talking about,
02:18:34.060 | but I kind of just don't remember what just happened.
02:18:37.860 | And I look up, and I see the tiles going by,
02:18:39.700 | and I'm like, "Oh, it's just like in the movies
02:18:42.260 | "where you see the tiles go by."
02:18:43.980 | And then I have this brief thought
02:18:47.660 | that I'm in an infinitely long warehouse,
02:18:49.860 | and there's someone sitting next to me.
02:18:52.840 | And I remember thinking, "Oh, she's not talking to me."
02:18:54.980 | And then I'm back in the hospital bed.
02:18:57.260 | And in between the time where the tiles were going by
02:19:02.340 | and I got in the hospital bed,
02:19:03.700 | something like five hours had passed.
02:19:05.580 | Apparently, it had grown so much
02:19:07.020 | that it was a four and a half hour procedure
02:19:08.740 | instead of an hour long procedure.
02:19:10.380 | I lost a neck size and a half.
02:19:14.420 | It was pretty big.
02:19:15.260 | Apparently, it was as big as my heart.
02:19:17.380 | Why am I telling you this?
02:19:18.500 | I'm telling you this because--
02:19:20.460 | - It's a hell of a story already, so--
02:19:21.780 | - Between the tiles going by
02:19:24.660 | and me waking up in my hospital bed, no time passed.
02:19:29.260 | There was no sensation of time passing.
02:19:32.060 | When I go to sleep and I wake up in the morning,
02:19:35.020 | I have this feeling that time has passed,
02:19:36.780 | or this feeling that something
02:19:38.060 | has physically changed about me.
02:19:39.900 | Nothing happened between the time
02:19:42.180 | they put the magic juice in me
02:19:44.180 | and the time that I woke up, nothing.
02:19:46.500 | By the way, my wife was there with me talking.
02:19:48.860 | Apparently, I was also talking.
02:19:50.860 | I don't remember any of this,
02:19:52.340 | but luckily I didn't say anything I wouldn't normally say.
02:19:54.940 | My memory of it is I would talk to her
02:19:57.380 | and she would teleport around the room.
02:19:59.340 | And then I accused her of witchcraft
02:20:01.940 | and that was the end of that.
02:20:02.780 | But her point of view is I would start talking
02:20:05.820 | and then I would fall asleep
02:20:07.220 | and then I would wake up and leave off where I was before.
02:20:09.500 | I had no notion of any time passing.
02:20:11.300 | I kind of imagine that that's death.
02:20:14.380 | - Yeah.
02:20:16.660 | - Is the lack of sensation of time passing.
02:20:19.060 | And on the one hand, I am, I don't know,
02:20:23.700 | soothed by the idea that I won't notice.
02:20:27.060 | On the other hand, I'm very unhappy
02:20:28.780 | at the idea that I won't notice.
02:20:30.860 | So I don't know if I'm afraid of death,
02:20:34.620 | but I'm completely sure that I don't like it
02:20:36.820 | and that I particularly would prefer
02:20:39.180 | to discover on my own whether immortality sucks
02:20:42.860 | and be able to make a decision about it.
02:20:44.780 | That's what I would prefer.
02:20:46.260 | - You'd like to have a choice in the matter.
02:20:48.420 | - I would like to have a choice in the matter.
02:20:49.900 | - Well, again, on the Russian thing,
02:20:51.500 | I think the finiteness of it is the thing
02:20:54.060 | that gives it a little flavor, a little spice.
02:20:56.980 | So, immortality.
02:20:57.820 | - Well, in reinforcement learning, we believe that.
02:20:59.260 | That's why we have discount factors.
02:21:00.420 | Otherwise, it doesn't matter what you do.
02:21:02.740 | - Amen.
02:21:03.620 | Well, let me, one last question
02:21:06.580 | sticking on the Russian theme.
02:21:09.360 | You talked about your great-grandparents
02:21:13.680 | not remembering their name.
02:21:15.760 | What do you think is the,
02:21:17.000 | in this kind of Markov chain that is life,
02:21:23.160 | what do you think is the meaning of it all?
02:21:27.000 | What's the meaning of life?
02:21:28.440 | - Well, in a world where eventually
02:21:31.120 | you won't know who your great-grandchildren are,
02:21:35.240 | I'm reminded of something I heard once,
02:21:39.240 | or I read once that I really like,
02:21:42.120 | which is it is well worth remembering
02:21:45.320 | that the entire universe, save for one trifling exception,
02:21:52.480 | is composed entirely of others.
02:21:54.440 | I think that's the meaning of life.
02:21:58.880 | - Charles, this was one of the best conversations
02:22:03.600 | I've ever had, and I get to see you tomorrow again
02:22:06.640 | to hang out with who looks to be one of the most,
02:22:11.640 | how should I say, interesting personalities
02:22:15.760 | that I'll ever get to meet with Michael Lipman.
02:22:17.520 | So I can't wait.
02:22:19.180 | I'm excited to have had this opportunity.
02:22:21.440 | Thank you for traveling all the way here.
02:22:23.160 | It was amazing.
02:22:24.780 | I'm excited.
02:22:25.620 | I always loved Georgia Tech.
02:22:26.760 | I'm excited to see with you being involved there
02:22:29.800 | what the future holds.
02:22:30.640 | So thank you for talking today.
02:22:31.800 | - Thank you for having me.
02:22:32.640 | I appreciate every minute of it.
02:22:34.280 | - Thanks for listening to this conversation
02:22:35.720 | with Charles Isbell, and thank you to our sponsors,
02:22:39.000 | Neuro, the maker of functional sugar-free gum and mints
02:22:42.600 | that I use to give my brain a quick caffeine boost,
02:22:46.480 | Decoding Digital, a podcast on tech and entrepreneurship
02:22:49.960 | that I listen to and enjoy, Masterclass,
02:22:53.000 | online courses that I watch from some
02:22:55.620 | of the most amazing humans in history,
02:22:57.900 | and Cash App, the app I use to send money
02:23:00.680 | to friends for food and drinks.
02:23:03.680 | Please check out these sponsors in the description
02:23:06.000 | to get a discount and to support this podcast.
02:23:09.360 | If you enjoy this thing, subscribe on YouTube,
02:23:11.680 | review it with Five Stars on Apple Podcasts,
02:23:13.800 | follow on Spotify, support on Patreon,
02:23:16.360 | or connect with me on Twitter @LexFriedman.
02:23:19.600 | And now let me leave you with some poetic words
02:23:22.820 | from Martin Luther King Jr.
02:23:24.540 | There comes a time when people get tired
02:23:28.600 | of being pushed out of the glittering sunlight
02:23:30.840 | of life's July and left standing amid the piercing chill
02:23:35.680 | of an alpine November.
02:23:37.620 | Thank you for listening, and hope to see you next time.
02:23:41.720 | (upbeat music)
02:23:44.300 | (upbeat music)
02:23:46.880 | [BLANK_AUDIO]